CN114596210A - Noise estimation method, device, terminal equipment and computer readable storage medium - Google Patents

Noise estimation method, device, terminal equipment and computer readable storage medium Download PDF

Info

Publication number
CN114596210A
CN114596210A CN202011418618.1A CN202011418618A CN114596210A CN 114596210 A CN114596210 A CN 114596210A CN 202011418618 A CN202011418618 A CN 202011418618A CN 114596210 A CN114596210 A CN 114596210A
Authority
CN
China
Prior art keywords
image
edge
processed
edge map
noise
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202011418618.1A
Other languages
Chinese (zh)
Inventor
郑加章
刘阳兴
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Wuhan TCL Group Industrial Research Institute Co Ltd
Original Assignee
Wuhan TCL Group Industrial Research Institute Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Wuhan TCL Group Industrial Research Institute Co Ltd filed Critical Wuhan TCL Group Industrial Research Institute Co Ltd
Priority to CN202011418618.1A priority Critical patent/CN114596210A/en
Publication of CN114596210A publication Critical patent/CN114596210A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • G06T5/70
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/13Edge detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/181Segmentation; Edge detection involving edge growing; involving edge linking
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20024Filtering details
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20036Morphological image processing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20172Image enhancement details
    • G06T2207/20192Edge enhancement; Edge preservation

Abstract

The application provides a noise estimation method, a noise estimation device, a terminal device and a computer readable storage medium, wherein the method comprises the following steps: carrying out Laplacian Laplace filtering on the image to be processed to obtain a Laplace filtering result; carrying out edge detection operator filtering on an image to be processed to obtain a first edge image; determining a second edge map according to the first edge map, wherein the second edge map is obtained by performing edge enhancement on the first edge map; and determining the noise level of the image to be processed according to the Laplace filtering result and the second edge map. The noise level is determined by combining the second edge map and the Laplace filtering result, and the noise level can be accurately estimated, so that the noise removal intensity can be better controlled.

Description

Noise estimation method, device, terminal equipment and computer readable storage medium
Technical Field
The present application belongs to the field of image processing technologies, and in particular, to a noise estimation method, apparatus, terminal device, and computer-readable storage medium.
Background
Image noise refers to unnecessary or redundant disturbance information present in image data, and noise is one of the most critical factors affecting the visual effect of an image and is not conducive to the processing of subsequent images, such as segmentation, compression, image understanding, and the like. At night or in a dark environment, the captured image may have too much brightness and too much noise due to insufficient light. In a dark light environment, noise caused by dark current of a photosensitive component, gaussian noise caused by a hardware reading effect of a sensor, shot noise along with signal change and the like exist in a shot picture.
In order to suppress noise, reduce the influence of low signal-to-noise ratio caused by noise, improve image quality, improve the visual effect of an image and facilitate higher-level processing, the image needs to be subjected to denoising preprocessing. Most of the traditional denoising methods need to estimate the noise condition of an image first and then denoise.
Disclosure of Invention
The embodiment of the application provides a noise estimation method, a noise estimation device, terminal equipment and a computer readable storage medium, which can effectively and accurately estimate the noise level.
In a first aspect, an embodiment of the present application provides a noise estimation method, including:
carrying out Laplacian Laplace filtering on the image to be processed to obtain a Laplace filtering result;
carrying out edge detection operator filtering on an image to be processed to obtain a first edge image;
determining a second edge map according to the first edge map, wherein the second edge map is obtained by performing edge enhancement on the first edge map;
and determining the noise level of the image to be processed according to the Laplace filtering result and the second edge map.
In a second aspect, an embodiment of the present application provides a noise estimation apparatus, including:
the first filtering module is used for carrying out Laplacian Laplace filtering on the image to be processed to obtain a Laplace filtering result;
the second filtering module is used for filtering the image to be processed by using an edge detection operator to obtain a first edge image;
the edge map generation module is used for determining a second edge map according to the first edge map, and the second edge map is obtained by performing edge enhancement on the first edge map;
and the noise estimation module is used for determining the noise level of the image to be processed according to the Laplace filtering result and the second edge map.
In a third aspect, an embodiment of the present application provides a terminal device, where the terminal device includes a memory, a processor, and a noise estimation program stored in the memory and executable on the processor, and the processor implements the noise estimation method of any one of the first aspects when executing the noise estimation program.
In a fourth aspect, embodiments of the present application provide a computer program product, which, when run on a terminal device, causes the terminal device to perform the noise estimation method of any one of the first aspect.
In a fifth aspect, an embodiment of the present application provides a computer-readable storage medium, where a noise estimation program is stored, and when the noise estimation program is executed by a processor, the noise estimation method in any one of the first aspects is implemented.
It is understood that the beneficial effects of the second aspect to the fifth aspect can be referred to the related description of the first aspect, and are not described herein again.
The embodiment of the application provides a noise estimation method, a device and a terminal device, wherein the method comprises the following steps: carrying out Laplacian Laplace filtering on the image to be processed to obtain a Laplace filtering result; carrying out edge detection operator filtering on an image to be processed to obtain a first edge image; determining a second edge map according to the first edge map, wherein the second edge map is obtained by performing edge enhancement on the first edge map; and determining the noise level of the image to be processed according to the Laplace filtering result and the second edge map. The noise level is determined by combining the edge map and the Laplace filtering result, and the noise level can be accurately estimated, so that the denoising strength can be better controlled.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present application, the drawings needed to be used in the embodiments or the prior art descriptions will be briefly described below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without inventive exercise.
Fig. 1 is a schematic flow chart of a noise estimation method according to an embodiment of the present application;
fig. 2 is a schematic structural diagram of a noise estimation apparatus provided in an embodiment of the present application;
fig. 3 is a schematic structural diagram of a terminal device according to an embodiment of the present application.
Detailed Description
In the following description, for purposes of explanation and not limitation, specific details are set forth, such as particular system structures, techniques, etc. in order to provide a thorough understanding of the embodiments of the present application. It will be apparent, however, to one skilled in the art that the present application may be practiced in other embodiments that depart from these specific details. In other instances, detailed descriptions of well-known systems, devices, circuits, and methods are omitted so as not to obscure the description of the present application with unnecessary detail.
It will be understood that the terms "comprises" and/or "comprising," when used in this specification and the appended claims, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof.
It should also be understood that the term "and/or" as used in this specification and the appended claims refers to and includes any and all possible combinations of one or more of the associated listed items.
As used in this specification and the appended claims, the term "if" may be interpreted contextually as "when", "upon" or "in response to" determining "or" in response to detecting ". Similarly, the phrase "if it is determined" or "if a described condition or event is detected" may be interpreted, depending on the context, to mean "upon determining" or "in response to determining" or "upon detecting a described condition or event" or "in response to detecting a described condition or event".
Furthermore, in the description of the present application and the appended claims, the terms "first," "second," "third," and the like are used for distinguishing between descriptions and not necessarily for describing a relative importance or importance.
Reference throughout this specification to "one embodiment" or "some embodiments," or the like, means that a particular feature, structure, or characteristic described in connection with the embodiment is included in one or more embodiments of the present application. Thus, appearances of the phrases "in one embodiment," "in some embodiments," "in other embodiments," or the like, in various places throughout this specification are not necessarily all referring to the same embodiment, but rather "one or more but not all embodiments" unless specifically stated otherwise. The terms "comprising," "including," "having," and variations thereof mean "including, but not limited to," unless expressly specified otherwise.
Referring to fig. 1, a flow chart of a noise estimation method is shown, which may be applied to a terminal device, where the terminal device may be a mobile phone, a tablet, or a computer, and the type of the terminal device is not limited herein. The method may comprise the steps of:
step S101: the laplacian filtering is performed on the image to be processed to obtain a laplacian filtering result, and the laplacian filtering method specifically comprises the following steps:
step S1011: when the image to be processed is an RGB image, the image to be processed is converted into a YUV image, and the chromaticity U, V and the brightness Y of the image can be separated. The image to be processed is a face image or a landscape image. When the image to be processed is a YUV image, step S1012 is performed.
Step S1012: and performing convolution operation on the Y channel of the YUV image by using a Laplace operator to obtain a Laplace filtering result.
The Laplace operator is a convolution kernel representation of the second order difference in the x and y directions of the image to be processed. According to the embodiment of the application, the Laplace operator is used for carrying out convolution operation on the Y channel, so that the edge of the image and noise information can be better reserved and enhanced.
Step S102: and filtering the image to be processed by using an edge detection operator to obtain a first edge image.
In some possible embodiments, the filtering, by an edge detection operator, the image to be processed to obtain a first edge map includes:
when the image to be processed is an RGB image, carrying out convolution operation on x of the image to be processed by adopting a transverse edge detection operator to obtain a horizontal edge image Gx;
performing convolution operation on the y axis of the image to be processed by adopting a longitudinal edge detection operator to obtain a vertical edge image Gy;
determining a first edge map according to the horizontal edge map and the vertical edge map;
alternatively, the first and second electrodes may be,
and when the image to be processed is a YUV image, filtering an edge detection operator of a Y channel of the YUV image to obtain a first edge image.
The edge detection operator can include Sobel, Robert and other operators capable of effectively detecting image edge information, and the edge detection operator is taken as the Sobel operator as an example, where the operation formula is:
Figure BDA0002821198660000051
wherein I represents an image to be processed or an RGB image converted from the image to be processed; the transverse Sobel operator is
Figure BDA0002821198660000052
The vertical Sobel operator is
Figure BDA0002821198660000053
Optionally, Gx and Gy are converted and normalized to 8-bit standard image data.
Figure BDA0002821198660000054
Figure BDA0002821198660000055
If the pixel values of the pixel points in the horizontal edge image and the vertical edge image are larger than-255 and smaller than 0, resetting the pixel values of the pixel points to be the absolute values of the original pixel values;
if the pixel values of the pixel points in the horizontal edge image and the vertical edge image are not less than 255 or not less than-255, resetting the pixel values of the pixel points to be 255;
and if the pixel values of the pixel points in the horizontal edge image and the vertical edge image are more than or equal to 0 and less than 255, keeping the pixel values of the pixel points unchanged.
Wherein x represents the pixel value of the pixel point in the horizontal edge map Gx, and y represents the pixel value of the pixel point in the vertical edge map Gy.
Optionally, the weights of the horizontal edge map and the vertical edge map are respectively half, and then the weights are superimposed to obtain a Sobel filtering result S as the first edge map.
S=0.5*Gx+0.5*Gy
According to the method and the device, the edge information of the image to be processed can be extracted through filtering of the edge detection operator. And the first edge map obtained by filtering by the edge detection operator is used for calculating a second edge map of the image to be processed.
Step S103: and determining a second edge map according to the first edge map, wherein the second edge map is obtained by performing edge enhancement on the first edge map.
In some possible embodiments, determining the second edge map from the first edge map comprises:
the first step is as follows: when the first edge map is an RGB image, the first edge map is converted into a YUV image.
If the first edge map is not a YUV image, the first edge map is first converted into a YUV image. The application does not specifically limit how the first edge map is converted into the YUV image. If the first edge map is a YUV image, the second step is directly performed.
The second step is that: selecting a Y-channel pixel value which meets a preset condition in the YUV image as a binarization threshold value;
the binarization threshold value is a pixel value corresponding to a position, in a YUV image, of a Y channel which meets a preset condition, the range of the pixel value corresponding to the position which meets the preset condition in the sorting sequence of the pixel values of the Y channels of all YUV images is 60% -90%, and the value of the Y channel which is sorted in 85% can be selected as the binarization threshold value. The binarization threshold represents the confidence of the edge, and strong edge information can be reserved.
The third step: and carrying out binarization on the YUV image according to a binarization threshold value to obtain a binarization edge map, wherein the binarization edge map is used as a second edge map.
Setting the pixel value larger than the binarization threshold value on the Y channel as 0, and setting the pixel value smaller than or equal to the binarization threshold value as 255, thereby obtaining a binarization edge map, wherein the position with the value of 0 is an edge position, and the position with the value of 255 is a smooth area.
In one embodiment, after obtaining the binarized edge map, three processing cases are included:
the first method comprises the following steps: performing denoising point operation on the binary edge image;
the method for performing denoising point operation on the binary edge map comprises the following steps:
traversing the binary edge map, and judging whether the current pixel point is a noise point;
if the value of pixel I (x, y) in the binarized edge map is 0 and the number of pixels in its designated neighborhood of 0 is less than the first threshold, then pixel I (x, y) is identified as noise.
If the pixel value of the current pixel point is 0 and the number of the pixel points with the pixel value of 0 in the preset domain range of the current pixel point is less than a first threshold value, determining the current pixel point as a noise point, and resetting the pixel value of the current pixel point to 255; and if the condition of traversing termination is met, terminating the traversing.
For example, the area of the specified neighborhood region is 3 × 3, the first threshold is the area of the specified neighborhood region multiplied by 0.3, that is, 3 × 0.3 — 2.7, and the first threshold is the area of the total pixels of the image multiplied by 0.05. The first threshold and the neighborhood range can be adjusted according to actual conditions on the premise of removing obvious noise points and reserving main edge information of the image.
The condition of traversal termination is as follows: the number of pixels having a pixel value of 0 and not being noise is less than the second threshold or no noise.
And circularly traversing the binary edge map for multiple times until the number of pixels which have values of 0 and are not noise points on the Y channel is less than a second threshold value or no noise point pixels exist on the Y channel. Wherein the second threshold value is set according to actual needs.
Performing morphological operation on the binary edge image after the noise points are removed to obtain a binary edge image after the morphological operation;
and updating the binary edge map after the morphological operation into a second edge map.
And the edge information can be better enhanced by performing morphological operations such as multiple corrosion on the binarized edge map without the noise points. For example, the size of the etching nuclei is set to 5 × 5, and 2 etching operations are performed. The specific corrosion value can be adjusted according to actual conditions, so as to enhance the edge information without overlapping the edge information in a transition way.
And the second method comprises the following steps: only noisy points in the binarized edge map are removed.
Specifically, how to remove the noise in the binarized edge map is the same as the above, and is not described here again.
And the third is that: and performing morphological operation on the binary edge map.
And performing morphological operations such as multiple corrosion on the binary edge map, so that edge information can be better enhanced, and an updated second edge map is obtained.
It is understood that Laplace filtering and edge detection operator filtering are not limited in sequence.
Step S104: and determining the noise level of the image to be processed according to the Laplace filtering result and the second edge map.
And traversing the second edge graph, and counting the noise level of the Laplace filtering result of the smooth region, wherein the specific operation is as follows:
counting the total number n and the pixel positions of pixel points with the pixel value of 255 in the second edge image;
calculating the sum V of the absolute values of the Laplace filtering results corresponding to each pixel position;
and dividing the sum V of the absolute values by the total number n of the pixels to obtain the noise level G of the image to be processed.
The noise level reflects the noise mean strength of the image to be processed.
Figure BDA0002821198660000081
In subsequent applications, the noise average intensity G may be multiplied by different adjustment coefficients according to the different size requirements of the application scenario on the noise level value.
According to the embodiment of the application, the noise level is determined by combining the second edge map and the Laplace filtering result, the noise information can be accurately estimated, the denoising strength can be better controlled, and more detail information can be reserved while denoising.
After determining the noise level of the image to be processed according to the Laplace filtering result and the second edge map, the method may further include: and removing the noise of the image to be processed according to the noise level, monitoring the noise of the image to be processed or screening the image to be processed.
Fig. 2 is a schematic diagram of a noise estimation apparatus according to an embodiment of the present application, and for convenience of description, only a portion related to the embodiment of the present invention is shown, including:
the first filtering module 21 is configured to perform Laplace filtering on the image to be processed to obtain a Laplace filtering result;
the second filtering module 22 is configured to perform edge detection operator filtering on the image to be processed to obtain a first edge map;
an edge map generating module 23, configured to determine a second edge map according to the first edge map, where the second edge map is obtained by performing edge enhancement on the first edge map;
and the noise estimation module 24 is configured to determine a noise level of the image to be processed according to the Laplace filtering result and the second edge map.
It will be apparent to those skilled in the art that, for convenience and simplicity of description, the foregoing division of the functional modules is merely illustrated, and in practical applications, the above function distribution may be performed by different functional units and modules as needed, that is, the internal structure of the mobile terminal is divided into different functional units or modules to perform all or part of the above described functions. Each functional module in the embodiments may be integrated in one processing unit, or each unit may exist alone physically, or two or more units are integrated in one unit, and the integrated unit may be implemented in a form of hardware, or in a form of software functional unit. In addition, specific names of the functional modules are only used for distinguishing one functional module from another, and are not used for limiting the protection scope of the application. The specific working process of the module in the mobile terminal may refer to the corresponding process in the foregoing method embodiment, and is not described herein again.
Fig. 3 is a schematic diagram of a terminal device according to an embodiment of the present invention. As shown in fig. 3, the terminal device 3 of this embodiment includes: a processor 30, a memory 31, and a noise estimation program 32 stored in the memory 31 and executable on the processor 30. The steps of the noise estimation method described above, such as steps 101 to 104 shown in fig. 1, are implemented when the processor 30 executes the noise estimation program 32. Alternatively, the processor 30, when executing the noise estimation program 32, implements the functions of the various modules/units in the various device embodiments described above, such as the functions of the modules 21 to 24 shown in fig. 2.
Illustratively, the noise estimation program 32 may be partitioned into one or more modules/units, which are stored in the memory 31 and executed by the processor 30 to carry out the invention. One or more of the modules/units may be a series of computer program instruction segments capable of performing specific functions, which are used to describe the execution of the noise estimation program 32 in the terminal device 3.
The terminal device 3 may be a desktop computer, a notebook, a palm computer, or other computing devices. The terminal equipment may include, but is not limited to, a processor 30, a memory 31. It will be appreciated by those skilled in the art that fig. 3 is merely an example of the terminal device 3 and does not constitute a limitation of the terminal device 3 and may comprise more or less components than those shown, or some components may be combined, or different components, e.g. the terminal device may further comprise input output devices, network access devices, buses, etc.
The Processor 30 may be a Central Processing Unit (CPU), other general purpose Processor, a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), an off-the-shelf Programmable Gate Array (FPGA) or other Programmable logic device, discrete Gate or transistor logic, discrete hardware components, etc. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
The storage 31 may be an internal storage unit of the terminal device 3, such as a hard disk or a memory of the terminal device 3. The memory 31 may also be an external storage device of the terminal device 3, such as a plug-in hard disk provided on the terminal device 3, a Smart Media Card (SMC), a Secure Digital (SD) Card, a Flash memory Card (Flash Card), and the like. Further, the memory 31 may also include both an internal storage unit of the terminal device 3 and an external storage device. The memory 31 is used to store a noise estimation program 32 and other programs and data required by the terminal device. The memory 31 may also be used to temporarily store data that has been output or is to be output.
The embodiment of the present application further provides a computer-readable storage medium, where the noise estimation program 32 is stored in the computer-readable storage medium, and when being executed by a processor, the noise estimation program 32 implements the steps in the above-mentioned method embodiments.
The embodiments of the present application provide a computer program product, which when running on a mobile terminal, enables the mobile terminal to implement the steps in the above method embodiments when executed.
It will be apparent to those skilled in the art that, for convenience and brevity of description, only the above-mentioned division of the functional units and modules is illustrated, and in practical applications, the above-mentioned function distribution may be performed by different functional units and modules according to needs, that is, the internal structure of the apparatus is divided into different functional units or modules, so as to perform all or part of the functions described above. Each functional unit and module in the embodiments may be integrated in one processing unit, or each unit may exist alone physically, or two or more units are integrated in one unit, and the integrated unit may be implemented in a form of hardware, or in a form of software functional unit. In addition, specific names of the functional units and modules are only for convenience of distinguishing from each other, and are not used for limiting the protection scope of the present invention. The specific working processes of the units and modules in the system may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again.
In the above embodiments, the descriptions of the respective embodiments have respective emphasis, and reference may be made to the related descriptions of other embodiments for parts that are not described or illustrated in a certain embodiment.
Those of ordinary skill in the art will appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware, or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the implementation. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present invention.
In the embodiments provided in the present invention, it should be understood that the disclosed apparatus/terminal device and method may be implemented in other ways. For example, the above-described embodiments of the apparatus/terminal device are merely illustrative, and for example, a module or a unit may be divided into only one logical function, and may be implemented in other ways, for example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or units, and may be in an electrical, mechanical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one position, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present invention may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit.
The integrated modules/units, if implemented in the form of software functional units and sold or used as separate products, may be stored in a computer readable storage medium. Based on such understanding, all or part of the flow in the method according to the embodiments of the present invention may also be implemented by a computer program to instruct related hardware, where the computer program may be stored in a computer readable storage medium, and when the computer program is executed by a processor, the computer program may implement the steps of the embodiments of the method. Wherein the computer program comprises computer program code, which may be in the form of source code, object code, an executable file or some intermediate form, etc. The computer readable medium may include: any entity or device capable of carrying computer program code, recording medium, U.S. disk, removable hard disk, magnetic disk, optical disk, computer Memory, Read-Only Memory (ROM), Random Access Memory (RAM), electrical carrier wave signals, telecommunications signals, software distribution media, and the like. It should be noted that the computer-readable medium may contain suitable additions or subtractions depending on the requirements of legislation and patent practice in jurisdictions, for example, in some jurisdictions, computer-readable media excludes electrical carrier signals and telecommunications signals in accordance with legislation and patent practice.
The above examples are only intended to illustrate the technical solution of the present invention, and not to limit it; although the present invention has been described in detail with reference to the foregoing embodiments, it should be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; such modifications and substitutions do not substantially depart from the spirit and scope of the embodiments of the present invention, and are intended to be included within the scope of the present invention.

Claims (11)

1. A method of noise estimation, comprising:
carrying out Laplacian Laplace filtering on the image to be processed to obtain a Laplace filtering result;
carrying out edge detection operator filtering on the image to be processed to obtain a first edge image;
determining a second edge map according to the first edge map, wherein the second edge map is obtained by performing edge enhancement on the first edge map;
and determining the noise level of the image to be processed according to the Laplace filtering result and the second edge map.
2. The method of claim 1, wherein the filtering the to-be-processed image by the edge detection operator to obtain the first edge map comprises:
when the image to be processed is an RGB image, performing convolution operation on an x axis of the image to be processed by adopting a transverse edge detection operator to obtain a horizontal edge image;
performing convolution operation on the y axis of the image to be processed by adopting a longitudinal edge detection operator to obtain a vertical edge image;
determining a first edge map according to the horizontal edge map and the vertical edge map;
alternatively, the first and second electrodes may be,
and when the image to be processed is a YUV image, carrying out edge detection operator filtering on a Y channel of the YUV image to obtain a first edge image.
3. The method of claim 1, wherein determining a second edge map from the first edge map comprises:
when the first edge map is an RGB image, converting the first edge map into a YUV image;
selecting a Y-channel pixel value which meets a preset condition in the YUV image as a binarization threshold value;
binarizing the YUV image according to the binarization threshold value to obtain a binarization edge image, wherein the binarization edge image is used as a second edge image;
alternatively, the first and second electrodes may be,
when the first edge image is a YUV image, selecting a Y-channel pixel value which meets a preset condition in the YUV image as a binarization threshold value;
and carrying out binarization on the YUV image according to the binarization threshold value to obtain a binarization edge map, wherein the binarization edge map is used as a second edge map.
4. The method as claimed in claim 3, wherein after the binarizing the YUV image according to the binarization threshold to obtain a binarized edge map, the method further comprises:
performing denoising point operation on the binarization edge image;
performing morphological operation on the binary edge image after the noise points are removed to obtain a binary edge image after the morphological operation;
and updating the binary edge map after the morphological operation into the second edge map.
5. The method as claimed in claim 4, wherein said performing a de-noising operation on said binarized edge map comprises:
traversing the binary edge map, if the pixel value of the current pixel point is 0 and the number of the pixel points with the pixel value of 0 in the preset field range of the current pixel point is less than a first threshold value, determining the current pixel point as a noise point, and resetting the pixel value of the current pixel point to 255; if the condition of traversing termination is met, the traversing is terminated;
wherein the condition of the traversal termination is as follows: the number of pixels having a pixel value of 0 and not being noise is less than the second threshold, or there is no noise.
6. The method according to any one of claims 1-5, wherein said determining a noise level of the image to be processed from the Laplace filtering result and the second edge map comprises:
counting the total number of pixels and the positions of the pixels of which the pixel values are 255 in the second edge image;
calculating the sum of absolute values of Laplace filtering results corresponding to each pixel position;
and dividing the sum of the absolute values by the total number of the pixels to obtain the noise level of the image to be processed.
7. The method of claim 1, wherein the laplacian filtering of the image to be processed to obtain a laplacian filtering result comprises:
when the image to be processed is an RGB image, converting the image to be processed into a YUV image;
performing convolution operation on the Y channel of the YUV image by using a Laplace operator to obtain a Laplace filtering result;
alternatively, the first and second electrodes may be,
and when the image to be processed is a YUV image, performing convolution operation on a Y channel of the image to be processed by using a Laplace operator to obtain a Laplace filtering result.
8. The method according to claim 1, wherein the image to be processed is a face image or a landscape image;
after determining the noise level of the image to be processed according to the Laplace filtering result and the second edge map, the method further includes:
and removing the noise of the image to be processed according to the noise level, monitoring the noise of the image to be processed or screening the image to be processed.
9. A noise estimation device, comprising:
the first filtering module is used for carrying out Laplacian Laplace filtering on the image to be processed to obtain a Laplace filtering result;
the second filtering module is used for carrying out edge detection operator filtering on the image to be processed to obtain a first edge image;
an edge map generation module, configured to determine a second edge map according to the first edge map, where the second edge map is obtained by performing edge enhancement on the first edge map;
and the noise estimation module is used for determining the noise level of the image to be processed according to the Laplace filtering result and the second edge map.
10. A terminal device, characterized in that the terminal device comprises a memory, a processor and a noise estimation program stored in the memory and executable on the processor, the processor implementing the noise estimation method according to any of claims 1 to 8 when executing the noise estimation program.
11. A computer-readable storage medium, characterized in that the computer-readable storage medium stores a noise estimation program which, when executed by a processor, implements the noise estimation method according to any one of claims 1 to 8.
CN202011418618.1A 2020-12-07 2020-12-07 Noise estimation method, device, terminal equipment and computer readable storage medium Pending CN114596210A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011418618.1A CN114596210A (en) 2020-12-07 2020-12-07 Noise estimation method, device, terminal equipment and computer readable storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011418618.1A CN114596210A (en) 2020-12-07 2020-12-07 Noise estimation method, device, terminal equipment and computer readable storage medium

Publications (1)

Publication Number Publication Date
CN114596210A true CN114596210A (en) 2022-06-07

Family

ID=81802371

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011418618.1A Pending CN114596210A (en) 2020-12-07 2020-12-07 Noise estimation method, device, terminal equipment and computer readable storage medium

Country Status (1)

Country Link
CN (1) CN114596210A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20210203900A1 (en) * 2020-04-17 2021-07-01 Beijing Baidu Netcom Science And Technology Co., Ltd. Image processing method and apparatus, electronic device and computer-readable storage medium

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20210203900A1 (en) * 2020-04-17 2021-07-01 Beijing Baidu Netcom Science And Technology Co., Ltd. Image processing method and apparatus, electronic device and computer-readable storage medium
US11930307B2 (en) * 2020-04-17 2024-03-12 Beijing Baidu Netcom Science Technology Co., Ltd. Image processing method and apparatus, electronic device and computer-readable storage medium

Similar Documents

Publication Publication Date Title
CN110766679B (en) Lens contamination detection method and device and terminal equipment
CN109146855B (en) Image moire detection method, terminal device and storage medium
CN111080661B (en) Image-based straight line detection method and device and electronic equipment
US11107202B2 (en) Contrast enhancement and reduction of noise in images from cameras
CN110335216B (en) Image processing method, image processing apparatus, terminal device, and readable storage medium
CN109214996B (en) Image processing method and device
CN110708568B (en) Video content mutation detection method and device
CN111861938B (en) Image denoising method and device, electronic equipment and readable storage medium
CN111368587A (en) Scene detection method and device, terminal equipment and computer readable storage medium
CN111383178A (en) Image enhancement method and device and terminal equipment
CN111563517A (en) Image processing method, image processing device, electronic equipment and storage medium
CN114298985B (en) Defect detection method, device, equipment and storage medium
CN111882565A (en) Image binarization method, device, equipment and storage medium
CN113487473B (en) Method and device for adding image watermark, electronic equipment and storage medium
CN114943649A (en) Image deblurring method, device and computer readable storage medium
CN108805838B (en) Image processing method, mobile terminal and computer readable storage medium
CN110751156A (en) Method, system, device and medium for table line bulk interference removal
CN111311619A (en) Method and device for realizing slider verification
CN113744294A (en) Image processing method and related device
CN114596210A (en) Noise estimation method, device, terminal equipment and computer readable storage medium
CN111311610A (en) Image segmentation method and terminal equipment
US10964028B2 (en) Electronic device and method for segmenting image
CN109255311B (en) Image-based information identification method and system
CN108810407B (en) Image processing method, mobile terminal and computer readable storage medium
CN110633705A (en) Low-illumination imaging license plate recognition method and device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination