CN103489167A - Automatic image sharpening method - Google Patents

Automatic image sharpening method Download PDF

Info

Publication number
CN103489167A
CN103489167A CN201310494804.7A CN201310494804A CN103489167A CN 103489167 A CN103489167 A CN 103489167A CN 201310494804 A CN201310494804 A CN 201310494804A CN 103489167 A CN103489167 A CN 103489167A
Authority
CN
China
Prior art keywords
mrow
image
msup
sharpening
edge
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201310494804.7A
Other languages
Chinese (zh)
Inventor
张伟
傅松林
李志阳
张长定
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
XIAMEN MEITUWANG TECHNOLOGY Co Ltd
Original Assignee
XIAMEN MEITUWANG TECHNOLOGY Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by XIAMEN MEITUWANG TECHNOLOGY Co Ltd filed Critical XIAMEN MEITUWANG TECHNOLOGY Co Ltd
Priority to CN201310494804.7A priority Critical patent/CN103489167A/en
Priority to CN201310652446.8A priority patent/CN103679656B/en
Publication of CN103489167A publication Critical patent/CN103489167A/en
Pending legal-status Critical Current

Links

Images

Landscapes

  • Facsimile Image Signal Circuits (AREA)
  • Image Analysis (AREA)
  • Image Processing (AREA)

Abstract

The invention relates to an automatic image sharpening method. The automatic image sharpening method comprises the following steps: carrying out graying processing and edge detection on an original image and carrying out histogram statistic on edge strength so as to calculate a fuzzy probability of a gray level image and set a sharpening degree of the gray level image according to the fuzzy probability; and finally, combining a Gaussian blurred image to carry out automatic image sharpening processing on the original image according to the sharpening degree so that a sharpening amount does not depend on a display condition and a visual system of a user. Therefore, not only can the automatic sharpening be realized, but also the sharpening processing quality of the image is improved.

Description

Automatic image sharpening method
Technical Field
The invention relates to an image enhancement method, in particular to an automatic image sharpening method.
Background
After a digital image is taken, it is often sharpened to reduce or eliminate blur, enhance image focus, or simulate better resolution. Sharpening can be performed by a deconvolution method or by using a blunted mask filter to increase the contrast of edges in an image. The passivation mask filter identifies pixels that are different from surrounding pixels by defined thresholds and increases contrast by a specified amount of sharpening. The user may determine and set the amount of sharpening, which makes the amount of sharpening dependent on the display conditions and the user's visual system. In other words, although the amount of sharpening is one of the main parameters in the passivation mask filter, the amount of sharpening is usually set based on subjective judgment rather than objective judgment. Therefore, how to set the sharpening amount by objective judgment becomes a key point for eliminating the image blur.
Disclosure of Invention
The present invention provides an automatic image sharpening method, which sets the sharpening amount by an objective determination method and is an image enhancement method for automatically sharpening based on the image blur degree.
In order to achieve the purpose, the invention adopts the technical scheme that:
an automatic image sharpening method is characterized by comprising the following steps:
10. receiving an original image, and carrying out graying processing on the original image to obtain a grayscale image;
20. carrying out edge detection on the gray level image, and carrying out histogram statistics on edge intensity;
30. calculating the fuzzy probability of the gray level image according to the result of the histogram statistics;
40. performing Gaussian blur processing on the gray level image to obtain a Gaussian blur image;
50. and setting the sharpening degree of the gray level image according to the fuzzy probability, and automatically sharpening the original image according to the sharpening degree and the color value of the Gaussian fuzzy image.
As a preferred embodiment: the step 20 further comprises:
21. detecting strong edges and weak edges of the gray level image, and obtaining strong edge results and weak edge results;
22. and partitioning the gray level image, and counting each block and the strong edge result and the weak edge result corresponding to each block to obtain histogram statistics of the edge strength.
As a preferred embodiment: said step 22 further comprises the steps of:
step 221, partitioning the gray level image;
step 222, analyzing the strong edge result of each block, and judging whether the strong edge result belongs to an edge block; if so, then step 223 is performed;
step 223, calculating the gradient of each pixel point in the edge block;
step 224, calculating the gradient direction corresponding to each pixel point according to the gradient of the pixel point;
step 225, searching the edge continuity strength of each pixel point according to the weak edge result corresponding to each block and the gradient direction in the step 224;
and step 226, calculating the contrast difference of each block, and performing histogram statistics on the edge continuity strength of each block.
As a preferred embodiment: in the step 221, the gray-scale image is partitioned, and the size of each block is 16-128 pixels.
As a preferred embodiment: in the step 222, whether the edge block belongs to is determined according to whether the number of the pixels belonging to the edge is greater than the predetermined percentage of the total number of the block pixels according to the result of determining the strong edge, and the predetermined percentage ranges from 0.1% to 2%.
As a preferred embodiment: in step 223, the gradient calculation formula of the pixel point is:
grad=(next-prev)/2,
wherein, grad is the gradient value of the current pixel point; next is the value of the next pixel of the current pixel; prev is the value of the previous pixel of the current pixel.
As a preferred embodiment: in step 224, the calculation of the gradient direction corresponding to the pixel point is an angle obtained by performing arc tangent according to the gradient values of the gradient of the pixel point in the X direction and the Y direction.
As a preferred embodiment: in step 225, the edge continuity strength of the pixel point is calculated by using whether the pixel point in the gradient direction belongs to the edge in the weak edge result, so as to obtain the strength of the continuity of each pixel point being the edge.
As a preferred embodiment: in step 226, the calculation method of the contrast difference is to calculate the maximum value and the minimum value of the pixel values in each block, subtract the maximum value and the minimum value to obtain a difference value, and then obtain the final contrast difference; wherein,
when the difference value ranges from 0 to 51, the contrast difference is 5;
the contrast difference is 3 when the difference ranges from 52 to 256.
As a preferred embodiment: in step 226, the histogram statistical formula is:
index = 0.5 + 100 * ( 1 - e ( - continue [ x ] [ y ] / block ) 3.6 ) ;
wherein, index is the serial number value of the histogram, and the range is 0-100; continue is the edge continuity strength value and block is the contrast difference.
As a preferred embodiment: in the step 10, a calculation formula for performing graying processing on the original image is one of the following formulas:
Gray=0.299*Red+0.587*Green+0.114*Blue:
or
Gray=(Red*306+Green*601+Blue*117+512)/1024;
Wherein Gray is a mean value of Gray values, and Red, Green and Blue are mean values of color values of Red, Green and Blue channels respectively.
As a preferred embodiment: in step 30, the calculation formula of the blur probability of the grayscale image is:
<math> <mrow> <mi>score</mi> <mo>=</mo> <mn>1.0</mn> <mo>-</mo> <mrow> <mo>(</mo> <munderover> <mi>&Sigma;</mi> <mrow> <mi>i</mi> <mo>=</mo> <mn>0</mn> </mrow> <mn>64</mn> </munderover> <mi>hist</mi> <mo>[</mo> <mi>i</mi> <mo>]</mo> <mo>)</mo> </mrow> <mo>/</mo> <mi>nCount</mi> <mo>;</mo> </mrow> </math>
wherein score is fuzzy probability, the range is 0.0-1.0, 1.0 represents that the fuzzy probability is the highest, and 0.0 represents that the fuzzy probability is the lowest; hist is an array obtained after histogram statistics; nCount is the total number of histogram statistics performed.
As a preferred embodiment: the gaussian blur processing in said step 40 is to calculate the transformation of each pixel in the image with a normal distribution, wherein,
the normal distribution equation in the N-dimensional space is:
<math> <mrow> <mi>G</mi> <mrow> <mo>(</mo> <mi>r</mi> <mo>)</mo> </mrow> <mo>=</mo> <mfrac> <mn>1</mn> <msup> <msqrt> <mn>2</mn> <mi>&pi;</mi> <msup> <mi>&sigma;</mi> <mn>2</mn> </msup> </msqrt> <mi>N</mi> </msup> </mfrac> <msup> <mi>e</mi> <mrow> <mo>-</mo> <msup> <mi>r</mi> <mn>2</mn> </msup> <mo>/</mo> <mrow> <mo>(</mo> <mn>2</mn> <msup> <mi>&sigma;</mi> <mn>2</mn> </msup> <mo>)</mo> </mrow> </mrow> </msup> <mo>;</mo> </mrow> </math>
the normal distribution equation in two dimensions is:
<math> <mrow> <mi>G</mi> <mrow> <mo>(</mo> <mi>u</mi> <mo>,</mo> <mi>&upsi;</mi> <mo>)</mo> </mrow> <mo>=</mo> <mfrac> <mn>1</mn> <mrow> <mn>2</mn> <mi>&pi;</mi> <msup> <mi>&sigma;</mi> <mn>2</mn> </msup> </mrow> </mfrac> <msup> <mi>e</mi> <mrow> <mo>-</mo> <mrow> <mo>(</mo> <msup> <mi>u</mi> <mn>2</mn> </msup> <mo>+</mo> <msup> <mi>&upsi;</mi> <mn>2</mn> </msup> <mo>)</mo> </mrow> <mo>/</mo> <mrow> <mo>(</mo> <mn>2</mn> <msup> <mi>&sigma;</mi> <mn>2</mn> </msup> <mo>)</mo> </mrow> </mrow> </msup> <mo>;</mo> </mrow> </math>
where r is the blur radius (r)2=u22) σ is the standard deviation of the normal distribution, u is the position deviation value of the original pixel point on the x axis, and v is the position deviation value of the original pixel point on the y axis.
As a preferred embodiment: in the step 50, the sharpening degree of the gray image is set according to the fuzzy probability, wherein a formula for setting the sharpening degree is as follows:
depth=max((blur-0.4)/k,0.0);
wherein depth is the sharpening degree of the image; blur is the probability of blurring of the grayscale image; k is a fixed value ranging from 2 to 10.
As a preferred embodiment: in the step 50, the original image is automatically sharpened according to the sharpening degree and the color value of the gaussian blurred image, and the calculation formula is as follows:
resultColor=min(max((depth+1)*color-depth*gaussColor,0),255);
the depth is the set sharpening degree, and the resultColor is the color value of each sharpened pixel point; color is the color value of each pixel point of the gray level image; the gaussColor is the color value of each pixel point after the gray level image is subjected to Gaussian blur.
The invention has the beneficial effects that:
(1) carrying out graying processing and edge detection on the original image, carrying out histogram statistics of edge intensity to further calculate the fuzzy probability of the grayscale image, thereby setting the sharpening degree of the grayscale image according to the fuzzy probability, and finally carrying out automatic sharpening processing on the original image according to the sharpening degree and in combination with a Gaussian fuzzy image, so that the sharpening amount does not depend on display conditions and a visual system of a user, automatic sharpening can be realized, and the sharpening processing quality of the image is improved;
(2) and partitioning the gray level image, and counting each block and the strong edge result and the weak edge result corresponding to each block to obtain histogram statistics of the edge strength, so that the calculated fuzzy probability is more accurate, and the sharpening amount is more reasonable.
Drawings
The accompanying drawings, which are included to provide a further understanding of the invention and are incorporated in and constitute a part of this specification, illustrate embodiments of the invention and together with the description serve to explain the invention and not to limit the invention. In the drawings:
fig. 1 is a simplified flowchart of the image automatic sharpening method according to the present invention.
Detailed Description
In order to make the technical problems, technical solutions and advantageous effects of the present invention more clear and obvious, the present invention is further described in detail below with reference to the accompanying drawings and embodiments. It should be understood that the specific embodiments described herein are merely illustrative of the invention and are not intended to limit the invention.
As shown in fig. 1, an image automatic sharpening method of the present invention includes the following steps:
10. receiving an original image, and carrying out graying processing on the original image to obtain a grayscale image;
20. carrying out edge detection on the gray level image, and carrying out histogram statistics on edge intensity;
30. calculating the fuzzy probability of the gray level image according to the result of the histogram statistics;
40. performing Gaussian blur processing on the gray level image to obtain a Gaussian blur image;
50. and setting the sharpening degree of the gray level image according to the fuzzy probability, and automatically sharpening the original image according to the sharpening degree and the color value of the Gaussian fuzzy image.
In this embodiment, the step 20 further includes:
21. detecting strong edges and weak edges of the gray level image, and obtaining strong edge results and weak edge results;
22. and partitioning the gray level image, and counting each block and the strong edge result and the weak edge result corresponding to each block to obtain histogram statistics of the edge strength.
Specifically, the method for strong edge detection may be a Canny edge detection algorithm or a threshold edge detection algorithm, and the method for weak edge detection may be a Sobel edge detection algorithm or a Prewitt edge detection algorithm. The edge detection method is a conventional detection method, and will not be described herein.
In this embodiment, the step 22 further includes the following steps:
step 221, partitioning the gray level image;
step 222, analyzing the strong edge result of each block, and judging whether the strong edge result belongs to an edge block; if so, then step 223 is performed;
step 223, calculating the gradient of each pixel point in the edge block;
step 224, calculating the gradient direction corresponding to each pixel point according to the gradient of the pixel point;
step 225, searching the edge continuity strength of each pixel point according to the weak edge result corresponding to each block and the gradient direction in the step 224;
and step 226, calculating the contrast difference of each block, and performing histogram statistics on the edge continuity strength of each block.
In this embodiment, the grayscale image is divided into blocks in step 221, and the size of each block is 16 to 128 pixels; preferably 64 pixels.
In this embodiment, the step 222 of determining whether the edge block belongs to is to determine whether the number of the pixels belonging to the edge is greater than a predetermined percentage of the total number of the pixels of the block according to the result of determining the strong edge, and the predetermined percentage ranges from 0.1% to 2%.
In this embodiment, in step 223, the gradient calculation formula of the pixel point is:
grad=(next-prev)/2,
wherein, grad is the gradient value of the current pixel point; next is the value of the next pixel of the current pixel; prev is the value of the previous pixel of the current pixel.
In this embodiment, in step 224, the calculation of the gradient direction corresponding to the pixel point is an angle obtained by performing arc tangent according to the gradient values of the gradient of the pixel point in the X direction and the Y direction.
In this embodiment, in step 225, the calculation of the edge continuity strength of the pixel point is to use whether the pixel point in the gradient direction belongs to the edge in the weak edge result, so as to obtain the strength of the continuity of each pixel point being the edge.
In this embodiment, in step 226, the calculation method of the contrast difference is to calculate the maximum value and the minimum value of the pixel values in each block, subtract the maximum value and the minimum value to obtain a difference value, and then obtain a final contrast difference; wherein,
when the difference value ranges from 0 to 51, the contrast difference is 5;
the contrast difference is 3 when the difference ranges from 52 to 256.
In this embodiment, in step 226, the histogram statistical formula is:
index = 0.5 + 100 * ( 1 - e ( - continue [ x ] [ y ] / block ) 3.6 ) ;
wherein, index is the serial number value of the histogram, and the range is 0-100; continue is the edge continuity strength value and block is the contrast difference.
In this embodiment, in the step 10, a calculation formula for performing graying processing on the original image is one of the following formulas:
Gray=0.299*Red+0.587*Green+0.114*Blue;
or
Gray=(Red*306+Green*601+Blue*117+512)/1024;
Wherein Gray is a mean value of Gray values, and Red, Green and Blue are mean values of color values of Red, Green and Blue channels respectively.
In this embodiment, in the step 30, the calculation formula of the blur probability of the grayscale image is as follows:
<math> <mrow> <mi>score</mi> <mo>=</mo> <mn>1.0</mn> <mo>-</mo> <mrow> <mo>(</mo> <munderover> <mi>&Sigma;</mi> <mrow> <mi>i</mi> <mo>=</mo> <mn>0</mn> </mrow> <mn>64</mn> </munderover> <mi>hist</mi> <mo>[</mo> <mi>i</mi> <mo>]</mo> <mo>)</mo> </mrow> <mo>/</mo> <mi>nCount</mi> <mo>;</mo> </mrow> </math>
wherein score is fuzzy probability, the range is 0.0-1.0, 1.0 represents that the fuzzy probability is the highest, and 0.0 represents that the fuzzy probability is the lowest; hist is an array obtained after histogram statistics; nCount is the total number of histogram statistics performed.
In this embodiment, the gaussian blur processing in step 40 is to calculate the transformation of each pixel in the image by using a normal distribution, wherein,
the normal distribution equation in the N-dimensional space is:
<math> <mrow> <mi>G</mi> <mrow> <mo>(</mo> <mi>r</mi> <mo>)</mo> </mrow> <mo>=</mo> <mfrac> <mn>1</mn> <msup> <msqrt> <mn>2</mn> <mi>&pi;</mi> <msup> <mi>&sigma;</mi> <mn>2</mn> </msup> </msqrt> <mi>N</mi> </msup> </mfrac> <msup> <mi>e</mi> <mrow> <mo>-</mo> <msup> <mi>r</mi> <mn>2</mn> </msup> <mo>/</mo> <mrow> <mo>(</mo> <mn>2</mn> <msup> <mi>&sigma;</mi> <mn>2</mn> </msup> <mo>)</mo> </mrow> </mrow> </msup> <mo>;</mo> </mrow> </math>
the normal distribution equation in two dimensions is:
<math> <mrow> <mi>G</mi> <mrow> <mo>(</mo> <mi>u</mi> <mo>,</mo> <mi>&upsi;</mi> <mo>)</mo> </mrow> <mo>=</mo> <mfrac> <mn>1</mn> <mrow> <mn>2</mn> <mi>&pi;</mi> <msup> <mi>&sigma;</mi> <mn>2</mn> </msup> </mrow> </mfrac> <msup> <mi>e</mi> <mrow> <mo>-</mo> <mrow> <mo>(</mo> <msup> <mi>u</mi> <mn>2</mn> </msup> <mo>+</mo> <msup> <mi>&upsi;</mi> <mn>2</mn> </msup> <mo>)</mo> </mrow> <mo>/</mo> <mrow> <mo>(</mo> <mn>2</mn> <msup> <mi>&sigma;</mi> <mn>2</mn> </msup> <mo>)</mo> </mrow> </mrow> </msup> <mo>;</mo> </mrow> </math>
where r is the blur radius (r)2=u22) σ is the standard deviation of the normal distribution, u is the position deviation value of the original pixel point on the x axis, and v is the position deviation value of the original pixel point on the y axis.
In this embodiment, in the step 50, the sharpening degree of the gray image is set according to the fuzzy probability, where a formula for setting the sharpening degree is as follows:
depth=max((blur-0.4)/k,0.0);
wherein depth is the sharpening degree of the image; blur is the probability of blurring of the grayscale image; k is a fixed value ranging from 2 to 10, with k having an optimal value of 6.
In this embodiment, in the step 50, the original image is automatically sharpened according to the sharpening degree and the color value of the gaussian blurred image, and a calculation formula of the sharpening degree is as follows:
resultColor=min(max((depth+1)*color-depth*gaussColor,0),255);
the depth is the set sharpening degree, and the resultColor is the color value of each sharpened pixel point; color is the color value of each pixel point of the gray level image; the gaussColor is the color value of each pixel point after the gray level image is subjected to Gaussian blur.
While the foregoing specification illustrates and describes the preferred embodiments of this invention, it is to be understood that the invention is not limited to the forms disclosed herein, but is not to be construed as limited to other embodiments, and is capable of use in various other combinations, modifications, and environments and is capable of changes within the scope of the inventive concept as expressed herein, commensurate with the above teachings, or the skill or knowledge of the relevant art. And that modifications and variations may be effected by those skilled in the art without departing from the spirit and scope of the invention as defined by the appended claims.

Claims (15)

1. An automatic image sharpening method is characterized by comprising the following steps:
10. receiving an original image, and carrying out graying processing on the original image to obtain a grayscale image;
20. carrying out edge detection on the gray level image, and carrying out histogram statistics on edge intensity;
30. calculating the fuzzy probability of the gray level image according to the result of the histogram statistics;
40. performing Gaussian blur processing on the gray level image to obtain a Gaussian blur image;
50. and setting the sharpening degree of the gray level image according to the fuzzy probability, and automatically sharpening the original image according to the sharpening degree and the color value of the Gaussian fuzzy image.
2. The method for automatically sharpening the image according to claim 1, wherein: the step 20 further comprises:
21. detecting strong edges and weak edges of the gray level image, and obtaining strong edge results and weak edge results;
22. and partitioning the gray level image, and counting each block and the strong edge result and the weak edge result corresponding to each block to obtain histogram statistics of the edge strength.
3. The method for automatically sharpening an image according to claim 2, wherein: said step 22 further comprises the steps of:
step 221, partitioning the gray level image;
step 222, analyzing the strong edge result of each block, and judging whether the strong edge result belongs to an edge block; if so, then step 223 is performed;
step 223, calculating the gradient of each pixel point in the edge block;
step 224, calculating the gradient direction corresponding to each pixel point according to the gradient of the pixel point;
step 225, searching the edge continuity strength of each pixel point according to the weak edge result corresponding to each block and the gradient direction in the step 224;
and step 226, calculating the contrast difference of each block, and performing histogram statistics on the edge continuity strength of each block.
4. The method for automatically sharpening an image according to claim 3, wherein: in the step 221, the gray-scale image is partitioned, and the size of each block is 16-128 pixels.
5. The method for automatically sharpening an image according to claim 3, wherein: in the step 222, whether the edge block belongs to is determined according to whether the number of the pixels belonging to the edge is greater than the predetermined percentage of the total number of the block pixels according to the result of determining the strong edge, and the predetermined percentage ranges from 0.1% to 2%.
6. The method for automatically sharpening an image according to claim 3, wherein: in step 223, the gradient calculation formula of the pixel point is:
grad=(next-prev)/2,
wherein, grad is the gradient value of the current pixel point; next is the value of the next pixel of the current pixel; prev is the value of the previous pixel of the current pixel.
7. The method for automatically sharpening an image according to claim 3, wherein: in step 224, the calculation of the gradient direction corresponding to the pixel point is an angle obtained by performing arc tangent according to the gradient values of the gradient of the pixel point in the X direction and the Y direction.
8. The method for automatically sharpening an image according to claim 3, wherein: in step 225, the edge continuity strength of the pixel point is calculated by using whether the pixel point in the gradient direction belongs to the edge in the weak edge result, so as to obtain the strength of the continuity of each pixel point being the edge.
9. The method for automatically sharpening an image according to claim 3, wherein: in step 226, the calculation method of the contrast difference is to calculate the maximum value and the minimum value of the pixel values in each block, subtract the maximum value and the minimum value to obtain a difference value, and then obtain the final contrast difference; wherein,
when the difference value ranges from 0 to 51, the contrast difference is 5;
the contrast difference is 3 when the difference ranges from 52 to 256.
10. The method for automatically sharpening an image according to claim 3, wherein: in step 226, the histogram statistical formula is:
index = 0.5 + 100 * ( 1 - e ( - continue [ x ] [ y ] / block ) 3.6 ) ;
wherein, index is the serial number value of the histogram, and the range is 0-100; continue is the edge continuity strength value and block is the contrast difference.
11. The method for automatically sharpening the image according to claim 1, wherein: in the step 10, a calculation formula for performing graying processing on the original image is one of the following formulas:
Gray=0.299*Red+0.587*Green+0.114*Blue:
or
Gray=(Red*306+Green*601+Blue*117+512)/1024;
Wherein Gray is a mean value of Gray values, and Red, Green and Blue are mean values of color values of Red, Green and Blue channels respectively.
12. The method for automatically sharpening the image according to claim 1, wherein: in step 30, the calculation formula of the blur probability of the grayscale image is:
<math> <mrow> <mi>score</mi> <mo>=</mo> <mn>1.0</mn> <mo>-</mo> <mrow> <mo>(</mo> <munderover> <mi>&Sigma;</mi> <mrow> <mi>i</mi> <mo>=</mo> <mn>0</mn> </mrow> <mn>64</mn> </munderover> <mi>hist</mi> <mo>[</mo> <mi>i</mi> <mo>]</mo> <mo>)</mo> </mrow> <mo>/</mo> <mi>nCount</mi> <mo>;</mo> </mrow> </math>
wherein score is fuzzy probability, the range is 0.0-1.0, 1.0 represents that the fuzzy probability is the highest, and 0.0 represents that the fuzzy probability is the lowest; hist is an array obtained after histogram statistics; nCount is the total number of histogram statistics performed.
13. The method for automatically sharpening the image according to claim 1, wherein: the gaussian blur processing in said step 40 is to calculate the transformation of each pixel in the image with a normal distribution, wherein,
the normal distribution equation in the N-dimensional space is:
<math> <mrow> <mi>G</mi> <mrow> <mo>(</mo> <mi>r</mi> <mo>)</mo> </mrow> <mo>=</mo> <mfrac> <mn>1</mn> <msup> <msqrt> <mn>2</mn> <mi>&pi;</mi> <msup> <mi>&sigma;</mi> <mn>2</mn> </msup> </msqrt> <mi>N</mi> </msup> </mfrac> <msup> <mi>e</mi> <mrow> <mo>-</mo> <msup> <mi>r</mi> <mn>2</mn> </msup> <mo>/</mo> <mrow> <mo>(</mo> <mn>2</mn> <msup> <mi>&sigma;</mi> <mn>2</mn> </msup> <mo>)</mo> </mrow> </mrow> </msup> <mo>;</mo> </mrow> </math>
the normal distribution equation in two dimensions is:
<math> <mrow> <mi>G</mi> <mrow> <mo>(</mo> <mi>u</mi> <mo>,</mo> <mi>&upsi;</mi> <mo>)</mo> </mrow> <mo>=</mo> <mfrac> <mn>1</mn> <mrow> <mn>2</mn> <mi>&pi;</mi> <msup> <mi>&sigma;</mi> <mn>2</mn> </msup> </mrow> </mfrac> <msup> <mi>e</mi> <mrow> <mo>-</mo> <mrow> <mo>(</mo> <msup> <mi>u</mi> <mn>2</mn> </msup> <mo>+</mo> <msup> <mi>&upsi;</mi> <mn>2</mn> </msup> <mo>)</mo> </mrow> <mo>/</mo> <mrow> <mo>(</mo> <mn>2</mn> <msup> <mi>&sigma;</mi> <mn>2</mn> </msup> <mo>)</mo> </mrow> </mrow> </msup> <mo>;</mo> </mrow> </math>
where r is the blur radius (r)2=u22) σ is the standard deviation of the normal distribution, u is the position deviation value of the original pixel point on the x axis, and v is the position deviation value of the original pixel point on the y axis.
14. The method for automatically sharpening the image according to claim 1, wherein: in the step 50, the sharpening degree of the gray image is set according to the fuzzy probability, wherein a formula for setting the sharpening degree is as follows:
depth=max((blur-0.4)/k,0.0);
wherein depth is the sharpening degree of the image; blur is the probability of blurring of the grayscale image; k is a fixed value ranging from 2 to 10.
15. The method for automatically sharpening the image according to claim 1, wherein: in the step 50, the original image is automatically sharpened according to the sharpening degree and the color value of the gaussian blurred image, and the calculation formula is as follows:
resultColor=min(max((depth+1)*color-depth*gaussColor,0),255);
the depth is the set sharpening degree, and the resultColor is the color value of each sharpened pixel point; color is the color value of each pixel point of the gray level image; the gaussColor is the color value of each pixel point after the gray level image is subjected to Gaussian blur.
CN201310494804.7A 2013-10-21 2013-10-21 Automatic image sharpening method Pending CN103489167A (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN201310494804.7A CN103489167A (en) 2013-10-21 2013-10-21 Automatic image sharpening method
CN201310652446.8A CN103679656B (en) 2013-10-21 2013-12-05 A kind of Automated sharpening of images method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201310494804.7A CN103489167A (en) 2013-10-21 2013-10-21 Automatic image sharpening method

Publications (1)

Publication Number Publication Date
CN103489167A true CN103489167A (en) 2014-01-01

Family

ID=49829366

Family Applications (2)

Application Number Title Priority Date Filing Date
CN201310494804.7A Pending CN103489167A (en) 2013-10-21 2013-10-21 Automatic image sharpening method
CN201310652446.8A Active CN103679656B (en) 2013-10-21 2013-12-05 A kind of Automated sharpening of images method

Family Applications After (1)

Application Number Title Priority Date Filing Date
CN201310652446.8A Active CN103679656B (en) 2013-10-21 2013-12-05 A kind of Automated sharpening of images method

Country Status (1)

Country Link
CN (2) CN103489167A (en)

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2017096820A1 (en) * 2015-12-10 2017-06-15 乐视控股(北京)有限公司 Gradient value and direction based image sharpening method and device
CN106991662A (en) * 2017-04-05 2017-07-28 上海矽奥微电子有限公司 Image sharpening method based on horizontal direction pixel
CN108024103A (en) * 2017-12-01 2018-05-11 重庆贝奥新视野医疗设备有限公司 Image sharpening method and device
CN109934785A (en) * 2019-03-12 2019-06-25 湖南国科微电子股份有限公司 Image sharpening method and device
CN110192744A (en) * 2019-04-24 2019-09-03 张金秋 Heat radiator for infant care apparatus
CN112918956A (en) * 2021-02-20 2021-06-08 陆伟凤 Garbage classification system based on image recognition technology

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106709430A (en) * 2016-11-30 2017-05-24 努比亚技术有限公司 Mobile terminal and mobile terminal based fingerprint information image processing method
CN107481203B (en) * 2017-08-14 2020-05-15 厦门美图之家科技有限公司 Image-oriented filtering method and computing device
CN110111261B (en) * 2019-03-28 2021-05-28 瑞芯微电子股份有限公司 Adaptive balance processing method for image, electronic device and computer readable storage medium

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20020176113A1 (en) * 2000-09-21 2002-11-28 Edgar Albert D. Dynamic image correction and imaging systems
CN100420269C (en) * 2005-12-09 2008-09-17 逐点半导体(上海)有限公司 Picture reinforcing treatment system and treatment method
US7643698B2 (en) * 2005-12-22 2010-01-05 Apple Inc. Image sharpening using diffusion
CN101794380B (en) * 2010-02-11 2012-08-08 上海点佰趣信息科技有限公司 Enhancement method of fingerprint image
CN103079038B (en) * 2013-01-07 2016-06-29 华为终端有限公司 Image sharpening processing method, device and camera terminal

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2017096820A1 (en) * 2015-12-10 2017-06-15 乐视控股(北京)有限公司 Gradient value and direction based image sharpening method and device
CN106991662A (en) * 2017-04-05 2017-07-28 上海矽奥微电子有限公司 Image sharpening method based on horizontal direction pixel
CN108024103A (en) * 2017-12-01 2018-05-11 重庆贝奥新视野医疗设备有限公司 Image sharpening method and device
CN109934785A (en) * 2019-03-12 2019-06-25 湖南国科微电子股份有限公司 Image sharpening method and device
CN109934785B (en) * 2019-03-12 2021-03-12 湖南国科微电子股份有限公司 Image sharpening method and device
CN110192744A (en) * 2019-04-24 2019-09-03 张金秋 Heat radiator for infant care apparatus
CN112918956A (en) * 2021-02-20 2021-06-08 陆伟凤 Garbage classification system based on image recognition technology

Also Published As

Publication number Publication date
CN103679656A (en) 2014-03-26
CN103679656B (en) 2016-08-31

Similar Documents

Publication Publication Date Title
CN103679656B (en) A kind of Automated sharpening of images method
CN103413311B (en) A kind of fuzzy detection method based on edge
CN111080661B (en) Image-based straight line detection method and device and electronic equipment
CN104732227B (en) A kind of Location Method of Vehicle License Plate based on definition and luminance evaluation
Park et al. Sand-dust image enhancement using successive color balance with coincident chromatic histogram
CN105913396A (en) Noise estimation-based image edge preservation mixed de-noising method
CN109584240B (en) Landslide trailing edge crack displacement image identification method
CN105139391B (en) A kind of haze weather traffic image edge detection method
CN101079149A (en) Noise-possessing movement fuzzy image restoration method based on radial basis nerve network
CN112528868B (en) Illegal line pressing judgment method based on improved Canny edge detection algorithm
CN112561804A (en) Low-illumination underwater image enhancement method based on multi-scale detail enhancement
CN114118144A (en) Anti-interference accurate aerial remote sensing image shadow detection method
CN108898132A (en) A kind of terahertz image dangerous material recognition methods based on Shape context description
CN102306307B (en) Positioning method of fixed point noise in color microscopic image sequence
CN110796615A (en) Image denoising method and device and storage medium
CN103400367A (en) No-reference blurred image quality evaluation method
CN110796626A (en) Image sharpening method and device
CN109377450A (en) A kind of edge-protected denoising method
CN116664457B (en) Image processing method for enhancing denoising
CN102521800A (en) Denoising and sharpening method by aiming at multimode image
CN107067375A (en) A kind of image defogging method based on dark channel prior and marginal information
CN117011291B (en) Watch shell quality visual detection method
CN110765887A (en) Automatic identification technology and detection method for tunnel lining cracks
CN105741281A (en) Image edge detection method based on neighbourhood dispersion
CN116129195A (en) Image quality evaluation device, image quality evaluation method, electronic device, and storage medium

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C02 Deemed withdrawal of patent application after publication (patent law 2001)
WD01 Invention patent application deemed withdrawn after publication

Application publication date: 20140101