CN115937016A - Contrast enhancement method for ensuring image details - Google Patents

Contrast enhancement method for ensuring image details Download PDF

Info

Publication number
CN115937016A
CN115937016A CN202211349133.0A CN202211349133A CN115937016A CN 115937016 A CN115937016 A CN 115937016A CN 202211349133 A CN202211349133 A CN 202211349133A CN 115937016 A CN115937016 A CN 115937016A
Authority
CN
China
Prior art keywords
gray
image
value
enhanced
global
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202211349133.0A
Other languages
Chinese (zh)
Other versions
CN115937016B (en
Inventor
赵蓝飞
李国庆
李士俊
刘发强
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Harbin University of Science and Technology
Original Assignee
Harbin University of Science and Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Harbin University of Science and Technology filed Critical Harbin University of Science and Technology
Priority to CN202211349133.0A priority Critical patent/CN115937016B/en
Publication of CN115937016A publication Critical patent/CN115937016A/en
Application granted granted Critical
Publication of CN115937016B publication Critical patent/CN115937016B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Landscapes

  • Image Processing (AREA)

Abstract

A contrast enhancement method for guaranteeing image details belongs to the technical field of image processing. The invention solves the problems that the existing contrast enhancement algorithm has limited improvement capability on the visual effect of the image and is easy to blur the details of the image. The method comprises the steps of firstly constructing a nonlinear global gray mapping function, decomposing a gray range into 3 intervals, and respectively counting the cumulative probability distribution of gray values with the maximum occurrence probability in the 3 intervals. Taking the gray value with the maximum occurrence probability in each gray interval and the cumulative probability distribution corresponding to the gray value as known data points, and estimating unknown parameters in a nonlinear global gray mapping function by a Newton iteration method, so as to perform global gray mapping on the image to be enhanced by using the expression of the global gray mapping function; secondly, a detail compensation item is added to the global gray level mapping by converting the detail enhancement problem, and the detail compensation item is determined by an iterative algorithm. The method can be applied to the technical field of image processing.

Description

Contrast enhancement method for ensuring image details
Technical Field
The invention belongs to the technical field of image processing, and particularly relates to a contrast enhancement method for ensuring image details.
Background
The lighting conditions of the imaging environment directly determine the image quality. Under many circumstances, due to the fact that objects are shielded and terrain factors cause uneven distribution of visible light in a shooting scene, the acquired digital images have shadows and highlight phenomena which affect visual effects. In order to suppress the above-mentioned problem of uneven distribution of visible light, it is generally necessary to perform back-end processing, i.e., contrast enhancement, on a digital image after imaging by a camera, thereby improving the image quality.
Classical contrast enhancement algorithms include: histogram equalization algorithm, gamma correction algorithm, logarithmic domain image enhancement algorithm, contrast stretching algorithm and the like. These contrast enhancement algorithms can improve the image quality of shadow and highlight regions to some extent, but the problem of uneven visible light distribution still limits the visual effect of the image. In addition, classical contrast enhancement algorithms typically stretch the image contrast through a globally uniform gray scale mapping function, which tends to blur the details of the image.
In summary, the existing contrast enhancement algorithm still has a limited capability of improving the visual effect of an image, and details of the image are easily blurred in the contrast enhancement process.
Disclosure of Invention
The invention aims to solve the problems that the existing contrast enhancement algorithm has limited improvement capability on the visual effect of an image and is easy to blur image details, and provides a contrast enhancement method for ensuring image details, which is used for enhancing the contrast of the image on the premise of ensuring the local details of the image and reconstructing an enhanced image with higher image quality.
The technical scheme adopted by the invention for solving the technical problems is as follows:
a contrast enhancement method for guaranteeing image details specifically comprises the following steps:
carrying out global gray mapping on an image to be enhanced to obtain an image subjected to global gray mapping;
the specific process of the step one is as follows:
step one, expressing the gray value of an image to be enhanced at a plane coordinate position (x, y) as L (x, y), and dividing the gray range [0,255] of the image to be enhanced into three gray intervals, wherein the gray ranges of the three gray intervals are [0,85], [86,171] and [172,255];
step two, in each gray scale interval, counting a gray scale value with the maximum occurrence probability, and dividing the gray scale interval [0,85]]The gray value with the highest probability of occurrence is recorded as T l The gray scale interval [86,171]]The gray scale value with the maximum internal occurrence probability is recorded as T m The gray scale interval [172,255]]The gray scale value with the maximum internal occurrence probability is recorded as T h
Step three, respectively counting the gray scale interval of the image to be enhanced as [0,T ] l ]、[0,T m ]、[0,T h ]The cumulative probability of the image to be enhanced is within the gray scale interval [0,T l ]The cumulative probability of inner is denoted as P l And enabling the image to be enhanced to be in a gray scale interval [0,T ] m ]The cumulative probability of inner is denoted as P m And enabling the image to be enhanced to be in a gray scale interval [0,T ] h ]The cumulative probability of inner is denoted as P h
Step one and four, respectively aligning T l 、T m And T h Are normalized, i.e.
Figure BDA0003918242570000021
Figure BDA0003918242570000022
Is T l Corresponding normalization value->
Figure BDA0003918242570000023
Is T m Corresponding normalization value->
Figure BDA0003918242570000024
Is T h A corresponding normalization value; />
Step one or five, according to the normalized gray value
Figure BDA0003918242570000025
And the normalized gray value->
Figure BDA0003918242570000026
Corresponding cumulative probability P l 、P m 、P h And 5 two-dimensional coordinates (namely (0,0) and (or 5)) are obtained according to the accumulation probabilities corresponding to the gray level normalization value 1, the gray level normalization value 0, the gray level normalization value 1 and the gray level normalization value 0 in the image to be enhanced>
Figure BDA0003918242570000027
Figure BDA0003918242570000028
And (1,1);
step six, constructing a global nonlinear gray scale mapping function as shown in a formula (1):
Figure BDA0003918242570000029
the method comprises the following steps that a, b, c, d and e are 5 undetermined coefficients, and I (x, y) represents a gray value at a plane coordinate position (x, y) after global gray mapping is carried out on an image to be enhanced;
seventhly, setting the initial values of 5 undetermined coefficients a, b, c, d and e to be 1, taking the 5 two-dimensional coordinates obtained in the step five as known data points, fitting the formula (1) through a Newton iteration method to obtain the optimal estimation of the 5 undetermined coefficients, and expressing the optimal estimation of a as the optimal estimation of a
Figure BDA00039182425700000210
Expressing the optimal estimate of b as ^ er>
Figure BDA00039182425700000211
Expressing the optimal estimate of c as ^ er>
Figure BDA00039182425700000212
Expressing an optimal estimate of d as +>
Figure BDA00039182425700000213
Expressing the optimal estimate of e as ^ er>
Figure BDA00039182425700000214
Substituting the optimal estimation of the 5 undetermined coefficients into a formula (1) to obtain an optimal global nonlinear gray mapping function, and performing global gray mapping on the image to be enhanced through the optimal global nonlinear gray mapping function to obtain an image subjected to global gray mapping;
and step two, performing detail enhancement on the image obtained in the step one after the global gray level mapping to obtain an image after the detail enhancement.
The invention has the beneficial effects that:
the method comprises the steps of firstly constructing a nonlinear global gray mapping function with 5 unknown parameters, decomposing a gray range into 3 intervals, and respectively counting the cumulative probability distribution of gray values with the maximum occurrence probability in the 3 intervals. Taking the gray value with the maximum occurrence probability in each gray interval and the cumulative probability distribution corresponding to the gray value as known data points, and estimating unknown parameters in a nonlinear global gray mapping function by a Newton iteration method, thereby determining an expression of the global gray mapping function and performing global gray mapping on an image to be enhanced by using the expression; secondly, a detail compensation item is added to the global gray level mapping by converting the detail enhancement problem, and the detail compensation item is determined by an iterative algorithm. Experimental results show that the algorithm can inhibit the phenomenon of uneven distribution of visible light, enhance the whole illumination layering sense of the image, effectively improve the visual effect of the image and enhance the local detail characteristics of the image to be clear.
Drawings
FIG. 1 is a first image to be enhanced;
FIG. 2 is a schematic diagram of a global gray scale mapping function corresponding to FIG. 1;
FIG. 3 is a corresponding enhanced image of FIG. 1;
FIG. 4 is a second image to be enhanced;
FIG. 5 is a diagram illustrating a global gray scale mapping function corresponding to FIG. 4;
FIG. 6 is a corresponding enhanced image of FIG. 5;
FIG. 7 is a third image to be enhanced;
FIG. 8 is a schematic diagram of the global gray scale mapping function corresponding to FIG. 7;
fig. 9 is an enhanced image corresponding to fig. 7.
Detailed Description
In a first specific embodiment, a contrast enhancement method for guaranteeing details of an image in the present embodiment specifically includes the following steps:
carrying out global gray mapping on an image to be enhanced to obtain an image subjected to global gray mapping;
the specific process of the step one is as follows:
step one, expressing the gray value of an image to be enhanced at a plane coordinate position (x, y) as L (x, y), and dividing the gray range [0,255] of the image to be enhanced into three gray intervals, wherein the gray ranges of the three gray intervals are [0,85], [86,171] and [172,255];
step two, in each gray scale interval, counting a gray scale value with the maximum occurrence probability, and determining the gray scale interval [0,85]]The gray value with the highest probability of occurrence is recorded as T l The gray scale interval [86,171]]The gray value with the highest probability of occurrence is recorded as T m The gray scale interval [172,255]]The gray scale value with the maximum internal occurrence probability is recorded as T h
I.e. in the gray scale interval [0,85]In the interior, the gray value with the most occurrence times in the gray values of the pixel points is T l In the gray scale interval [86,171]In the interior, the gray value with the most occurrence times in the gray values of the pixel points is T m In the gray scale interval [172,255]In the interior, the gray value with the most occurrence times in the gray values of the pixel points is T h
Step three, respectively counting that the gray scale interval of the image to be enhanced is [0,T ] l ]、[0,T m ]、[0,T h ]The cumulative probability of the image to be enhanced is within the gray scale interval [0,T l ]The cumulative probability of inner is denoted as P l And enabling the image to be enhanced to be in a gray scale interval [0,T ] m ]The cumulative probability of inner is denoted as P m And enabling the image to be enhanced to be in a gray scale interval [0,T ] h ]The cumulative probability of inner is denoted as P h
Cumulative probability P l The calculation method comprises the following steps: the statistical gray value is in the gray range [0,T l ]The total number of the pixels in the image to be enhanced is divided by the statistical value to obtain the cumulative probability P l
Step one and four, respectively aligning T l 、T m And T h Are normalized, i.e.
Figure BDA0003918242570000041
Figure BDA0003918242570000042
Is T l Corresponding normalization value->
Figure BDA0003918242570000043
Is T m Corresponding normalization value->
Figure BDA0003918242570000044
Is T h A corresponding normalization value;
step one or five, according to the normalized gray value
Figure BDA0003918242570000045
And the normalized gray value->
Figure BDA0003918242570000046
Corresponding cumulative probability P l 、P m 、P h And the cumulative probabilities corresponding to the gray normalization value 1, the gray normalization value 0, the gray normalization value 1 and the gray normalization value 0 in the image to be enhanced are obtained to obtain 5 two-dimensional coordinates (the cumulative probability corresponding to the gray normalization value 0 is 0, and the cumulative probability corresponding to the gray normalization value 1 is 1), namely (0,0), "live">
Figure BDA0003918242570000047
Figure BDA0003918242570000048
And (1,1);
step six, constructing a global nonlinear gray scale mapping function as shown in a formula (1):
Figure BDA0003918242570000049
the method comprises the following steps that a, b, c, d and e are 5 undetermined coefficients, and I (x, y) represents a gray value at a plane coordinate position (x, y) after global gray mapping is carried out on an image to be enhanced;
seventhly, setting the initial values of 5 undetermined coefficients a, b, c, d and e to be 1, taking the 5 two-dimensional coordinates obtained in the step five as known data points, fitting the formula (1) through a Newton iteration method to obtain the optimal estimation of the 5 undetermined coefficients, and expressing the optimal estimation of a as the optimal estimation of a
Figure BDA00039182425700000410
Expressing the optimal estimate of b as ^ er>
Figure BDA00039182425700000411
Expressing the optimal estimate of c as ^ er>
Figure BDA00039182425700000412
Expressing the optimal estimate of d as ^ er>
Figure BDA00039182425700000413
Expressing the optimal estimate of e as ^ er>
Figure BDA00039182425700000414
Substituting the optimal estimation of the 5 undetermined coefficients into a formula (1) to obtain an optimal global nonlinear gray mapping function, and performing global gray mapping on the image to be enhanced through the optimal global nonlinear gray mapping function to obtain an image subjected to global gray mapping;
and step two, performing detail enhancement on the image obtained in the step one after the global gray level mapping to obtain an image after the detail enhancement.
When detail enhancement is performed on a certain pixel in the image after global gray mapping, eight neighborhood pixels of the pixel need to be used, but if the pixel is already at or close to the edge of the image after global gray mapping, that is, if eight neighborhood pixels taking the pixel as the center do not exist in the image after global gray mapping, detail enhancement is not required to be performed on the pixel.
The second embodiment is as follows: the difference between this embodiment and the first embodiment is that the specific process of the second step is as follows:
the detail-enhanced image H is:
H(x,y)=I(x,y)+ΔI(x,y) (2)
where H (x, y) represents the grayscale value of the detail-enhanced image at the planar coordinate position (x, y), and Δ I (x, y) is the detail compensation term at the planar coordinate position (x, y).
The third concrete implementation mode: in this embodiment, the detail compensation term Δ I (x, y) satisfies formula (3) different from the second embodiment:
Figure BDA0003918242570000051
wherein, mu I (x, y) represents a set N 'consisting of a pixel (x, y) in the global gray-scale mapped image and eight neighborhood pixels with the pixel (x, y) as the center' x,y Average gray scale of middle pixel, mu L (x, y) represents a set N consisting of a pixel (x, y) in the image to be enhanced and eight neighborhood pixels centered on the pixel (x, y) x,y Average gray of middle pixel;
equation (3) is converted into an expression shown in equation (5) according to a variance calculation method:
Figure BDA0003918242570000052
wherein,
Figure BDA0003918242570000053
represents a pixel set N' x,y The variance of the gray level of each pixel in (4), (4)>
Figure BDA0003918242570000054
Representing a set of pixels N x,y The variance of each pixel gray level in (1);
separating the central pixel and 8 neighborhood pixels in the expression on the left side of the equal sign of the formula (5) to obtain an expression shown in a formula (7):
Figure BDA0003918242570000061
wherein, the value of i 'is-1 and the values of 1,j' are-1 and 1;
the formula (7) is collated to obtain an expression formula of Δ I (x, y) shown in formula (8):
Figure BDA0003918242570000062
and solving the detail compensation term in the formula (8) by adopting an iterative method.
The fourth concrete implementation mode: the present embodiment is different from the third embodiment in that μ I (x, y) and μ L The calculation method of (x, y) is as follows:
Figure BDA0003918242570000063
wherein, the value of i is i = -1,0, the value of 1,j is j = -1,0,1.
The fifth concrete implementation mode is as follows: the present embodiment is different from the fourth embodiment in that
Figure BDA0003918242570000064
And
Figure BDA0003918242570000065
the calculation method of (2) is shown in formula (6):
Figure BDA0003918242570000066
the sixth specific implementation mode: the difference between this embodiment and the fifth embodiment is that the detail compensation term in the formula (8) is solved by using an iterative method, and the specific process is as follows:
let the initial value Δ I of Δ I (x + I ', y + j') (0) (x + i ', y + j') is represented by formula (9):
ΔI (0) (x+i′,y+j′)=L(x+i′,y+j′)-I(x+i′,y+j′) (9)
the iterative process is shown in equation (10):
Figure BDA0003918242570000067
wherein superscript (t) represents t iterations;
setting an iteration stop condition as shown in equation (11):
max(|ΔI (t) (x,y)-ΔI (t-1) (x,y)|)≤0.01 (11)
satisfying equation (11) means that | Δ I is satisfied for each pixel in an image (t) (x,y)-ΔI (t-1) (x, y) | is less than or equal to 0.01, let t 0 And if the iteration stop condition shown in the formula (11) is met during the secondary iteration, the obtained detail enhanced image is shown in the formula (12):
Figure BDA0003918242570000071
results and analysis of the experiments
The hardware platform adopted by the experiment simulation is a notebook computer. The CPU of the computer is AMD Ryzen 7 5800H, and the display card is NVIDIA GeForce RTX 3070. The simulation software was Matlab 2016a. The input and output of the algorithm are bitmap files in the bmp format. The experimental results are shown in fig. 1 to 9, where fig. 1, 4, and 7 are three images to be enhanced, fig. 2, 5, and 8 are global grayscale mapping functions corresponding to fig. 1, 4, and 7, respectively, and fig. 3, 6, and 9 are enhanced images corresponding to fig. 1, 4, and 7, respectively.
By comparing the image to be enhanced with the enhanced image, the algorithm designed by the invention can effectively improve the contrast of the image. The enhanced image definition, visualization effect and local detail characteristics are obviously improved, for example, the surface of the cameraman's coat in fig. 1, the light reflection area of the car window in fig. 4, the low-light effect on the surface of the coral reef in fig. 7 and the high-light phenomenon of the underwater vehicle searchlight in fig. 7 are obviously inhibited. Compared with the original image, the overall brightness contrast of the enhanced image is more suitable for subjective observation of human eyes. Therefore, the algorithm designed by the invention can effectively enhance the contrast of the image, and the enhanced image has clear layering and abundant image details.
The above-described calculation examples of the present invention are merely to explain the calculation model and the calculation flow of the present invention in detail, and are not intended to limit the embodiments of the present invention. It will be apparent to those skilled in the art that other variations and modifications of the present invention can be made based on the above description, and it is not intended to be exhaustive or to limit the invention to the precise form disclosed, and all such modifications and variations are possible and contemplated as falling within the scope of the invention.

Claims (6)

1. A contrast enhancement method for guaranteeing image details is characterized by specifically comprising the following steps: carrying out global gray mapping on an image to be enhanced to obtain an image subjected to global gray mapping;
the specific process of the step one is as follows:
step one, expressing the gray value of an image to be enhanced at a plane coordinate position (x, y) as L (x, y), and dividing the gray range [0,255] of the image to be enhanced into three gray intervals, wherein the gray ranges of the three gray intervals are [0,85], [86,171] and [172,255];
step two, in each gray scale interval, counting a gray scale value with the maximum occurrence probability, and dividing the gray scale interval [0,85]]The gray value with the highest probability of occurrence is recorded as T l The gray scale interval [86,171]]The gray value with the highest probability of occurrence is recorded as T m The gray scale interval [172,255]]The gray scale value with the maximum internal occurrence probability is recorded as T h
Step three, respectively counting the gray scale interval of the image to be enhanced as [0,T ] l ]、[0,T m ]、[0,T h ]The cumulative probability of the image to be enhanced is within the gray scale interval [0,T ] l ]Inner cumulative probability is denoted as P l And (3) placing the image to be enhanced in a gray scale interval [0,T ] m ]Inner cumulative probability is denoted as P m And (3) placing the image to be enhanced in a gray scale interval [0,T ] h ]Inner cumulative probability is denoted as P h
Step one and four, respectively aligning T l 、T m And T h Are normalized, i.e.
Figure FDA0003918242560000011
Figure FDA0003918242560000012
Is T l Corresponding normalization value(s), in conjunction with a corresponding normalization value>
Figure FDA0003918242560000013
Is T m Corresponding normalization value->
Figure FDA0003918242560000014
Is T h A corresponding normalization value;
step one or five, according to the normalized gray value
Figure FDA0003918242560000015
And the normalized gray value->
Figure FDA0003918242560000016
Corresponding cumulative probability P l 、P m 、P h And accumulating probabilities corresponding to the gray normalization value 1, the gray normalization value 0, the gray normalization value 1 and the gray normalization value 0 in the image to be enhanced to obtain 5 two-dimensional coordinates (namely (0,0) and (5) based on the gray normalization value 0>
Figure FDA0003918242560000017
Figure FDA0003918242560000018
And (1,1);
step six, constructing a global nonlinear gray mapping function as shown in a formula (1):
Figure FDA0003918242560000019
the method comprises the following steps that a, b, c, d and e are 5 undetermined coefficients, and I (x, y) represents a gray value at a plane coordinate position (x, y) after global gray mapping is carried out on an image to be enhanced;
seventhly, setting initial values of 5 undetermined coefficients a, b, c, d and e to be 1, taking 5 two-dimensional coordinates obtained in the step five as known data points, fitting the formula (1) through a Newton iteration method to obtain optimal estimation of the 5 undetermined coefficients, and expressing the optimal estimation of a as the optimal estimation of a
Figure FDA0003918242560000021
Expressing the optimal estimate of b as ^ er>
Figure FDA0003918242560000022
Expressing the optimal estimate of c as ^ er>
Figure FDA0003918242560000023
Expressing the optimal estimate of d as ^ er>
Figure FDA0003918242560000024
Expressing the optimal estimate of e as ^ er>
Figure FDA0003918242560000025
Substituting the optimal estimation of the 5 undetermined coefficients into a formula (1) to obtain an optimal global nonlinear gray mapping function, and performing global gray mapping on the image to be enhanced through the optimal global nonlinear gray mapping function to obtain an image subjected to global gray mapping;
and step two, performing detail enhancement on the image obtained in the step one after the global gray level mapping to obtain an image after the detail enhancement.
2. The contrast enhancement method for guaranteeing image details according to claim 1, wherein the specific process of the second step is as follows:
the detail-enhanced image H is:
H(x,y)=I(x,y)+ΔI(x,y) (2)
where H (x, y) represents the grayscale value of the detail-enhanced image at the plane coordinate position (x, y), and Δ I (x, y) is the detail compensation term at the plane coordinate position (x, y).
3. A contrast enhancement method for guaranteeing image details according to claim 2, wherein the detail compensation term Δ I (x, y) satisfies formula (3):
Figure FDA0003918242560000026
wherein, mu I (x, y) represents a set N 'consisting of a pixel (x, y) in the global post-grayscale mapped image and eight neighborhood pixels centered on the pixel (x, y)' x,y Average gray scale of middle pixel, mu L (x, y) represents a pixel (x, y) in the image to be enhanced and a set N consisting of eight neighborhood pixels with the pixel (x, y) as the center x,y Average gray of middle pixel;
equation (3) is converted into an expression shown in equation (5) according to a variance calculation method:
Figure FDA0003918242560000027
wherein,
Figure FDA0003918242560000028
represents a pixel set N' x,y The variance of the gray level of each pixel in (4), (4)>
Figure FDA0003918242560000029
Representing a set of pixels N x,y The variance of each pixel gray level in (1);
separating the central pixel and 8 neighborhood pixels in the expression on the left side of the equal sign of the formula (5) to obtain an expression shown in a formula (7):
Figure FDA0003918242560000031
wherein, the value of i 'is-1 and the values of 1,j' are-1 and 1;
the formula (7) is collated to obtain an expression formula of Δ I (x, y) shown in formula (8):
Figure FDA0003918242560000032
and solving the detail compensation term in the formula (8) by adopting an iterative method.
4. A method for contrast enhancement to ensure image detail according to claim 3, wherein said μ I (x, y) and μ L The calculation method of (x, y) is as follows (4):
Figure FDA0003918242560000033
wherein, the value of i is i = -1,0, the value of 1,j is j = -1,0,1.
5. A method for contrast enhancement to ensure image detail according to claim 4, characterized in that said method comprises
Figure FDA0003918242560000034
And &>
Figure FDA0003918242560000035
The calculation method of (2) is shown in formula (6): />
Figure FDA0003918242560000036
6. The contrast enhancement method for guaranteeing image details according to claim 5, wherein the detail compensation term in equation (8) is solved by using an iterative method, which comprises the following specific processes:
let the initial value Δ I of Δ I (x + I ', y + j') (0) (x + i ', y + j') is represented by formula (9):
ΔI (0) (x+i′,y+j′)=L(x+i′,y+j′)-I(x+i′,y+j′) (9)
the iterative process is shown in equation (10):
Figure FDA0003918242560000041
wherein superscript (t) represents t iterations;
setting an iteration stop condition as shown in equation (11):
max(|ΔI (t) (x,y)-ΔI (t-1) (x,y)|)≤0.01 (11)
let t be 0 And if the iteration stop condition shown in the formula (11) is met during the secondary iteration, the obtained detail enhancement image is shown in the formula (12):
Figure FDA0003918242560000042
/>
CN202211349133.0A 2022-10-31 2022-10-31 Contrast enhancement method for guaranteeing image details Active CN115937016B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211349133.0A CN115937016B (en) 2022-10-31 2022-10-31 Contrast enhancement method for guaranteeing image details

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211349133.0A CN115937016B (en) 2022-10-31 2022-10-31 Contrast enhancement method for guaranteeing image details

Publications (2)

Publication Number Publication Date
CN115937016A true CN115937016A (en) 2023-04-07
CN115937016B CN115937016B (en) 2023-07-21

Family

ID=86653346

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211349133.0A Active CN115937016B (en) 2022-10-31 2022-10-31 Contrast enhancement method for guaranteeing image details

Country Status (1)

Country Link
CN (1) CN115937016B (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100278423A1 (en) * 2009-04-30 2010-11-04 Yuji Itoh Methods and systems for contrast enhancement
CN101951523A (en) * 2010-09-21 2011-01-19 北京工业大学 Adaptive colour image processing method and system
US20140267378A1 (en) * 2013-03-15 2014-09-18 Eric Olsen Global contrast correction
CN105957030A (en) * 2016-04-26 2016-09-21 成都市晶林科技有限公司 Infrared thermal imaging system image detail enhancing and noise inhibiting method
CN112037144A (en) * 2020-08-31 2020-12-04 哈尔滨理工大学 Low-illumination image enhancement method based on local contrast stretching

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100278423A1 (en) * 2009-04-30 2010-11-04 Yuji Itoh Methods and systems for contrast enhancement
CN101951523A (en) * 2010-09-21 2011-01-19 北京工业大学 Adaptive colour image processing method and system
US20140267378A1 (en) * 2013-03-15 2014-09-18 Eric Olsen Global contrast correction
CN105957030A (en) * 2016-04-26 2016-09-21 成都市晶林科技有限公司 Infrared thermal imaging system image detail enhancing and noise inhibiting method
CN112037144A (en) * 2020-08-31 2020-12-04 哈尔滨理工大学 Low-illumination image enhancement method based on local contrast stretching

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
依玉峰;田宏;刘彤宇;: "直方图重建图像细节增强算法", 计算机工程与应用, no. 13 *
张婷婷;祁伟;曹峰;李伟;: "基于全局和局部特征的自适应红外图像增强算法研究", 信息与电脑(理论版), no. 03 *

Also Published As

Publication number Publication date
CN115937016B (en) 2023-07-21

Similar Documents

Publication Publication Date Title
Agrawal et al. A novel joint histogram equalization based image contrast enhancement
US11127122B2 (en) Image enhancement method and system
Gao et al. Naturalness preserved nonuniform illumination estimation for image enhancement based on retinex
Ju et al. Single image dehazing via an improved atmospheric scattering model
WO2022199583A1 (en) Image processing method and apparatus, computer device, and storage medium
CN111583123A (en) Wavelet transform-based image enhancement algorithm for fusing high-frequency and low-frequency information
CN108564549B (en) Image defogging method based on multi-scale dense connection network
CN103218778B (en) The disposal route of a kind of image and video and device
CN102982513B (en) A kind of adapting to image defogging method capable based on texture
CN104463804B (en) Image enhancement method based on intuitional fuzzy set
CN109872285A (en) A kind of Retinex low-luminance color image enchancing method based on variational methods
CN110956632A (en) Method and device for automatically detecting pectoralis major region in molybdenum target image
Feng et al. Low-light image enhancement algorithm based on an atmospheric physical model
CN112053302A (en) Denoising method and device for hyperspectral image and storage medium
Pandey et al. A fast and effective vision enhancement method for single foggy image
CN104616259A (en) Non-local mean image de-noising method with noise intensity self-adaptation function
CN110992287B (en) Method for clarifying non-uniform illumination video
Hua et al. Low-light image enhancement based on joint generative adversarial network and image quality assessment
CN115660994B (en) Image enhancement method based on regional least square estimation
CN116630198A (en) Multi-scale fusion underwater image enhancement method combining self-adaptive gamma correction
CN103595933A (en) Method for image noise reduction
CN108492264B (en) Single-frame image fast super-resolution method based on sigmoid transformation
CN116452447A (en) Low-illumination high-definition image processing method
CN114240990B (en) SAR image point target segmentation method
Malik et al. Contrast enhancement and smoothing of CT images for diagnosis

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant