CN115937016A - Contrast enhancement method for ensuring image details - Google Patents
Contrast enhancement method for ensuring image details Download PDFInfo
- Publication number
- CN115937016A CN115937016A CN202211349133.0A CN202211349133A CN115937016A CN 115937016 A CN115937016 A CN 115937016A CN 202211349133 A CN202211349133 A CN 202211349133A CN 115937016 A CN115937016 A CN 115937016A
- Authority
- CN
- China
- Prior art keywords
- gray
- image
- value
- enhanced
- global
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
- 238000000034 method Methods 0.000 title claims abstract description 37
- 238000013507 mapping Methods 0.000 claims abstract description 45
- 230000001186 cumulative effect Effects 0.000 claims abstract description 24
- 238000010606 normalization Methods 0.000 claims description 25
- 238000004364 calculation method Methods 0.000 claims description 10
- 238000012804 iterative process Methods 0.000 claims description 2
- 238000004422 calculation algorithm Methods 0.000 abstract description 16
- 230000000007 visual effect Effects 0.000 abstract description 6
- 238000012545 processing Methods 0.000 abstract description 4
- 238000010586 diagram Methods 0.000 description 3
- 230000000694 effects Effects 0.000 description 2
- 238000002474 experimental method Methods 0.000 description 2
- 238000003384 imaging method Methods 0.000 description 2
- 238000012986 modification Methods 0.000 description 2
- 230000004048 modification Effects 0.000 description 2
- 235000014653 Carica parviflora Nutrition 0.000 description 1
- 241000243321 Cnidaria Species 0.000 description 1
- 238000009825 accumulation Methods 0.000 description 1
- 230000009286 beneficial effect Effects 0.000 description 1
- 238000012937 correction Methods 0.000 description 1
- 230000002708 enhancing effect Effects 0.000 description 1
- 238000005286 illumination Methods 0.000 description 1
- 238000004088 simulation Methods 0.000 description 1
- 238000012800 visualization Methods 0.000 description 1
Images
Landscapes
- Image Processing (AREA)
Abstract
A contrast enhancement method for guaranteeing image details belongs to the technical field of image processing. The invention solves the problems that the existing contrast enhancement algorithm has limited improvement capability on the visual effect of the image and is easy to blur the details of the image. The method comprises the steps of firstly constructing a nonlinear global gray mapping function, decomposing a gray range into 3 intervals, and respectively counting the cumulative probability distribution of gray values with the maximum occurrence probability in the 3 intervals. Taking the gray value with the maximum occurrence probability in each gray interval and the cumulative probability distribution corresponding to the gray value as known data points, and estimating unknown parameters in a nonlinear global gray mapping function by a Newton iteration method, so as to perform global gray mapping on the image to be enhanced by using the expression of the global gray mapping function; secondly, a detail compensation item is added to the global gray level mapping by converting the detail enhancement problem, and the detail compensation item is determined by an iterative algorithm. The method can be applied to the technical field of image processing.
Description
Technical Field
The invention belongs to the technical field of image processing, and particularly relates to a contrast enhancement method for ensuring image details.
Background
The lighting conditions of the imaging environment directly determine the image quality. Under many circumstances, due to the fact that objects are shielded and terrain factors cause uneven distribution of visible light in a shooting scene, the acquired digital images have shadows and highlight phenomena which affect visual effects. In order to suppress the above-mentioned problem of uneven distribution of visible light, it is generally necessary to perform back-end processing, i.e., contrast enhancement, on a digital image after imaging by a camera, thereby improving the image quality.
Classical contrast enhancement algorithms include: histogram equalization algorithm, gamma correction algorithm, logarithmic domain image enhancement algorithm, contrast stretching algorithm and the like. These contrast enhancement algorithms can improve the image quality of shadow and highlight regions to some extent, but the problem of uneven visible light distribution still limits the visual effect of the image. In addition, classical contrast enhancement algorithms typically stretch the image contrast through a globally uniform gray scale mapping function, which tends to blur the details of the image.
In summary, the existing contrast enhancement algorithm still has a limited capability of improving the visual effect of an image, and details of the image are easily blurred in the contrast enhancement process.
Disclosure of Invention
The invention aims to solve the problems that the existing contrast enhancement algorithm has limited improvement capability on the visual effect of an image and is easy to blur image details, and provides a contrast enhancement method for ensuring image details, which is used for enhancing the contrast of the image on the premise of ensuring the local details of the image and reconstructing an enhanced image with higher image quality.
The technical scheme adopted by the invention for solving the technical problems is as follows:
a contrast enhancement method for guaranteeing image details specifically comprises the following steps:
carrying out global gray mapping on an image to be enhanced to obtain an image subjected to global gray mapping;
the specific process of the step one is as follows:
step one, expressing the gray value of an image to be enhanced at a plane coordinate position (x, y) as L (x, y), and dividing the gray range [0,255] of the image to be enhanced into three gray intervals, wherein the gray ranges of the three gray intervals are [0,85], [86,171] and [172,255];
step two, in each gray scale interval, counting a gray scale value with the maximum occurrence probability, and dividing the gray scale interval [0,85]]The gray value with the highest probability of occurrence is recorded as T l The gray scale interval [86,171]]The gray scale value with the maximum internal occurrence probability is recorded as T m The gray scale interval [172,255]]The gray scale value with the maximum internal occurrence probability is recorded as T h ;
Step three, respectively counting the gray scale interval of the image to be enhanced as [0,T ] l ]、[0,T m ]、[0,T h ]The cumulative probability of the image to be enhanced is within the gray scale interval [0,T l ]The cumulative probability of inner is denoted as P l And enabling the image to be enhanced to be in a gray scale interval [0,T ] m ]The cumulative probability of inner is denoted as P m And enabling the image to be enhanced to be in a gray scale interval [0,T ] h ]The cumulative probability of inner is denoted as P h ;
Step one and four, respectively aligning T l 、T m And T h Are normalized, i.e. Is T l Corresponding normalization value->Is T m Corresponding normalization value->Is T h A corresponding normalization value; />
Step one or five, according to the normalized gray valueAnd the normalized gray value->Corresponding cumulative probability P l 、P m 、P h And 5 two-dimensional coordinates (namely (0,0) and (or 5)) are obtained according to the accumulation probabilities corresponding to the gray level normalization value 1, the gray level normalization value 0, the gray level normalization value 1 and the gray level normalization value 0 in the image to be enhanced> And (1,1);
step six, constructing a global nonlinear gray scale mapping function as shown in a formula (1):
the method comprises the following steps that a, b, c, d and e are 5 undetermined coefficients, and I (x, y) represents a gray value at a plane coordinate position (x, y) after global gray mapping is carried out on an image to be enhanced;
seventhly, setting the initial values of 5 undetermined coefficients a, b, c, d and e to be 1, taking the 5 two-dimensional coordinates obtained in the step five as known data points, fitting the formula (1) through a Newton iteration method to obtain the optimal estimation of the 5 undetermined coefficients, and expressing the optimal estimation of a as the optimal estimation of aExpressing the optimal estimate of b as ^ er>Expressing the optimal estimate of c as ^ er>Expressing an optimal estimate of d as +>Expressing the optimal estimate of e as ^ er>
Substituting the optimal estimation of the 5 undetermined coefficients into a formula (1) to obtain an optimal global nonlinear gray mapping function, and performing global gray mapping on the image to be enhanced through the optimal global nonlinear gray mapping function to obtain an image subjected to global gray mapping;
and step two, performing detail enhancement on the image obtained in the step one after the global gray level mapping to obtain an image after the detail enhancement.
The invention has the beneficial effects that:
the method comprises the steps of firstly constructing a nonlinear global gray mapping function with 5 unknown parameters, decomposing a gray range into 3 intervals, and respectively counting the cumulative probability distribution of gray values with the maximum occurrence probability in the 3 intervals. Taking the gray value with the maximum occurrence probability in each gray interval and the cumulative probability distribution corresponding to the gray value as known data points, and estimating unknown parameters in a nonlinear global gray mapping function by a Newton iteration method, thereby determining an expression of the global gray mapping function and performing global gray mapping on an image to be enhanced by using the expression; secondly, a detail compensation item is added to the global gray level mapping by converting the detail enhancement problem, and the detail compensation item is determined by an iterative algorithm. Experimental results show that the algorithm can inhibit the phenomenon of uneven distribution of visible light, enhance the whole illumination layering sense of the image, effectively improve the visual effect of the image and enhance the local detail characteristics of the image to be clear.
Drawings
FIG. 1 is a first image to be enhanced;
FIG. 2 is a schematic diagram of a global gray scale mapping function corresponding to FIG. 1;
FIG. 3 is a corresponding enhanced image of FIG. 1;
FIG. 4 is a second image to be enhanced;
FIG. 5 is a diagram illustrating a global gray scale mapping function corresponding to FIG. 4;
FIG. 6 is a corresponding enhanced image of FIG. 5;
FIG. 7 is a third image to be enhanced;
FIG. 8 is a schematic diagram of the global gray scale mapping function corresponding to FIG. 7;
fig. 9 is an enhanced image corresponding to fig. 7.
Detailed Description
In a first specific embodiment, a contrast enhancement method for guaranteeing details of an image in the present embodiment specifically includes the following steps:
carrying out global gray mapping on an image to be enhanced to obtain an image subjected to global gray mapping;
the specific process of the step one is as follows:
step one, expressing the gray value of an image to be enhanced at a plane coordinate position (x, y) as L (x, y), and dividing the gray range [0,255] of the image to be enhanced into three gray intervals, wherein the gray ranges of the three gray intervals are [0,85], [86,171] and [172,255];
step two, in each gray scale interval, counting a gray scale value with the maximum occurrence probability, and determining the gray scale interval [0,85]]The gray value with the highest probability of occurrence is recorded as T l The gray scale interval [86,171]]The gray value with the highest probability of occurrence is recorded as T m The gray scale interval [172,255]]The gray scale value with the maximum internal occurrence probability is recorded as T h ;
I.e. in the gray scale interval [0,85]In the interior, the gray value with the most occurrence times in the gray values of the pixel points is T l In the gray scale interval [86,171]In the interior, the gray value with the most occurrence times in the gray values of the pixel points is T m In the gray scale interval [172,255]In the interior, the gray value with the most occurrence times in the gray values of the pixel points is T h ;
Step three, respectively counting that the gray scale interval of the image to be enhanced is [0,T ] l ]、[0,T m ]、[0,T h ]The cumulative probability of the image to be enhanced is within the gray scale interval [0,T l ]The cumulative probability of inner is denoted as P l And enabling the image to be enhanced to be in a gray scale interval [0,T ] m ]The cumulative probability of inner is denoted as P m And enabling the image to be enhanced to be in a gray scale interval [0,T ] h ]The cumulative probability of inner is denoted as P h ;
Cumulative probability P l The calculation method comprises the following steps: the statistical gray value is in the gray range [0,T l ]The total number of the pixels in the image to be enhanced is divided by the statistical value to obtain the cumulative probability P l ;
Step one and four, respectively aligning T l 、T m And T h Are normalized, i.e. Is T l Corresponding normalization value->Is T m Corresponding normalization value->Is T h A corresponding normalization value;
step one or five, according to the normalized gray valueAnd the normalized gray value->Corresponding cumulative probability P l 、P m 、P h And the cumulative probabilities corresponding to the gray normalization value 1, the gray normalization value 0, the gray normalization value 1 and the gray normalization value 0 in the image to be enhanced are obtained to obtain 5 two-dimensional coordinates (the cumulative probability corresponding to the gray normalization value 0 is 0, and the cumulative probability corresponding to the gray normalization value 1 is 1), namely (0,0), "live"> And (1,1);
step six, constructing a global nonlinear gray scale mapping function as shown in a formula (1):
the method comprises the following steps that a, b, c, d and e are 5 undetermined coefficients, and I (x, y) represents a gray value at a plane coordinate position (x, y) after global gray mapping is carried out on an image to be enhanced;
seventhly, setting the initial values of 5 undetermined coefficients a, b, c, d and e to be 1, taking the 5 two-dimensional coordinates obtained in the step five as known data points, fitting the formula (1) through a Newton iteration method to obtain the optimal estimation of the 5 undetermined coefficients, and expressing the optimal estimation of a as the optimal estimation of aExpressing the optimal estimate of b as ^ er>Expressing the optimal estimate of c as ^ er>Expressing the optimal estimate of d as ^ er>Expressing the optimal estimate of e as ^ er>
Substituting the optimal estimation of the 5 undetermined coefficients into a formula (1) to obtain an optimal global nonlinear gray mapping function, and performing global gray mapping on the image to be enhanced through the optimal global nonlinear gray mapping function to obtain an image subjected to global gray mapping;
and step two, performing detail enhancement on the image obtained in the step one after the global gray level mapping to obtain an image after the detail enhancement.
When detail enhancement is performed on a certain pixel in the image after global gray mapping, eight neighborhood pixels of the pixel need to be used, but if the pixel is already at or close to the edge of the image after global gray mapping, that is, if eight neighborhood pixels taking the pixel as the center do not exist in the image after global gray mapping, detail enhancement is not required to be performed on the pixel.
The second embodiment is as follows: the difference between this embodiment and the first embodiment is that the specific process of the second step is as follows:
the detail-enhanced image H is:
H(x,y)=I(x,y)+ΔI(x,y) (2)
where H (x, y) represents the grayscale value of the detail-enhanced image at the planar coordinate position (x, y), and Δ I (x, y) is the detail compensation term at the planar coordinate position (x, y).
The third concrete implementation mode: in this embodiment, the detail compensation term Δ I (x, y) satisfies formula (3) different from the second embodiment:
wherein, mu I (x, y) represents a set N 'consisting of a pixel (x, y) in the global gray-scale mapped image and eight neighborhood pixels with the pixel (x, y) as the center' x,y Average gray scale of middle pixel, mu L (x, y) represents a set N consisting of a pixel (x, y) in the image to be enhanced and eight neighborhood pixels centered on the pixel (x, y) x,y Average gray of middle pixel;
equation (3) is converted into an expression shown in equation (5) according to a variance calculation method:
wherein,represents a pixel set N' x,y The variance of the gray level of each pixel in (4), (4)>Representing a set of pixels N x,y The variance of each pixel gray level in (1);
separating the central pixel and 8 neighborhood pixels in the expression on the left side of the equal sign of the formula (5) to obtain an expression shown in a formula (7):
wherein, the value of i 'is-1 and the values of 1,j' are-1 and 1;
the formula (7) is collated to obtain an expression formula of Δ I (x, y) shown in formula (8):
and solving the detail compensation term in the formula (8) by adopting an iterative method.
The fourth concrete implementation mode: the present embodiment is different from the third embodiment in that μ I (x, y) and μ L The calculation method of (x, y) is as follows:
wherein, the value of i is i = -1,0, the value of 1,j is j = -1,0,1.
The fifth concrete implementation mode is as follows: the present embodiment is different from the fourth embodiment in thatAndthe calculation method of (2) is shown in formula (6):
the sixth specific implementation mode: the difference between this embodiment and the fifth embodiment is that the detail compensation term in the formula (8) is solved by using an iterative method, and the specific process is as follows:
let the initial value Δ I of Δ I (x + I ', y + j') (0) (x + i ', y + j') is represented by formula (9):
ΔI (0) (x+i′,y+j′)=L(x+i′,y+j′)-I(x+i′,y+j′) (9)
the iterative process is shown in equation (10):
wherein superscript (t) represents t iterations;
setting an iteration stop condition as shown in equation (11):
max(|ΔI (t) (x,y)-ΔI (t-1) (x,y)|)≤0.01 (11)
satisfying equation (11) means that | Δ I is satisfied for each pixel in an image (t) (x,y)-ΔI (t-1) (x, y) | is less than or equal to 0.01, let t 0 And if the iteration stop condition shown in the formula (11) is met during the secondary iteration, the obtained detail enhanced image is shown in the formula (12):
results and analysis of the experiments
The hardware platform adopted by the experiment simulation is a notebook computer. The CPU of the computer is AMD Ryzen 7 5800H, and the display card is NVIDIA GeForce RTX 3070. The simulation software was Matlab 2016a. The input and output of the algorithm are bitmap files in the bmp format. The experimental results are shown in fig. 1 to 9, where fig. 1, 4, and 7 are three images to be enhanced, fig. 2, 5, and 8 are global grayscale mapping functions corresponding to fig. 1, 4, and 7, respectively, and fig. 3, 6, and 9 are enhanced images corresponding to fig. 1, 4, and 7, respectively.
By comparing the image to be enhanced with the enhanced image, the algorithm designed by the invention can effectively improve the contrast of the image. The enhanced image definition, visualization effect and local detail characteristics are obviously improved, for example, the surface of the cameraman's coat in fig. 1, the light reflection area of the car window in fig. 4, the low-light effect on the surface of the coral reef in fig. 7 and the high-light phenomenon of the underwater vehicle searchlight in fig. 7 are obviously inhibited. Compared with the original image, the overall brightness contrast of the enhanced image is more suitable for subjective observation of human eyes. Therefore, the algorithm designed by the invention can effectively enhance the contrast of the image, and the enhanced image has clear layering and abundant image details.
The above-described calculation examples of the present invention are merely to explain the calculation model and the calculation flow of the present invention in detail, and are not intended to limit the embodiments of the present invention. It will be apparent to those skilled in the art that other variations and modifications of the present invention can be made based on the above description, and it is not intended to be exhaustive or to limit the invention to the precise form disclosed, and all such modifications and variations are possible and contemplated as falling within the scope of the invention.
Claims (6)
1. A contrast enhancement method for guaranteeing image details is characterized by specifically comprising the following steps: carrying out global gray mapping on an image to be enhanced to obtain an image subjected to global gray mapping;
the specific process of the step one is as follows:
step one, expressing the gray value of an image to be enhanced at a plane coordinate position (x, y) as L (x, y), and dividing the gray range [0,255] of the image to be enhanced into three gray intervals, wherein the gray ranges of the three gray intervals are [0,85], [86,171] and [172,255];
step two, in each gray scale interval, counting a gray scale value with the maximum occurrence probability, and dividing the gray scale interval [0,85]]The gray value with the highest probability of occurrence is recorded as T l The gray scale interval [86,171]]The gray value with the highest probability of occurrence is recorded as T m The gray scale interval [172,255]]The gray scale value with the maximum internal occurrence probability is recorded as T h ;
Step three, respectively counting the gray scale interval of the image to be enhanced as [0,T ] l ]、[0,T m ]、[0,T h ]The cumulative probability of the image to be enhanced is within the gray scale interval [0,T ] l ]Inner cumulative probability is denoted as P l And (3) placing the image to be enhanced in a gray scale interval [0,T ] m ]Inner cumulative probability is denoted as P m And (3) placing the image to be enhanced in a gray scale interval [0,T ] h ]Inner cumulative probability is denoted as P h ;
Step one and four, respectively aligning T l 、T m And T h Are normalized, i.e. Is T l Corresponding normalization value(s), in conjunction with a corresponding normalization value>Is T m Corresponding normalization value->Is T h A corresponding normalization value;
step one or five, according to the normalized gray valueAnd the normalized gray value->Corresponding cumulative probability P l 、P m 、P h And accumulating probabilities corresponding to the gray normalization value 1, the gray normalization value 0, the gray normalization value 1 and the gray normalization value 0 in the image to be enhanced to obtain 5 two-dimensional coordinates (namely (0,0) and (5) based on the gray normalization value 0> And (1,1);
step six, constructing a global nonlinear gray mapping function as shown in a formula (1):
the method comprises the following steps that a, b, c, d and e are 5 undetermined coefficients, and I (x, y) represents a gray value at a plane coordinate position (x, y) after global gray mapping is carried out on an image to be enhanced;
seventhly, setting initial values of 5 undetermined coefficients a, b, c, d and e to be 1, taking 5 two-dimensional coordinates obtained in the step five as known data points, fitting the formula (1) through a Newton iteration method to obtain optimal estimation of the 5 undetermined coefficients, and expressing the optimal estimation of a as the optimal estimation of aExpressing the optimal estimate of b as ^ er>Expressing the optimal estimate of c as ^ er>Expressing the optimal estimate of d as ^ er>Expressing the optimal estimate of e as ^ er>
Substituting the optimal estimation of the 5 undetermined coefficients into a formula (1) to obtain an optimal global nonlinear gray mapping function, and performing global gray mapping on the image to be enhanced through the optimal global nonlinear gray mapping function to obtain an image subjected to global gray mapping;
and step two, performing detail enhancement on the image obtained in the step one after the global gray level mapping to obtain an image after the detail enhancement.
2. The contrast enhancement method for guaranteeing image details according to claim 1, wherein the specific process of the second step is as follows:
the detail-enhanced image H is:
H(x,y)=I(x,y)+ΔI(x,y) (2)
where H (x, y) represents the grayscale value of the detail-enhanced image at the plane coordinate position (x, y), and Δ I (x, y) is the detail compensation term at the plane coordinate position (x, y).
3. A contrast enhancement method for guaranteeing image details according to claim 2, wherein the detail compensation term Δ I (x, y) satisfies formula (3):
wherein, mu I (x, y) represents a set N 'consisting of a pixel (x, y) in the global post-grayscale mapped image and eight neighborhood pixels centered on the pixel (x, y)' x,y Average gray scale of middle pixel, mu L (x, y) represents a pixel (x, y) in the image to be enhanced and a set N consisting of eight neighborhood pixels with the pixel (x, y) as the center x,y Average gray of middle pixel;
equation (3) is converted into an expression shown in equation (5) according to a variance calculation method:
wherein,represents a pixel set N' x,y The variance of the gray level of each pixel in (4), (4)>Representing a set of pixels N x,y The variance of each pixel gray level in (1);
separating the central pixel and 8 neighborhood pixels in the expression on the left side of the equal sign of the formula (5) to obtain an expression shown in a formula (7):
wherein, the value of i 'is-1 and the values of 1,j' are-1 and 1;
the formula (7) is collated to obtain an expression formula of Δ I (x, y) shown in formula (8):
and solving the detail compensation term in the formula (8) by adopting an iterative method.
6. The contrast enhancement method for guaranteeing image details according to claim 5, wherein the detail compensation term in equation (8) is solved by using an iterative method, which comprises the following specific processes:
let the initial value Δ I of Δ I (x + I ', y + j') (0) (x + i ', y + j') is represented by formula (9):
ΔI (0) (x+i′,y+j′)=L(x+i′,y+j′)-I(x+i′,y+j′) (9)
the iterative process is shown in equation (10):
wherein superscript (t) represents t iterations;
setting an iteration stop condition as shown in equation (11):
max(|ΔI (t) (x,y)-ΔI (t-1) (x,y)|)≤0.01 (11)
let t be 0 And if the iteration stop condition shown in the formula (11) is met during the secondary iteration, the obtained detail enhancement image is shown in the formula (12):
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202211349133.0A CN115937016B (en) | 2022-10-31 | 2022-10-31 | Contrast enhancement method for guaranteeing image details |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202211349133.0A CN115937016B (en) | 2022-10-31 | 2022-10-31 | Contrast enhancement method for guaranteeing image details |
Publications (2)
Publication Number | Publication Date |
---|---|
CN115937016A true CN115937016A (en) | 2023-04-07 |
CN115937016B CN115937016B (en) | 2023-07-21 |
Family
ID=86653346
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202211349133.0A Active CN115937016B (en) | 2022-10-31 | 2022-10-31 | Contrast enhancement method for guaranteeing image details |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN115937016B (en) |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20100278423A1 (en) * | 2009-04-30 | 2010-11-04 | Yuji Itoh | Methods and systems for contrast enhancement |
CN101951523A (en) * | 2010-09-21 | 2011-01-19 | 北京工业大学 | Adaptive colour image processing method and system |
US20140267378A1 (en) * | 2013-03-15 | 2014-09-18 | Eric Olsen | Global contrast correction |
CN105957030A (en) * | 2016-04-26 | 2016-09-21 | 成都市晶林科技有限公司 | Infrared thermal imaging system image detail enhancing and noise inhibiting method |
CN112037144A (en) * | 2020-08-31 | 2020-12-04 | 哈尔滨理工大学 | Low-illumination image enhancement method based on local contrast stretching |
-
2022
- 2022-10-31 CN CN202211349133.0A patent/CN115937016B/en active Active
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20100278423A1 (en) * | 2009-04-30 | 2010-11-04 | Yuji Itoh | Methods and systems for contrast enhancement |
CN101951523A (en) * | 2010-09-21 | 2011-01-19 | 北京工业大学 | Adaptive colour image processing method and system |
US20140267378A1 (en) * | 2013-03-15 | 2014-09-18 | Eric Olsen | Global contrast correction |
CN105957030A (en) * | 2016-04-26 | 2016-09-21 | 成都市晶林科技有限公司 | Infrared thermal imaging system image detail enhancing and noise inhibiting method |
CN112037144A (en) * | 2020-08-31 | 2020-12-04 | 哈尔滨理工大学 | Low-illumination image enhancement method based on local contrast stretching |
Non-Patent Citations (2)
Title |
---|
依玉峰;田宏;刘彤宇;: "直方图重建图像细节增强算法", 计算机工程与应用, no. 13 * |
张婷婷;祁伟;曹峰;李伟;: "基于全局和局部特征的自适应红外图像增强算法研究", 信息与电脑(理论版), no. 03 * |
Also Published As
Publication number | Publication date |
---|---|
CN115937016B (en) | 2023-07-21 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
Agrawal et al. | A novel joint histogram equalization based image contrast enhancement | |
US11127122B2 (en) | Image enhancement method and system | |
Gao et al. | Naturalness preserved nonuniform illumination estimation for image enhancement based on retinex | |
Ju et al. | Single image dehazing via an improved atmospheric scattering model | |
WO2022199583A1 (en) | Image processing method and apparatus, computer device, and storage medium | |
CN111583123A (en) | Wavelet transform-based image enhancement algorithm for fusing high-frequency and low-frequency information | |
CN108564549B (en) | Image defogging method based on multi-scale dense connection network | |
CN103218778B (en) | The disposal route of a kind of image and video and device | |
CN102982513B (en) | A kind of adapting to image defogging method capable based on texture | |
CN104463804B (en) | Image enhancement method based on intuitional fuzzy set | |
CN109872285A (en) | A kind of Retinex low-luminance color image enchancing method based on variational methods | |
CN110956632A (en) | Method and device for automatically detecting pectoralis major region in molybdenum target image | |
Feng et al. | Low-light image enhancement algorithm based on an atmospheric physical model | |
CN112053302A (en) | Denoising method and device for hyperspectral image and storage medium | |
Pandey et al. | A fast and effective vision enhancement method for single foggy image | |
CN104616259A (en) | Non-local mean image de-noising method with noise intensity self-adaptation function | |
CN110992287B (en) | Method for clarifying non-uniform illumination video | |
Hua et al. | Low-light image enhancement based on joint generative adversarial network and image quality assessment | |
CN115660994B (en) | Image enhancement method based on regional least square estimation | |
CN116630198A (en) | Multi-scale fusion underwater image enhancement method combining self-adaptive gamma correction | |
CN103595933A (en) | Method for image noise reduction | |
CN108492264B (en) | Single-frame image fast super-resolution method based on sigmoid transformation | |
CN116452447A (en) | Low-illumination high-definition image processing method | |
CN114240990B (en) | SAR image point target segmentation method | |
Malik et al. | Contrast enhancement and smoothing of CT images for diagnosis |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |