CN115937016B - Contrast enhancement method for guaranteeing image details - Google Patents
Contrast enhancement method for guaranteeing image details Download PDFInfo
- Publication number
- CN115937016B CN115937016B CN202211349133.0A CN202211349133A CN115937016B CN 115937016 B CN115937016 B CN 115937016B CN 202211349133 A CN202211349133 A CN 202211349133A CN 115937016 B CN115937016 B CN 115937016B
- Authority
- CN
- China
- Prior art keywords
- gray
- image
- enhanced
- value
- formula
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 238000000034 method Methods 0.000 title claims abstract description 33
- 238000013507 mapping Methods 0.000 claims abstract description 43
- 230000001186 cumulative effect Effects 0.000 claims abstract description 24
- 238000010606 normalization Methods 0.000 claims description 14
- 238000004364 calculation method Methods 0.000 claims description 9
- NAWXUBYGYWOOIX-SFHVURJKSA-N (2s)-2-[[4-[2-(2,4-diaminoquinazolin-6-yl)ethyl]benzoyl]amino]-4-methylidenepentanedioic acid Chemical compound C1=CC2=NC(N)=NC(N)=C2C=C1CCC1=CC=C(C(=O)N[C@@H](CC(=C)C(O)=O)C(O)=O)C=C1 NAWXUBYGYWOOIX-SFHVURJKSA-N 0.000 claims description 2
- 238000012804 iterative process Methods 0.000 claims description 2
- 238000004422 calculation algorithm Methods 0.000 abstract description 16
- 230000000007 visual effect Effects 0.000 abstract description 7
- 238000012545 processing Methods 0.000 abstract description 3
- 238000010586 diagram Methods 0.000 description 3
- 238000005286 illumination Methods 0.000 description 2
- 238000003384 imaging method Methods 0.000 description 2
- 235000014653 Carica parviflora Nutrition 0.000 description 1
- 241000243321 Cnidaria Species 0.000 description 1
- 230000009286 beneficial effect Effects 0.000 description 1
- 238000012937 correction Methods 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 230000002708 enhancing effect Effects 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 238000004088 simulation Methods 0.000 description 1
- 238000012876 topography Methods 0.000 description 1
Landscapes
- Image Processing (AREA)
Abstract
A contrast enhancement method for ensuring image details belongs to the technical field of image processing. The invention solves the problems that the traditional contrast enhancement algorithm has limited capability of improving the visual effect of the image and is easy to blur the details of the image. Firstly, constructing a nonlinear global gray mapping function, decomposing a gray range into 3 intervals, and respectively counting the cumulative probability distribution of gray values with the maximum occurrence probability in the 3 intervals. Estimating unknown parameters in a nonlinear global gray mapping function by using a Newton iteration method by taking a gray value with the maximum occurrence probability in each gray interval and an accumulated probability distribution corresponding to the value as known data points, so that the global gray mapping is carried out on the image to be enhanced by using the expression of the global gray mapping function; and secondly, converting the detail enhancement problem into a global gray level map, and adding a detail compensation term, wherein the detail compensation term is determined by an iterative algorithm. The method can be applied to the technical field of image processing.
Description
Technical Field
The invention belongs to the technical field of image processing, and particularly relates to a contrast enhancement method for ensuring image details.
Background
The illumination conditions of the imaging environment directly determine the image quality. In many circumstances, the visible light in the photographed scene is unevenly distributed due to object shielding and topography factors, so that the acquired digital image has shadows and highlights affecting the visual effect. In order to suppress the problem of uneven visible light distribution, a back-end process, i.e., contrast enhancement, is also generally required for digital images after imaging by a camera, thereby improving image quality.
Classical contrast enhancement algorithms include: histogram equalization algorithm, gamma correction algorithm, logarithmic domain image enhancement algorithm, contrast stretching algorithm, etc. These contrast enhancement algorithms can improve the image quality in shadow and highlight areas to some extent, but the problem of uneven visible light distribution still limits the visual effect of the image. In addition, classical contrast enhancement algorithms typically stretch the image contrast by a globally uniform gray mapping function, which tends to blur the details of the image.
In summary, the existing contrast enhancement algorithms have limited capabilities for improving the visual effect of the image, and the details of the image are easily blurred during the contrast enhancement process.
Disclosure of Invention
The invention aims to solve the problems that the traditional contrast enhancement algorithm has limited capability of improving the visual effect of an image and is easy to blur image details, and provides a contrast enhancement method for guaranteeing the image details, which is used for enhancing the contrast of the image on the premise of guaranteeing the local details of the image and reconstructing an enhanced image with higher image quality.
The technical scheme adopted by the invention for solving the technical problems is as follows:
a contrast enhancement method for ensuring image detail, the method comprising the steps of:
step one, performing global gray mapping on an image to be enhanced to obtain an image subjected to global gray mapping;
the specific process of the first step is as follows:
the method comprises the steps of representing gray values of an image to be enhanced at a plane coordinate position (x, y) as L (x, y), dividing a gray range [0,255] of the image to be enhanced into three gray intervals, wherein the gray ranges of the three gray intervals are [0,85], [86,171] and [172,255];
step two, counting a gray value with the maximum occurrence probability in each gray interval, and dividing the gray interval [0,85]]The gray value with the largest internal occurrence probability is marked as T l The gray scale interval [86,171]]The gray value with the largest internal occurrence probability is marked as T m The gray scale interval [172,255]]The gray value with the largest internal occurrence probability is marked as T h ;
Step one, three, respectively counting that the image to be enhanced is [0, T ] in the gray scale interval l ]、[0,T m ]、[0,T h ]The cumulative probability in the gray scale interval [0, T ] of the image to be enhanced l ]The cumulative probability in is denoted as P l The image to be enhanced is arranged in the gray scale interval [0, T m ]The cumulative probability in is denoted as P m The image to be enhanced is arranged in the gray scale interval [0, T h ]The cumulative probability in is denoted as P h ;
Step one and four, respectively to T l 、T m And T h Normalizing, i.e. Is T l Corresponding normalized value, < >>Is T m Corresponding normalized value, < >>Is T h Corresponding normalized values;
step five, according to the normalized gray valueAnd normalized gray value +.>Corresponding cumulative probability P l 、P m 、P h And the gray scale normalization value 1 and the gray scale normalization value 0 in the image to be enhanced, and the cumulative probability corresponding to the gray scale normalization value 1 and the gray scale normalization value 0, obtaining 5 two-dimensional coordinates, namely (0, 0) and (0)> And (1, 1);
step six, constructing a global nonlinear gray mapping function as shown in (1):
wherein a, b, c, d, e is 5 undetermined coefficients, and I (x, y) represents a gray value at a plane coordinate position (x, y) after global gray mapping is performed on an image to be enhanced;
seventhly, making initial values of 5 undetermined coefficients a, b, c, d, e be 1, using 5 two-dimensional coordinates obtained in the fifth step as known data points, fitting the formula (1) through a Newton iteration method to obtain optimal estimates of the 5 undetermined coefficients, and expressing the optimal estimates of a asThe optimal estimate of b is denoted +.>The optimal estimate of c is denoted +.>The optimal estimate of d is denoted +.>The optimal estimate of e is denoted +.>
Step eight, bringing the optimal estimation of the 5 coefficients to be determined into a formula (1) to obtain an optimal global nonlinear gray mapping function, and performing global gray mapping on the image to be enhanced through the optimal global nonlinear gray mapping function to obtain an image subjected to global gray mapping;
and step two, carrying out detail enhancement on the image subjected to the global gray mapping obtained in the step one, and obtaining the image subjected to detail enhancement.
The beneficial effects of the invention are as follows:
firstly, constructing a nonlinear global gray mapping function with 5 unknown parameters, decomposing a gray range into 3 intervals, and respectively counting the cumulative probability distribution of the gray value with the largest occurrence probability in the 3 intervals. Estimating unknown parameters in a nonlinear global gray mapping function by using a gray value with the maximum occurrence probability in each gray interval and accumulated probability distribution corresponding to the value as known data points through a Newton iteration method, so as to determine an expression of the global gray mapping function and perform global gray mapping on an image to be enhanced by using the expression; and secondly, converting the detail enhancement problem into a global gray level map, and adding a detail compensation term, wherein the detail compensation term is determined by an iterative algorithm. Experimental results show that the algorithm can inhibit the phenomenon of uneven visible light distribution, enhance the overall illumination layering sense of the image, effectively improve the visual effect of the image, and enhance the local detail characteristic of the image.
Drawings
FIG. 1 is a first image to be enhanced;
FIG. 2 is a schematic diagram of a global gray mapping function corresponding to FIG. 1;
FIG. 3 is an enhanced image corresponding to FIG. 1;
FIG. 4 is a second image to be enhanced;
FIG. 5 is a schematic diagram of the global gray mapping function corresponding to FIG. 4;
FIG. 6 is an enhanced image corresponding to FIG. 5;
FIG. 7 is a third image to be enhanced;
FIG. 8 is a schematic diagram of a global gray mapping function corresponding to FIG. 7;
fig. 9 is an enhanced image corresponding to fig. 7.
Detailed Description
The first embodiment of the present invention provides a contrast enhancement method for ensuring image details, the method specifically including the following steps:
step one, performing global gray mapping on an image to be enhanced to obtain an image subjected to global gray mapping;
the specific process of the first step is as follows:
the method comprises the steps of representing gray values of an image to be enhanced at a plane coordinate position (x, y) as L (x, y), dividing a gray range [0,255] of the image to be enhanced into three gray intervals, wherein the gray ranges of the three gray intervals are [0,85], [86,171] and [172,255];
step two, counting a gray value with the maximum occurrence probability in each gray interval, and dividing the gray interval [0,85]]The gray value with the largest internal occurrence probability is marked as T l The gray scale interval [86,171]]The gray value with the largest internal occurrence probability is marked as T m The gray scale interval [172,255]]The gray value with the largest internal occurrence probability is marked as T h ;
I.e. in the gray scale interval [0,85]]In the gray value of the pixel point, the gray value with the largest occurrence number is T l In the gray scale interval [86,171]]In the gray value of the pixel point, the gray value with the largest occurrence number is T m In the gray scale interval [172,255]]In the gray value of the pixel point, the gray value with the largest occurrence number is T h ;
Step one, three, respectively counting that the image to be enhanced is [0, T ] in the gray scale interval l ]、[0,T m ]、[0,T h ]The cumulative probability in the gray scale interval [0, T ] of the image to be enhanced l ]The cumulative probability in is denoted as P l The image to be enhanced is arranged in the gray scale interval [0, T m ]The cumulative probability in is denoted as P m The image to be enhanced is arranged in the gray scale interval [0, T h ]The cumulative probability in is denoted as P h ;
Cumulative probability P l The calculation method of (1) is as follows: the statistical gray value is in the gray interval [0, T ] l ]The total number of pixels in the image to be enhanced is divided by the total number of pixels contained in the image to be enhanced by the statistical value to obtain the cumulative probability P l ;
Step one and four, respectively to T l 、T m And T h Normalizing, i.e. Is T l Corresponding normalized value, < >>Is T m Corresponding normalized value, < >>Is T h Corresponding normalized values;
step five, according to the normalized gray valueAnd normalized gray value +.>Corresponding cumulative probability P l 、P m 、P h And the gray scale normalization value 1 and the gray scale normalization value 0 in the image to be enhanced, and the cumulative probability corresponding to the gray scale normalization value 1 and the gray scale normalization value 0, obtaining 5 two-dimensional coordinates (the cumulative probability corresponding to the gray scale normalization value 0 is 0, and the cumulative probability corresponding to the gray scale normalization value 1 is 1), namely (0, 0),> and (1, 1);
step six, constructing a global nonlinear gray mapping function as shown in (1):
wherein a, b, c, d, e is 5 undetermined coefficients, and I (x, y) represents a gray value at a plane coordinate position (x, y) after global gray mapping is performed on an image to be enhanced;
seventhly, making initial values of 5 undetermined coefficients a, b, c, d, e be 1, using 5 two-dimensional coordinates obtained in the fifth step as known data points, fitting the formula (1) through a Newton iteration method to obtain optimal estimates of the 5 undetermined coefficients, and expressing the optimal estimates of a asThe optimal estimate of b is denoted +.>The optimal estimate of c is denoted +.>The optimal estimate of d is denoted +.>The optimal estimate of e is denoted +.>
Step eight, bringing the optimal estimation of the 5 coefficients to be determined into a formula (1) to obtain an optimal global nonlinear gray mapping function, and performing global gray mapping on the image to be enhanced through the optimal global nonlinear gray mapping function to obtain an image subjected to global gray mapping;
and step two, carrying out detail enhancement on the image subjected to the global gray mapping obtained in the step one, and obtaining the image subjected to detail enhancement.
When a pixel in the image after global gray mapping is subjected to detail enhancement, eight neighborhood pixels of the pixel are needed to be utilized, but if the pixel is already at or near the edge of the image after global gray mapping, that is, the eight neighborhood pixels taking the pixel as the center do not exist in the image after global gray mapping, then the pixel is not needed to be subjected to detail enhancement.
The second embodiment is as follows: the first difference between the present embodiment and the specific embodiment is that the specific process of the second step is:
the detail enhanced image H is:
H(x,y)=I(x,y)+ΔI(x,y) (2)
where H (x, y) represents the gray value of the detail enhanced image at the plane coordinate position (x, y), Δi (x, y) is the detail compensation term at the plane coordinate position (x, y).
And a third specific embodiment: this embodiment differs from the second embodiment in that the detail compensation term Δi (x, y) satisfies the formula (3):
wherein mu I (x, y) represents a set N 'of pixels (x, y) and eight neighbor pixels centered on the pixel (x, y) in the global gray mapped image' x,y Average gray, μ of middle pixels L (x, y) represents a set N of pixels (x, y) in the image to be enhanced and eight neighborhood pixels centered on the pixel (x, y) x,y Average gray scale of the middle pixels;
converting the expression (3) into an expression shown in the expression (5) according to a variance calculation method:
wherein,,representing a set of pixels N' x,y Variance of gray scale of each pixel, +.>Representing a set of pixels N x,y Variance of the gray level of each pixel;
separating a center pixel from 8 neighborhood pixels in the expression on the left side of the equal sign in the expression (5) to obtain an expression shown in the expression (7):
wherein, i 'has values of-1 and 1, and j' has values of-1 and 1;
finishing the formula (7) to obtain an expression about Δi (x, y) as shown in formula (8):
and (5) solving a detail compensation term in the formula (8) by adopting an iteration method.
The specific embodiment IV is as follows: this embodiment differs from the third embodiment in that the μ I (x, y) and mu L The calculation method of (x, y) is as shown in formula (4):
wherein, i is i= -1,0,1, j is j= -1,0,1.
Fifth embodiment: this embodiment differs from the fourth embodiment in thatAndthe calculation method of (2) is shown in the formula (6):
specific embodiment six: the fifth difference between this embodiment and the fifth embodiment is that the detailed compensation term in the formula (8) is solved by using an iterative method, and the specific process is as follows:
let ΔI be the initial value ΔI of ΔI (x+i ', y+j') (0) (x+i ', y+j') is represented by formula (9):
ΔI (0) (x+i′,y+j′)=L(x+i′,y+j′)-I(x+i′,y+j′) (9)
the iterative process is shown in formula (10):
wherein superscript (t) represents t iterations;
setting an iteration stop condition as shown in formula (11):
max(|ΔI (t) (x,y)-ΔI (t-1) (x,y)|)≤0.01 (11)
satisfying the formula (11) means that |ΔI can be satisfied for each pixel in the image (t) (x,y)-ΔI (t-1) (x, y) is less than or equal to 0.01, let t 0 When the iteration stopping condition shown in the formula (11) is satisfied during the iteration, the obtained detail enhancement image is shown in the formula (12):
experimental results and analysis
The hardware platform adopted by the experimental simulation is a notebook computer. The CPU of the computer is AMD Ryzen 7 5800H, and the display card is NVIDIA GeForce RTX 3070. The simulation software is Matlab 2016a. The input and output of the algorithm are bitmap files in the bmp format. The experimental results are shown in fig. 1 to 9, wherein fig. 1, fig. 4 and fig. 7 are three images to be enhanced respectively, fig. 2, fig. 5 and fig. 8 are global gray mapping functions corresponding to fig. 1, fig. 4 and fig. 7 respectively, and fig. 3, fig. 6 and fig. 9 are enhanced images corresponding to fig. 1, fig. 4 and fig. 7 respectively.
The algorithm designed by the invention can effectively improve the contrast of the image as known by comparing the image to be enhanced with the enhanced image. The enhanced image definition, the visual effect and the local detail characteristics are obviously improved, for example, the surface of a master coat is photographed in the picture 1, the light reflection area of a car window in the picture 4, the low-luminosity effect of the surface of a coral reef in the picture 7 and the high-luminosity phenomenon of a submarine light in the picture 7 are obviously inhibited. Compared with the original image, the overall brightness contrast of the enhanced image is more suitable for subjective observation of human eyes. Therefore, the algorithm designed by the invention can effectively enhance the contrast of the image, the layering sense of the enhanced image is vivid, and the details of the image are rich.
The above examples of the present invention are only for describing the calculation model and calculation flow of the present invention in detail, and are not limiting of the embodiments of the present invention. Other variations and modifications of the above description will be apparent to those of ordinary skill in the art, and it is not intended to be exhaustive of all embodiments, all of which are within the scope of the invention.
Claims (1)
1. A method for contrast enhancement ensuring image detail, said method comprising the steps of:
step one, performing global gray mapping on an image to be enhanced to obtain an image subjected to global gray mapping;
the specific process of the first step is as follows:
the method comprises the steps of representing gray values of an image to be enhanced at a plane coordinate position (x, y) as L (x, y), dividing a gray range [0,255] of the image to be enhanced into three gray intervals, wherein the gray ranges of the three gray intervals are [0,85], [86,171] and [172,255];
step two, counting a gray value with the maximum occurrence probability in each gray interval, and dividing the gray interval [0,85]]The gray value with the largest internal occurrence probability is marked as T l The gray scale interval [86,171]]The gray value with the largest internal occurrence probability is marked as T m The gray scale interval [172,255]]The gray value with the largest internal occurrence probability is marked as T h ;
Step one, three, respectively counting that the image to be enhanced is [0, T ] in the gray scale interval l ]、[0,T m ]、[0,T h ]The cumulative probability in the gray scale interval [0, T ] of the image to be enhanced l ]The cumulative probability in is denoted as P l The image to be enhanced is arranged in the gray scale interval [0, T m ]The cumulative probability in is denoted as P m The image to be enhanced is arranged in the gray scale interval [0, T h ]The cumulative probability in is denoted as P h ;
Step one and four, respectively to T l 、T m And T h Normalizing, i.e. Is T l Corresponding normalized value, < >>Is T m Corresponding normalized value, < >>Is T h Corresponding normalized values;
step five, according to the normalized gray valueAnd normalized gray value +.>Corresponding cumulative probability P l 、P m 、P h And the gray scale normalization value 1 and the gray scale normalization value 0 in the image to be enhanced, and the cumulative probability corresponding to the gray scale normalization value 1 and the gray scale normalization value 0, obtaining 5 two-dimensional coordinates, namely (0, 0) and (0)> And (1, 1);
step six, constructing a global nonlinear gray mapping function as shown in (1):
wherein a, b, c, d, e is 5 undetermined coefficients, and I (x, y) represents a gray value at a plane coordinate position (x, y) after global gray mapping is performed on an image to be enhanced;
seventhly, making initial values of 5 undetermined coefficients a, b, c, d, e be 1, using 5 two-dimensional coordinates obtained in the fifth step as known data points, fitting the formula (1) through a Newton iteration method to obtain optimal estimates of the 5 undetermined coefficients, and expressing the optimal estimates of a asThe optimal estimate of b is denoted +.>The optimal estimate of c is denoted +.>The optimal estimate of d is denoted +.>The optimal estimate of e is denoted +.>
Step eight, bringing the optimal estimation of the 5 coefficients to be determined into a formula (1) to obtain an optimal global nonlinear gray mapping function, and performing global gray mapping on the image to be enhanced through the optimal global nonlinear gray mapping function to obtain an image subjected to global gray mapping;
step two, carrying out detail enhancement on the image subjected to the global gray mapping obtained in the step one, and obtaining an image subjected to detail enhancement;
the specific process of the second step is as follows:
the detail enhanced image H is:
H(x,y)=I(x,y)+ΔI(x,y) (2)
wherein H (x, y) represents the gray value of the detail enhanced image at the plane coordinate position (x, y), Δi (x, y) is the detail compensation term at the plane coordinate position (x, y);
the detail compensation term Δi (x, y) satisfies the formula (3):
wherein mu I (x, y) represents a set N of eight neighborhood pixels centered on pixel (x, y) in the global gray mapped image x ′ ,y Average gray, μ of middle pixels L (x, y) represents a set N of pixels (x, y) in the image to be enhanced and eight neighborhood pixels centered on the pixel (x, y) x,y Average gray scale of the middle pixels;
converting the expression (3) into an expression shown in the expression (5) according to a variance calculation method:
wherein,,representing a set of pixels N x ′ ,y Variance of gray scale of each pixel, +.>Representing a set of pixels N x,y Variance of the gray level of each pixel;
separating a center pixel from 8 neighborhood pixels in the expression on the left side of the equal sign in the expression (5) to obtain an expression shown in the expression (7):
wherein, i 'has values of-1 and 1, and j' has values of-1 and 1;
finishing the formula (7) to obtain an expression about Δi (x, y) as shown in formula (8):
solving a detail compensation term in the formula (8) by adopting an iteration method;
said mu I (x, y) and mu L The calculation method of (x, y) is as shown in formula (4):
wherein, i is i= -1,0,1, j is j= -1,0,1;
the saidAnd->The calculation method of (2) is shown in the formula (6):
the detail compensation term in the formula (8) is solved by adopting an iteration method, and the specific process is as follows:
let ΔI be the initial value ΔI of ΔI (x+i ', y+j') (0) (x+i ', y+j') is represented by formula (9):
ΔI (0) (x+i′,y+j′)=L(x+i′,y+j′)-I(x+i′,y+j′) (9)
the iterative process is shown in formula (10):
wherein superscript (t) represents t iterations;
setting an iteration stop condition as shown in formula (11):
max(ΔI (t) (x,y)-ΔI (t-1) (x,y))≤0.01 (11)
let t 0 When the iteration stopping condition shown in the formula (11) is satisfied during the iteration, the obtained detail enhancement image is shown in the formula (12):
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202211349133.0A CN115937016B (en) | 2022-10-31 | 2022-10-31 | Contrast enhancement method for guaranteeing image details |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202211349133.0A CN115937016B (en) | 2022-10-31 | 2022-10-31 | Contrast enhancement method for guaranteeing image details |
Publications (2)
Publication Number | Publication Date |
---|---|
CN115937016A CN115937016A (en) | 2023-04-07 |
CN115937016B true CN115937016B (en) | 2023-07-21 |
Family
ID=86653346
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202211349133.0A Active CN115937016B (en) | 2022-10-31 | 2022-10-31 | Contrast enhancement method for guaranteeing image details |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN115937016B (en) |
Citations (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN112037144A (en) * | 2020-08-31 | 2020-12-04 | 哈尔滨理工大学 | Low-illumination image enhancement method based on local contrast stretching |
Family Cites Families (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20100278423A1 (en) * | 2009-04-30 | 2010-11-04 | Yuji Itoh | Methods and systems for contrast enhancement |
CN101951523B (en) * | 2010-09-21 | 2012-10-24 | 北京工业大学 | Adaptive colour image processing method and system |
US9324138B2 (en) * | 2013-03-15 | 2016-04-26 | Eric Olsen | Global contrast correction |
CN105957030B (en) * | 2016-04-26 | 2019-03-22 | 成都市晶林科技有限公司 | One kind being applied to the enhancing of thermal infrared imager image detail and noise suppressing method |
-
2022
- 2022-10-31 CN CN202211349133.0A patent/CN115937016B/en active Active
Patent Citations (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN112037144A (en) * | 2020-08-31 | 2020-12-04 | 哈尔滨理工大学 | Low-illumination image enhancement method based on local contrast stretching |
Non-Patent Citations (2)
Title |
---|
基于全局和局部特征的自适应红外图像增强算法研究;张婷婷;祁伟;曹峰;李伟;;信息与电脑(理论版)(第03期);全文 * |
直方图重建图像细节增强算法;依玉峰;田宏;刘彤宇;;计算机工程与应用(第13期);全文 * |
Also Published As
Publication number | Publication date |
---|---|
CN115937016A (en) | 2023-04-07 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US11127122B2 (en) | Image enhancement method and system | |
Gao et al. | Naturalness preserved nonuniform illumination estimation for image enhancement based on retinex | |
CN108921800B (en) | Non-local mean denoising method based on shape self-adaptive search window | |
Gao et al. | A fast image dehazing algorithm based on negative correction | |
Sun et al. | Fast single image haze removal via local atmospheric light veil estimation | |
WO2022199583A1 (en) | Image processing method and apparatus, computer device, and storage medium | |
CN104463804B (en) | Image enhancement method based on intuitional fuzzy set | |
CN111583123A (en) | Wavelet transform-based image enhancement algorithm for fusing high-frequency and low-frequency information | |
CN109919859B (en) | Outdoor scene image defogging enhancement method, computing device and storage medium thereof | |
CN103578084A (en) | Color image enhancement method based on bright channel filtering | |
WO2020124873A1 (en) | Image processing method | |
Bhairannawar | Efficient medical image enhancement technique using transform HSV space and adaptive histogram equalization | |
Yang et al. | Visibility restoration of single image captured in dust and haze weather conditions | |
CN111223110A (en) | Microscopic image enhancement method and device and computer equipment | |
Parihar et al. | A comprehensive analysis of fusion-based image enhancement techniques | |
Mu et al. | Low and non-uniform illumination color image enhancement using weighted guided image filtering | |
CN117252773A (en) | Image enhancement method and system based on self-adaptive color correction and guided filtering | |
CN117830134A (en) | Infrared image enhancement method and system based on mixed filtering decomposition and image fusion | |
CN110706180B (en) | Method, system, equipment and medium for improving visual quality of extremely dark image | |
CN115937016B (en) | Contrast enhancement method for guaranteeing image details | |
CN109636749B (en) | Image processing method | |
CN116630198A (en) | Multi-scale fusion underwater image enhancement method combining self-adaptive gamma correction | |
CN110992287A (en) | Method for clarifying non-uniform illumination video | |
Zhou et al. | An improved algorithm using weighted guided coefficient and union self‐adaptive image enhancement for single image haze removal | |
Malik et al. | Contrast enhancement and smoothing of CT images for diagnosis |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |