CN113160096A - Low-light image enhancement method based on retina model - Google Patents

Low-light image enhancement method based on retina model Download PDF

Info

Publication number
CN113160096A
CN113160096A CN202110581353.5A CN202110581353A CN113160096A CN 113160096 A CN113160096 A CN 113160096A CN 202110581353 A CN202110581353 A CN 202110581353A CN 113160096 A CN113160096 A CN 113160096A
Authority
CN
China
Prior art keywords
component
low
illumination component
image
final
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202110581353.5A
Other languages
Chinese (zh)
Other versions
CN113160096B (en
Inventor
魏本征
侯昊
侯迎坤
丁鹏
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shandong University of Traditional Chinese Medicine
Original Assignee
Shandong University of Traditional Chinese Medicine
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shandong University of Traditional Chinese Medicine filed Critical Shandong University of Traditional Chinese Medicine
Priority to CN202110581353.5A priority Critical patent/CN113160096B/en
Publication of CN113160096A publication Critical patent/CN113160096A/en
Application granted granted Critical
Publication of CN113160096B publication Critical patent/CN113160096B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/22Matching criteria, e.g. proximity measures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/90Determination of colour characteristics
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/56Extraction of image or video features relating to colour
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10024Color image

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Physics & Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Data Mining & Analysis (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Evolutionary Biology (AREA)
  • Evolutionary Computation (AREA)
  • Artificial Intelligence (AREA)
  • General Engineering & Computer Science (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Multimedia (AREA)
  • Image Processing (AREA)

Abstract

The invention discloses a low-light image enhancement method based on a retina model, which belongs to the technical field of image processing and comprises the following steps: step S1: obtaining a similar pixel group; step S2: executing haar transform on the similar pixel group, and respectively obtaining an illumination component and a reflection component in R, G, B three channels by utilizing non-local haar transform at a pixel level; step S3: finding a final reflection component; step S4: finding an enhanced illumination component; step S5: taking the minimum component of the enhanced illumination component as a final illumination component; step S6: the final reflection component and the final illumination component are applied to a retina model to obtain an enhanced image. The low-light image enhancement method is quick and effective, the color of the image processed by the method is not too bright, the information in the original image can be well kept, the problem of uneven illumination after enhancement can be well solved, no false signal is introduced, and the fidelity of the image edge information is very high.

Description

Low-light image enhancement method based on retina model
Technical Field
The invention belongs to the technical field of image processing, and particularly relates to a low-light image enhancement method based on a retina model.
Background
The low-light image enhancement is to enhance the images with low brightness shot in some low-light environments, so as to obtain images with good illumination effect. For example, photographs taken at night or in a closed environment often have the problems of too low brightness and incapability of effectively recognizing specific contents of the photographs, so that low-light image enhancement is always one of the popular research directions in the field of computer vision.
Most of the current low-light image enhancement methods are based on retina models, such as MSR, MSRCR, SIRE, RRM, and the like, and such methods often have the defects of insignificant enhancement effect, color distortion, artifact generation, and the like, and the current low-light image enhancement methods consume much time when processing images.
Disclosure of Invention
The present invention is directed to overcoming at least one of the disadvantages of the prior art, and providing a low-light image enhancement method based on a retina model, which can effectively solve the problems of overexposure caused by very bright local enhancement and distortion caused by very dark local enhancement, thereby obtaining a better low-light image enhancement effect.
The invention discloses a low-light image enhancement method based on a retina model, which comprises the following steps:
step S1: obtaining a similar pixel group;
step S2: executing haar transform on the similar pixel group, and respectively obtaining an illumination component and a reflection component in R, G, B three channels by using a low-frequency coefficient and a high-frequency coefficient of non-local haar transform at a pixel level;
step S3: finding R, G, B the minimum component of the three channel reflection components as the final reflection component;
step S4: r, G, B finding out the maximum component of the three channel illumination components, and using the method of combining the index and the logarithm to enhance to obtain the enhanced illumination component;
step S5: taking the minimum component of the enhanced illumination component as a final illumination component;
step S6: applying the final reflection component and the final illumination component to a retinal model to obtain an enhanced image.
Preferably, the step S1 specifically includes:
step S1 a: respectively executing block matching and line matching in R, G, B three channels in RGB color space, and selecting one size of the three channels according to a certain sliding step length
Figure BDA0003086196830000021
Reference image block BrIn the presence of BrPerforming block matching in a neighborhood of a given size centered on the upper left corner coordinate and obtaining a match BrThe most similar N2-1 image blocks, thus obtaining B together withrInner N2 similar image blocks;
step S1 b: each size is as follows
Figure BDA0003086196830000022
Is stretched into a column vector, labeled
Figure BDA0003086196830000023
Figure BDA0003086196830000024
All V are connectedlSpliced into one by row
Figure BDA0003086196830000025
Matrix M of rows N2 columnsb
Step S1 c: selecting matrix MbWherein one row RrAs a reference line, calculate RrEuclidean distance from all other rows to find the most similar N thereto3-1 lines, together with RrBuilt-in one dimension of N3×N2Similar pixel matrix Ms
Preferably, the step S1c is specifically:
will matrix MbTaking the ith row as a reference row, and calculating Euclidean distances between the ith row and all the rest rows as follows:
Figure BDA0003086196830000026
selecting N with the minimum distance from the ith row3Line 1, together with line i, finally obtained size N3×N2Of the similar pixel matrix Ms
Preferably, the haar transform in step S2 specifically includes:
for similar pixel matrix MsPerforming separable lifting haar transforms in the vertical and horizontal directions, respectively, that is:
Ch=Hl*MS*Hr
wherein, ChFor a spectral matrix after haar transform, HlAnd HrIs a haar matrix.
Preferably, the obtaining method of the illumination component in step S2 specifically includes:
definition Ch(1,1) is a low-frequency coefficient, using Ch(1,1) obtaining an illumination component I by reconstructing an image after performing an inverse haar transforml
Preferably, the method for obtaining the reflection component in step S2 specifically includes:
definition Ch(1,1) is a low-frequency coefficient, using Ch-Ch(1,1) N3×N2-1 transform coefficients, the image being reconstructed after performing an inverse haar transform to obtain a reflection component Ir
Preferably, the step S4 specifically includes:
step S4 a: for the illumination component, 3 illumination components are obtained through R, G, B three channels respectively
Figure BDA0003086196830000031
Figure BDA0003086196830000032
And
Figure BDA0003086196830000033
to pair
Figure BDA0003086196830000034
And
Figure BDA0003086196830000035
comparing to obtain the maximum component of the illumination component;
step S4 b: by different indices gamma1And gamma2To perform the step of enhancing in a manner such that,
wherein gamma is1Calculated by the following method:
Figure BDA0003086196830000036
wherein gamma is2Calculated by the following method:
if it is not
Figure BDA0003086196830000037
Then
Figure BDA0003086196830000038
Otherwise
Figure BDA0003086196830000039
Step S4 c: obtaining a first enhanced illumination component
Figure BDA00030861968300000310
Figure BDA00030861968300000311
Step S4 cd: obtaining a second enhanced illumination component
Figure BDA00030861968300000312
Figure BDA00030861968300000313
Preferably, the step S5 specifically includes:
step S5 a: obtaining a final illumination component
Figure BDA00030861968300000314
Figure BDA00030861968300000315
Wherein,
Figure BDA00030861968300000316
in order to be the final illumination component,
Figure BDA00030861968300000317
for the first enhancement illumination component,
Figure BDA00030861968300000318
is the second enhanced illumination component;
step S5 b: normalizing the image to a gray value [0,1 ];
step S5 c: the gray scale range of the image is compressed.
Preferably, the step S6 specifically includes:
step S6 a: applying said final enhanced illumination component and said final reflectance component to a retinal model, i.e. a model of the retina
Figure BDA0003086196830000041
Wherein IeFor the last enhanced image, a dot product operation;
step S6 b: will IeDenoted as Y', after replacing the V channel in the HSV color space with YConverted back to RGB color space to obtain the final enhanced color image.
Compared with the prior art, the invention has the beneficial effects that: the low-light image enhancement method based on the retina model is quick and effective, the color of the image processed by the method is not too bright, the information in the original image can be well kept, the problem of uneven illumination after enhancement can be well solved, no false signal is introduced, and the fidelity of the image edge information is very high.
Drawings
FIG. 1 is a flow chart of a method for low-light image enhancement based on a retinal model according to the present invention;
FIG. 2 is a first comparison graph of the image effect after processing by the low-light image enhancement method of the present invention and a portion of the prior art low-light enhancement method;
fig. 3 is a second comparison diagram of the image effect after the low-light image enhancement method of the present invention and a part of the prior art low-light enhancement method.
Detailed Description
The invention will be further described with reference to the accompanying drawings, which are provided for illustration of specific embodiments of the invention only and are not to be construed as limiting the invention in any way, the specific embodiments being as follows:
as shown in fig. 1, the present invention provides a low-light image enhancement method based on the combination of retina models, which includes the following steps:
step S1: obtaining a similar pixel group;
step S2: executing haar transform on the similar pixel group, and respectively obtaining an illumination component and a reflection component in R, G, B three channels by using a low-frequency coefficient and a high-frequency coefficient of non-local haar transform at a pixel level;
step S3: finding R, G, B the minimum component of the three channel reflection components as the final reflection component;
step S4: r, G, B finding out the maximum component of the three channel illumination components, and using the method of combining the index and the logarithm to enhance to obtain the enhanced illumination component;
step S5: taking the minimum component of the enhanced illumination component as a final illumination component;
step S6: applying the final reflection component and the final illumination component to a retinal model to obtain an enhanced image.
The method specifically comprises the following steps:
a similar group of pixels is obtained.
Low-light color image I epsilon R in RGB color spaceh×w×cSynchronously converting I from RGB color space to HSV color space.
Respectively executing block matching and line matching in R, G, B three channels in RGB color space, and selecting one size of the three channels according to a certain sliding step length
Figure BDA0003086196830000051
Reference image block BrIn the presence of BrPerforming block matching in a neighborhood of a given size centered on the upper left corner coordinate and obtaining a match BrThe most similar N2-1 image blocks, thus obtaining B together withrThe inner N2 are similar image blocks. Each size is as follows
Figure BDA0003086196830000052
Is stretched into a column vector, labeled
Figure BDA0003086196830000053
All V are connectedlSpliced into one by row
Figure BDA0003086196830000054
Figure BDA0003086196830000055
Matrix M of rows N2 columnsb
To better mine the self-similarity in images, we further explore MbAnd performing row matching.
One row R is selectedrAs a reference row, its euclidean distance to all other rows is calculated to find the most similar N to it3-1 lines, together with RrBuilt-in one dimension of N3×N2Similar pixel matrix Ms
Specifically, for the ith row as a reference row, the euclidean distances between the ith row and all the remaining rows are calculated as:
Figure BDA0003086196830000056
then selecting N with the minimum distance from the ith row3Line 1, together with line i, finally obtained size N3×N2Of the similar pixel matrix Ms
In the similar pixel group MsPerforms a separable haar transform.
Are respectively paired with MsPerforming a separable lifting haar transform in the vertical and horizontal directions, namely:
Ch=Hl*MS*Hr
wherein, ChFor a spectral matrix after haar transform, HlAnd HrIs a haar matrix.
Due to the property of separable lifting of the haar transform, Ch(1,1) is MSWe define it as a low frequency coefficient, using only C, as a weighted average of all pixels ofh(1,1) the ideal illumination component I can be obtained by reconstructing the image after the inverse haar transform is executedl(ii) a Otherwise, use Ch-Ch(1,1) N3×N21 transform coefficients (i.e. medium and high frequency coefficients) to perform an inverse haar transform and reconstruct the image to obtain the desired reflection component Ir. The method can effectively and quickly separate the illumination and reflection components of the image, and is an important step for enhancing the low-light image.
Performing an enhancement operation on the reflected component:
for the reflected component, 3 reflected components were obtained through R, G, B channels, respectively
Figure BDA0003086196830000061
And
Figure BDA0003086196830000062
to pair
Figure BDA0003086196830000063
And
Figure BDA0003086196830000064
comparing to obtain the minimum component of the reflection component, and selecting the minimum component as the final reflection component lr
Performing enhancement operations on the illumination component:
for the illumination component, 3 illumination components are obtained through R, G, B three channels respectively
Figure BDA0003086196830000065
And
Figure BDA0003086196830000066
to pair
Figure BDA0003086196830000067
And
Figure BDA0003086196830000068
and comparing to obtain the maximum component of the illumination component, namely obtaining the brightest illumination component. By different indices gamma1And gamma2To perform an enhancement step, γ1And gamma2Each can be calculated by the following method:
Figure BDA0003086196830000069
γ2the following two cases are distinguished:
if it is not
Figure BDA00030861968300000610
Then
Figure BDA00030861968300000611
Otherwise
Figure BDA00030861968300000612
A first enhanced illumination component
Figure BDA00030861968300000613
Can be obtained by the following method:
Figure BDA00030861968300000614
second enhanced illumination component
Figure BDA00030861968300000615
Can be obtained by the following method:
Figure BDA0003086196830000071
in practice, it is found that if only exponential transformation is used to enhance the illumination component of low-brightness part in the image, the brightness value is increased too fast, which results in insufficient brightness; if only logarithmic transformation is used to enhance the illumination component of high brightness part in image, the increase speed of brightness value is too fast, which also results in uneven brightness.
Therefore, in order to solve the above-mentioned problems, the present invention obtains the final illumination component I in the following mannerl
Figure BDA0003086196830000072
The image is normalized to a gray value [0,1], then the gray range of the image is compressed, the dark area in the original image has a bright area with a large degree, the change of the bright area is small, the low-light image enhancement effect is realized, and the self-adaptability of the enhancement result in different illumination areas is ensured.
Applying the final illumination component and the final reflection component to the retina model, i.e.
Figure BDA0003086196830000073
Wherein IeFor the last enhanced image, an operation of dot product.
Will IeDenoted as Y ', the V channel in the HSV color space is replaced with Y' and converted back to the RGB color space, and the enhanced final color image is obtained.
The invention carries out enhancement experiments on a data set consisting of 200 randomly selected low-light images in a CVPR2021UG2+ challenge data set and 35 low-light image data sets by MATLAB software, executes the algorithm of the invention to obtain an enhanced result image, and compares the enhanced result image with the prior art classical methods of HE, MSRCR, CVC, NPE, SIRE, MF, WVM, CRM, BIMEF, LIME, Jiep and STAR, wherein the image effect is shown in figure 2 and figure 3. Wherein the address of the CVPR2021UG2+ challenge data set is: (http://cvpr2021.ug2challenge.org/dataset21_t1.html)
As can be seen from fig. 2 and fig. 3, the color of the image enhanced by the method of the present invention is not too bright, the information in the original image can be well retained, the problem of uneven illumination after enhancement can be well solved, no false signal is introduced, and the fidelity of the image edge information is very high.
The comparison of the NIQE, LOE, TMQI and FSIM values of the image enhanced by the method of the invention and the image enhanced by the prior art is shown in the following table:
Method NIQE LOE TMQI FSIM
HE 3.62 740.30 0.9220 0.7174
MSRCR 3.17 702.85 0.8506 0.6969
CVC 3.11 654.82 0.8715 0.8578
NPE 3.22 710.21 0.8891 0.8193
SIRE 3.01 637.70 0.8680 0.8991
MF 3.38 776.41 0.8997 0.8233
WVM 2.99 633.40 0.8674 0.8999
CRM 3.13 744.61 0.8964 0.8123
BIMEF 3.04 703.16 0.9017 0.8898
LIME 3.39 779.73 0.8791 0.7131
JieP 2.99 724.52 0.8766 0.8749
STAR 2.93 677.43 0.8784 0.9047
the method of the invention 2.76 546.63 0.8616 0.9250
It should be noted that: a lower NIQE, LOE, TMQI indicates a higher image quality, and a higher FSIM indicates a higher image quality.
As can be seen from the data in the table, the processing effect of the four index values of the image processed by the low-light image enhancement method of the invention on the image is better than the result of the low-light image enhancement method in the prior art.
Thus, it should be understood by those skilled in the art that while exemplary embodiments of the present invention have been illustrated and described in detail herein, many other variations or modifications which are consistent with the principles of the invention may be directly determined or derived from the disclosure of the present invention without departing from the spirit and scope of the invention. Accordingly, the scope of the invention should be understood and interpreted to cover all such other variations or modifications.

Claims (9)

1. A method for low-light image enhancement based on a retina model, comprising:
step S1: obtaining a similar pixel group;
step S2: executing haar transform on the similar pixel group, and respectively obtaining an illumination component and a reflection component in R, G, B three channels by using a low-frequency coefficient and a high-frequency coefficient of non-local haar transform at a pixel level;
step S3: finding R, G, B the minimum component of the three channel reflection components as the final reflection component;
step S4: r, G, B finding out the maximum component of the three channel illumination components, and using the method of combining the index and the logarithm to enhance to obtain the enhanced illumination component;
step S5: taking the minimum component of the enhanced illumination component as a final illumination component;
step S6: applying the final reflection component and the final illumination component to a retinal model to obtain an enhanced image.
2. The low-light image enhancement method according to claim 1, wherein the step S1 specifically includes:
step S1 a: respectively executing block matching and line matching in R, G, B three channels in RGB color space, and selecting one size of the three channels according to a certain sliding step length
Figure FDA0003086196820000011
Reference image block BrIn the presence of BrPerforming block matching in a neighborhood of a given size centered on the upper left corner coordinate and obtaining a match BrThe most similar N2-1 image blocks, thus obtaining B together withrInner N2 similar image blocks;
step S1 b: each size is as follows
Figure FDA0003086196820000012
Is stretched into a column vector, labeled
Figure FDA0003086196820000013
Figure FDA0003086196820000014
All V are connectedlSpliced into one by row
Figure FDA0003086196820000015
Matrix M of rows N2 columnsb
Step S1 c: selecting matrix MbWherein one row RrAs a reference line, calculate RrEuclidean distance from all other rows to find the most similar N thereto3-1 lines, together with RrBuilt-in one dimension of N3×N2Similar pixel matrix Ms
3. The low-light image enhancement method according to claim 2, wherein the step S1c is specifically:
will matrix MbTaking the ith row as a reference row, and calculating Euclidean distances between the ith row and all the rest rows as follows:
Figure FDA0003086196820000016
selecting N with the minimum distance from the ith row3Line 1, together with line i, finally obtained size N3×N2Of the similar pixel matrix Ms
4. The low-light image enhancement method according to claim 3, wherein the haar transform in step S2 specifically includes:
for similar pixel matrix MsPerforming separable lifting haar transforms in the vertical and horizontal directions, respectively, that is:
Ch=Hl*Ms*Hr
wherein, ChFor a spectral matrix after haar transform, HlAnd HrIs a haar matrix.
5. The low-light image enhancement method according to claim 4, wherein the obtaining method of the illumination component in the step S2 specifically comprises:
definition Ch(1,1) is a low-frequency coefficient, using Ch(1,1) obtaining an illumination component I by reconstructing an image after performing an inverse haar transforml
6. The low-light image enhancement method according to claim 4, wherein the obtaining method of the reflection component in the step S2 specifically includes:
definition Ch(1,1) is a low-frequency coefficient, using Ch-Ch(1,1) N3×N2-1 transform coefficients, the image being reconstructed after performing an inverse haar transform to obtain a reflection component Ir
7. The low-light image enhancement method according to claim 1, wherein the step S4 specifically includes:
step S4 a: for the illumination component, 3 illumination components are obtained through R, G, B three channels respectively
Figure FDA0003086196820000021
Figure FDA0003086196820000022
And
Figure FDA0003086196820000023
to pair
Figure FDA0003086196820000024
And
Figure FDA0003086196820000025
comparing to obtain the maximum component of the illumination component;
step S4 b: by different indices gamma1And gamma2To perform the step of enhancing in a manner such that,
wherein gamma is1Calculated by the following method:
Figure FDA0003086196820000026
wherein gamma is2Calculated by the following method:
if it is not
Figure FDA0003086196820000027
Then
Figure FDA0003086196820000028
Otherwise
Figure FDA0003086196820000031
Step S4 c: obtaining a first enhanced illumination component
Figure FDA0003086196820000032
Figure FDA0003086196820000033
Step S4 cd: obtaining a second enhanced illumination component
Figure FDA0003086196820000034
Figure FDA0003086196820000035
8. The low-light image enhancement method according to claim 7, wherein the step S5 specifically includes:
step S5 a: obtaining a final illumination component
Figure FDA0003086196820000036
Figure FDA0003086196820000037
Wherein,
Figure FDA0003086196820000038
in order to be the final illumination component,
Figure FDA0003086196820000039
for the first enhancement illumination component,
Figure FDA00030861968200000310
is the second enhanced illumination component;
step S5 b: normalizing the image to a gray value [0,1 ];
step S5 c: the gray scale range of the image is compressed.
9. The low-light image enhancement method according to claim 8, wherein the step S6 specifically includes:
step S6 a: applying said final enhanced illumination component and said final reflectance component to a retinal model, i.e. a model of the retina
Figure FDA00030861968200000311
Wherein IeFor the last enhanced image, a dot product operation;
step S6 b: will IeDenoted Y ', replaces the V channel in the HSV color space with Y' and converts back to the RGB color space to obtain the final enhanced color image.
CN202110581353.5A 2021-05-27 2021-05-27 Low-light image enhancement method based on retina model Active CN113160096B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110581353.5A CN113160096B (en) 2021-05-27 2021-05-27 Low-light image enhancement method based on retina model

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110581353.5A CN113160096B (en) 2021-05-27 2021-05-27 Low-light image enhancement method based on retina model

Publications (2)

Publication Number Publication Date
CN113160096A true CN113160096A (en) 2021-07-23
CN113160096B CN113160096B (en) 2023-12-08

Family

ID=76877698

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110581353.5A Active CN113160096B (en) 2021-05-27 2021-05-27 Low-light image enhancement method based on retina model

Country Status (1)

Country Link
CN (1) CN113160096B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114298943A (en) * 2021-12-30 2022-04-08 桂林理工大学 Low-light image enhancement method based on block matching three-dimensional transformation

Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2014169579A1 (en) * 2013-04-19 2014-10-23 华为技术有限公司 Color enhancement method and device
CN106780417A (en) * 2016-11-22 2017-05-31 北京交通大学 A kind of Enhancement Method and system of uneven illumination image
CN107578383A (en) * 2017-08-29 2018-01-12 北京华易明新科技有限公司 A kind of low-light (level) image enhancement processing method
CN109493295A (en) * 2018-10-31 2019-03-19 泰山学院 A kind of non local Haar transform image de-noising method
US20190333200A1 (en) * 2017-01-17 2019-10-31 Peking University Shenzhen Graduate School Method for enhancing low-illumination image
CN111223068A (en) * 2019-11-12 2020-06-02 西安建筑科技大学 Retinex-based self-adaptive non-uniform low-illumination image enhancement method
CN111583123A (en) * 2019-02-17 2020-08-25 郑州大学 Wavelet transform-based image enhancement algorithm for fusing high-frequency and low-frequency information
CN111626945A (en) * 2020-04-23 2020-09-04 泰山学院 Depth image restoration method based on pixel-level self-similarity model
CN112116536A (en) * 2020-08-24 2020-12-22 山东师范大学 Low-illumination image enhancement method and system
CN112365425A (en) * 2020-11-24 2021-02-12 中国人民解放军陆军炮兵防空兵学院 Low-illumination image enhancement method and system
US20210118110A1 (en) * 2019-10-21 2021-04-22 Illumina, Inc. Increased Calculation Efficiency for Structured Illumination Microscopy
WO2021088481A1 (en) * 2019-11-08 2021-05-14 南京理工大学 High-precision dynamic real-time 360-degree omnibearing point cloud acquisition method based on fringe projection

Patent Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2014169579A1 (en) * 2013-04-19 2014-10-23 华为技术有限公司 Color enhancement method and device
CN106780417A (en) * 2016-11-22 2017-05-31 北京交通大学 A kind of Enhancement Method and system of uneven illumination image
US20190333200A1 (en) * 2017-01-17 2019-10-31 Peking University Shenzhen Graduate School Method for enhancing low-illumination image
CN107578383A (en) * 2017-08-29 2018-01-12 北京华易明新科技有限公司 A kind of low-light (level) image enhancement processing method
CN109493295A (en) * 2018-10-31 2019-03-19 泰山学院 A kind of non local Haar transform image de-noising method
CN111583123A (en) * 2019-02-17 2020-08-25 郑州大学 Wavelet transform-based image enhancement algorithm for fusing high-frequency and low-frequency information
US20210118110A1 (en) * 2019-10-21 2021-04-22 Illumina, Inc. Increased Calculation Efficiency for Structured Illumination Microscopy
WO2021088481A1 (en) * 2019-11-08 2021-05-14 南京理工大学 High-precision dynamic real-time 360-degree omnibearing point cloud acquisition method based on fringe projection
CN111223068A (en) * 2019-11-12 2020-06-02 西安建筑科技大学 Retinex-based self-adaptive non-uniform low-illumination image enhancement method
CN111626945A (en) * 2020-04-23 2020-09-04 泰山学院 Depth image restoration method based on pixel-level self-similarity model
CN112116536A (en) * 2020-08-24 2020-12-22 山东师范大学 Low-illumination image enhancement method and system
CN112365425A (en) * 2020-11-24 2021-02-12 中国人民解放军陆军炮兵防空兵学院 Low-illumination image enhancement method and system

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
黄丽雯;王勃;宋涛;黄俊木;: "低光照彩色图像增强算法研究", 重庆理工大学学报(自然科学), no. 01 *

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114298943A (en) * 2021-12-30 2022-04-08 桂林理工大学 Low-light image enhancement method based on block matching three-dimensional transformation

Also Published As

Publication number Publication date
CN113160096B (en) 2023-12-08

Similar Documents

Publication Publication Date Title
CN110232661B (en) Low-illumination color image enhancement method based on Retinex and convolutional neural network
Gupta et al. Minimum mean brightness error contrast enhancement of color images using adaptive gamma correction with color preserving framework
CN104156921B (en) Self-adaptive low-illuminance or non-uniform-brightness image enhancement method
CN100568279C (en) A kind of fast colourful image enchancing method based on the Retinex theory
CN109919859B (en) Outdoor scene image defogging enhancement method, computing device and storage medium thereof
CN110298792B (en) Low-illumination image enhancement and denoising method, system and computer equipment
CN108389163A (en) A kind of self-adapting enhancement method based on half-light coloured image
CN110473152B (en) Image enhancement method based on improved Retinex algorithm
CN111968065B (en) Self-adaptive enhancement method for image with uneven brightness
Rong et al. Study of color heritage image enhancement algorithms based on histogram equalization
CN112053309A (en) Image enhancement method and image enhancement device
CN114897753A (en) Low-illumination image enhancement method
CN112598607B (en) Endoscope image blood vessel enhancement algorithm based on improved weighted CLAHE
CN111340717A (en) Image preprocessing device for uncooled thermal infrared imager
Lei et al. Low-light image enhancement using the cell vibration model
CN105225205A (en) Image enchancing method, Apparatus and system
CN115809966A (en) Low-illumination image enhancement method and system
CN113160096A (en) Low-light image enhancement method based on retina model
CN113129300A (en) Drainage pipeline defect detection method, device, equipment and medium for reducing false detection rate
Omarova et al. Application of the Clahe method contrast enhancement of X-Ray Images
CN114429426B (en) Low-illumination image quality improvement method based on Retinex model
Mosny et al. Cubical gamut mapping colour constancy
CN112614471B (en) Tone mapping method and system
CN111652816B (en) Image processing method and related equipment
CN113781333A (en) Method for processing underwater image by GAN network based on guided filtering

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant