CN111489302A - Maritime image enhancement method in fog environment - Google Patents

Maritime image enhancement method in fog environment Download PDF

Info

Publication number
CN111489302A
CN111489302A CN202010231300.6A CN202010231300A CN111489302A CN 111489302 A CN111489302 A CN 111489302A CN 202010231300 A CN202010231300 A CN 202010231300A CN 111489302 A CN111489302 A CN 111489302A
Authority
CN
China
Prior art keywords
image
network
noise
transmittance
marine
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202010231300.6A
Other languages
Chinese (zh)
Other versions
CN111489302B (en
Inventor
刘�文
卢南华
孙睿涵
郭彧
崔振华
胡旭东
马全党
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Wuhan University of Technology WUT
Original Assignee
Wuhan University of Technology WUT
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Wuhan University of Technology WUT filed Critical Wuhan University of Technology WUT
Priority to CN202010231300.6A priority Critical patent/CN111489302B/en
Publication of CN111489302A publication Critical patent/CN111489302A/en
Application granted granted Critical
Publication of CN111489302B publication Critical patent/CN111489302B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • G06T5/70
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/045Combinations of networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/20Image enhancement or restoration by the use of local operators
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10024Color image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20081Training; Learning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20084Artificial neural networks [ANN]
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02ATECHNOLOGIES FOR ADAPTATION TO CLIMATE CHANGE
    • Y02A90/00Technologies having an indirect contribution to adaptation to climate change
    • Y02A90/10Information and communication technologies [ICT] supporting adaptation to climate change, e.g. for weather forecasting or climate simulation

Abstract

The invention discloses a maritime image enhancement method in a fog environment, which comprises the steps of firstly estimating an initial transmittance graph and an atmospheric light value by adopting a dark channel first-pass algorithm, then correcting the transmittance graph by adopting a sky region soft segmentation method, then optimizing the transmittance graph by adopting a convolution neural network, and finally recovering a fog-free image through an inverted atmospheric light scattering model. The invention provides an enhancement method capable of effectively solving the problem of marine fog images, which is characterized in that the common occupation ratio of the sky area of the marine fog images is large, the difference exists in texture structure with common images, and the sky area restored by the traditional dark channel pre-inspection algorithm has obvious distortion.

Description

Maritime image enhancement method in fog environment
Technical Field
The invention relates to an image enhancement technology, in particular to a marine image enhancement method in a fog environment.
Background
The spread of the mist depends on the depth of the different locations. Various image enhancement methods have been widely used to remove haze from a single image, and these methods can be broadly divided into two main categories: enhancement-based methods and physical-based methods. These methods do not take into account the intrinsic physics of fog formation based on enhanced defogging methods such as histogram based, contrast based, and saturation based. The image contrast is enhanced only by some classical image enhancement techniques and therefore the image quality cannot be improved in the presence of non-uniform haze. In contrast, the physics-based approach focuses primarily on the degradation mechanism of the image. Both theoretical and experimental studies have shown that physically based defogging methods generally produce higher image quality under different imaging conditions.
The method is based on empirical statistics of haze-free image experiments, a Dark Channel Prior (DCP) method is found, which considers that in most non-haze blocks, at least one color channel has some pixels with very low intensity, under dark channel prior conditions, the haze can be estimated and removed using an atmospheric scattering model, however, DCP has poor haze removal quality in sky images and is computationally intensive, improved algorithms are proposed to overcome these limitations, Nishino et al use a factor MRF to model images to more accurately estimate scene radiation, Meng et al propose an effective regularized haze removal method to recover haze-free images through inherent boundary constraints, a machine learning framework to study relevant factors, a new set of energy transfer coefficients are developed to obtain a robust haze-based blurred image, a net-based neural network convolution, etc. A robust haze-based fuzzy-based on a net-based on a latent-based on a net-fuzzy-based neural-fuzzy-based learning algorithm, a robust-based on a net-based fuzzy-based on a neural-fuzzy-based network-based on which it is not possible to obtain a robust and a deep-fuzzy-based on a deep-fuzzy-like deep-fuzzy-based on a net-fuzzy-based on a neural-fuzzy-based on-fuzzy-based fuzzy-based-fuzzy-based on-fuzzy-based-fuzzy-based-fuzzy-based-fuzzy.
At present, although a plurality of researchers provide algorithms aiming at fog image enhancement, the proportion of a maritime image sky region is large, and the difference of the maritime image sky region and a common image in texture structure exists. The method for removing the fog of the common image easily causes the color distortion of the marine image in the sky area, so that the method for enhancing the marine image in the fog environment has important practical significance.
Disclosure of Invention
The invention aims to solve the technical problem of providing a marine image enhancement method in a fog environment aiming at the defects in the prior art.
The technical scheme adopted by the invention for solving the technical problems is as follows: a marine image enhancement method in a fog environment comprises the following steps:
1) obtaining a marine fog image to be processed, and estimating and obtaining an initial transmittance graph and an atmospheric light value through a dark channel first-aid algorithm;
2) correcting the initial transmittance graph by adopting a sky region soft segmentation method to obtain a corrected transmittance graph;
3) optimizing the modified transmittance map by a convolutional neural network;
4) and obtaining the defogged image after the marine image enhancement by adopting an inverted atmospheric light scattering model according to the optimized transmissivity graph.
According to the scheme, the initial transmittance graph and the atmospheric light value are obtained through dark channel prior-inspection algorithm estimation in the step 1), and the method is as follows
According to dark channel prior theory, the method comprises the following steps:
Figure BDA0002429371430000031
combining an atmospheric light scattering model to obtain:
Figure BDA0002429371430000032
wherein the content of the first and second substances,
Figure BDA0002429371430000033
in the initial transmittance graph, w represents the degree of haze removal, a represents the atmospheric light value, and the haze removal effect is more pronounced as the value of w is larger, and w is set to 0.95 according to the experience.
According to the scheme, the sky region soft segmentation method is represented as follows:
Figure BDA0002429371430000041
in the formula (I), the compound is shown in the specification,
Figure BDA0002429371430000042
is an initial graph of the transmittance of the light,
Figure BDA0002429371430000043
for the modified transmittance map, a weighting function M and a luminance-based transmittance map t1Expressed as:
Figure BDA0002429371430000044
Figure BDA0002429371430000045
in the formula (I), the compound is shown in the specification,
Figure BDA0002429371430000046
l represents an input imageLuminance of I (convert RGB image to HSI color space and select I channel as L), L*A value representing L95% of luminance, β is a scattering coefficient, β is wavelength-dependent from an optical point of view, and thus correlation coefficients β of RGB channels in a color image are set to 0.3324, 0.3433, and 0.3502, respectively.
According to the scheme, the convolutional neural network in the step 3) is a blind denoising convolutional neural network.
According to the scheme, the convolutional neural network in the step 3) comprises a noise estimation network and a noise removal network, and the structure is as follows:
a. noise estimation sub-network (CNN)E)
The noise estimation sub-network may estimate the noise level of the image from the noise image g and obtain a noise level map
Figure BDA0002429371430000047
The noise estimation sub-network uses a 5-layer Conv (convolutional) network;
b. noise-removing sub-network (CNN)D)
The first layer of the noise removal network adopts Conv + Re L U, the middle layer adopts Conv + BN (batch normalization) + Re L U, the last layer adopts Conv, wherein the size of all filters in the denoising sub-network is set to be 3 ×, each layer network adopts Zero padding mode (Zero padding), the input and output sizes of each layer are kept consistent, and therefore artificial boundaries (Boundary Artifacts) are prevented from being generated
Figure BDA0002429371430000051
Finally, a noise-free estimation image is obtained
Figure BDA0002429371430000052
According to the scheme, the number of channels of each convolutional layer of the noise estimation sub-network in the step 3) is 32, the size of the filter is 3 × 3, and the activation function arranged behind each convolutional layer is Re L U.
According to the scheme, the loss function adopted by the convolutional neural network in the step 3) is a mixed loss function and comprises three sub-loss functions to constrain a noise level estimation diagram
Figure BDA0002429371430000053
And noise-free estimated image
Figure BDA0002429371430000054
To reliably estimate the noise level, an asymmetric MSE loss function is employed
Figure BDA0002429371430000055
Sum total variation regularization term loss function
Figure BDA0002429371430000056
Constrained noise level map sigma and estimation map
Figure BDA0002429371430000057
The mathematical formula can be expressed as:
Figure BDA0002429371430000058
Figure BDA0002429371430000059
where, Ω represents the domain of the image,
Figure BDA00024293714300000510
and
Figure BDA00024293714300000511
operators representing horizontal and vertical gradients, α and β are penalty factors for the loss function, α is set empirically to 0.3 when
Figure BDA00024293714300000512
When β is 1, otherwise β is 0, for the noise-free real image f and the estimated image f
Figure BDA00024293714300000513
Using a reconstruction loss function
Figure BDA00024293714300000514
Constrained, its formula can be expressed as:
Figure BDA0002429371430000061
thus, the loss function of the entire network can be expressed as:
Figure BDA0002429371430000062
according to the scheme, in the first step, 10000 image data sets are collected and manufactured, a data set of a transmittance graph is obtained through image depth information, and the transmittance graph is cut into 10000 × 512 large data sets;
according to the scheme, the mathematical formula of the atmospheric light scattering model inverted in the step 4) is as follows:
Figure BDA0002429371430000063
wherein max (a, b) represents the selection of the larger of a, b, tlbRepresents the lower bound of transmittance and empirically sets tlb=0.1。
The invention has the following beneficial effects:
the invention combines the traditional algorithm with deep learning through the transmissivity map, is more suitable for enhancing the marine fog image, provides a new convolution neural network and can obtain satisfactory optimization effect.
Drawings
The invention will be further described with reference to the accompanying drawings and examples, in which:
FIG. 1 is a flow chart of a method of an embodiment of the present invention;
fig. 2 is a schematic diagram of a neural network structure according to an embodiment of the present invention.
Detailed Description
In order to make the objects, technical solutions and advantages of the present invention more apparent, the present invention is further described in detail with reference to the following embodiments. It should be understood that the specific embodiments described herein are merely illustrative of the invention and are not intended to limit the invention.
As shown in fig. 1, a method for enhancing a marine image in a fog environment includes the following steps:
inputting a marine fog image to be processed, and estimating an initial transmittance map and an atmospheric light value through a dark channel first-pass algorithm;
the dark channel pre-inspection algorithm estimates the initial transmittance map and atmospheric light values:
a. dark channel first-check algorithm
In a local area other than the sky, some pixels have channels of at least one color with very low values. Thus, for an arbitrary input image J, its dark channels can be represented as:
Figure BDA0002429371430000071
where r, g, b represent the three channels of the color image and Ω (x) represents a square window centered on pixel x. The dark channel prior theory states that:
Jdark→0
b. atmospheric scattering model
According to the physical characteristics of atmospheric light transmitted in the foggy day degradation process, the atmospheric scattering model can be expressed as:
I(x)=J(x)t(x)+A(1-t(x))
wherein, I represents a fog image, J represents a defogged image, t represents a transmissivity graph, and A represents an atmospheric light value.
c. Initial transmittance graph:
firstly, according to dark channel prior theory, the method comprises the following steps:
Figure BDA0002429371430000081
substituting the above equation into an atmospheric light scattering model yields:
Figure BDA0002429371430000082
wherein
Figure BDA0002429371430000083
For the initial transmittance map, w represents the degree of haze removal, and the haze removal effect becomes more significant as the value of w is larger, and is set to 0.95 according to the experience.
Secondly, correcting a transmissivity graph by adopting a sky region soft segmentation method;
the sky region soft segmentation method can be expressed as:
Figure BDA0002429371430000084
in the formula (I), the compound is shown in the specification,
Figure BDA0002429371430000085
is an initial graph of the transmittance of the light,
Figure BDA0002429371430000086
for the modified transmittance map, a weighting function M and a luminance-based transmittance map t1Can be expressed as:
Figure BDA0002429371430000087
Figure BDA0002429371430000088
in the formula (I), the compound is shown in the specification,
Figure BDA0002429371430000091
l represents an input imageLuminance of I (convert RGB image to HSI color space and select I channel as L), L*A value representing L95% of luminance, β is a scattering coefficient, β is wavelength-dependent from an optical point of view, and thus correlation coefficients β of RGB channels in a color image are set to 0.3324, 0.3433, and 0.3502, respectively.
Thirdly, optimizing a transmittance graph through a convolutional neural network;
theoretically, the transmittance map should smooth out the details as much as possible while ensuring that the texture important in the image is sharp. Therefore, the invention adopts a blind denoising convolutional neural network to eliminate artifacts, noise and smooth details in the transmittance map.
a. Noise estimation sub-network (CNN)E)
The noise estimation sub-network may estimate the noise level of the image from the noise image g and obtain a noise level map
Figure BDA0002429371430000092
The noise estimation sub-network uses a 5-layer Conv network with 32 channels per convolutional layer, a filter size of 3 × 3, and an activation function of Re L U set after each convolutional layer.
b. Noise-removing sub-network (CNN)D)
The first layer of the noise removal network adopts Conv + Re L U, the middle layer adopts Conv + BN (batch normalization) + Re L U, the last layer adopts Conv, wherein the size of all filters in the denoising sub-network is set to be 3 ×, each layer network adopts Zero padding mode (Zero padding), the input and output sizes of each layer are kept consistent, and therefore artificial boundaries (Boundary Artifacts) are prevented from being generated
Figure BDA0002429371430000101
Finally, a noise-free estimation image is obtained
Figure BDA0002429371430000102
c. Loss Function (L oss Function)
The invention adopts a mixed loss function to calculate the similarity of the images in training, and the noise can be effectively estimated and removed through the loss function.
The hybrid loss function includes three sub-loss functions to constrain the noise level estimate map
Figure BDA0002429371430000103
And noise-free estimated image
Figure BDA0002429371430000104
To reliably estimate the noise level, an asymmetric MSE loss function is employed
Figure BDA0002429371430000105
Sum total variation regularization term loss function
Figure BDA0002429371430000106
Constrained noise level map sigma and estimation map
Figure BDA0002429371430000107
The mathematical formula can be expressed as:
Figure BDA0002429371430000108
Figure BDA0002429371430000109
where, Ω represents the domain of the image,
Figure BDA00024293714300001010
and
Figure BDA00024293714300001011
operators representing horizontal and vertical gradients, α and β are penalty factors for the loss function, α is set empirically to 0.3 when
Figure BDA00024293714300001012
When β is 1, otherwise β is 0, for the noise-free real image f and the estimated image f
Figure BDA00024293714300001013
Using a reconstruction loss function
Figure BDA00024293714300001014
Constrained, its formula can be expressed as:
Figure BDA00024293714300001015
thus, the loss function of the entire network can be expressed as:
Figure BDA00024293714300001016
the training method of the neural network in this embodiment is as follows:
1) 10000 image data sets are collected and manufactured, a data set of a transmittance graph is obtained through image depth information, and the transmittance graph is cut into 10000 × 512 large data sets;
the collected marine fog image comprises a plurality of common marine objects, such as sea, reef, bridge column, ship, bridge and the like;
2) training and learning the original transmittance graph and the noisy transmittance graph by adopting a convolutional neural network training frame respectively to obtain training parameters and test the denoising effect;
and fourthly, recovering by adopting an inverted atmospheric light scattering model to obtain a defogged image.
The mathematical formula of the inverted atmospheric light scattering model is as follows:
Figure BDA0002429371430000111
wherein max (a, b) represents the selection of the larger of a, b, tlbRepresents the lower bound of transmittance and empirically sets tlb=0.1。
I denotes the fog image, J denotes the defogged image, and a denotes the atmospheric light value.
It will be understood that modifications and variations can be made by persons skilled in the art in light of the above teachings and all such modifications and variations are intended to be included within the scope of the invention as defined in the appended claims.

Claims (9)

1. A marine image enhancement method in a fog environment is characterized by comprising the following steps:
1) obtaining a marine fog image to be processed, and estimating and obtaining an initial transmittance graph and an atmospheric light value through a dark channel first-aid algorithm;
2) correcting the initial transmittance graph by adopting a sky region soft segmentation method to obtain a corrected transmittance graph;
3) optimizing the modified transmittance map by a convolutional neural network;
4) and obtaining the defogged image after the marine image enhancement by adopting an inverted atmospheric light scattering model according to the optimized transmissivity graph.
2. The method for enhancing the maritime affairs image in the fog environment according to claim 1, wherein the initial transmittance map and the atmospheric light value are obtained in the step 1) through dark channel prior algorithm estimation, and the method comprises the following specific steps:
according to dark channel prior theory, the method comprises the following steps:
Figure FDA0002429371420000011
combining an atmospheric light scattering model to obtain:
Figure FDA0002429371420000012
wherein the content of the first and second substances,
Figure FDA0002429371420000013
as an initial transmittance graph, w represents a mist removal processAnd degree, A represents the atmospheric light value.
3. The marine image enhancement method in a fog environment according to claim 1, wherein the sky region soft segmentation method is represented as:
Figure FDA0002429371420000021
in the formula (I), the compound is shown in the specification,
Figure FDA0002429371420000022
is an initial graph of the transmittance of the light,
Figure FDA0002429371420000023
for the modified transmittance map, a weighting function M and a luminance-based transmittance map t1Expressed as:
Figure FDA0002429371420000024
Figure FDA0002429371420000025
in the formula (I), the compound is shown in the specification,
Figure FDA0002429371420000026
l denotes the brightness of the input image I, L*A value representing L95% of luminance, β is a scattering coefficient.
4. The marine image enhancement method in the fog environment of claim 1, wherein the convolutional neural network in the step 3) is a blind denoising convolutional neural network.
5. The method for enhancing the marine image in the fog environment according to claim 1, wherein the convolutional neural network in the step 3) comprises a noise estimation network and a noise removal network, and the structure is as follows:
a. noise estimation sub-network
The noise estimation sub-network may estimate the noise level of the image from the noise image g and obtain a noise level map
Figure FDA0002429371420000027
The noise estimation sub-network uses a 5-layer Conv convolutional network;
b. noise-removing sub-network
The first layer of the noise removal sub-network adopts Conv + Re L U, the middle layer adopts Conv + BN + Re L U, the last layer adopts Conv, wherein the size of all filters in the noise removal sub-network is set to be 3 × 3, each layer of network adopts a zero filling mode, the input and output sizes of each layer are kept consistent, after the second layer, each layer adds batch normalization operation between Conv and Re L U, and meanwhile, the noise removal sub-network learns residual mapping in a residual learning mode
Figure FDA0002429371420000031
Finally, a noise-free estimation image is obtained
Figure FDA0002429371420000032
6. The marine image enhancement method in the fog environment of claim 5, wherein the number of channels of each convolutional layer of the noise estimation sub-network in the step 3) is 32, the size of the filter is 3 × 3, and the activation function arranged behind each convolutional layer is Re L U.
7. The method for enhancing maritime affairs image under fog environment according to claim 5, wherein the loss function adopted by the convolutional neural network in the step 3) is a mixed loss function comprising three sub-loss functions to constrain the noise level estimation map
Figure FDA0002429371420000033
And noise-free estimated image
Figure FDA0002429371420000034
To reliably estimate the noise level, an asymmetric MSE loss function is employed
Figure FDA0002429371420000035
Sum total variation regularization term loss function
Figure FDA0002429371420000036
Constrained noise level map sigma and estimation map
Figure FDA0002429371420000037
The mathematical formula is expressed as:
Figure FDA0002429371420000038
Figure FDA0002429371420000039
where, Ω represents the domain of the image,
Figure FDA00024293714200000310
and
Figure FDA00024293714200000311
operators representing horizontal and vertical gradients, α and β are penalty coefficients of the loss function;
for a noise-free real image f and an estimated image
Figure FDA00024293714200000312
Using a reconstruction loss function
Figure FDA00024293714200000313
Constrained, its formula can be expressed as:
Figure FDA0002429371420000041
thus, the loss function of the entire network can be expressed as:
Figure FDA0002429371420000042
8. the marine image enhancement method in the fog environment according to claim 5, wherein the training method of the convolutional neural network is as follows:
10000 image data sets are collected and manufactured, a data set of a transmittance graph is obtained through image depth information, and the transmittance graph is cut into 10000 × 512 large data sets;
and respectively training and learning the original transmittance graph and the noisy transmittance graph by adopting a convolutional neural network training framework to obtain training parameters and test the denoising effect.
9. The marine image enhancement method under the fog environment according to claim 1, wherein the mathematical formula of the atmospheric light scattering model inverted in the step 4) is as follows:
Figure FDA0002429371420000043
wherein max (a, b) represents the selection of the larger of a, b, tlbRepresenting a lower transmission bound.
CN202010231300.6A 2020-03-27 2020-03-27 Maritime image enhancement method in fog environment Active CN111489302B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010231300.6A CN111489302B (en) 2020-03-27 2020-03-27 Maritime image enhancement method in fog environment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010231300.6A CN111489302B (en) 2020-03-27 2020-03-27 Maritime image enhancement method in fog environment

Publications (2)

Publication Number Publication Date
CN111489302A true CN111489302A (en) 2020-08-04
CN111489302B CN111489302B (en) 2023-12-05

Family

ID=71810818

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010231300.6A Active CN111489302B (en) 2020-03-27 2020-03-27 Maritime image enhancement method in fog environment

Country Status (1)

Country Link
CN (1) CN111489302B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112927157A (en) * 2021-03-08 2021-06-08 电子科技大学 Improved dark channel defogging method using weighted least square filtering

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103150708A (en) * 2013-01-18 2013-06-12 上海交通大学 Image quick defogging optimized method based on black channel
CN104299192A (en) * 2014-09-28 2015-01-21 北京联合大学 Single image defogging method based on atmosphere light scattering physical model
US9288458B1 (en) * 2015-01-31 2016-03-15 Hrl Laboratories, Llc Fast digital image de-hazing methods for real-time video processing
CN105761230A (en) * 2016-03-16 2016-07-13 西安电子科技大学 Single image defogging method based on sky region segmentation processing
CN106530246A (en) * 2016-10-28 2017-03-22 大连理工大学 Image dehazing method and system based on dark channel and non-local prior
KR20180050832A (en) * 2016-11-07 2018-05-16 한국과학기술원 Method and system for dehazing image using convolutional neural network
CN110211067A (en) * 2019-05-27 2019-09-06 哈尔滨工程大学 One kind being used for UUV Layer Near The Sea Surface visible images defogging method
US20190287219A1 (en) * 2018-03-15 2019-09-19 National Chiao Tung University Video dehazing device and method
CN110782407A (en) * 2019-10-15 2020-02-11 北京理工大学 Single image defogging method based on sky region probability segmentation
AU2020100274A4 (en) * 2020-02-25 2020-03-26 Huang, Shuying DR A Multi-Scale Feature Fusion Network based on GANs for Haze Removal

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103150708A (en) * 2013-01-18 2013-06-12 上海交通大学 Image quick defogging optimized method based on black channel
CN104299192A (en) * 2014-09-28 2015-01-21 北京联合大学 Single image defogging method based on atmosphere light scattering physical model
US9288458B1 (en) * 2015-01-31 2016-03-15 Hrl Laboratories, Llc Fast digital image de-hazing methods for real-time video processing
CN105761230A (en) * 2016-03-16 2016-07-13 西安电子科技大学 Single image defogging method based on sky region segmentation processing
CN106530246A (en) * 2016-10-28 2017-03-22 大连理工大学 Image dehazing method and system based on dark channel and non-local prior
KR20180050832A (en) * 2016-11-07 2018-05-16 한국과학기술원 Method and system for dehazing image using convolutional neural network
US20190287219A1 (en) * 2018-03-15 2019-09-19 National Chiao Tung University Video dehazing device and method
CN110211067A (en) * 2019-05-27 2019-09-06 哈尔滨工程大学 One kind being used for UUV Layer Near The Sea Surface visible images defogging method
CN110782407A (en) * 2019-10-15 2020-02-11 北京理工大学 Single image defogging method based on sky region probability segmentation
AU2020100274A4 (en) * 2020-02-25 2020-03-26 Huang, Shuying DR A Multi-Scale Feature Fusion Network based on GANs for Haze Removal

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112927157A (en) * 2021-03-08 2021-06-08 电子科技大学 Improved dark channel defogging method using weighted least square filtering
CN112927157B (en) * 2021-03-08 2023-08-15 电子科技大学 Improved dark channel defogging method adopting weighted least square filtering

Also Published As

Publication number Publication date
CN111489302B (en) 2023-12-05

Similar Documents

Publication Publication Date Title
CN110148095B (en) Underwater image enhancement method and enhancement device
CN111292258B (en) Image defogging method based on dark channel prior and bright channel prior
CN106530257A (en) Remote sensing image de-fogging method based on dark channel prior model
CN109118446B (en) Underwater image restoration and denoising method
CN110211070B (en) Low-illumination color image enhancement method based on local extreme value
CN110807742B (en) Low-light-level image enhancement method based on integrated network
CN104091310A (en) Image defogging method and device
CN107705258B (en) Underwater image enhancement method based on three-primary-color combined pre-equalization and deblurring
CN110570381B (en) Semi-decoupling image decomposition dark light image enhancement method based on Gaussian total variation
CN110689490A (en) Underwater image restoration method based on texture color features and optimized transmittance
CN114693548B (en) Dark channel defogging method based on bright area detection
CN111462022A (en) Underwater image sharpness enhancement method
CN109345479B (en) Real-time preprocessing method and storage medium for video monitoring data
Cai et al. Underwater image processing system for image enhancement and restoration
CN107451962B (en) Image defogging method and device
CN116029944B (en) Self-adaptive contrast enhancement method and device for gray level image
CN111489302B (en) Maritime image enhancement method in fog environment
CN116823686B (en) Night infrared and visible light image fusion method based on image enhancement
CN116229404A (en) Image defogging optimization method based on distance sensor
CN114170101A (en) Structural texture keeping low-light image enhancement method and system based on high-frequency and low-frequency information
CN110796607B (en) Deep learning low-illumination image enhancement method based on retina cerebral cortex theory
CN113012067B (en) Retinex theory and end-to-end depth network-based underwater image restoration method
CN113379632B (en) Image defogging method and system based on wavelet transmissivity optimization
CN113112429B (en) Universal enhancement frame for foggy images under complex illumination conditions
CN115496694B (en) Method for recovering and enhancing underwater image based on improved image forming model

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant