CN107330870A - A kind of thick fog minimizing technology accurately estimated based on scene light radiation - Google Patents

A kind of thick fog minimizing technology accurately estimated based on scene light radiation Download PDF

Info

Publication number
CN107330870A
CN107330870A CN201710509774.0A CN201710509774A CN107330870A CN 107330870 A CN107330870 A CN 107330870A CN 201710509774 A CN201710509774 A CN 201710509774A CN 107330870 A CN107330870 A CN 107330870A
Authority
CN
China
Prior art keywords
image
light radiation
mrow
scene light
scene
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201710509774.0A
Other languages
Chinese (zh)
Other versions
CN107330870B (en
Inventor
胡海苗
高原原
李波
郭强
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beihang University
Original Assignee
Beihang University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beihang University filed Critical Beihang University
Priority to CN201710509774.0A priority Critical patent/CN107330870B/en
Publication of CN107330870A publication Critical patent/CN107330870A/en
Application granted granted Critical
Publication of CN107330870B publication Critical patent/CN107330870B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/73Deblurring; Sharpening
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10004Still image; Photographic image

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Image Processing (AREA)

Abstract

The invention discloses a kind of thick fog minimizing technology accurately estimated based on scene light radiation, this method mainly includes:Maximum value filtering is carried out respectively to 3 passages of Misty Image, the initial value (S1) of scene light radiation is obtained;Initial scene light radiation by each passage of Misty Image respectively with respective channel carries out Federated filter, obtains the accurate estimation (S2) of scene light radiation;Each passage difference of Misty Image divided by corresponding scene optical radiation components, obtain the Misty Image (S3) for eliminating the influence of scene light attenuation;The image for eliminating the influence of scene light attenuation is projected into spherical coordinate, and then pixel cluster is calculated into the transmissivity (S4) of each pixel using haze line methods according to angular dimension;The image after defogging (S5) is asked for the transmissivity and greasy weather imaging model of acquisition.Thick fog brightness of image after the inventive method processing is adapted to eye-observation, and details is clear.

Description

Dense fog removing method based on accurate estimation of scene light radiation
Technical Field
The invention relates to an image enhancement method, in particular to a dense fog removal method based on accurate estimation of scene light radiation, and belongs to the technical field of digital image processing.
Background
Due to the existence of fog in the image, the visibility of the image is greatly reduced, people cannot obtain accurate information from the image, and further the surrounding environment is judged wrongly, so that disasters can be caused seriously. Particularly, under the condition of dense fog, the visibility is extremely reduced, a large amount of image information is lost, and the safety monitoring system is directly influenced to exert the efficacy. Due to the requirement of outdoor monitoring on image quality, intensive research on the defogging technology of the dense fog images becomes a problem which needs to be solved urgently for image clearness.
Image defogging algorithms are mainly classified into two categories: one is a non-physical model based approach and the other is a physical model based approach. The difference between these two types of methods is whether or not a foggy day imaging model is utilized.
The defogging method based on the non-physical model does not start with the physical cause of image degradation, but enhances the contrast of the image and corrects the color of the image, improving the quality of the image according to the visual perception. Typical foggy day image enhancement methods include histogram equalization algorithms, wavelet methods, curvelet transforms, and automatic color equalization algorithms. The algorithm can not realize defogging in a real sense, can only improve the visual effect of the image to a certain extent, and is easy to have the phenomena of incomplete defogging and color distortion.
The defogging method based on the physical model is essentially based on a classical atmospheric scattering model, and a scene reflectivity or fog-free image is obtained by solving relevant parameters in the model. At present, the foggy day image restoration method based on the physical model mainly includes a method based on polarization characteristics, a method based on partial differential equations, a method based on depth information, and a method based on prior knowledge or hypothesis. These methods consider the scene light radiation sufficient, and only consider the degradation caused by scattering of the surface suspended particles, and can better process the haze image.
Disclosure of Invention
However, when processing a dense fog image by the fog processing method, the image may be darkened or discolored as a whole, and the details may be lost. This is because, under dense fog conditions, aerosol particles collect on the surface of the earth, and as the optical thickness of the fog gradually increases, the transmission of visible light gradually becomes lower, so that the energy of the light radiation reaching the ground gradually decreases. In addition, under dense fog conditions, the weather is also more complicated, and may be accompanied by a thickening of the atmosphere. In a low-altitude area close to the ground surface, the size of aerosol suspended particles is larger than the wavelength, and the attenuation coefficients of the aerosol suspended particles to different wavelengths of light are the same. However, as the altitude increases, the gravitational attraction decreases and the diameter of the suspended particles also gradually decreases. In the atmosphere, a plurality of suspended particles with the diameter smaller than the wavelength of visible light are distributed, and the particles generate Rayleigh scattering. Rayleigh scattering allows shorter wavelengths of light to be scattered and dissipated during propagation, with only longer wavelengths of light passing through the atmosphere to the ground. Therefore, a colored color rendering of the surface light radiation under heavy fog conditions may occur, the fog no longer appearing pure white.
In order to remove the dense fog more effectively, researchers propose to adjust the brightness of the image first and then eliminate the fog by using the priori knowledge of the dark primary color. This method can improve the brightness of the processed image, but the details of the image may be unclear due to inaccuracies in the estimation of the scene emission using the method. In addition, the image after brightness adjustment may be eliminated from some shadow areas, so that the image no longer completely conforms to the principle of dark channel prior. If the relevant parameters of the scattered light are still estimated by adopting the dark channel prior knowledge, the image has poor layering and noise is easily highlighted.
Under the background, it is very important to research a dense fog image enhancement method which can not only maintain the brightness and the details of an enhanced image, but also effectively remove image fog.
The invention aims to provide a dense fog removal method based on scene light radiation accurate estimation, and the dense fog removal method is used for realizing the dense fog removal of a single image.
In order to achieve the above object, the present invention provides a dense fog removing method based on accurate estimation of scene light radiation, which comprises the following steps:
(1) decomposing the RGB three color channels of the foggy day image I to obtain a component image I of the three color channelsr,Ig,IbUsing maximum filtering to respectively perform component mapping on three color channelsr,Ig,IbThe calculation is carried out to obtain a rough estimation graph M of the light radiation of the scener,Mg,Mb
(2) Using a joint edge-preserving filter pair IrAnd Mr,IgAnd Mg,IbAnd MbThree groups of images are respectively calculated to obtain an accurate estimation image L of scene light radiationr,Lg,Lb
(3) Component map I of three color channelsr,Ig,IbDivide by the corresponding scene ray radiation accurate estimate Lr,Lg,LbSynthesizing results obtained by calculation of the three color channels to obtain a foggy day image J which eliminates the influence of scene light radiation attenuation;
(4) projecting each pixel value in the foggy day image J without the influence of the scene light radiation attenuation in a spherical coordinate system to obtain (r (x, y), theta (x, y) and phi (x, y)) coordinates, wherein (x, y) represents the coordinates of the pixel point, r (x, y) represents the distance from the point to an original point, namely | | | J (x, y) -1|, theta (x, y) and phi (x, y) respectively represent corresponding longitude and latitude, clustering the pixels according to the longitude and latitude to obtain n classified fog lines (P)1,P2,…,Pn) Then, the transmittance of each pixel is obtained by adopting a haze line method;
(5) and (5) calculating the reflectivity of the scene by using the transmissivity acquired in the step (4), and further acquiring the defogged image.
The method for removing dense fog based on accurate estimation of scene light radiation is characterized in that in the step (1), the Radius of maximum filtering is Radius, and the formula is as follows:
Radius=|(max(height,width)/100|+1 (1)
where height and width represent the length and width of the image.
The method for removing dense fog based on accurate estimation of scene light radiation is characterized in that in the step (2), the formula used for the joint filtering is as follows:
where (x, y), (i, j) are coordinate information, Ω is a block centered at (x, y), McA coarse estimate of scene light emission obtained after maximum filtering for each channel.
ηc(i, j) is a step function:
as an exponential function:
wherein, Ic(x, y) represents a c-channel of the foggy day image, and σ represents a pixel value variance.
The method for removing fog based on accurate estimation of scene light radiation is characterized in that, in the step (3), in order to avoid the situation that the obtained ambient light radiation is equal to 0, when dividing by the ambient light radiation, the ambient light radiation component obtained in the step (2) needs to be added with a very small constant of 0.01 respectively.
The method for removing the dense fog based on the accurate estimation of the scene light radiation is characterized in that in the step (4), pixels projected to a spherical coordinate system are clustered by using a k-mean algorithm.
The method for removing dense fog based on accurate estimation of scene light radiation is characterized in that the range of the number of clustering centers used in the clustering method is set to be 200-500.
Drawings
The invention is further described with reference to the following figures and detailed description.
Fig. 1 is a flowchart of a method for removing dense fog based on accurate estimation of scene light radiation according to the present invention.
Fig. 2(a) to 2(c) are 3-D effect display diagrams of the joint filtering method according to the present invention.
FIGS. 3(a) to 3(f) are experimental results comparing the defogging method according to the present invention and a typical defogging method in a test image; wherein, fig. 3(a) shows the image in foggy days, fig. 3(b) shows the result after defogging by the adaptive histogram equalization method for limiting contrast, fig. 3(c) shows the result after defogging by the Tarel method, fig. 3(d) shows the result after defogging by the combination dark primary color and guided filtering method, fig. 3(e) shows the result after defogging by the combination dark primary color and post-processing method, and fig. 3(f) shows the image after defogging by the method of the invention.
Detailed Description
The method for removing the dense fog based on the accurate estimation of the scene light radiation firstly estimates the scene light radiation through a filter for edge protection and eliminates the influence of the scene light radiation attenuation. And then, calculating the reflectivity of the scene by utilizing the property of a pixel spherical coordinate system, and acquiring the defogged image. This is explained in detail below.
In the invention, the method for removing the dense fog based on the accurate estimation of the scene light radiation comprises the following steps as shown in figure 1:
step 1, decomposing three color channels of r, g and b of the foggy weather image I to obtain a component diagram I of the three color channelsr,Ig,IbUsing maximum filtering to respectively perform component mapping on three color channelsr,Ig,IbThe calculation is carried out to obtain a rough estimation graph M of the light radiation of the scener,Mg,Mb
The estimation of scene light radiation to maintain naturalness needs to satisfy two conditions: (a) the light radiation should be smooth in most areas but should leave alternating bright and dark edges; (b) the scene light radiation should be no less than the reflected light to ensure that the defogged image retains as much scene detail as possible.
In Retinex theory, many center-surround methods acquire scene illumination by low-pass filtering the image maximum channel. However, this method is not applicable to the case of light emission color cast of the scene. Furthermore, the maximum channel is only a lower limit of scene radiance, and its use as an initial estimate of illumination lacks physical interpretation.
In the prior art, based on the assumption that a high luminance area in an image is a white surface or a high luminance point of a light source, a Max-RGB algorithm is proposed by researchers, and the maximum value of 3 channels is used as an estimation of illumination. However, this method is not suitable for the case where the illumination is not uniform. In order to make the estimation method robust, the Max-RGB method is popularized to the local areas, in other words, the light reflected by the object with high reflectivity in each local area is considered to be closer to the ambient illumination by the method. Assuming I is the observed image, the coarse estimate of scene illuminant is:
where Ω represents a local window centered at (x, y) and c represents a color channel.
Step 2, adopting a joint edge-preserving filtering method to carry out filtering on IrAnd Mr,IgAnd Mg,IbAnd MbThree groups of images are respectively calculated to obtain an accurate estimation image L of scene light radiationr,Lg,Lb
The method in step 1 can indeed approach the scene light radiation well. However, similar to other methods that assume local constancy, there is a blocking effect on the illumination estimated by this method where the scene light radiation is cut off, and therefore, further optimization is required. For this purpose, the invention designs a content-adaptive joint edge-preserving filter to estimate the accurate scene light radiation Lc(x,y):
Representing an exponential function for controlling the light radiation to satisfy the condition (a):
η thereinc(i, j) is a step function for controlling the scene light radiation to satisfy condition (b), defined as follows:
where σ is the pixel gray scale difference.
Unlike other joint edge-preserving filtering, the method does not give neighborhood relations of the guide graph to the graph to be filtered for smoothing. Only one pixel in the pilot map acts as a pilot when each window is filtered.
Step 3, dividing the component diagram I of the three color channelsr,Ig,IbDivide by the corresponding scene ray radiation accurate estimate Lr,Lg,LbSynthesizing results obtained by calculation of the three color channels to obtain a foggy day image J which eliminates the influence of scene light radiation attenuation;
with accurate estimation of the scene light radiation, the elimination of the effect of the scene light radiation attenuation can be obtained by the following formula:
Jc(x,y)=Ic(x,y)/Lc(x,y) (5)
step 4, projecting each pixel value in the foggy day image J without the influence of the scene light radiation attenuation in a spherical coordinate system to obtain (r (x, y), theta (x, y) and phi (x, y)) coordinates, wherein (x, y) represents the coordinates of the pixel point, r (x, y) represents the distance between the point and an original point, namely | | | | J (x, y) -1|, theta (x, y) and phi (x, y) respectively represent corresponding longitude and latitude, clustering the pixels according to the longitude and latitude to obtain n classified fog lines (P)1,P2,…,Pn) Then, the transmittance of each pixel is obtained by adopting a haze line method;
after eliminating the attenuation effect of the scene light radiation, the acquisition of the scene reflectivity translates into the problem of acquiring the transmittance from the following equation:
J(x,y)=R(x,y)t(x,y)+1-t(x,y) (6)
after the above formula is modified:
J(x,y)-1=(R(x,y)-1)t(x,y) (7)
after J (x, y) -1 is expressed in a spherical coordinate system, it is:
J(x,y)-1=[r(x,y),θ(x,y),φ(x,y)](8)
wherein r represents the distance from the point to the origin, i.e., | | J (x, y) -1|, and θ and φ represent latitude and longitude, respectively. The pixel values of the same color for the foggy day image have different t, which in the spherical coordinate system appears as: the pixels have different radii from the origin but the same latitude and longitude theta and phi. Based on the principle, by a clustering method, after the number of clustering centers is set, a plurality of lines can be obtained, and the lines contain pixel values with similar colors.
Then, recording the transmissivity of the farthest pixel in each cluster as 1, and solving the transmissivity of other pixels distributed on each line according to the ratio of the radius of the projected spherical coordinate to the radius of the projection coordinate of the farthest pixel;
given a point r (x, y) can be expressed as:
r(x,y)=t(x,y)||J(x,y)-1||,0≤t(x,y)≤1 (9)
the longest radiusThe coordinate is set to t 1. Wherein,p is a clustering line. The transmittance at each point can be obtained:
and 5, calculating the reflectivity of the scene by using the transmissivity acquired in the step 4, and further acquiring the defogged image.
The reflectivity of the scene can be calculated as a defogged image by the following formula:
fig. 2(a) to 2(c) are 3-D effect display diagrams of the joint filtering method according to the present invention.
FIGS. 3(a) to 3(f) are experimental results comparing the defogging method according to the present invention and a typical defogging method in a test image; wherein, fig. 3(a) shows the image in foggy days, fig. 3(b) shows the result after defogging by the adaptive histogram equalization method for limiting contrast, fig. 3(c) shows the result after defogging by the Tarel method, fig. 3(d) shows the result after defogging by the combination dark primary color and guided filtering method, fig. 3(e) shows the result after defogging by the combination dark primary color and post-processing method, and fig. 3(f) shows the image after defogging by the method of the invention.
It is to be understood that the above disclosure is only illustrative of specific embodiments of the invention. According to the technical idea provided by the invention, the changes which can be thought by the ordinary skilled person in the field shall fall into the protection scope of the invention.

Claims (6)

1. A dense fog removing method based on scene light radiation accurate estimation comprises the following steps:
(1) decomposing the RGB three color channels of the foggy day image I to obtain a component image I of the three color channelsr,Ig,IbUsing maximum filtering to respectively perform component mapping on three color channelsr,Ig,IbThe calculation is carried out to obtain a rough estimation graph M of the light radiation of the scener,Mg,Mb
(2) Using a joint edge-preserving filter pair IrAnd Mr,IgAnd Mg,IbAnd MbThree groups of images are respectively calculated to obtain an accurate estimation image L of scene light radiationr,Lg,Lb
(3) Component map I of three color channelsr,Ig,IbRespective division by corresponding estimated profile L of scene radiationr,Lg,LbSynthesizing results obtained by calculation of the three color channels to obtain a foggy day image J which eliminates the influence of scene light radiation attenuation;
(4) projecting each pixel value in the foggy day image J without the influence of the scene light radiation attenuation in a spherical coordinate system to obtain (r (x, y), theta (x, y) and phi (x, y)) coordinates, wherein (x, y) represents the coordinates of the pixel point, r (x, y) represents the distance from the point to an original point, namely | | | J (x, y) -1|, theta (x, y) and phi (x, y) respectively represent corresponding longitude and latitude, clustering the pixels according to the longitude and latitude to obtain n classified fog lines (P)1,P2,…,Pn) Then, the transmittance of each pixel is obtained by adopting a haze line method;
(5) and (5) calculating the reflectivity of the scene by using the transmissivity acquired in the step (4), and further acquiring the defogged image.
2. The dense fog removing method according to claim 1, wherein:
in the step (1), the Radius of the maximum value filtering is determined by the following formula:
Radius=|(max(height,width)/100|+1 (1)
where height and width represent the length and width of the image.
3. The dense fog removing method according to claim 1, wherein:
in the step (2), the joint edge preserving filtering is represented by the following formula:
where (x, y), (i, j) are coordinate information, Ω is a block centered at (x, y), McObtaining a rough estimation image of scene light radiation after maximum filtering is carried out on each channel;
ηc(i, j) is a step function:
<mrow> <msup> <mi>&amp;eta;</mi> <mi>c</mi> </msup> <mrow> <mo>(</mo> <mi>i</mi> <mo>,</mo> <mi>j</mi> <mo>)</mo> </mrow> <mo>=</mo> <mfenced open = "{" close = ""> <mtable> <mtr> <mtd> <mn>1</mn> </mtd> <mtd> <mrow> <msup> <mi>M</mi> <mi>c</mi> </msup> <mrow> <mo>(</mo> <mi>i</mi> <mo>,</mo> <mi>j</mi> <mo>)</mo> </mrow> <mo>-</mo> <msup> <mi>I</mi> <mi>c</mi> </msup> <mrow> <mo>(</mo> <mi>x</mi> <mo>,</mo> <mi>y</mi> <mo>)</mo> </mrow> <mo>&amp;GreaterEqual;</mo> <mn>0</mn> </mrow> </mtd> </mtr> <mtr> <mtd> <mn>0</mn> </mtd> <mtd> <mrow> <mi>e</mi> <mi>l</mi> <mi>s</mi> <mi>e</mi> </mrow> </mtd> </mtr> </mtable> </mfenced> <mo>,</mo> <mi>c</mi> <mo>&amp;Element;</mo> <mo>{</mo> <mi>r</mi> <mo>,</mo> <mi>g</mi> <mo>,</mo> <mi>b</mi> <mo>}</mo> <mo>-</mo> <mo>-</mo> <mo>-</mo> <mrow> <mo>(</mo> <mn>3</mn> <mo>)</mo> </mrow> </mrow>
as an exponential function:
wherein, Ic(x, y) represents a c-channel of the foggy day image, and σ represents a pixel value variance.
4. The dense fog removing method according to claim 1, wherein:
in said step (3), to avoid obtaining an accurate estimate L of the scene light radiationr,Lg,LbIs equal to 0, the accurate estimate L of the scene light radiation acquired in step (2) is obtained when dividing by the ambient light radiationr,Lg,LbRespectively, plus a very small constant of 0.01.
5. The dense fog removing method according to claim 1, wherein:
in the step (4), pixels projected to the spherical coordinate system are clustered by using a k-mean algorithm.
6. The method of claim 5, wherein the method comprises:
the range of the number of the clustering centers used by the clustering method is set to be 200-500.
CN201710509774.0A 2017-06-28 2017-06-28 A kind of thick fog minimizing technology accurately estimated based on scene light radiation Active CN107330870B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201710509774.0A CN107330870B (en) 2017-06-28 2017-06-28 A kind of thick fog minimizing technology accurately estimated based on scene light radiation

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201710509774.0A CN107330870B (en) 2017-06-28 2017-06-28 A kind of thick fog minimizing technology accurately estimated based on scene light radiation

Publications (2)

Publication Number Publication Date
CN107330870A true CN107330870A (en) 2017-11-07
CN107330870B CN107330870B (en) 2019-06-18

Family

ID=60198601

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201710509774.0A Active CN107330870B (en) 2017-06-28 2017-06-28 A kind of thick fog minimizing technology accurately estimated based on scene light radiation

Country Status (1)

Country Link
CN (1) CN107330870B (en)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108447034A (en) * 2018-03-13 2018-08-24 北京航空航天大学 A kind of marine Misty Image defogging method decomposed based on illumination
CN109345479A (en) * 2018-09-28 2019-02-15 中国电子科技集团公司信息科学研究院 A kind of real-time preprocess method and storage medium of video monitoring data
CN110335210A (en) * 2019-06-11 2019-10-15 长江勘测规划设计研究有限责任公司 Underwater image restoration method
CN111583125A (en) * 2019-02-18 2020-08-25 佳能株式会社 Image processing apparatus, image processing method, and computer-readable storage medium
CN112907472A (en) * 2021-02-09 2021-06-04 大连海事大学 Polarization underwater image optimization method based on scene depth information

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102243758A (en) * 2011-07-14 2011-11-16 浙江大学 Fog-degraded image restoration and fusion based image defogging method
CN104809707A (en) * 2015-04-28 2015-07-29 西南科技大学 Method for estimating visibility of single fog-degraded image
CN105469372A (en) * 2015-12-30 2016-04-06 广西师范大学 Mean filtering-based fog-degraded image sharp processing method
CN106846263A (en) * 2016-12-28 2017-06-13 中国科学院长春光学精密机械与物理研究所 The image defogging method being immunized based on fusion passage and to sky

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102243758A (en) * 2011-07-14 2011-11-16 浙江大学 Fog-degraded image restoration and fusion based image defogging method
CN104809707A (en) * 2015-04-28 2015-07-29 西南科技大学 Method for estimating visibility of single fog-degraded image
CN105469372A (en) * 2015-12-30 2016-04-06 广西师范大学 Mean filtering-based fog-degraded image sharp processing method
CN106846263A (en) * 2016-12-28 2017-06-13 中国科学院长春光学精密机械与物理研究所 The image defogging method being immunized based on fusion passage and to sky

Non-Patent Citations (5)

* Cited by examiner, † Cited by third party
Title
DANA BERMAN等: "《2017 IEEE International Conference on Computational Photography (ICCP)》", 14 May 2017 *
QINGSONG ZHU等: ""A Fast Single Image Haze Removal Algorithm Using Color Attenuation Prior"", 《IEEE TRANSACTIONS ON IMAGE PROCESSING》 *
YUANYUAN GAO等: ""A fast image dehazing algorithm based on negative correction"", 《SIGNAL PROCESSING》 *
张晶晶等: ""基于暗原色先验原理的偏振图像浓雾去除算法"", 《计算机应用》 *
陆健强等: ""基于改进暗通道先验算法的农田视频实时去雾清晰化系统"", 《农业工程学报》 *

Cited By (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108447034A (en) * 2018-03-13 2018-08-24 北京航空航天大学 A kind of marine Misty Image defogging method decomposed based on illumination
CN108447034B (en) * 2018-03-13 2021-08-13 北京航空航天大学 Marine foggy day image defogging method based on illumination decomposition
CN109345479A (en) * 2018-09-28 2019-02-15 中国电子科技集团公司信息科学研究院 A kind of real-time preprocess method and storage medium of video monitoring data
CN109345479B (en) * 2018-09-28 2021-04-06 中国电子科技集团公司信息科学研究院 Real-time preprocessing method and storage medium for video monitoring data
CN111583125A (en) * 2019-02-18 2020-08-25 佳能株式会社 Image processing apparatus, image processing method, and computer-readable storage medium
CN111583125B (en) * 2019-02-18 2023-10-13 佳能株式会社 Image processing apparatus, image processing method, and computer-readable storage medium
US11995799B2 (en) 2019-02-18 2024-05-28 Canon Kabushiki Kaisha Image processing apparatus, image processing method, and non-transitory computer-readable storage medium
CN110335210A (en) * 2019-06-11 2019-10-15 长江勘测规划设计研究有限责任公司 Underwater image restoration method
CN110335210B (en) * 2019-06-11 2022-05-13 长江勘测规划设计研究有限责任公司 Underwater image restoration method
CN112907472A (en) * 2021-02-09 2021-06-04 大连海事大学 Polarization underwater image optimization method based on scene depth information

Also Published As

Publication number Publication date
CN107330870B (en) 2019-06-18

Similar Documents

Publication Publication Date Title
CN107330870B (en) A kind of thick fog minimizing technology accurately estimated based on scene light radiation
CN106548463B (en) Sea fog image automatic defogging method and system based on dark and Retinex
CN106530246B (en) Image defogging method and system based on dark Yu non local priori
Lu et al. Underwater image enhancement using guided trigonometric bilateral filter and fast automatic color correction
CN107103591B (en) Single image defogging method based on image haze concentration estimation
CN106846263B (en) Based on the image defogging method for merging channel and sky being immunized
Tripathi et al. Single image fog removal using bilateral filter
CN103218778B (en) The disposal route of a kind of image and video and device
CN111292258B (en) Image defogging method based on dark channel prior and bright channel prior
CN108389175B (en) Image defogging method integrating variation function and color attenuation prior
CN104253930B (en) A kind of real-time video defogging method
CN107767354A (en) A kind of image defogging algorithm based on dark primary priori
Singh et al. Single image defogging by gain gradient image filter
CN106875351A (en) A kind of defogging method towards large area sky areas image
CN103914813A (en) Colorful haze image defogging and illumination compensation restoration method
CN104182943B (en) A kind of single image defogging method capable merging human-eye visual characteristic
CN107977941B (en) Image defogging method for color fidelity and contrast enhancement of bright area
CN105447825A (en) Image defogging method and system
CN106447617A (en) Improved Retinex image defogging method
CN109118450B (en) Low-quality image enhancement method under sand weather condition
CN104331867B (en) The method, device and mobile terminal of image defogging
CN107563980A (en) Underwater picture clarification method based on Underwater Imaging model and the depth of field
CN111598800B (en) Single image defogging method based on space domain homomorphic filtering and dark channel priori
CN110349113B (en) Adaptive image defogging method based on dark primary color priori improvement
CN106780362B (en) Road video defogging method based on dichromatic reflection model and bilateral filtering

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant