CN113469912B - Deep learning-based foggy day visibility estimation method and system - Google Patents

Deep learning-based foggy day visibility estimation method and system Download PDF

Info

Publication number
CN113469912B
CN113469912B CN202110737978.6A CN202110737978A CN113469912B CN 113469912 B CN113469912 B CN 113469912B CN 202110737978 A CN202110737978 A CN 202110737978A CN 113469912 B CN113469912 B CN 113469912B
Authority
CN
China
Prior art keywords
map
image
estimation
neural network
depth
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202110737978.6A
Other languages
Chinese (zh)
Other versions
CN113469912A (en
Inventor
裴欣
胡坚明
游晶
贾邵程
岳云
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tsinghua University
Original Assignee
Tsinghua University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tsinghua University filed Critical Tsinghua University
Priority to CN202110737978.6A priority Critical patent/CN113469912B/en
Publication of CN113469912A publication Critical patent/CN113469912A/en
Application granted granted Critical
Publication of CN113469912B publication Critical patent/CN113469912B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/73Deblurring; Sharpening
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/045Combinations of networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20081Training; Learning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20084Artificial neural networks [ANN]
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02ATECHNOLOGIES FOR ADAPTATION TO CLIMATE CHANGE
    • Y02A90/00Technologies having an indirect contribution to adaptation to climate change
    • Y02A90/10Information and communication technologies [ICT] supporting adaptation to climate change, e.g. for weather forecasting or climate simulation

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Data Mining & Analysis (AREA)
  • Molecular Biology (AREA)
  • Biophysics (AREA)
  • Computational Linguistics (AREA)
  • Artificial Intelligence (AREA)
  • Evolutionary Computation (AREA)
  • General Health & Medical Sciences (AREA)
  • Biomedical Technology (AREA)
  • Computing Systems (AREA)
  • General Engineering & Computer Science (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Mathematical Physics (AREA)
  • Software Systems (AREA)
  • Health & Medical Sciences (AREA)
  • Image Analysis (AREA)

Abstract

The invention relates to a foggy weather visibility estimation method and system based on deep learning, which are characterized by comprising the following steps: 1) Estimating a transmittance map through a transmittance estimation neural network for the input fogged image; 2) Recovering and obtaining defogged images by using the estimated transmissivity graph and the fog imaging model; 3) Inputting the defogged image into a depth estimation neural network estimation depth map; 4) Calculating an extinction coefficient map through the depth map and the transmittance map; 5) From the extinction coefficient map, a visibility map is calculated using Koschmieder's law. The method and the device have the advantages that the inherent relation between the fog imaging model and Koschmieder law is analyzed, and the visibility estimation is carried out by combining the fog imaging model and Koschmieder law, so that the method and the device have better interpretability, and can be widely applied to the field of the visibility estimation in severe weather such as fog weather.

Description

Deep learning-based foggy day visibility estimation method and system
Technical Field
The invention belongs to the technical field of computer vision, and particularly relates to a foggy day visibility estimation method and system based on deep learning.
Background
Visibility is the maximum distance that a person with normal vision can recognize an object from the background. Visibility is mainly affected by two factors, namely the brightness difference of the target object and the background thereof and the transparency of the atmosphere. The transparency of the atmosphere is greatly related to weather, and the weather such as rain, fog, snow, sand, haze and the like can influence the transparency of the atmosphere, so that the visibility is reduced. Thus, in general, visibility is mainly dependent on weather conditions.
In the measurement of visibility, the conventional method is to measure the extinction coefficient by an instrument to thereby calculate the visibility. The limitations of these conventional approaches are also very evident: the atmospheric transmittance is calculated by measuring the transmittance of the air column between two points, and in the weather with low visibility such as rain, fog and the like, the method has larger error; the laser visibility automatic measuring instrument calculates the visibility by measuring the atmospheric extinction coefficient, and the method is accurate, but the instrument has the defects of high manufacturing cost, complex operation, difficult popularization and the like.
With the development of computer science, several methods based on digital image processing technology and data driving have also been developed. However, the current non-deep learning method can only estimate one visibility value for one image, which is too coarse for the problem of visibility estimation under uneven weather conditions and has insufficient accuracy. In the field of computer vision, the visibility estimation based on the deep learning method becomes one direction of research due to the development of the deep neural network theory and the improvement of the computing power, however, the deep learning method suffers from the problem of poor interpretation.
Disclosure of Invention
Aiming at the problems, the invention aims to provide a foggy day visibility estimation method and system based on deep learning, which can output visibility of a pixel level only by inputting one image, thereby solving the problem that the traditional method depends on a measuring instrument and a measuring condition, and solving the problem that the visibility estimation of a non-deep learning method in the computer field is too rough due to the visibility of the pixel level. In addition, the method captures the inherent relation between the fog imaging model and Koschmieder's law, and combines the fog imaging model and Koschmieder's law to perform visibility estimation, so that the problem of insufficient interpretability of the deep learning method is effectively solved.
In order to achieve the above purpose, the present invention adopts the following technical scheme:
In a first aspect of the present invention, there is provided a foggy weather visibility estimation method based on deep learning, comprising the steps of:
1) Inputting the foggy image into a pre-constructed transmissivity estimation neural network to perform transmissivity estimation, so as to obtain a transmissivity graph of the foggy image;
2) Processing the foggy image by using the estimated transmissivity image and the foggy imaging model, and recovering to obtain a defogged image;
3) Inputting the defogged image into a pre-constructed depth estimation neural network to perform depth estimation to obtain a depth map;
4) Calculating to obtain an extinction coefficient chart through the depth chart and the transmissivity chart;
5) And calculating to obtain a visibility graph by utilizing Koschmieder law through an extinction coefficient graph.
Further, in the step 1), the method for inputting the image with fog into a pre-constructed transmission estimation neural network to perform transmission estimation to obtain a transmission map includes the following steps:
1.1 Constructing and initializing a transmissivity estimation neural network;
1.2 Loading the network parameters of the pre-trained transmissivity estimation neural network and setting the network parameters as a test mode;
1.3 Preprocessing an input fogged image according to the requirements of the transmissivity estimation neural network;
1.4 Inputting the preprocessed foggy image into a transmissivity estimation neural network to obtain a transmissivity graph.
Further, in the step 2), the method for recovering the defogged image through the transmittance map and the foggy map includes the following steps:
2.1 According to the input foggy image, calculating to obtain an atmosphere light component A;
2.2 Substituting the foggy image, the transmissivity image obtained in the step 1) and the atmospheric light component A into a foggy imaging model to obtain a defogged image.
Further, in the step 2.1), the method for calculating the atmospheric light component a according to the inputted foggy image includes the following steps:
2.1.1 Minimum value is calculated on three RGB channels for each pixel of the input foggy image;
2.1.2 Performing minimum value filtering on the minimum value diagram obtained in the step 2.1.1) to obtain a dark channel diagram;
2.1.3 For the dark channel map obtained in the step 2.1.2), the first 0.1% of pixels are taken according to the brightness, the pixel values corresponding to the pixels in the input foggy image are found, and the average value of the pixel values is taken as the atmospheric light component A.
Further, in the step 3), the defogged image is input into a pre-constructed depth estimation neural network to perform depth estimation, and the method for obtaining the depth map comprises the following steps:
3.1 Constructing and initializing a depth estimation neural network;
3.2 Loading the pre-trained network parameters of the depth estimation neural network and setting the network parameters as a test mode;
3.3 Preprocessing the defogged image according to the requirement;
3.4 Inputting the preprocessed defogged image into a depth estimation neural network to perform depth estimation, so as to obtain a depth map.
Further, in the step 4), the method for calculating the extinction coefficient map through the depth map and the transmittance map comprises the following steps:
firstly, according to a transmissivity image and a depth image, calculating to obtain extinction coefficients corresponding to each pixel, wherein a calculation formula is as follows:
t (x) =e -β(x)d(x), where t (x) is the transmittance of the transmittance map at one pixel point; beta (x) is the extinction coefficient corresponding to each pixel; d (x) is the depth of the depth map at one pixel;
next, an extinction coefficient map is obtained from the extinction coefficient β (x) corresponding to each pixel.
Further, in the step 5), the method for calculating the visibility map by using Koschmieder's law through the extinction coefficient map includes:
Firstly, according to an extinction coefficient chart and Koschmieder law, the visibility of each pixel point is calculated, and the calculation formula is as follows:
wherein V (x) is the visibility of a pixel point; epsilon is the eye contrast threshold; beta (x) is the extinction coefficient of the extinction coefficient map at one pixel point;
next, a visibility map is obtained from the visibility V (x) of each pixel.
In a second aspect of the present invention, there is provided a foggy weather visibility estimation system based on deep learning, comprising:
The transmission map determining module is used for inputting the foggy image into a pre-constructed transmission estimation neural network to perform transmission estimation to obtain a transmission map;
the defogged image determining module is used for recovering and obtaining defogged images by using the estimated transmissivity image and the defogged imaging model;
The depth map determining module is used for inputting the defogged image into a pre-constructed depth estimation neural network to perform depth estimation to obtain a depth map;
The extinction coefficient map determining module is used for calculating an extinction coefficient map through the depth map and the transmittance map;
And the visibility map determining module is used for calculating the visibility map through the extinction coefficient map by utilizing Koschmieder's law.
Further, the transmittance map determining module includes:
the transmissivity estimation neural network construction module is used for constructing and initializing a transmissivity estimation neural network;
the first network parameter loading module is used for loading the network parameters of the pre-trained transmissivity estimation neural network and setting the network parameters into a test mode;
the first preprocessing module is used for preprocessing the input foggy image according to the requirement;
The transmissivity estimation module is used for inputting the preprocessed fog-carrying image into a transmissivity estimation neural network to carry out transmissivity estimation, and a transmissivity graph is obtained.
Further, the depth map determining module includes:
the depth estimation neural network construction module is used for constructing and initializing a depth estimation neural network;
the second network parameter loading module is used for loading the network parameters of the pre-trained depth estimation neural network and setting the network parameters as a test mode;
the second preprocessing module is used for preprocessing the defogged image according to the requirement;
the depth estimation module is used for inputting the preprocessed defogged image into a depth estimation neural network to perform depth estimation, so as to obtain a depth map.
Due to the adoption of the technical scheme, the invention has the following advantages:
1. the invention analyzes the intrinsic relation between the fog imaging model and Koschmieder law and combines the fog imaging model and Koschmieder law to estimate the visibility, so the invention has better interpretation;
2. the invention is not only suitable for fog days, but also suitable for other imaging models and weather conditions same as fog days;
3. different from the traditional method in the field of computer vision, the method can obtain the visibility estimation of the pixel level, and the traditional method can only obtain the visibility estimation of the picture level, so that the method can better estimate uneven fog, has stronger robustness and finer granularity;
4. The invention has the characteristics of modularization and low coupling, so that with the development of science and technology, the modules in the framework of the invention can be replaced by a method with better effect, thereby improving the total estimation effect.
Therefore, the method can be widely applied to the field of visibility estimation in severe weather such as foggy weather and the like.
Drawings
FIG. 1 is a flow chart of a foggy weather visibility estimation method based on deep learning of the present invention;
FIG. 2 is a flow chart of the present invention for transmission estimation of a fogged image using a transmission estimation neural network;
FIG. 4 is a flow chart of the solution of the atmospheric light component in the present invention;
FIG. 3 is a flow chart of the invention, in which defogged images are input into a depth estimation neural network for estimation to obtain a depth map thereof;
FIG. 5 is a schematic diagram of a network architecture of DehazeNet constructed in an embodiment of the present invention;
Fig. 6 is a schematic diagram of a network structure of DORN (Deep Ordinal Regression Network) constructed in an embodiment of the present invention.
Detailed Description
The present invention will be described in detail with reference to the accompanying drawings and examples.
The invention provides a foggy day visibility estimation method based on a foggy imaging model, koschmieder law and related work in the field of image defogging and depth estimation at present, which is a theoretical basis of the invention, namely Koschmieder law and brief introduction of the foggy imaging model:
Koschmieder's law is the basis for determining visibility. When the eyes face the ground object at the far position along the horizontal direction and visually observe the ground sky, the aerosol particles in the atmosphere generate an absorption scattering effect due to the influence of direct sunlight, the sky, cloud layers and diffuse scattering on the ground, so that an observer feels that a layer of shielding is formed in the atmosphere, the direct contrast between the object and the background is reduced, and the distance that the observer can see the object is reduced. The law of transformation of visual contrast C with distance d, koschmieder law, is expressed as follows:
Wherein C is visual contrast, also called visual contrast, which is the relative brightness difference between the target and the background; beta is an extinction coefficient, and is a characteristic quantity for representing the transparency of the atmosphere under any condition; d is the distance of the observer from the target; i object and I background are the brightness of the object and the background, respectively.
If the object is just visible, this means that the visual contrast C reaches the eye contrast threshold epsilon, which is epsilon=0.02 or epsilon=0.05 for normal people. At this time, the distance d between the observer and the target represents the visibility V, and the above formula (1) may be expressed as:
ε=e-βV (2)
Namely:
furthermore, in computer vision and computer graphics, fog imaging models described by the following equations are widely used:
I(x)=J(x)t(x)+A(1-t(x)) (4)
t(x)=e-β(x)d(x) (5)
Wherein, I (x) is the value of the image to be defogged at a pixel point; j (x) is the value of the defogged image at a pixel point; a is an atmospheric light component, which is constant for one image; t (x) is the transmissivity of the image at one pixel point; beta (x) is an extinction coefficient, the magnitude of which reflects the fog concentration, and can be generally considered to be constant for a particular image when the fog concentration is uniform, and refers to the extinction coefficient of a pixel in the image when the fog concentration is non-uniform; d (x) is the actual physical distance between the target object and the shooting source on one pixel point of the image.
From the theoretical analysis, the visibility can be calculated by using Koschmieder law as long as the extinction coefficient is calculated by the mist imaging model. As shown in fig. 1, the invention provides a foggy weather visibility estimation method based on deep learning, which comprises the following steps:
1) Inputting the foggy image into a pre-constructed transmissivity estimation neural network to perform transmissivity estimation, so as to obtain a transmissivity graph of the foggy image;
2) Processing the foggy image by using the estimated transmissivity image and the foggy imaging model, and recovering to obtain a defogged image;
3) Inputting the defogged image into a pre-constructed depth estimation neural network for depth estimation to obtain a depth map of the image;
4) Calculating an extinction coefficient map through the obtained depth map and transmittance map;
5) And calculating to obtain a visibility graph by utilizing Koschmieder law through an extinction coefficient graph.
As shown in fig. 2, in the step 1), the method for obtaining the transmittance map of the foggy image by inputting the foggy image into a pre-constructed transmittance estimation neural network to perform transmittance estimation includes the following steps:
1.1 A) constructing and initializing a transmittance estimation neural network, which may employ DehazeNet, DCPDN (Densely Connected Pyramid Dehazing Network) or the like;
1.2 Loading the network parameters of the pre-trained transmissivity estimation neural network and setting the network parameters as a test mode;
1.3 Preprocessing the input foggy image according to the requirements of the transmissivity estimation neural network, such as cutting, scaling, normalizing the mean value and variance, and the like;
1.4 Inputting the preprocessed foggy image into a transmissivity estimation neural network to obtain a transmissivity graph of the foggy image.
In the step 2), the method for obtaining the defogged image comprises the following steps:
2.1 Calculating to obtain an atmosphere light component A based on the foggy image;
2.2 Substituting the foggy image, the transmittance image obtained in the step 1) and the atmospheric light component A into a foggy imaging model (namely, formula (4)) to obtain a defogged image.
As shown in fig. 3, in the above step 2.1), there are many algorithms for obtaining the atmospheric light component a, and a method using a pixel of the first 0.1% according to the brightness from the dark channel map as a is currently used, which includes the following steps:
2.1.1 For each pixel of the foggy image, minimum values are found on the three channels RGB;
2.1.2 Performing minimum value filtering on the obtained minimum value graph to obtain a dark channel graph;
2.1.3 For the obtained dark channel diagram, the first 0.1% of pixels are taken according to the brightness, the corresponding pixel values of the pixels in the input image are found, and the average value of the pixel values is taken as an atmospheric light component A.
As shown in fig. 4, in the above step 3), a depth estimation neural network is constructed, network parameters of the depth estimation neural network are initialized, and a transmittance estimation method using the depth estimation neural network is performed, which includes the following steps:
3.1 A) constructing and initializing a depth estimation neural network, such as FCRN (Fully Convolutional Residual Networks), DORN (Deep Ordinal Regression Network), etc.;
3.2 Loading pre-trained depth estimation neural network parameters and setting the parameters as a test mode;
3.3 Preprocessing defogged images according to the requirements of a depth estimation application network, such as cutting, scaling, normalizing mean and variance, and the like;
3.4 Inputting the preprocessed defogged image into a depth estimation neural network to obtain a depth estimation image.
In the step 4), the transmittance map is obtained according to the step 1), the depth map is obtained according to the step 3), so that the extinction coefficient β (x) corresponding to each pixel can be calculated by the formula (5) because t (x) and d (x) in the fog imaging model are known, and the extinction coefficient map is obtained.
In the above step 5), since the extinction coefficient map is obtained according to step 4), the extinction coefficient β in Koschmieder's law is known, epsilon=0.02 or epsilon=0.05 is substituted into formula (3), and the visibility V of each pixel point can be calculated, i.e. the visibility map is obtained.
According to the foggy day visibility estimation method based on deep learning, the invention also provides a foggy day visibility estimation system based on deep learning, which comprises the following steps: the transmission map determining module is used for inputting the foggy image into a pre-constructed transmission estimation neural network to perform transmission estimation to obtain a transmission map of the foggy image; the defogged image determining module is used for recovering and obtaining defogged images by using the estimated transmissivity image and the defogged imaging model; the depth map determining module is used for inputting the defogged image into a pre-constructed depth estimation neural network to perform depth estimation to obtain a depth map; the extinction coefficient map determining module is used for calculating an extinction coefficient map through the depth map and the transmittance map; and the visibility map determining module is used for calculating the visibility map through the extinction coefficient map by utilizing Koschmieder's law.
Further, the transmittance map determining module includes: the transmissivity estimation neural network construction module is used for constructing and initializing a transmissivity estimation neural network; the first network parameter loading module is used for loading the network parameters of the pre-trained transmissivity estimation neural network and setting the network parameters into a test mode; the first preprocessing module is used for preprocessing the input foggy image according to the requirement; the transmissivity estimation module is used for inputting the preprocessed fog-carrying image into a transmissivity estimation neural network to carry out transmissivity estimation, and a transmissivity graph is obtained.
Further, the depth map determining module includes: the depth estimation neural network construction module is used for constructing and initializing a depth estimation neural network; the second network parameter loading module is used for loading the network parameters of the pre-trained depth estimation neural network and setting the network parameters as a test mode; the second preprocessing module is used for preprocessing the defogged image according to the requirement; the depth estimation module is used for inputting the preprocessed defogged image into a depth estimation neural network to perform depth estimation, so as to obtain a depth map.
Embodiment one:
In this embodiment, the steps of estimating the visibility of fog on a road are as follows:
1) For an input fogged image, a transmittance map is estimated by a transmittance estimation neural network.
As shown in fig. 5, dehazeNet is selected as a transmittance estimation neural network, a Pytorch programming environment (version 1.6) is used to build a network and initialize parameters, then network parameters downloaded from the internet are loaded, the transmittance estimation neural network is set as a test mode, and after preprocessing of average and variance normalization is performed on an input image, the input image is input into DehazeNet, and a transmittance map is obtained.
2) And recovering and obtaining an defogged image by using the estimated transmissivity graph and the defogging imaging model.
Specifically, the method comprises the following steps:
2.1 Atmospheric light component a);
2.1.1 For each pixel of the input image, minimum values are found on the three channels RGB;
2.1.2 Performing minimum value filtering on the minimum value diagram obtained in the step 2.1.1), and selecting the size of a filter to be 5*5 to obtain a dark channel diagram;
2.1.3 For the dark channel map obtained in step 2.1.2), the first 0.1% of pixels are taken according to the brightness, the corresponding pixel values of the pixels in the input image are found, and the average value of the pixel values is taken as the atmospheric light component A.
2.2 Substituting t (x), I (x) and A into a formula of the fog imaging model to obtain an defogged image.
3) And inputting the defogged image into a depth estimation neural network estimation depth map.
As shown in fig. 6, DORN (Deep Ordinal Regression Network) is selected as a depth estimation neural network, a Pytorch programming environment (version 1.6) is used to build a network and initialize parameters, then network parameters downloaded from the internet are loaded, the depth estimation neural network is set as a test mode, and after preprocessing of mean and variance normalization is performed on an input image, the input image is input into DORN, and a depth estimation graph is obtained.
4) An extinction coefficient map is calculated from the depth map and the transmittance map.
And 3) obtaining a depth map according to the step 3), so that t (x) and d (x) in the fog imaging model are known, and the extinction coefficient beta corresponding to each pixel can be obtained through calculation of a formula of the fog imaging model, so that the extinction coefficient map is obtained.
5) From the extinction coefficient map, a visibility map is calculated using Koschmieder's law.
According to the step 4), an extinction coefficient diagram is obtained, so that the extinction coefficient beta in Koschmieder's law is known, epsilon=0.02 is taken, and the visibility V of each pixel point can be calculated by substituting the formula, and the visibility diagram is obtained.
A specific embodiment is given above, but the invention is not limited to the described embodiment. The basic idea of the invention is that the above-mentioned scheme, it is not necessary for those skilled in the art to design various modified models, formulas, parameters according to the teaching of the present invention to take creative effort. Variations, modifications, substitutions and alterations are also possible in the embodiments without departing from the principles and spirit of the present invention.

Claims (10)

1. The foggy day visibility estimation method based on deep learning is characterized by comprising the following steps of:
1) Inputting the foggy image into a pre-constructed transmissivity estimation neural network to perform transmissivity estimation, so as to obtain a transmissivity graph of the foggy image;
2) Processing the foggy image by using the estimated transmissivity image and the foggy imaging model, and recovering to obtain a defogged image;
3) Inputting the defogged image into a pre-constructed depth estimation neural network to perform depth estimation to obtain a depth map;
4) Calculating to obtain an extinction coefficient chart through the depth chart and the transmissivity chart;
5) And calculating to obtain a visibility graph by utilizing Koschmieder law through an extinction coefficient graph.
2. The foggy weather visibility estimating method based on deep learning as claimed in claim 1, wherein: in the step 1), inputting the foggy image into a pre-constructed transmission estimation neural network for transmission estimation, and obtaining a transmission map, wherein the method comprises the following steps:
1.1 Constructing and initializing a transmissivity estimation neural network;
1.2 Loading the network parameters of the pre-trained transmissivity estimation neural network and setting the network parameters as a test mode;
1.3 Preprocessing an input fogged image according to the requirements of the transmissivity estimation neural network;
1.4 Inputting the preprocessed foggy image into a transmissivity estimation neural network to obtain a transmissivity graph.
3. The foggy weather visibility estimating method based on deep learning as claimed in claim 1, wherein: in the step 2), the method for recovering the defogged image through the transmissivity graph and the fogged graph comprises the following steps:
2.1 According to the input foggy image, calculating to obtain an atmosphere light component A;
2.2 Substituting the foggy image, the transmissivity image obtained in the step 1) and the atmospheric light component A into a foggy imaging model to obtain a defogged image.
4. A foggy weather visibility estimating method based on deep learning as claimed in claim 3, wherein: in the step 2.1), the method for calculating the atmospheric light component A according to the input foggy image comprises the following steps:
2.1.1 Minimum value is calculated on three RGB channels for each pixel of the input foggy image;
2.1.2 Performing minimum value filtering on the minimum value diagram obtained in the step 2.1.1) to obtain a dark channel diagram;
2.1.3 For the dark channel map obtained in the step 2.1.2), the first 0.1% of pixels are taken according to the brightness, the pixel values corresponding to the pixels in the input foggy image are found, and the average value of the pixel values is taken as the atmospheric light component A.
5. The foggy weather visibility estimating method based on deep learning as claimed in claim 1, wherein: in the step 3), the defogged image is input into a pre-constructed depth estimation neural network to perform depth estimation, and a depth map obtaining method comprises the following steps:
3.1 Constructing and initializing a depth estimation neural network;
3.2 Loading the pre-trained network parameters of the depth estimation neural network and setting the network parameters as a test mode;
3.3 Preprocessing the defogged image according to the requirement;
3.4 Inputting the preprocessed defogged image into a depth estimation neural network to perform depth estimation, so as to obtain a depth map.
6. The foggy weather visibility estimating method based on deep learning as claimed in claim 1, wherein: in the step 4), the method for calculating the extinction coefficient map through the depth map and the transmissivity map comprises the following steps:
firstly, according to a transmissivity image and a depth image, calculating to obtain extinction coefficients corresponding to each pixel, wherein a calculation formula is as follows:
t(x)=e-β(x)d(x)
Wherein t (x) is the transmittance of the transmittance map at one pixel point; beta (x) is the extinction coefficient corresponding to each pixel; d (x) is the depth of the depth map at one pixel;
next, an extinction coefficient map is obtained from the extinction coefficient β (x) corresponding to each pixel.
7. The foggy weather visibility estimating method based on deep learning as claimed in claim 1, wherein: in the step 5), the method for calculating the visibility graph by utilizing Koschmieder law through the extinction coefficient graph comprises the following steps:
Firstly, according to an extinction coefficient chart and Koschmieder law, the visibility of each pixel point is calculated, and the calculation formula is as follows:
wherein V (x) is the visibility of a pixel point; epsilon is the eye contrast threshold; beta (x) is the extinction coefficient of the extinction coefficient map at one pixel point;
next, a visibility map is obtained from the visibility V (x) of each pixel.
8. A deep learning based foggy weather visibility estimation system employing the method of any one of claims 1-7, comprising:
The transmission map determining module is used for inputting the foggy image into a pre-constructed transmission estimation neural network to perform transmission estimation to obtain a transmission map;
the defogged image determining module is used for recovering and obtaining defogged images by using the estimated transmissivity image and the defogged imaging model;
The depth map determining module is used for inputting the defogged image into a pre-constructed depth estimation neural network to perform depth estimation to obtain a depth map;
The extinction coefficient map determining module is used for calculating an extinction coefficient map through the depth map and the transmittance map;
And the visibility map determining module is used for calculating the visibility map through the extinction coefficient map by utilizing Koschmieder's law.
9. A deep learning based foggy weather visibility estimating system as claimed in claim 8, wherein: the transmittance map determining module includes:
the transmissivity estimation neural network construction module is used for constructing and initializing a transmissivity estimation neural network;
the first network parameter loading module is used for loading the network parameters of the pre-trained transmissivity estimation neural network and setting the network parameters into a test mode;
the first preprocessing module is used for preprocessing the input foggy image according to the requirement;
The transmissivity estimation module is used for inputting the preprocessed fog-carrying image into a transmissivity estimation neural network to carry out transmissivity estimation, and a transmissivity graph is obtained.
10. A deep learning based foggy weather visibility estimating system as claimed in claim 8, wherein: the depth map determining module includes:
the depth estimation neural network construction module is used for constructing and initializing a depth estimation neural network;
the second network parameter loading module is used for loading the network parameters of the pre-trained depth estimation neural network and setting the network parameters as a test mode;
the second preprocessing module is used for preprocessing the defogged image according to the requirement;
the depth estimation module is used for inputting the preprocessed defogged image into a depth estimation neural network to perform depth estimation, so as to obtain a depth map.
CN202110737978.6A 2021-06-30 2021-06-30 Deep learning-based foggy day visibility estimation method and system Active CN113469912B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110737978.6A CN113469912B (en) 2021-06-30 2021-06-30 Deep learning-based foggy day visibility estimation method and system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110737978.6A CN113469912B (en) 2021-06-30 2021-06-30 Deep learning-based foggy day visibility estimation method and system

Publications (2)

Publication Number Publication Date
CN113469912A CN113469912A (en) 2021-10-01
CN113469912B true CN113469912B (en) 2024-06-14

Family

ID=77876590

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110737978.6A Active CN113469912B (en) 2021-06-30 2021-06-30 Deep learning-based foggy day visibility estimation method and system

Country Status (1)

Country Link
CN (1) CN113469912B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116664448B (en) * 2023-07-24 2023-10-03 南京邮电大学 Medium-high visibility calculation method and system based on image defogging

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104680494A (en) * 2015-03-14 2015-06-03 西安电子科技大学 Optimal fog image recovery method based on artificial fog addition

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9288458B1 (en) * 2015-01-31 2016-03-15 Hrl Laboratories, Llc Fast digital image de-hazing methods for real-time video processing
CN104809707B (en) * 2015-04-28 2017-05-31 西南科技大学 A kind of single width Misty Image visibility method of estimation
CN107194924A (en) * 2017-05-23 2017-09-22 重庆大学 Expressway foggy-dog visibility detecting method based on dark channel prior and deep learning
CN107749052A (en) * 2017-10-24 2018-03-02 中国科学院长春光学精密机械与物理研究所 Image defogging method and system based on deep learning neutral net
CN112365467B (en) * 2020-11-11 2022-07-19 武汉长江通信智联技术有限公司 Foggy image visibility estimation method based on single image depth estimation

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104680494A (en) * 2015-03-14 2015-06-03 西安电子科技大学 Optimal fog image recovery method based on artificial fog addition

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
基于图像识别技术白天能见度的算法分析;于佳松,韩夏清;电子测量技术;20201031;全文 *

Also Published As

Publication number Publication date
CN113469912A (en) 2021-10-01

Similar Documents

Publication Publication Date Title
CN104809707B (en) A kind of single width Misty Image visibility method of estimation
CN107301623B (en) Traffic image defogging method and system based on dark channel and image segmentation
Highnam et al. Model-based image enhancement of far infrared images
CN102682443B (en) Rapid defogging algorithm based on polarization image guide
CN103020914B (en) Based on the rapid image defogging method capable of spatial continuity principle
CN102411774A (en) Processing method, device and system based on single image defogging
CN104156536A (en) Visual quantitative calibration and analysis method for cutter abrasion of shield tunneling machine
CN104574387A (en) Image processing method in underwater vision SLAM system
CN105139347A (en) Polarization imaging defogging method combined with dark channel prior principle
CN107505291B (en) Method for estimating visibility through single image
CN104050637A (en) Quick image defogging method based on two times of guide filtration
CN110458029B (en) Vehicle detection method and device in foggy environment
CN111210396A (en) Multispectral polarization image defogging method combined with sky light polarization model
CN113469912B (en) Deep learning-based foggy day visibility estimation method and system
CN104272347A (en) Image processing apparatus for removing haze contained in still image and method thereof
JP2023548127A (en) Correcting camera images in the presence of rain, intruding light and dirt
CN106023108A (en) Image defogging algorithm based on boundary constraint and context regularization
CN113763261B (en) Real-time detection method for far small target under sea fog weather condition
CN114998147A (en) Traffic image fog adding method under digital twin
CN111951339A (en) Image processing method for performing parallax calculation by using heterogeneous binocular cameras
CN113888420A (en) Underwater image restoration method and device based on correction model and storage medium
Bartani et al. An adaptive optic-physic based dust removal method using optimized air-light and transfer function
CN110246102B (en) Method for clearly processing video in rainy days
CN116229404A (en) Image defogging optimization method based on distance sensor
Mittal et al. FEMT: a computational approach for fog elimination using multiple thresholds

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant