CN111274911A - Dense fog monitoring method based on wireless microwave attenuation characteristic transfer learning - Google Patents

Dense fog monitoring method based on wireless microwave attenuation characteristic transfer learning Download PDF

Info

Publication number
CN111274911A
CN111274911A CN202010050327.5A CN202010050327A CN111274911A CN 111274911 A CN111274911 A CN 111274911A CN 202010050327 A CN202010050327 A CN 202010050327A CN 111274911 A CN111274911 A CN 111274911A
Authority
CN
China
Prior art keywords
dense fog
microwave attenuation
network
test set
intensity data
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202010050327.5A
Other languages
Chinese (zh)
Other versions
CN111274911B (en
Inventor
杨涛
洪岱
郑鑫
师鹏飞
秦友伟
李振亚
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hohai University HHU
Original Assignee
Hohai University HHU
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hohai University HHU filed Critical Hohai University HHU
Priority to CN202010050327.5A priority Critical patent/CN111274911B/en
Publication of CN111274911A publication Critical patent/CN111274911A/en
Application granted granted Critical
Publication of CN111274911B publication Critical patent/CN111274911B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2218/00Aspects of pattern recognition specially adapted for signal processing
    • G06F2218/02Preprocessing
    • G06F2218/04Denoising
    • G06F2218/06Denoising by applying a scale-space analysis, e.g. using wavelet analysis
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/21Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation
    • G06F18/214Generating training patterns; Bootstrap methods, e.g. bagging or boosting
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/045Combinations of networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Artificial Intelligence (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Data Mining & Analysis (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Evolutionary Computation (AREA)
  • General Health & Medical Sciences (AREA)
  • Molecular Biology (AREA)
  • Software Systems (AREA)
  • Mathematical Physics (AREA)
  • Computing Systems (AREA)
  • Health & Medical Sciences (AREA)
  • Biomedical Technology (AREA)
  • Biophysics (AREA)
  • Computational Linguistics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Signal Processing (AREA)
  • Evolutionary Biology (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Radar Systems Or Details Thereof (AREA)
  • Geophysics And Detection Of Objects (AREA)

Abstract

The invention discloses a dense fog monitoring method based on wireless microwave attenuation characteristic transfer learning, which obtains a time-frequency graph by performing wavelet transformation on preprocessed microwave attenuation signal intensity data, adjusting the size of the time-frequency diagram to obtain an adjusted image, inputting the training set into an Alexnet network for training, when the Alexnet network reaches the preset requirement, determining an Alexnet network model according to the current network parameters of the Alexnet network, and the Alexnet network model is used to detect the test set and output the network detection result of each adjusted image in the test set, calculating the content of the dense fog liquid water of each adjusted image in the test set according to the network detection result and a preset inversion formula, the visibility corresponding to each adjustment image in the test set is calculated according to the water content of each dense fog liquid, the monitoring flexibility of the visibility and other dense fog parameters can be improved, and the accuracy of the visibility and other dense fog parameters can be improved.

Description

Dense fog monitoring method based on wireless microwave attenuation characteristic transfer learning
Technical Field
The invention relates to the technical field of meteorological factor monitoring, in particular to a dense fog monitoring method based on wireless microwave attenuation characteristic transfer learning.
Background
Dense fog is a meteorological phenomenon that has great influence on human production and life: on one hand, dense fog can cause visibility reduction, thereby causing great influence on traffic and transportation and causing serious life and property loss; on the other hand, the combination of air pollutants and dense fog to form smoke can cause harm to animals and plants. Common dense fog monitoring methods include manual observation, transmissometry, scattering coefficient measuring instruments, satellite monitoring and the like. The visibility is estimated by the appearance or the shielding of an object at a known distance from the current position of an observer through manual observation, the influence of human subjectivity is large, and the accurate magnitude of dense fog cannot be given; the transmittance meter measurement estimates the fog concentration by calculating the average extinction coefficient in the horizontal direction: the modulated magnetic flux emitted by the transmitter is light with constant average power, and the receiver comprises a photodetector for measuring the light falling on the receiver, although the measurement of the instrument is very accurate, the cost is very high; instruments for measuring the scattering coefficient of light by concentrating the light beam on a small volume of air, the proportion of light scattered at sufficiently large angles and in non-critical directions can be determined photometrically. However, this technique only allows measuring very small sample volumes, and therefore the visibility obtained is not representative; satellite monitoring has the advantage of large spatial coverage, however, it is difficult to provide accurate ground monitoring data and to distinguish whether the observation result is actual fog or cloud fog. It can be seen that the traditional dense fog monitoring is often limited, and the accuracy of the corresponding monitoring result is low.
Disclosure of Invention
Aiming at the problems, the invention provides a dense fog monitoring method based on wireless microwave attenuation characteristic transfer learning.
In order to realize the aim of the invention, the invention provides a dense fog monitoring method based on wireless microwave attenuation characteristic transfer learning, which comprises the following steps:
s10, obtaining microwave attenuation signal intensity data obtained by the microwave signal receiving end aiming at atmospheric measurement, and preprocessing the microwave attenuation signal intensity data;
s20, performing wavelet transformation on the preprocessed microwave attenuation signal intensity data to obtain a time-frequency diagram;
s30, adjusting the size of the time-frequency diagram to obtain an adjusted image, and enabling the adjusted image to meet the input requirement of a pre-trained Alexnet network;
s40, dividing the adjustment images into a test set and a training set, determining the state label of each adjustment image in the training set, inputting the training set into an Alexnet network for training, determining an Alexnet network model according to the current network parameters of the Alexnet network when the difference between the state detection result output by the Alexnet network and the state label of the training set meets the preset requirement, detecting the test set by using the Alexnet network model, and outputting the network detection result of each adjustment image in the test set;
and S50, calculating the content of the dense fog liquid water of each adjustment image in the test set according to the network detection result and a preset inversion formula, and calculating the visibility corresponding to each adjustment image in the test set according to the content of the dense fog liquid water.
In one embodiment, the process of pre-processing the microwave attenuation signal strength data comprises:
and identifying lost data in the microwave attenuation signal intensity data, interpolating the lost data, and eliminating noise data which is larger than a response threshold value in the microwave attenuation signal intensity data.
In one embodiment, the process of performing wavelet transform on the preprocessed microwave attenuation signal intensity data to obtain a time-frequency diagram includes:
s31, determining a scale parameter a, determining a wavelet basis according to the scale parameter a, and comparing the wavelet basis with the microwave attenuation signal intensity data after pretreatment to obtain a plurality of fragment signals;
s32, calculating a continuous wavelet transform coefficient C1 of a segment signal, wherein the continuous wavelet transform coefficient C1 represents the similarity degree of a wavelet basis and the corresponding segment signal;
s33, adjusting the position parameter b to repeatedly execute the step S32 by the translation wavelet aiming at other segment signals until each segment signal is analyzed to obtain a continuous wavelet transformation coefficient C1 of each segment signal;
s34, updating the scale parameter a, and returning to execute the step S31 until the continuous wavelet transform coefficient C1 of each segment signal is determined according to each preset scale parameter a;
and S35, drawing a time-frequency graph according to the continuous wavelet transform coefficient C1 corresponding to each scale parameter a.
In one embodiment, the Alexnet network includes 5 convolutional layers, 3 pooling layers, 3 fully-connected layers, and 1 soft-max layer.
In one embodiment, the activation function of the Alexnet network includes:
Figure BDA0002370920800000021
where ReLu represents the activation function and x represents the input signal.
In one embodiment, the inversion formula comprises:
Af=φ·LWC,
wherein A isfAnd the detection result of the network is shown, phi represents a set constant, and LWC represents the content of the dense fog liquid water.
In one embodiment, the visibility calculation formula includes:
V=1.002·(LWC×ND)-0.6473
ND=-0.071×T2+2.213×T+141.56,
wherein V represents visibility, NDIndicating the water drop number concentration and T the current temperature.
The method comprises the steps of obtaining microwave attenuation signal intensity data obtained by a microwave signal receiving end aiming at atmospheric measurement, preprocessing the microwave attenuation signal intensity data, performing wavelet transformation on the preprocessed microwave attenuation signal intensity data to obtain a time-frequency diagram, adjusting the size of the time-frequency diagram to obtain an adjusted image, enabling the adjusted image to meet the input requirement of an Alexnet network to be pre-trained, dividing the adjusted image into a test set and a training set, determining the state label of each adjusted image in the training set, inputting the training set into the Alexnet network for training, determining the Alexnet network model according to the current network parameters of the Alexnet network when the difference between the state detection result output by the Alexnet network and the state label of the training set meets the preset requirement, detecting the test set by using the Alexnet network model, and outputting the network detection result of each adjusted image in the test set, the content of the dense fog liquid water of each adjustment image in the test set is calculated according to the network detection result and a preset inversion formula, so that the visibility corresponding to each adjustment image in the test set is calculated according to the content of the dense fog liquid water, the monitoring flexibility of the visibility and other dense fog parameters can be improved, and the accuracy of the visibility and other dense fog parameters is improved.
Drawings
Fig. 1 is a flowchart of a dense fog monitoring method based on wireless microwave attenuation characteristic transfer learning according to an embodiment.
Detailed Description
In order to make the objects, technical solutions and advantages of the present application more apparent, the present application is described in further detail below with reference to the accompanying drawings and embodiments. It should be understood that the specific embodiments described herein are merely illustrative of the present application and are not intended to limit the present application.
Reference herein to "an embodiment" means that a particular feature, structure, or characteristic described in connection with the embodiment can be included in at least one embodiment of the application. The appearances of the phrase in various places in the specification are not necessarily all referring to the same embodiment, nor are separate or alternative embodiments mutually exclusive of other embodiments. It is explicitly and implicitly understood by one skilled in the art that the embodiments described herein can be combined with other embodiments.
Referring to fig. 1, fig. 1 is a flowchart of a dense fog monitoring method based on wireless microwave attenuation feature transfer learning according to an embodiment, including the following steps:
and S10, obtaining microwave attenuation signal intensity data obtained by the microwave signal receiving end aiming at atmospheric measurement, and preprocessing the microwave attenuation signal intensity data.
After the microwave attenuation signal intensity data of the microwave signal receiving end is obtained in the steps, the microwave attenuation signal intensity data is primarily processed, and the purpose of doing so is to eliminate some obvious unreasonable abnormal values in the signal so as to ensure the effectiveness of the microwave attenuation signal intensity data adopted in the subsequent process.
And S20, performing wavelet transformation on the preprocessed microwave attenuation signal intensity data to obtain a time-frequency diagram.
The obtained microwave attenuation signal intensity data is subjected to wavelet transformation to obtain a time-frequency diagram, and the purpose of doing so is to extract the characteristics of the microwave attenuation signal intensity data in a time domain and a frequency domain.
S30, adjusting the size of the time-frequency diagram to obtain an adjusted image, and enabling the adjusted image to meet the input requirement of the pre-trained Alexnet network.
In the above steps, the extracted time-frequency diagram is used as a feature, and the size of the time-frequency diagram is transformed to obtain an image (i.e. an adjusted image) with the size of 227 × 227 × 3, so that the adjusted image meets the input requirement of a pre-training large network.
S40, dividing the adjustment images into a test set and a training set, determining the state label of each adjustment image in the training set, inputting the training set into an Alexnet network for training, determining an Alexnet network model according to the current network parameters of the Alexnet network when the difference between the state detection result output by the Alexnet network and the state label of the training set reaches a preset requirement, detecting the test set by using the Alexnet network model, and outputting the network detection result of each adjustment image in the test set.
The preset requirement may refer to a condition that a difference between a value of each state detection result output by the Alexnet network and a value of a corresponding state label of the training set is smaller than a preset threshold value.
Dividing the adjusted image into a test set and a training set may specifically select 80% of the adjusted image as the training set and 20% as the test set. Wherein each image of the training set is named as a labeled image with or without a dense fog state. And sending the images of the training set into an improved Alexnet network for training, classifying the divided test set by using parameters such as trained Alexnet network model neuron weight and the like after training to judge whether the test set is in a dense fog state or not, and outputting network detection results of all the adjusted images in the test set.
And S50, calculating the content of the dense fog liquid water of each adjustment image in the test set according to the network detection result and a preset inversion formula, and calculating the visibility corresponding to each adjustment image in the test set according to the content of the dense fog liquid water.
In the steps, the content of the liquid water in the dense fog can be reversely calculated according to the inversion formula and related parameters by using the data (network detection result) of the image which is identified as the dense fog after being processed. And (4) according to the calculated content of the dense fog liquid water and the temperature, the quantity and the concentration of the fog drops are obtained according to a formula, and then the visibility is obtained.
The dense fog monitoring method based on the wireless microwave attenuation characteristic transfer learning obtains microwave attenuation signal intensity data obtained by a microwave signal receiving end aiming at atmospheric measurement, preprocesses the microwave attenuation signal intensity data, performs wavelet transformation on the preprocessed microwave attenuation signal intensity data to obtain a time-frequency diagram, adjusts the size of the time-frequency diagram to obtain an adjusted image, enables the adjusted image to meet the input requirement of a pre-trained Alexenet network, divides the adjusted image into a test set and a training set, determines the state label of each adjusted image in the training set, inputs the training set into the Alexenet network for training, determines an Alexenet network model according to the current network parameters of the Alexenet network when the difference between the state detection result output by the Alexenet network and the state label of the training set reaches the preset requirement, and detects the test set by using the Alexenet network model, the network detection result of each adjustment image in the test set is output, the content of the dense fog liquid water of each adjustment image in the test set is calculated according to the network detection result and a preset inversion formula, the visibility corresponding to each adjustment image in the test set is calculated according to the content of the dense fog liquid water, the monitoring flexibility of the dense fog parameters such as visibility can be improved, and the accuracy of the obtained dense fog parameters such as visibility is improved.
In one embodiment, the process of pre-processing the microwave attenuation signal strength data comprises:
and identifying lost data in the microwave attenuation signal intensity data, interpolating the lost data, and eliminating noise data (interference data) which is larger than a response threshold value in the microwave attenuation signal intensity data.
The embodiment can interpolate individual lost data and remove interference data obviously exceeding the response threshold so as to realize the preprocessing of the microwave attenuation signal intensity data.
In one embodiment, the process of performing wavelet transform on the preprocessed microwave attenuation signal intensity data to obtain a time-frequency diagram includes:
and S31, determining a scale parameter a, determining a wavelet basis according to the scale parameter a, and comparing the wavelet basis with the microwave attenuation signal intensity data after pretreatment to obtain a plurality of fragment signals.
S32, calculating continuous wavelet transform coefficient C1 of a segment signal, continuously representing the similarity degree of wavelet basis and corresponding segment signal; specifically, the larger the C1, the more similar the two are. The wavelet transform is performed by selecting a wavelet similar to the signal.
And S33, adjusting the position parameter b, and repeatedly executing the step S32 by the shifting wavelet aiming at other segment signals of the wavelet transform coefficient C1 until each segment signal is analyzed to obtain a continuous wavelet transform coefficient C1 of each segment signal.
And S34, updating the scale parameter a, and returning to execute the step S31 until the continuous wavelet transform coefficient C1 of each segment signal is determined according to each preset scale parameter a.
And S35, drawing a time-frequency graph according to the continuous wavelet transform coefficient C1 corresponding to each scale parameter a.
Further, the preprocessed microwave signal data may be subjected to a wavelet transform: wavelet transformation refers to comparing a sub wavelet function obtained by carrying out different time translation and expansion changes on a mother wavelet function psi (t) with a signal, displaying the similarity degree of the signal and a series of wavelets through a wavelet coefficient, analyzing a fragment signal by utilizing the sub wavelet capable of carrying out translation expansion transformation, and obtaining the information of any local area of the signal.
In one example, for signal x (t), the wavelet transform formula is:
Figure BDA0002370920800000061
in the formula, the superscript represents conjugation, a is a scale factor, b is a position parameter, and the sub-wavelet
Figure BDA0002370920800000062
From mother wavelet
Figure BDA0002370920800000063
Translating b and stretching a to obtain:
Figure BDA0002370920800000064
discretizing the signal, and sampling the signal at equal time intervals, namely:
Figure BDA0002370920800000065
the generalized morse is then used to define the waveform and characteristics using two parameters γ and β, the frequency domain of which is defined as:
Figure BDA0002370920800000066
when the gamma is 3, the morse wavelet is completely symmetrical in the frequency domain, when proper parameters are selected, the generalized morse analysis wavelet can be similar to any signal, so that the generalized morse analysis wavelet has complete analysis performance, and the information can be retained to the maximum extent after wavelet transformation.
In one embodiment, the Alexnet network includes 5 convolutional layers, 3 pooling layers, 3 fully-connected layers, and 1 soft-max layer.
As an embodiment, the activation function of the Alexnet network includes:
Figure BDA0002370920800000067
where ReLu represents the activation function and x represents the input signal.
The Alexnet network comprises 5 convolutional layers, 3 pooling layers, 3 full-connection layers and 1 soft-max layer. The functions involved therein include:
the ReLu function is used as the activation function of the model,
the ReLu function specifically includes:
Figure BDA0002370920800000071
wherein x is an input;
an Adam algorithm is utilized to optimize an improved Alex Net network model,
using cross entropy as a cost function of
Figure BDA0002370920800000072
Wherein y is the expected output, a1 is the actual output of the neuron, C is the cost function, and n is the total number of samples in the training set;
[a1=σ(z),wherez=∑wj×xj+b1]
where σ (z) is the standard deviation of z, wjIs the weight of the jth neuron, xjInput to the jth neuron, b1 is a threshold;
the fully-connected layer exists like a classifier in the whole network, and the learned distributed feature representation is mapped to a sample mark space for weighted summation of the features designed in the front. The softmax layer and the output classification layer are used for final classification and normalization.
Further, the goal of training an Alexnet network is to minimize the cost function; the steps of the algorithm are as follows:
(1) step size e (suggested default: 0.001);
(2) the exponential decay rates of the moment estimates, p1 and p2, are within the interval [0,1 ]. (default suggestions: 0.9 and 0.999, respectively);
(3) small constant delta for numerical stabilization (recommended default: 10)-8)
(4) Initializing parameter θ, initializing first-order and second-order moment variables s equal to 0, r equal to 0, and initializing time step t equal to 0.
(5) When the stopping criterion is not reached, the training set is adopted to contain m samples { x(1),...,x(m)Small batches of } corresponding to targets y (i).
(6) Calculating the gradient: t ← t +1
(7) Updating biased first moment estimates:
Figure BDA0002370920800000073
updating the biased second moment estimation:
Figure BDA0002370920800000074
(8) correcting the deviation of the first moment:
Figure BDA0002370920800000081
correcting the deviation of the second moment:
Figure BDA0002370920800000082
(9) and (3) calculating and updating:
Figure BDA0002370920800000083
(element-by-element application operation)
Application updating: θ ← θ + Δ θ
And optimizing a cross entropy cost function by using an Adam algorithm to optimize a model, so that the speed and the accuracy of time-frequency image classification can be improved.
Then the trained Alex Net network model is used for identifying dense fog according to the steps, and after the identification is successful, the liquid water content (LWC, gm) of the dense fog is calculated-3)
In one embodiment, the inversion formula comprises:
Af=φ·LWC,
wherein A isfRepresents the network detection result in dBkm-1Phi denotes a set constant, in particular a constant relating to frequency and temperature, LWC denotes the content of liquid water in dense fog, in units of gm-3
Further, the calculation of LWC is inverted according to the above formula (inversion formula), based on the measured AfThe LWC value is inversely solved by the signal and the phi value, and the calculation formula of phi is given as follows:
Figure BDA0002370920800000084
(ghz) is the link frequency:
Figure BDA0002370920800000085
where the dielectric constant e (f, T) of water is ∈' (f, T) + i e ∈ "(f, T), which can be found in the relevant literature.
In one embodiment, the visibility calculation formula includes:
V=1.002·(LWC×ND)-0.6473
ND=-0.071×T2+2.213×T+141.56,
wherein V represents visibility, NDIndicating the water droplet number concentration and T the current temperature (local temperature).
The dense fog monitoring method based on the wireless microwave attenuation characteristic transfer learning extracts the microwave signal attenuation signal characteristics by using a method of generating a time-frequency graph by using wavelet transform, adopts a characteristic transfer learning method to identify dense fog and utilizes a related formula to deduce the visibility of the dense fog so as to achieve the purposes of monitoring and analyzing the dense fog, and has the following advantages:
(1) and the microwave signal attenuation data characteristics caused by the dense fog are fully extracted from the time-frequency image obtained by wavelet transform.
(2) The method for monitoring and analyzing the dense fog by the machine learning provides a new idea for effectively monitoring and analyzing the dense fog.
(3) The method for identifying the dense fog innovatively by using the characteristic migration utilizes the principle that the microwave signal is attenuated due to the attenuation of the dense fog, and provides a new solution for the problems of high monitoring cost and low coverage rate of the dense fog.
The technical features of the above embodiments can be arbitrarily combined, and for the sake of brevity, all possible combinations of the technical features in the above embodiments are not described, but should be considered as the scope of the present specification as long as there is no contradiction between the combinations of the technical features.
It should be noted that the terms "first \ second \ third" referred to in the embodiments of the present application merely distinguish similar objects, and do not represent a specific ordering for the objects, and it should be understood that "first \ second \ third" may exchange a specific order or sequence when allowed. It should be understood that "first \ second \ third" distinct objects may be interchanged under appropriate circumstances such that the embodiments of the application described herein may be implemented in an order other than those illustrated or described herein.
The terms "comprising" and "having" and any variations thereof in the embodiments of the present application are intended to cover non-exclusive inclusions. For example, a process, method, apparatus, product, or device that comprises a list of steps or modules is not limited to the listed steps or modules but may alternatively include other steps or modules not listed or inherent to such process, method, product, or device.
The above-mentioned embodiments only express several embodiments of the present application, and the description thereof is more specific and detailed, but not construed as limiting the scope of the invention. It should be noted that, for a person skilled in the art, several variations and modifications can be made without departing from the concept of the present application, which falls within the scope of protection of the present application. Therefore, the protection scope of the present patent shall be subject to the appended claims.

Claims (7)

1. A dense fog monitoring method based on wireless microwave attenuation characteristic transfer learning is characterized by comprising the following steps:
s10, obtaining microwave attenuation signal intensity data obtained by the microwave signal receiving end aiming at atmospheric measurement, and preprocessing the microwave attenuation signal intensity data;
s20, performing wavelet transformation on the preprocessed microwave attenuation signal intensity data to obtain a time-frequency diagram;
s30, adjusting the size of the time-frequency diagram to obtain an adjusted image, and enabling the adjusted image to meet the input requirement of a pre-trained Alexnet network;
s40, dividing the adjustment images into a test set and a training set, determining the state label of each adjustment image in the training set, inputting the training set into an Alexnet network for training, determining an Alexnet network model according to the current network parameters of the Alexnet network when the difference between the state detection result output by the Alexnet network and the state label of the training set meets the preset requirement, detecting the test set by using the Alexnet network model, and outputting the network detection result of each adjustment image in the test set;
and S50, calculating the content of the dense fog liquid water of each adjustment image in the test set according to the network detection result and a preset inversion formula, and calculating the visibility corresponding to each adjustment image in the test set according to the content of the dense fog liquid water.
2. The dense fog monitoring method based on wireless microwave attenuation characteristic transfer learning as claimed in claim 1, wherein the process of preprocessing the microwave attenuation signal intensity data comprises:
and identifying lost data in the microwave attenuation signal intensity data, interpolating the lost data, and eliminating noise data which is larger than a response threshold value in the microwave attenuation signal intensity data.
3. The dense fog monitoring method based on the wireless microwave attenuation characteristic transfer learning of claim 1, wherein the process of performing wavelet transformation on the preprocessed microwave attenuation signal intensity data to obtain a time-frequency graph comprises the following steps:
s31, determining a scale parameter a, determining a wavelet basis according to the scale parameter a, and comparing the wavelet basis with the microwave attenuation signal intensity data after pretreatment to obtain a plurality of fragment signals;
s32, calculating a continuous wavelet transform coefficient C1 of a segment signal, wherein the continuous wavelet transform coefficient C1 represents the similarity degree of a wavelet basis and the corresponding segment signal;
s33, adjusting the position parameter b to repeatedly execute the step S32 by the translation wavelet aiming at other segment signals until each segment signal is analyzed to obtain a continuous wavelet transformation coefficient C1 of each segment signal;
s34, updating the scale parameter a, and returning to execute the step S31 until the continuous wavelet transform coefficient C1 of each segment signal is determined according to each preset scale parameter a;
and S35, drawing a time-frequency graph according to the continuous wavelet transform coefficient C1 corresponding to each scale parameter a.
4. The dense fog monitoring method based on wireless microwave attenuation feature transfer learning of claim 1, wherein the Alexnet network comprises 5 convolutional layers, 3 pooling layers, 3 full-link layers and 1 soft-max layer.
5. The dense fog monitoring method based on wireless microwave attenuation feature transfer learning as claimed in claim 1, wherein the activation function of the Alexnet network comprises:
Figure RE-FDA0002424703200000021
where ReLu represents the activation function and x represents the input signal.
6. The dense fog monitoring method based on wireless microwave attenuation feature transfer learning according to claim 1, wherein the inversion formula comprises:
Af=φ·LWC,
wherein A isfAnd the detection result of the network is shown, phi represents a set constant, and LWC represents the content of the dense fog liquid water.
7. The dense fog monitoring method based on wireless microwave attenuation feature transfer learning as claimed in claim 1, wherein the visibility calculation formula comprises:
V=1.002·(LWC×ND)-0.6473
ND=-0.071×T2+2.213×T+141.56,
wherein V represents visibility, NDIndicating the water drop number concentration and T the current temperature.
CN202010050327.5A 2020-01-17 2020-01-17 Dense fog monitoring method based on wireless microwave attenuation characteristic transfer learning Active CN111274911B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010050327.5A CN111274911B (en) 2020-01-17 2020-01-17 Dense fog monitoring method based on wireless microwave attenuation characteristic transfer learning

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010050327.5A CN111274911B (en) 2020-01-17 2020-01-17 Dense fog monitoring method based on wireless microwave attenuation characteristic transfer learning

Publications (2)

Publication Number Publication Date
CN111274911A true CN111274911A (en) 2020-06-12
CN111274911B CN111274911B (en) 2020-12-01

Family

ID=71003505

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010050327.5A Active CN111274911B (en) 2020-01-17 2020-01-17 Dense fog monitoring method based on wireless microwave attenuation characteristic transfer learning

Country Status (1)

Country Link
CN (1) CN111274911B (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113834656A (en) * 2021-08-27 2021-12-24 西安电子科技大学 Bearing fault diagnosis method, system, equipment and terminal
CN114814993A (en) * 2022-03-25 2022-07-29 河海大学 Microwave attenuation snowfall intensity monitoring method based on DCGAN and 2D-CNN
CN115688544A (en) * 2022-06-20 2023-02-03 河海大学 MLP neural network-based microwave attenuation snow field chromatography reconstruction method

Citations (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1189231A (en) * 1995-05-30 1998-07-29 小维克多J·诺里斯 System for enhancing navigation and surveillance in low visibility conditions
CN1603792A (en) * 2003-10-02 2005-04-06 C.R.F.阿西安尼顾问公司 Detection device to be installed on a road for detecting environmental conditions and monitoring and controlling traffic
WO2011009945A2 (en) * 2009-07-23 2011-01-27 Medfield Diagnostics Ab Classification of microwave scattering data
CN102253013A (en) * 2011-04-29 2011-11-23 陈伟 Transmission method visibility detection device and method applied in field of transportation
CN202075226U (en) * 2011-04-29 2011-12-14 陈伟 Transmission type visibility detection device used in traffic field
CN102879309A (en) * 2012-09-22 2013-01-16 华南理工大学 Gas particle concentration measurement method and device on basis of broadband linear frequency modulation ultrasound
CN103617617A (en) * 2013-12-05 2014-03-05 淮海工学院 Underwater image quality evaluating and measuring method based on power spectrum description
CN104634740A (en) * 2013-11-12 2015-05-20 中国电信股份有限公司 Monitoring method and monitoring device of haze visibility
CN105426837A (en) * 2015-11-17 2016-03-23 南京大学 Haze detection system based on mobile network vision
CN105931220A (en) * 2016-04-13 2016-09-07 南京邮电大学 Dark channel experience and minimal image entropy based traffic smog visibility detection method
CN107907508A (en) * 2017-09-26 2018-04-13 无锡昊瑜节能环保设备有限公司 A kind of atmospheric visibility measuring device
CN109214331A (en) * 2018-08-30 2019-01-15 南京邮电大学 A kind of traffic haze visibility detecting method based on image spectrum
CN109255100A (en) * 2018-09-10 2019-01-22 河海大学 A kind of Urban Rain inversion algorithm based on microwave attenuation characteristic response fingerprint recognition
CN109374488A (en) * 2018-11-14 2019-02-22 国网安徽省电力有限公司电力科学研究院 Prediction technique for photovoltaic power station haze weather visibility
US20190293572A1 (en) * 2018-03-20 2019-09-26 Ramot At Tel-Aviv University Ltd. Methos, system and computer program product for generating a two dimensional fog map from cellular communication network information
CN110659593A (en) * 2019-09-09 2020-01-07 南京邮电大学 Urban haze visibility detection method based on improved DiracNet

Patent Citations (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1189231A (en) * 1995-05-30 1998-07-29 小维克多J·诺里斯 System for enhancing navigation and surveillance in low visibility conditions
CN1603792A (en) * 2003-10-02 2005-04-06 C.R.F.阿西安尼顾问公司 Detection device to be installed on a road for detecting environmental conditions and monitoring and controlling traffic
WO2011009945A2 (en) * 2009-07-23 2011-01-27 Medfield Diagnostics Ab Classification of microwave scattering data
CN102253013A (en) * 2011-04-29 2011-11-23 陈伟 Transmission method visibility detection device and method applied in field of transportation
CN202075226U (en) * 2011-04-29 2011-12-14 陈伟 Transmission type visibility detection device used in traffic field
CN102879309A (en) * 2012-09-22 2013-01-16 华南理工大学 Gas particle concentration measurement method and device on basis of broadband linear frequency modulation ultrasound
CN104634740A (en) * 2013-11-12 2015-05-20 中国电信股份有限公司 Monitoring method and monitoring device of haze visibility
CN103617617A (en) * 2013-12-05 2014-03-05 淮海工学院 Underwater image quality evaluating and measuring method based on power spectrum description
CN105426837A (en) * 2015-11-17 2016-03-23 南京大学 Haze detection system based on mobile network vision
CN105931220A (en) * 2016-04-13 2016-09-07 南京邮电大学 Dark channel experience and minimal image entropy based traffic smog visibility detection method
CN107907508A (en) * 2017-09-26 2018-04-13 无锡昊瑜节能环保设备有限公司 A kind of atmospheric visibility measuring device
US20190293572A1 (en) * 2018-03-20 2019-09-26 Ramot At Tel-Aviv University Ltd. Methos, system and computer program product for generating a two dimensional fog map from cellular communication network information
CN109214331A (en) * 2018-08-30 2019-01-15 南京邮电大学 A kind of traffic haze visibility detecting method based on image spectrum
CN109255100A (en) * 2018-09-10 2019-01-22 河海大学 A kind of Urban Rain inversion algorithm based on microwave attenuation characteristic response fingerprint recognition
CN109374488A (en) * 2018-11-14 2019-02-22 国网安徽省电力有限公司电力科学研究院 Prediction technique for photovoltaic power station haze weather visibility
CN110659593A (en) * 2019-09-09 2020-01-07 南京邮电大学 Urban haze visibility detection method based on improved DiracNet

Non-Patent Citations (4)

* Cited by examiner, † Cited by third party
Title
MAGED ABDULLAH ESMAIL ET AL.: ""An Experimental Study of FSO Link Performance in Desert Environment"", 《IEEE COMMUNICATIONS LETTERS》 *
卢昌胜 等: ""云雾对THz波传播的衰减研究"", 《电波科学学报》 *
廖碧婷 等: ""基于微波辐射计分析低能见度的液态含水量特征"", 《中国环境科学》 *
沈广德 等: ""海雾对3毫米波传播特性的影响研究"", 《红外与毫米波学报》 *

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113834656A (en) * 2021-08-27 2021-12-24 西安电子科技大学 Bearing fault diagnosis method, system, equipment and terminal
CN113834656B (en) * 2021-08-27 2024-04-30 西安电子科技大学 Bearing fault diagnosis method, system, equipment and terminal
CN114814993A (en) * 2022-03-25 2022-07-29 河海大学 Microwave attenuation snowfall intensity monitoring method based on DCGAN and 2D-CNN
CN115688544A (en) * 2022-06-20 2023-02-03 河海大学 MLP neural network-based microwave attenuation snow field chromatography reconstruction method

Also Published As

Publication number Publication date
CN111274911B (en) 2020-12-01

Similar Documents

Publication Publication Date Title
CN111274911B (en) Dense fog monitoring method based on wireless microwave attenuation characteristic transfer learning
Bulakh et al. Time series classification based on fractal properties
CN112560803A (en) Radar signal modulation identification method based on time-frequency analysis and machine learning
CN105844279A (en) Depth learning and SIFT feature-based SAR image change detection method
CN116879297B (en) Soil moisture collaborative inversion method, device, equipment and medium
Fetterer et al. Sea ice type maps from Alaska synthetic aperture radar facility imagery: An assessment
Chen et al. Spatial–temporal convolutional gated recurrent unit network for significant wave height estimation from shipborne marine radar data
CN113534120B (en) Multi-target constant false alarm rate detection method based on deep neural network
CN114595732B (en) Radar radiation source sorting method based on depth clustering
CN110427878A (en) A kind of sudden and violent signal recognition method of Rapid Radio and system
CN113139594A (en) Airborne image unmanned aerial vehicle target self-adaptive detection method
CN110531039B (en) Grain mildew detection method and device based on WiFi equipment
CN111291818B (en) Non-uniform class sample equalization method for cloud mask
CN115546608A (en) Unmanned aerial vehicle data link electromagnetic interference classification and threat assessment method
CN110516552B (en) Multi-polarization radar image classification method and system based on time sequence curve
Sinha et al. Estimation of Doppler profile using multiparameter cost function method
CN113608193A (en) Radar multi-target distance and speed estimation method based on UNet
Puno et al. Vision system for soil nutrient detection using fuzzy logic
CN105046707B (en) SAR image ship detection method based on n-order polynomial Function Fitting sea clutter
Pelliza et al. Optimal Canny’s parameters regressions for coastal line detection in satellite-based SAR images
CN107644413A (en) SAR image change region detection method based on neighborhood ratio and from step study
Scott Assessment of categorical triple collocation for sea ice/open water observations: Application to the Gulf of Saint Lawrence
CN116010798A (en) YOLOv 5-based rapid identification and classification method for complex typical communication signals
CN115755020A (en) Tracking method for scanning position of mechanical scanning radar antenna
CN109406911B (en) Low-frequency channel radio frequency interference detection and correction method for satellite-borne microwave sensor

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant