CN112949934A - Short-term heavy rainfall prediction method based on deep learning - Google Patents
Short-term heavy rainfall prediction method based on deep learning Download PDFInfo
- Publication number
- CN112949934A CN112949934A CN202110317764.3A CN202110317764A CN112949934A CN 112949934 A CN112949934 A CN 112949934A CN 202110317764 A CN202110317764 A CN 202110317764A CN 112949934 A CN112949934 A CN 112949934A
- Authority
- CN
- China
- Prior art keywords
- neural network
- convolution
- network model
- gru neural
- rainfall
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000000034 method Methods 0.000 title claims abstract description 46
- 238000013135 deep learning Methods 0.000 title claims abstract description 21
- 238000003062 neural network model Methods 0.000 claims abstract description 106
- 238000010586 diagram Methods 0.000 claims abstract description 47
- 238000012545 processing Methods 0.000 claims abstract description 39
- 238000010606 normalization Methods 0.000 claims abstract description 30
- 238000012549 training Methods 0.000 claims abstract description 12
- 230000006870 function Effects 0.000 claims description 39
- 238000013528 artificial neural network Methods 0.000 claims description 31
- 238000013527 convolutional neural network Methods 0.000 claims description 24
- 230000008569 process Effects 0.000 claims description 12
- 230000004913 activation Effects 0.000 claims description 10
- 230000001537 neural effect Effects 0.000 claims description 9
- 210000002569 neuron Anatomy 0.000 claims description 9
- 230000009466 transformation Effects 0.000 claims description 9
- 230000008859 change Effects 0.000 claims description 6
- 238000001914 filtration Methods 0.000 claims description 6
- 230000000694 effects Effects 0.000 description 5
- 238000013213 extrapolation Methods 0.000 description 2
- 230000004048 modification Effects 0.000 description 2
- 238000012986 modification Methods 0.000 description 2
- 230000002411 adverse Effects 0.000 description 1
- 238000006243 chemical reaction Methods 0.000 description 1
- 230000000052 comparative effect Effects 0.000 description 1
- 238000007796 conventional method Methods 0.000 description 1
- 230000001771 impaired effect Effects 0.000 description 1
- 230000006872 improvement Effects 0.000 description 1
- 238000011176 pooling Methods 0.000 description 1
- 230000002265 prevention Effects 0.000 description 1
- 230000009467 reduction Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q10/00—Administration; Management
- G06Q10/04—Forecasting or optimisation specially adapted for administrative or management purposes, e.g. linear programming or "cutting stock problem"
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S13/00—Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
- G01S13/88—Radar or analogous systems specially adapted for specific applications
- G01S13/95—Radar or analogous systems specially adapted for specific applications for meteorological use
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01W—METEOROLOGY
- G01W1/00—Meteorology
- G01W1/10—Devices for predicting weather conditions
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/04—Architecture, e.g. interconnection topology
- G06N3/044—Recurrent networks, e.g. Hopfield networks
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/04—Architecture, e.g. interconnection topology
- G06N3/045—Combinations of networks
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/04—Architecture, e.g. interconnection topology
- G06N3/048—Activation functions
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/08—Learning methods
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q50/00—Information and communication technology [ICT] specially adapted for implementation of business processes of specific business sectors, e.g. utilities or tourism
- G06Q50/10—Services
- G06Q50/26—Government or public services
-
- Y—GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
- Y02—TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
- Y02A—TECHNOLOGIES FOR ADAPTATION TO CLIMATE CHANGE
- Y02A90/00—Technologies having an indirect contribution to adaptation to climate change
- Y02A90/10—Information and communication technologies [ICT] supporting adaptation to climate change, e.g. for weather forecasting or climate simulation
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Business, Economics & Management (AREA)
- General Physics & Mathematics (AREA)
- Health & Medical Sciences (AREA)
- Life Sciences & Earth Sciences (AREA)
- General Health & Medical Sciences (AREA)
- Data Mining & Analysis (AREA)
- Evolutionary Computation (AREA)
- Computational Linguistics (AREA)
- Molecular Biology (AREA)
- Computing Systems (AREA)
- General Engineering & Computer Science (AREA)
- Biophysics (AREA)
- Mathematical Physics (AREA)
- Software Systems (AREA)
- Biomedical Technology (AREA)
- Artificial Intelligence (AREA)
- Remote Sensing (AREA)
- Radar, Positioning & Navigation (AREA)
- Tourism & Hospitality (AREA)
- Economics (AREA)
- Human Resources & Organizations (AREA)
- Strategic Management (AREA)
- Marketing (AREA)
- Environmental & Geological Engineering (AREA)
- General Business, Economics & Management (AREA)
- Development Economics (AREA)
- Primary Health Care (AREA)
- Electromagnetism (AREA)
- Computer Networks & Wireless Communication (AREA)
- Educational Administration (AREA)
- Atmospheric Sciences (AREA)
- Biodiversity & Conservation Biology (AREA)
- Ecology (AREA)
- Environmental Sciences (AREA)
- Game Theory and Decision Science (AREA)
- Entrepreneurship & Innovation (AREA)
- Operations Research (AREA)
Abstract
The invention relates to a short-time heavy rainfall prediction method based on deep learning, which comprises the steps of forming target area actual rainfall data sets of a target area at different acquisition moments in advance, obtaining a target area actual rainfall data set after normalization processing, inputting a normalized radar echo diagram sequence corresponding to any acquisition moment in the target area actual rainfall data set after the normalization processing into a 3D convolution-GRU neural network model which is constructed in advance, taking the output of the 3D convolution-GRU neural network model as a rainfall prediction value of any acquisition moment, obtaining an optimized 3D convolution-GRU neural network model through continuous training, inputting the radar echo diagram sequence of the target area at the current moment into the optimized 3D convolution-GRU neural network model after the normalization processing, and taking the output of the optimized 3D convolution-GRU neural network model as the rainfall prediction value of the target area in a future time period And strong rainfall prediction in a short time aiming at the target area is realized.
Description
Technical Field
The invention relates to the technical field of computer vision and meteorological service, in particular to a short-time heavy rainfall prediction method based on deep learning.
Background
Short-term heavy rainfall is a weather process with strong burst property, short rainfall time and large rainfall amount, because meteorological disasters caused by the short-term heavy rainfall are usually 'precautionary', the social harmfulness generated by the short-term heavy rainfall is extremely large, and the natural disasters caused by the short-term heavy rainfall are endless every year, thereby seriously threatening the life and property safety of people. Therefore, the method realizes accurate prediction of short-time heavy rainfall and has great significance for disaster prevention and reduction.
In the existing short-time heavy rainfall prediction method, a radar echo extrapolation technology is generally used as a main technical means for forecasting the near weather, and specifically, according to echo data detected by a weather radar, the intensity distribution of an echo and the moving speed and moving direction of an echo body (such as a rainfall area) are determined, and then the radar echo state after a certain period of time is predicted by performing linear or nonlinear extrapolation on the echo body.
The Chinese invention patent CN105046089B discloses a method for predicting heavy rainfall and flood disasters, which predicts the heavy rainfall and flood disasters by collecting event time sequence data and constructing a rainfall sequence according to the total rainfall data of each historical month and by predicting the total rainfall of a certain month in the future by combining a fuzzy subtraction clustering algorithm, statistical learning, a selective structure risk minimization theory and cluster-like projection. The scheme for predicting heavy rainfall in the invention uses a fuzzy clustering algorithm to cluster the training set, and the number of clusters is determined by a selective structure risk minimization theory, so that the clustering result is more accurate, and the effectiveness and accuracy of the prediction result are ensured.
However, the method for predicting heavy rainfall of the above patent CN105046089B also has the following disadvantages: the method for predicting the heavy rainfall in the invention patent can predict the total rainfall in a certain month in the future, so that the short-time heavy rainfall weather with strong burst property, short rainfall time and large rainfall amount cannot be predicted.
Disclosure of Invention
The invention aims to solve the technical problem of providing a short-time heavy rainfall prediction method based on deep learning aiming at the prior art.
The technical scheme adopted by the invention for solving the technical problems is as follows: the short-time heavy rainfall prediction method based on deep learning is characterized by comprising the following steps of S1-S5:
step S1, collecting radar echo diagram sequences and rainfall actual values of a target area at different collecting moments in advance, and forming a target area actual rainfall data set by all the collected radar echo diagram sequences and the collected rainfall actual values; in the target area actual rainfall data set, radar echo diagram sequences at the same acquisition time are in one-to-one correspondence with rainfall actual values;
step S2, normalizing each radar echo map in the target area actual rainfall data set to obtain a normalized target area actual rainfall data set;
step S3, a 3D convolution-GRU neural network model is constructed in advance; the 3D convolution-GRU neural network model comprises a 3D convolution neural network and a GRU neural network, wherein the input of the 3D convolution-GRU neural network model is the input of the 3D convolution neural network, the output of the 3D convolution neural network is the input of the GRU neural network, and the output of the GRU neural network is the output of the 3D convolution-GRU neural network model;
step S4, taking the radar echo diagram sequence of each acquisition time in the target area actual rainfall data set after normalization processing as the input of a 3D convolution-GRU neural network model, taking the output of the 3D convolution-GRU neural network model as a rainfall prediction value aiming at the acquisition time, and training the 3D convolution-GRU neural network model by using the target area actual rainfall data set after normalization processing to obtain an optimized 3D convolution-GRU neural network model through training;
and step S5, acquiring a radar echo diagram sequence of the target area at the current moment, performing normalization processing on each radar echo diagram in the radar echo diagram sequence, inputting the radar echo diagram sequence after the normalization processing into the optimized 3D convolution-GRU neural network model, and taking the output of the optimized 3D convolution-GRU neural network model as a rainfall prediction value of the target area in a future time period.
In step S2, before normalizing each radar echo map in the target area actual rainfall data set, the method further includes:
processing each radar echo map in the target area actual rainfall data set into a gray scale map through linear transformation; wherein, the formula of the linear transformation processing is that g '(d, e) ═ K · g (d, e) + B, g (d, e) represents the pixel value of the acquired radar echo image, K represents the slope, B represents the intercept, and g' (d, e) represents the pixel value corresponding to the gray scale image after the linear transformation processing;
and filtering the obtained gray level image by adopting a bilinear filter, and taking the gray level image after filtering as a radar echo image needing normalization processing.
Further, in the method for predicting short-term heavy rainfall based on deep learning, in step S3, the 3D convolutional neural network of the 3D convolutional-GRU neural network model is constructed as follows:
wherein,to representOutput of the jth feature map of the ith layer of neurons of the 3D convolutional neural network, x and y respectively represent the spatial dimensions of the normalized radar echo map input into the 3D convolutional neural network, z represents the time dimension of the normalized radar echo map sequence input into the 3D convolutional neural network, σ (·) represents the activation function, bijA bias function representing the jth feature map of layer i neurons of the 3D convolutional neural network, P, q, and r represent convolution values, respectively, Pi、QiAnd RiRespectively represent the size of the convolution kernel in the 3D convolutional neural network,representing the weight of the (p, q, r) th neuron connection in the mth feature,a dimension value representing a sequence of normalized radar echo maps input into the 3D convolutional neural network.
Still further, in the method for predicting short-term heavy rainfall based on deep learning, in step S3, the GRU neural network of the 3D convolution-GRU neural network model is constructed as follows:
Zt=σ(WZ·[ht-1,Xt]);
rt=σ(Wr·[ht-1,Xt]);
ht=(1-Zt)*ht-1+Zt*h′t;
where σ (-) denotes the activation function, WZPresentation update door ZtWeight of (a), htRepresenting the output of the current neural unit in the GRU neural network, ht-1Representing the output, X, of the last neural unit in the GRU neural networktRepresenting the input of the current neural unit in the GRU neural network, WZ·[ht-1,Xt]Indicates to output ht-1And input XtThe result of the addition is given a weight WZAre subjected to multiplication treatment of h'tBy controlling rtFrom the output ht-1The information amount in (1), tanh (-) represents a commonly used hyperbolic tangent activation function.
Further improved, in the method for predicting short-term heavy rainfall based on deep learning, in step S4, an optimized 3D convolution-GRU neural network model is trained as follows:
step S41, acquiring a rainfall prediction value output by the 3D convolution-GRU neural network model at any acquisition time;
step S42, acquiring the rainfall actual value corresponding to the target area at any acquisition time;
step S43, constructing a loss function of the 3D convolution-GRU neural network model, and obtaining a loss function value of the 3D convolution-GRU neural network model; wherein, the loss function of the 3D convolution-GRU neural network model is as follows:
wherein, Γ represents a loss function value of the 3D convolution-GRU neural network model, y (t) represents an actual rainfall value of the target area at the acquisition time t, and y' (t) represents a predicted rainfall value of the target area output by the 3D convolution-GRU neural network model at the acquisition time t;
step S44, making judgment according to the loss function value of the obtained 3D convolution-GRU neural network model:
when the change of the loss function value of the 3D convolution-GRU neural network model is stable, taking the 3D convolution-GRU neural network model as an optimized 3D convolution-GRU neural network model; otherwise, the process proceeds to step S41.
Preferably, in the method for predicting short-term heavy rainfall based on deep learning, in step S1, radar echo map sequences of the target area at different acquisition times within 24h are acquired in advance according to an acquisition frequency of 1 frame/6 min.
Compared with the prior art, the invention has the advantages that:
firstly, the invention acquires radar echo diagram sequences and rainfall actual values of a target area at different acquisition moments in advance to form a target area actual rainfall data set, then acquires a normalized target area actual rainfall data set, inputs a normalized radar echo diagram sequence corresponding to any acquisition moment as a pre-constructed 3D convolution-GRU neural network model, and takes the output of the 3D convolution-GRU neural network model as a rainfall predicted value aiming at any acquisition moment, thereby continuously training the 3D convolution-GRU neural network model through the radar echo diagram sequence in the normalized target area actual rainfall data set, acquiring an optimized 3D convolution-GRU neural network model through training, and then inputting the radar echo diagram sequence of the target area at the current moment into the optimized 3D convolution-GRU neural network model after normalization processing, and the output of the optimized 3D convolution-GRU neural network model is used as a rainfall prediction value of the target area in a future time period, so that the rainfall forecast of the target area in a short time is realized, and the method has important application value and practical significance for improving the accuracy of meteorological early warning work and natural disasters caused by the rainfall.
Secondly, the method can adjust the acquisition frequency of the radar echo map aiming at the target area according to the requirement, and further realize the forecast of the heavy rainfall at different moments in the future, thereby meeting the forecast of the heavy rainfall at different moments in the future according to the requirement and having higher practicability.
Drawings
FIG. 1 is a schematic flow chart of a short-term heavy rainfall prediction method based on deep learning according to an embodiment of the present invention;
FIG. 2 is a schematic diagram of a 3D convolutional-GRU neural network model constructed in an embodiment of the present invention;
FIG. 3 is a predicted output of a radar echo image sequence within 2 hours in the future using a conventional short-term heavy rainfall prediction method;
FIG. 4 is a predicted output of a radar echo image sequence within 2 hours in the future, obtained by using a short-time heavy rainfall prediction method based on deep learning in an embodiment of the present invention;
fig. 5 is a real output of a sequence of radar echo images of a target region acquired from a weather station over the next 2 hours.
Detailed Description
The invention is described in further detail below with reference to the accompanying examples.
The embodiment provides a short-time heavy rainfall prediction method based on deep learning. Specifically, referring to fig. 1, the method for predicting short-term heavy rainfall based on deep learning in this embodiment includes the following steps S1 to S6:
step S1, collecting radar echo diagram sequences and rainfall actual values of a target area at different collecting moments in advance, and forming a target area actual rainfall data set by all the collected radar echo diagram sequences and the collected rainfall actual values;
in this embodiment, the "acquisition time" referred to herein is a time before the current time; specifically, assuming that the target region is a nail, the pre-acquisition times are t1、t2、……、tMAnd marking the formed target area actual rainfall data set as List: wherein:
the pre-collected target area A is collected at the collecting time t1The radar echo diagram sequence isThe target area A is at the acquisition time t1Actual value of rainfall in time is marked
The pre-collected target area A is collected at the collecting time t2The radar echo diagram sequence isThe target area A is at the acquisition time t2Actual value of rainfall in time is marked
And so on;
the pre-collected target area A is collected at the collecting time tMThe radar echo diagram sequence isThe target area A is at the acquisition time tMActual value of rainfall in time is marked
Since the radar echo map sequence is a sequence of images comprising images at different heights, H herewRepresenting the W-th height value of the target area A, wherein W is more than or equal to 1 and less than or equal to W, and W is the total number of height values corresponding to the target area A when the radar echo map is acquired;showing the acquired target area A at time tMHeight of (H)wA single radar echo map of time;
as is well known to those skilled in the art, in the process of predicting rainfall conditions by using a radar echo sequence diagram, a radar echo sequence diagram at a certain time can reflect rainfall conditions at the certain time, that is, the radar echo sequence diagram and the rainfall at the same time are in a corresponding relationship, that is, the following corresponding relationship exists in the actual rainfall data set List of the target area in the embodiment:
for the acquisition time t1Sequence of radar echo diagramsAnd actual value of rainfallOne-to-one correspondence is realized;
for the acquisition time t2Sequence of radar echo diagramsAnd actual value of rainfallOne-to-one correspondence is realized;
and so on;
for the acquisition time tMSequence of radar echo diagramsAnd actual value of rainfallOne-to-one correspondence is realized;
in the embodiment, the radar echo map sequences of the target area A at different heights at different acquisition moments within 24h are acquired in advance according to the acquisition frequency of 1 frame/6 min. Therefore, the acquisition frequency of the radar echo map set by the device is 1 frame/6 min, namely 10 radar echo maps are acquired within 1h, and 240 radar echo maps are acquired within 24h (namely one day);
step S2, normalizing each radar echo map in the target area actual rainfall data set to obtain a normalized target area actual rainfall data set;
specifically, in step S2, normalization processing needs to be performed on each radar echo map in the obtained target area actual rainfall data set List, that is, a sequence of radar echo maps is obtainedEach radar echo map in the target area is subjected to normalization processing, so that a normalized target area actual rainfall data set List' can be obtained through the normalization processing;
specifically, in the target area actual rainfall data set List' after the normalization processing:
for the acquisition time t1Normalized radar echo map sequenceAnd actual value of rainfallOne-to-one correspondence, radar echo mapsIs a radar echo diagramNormalized radar echo map of (a);
for the acquisition time t2Normalized radar echo map sequenceAnd actual value of rainfallOne-to-one correspondence, radar echo mapsIs a radar echo diagramNormalized radar echo map of (a);
and so on;
for the acquisition time tMNormalized radar echo map sequenceAnd actual value of rainfallOne-to-one correspondence, radar echo mapsIs a radar echo diagramNormalized radar echo map of (a);
it should be noted that there is usually noise interference during the actual acquisition process for the radar echo pattern sequence. Therefore, in order to eliminate the adverse effect of noise on the acquired radar echo map, in this step S2, a noise cancellation process for the acquired radar echo map may also be performed before the normalization process for the radar echo image. For example, the noise cancellation process herein includes: processing each radar echo map in the target area actual rainfall data set List into a gray map through linear transformation; wherein, the formula of the linear transformation processing is that g '(d, e) ═ K · g (d, e) + B, g (d, e) represents the pixel value of the acquired radar echo image, K represents the slope, B represents the intercept, and g' (d, e) represents the pixel value corresponding to the gray scale image after the linear transformation processing;
and filtering the obtained gray level image by adopting a bilinear filter, and taking the gray level image after filtering as a radar echo image needing normalization processing.
When K is greater than 1, the method can be used for increasing the contrast of the image, the pixel values of the image are all increased after conversion, and the overall display effect is enhanced; when K is 1, it is often used to adjust image brightness; when 0< K <1, the effect is just opposite to that when K >1, and both the contrast of the image and the overall effect are impaired; when K is less than 0, the brighter area of the source image becomes dark, and the darker area becomes bright, at this time, K in the function is 1, and B is 255, so that the image realizes the reverse color effect;
step S3, a 3D convolution-GRU neural network model is constructed in advance; referring to fig. 2, the 3D convolution-GRU neural network model includes a 3D convolution neural network and a GRU neural network, an input of the 3D convolution-GRU neural network model is an input of the 3D convolution neural network, an output of the 3D convolution neural network is an input of the GRU neural network, and an output of the GRU neural network is an output of the 3D convolution-GRU neural network model;
assume, initially, that the constructed 3D convolutional-GRU neural network model is labeled as Conv3D _ GRU0The initially constructed model Conv3D _ GRU0The 3D convolutional neural network in (1) is labeled as Conv3D0The initially constructed model Conv3D _ GRU0The GRU neural network in (1) is marked as GRU0;
For example, in this embodiment, the constructed 3D convolutional-GRU neural network model Conv3D _ GRU0The 3D convolutional neural network of (a) is as follows:
wherein,an output of a jth feature map representing a layer i neuron of the 3D convolutional neural network, x and y represent spatial dimensions of the normalized radar echo map input into the 3D convolutional neural network, respectively, z represents a time dimension of the normalized radar echo map input into the 3D convolutional neural network, σ (·) represents an activation function, bijA bias function representing the jth feature map of layer i neurons of the 3D convolutional neural network, P, q, and r represent convolution values, respectively, Pi、QiAnd RiRespectively represent the size of the convolution kernel in the 3D convolutional neural network,representing the weight of the (p, q, r) th neuron connection in the mth feature,a dimension value representing a normalized radar echo map input into the 3D convolutional neural network; wherein, in this embodiment, the 3D convolutional neural network used consists of 1 input layer, 3 three-dimensional convolutional layers, and 3 three-dimensional pooling layers;
the GRU neural network of the constructed 3D convolution-GRU neural network model is as follows:
Zt=σ(WZ·[ht-1,Xt]);
rt=σ(Wr·[ht-1,Xt]);
ht=(1-Zt)*ht-1+Zt*ht';
where σ (-) denotes an activation function, e.g., the activation function employed here is a commonly used sigmoid function, i.e.,k is a variable, WZPresentation update door ZtWeight of (a), htRepresenting the output of the current neural unit in the GRU neural network, ht-1Representing the output, X, of the last neural unit in the GRU neural networktRepresenting the input of the current neural unit in the GRU neural network, WZ·[ht-1,Xt]Indicates to output ht-1And input XtThe result of the addition is given a weight WZAre subjected to multiplication treatment of h'tBy controlling rtFrom the output ht-1The information amount obtained in (1), tanh (-) represents a commonly used hyperbolic tangent activation function;
step S4, taking the radar echo map of each acquisition time in the target area actual rainfall data set after normalization processing as the input of a 3D convolution-GRU neural network model, taking the output of the 3D convolution-GRU neural network model as a rainfall prediction value aiming at the acquisition time, and training the 3D convolution-GRU neural network model by using the target area actual rainfall data set after normalization processing to obtain an optimized 3D convolution-GRU neural network model through training;
specifically, when this step S4 is executed, the following processing is executed:
collecting time t in the target area actual rainfall data set List' after normalization processing1Corresponding normalized radar echo diagram sequenceAs an input to a 3D convolution-GRU neural network model, and an output of the 3D convolution-GRU neural network model as an output for the acquisition instantt1E.g. the output of the 3D convolution-GRU neural network model here as the prediction value for the acquisition time t1Is marked as a rainfall prediction value
Collecting time t in the target area actual rainfall data set List' after normalization processing2Corresponding normalized radar echo diagram sequenceAs input to the 3D convolution-GRU neural network model, and the output of the 3D convolution-GRU neural network model as input for the acquisition time t2E.g. the output of the 3D convolution-GRU neural network model here as the prediction value for the acquisition time t2Is marked as a rainfall prediction value
And so on;
collecting time t in the target area actual rainfall data set List' after normalization processingMCorresponding normalized radar echo diagram sequenceAs input to the 3D convolution-GRU neural network model, and the output of the 3D convolution-GRU neural network model as input for the acquisition time tME.g. the output of the 3D convolution-GRU neural network model here as the prediction value for the acquisition time tMIs marked as a rainfall prediction value
In turn normalized radar echo map sequenceTo normalized radar echo pattern sequenceRespectively training the 3D convolution-GRU neural network model to obtain an optimized 3D convolution-GRU neural network model; in step S4 of this embodiment, the optimized 3D convolution-GRU neural network model is trained as follows:
step S41, acquiring a rainfall prediction value output by the 3D convolution-GRU neural network model at any acquisition time;
step S42, acquiring the rainfall actual value corresponding to the target area at any acquisition time;
step S43, constructing a loss function of the 3D convolution-GRU neural network model, and obtaining a loss function value of the 3D convolution-GRU neural network model; wherein, the loss function of the 3D convolution-GRU neural network model is as follows:
wherein, Γ represents a loss function value of the 3D convolution-GRU neural network model, y (t) represents an actual rainfall value of the target area at the acquisition time t, and y' (t) represents a predicted rainfall value of the target area output by the 3D convolution-GRU neural network model at the acquisition time t;
step S44, making judgment according to the loss function value of the obtained 3D convolution-GRU neural network model:
when the change of the loss function value of the 3D convolution-GRU neural network model is stable, taking the 3D convolution-GRU neural network model as an optimized 3D convolution-GRU neural network model; otherwise, the process proceeds to step S41.
For example, a 3D convolution-GRU neural network model is first obtained at an acquisition time t1Output rainfall prediction value 3D convolution-GRU neural network model at acquisition time t2Output rainfall prediction value… …, 3D convolution-GRU neural network model at acquisition time tMOutput rainfall prediction valueThen, acquiring the target area A at any acquisition time t1、t2、……、tMActual values of rainfall corresponding to the timesThirdly, constructing a loss function of the 3D convolution-GRU neural network model, and obtaining a loss function value of the 3D convolution-GRU neural network model; wherein, the loss function value of the 3D convolution-GRU neural network model is as follows:
when the change of the loss function value gamma of the 3D convolution-GRU neural network model is stable, namely the change range of the loss function value gamma is small, for example, the change of the loss function value gamma is in a preset small numerical range, taking the 3D convolution-GRU neural network model as an optimized 3D convolution-GRU neural network model; otherwise, the rainfall prediction value output by the 3D convolution-GRU neural network model at any acquisition time is obtained again, and the subsequent steps are executed in sequence.
And step S5, acquiring a radar echo diagram sequence of the target area at the current moment, executing normalization processing on the radar echo diagram sequence, inputting the radar echo diagram sequence after the normalization processing into the optimized 3D convolution-GRU neural network model, and taking the output of the optimized 3D convolution-GRU neural network model as a rainfall prediction value of the target area in a future time period.
Since an optimized 3D convolution-GRU neural network model has been obtained through the training process of step S4, at this time, the radar echo map sequence of the target area a at the current time is collected again, normalization processing is performed on each radar echo map in the radar echo map sequence, the radar echo map sequence after normalization processing is input into the optimized 3D convolution-GRU neural network model, and the output of the optimized 3D convolution-GRU neural network model is taken as the rainfall prediction value of the target area in the future time period.
As is well known to those skilled in the art, in the process of predicting rainfall conditions through a radar echo sequence diagram, a radar echo sequence diagram at a certain time can reflect the rainfall conditions at the certain time, that is, the radar echo sequence diagram and the rainfall at the same time are in a corresponding relationship. Therefore, in order to more intuitively compare the performance differences of different short-term heavy rainfall prediction methods, the embodiment characterizes the rainfall condition in a predicted future time period by a radar echo diagram sequence in the future time period. Specifically, to make a comparative illustration of the performance of the short-term heavy rainfall prediction method based on deep learning in this embodiment, this embodiment also provides a predicted output image sequence (see fig. 3) of the radar echo image sequence within 2 hours in the future and a true output image sequence (see fig. 5) of the radar echo image sequence within 2 hours in the future using a conventional method (based on a 2D convolutional neural network).
As can be seen from comparison among fig. 3, fig. 4, and fig. 5, the output image sequence predicted by the time-heavy rainfall prediction method in this embodiment is clearer, the time dimension features and the space dimension features of the radar echo map of the target area at different heights can be better obtained, and the future rainfall condition can be predicted more accurately.
Although preferred embodiments of the present invention have been described in detail hereinabove, it should be clearly understood that modifications and variations of the present invention are possible to those skilled in the art. Any modification, equivalent replacement, or improvement made within the spirit and principle of the present invention should be included in the protection scope of the present invention.
Claims (6)
1. The short-time heavy rainfall prediction method based on deep learning is characterized by comprising the following steps of S1-S5:
step S1, collecting radar echo diagram sequences and rainfall actual values of a target area at different collecting moments in advance, and forming a target area actual rainfall data set by all the collected radar echo diagram sequences and the collected rainfall actual values; in the target area actual rainfall data set, radar echo diagram sequences at the same acquisition time are in one-to-one correspondence with rainfall actual values;
step S2, normalizing each radar echo map in the target area actual rainfall data set to obtain a normalized target area actual rainfall data set;
step S3, a 3D convolution-GRU neural network model is constructed in advance; the 3D convolution-GRU neural network model comprises a 3D convolution neural network and a GRU neural network, wherein the input of the 3D convolution-GRU neural network model is the input of the 3D convolution neural network, the output of the 3D convolution neural network is the input of the GRU neural network, and the output of the GRU neural network is the output of the 3D convolution-GRU neural network model;
step S4, taking the radar echo diagram sequence of each acquisition time in the target area actual rainfall data set after normalization processing as the input of a 3D convolution-GRU neural network model, taking the output of the 3D convolution-GRU neural network model as a rainfall prediction value aiming at the acquisition time, and training the 3D convolution-GRU neural network model by using the target area actual rainfall data set after normalization processing to obtain an optimized 3D convolution-GRU neural network model through training;
and step S5, acquiring a radar echo diagram sequence of the target area at the current moment, performing normalization processing on each radar echo diagram in the radar echo diagram sequence, inputting the radar echo diagram sequence after the normalization processing into the optimized 3D convolution-GRU neural network model, and taking the output of the optimized 3D convolution-GRU neural network model as a rainfall prediction value of the target area in a future time period.
2. The method for forecasting short-term heavy rainfall based on deep learning of claim 1, wherein in step S2, before the normalization process is performed on each radar echo map in the target area actual rainfall data set, the method further comprises:
processing each radar echo map in the target area actual rainfall data set into a gray scale map through linear transformation; wherein, the formula of the linear transformation processing is that g '(d, e) ═ K · g (d, e) + B, g (d, e) represents the pixel value of the acquired radar echo image, K represents the slope, B represents the intercept, and g' (d, e) represents the pixel value corresponding to the gray scale image after the linear transformation processing;
and filtering the obtained gray level image by adopting a bilinear filter, and taking the gray level image after filtering as a radar echo image needing normalization processing.
3. The method for forecasting short-term heavy rainfall based on deep learning of claim 2, wherein in step S3, the 3D convolutional neural network of the constructed 3D convolutional-GRU neural network model is as follows:
wherein,an output of a jth feature map representing a layer i neuron of the 3D convolutional neural network, x and y represent spatial dimensions of a normalized radar echo map input into the 3D convolutional neural network, respectively, z represents a time dimension of a sequence of normalized radar echo maps input into the 3D convolutional neural network, σ (·) represents an activation function, bijA bias function representing the jth feature map of layer i neurons of the 3D convolutional neural network, P, q, and r represent convolution values, respectively, Pi、QiAnd RiRespectively represent the size of the convolution kernel in the 3D convolutional neural network,representing the weight of the (p, q, r) th neuron connection in the mth feature,a dimension value representing a sequence of normalized radar echo maps input into the 3D convolutional neural network.
4. The method for forecasting short-term heavy rainfall based on deep learning of claim 3, wherein in step S3, the GRU neural network of the constructed 3D convolution-GRU neural network model is as follows:
Zt=σ(WZ·[ht-1,Xt]);
rt=σ(Wr·[ht-1,Xt]);
ht=(1-Zt)*ht-1+Zt*h't;
where σ (-) denotes the activation function, WZPresentation update door ZtWeight of (a), htRepresenting the output of the current neural unit in the GRU neural network, ht-1Representing the output, X, of the last neural unit in the GRU neural networktRepresenting the input of the current neural unit in the GRU neural network, WZ·[ht-1,Xt]Indicates to output ht-1And input XtThe result of the addition is given a weight WZMultiplication processing is carried out, ht' indicates by controlling rtFrom the output ht-1The information amount in (1), tanh (-) represents a commonly used hyperbolic tangent activation function.
5. The method for forecasting short-term heavy rainfall based on deep learning of claim 4, wherein in step S4, the optimized 3D convolution-GRU neural network model is trained by:
step S41, acquiring a rainfall prediction value output by the 3D convolution-GRU neural network model at any acquisition time;
step S42, acquiring the rainfall actual value corresponding to the target area at any acquisition time;
step S43, constructing a loss function of the 3D convolution-GRU neural network model, and obtaining a loss function value of the 3D convolution-GRU neural network model; wherein, the loss function of the 3D convolution-GRU neural network model is as follows:
wherein, Γ represents a loss function value of the 3D convolution-GRU neural network model, y (t) represents an actual rainfall value of the target area at the acquisition time t, and y' (t) represents a predicted rainfall value of the target area output by the 3D convolution-GRU neural network model at the acquisition time t;
step S44, making judgment according to the loss function value of the obtained 3D convolution-GRU neural network model:
when the change of the loss function value of the 3D convolution-GRU neural network model is stable, taking the 3D convolution-GRU neural network model as an optimized 3D convolution-GRU neural network model; otherwise, the process proceeds to step S41.
6. The method for predicting short-term heavy rainfall based on deep learning of any one of claims 1 to 5, wherein in step S1, radar echo map sequences of the target area at different acquisition times within 24h are acquired in advance at an acquisition frequency of 1 frame/6 min.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202110317764.3A CN112949934A (en) | 2021-03-25 | 2021-03-25 | Short-term heavy rainfall prediction method based on deep learning |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202110317764.3A CN112949934A (en) | 2021-03-25 | 2021-03-25 | Short-term heavy rainfall prediction method based on deep learning |
Publications (1)
Publication Number | Publication Date |
---|---|
CN112949934A true CN112949934A (en) | 2021-06-11 |
Family
ID=76228075
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202110317764.3A Pending CN112949934A (en) | 2021-03-25 | 2021-03-25 | Short-term heavy rainfall prediction method based on deep learning |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN112949934A (en) |
Cited By (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN113568068A (en) * | 2021-07-22 | 2021-10-29 | 河南大学 | Strong convection weather prediction method based on MPI parallel three-dimensional neural network |
CN114021349A (en) * | 2021-11-05 | 2022-02-08 | 广东电网有限责任公司广州供电局 | Method, system and device for predicting heavy rainfall and computer storage medium |
CN114091765A (en) * | 2021-11-25 | 2022-02-25 | 山西勇利信息科技有限公司 | Future rainfall prediction method based on space-time bidirectional multi-granularity dynamic integration |
CN114509825A (en) * | 2021-12-31 | 2022-05-17 | 河南大学 | Strong convection weather prediction method and system for improving three-dimensional confrontation generation neural network based on hybrid evolution algorithm |
CN116520459A (en) * | 2023-06-28 | 2023-08-01 | 成都信息工程大学 | Weather prediction method |
CN116755181A (en) * | 2023-08-11 | 2023-09-15 | 深圳市昆特科技有限公司 | Precipitation prediction method and related device |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN105046089A (en) * | 2015-08-13 | 2015-11-11 | 电子科技大学 | Method for predicting strong rainfall and flood disasters |
CN109376848A (en) * | 2018-09-01 | 2019-02-22 | 哈尔滨工程大学 | A kind of door control unit neural network of simplification |
CN111476713A (en) * | 2020-03-26 | 2020-07-31 | 中南大学 | Intelligent weather image identification method and system based on multi-depth convolution neural network fusion |
CN112415521A (en) * | 2020-12-17 | 2021-02-26 | 南京信息工程大学 | CGRU (China-swarm optimization and RU-based radar echo nowcasting) method with strong space-time characteristics |
-
2021
- 2021-03-25 CN CN202110317764.3A patent/CN112949934A/en active Pending
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN105046089A (en) * | 2015-08-13 | 2015-11-11 | 电子科技大学 | Method for predicting strong rainfall and flood disasters |
CN109376848A (en) * | 2018-09-01 | 2019-02-22 | 哈尔滨工程大学 | A kind of door control unit neural network of simplification |
CN111476713A (en) * | 2020-03-26 | 2020-07-31 | 中南大学 | Intelligent weather image identification method and system based on multi-depth convolution neural network fusion |
CN112415521A (en) * | 2020-12-17 | 2021-02-26 | 南京信息工程大学 | CGRU (China-swarm optimization and RU-based radar echo nowcasting) method with strong space-time characteristics |
Non-Patent Citations (4)
Title |
---|
刘东: "《工业机器视觉 基于灵闪平台的开发及应用》", 上海教育出版社, pages: 83 * |
陈晓平等: "基于机器学习的降雨量雷达回波数据建模与预测", 《南京信息工程大学学报》, vol. 12, no. 4, pages 483 - 494 * |
陈程: "卷积神经网络在气象短临预报的研究与应用", 《中国优秀硕士学位论文全文数据库 基础科学辑》, no. 12, pages 009 - 16 * |
陈颖等: "基于3D双流卷积神经网络和GRU网络的人体行为识别", 《计算机应用与软件》, no. 5, pages 170 - 174 * |
Cited By (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN113568068A (en) * | 2021-07-22 | 2021-10-29 | 河南大学 | Strong convection weather prediction method based on MPI parallel three-dimensional neural network |
CN113568068B (en) * | 2021-07-22 | 2022-03-29 | 河南大学 | Strong convection weather prediction method based on MPI parallel three-dimensional neural network |
CN114021349A (en) * | 2021-11-05 | 2022-02-08 | 广东电网有限责任公司广州供电局 | Method, system and device for predicting heavy rainfall and computer storage medium |
CN114091765A (en) * | 2021-11-25 | 2022-02-25 | 山西勇利信息科技有限公司 | Future rainfall prediction method based on space-time bidirectional multi-granularity dynamic integration |
CN114509825A (en) * | 2021-12-31 | 2022-05-17 | 河南大学 | Strong convection weather prediction method and system for improving three-dimensional confrontation generation neural network based on hybrid evolution algorithm |
CN114509825B (en) * | 2021-12-31 | 2022-11-08 | 河南大学 | Strong convection weather prediction method and system for improving three-dimensional confrontation generation neural network based on hybrid evolution algorithm |
CN116520459A (en) * | 2023-06-28 | 2023-08-01 | 成都信息工程大学 | Weather prediction method |
CN116520459B (en) * | 2023-06-28 | 2023-08-25 | 成都信息工程大学 | Weather prediction method |
CN116755181A (en) * | 2023-08-11 | 2023-09-15 | 深圳市昆特科技有限公司 | Precipitation prediction method and related device |
CN116755181B (en) * | 2023-08-11 | 2023-10-20 | 深圳市昆特科技有限公司 | Precipitation prediction method and related device |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN112949934A (en) | Short-term heavy rainfall prediction method based on deep learning | |
CN109215034B (en) | Weak supervision image semantic segmentation method based on spatial pyramid covering pooling | |
CN111882002B (en) | MSF-AM-based low-illumination target detection method | |
CN110223323B (en) | Target tracking method based on depth feature adaptive correlation filtering | |
CN110751067B (en) | Dynamic expression recognition method combined with biological form neuron model | |
CN112633497A (en) | Convolutional pulse neural network training method based on reweighted membrane voltage | |
CN108447041B (en) | Multi-source image fusion method based on reinforcement learning | |
CN107451999A (en) | foreign matter detecting method and device based on image recognition | |
CN107229929A (en) | A kind of license plate locating method based on R CNN | |
CN113111758B (en) | SAR image ship target recognition method based on impulse neural network | |
CN113239722B (en) | Deep learning based strong convection extrapolation method and system under multi-scale | |
CN109255304B (en) | Target tracking method based on distribution field characteristics | |
CN107977683A (en) | Joint SAR target identification methods based on convolution feature extraction and machine learning | |
CN100565557C (en) | System for tracking infrared human body target based on corpuscle dynamic sampling model | |
CN108563977A (en) | A kind of the pedestrian's method for early warning and system of expressway entrance and exit | |
CN110084201B (en) | Human body action recognition method based on convolutional neural network of specific target tracking in monitoring scene | |
CN110765948A (en) | Target detection and identification method and system based on unmanned aerial vehicle | |
CN109886387A (en) | It is a kind of that the traffic time sequence forecasting method returned is promoted based on gating network and gradient | |
CN111461213A (en) | Training method of target detection model and target rapid detection method | |
US20130039534A1 (en) | Motion detection method for complex scenes | |
CN109214253A (en) | A kind of video frame detection method and device | |
Xu et al. | A bio-inspired motion sensitive model and its application to estimating human gaze positions under classified driving conditions | |
CN112215334A (en) | Neural network model compression method for event camera | |
CN106127740A (en) | A kind of profile testing method based on the association of visual pathway many orientation of sensory field | |
CN115690557A (en) | Construction safety early warning method and device based on attention mechanism neural network |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
RJ01 | Rejection of invention patent application after publication | ||
RJ01 | Rejection of invention patent application after publication |
Application publication date: 20210611 |