CN112949934A - Short-term heavy rainfall prediction method based on deep learning - Google Patents

Short-term heavy rainfall prediction method based on deep learning Download PDF

Info

Publication number
CN112949934A
CN112949934A CN202110317764.3A CN202110317764A CN112949934A CN 112949934 A CN112949934 A CN 112949934A CN 202110317764 A CN202110317764 A CN 202110317764A CN 112949934 A CN112949934 A CN 112949934A
Authority
CN
China
Prior art keywords
neural network
convolution
network model
gru neural
rainfall
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202110317764.3A
Other languages
Chinese (zh)
Inventor
王仁芳
孙德超
李谦
洪鑫华
梁丰
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Zhejiang Wanli University
Original Assignee
Zhejiang Wanli University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Zhejiang Wanli University filed Critical Zhejiang Wanli University
Priority to CN202110317764.3A priority Critical patent/CN112949934A/en
Publication of CN112949934A publication Critical patent/CN112949934A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q10/00Administration; Management
    • G06Q10/04Forecasting or optimisation specially adapted for administrative or management purposes, e.g. linear programming or "cutting stock problem"
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/88Radar or analogous systems specially adapted for specific applications
    • G01S13/95Radar or analogous systems specially adapted for specific applications for meteorological use
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01WMETEOROLOGY
    • G01W1/00Meteorology
    • G01W1/10Devices for predicting weather conditions
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/044Recurrent networks, e.g. Hopfield networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/045Combinations of networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/048Activation functions
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q50/00Information and communication technology [ICT] specially adapted for implementation of business processes of specific business sectors, e.g. utilities or tourism
    • G06Q50/10Services
    • G06Q50/26Government or public services
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02ATECHNOLOGIES FOR ADAPTATION TO CLIMATE CHANGE
    • Y02A90/00Technologies having an indirect contribution to adaptation to climate change
    • Y02A90/10Information and communication technologies [ICT] supporting adaptation to climate change, e.g. for weather forecasting or climate simulation

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Business, Economics & Management (AREA)
  • General Physics & Mathematics (AREA)
  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Data Mining & Analysis (AREA)
  • Evolutionary Computation (AREA)
  • Computational Linguistics (AREA)
  • Molecular Biology (AREA)
  • Computing Systems (AREA)
  • General Engineering & Computer Science (AREA)
  • Biophysics (AREA)
  • Mathematical Physics (AREA)
  • Software Systems (AREA)
  • Biomedical Technology (AREA)
  • Artificial Intelligence (AREA)
  • Remote Sensing (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Tourism & Hospitality (AREA)
  • Economics (AREA)
  • Human Resources & Organizations (AREA)
  • Strategic Management (AREA)
  • Marketing (AREA)
  • Environmental & Geological Engineering (AREA)
  • General Business, Economics & Management (AREA)
  • Development Economics (AREA)
  • Primary Health Care (AREA)
  • Electromagnetism (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Educational Administration (AREA)
  • Atmospheric Sciences (AREA)
  • Biodiversity & Conservation Biology (AREA)
  • Ecology (AREA)
  • Environmental Sciences (AREA)
  • Game Theory and Decision Science (AREA)
  • Entrepreneurship & Innovation (AREA)
  • Operations Research (AREA)

Abstract

The invention relates to a short-time heavy rainfall prediction method based on deep learning, which comprises the steps of forming target area actual rainfall data sets of a target area at different acquisition moments in advance, obtaining a target area actual rainfall data set after normalization processing, inputting a normalized radar echo diagram sequence corresponding to any acquisition moment in the target area actual rainfall data set after the normalization processing into a 3D convolution-GRU neural network model which is constructed in advance, taking the output of the 3D convolution-GRU neural network model as a rainfall prediction value of any acquisition moment, obtaining an optimized 3D convolution-GRU neural network model through continuous training, inputting the radar echo diagram sequence of the target area at the current moment into the optimized 3D convolution-GRU neural network model after the normalization processing, and taking the output of the optimized 3D convolution-GRU neural network model as the rainfall prediction value of the target area in a future time period And strong rainfall prediction in a short time aiming at the target area is realized.

Description

Short-term heavy rainfall prediction method based on deep learning
Technical Field
The invention relates to the technical field of computer vision and meteorological service, in particular to a short-time heavy rainfall prediction method based on deep learning.
Background
Short-term heavy rainfall is a weather process with strong burst property, short rainfall time and large rainfall amount, because meteorological disasters caused by the short-term heavy rainfall are usually 'precautionary', the social harmfulness generated by the short-term heavy rainfall is extremely large, and the natural disasters caused by the short-term heavy rainfall are endless every year, thereby seriously threatening the life and property safety of people. Therefore, the method realizes accurate prediction of short-time heavy rainfall and has great significance for disaster prevention and reduction.
In the existing short-time heavy rainfall prediction method, a radar echo extrapolation technology is generally used as a main technical means for forecasting the near weather, and specifically, according to echo data detected by a weather radar, the intensity distribution of an echo and the moving speed and moving direction of an echo body (such as a rainfall area) are determined, and then the radar echo state after a certain period of time is predicted by performing linear or nonlinear extrapolation on the echo body.
The Chinese invention patent CN105046089B discloses a method for predicting heavy rainfall and flood disasters, which predicts the heavy rainfall and flood disasters by collecting event time sequence data and constructing a rainfall sequence according to the total rainfall data of each historical month and by predicting the total rainfall of a certain month in the future by combining a fuzzy subtraction clustering algorithm, statistical learning, a selective structure risk minimization theory and cluster-like projection. The scheme for predicting heavy rainfall in the invention uses a fuzzy clustering algorithm to cluster the training set, and the number of clusters is determined by a selective structure risk minimization theory, so that the clustering result is more accurate, and the effectiveness and accuracy of the prediction result are ensured.
However, the method for predicting heavy rainfall of the above patent CN105046089B also has the following disadvantages: the method for predicting the heavy rainfall in the invention patent can predict the total rainfall in a certain month in the future, so that the short-time heavy rainfall weather with strong burst property, short rainfall time and large rainfall amount cannot be predicted.
Disclosure of Invention
The invention aims to solve the technical problem of providing a short-time heavy rainfall prediction method based on deep learning aiming at the prior art.
The technical scheme adopted by the invention for solving the technical problems is as follows: the short-time heavy rainfall prediction method based on deep learning is characterized by comprising the following steps of S1-S5:
step S1, collecting radar echo diagram sequences and rainfall actual values of a target area at different collecting moments in advance, and forming a target area actual rainfall data set by all the collected radar echo diagram sequences and the collected rainfall actual values; in the target area actual rainfall data set, radar echo diagram sequences at the same acquisition time are in one-to-one correspondence with rainfall actual values;
step S2, normalizing each radar echo map in the target area actual rainfall data set to obtain a normalized target area actual rainfall data set;
step S3, a 3D convolution-GRU neural network model is constructed in advance; the 3D convolution-GRU neural network model comprises a 3D convolution neural network and a GRU neural network, wherein the input of the 3D convolution-GRU neural network model is the input of the 3D convolution neural network, the output of the 3D convolution neural network is the input of the GRU neural network, and the output of the GRU neural network is the output of the 3D convolution-GRU neural network model;
step S4, taking the radar echo diagram sequence of each acquisition time in the target area actual rainfall data set after normalization processing as the input of a 3D convolution-GRU neural network model, taking the output of the 3D convolution-GRU neural network model as a rainfall prediction value aiming at the acquisition time, and training the 3D convolution-GRU neural network model by using the target area actual rainfall data set after normalization processing to obtain an optimized 3D convolution-GRU neural network model through training;
and step S5, acquiring a radar echo diagram sequence of the target area at the current moment, performing normalization processing on each radar echo diagram in the radar echo diagram sequence, inputting the radar echo diagram sequence after the normalization processing into the optimized 3D convolution-GRU neural network model, and taking the output of the optimized 3D convolution-GRU neural network model as a rainfall prediction value of the target area in a future time period.
In step S2, before normalizing each radar echo map in the target area actual rainfall data set, the method further includes:
processing each radar echo map in the target area actual rainfall data set into a gray scale map through linear transformation; wherein, the formula of the linear transformation processing is that g '(d, e) ═ K · g (d, e) + B, g (d, e) represents the pixel value of the acquired radar echo image, K represents the slope, B represents the intercept, and g' (d, e) represents the pixel value corresponding to the gray scale image after the linear transformation processing;
and filtering the obtained gray level image by adopting a bilinear filter, and taking the gray level image after filtering as a radar echo image needing normalization processing.
Further, in the method for predicting short-term heavy rainfall based on deep learning, in step S3, the 3D convolutional neural network of the 3D convolutional-GRU neural network model is constructed as follows:
Figure BDA0002991577530000021
wherein,
Figure BDA0002991577530000022
to representOutput of the jth feature map of the ith layer of neurons of the 3D convolutional neural network, x and y respectively represent the spatial dimensions of the normalized radar echo map input into the 3D convolutional neural network, z represents the time dimension of the normalized radar echo map sequence input into the 3D convolutional neural network, σ (·) represents the activation function, bijA bias function representing the jth feature map of layer i neurons of the 3D convolutional neural network, P, q, and r represent convolution values, respectively, Pi、QiAnd RiRespectively represent the size of the convolution kernel in the 3D convolutional neural network,
Figure BDA0002991577530000031
representing the weight of the (p, q, r) th neuron connection in the mth feature,
Figure BDA0002991577530000032
a dimension value representing a sequence of normalized radar echo maps input into the 3D convolutional neural network.
Still further, in the method for predicting short-term heavy rainfall based on deep learning, in step S3, the GRU neural network of the 3D convolution-GRU neural network model is constructed as follows:
Zt=σ(WZ·[ht-1,Xt]);
rt=σ(Wr·[ht-1,Xt]);
Figure BDA0002991577530000033
ht=(1-Zt)*ht-1+Zt*h′t
where σ (-) denotes the activation function, WZPresentation update door ZtWeight of (a), htRepresenting the output of the current neural unit in the GRU neural network, ht-1Representing the output, X, of the last neural unit in the GRU neural networktRepresenting the input of the current neural unit in the GRU neural network, WZ·[ht-1,Xt]Indicates to output ht-1And input XtThe result of the addition is given a weight WZAre subjected to multiplication treatment of h'tBy controlling rtFrom the output ht-1The information amount in (1), tanh (-) represents a commonly used hyperbolic tangent activation function.
Further improved, in the method for predicting short-term heavy rainfall based on deep learning, in step S4, an optimized 3D convolution-GRU neural network model is trained as follows:
step S41, acquiring a rainfall prediction value output by the 3D convolution-GRU neural network model at any acquisition time;
step S42, acquiring the rainfall actual value corresponding to the target area at any acquisition time;
step S43, constructing a loss function of the 3D convolution-GRU neural network model, and obtaining a loss function value of the 3D convolution-GRU neural network model; wherein, the loss function of the 3D convolution-GRU neural network model is as follows:
Figure BDA0002991577530000034
wherein, Γ represents a loss function value of the 3D convolution-GRU neural network model, y (t) represents an actual rainfall value of the target area at the acquisition time t, and y' (t) represents a predicted rainfall value of the target area output by the 3D convolution-GRU neural network model at the acquisition time t;
step S44, making judgment according to the loss function value of the obtained 3D convolution-GRU neural network model:
when the change of the loss function value of the 3D convolution-GRU neural network model is stable, taking the 3D convolution-GRU neural network model as an optimized 3D convolution-GRU neural network model; otherwise, the process proceeds to step S41.
Preferably, in the method for predicting short-term heavy rainfall based on deep learning, in step S1, radar echo map sequences of the target area at different acquisition times within 24h are acquired in advance according to an acquisition frequency of 1 frame/6 min.
Compared with the prior art, the invention has the advantages that:
firstly, the invention acquires radar echo diagram sequences and rainfall actual values of a target area at different acquisition moments in advance to form a target area actual rainfall data set, then acquires a normalized target area actual rainfall data set, inputs a normalized radar echo diagram sequence corresponding to any acquisition moment as a pre-constructed 3D convolution-GRU neural network model, and takes the output of the 3D convolution-GRU neural network model as a rainfall predicted value aiming at any acquisition moment, thereby continuously training the 3D convolution-GRU neural network model through the radar echo diagram sequence in the normalized target area actual rainfall data set, acquiring an optimized 3D convolution-GRU neural network model through training, and then inputting the radar echo diagram sequence of the target area at the current moment into the optimized 3D convolution-GRU neural network model after normalization processing, and the output of the optimized 3D convolution-GRU neural network model is used as a rainfall prediction value of the target area in a future time period, so that the rainfall forecast of the target area in a short time is realized, and the method has important application value and practical significance for improving the accuracy of meteorological early warning work and natural disasters caused by the rainfall.
Secondly, the method can adjust the acquisition frequency of the radar echo map aiming at the target area according to the requirement, and further realize the forecast of the heavy rainfall at different moments in the future, thereby meeting the forecast of the heavy rainfall at different moments in the future according to the requirement and having higher practicability.
Drawings
FIG. 1 is a schematic flow chart of a short-term heavy rainfall prediction method based on deep learning according to an embodiment of the present invention;
FIG. 2 is a schematic diagram of a 3D convolutional-GRU neural network model constructed in an embodiment of the present invention;
FIG. 3 is a predicted output of a radar echo image sequence within 2 hours in the future using a conventional short-term heavy rainfall prediction method;
FIG. 4 is a predicted output of a radar echo image sequence within 2 hours in the future, obtained by using a short-time heavy rainfall prediction method based on deep learning in an embodiment of the present invention;
fig. 5 is a real output of a sequence of radar echo images of a target region acquired from a weather station over the next 2 hours.
Detailed Description
The invention is described in further detail below with reference to the accompanying examples.
The embodiment provides a short-time heavy rainfall prediction method based on deep learning. Specifically, referring to fig. 1, the method for predicting short-term heavy rainfall based on deep learning in this embodiment includes the following steps S1 to S6:
step S1, collecting radar echo diagram sequences and rainfall actual values of a target area at different collecting moments in advance, and forming a target area actual rainfall data set by all the collected radar echo diagram sequences and the collected rainfall actual values;
in this embodiment, the "acquisition time" referred to herein is a time before the current time; specifically, assuming that the target region is a nail, the pre-acquisition times are t1、t2、……、tMAnd marking the formed target area actual rainfall data set as List: wherein:
the pre-collected target area A is collected at the collecting time t1The radar echo diagram sequence is
Figure BDA0002991577530000051
The target area A is at the acquisition time t1Actual value of rainfall in time is marked
Figure BDA0002991577530000052
The pre-collected target area A is collected at the collecting time t2The radar echo diagram sequence is
Figure BDA0002991577530000053
The target area A is at the acquisition time t2Actual value of rainfall in time is marked
Figure BDA0002991577530000054
And so on;
the pre-collected target area A is collected at the collecting time tMThe radar echo diagram sequence is
Figure BDA0002991577530000055
The target area A is at the acquisition time tMActual value of rainfall in time is marked
Figure BDA0002991577530000056
Since the radar echo map sequence is a sequence of images comprising images at different heights, H herewRepresenting the W-th height value of the target area A, wherein W is more than or equal to 1 and less than or equal to W, and W is the total number of height values corresponding to the target area A when the radar echo map is acquired;
Figure BDA0002991577530000057
showing the acquired target area A at time tMHeight of (H)wA single radar echo map of time;
as is well known to those skilled in the art, in the process of predicting rainfall conditions by using a radar echo sequence diagram, a radar echo sequence diagram at a certain time can reflect rainfall conditions at the certain time, that is, the radar echo sequence diagram and the rainfall at the same time are in a corresponding relationship, that is, the following corresponding relationship exists in the actual rainfall data set List of the target area in the embodiment:
for the acquisition time t1Sequence of radar echo diagrams
Figure BDA0002991577530000058
And actual value of rainfall
Figure BDA0002991577530000059
One-to-one correspondence is realized;
for the acquisition time t2Sequence of radar echo diagrams
Figure BDA00029915775300000510
And actual value of rainfall
Figure BDA00029915775300000511
One-to-one correspondence is realized;
and so on;
for the acquisition time tMSequence of radar echo diagrams
Figure BDA00029915775300000512
And actual value of rainfall
Figure BDA00029915775300000513
One-to-one correspondence is realized;
in the embodiment, the radar echo map sequences of the target area A at different heights at different acquisition moments within 24h are acquired in advance according to the acquisition frequency of 1 frame/6 min. Therefore, the acquisition frequency of the radar echo map set by the device is 1 frame/6 min, namely 10 radar echo maps are acquired within 1h, and 240 radar echo maps are acquired within 24h (namely one day);
step S2, normalizing each radar echo map in the target area actual rainfall data set to obtain a normalized target area actual rainfall data set;
specifically, in step S2, normalization processing needs to be performed on each radar echo map in the obtained target area actual rainfall data set List, that is, a sequence of radar echo maps is obtained
Figure BDA0002991577530000061
Each radar echo map in the target area is subjected to normalization processing, so that a normalized target area actual rainfall data set List' can be obtained through the normalization processing;
specifically, in the target area actual rainfall data set List' after the normalization processing:
for the acquisition time t1Normalized radar echo map sequence
Figure BDA0002991577530000062
And actual value of rainfall
Figure BDA0002991577530000063
One-to-one correspondence, radar echo maps
Figure BDA0002991577530000064
Is a radar echo diagram
Figure BDA0002991577530000065
Normalized radar echo map of (a);
for the acquisition time t2Normalized radar echo map sequence
Figure BDA0002991577530000066
And actual value of rainfall
Figure BDA0002991577530000067
One-to-one correspondence, radar echo maps
Figure BDA0002991577530000068
Is a radar echo diagram
Figure BDA0002991577530000069
Normalized radar echo map of (a);
and so on;
for the acquisition time tMNormalized radar echo map sequence
Figure BDA00029915775300000610
And actual value of rainfall
Figure BDA00029915775300000611
One-to-one correspondence, radar echo maps
Figure BDA00029915775300000612
Is a radar echo diagram
Figure BDA00029915775300000613
Normalized radar echo map of (a);
it should be noted that there is usually noise interference during the actual acquisition process for the radar echo pattern sequence. Therefore, in order to eliminate the adverse effect of noise on the acquired radar echo map, in this step S2, a noise cancellation process for the acquired radar echo map may also be performed before the normalization process for the radar echo image. For example, the noise cancellation process herein includes: processing each radar echo map in the target area actual rainfall data set List into a gray map through linear transformation; wherein, the formula of the linear transformation processing is that g '(d, e) ═ K · g (d, e) + B, g (d, e) represents the pixel value of the acquired radar echo image, K represents the slope, B represents the intercept, and g' (d, e) represents the pixel value corresponding to the gray scale image after the linear transformation processing;
and filtering the obtained gray level image by adopting a bilinear filter, and taking the gray level image after filtering as a radar echo image needing normalization processing.
When K is greater than 1, the method can be used for increasing the contrast of the image, the pixel values of the image are all increased after conversion, and the overall display effect is enhanced; when K is 1, it is often used to adjust image brightness; when 0< K <1, the effect is just opposite to that when K >1, and both the contrast of the image and the overall effect are impaired; when K is less than 0, the brighter area of the source image becomes dark, and the darker area becomes bright, at this time, K in the function is 1, and B is 255, so that the image realizes the reverse color effect;
step S3, a 3D convolution-GRU neural network model is constructed in advance; referring to fig. 2, the 3D convolution-GRU neural network model includes a 3D convolution neural network and a GRU neural network, an input of the 3D convolution-GRU neural network model is an input of the 3D convolution neural network, an output of the 3D convolution neural network is an input of the GRU neural network, and an output of the GRU neural network is an output of the 3D convolution-GRU neural network model;
assume, initially, that the constructed 3D convolutional-GRU neural network model is labeled as Conv3D _ GRU0The initially constructed model Conv3D _ GRU0The 3D convolutional neural network in (1) is labeled as Conv3D0The initially constructed model Conv3D _ GRU0The GRU neural network in (1) is marked as GRU0
For example, in this embodiment, the constructed 3D convolutional-GRU neural network model Conv3D _ GRU0The 3D convolutional neural network of (a) is as follows:
Figure BDA0002991577530000071
wherein,
Figure BDA0002991577530000072
an output of a jth feature map representing a layer i neuron of the 3D convolutional neural network, x and y represent spatial dimensions of the normalized radar echo map input into the 3D convolutional neural network, respectively, z represents a time dimension of the normalized radar echo map input into the 3D convolutional neural network, σ (·) represents an activation function, bijA bias function representing the jth feature map of layer i neurons of the 3D convolutional neural network, P, q, and r represent convolution values, respectively, Pi、QiAnd RiRespectively represent the size of the convolution kernel in the 3D convolutional neural network,
Figure BDA0002991577530000073
representing the weight of the (p, q, r) th neuron connection in the mth feature,
Figure BDA0002991577530000074
a dimension value representing a normalized radar echo map input into the 3D convolutional neural network; wherein, in this embodiment, the 3D convolutional neural network used consists of 1 input layer, 3 three-dimensional convolutional layers, and 3 three-dimensional pooling layers;
the GRU neural network of the constructed 3D convolution-GRU neural network model is as follows:
Zt=σ(WZ·[ht-1,Xt]);
rt=σ(Wr·[ht-1,Xt]);
Figure BDA0002991577530000075
ht=(1-Zt)*ht-1+Zt*ht';
where σ (-) denotes an activation function, e.g., the activation function employed here is a commonly used sigmoid function, i.e.,
Figure BDA0002991577530000081
k is a variable, WZPresentation update door ZtWeight of (a), htRepresenting the output of the current neural unit in the GRU neural network, ht-1Representing the output, X, of the last neural unit in the GRU neural networktRepresenting the input of the current neural unit in the GRU neural network, WZ·[ht-1,Xt]Indicates to output ht-1And input XtThe result of the addition is given a weight WZAre subjected to multiplication treatment of h'tBy controlling rtFrom the output ht-1The information amount obtained in (1), tanh (-) represents a commonly used hyperbolic tangent activation function;
step S4, taking the radar echo map of each acquisition time in the target area actual rainfall data set after normalization processing as the input of a 3D convolution-GRU neural network model, taking the output of the 3D convolution-GRU neural network model as a rainfall prediction value aiming at the acquisition time, and training the 3D convolution-GRU neural network model by using the target area actual rainfall data set after normalization processing to obtain an optimized 3D convolution-GRU neural network model through training;
specifically, when this step S4 is executed, the following processing is executed:
collecting time t in the target area actual rainfall data set List' after normalization processing1Corresponding normalized radar echo diagram sequence
Figure BDA0002991577530000082
As an input to a 3D convolution-GRU neural network model, and an output of the 3D convolution-GRU neural network model as an output for the acquisition instantt1E.g. the output of the 3D convolution-GRU neural network model here as the prediction value for the acquisition time t1Is marked as a rainfall prediction value
Figure BDA0002991577530000083
Collecting time t in the target area actual rainfall data set List' after normalization processing2Corresponding normalized radar echo diagram sequence
Figure BDA0002991577530000084
As input to the 3D convolution-GRU neural network model, and the output of the 3D convolution-GRU neural network model as input for the acquisition time t2E.g. the output of the 3D convolution-GRU neural network model here as the prediction value for the acquisition time t2Is marked as a rainfall prediction value
Figure BDA0002991577530000085
And so on;
collecting time t in the target area actual rainfall data set List' after normalization processingMCorresponding normalized radar echo diagram sequence
Figure BDA0002991577530000086
As input to the 3D convolution-GRU neural network model, and the output of the 3D convolution-GRU neural network model as input for the acquisition time tME.g. the output of the 3D convolution-GRU neural network model here as the prediction value for the acquisition time tMIs marked as a rainfall prediction value
Figure BDA0002991577530000087
In turn normalized radar echo map sequence
Figure BDA0002991577530000088
To normalized radar echo pattern sequence
Figure BDA0002991577530000089
Respectively training the 3D convolution-GRU neural network model to obtain an optimized 3D convolution-GRU neural network model; in step S4 of this embodiment, the optimized 3D convolution-GRU neural network model is trained as follows:
step S41, acquiring a rainfall prediction value output by the 3D convolution-GRU neural network model at any acquisition time;
step S42, acquiring the rainfall actual value corresponding to the target area at any acquisition time;
step S43, constructing a loss function of the 3D convolution-GRU neural network model, and obtaining a loss function value of the 3D convolution-GRU neural network model; wherein, the loss function of the 3D convolution-GRU neural network model is as follows:
Figure BDA0002991577530000091
wherein, Γ represents a loss function value of the 3D convolution-GRU neural network model, y (t) represents an actual rainfall value of the target area at the acquisition time t, and y' (t) represents a predicted rainfall value of the target area output by the 3D convolution-GRU neural network model at the acquisition time t;
step S44, making judgment according to the loss function value of the obtained 3D convolution-GRU neural network model:
when the change of the loss function value of the 3D convolution-GRU neural network model is stable, taking the 3D convolution-GRU neural network model as an optimized 3D convolution-GRU neural network model; otherwise, the process proceeds to step S41.
For example, a 3D convolution-GRU neural network model is first obtained at an acquisition time t1Output rainfall prediction value
Figure BDA0002991577530000092
Figure BDA0002991577530000092
3D convolution-GRU neural network model at acquisition time t2Output rainfall prediction value
Figure BDA0002991577530000093
… …, 3D convolution-GRU neural network model at acquisition time tMOutput rainfall prediction value
Figure BDA0002991577530000094
Then, acquiring the target area A at any acquisition time t1、t2、……、tMActual values of rainfall corresponding to the times
Figure BDA0002991577530000095
Thirdly, constructing a loss function of the 3D convolution-GRU neural network model, and obtaining a loss function value of the 3D convolution-GRU neural network model; wherein, the loss function value of the 3D convolution-GRU neural network model is as follows:
Figure BDA0002991577530000096
when the change of the loss function value gamma of the 3D convolution-GRU neural network model is stable, namely the change range of the loss function value gamma is small, for example, the change of the loss function value gamma is in a preset small numerical range, taking the 3D convolution-GRU neural network model as an optimized 3D convolution-GRU neural network model; otherwise, the rainfall prediction value output by the 3D convolution-GRU neural network model at any acquisition time is obtained again, and the subsequent steps are executed in sequence.
And step S5, acquiring a radar echo diagram sequence of the target area at the current moment, executing normalization processing on the radar echo diagram sequence, inputting the radar echo diagram sequence after the normalization processing into the optimized 3D convolution-GRU neural network model, and taking the output of the optimized 3D convolution-GRU neural network model as a rainfall prediction value of the target area in a future time period.
Since an optimized 3D convolution-GRU neural network model has been obtained through the training process of step S4, at this time, the radar echo map sequence of the target area a at the current time is collected again, normalization processing is performed on each radar echo map in the radar echo map sequence, the radar echo map sequence after normalization processing is input into the optimized 3D convolution-GRU neural network model, and the output of the optimized 3D convolution-GRU neural network model is taken as the rainfall prediction value of the target area in the future time period.
As is well known to those skilled in the art, in the process of predicting rainfall conditions through a radar echo sequence diagram, a radar echo sequence diagram at a certain time can reflect the rainfall conditions at the certain time, that is, the radar echo sequence diagram and the rainfall at the same time are in a corresponding relationship. Therefore, in order to more intuitively compare the performance differences of different short-term heavy rainfall prediction methods, the embodiment characterizes the rainfall condition in a predicted future time period by a radar echo diagram sequence in the future time period. Specifically, to make a comparative illustration of the performance of the short-term heavy rainfall prediction method based on deep learning in this embodiment, this embodiment also provides a predicted output image sequence (see fig. 3) of the radar echo image sequence within 2 hours in the future and a true output image sequence (see fig. 5) of the radar echo image sequence within 2 hours in the future using a conventional method (based on a 2D convolutional neural network).
As can be seen from comparison among fig. 3, fig. 4, and fig. 5, the output image sequence predicted by the time-heavy rainfall prediction method in this embodiment is clearer, the time dimension features and the space dimension features of the radar echo map of the target area at different heights can be better obtained, and the future rainfall condition can be predicted more accurately.
Although preferred embodiments of the present invention have been described in detail hereinabove, it should be clearly understood that modifications and variations of the present invention are possible to those skilled in the art. Any modification, equivalent replacement, or improvement made within the spirit and principle of the present invention should be included in the protection scope of the present invention.

Claims (6)

1. The short-time heavy rainfall prediction method based on deep learning is characterized by comprising the following steps of S1-S5:
step S1, collecting radar echo diagram sequences and rainfall actual values of a target area at different collecting moments in advance, and forming a target area actual rainfall data set by all the collected radar echo diagram sequences and the collected rainfall actual values; in the target area actual rainfall data set, radar echo diagram sequences at the same acquisition time are in one-to-one correspondence with rainfall actual values;
step S2, normalizing each radar echo map in the target area actual rainfall data set to obtain a normalized target area actual rainfall data set;
step S3, a 3D convolution-GRU neural network model is constructed in advance; the 3D convolution-GRU neural network model comprises a 3D convolution neural network and a GRU neural network, wherein the input of the 3D convolution-GRU neural network model is the input of the 3D convolution neural network, the output of the 3D convolution neural network is the input of the GRU neural network, and the output of the GRU neural network is the output of the 3D convolution-GRU neural network model;
step S4, taking the radar echo diagram sequence of each acquisition time in the target area actual rainfall data set after normalization processing as the input of a 3D convolution-GRU neural network model, taking the output of the 3D convolution-GRU neural network model as a rainfall prediction value aiming at the acquisition time, and training the 3D convolution-GRU neural network model by using the target area actual rainfall data set after normalization processing to obtain an optimized 3D convolution-GRU neural network model through training;
and step S5, acquiring a radar echo diagram sequence of the target area at the current moment, performing normalization processing on each radar echo diagram in the radar echo diagram sequence, inputting the radar echo diagram sequence after the normalization processing into the optimized 3D convolution-GRU neural network model, and taking the output of the optimized 3D convolution-GRU neural network model as a rainfall prediction value of the target area in a future time period.
2. The method for forecasting short-term heavy rainfall based on deep learning of claim 1, wherein in step S2, before the normalization process is performed on each radar echo map in the target area actual rainfall data set, the method further comprises:
processing each radar echo map in the target area actual rainfall data set into a gray scale map through linear transformation; wherein, the formula of the linear transformation processing is that g '(d, e) ═ K · g (d, e) + B, g (d, e) represents the pixel value of the acquired radar echo image, K represents the slope, B represents the intercept, and g' (d, e) represents the pixel value corresponding to the gray scale image after the linear transformation processing;
and filtering the obtained gray level image by adopting a bilinear filter, and taking the gray level image after filtering as a radar echo image needing normalization processing.
3. The method for forecasting short-term heavy rainfall based on deep learning of claim 2, wherein in step S3, the 3D convolutional neural network of the constructed 3D convolutional-GRU neural network model is as follows:
Figure FDA0002991577520000021
wherein,
Figure FDA0002991577520000022
an output of a jth feature map representing a layer i neuron of the 3D convolutional neural network, x and y represent spatial dimensions of a normalized radar echo map input into the 3D convolutional neural network, respectively, z represents a time dimension of a sequence of normalized radar echo maps input into the 3D convolutional neural network, σ (·) represents an activation function, bijA bias function representing the jth feature map of layer i neurons of the 3D convolutional neural network, P, q, and r represent convolution values, respectively, Pi、QiAnd RiRespectively represent the size of the convolution kernel in the 3D convolutional neural network,
Figure FDA0002991577520000023
representing the weight of the (p, q, r) th neuron connection in the mth feature,
Figure FDA0002991577520000024
a dimension value representing a sequence of normalized radar echo maps input into the 3D convolutional neural network.
4. The method for forecasting short-term heavy rainfall based on deep learning of claim 3, wherein in step S3, the GRU neural network of the constructed 3D convolution-GRU neural network model is as follows:
Zt=σ(WZ·[ht-1,Xt]);
rt=σ(Wr·[ht-1,Xt]);
Figure FDA0002991577520000025
ht=(1-Zt)*ht-1+Zt*h't
where σ (-) denotes the activation function, WZPresentation update door ZtWeight of (a), htRepresenting the output of the current neural unit in the GRU neural network, ht-1Representing the output, X, of the last neural unit in the GRU neural networktRepresenting the input of the current neural unit in the GRU neural network, WZ·[ht-1,Xt]Indicates to output ht-1And input XtThe result of the addition is given a weight WZMultiplication processing is carried out, ht' indicates by controlling rtFrom the output ht-1The information amount in (1), tanh (-) represents a commonly used hyperbolic tangent activation function.
5. The method for forecasting short-term heavy rainfall based on deep learning of claim 4, wherein in step S4, the optimized 3D convolution-GRU neural network model is trained by:
step S41, acquiring a rainfall prediction value output by the 3D convolution-GRU neural network model at any acquisition time;
step S42, acquiring the rainfall actual value corresponding to the target area at any acquisition time;
step S43, constructing a loss function of the 3D convolution-GRU neural network model, and obtaining a loss function value of the 3D convolution-GRU neural network model; wherein, the loss function of the 3D convolution-GRU neural network model is as follows:
Figure FDA0002991577520000031
wherein, Γ represents a loss function value of the 3D convolution-GRU neural network model, y (t) represents an actual rainfall value of the target area at the acquisition time t, and y' (t) represents a predicted rainfall value of the target area output by the 3D convolution-GRU neural network model at the acquisition time t;
step S44, making judgment according to the loss function value of the obtained 3D convolution-GRU neural network model:
when the change of the loss function value of the 3D convolution-GRU neural network model is stable, taking the 3D convolution-GRU neural network model as an optimized 3D convolution-GRU neural network model; otherwise, the process proceeds to step S41.
6. The method for predicting short-term heavy rainfall based on deep learning of any one of claims 1 to 5, wherein in step S1, radar echo map sequences of the target area at different acquisition times within 24h are acquired in advance at an acquisition frequency of 1 frame/6 min.
CN202110317764.3A 2021-03-25 2021-03-25 Short-term heavy rainfall prediction method based on deep learning Pending CN112949934A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110317764.3A CN112949934A (en) 2021-03-25 2021-03-25 Short-term heavy rainfall prediction method based on deep learning

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110317764.3A CN112949934A (en) 2021-03-25 2021-03-25 Short-term heavy rainfall prediction method based on deep learning

Publications (1)

Publication Number Publication Date
CN112949934A true CN112949934A (en) 2021-06-11

Family

ID=76228075

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110317764.3A Pending CN112949934A (en) 2021-03-25 2021-03-25 Short-term heavy rainfall prediction method based on deep learning

Country Status (1)

Country Link
CN (1) CN112949934A (en)

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113568068A (en) * 2021-07-22 2021-10-29 河南大学 Strong convection weather prediction method based on MPI parallel three-dimensional neural network
CN114021349A (en) * 2021-11-05 2022-02-08 广东电网有限责任公司广州供电局 Method, system and device for predicting heavy rainfall and computer storage medium
CN114091765A (en) * 2021-11-25 2022-02-25 山西勇利信息科技有限公司 Future rainfall prediction method based on space-time bidirectional multi-granularity dynamic integration
CN114509825A (en) * 2021-12-31 2022-05-17 河南大学 Strong convection weather prediction method and system for improving three-dimensional confrontation generation neural network based on hybrid evolution algorithm
CN116520459A (en) * 2023-06-28 2023-08-01 成都信息工程大学 Weather prediction method
CN116755181A (en) * 2023-08-11 2023-09-15 深圳市昆特科技有限公司 Precipitation prediction method and related device

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105046089A (en) * 2015-08-13 2015-11-11 电子科技大学 Method for predicting strong rainfall and flood disasters
CN109376848A (en) * 2018-09-01 2019-02-22 哈尔滨工程大学 A kind of door control unit neural network of simplification
CN111476713A (en) * 2020-03-26 2020-07-31 中南大学 Intelligent weather image identification method and system based on multi-depth convolution neural network fusion
CN112415521A (en) * 2020-12-17 2021-02-26 南京信息工程大学 CGRU (China-swarm optimization and RU-based radar echo nowcasting) method with strong space-time characteristics

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105046089A (en) * 2015-08-13 2015-11-11 电子科技大学 Method for predicting strong rainfall and flood disasters
CN109376848A (en) * 2018-09-01 2019-02-22 哈尔滨工程大学 A kind of door control unit neural network of simplification
CN111476713A (en) * 2020-03-26 2020-07-31 中南大学 Intelligent weather image identification method and system based on multi-depth convolution neural network fusion
CN112415521A (en) * 2020-12-17 2021-02-26 南京信息工程大学 CGRU (China-swarm optimization and RU-based radar echo nowcasting) method with strong space-time characteristics

Non-Patent Citations (4)

* Cited by examiner, † Cited by third party
Title
刘东: "《工业机器视觉 基于灵闪平台的开发及应用》", 上海教育出版社, pages: 83 *
陈晓平等: "基于机器学习的降雨量雷达回波数据建模与预测", 《南京信息工程大学学报》, vol. 12, no. 4, pages 483 - 494 *
陈程: "卷积神经网络在气象短临预报的研究与应用", 《中国优秀硕士学位论文全文数据库 基础科学辑》, no. 12, pages 009 - 16 *
陈颖等: "基于3D双流卷积神经网络和GRU网络的人体行为识别", 《计算机应用与软件》, no. 5, pages 170 - 174 *

Cited By (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113568068A (en) * 2021-07-22 2021-10-29 河南大学 Strong convection weather prediction method based on MPI parallel three-dimensional neural network
CN113568068B (en) * 2021-07-22 2022-03-29 河南大学 Strong convection weather prediction method based on MPI parallel three-dimensional neural network
CN114021349A (en) * 2021-11-05 2022-02-08 广东电网有限责任公司广州供电局 Method, system and device for predicting heavy rainfall and computer storage medium
CN114091765A (en) * 2021-11-25 2022-02-25 山西勇利信息科技有限公司 Future rainfall prediction method based on space-time bidirectional multi-granularity dynamic integration
CN114509825A (en) * 2021-12-31 2022-05-17 河南大学 Strong convection weather prediction method and system for improving three-dimensional confrontation generation neural network based on hybrid evolution algorithm
CN114509825B (en) * 2021-12-31 2022-11-08 河南大学 Strong convection weather prediction method and system for improving three-dimensional confrontation generation neural network based on hybrid evolution algorithm
CN116520459A (en) * 2023-06-28 2023-08-01 成都信息工程大学 Weather prediction method
CN116520459B (en) * 2023-06-28 2023-08-25 成都信息工程大学 Weather prediction method
CN116755181A (en) * 2023-08-11 2023-09-15 深圳市昆特科技有限公司 Precipitation prediction method and related device
CN116755181B (en) * 2023-08-11 2023-10-20 深圳市昆特科技有限公司 Precipitation prediction method and related device

Similar Documents

Publication Publication Date Title
CN112949934A (en) Short-term heavy rainfall prediction method based on deep learning
CN109215034B (en) Weak supervision image semantic segmentation method based on spatial pyramid covering pooling
CN111882002B (en) MSF-AM-based low-illumination target detection method
CN110223323B (en) Target tracking method based on depth feature adaptive correlation filtering
CN110751067B (en) Dynamic expression recognition method combined with biological form neuron model
CN112633497A (en) Convolutional pulse neural network training method based on reweighted membrane voltage
CN108447041B (en) Multi-source image fusion method based on reinforcement learning
CN107451999A (en) foreign matter detecting method and device based on image recognition
CN107229929A (en) A kind of license plate locating method based on R CNN
CN113111758B (en) SAR image ship target recognition method based on impulse neural network
CN113239722B (en) Deep learning based strong convection extrapolation method and system under multi-scale
CN109255304B (en) Target tracking method based on distribution field characteristics
CN107977683A (en) Joint SAR target identification methods based on convolution feature extraction and machine learning
CN100565557C (en) System for tracking infrared human body target based on corpuscle dynamic sampling model
CN108563977A (en) A kind of the pedestrian&#39;s method for early warning and system of expressway entrance and exit
CN110084201B (en) Human body action recognition method based on convolutional neural network of specific target tracking in monitoring scene
CN110765948A (en) Target detection and identification method and system based on unmanned aerial vehicle
CN109886387A (en) It is a kind of that the traffic time sequence forecasting method returned is promoted based on gating network and gradient
CN111461213A (en) Training method of target detection model and target rapid detection method
US20130039534A1 (en) Motion detection method for complex scenes
CN109214253A (en) A kind of video frame detection method and device
Xu et al. A bio-inspired motion sensitive model and its application to estimating human gaze positions under classified driving conditions
CN112215334A (en) Neural network model compression method for event camera
CN106127740A (en) A kind of profile testing method based on the association of visual pathway many orientation of sensory field
CN115690557A (en) Construction safety early warning method and device based on attention mechanism neural network

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20210611