CN112949934A - Short-term heavy rainfall prediction method based on deep learning - Google Patents

Short-term heavy rainfall prediction method based on deep learning Download PDF

Info

Publication number
CN112949934A
CN112949934A CN202110317764.3A CN202110317764A CN112949934A CN 112949934 A CN112949934 A CN 112949934A CN 202110317764 A CN202110317764 A CN 202110317764A CN 112949934 A CN112949934 A CN 112949934A
Authority
CN
China
Prior art keywords
neural network
convolution
network model
rainfall
gru neural
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202110317764.3A
Other languages
Chinese (zh)
Inventor
王仁芳
孙德超
李谦
洪鑫华
梁丰
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Zhejiang Wanli University
Original Assignee
Zhejiang Wanli University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Zhejiang Wanli University filed Critical Zhejiang Wanli University
Priority to CN202110317764.3A priority Critical patent/CN112949934A/en
Publication of CN112949934A publication Critical patent/CN112949934A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q10/00Administration; Management
    • G06Q10/04Forecasting or optimisation specially adapted for administrative or management purposes, e.g. linear programming or "cutting stock problem"
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/88Radar or analogous systems specially adapted for specific applications
    • G01S13/95Radar or analogous systems specially adapted for specific applications for meteorological use
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01WMETEOROLOGY
    • G01W1/00Meteorology
    • G01W1/10Devices for predicting weather conditions
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/044Recurrent networks, e.g. Hopfield networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/045Combinations of networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/048Activation functions
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q50/00Information and communication technology [ICT] specially adapted for implementation of business processes of specific business sectors, e.g. utilities or tourism
    • G06Q50/10Services
    • G06Q50/26Government or public services
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02ATECHNOLOGIES FOR ADAPTATION TO CLIMATE CHANGE
    • Y02A90/00Technologies having an indirect contribution to adaptation to climate change
    • Y02A90/10Information and communication technologies [ICT] supporting adaptation to climate change, e.g. for weather forecasting or climate simulation

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Business, Economics & Management (AREA)
  • General Physics & Mathematics (AREA)
  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Data Mining & Analysis (AREA)
  • Evolutionary Computation (AREA)
  • Computational Linguistics (AREA)
  • Molecular Biology (AREA)
  • Computing Systems (AREA)
  • General Engineering & Computer Science (AREA)
  • Biophysics (AREA)
  • Mathematical Physics (AREA)
  • Software Systems (AREA)
  • Biomedical Technology (AREA)
  • Artificial Intelligence (AREA)
  • Remote Sensing (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Tourism & Hospitality (AREA)
  • Economics (AREA)
  • Human Resources & Organizations (AREA)
  • Strategic Management (AREA)
  • Marketing (AREA)
  • Environmental & Geological Engineering (AREA)
  • General Business, Economics & Management (AREA)
  • Development Economics (AREA)
  • Primary Health Care (AREA)
  • Electromagnetism (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Educational Administration (AREA)
  • Atmospheric Sciences (AREA)
  • Biodiversity & Conservation Biology (AREA)
  • Ecology (AREA)
  • Environmental Sciences (AREA)
  • Game Theory and Decision Science (AREA)
  • Entrepreneurship & Innovation (AREA)
  • Operations Research (AREA)

Abstract

本发明涉及一种基于深度学习的短时强降雨预测方法,通过预先形成目标区域在不同采集时刻的目标区域实际降雨数据集,并且得到归一化处理后的目标区域实际降雨数据集,将该归一化处理后的目标区域实际降雨数据集内任一采集时刻对应的归一化后的雷达回波图序列输入到预先构建的3D卷积‑GRU神经网络模型,将3D卷积‑GRU神经网络模型的输出作为该任一采集时刻的降雨量预测值,通过不断训练,得到优化的3D卷积‑GRU神经网络模型,再将目标区域当前时刻的雷达回波图序列归一化处理后输入优化的3D卷积‑GRU神经网络模型,将优化的3D卷积‑GRU神经网络模型的输出作为目标区域在未来时间段内的降雨量预测值,实现针对目标区域在短时内的强降雨预测。

Figure 202110317764

The invention relates to a short-term heavy rainfall prediction method based on deep learning. The actual rainfall data set of the target area at different collection times of the target area is formed in advance, and the normalized actual rainfall data set of the target area is obtained. The normalized radar echo map sequence corresponding to any acquisition time in the actual rainfall dataset of the target area after normalization is input into the pre-built 3D convolution-GRU neural network model, and the 3D convolution-GRU neural network model is The output of the network model is used as the predicted value of rainfall at any acquisition time. Through continuous training, an optimized 3D convolution-GRU neural network model is obtained, and then the radar echo map sequence of the target area at the current time is normalized and input. The optimized 3D convolution-GRU neural network model takes the output of the optimized 3D convolution-GRU neural network model as the predicted value of rainfall in the target area in the future time period, so as to realize the prediction of heavy rainfall in the target area in a short time .

Figure 202110317764

Description

基于深度学习的短时强降雨预测方法Prediction method of short-term heavy rainfall based on deep learning

技术领域technical field

本发明涉及计算机视觉和气象服务技术领域,尤其涉及一种基于深度学习的短时强降雨预测方法。The invention relates to the technical field of computer vision and meteorological services, in particular to a method for predicting short-term heavy rainfall based on deep learning.

背景技术Background technique

短时强降雨是一种突发性强、降水时间短以及降水量大的天气过程,由于短时强降雨引发的气象灾害通常“措不及防”,导致其产生的社会危害性极大,每年因短时强降雨引发的自然灾害层出不穷,严重威胁着人们的生命财产安全。因此,实现对短时强降雨进行准确预测,对于防灾、减灾意义重大。Short-term heavy rainfall is a weather process with sudden strong, short precipitation time and large amount of precipitation. Meteorological disasters caused by short-term heavy rainfall are usually "overwhelmed" and cause great social harm. Natural disasters caused by short-term heavy rainfall emerge in an endless stream, seriously threatening the safety of people's lives and property. Therefore, the accurate prediction of short-term heavy rainfall is of great significance for disaster prevention and mitigation.

在现有的短时强降雨预测方法中,通常将雷达回波外推技术作为临近天气预报的主要技术手段,具体是根据气象雷达探测到的回波数据,确定回波的强度分布以及回波体(如降水区)的移动速度和移动方向,然后再通过对回波体做线性或者非线性外推,预测出一定时间段后的雷达回波状态。In the existing short-term heavy rainfall prediction methods, the radar echo extrapolation technology is usually used as the main technical means for near-weather weather forecasting. Specifically, the intensity distribution of the echo and the echo data are determined according to the echo data detected by the weather radar. The moving speed and moving direction of the body (such as the precipitation area) are calculated, and then the radar echo state after a certain period of time is predicted by linear or nonlinear extrapolation of the echo body.

中国发明专利CN105046089B公开了一种预测强降雨及洪涝灾害的方法,根据历史各月的降雨总量数据,通过采集事件时序列数据并构建降雨量序列,利用基于模糊减法聚类算法、统计学习、选择性结构风险最小化理论以及类簇投影相结合的方式预测将来某月总降雨量,由此实现强降雨及洪涝灾害的预测。该发明专利中的预测强降雨方案使用模糊聚类算法对训练集进行聚类,且类簇个数由选择性结构风险最小化理论确定,使得聚类结果更加准确,且保证了预测结果的有效性和准确性。Chinese invention patent CN105046089B discloses a method for predicting heavy rainfall and flood disasters. According to the historical monthly rainfall total data, by collecting event time series data and constructing a rainfall sequence, using clustering algorithm based on fuzzy subtraction, statistical learning, The combined method of selective structural risk minimization and cluster-like projection predicts the total rainfall in a certain month in the future, thereby realizing the prediction of heavy rainfall and flood disasters. The heavy rainfall prediction scheme in the invention patent uses fuzzy clustering algorithm to cluster the training set, and the number of clusters is determined by the theory of selective structural risk minimization, which makes the clustering results more accurate and ensures the validity of the prediction results. sex and accuracy.

但是,上述发明专利CN105046089B的预测强降雨的方法也存在不足:由于该发明专利中的预测强降雨方法所能预测的是将来某一个月的总降雨量,无法实现对突发性强、降水时间短以及降水量大的短时强降雨天气进行预测。However, the method for predicting heavy rainfall in the above-mentioned invention patent CN105046089B also has shortcomings: because the method for predicting heavy rainfall in this invention patent can predict the total rainfall in a certain month in the future, it is impossible to realize the prediction of sudden strong rainfall and precipitation time. Short-term and heavy rainfall weather forecast.

发明内容SUMMARY OF THE INVENTION

本发明所要解决的技术问题是针对上述现有技术提供一种基于深度学习的短时强降雨预测方法。The technical problem to be solved by the present invention is to provide a deep learning-based short-term heavy rainfall prediction method for the above-mentioned prior art.

本发明解决上述技术问题所采用的技术方案为:基于深度学习的短时强降雨预测方法,其特征在于,包括如下步骤S1~S5:The technical solution adopted by the present invention to solve the above technical problems is: a deep learning-based short-term heavy rainfall prediction method, which is characterized in that it includes the following steps S1-S5:

步骤S1,预先采集目标区域在不同采集时刻的雷达回波图序列以及降雨量实际值,并由采集到的所有雷达回波图序列和降雨量实际值共同形成目标区域实际降雨数据集;其中,在该目标区域实际降雨数据集中,同一采集时刻的雷达回波图序列与降雨量实际值为一一对应关系;Step S1, pre-collect the radar echo map sequence and the actual rainfall value of the target area at different collection times, and form the actual rainfall data set of the target area from all the collected radar echo map sequences and the actual rainfall value; wherein, In the actual rainfall data set of the target area, there is a one-to-one correspondence between the radar echo map sequence at the same acquisition time and the actual rainfall value;

步骤S2,对目标区域实际降雨数据集中的各雷达回波图做归一化处理,得到归一化处理后的目标区域实际降雨数据集;Step S2, performing normalization processing on each radar echo image in the actual rainfall data set of the target area, to obtain the normalized actual rainfall data set of the target area;

步骤S3,预先构建3D卷积-GRU神经网络模型;其中,3D卷积-GRU神经网络模型包括3D卷积神经网络和GRU神经网络,3D卷积-GRU神经网络模型的输入为3D卷积神经网络的输入,3D卷积神经网络的输出为GRU神经网络的输入,GRU神经网络的输出为3D卷积-GRU神经网络模型的输出;Step S3, constructing a 3D convolution-GRU neural network model in advance; wherein, the 3D convolution-GRU neural network model includes a 3D convolutional neural network and a GRU neural network, and the input of the 3D convolution-GRU neural network model is a 3D convolutional neural network. The input of the network, the output of the 3D convolutional neural network is the input of the GRU neural network, and the output of the GRU neural network is the output of the 3D convolutional-GRU neural network model;

步骤S4,将归一化处理后的目标区域实际降雨数据集中的各采集时刻的雷达回波图序列作为3D卷积-GRU神经网络模型的输入,并且将该3D卷积-GRU神经网络模型的输出作为针对该采集时刻的降雨量预测值,利用归一化处理后的目标区域实际降雨数据集对该3D卷积-GRU神经网络模型做训练,以训练得到优化的3D卷积-GRU神经网络模型;Step S4, the radar echo map sequence at each acquisition time in the normalized target area actual rainfall data set is used as the input of the 3D convolution-GRU neural network model, and the 3D convolution-GRU neural network model is used. The output is used as the predicted value of rainfall for the collection time, and the 3D convolution-GRU neural network model is trained by using the normalized actual rainfall data set of the target area to train the optimized 3D convolution-GRU neural network. Model;

步骤S5,采集目标区域在当前时刻的雷达回波图序列,对该雷达回波图序列内各雷达回波图执行归一化处理,并将归一化处理后的雷达回波图序列输入到优化的3D卷积-GRU神经网络模型中,并且将该优化的3D卷积-GRU神经网络模型的输出作为目标区域在未来时间段内的降雨量预测值。Step S5, collect the radar echo map sequence of the target area at the current moment, perform normalization processing on each radar echo map in the radar echo map sequence, and input the normalized radar echo map sequence into the In the optimized 3D convolution-GRU neural network model, the output of the optimized 3D convolution-GRU neural network model is used as the predicted value of rainfall in the target area in the future time period.

改进地,在所述基于深度学习的短时强降雨预测方法中,在步骤S2中,在对目标区域实际降雨数据集中各雷达回波图做归一化处理之前还包括:Improved, in the deep learning-based short-term heavy rainfall prediction method, in step S2, before normalizing each radar echo image in the actual rainfall data set in the target area, the method further includes:

将目标区域实际降雨数据集中的每一个雷达回波图通过线性变换处理成灰度图;其中,线性变换处理的公式为g'(d,e)=K·g(d,e)+B,g(d,e)表示采集到的雷达回波图的像素值,K表示斜率,B表示截距,g'(d,e)表示线性变换处理后的灰度图所对应的像素值;Each radar echo image in the actual rainfall data set of the target area is processed into a grayscale image through linear transformation; the formula of linear transformation processing is g'(d,e)=K·g(d,e)+B, g(d,e) represents the pixel value of the collected radar echo image, K represents the slope, B represents the intercept, and g'(d,e) represents the pixel value corresponding to the grayscale image after linear transformation;

以及,采用双线性滤波器对所得灰度图做滤波处理,并且将滤波处理后的灰度图作为需要进行归一化处理的雷达回波图。And, a bilinear filter is used to filter the obtained grayscale image, and the filtered grayscale image is used as a radar echo image that needs to be normalized.

进一步地,在所述基于深度学习的短时强降雨预测方法中,在步骤S3中,构建的3D卷积-GRU神经网络模型的3D卷积神经网络如下:Further, in the deep learning-based short-term heavy rainfall prediction method, in step S3, the 3D convolutional neural network of the constructed 3D convolution-GRU neural network model is as follows:

Figure BDA0002991577530000021
Figure BDA0002991577530000021

其中,

Figure BDA0002991577530000022
表示3D卷积神经网络的第i层神经元的第j个特征图的输出,x和y分别表示输入到3D卷积神经网络中的归一化雷达回波图的空间维度,z表示输入到3D卷积神经网络中的归一化雷达回波图序列的时间维度,σ(·)表示激活函数,bij表示3D卷积神经网络的第i层神经元的第j个特征图的偏置函数,p、q和r分别表示卷积值,Pi、Qi和Ri分别表示3D卷积神经网络中卷积核的尺寸,
Figure BDA0002991577530000031
表示第m个特征中的第(p,q,r)个神经元连接的权重,
Figure BDA0002991577530000032
表示输入到3D卷积神经网络中的归一化雷达回波图序列的维度值。in,
Figure BDA0002991577530000022
Represents the output of the jth feature map of the i-th layer neuron of the 3D convolutional neural network, x and y respectively represent the spatial dimension of the normalized radar echo map input into the 3D convolutional neural network, and z represents the input to the The time dimension of the normalized radar echo map sequence in the 3D convolutional neural network, σ( ) represents the activation function, and b ij represents the bias of the jth feature map of the i-th layer neuron of the 3D convolutional neural network function, p , q and r represent the convolution value, respectively, Pi, Qi and Ri represent the size of the convolution kernel in the 3D convolutional neural network, respectively,
Figure BDA0002991577530000031
represents the weight of the (p,q,r)th neuron connection in the mth feature,
Figure BDA0002991577530000032
Represents the dimension value of the sequence of normalized radar echograms input into the 3D convolutional neural network.

再进一步地,在所述基于深度学习的短时强降雨预测方法中,在步骤S3中,构建的3D卷积-GRU神经网络模型的GRU神经网络如下:Still further, in the deep learning-based short-term heavy rainfall prediction method, in step S3, the GRU neural network of the constructed 3D convolution-GRU neural network model is as follows:

Zt=σ(WZ·[ht-1,Xt]);Z t =σ(W Z ·[h t-1 ,X t ]);

rt=σ(Wr·[ht-1,Xt]);r t =σ(W r ·[h t-1 ,X t ]);

Figure BDA0002991577530000033
Figure BDA0002991577530000033

ht=(1-Zt)*ht-1+Zt*h′th t =(1-Z t )*h t-1 +Z t *h′ t ;

其中,σ(·)表示激活函数,WZ表示更新门Zt的权重,ht表示GRU神经网络中的当前神经单元的输出,ht-1表示GRU神经网络中的上一个神经单元的输出,Xt表示GRU神经网络中的当前神经单元的输入,WZ·[ht-1,Xt]表示将输出ht-1和输入Xt相加后的结果与权重WZ做相乘处理,h′t表示通过控制rt从输出ht-1中得到的信息量,tanh(·)表示常用的双曲正切激活函数。Among them, σ( ) represents the activation function, W Z represents the weight of the update gate Z t , h t represents the output of the current neural unit in the GRU neural network, and h t-1 represents the output of the previous neural unit in the GRU neural network , X t represents the input of the current neural unit in the GRU neural network, W Z ·[h t-1 ,X t ] represents the result of adding the output h t-1 and the input X t to the weight W Z to do multiplication processing, h' t represents the amount of information obtained from the output h t-1 by controlling r t , and tanh( ) represents the commonly used hyperbolic tangent activation function.

再改进地,在所述基于深度学习的短时强降雨预测方法中,在步骤S4中,通过如下方式训练得到优化的3D卷积-GRU神经网络模型:Further improvement, in the deep learning-based short-term heavy rainfall prediction method, in step S4, an optimized 3D convolution-GRU neural network model is obtained by training in the following manner:

步骤S41,获取3D卷积-GRU神经网络模型在任一采集时刻输出的降雨量预测值;Step S41, obtaining the predicted value of rainfall output by the 3D convolution-GRU neural network model at any acquisition moment;

步骤S42,获取目标区域在该任一采集时刻所对应的降雨量实际值;Step S42, obtaining the actual value of the rainfall corresponding to the target area at any collection moment;

步骤S43,构建3D卷积-GRU神经网络模型的损失函数,并得到3D卷积-GRU神经网络模型的损失函数值;其中,3D卷积-GRU神经网络模型的损失函数如下所示:Step S43, construct the loss function of the 3D convolution-GRU neural network model, and obtain the loss function value of the 3D convolution-GRU neural network model; wherein, the loss function of the 3D convolution-GRU neural network model is as follows:

Figure BDA0002991577530000034
Figure BDA0002991577530000034

其中,Γ表示3D卷积-GRU神经网络模型的损失函数值,y(t)表示目标区域在采集时刻t的降雨量实际值,y'(t)表示3D卷积-GRU神经网络模型输出的目标区域在采集时刻t的降雨量预测值;Among them, Γ represents the loss function value of the 3D convolution-GRU neural network model, y(t) represents the actual value of rainfall in the target area at the collection time t, and y'(t) represents the output of the 3D convolution-GRU neural network model. The predicted rainfall value of the target area at the collection time t;

步骤S44,根据所得3D卷积-GRU神经网络模型的损失函数值做出判断:Step S44, make a judgment according to the loss function value of the obtained 3D convolution-GRU neural network model:

当3D卷积-GRU神经网络模型的损失函数值的变化处于稳定时,将该3D卷积-GRU神经网络模型作为优化的3D卷积-GRU神经网络模型;否则,转入步骤S41。When the change of the loss function value of the 3D convolution-GRU neural network model is stable, the 3D convolution-GRU neural network model is regarded as the optimized 3D convolution-GRU neural network model; otherwise, go to step S41.

优选地,在所述基于深度学习的短时强降雨预测方法中,在步骤S1中,预先按照1帧/6min的采集频率采集目标区域在24h内的不同采集时刻的雷达回波图序列。Preferably, in the deep learning-based short-term heavy rainfall prediction method, in step S1, the radar echo map sequences of the target area at different collection times within 24 hours are collected in advance according to the collection frequency of 1 frame/6min.

与现有技术相比,本发明的优点在于:Compared with the prior art, the advantages of the present invention are:

首先,该发明通过预先采集目标区域在不同采集时刻的雷达回波图序列以及降雨量实际值,以形成目标区域实际降雨数据集,然后得到归一化处理后的目标区域实际降雨数据集,将任一采集时刻所对应的归一化后的雷达回波图序列作为预先构建的3D卷积-GRU神经网络模型输入,并且将3D卷积-GRU神经网络模型的输出作为针对该任一采集时刻的降雨量预测值,从而通过归一化处理后的目标区域实际降雨数据集内的雷达回波图序列不断训练该3D卷积-GRU神经网络模型,并训练得到优化的3D卷积-GRU神经网络模型,然后再将目标区域在当前时刻的雷达回波图序列归一化处理后输入到优化的3D卷积-GRU神经网络模型,并且将该优化的3D卷积-GRU神经网络模型的输出作为目标区域在未来时间段内的降雨量预测值,从而实现了针对目标区域在短时内的强降雨预测,对于提高气象预警工作的准确度以及强降雨引发的自然灾害,具有重要的应用价值与实际意义。First, the invention forms the actual rainfall data set of the target area by pre-collecting the radar echo map sequence and the actual rainfall value of the target area at different collection times, and then obtains the normalized actual rainfall data set of the target area. The normalized radar echo map sequence corresponding to any acquisition time is used as the input of the pre-built 3D convolution-GRU neural network model, and the output of the 3D convolution-GRU neural network model is used as the input for any acquisition time. Therefore, the 3D convolution-GRU neural network model is continuously trained through the normalized radar echo map sequence in the actual rainfall data set of the target area, and the optimized 3D convolution-GRU neural network model is trained. network model, and then normalize the radar echo map sequence of the target area at the current moment and input it into the optimized 3D convolution-GRU neural network model, and the output of the optimized 3D convolution-GRU neural network model As the rainfall forecast value of the target area in the future time period, the forecast of heavy rainfall in the target area in a short period of time is realized, which has important application value for improving the accuracy of meteorological warning work and natural disasters caused by heavy rainfall. with actual meaning.

其次,该发明还可以根据需要调整针对目标区域的雷达回波图的采集频率,进而实现对将来不同时刻的强降雨预测,从而满足了根据需要对将来不同时刻的强降雨预测,更具有实用性。Secondly, the invention can also adjust the collection frequency of the radar echo image for the target area as required, thereby realizing the prediction of heavy rainfall at different times in the future, thus satisfying the prediction of heavy rainfall at different times in the future according to the needs, which is more practical. .

附图说明Description of drawings

图1为本发明实施例中基于深度学习的短时强降雨预测方法流程示意图;1 is a schematic flowchart of a method for predicting short-term heavy rainfall based on deep learning in an embodiment of the present invention;

图2为本发明实施例中所构建的3D卷积-GRU神经网络模型的示意图;2 is a schematic diagram of a 3D convolution-GRU neural network model constructed in an embodiment of the present invention;

图3为利用传统的短时强降雨预测方法得到的未来2小时内的雷达回波图像序列的预测输出;Figure 3 is the predicted output of the radar echo image sequence in the next 2 hours obtained by using the traditional short-term heavy rainfall prediction method;

图4为利用本发明实施例中基于深度学习的短时强降雨预测方法所得到的未来2小时内的雷达回波图像序列的预测输出;4 is the predicted output of the radar echo image sequence in the next 2 hours obtained by using the deep learning-based short-term heavy rainfall prediction method in the embodiment of the present invention;

图5为从气象台处获取的目标区域在未来2小时内的雷达回波图像序列的真实输出。Figure 5 is the real output of the radar echo image sequence of the target area acquired from the meteorological station in the next 2 hours.

具体实施方式Detailed ways

以下结合附图实施例对本发明作进一步详细描述。The present invention will be further described in detail below with reference to the embodiments of the accompanying drawings.

本实施例提供一种基于深度学习的短时强降雨预测方法。具体地,参见图1所示,该实施例中基于深度学习的短时强降雨预测方法,包括如下步骤S1~S6:This embodiment provides a deep learning-based short-term heavy rainfall prediction method. Specifically, as shown in FIG. 1 , the deep learning-based short-term heavy rainfall prediction method in this embodiment includes the following steps S1 to S6:

步骤S1,预先采集目标区域在不同采集时刻的雷达回波图序列以及降雨量实际值,并由采集到的所有雷达回波图序列和降雨量实际值共同形成目标区域实际降雨数据集;Step S1, pre-collect the radar echo map sequence and the actual rainfall value of the target area at different collection times, and form the actual rainfall data set of the target area together by all the collected radar echo map sequences and the actual rainfall value;

在该实施例中,此处所指的“采集时刻”为当前时刻之前的时刻;具体地,假设目标区域为甲,预先采集的采集时刻分别为t1、t2、……、tM,形成的该目标区域实际降雨数据集标记为List:其中:In this embodiment, the "collection time" referred to here is the time before the current time; specifically, assuming that the target area is A, the pre-collected collection times are respectively t 1 , t 2 , ..., t M , The actual rainfall dataset formed in this target area is marked as List: where:

预先采集到的目标区域甲在采集时刻t1的雷达回波图序列为

Figure BDA0002991577530000051
目标区域甲在该采集时刻t1时的降雨量实际值标记为
Figure BDA0002991577530000052
The radar echo map sequence of the pre-collected target area A at the collection time t 1 is:
Figure BDA0002991577530000051
The actual value of rainfall in the target area A at the collection time t1 is marked as
Figure BDA0002991577530000052

预先采集到的目标区域甲在采集时刻t2的雷达回波图序列为

Figure BDA0002991577530000053
目标区域甲在该采集时刻t2时的降雨量实际值标记为
Figure BDA0002991577530000054
The radar echo map sequence of the pre-collected target area A at the collection time t 2 is:
Figure BDA0002991577530000053
The actual value of rainfall in the target area A at the collection time t 2 is marked as
Figure BDA0002991577530000054

依次类推;And so on;

预先采集到的目标区域甲在采集时刻tM的雷达回波图序列为

Figure BDA0002991577530000055
目标区域甲在该采集时刻tM时的降雨量实际值标记为
Figure BDA0002991577530000056
The radar echo map sequence of the pre-collected target area A at the collection time t M is:
Figure BDA0002991577530000055
The actual value of rainfall in the target area A at the collection time t M is marked as
Figure BDA0002991577530000056

由于雷达回波图序列是一个包括不同高度上的图像序列,所以此处的Hw表示目标区域甲的第w个高度值,1≤w≤W,W为针对目标区域甲采集雷达回波图时所对应的高度值的总数量;

Figure BDA0002991577530000057
表示采集到的目标区域甲在时刻tM时的高度为Hw时的单张雷达回波图;Since the radar echo map sequence is an image sequence including different heights, H w here represents the w-th height value of the target area A, 1≤w≤W, W is the radar echo image collected for the target area A The total number of height values corresponding to the time;
Figure BDA0002991577530000057
represents the single radar echo image of the collected target area A at time t M when the height is H w ;

其中,本领域技术人员熟知,在通过雷达回波序列图预测降雨情况的过程中,某一时刻的雷达回波序列图就可以反映出该时刻的降雨量情况,即同一时刻的雷达回波序列图与降雨量之间是相互对应的关系,即在该实施例的目标区域实际降雨数据集List中存在如下对应关系:Among them, those skilled in the art are well-known that in the process of predicting rainfall through the radar echo sequence diagram, the radar echo sequence diagram at a certain moment can reflect the rainfall situation at that moment, that is, the radar echo sequence at the same moment. There is a corresponding relationship between the graph and the rainfall, that is, the following correspondence exists in the actual rainfall data set List of the target area in this embodiment:

针对采集时刻t1,雷达回波图序列

Figure BDA0002991577530000058
与降雨量实际值
Figure BDA0002991577530000059
一一对应;For the acquisition time t 1 , the radar echo map sequence
Figure BDA0002991577530000058
with actual rainfall
Figure BDA0002991577530000059
one-to-one correspondence;

针对采集时刻t2,雷达回波图序列

Figure BDA00029915775300000510
与降雨量实际值
Figure BDA00029915775300000511
一一对应;For the acquisition time t 2 , the radar echo map sequence
Figure BDA00029915775300000510
with actual rainfall
Figure BDA00029915775300000511
one-to-one correspondence;

依次类推;And so on;

针对采集时刻tM,雷达回波图序列

Figure BDA00029915775300000512
与降雨量实际值
Figure BDA00029915775300000513
一一对应;For the acquisition time t M , the radar echo map sequence
Figure BDA00029915775300000512
with actual rainfall
Figure BDA00029915775300000513
one-to-one correspondence;

具体到该实施例中,由于此处预先按照1帧/6min的采集频率采集目标区域甲在24h内的不同采集时刻的不同高度上的雷达回波图序列。这样,通过此处设置的雷达回波图采集频率为1帧/6min,即1h内采集10帧雷达回波图,24h内(即一天内)就会采集到240帧雷达回波图;Specifically in this embodiment, the radar echo map sequences at different heights at different collection moments in the target area A within 24 hours are collected in advance according to the collection frequency of 1 frame/6min. In this way, the collection frequency of radar echo images set here is 1 frame/6min, that is, 10 frames of radar echo images are collected within 1 hour, and 240 frames of radar echo images will be collected within 24 hours (that is, within one day).

步骤S2,对目标区域实际降雨数据集中的各雷达回波图做归一化处理,得到归一化处理后的目标区域实际降雨数据集;Step S2, performing normalization processing on each radar echo image in the actual rainfall data set of the target area, to obtain the normalized actual rainfall data set of the target area;

具体地,在该步骤S2中,需要对已经得到的目标区域实际降雨数据集List内的每一个雷达回波图做归一化处理,即对雷达回波图序列

Figure BDA0002991577530000061
中的每一个雷达回波图均做归一化处理,这样,经过归一化处理,就可以得到归一化处理后的目标区域实际降雨数据集List';Specifically, in this step S2, it is necessary to perform normalization processing on each radar echo image in the obtained actual rainfall data set List of the target area, that is, the sequence of radar echo images needs to be normalized.
Figure BDA0002991577530000061
Each radar echo image in is normalized, so that after normalization, the normalized target area actual rainfall dataset List' can be obtained;

具体地,在该归一化处理后的目标区域实际降雨数据集List'中:Specifically, in the normalized target area actual rainfall dataset List':

针对采集时刻t1,归一化雷达回波图序列

Figure BDA0002991577530000062
与降雨量实际值
Figure BDA0002991577530000063
一一对应,雷达回波图
Figure BDA0002991577530000064
是雷达回波图
Figure BDA0002991577530000065
的归一化雷达回波图;For the acquisition time t 1 , the normalized radar echo map sequence
Figure BDA0002991577530000062
with actual rainfall
Figure BDA0002991577530000063
One-to-one correspondence, radar echo map
Figure BDA0002991577530000064
is the radar echo
Figure BDA0002991577530000065
The normalized radar echo map of ;

针对采集时刻t2,归一化雷达回波图序列

Figure BDA0002991577530000066
与降雨量实际值
Figure BDA0002991577530000067
一一对应,雷达回波图
Figure BDA0002991577530000068
是雷达回波图
Figure BDA0002991577530000069
的归一化雷达回波图;For the acquisition time t 2 , the normalized radar echo map sequence
Figure BDA0002991577530000066
with actual rainfall
Figure BDA0002991577530000067
One-to-one correspondence, radar echo map
Figure BDA0002991577530000068
is the radar echo
Figure BDA0002991577530000069
The normalized radar echo map of ;

依次类推;And so on;

针对采集时刻tM,归一化雷达回波图序列

Figure BDA00029915775300000610
与降雨量实际值
Figure BDA00029915775300000611
一一对应,雷达回波图
Figure BDA00029915775300000612
是雷达回波图
Figure BDA00029915775300000613
的归一化雷达回波图;For the acquisition time t M , the normalized radar echo map sequence
Figure BDA00029915775300000610
with actual rainfall
Figure BDA00029915775300000611
One-to-one correspondence, radar echo map
Figure BDA00029915775300000612
is the radar echo
Figure BDA00029915775300000613
The normalized radar echo map of ;

需要说明的是,在针对雷达回波图序列的实际采集过程中,通常会存在噪声的干扰。因此,为了消除噪声对所采集雷达回波图的不利影响,在该步骤S2中,还可以在针对雷达回波图像做归一化处理之前,执行针对所述采集雷达回波图的消噪处理过程。例如,此处的消噪处理过程包括了:将目标区域实际降雨数据集List中的每一个雷达回波图通过线性变换处理成灰度图;其中,线性变换处理的公式为g'(d,e)=K·g(d,e)+B,g(d,e)表示采集到的雷达回波图的像素值,K表示斜率,B表示截距,g'(d,e)表示线性变换处理后的灰度图所对应的像素值;It should be noted that, in the actual acquisition process for the radar echo pattern sequence, there is usually noise interference. Therefore, in order to eliminate the adverse effect of noise on the collected radar echo image, in this step S2, before the normalization processing is performed on the radar echo image, a denoising process for the collected radar echo image may be performed. process. For example, the de-noising process here includes: processing each radar echo image in the actual rainfall dataset List of the target area into a grayscale image through linear transformation; the formula for the linear transformation processing is g'(d, e)=K·g(d,e)+B, g(d,e) represents the pixel value of the collected radar echo image, K represents the slope, B represents the intercept, and g'(d,e) represents the linearity The pixel value corresponding to the transformed grayscale image;

以及,采用双线性滤波器对所得灰度图做滤波处理,并且将滤波处理后的灰度图作为需要进行归一化处理的雷达回波图。And, a bilinear filter is used to filter the obtained grayscale image, and the filtered grayscale image is used as a radar echo image that needs to be normalized.

当K>1时,可用于增加图像的对比度,图像的像素值在变换后全部增大,整体显示效果被增强;当K=1时,常用于调节图像亮度;当0<K<1时,效果与K>1时刚刚相反,图像的对比度和整体效果都被削弱;当K<0时,源图像较亮的区域变暗,而较暗的区域会变亮,此时可以使函数中的K=1,B=255让图像实现反色效果;When K>1, it can be used to increase the contrast of the image, the pixel values of the image are all increased after the transformation, and the overall display effect is enhanced; when K=1, it is often used to adjust the image brightness; when 0<K<1, the The effect is just the opposite when K>1, the contrast and the overall effect of the image are weakened; when K<0, the brighter areas of the source image will become darker, and the darker areas will become brighter. At this time, you can make the function in the K=1, B=255 makes the image inverse color effect;

步骤S3,预先构建3D卷积-GRU神经网络模型;其中,参见图2所示,3D卷积-GRU神经网络模型包括3D卷积神经网络和GRU神经网络,3D卷积-GRU神经网络模型的输入为3D卷积神经网络的输入,3D卷积神经网络的输出为GRU神经网络的输入,GRU神经网络的输出为3D卷积-GRU神经网络模型的输出;Step S3, construct a 3D convolution-GRU neural network model in advance; wherein, as shown in FIG. 2, the 3D convolution-GRU neural network model includes a 3D convolutional neural network and a GRU neural network, and the 3D convolution-GRU neural network model is The input is the input of the 3D convolutional neural network, the output of the 3D convolutional neural network is the input of the GRU neural network, and the output of the GRU neural network is the output of the 3D convolution-GRU neural network model;

假设,在初始时,所构建的3D卷积-GRU神经网络模型标记为Conv3D_GRU0,该初始时所构建模型Conv3D_GRU0中的3D卷积神经网络标记为Conv3D0,该初始时所构建模型Conv3D_GRU0中的GRU神经网络标记为GRU0Suppose, at the beginning, the constructed 3D convolution-GRU neural network model is marked as Conv3D_GRU 0 , the 3D convolutional neural network in the initially constructed model Conv3D_GRU 0 is marked as Conv3D 0 , the initially constructed model Conv3D_GRU 0 The GRU neural network in is marked as GRU 0 ;

例如,在该实施例中,所构建的3D卷积-GRU神经网络模型Conv3D_GRU0的3D卷积神经网络如下:For example, in this embodiment, the constructed 3D convolutional neural network of the 3D convolution-GRU neural network model Conv3D_GRU 0 is as follows:

Figure BDA0002991577530000071
Figure BDA0002991577530000071

其中,

Figure BDA0002991577530000072
表示3D卷积神经网络的第i层神经元的第j个特征图的输出,x和y分别表示输入到3D卷积神经网络中的归一化雷达回波图的空间维度,z表示输入到3D卷积神经网络中的归一化雷达回波图的时间维度,σ(·)表示激活函数,bij表示3D卷积神经网络的第i层神经元的第j个特征图的偏置函数,p、q和r分别表示卷积值,Pi、Qi和Ri分别表示3D卷积神经网络中卷积核的尺寸,
Figure BDA0002991577530000073
表示第m个特征中的第(p,q,r)个神经元连接的权重,
Figure BDA0002991577530000074
表示输入到3D卷积神经网络中的归一化雷达回波图的维度值;其中,在该实施例中,使用的3D卷积神经网络由1个输入层、3个三维卷积层和3个三维池化层组成;in,
Figure BDA0002991577530000072
Represents the output of the jth feature map of the i-th layer neuron of the 3D convolutional neural network, x and y respectively represent the spatial dimension of the normalized radar echo map input into the 3D convolutional neural network, and z represents the input to the The time dimension of the normalized radar echo map in the 3D convolutional neural network, σ( ) represents the activation function, and b ij represents the bias function of the jth feature map of the i-th layer neuron of the 3D convolutional neural network , p , q and r represent the convolution value, respectively, Pi, Qi and Ri represent the size of the convolution kernel in the 3D convolutional neural network, respectively,
Figure BDA0002991577530000073
represents the weight of the (p,q,r)th neuron connection in the mth feature,
Figure BDA0002991577530000074
Represents the dimension value of the normalized radar echo image input into the 3D convolutional neural network; wherein, in this embodiment, the 3D convolutional neural network used consists of 1 input layer, 3 3D convolutional layers and 3 composed of three-dimensional pooling layers;

构建的3D卷积-GRU神经网络模型的GRU神经网络如下:The GRU neural network of the constructed 3D convolution-GRU neural network model is as follows:

Zt=σ(WZ·[ht-1,Xt]);Z t =σ(W Z ·[h t-1 ,X t ]);

rt=σ(Wr·[ht-1,Xt]);r t =σ(W r ·[h t-1 ,X t ]);

Figure BDA0002991577530000075
Figure BDA0002991577530000075

ht=(1-Zt)*ht-1+Zt*ht';h t =(1-Z t )*h t-1 +Z t *h t ';

其中,σ(·)表示激活函数,例如,此处所采用的激活函数为常用的sigmoid函数,即,

Figure BDA0002991577530000081
k为变量,WZ表示更新门Zt的权重,ht表示GRU神经网络中的当前神经单元的输出,ht-1表示GRU神经网络中的上一个神经单元的输出,Xt表示GRU神经网络中的当前神经单元的输入,WZ·[ht-1,Xt]表示将输出ht-1和输入Xt相加后的结果与权重WZ做相乘处理,h′t表示通过控制rt从输出ht-1中得到的信息量,tanh(·)表示常用的双曲正切激活函数;Among them, σ( ) represents the activation function, for example, the activation function used here is the commonly used sigmoid function, that is,
Figure BDA0002991577530000081
k is a variable, W Z represents the weight of the update gate Z t , h t represents the output of the current neural unit in the GRU neural network, h t-1 represents the output of the previous neural unit in the GRU neural network, X t represents the GRU neural network The input of the current neural unit in the network, W Z ·[h t-1 , X t ] means that the result of adding the output h t-1 and the input X t is multiplied by the weight W Z , h′ t means By controlling the amount of information that r t gets from the output h t-1 , tanh( ) represents the commonly used hyperbolic tangent activation function;

步骤S4,将归一化处理后的目标区域实际降雨数据集中的各采集时刻的雷达回波图作为3D卷积-GRU神经网络模型的输入,并且将该3D卷积-GRU神经网络模型的输出作为针对该采集时刻的降雨量预测值,利用归一化处理后的目标区域实际降雨数据集对该3D卷积-GRU神经网络模型做训练,以训练得到优化的3D卷积-GRU神经网络模型;Step S4, taking the radar echo image at each acquisition time in the normalized target area actual rainfall data set as the input of the 3D convolution-GRU neural network model, and the output of the 3D convolution-GRU neural network model. As the predicted value of rainfall at the collection time, the 3D convolution-GRU neural network model is trained by using the normalized actual rainfall data set of the target area to obtain an optimized 3D convolution-GRU neural network model. ;

具体地,在执行该步骤S4时,执行如下处理:Specifically, when this step S4 is performed, the following processing is performed:

将归一化处理后的目标区域实际降雨数据集List'中的采集时刻t1所对应的归一化雷达回波图序列

Figure BDA0002991577530000082
作为3D卷积-GRU神经网络模型的输入,并且将该3D卷积-GRU神经网络模型的输出作为针对该采集时刻t1的降雨量预测值,比如,此处的3D卷积-GRU神经网络模型的输出作为针对该采集时刻t1的降雨量预测值标记为
Figure BDA0002991577530000083
The normalized radar echo map sequence corresponding to the acquisition time t 1 in the normalized target area actual rainfall data set List'
Figure BDA0002991577530000082
As the input of the 3D convolution-GRU neural network model, and the output of the 3D convolution-GRU neural network model is used as the predicted value of rainfall for the collection time t 1 , for example, the 3D convolution-GRU neural network here The output of the model as the predicted rainfall value for this collection time t1 is labeled as
Figure BDA0002991577530000083

将归一化处理后的目标区域实际降雨数据集List'中的采集时刻t2所对应的归一化雷达回波图序列

Figure BDA0002991577530000084
作为3D卷积-GRU神经网络模型的输入,并且将该3D卷积-GRU神经网络模型的输出作为针对该采集时刻t2的降雨量预测值,比如,此处的3D卷积-GRU神经网络模型的输出作为针对该采集时刻t2的降雨量预测值标记为
Figure BDA0002991577530000085
The normalized radar echo map sequence corresponding to the acquisition time t 2 in the normalized target area actual rainfall dataset List'
Figure BDA0002991577530000084
As the input of the 3D convolution-GRU neural network model, and the output of the 3D convolution-GRU neural network model is used as the rainfall prediction value for the collection time t 2 , for example, the 3D convolution-GRU neural network here The output of the model as the predicted rainfall value for this collection time t2 is labeled as
Figure BDA0002991577530000085

依次类推;And so on;

将归一化处理后的目标区域实际降雨数据集List'中的采集时刻tM所对应的归一化雷达回波图序列

Figure BDA0002991577530000086
作为3D卷积-GRU神经网络模型的输入,并且将该3D卷积-GRU神经网络模型的输出作为针对该采集时刻tM的降雨量预测值,比如,此处的3D卷积-GRU神经网络模型的输出作为针对该采集时刻tM的降雨量预测值标记为
Figure BDA0002991577530000087
The normalized radar echo map sequence corresponding to the acquisition time t M in the normalized target area actual rainfall dataset List'
Figure BDA0002991577530000086
As the input of the 3D convolution-GRU neural network model, and the output of the 3D convolution-GRU neural network model is used as the rainfall prediction value for the collection time t M , for example, the 3D convolution-GRU neural network here The output of the model is labeled as the predicted rainfall value for this collection time t M as
Figure BDA0002991577530000087

依次通过归一化雷达回波图序列

Figure BDA0002991577530000088
至归一化雷达回波图序列
Figure BDA0002991577530000089
分别对3D卷积-GRU神经网络模型做训练,从而得到优化的3D卷积-GRU神经网络模型;其中,在该实施例的步骤S4中,通过如下方式训练得到优化的3D卷积-GRU神经网络模型:Sequentially pass through the sequence of normalized radar echograms
Figure BDA0002991577530000088
to normalized radar echogram sequence
Figure BDA0002991577530000089
The 3D convolution-GRU neural network model is trained respectively to obtain an optimized 3D convolution-GRU neural network model; wherein, in step S4 of this embodiment, the optimized 3D convolution-GRU neural network model is obtained by training in the following manner Network model:

步骤S41,获取3D卷积-GRU神经网络模型在任一采集时刻输出的降雨量预测值;Step S41, obtaining the predicted value of rainfall output by the 3D convolution-GRU neural network model at any acquisition moment;

步骤S42,获取目标区域在该任一采集时刻所对应的降雨量实际值;Step S42, obtaining the actual value of the rainfall corresponding to the target area at any collection moment;

步骤S43,构建3D卷积-GRU神经网络模型的损失函数,并得到3D卷积-GRU神经网络模型的损失函数值;其中,3D卷积-GRU神经网络模型的损失函数如下所示:Step S43, construct the loss function of the 3D convolution-GRU neural network model, and obtain the loss function value of the 3D convolution-GRU neural network model; wherein, the loss function of the 3D convolution-GRU neural network model is as follows:

Figure BDA0002991577530000091
Figure BDA0002991577530000091

其中,Γ表示3D卷积-GRU神经网络模型的损失函数值,y(t)表示目标区域在采集时刻t的降雨量实际值,y'(t)表示3D卷积-GRU神经网络模型输出的目标区域在采集时刻t的降雨量预测值;Among them, Γ represents the loss function value of the 3D convolution-GRU neural network model, y(t) represents the actual value of rainfall in the target area at the collection time t, and y'(t) represents the output of the 3D convolution-GRU neural network model. The predicted rainfall value of the target area at the collection time t;

步骤S44,根据所得3D卷积-GRU神经网络模型的损失函数值做出判断:Step S44, make a judgment according to the loss function value of the obtained 3D convolution-GRU neural network model:

当3D卷积-GRU神经网络模型的损失函数值的变化处于稳定时,将该3D卷积-GRU神经网络模型作为优化的3D卷积-GRU神经网络模型;否则,转入步骤S41。When the change of the loss function value of the 3D convolution-GRU neural network model is stable, the 3D convolution-GRU neural network model is regarded as the optimized 3D convolution-GRU neural network model; otherwise, go to step S41.

例如,首先获取3D卷积-GRU神经网络模型在采集时刻t1输出的降雨量预测值

Figure BDA0002991577530000092
3D卷积-GRU神经网络模型在采集时刻t2输出的降雨量预测值
Figure BDA0002991577530000093
……、3D卷积-GRU神经网络模型在采集时刻tM输出的降雨量预测值
Figure BDA0002991577530000094
然后,获取目标区域甲在该任一采集时刻t1、t2、……、tM时所分别对应的降雨量实际值
Figure BDA0002991577530000095
再次,构建3D卷积-GRU神经网络模型的损失函数,并得到3D卷积-GRU神经网络模型的损失函数值;其中,该3D卷积-GRU神经网络模型的损失函数值如下:For example, first obtain the predicted value of rainfall output by the 3D convolution-GRU neural network model at the acquisition time t1
Figure BDA0002991577530000092
The predicted value of rainfall output by the 3D convolution-GRU neural network model at the collection time t 2
Figure BDA0002991577530000093
..., the predicted value of rainfall output by the 3D convolution-GRU neural network model at the acquisition time t M
Figure BDA0002991577530000094
Then, obtain the actual values of rainfall corresponding to the target area A at any of the collection times t 1 , t 2 , ..., t M respectively
Figure BDA0002991577530000095
Third, construct the loss function of the 3D convolution-GRU neural network model, and obtain the loss function value of the 3D convolution-GRU neural network model; wherein, the loss function value of the 3D convolution-GRU neural network model is as follows:

Figure BDA0002991577530000096
Figure BDA0002991577530000096

当3D卷积-GRU神经网络模型的损失函数值Γ的变化处于稳定时,即损失函数值Γ的变化范围较小,比如损失函数值Γ的变化处于一个预设的很小的数值范围内时,将该3D卷积-GRU神经网络模型作为优化的3D卷积-GRU神经网络模型;否则,重新再次获取3D卷积-GRU神经网络模型在任一采集时刻输出的降雨量预测值,以顺序执行后续的步骤。When the change of the loss function value Γ of the 3D convolution-GRU neural network model is stable, that is, the change range of the loss function value Γ is small, for example, when the change of the loss function value Γ is within a preset small value range , take the 3D convolution-GRU neural network model as the optimized 3D convolution-GRU neural network model; otherwise, re-acquire the predicted value of rainfall output by the 3D convolution-GRU neural network model at any acquisition moment, and execute them in sequence next steps.

步骤S5,采集目标区域在当前时刻的雷达回波图序列,对该雷达回波图序列执行归一化处理,并将归一化处理后的雷达回波图序列输入到优化的3D卷积-GRU神经网络模型中,并且将该优化的3D卷积-GRU神经网络模型的输出作为目标区域在未来时间段内的降雨量预测值。Step S5, collect the radar echo map sequence of the target area at the current moment, perform normalization processing on the radar echo map sequence, and input the normalized radar echo map sequence into the optimized 3D convolution- In the GRU neural network model, the output of the optimized 3D convolution-GRU neural network model is used as the predicted rainfall value of the target area in the future time period.

由于经过步骤S4的训练处理,已经得到了一个优化的3D卷积-GRU神经网络模型,所以此时再次采集目标区域甲在当前时刻的雷达回波图序列,对该雷达回波图序列内每一个雷达回波图执行归一化处理,并将归一化处理后的雷达回波图序列输入到优化的3D卷积-GRU神经网络模型中,并且将该优化的3D卷积-GRU神经网络模型的输出作为目标区域在未来时间段内的降雨量预测值。Since an optimized 3D convolution-GRU neural network model has been obtained after the training process in step S4, the radar echo map sequence of the target area A at the current moment is collected again at this time. A radar echo image is normalized, and the normalized radar echo image sequence is input into the optimized 3D convolution-GRU neural network model, and the optimized 3D convolution-GRU neural network is The output of the model serves as a forecast of rainfall for the target area over a future time period.

本领域技术人员熟知,在通过雷达回波序列图预测降雨情况的过程中,某一时刻的雷达回波序列图就可以反映出该时刻的降雨量情况,即同一时刻的雷达回波序列图与降雨量之间是相互对应的关系。因此,为了更为直观地比较不同短时强降雨预测方法在性能上的差异,该实施例以预测的未来时间段内的雷达回波图序列表征该未来时间段内的降雨量情况。具体地,为了对该实施例中基于深度学习的短时强降雨预测方法的性能做出比较说明,该实施例还提供了利用传统方法(基于2D卷积神经网络)对未来2小时内的雷达回波图像序列的预测输出图像序列(参见图3)以及未来2小时内的雷达回波图像序列的真实输出图像序列(参见图5)。It is well known to those skilled in the art that in the process of predicting rainfall through the radar echo sequence diagram, the radar echo sequence diagram at a certain moment can reflect the rainfall situation at that moment, that is, the radar echo sequence diagram at the same moment is the same as There is a corresponding relationship between rainfall. Therefore, in order to compare the performance differences of different short-term heavy rainfall prediction methods more intuitively, this embodiment uses the radar echo map sequence in the predicted future time period to represent the rainfall situation in the future time period. Specifically, in order to compare and illustrate the performance of the deep learning-based short-term heavy rainfall prediction method in this embodiment, this embodiment also provides the use of traditional methods (based on 2D convolutional neural networks) to predict the radar in the next 2 hours. The predicted output image sequence of the echo image sequence (see Figure 3) and the real output image sequence of the radar echo image sequence in the next 2 hours (see Figure 5).

从图3、图4以及图5的比较可以看出,该实施例中的时强降雨预测方法所预测的输出图像序列更加清晰,能够更好的获取目标区域在不同高度上的雷达回波图时间维度特征和空间维度特征,更加准确地预测未来的降雨情况。It can be seen from the comparison of Fig. 3, Fig. 4 and Fig. 5 that the output image sequence predicted by the time-heavy rainfall prediction method in this embodiment is clearer, and the radar echo images of the target area at different heights can be better obtained. Time dimension features and spatial dimension features can more accurately predict future rainfall.

尽管以上详细地描述了本发明的优选实施例,但是应该清楚地理解,对于本领域的技术人员来说,本发明可以有各种更改和变化。凡在本发明的精神和原则之内所作的任何修改、等同替换、改进等,均应包含在本发明的保护范围之内。Although the preferred embodiments of the present invention have been described in detail above, it should be clearly understood that various modifications and variations of the present invention will occur to those skilled in the art. Any modification, equivalent replacement, improvement, etc. made within the spirit and principle of the present invention shall be included within the protection scope of the present invention.

Claims (6)

1. The short-time heavy rainfall prediction method based on deep learning is characterized by comprising the following steps of S1-S5:
step S1, collecting radar echo diagram sequences and rainfall actual values of a target area at different collecting moments in advance, and forming a target area actual rainfall data set by all the collected radar echo diagram sequences and the collected rainfall actual values; in the target area actual rainfall data set, radar echo diagram sequences at the same acquisition time are in one-to-one correspondence with rainfall actual values;
step S2, normalizing each radar echo map in the target area actual rainfall data set to obtain a normalized target area actual rainfall data set;
step S3, a 3D convolution-GRU neural network model is constructed in advance; the 3D convolution-GRU neural network model comprises a 3D convolution neural network and a GRU neural network, wherein the input of the 3D convolution-GRU neural network model is the input of the 3D convolution neural network, the output of the 3D convolution neural network is the input of the GRU neural network, and the output of the GRU neural network is the output of the 3D convolution-GRU neural network model;
step S4, taking the radar echo diagram sequence of each acquisition time in the target area actual rainfall data set after normalization processing as the input of a 3D convolution-GRU neural network model, taking the output of the 3D convolution-GRU neural network model as a rainfall prediction value aiming at the acquisition time, and training the 3D convolution-GRU neural network model by using the target area actual rainfall data set after normalization processing to obtain an optimized 3D convolution-GRU neural network model through training;
and step S5, acquiring a radar echo diagram sequence of the target area at the current moment, performing normalization processing on each radar echo diagram in the radar echo diagram sequence, inputting the radar echo diagram sequence after the normalization processing into the optimized 3D convolution-GRU neural network model, and taking the output of the optimized 3D convolution-GRU neural network model as a rainfall prediction value of the target area in a future time period.
2. The method for forecasting short-term heavy rainfall based on deep learning of claim 1, wherein in step S2, before the normalization process is performed on each radar echo map in the target area actual rainfall data set, the method further comprises:
processing each radar echo map in the target area actual rainfall data set into a gray scale map through linear transformation; wherein, the formula of the linear transformation processing is that g '(d, e) ═ K · g (d, e) + B, g (d, e) represents the pixel value of the acquired radar echo image, K represents the slope, B represents the intercept, and g' (d, e) represents the pixel value corresponding to the gray scale image after the linear transformation processing;
and filtering the obtained gray level image by adopting a bilinear filter, and taking the gray level image after filtering as a radar echo image needing normalization processing.
3. The method for forecasting short-term heavy rainfall based on deep learning of claim 2, wherein in step S3, the 3D convolutional neural network of the constructed 3D convolutional-GRU neural network model is as follows:
Figure FDA0002991577520000021
wherein,
Figure FDA0002991577520000022
an output of a jth feature map representing a layer i neuron of the 3D convolutional neural network, x and y represent spatial dimensions of a normalized radar echo map input into the 3D convolutional neural network, respectively, z represents a time dimension of a sequence of normalized radar echo maps input into the 3D convolutional neural network, σ (·) represents an activation function, bijA bias function representing the jth feature map of layer i neurons of the 3D convolutional neural network, P, q, and r represent convolution values, respectively, Pi、QiAnd RiRespectively represent the size of the convolution kernel in the 3D convolutional neural network,
Figure FDA0002991577520000023
representing the weight of the (p, q, r) th neuron connection in the mth feature,
Figure FDA0002991577520000024
a dimension value representing a sequence of normalized radar echo maps input into the 3D convolutional neural network.
4. The method for forecasting short-term heavy rainfall based on deep learning of claim 3, wherein in step S3, the GRU neural network of the constructed 3D convolution-GRU neural network model is as follows:
Zt=σ(WZ·[ht-1,Xt]);
rt=σ(Wr·[ht-1,Xt]);
Figure FDA0002991577520000025
ht=(1-Zt)*ht-1+Zt*h't
where σ (-) denotes the activation function, WZPresentation update door ZtWeight of (a), htRepresenting the output of the current neural unit in the GRU neural network, ht-1Representing the output, X, of the last neural unit in the GRU neural networktRepresenting the input of the current neural unit in the GRU neural network, WZ·[ht-1,Xt]Indicates to output ht-1And input XtThe result of the addition is given a weight WZMultiplication processing is carried out, ht' indicates by controlling rtFrom the output ht-1The information amount in (1), tanh (-) represents a commonly used hyperbolic tangent activation function.
5. The method for forecasting short-term heavy rainfall based on deep learning of claim 4, wherein in step S4, the optimized 3D convolution-GRU neural network model is trained by:
step S41, acquiring a rainfall prediction value output by the 3D convolution-GRU neural network model at any acquisition time;
step S42, acquiring the rainfall actual value corresponding to the target area at any acquisition time;
step S43, constructing a loss function of the 3D convolution-GRU neural network model, and obtaining a loss function value of the 3D convolution-GRU neural network model; wherein, the loss function of the 3D convolution-GRU neural network model is as follows:
Figure FDA0002991577520000031
wherein, Γ represents a loss function value of the 3D convolution-GRU neural network model, y (t) represents an actual rainfall value of the target area at the acquisition time t, and y' (t) represents a predicted rainfall value of the target area output by the 3D convolution-GRU neural network model at the acquisition time t;
step S44, making judgment according to the loss function value of the obtained 3D convolution-GRU neural network model:
when the change of the loss function value of the 3D convolution-GRU neural network model is stable, taking the 3D convolution-GRU neural network model as an optimized 3D convolution-GRU neural network model; otherwise, the process proceeds to step S41.
6. The method for predicting short-term heavy rainfall based on deep learning of any one of claims 1 to 5, wherein in step S1, radar echo map sequences of the target area at different acquisition times within 24h are acquired in advance at an acquisition frequency of 1 frame/6 min.
CN202110317764.3A 2021-03-25 2021-03-25 Short-term heavy rainfall prediction method based on deep learning Pending CN112949934A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110317764.3A CN112949934A (en) 2021-03-25 2021-03-25 Short-term heavy rainfall prediction method based on deep learning

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110317764.3A CN112949934A (en) 2021-03-25 2021-03-25 Short-term heavy rainfall prediction method based on deep learning

Publications (1)

Publication Number Publication Date
CN112949934A true CN112949934A (en) 2021-06-11

Family

ID=76228075

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110317764.3A Pending CN112949934A (en) 2021-03-25 2021-03-25 Short-term heavy rainfall prediction method based on deep learning

Country Status (1)

Country Link
CN (1) CN112949934A (en)

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113568068A (en) * 2021-07-22 2021-10-29 河南大学 A Prediction Method of Severe Convective Weather Based on MPI Parallel 3D Neural Network
CN114021349A (en) * 2021-11-05 2022-02-08 广东电网有限责任公司广州供电局 Method, system and device for predicting heavy rainfall and computer storage medium
CN114091765A (en) * 2021-11-25 2022-02-25 山西勇利信息科技有限公司 Future rainfall prediction method based on space-time bidirectional multi-granularity dynamic integration
CN114186743A (en) * 2021-12-13 2022-03-15 中国平安财产保险股份有限公司 Dangerous area prediction method, device, equipment and storage medium
CN114509825A (en) * 2021-12-31 2022-05-17 河南大学 Strong convection weather prediction method and system for improving three-dimensional confrontation generation neural network based on hybrid evolution algorithm
CN116520459A (en) * 2023-06-28 2023-08-01 成都信息工程大学 A Method of Weather Forecasting
CN116755181A (en) * 2023-08-11 2023-09-15 深圳市昆特科技有限公司 Precipitation prediction method and related device

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105046089A (en) * 2015-08-13 2015-11-11 电子科技大学 Method for predicting strong rainfall and flood disasters
CN109376848A (en) * 2018-09-01 2019-02-22 哈尔滨工程大学 A Simplified Gated Unit Neural Network
CN111476713A (en) * 2020-03-26 2020-07-31 中南大学 Weather image intelligent recognition method and system based on multi-depth convolutional neural network fusion
CN112415521A (en) * 2020-12-17 2021-02-26 南京信息工程大学 Nowcasting method of radar echo with strong spatiotemporal characteristics based on CGRU

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105046089A (en) * 2015-08-13 2015-11-11 电子科技大学 Method for predicting strong rainfall and flood disasters
CN109376848A (en) * 2018-09-01 2019-02-22 哈尔滨工程大学 A Simplified Gated Unit Neural Network
CN111476713A (en) * 2020-03-26 2020-07-31 中南大学 Weather image intelligent recognition method and system based on multi-depth convolutional neural network fusion
CN112415521A (en) * 2020-12-17 2021-02-26 南京信息工程大学 Nowcasting method of radar echo with strong spatiotemporal characteristics based on CGRU

Non-Patent Citations (4)

* Cited by examiner, † Cited by third party
Title
刘东: "《工业机器视觉 基于灵闪平台的开发及应用》", 上海教育出版社, pages: 83 *
陈晓平等: "基于机器学习的降雨量雷达回波数据建模与预测", 《南京信息工程大学学报》, vol. 12, no. 4, pages 483 - 494 *
陈程: "卷积神经网络在气象短临预报的研究与应用", 《中国优秀硕士学位论文全文数据库 基础科学辑》, no. 12, pages 009 - 16 *
陈颖等: "基于3D双流卷积神经网络和GRU网络的人体行为识别", 《计算机应用与软件》, no. 5, pages 170 - 174 *

Cited By (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113568068A (en) * 2021-07-22 2021-10-29 河南大学 A Prediction Method of Severe Convective Weather Based on MPI Parallel 3D Neural Network
CN113568068B (en) * 2021-07-22 2022-03-29 河南大学 Strong convection weather prediction method based on MPI parallel three-dimensional neural network
CN114021349A (en) * 2021-11-05 2022-02-08 广东电网有限责任公司广州供电局 Method, system and device for predicting heavy rainfall and computer storage medium
CN114091765A (en) * 2021-11-25 2022-02-25 山西勇利信息科技有限公司 Future rainfall prediction method based on space-time bidirectional multi-granularity dynamic integration
CN114186743A (en) * 2021-12-13 2022-03-15 中国平安财产保险股份有限公司 Dangerous area prediction method, device, equipment and storage medium
CN114509825A (en) * 2021-12-31 2022-05-17 河南大学 Strong convection weather prediction method and system for improving three-dimensional confrontation generation neural network based on hybrid evolution algorithm
CN114509825B (en) * 2021-12-31 2022-11-08 河南大学 Strong convection weather prediction method and system for improving three-dimensional confrontation generation neural network based on hybrid evolution algorithm
CN116520459A (en) * 2023-06-28 2023-08-01 成都信息工程大学 A Method of Weather Forecasting
CN116520459B (en) * 2023-06-28 2023-08-25 成都信息工程大学 A Method of Weather Forecasting
CN116755181A (en) * 2023-08-11 2023-09-15 深圳市昆特科技有限公司 Precipitation prediction method and related device
CN116755181B (en) * 2023-08-11 2023-10-20 深圳市昆特科技有限公司 Precipitation prediction method and related device

Similar Documents

Publication Publication Date Title
CN112949934A (en) Short-term heavy rainfall prediction method based on deep learning
CN112446419B (en) Attention mechanism-based space-time neural network radar echo extrapolation prediction method
CN110033002B (en) License plate detection method based on multi-task cascaded convolutional neural network
WO2022036777A1 (en) Method and device for intelligent estimation of human body movement posture based on convolutional neural network
CN112633497A (en) Convolutional pulse neural network training method based on reweighted membrane voltage
Bi et al. Evacuation route recommendation using auto-encoder and Markov decision process
Zhu et al. Ensemble methodology: Innovations in credit default prediction using lightgbm, xgboost, and localensemble
CN110232169A (en) Track denoising method based on two-way length memory models and Kalman filtering in short-term
CN112945162B (en) A kind of accumulation layer landslide displacement prediction model and prediction method
CN104200471A (en) SAR image change detection method based on adaptive weight image fusion
CN113239722B (en) Deep learning based strong convection extrapolation method and system under multi-scale
CN106384339A (en) Infrared night vision image enhancement method
CN112766165B (en) Falling pre-judging method based on deep neural network and panoramic segmentation
CN108563977A (en) A kind of the pedestrian&#39;s method for early warning and system of expressway entrance and exit
CN113011305A (en) SAR image road extraction method and device based on semantic segmentation and conditional random field
Zhang et al. Surface and high-altitude combined rainfall forecasting using convolutional neural network
Chaurasiya et al. Deep dilated CNN based image denoising
CN104091350A (en) Object tracking method achieved through movement fuzzy information
Wang et al. Combined model of air quality index forecasting based on the combination of complementary empirical mode decomposition and sequence reconstruction
Adi et al. Analysis and Detection of Cholesterol by Wavelets based and ANN Classification
CN112232102B (en) Building target recognition method and system based on deep neural network and multi-task learning
CN114638441B (en) An ocean current monitoring and early warning system based on satellite remote sensing images
CN118898545B (en) A multi-level collaborative mapping method for fusion of hyperspectral and multispectral remote sensing images
CN116029420A (en) Equipment life prediction method based on S transformation and CNN network
CN115081920A (en) Attendance check-in scheduling management method, system, equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication

Application publication date: 20210611

RJ01 Rejection of invention patent application after publication