CN114881286A - Short-time rainfall prediction method based on deep learning - Google Patents

Short-time rainfall prediction method based on deep learning Download PDF

Info

Publication number
CN114881286A
CN114881286A CN202210351766.9A CN202210351766A CN114881286A CN 114881286 A CN114881286 A CN 114881286A CN 202210351766 A CN202210351766 A CN 202210351766A CN 114881286 A CN114881286 A CN 114881286A
Authority
CN
China
Prior art keywords
rainfall
layer
prediction model
module
short
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210351766.9A
Other languages
Chinese (zh)
Inventor
陈晓楠
赵建宇
陈鲁刚
谢志强
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Dalian Maritime University
Original Assignee
Dalian Maritime University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Dalian Maritime University filed Critical Dalian Maritime University
Priority to CN202210351766.9A priority Critical patent/CN114881286A/en
Publication of CN114881286A publication Critical patent/CN114881286A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q10/00Administration; Management
    • G06Q10/04Forecasting or optimisation specially adapted for administrative or management purposes, e.g. linear programming or "cutting stock problem"
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/044Recurrent networks, e.g. Hopfield networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/045Combinations of networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02ATECHNOLOGIES FOR ADAPTATION TO CLIMATE CHANGE
    • Y02A90/00Technologies having an indirect contribution to adaptation to climate change
    • Y02A90/10Information and communication technologies [ICT] supporting adaptation to climate change, e.g. for weather forecasting or climate simulation

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Computational Linguistics (AREA)
  • Health & Medical Sciences (AREA)
  • Biophysics (AREA)
  • Artificial Intelligence (AREA)
  • Data Mining & Analysis (AREA)
  • Evolutionary Computation (AREA)
  • General Health & Medical Sciences (AREA)
  • Molecular Biology (AREA)
  • Computing Systems (AREA)
  • General Engineering & Computer Science (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Mathematical Physics (AREA)
  • Software Systems (AREA)
  • Biomedical Technology (AREA)
  • Business, Economics & Management (AREA)
  • Economics (AREA)
  • Human Resources & Organizations (AREA)
  • Strategic Management (AREA)
  • Development Economics (AREA)
  • Game Theory and Decision Science (AREA)
  • Entrepreneurship & Innovation (AREA)
  • Marketing (AREA)
  • Operations Research (AREA)
  • Quality & Reliability (AREA)
  • Tourism & Hospitality (AREA)
  • General Business, Economics & Management (AREA)
  • Management, Administration, Business Operations System, And Electronic Commerce (AREA)

Abstract

The invention discloses a short-time rainfall prediction method based on deep learning, which belongs to the technical field of computer vision and meteorological service and comprises the following steps: collecting a radar echo sequence of a rainfall process in a target area in a certain period of time recently, constructing a rainfall data set, and dividing the data set into a training set and a test set according to a preset proportion; preprocessing rainfall data in the rainfall data set to obtain a preprocessed rainfall data set; constructing a rainfall prediction model based on an inclusion module, a high-efficiency channel attention module and a bidirectional long-time and short-time memory neural network which are introduced into a convolution part; training a rainfall prediction model based on the preprocessed training set data to obtain a trained rainfall prediction model; the preprocessed test concentrated rainfall data are input into a trained rainfall prediction model, rainfall prediction in the target area within 1 to 2 hours in the future is achieved, the problem that the prediction result of the existing rainfall prediction method is large in error is solved, and the rainfall prediction accuracy is improved.

Description

Short-time rainfall prediction method based on deep learning
Technical Field
The invention relates to the technical field of computer vision and meteorological service, in particular to a short-time rainfall prediction method based on deep learning.
Background
Because weather disasters caused by short-time rainfall occur frequently, the social harmfulness generated by the short-time rainfall is extremely high, and the life and property safety of people is seriously threatened, so that the short-time rainfall forecast is always an important research problem in the field of weather forecast. The purpose of short-time rainfall forecast is to accurately and timely predict rainfall intensity in a relatively short time (generally 0-6 hours) aiming at a certain local area. At present, a precipitation forecasting method commonly used in domestic and foreign businesses is a radar echo extrapolation method based on an optical flow method. The method comprises the steps of calculating the change of pixels among echo images in time, calculating the motion vector of the pixels, predicting the radar echo state at the future time by means of linear extrapolation or Lagrange extrapolation, and inverting the rainfall corresponding to the echo by means of a Z-I relation. On one hand, rainfall prediction is carried out in multiple steps, so that errors are easy to accumulate; on the other hand, the actual effective prediction of radar echo extrapolation techniques is typically less than 1 hour. Therefore, if the rainfall in the future of 1-2 hours in a certain target area is predicted by using the method, the precision of the rainfall prediction is reduced. In recent years, with the development of deep learning, the technology is applied to various industries, and the technology is very beneficial to deep learning, namely a data-driven method, because the technology contains a large amount of data in the meteorological field.
Disclosure of Invention
In order to improve the precision of rainfall prediction and reduce errors, aiming at the characteristics of a radar echo sequence, the invention adopts a short-time rainfall prediction method based on deep learning, which comprises the following steps:
collecting a radar echo sequence of a rainfall process in a target area in a certain period of time recently, constructing a rainfall data set, and dividing the data set into a training set and a test set according to a preset proportion;
preprocessing rainfall data in the rainfall data set to obtain a preprocessed rainfall data set;
constructing a rainfall prediction model based on an inclusion module, a high-efficiency channel attention module and a bidirectional long-time and short-time memory neural network which are introduced into a convolution part;
training a rainfall prediction model based on the preprocessed training set data to obtain a trained rainfall prediction model;
and inputting the pretreated test concentrated rainfall data into a trained rainfall prediction model to realize the short-time rainfall prediction of the target area. The short time refers to within 1 to 2 hours in the future.
Further: the pretreatment of the rainfall data in the rainfall data set comprises the following steps:
s21, adopting a K neighbor algorithm, finding out K points similar to the neighborhood of the missing value in the radar echo sequence through Euclidean distance, calculating the mean value of the K points as the numerical value of the missing value to obtain an image filled with the missing value,
s22, removing Gaussian noise from the image filled with the missing value by using a Gaussian filter, and filtering the image after Gaussian filtering;
and S23, performing normalization processing on the obtained image after the Gaussian filtering processing to obtain a normalized image.
Further: the rainfall prediction model has the following structure: comprises 14 layers:
the first layer is a convolution layer, the size of a convolution kernel is 3x3, and the number of the convolution kernels is 64;
the second layer is a pooling layer, the kernel size is 3x3, the step size is 2;
the third layer is a convolution layer, the size of convolution kernel is 3x3, and the number of convolution kernels is 128
The fourth layer is a pooling layer, the kernel size is 3x3, and the step size is 2;
the fifth layer is an inclusion module I;
the sixth layer is formed by connecting an inclusion module II with a channel attention ECA module;
the seventh layer is a pooling layer, with kernel size 3x3, step size 2;
the ninth layer to the tenth layer are three layers of BiLSTM, and the number of nodes of the hidden layer is 256;
the twelfth to the fourteenth layers are three full-connection layers, and the number of the neurons is 512, 128 and 1 respectively; (ii) a
Dropout is set behind each layer of the BilSTM layer and the full connection layer, and the parameter is set to be 0.5.
Further: the IncEPTION module I and the IncEPTION module II have the same structure; the Inception module I comprises a first layer group and a second layer group; the first group of layers and the second group of layers are connected in series;
the first layer set comprises three 1x1 convolutional layers and 13 x3 convolutional layer;
the second layer set includes 1x1 convolutional layers, 13 x3 convolutional layers, and 15 x5 convolutional layers.
Further: the rainfall prediction model uses the root mean square error as a loss function, the formula of which is as follows:
Figure BDA0003580831710000031
wherein: n denotes the number of samples, y i And
Figure BDA0003580831710000032
respectively representing the actual rainfall result and the predicted rainfall corresponding to the ith sample sequence.
A short-term precipitation prediction device based on deep learning, comprising:
an acquisition module: the rainfall simulation test system is used for collecting a radar echo sequence of a rainfall process in a target area in a recent certain period of time, constructing a rainfall data set, and dividing the data set into a training set and a test set according to a preset proportion;
a preprocessing module: preprocessing rainfall data in the rainfall data set to obtain a preprocessed rainfall data set;
constructing a module: the rainfall prediction method comprises the steps of constructing a rainfall prediction model based on an inclusion module, a high-efficiency channel attention module and a bidirectional long-and-short-term memory neural network which are introduced into a convolution part;
a training module: training a rainfall prediction model based on the preprocessed training set data to obtain a trained rainfall prediction model;
and (3) prediction model: and the rainfall prediction method is used for inputting the preprocessed test concentrated rainfall data into a trained rainfall prediction model to realize rainfall prediction in the future 1-2 hours of the target area.
The invention provides a short-time rainfall prediction method based on deep learning, which utilizes the good feature expression capability of CNN to image data and the capability of LSTM to excavate potential rules in time sequence data of historical time, combines the good feature expression capability and the potential rules in LSTM to extract the space-time features of echo sequences, constructs a rainfall prediction model based on CNN-BilSTM, inputs radar echo sequences collected above a target area into a CNN part to extract spatial features, and then utilizes the BilSTM part to analyze and predict according to the front-back relation of the spatial features on a time dimension. The invention has the advantages and positive effects that:
(1) the rainfall prediction in the target area within 1 to 2 hours in the future is realized through the radar echo sequence, the rainfall prediction accuracy is improved, and the rainfall prediction error is reduced;
(2) the method of deep learning is applied to weather prediction, abundant weather data are fully utilized, and rainfall prediction based on data driving is achieved.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings needed to be used in the description of the embodiments or the prior art will be briefly introduced below, and it is obvious that the drawings in the following description are some embodiments of the present invention, and for those skilled in the art, other drawings can be obtained according to these drawings without creative efforts.
FIG. 1 is a general flow diagram of the present method;
FIG. 2 is a diagram of a rainfall prediction model architecture;
fig. 3 is an inclusion block diagram.
Detailed Description
It should be noted that, in the case of conflict, the embodiments and features of the embodiments of the present invention may be combined with each other, and the present invention will be described in detail with reference to the accompanying drawings and embodiments.
In order to make the objects, technical solutions and advantages of the embodiments of the present invention clearer, the technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. The following description of at least one exemplary embodiment is merely illustrative in nature and is in no way intended to limit the invention, its application, or uses. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
It is noted that the terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of exemplary embodiments according to the invention. As used herein, the singular forms "a", "an" and "the" are intended to include the plural forms as well, and it should be understood that when the terms "comprises" and/or "comprising" are used in this specification, they specify the presence of stated features, steps, operations, devices, components, and/or combinations thereof, unless the context clearly indicates otherwise.
The relative arrangement of the components and steps, the numerical expressions and numerical values set forth in these embodiments do not limit the scope of the present invention unless specifically stated otherwise. Meanwhile, it should be understood that the sizes of the respective portions shown in the drawings are not drawn in an actual proportional relationship for the convenience of description. Techniques, methods, and apparatus known to those of ordinary skill in the relevant art may not be discussed in detail but are intended to be part of the specification where appropriate. Any specific values in all examples shown and discussed herein are to be construed as exemplary only and not as limiting. Thus, other examples of the exemplary embodiments may have different values. It should be noted that: like reference numbers and letters refer to like items in the following figures, and thus, once an item is defined in one figure, further discussion thereof is not required in subsequent figures.
In the description of the present invention, it is to be understood that the orientation or positional relationship indicated by the directional terms such as "front, rear, upper, lower, left, right", "lateral, vertical, horizontal" and "top, bottom", etc., are generally based on the orientation or positional relationship shown in the drawings, and are used for convenience of description and simplicity of description only, and in the absence of any contrary indication, these directional terms are not intended to indicate and imply that the device or element so referred to must have a particular orientation or be constructed and operated in a particular orientation, and therefore should not be considered as limiting the scope of the present invention: the terms "inner and outer" refer to the inner and outer relative to the profile of the respective component itself.
Spatially relative terms, such as "above … …," "above … …," "above … … surface," "above," and the like, may be used herein for ease of description to describe one device or feature's spatial relationship to another device or feature as illustrated in the figures. It will be understood that the spatially relative terms are intended to encompass different orientations of the device in use or operation in addition to the orientation depicted in the figures. For example, if a device in the figures is turned over, devices described as "above" or "on" other devices or configurations would then be oriented "below" or "under" the other devices or configurations. Thus, the exemplary term "above … …" can include both an orientation of "above … …" and "below … …". The device may be otherwise variously oriented (rotated 90 degrees or at other orientations) and the spatially relative descriptors used herein interpreted accordingly.
It should be noted that the terms "first", "second", and the like are used to define the components, and are only used for convenience of distinguishing the corresponding components, and the terms have no special meanings unless otherwise stated, and therefore, the scope of the present invention should not be construed as being limited.
FIG. 1 is a general flow diagram of the present method;
a short-time precipitation prediction method based on deep learning comprises the following steps:
s1, collecting a radar echo sequence of a rainfall process in a target area in a recent certain period of time, constructing a rainfall data set, and dividing the data set into a training set and a test set according to a preset proportion;
s2, preprocessing rainfall data in the rainfall data set to obtain a preprocessed rainfall data set;
s3: establishing a rainfall prediction model based on an inclusion module and an efficient Channel attention module ECA (efficient Channel attention) introduced into the convolutional layer and a bidirectional long-time and short-time memory neural network (BiLSTM);
s4: training a rainfall prediction model based on the preprocessed training set data to obtain a trained rainfall prediction model;
and S5, inputting the preprocessed test concentrated rainfall data into the trained rainfall prediction model to realize the rainfall prediction of the target area within 1 to 2 hours in the future.
Steps S1, S2, S3, S4, S5 are sequentially performed;
further, a radar echo sequence of a rainfall process in a target area in a recent certain period of time is collected, a rainfall data set is constructed, and the data set is divided into a training set and a testing set according to a preset proportion; the specific process is as follows:
collecting radar echo sequences of a target area in the historical rainfall process, selecting corresponding radar echo images at 15 continuous moments from the radar echo sequences,
Figure BDA0003580831710000061
to
Figure BDA0003580831710000062
i denotes the ith sample, i.e. the ith radar image sequence, where t1 denotes the radar echo map at the first time instant of the image sequence, t15 denotes the radar echo map at the 15 th time instant of the sample, and the sample in this application is composed of 15 images, i.e. one image sequence, and the whole data set has many samples. i denotes the ith sample, then t1 to t15, the subscripts, representing the temporal relationship of this image sequence. Cutting the target area from front to back to be 101x101km size by taking the target area as the center, and recording the actual rainfall y within 1 to 2 hours in the future corresponding to the last acquisition time as the reference i And constructing a rainfall data set. And dividing the data set into a training set and a testing set according to a preset proportion, wherein the training set is used for training the model, and the testing set is used for verifying the rainfall prediction precision of the model.
The method includes the steps that rainfall radar data are collected in units of years, then a proper sample is selected from the rainfall data, and because rainfall only occurs in flood season in the rainfall process, the rainfall data are collected for several years generally, and then data are screened, as shown in figure 1;
when the radar echo diagram within 1.5 hours is collected to predict rainfall, the radar echo sequence within 1.5 hours is cut out every 1.5 hours in the process of one-time rainfall, then the rainfall after one hour is recorded, and a sample is formed by the echo sequence and the rainfall. The entity implementation data is derived from rainfall data in the Shenzhen meteorological office 2014-2016 year.
Further, preprocessing rainfall data in the rainfall data set to obtain a preprocessed rainfall data set; the specific process is as follows:
s21, in the data acquisition process, when the radar meets obstacles to cause echo data to be abnormal, namely, the situation that partial numerical values are missing exists, the missing values are recorded on an echo image by-1, the method adopts a K neighbor algorithm, finds K points similar to the missing values through Euclidean distance, then calculates the mean value of the K points as the numerical value of the missing values to obtain an image filled with the missing values, and sets K to be 3;
s22, removing Gaussian noise from the image filled with the missing value by using a Gaussian filter to obtain an image with the Gaussian noise removed; specifically scanning each pixel in the image by using a template (matrix with the size of 3x 3) with the size of 3x3, and replacing the value of the central pixel point of the template by using the weighted average value of the pixels in the neighborhood determined by the template;
s23, normalizing the obtained filtering image after Gaussian processing, and dividing each value on the image by 255 to convert the value into the range of 0-1.0 because the size range of the original image value is 0-255.
Further, a rainfall prediction model is constructed on the basis of introducing an inclusion module and an efficient channel attention module ECA in a convolution part and a bidirectional long-time and short-time memory neural network;
the constructed rainfall prediction model is shown in fig. 2, and in order to improve the rainfall prediction accuracy, an inclusion module and an efficient channel attention module (ECA) are introduced into a convolution part on the basis of a common convolution layer and a unidirectional long-short time memory neural network (LSTM), and the LSTM is replaced by a bidirectional long-short time memory neural network (BiLSTM), specifically, the rainfall prediction model is composed of the following parts: comprises 14 layers
The first layer is a convolution layer, the size of a convolution kernel is 3x3, and the number of the convolution kernels is 64;
the second layer is a pooling layer, the kernel size is 3x3, the step length is 2, and the pooling layer can remove redundant information and prevent overfitting;
the third layer is a convolution layer, the size of convolution kernel is 3x3, and the number of convolution kernels is 128
The fourth layer is a pooling layer, the kernel size is 3x3, and the step size is 2;
the fifth layer is an inclusion module I, and the internal structure is shown in figure 3; the module can extract information of different scales in the echo image.
The sixth layer is composed of an inclusion module and a channel attention ECA module added behind the inclusion module, and different weights are given to the extracted feature map according to importance.
The seventh layer is a pooling layer, with kernel size 3x3, step size 2;
the ninth layer to the tenth layer are three layers of BilSTM, the number of nodes of the hidden layer is 256, and the BilSTM can fully consider the mutual influence between the front time sequence information and the rear time sequence information;
the twelfth to fourteenth layers are three full-junction layers, and the number of the neurons is 512, 128 and 1 respectively. And obtaining a prediction result. Dropout is set behind each of the BilSTM layer and the fully-connected layer, and the parameter is set to 0.5, so that overfitting is prevented.
Fig. 3 is an inclusion block diagram.
Further, the Inception module I and the Inception module II have the same structure; the inclusion i module includes two layer sets: first layer group and second layer group:
the first layer set includes: 3 1x1 convolutions and 13 x3 max pooling layers;
the second layer set comprises: 13 x3 convolutional layer, 15 x5 convolutional layer and 1x1 convolutional layer;
and further, inputting the pretreated test concentrated rainfall data into a trained rainfall prediction model to realize the rainfall prediction of the target area within 1 to 2 hours in the future.
The normalized image sequence is input into a rainfall prediction model, corresponding spatial features of each frame of echo image are extracted from a first layer to a seventh layer of the rainfall prediction model, each image corresponds to one output feature vector, namely, one sample is input into the rainfall prediction model each time, one sample comprises 15 radar images, each image can obtain one output through a CNN part, one image corresponds to 1 feature vector, and 15 frames of images correspond to 15 feature vectors.
The spatial features corresponding to each frame of image in the whole image sequence are spliced together, dimension change is directly carried out, (for example, a matrix with the size of 5x5, 5 rows and 5 columns are changed into 1 row and 25 columns) to form a two-dimensional vector which is used as the input of a BilSTM part, and then the front-back relation of the features extracted by the three layers of the BilSTM part on the time dimension is analyzed and predicted to obtain a prediction result.
The rainfall prediction model uses the Root Mean Square Error (RMSE) as a loss function, and the calculation formula is shown as formula 1, wherein y i And
Figure BDA0003580831710000081
respectively representing an actual rainfall result and a predicted rainfall result corresponding to the ith sample sequence, obtaining a predicted result and an actual rainfall calculation loss value corresponding to the sample through the model, continuously performing iterative training by using a gradient descent method until the result of the rainfall prediction model meets the threshold requirement, and reserving the model. And calculating a loss function by the model in the training process, and when the value of the loss function is not reduced any more, indicating that the model is converged and storing the model.
Figure BDA0003580831710000082
Wherein: n denotes the number of samples, y i And
Figure BDA0003580831710000083
respectively representing the actual rainfall result and the predicted rainfall corresponding to the ith sample sequence
Inputting an image sequence (one image sequence comprises 15 frames of radar echo images) subjected to preprocessing S2 into a model for training, wherein the model has fourteen layers as shown in step 3, the first layer to the seventh layer are CNN parts and are mainly used for extracting features of the images, each frame of image (with the image dimension of 101 × 101) is extracted from the first layer and sequentially passes through the following layers, the final feature is output from the seventh layer, the output feature size is 13 × 13 × 832, therefore, the output size of the CNN part of the 15 frames of images is 15 × 13 × 13 × 832, then the feature obtained by the CNN part of each frame of radar echo images is subjected to dimension change, specifically, the feature with the size of 13 × 13 × 832 is changed into a two-dimensional vector, namely 1 × 140608, and the feature of the CNN part of the final image sequence is a two-dimensional vector of 15 × 140608. Then inputting the vector into three layers of BilSTM from the ninth layer to the eleventh layer, analyzing the BilSTM according to the front-back relation of the input features on the time dimension, outputting a feature vector of 1 multiplied by 256, and finally mapping the feature vector to a sample space through three layers of full-connected layers to obtain a predicted rainfall result.
Further: inputting a test set to verify the performance of the rainfall prediction model
And inputting the test set into a trained model to obtain a rainfall prediction result, and meanwhile, calculating an overall error through RMSE, a Mean Absolute Error (MAE) and a correlation Coefficient (CORR) to reasonably evaluate the model. The formulas for MAE and correlation coefficient are shown in formulas (2) and (3), respectively. In the formula 3, y (i) represents the actual rainfall corresponding to the i-th sample sequence, y avg (n) represents an average value of n actual rainfall amounts. y is p (i) Indicating the predicted rainfall corresponding to the ith sample sequence,
Figure BDA0003580831710000091
represents the average of n predicted rainfall.
Figure BDA0003580831710000092
Figure BDA0003580831710000093
Wherein: y (i) represents the actual rainfall corresponding to the ith sample sequence, y avg (n) represents an average value of n actual rainfall amounts; y is p (i) Indicating the predicted rainfall corresponding to the ith sample sequence,
Figure BDA0003580831710000094
represents the average of n predicted rainfall
The method is based on a meteorological data set provided by Shenzhen city, a specific experiment is carried out, each sample sequence is 15 frames, and rainfall in the future 1-2 hours of the corresponding moment of the last frame is predicted. For a total of ten thousand samples, the samples were scaled according to 8: the scale of 2 is divided into a training set and a test set. 8000 training samples are input into the model, continuous iterative training is carried out, finally, the model with optimized parameters is obtained, finally, the model is compared with other algorithms through a test set, and the comparison result is shown in table 1.
Index (I) CNN LSTM LRCN 3DCNN The model
RMSE(mm/h) 12.41 9.72 9.55 9.24 8.41
MAE(mm/h) 8.31 6.66 6.55 6.33 5.60
CORR 70.5% 75.8% 77.5% 78.1% 82.8%
Finally, it should be noted that: the above embodiments are only used to illustrate the technical solution of the present invention, and not to limit the same; while the invention has been described in detail and with reference to the foregoing embodiments, it will be understood by those skilled in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some or all of the technical features may be equivalently replaced; and the modifications or the substitutions do not make the essence of the corresponding technical solutions depart from the scope of the technical solutions of the embodiments of the present invention.

Claims (6)

1. A short-time rainfall prediction method based on deep learning is characterized in that: the method comprises the following steps:
collecting a radar echo sequence of a rainfall process in a target area in a certain period of time recently, constructing a rainfall data set, and dividing the data set into a training set and a test set according to a preset proportion;
preprocessing rainfall data in the rainfall data set to obtain a preprocessed rainfall data set;
constructing a rainfall prediction model based on an inclusion module, a high-efficiency channel attention module and a bidirectional long-time and short-time memory neural network which are introduced into a convolution part;
training a rainfall prediction model based on the preprocessed training set data to obtain a trained rainfall prediction model;
and inputting the pretreated test concentrated rainfall data into a trained rainfall prediction model to realize rainfall prediction in a target area in a short time.
2. The method for predicting short-term precipitation based on deep learning according to claim 1, wherein: the pretreatment of the rainfall data in the rainfall data set comprises the following steps:
s21, adopting a K neighbor algorithm, finding out K points similar to the neighborhood of the missing value in the radar echo sequence through Euclidean distance, calculating the mean value of the K points as the numerical value of the missing value to obtain an image filled with the missing value,
s22, removing Gaussian noise from the image filled with the missing value by using a Gaussian filter to obtain an image subjected to Gaussian removal processing;
and S23, performing normalization processing on the obtained image after the Gaussian filtering processing to obtain a normalized image.
3. The method for predicting short-term precipitation based on deep learning according to claim 1, wherein: the rainfall prediction model has the following structure: comprises 14 layers:
the first layer is a convolution layer, the size of a convolution kernel is 3x3, and the number of the convolution kernels is 64;
the second layer is a pooling layer, the kernel size is 3x3, the step size is 2;
the third layer is a convolution layer, the size of convolution kernel is 3x3, and the number of convolution kernels is 128
The fourth layer is a pooling layer, the kernel size is 3x3, and the step size is 2;
the fifth layer is an inclusion module I;
the sixth layer is formed by connecting an inclusion module II with a channel attention ECA module;
the seventh layer is a pooling layer, with kernel size 3x3, step size 2;
the ninth layer to the tenth layer are three layers of BiLSTM, and the number of nodes of the hidden layer is 256;
the twelfth to the fourteenth layers are three full-connection layers, and the number of the neurons is 512, 128 and 1 respectively; (ii) a
Dropout is set behind each layer of the BilSTM layer and the full connection layer, and the parameter is set to be 0.5.
4. The method for predicting short-term precipitation based on deep learning according to claim 1, wherein: the Inception module I and the Inception module II have the same structure; the Inception module I comprises a first layer group and a second layer group; the first group of layers and the second group of layers are connected in series;
the first layer set comprises three 1x1 convolutional layers and 13 x3 convolutional layer;
the second layer set includes 1x1 convolutional layers, 13 x3 convolutional layers, and 15 x5 convolutional layers.
5. The method for predicting short-term precipitation based on deep learning according to claim 1, wherein: the rainfall prediction model uses the root mean square error as a loss function, the formula of which is as follows:
Figure FDA0003580831700000021
wherein: n denotes the number of samples, y i And
Figure FDA0003580831700000022
respectively representing the actual rainfall result and the predicted rainfall corresponding to the ith sample sequence.
6. The utility model provides a short-term precipitation prediction device based on deep learning which characterized in that: comprises that
An acquisition module: the rainfall simulation test system is used for collecting a radar echo sequence of a rainfall process in a target area in a recent certain period of time, constructing a rainfall data set, and dividing the data set into a training set and a test set according to a preset proportion;
a preprocessing module: preprocessing rainfall data in the rainfall data set to obtain a preprocessed rainfall data set;
constructing a module: the rainfall prediction model is constructed on the basis of introducing an inclusion module and a high-efficiency channel attention module in a convolution part and a bidirectional long-and-short time memory neural network;
a training module: training a rainfall prediction model based on the preprocessed training set data to obtain a trained rainfall prediction model;
and (3) prediction model: and the rainfall prediction method is used for inputting the preprocessed test concentrated rainfall data into a trained rainfall prediction model to realize rainfall prediction in the future 1-2 hours of the target area.
CN202210351766.9A 2022-04-02 2022-04-02 Short-time rainfall prediction method based on deep learning Pending CN114881286A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210351766.9A CN114881286A (en) 2022-04-02 2022-04-02 Short-time rainfall prediction method based on deep learning

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210351766.9A CN114881286A (en) 2022-04-02 2022-04-02 Short-time rainfall prediction method based on deep learning

Publications (1)

Publication Number Publication Date
CN114881286A true CN114881286A (en) 2022-08-09

Family

ID=82669083

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210351766.9A Pending CN114881286A (en) 2022-04-02 2022-04-02 Short-time rainfall prediction method based on deep learning

Country Status (1)

Country Link
CN (1) CN114881286A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116451881B (en) * 2023-06-16 2023-08-22 南京信息工程大学 Short-time precipitation prediction method based on MSF-Net network model
CN116755181A (en) * 2023-08-11 2023-09-15 深圳市昆特科技有限公司 Precipitation prediction method and related device

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116451881B (en) * 2023-06-16 2023-08-22 南京信息工程大学 Short-time precipitation prediction method based on MSF-Net network model
CN116755181A (en) * 2023-08-11 2023-09-15 深圳市昆特科技有限公司 Precipitation prediction method and related device
CN116755181B (en) * 2023-08-11 2023-10-20 深圳市昆特科技有限公司 Precipitation prediction method and related device

Similar Documents

Publication Publication Date Title
Zhang et al. Surface defect detection of steel strips based on classification priority YOLOv3-dense network
CN106845401B (en) Pest image identification method based on multi-space convolution neural network
CN110348624B (en) Sand storm grade prediction method based on Stacking integration strategy
CN112949565A (en) Single-sample partially-shielded face recognition method and system based on attention mechanism
CN114881286A (en) Short-time rainfall prediction method based on deep learning
CN113392931B (en) Hyperspectral open set classification method based on self-supervision learning and multitask learning
CN111814685A (en) Hyperspectral image classification method based on double-branch convolution self-encoder
CN116128141B (en) Storm surge prediction method and device, storage medium and electronic equipment
CN111047078A (en) Traffic characteristic prediction method, system and storage medium
CN104281835A (en) Face recognition method based on local sensitive kernel sparse representation
CN116503399B (en) Insulator pollution flashover detection method based on YOLO-AFPS
CN109146925A (en) Conspicuousness object detection method under a kind of dynamic scene
CN112285376A (en) Wind speed prediction method based on CNN-LSTM
CN112818849A (en) Crowd density detection algorithm based on context attention convolutional neural network of counterstudy
CN115965862A (en) SAR ship target detection method based on mask network fusion image characteristics
CN115792853A (en) Radar echo extrapolation method based on dynamic weight loss
CN111242028A (en) Remote sensing image ground object segmentation method based on U-Net
CN113935413A (en) Distribution network wave recording file waveform identification method based on convolutional neural network
CN113536944A (en) Distribution line inspection data identification and analysis method based on image identification
CN112115984A (en) Tea garden abnormal data correction method and system based on deep learning and storage medium
CN113012107A (en) Power grid defect detection method and system
CN117368862A (en) High-efficiency weather radar data quality evaluation system
CN110826810B (en) Regional rainfall prediction method combining spatial reasoning and machine learning
CN116486393A (en) Scene text detection method based on image segmentation
CN111242839A (en) Image scaling and cutting method based on scale grade

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination