CN115761261A - Short-term rainfall prediction method based on radar echo diagram extrapolation - Google Patents

Short-term rainfall prediction method based on radar echo diagram extrapolation Download PDF

Info

Publication number
CN115761261A
CN115761261A CN202211495440.XA CN202211495440A CN115761261A CN 115761261 A CN115761261 A CN 115761261A CN 202211495440 A CN202211495440 A CN 202211495440A CN 115761261 A CN115761261 A CN 115761261A
Authority
CN
China
Prior art keywords
short
radar echo
network
feature
model
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202211495440.XA
Other languages
Chinese (zh)
Inventor
牛丹
施春磊
张天宝
臧增亮
陈夕松
陈训来
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Jiangyin Zhixing Industrial Control Technology Co ltd
Southeast University
Original Assignee
Jiangyin Zhixing Industrial Control Technology Co ltd
Southeast University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Jiangyin Zhixing Industrial Control Technology Co ltd, Southeast University filed Critical Jiangyin Zhixing Industrial Control Technology Co ltd
Priority to CN202211495440.XA priority Critical patent/CN115761261A/en
Publication of CN115761261A publication Critical patent/CN115761261A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02ATECHNOLOGIES FOR ADAPTATION TO CLIMATE CHANGE
    • Y02A90/00Technologies having an indirect contribution to adaptation to climate change
    • Y02A90/10Information and communication technologies [ICT] supporting adaptation to climate change, e.g. for weather forecasting or climate simulation

Landscapes

  • Radar Systems Or Details Thereof (AREA)

Abstract

The invention discloses a short-faced precipitation prediction method based on radar echo diagram extrapolation, which comprises the steps of firstly adopting a UNet network model framework, improving a characteristic coding block of the UNet network model framework, and obtaining local and global characteristic information of short-faced precipitation by introducing a multi-scale characteristic embedding module, thereby increasing the multi-scale prediction capability of the model; a multi-level fusion module is introduced to realize information transmission of local, medium and global characteristics among different levels, the advantage that CNN focuses on local information and Transformer establishes global dependence can be fully utilized, and the characteristic expression of the model on the space dimension of a radar echo sequence is improved; adopting a trans-time algorithm to excavate the dependency relationship on the time sequence of the radar echo sequence, learning to obtain the time variation trend of the short rainfall, and enhancing the characteristic expression of the model on the time dimension of the radar echo sequence; the improved model enhances the feature extraction capability of the model in the space and time dimension of the radar echo sequence, and improves the accuracy of the prediction of the short-term rainfall.

Description

Short-term rainfall prediction method based on radar echo diagram extrapolation
Technical Field
The invention belongs to the field of target detection in image processing, and particularly relates to a short-term rainfall prediction method based on radar echo diagram extrapolation.
Background
China is a country with large geographical environment difference, fragile ecosystem and complex climatic conditions and is also a country seriously affected by extreme weather disasters, and meanwhile, the short rainfall prediction is also a very complex and difficult field because of the characteristics of strong burst, complex and various mechanisms, short life cycle and the like, so the short rainfall prediction research has great significance. However, the accuracy and forecast timeliness are still insufficient to meet business requirements, so that further improvement of the accuracy of the short forecast is very important for production and life of people.
The numerical prediction is used as a traditional forecasting method in China, and as the resolution ratio of the numerical prediction on time and space is not fine enough, only simple qualitative description can be provided, the requirement of rapid development of the current economic society cannot be met, and more fine short-term forecasting, particularly fine forecasting of strong convection weather, becomes a main direction of research and development of the current meteorological services.
With the development of meteorological monitoring equipment and artificial intelligence technology in recent years, a short-term imminent precipitation prediction method based on radar echo data driving by utilizing machine learning and deep learning technology quickly becomes a hot problem of research. The method does not pay attention to the complex atmospheric dynamics process, and only predicts the future radar echo sequence through the past radar echo sequence. The method divides the short-term rainfall forecast into two steps: the method comprises the steps of firstly predicting a future radar echo sequence, and then estimating precipitation according to the obtained echo distribution.
Methods based on deep learning RNN are widely used for precipitation prediction based on radar echo extrapolation. For example, combining CNN with LSTM architecture, a ConvLSTM model is proposed. A prediction recurrent neural network (PredRNN) model and an improved version (PredRNN + +) model are also provided, so that the prediction of the short rainfall is realized. However, these RNN-based methods always have the problem of gradient disappearance, thus causing the problem of poor accuracy of short-range precipitation prediction.
In addition to RNN-based methods, CNN-based methods have also been proposed for the prediction of short-term precipitation. For example, UNet-based FCN (Fully volume Networks) models and SmaAt-UNet models have been proposed. However, the CNN-based method is generally an effective method for extracting local features, and is insufficient for the expression of global features, thereby causing a problem that the accuracy of short-term precipitation prediction is not high.
The existing method can be used for predicting the short-term rainfall by utilizing the advantages of the CNN attention local characteristics and the self-attention mechanism in the Transformer to mine the global characteristics and combining the CNN attention local characteristics and the self-attention mechanism. An AA-UNetTransUnnet network combining U-Net and a Transformer is provided for short-term rainfall prediction, and CSI is 0.6651 (r is more than or equal to 0.5). In addition, a Rainformer model combining Swin-transducer, UNet and grid fusion units is also provided, and the CSI of the short impending rainfall is predicted to be 0.6678 (r is more than or equal to 0.5) (the CSI index simultaneously considers two types of errors of missing report and empty report, the value of the error is between 0 and 1, 0 represents complete failure, and 1 represents complete correctness). However, the time series information of the radar echo is not effectively extracted by the models, so that the problem that the prediction accuracy of the short-term precipitation is not high is caused. The UNet is directly utilized to train the short rainfall predictor, the short rainfall prediction accuracy is not high, and the main problem is that the feature expression of a radar echo sequence in the space and time dimensions is insufficient. Therefore, the original UNet network people need further improvement to meet the practical application.
Disclosure of Invention
The purpose of the invention is as follows: the invention aims to provide a short-term rainfall prediction method based on radar echo diagram extrapolation, which comprises the steps of firstly adopting a UNet network model framework, improving a characteristic coding block of the UNet network model framework, and obtaining local and global characteristic information of short-term rainfall by introducing a multi-scale characteristic embedding module, thereby increasing the multi-scale prediction capability of the model; a multi-level fusion module is introduced to realize information transmission of local, medium and global characteristics among different levels, the advantages of CNN concerned local information and Transformer established global dependence can be fully combined, and the characteristic expression of the model on the radar echo sequence spatial dimension is improved; and a cross-time transform algorithm is adopted to mine the dependency relationship on the time sequence of the radar echo sequence, the time change trend of the short-term rainfall is obtained through learning, the feature expression of the model on the time dimension of the radar echo sequence is enhanced, the improved model enhances the feature extraction capability of the model on the space and the time dimension of the radar echo sequence, and the accuracy of the prediction of the short-term rainfall is improved.
The technical scheme is as follows: in order to achieve the purpose, the invention adopts the following technical scheme:
the invention relates to a short-term rainfall prediction method based on radar echo diagram extrapolation, which comprises the following steps:
s1: collecting a radar precipitation map, and performing data preprocessing on an original precipitation map;
s2: a multi-scale feature embedding module is used in a network framework of UNet, a feature extraction network front end is constructed, output features of three branches are led out, and the output features are recorded as features X l ,X m ,X g
S3: fusing feature X using a multi-level fusion algorithm l ,X m ,X g Constructing an extraction network middle end;
s4: increasing a Transformer algorithm of cross-time to construct an extraction network rear end;
s5: and (4) performing transfer learning on the improved network model to obtain a short-imminent precipitation prediction model based on radar echo diagram extrapolation.
As a further technical solution of the present invention, in the step S1, a data preprocessing method includes center clipping of an image, and the data preprocessing method of the present invention includes:
s11: and collecting original data of the radar precipitation map, and cutting the center of the original precipitation map to remove a blank background area with a circular edge to obtain a training sample.
As a further technical solution of the present invention, in step S2, a method for constructing a feature extraction network front end by using a multi-scale feature embedding module includes:
s21: separating the encoder characteristic of the original UNet network to extract a first pure convolution network structure of the network;
s22: replacing the single-branch pure convolution module separated from S21 with a multi-scale feature embedding module with three branches (two transform block branches and one CNN block branch), connecting the output end of the previous layer to the input end of the multi-scale feature embedding module, calculating a plurality of different transformations on the same mapping in parallel by the multi-scale feature embedding module, namely performing multi-scale learning on the image with fixed size and extracting X by using the multi-scale feature embedding module l ,X m ,X g Characteristic;
s23: outputting the three branches of the multi-scale feature module in the S22 to the next layer for processing to obtain features of the same graph under different scales;
s24: and (3) continuously processing the pure convolution feature extraction network structure in the original UNet network according to S21 to S22 until all the pure convolution feature extraction modules in the original UNet feature extraction network are replaced by the multi-scale feature embedding module, and finishing the step.
As a further technical solution of the present invention, in step S3, a multi-level fusion algorithm is adopted to fuse the feature X l ,X m ,X g The method for constructing and extracting the network middle end comprises the following steps:
s31: leading out output characteristics from convolution branches of the new multi-scale characteristic embedding module in S2, and recording the output characteristics as characteristics X l The output characteristics of the branch led out by the two 3*3 convolved branches connected with the Transformer block are recorded as characteristics X m The output characteristics of the branch led out by the three 3*3 convolved and then connected with the Transformer block are recorded as characteristics X g
S32: will be characterized by X l Spliced to feature X m Performing the following steps;
s33: will be characterized by X m Spliced to feature X g In, turning on;
s34: and respectively mapping and transforming the output obtained in the S32 and the S33 to the features through a multilayer perceptron so as to realize feature fusion.
As a further technical solution of the present invention, in step S4, a time-span Transformer algorithm is added, and the method for constructing and extracting the network back end is:
41: respectively connecting the output ends of the three branches of S3 to the input end of a trans-time Transformer block;
s42: the output characteristics of the three branches of S31 connected with the cross-time transform block are respectively drawn and are respectively marked as X' l ,X’ m ,X’ g The algorithm formula of the Transformer block across time is as follows:
Figure BDA0003965609820000041
Figure BDA0003965609820000042
Figure BDA0003965609820000043
wherein
Figure BDA0003965609820000044
A mask parameter that is adaptive;
s43: and splicing the outputs of the three branches in the S41, and sending the spliced outputs to the model structure of the next layer through jump connection and convolution operation.
As a further technical solution of the present invention, in step S5, the method for performing transfer learning on the improved network model to obtain a short-term rainfall prediction model based on radar echo diagram extrapolation includes:
s51: downloading pre-trained default parameters from UNet official networks and performing weight trimming operations load the resulting parameters into our improved UNet network, resulting in an optimized objective function of:
Figure BDA0003965609820000045
X t+1 ,...,X t+k =f(X t-j+1 ,...,X t )
wherein X t A radar echo sequence at the time t;
s52: and (5) performing transfer learning on the model by adopting the manufactured data set, and training until convergence.
Has the beneficial effects that: the invention discloses a short-critical rainfall prediction method based on radar echo diagram extrapolation, which comprises the steps of firstly adopting a UNet network model framework, fusing multi-scale characteristics of CNN blocks and Transformer blocks of different levels by introducing a multi-scale characteristic embedding module and a multi-level fusion module, and improving the characteristic expression of the model on the radar echo sequence space dimension; adopting a trans-time algorithm to excavate the dependency relationship on the time sequence of the radar echo sequence, and enhancing the characteristic expression of the model in the time dimension of the radar echo sequence; the method enhances the characteristic expression capability of the radar echo sequence in space and time dimensions, and obviously improves the accuracy of the short-term rainfall prediction.
Drawings
FIG. 1 is a flow chart of a method for predicting short-term precipitation based on radar echo diagram extrapolation in accordance with the present invention,
figure 2 is a schematic diagram of a center-clipped training sample of the present invention,
figure 3 is a schematic diagram of a multi-scale feature embedding module of the present invention,
FIG. 4 is a schematic diagram of a multi-level fusion module according to the present invention,
FIG. 5 is a schematic diagram of a Transformer module for introducing cross-time in a network structure of UNet coding layer according to the present invention,
figure 6 is a network overall framework diagram of UNet coding blocks improved by the invention,
fig. 7 is a diagram illustrating the actual prediction effect of the UNet encoder layer network after improvement of the present invention.
Detailed Description
The technical solution of the present invention will be further described with reference to the following detailed description and accompanying drawings.
Example 1: the specific embodiment discloses a short-term rainfall prediction method based on radar echo diagram extrapolation, as shown in fig. 1 to 7, comprising the following steps:
s1: as shown in fig. 1, before the samples are input into the network for training, preprocessing is required, radar precipitation graphs are collected, and center clipping is performed on the original precipitation graphs to obtain training samples (as shown in fig. 2);
s2: a multi-scale feature embedding module containing a transform block and a CNN block is used for replacing a pure convolution module of an original UNet feature extraction network, so that the improved feature extraction network front end (shown in figure 3) is obtained, output features of three branches are led out, and the output features are recorded as a feature X l ,X m ,X g
S3: fusing feature X using a multi-level fusion algorithm l ,X m ,X g Constructing and extracting a network middle end (as shown in fig. 4), realizing information transfer among different levels, and strengthening the characteristic expression of an original model on the radar echo sequence space dimension;
s4: a cross-time Transformer algorithm is added to excavate the dependency relationship on the time sequence of the radar echo sequence, the improved feature extraction network rear end (shown in figure 5) is obtained, and the feature expression of the original model on the time dimension of the radar echo sequence is strengthened;
s5: the method comprises the steps of downloading pre-training default parameters from a UNet official network, performing weight fine-tuning operation, loading the parameters into an improved UNet model, and performing transfer learning on the improved UNet network (as shown in fig. 6), wherein the loss function of the network for prediction of a test set picture is gradually reduced, convergence is finally achieved, and the short imminent precipitation prediction model based on radar echo diagram extrapolation is obtained, wherein a schematic diagram of an actual prediction effect is shown in fig. 7.
In step S1, the data is preprocessed by center cropping, and the data preprocessing method of the present invention includes:
s11: collecting radar precipitation graphs, wherein 4000 sequences are selected as a training set, and 1734 sequences are selected as a verification set;
s12: for an original radar precipitation picture, a center clipping algorithm is adopted to clip the original picture of the network to remove a blank background area with a circular edge, so that a training sample and a test sample with the resolution of 288 × 288 are obtained, the training samples and the test sample are applied to training of the UNet network, each training adopts 9 frame sequences to be used as the input of a model, and the other 9 frames are used as ground channel (GT), X (X) frames t-j+1 ,...,X t Is a radar echo sequence diagram at j consecutive time stamps of a once input network.
In step S2, the method of using the multi-scale feature embedding module containing the Transformer and the CNN block to replace the pure convolution module of the original UNet feature extraction network is as follows:
s21: separating the encoder characteristic of the original UNet network to extract a first pure convolution network structure of the network;
s22: replacing a single-branch pure convolution module separated from S21 with a multi-scale feature embedding module with three branches (two transform block branches and one CNN block branch), connecting the output end of the previous layer to the input end of the multi-scale feature embedding module, and calculating a plurality of different transformations on the same mapping in parallel, namely parallel convolution blocks and transform blocks, wherein the convolution block branches (namely local feature blocks) are composed of 1x1 convolution, 1x 3 convolution and 1x1 convolution, the medium feature branch of the transform is composed of two 3*3 convolution layers and layer standardized LNs, multi-head attention machine MHSA and feed-forward network FFN, and the global feature branch of the transform is composed of three 3*3 convolution layers and layer standardized LNs, and multi-head attention machineMHSA, feed forward network FFN, using this module to perform multi-scale learning on fixed size images and extract X l ,X m ,X g Characteristic;
s23: outputting the three branches of the multi-scale feature module in the S22 to the next layer for processing to obtain features of the same graph under different scales;
s24: and (3) continuously processing the pure convolution feature extraction network structure in the original UNet network according to S21 to S22 until all the pure convolution feature extraction modules in the original UNet feature extraction network are replaced by the multi-scale feature embedding module, and finishing the step.
In step S3, a multi-level fusion algorithm is adopted to fuse the characteristics X l ,X m ,X g The method for constructing and extracting the network middle end comprises the following steps:
s31: leading out output characteristics from convolution branches of the new multi-scale characteristic embedding module in S2, and recording the output characteristics as characteristics X l The output characteristics led out by the medium characteristic branch connected with the Transformer block after being convolved by two 3*3 are marked as characteristics X m The output characteristics led out by the global characteristic branch connected with the Transformer block after being convolved by three 3*3 are marked as characteristics X g
S32: will be characterized by X l Spliced to feature X m The preparation method comprises the following steps of (1) performing;
s33: will be characterized by X m Spliced to feature X g In, turning on;
s34: and respectively mapping and transforming the output obtained in S32 and S33 to the features through a multilayer perceptron to realize feature fusion, wherein the feature output can obtain richer features of radar echo sequence space dimensions.
In step S4, a cross-time Transformer algorithm is added, and a method for constructing and extracting a network back end is as follows:
s41: respectively connecting the output ends of the three branches of S3 to the input end of a trans-time Transformer block;
s42: extracting output characteristics of the three branches of S31 respectively connected with the cross-time Transformer block, and respectively marking the output characteristics as X' l ,X’ m ,X’ g When crossing timeThe algorithm formula of the Transformer block in the middle is as follows:
Figure BDA0003965609820000071
Figure BDA0003965609820000072
Figure BDA0003965609820000073
wherein
Figure BDA0003965609820000074
A masking policy that is adaptive;
s43: and splicing the outputs of the three branches in the S41, and sending the spliced outputs to the model structure of the next layer through jump connection and convolution operation.
In the step S5, the method for training the improved UNet model through transfer learning to obtain the intelligent prediction model comprises the following steps:
s51: downloading pre-trained default parameters from UNet official networks and performing fine-tuning operations on weights to load the obtained parameters into the UNet networks after being improved, wherein the obtained optimized objective function is as follows:
Figure BDA0003965609820000075
X t+1 ,...,X t+k =f(X t-j+1 ,...,X t )
wherein X t A radar echo sequence at the time t;
s52: and performing transfer learning on the model by adopting the manufactured data set, and training until convergence. The server configures an NVIDIA RTX 3090GPU to train and test our model, the initial learning rate is set to be 0.0001, an Adam optimizer is used in a random gradient descent method, 18 radar echo sequences are used as small-batch input data, a Balance Mean Absolute Error (B-MAE) is used as a verification loss function, and when the verification loss is not reduced any more in a training stage, the model with the minimum verification loss is selected as the best training prediction model.
The invention is compared to SOTA and the best performance is highlighted in bold for ease of comparison of the results for r ≧ 10 times 10 and r ≧ 30 times 100 as shown in tables 1, 2, and 3. Compared to other SOTAs, our model achieves the best performance level over four evaluation metrics (CSI, HSS, MSE, MAE).
Table 1 CSI comparison results of the present invention with existing algorithms
Figure BDA0003965609820000076
Table 2 HSS comparison of the present invention to existing algorithms
Figure BDA0003965609820000081
TABLE 3 MSE & MAE comparison of the present invention to the existing algorithms
Method B-MSE↓ B-MAE↓
ConvLSTM 18.4430 1.2656
PredRNN++ 17.7672 1.2332
SmaAt-Unet 17.5860 1.2162
AA-TransUnet 17.0093 1.2032
Rainformer 16.6708 1.1836
The invention 15.9018 1.1709
In conclusion, the present invention improves the UNet model with some current advanced methods, performing best on four prediction accuracy indicators, and above all, significantly improves on 10mm/h and 30mm/h thresholds (heavy rainfall). Compared with the excellent PredRNN + + method, the method has remarkable improvement on the prediction CSI and HSS of the rainstorm (30 mm/h). Even with the newly proposed Rainformer method, under a threshold of 30mm/h, CSI and HSS (kill-score of forecast, an index for comparing actual forecast with random forecast or continuous forecast to measure the quality of the forecast method, the higher the better) can obtain an improvement of more than 4.2% and 3.1%. The accuracy of short-bound precipitation prediction is improved, the point intensity and the scale of the short-bound precipitation can be accurately predicted by the detection model in complicated and changeable weather, and the accuracy requirement of the short-bound precipitation prediction is met.
The above description is only an embodiment of the present invention, but the scope of the present invention is not limited thereto, and any person skilled in the art can understand that the modifications or substitutions should be included in the scope of the present invention, and therefore, the scope of the present invention should be subject to the protection scope of the claims.

Claims (6)

1. A short-term rainfall prediction method based on radar echo map extrapolation is characterized by comprising the following steps: the method comprises the following steps:
s1: collecting a radar precipitation map, and performing data preprocessing on an original precipitation map;
s2: a multi-scale feature embedding module is used in a network framework of UNet, a feature extraction network front end is constructed, output features of three branches are led out, and the output features are recorded as features X l ,X m ,X g
S3: fusing feature X using a multi-level fusion algorithm l ,X m ,X g Constructing an extraction network middle end;
s4: increasing a Transformer algorithm of cross-time to construct an extraction network rear end;
s5: and (4) performing transfer learning on the improved network model to obtain a short-imminent precipitation prediction model based on radar echo diagram extrapolation.
2. The method for predicting the short-term rainfall based on the radar echo map extrapolation of claim 1, wherein in the step S1, the data is preprocessed by:
s11: and collecting a radar precipitation image, and cutting the center of the original precipitation image to remove a blank background area with a circular edge to obtain a training sample.
3. The method for predicting the short-rainfall based on the radar echo diagram extrapolation of claim 1, wherein in the step S2, the method for constructing the feature extraction network front end by using the multi-scale feature embedding module comprises:
s21: separating the encoder characteristic of the original UNet network to extract a first pure convolution network structure of the network;
s22: single-branch pure convolution module for separating S21Replacing the multi-scale characteristic embedding module with three branches (two transform block branches and one CNN block branch), connecting the output end of the upper layer to the input end of the multi-scale characteristic embedding module, calculating a plurality of different transformations on the same mapping in parallel by the multi-scale characteristic embedding module, namely performing parallel convolution block operation and transform block operation, performing multi-scale learning on the image with fixed size by using the multi-scale characteristic embedding module and extracting X l ,X m ,X g Characteristic;
s23: outputting the three branches of the multi-scale feature module in the S22 to the next layer for processing to obtain features of the same graph under different scales;
s24: and (3) continuously processing the pure convolution feature extraction network structure in the original network according to S21 to S22 until all the pure convolution feature extraction modules in the original UNet feature extraction network are replaced by multi-scale feature embedding modules, and finishing the step.
4. The method for predicting short-term rainfall based on radar echo map extrapolation of claim 1, wherein in step S3, the feature X is fused by a multi-level fusion algorithm l ,X m ,X g The method for constructing and extracting the network middle end comprises the following steps:
s31: the output characteristics are led out from the convolution branch of the new multi-scale characteristic embedding module in the S2 and are recorded as the characteristics X l The output characteristics of the branch led out by the two 3*3 convolved branches connected with the Transformer block are recorded as characteristics X m The output characteristics of the branch led out by the three 3*3 convolved and then connected with the Transformer block are recorded as characteristics X g
S32: will be characterized by X l Spliced to feature X m Performing the following steps;
s33: will be characterized by X m Spliced to feature X g Performing the following steps;
s34: and respectively mapping and transforming the output obtained in the S32 and the S33 to the features through a multilayer perceptron so as to realize feature fusion.
5. The method for predicting the short-term rainfall based on the radar echo diagram extrapolation of claim 1, wherein in the step S4, a Transformer algorithm is added across time, and a method for constructing and extracting a network back end is as follows:
s41: respectively connecting the output ends of the three branches of S3 to the input end of a trans-time Transformer block;
s42: the output characteristics of the three branches of S31 connected with the cross-time transform block are respectively drawn and are respectively marked as X' l ,X’ m ,X’ g The algorithm formula of the Transformer block across time is as follows:
Figure FDA0003965609810000021
Figure FDA0003965609810000022
Figure FDA0003965609810000023
wherein
Figure FDA0003965609810000024
A mask parameter that is adaptive;
s43: and splicing the outputs of the three branches in the S41, and sending the spliced outputs to the model structure of the next layer through jump connection and convolution operation.
6. The method for predicting the short-term rainfall based on the radar echo chart extrapolation as claimed in claim 1, wherein in step S5, the method for predicting the short-term rainfall based on the radar echo chart extrapolation by training the improved UNet model through transfer learning comprises:
s51: downloading pre-trained default parameters from UNet official networks and performing weight trimming operations load the resulting parameters into our improved UNet network, resulting in an optimized objective function of:
Figure FDA0003965609810000025
X t+1 ,...,X t+k =f(X t-j+1 ,...,X t )
wherein X t A radar echo sequence at the time t;
s52: and performing transfer learning on the model by adopting the manufactured data set, and training until convergence.
CN202211495440.XA 2022-11-27 2022-11-27 Short-term rainfall prediction method based on radar echo diagram extrapolation Pending CN115761261A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211495440.XA CN115761261A (en) 2022-11-27 2022-11-27 Short-term rainfall prediction method based on radar echo diagram extrapolation

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211495440.XA CN115761261A (en) 2022-11-27 2022-11-27 Short-term rainfall prediction method based on radar echo diagram extrapolation

Publications (1)

Publication Number Publication Date
CN115761261A true CN115761261A (en) 2023-03-07

Family

ID=85338603

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211495440.XA Pending CN115761261A (en) 2022-11-27 2022-11-27 Short-term rainfall prediction method based on radar echo diagram extrapolation

Country Status (1)

Country Link
CN (1) CN115761261A (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116307267A (en) * 2023-05-15 2023-06-23 成都信息工程大学 Rainfall prediction method based on convolution
CN116629142A (en) * 2023-07-24 2023-08-22 合肥工业大学 Lightning positioning track prediction method, system and storage medium based on transformer mechanism
CN117808650A (en) * 2024-02-29 2024-04-02 南京信息工程大学 Precipitation prediction method based on Transform-Flown and R-FPN
CN117907965A (en) * 2024-03-19 2024-04-19 江苏省气象台 Three-dimensional radar echo proximity forecasting method for convection storm fine structure

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116307267A (en) * 2023-05-15 2023-06-23 成都信息工程大学 Rainfall prediction method based on convolution
CN116307267B (en) * 2023-05-15 2023-07-25 成都信息工程大学 Rainfall prediction method based on convolution
CN116629142A (en) * 2023-07-24 2023-08-22 合肥工业大学 Lightning positioning track prediction method, system and storage medium based on transformer mechanism
CN116629142B (en) * 2023-07-24 2023-09-29 合肥工业大学 Lightning positioning track prediction method and system based on transformer mechanism
CN117808650A (en) * 2024-02-29 2024-04-02 南京信息工程大学 Precipitation prediction method based on Transform-Flown and R-FPN
CN117808650B (en) * 2024-02-29 2024-05-14 南京信息工程大学 Precipitation prediction method based on Transform-Flownet and R-FPN
CN117907965A (en) * 2024-03-19 2024-04-19 江苏省气象台 Three-dimensional radar echo proximity forecasting method for convection storm fine structure
CN117907965B (en) * 2024-03-19 2024-05-24 江苏省气象台 Three-dimensional radar echo proximity forecasting method for convection storm fine structure

Similar Documents

Publication Publication Date Title
CN115761261A (en) Short-term rainfall prediction method based on radar echo diagram extrapolation
CN113469094B (en) Surface coverage classification method based on multi-mode remote sensing data depth fusion
CN112380921A (en) Road detection method based on Internet of vehicles
CN113807432B (en) Air temperature forecast data correction method based on deep learning
CN114943963B (en) Remote sensing image cloud and cloud shadow segmentation method based on double-branch fusion network
CN108111860B (en) Video sequence lost frame prediction recovery method based on depth residual error network
CN109635763B (en) Crowd density estimation method
CN113392960A (en) Target detection network and method based on mixed hole convolution pyramid
CN112464718B (en) Target detection method based on YOLO-Terse network and storage medium
CN111428862B (en) Polar unbalanced space-time combined convection primary short-term prediction method
CN116310350B (en) Urban scene semantic segmentation method based on graph convolution and semi-supervised learning network
CN109787821B (en) Intelligent prediction method for large-scale mobile client traffic consumption
CN111178585A (en) Fault reporting amount prediction method based on multi-algorithm model fusion
CN112989942A (en) Target instance segmentation method based on traffic monitoring video
Zhang et al. Unified density-aware image dehazing and object detection in real-world hazy scenes
CN110222592A (en) A kind of construction method of the timing behavioral value network model generated based on complementary timing behavior motion
CN117593666B (en) Geomagnetic station data prediction method and system for aurora image
CN114881286A (en) Short-time rainfall prediction method based on deep learning
CN116596920B (en) Real-time zero measurement method and system for long-string porcelain insulator unmanned aerial vehicle
CN117610734A (en) Deep learning-based user behavior prediction method, system and electronic equipment
CN117058882A (en) Traffic data compensation method based on multi-feature double-discriminant
CN110826810B (en) Regional rainfall prediction method combining spatial reasoning and machine learning
CN113222209A (en) Regional tail gas migration prediction method and system based on domain adaptation and storage medium
CN112308066A (en) License plate recognition system
CN110599460A (en) Underground pipe network detection and evaluation cloud system based on hybrid convolutional neural network

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination