CN112180375A - Meteorological radar echo extrapolation method based on improved TrajGRU network - Google Patents

Meteorological radar echo extrapolation method based on improved TrajGRU network Download PDF

Info

Publication number
CN112180375A
CN112180375A CN202010961607.1A CN202010961607A CN112180375A CN 112180375 A CN112180375 A CN 112180375A CN 202010961607 A CN202010961607 A CN 202010961607A CN 112180375 A CN112180375 A CN 112180375A
Authority
CN
China
Prior art keywords
radar
trajgru
network
image sequence
data
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202010961607.1A
Other languages
Chinese (zh)
Other versions
CN112180375B (en
Inventor
甘建红
尹麒名
任宇
李炜
刘豪扬
张艺蓝
舒红平
何童丽
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Chengdu University of Information Technology
Original Assignee
Chengdu University of Information Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Chengdu University of Information Technology filed Critical Chengdu University of Information Technology
Priority to CN202010961607.1A priority Critical patent/CN112180375B/en
Publication of CN112180375A publication Critical patent/CN112180375A/en
Application granted granted Critical
Publication of CN112180375B publication Critical patent/CN112180375B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/88Radar or analogous systems specially adapted for specific applications
    • G01S13/95Radar or analogous systems specially adapted for specific applications for meteorological use
    • G01S13/958Theoretical aspects
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/02Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S13/00
    • G01S7/41Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S13/00 using analysis of echo signal for target characterisation; Target signature; Target cross-section
    • G01S7/418Theoretical aspects
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/045Combinations of networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/049Temporal neural networks, e.g. delay elements, oscillating neurons or pulsed inputs
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02ATECHNOLOGIES FOR ADAPTATION TO CLIMATE CHANGE
    • Y02A90/00Technologies having an indirect contribution to adaptation to climate change
    • Y02A90/10Information and communication technologies [ICT] supporting adaptation to climate change, e.g. for weather forecasting or climate simulation

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Remote Sensing (AREA)
  • Radar, Positioning & Navigation (AREA)
  • General Health & Medical Sciences (AREA)
  • Mathematical Physics (AREA)
  • Data Mining & Analysis (AREA)
  • Evolutionary Computation (AREA)
  • Biophysics (AREA)
  • Molecular Biology (AREA)
  • Computing Systems (AREA)
  • General Engineering & Computer Science (AREA)
  • Biomedical Technology (AREA)
  • Computational Linguistics (AREA)
  • Software Systems (AREA)
  • Artificial Intelligence (AREA)
  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Electromagnetism (AREA)
  • Radar Systems Or Details Thereof (AREA)

Abstract

The invention discloses a meteorological radar echo extrapolation method based on an improved TrajGRU network, which comprises the steps of S1, reading a multi-layer radar image sequence data bin file, and converting a read image into a gray graph sequence; s2, inputting the processed gray-scale image sequence into a TrajGRU deep learning network for calculating the weight in the loss function according to the pixel prediction accuracy, and training to obtain a prediction model; and S3, inputting the live radar map sequence used for prediction into a prediction model to obtain an extrapolation radar map sequence result. The invention preprocesses radar image sequence data, reads the data into a gray-scale image sequence, inputs the gray-scale image sequence into a TrajGRU network model which can accept a multi-layer radar image sequence and obtain different weights in a loss function according to pixel prediction accuracy, and obtains a prediction model. The improved TrajGRU network model can more accurately capture the time-space correlation of radar pictures; the extrapolation result can better keep the image details, and the radar echo distribution is closer to a real radar scanning image.

Description

Meteorological radar echo extrapolation method based on improved TrajGRU network
Technical Field
The invention belongs to the technical field of radar echo extrapolation, and particularly relates to a meteorological radar echo extrapolation method based on an improved TrajGRU network.
Background
The radar echo extrapolation refers to determining the intensity distribution of the echo and the moving speed and direction of an echo body according to echo data predicted by a weather radar, and forecasting the radar echo state after a certain period of time by performing linear or nonlinear extrapolation on the echo.
The existing conventional radar echo extrapolation technology mainly comprises a single centroid method and a cross correlation method. The single body centroid method is to simplify the target and is suitable for tracking and forecasting a larger target, but when radar echoes are scattered or merging and splitting phenomena occur, the accuracy of extrapolation forecasting is greatly reduced. The cross-correlation rule is to select two continuous time space optimization correlation coefficients to establish a fitting relation, but for a strong convection meteorological process with rapid echo change, the accuracy of extrapolation tracking is difficult to guarantee, and the extrapolation effect is also obviously reduced. The optical flow method has better performance in the aspect of motion vector estimation, and an algorithm adopts the optical flow method to extrapolate and predict continuous radar images and is applied to scenes such as heavy rainfall warnings. However, the optical flow method has disadvantages such as accumulation of errors and loss of radar echo images.
The latest deep learning algorithm is applied to radar echo extrapolation, a new convolution LSTM (ConvLSTM) network is provided for rainfall nowcasting, and the rainfall nowcasting is defined as a space-time sequence prediction problem; an end-to-end training model for the nowcast is constructed by stacking multiple ConvLSTM layers to form a coded prediction structure. However, the ConvLSTM model changes the cyclic structure into a space-time constant structure, and the natural motion and variation are usually variable in position. Although the convolutional recursive structure used in ConvLSTM is superior to the fully-linked recursive structure in capturing spatio-temporal correlations, it is not optimal. So then, beyond the ConvLSTM algorithm, a trajectory GRU (TrajGRU) model was proposed that can actively learn the structure of position variables in the loop connection, more flexible than ConvLSTM. The TrajGRU algorithm has the disadvantages that the image is gradually blurred along with the increase of the extrapolation time, and the extrapolation trend cannot be well reflected.
Disclosure of Invention
The invention aims to provide a meteorological radar echo extrapolation method based on an improved TrajGRU network aiming at the defects in the prior art, so as to solve the problems that the accuracy of extrapolation tracking is difficult to ensure and the extrapolation effect is obviously reduced in the prior art.
In order to achieve the purpose, the invention adopts the technical scheme that:
a meteorological radar echo extrapolation method based on an improved TrajGRU network comprises the following steps:
s1, reading a radar image sequence data bin file and processing the radar image sequence data bin file into a gray image sequence;
s2, inputting the processed gray image sequence into the constructed TrajGRU deep learning network, and obtaining a prediction model after training;
and S3, inputting the live radar image sequence used for prediction into a prediction model to obtain an extrapolation image sequence result.
Preferably, the step S1 reads the radar image sequence data bin file and processes it into a grayscale image sequence, and includes the following specific steps:
s1.1, acquiring a radar bin file and reading the radar bin file as a color picture;
and S1.2, processing the color picture into a gray image sequence.
Preferably, the specific step of step S2 includes:
s2.1, calculating the weight of a loss function of the TrajGRU deep learning network according to the pixel prediction accuracy;
s2.2, adding at least one layer of radar image sequence input in the TrajGRU network, wherein the loss function weight of each layer of radar data of the TrajGRU deep learning network is calculated according to the magnitude degree of the obtained loss function value, and different weight values of different loss values are given;
and S2.3, inputting the obtained gray image sequence into an improved TrajGRU deep learning network, and training to obtain a prediction model.
Preferably, the step 2.1 of calculating the weight of the loss function of the TrajGRU deep learning network according to the accuracy of pixel prediction includes:
calculating loss values point by point, and adding a weight w before calculating the loss values of the pixel points at the false report and the false report positions:
Figure BDA0002680749880000031
where wMSE is the loss function value, yiScanning the echo image pixel values of the live object for radarp iFor the pixel values of the model extrapolation result image, 255 is added to ensure that the weight values are positive, divided by 510 to achieve normalization of the weight values.
Preferably, S2.2, in the TrajGRU network, adding at least one layer of radar image sequence input, wherein the step of calculating a loss function weight of each layer of radar data of the TrajGRU deep learning network according to a magnitude degree of the obtained loss function value, and giving different weight values to different loss values includes:
Figure BDA0002680749880000032
Figure BDA0002680749880000033
Figure BDA0002680749880000034
wherein error1 is the loss value obtained by the first layer input, error2 is the loss value obtained by the second layer input, error is the total loss value, wl1 is the calculated first layer input weight value, and wl2 is the calculated second layer input weight value.
Preferably, in step S2.2, the method for adding a layer of radar image sequence input in the TrajGRU network includes:
in the TrajCRU model network model, radar images at 7 moments are input once, radar images at 7 moments in the future are predicted, input data are one layer of radar image, and after one layer of radar image is added, data of two layers of radar images correspond to each other one by one.
Preferably, in S2.3, the obtained grayscale image sequence is input into an improved TrajGRU deep learning network, and a prediction model is obtained after training, where the prediction model includes:
and constructing a two-channel TrajGRU network, respectively inputting two layers of radar echo image sequences decomposed from the base data into the TrajGRU network for training, and obtaining a prediction model after training.
The meteorological radar echo extrapolation method based on the improved TrajGRU network provided by the invention has the following beneficial effects:
the invention preprocesses radar image sequence data, inputs the radar image sequence data into a TrajGRU network model modified by an algorithm after reading the radar image sequence data into a gray image sequence, and obtains a prediction model. The improved TrajGRU network model can more accurately capture the space-time correlation of radar pictures, and the image quality is further improved; the extrapolation result image can better keep the image details, and the radar echo distribution is closer to a real radar scanning image.
Drawings
Fig. 1 is a schematic diagram of false alarm and false alarm, wherein (1) is a real radar image, (2) is an extrapolation prediction image, and (3) is an overlapped image of the two images, wherein a part is a quasi-alarm area, b part is a false alarm area, and c part is a false alarm area.
FIG. 2 is a comparison of ConvLSTM and TrajGRU models.
FIG. 3 is a comparison of the TrajGRU model modification loss function.
FIG. 4 is a comparison of results obtained by increasing only the number of input echo data layers for the TrajGRU model without improving the loss function.
Fig. 5 is an application of live radar image sequence data prediction.
FIG. 6 is the pixel difference of the improved loss function of the TrajGRU model
FIG. 7 shows that the ConvLSTM model improves lesion pixel differences.
FIG. 8 is a comparison of pixel differences for the TrajGRU and ConvLSTM models.
FIG. 9 is the comparison of pixel difference between one layer data and two layers data of the TrajGRU model.
Figure 10 is the TrajGRU network total loss function value.
FIG. 11 is a flow chart of the present invention.
Detailed Description
The following description of the embodiments of the present invention is provided to facilitate the understanding of the present invention by those skilled in the art, but it should be understood that the present invention is not limited to the scope of the embodiments, and it will be apparent to those skilled in the art that various changes may be made without departing from the spirit and scope of the invention as defined and defined in the appended claims, and all matters produced by the invention using the inventive concept are protected.
According to an embodiment of the present application, referring to fig. 11, the method for meteorological radar echo extrapolation based on the improved TrajGRU network of the present solution includes:
s1, reading a radar image sequence data bin file and processing the radar image sequence data bin file into a gray image sequence;
s2, inputting the processed gray image sequence into the constructed TrajGRU deep learning network, and obtaining a prediction model after training;
and S3, inputting the live radar image sequence used for prediction into a prediction model to obtain an extrapolation image sequence result.
The above steps will be described in detail below according to one embodiment of the present application.
Step S1, preprocessing data;
s1.1, acquiring a radar bin file and reading the radar bin file as a color picture;
and S1.2, processing the color picture into a gray image sequence.
The adopted experimental data are basic reflectivity factors in Doppler radar volume sweep data, and the basic data are preprocessed into a gray image sequence. And one basic reflectivity datum contains radar data of a plurality of elevation angles, and the radar data of the plurality of elevation angles are decomposed.
Step S2, constructing a prediction model;
s2.1, calculating the weight of a loss function of the TrajGRU deep learning network according to the pixel prediction accuracy, specifically: the loss function adopted by TrajGRU is an MSE loss function (mean square loss function), the mean square error is an expected value of the square of the difference between the estimated value of the parameter and the true value of the parameter, and the formula is as follows:
Figure BDA0002680749880000061
where, MSE, yi、yp iAre the same dimension of (a), yiIs a true measured value, yp iIs a predictor, which may be a vector or a matrix.
In order to improve the attention degree of the false report and the false report, different weights are given to the false report, the false report and the report preparation areas when the loss function is constructed. As shown in fig. 1, an ellipse in the figure represents an echo region, fig. 1 represents an echo region collected by a radar, namely real situation image data, and fig. 2 is a model extrapolation result, and the two figures are overlapped to form a figure 3. Certain difference exists between the model extrapolation result and the true value, so that the extrapolation result and the true data have overlapped parts and non-overlapped parts.
The completely overlapped part a area is called a quasi-reporting area, the b area is called a false-reporting area (an area which cannot be predicted), and the c area is called a false-reporting area (an area which is predicted incorrectly). Therefore, the point-by-point calculation loss value is not simply subtracted from the pixel value, and a weight w is added before the pixel point calculation loss value at the false report position and the pixel point calculation loss value at the false report position are added:
Figure BDA0002680749880000062
wherein wMSE is the loss function value after the algorithm is improved, yiScanning the echo image pixel values of the live object for radarp iFor the pixel values of the model extrapolation result image, 255 is added to ensure that the weight value is a positive number, and the weight value is divided by 510 to realize the normalization of the weight value. The loss function obtained in this way can reflect the influence of the situations of false report, missing report and the like on the extrapolation process. In the training process, the neural network model can continuously correct the data which are finally closer to the real radar echo data, so that the prediction result is improved.
S2.2, increasing the number of input data layers;
in the TrajGRU network, the addition of multi-layer radar image sequence input is allowed; and (3) increasing the radar layer number image at each moment, namely, the data of the circulating network input image is not one surface in a three-dimensional space but volume data with certain thickness, the output of the model is also the volume data with the same thickness, and in a new model, the change rule of the echo in the three-dimensional space is reflected in parameters.
And calculating the loss function weight of each layer of radar data of the TrajGRU deep learning network according to the magnitude degree of the obtained loss function value.
Figure BDA0002680749880000071
Figure BDA0002680749880000072
Wherein error1 is the loss value obtained by the first layer input, error2 is the loss value obtained by the second layer input, error is the total loss value, wl1 is the calculated first layer input weight value, and wl2 is the calculated second layer input weight value.
As shown in fig. 10, where error1 is the resulting loss function value for the first tier radar data, error2 is the resulting loss function value for the second tier radar data, and error is the total loss function value for the TrajGRU network.
Error is calculated by the following formula:
Figure BDA0002680749880000073
the Doppler weather radar scans one body every 6 minutes, and each body scan has a plurality of elevation angles and is converted into a plurality of height layer images in the actual weather service. The micro-physical processes such as weather, water vapor and the like occur in a three-dimensional space, the motion of each height layer has a certain relation, and the motion conditions of the water vapor can be reflected mutually, so that only one layer of radar echo image is extrapolated, and certain limitation is realized. The invention increases the radar layer number image at each moment, namely the data of the circulating network input image is not one surface in the three-dimensional space but the volume data with certain thickness, the output of the model is the volume data with the same thickness, and in the TrajGRU network model, the change rule of the echo in the three-dimensional space is reflected in the parameters.
The experimental data adopted by the invention are basic reflectivity factors in Doppler radar volume sweep data, and the basic data are preprocessed into a gray-scale image. The basic data of the basic reflectivity comprises radar data of a plurality of elevation angles, and the radar data of the plurality of elevation angles is decomposed. The radar images at 7 moments are input at a time, and the radar images at 7 moments in the future are predicted. The input data is a layer of radar image, the algorithm adds a layer of radar image, the two layers of data correspond to each other one by one at any time, and the corresponding relation is shown in the following graph 10.
And S2.3, inputting the acquired gray image sequence into an improved TrajGRU deep learning network, and training to obtain a prediction model.
In a multi-channel deep learning network in the TrajGRU network, a two-channel TrajGRU network is constructed, two layers of radar echo images decomposed from base data (basic reflectivity factors) are respectively input into the network for training, and a corresponding network model is obtained. The addition of an input layer increases the input radar echo image information, so that the network can learn richer features, and on the other hand, the overfitting phenomenon of the neural network can be relatively inhibited, so that the network has stronger representation capability on the image information of the radar echo, and the model accuracy is increased.
And step S3, inputting the live radar image sequence used for prediction into a prediction model to obtain an extrapolation image sequence result.
According to one embodiment of the present application, results were analyzed and compared, ConvLSTM was compared to TrajGRU.
The experimental data is a public data set of CIKM AnalyticCup 2017. In the aspect of data processing, firstly, reading experimental data by using a C + + language in a visual studio2015 development environment and converting the experimental data into a gray image sequence, wherein an experimental environment built by a deep learning network is PyCharm, and Python3.6 and PyTorch0.4.1 versions are adopted. The training input data and the training times of each group of comparative experiment network adopt the same parameters.
FIG. 2 is a comparison of the ConvLSTM model and TrajGRU model radar image prediction extrapolation results. The first column is the prediction time, the second column is the ConvLSTM model prediction result, the third column is the TrajGRU model prediction result, and the fourth column is the real image. It can be seen from fig. 2 that as the extrapolation time increases, the image quality becomes more blurred and the details decrease, but the degree of image blur decreases less for TrajGRU, which is seen to be better for TrajGRU than for ConvLSTM.
FIG. 3 is a comparison of the results of the predictions extrapolated before and after the TrajGRU model modifies the loss function. As can be seen from the figure, the model image without loss function modification gradually becomes dark, and the trend of radar extrapolation is not obvious. And on the contrary, the extrapolated radar image sequence data after the loss function is modified is clearer, and the extrapolation trend is more obvious and basically consistent with the real data. The experiment proves that the model with the modified loss function can more effectively maintain the image details and the light and shade contrast.
FIG. 4 is a comparison of results obtained by increasing only the number of input echo data layers for the TrajGRU model without improving the loss function. As can be seen from the figure, the capability of acquiring image features by using two layers of echo images is stronger, the small block echo at the lower right corner is better retained, and the lower right corner echo image in the result of inputting only one layer is not well retained. The iteration times in the experimental result are the same, the texture details are slightly worse when two layers of echo image results are input, the under-fitting phenomenon exists, the iteration times can be increased, the training sample is increased, and the detail keeping capability is improved.
In order to verify the effectiveness of the improved model on real radar echo data, the model is applied to a meteorological radar echo image data set of the Chengdu station in the 7-month period in 2018 for experiment. And carrying out certain preprocessing such as desensitization and the like, and converting the preprocessed signals into a gray image sequence to be used as the input of the model. Fig. 5 is a comparison between the prediction network and the real data, and it can be seen that the modified TrajGRU model has richer image texture in the echo region and better consistency with the observation data.
In the experiment, radar images at 7 continuous moments are used for predicting and extrapolating radar images at 7 subsequent moments, and because the meteorological radar completes the volume sweep every 6 minutes, the time interval between two adjacent moments is 6 minutes, namely images within 42 minutes are used for predicting and extrapolating images at 7 subsequent moments within 42 minutes. For convenience of description in the future, let the first predicted 6 minute time be t1, the second predicted 6 minute time be t2, and so on, and the 7 th six minute be t 7.
The image subtraction difference is commonly used to describe the similarity degree of the images, and accordingly, for each prediction time ti (i is 1,2,3,4,5,6,7), 100 prediction images are taken to be subtracted from the actually measured radar image at the corresponding time, and the differences are summed after calculating the absolute values:
Figure BDA0002680749880000101
wherein, PnPixel values representing the extrapolated image, P representing pixel values of the live real image,s represents the average pixel difference of 100 images, obviously, the expression includes the situations of false report and false report, the smaller the S value is, the closer the prediction extrapolation image is to the actual measurement image, the better the prediction effect is, and otherwise, the effect is worse.
In fig. 6 to 8, abscissa 1,2,3,4,5,6, and 7 represent times t1, t2, t3, t4, t5, t6, and t7, respectively, and ordinate represents a value calculated as S. The histogram of the four plots generally trends upward over time from time t1 to time t7, indicating that the actual situation is consistent as the accumulation of errors in the predictive extrapolation process causes the accuracy at subsequent times to decrease.
Fig. 6 and 7 are respectively a comparison of the modified results of the misreporting and missed reporting loss functions performed by the TrajGRU and ConvLSTM models, and the results show that the prediction results are more accurate after the loss functions are modified.
The experimental result in fig. 8 shows that TrajGRU has better robustness in the time dimension and better overall prediction extrapolation performance, and therefore, the experimental selection for increasing the number of layers of prediction extrapolation input radar is performed in TrajGRU, and as a result, as shown in fig. 9, the number of layers of input radar is increased, the image prediction extrapolation result is obviously improved, and the effect is more obvious at the closer time.
The invention preprocesses radar image sequence data, inputs the radar image sequence data into a TrajGRU network model modified by an algorithm after reading the radar image sequence data into a gray image sequence, and obtains a prediction model. The improved TrajGRU network model can more accurately capture the space-time correlation of radar pictures, and the image quality is further improved; the extrapolation result image can better keep the image details, and the radar echo distribution is closer to a real radar scanning image.
While the embodiments of the invention have been described in detail in connection with the accompanying drawings, it is not intended to limit the scope of the invention. Various modifications and changes may be made by those skilled in the art without inventive step within the scope of the appended claims.

Claims (7)

1. A meteorological radar echo extrapolation method based on an improved TrajGRU network is characterized by comprising the following steps:
s1, reading a radar image sequence data bin file and processing the radar image sequence data bin file into a gray image sequence;
s2, inputting the processed gray image sequence into the constructed TrajGRU deep learning network, and obtaining a prediction model after training;
and S3, inputting the live radar image sequence used for prediction into a prediction model to obtain an extrapolation image sequence result.
2. The method for meteorological radar echo extrapolation based on TrajGRU network improvement according to claim 1, wherein the step S1 is reading radar image sequence data bin file and processing it into gray image sequence, and comprises the following steps:
s1.1, acquiring a radar bin file and reading the radar bin file as a color picture;
and S1.2, processing the color picture into a gray image sequence.
3. The improved TrajGRU network-based weather radar echo extrapolation method according to claim 1, wherein the specific step of S2 includes:
s2.1, calculating the weight of a loss function of the TrajGRU deep learning network according to the pixel prediction accuracy;
s2.2, adding at least one layer of radar image sequence input in the TrajGRU network, wherein the loss function weight of each layer of radar data of the TrajGRU deep learning network is calculated according to the magnitude degree of the obtained loss function value, and different weight values of different loss values are given;
and S2.3, inputting the obtained gray image sequence into an improved TrajGRU deep learning network, and training to obtain a prediction model.
4. The method for meteorological radar echo extrapolation based on improved TrajGRU network as claimed in claim 3, wherein the 2.1 calculating the weight of the loss function of the TrajGRU deep learning network according to the accuracy of pixel prediction comprises:
calculating loss values point by point, and adding a weight w before calculating the loss values of the pixel points at the false report and the false report positions:
Figure FDA0002680749870000021
where wMSE is the loss function value, yiScanning the echo image pixel values of the live object for radarp iFor the pixel values of the model extrapolation result image, 255 is added to ensure that the weight values are positive, divided by 510 to achieve normalization of the weight values.
5. The improved TrajGRU network based weather radar echo extrapolation method of claim 3, wherein the S2.2 adds at least one layer of radar image sequence input in the TrajGRU network, wherein the weight of the loss function of each layer of radar data in the TrajGRU deep learning network is calculated according to the magnitude degree of the obtained loss function value, and different weight values are given to different loss values, and the method comprises the following steps:
Figure FDA0002680749870000022
Figure FDA0002680749870000023
Figure FDA0002680749870000024
wherein error1 is the loss value obtained by the first layer input, error2 is the loss value obtained by the second layer input, error is the total loss value, wl1 is the calculated first layer input weight value, and wl2 is the calculated second layer input weight value.
6. The improved TrajGRU network-based weather radar echo extrapolation method according to claim 3, wherein in the TrajGRU network in S2.2, the method for adding a layer of radar image sequence input comprises:
in the TrajCRU model network model, radar images at 7 moments are input once, radar images at 7 moments in the future are predicted, input data are one layer of radar image, and after one layer of radar image is added, data of two layers of radar images correspond to each other one by one.
7. The method for meteorological radar echo extrapolation based on improved TrajGRU network according to claim 3, wherein the step of inputting the obtained grayscale image sequence into the improved TrajGRU deep learning network in S2.3, and obtaining a prediction model after training comprises:
and constructing a two-channel TrajGRU network, respectively inputting two layers of radar echo image sequences decomposed from the base data into the TrajGRU network for training, and obtaining a prediction model after training.
CN202010961607.1A 2020-09-14 2020-09-14 Weather radar echo extrapolation method based on improved TrajGRU network Active CN112180375B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010961607.1A CN112180375B (en) 2020-09-14 2020-09-14 Weather radar echo extrapolation method based on improved TrajGRU network

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010961607.1A CN112180375B (en) 2020-09-14 2020-09-14 Weather radar echo extrapolation method based on improved TrajGRU network

Publications (2)

Publication Number Publication Date
CN112180375A true CN112180375A (en) 2021-01-05
CN112180375B CN112180375B (en) 2022-12-20

Family

ID=73920928

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010961607.1A Active CN112180375B (en) 2020-09-14 2020-09-14 Weather radar echo extrapolation method based on improved TrajGRU network

Country Status (1)

Country Link
CN (1) CN112180375B (en)

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113128778A (en) * 2021-04-27 2021-07-16 最美天气(上海)科技有限公司 Model training method based on graded TS meteorological scoring
CN113239722A (en) * 2021-03-31 2021-08-10 成都信息工程大学 Deep learning based strong convection extrapolation method and system under multi-scale
CN113341419A (en) * 2021-05-25 2021-09-03 成都信息工程大学 Weather extrapolation method and system based on VAN-ConvLSTM
CN113640769A (en) * 2021-08-27 2021-11-12 南京信息工程大学 Weather radar basic reflectivity prediction method based on deep neural network
CN113724287A (en) * 2021-09-02 2021-11-30 北京华云星地通科技有限公司 Satellite cloud picture prediction method and system
CN113936142A (en) * 2021-10-13 2022-01-14 成都信息工程大学 Rainfall approach forecasting method and device based on deep learning
CN114488070A (en) * 2022-04-08 2022-05-13 北京弘象科技有限公司 Radar echo extrapolation method and device based on deep learning model
CN117233724A (en) * 2023-11-15 2023-12-15 中国气象局公共气象服务中心(国家预警信息发布中心) Radar echo extrapolation method and device based on depth space-time attention network

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH09145851A (en) * 1995-11-28 1997-06-06 Nippon Telegr & Teleph Corp <Ntt> Meteorological forecasting device
CN108734357A (en) * 2018-05-29 2018-11-02 北京佳格天地科技有限公司 Weather prognosis system and method
CN108732550A (en) * 2018-08-01 2018-11-02 北京百度网讯科技有限公司 Method and apparatus for predicting radar return
CN110456355A (en) * 2019-08-19 2019-11-15 河南大学 A kind of Radar Echo Extrapolation method based on long short-term memory and generation confrontation network
CN111487624A (en) * 2020-04-23 2020-08-04 上海眼控科技股份有限公司 Method and equipment for predicting rainfall capacity
CN111521990A (en) * 2020-05-11 2020-08-11 沈阳工业大学 Rainfall analysis method based on multilayer radar echo data

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH09145851A (en) * 1995-11-28 1997-06-06 Nippon Telegr & Teleph Corp <Ntt> Meteorological forecasting device
CN108734357A (en) * 2018-05-29 2018-11-02 北京佳格天地科技有限公司 Weather prognosis system and method
CN108732550A (en) * 2018-08-01 2018-11-02 北京百度网讯科技有限公司 Method and apparatus for predicting radar return
CN110456355A (en) * 2019-08-19 2019-11-15 河南大学 A kind of Radar Echo Extrapolation method based on long short-term memory and generation confrontation network
CN111487624A (en) * 2020-04-23 2020-08-04 上海眼控科技股份有限公司 Method and equipment for predicting rainfall capacity
CN111521990A (en) * 2020-05-11 2020-08-11 沈阳工业大学 Rainfall analysis method based on multilayer radar echo data

Non-Patent Citations (4)

* Cited by examiner, † Cited by third party
Title
GAN JIANHONG,ET AL: "Research on weather radar nowcasting extrapolation", 《2020 INTERNATIONAL CONFERENCE ON COMPUTER VISION, IMAGE AND DEEP LEARNING (CVIDL)》 *
JINRUI JING, ET AL: "MLC-LSTM: Exploiting the Spatiotemporal Correlation between Multi-Level Weather Radar Echoes for Echo Sequence Extrapolation", 《SENSORS》 *
XINGJIAN SHI, ET AL: "Deep Learning for Precipitation Nowcasting:A Benchmark and A New Model", 《31ST CONFERENCE ON NEURAL INFORMATION PROCESSING SYSTEMS (NIPS 2017)》 *
方巍等: "人工智能在短临降水预报中应用研究综述", 《南京信息工程大学学报(自然科学版)》 *

Cited By (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113239722A (en) * 2021-03-31 2021-08-10 成都信息工程大学 Deep learning based strong convection extrapolation method and system under multi-scale
CN113239722B (en) * 2021-03-31 2022-08-30 成都信息工程大学 Deep learning based strong convection extrapolation method and system under multi-scale
CN113128778B (en) * 2021-04-27 2022-08-26 最美天气(上海)科技有限公司 Model training method based on graded TS meteorological scoring
CN113128778A (en) * 2021-04-27 2021-07-16 最美天气(上海)科技有限公司 Model training method based on graded TS meteorological scoring
CN113341419A (en) * 2021-05-25 2021-09-03 成都信息工程大学 Weather extrapolation method and system based on VAN-ConvLSTM
CN113640769A (en) * 2021-08-27 2021-11-12 南京信息工程大学 Weather radar basic reflectivity prediction method based on deep neural network
CN113724287A (en) * 2021-09-02 2021-11-30 北京华云星地通科技有限公司 Satellite cloud picture prediction method and system
CN113936142A (en) * 2021-10-13 2022-01-14 成都信息工程大学 Rainfall approach forecasting method and device based on deep learning
CN113936142B (en) * 2021-10-13 2024-06-18 成都信息工程大学 Precipitation proximity forecasting method and device based on deep learning
CN114488070A (en) * 2022-04-08 2022-05-13 北京弘象科技有限公司 Radar echo extrapolation method and device based on deep learning model
CN114488070B (en) * 2022-04-08 2022-07-19 北京弘象科技有限公司 Radar echo extrapolation method and device based on deep learning model
CN117233724A (en) * 2023-11-15 2023-12-15 中国气象局公共气象服务中心(国家预警信息发布中心) Radar echo extrapolation method and device based on depth space-time attention network
CN117233724B (en) * 2023-11-15 2024-02-06 中国气象局公共气象服务中心(国家预警信息发布中心) Radar echo extrapolation method and device based on depth space-time attention network

Also Published As

Publication number Publication date
CN112180375B (en) 2022-12-20

Similar Documents

Publication Publication Date Title
CN112180375B (en) Weather radar echo extrapolation method based on improved TrajGRU network
CN109816641B (en) Multi-scale morphological fusion-based weighted local entropy infrared small target detection method
CN108573499B (en) Visual target tracking method based on scale self-adaption and occlusion detection
CN111062355A (en) Human body action recognition method
CN111612817A (en) Target tracking method based on depth feature adaptive fusion and context information
CN112597815A (en) Synthetic aperture radar image ship detection method based on Group-G0 model
CN111080675A (en) Target tracking method based on space-time constraint correlation filtering
CN108961255B (en) Sea-land noise scene segmentation method based on phase linearity and power
CN112330593A (en) Building surface crack detection method based on deep learning network
CN111652790A (en) Sub-pixel image registration method
CN111524092B (en) Nondestructive testing method for black tea withering degree index
CN114926826A (en) Scene text detection system
CN114463624A (en) Method and device for detecting illegal buildings applied to city management supervision
CN112418149A (en) Abnormal behavior detection method based on deep convolutional neural network
CN116229419B (en) Pedestrian detection method and device
CN113129336A (en) End-to-end multi-vehicle tracking method, system and computer readable medium
CN115512299A (en) Flood early warning method of U-net variant neural network based on radar image
Jianhong et al. Research on weather radar nowcasting extrapolation
CN113283326A (en) Video SAR target intelligent detection method based on simulation target bright line characteristics
CN117372462B (en) High-precision underwater low-light target edge detection method
CN112734806B (en) Visual target tracking method and device based on peak sharp guidance confidence
CN116228769B (en) Device and method suitable for flaw detection of steel wire braided tube
CN117635637B (en) Autonomous conceived intelligent target dynamic detection system
CN113450374B (en) Automatic real-time three-dimensional measurement method for underwater target based on laser imaging
CN117824625B (en) High dam large warehouse underwater environment sensing and composition method based on improved visual odometer

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant