CN113640769A - Weather radar basic reflectivity prediction method based on deep neural network - Google Patents

Weather radar basic reflectivity prediction method based on deep neural network Download PDF

Info

Publication number
CN113640769A
CN113640769A CN202110997434.3A CN202110997434A CN113640769A CN 113640769 A CN113640769 A CN 113640769A CN 202110997434 A CN202110997434 A CN 202110997434A CN 113640769 A CN113640769 A CN 113640769A
Authority
CN
China
Prior art keywords
neural network
deep neural
radar
basic reflectivity
data
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202110997434.3A
Other languages
Chinese (zh)
Other versions
CN113640769B (en
Inventor
钱代丽
束宇
王兴
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Nanjing Meteorological Bureau
Nanjing University of Information Science and Technology
Original Assignee
Nanjing Meteorological Bureau
Nanjing University of Information Science and Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nanjing Meteorological Bureau, Nanjing University of Information Science and Technology filed Critical Nanjing Meteorological Bureau
Priority to CN202110997434.3A priority Critical patent/CN113640769B/en
Publication of CN113640769A publication Critical patent/CN113640769A/en
Application granted granted Critical
Publication of CN113640769B publication Critical patent/CN113640769B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/02Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S13/00
    • G01S7/41Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S13/00 using analysis of echo signal for target characterisation; Target signature; Target cross-section
    • G01S7/417Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S13/00 using analysis of echo signal for target characterisation; Target signature; Target cross-section involving the use of neural networks
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/88Radar or analogous systems specially adapted for specific applications
    • G01S13/95Radar or analogous systems specially adapted for specific applications for meteorological use
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01WMETEOROLOGY
    • G01W1/00Meteorology
    • G01W1/10Devices for predicting weather conditions
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/045Combinations of networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/049Temporal neural networks, e.g. delay elements, oscillating neurons or pulsed inputs
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02ATECHNOLOGIES FOR ADAPTATION TO CLIMATE CHANGE
    • Y02A90/00Technologies having an indirect contribution to adaptation to climate change
    • Y02A90/10Information and communication technologies [ICT] supporting adaptation to climate change, e.g. for weather forecasting or climate simulation

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Artificial Intelligence (AREA)
  • Evolutionary Computation (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • General Health & Medical Sciences (AREA)
  • Mathematical Physics (AREA)
  • Biophysics (AREA)
  • Computational Linguistics (AREA)
  • Data Mining & Analysis (AREA)
  • Health & Medical Sciences (AREA)
  • Molecular Biology (AREA)
  • Computing Systems (AREA)
  • General Engineering & Computer Science (AREA)
  • Biomedical Technology (AREA)
  • Software Systems (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Environmental & Geological Engineering (AREA)
  • Electromagnetism (AREA)
  • Atmospheric Sciences (AREA)
  • Biodiversity & Conservation Biology (AREA)
  • Ecology (AREA)
  • Environmental Sciences (AREA)
  • Radar Systems Or Details Thereof (AREA)

Abstract

The invention discloses a weather radar basic reflectivity prediction method based on a deep neural network, which is more concerned with the prediction of the evolution trend of the spatial form of the basic reflectivity than the common weather radar basic reflectivity prediction method.

Description

Weather radar basic reflectivity prediction method based on deep neural network
The technical field is as follows:
the invention belongs to the field of geoscience, and particularly relates to a weather radar basic reflectivity prediction method based on a deep neural network, which is suitable for short-term weather forecast.
Background art:
the basic reflectivity is one of basic data of weather radar detection, and the value of the basic reflectivity intuitively reflects the size and density distribution of precipitation particles in the atmosphere. In the proximity forecast of the strong convection weather, the evolution condition of the strong convection weather is generally analyzed through the prediction of the basic reflectivity. The main development trend of the current weather service approach forecast for strong weather is a basic reflectivity area tracking algorithm, common methods include a cross correlation algorithm, an optical flow method, a TITAN and the like, and the forecasting techniques have advantages and disadvantages in different weather backgrounds. In general, the main advantage of the cross-correlation algorithm is that its calculation method is relatively simple, and the cross-correlation algorithm is not only suitable for tracking and nowcasting of strong convection storms, but also suitable for tracking and nowcasting of general convection precipitation systems and convection cloud and laminar cloud mixed precipitation systems with wide ranges. However, for precipitation echoes that are locally generated and vary rapidly in intensity and shape over time, the quality of the motion vector field given by the cross-correlation method is reduced and the cases of tracking failures are significantly increased. The optical flow method has the advantage that even if the movement and the shape change of the thunderstorm are severe, the overall movement trend of the thunderstorm can be accurately obtained, which cannot be compared by a cross-correlation method. For the moving type thunderstorm and frontal surface low groove which locally generates and has the intensity and the shape which change rapidly along with the time, the optical flow method has better effect than the cross correlation algorithm. However, for tropical system precipitation weather and quasi-static locally enhanced precipitation, the cross correlation method is slightly superior to the optical flow method. In addition, the cross correlation and the optical flow method have the defect of serious divergence after the extrapolation time-effect is prolonged. And the object-based storm identification extrapolation TITAN or STI can more effectively identify and track small isolated thunderstorm monomers, and a thunderstorm movement vector is obtained by extrapolating the thunderstorm centroid position, so that the technology taking the thunderstorm monomers as identification matching and extrapolation is more suitable for small-scale thunderstorms. However, the defects are that the TITAN can only be used for identifying and tracking convective precipitation systems, cannot be used for the nowcast of lamellar cloud precipitation, and does not perform as well as cross-correlation in a large-range systematic precipitation forecast.
Accurate and reliable radar detection data is crucial to accurate prediction of the base reflectivity. Although the weather radar can be regularly maintained and calibrated, and data quality control is performed through various technical means, ground clutter caused by various factors still exists, and accurate and effective suppression is difficult to achieve. The basic reflectivity is predicted by utilizing the deep neural network, a large amount of historical radar detection data are used as training samples, clutter information in the samples can seriously interfere with the result of deep neural network model training, and the accuracy of basic reflectivity prediction is further influenced. In practical applications, people tend to pay more attention to the prediction of the evolution trend of the spatial form of the basic reflectivity than the prediction of the basic reflectivity intensity of any one region.
The invention content is as follows:
in order to solve the problems in the prior art, the invention provides a weather radar basic reflectivity prediction method based on a deep neural network, which reduces the negative influence of various ground clutter in a radar detection data sample on the training process of the deep neural network model, optimizes the prediction effect on the evolution trend of a basic reflectivity space structure, and improves the accuracy of weather radar basic reflectivity prediction.
The technical scheme of the invention is as follows:
a weather radar basic reflectivity prediction method based on a deep neural network comprises the following specific steps:
1) acquiring historical detection data of a weather radar, and preprocessing the historical data;
2) constructing a deep neural network model, taking the preprocessed historical data as a training set, and training the deep neural network model;
3) and predicting the basic reflectivity of the future weather radar based on the trained deep neural network model.
Preferably, the specific steps of step 1) are as follows:
step 1.1) extracting basic reflectivity in historical detection data of a weather radar, converting a polar coordinate format into a plane rectangular coordinate format by using a spatial interpolation algorithm, and expressing the basic reflectivity of a detection point after conversion as D (x, y, z, t); wherein x, y and z are coordinates of the radar in longitude, latitude and height respectively; x, y and z are positive integers of [1, N ], and N is the total number of detection points of the radar on longitude, latitude and height; t is radar detection time; t is in the value range of [1, X ]; x is the total number of radar detection moments in weather radar historical detection data; defining variable N, N is less than N and N is positive integer;
step 1.2) optional selection of a certain radar detection time tRA certain height zRBasic reflectivity D (x, y, z)R,tR) (ii) a Let di,j(x,y)=D(x,y,zR,tR) Wherein i and j are positive integers starting from 1 for counting;
step 1.3) taking n as side length and n as moving step length, and converting di,j(x, y) is divided into a plurality of small square areas with the side length of n from left to right and from top to bottom; sequential analysis of di,jAnd (x, y) keeping the maximum value of the basic reflectivity of each small area, keeping the value of the basic reflectivity of the coordinate position of the maximum value, setting the basic reflectivity of other coordinate positions of the small area to be NULL, and recording the result as d _ maxi,j(x, y); similarly, d is sequentially analyzedi,jAnd (x, y) keeping the minimum value of the basic reflectivity of each small area, keeping the value of the basic reflectivity of the coordinate position where the minimum value is located, setting the basic reflectivity of other coordinate positions of the small area to be NULL, and recording the result as d _ mini,j(x,y);
Step 1.4) Pair d _ maxi,j(x, y) and d _ mini,j(x, y) is interpolated and d _ max is calculatedi,j(x, y) and d _ mini,jThe base reflectivity of the coordinate location in (x, y) set to NULL is filled with the interpolation result, d _ maxi,j(x, y) and d _ mini,jThe basic reflectivity value of the coordinate position which is not NULL in (x, y) is kept unchanged, and the result after the interpolation calculation is finished is respectively recorded as d _ maxi,j' (x, y) andd_mini,j′(x,y);
step 1.5) the mean value of each coordinate position is calculated by adopting the following formula:
Figure BDA0003234323820000021
step 1.6) calculating the basic reflectivity detection value of each coordinate position and the mean value d _ mean of the same coordinate position by adopting the following formulai,jDifference between (x, y):
d_remi,j(x,y)=di,j(x,y)-d_meani,j(x,y)
step 1.7): when j is 1, let j be j +1, di,j(x,y)=d_remi,j-1(x, y), return to step 1.3); when j is more than 1, judging according to the following formula:
Figure BDA0003234323820000031
wherein TH1 is an empirical threshold, TH1 epsilon (0, 1);
when the above formula is satisfied, let di(x,y)=d_remi,j(x, y), go to step 1.8); when the above formula is not satisfied, let j equal to j +1, di,j(x,y)=d_remi,j-1(x, y), return to step 1.3);
step 1.8) the judgment is carried out according to the following formula:
i≤TH2
wherein TH2 is an empirical threshold, TH2 has a value range of [2, 20 ]]The counter is used for limiting the number of times of iterative calculation of the counter i; when the above formula is satisfied, let i ═ i +1, di,1(x,y)=di-1,j(x,y)-d_remi-1,j(x, y), and then, making j equal to 1, and returning to the step 1.3); when the above formula does not hold, the selected height z is obtainedRAnd a selected radar detection time tRCorresponding set di(x, y), i ═ 1, 2, 3, ·, TH2, and proceed to step 1.9);
step 1.9) repeat steps 1.2) to 1.8) to obtain all heights and all minesThe basic reflectivity data up to the detection time are recorded as di(x,y,z,t),i=1,2,3,...,TH2。
Preferably, the historical detection data of the weather radar in the step 1) are orderly arranged according to time sequence, and the time intervals of the adjacent data are kept the same.
Preferably, the historical detection data of the weather radar in the step 1) is the detection data of the weather radar for Y years, and the value range of Y is [2, 30 ].
Preferably, the specific steps of step 2) are as follows:
step 2.1) a deep neural network model is constructed, which is expressed as ds _ p (x, y, z, t '+ 1) model (ds _ t (x, y, z, t')),
where model denotes the deep neural network model, and ds _ t (x, y, z, t ') is input data of the model, where ds _ t (x, y, z, t') [ d ] isi(x,y,z,t’),di(x,y,z,t’-1),di(x,y,z,t’-2),...,di(x,y,z,t’-r)]R is a positive integer with a value range of [5, 100]](ii) a ds _ p (x, y, z, t ' +1) is the output data of the model, i.e., the trained label, and ds _ p (x, y, z, t ' +1) is D (x, y, z, t ' + 1);
step 2.2) training the deep neural network model by using the historical data preprocessed in the step 1 as a training set, using ds _ t (X, y, z, t '), X, y, z as 1, 2., N, t ' ═ r + 1., X-2, X-1 as input data of the model, and using ds _ p (X, y, z, t ' +1) as a label of model training.
Preferably, the deep neural network model in the step 2.1) adopts a Conv-LSTM model or a TrajGRU model.
Preferably, the specific steps of step 3) are as follows:
step 3.1) recording the current radar detection time as t0Will t0、t0-1、t0-2、……、t0R basic reflectivity at the detection time of the radar is processed according to the method from step 1.2) to step 1.9) to obtain a data set ds-t (x, y, z, t)0);
Step 32) the data set ds-t (x, y, z, t)0) Inputting the trained deep neural network model to obtain the future t0The base reflectance at time + 1;
step 3.3) the result calculated in step 3.2, and t0、t0-1、t0-2、……、t0The basic reflectivity at the detection time of the r +1 radar is processed according to the method from step 1.2) to step 1.9) to obtain a new data set ds-t (x, y, z, t)0+1);
Step 3.4) the new data set ds _ t (x, y, z, t)0+1) inputting the trained deep neural network model to obtain the future t0The basic reflectivity at the moment of +2, and so on, can be obtained as t in the future0+1、t0+2、t0The basic reflectance at time +3, … ….
Compared with the prior art, the invention has the following beneficial effects:
the invention provides a weather radar basic reflectivity prediction method based on a deep neural network, which is more concerned with the prediction of the basic reflectivity spatial form evolution trend than the common weather radar basic reflectivity prediction method, adopts the idea of basic reflectivity layering iteration, substitutes the basic reflectivity two-dimensional data characteristics of different extreme value spaces into a deep neural network model for training, and can reduce the adverse effect on the deep neural network model training process caused by various ground object clutter in a data sample to a certain extent, optimize the prediction effect on the basic reflectivity spatial structure evolution trend and further improve the accuracy of the weather radar basic reflectivity prediction compared with directly substituting the basic reflectivity data into the deep neural network model.
Description of the drawings:
FIG. 1 is a flow chart of the method of the present invention;
FIG. 2 is a flow chart of the steps of the method of the present invention;
FIG. 3 shows example di,jA schematic diagram of the division of (x, y);
the specific implementation mode is as follows:
the first embodiment is as follows:
in this embodiment, as shown in fig. 1, the method for predicting the basic reflectivity of a weather radar based on a deep neural network includes the following specific steps:
1) the method comprises the following specific steps of obtaining historical detection data of a weather radar and preprocessing the historical data:
step 1.1) extracting basic reflectivity in historical detection data of a weather radar, converting a polar coordinate format into a plane rectangular coordinate format by using a spatial interpolation algorithm, and expressing the basic reflectivity of a detection point after conversion as D (x, y, z, t); wherein x, y and z are coordinates of the radar in longitude, latitude and height respectively; x, y and z are positive integers of [1, N ], and N is the total number of detection points of the radar on longitude, latitude and height; t is radar detection time, and represents the sequence number of radar historical detection data in the sequence according to detection time sequence; t is in the value range of [1, X ]; x is the total number of radar detection moments in the weather radar historical detection data; then defining a variable N, wherein N is less than N and is a positive integer; the historical detection data of the weather radar is detection data of the same weather radar for Y years continuously, the value range of Y is [3, 5], the historical detection data of the weather radar are orderly arranged according to the time sequence, and the time intervals of adjacent data are kept the same.
Step 1.2) optional selection of a certain radar detection time tRA certain height zRBasic reflectivity D (x, y, z)R,tR) (ii) a Let di,j(x,y)=D(x,y,zR,tR) Wherein i and j are positive integers starting from 1 and used for counting, and the current values of i and j are both 1;
step 1.3) taking n as side length and n as moving step length, and converting di,j(x, y) is divided into a plurality of small square areas with the side length of n from left to right and from top to bottom; sequential analysis of di,jAnd (x, y) keeping the maximum value of the basic reflectivity of each small area, keeping the value of the basic reflectivity of the coordinate position of the maximum value, setting the basic reflectivity of other coordinate positions of the small area to be NULL, and recording the result as d _ maxi,j(x, y); similarly, d is sequentially analyzedi,jEach of (x, y) is smallThe minimum value of the basic reflectivity of the area is reserved, only the basic reflectivity value of the coordinate position of the minimum value is reserved, the basic reflectivity of other coordinate positions of the small area is set to be NULL, and the result is recorded as d _ mini,j(x,y);
Step 1.4) Pair d _ maxi,j(x, y) and d _ mini,j(x, y) is interpolated and d _ max is calculatedi,j(x, y) and d _ mini,jThe base reflectivity of the coordinate location in (x, y) set to NULL is filled with the interpolation result, d _ maxi,j(x, y) and d _ mini,jThe basic reflectivity value of the coordinate position which is not NULL in (x, y) is kept unchanged, and the result after the interpolation calculation is finished is respectively recorded as d _ maxi,j' (x, y) and d _ mini,j′(x,y);
Step 1.5) the mean value of each coordinate position is calculated by adopting the following formula:
Figure BDA0003234323820000051
step 1.6) calculating the basic reflectivity detection value of each coordinate position and the mean value d _ mean of the same coordinate position by adopting the following formulai,jDifference between (x, y):
d_remi,j(x,y)=di,j(x,y)-d_meani,j(x,y)
step 1.7): when j is 1, let j be j +1, di,j(x,y)=d_remi,j-1(x, y), return to step 1.3); when j is more than 1, judging according to the following formula:
Figure BDA0003234323820000061
wherein TH1 is an empirical threshold, TH1 epsilon (0, 1);
when the above formula is satisfied, let di(x,y)=d_remi,j(x, y), go to step 1.8); when the above formula is not satisfied, let j equal to j +1, di,j(x,y)=d_remi,j-1(x, y), return to step 1.3);
step 1.8) the judgment is carried out according to the following formula:
i≤TH2
wherein TH2 is an empirical threshold, TH2 has a value range of [2, 20 ]]The counter is used for limiting the number of times of iterative calculation of the counter i; when the above formula is satisfied, let i ═ i +1, di,1(x,y)=di-1,j(x,y)-d_remi-1,j(x, y), and then, returning to step 3 by changing j to 1; when the above formula does not hold, the selected height z is obtainedRAnd a selected radar detection time tRCorresponding set di(x, y), i ═ 1, 2, 3, ·, TH2, and proceed to step 1.9);
step 1.9) repeating the steps 1.2) to 1.8) to obtain the basic reflectivity data of all heights and all radar detection moments, and recording the result as di(x,y,z,t),i=1,2,3,...,TH2。
2) Constructing a deep neural network model, and training the deep neural network model by using the preprocessed historical data, wherein the step 2) specifically comprises the following steps:
step 2.1) a deep neural network model is constructed, which is expressed as ds _ p (x, y, z, t '+ 1) model (ds _ t (x, y, z, t')),
wherein, model represents a deep neural network model, and the deep neural network model adopts Conv-LSTM model or TrajGRU model. ds _ t (x, y, z, t ') is input data of the model, and ds _ t (x, y, z, t') [ d [i(x,y,z,t’),di(x,y,z,t’-1),di(x,y,z,t’-2),...,di(x,y,z,t’-r)]R is a positive integer with a value range of [5, 100]](ii) a ds _ p (x, y, z, t ' +1) is output data of the model, ds _ p (x, y, z, t ' +1) is D (x, y, z, t ' + 1);
taking the historical data preprocessed in the step 1 as a training set, sequentially taking X as 1, 2,., N, y as 1, 2,., N, z as 1, 2,., N, t ' as r +1,., X-2, X-1, and using ds _ t (X, y, z, t ') as input data of the model, and correspondingly using ds _ p (X, y, z, t ' +1) as a label of model training, and training the deep neural network model.
3) Predicting the basic reflectivity of the future weather radar based on the trained deep neural network model, wherein the specific steps of the step 3) are as follows:
step 3.1) recording the current radar detection time as t0Will t0、t0-1、t0-2、……、t0R basic reflectivity at the detection time of the radar is processed according to the method from step 1.2) to step 1.9) to obtain a data set ds _ t (x, y, z, t)0);
Step 3.2) the data set ds _ t (x, y, z, t)0) Inputting the training completion deep neural network model to obtain the future t0The base reflectance at time + 1;
step 3.3) the result calculated in step 3.2, and t0、t0-1、t0-2、……、t0The basic reflectivity at the moment-r +1 is processed according to the method from step 1.2) to step 1.9) to obtain a new data set ds _ t (x, y, z, t)0+1);
Step 3.4) the new data set ds _ t (x, y, z, t)0+1) input training to complete the deep neural network model to obtain the future t0The basic reflectivity at the moment of +2, and so on, can be obtained as t in the future0+1、t0+2、t0The basic reflectance at time +3, … ….
Example two:
in this embodiment, taking historical data of weather radar after a certain 3 years of continuous weather radar as an example, training a deep neural network model by using the prediction method of the present invention, and predicting the basic reflectivity by using the trained model, the method includes the following specific steps:
step 1) acquiring historical detection data of a weather radar, and preprocessing the historical data, wherein the method comprises the following specific steps:
step 1.1) prepare weather radar history data for 3 consecutive years, which data comes from the same radar operating in VCP21, scanning 9 different elevations every 6 minutes. The historical detection data of the weather radar are necessarily normalized to ensure that the data are orderly arranged according to time sequence, and the time intervals of every two adjacent data are approximately the sameAll of them were 6 minutes. In this embodiment, the value of the parameter r is 5, and each time the model inputs sd _ t (x, y, t ') [ d _ t (x, y, t') ]i(x,y,z,t′),di(x,y,z,t′-1),di(x,y,z,t′-2),...,di(x,y,z,t′-5)]Thus, the data input into the model each time includes radar detection data for 6 adjacent time instants t ', t' -1, t '-2 … …, t' -5.
Reading weather radar detection data, and performing coordinate system conversion on the basic reflectivity in the data, wherein the main process is as follows: the basic reflectivity is extracted from the data, and then the data is converted from a polar coordinate format to a plane rectangular coordinate format by using a spatial interpolation algorithm. The converted radar basic reflectivity data is expressed as D (X, y, z, t), wherein X, y and z respectively express coordinates in three directions of longitude, latitude and height, and t expresses the value range of the time t detected by the radar as [1, X ]; x is the total number of radar detection moments in weather radar historical detection data; the time interval of the data is 6 minutes in this embodiment, so the total number of radar detection times for the three years is 262800. It is not assumed that the number of detection points of the converted basic reflectance data in the three directions of longitude, latitude, and height is N, and in this embodiment, N is 100. Then x, y and z are all positive integers of [1, 100 ]. A variable N is defined for the subsequent calculation step, N < N and is a positive integer, where N is 5 in this embodiment.
Step 1.2): taking the basic reflectivity D (x, y, 1, 1) of the radar detection time t equal to 1 and the height z equal to 1, and letting Di,jD (x, y, 1, 1), where i and j are positive integers starting from 1, and are used for counting in subsequent iteration, and the current value is 1.
Step 1.3): taking n as side length and n as moving step length, and di,j(x, y) is divided into a plurality of small square areas with the side length of n from left to right and from top to bottom, as shown in FIG. 2. When N is not an integer multiple of N, di,jThe rightmost and lowermost (x, y) small regions are not squares of length n. In this embodiment, N is 5, N is 100, and d isi,j(x, y) is divided into 20 squares in both the x and y directionsAnd (4) a region. Analyzing the maximum value of the basic reflectivity of each small area in sequence, only keeping the basic reflectivity values of all the coordinate positions of the maximum value, setting the basic reflectivity of other coordinate positions of the small area to be NULL, and recording the calculated result as d _ maxi,j(x, y). And similarly, analyzing the minimum value of the basic reflectivity of each small area in sequence, only keeping the basic reflectivity values of all the coordinate positions of the minimum value, setting the basic reflectivity of other coordinate positions of the small area to be NULL, and recording the calculated result as d _ mini,j(x,y)。
Step 1.4): by using an interpolation algorithm, the embodiment uses an Inverse Distance Weight Interpolation (IDW) algorithm to respectively calculate d _ maxi,j(x, y) and d _ mini,jThe base reflectivity of the coordinate location in (x, y) set to NULL fills the interpolated result. d _ maxi,j(x, y) and d _ mini,jThe base reflectance values for coordinate locations in (x, y) that are not NULL remain unchanged. The results after the interpolation calculation are respectively recorded as d _ maxi,j' (x, y) and d _ mini,j′(x,y)。
Step 1.5): the mean value for each coordinate position is calculated using the following formula:
Figure BDA0003234323820000081
step 1.6): calculating the basic reflectivity detection value of each coordinate position and the mean value d _ mean of the same coordinate position by adopting the following formulai,jDifference between (x, y):
d_remi,j(x,y)=di,j(x,y)-d_meani,j(x,y)
step 1.7): when j is 1, let j be j +1, di,j(x,y)=d_remi,j-1(x, y), and returning to the calculation process of the step 1.3). When j is more than 1, judging according to the following formula:
Figure BDA0003234323820000082
where TH1 ∈ (0, 1) is an empirical threshold, and TH1 is 0.5 in this embodiment. When the above formula is satisfied, let di(x,y)=d_remi,j(x, y), go to step 1.8); when the above formula is not satisfied, let j equal to j +1, di,j(x,y)=d_remi,j-1(x, y), return to step 1.3).
Step 1.8): the judgment is carried out according to the following formula:
i≤TH2
wherein TH2 is an empirical threshold, and the general value range is [2, 20 ]]In this embodiment, TH2 is 9, which is used to limit the number of times counter i is iterated. When the above formula is satisfied, let i ═ i +1, di,1(x,y)=di-1,j(x,y)-d_remi-1,j(x, y), and then, j is made 1, and the calculation process of step 1.3) is returned. When the above formula does not hold, step 1.9) is entered.
Step 1.9): from the above calculation process, a set of d is obtainedi(x, y), i ═ 1, 2, 3,.., 8, 9. Calculating the radar basic reflectivity data of z +1 and t +1 according to the steps 1.2) to 1.8) in sequence to obtain d of all radar detection times of all heightsi(x, y) as di(x,y,z,t),i=1,2,3,...,TH2。
2) Constructing a deep neural network model, taking the preprocessed historical data as a training set, and training the deep neural network model, wherein the method specifically comprises the following steps:
step 2.1): and constructing a deep neural network model suitable for predicting two-dimensional structure data by using a deep learning correlation technique. The embodiment adopts a TrajGRU network[1][2]The model is expressed as ds _ p (X, y, z, t ' +1) ═ model (ds _ t (X, y, z, t ')), where model represents the deep neural network model, ds _ t (X, y, z, t ') is the input data ds _ p (X, y, z, t ' +1) of the model is the output data of the model, ds _ p (X, y, z, t ' +1) ═ D (X, y, z, t ' +1), t ': 6.
Step 2.2) using the preprocessed historical data as a training set, where data input by the model each time is radar detection data at 6 adjacent time points, so in this implementation, the input data ds _ t (x, y, z, t) for model training for the first time can be expressed as:
ds_t(x,y,z,6)=[di(x,y,z,6),di(x,y,z,5),di(x,y,z,4),...,di(x,y,z,1)]。
the output data for the first time of model training can be expressed as: ds _ p (x, y, z, 7). The meaning of this step is that d is the time t and its past periodi(x, y, z, t') is used as input data of the model, and the radar basic reflectivity value at the time of t +1 is used as output. The purpose of model training is to find the relationship between ds _ t (x, y, z, t ') and ds _ p (x, y, z, t' + 1).
According to the method, a deep neural network model is trained by using ds _ t (X, y, z, t '), X, y and z as input data of the model and using ds _ p (X, y, z, t' +1), X, y and z as 1, 2, 6, X-2 and X-1 as output data.
3) Predicting the basic reflectivity of the future weather radar based on the trained deep neural network model, wherein the specific steps of the step 3) are as follows:
step 3.1) recording the current radar detection time as t0Will t0、t0-1、t0-2、……、t0R basic reflectivity at the detection time of the radar is processed according to the method from step 1.2) to step 1.9) to obtain a data set ds _ t (x, y, z, t)0);
Step 3.2) the data set ds _ t (x, y, z, t)0) Inputting the training completion deep neural network model to obtain the future t0The base reflectance at time + 1;
step 3.3) the result calculated in step 3.2, and t0、t0-1、t0-2、……、t0The basic reflectivity at the moment-r +1 is processed according to the method from step 1.2) to step 1.9) to obtain a new data set ds _ t (x, y, z, t)0+1);
Step 3.4) the new data set ds_t(x,y,z,t0+1) input training to complete the deep neural network model to obtain the future t0The basic reflectivity at the moment of +2, and so on, can be obtained as t in the future0+1、t0+2、t0The basic reflectance at time +3, … ….
Reference documents:
[1]Shi X,Gao Z,Lausen L,et al.Deep Learning for Precipitation Nowcasting:A Benchmark and A New Model[J].2017.
[2] yiqi, ganjianhong, sumi hui, wendong, zhangying, li renguo, tang wang an improved recurrent neural network radar image extrapolation algorithm [ J ] meteorological science, 2021, 49 (01): 18-24+45.

Claims (7)

1. A weather radar basic reflectivity prediction method based on a deep neural network is characterized by comprising the following steps: the method comprises the following specific steps:
1) acquiring historical detection data of a weather radar, and preprocessing the historical data;
2) constructing a deep neural network model, taking the preprocessed historical data as a training set, and training the deep neural network model;
3) and predicting the basic reflectivity of the future weather radar based on the trained deep neural network model.
2. The method for predicting weather radar basic reflectivity based on the deep neural network as claimed in claim 1, wherein: the specific steps of step 1) are as follows:
step 1.1) extracting basic reflectivity in historical detection data of a weather radar, converting a polar coordinate format into a plane rectangular coordinate format by using a spatial interpolation algorithm, and expressing the basic reflectivity of a detection point after conversion as D (x, y, z, t); wherein x, y and z are coordinates of the radar in longitude, latitude and height respectively; x, y and z are positive integers of [1, N ], and N is the total number of detection points of the radar on longitude, latitude and height; t is radar detection time; t is in the value range of [1, X ]; x is the total number of radar detection moments in weather radar historical detection data; defining variable N, N is less than N and N is positive integer;
step 1.2) optional selection of a certain radar detection time tRA certain height zRBasic reflectivity D (x, y, z)R,tR) (ii) a Let di,j(x,y)=D(x,y,zR,tR) Wherein i and j are positive integers starting from 1 for counting;
step 1.3) taking n as side length and n as moving step length, and converting di,j(x, y) is divided into a plurality of small square areas with the side length of n from left to right and from top to bottom; sequential analysis of di,jAnd (x, y) keeping the maximum value of the basic reflectivity of each small area, keeping the value of the basic reflectivity of the coordinate position of the maximum value, setting the basic reflectivity of other coordinate positions of the small area to be NULL, and recording the result as d _ maxi,j(x, y); similarly, d is sequentially analyzedi,jAnd (x, y) keeping the minimum value of the basic reflectivity of each small area, keeping the value of the basic reflectivity of the coordinate position where the minimum value is located, setting the basic reflectivity of other coordinate positions of the small area to be NULL, and recording the result as d _ mini,j(x,y);
Step 1.4) Pair d _ maxi,j(x, y) and d _ mini,j(x, y) is interpolated and d _ max is calculatedi,j(x, y) and d _ mini,jThe base reflectivity of the coordinate location in (x, y) set to NULL is filled with the interpolation result, d _ maxi,j(x, y) and d _ mini,jThe basic reflectivity value of the coordinate position which is not NULL in (x, y) is kept unchanged, and the result after the interpolation calculation is finished is respectively recorded as d _ maxi,j' (x, y) and d _ mini,j′(x,y);
Step 1.5) the mean value of each coordinate position is calculated by adopting the following formula:
Figure FDA0003234323810000021
step 1.6) calculating the basic reflectivity detection value of each coordinate position and the mean value d _ mean of the same coordinate position by adopting the following formulai,j(x,y) difference between:
d_remi,j(x,y)=di,j(x,y)-d_meani,j(x,y)
step 1.7): when j is 1, let j be j +1, di,j(x,y)=d_remi,j-1(x, y), return to step 1.3); when j is more than 1, judging according to the following formula:
Figure FDA0003234323810000022
wherein TH1 is an empirical threshold, TH1 epsilon (0, 1);
when the above formula is satisfied, let di(x,y)=d_remi,j(x, y), go to step 1.8); when the above formula is not satisfied, let j equal to j +1, di,j(x,y)=d_remi,j-1(x, y), return to step 1.3);
step 1.8) the judgment is carried out according to the following formula:
i≤TH2
wherein TH2 is an empirical threshold, TH2 has a value range of [2, 20 ]]The counter is used for limiting the number of times of iterative calculation of the counter i; when the above formula is satisfied, let i ═ i +1, dj,1(x,y)=di-1,j(x,y)-d_remi-1,j(x, y), and then, making j equal to 1, and returning to the step 1.3); when the above formula does not hold, the selected height z is obtainedRAnd a selected radar detection time tRCorresponding set di(x, y), i ═ 1, 2, 3, ·, TH2, and proceed to step 1.9);
step 1.9) repeating the steps 1.2) to 1.8) to obtain basic reflectivity data of all heights and all radar detection moments, and recording the result as di(x,y,z,t),i=1,2,3,...,TH2。
3. The method for predicting weather radar basic reflectivity based on the deep neural network as claimed in claim 2, wherein: the historical detection data of the weather radar in the step 1) are orderly arranged according to the time sequence, and the time intervals of the adjacent data are kept the same.
4. The method for predicting weather radar basic reflectivity based on the deep neural network as claimed in claim 2, wherein: the historical detection data of the weather radar in the step 1) is the detection data of the weather radar for Y years continuously, and the value range of Y is [2, 30 ].
5. The method for predicting weather radar basic reflectivity based on the deep neural network as claimed in claim 2, wherein: the specific steps of step 2) are as follows:
step 2.1) a deep neural network model is constructed, which is expressed as ds _ p (x, y, z, t '+ 1) model (ds _ t (x, y, z, t')),
where model denotes the deep neural network model, and ds _ t (x, y, z, t ') is input data of the model, where ds _ t (x, y, z, t') [ d ] isi(x,y,z,t’),di(x,y,z,t’-1),di(x,y,z,t’-2),...,di(x,y,z,t’-r)]R is a positive integer with a value range of [5, 100]](ii) a ds _ p (x, y, z, t ' +1) is the output data of the model, i.e., the trained label, and ds _ p (x, y, z, t ' +1) is D (x, y, z, t ' + 1);
step 2.2) training the deep neural network model by using the historical data preprocessed in the step 1 as a training set, using ds _ t (X, y, z, t '), X, y, z as 1, 2., N, t ' ═ r + 1., X-2, X-1 as input data of the model, and using ds _ p (X, y, z, t ' +1) as a label of model training.
6. The method of claim 5, wherein the method comprises: the deep neural network model in the step 2.1) adopts a Conv-LSTM model or a TrajGRU model.
7. The method of claim 5, wherein the method comprises: the specific steps of step 3) are as follows:
step 3.1) recording the current radar detection timeIs carved as t0Will t0、t0-1、t0-2、……、t0R basic reflectivity at the detection time of the radar is processed according to the method from step 1.2) to step 1.9) to obtain a data set ds _ t (x, y, z, t)0);
Step 3.2) the data set ds _ t (x, y, z, t)0) Inputting the trained deep neural network model to obtain the future t0The base reflectance at time + 1;
step 3.3) the result calculated in step 3.2, and t0、t0-1、t0-2、……、t0The basic reflectivity at the detection time of the r +1 radar is processed according to the method from step 1.2) to step 1.9) to obtain a new data set ds _ t (x, y, z, t)0+1);
Step 3.4) the new data set ds _ t (x, y, z, t)0+1) inputting the trained deep neural network model to obtain the future t0The basic reflectivity at the moment of +2, and so on, can be obtained as t in the future0+1、t0+2、t0The basic reflectance at time +3, … ….
CN202110997434.3A 2021-08-27 2021-08-27 Weather radar basic reflectivity prediction method based on deep neural network Active CN113640769B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110997434.3A CN113640769B (en) 2021-08-27 2021-08-27 Weather radar basic reflectivity prediction method based on deep neural network

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110997434.3A CN113640769B (en) 2021-08-27 2021-08-27 Weather radar basic reflectivity prediction method based on deep neural network

Publications (2)

Publication Number Publication Date
CN113640769A true CN113640769A (en) 2021-11-12
CN113640769B CN113640769B (en) 2023-06-13

Family

ID=78424156

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110997434.3A Active CN113640769B (en) 2021-08-27 2021-08-27 Weather radar basic reflectivity prediction method based on deep neural network

Country Status (1)

Country Link
CN (1) CN113640769B (en)

Citations (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0851240A2 (en) * 1996-12-26 1998-07-01 Nippon Telegraph And Telephone Corporation Meteorological radar precipitation pattern prediction method and apparatus
CN105069541A (en) * 2015-09-18 2015-11-18 南京信息工程大学 Short-term weather subjective analysis system for mobile terminals
CN106872981A (en) * 2017-02-17 2017-06-20 水利部南京水利水文自动化研究所 The precipitation strong center tracking of rainfall radar and forecasting procedure
CN106886023A (en) * 2017-02-27 2017-06-23 中国人民解放军理工大学 A kind of Radar Echo Extrapolation method based on dynamic convolutional neural networks
CN106959475A (en) * 2016-01-08 2017-07-18 株式会社东芝 Estimation unit, method of estimation and computer-readable recording medium
CN108761461A (en) * 2018-05-29 2018-11-06 南京信息工程大学 Precipitation forecast method based on Weather Radar sequential image
CN109001736A (en) * 2018-06-12 2018-12-14 中国人民解放军国防科技大学 Radar echo extrapolation method based on deep space-time prediction neural network
US20200132884A1 (en) * 2018-10-30 2020-04-30 Climacell Inc. Forecasting method with machine learning
CN111239704A (en) * 2020-02-12 2020-06-05 中国科学院大气物理研究所 Atmospheric detection radar target echo identification processing method, device, equipment and medium
US20200309993A1 (en) * 2019-03-25 2020-10-01 Yandex Europe Ag Method of and system for generating weather forecast
US20200371230A1 (en) * 2019-05-24 2020-11-26 Beijing Baidu Netcom Science And Technology Co., Ltd. Method and apparatus for predicting severe convection weather
CN112180375A (en) * 2020-09-14 2021-01-05 成都信息工程大学 Meteorological radar echo extrapolation method based on improved TrajGRU network
WO2021064524A1 (en) * 2019-10-04 2021-04-08 International Business Machines Corporation Predicting weather radar images
CN112666527A (en) * 2020-12-03 2021-04-16 象辑知源(武汉)科技有限公司 Weather radar filtering method fusing live precipitation data

Patent Citations (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0851240A2 (en) * 1996-12-26 1998-07-01 Nippon Telegraph And Telephone Corporation Meteorological radar precipitation pattern prediction method and apparatus
CN105069541A (en) * 2015-09-18 2015-11-18 南京信息工程大学 Short-term weather subjective analysis system for mobile terminals
CN106959475A (en) * 2016-01-08 2017-07-18 株式会社东芝 Estimation unit, method of estimation and computer-readable recording medium
CN106872981A (en) * 2017-02-17 2017-06-20 水利部南京水利水文自动化研究所 The precipitation strong center tracking of rainfall radar and forecasting procedure
CN106886023A (en) * 2017-02-27 2017-06-23 中国人民解放军理工大学 A kind of Radar Echo Extrapolation method based on dynamic convolutional neural networks
CN108761461A (en) * 2018-05-29 2018-11-06 南京信息工程大学 Precipitation forecast method based on Weather Radar sequential image
CN109001736A (en) * 2018-06-12 2018-12-14 中国人民解放军国防科技大学 Radar echo extrapolation method based on deep space-time prediction neural network
US20200132884A1 (en) * 2018-10-30 2020-04-30 Climacell Inc. Forecasting method with machine learning
US20200309993A1 (en) * 2019-03-25 2020-10-01 Yandex Europe Ag Method of and system for generating weather forecast
US20200371230A1 (en) * 2019-05-24 2020-11-26 Beijing Baidu Netcom Science And Technology Co., Ltd. Method and apparatus for predicting severe convection weather
WO2021064524A1 (en) * 2019-10-04 2021-04-08 International Business Machines Corporation Predicting weather radar images
CN111239704A (en) * 2020-02-12 2020-06-05 中国科学院大气物理研究所 Atmospheric detection radar target echo identification processing method, device, equipment and medium
CN112180375A (en) * 2020-09-14 2021-01-05 成都信息工程大学 Meteorological radar echo extrapolation method based on improved TrajGRU network
CN112666527A (en) * 2020-12-03 2021-04-16 象辑知源(武汉)科技有限公司 Weather radar filtering method fusing live precipitation data

Non-Patent Citations (8)

* Cited by examiner, † Cited by third party
Title
BRASIL BONNET: "Precipitation Nowcasting with Weather Radar Images and Deep Learning in São Paulo", 《ATMOSPHERE》 *
SHI EN: "Weather radar echo extrapolation method based on convolutional neural networks", 《JOURNAL OF COMPUTER APPLICATIONS》 *
SINGH, S: "A DEEP LEARNING BASED APPROACH WITH ADVERSARIAL REGULARIZATION FOR DOPPLER WEATHER RADAR ECHO PREDICTION", 《2017 IEEE INTERNATIONAL GEOSCIENCE AND REMOTE SENSING SYMPOSIUM (IGARSS)》 *
施恩: "基于卷积神经网络的雷达回波外推方法研究", 《中国优秀硕士学位论文全文数据库 基础科学辑》 *
杨洪平;张沛源;程明虎;王斌;肖艳姣;张亚萍;: "多普勒天气雷达短时预报技术研究进展", 暴雨灾害, no. 02 *
王兴: "基于深度神经网络的强对流天气识别算法", 《科学技术与工程》 *
赖灿: "基于ConvLSTM的雷达回波外推", 《成都信息工程大学学报》, pages 1 - 3 *
陈晓平;陈易旺;施建华;: "基于机器学习的降雨量雷达回波数据建模与预测", 南京信息工程大学学报(自然科学版), no. 04 *

Also Published As

Publication number Publication date
CN113640769B (en) 2023-06-13

Similar Documents

Publication Publication Date Title
CN110333554B (en) NRIET rainstorm intelligent similarity analysis method
CN107038717B (en) A method of 3D point cloud registration error is automatically analyzed based on three-dimensional grid
CN108961235B (en) Defective insulator identification method based on YOLOv3 network and particle filter algorithm
CN111832655B (en) Multi-scale three-dimensional target detection method based on characteristic pyramid network
CN110796168A (en) Improved YOLOv 3-based vehicle detection method
CN110648014B (en) Regional wind power prediction method and system based on space-time quantile regression
CN108254750B (en) Down-blast intelligent identification early warning method based on radar data
CN112396619B (en) Small particle segmentation method based on semantic segmentation and internally complex composition
CN102662173A (en) Thunderstorm forecasting method based on level set
CN110334656A (en) Multi-source Remote Sensing Images Clean water withdraw method and device based on information source probability weight
CN113836808A (en) PM2.5 deep learning prediction method based on heavy pollution feature constraint
CN116630802A (en) SwinT and size self-adaptive convolution-based power equipment rust defect image detection method
CN116503760A (en) Unmanned aerial vehicle cruising detection method based on self-adaptive edge feature semantic segmentation
CN113902792A (en) Building height detection method and system based on improved RetinaNet network and electronic equipment
CN117593657A (en) Method and system for processing refined weather forecast data and readable storage medium
CN113640769A (en) Weather radar basic reflectivity prediction method based on deep neural network
CN111738327A (en) Ultra-short-term irradiation prediction method based on typical cloud shielding irradiation difference
CN115984689A (en) Multi-scale earth surface complexity feature extraction and land utilization segmentation method
CN113673534B (en) RGB-D image fruit detection method based on FASTER RCNN
CN114419443A (en) Automatic remote-sensing image cultivated land block extraction method and system
Wang et al. Tiny-RainNet: A Deep CNN-BiLSTM Model for Short-Term Rainfall Prediction
CN112200831A (en) Dense connection twin neural network target tracking method based on dynamic template
CN116630814B (en) Quick positioning and evaluating method for building disasters based on machine learning
CN116129280B (en) Method for detecting snow in remote sensing image
CN117367435B (en) Evacuation path planning method, device, equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant