CN110456355A - A kind of Radar Echo Extrapolation method based on long short-term memory and generation confrontation network - Google Patents

A kind of Radar Echo Extrapolation method based on long short-term memory and generation confrontation network Download PDF

Info

Publication number
CN110456355A
CN110456355A CN201910763464.0A CN201910763464A CN110456355A CN 110456355 A CN110456355 A CN 110456355A CN 201910763464 A CN201910763464 A CN 201910763464A CN 110456355 A CN110456355 A CN 110456355A
Authority
CN
China
Prior art keywords
model
image
input
term memory
long short
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201910763464.0A
Other languages
Chinese (zh)
Other versions
CN110456355B (en
Inventor
张磊
黄振月
贾培艳
孙俊
高艺华
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Henan University
Original Assignee
Henan University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Henan University filed Critical Henan University
Priority to CN201910763464.0A priority Critical patent/CN110456355B/en
Publication of CN110456355A publication Critical patent/CN110456355A/en
Application granted granted Critical
Publication of CN110456355B publication Critical patent/CN110456355B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/88Radar or analogous systems specially adapted for specific applications
    • G01S13/95Radar or analogous systems specially adapted for specific applications for meteorological use
    • G01S13/958Theoretical aspects
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/02Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S13/00
    • G01S7/41Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S13/00 using analysis of echo signal for target characterisation; Target signature; Target cross-section
    • G01S7/417Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S13/00 using analysis of echo signal for target characterisation; Target signature; Target cross-section involving the use of neural networks
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/02Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S13/00
    • G01S7/41Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S13/00 using analysis of echo signal for target characterisation; Target signature; Target cross-section
    • G01S7/418Theoretical aspects
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02ATECHNOLOGIES FOR ADAPTATION TO CLIMATE CHANGE
    • Y02A90/00Technologies having an indirect contribution to adaptation to climate change
    • Y02A90/10Information and communication technologies [ICT] supporting adaptation to climate change, e.g. for weather forecasting or climate simulation

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • General Physics & Mathematics (AREA)
  • Artificial Intelligence (AREA)
  • Evolutionary Computation (AREA)
  • Electromagnetism (AREA)
  • Radar Systems Or Details Thereof (AREA)
  • Image Analysis (AREA)

Abstract

The invention discloses a kind of based on long short-term memory and generates the Radar Echo Extrapolation method of confrontation network, successively the following steps are included: A: obtaining radar data collection and obtains greyscale image data set, division obtains training sample set and test sample collection;B: constructing long short-term memory and generates confrontation network model, obtains convergent long short-term memory and generates confrontation network model;C: it by convergent long short-term memory obtained in the test sample collection input step B in step A and generates in confrontation network model, obtains Radar Echo Extrapolation image.The present invention can effectively predict Radar Echo Extrapolation image, provide technical foundation to close on being effectively predicted for weather forecast.

Description

A kind of Radar Echo Extrapolation method based on long short-term memory and generation confrontation network
Technical field
The present invention relates to surface weather observation technical field in Atmospheric Survey, more particularly to it is a kind of based on long short-term memory and Generate the Radar Echo Extrapolation method of confrontation network.
Background technique
Closing on weather forecast is closed on pre- to the high-spatial and temporal resolution forecast in variation faster weather phenomenon 0-2 hours The object of report is mainly the diastrous weather that the differentiation such as thunderstorm, strong convection, precipitation and sandstorm are rapid, destructiveness is strong.Currently, thunder It is the technical way for close on weather forecast up to echo extrapolation technique.
Radar Echo Extrapolation refers to the echo data detected according to weather radar, determines the intensity distribution of echo and returns The movement speed of wave body (such as storm monomer, precipitation area) and direction, it is linear or nonlinear outer by being carried out to echo body It pushes away, the radar return state after forecasting certain period of time.As China New Generation Doppler radar observation grid is gradually thrown How enter operation reduces meteorological disaster utmostly using Doppler radar observation grid progress echo Extrapotated prediction, As a critically important at present job.
Traditional Radar Echo Extrapolation mainly uses monomer centroid method and cross-correlation technique.Monomer centroid method be suitble to big and Strong target is tracked and is forecast that weather condition and discomfort when merging or dividing occur for or echo more scattered to echo With.Cross-correlation technique can track stratiform clouds rainfall system, but change cracking strong convective weather for echo, it is difficult to guarantee The accuracy of tracking.Therefore needing one kind can be to the method that Radar Echo Extrapolation image is effectively predicted, to close on weather Offer technical foundation is effectively predicted in forecast.
Summary of the invention
The object of the present invention is to provide a kind of based on long short-term memory and generates the Radar Echo Extrapolation method of confrontation network, Radar Echo Extrapolation image can effectively be predicted, provide technical foundation to close on being effectively predicted for weather forecast.
The present invention adopts the following technical solutions:
It is a kind of based on long short-term memory and generate confrontation network Radar Echo Extrapolation method, successively the following steps are included:
A: obtaining radar data collection, and the every data concentrated to radar data carries out unified size and sequence is handled, then Every data that radar data is concentrated is converted to greyscale image data and obtains greyscale image data after normalized Set, finally is divided to obtain training sample set and test sample collection to greyscale image data set;In training set sample set Every group of image collection in include input label and authentic specimen label;
B: constructing long short-term memory first and generates confrontation network model, and initializes long short-term memory and generate confrontation net The weight and biasing of network, long short-term memory and generation fight the generation model in network model by growing Memory Neural Networks structure in short-term At long short-term memory and the discrimination model generated in confrontation network model are made of full Connection Neural Network;It then will be in step A Obtained training sample set, which is input to, to be generated in model and obtains forecast image;Forecast image and authentic specimen label are inputted again Into discrimination model, calculates the average absolute overall error between forecast image and authentic specimen label and generate model and sentence The penalty values of other model, then long short-term memory is updated by backpropagation and generates the weight and biasing of confrontation network, repeat this Process terminates until training, obtains convergent long short-term memory and generates confrontation network model;
C: by convergent long short-term memory obtained in the test sample collection input step B in step A and confrontation net is generated In network model, Radar Echo Extrapolation image is obtained.
The step A includes step in detail below:
A1: radar data collection is obtained, and the N data that radar data is concentrated is ranked up according to time incremental order;
A2: carrying out unified size to every data that radar data is concentrated and image converted, by normalization operation by thunder The greyscale image data after normalization is converted to up to every data in data set, and obtains greyscale image data set;
A3: dividing greyscale image data set, by every four adjacent gray scales in greyscale image data set Image obtains i group image collection as one group of image collection, first three width i.e. 4i-3 width, 4i-2 in i-th group of image collection The 4th width i.e. 4i width gray level image of width and 4i-1 width gray level image as one group of input label, in i-th group of image collection As authentic specimen label, natural number of the i between [1, N/4] is then drawn acquired i group image collection with the ratio of 7:3 It is divided into training sample set and test sample collection.
The step A2 includes step in detail below:
A21: size conversion is carried out to every data that radar data is concentrated, every data size that radar data is concentrated It is converted into 360 × 480;
A22: gray level image is converted by the data after the conversion of size obtained in step A21, then again to gray level image Operation is normalized, finally obtains greyscale image data set.
The step B includes step in detail below:
B1: constructing long short-term memory first and generates confrontation network model, and initializes long short-term memory and generate confrontation The weight and biasing of network;
B2: and then training set sample set obtained in step A3 is input to and is generated in model, in training set sample set Every group of image collection includes input label input and authentic specimen label true, wherein input label input={ x1, x2, x3, authentic specimen label true={ x4};x1, x2, x3First three width i.e. 4i- in respectively step A3 in i-th group of image collection 3 width, 4i-2 width and 4i-1 width gray level image, x4Indicate the 4th width i.e. 4i width gray level image in i-th group of image;
B3: forecast image is obtained by generating model, then forecast image and authentic specimen label are sequentially inputted to differentiate In model, calculates the average absolute overall error between forecast image and authentic specimen label and generate model and discrimination model Penalty values;
B4: anti-from discrimination model output layer to discrimination model input layer according to the penalty values for the discrimination model being calculated To propagation, uses learning rate as input parameter, adjust the weight and biasing of every layer network, finally obtain and update weight and biasing Discrimination model afterwards;
According to the penalty values for generating model are calculated, reversely passed from model output layer is generated to mode input layer is generated It broadcasts, uses learning rate as input parameter, adjust the weight and biasing of every layer network, finally obtain after updating weight and biasing Generate model;
B5: repeating step B2 to B4, until reaching maximum number of iterations completes training, finally obtains convergent length and remembers in short-term Recall and generate confrontation network model.
In the step B1, the network layer for generating model, which is followed successively by, generates mode input layer, the first long short-term memory Neural net layer, generates the full articulamentum of model and generates model output layer the second long Memory Neural Networks layer in short-term, generates model Training the number of iterations be 150 times, learning rate 0.001, the size for generating mode input layer is 3 × 360 × 480, and first is long The hidden layer number of nodes of short-term memory neural net layer and the second long Memory Neural Networks layer in short-term is 128, and it is complete to generate model The number of nodes of articulamentum is 360 × 480, and the size for generating model output layer is 360 × 480, generates the final output figure of model Picture size is 360 × 480, that is, the size of the forecast image exported is 360 × 480;
The network layer of discrimination model is followed successively by discrimination model input layer, the full articulamentum of discrimination model first, discrimination model Second full articulamentum, the full articulamentum of discrimination model third and discrimination model output layer, the training the number of iterations of discrimination model are 150 Secondary, learning rate 0.001, the number of nodes of the full articulamentum of discrimination model first is 256, the section of the full articulamentum of discrimination model second Points are 128, and the number of nodes of the full articulamentum of discrimination model third is 1, and the size 360 × 480 of discrimination model input layer is sentenced The size of other model output layer is 1.
In the step B2, the size of the input label input in every group of image collection in training set sample set is 3 ×360×480。
In the step B3, the calculating of the average absolute overall error MAE between forecast image and authentic specimen label is public Formula are as follows:
Wherein, m indicates the pixel number shared in forecast image and authentic specimen label, trueiIt indicates i-th in step A3 Authentic specimen label in group image collection,Indicate authentic specimen label trueiIn j-th of pixel value, oiIt indicates Forecast image,Indicate the value of j-th of pixel in forecast image;
Successively forecast image and authentic specimen label are input in discrimination model, discrimination model exports two sizes respectively Scalar D (G (input between [0,1]i)) and D (truei), the two scalar D (G then exported according to discrimination model (inputi)) and D (truei), calculate separately the penalty values for generating model and discrimination model;
Generate the loss function of model are as follows:
Wherein, V1For the penalty values for generating model, D indicates that discrimination model to be optimized, G indicate generation mould to be optimized Type,N is training sample set number, and N is the total number of radar data intensive data, and log indicates log-likelihood Function, inputiFor i-th group of input sample, G (inputi) it is inputiInput generates the forecast image obtained after model G, D (G (inputi)) indicate to generate differentiation result of the forecast image of the generation of model after discrimination model D differentiation;
The loss function of discrimination model are as follows:
Wherein, V2For the penalty values of discrimination model, trueiIndicate the authentic specimen label in i-th group of image collection, D (truei) indicate differentiation result of the authentic specimen label after discrimination model differentiates.
In the step C, the size of Radar Echo Extrapolation image is 360 × 480.
The present invention is based on image processing techniques, by converting radar data collection to the greyscale image data collection after normalization It closes, then constructs based on long short-term memory and generate confrontation network model, pass through backpropagation and learning rate is used to join as input Several pairs of weights and biasing are adjusted, and fight network model based on long short-term memory and generation after being trained, finally will instruction Practice sample set and be input to fighting in network model after training based on long short-term memory and generation, obtains forecast image, Neng Gouwei The accuracy rate for closing on weather forecast provides technical foundation, and diastrous weather can be effectively predicted.
Detailed description of the invention
Fig. 1 is flow chart of the invention.
Specific embodiment
The present invention is made with detailed description below in conjunction with drawings and examples:
As shown in Figure 1, the Radar Echo Extrapolation method of the present invention based on long short-term memory and generation confrontation network, Successively comprise the steps of:
A: obtaining radar data collection, and the every data concentrated to radar data carries out unified size and sequence is handled, then Every data that radar data is concentrated is converted to greyscale image data and obtains greyscale image data after normalized Set, finally is divided to obtain training sample set and test sample collection to greyscale image data set;In training set sample set Every group of image collection in include input label and authentic specimen label;
Step A includes step in detail below:
A1: radar data collection is obtained, and the N data that radar data is concentrated is ranked up according to time incremental order;
A2: carrying out unified size to every data that radar data is concentrated and image converted, by normalization operation by thunder The greyscale image data after normalization is converted to up to every data in data set, and obtains greyscale image data set;
The step A2 comprising the following specific steps
A21: size conversion is carried out to every data that radar data is concentrated, every data size that radar data is concentrated It is converted into 360 × 480;
A22: gray level image is converted by the data after the conversion of size obtained in step A21, then again to gray level image Operation is normalized, finally obtains greyscale image data set;
A3: dividing greyscale image data set, by every four adjacent gray scales in greyscale image data set Image obtains i group image collection as one group of image collection, first three width i.e. 4i-3 width, 4i-2 in i-th group of image collection The 4th width i.e. 4i width gray level image of width and 4i-1 width gray level image as one group of input label, in i-th group of image collection As authentic specimen label, natural number of the i between [1, N/4] is then drawn acquired i group image collection with the ratio of 7:3 It is divided into training sample set and test sample collection.
In the present embodiment, when being divided to greyscale image data set, by every four in greyscale image data set Adjacent gray level image obtains i group image collection, i.e. the first width, the second width, third width and the 4th as one group of image collection Width gray level image is as the 1st group of image collection, wherein the first width, the second width and third width grayscale image in the 1st group of image collection As being used as one group of input label, the 4th width gray level image in the 1st group of image collection is as authentic specimen label;5th width, Six width, the 7th width and the 8th width gray level image are as the 2nd group of image collection, wherein the 5th width in the 2nd group of image collection, the 6th Width and the 7th width gray level image are as one group of input label, and the 8th width gray level image in the 2nd group of image collection is as true sample This label;And so on;
B: constructing long short-term memory first and generates confrontation network model, and initializes long short-term memory and generate confrontation net The weight and biasing of network, long short-term memory and generation fight the generation model in network model by growing Memory Neural Networks structure in short-term At long short-term memory and the discrimination model generated in confrontation network model are made of full Connection Neural Network;It then will be in step A Obtained training sample set, which is input to, to be generated in model and obtains forecast image;Again simultaneously by forecast image and authentic specimen label It is input in discrimination model, calculates the average absolute overall error between forecast image and authentic specimen label and generates model With the penalty values of discrimination model, then by backpropagation update long short-term memory and generate confrontation network weight and biasing, weight This multiple process terminates until training, obtains convergent long short-term memory and generates confrontation network model;
The step B includes step in detail below:
B1: constructing long short-term memory first and generates confrontation network model, and initializes long short-term memory and generate confrontation The weight and biasing of network;
The network layer for generating model is to generate mode input layer, the first long Memory Neural Networks layer, the second length in short-term When Memory Neural Networks layers, generate the full articulamentum of model and generate model output layer, the training the number of iterations for generating model is 150 Secondary, learning rate 0.001, the size for generating mode input layer is 3 × 360 × 480, the first long Memory Neural Networks layer in short-term and The hidden layer number of nodes of second long Memory Neural Networks layer in short-term is 128, and the number of nodes for generating the full articulamentum of model is 360 × 480, the size for generating model output layer is 360 × 480, and the final output image size for generating model is 360 × 480, i.e., defeated The size of forecast image out is 360 × 480;
The network layer of discrimination model is discrimination model input layer, the full articulamentum of discrimination model first, discrimination model second Full articulamentum, the full articulamentum of discrimination model third and discrimination model output layer, the training the number of iterations of discrimination model are 150 times, Learning rate is 0.001, and the number of nodes of the full articulamentum of discrimination model first is 256, the node of the full articulamentum of discrimination model second Number is 128, and the number of nodes of the full articulamentum of discrimination model third is 1, and the size 360 × 480 of discrimination model input layer differentiates The size of model output layer is 1.
B2: and then training set sample set obtained in step A3 is input to and is generated in model, in training set sample set Every group of image collection includes input label input and authentic specimen label true, wherein input label input={ x1, x2, x3, authentic specimen label true={ x4};x1, x2, x3First three width i.e. 4i- in respectively step A3 in i-th group of image collection 3 width, 4i-2 width and 4i-1 width gray level image, x4Indicate the 4th width i.e. 4i width gray level image in i-th group of image;
In step B2, the size of the input label input in every group of image collection in training set sample set is 3 × 360 ×480;
B3: forecast image is obtained by generating model, then forecast image and authentic specimen label are input to discrimination model In, it calculates the average absolute overall error between forecast image and authentic specimen label and generates the damage of model and discrimination model Mistake value;
Average absolute in step B3 between authentic specimen label obtained in obtained forecast image and step A3 is total The calculation formula of error MAE are as follows:
Wherein, m indicates the pixel number shared in forecast image and authentic specimen label, trueiIt indicates i-th in step A3 Authentic specimen label in group image collection,Indicate authentic specimen label trueiIn j-th of pixel value, oiIt indicates Forecast image,Indicate the value of j-th of pixel in forecast image;
Forecast image and authentic specimen label are input in discrimination model, discrimination model exports two sizes respectively and exists Scalar D (G (input between [0,1]i)) and D (truei), the two scalar D (G then exported according to discrimination model (inputi)) and D (truei), calculate separately the penalty values for generating model and discrimination model;
Generate the loss function of model are as follows:
V1For the penalty values for generating model, D indicates that discrimination model to be optimized, G indicate generation model to be optimized,N is training sample set number, and N is the total number of radar data intensive data, and log indicates log-likelihood letter Number, inputiFor i-th group of input sample, G (inputi) it is inputiInput generates the forecast image obtained after model G, D (G (inputi)) indicate to generate differentiation result of the forecast image of the generation of model after discrimination model D differentiation;
The loss function of discrimination model are as follows:
Wherein, V2For the penalty values of discrimination model, trueiIndicate the authentic specimen label in i-th group of image collection, D (truei) indicate differentiation result of the authentic specimen label after discrimination model differentiates;
B4: according to the penalty values V for the discrimination model being calculated2, from discrimination model output layer to discrimination model input layer Backpropagation, use learning rate as input parameter, adjust the weight and biasing of every layer network, finally obtain update weight and partially The discrimination model postponed;
According to the penalty values V that generation model is calculated1, reversely passed from model output layer is generated to mode input layer is generated It broadcasts, uses learning rate as input parameter, adjust the weight and biasing of every layer network, finally obtain after updating weight and biasing Generate model;
To discrimination model and after generating the update that model completes weight and biasing, that is, complete to long short-term memory and generation Fight the weight of network and the update of biasing.
B5: repeating step B2 to B4, until reaching maximum number of iterations completes training, finally obtains convergent length and remembers in short-term Recall and generate confrontation network model;
C: by convergent long short-term memory obtained in the test sample collection input step B in step A and confrontation net is generated In network model, Radar Echo Extrapolation image is obtained.
The size for the input label input in every group of image collection that test sample is concentrated is 3 × 360 × 480;Radar returns The size of wave extrapolated image is 360 × 480.

Claims (8)

1. a kind of Radar Echo Extrapolation method based on long short-term memory and generation confrontation network, which is characterized in that successively include Following steps:
A: obtaining radar data collection, and the every data concentrated to radar data carries out unified size and sequence is handled, then by thunder Up to every data in data set after normalized, is converted to greyscale image data and obtains greyscale image data collection It closes, finally greyscale image data set is divided to obtain training sample set and test sample collection;In training set sample set It include input label and authentic specimen label in every group of image collection;
B: constructing long short-term memory first and generates confrontation network model, and initializes long short-term memory and generate confrontation network Weight and biasing, long short-term memory and the generation model generated in confrontation network model are constituted by growing Memory Neural Networks in short-term, Long short-term memory and the discrimination model generated in confrontation network model are made of full Connection Neural Network;Then it will be obtained in step A To training sample set be input to generate model in and obtain forecast image;Forecast image and authentic specimen label are input to again In discrimination model, calculates the average absolute overall error between forecast image and authentic specimen label and generate model and differentiation The penalty values of model, then long short-term memory is updated by backpropagation and generates the weight and biasing of confrontation network, repeat this mistake Journey terminates until training, obtains convergent long short-term memory and generates confrontation network model;
C: by convergent long short-term memory obtained in the test sample collection input step B in step A and confrontation network mould is generated In type, Radar Echo Extrapolation image is obtained.
2. the Radar Echo Extrapolation method according to claim 1 based on long short-term memory and generation confrontation network, special Sign is that the step A includes step in detail below:
A1: radar data collection is obtained, and the N data that radar data is concentrated is ranked up according to time incremental order;
A2: carrying out unified size to every data that radar data is concentrated and image converted, by normalization operation by radar number The greyscale image data after normalization is converted to according to every data of concentration, and obtains greyscale image data set;
A3: dividing greyscale image data set, by every four adjacent gray level images in greyscale image data set Obtain i group image collection as one group of image collection, first three width i.e. 4i-3 width, 4i-2 width in i-th group of image collection and Fourth width i.e. 4i width gray level image conduct of the 4i-1 width gray level image as one group of input label, in i-th group of image collection Then acquired i group image collection is by authentic specimen label, natural number of the i between [1, N/4] with 7: 3 ratio cut partition Training sample set and test sample collection.
3. the Radar Echo Extrapolation method according to claim 2 based on long short-term memory and generation confrontation network, special Sign is that the step A2 includes step in detail below:
A21: size conversion is carried out to every data that radar data is concentrated, every data size that radar data is concentrated is turned Turn to 360 × 480;
A22: gray level image is converted by the data after the conversion of size obtained in step A21, then gray level image is carried out again Normalization operation finally obtains greyscale image data set.
4. the Radar Echo Extrapolation method according to claim 3 based on long short-term memory and generation confrontation network, special Sign is: the step B includes step in detail below:
B1: constructing long short-term memory first and generates confrontation network model, and initializes long short-term memory and generate confrontation network Weight and biasing;
B2: and then training set sample set obtained in step A3 is input to and is generated in model, every group in training set sample set Image collection includes input label input and authentic specimen label true, wherein input label input={ x1, x2, x3, very Real sample label true={ x4};x1, x2, x3First three width i.e. 4i-3 width in respectively step A3 in i-th group of image collection, 4i-2 width and 4i-1 width gray level image, x4Indicate the 4th width i.e. 4i width gray level image in i-th group of image;
B3: forecast image is obtained by generating model, then forecast image and authentic specimen label are sequentially inputted to discrimination model In, it calculates the average absolute overall error between forecast image and authentic specimen label and generates the damage of model and discrimination model Mistake value;
B4: it according to the penalty values for the discrimination model being calculated, is reversely passed from discrimination model output layer to discrimination model input layer It broadcasts, uses learning rate as input parameter, adjust the weight and biasing of every layer network, finally obtain after updating weight and biasing Discrimination model;
According to the penalty values for generating model are calculated, from model output layer is generated to the backpropagation of mode input layer is generated, make It uses learning rate as input parameter, adjusts the weight and biasing of every layer network, finally obtain the generation after updating weight and biasing Model;
B5: repeating step B2 to B4, until reach maximum number of iterations complete training, finally obtain convergent long short-term memory and Generate confrontation network model.
5. the Radar Echo Extrapolation method according to claim 4 based on long short-term memory and generation confrontation network, special Sign is: in the step B1, the network layer for generating model, which is followed successively by, generates mode input layer, the first long short-term memory mind It through network layer, the second long Memory Neural Networks layer in short-term, generates the full articulamentum of model and generates model output layer, generate model Training the number of iterations is 150 times, learning rate 0.001, and the size for generating mode input layer is 3 × 360 × 480, the first length When Memory Neural Networks layer and the hidden layer number of nodes of the second long Memory Neural Networks layer in short-term be 128, generate model and connect entirely The number of nodes for connecing layer is 360 × 480, and the size for generating model output layer is 360 × 480, generates the final output image of model Size is 360 × 480, that is, the size of the forecast image exported is 360 × 480;
The network layer of discrimination model is followed successively by discrimination model input layer, the full articulamentum of discrimination model first, discrimination model second Full articulamentum, the full articulamentum of discrimination model third and discrimination model output layer, the training the number of iterations of discrimination model are 150 times, Learning rate is 0.001, and the number of nodes of the full articulamentum of discrimination model first is 256, the node of the full articulamentum of discrimination model second Number is 128, and the number of nodes of the full articulamentum of discrimination model third is 1, and the size 360 × 480 of discrimination model input layer differentiates The size of model output layer is 1.
6. the Radar Echo Extrapolation method according to claim 4 based on long short-term memory and generation confrontation network, special Sign is: in the step B2, the size of the input label input in every group of image collection in training set sample set is 3 ×360×480。
7. the Radar Echo Extrapolation method according to claim 4 based on long short-term memory and generation confrontation network, special Sign is: in the step B3, the calculation formula of the average absolute overall error MAE between forecast image and authentic specimen label Are as follows:
Wherein, m indicates the pixel number shared in forecast image and authentic specimen label, trueiIndicate the i-th group picture in step A3 Authentic specimen label in image set conjunction,Indicate authentic specimen label trueiIn j-th of pixel value, oiIndicate prediction Image,Indicate the value of j-th of pixel in forecast image;
Successively forecast image and authentic specimen label are input in discrimination model, discrimination model exports two sizes respectively and exists Scalar D (G (input between [0,1]i)) and D (truei), the two scalar D (G then exported according to discrimination model (inputi)) and D (truei), calculate separately the penalty values for generating model and discrimination model;
Generate the loss function of model are as follows:
Wherein, V1For the penalty values for generating model, D indicates that discrimination model to be optimized, G indicate generation model to be optimized,N is training sample set number, and N is the total number of radar data intensive data, and log indicates log-likelihood letter Number, inputiFor i-th group of input sample, G (inputi) it is inputiInput generates the forecast image obtained after model G, D (G (inputi)) indicate to generate differentiation result of the forecast image of the generation of model after discrimination model D differentiation;
The loss function of discrimination model are as follows:
Wherein, V2For the penalty values of discrimination model, trueiIndicate the authentic specimen label in i-th group of image collection, D (truei) Indicate differentiation result of the authentic specimen label after discrimination model differentiates.
8. the Radar Echo Extrapolation method according to claim 1 based on long short-term memory and generation confrontation network, special Sign is: in the step C, the size of Radar Echo Extrapolation image is 360 × 480.
CN201910763464.0A 2019-08-19 2019-08-19 Radar echo extrapolation method based on long-time and short-time memory and generation countermeasure network Active CN110456355B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910763464.0A CN110456355B (en) 2019-08-19 2019-08-19 Radar echo extrapolation method based on long-time and short-time memory and generation countermeasure network

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910763464.0A CN110456355B (en) 2019-08-19 2019-08-19 Radar echo extrapolation method based on long-time and short-time memory and generation countermeasure network

Publications (2)

Publication Number Publication Date
CN110456355A true CN110456355A (en) 2019-11-15
CN110456355B CN110456355B (en) 2021-12-24

Family

ID=68487487

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910763464.0A Active CN110456355B (en) 2019-08-19 2019-08-19 Radar echo extrapolation method based on long-time and short-time memory and generation countermeasure network

Country Status (1)

Country Link
CN (1) CN110456355B (en)

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111008604A (en) * 2019-12-09 2020-04-14 上海眼控科技股份有限公司 Prediction image acquisition method and device, computer equipment and storage medium
CN111028260A (en) * 2019-12-17 2020-04-17 上海眼控科技股份有限公司 Image prediction method, image prediction device, computer equipment and storage medium
CN111915591A (en) * 2020-08-03 2020-11-10 中国海洋大学 Spiral generation network for high-quality image extrapolation
CN112180375A (en) * 2020-09-14 2021-01-05 成都信息工程大学 Meteorological radar echo extrapolation method based on improved TrajGRU network
CN114509825A (en) * 2021-12-31 2022-05-17 河南大学 Strong convection weather prediction method and system for improving three-dimensional confrontation generation neural network based on hybrid evolution algorithm
CN116499532A (en) * 2023-06-27 2023-07-28 中建三局集团华南有限公司 Complex marine environment deep water pile group construction monitoring system constructed based on hydrologic model

Citations (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH09257951A (en) * 1996-03-22 1997-10-03 Nippon Telegr & Teleph Corp <Ntt> Weather forcasting device
WO2007005328A2 (en) * 2005-06-30 2007-01-11 Massachusetts Institute Of Technology Weather radar echo tops forecast generation
CN107016406A (en) * 2017-02-24 2017-08-04 中国科学院合肥物质科学研究院 The pest and disease damage image generating method of network is resisted based on production
CN107121679A (en) * 2017-06-08 2017-09-01 湖南师范大学 Recognition with Recurrent Neural Network predicted method and memory unit structure for Radar Echo Extrapolation
CN107545245A (en) * 2017-08-14 2018-01-05 中国科学院半导体研究所 A kind of age estimation method and equipment
CN107632295A (en) * 2017-09-15 2018-01-26 广东工业大学 A kind of Radar Echo Extrapolation method based on sequential convolutional neural networks
CN108846409A (en) * 2018-04-28 2018-11-20 中国人民解放军国防科技大学 Radar echo extrapolation model training method based on cyclic dynamic convolution neural network
CN109003678A (en) * 2018-06-12 2018-12-14 清华大学 A kind of generation method and system emulating text case history
WO2019019199A1 (en) * 2017-07-28 2019-01-31 Shenzhen United Imaging Healthcare Co., Ltd. System and method for image conversion
EP3471005A1 (en) * 2017-10-13 2019-04-17 Nokia Technologies Oy Artificial neural network
CN109872346A (en) * 2019-03-11 2019-06-11 南京邮电大学 A kind of method for tracking target for supporting Recognition with Recurrent Neural Network confrontation study
CN109919032A (en) * 2019-01-31 2019-06-21 华南理工大学 A kind of video anomaly detection method based on action prediction
CN109948693A (en) * 2019-03-18 2019-06-28 西安电子科技大学 Expand and generate confrontation network hyperspectral image classification method based on super-pixel sample

Patent Citations (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH09257951A (en) * 1996-03-22 1997-10-03 Nippon Telegr & Teleph Corp <Ntt> Weather forcasting device
WO2007005328A2 (en) * 2005-06-30 2007-01-11 Massachusetts Institute Of Technology Weather radar echo tops forecast generation
CN107016406A (en) * 2017-02-24 2017-08-04 中国科学院合肥物质科学研究院 The pest and disease damage image generating method of network is resisted based on production
CN107121679A (en) * 2017-06-08 2017-09-01 湖南师范大学 Recognition with Recurrent Neural Network predicted method and memory unit structure for Radar Echo Extrapolation
WO2019019199A1 (en) * 2017-07-28 2019-01-31 Shenzhen United Imaging Healthcare Co., Ltd. System and method for image conversion
CN107545245A (en) * 2017-08-14 2018-01-05 中国科学院半导体研究所 A kind of age estimation method and equipment
CN107632295A (en) * 2017-09-15 2018-01-26 广东工业大学 A kind of Radar Echo Extrapolation method based on sequential convolutional neural networks
EP3471005A1 (en) * 2017-10-13 2019-04-17 Nokia Technologies Oy Artificial neural network
CN108846409A (en) * 2018-04-28 2018-11-20 中国人民解放军国防科技大学 Radar echo extrapolation model training method based on cyclic dynamic convolution neural network
CN109003678A (en) * 2018-06-12 2018-12-14 清华大学 A kind of generation method and system emulating text case history
CN109919032A (en) * 2019-01-31 2019-06-21 华南理工大学 A kind of video anomaly detection method based on action prediction
CN109872346A (en) * 2019-03-11 2019-06-11 南京邮电大学 A kind of method for tracking target for supporting Recognition with Recurrent Neural Network confrontation study
CN109948693A (en) * 2019-03-18 2019-06-28 西安电子科技大学 Expand and generate confrontation network hyperspectral image classification method based on super-pixel sample

Non-Patent Citations (5)

* Cited by examiner, † Cited by third party
Title
LIN TIAN 等: "A Generative Adversarial Gated Recurrent Unit Model for Precipitation Nowcasting", 《IEEE GEOSCIENCE AND REMOTE SENSING LETTERS》 *
YATIAN SHEN 等: "Entity-Dependent Long-Short Time Memory Network for Semantic Relation Extraction", 《2018 5TH IEEE INTERNATIONAL CONFERENCE ON CLOUD COMPUTING AND INTELLIGENCE SYSTEMS (CCIS)》 *
张玲玲: "基于雷达回波图像的短期降雨预测", 《中国优秀博硕士学位论文全文数据库(硕士)信息科技辑》 *
贾培艳 等: "基于节点分类的概念格同构判定算法", 《计算机应用与软件》 *
陈元昭 等: "基于生成对抗网络GAN的人工智能临近预报方法研究", 《大气科学学报》 *

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111008604A (en) * 2019-12-09 2020-04-14 上海眼控科技股份有限公司 Prediction image acquisition method and device, computer equipment and storage medium
CN111028260A (en) * 2019-12-17 2020-04-17 上海眼控科技股份有限公司 Image prediction method, image prediction device, computer equipment and storage medium
CN111915591A (en) * 2020-08-03 2020-11-10 中国海洋大学 Spiral generation network for high-quality image extrapolation
CN112180375A (en) * 2020-09-14 2021-01-05 成都信息工程大学 Meteorological radar echo extrapolation method based on improved TrajGRU network
CN112180375B (en) * 2020-09-14 2022-12-20 成都信息工程大学 Weather radar echo extrapolation method based on improved TrajGRU network
CN114509825A (en) * 2021-12-31 2022-05-17 河南大学 Strong convection weather prediction method and system for improving three-dimensional confrontation generation neural network based on hybrid evolution algorithm
CN114509825B (en) * 2021-12-31 2022-11-08 河南大学 Strong convection weather prediction method and system for improving three-dimensional confrontation generation neural network based on hybrid evolution algorithm
CN116499532A (en) * 2023-06-27 2023-07-28 中建三局集团华南有限公司 Complex marine environment deep water pile group construction monitoring system constructed based on hydrologic model
CN116499532B (en) * 2023-06-27 2023-09-01 中建三局集团华南有限公司 Complex marine environment deep water pile group construction monitoring system constructed based on hydrologic model

Also Published As

Publication number Publication date
CN110456355B (en) 2021-12-24

Similar Documents

Publication Publication Date Title
CN110456355A (en) A kind of Radar Echo Extrapolation method based on long short-term memory and generation confrontation network
Han et al. Convolutional neural network for convective storm nowcasting using 3-D Doppler weather radar data
Zhou et al. Forecasting different types of convective weather: A deep learning approach
Olatomiwa et al. Adaptive neuro-fuzzy approach for solar radiation prediction in Nigeria
Mahjoobi et al. Hindcasting of wave parameters using different soft computing methods
Litta et al. Artificial neural network model in prediction of meteorological parameters during premonsoon thunderstorms
Akbarifard et al. Predicting sea wave height using Symbiotic Organisms Search (SOS) algorithm
Meng et al. Forecasting tropical cyclones wave height using bidirectional gated recurrent unit
CN111160520A (en) BP neural network wind speed prediction method based on genetic algorithm optimization
CN109001736B (en) Radar echo extrapolation method based on deep space-time prediction neural network
Chaudhuri Convective energies in forecasting severe thunderstorms with one hidden layer neural net and variable learning rate back propagation algorithm
CN113239722B (en) Deep learning based strong convection extrapolation method and system under multi-scale
Li et al. Decomposition integration and error correction method for photovoltaic power forecasting
Elbeltagi et al. Optimizing hyperparameters of deep hybrid learning for rainfall prediction: a case study of a Mediterranean basin
Wang et al. A quantitative comparison of precipitation forecasts between the storm-scale numerical weather prediction model and auto-nowcast system in Jiangsu, China
CN115759445A (en) Machine learning and cloud model-based classified flood random forecasting method
Mar et al. Optimum neural network architecture for precipitation prediction of Myanmar
Zhu et al. Long lead-time radar rainfall nowcasting method incorporating atmospheric conditions using long short-term memory networks
CN115877483A (en) Typhoon path forecasting method based on random forest and GRU
Zhang et al. A multi-site tide level prediction model based on graph convolutional recurrent networks
CN116012618A (en) Weather identification method, system, equipment and medium based on radar echo diagram
Sonkusare et al. Improved performance of multi-model ensemble through the bias correction based on ANN technique
Wang et al. Data-driven modeling of Bay-Ocean wave spectra at bridge-tunnel crossing of Chesapeake Bay, USA
Yuhao et al. Research on Prediction of Ground Settlement of Deep Foundation Pit Based on Improved PSO-BP Neural Network
Wei Real-time extreme rainfall evaluation system for the construction industry using deep convolutional neural networks

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant