CN111158068A - Short-term prediction method and system based on simple convolutional recurrent neural network - Google Patents

Short-term prediction method and system based on simple convolutional recurrent neural network Download PDF

Info

Publication number
CN111158068A
CN111158068A CN201911410044.0A CN201911410044A CN111158068A CN 111158068 A CN111158068 A CN 111158068A CN 201911410044 A CN201911410044 A CN 201911410044A CN 111158068 A CN111158068 A CN 111158068A
Authority
CN
China
Prior art keywords
neural network
image
convolutional neural
model
memory unit
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201911410044.0A
Other languages
Chinese (zh)
Other versions
CN111158068B (en
Inventor
叶允明
李旭涛
董宇
姬喜洋
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Graduate School Harbin Institute of Technology
Original Assignee
Shenzhen Graduate School Harbin Institute of Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Graduate School Harbin Institute of Technology filed Critical Shenzhen Graduate School Harbin Institute of Technology
Priority to CN201911410044.0A priority Critical patent/CN111158068B/en
Publication of CN111158068A publication Critical patent/CN111158068A/en
Application granted granted Critical
Publication of CN111158068B publication Critical patent/CN111158068B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01WMETEOROLOGY
    • G01W1/00Meteorology
    • G01W1/10Devices for predicting weather conditions
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/045Combinations of networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02ATECHNOLOGIES FOR ADAPTATION TO CLIMATE CHANGE
    • Y02A90/00Technologies having an indirect contribution to adaptation to climate change
    • Y02A90/10Information and communication technologies [ICT] supporting adaptation to climate change, e.g. for weather forecasting or climate simulation

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Computing Systems (AREA)
  • Mathematical Physics (AREA)
  • Biophysics (AREA)
  • Computational Linguistics (AREA)
  • Data Mining & Analysis (AREA)
  • Evolutionary Computation (AREA)
  • General Health & Medical Sciences (AREA)
  • Molecular Biology (AREA)
  • Artificial Intelligence (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Biomedical Technology (AREA)
  • Software Systems (AREA)
  • Health & Medical Sciences (AREA)
  • Environmental & Geological Engineering (AREA)
  • Atmospheric Sciences (AREA)
  • Biodiversity & Conservation Biology (AREA)
  • Ecology (AREA)
  • Environmental Sciences (AREA)
  • Image Processing (AREA)
  • Compression, Expansion, Code Conversion, And Decoders (AREA)

Abstract

The invention provides a short-term prediction method and a short-term prediction system based on a simple convolution cyclic neural network, wherein the method comprises the following steps: acquiring a radar echo image sequence for weather short-term forecasting; and inputting the radar echo image sequence into an encoding-decoding model to generate a weather short-term forecast result. The invention has the beneficial effects that: by constructing the encoding-decoding model as a prediction model of the weather short-term forecast, only a radar echo image sequence for the weather short-term forecast needs to be input into the model, and image characteristic information in the radar echo image sequence is acquired through the model, so that the image characteristic information can be analyzed, the weather short-term forecast can be acquired more conveniently, and the forecast is more accurate.

Description

Short-term prediction method and system based on simple convolutional recurrent neural network
Technical Field
The invention relates to the technical field of ground meteorological observation, in particular to a short-term forecasting method and system based on a simple convolution cyclic neural network.
Background
In recent years, with the development of internet technology, the public demand degree for real-time short-term weather forecast is higher and higher, and the related short-term forecast technology is also continuously advanced. In the current weather forecast, for example, the short-term forecast of rainfall mainly depends on extrapolation of a weather radar echo image, a weather radar can emit radio waves to high altitude, and the water vapor distribution condition of cloud layers with different heights can be judged by monitoring reflection echoes of the cloud layers with different heights (for example, 0.5km, 1.5km,2.5km and 3.5km), so that the method can be used for estimating the rainfall.
Disclosure of Invention
The problem to be solved by the invention is how to improve the accuracy of the weather forecast.
In order to solve the above problems, the present invention provides a short-term prediction method based on a simple convolutional recurrent neural network, comprising the following steps:
acquiring a radar echo image sequence for weather short-term forecasting;
inputting the radar echo image sequence into an encoding-decoding model to generate a weather short-term forecast result;
the encoding-decoding model comprises an encoder and a decoder, wherein the encoder is used for encoding and extracting image characteristic information of the radar echo image sequence, and the decoder is used for decoding and outputting the result of the weather short-term forecast by taking the output of the encoder as input.
Further, the encoder comprises a plurality of layers of convolutional neural network memory units and a plurality of layers of down-sampling layers which are alternately arranged, the decoder comprises a plurality of layers of convolutional neural network memory units and a plurality of layers of up-sampling layers which are alternately arranged, and the convolutional neural network memory unit of each layer of the encoder corresponds to the convolutional neural network memory unit of the decoder;
a downsampling layer of the encoder to convolve downscaling the received image data;
the convolutional neural network units of the encoder and the decoder are used for encoding the received image data to obtain an output state image and a memory state image which comprise the image characteristic information;
the up-sampling layer of the decoder is used for carrying out deconvolution expansion on the received image data, wherein the image data received by the up-sampling layer of the decoder comprises the output state image output by the convolutional neural network unit at the upper layer of the up-sampling layer of the decoder and the memory state image output by the convolutional neural network unit of the encoder at the upper layer of the up-sampling layer.
Further, the image feature information includes image local detail information and image overall shape information, and the encoder and the decoder each include at least one layer of the convolutional neural network unit, respectively, so as to obtain the output state image and the memory state image including the image local detail information; and the encoder and the decoder respectively comprise at least one layer of the convolutional neural network unit so as to obtain the output state image and the memory state image which comprise the integral form information of the image.
Further, the convolutional neural network memory unit includes:
forget door ft,ft=σ(Wf*xt+bf);
Reset gate rt,rt=σ(Wr*xt+br);
The memory state of the convolutional neural network memory unit is as follows: c. Ct
Figure BDA0002349739420000021
Figure BDA0002349739420000022
The convolutional neural networkThe output state of the memory cell is: h ist,ht=rt⊙g(ct)+(1-rt)⊙xt
Wherein, bfAnd brRepresenting two biases, a convolution, ⊙ representing multiplication of corresponding elements of the matrix, a and g representing activation functions, t representing the current time step, xtRepresenting data received by the convolutional neural network memory unit,
Figure BDA0002349739420000023
W、Wf、Wrrespectively representing three convolution kernels in the convolutional neural network memory unit.
Further, before the step of inputting the radar echo image sequence into an encoding-decoding model to generate a result of the weather forecast, the method further comprises the following steps:
sampling historical radar echo images by taking a first continuous time step and a second continuous time step as sliding windows, wherein the first continuous time step and the second continuous time step are continuous in time;
setting the echo image at each of the first successive time steps as model input data, and setting the echo image at each of the second successive time steps as live data;
establishing a coding-decoding model;
inputting the model input data into an encoding-decoding model for iterative prediction until a plurality of model prediction data for model training are obtained, wherein each time step in the second continuous time steps corresponds to one model prediction data, and the iterative prediction is iterated according to the time steps in the second continuous time steps;
training the encoding-decoding model according to the model prediction data and the corresponding live data until the encoding-decoding model converges.
Further, the inputting the model input data into an encoding-decoding model for iterative prediction until a plurality of model prediction data for model training are obtained specifically includes:
receiving, by the downsampling layer located at a topmost layer in the encoder, the model input data for a current time step in the first continuous time step and convolution reducing the model input data for transmission to the convolutional neural network memory unit at a lower layer;
encoding the received model input data by the convolutional neural network memory unit of the encoder to obtain an output state image and a memory state image which comprise image characteristic information;
transmitting the output state image into the down-sampling layer of a lower layer,
performing convolution reduction on the received output state image through the down-sampling layer positioned in the middle layer of the encoder and transmitting the reduced output state image to the convolution neural network memory unit at the lower layer;
storing the memory state image in the convolutional neural network memory unit;
and when the current time step is a final time step in the first continuous time steps, transmitting the memory state image stored in the convolutional neural network memory unit of the final time step to the convolutional neural network memory unit of the corresponding decoder, and transmitting the memory state image to the upper sampling layer below the convolutional neural network memory unit of the corresponding decoder.
Further, the inputting the model input data into an encoding-decoding model for iterative prediction until a plurality of model prediction data for model training are obtained specifically includes:
deconvoluting and expanding the received output state image through the up-sampling layer of the decoder, and transmitting the output state image to the lower convolutional neural network memory unit or outputting the output state image as the model training data, wherein the up-sampling layer at the bottommost layer in the encoder outputs the model training data;
the received data transmitted by the upper sampling layer is coded through the convolutional neural network memory unit of the decoder to obtain an output state image and a memory state image, the output state image is transmitted to the lower upper sampling layer, and the memory state image transmitted by the convolutional neural network memory unit of the corresponding layer in the encoder is received through the convolutional neural network memory unit of the decoder.
Further, the sampling of the historical radar echo image by using the continuous first continuous time step and the continuous second continuous time step as the sliding window specifically includes:
obtaining historical radar echo images of multiple heights at the same time in each time step of the first continuous time step, and obtaining historical radar echo images of multiple heights at the same time in each time step of the second continuous time step;
taking the historical radar echo images at a plurality of heights at the same moment as a plurality of channels of the same image;
and sampling an image area of the historical radar echo image, wherein the echo intensity is within a preset range.
The invention has the beneficial effects that: by constructing the encoding-decoding model as a prediction model of the weather short-term forecast, only a radar echo image sequence for the weather short-term forecast needs to be input into the model, and image characteristic information in the radar echo image sequence is acquired through the model, so that the image characteristic information can be analyzed, the weather short-term forecast can be acquired more conveniently, and the forecast is more accurate. The method comprises the steps of extracting samples of a certain time span from historical radar echo images, extracting echo images of all time periods, carrying out subsequent short-term weather forecast according to the echo images of the previous time periods and an encoding-decoding model, namely obtaining model prediction data of the subsequent time according to the echo images of the previous time periods, comparing the model prediction data with radar data of the same time period in the historical radar echo images, and training the encoding-decoding model according to the comparison condition, so that the trained encoding-decoding model is used for the subsequent short-term weather forecast, the forecast time of the short-term weather forecast is longer, and the accuracy is higher.
The short-term prediction system based on the simple convolution cyclic neural network comprises a computer readable storage medium and a processor, wherein a computer program is stored in the computer readable storage medium, and when the computer program is read and executed by the processor, the short-term prediction method based on the simple convolution cyclic neural network is realized.
Further, the method comprises an encoding-decoding model, wherein the encoding-decoding model comprises an encoder with a plurality of layers of convolutional neural network memory units alternating with a plurality of layers of down-sampling layers, and a decoder with a plurality of layers of convolutional neural network memory units alternating with a plurality of layers of up-sampling layers, wherein the convolutional neural network memory unit of each layer of the encoder corresponds to the convolutional neural network memory unit of one layer of the decoder, and the convolutional neural network memory unit comprises:
forget door ft,ft=σ(Wf*xt+bf);
Reset gate rt,rt=σ(Wr*xt+br);
The memory state of the convolutional neural network memory unit is as follows: c. Ct
Figure BDA0002349739420000053
Figure BDA0002349739420000051
The output state of the convolutional neural network memory unit is as follows: h ist,ht=rt⊙g(ct)+(1-rt)⊙xt
Wherein, bfAnd brRepresenting two biases, a convolution, ⊙ representing multiplication of corresponding elements of the matrix, a and g representing activation functions, t representing the current time step, xtRepresenting data received by the convolutional neural network memory unit,
Figure BDA0002349739420000052
W、Wf、Wrrespectively representing three convolution kernels in the convolutional neural network memory unit.
Compared with the prior art, the short-term prediction system based on the simple convolutional recurrent neural network and the short-term prediction method based on the simple convolutional recurrent neural network have the same advantages, and are not described again.
A computer-readable storage medium, which stores a computer program that, when read and executed by a processor, implements the above-described method for short-term prediction based on a simple convolutional recurrent neural network.
The computer readable storage medium of the present invention has the same advantages as the above-mentioned short-term prediction method based on the simple convolutional recurrent neural network over the prior art, and is not described herein again.
Drawings
FIG. 1 is a flow chart of a method for short-term prediction based on a simple convolutional recurrent neural network according to an embodiment of the present invention;
FIG. 2 is a flowchart illustrating a decoding-encoding model according to an embodiment of the present invention operating at a first continuous time step and a second continuous time step;
FIG. 3 is a block diagram of a convolutional neural network memory unit according to an embodiment of the present invention;
fig. 4 is a schematic diagram of input and output when the coding-decoding model according to the embodiment of the present invention is trained.
Detailed Description
In order to make the aforementioned objects, features and advantages of the present invention comprehensible, embodiments accompanied with figures are described in detail below.
Also, it is noted that the terms "first," "second," and the like in the description and claims of the present invention and in the above-described drawings are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used is interchangeable under appropriate circumstances such that the embodiments of the invention described herein are capable of operation in sequences other than those illustrated or described herein.
Referring to fig. 1 and 2, an embodiment of the present invention provides a short-term prediction method based on a simple convolutional recurrent neural network,
a short-term prediction method based on a simple convolution cyclic neural network comprises the following steps:
s1, acquiring a radar echo image sequence for weather short-term forecasting;
s2, inputting the radar echo image sequence into an encoding-decoding model to generate a weather short-term forecast result;
the encoding-decoding model comprises an encoder and a decoder, wherein the encoder is used for encoding and extracting image characteristic information of the radar echo image sequence, and the decoder is used for decoding and outputting a weather short-term forecast result by taking the output of the encoder as input.
The existing short-term weather forecasting methods mainly comprise forecasting of an optical flow method, a monomer centroid method, a cross correlation method and the like, and the forecasting results of the methods have large errors in a short time, and the errors of the forecasting results are large after a certain time.
Based on the method, the coding-decoding model is constructed as the prediction model of the weather short-term forecast, only a radar echo image sequence for the weather short-term forecast is required to be input into the model, and the image characteristic information in the radar echo image sequence is obtained through the model, so that the image characteristic information can be analyzed, the weather short-term forecast can be more conveniently obtained, and the forecast is more accurate.
In an alternative embodiment of the present invention, referring to fig. 2, the encoder includes a plurality of convolutional neural network memory units and a plurality of downsampling layers that are alternately arranged, the decoder includes a plurality of convolutional neural network memory units and a plurality of upsampling layers that are alternately arranged, and the convolutional neural network memory unit of each layer of the encoder corresponds to the convolutional neural network memory unit of the decoder;
a downsampling layer of the encoder to convolve downscaling the received image data;
the convolutional neural network units of the encoder and the decoder are used for encoding the received image data to obtain an output state image and a memory state image which comprise the image characteristic information;
the up-sampling layer of the decoder is used for carrying out deconvolution expansion on the received image data, wherein the image data received by the up-sampling layer of the decoder comprises the output state image output by the convolutional neural network unit at the upper layer of the up-sampling layer of the decoder and the memory state image output by the convolutional neural network unit of the encoder at the upper layer of the up-sampling layer.
Referring to fig. 2 and 4, in the present embodiment, an encoder and a decoder of an encoding-decoding model are formed by stacking a plurality of convolutional neural network units, and a downsampling layer and an upsampling layer are respectively inserted into the encoder and the decoder, when a radar echo image sequence for short-term weather forecast is input into the encoding-decoding model, the received image data is convolved layer by layer through the plurality of downsampling layers of the encoder to reduce the image data and reduce the scale of the image data, wherein the downsampling layer at the topmost layer of the encoder processes the image data of the radar echo image sequence, and the downsampling layers at the later layers process the image data output by the convolutional neural network unit layer at the upper layer of the sampling layer, so that after the image scale is reduced, the convolutional neural network unit is convenient to extract image characteristic information in the image data, and the convolutional neural network unit stores the extracted image characteristic information, and correspondingly, the multiple upper sampling layers of the decoder perform deconvolution operation on the received image data layer by layer to finally output a weather short forecast result through the lowest upper sampling layer, wherein the result is also embodied in the form of an image sequence, as shown in fig. 4, the output image sequence is the radar echo image sequence, and the output prediction image sequence is the weather short forecast result.
In an optional embodiment of the present invention, the image feature information includes image local detail information and image overall morphology information, and each of the encoder and the decoder includes at least one layer of the convolutional neural network unit, respectively, so as to obtain the output state image and the memory state image including the image local detail information; and the encoder and the decoder respectively comprise at least one layer of the convolutional neural network unit so as to obtain the output state image and the memory state image which comprise the integral form information of the image.
In this embodiment, in the encoding stage, the lower layer convolutional neural network unit is mainly responsible for extracting the local detail information of the image, and the higher layer convolutional neural network unit is mainly responsible for extracting the overall mind state information of the image, so that the image feature information obtained in the encoding stage is a process from details to the whole, and correspondingly, in the decoding stage, the image feature information obtained in the encoding stage can be a process from the whole mind state to specific details, so that feature extraction can be reasonably performed on the radar echo image sequence, and the result of the short-term weather forecast can be more accurately obtained.
In an optional embodiment of the present invention, before the step of inputting the radar echo image sequence into an encoding-decoding model to generate a result of the weather forecast, the method further includes the following steps:
sampling historical radar echo images by taking a first continuous time step and a second continuous time step as sliding windows, wherein the first continuous time step and the second continuous time step are continuous in time;
setting the echo image at each of the first successive time steps as model input data, and setting the echo image at each of the second successive time steps as live data;
establishing a coding-decoding model;
inputting the model input data into an encoding-decoding model for iterative prediction until a plurality of model prediction data for model training are obtained, wherein each time step in the second continuous time steps corresponds to one model prediction data, and the iterative prediction is iterated according to the time steps in the second continuous time steps;
training the encoding-decoding model according to the model prediction data and the corresponding live data until the encoding-decoding model converges.
In the embodiment, echo images of each time period are extracted by sampling a certain time span in a historical radar echo image, wherein model prediction data of a subsequent time period is performed according to a constructed coding-decoding model based on the echo image of a previous time period, and training of the coding-decoding model is performed by combining the model prediction data with actual data so as to enable the output of the trained coding-decoding model to approach the actual condition, namely after the model prediction data is obtained, the model prediction data is compared with radar images of the same time period in the historical radar echo image so as to train the coding-decoding model according to the comparison condition, so that a subsequent weather short-term forecast result is generated by the trained coding-decoding model, and therefore, the weather short-term forecast can be more accurate and rapid, meanwhile, according to the length of the time period of the selected historical radar echo image, the forecasting time span of actual weather forecasting can be favorably improved.
Specifically, referring to fig. 2, the historical radar echo image is sampled by using a first continuous time step and a second continuous time step as a sliding window, where the first continuous time step and the second continuous time step are consecutive in time, and in this embodiment, the first continuous time step is a plurality of consecutive time periods consecutive in time, such as a time period X in the figure0、X1、X2、X3、X4The 5 time slots, according to the weather forecast requirement, correspondingly select the second continuous time step as 20 continuous time steps after the first continuous time step in this embodiment, i.e. X5、X6……X24Assuming that in the historical radar echo image, if the radar image is generated every 6 minutes, the total time length of the first continuous time step is 30 minutes, and the total time length of the second continuous time step is twoIn the hour, 5 echo images are collected in the first continuous time step, and 20 echo images are collected in the second continuous step; wherein the echo image at each of the first successive time steps is set as model input data, the echo image prediction corresponding to the time step at the second successive time step is performed by inputting the model input data into an encoding-decoding model, the echo image at each of the second successive time steps is set as live data, the live data is compared with the prediction result, and the deviation of the model is determined based on the comparison, whereby the model correction can be performed.
After acquiring and setting model input data and live data, inputting the model input data into an encoding-decoding model for iterative prediction until a plurality of model prediction data are obtained, wherein each time step in the second continuous time steps corresponds to one model prediction data, and the iterative prediction is iterated according to the time steps in the second continuous time steps; performing encoding and decoding operations on input model input data through an encoding-decoding model to perform data prediction corresponding to a second continuous time step on the model input data so as to obtain model prediction data corresponding to live data of the second continuous time step, wherein when prediction is performed through the encoding-decoding model, prediction is performed in an iterative prediction mode, namely, model prediction data of a first second continuous time step at the prediction, such as time step X5After the model predicts the data, each item of data predicted by the time step is used as initial data of the next time step to carry out iterative prediction, so that the prediction error is reduced in a prediction mode with similar time steps, and correspondingly, the error in the model training can be reduced, so that the error in the weather forecast is reduced when the actual weather is short and is forecasted.
After model prediction data is obtained according to the model, live data, namely actual weather, is correspondingly arranged at the same time step in the second continuous time step, the model prediction data and the live data are combined, the coding-decoding model can be trained, after the model training is completed, short-term weather forecast can be performed through the model, wherein the time span of the first continuous time step and the second continuous time step can be changed, so that the forecast time span of the weather forecast is increased and the forecast error can be reduced to a certain extent during the subsequent weather forecast.
In an optional embodiment of the present invention, the training the coding-decoding model according to the model prediction data and the corresponding live data until the coding-decoding model converges specifically includes:
constructing an error function according to the model prediction data and the corresponding live data, and reversely deriving;
updating parameters in the encoding-decoding model according to an inverse derivative of the error function.
In this embodiment, after obtaining the model prediction data, an error function may be constructed using mean square error (mse) or absolute mean absolute error (mae) to perform inverse derivation, so as to update parameters in the encoding-decoding model according to the inverse derivation of the error function, so as to train the encoding-decoding model.
Wherein, a preset learning rate can be set, and the preset learning rate used for limiting the parameter updating amplitude in the coding-decoding model is obtained during model training; and updating parameters in the coding-decoding model according to the preset learning rate and the reverse derivative of the error function, so that the model training is more accurate, and the accuracy of the follow-up short-term weather forecast is improved.
In an optional embodiment of the present invention, the sampling the historical radar echo image with the first continuous time step and the second continuous time step as the sliding window specifically includes:
obtaining historical radar echo images of multiple heights at the same time in each time step of the first continuous time step, and obtaining historical radar echo images of multiple heights at the same time in each time step of the second continuous time step;
taking the historical radar echo images at a plurality of heights at the same moment as a plurality of channels of the same image;
and sampling an image area of the historical radar echo image, wherein the echo intensity is within a preset range.
In this embodiment, the historical radar echo images detected at three heights (1.5km,2.5km, and 3.5km) at the same time are regarded as three channels of the same image, the image is preprocessed, and an image area of which the echo intensity is within a preset range in the historical radar echo images is sampled and retained, for example, the echo intensity range of the preset range is set to be 15-80, so as to remove ground clutter and abnormal echoes in the historical echo image acquisition, so that the acquired image area is used as an echo image of each time step, and the accuracy of weather forecast is improved.
In an alternative embodiment of the present invention,
inputting the model input data into an encoding-decoding model for iterative prediction until a plurality of model prediction data for model training are obtained, specifically comprising:
receiving, by the downsampling layer located at a topmost layer in the encoder, the model input data for a current time step in the first continuous time step and convolution reducing the model input data for transmission to the convolutional neural network memory unit at a lower layer;
encoding the received model input data by the convolutional neural network memory unit of the encoder to obtain an output state image and a memory state image which comprise image characteristic information;
transmitting the output state image into the down-sampling layer of a lower layer,
performing convolution reduction on the received output state image through the down-sampling layer positioned in the middle layer of the encoder and transmitting the reduced output state image to the convolution neural network memory unit at the lower layer;
storing the memory state image in the convolutional neural network memory unit;
and when the current time step is a final time step in the first continuous time steps, transmitting the memory state image stored in the convolutional neural network memory unit of the final time step to the convolutional neural network memory unit of the corresponding decoder, and transmitting the memory state image to the upper sampling layer below the convolutional neural network memory unit of the corresponding decoder.
Referring to fig. 2, in the present embodiment, the encoding-decoding model includes an encoder and a decoder, wherein the encoder is a multilayer convolutional neural network memory unit (ConvSRU) and is arranged alternately with a plurality of layers of downsampling layers, each layer of downsampling layer can reduce the size of the model input data, i.e. the image data, of the current time step, e.g. reduce the size of the image data to half the size of the image data when receiving, the downsampling layer can use a convolution operation with a step size greater than 1 instead of retaining as much feature information as possible, thereby facilitating the extraction of the image feature information after reducing the size of the image data, wherein, in the model training, the encoder encodes the model input data in a first continuous time step, and in the first time step of the first continuous time step, e.g. X, the encoder encodes the model input data in the first0Receiving X in a first continuous time step at a top down-sampling layer of an encoder0In the present embodiment, the number of layers of the convolutional neural network memory unit (ConvSRU) and the downsampling layer may be three layers, so as to reduce the memory consumption of training and reduce the training cost under the condition of guaranteeing the training, the size of the model input data is 480 × 3, the initial states of the three layers of the convolutional neural network memory unit (ConvSRU) in the encoder are all zero states, therefore, the collected model input data is processed by the first downsampling layer in the encoder, that is, the topmost downsampling layer in the encoder, the size of the convolution kernel of the first downsampling layer in the present embodiment is 3 × 3, the step size is 2, the number of convolution kernels is 8, so as to reduce the input model input data to 240 × 8, and an image of 240 × 64 is obtained after being encoded by the lower convolutional neural network memory unit (ConvSRU), wherein the image is output as an output state image to a down-sampling layer of a next layer, and the image is stored as a memory state image, and the following image processing in the encoder is sequentially the firstTwo layers of down-sampling layers can be obtained by using the convolution kernel size of 3 x 3, the step size of 2, the convolution kernel number of 64, the processed image size of 120 x 64, then, the convolution kernel size of the second layer of convolution neural network memory unit (ConvSRU) is 3 x 3, the step length is 1, the number of convolution kernels is 192, the output state image size obtained after coding is 120 x 192, and the memory state image 120 x 192 is stored and memorized, and then passes through a third down-sampling layer, the convolution kernel size 3 x 3, step size 2, convolution kernel size 192, post data image size 60 x 192, again through a third layer of convolutional neural network memory units (ConvSRU), the convolution kernel size is 3 × 3, the step size is 1, the number of convolution kernels is 192, the output state image size is 60 × 192, and at this time, the memory state image 60 × 192 is subjected to memory storage. At this point, after passing through the encoder, the 480 × 3 images of the time step input will obtain three groups of memory states in the three layers of convolutional neural network memory units (ConvSRU), the scales of which are 240 × 64, 120 × 192, and 60 × 192, respectively, and the three memory state images will be updated again with the arrival of new input data of the next time step, and these memory states store the change rule of the images between different time steps.
When the current time step is the final time step in the first continuous time steps, as in the embodiment X4And a time step, namely transmitting the memory state image stored in the convolutional neural network memory unit of the final time step to the convolutional neural network memory unit of the corresponding decoder, and transmitting the memory state image to the upper sampling layer at the lower layer of the convolutional neural network memory unit of the corresponding decoder, so that when the decoder in the model performs iterative prediction at a second continuous time step, prediction is performed according to the memory state image.
In an optional embodiment of the present invention, the inputting the model input data into the encoding-decoding model for iterative prediction until obtaining a plurality of model prediction data for model training specifically further includes:
deconvoluting and expanding the received output state image through the up-sampling layer of the decoder, and transmitting the output state image to the lower convolutional neural network memory unit or outputting the output state image as the model training data, wherein the up-sampling layer at the bottommost layer in the encoder outputs the model training data;
the received data transmitted by the upper sampling layer is coded through the convolutional neural network memory unit of the decoder to obtain an output state image and a memory state image, the output state image is transmitted to the lower upper sampling layer, and the memory state image transmitted by the convolutional neural network memory unit of the corresponding layer in the encoder is received through the convolutional neural network memory unit of the decoder.
In this embodiment, the decoder of the encoding-decoding model is substantially symmetrical to the encoder, except that the initial state of the decoder is the final memory state of the encoder, that is, in the above embodiment, the memory state image transmitted by the convolutional neural network memory unit in the final time step of the first continuous time step, and the up-sampling layer operation of the decoder may use deconvolution. In order to make the decoder more utilize the memory state image transmitted by the encoder to reduce the accumulated error during decoding, the memory state image transmitted by the encoder and the output state image of the corresponding convolutional neural network memory unit (ConvSRU) in the decoder are input into the up-sampling layer of the decoder together, as shown in the figure, the up-sampling layer at the topmost layer of the decoder receives the output state image for deconvolution expansion, and transmits the output state image after deconvolution expansion to the convolutional neural network memory unit (ConvSRU) at the lower layer, so as to be used as the output state image received by the convolutional neural network memory unit (ConvSRU) for encoding, the convolutional neural network memory unit of the decoder encodes the characteristic image transmitted by the up-sampling layer to obtain the output state image and the memory state image, and outputs the output state image, storing the memory state image, wherein the last upper sampling layer of the decoder can correspond to the lower sampling layer in the encoder, the number of convolution kernels is 1, a single image with 240 × 64 feature diagram convolution being 480 × 1 is used as a prediction result, the input of the decoder is 0, after 20 time steps of iteration, the final output of the decoder, namely 20 radar echo extrapolation results, can be obtained during each iteration, the 20 prediction graphs output by the decoder are used as model prediction data to be compared with the real 20 radar live data, namely live data, the image processing of the echo image is carried out through the multilayer upper sampling layer and a convolution neural network memory unit (ConvSRU), iterative prediction is carried out, the encoding-decoding model is reasonably predicted, and the prediction is more accurate when the weather short-time prediction is carried out, while improving the time span of the forecast.
In an optional embodiment of the present invention, the convolutional neural network memory unit includes:
forget door ft,ft=σ(Wf*xt+bf);
Reset gate rt,rt=σ(Wr*xt+br);
The memory state of the convolutional neural network memory unit is as follows: c. Ct
Figure BDA0002349739420000141
Figure BDA0002349739420000142
The output state of the convolutional neural network memory unit is as follows: h ist,ht=rt⊙g(ct)+(1-rt)⊙xt
Wherein, bfAnd brRepresenting two biases, a convolution, ⊙ representing multiplication of corresponding elements of the matrix, a and g representing activation functions, t representing the current time step, xtRepresenting data received by the convolutional neural network memory unit,
Figure BDA0002349739420000151
W、Wf、Wrrespectively represent three of the convolutional neural network memory unitsAnd (4) performing convolution kernel.
Referring to fig. 3, in the present embodiment, a convolutional neural network memory unit (ConvSRU) is used to process image data, extract image feature information in the data, and perform memory storage of a memory state image and output of an output state image, where a memory state formula is
Figure BDA0002349739420000152
Figure BDA0002349739420000153
The formula of the output state is ht=rt⊙g(ct)+(1-rt)⊙xtT denotes the current time step, e.g. X for the current time step0When x0Represents X0The echo image received at a time step, in this embodiment, is a multi-channel historical echo image at a time in the current time step, W, Wf、WrThe method comprises the steps that three convolution kernels in a convolution neural network memory unit are represented respectively to conduct convolution operation on an image, sigma and g represent activation functions, sigma can adopt a sigmod activation function, g can adopt a relu activation function, and after the convolution neural network memory unit (ConvSRU) processes, motion and generation and elimination conditions of pixel points in the image are recorded by the three convolution kernels in the memory unit. During the actual training process, W, Wf、WrThe three convolution kernels can be trained simultaneously, so that the three convolution kernels can be directly spliced together due to the fact that the three convolution kernels are identical in shape and size, and the advantage that parallel training can be achieved and training speed is improved is achieved.
The short-circuit prediction system based on the simple convolutional recurrent neural network comprises a computer readable storage medium and a processor, wherein the computer readable storage medium is used for storing a computer program, and the computer program is read by the processor and runs to realize the short-circuit prediction method based on the simple convolutional recurrent neural network.
In an optional embodiment of the present invention, the present invention further includes an encoding-decoding model, where the encoding-decoding model includes an encoder with multiple layers of convolutional neural network memory units alternating with multiple layers of downsampling layers, and a decoder with multiple layers of convolutional neural network memory units alternating with multiple layers of upsampling layers, where the convolutional neural network memory unit of each layer of the encoder corresponds to the convolutional neural network memory unit of one layer of the decoder, and the convolutional neural network memory unit includes:
forget door ft,ft=σ(Wf*xt+bf);
Reset gate rt,rt=σ(Wr*xt+br);
The memory state of the convolutional neural network memory unit is as follows: c. Ct
Figure BDA0002349739420000154
Figure BDA0002349739420000161
The output state of the convolutional neural network memory unit is as follows: h ist,ht=rt⊙g(ct)+(1-rt)⊙xt
Wherein, bfAnd brRepresenting two biases, a convolution, ⊙ representing multiplication of corresponding elements of the matrix, a and g representing activation functions, t representing the current time step, xtRepresenting data received by the convolutional neural network memory unit,
Figure BDA0002349739420000162
W、Wf、Wrrespectively representing three convolution kernels in the convolutional neural network memory unit.
Therefore, by constructing the coding-decoding model as a prediction model of the weather short-term forecast, only a radar echo image sequence for the weather short-term forecast is required to be input into the model, and the image characteristic information in the radar echo image sequence is obtained through the model, so that the image characteristic information can be analyzed, the weather short-term forecast can be more conveniently obtained, and the forecast is more accurate.
A computer-readable storage medium according to another embodiment of the present invention stores a computer program, and when the computer program is read and executed by a processor, the method for short-term prediction based on a simple convolutional recurrent neural network described above is implemented, so as to achieve the beneficial effects of the embodiments of the present invention.
Although the present invention is disclosed above, the present invention is not limited thereto. Various changes and modifications may be effected therein by one skilled in the art without departing from the spirit and scope of the invention as defined in the appended claims.

Claims (11)

1. A short-term prediction method based on a simple convolution cyclic neural network is characterized by comprising the following steps:
acquiring a radar echo image sequence for weather short-term forecasting;
inputting the radar echo image sequence into an encoding-decoding model to generate a weather short-term forecast result; the encoding-decoding model comprises an encoder and a decoder, wherein the encoder is used for encoding and extracting image characteristic information of the radar echo image sequence, and the decoder is used for decoding and outputting the result of the weather short-term forecast by taking the output of the encoder as input.
2. The method according to claim 1, wherein the encoder comprises a plurality of convolutional neural network memory units and a plurality of downsampling layers which are alternately arranged, the decoder comprises a plurality of convolutional neural network memory units and a plurality of upsampling layers which are alternately arranged, and the convolutional neural network memory unit of each layer of the encoder corresponds to one layer of the convolutional neural network memory unit of the decoder;
a downsampling layer of the encoder to convolve downscaling the received image data;
the convolutional neural network units of the encoder and the decoder are used for encoding the received image data to obtain an output state image and a memory state image which comprise the image characteristic information; the up-sampling layer of the decoder is used for carrying out deconvolution expansion on the received image data, wherein the image data received by the up-sampling layer of the decoder comprises the output state image output by the convolutional neural network unit at the upper layer of the up-sampling layer of the decoder and the memory state image output by the convolutional neural network unit of the encoder at the upper layer of the up-sampling layer.
3. The method according to claim 2, wherein the image characteristic information includes image local detail information and image overall morphology information, and each of the encoder and the decoder includes at least one layer of the convolutional neural network unit respectively, so as to obtain the output state image and the memory state image including the image local detail information; and the encoder and the decoder respectively comprise at least one layer of the convolutional neural network unit so as to obtain the output state image and the memory state image which comprise the integral form information of the image.
4. The method for short-term prediction based on simple convolutional recurrent neural network as claimed in claim 3, wherein the convolutional neural network memory unit comprises:
forget door ft,ft=σ(Wf*xt+bf);
Reset gate rt,rt=σ(Wr*xt+br);
The memory state of the convolutional neural network memory unit is as follows: c. Ct
Figure FDA0002349739410000021
The rollThe output state of the neural network memory unit is as follows: h ist,ht=rt⊙g(ct)+(1-rt)⊙xt
Wherein, bfAnd brRepresenting two biases, a convolution, ⊙ representing multiplication of corresponding elements of the matrix, a and g representing activation functions, t representing the current time step, xtRepresenting data received by the convolutional neural network memory unit,
Figure FDA0002349739410000022
W、Wf、Wrrespectively representing three convolution kernels in the convolutional neural network memory unit.
5. The short-forecasting method based on the simple convolutional recurrent neural network as claimed in any of claims 2-4, wherein before the step of inputting the radar echo image sequence into an encoding-decoding model to generate the weather short-forecasting result, the method further comprises the following steps:
sampling historical radar echo images by taking a first continuous time step and a second continuous time step as sliding windows, wherein the first continuous time step and the second continuous time step are continuous in time;
setting the echo image at each of the first successive time steps as model input data, and setting the echo image at each of the second successive time steps as live data;
establishing a coding-decoding model;
inputting the model input data into an encoding-decoding model for iterative prediction until a plurality of model prediction data for model training are obtained, wherein each time step in the second continuous time steps corresponds to one model prediction data, and the iterative prediction is iterated according to the time steps in the second continuous time steps;
training the encoding-decoding model according to the model prediction data and the corresponding live data until the encoding-decoding model converges.
6. The method for impracticably forecasting based on the simple convolutional recurrent neural network as claimed in claim 5, wherein the inputting of the model input data into the coding-decoding model for iterative prediction until obtaining a plurality of model prediction data for model training specifically comprises:
receiving, by the downsampling layer located at a topmost layer in the encoder, the model input data for a current time step in the first continuous time step and convolution reducing the model input data for transmission to the convolutional neural network memory unit at a lower layer;
encoding the received model input data by the convolutional neural network memory unit of the encoder to obtain an output state image and a memory state image which comprise image characteristic information;
transmitting the output state image into the down-sampling layer of a lower layer,
performing convolution reduction on the received output state image through the down-sampling layer positioned in the middle layer of the encoder and transmitting the reduced output state image to the convolution neural network memory unit at the lower layer;
storing the memory state image in the convolutional neural network memory unit;
and when the current time step is a final time step in the first continuous time steps, transmitting the memory state image stored in the convolutional neural network memory unit of the final time step to the convolutional neural network memory unit of the corresponding decoder, and transmitting the memory state image to the upper sampling layer below the convolutional neural network memory unit of the corresponding decoder.
7. The method of claim 6, wherein the model input data is input into an encoding-decoding model for iterative prediction until a plurality of model prediction data for model training is obtained, and further comprising:
deconvoluting and expanding the received output state image through the up-sampling layer of the decoder, and transmitting the output state image to the lower convolutional neural network memory unit or outputting the output state image as the model training data, wherein the up-sampling layer at the bottommost layer in the encoder outputs the model training data;
the received data transmitted by the upper sampling layer is coded through the convolutional neural network memory unit of the decoder to obtain an output state image and a memory state image, the output state image is transmitted to the lower upper sampling layer, and the memory state image transmitted by the convolutional neural network memory unit of the corresponding layer in the encoder is received through the convolutional neural network memory unit of the decoder.
8. The method for short-term prediction based on the simple convolutional recurrent neural network as claimed in claim 5, wherein the sampling of the historical radar echo image with the continuous first continuous time step and the continuous second continuous time step as the sliding window specifically comprises:
obtaining historical radar echo images of multiple heights at the same time in each time step of the first continuous time step, and obtaining historical radar echo images of multiple heights at the same time in each time step of the second continuous time step;
taking the historical radar echo images at a plurality of heights at the same moment as a plurality of channels of the same image;
and sampling an image area of the historical radar echo image, wherein the echo intensity is within a preset range.
9. A short-circuit prediction system based on a simple convolutional recurrent neural network, comprising a computer-readable storage medium storing a computer program and a processor, wherein the computer program is read by the processor and executed by the processor, so as to realize the short-circuit prediction method based on the simple convolutional recurrent neural network according to any one of claims 1 to 8.
10. The short-term prediction system based on simple convolutional recurrent neural network as claimed in claim 9, further comprising an encoding-decoding model, wherein the encoding-decoding model comprises an encoder with multiple layers of convolutional neural network memory units alternating with multiple layers of downsampling layers, and a decoder with multiple layers of convolutional neural network memory units alternating with multiple layers of upsampling layers, wherein the convolutional neural network memory unit of each layer of the encoder corresponds to the convolutional neural network memory unit of one layer of the decoder, and the convolutional neural network memory unit comprises:
forget door ft,ft=σ(Wf*xt+bf);
Reset gate rt,rt=σ(Wr*xt+br);
The memory state of the convolutional neural network memory unit is as follows: c. Ct
Figure FDA0002349739410000041
The output state of the convolutional neural network memory unit is as follows: h ist,ht=rt⊙g(ct)+(1-rt)⊙xt
Wherein, bfAnd brRepresenting two biases, a convolution, ⊙ representing multiplication of corresponding elements of the matrix, a and g representing activation functions, t representing the current time step, xtRepresenting data received by the convolutional neural network memory unit,
Figure FDA0002349739410000042
W、Wf、Wrrespectively representing three convolution kernels in the convolutional neural network memory unit.
11. A computer-readable storage medium, characterized in that it stores a computer program which, when read and executed by a processor, implements the method for short-term prediction based on simple convolutional recurrent neural networks as claimed in any one of claims 1 to 8.
CN201911410044.0A 2019-12-31 2019-12-31 Short-term prediction method and system based on simple convolution cyclic neural network Active CN111158068B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201911410044.0A CN111158068B (en) 2019-12-31 2019-12-31 Short-term prediction method and system based on simple convolution cyclic neural network

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911410044.0A CN111158068B (en) 2019-12-31 2019-12-31 Short-term prediction method and system based on simple convolution cyclic neural network

Publications (2)

Publication Number Publication Date
CN111158068A true CN111158068A (en) 2020-05-15
CN111158068B CN111158068B (en) 2022-09-23

Family

ID=70559916

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911410044.0A Active CN111158068B (en) 2019-12-31 2019-12-31 Short-term prediction method and system based on simple convolution cyclic neural network

Country Status (1)

Country Link
CN (1) CN111158068B (en)

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111766641A (en) * 2020-09-01 2020-10-13 南京信大气象科学技术研究院有限公司 Strong convection weather identification method based on deep neural network
CN111830595A (en) * 2020-06-09 2020-10-27 上海眼控科技股份有限公司 Meteorological element prediction method and equipment
CN112698427A (en) * 2020-12-09 2021-04-23 最美天气(上海)科技有限公司 Short-term forecasting method and system based on space-time forecasting model
CN113657477A (en) * 2021-08-10 2021-11-16 南宁五加五科技有限公司 Method, device and system for forecasting short-term rainfall
CN114325880A (en) * 2022-03-08 2022-04-12 浙江工业大学 Rainfall prediction method and device based on radar echo diagram
CN114460555A (en) * 2022-04-08 2022-05-10 哈尔滨工业大学(深圳)(哈尔滨工业大学深圳科技创新研究院) Radar echo extrapolation method and device and storage medium
CN115390164A (en) * 2022-10-27 2022-11-25 南京信息工程大学 Radar echo extrapolation forecasting method and system
CN117313823A (en) * 2023-11-28 2023-12-29 哈尔滨工业大学(深圳)(哈尔滨工业大学深圳科技创新研究院) Mixed distributed parallel training method and system for convolutional neural network
CN117368881A (en) * 2023-12-08 2024-01-09 哈尔滨工业大学(深圳)(哈尔滨工业大学深圳科技创新研究院) Multi-source data fusion long-sequence radar image prediction method and system

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107121679A (en) * 2017-06-08 2017-09-01 湖南师范大学 Recognition with Recurrent Neural Network predicted method and memory unit structure for Radar Echo Extrapolation
US20180218254A1 (en) * 2017-02-02 2018-08-02 International Business Machines Corporation Solar power forecasting with volumetric convolutional neural network
CN108508505A (en) * 2018-02-05 2018-09-07 南京云思创智信息科技有限公司 Heavy showers and thunderstorm forecasting procedure based on multiple dimensioned convolutional neural networks and system
CN108732550A (en) * 2018-08-01 2018-11-02 北京百度网讯科技有限公司 Method and apparatus for predicting radar return
CN109001736A (en) * 2018-06-12 2018-12-14 中国人民解放军国防科技大学 Radar echo extrapolation method based on deep space-time prediction neural network
CN109190752A (en) * 2018-07-27 2019-01-11 国家新闻出版广电总局广播科学研究院 The image, semantic dividing method of global characteristics and local feature based on deep learning
CN109948930A (en) * 2019-03-18 2019-06-28 北京泊远网络科技有限公司 Deep learning method and its application for driving training planning
CN110008953A (en) * 2019-03-29 2019-07-12 华南理工大学 Potential target Area generation method based on the fusion of convolutional neural networks multilayer feature
CN110210485A (en) * 2019-05-13 2019-09-06 常熟理工学院 The image, semantic dividing method of Fusion Features is instructed based on attention mechanism
CN110322009A (en) * 2019-07-19 2019-10-11 南京梅花软件系统股份有限公司 Image prediction method based on the long Memory Neural Networks in short-term of multilayer convolution

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20180218254A1 (en) * 2017-02-02 2018-08-02 International Business Machines Corporation Solar power forecasting with volumetric convolutional neural network
CN107121679A (en) * 2017-06-08 2017-09-01 湖南师范大学 Recognition with Recurrent Neural Network predicted method and memory unit structure for Radar Echo Extrapolation
CN108508505A (en) * 2018-02-05 2018-09-07 南京云思创智信息科技有限公司 Heavy showers and thunderstorm forecasting procedure based on multiple dimensioned convolutional neural networks and system
CN109001736A (en) * 2018-06-12 2018-12-14 中国人民解放军国防科技大学 Radar echo extrapolation method based on deep space-time prediction neural network
CN109190752A (en) * 2018-07-27 2019-01-11 国家新闻出版广电总局广播科学研究院 The image, semantic dividing method of global characteristics and local feature based on deep learning
CN108732550A (en) * 2018-08-01 2018-11-02 北京百度网讯科技有限公司 Method and apparatus for predicting radar return
CN109948930A (en) * 2019-03-18 2019-06-28 北京泊远网络科技有限公司 Deep learning method and its application for driving training planning
CN110008953A (en) * 2019-03-29 2019-07-12 华南理工大学 Potential target Area generation method based on the fusion of convolutional neural networks multilayer feature
CN110210485A (en) * 2019-05-13 2019-09-06 常熟理工学院 The image, semantic dividing method of Fusion Features is instructed based on attention mechanism
CN110322009A (en) * 2019-07-19 2019-10-11 南京梅花软件系统股份有限公司 Image prediction method based on the long Memory Neural Networks in short-term of multilayer convolution

Cited By (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111830595A (en) * 2020-06-09 2020-10-27 上海眼控科技股份有限公司 Meteorological element prediction method and equipment
CN111766641A (en) * 2020-09-01 2020-10-13 南京信大气象科学技术研究院有限公司 Strong convection weather identification method based on deep neural network
CN112698427A (en) * 2020-12-09 2021-04-23 最美天气(上海)科技有限公司 Short-term forecasting method and system based on space-time forecasting model
CN113657477A (en) * 2021-08-10 2021-11-16 南宁五加五科技有限公司 Method, device and system for forecasting short-term rainfall
CN113657477B (en) * 2021-08-10 2022-04-08 南宁五加五科技有限公司 Method, device and system for forecasting short-term rainfall
CN114325880A (en) * 2022-03-08 2022-04-12 浙江工业大学 Rainfall prediction method and device based on radar echo diagram
CN114460555A (en) * 2022-04-08 2022-05-10 哈尔滨工业大学(深圳)(哈尔滨工业大学深圳科技创新研究院) Radar echo extrapolation method and device and storage medium
CN115390164A (en) * 2022-10-27 2022-11-25 南京信息工程大学 Radar echo extrapolation forecasting method and system
CN117313823A (en) * 2023-11-28 2023-12-29 哈尔滨工业大学(深圳)(哈尔滨工业大学深圳科技创新研究院) Mixed distributed parallel training method and system for convolutional neural network
CN117313823B (en) * 2023-11-28 2024-04-12 哈尔滨工业大学(深圳)(哈尔滨工业大学深圳科技创新研究院) Mixed distributed parallel training method and system for convolutional neural network
CN117368881A (en) * 2023-12-08 2024-01-09 哈尔滨工业大学(深圳)(哈尔滨工业大学深圳科技创新研究院) Multi-source data fusion long-sequence radar image prediction method and system
CN117368881B (en) * 2023-12-08 2024-03-26 哈尔滨工业大学(深圳)(哈尔滨工业大学深圳科技创新研究院) Multi-source data fusion long-sequence radar image prediction method and system

Also Published As

Publication number Publication date
CN111158068B (en) 2022-09-23

Similar Documents

Publication Publication Date Title
CN111158068B (en) Short-term prediction method and system based on simple convolution cyclic neural network
WO2021093393A1 (en) Video compressed sensing and reconstruction method and apparatus based on deep neural network
CN108508505A (en) Heavy showers and thunderstorm forecasting procedure based on multiple dimensioned convolutional neural networks and system
CN113936142A (en) Rainfall approach forecasting method and device based on deep learning
CN115390164B (en) Radar echo extrapolation forecasting method and system
CN110765878A (en) Short-term rainfall prediction method
CN111708030B (en) Disaster weather forecast method based on energy generation antagonism predictor
CN112115911A (en) Light-weight SAR image target detection method based on deep learning
CN115113301B (en) Emergency short-term forecasting method and system based on multi-source data fusion
CN111428862B (en) Polar unbalanced space-time combined convection primary short-term prediction method
CN110619427A (en) Traffic index prediction method and device based on sequence-to-sequence learning model
CN116106988A (en) Weather prediction method and device, electronic equipment and storage medium
CN116415730A (en) Fusion self-attention mechanism time-space deep learning model for predicting water level
CN117665825B (en) Radar echo extrapolation prediction method, system and storage medium
CN112543339B (en) Video simulation method and device based on residual error reconstruction
CN115713044B (en) Method and device for analyzing residual life of electromechanical equipment under multi-condition switching
CN117371571A (en) Regional air quality prediction model based on multi-scale dynamic synchronous diagram mechanism
CN109993282B (en) Typhoon wave and range prediction method
CN111985731A (en) Method and system for predicting number of people at urban public transport station
CN115600101B (en) Priori knowledge-based unmanned aerial vehicle signal intelligent detection method and apparatus
CN116563103A (en) Remote sensing image space-time fusion method based on self-adaptive neural network
CN110852189A (en) Low-complexity dense crowd analysis method based on deep learning
CN116152206A (en) Photovoltaic output power prediction method, terminal equipment and storage medium
CN115168327A (en) Large-scale data space-time prediction method based on multilayer tree long-short term memory network
CN114627370A (en) Hyperspectral image classification method based on TRANSFORMER feature fusion

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant