CN109523082B - CNN-LSTM flight normal clearance rate prediction method - Google Patents
CNN-LSTM flight normal clearance rate prediction method Download PDFInfo
- Publication number
- CN109523082B CN109523082B CN201811384267.XA CN201811384267A CN109523082B CN 109523082 B CN109523082 B CN 109523082B CN 201811384267 A CN201811384267 A CN 201811384267A CN 109523082 B CN109523082 B CN 109523082B
- Authority
- CN
- China
- Prior art keywords
- data
- word
- time
- flight
- real
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 238000000034 method Methods 0.000 title claims abstract description 29
- 239000013598 vector Substances 0.000 claims abstract description 73
- 238000012549 training Methods 0.000 claims abstract description 52
- 238000005457 optimization Methods 0.000 claims abstract description 18
- 238000012360 testing method Methods 0.000 claims abstract description 9
- 230000006870 function Effects 0.000 claims description 12
- 230000005540 biological transmission Effects 0.000 claims description 6
- 238000004364 calculation method Methods 0.000 claims description 6
- 230000008859 change Effects 0.000 claims description 6
- 238000012545 processing Methods 0.000 claims description 5
- 230000001186 cumulative effect Effects 0.000 claims description 3
- 238000011478 gradient descent method Methods 0.000 claims description 3
- 230000009191 jumping Effects 0.000 claims description 3
- 238000007477 logistic regression Methods 0.000 claims description 3
- 238000013507 mapping Methods 0.000 claims description 3
- 239000011159 matrix material Substances 0.000 claims description 3
- 238000010606 normalization Methods 0.000 claims description 3
- 230000008569 process Effects 0.000 claims description 3
- 230000000630 rising effect Effects 0.000 claims description 3
- 230000010006 flight Effects 0.000 description 8
- 238000013527 convolutional neural network Methods 0.000 description 5
- 238000013528 artificial neural network Methods 0.000 description 4
- 238000010586 diagram Methods 0.000 description 4
- 238000007405 data analysis Methods 0.000 description 2
- 238000004891 communication Methods 0.000 description 1
- 238000013523 data management Methods 0.000 description 1
- 238000013135 deep learning Methods 0.000 description 1
- 230000001934 delay Effects 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 230000015654 memory Effects 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 238000012544 monitoring process Methods 0.000 description 1
- 238000007430 reference method Methods 0.000 description 1
- 230000006403 short-term memory Effects 0.000 description 1
- 230000007704 transition Effects 0.000 description 1
- 238000012800 visualization Methods 0.000 description 1
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q10/00—Administration; Management
- G06Q10/04—Forecasting or optimisation specially adapted for administrative or management purposes, e.g. linear programming or "cutting stock problem"
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/04—Architecture, e.g. interconnection topology
- G06N3/045—Combinations of networks
Landscapes
- Engineering & Computer Science (AREA)
- Business, Economics & Management (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Strategic Management (AREA)
- Human Resources & Organizations (AREA)
- Economics (AREA)
- Computational Linguistics (AREA)
- Development Economics (AREA)
- General Health & Medical Sciences (AREA)
- Molecular Biology (AREA)
- Computing Systems (AREA)
- General Engineering & Computer Science (AREA)
- Data Mining & Analysis (AREA)
- Mathematical Physics (AREA)
- Software Systems (AREA)
- Biophysics (AREA)
- Biomedical Technology (AREA)
- Evolutionary Computation (AREA)
- Artificial Intelligence (AREA)
- Game Theory and Decision Science (AREA)
- Life Sciences & Earth Sciences (AREA)
- Health & Medical Sciences (AREA)
- Entrepreneurship & Innovation (AREA)
- Marketing (AREA)
- Operations Research (AREA)
- Quality & Reliability (AREA)
- Tourism & Hospitality (AREA)
- General Business, Economics & Management (AREA)
- Management, Administration, Business Operations System, And Electronic Commerce (AREA)
Abstract
The invention discloses a method for predicting a normal clearance rate of a CNN-LSTM-based flight, which comprises the following steps: marking the real-time flight status data and the real-time weather data with time labels, combining the real-time flight status data and the real-time weather data into formatted data, and taking the formatted data as real-time data; taking the flight state data and the weather data with the same time sequence as related words, and obtaining Word vectors of each Word by using Word2vec training test, wherein the time sequence of the flight state data and the weather data is prior to the real-time flight state data and the real-time weather data; learning word vectors by using CNN to obtain word vectors after training optimization, wherein the word vectors after training optimization have richer semantics than word vectors before training optimization; replacing flight state data and weather data with the word vector after training optimization, and taking the word vector as time sequence training set data; training the time sequence training set data by using the LSTM to obtain the values of all parameters in the LSTM; and calculating the real-time data by using the LSTM to obtain a predicted value of the normal release rate of the flight.
Description
Technical Field
The invention relates to the technical field of civil aviation service, in particular to a method for predicting a normal release rate of a flight based on CNN-LSTM.
Background
The basic purpose of the current airport visualization system is to realize the collaborative communication of departments such as air traffic control, airports, airlines, residence units and the like by utilizing multimedia and multiple channels, integrate the data information owned by each unit and display the data in a chart form. However, the displayed data are all original data, and the data are not calculated and analyzed, so that the user can automatically analyze the data after taking the data, and the energy is wasted.
CNN (Convoltional Neural Networks) is a convolutional neural network, which is a type of feedforward neural network (Feedforward Neural Networks) containing convolutional or correlation calculations and having a Deep structure, and is one of representative algorithms of Deep Learning.
LSTM (Long Short-Term Memory) is a Long-Short-Term Memory network, a type of time-recurrent neural network, suitable for processing and predicting important events with relatively Long intervals and delays in a time series.
With the increasing demand of civil aviation, airports and time-efficient services are also under a certain pressure, and a large amount of various information data generated by airport operation are responsible for various departments of the airport, so that data management is more decentralized. In order to better manage and analyze various information data, a system for providing real-time special conditions, full-element monitoring, live early warning and comprehensive situation assessment and prediction of an airport is constructed by using CNN and LSTM.
Disclosure of Invention
Based on the above, the invention aims to provide a method for predicting the normal clearance rate of CNN-LSTM flights.
A method for predicting a normal clearance rate of a flight based on CNN-LSTM comprises the following steps:
marking the real-time flight status data and the real-time weather data with time labels, combining the real-time flight status data and the real-time weather data into formatted data, and taking the formatted data as real-time data;
taking the flight state data and the weather data with the same time sequence as related words, and obtaining Word vectors of each Word by using Word2vec training test, wherein the time sequence of the flight state data and the weather data is prior to the real-time flight state data and the real-time weather data;
learning word vectors by using CNN to obtain word vectors after training optimization, wherein the word vectors after training optimization have richer semantics than word vectors before training optimization;
replacing flight state data and weather data with the word vector after training optimization, and taking the word vector as time sequence training set data;
training the time sequence training set data by using the LSTM to obtain the values of all parameters in the LSTM;
and calculating the real-time data by using the LSTM to obtain a predicted value of the normal release rate of the flight.
Further preferably, the step of using the flight status data and the weather data in the same time sequence as related words includes: and sorting and formatting the flight status data and the weather data with the same time sequence.
Further preferably, the Word vector of each Word is obtained by using a Word2vec training test, including:
calculating the code of each word using Huffman codingWherein w represents a word, p w Represents the path from the root node to the leaf node corresponding to w, l W Representing path p w The number of the nodes is included in->Representing path p w Huffman coding corresponding to the ith non-leaf node;
the huffman vector for the word w is θw,representing path p w Vectors corresponding to the ith non-leaf node;
according to logistic regression, word path p w The probability of a node being classified into a positive class isThe probability of being divided into negative classes is 1-q, where σ () represents the sigma function, V w A word vector representing a word;
in the parameter iteration process, the parameters adopt gradient rising and step lengthWherein α is the learning rate;
iterative variance cumulative value of word w vector
Huffman vector of word w rises through gradient
Of which theta is w The difference between the value of the previous iteration and the value of the previous iteration is smaller than a threshold value, the word vector converges, otherwise, the word vector returnsRetraining;
when the model converges, word vector V of word w w The calculation formula of (2) is as follows:where Context (w) represents a word contained in data, and 1.1 represents a total number.
Further preferably, the learning the word vector by using CNN to obtain the word vector after training optimization includes:
carrying out normalization processing on the word vectors;
setting the maximum number of iterations, an initialization weight and an acceptance accuracy threshold parameter;
output a of the first layer in forward propagation in CNN l =σ(w l a l-1 +b l ) Wherein w is l Weight matrix representing mapping of layer l-1 to layer l, b l Representing data bias, σ () represents a sigma function, i.e
Output of layer I in back propagation in CNNWherein +.Multiplying, z l Input data representing a first layer;
updating w of the first layer using gradient descent method l And b l The updating method is as follows
Wherein α represents a learning rate;
checking the change values of W and b, if the change values are smaller than the threshold value for stopping iteration, jumping out of the loop, outputting W and b of each layer, and training the optimized word vector.
Further preferably, the training the time sequence training set data by using the LSTM to obtain values of each parameter in the LSTM includes:
setting initial values of weight, iteration times and network layer numbers;
the forgetting door state value at the moment t in forward transmission is f t =σ(W i [h t-1 ,x t ]+b f ) Wherein W is i Represents the weight, h t-1 Representing the output of the last cell, x t Representing the input of the current cell, σ represents the sigmod function, b f Representing bias;
inputting a gate state value i at time t in forward transmission t =σ(W i [h t-1 ,x t ]+b i ) The current transfusion is filled withWherein, tanh is a tanh function;
according to the output values of the forgetting gate and the output gate, the output value of the cell is
Outputting the state value o of the gate t =σ(W o [h t-1 ,x t ]+b o ) The hidden layer has a state of h t =o t *tanh(C t );
Calculating an error using back propagation;
when the error is smaller than the errorAt the threshold value, the weight W is output i And bias b f 。
Further preferably, the calculating the real-time data by using the LSTM to obtain a predicted value of a normal clearance rate of the flight includes:
weight W to be output i And bias b f Substituting and utilizing the f t =σ(W i [h t-1 ,x t ]+b f )、i t =σ(W i [h t-1 ,x t ]+b i )o t =σ(W o [h t-1 ,x t ]+b o )、h t =o t *tanh(C t ) And calculating the real-time data to obtain a predicted value of the normal release rate of the flight.
Compared with the prior art, the CNN-LSTM flight normal release rate-based method provides a better calculation method for the flight state time sequence data, so that the data analysis is more accurate, an accurate predicted value of the flight normal release rate is provided, and the problems that the flight system in the prior art is single in data form and the predicted value of the flight normal release rate is inaccurate are solved.
For a better understanding and implementation, the present invention is described in detail below with reference to the drawings.
Drawings
FIG. 1 is an exemplary flow chart of a method of the present invention for CNN-LSTM flight normal clearance prediction.
FIG. 2 is an exemplary flow chart of Word vectors for each Word using a Word2vec training test.
Fig. 3 is an exemplary flow diagram for learning word vectors using CNNs.
FIG. 4 is an exemplary flow diagram for training time series training set data using LSTM.
Detailed Description
The implementations described in the following exemplary examples are not representative of all implementations consistent with the present disclosure. Rather, they are merely examples of methods consistent with some aspects of the disclosure as detailed in the accompanying claims.
The terminology used in the present disclosure is for the purpose of describing particular embodiments only and is not intended to be limiting of the disclosure. As used in this disclosure and the appended claims, the singular forms "a," "an," and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise. It should also be understood that the term "and/or" as used herein refers to and encompasses any or all possible combinations of one or more of the associated listed items.
The weather condition is one of the important reasons for influencing the normal release rate of the airport flights, and the flight status is related to the normal take-off of the flights, and the weather condition and the flight transition are all changed in real time and are related in time sequence, so that the normal release rate of the flights can be predicted according to the weather data and the flight status data on the time sequence.
Referring to FIG. 1, FIG. 1 is an exemplary flow chart of a method of the present invention for CNN-LSTM based flight normal clearance prediction. The invention discloses a method for predicting a normal clearance rate of a flight based on CNN-LSTM, which comprises the following steps:
step 101, marking time labels on real-time flight status data and real-time weather data, combining the real-time flight status data and the real-time weather data into formatted data, and taking the formatted data as real-time data;
step 102, taking flight state data and weather data with the same time sequence as related words, and obtaining Word vectors of each Word by using Word2vec training test, wherein the time sequence of the flight state data and the weather data is prior to the real-time flight state data and the real-time weather data;
step 103, learning word vectors by using CNN to obtain word vectors after training optimization, wherein the word vectors after training optimization are richer in semantics than word vectors before training optimization;
step 104, replacing flight state data and weather data with the word vector after training optimization, and taking the word vector as time sequence training set data;
step 105, training the time sequence training set data by using the LSTM to obtain the values of all parameters in the LSTM;
and 106, calculating the real-time data by using the LSTM to obtain a predicted value of the normal clearance rate of the flight.
Preferably, in the step 102, the step of using the flight status data and the weather data with the same time sequence as related words includes: and sorting and formatting the flight status data and the weather data with the same time sequence.
Referring to fig. 2, fig. 2 is an exemplary flow chart of Word vectors for each Word using a Word2vec training test.
In the step 102, the obtaining the Word vector of each Word by using the Word2vec training test includes:
calculating the code of each word using Huffman codingWherein w represents a word, p w Represents the path from the root node to the leaf node corresponding to w, l w Representing path p w The number of the nodes is included in->Representing path p w Huffman coding corresponding to the ith non-leaf node;
huffman vector of word w is θ w ,Representing a vector corresponding to an ith non-leaf node in the path pw;
according to logistic regression, word path p w The probability of a node being classified into a positive class isThe probability of being divided into negative classes is 1-q, where σ () represents the sigma function, V w A word vector representing a word;
in the parameter iteration process, the parameters adopt gradient rising and step lengthWherein α is the learning rate;
iterative variance cumulative value of word w vector
Huffman vector of word w rises through gradient
Of which theta is w The difference between the value of the previous iteration and the value of the previous iteration is smaller than a threshold value, the word vector converges, otherwise, the word vector returnsRetraining;
when the model converges, word vector V of word w w The calculation formula of (2) is as follows:where Context (w) represents a word contained in data, ||indicates a total number.
According to the word vector obtained in the step 102, the data are trained by using CNN, so that the hidden characteristic distinction between the data is obvious, and the difference between the data is enlarged.
Referring to fig. 3, fig. 3 is an exemplary flow diagram for learning word vectors using CNNs. In step 103, learning the word vector by using CNN to obtain the word vector after training optimization includes:
carrying out normalization processing on the word vectors;
setting the maximum number of iterations, an initialization weight and an acceptance accuracy threshold parameter;
output a of the first layer in forward propagation in CNN l =σ(w l a l-1 +b l ) Wherein w is l Weight matrix representing mapping of layer l-1 to layer l, b l Representing data bias, σ () represents a sigma function, i.e
Output of layer I in back propagation in CNNWherein ≡indicates dot product, z between vectors l Input data representing a first layer;
updating w of the first layer using gradient descent method l And b l The updating method is as follows
Wherein α represents a learning rate;
checking the change values of W and b, if the change values are smaller than the threshold value for stopping iteration, jumping out of the loop, outputting W and b of each layer, and training the optimized word vector.
Referring to fig. 4, fig. 4 is an exemplary flow diagram for training time series training set data using LSTM. In step 105, training the time sequence training set data by using the LSTM to obtain values of each parameter in the LSTM, including:
setting initial values of weight, iteration times and network layer numbers;
the forgetting door state value at the moment t in forward transmission is f t =σ(W i [h t-1 ,x t ]+b f ) Wherein W is i Represents the weight, h t-1 Representing the output of the last cell, x t Representing the input of the current cell, σ represents the sigmod function, b f Representing bias;
inputting a gate state value i at time t in forward transmission t =σ(W i [h t-1 ,x t ]+b i ) The current transfusion is filled withWherein, tanh is a tanh function;
according to the output values of the forgetting gate and the output gate, the output value of the cell is
Outputting the state value o of the gate t =σ(W o [h t-1 ,x t ]+b v ) The hidden layer has a state of h t =o t *tanh(C t );
Calculating an error using back propagation;
when the error is smaller than the error threshold, the weight W is output i And bias b f 。
In step 106, the calculating the real-time data by using the LSTM to obtain a predicted value of the normal release rate of the flight includes:
weight W to be output i And bias b f Substituting and utilizing the f t =σ(W i [h t-1 ,x t ]+b f )、i t =σ(W i [h t-1 ,x t ]+b i )o t =σ(W o [h t-1 ,x t ]+b o )h t =o t *tanh(C t ) And calculating the real-time data to obtain a predicted value of the normal release rate of the flight.
The method for predicting the normal release rate of the flights based on the CNN-LSTM is a method for predicting the normal release rate of the flights based on the dynamic data and the weather condition data of the flights of the terminal building, improves the traditional prediction method, adopts the CNN-LSTM to analyze time sequence data, greatly improves the accuracy of the prediction, and simultaneously avoids the delay of the prediction.
The time sequence is adopted, so that the method has a good effect on the real-time data stream processing of the airport, and the prediction of the normal rate of the airport flights improves the operation scheduling efficiency of the airport. In addition, the method for predicting the normal clearance rate of the CNN-LSTM flight also provides a reference method for analyzing other time sequence state data of the airport.
Compared with the prior art, the CNN-LSTM flight normal release rate-based method provides a better calculation method for the flight state time sequence data, so that the data analysis is more accurate, an accurate predicted value of the flight normal release rate is provided, and the problems that the flight system in the prior art is single in data form and the predicted value of the flight normal release rate is inaccurate are solved.
The above examples illustrate only a few embodiments of the invention, which are described in detail and are not to be construed as limiting the scope of the invention. It should be noted that it will be apparent to those skilled in the art that several variations and modifications can be made without departing from the spirit of the invention, which are all within the scope of the invention.
Claims (4)
1. A method for predicting a normal clearance rate of a CNN-LSTM based flight, comprising:
marking the real-time flight status data and the real-time weather data with time labels, combining the real-time flight status data and the real-time weather data into formatted data, and taking the formatted data as real-time data;
taking the flight state data and the weather data with the same time sequence as related words, and obtaining Word vectors of each Word by using Word2vec training test, wherein the time sequence of the flight state data and the weather data is prior to the real-time flight state data and the real-time weather data;
learning word vectors by using CNN to obtain word vectors after training optimization, wherein the word vectors after training optimization have richer semantics than word vectors before training optimization;
replacing flight state data and weather data with the word vector after training optimization, and taking the word vector as time sequence training set data;
training the time sequence training set data by using the LSTM to obtain the values of all parameters in the LSTM;
calculating real-time data by using the LSTM to obtain a predicted value of the normal release rate of the flight;
the method for using the flight status data and the weather data with the same time sequence as related words comprises the following steps: the flight status data and the weather data with the same time sequence are arranged and formatted;
the Word vector of each Word is obtained by using a Word2vec training test, which comprises the following steps:
calculating the code of each word using Huffman codingWherein w represents a word, p w Represents the path from the root node to the leaf node corresponding to w, l w Representing path p w The number of the nodes is included in->Representing path p w Huffman coding corresponding to the ith non-leaf node;
huffman vector of word w is θ w ,Representing path p w Vectors corresponding to the ith non-leaf node;
according to logistic regression, word path p w The probability of a node being classified into a positive class isThe probability of being divided into negative classes is 1-q, where σ () represents the sigma function, V W A word vector representing a word;
in the parameter iteration process, the parameters adopt gradient rising and step lengthWherein α is the learning rate;
iterative variance cumulative value of word w vector
Huffman vector of word w rises through gradient
Of which theta is w And the value of the last iterationIf the difference value is smaller than the threshold value, the word vector converges, otherwise, the method returnsRetraining;
when the model converges, word vector V of word w w The calculation formula of (2) is as follows:where Context (w) represents a word contained in data, ||indicates a total number.
2. The method for predicting the normal clearance rate of a CNN-LSTM based flight of claim 1, wherein the learning of the word vector with the CNN to obtain the trained and optimized word vector includes:
carrying out normalization processing on the word vectors;
setting the maximum number of iterations, an initialization weight and an acceptance accuracy threshold parameter;
output a of the first layer in forward propagation in CNN l =σ(w l a l-1 +b l ) Wherein w is l Weight matrix representing mapping of layer l-1 to layer l, b l Representing data bias, σ () represents a sigma function, i.e
Output of layer I in back propagation in CNNWherein ≡indicates dot product, z between vectors l Input data representing a first layer;
updating W of first layer using gradient descent method l And b l The updating method is as followsWherein α represents a learning rate;
checking the change values of W and b, if the change values are smaller than the threshold value for stopping iteration, jumping out of the loop, outputting W and b of each layer, and training the optimized word vector.
3. The method for predicting the normal clearance rate of a CNN-LSTM based flight according to claim 2, wherein training the time series training set data by using the LSTM to obtain the values of each parameter in the LSTM includes:
setting initial values of weight, iteration times and network layer numbers;
the forgetting door state value at the moment t in forward transmission is f t =σ(W i [h t-1 ,x t ]+b f ) Wherein W is i Represents the weight, h t-1 Representing the output of the last cell, x t Representing the input of the current cell, σ represents the sigmod function, b f Representing bias;
inputting a gate state value i at time t in forward transmission t =σ(W i [h t-1 ,x t ]+b i ) The current transfusion is filled withWherein, tanh is a tanh function;
according to the output values of the forgetting gate and the output gate, the output value of the cell is
Outputting the state value o of the gate t =σ(W o [h l-1 ,x t ]+b o ) The hidden layer has a state of h t =o t *tanh(C t );
Calculating an error using back propagation;
when the error is smaller than the error threshold, the weight W is output i And bias b f 。
4. The method for predicting the normal clearance rate of a flight based on CNN-LSTM according to claim 3, wherein the calculating the real-time data using LSTM to obtain the predicted value of the normal clearance rate of the flight comprises:
weight W to be output i And bias b f Substituting and utilizing the f t =σ(W i [h t-1 ,x t ]+b f )、i t =σ(W i [h t-1 ,x t ]+b i )、o t =σ(W o [h t-1 ,x t ]+b o )、h t =o t *tanh(C t ) And calculating the real-time data to obtain a predicted value of the normal release rate of the flight.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201811384267.XA CN109523082B (en) | 2018-11-20 | 2018-11-20 | CNN-LSTM flight normal clearance rate prediction method |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201811384267.XA CN109523082B (en) | 2018-11-20 | 2018-11-20 | CNN-LSTM flight normal clearance rate prediction method |
Publications (2)
Publication Number | Publication Date |
---|---|
CN109523082A CN109523082A (en) | 2019-03-26 |
CN109523082B true CN109523082B (en) | 2023-12-22 |
Family
ID=65776777
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201811384267.XA Active CN109523082B (en) | 2018-11-20 | 2018-11-20 | CNN-LSTM flight normal clearance rate prediction method |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN109523082B (en) |
Families Citing this family (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN111027767B (en) * | 2019-12-09 | 2023-04-07 | 中国民航大学 | Airport group delay prediction method based on Skip-LSTM network |
CN111047917B (en) * | 2019-12-18 | 2021-01-15 | 四川大学 | Flight landing scheduling method based on improved DQN algorithm |
CN112330114B (en) * | 2020-10-27 | 2024-09-24 | 南京航空航天大学 | Aircraft hazard identification method based on mixed deep neural network |
CN112132366A (en) * | 2020-11-30 | 2020-12-25 | 中航信移动科技有限公司 | Prediction system for flight clearance rate |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN107274383A (en) * | 2017-05-17 | 2017-10-20 | 南京邮电大学 | A kind of haze visibility detecting method based on deep learning |
CN107291693A (en) * | 2017-06-15 | 2017-10-24 | 广州赫炎大数据科技有限公司 | A kind of semantic computation method for improving term vector model |
CN107544957A (en) * | 2017-07-05 | 2018-01-05 | 华北电力大学 | A kind of Sentiment orientation analysis method of business product target word |
CN107967532A (en) * | 2017-10-30 | 2018-04-27 | 厦门大学 | The Forecast of Urban Traffic Flow Forecasting Methodology of integration region vigor |
US10002322B1 (en) * | 2017-04-06 | 2018-06-19 | The Boston Consulting Group, Inc. | Systems and methods for predicting transactions |
CN108564227A (en) * | 2018-04-26 | 2018-09-21 | 重庆大学 | A kind of track traffic for passenger flow amount prediction technique based on space-time characteristic |
Family Cites Families (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10825072B2 (en) * | 2016-09-14 | 2020-11-03 | Microsoft Technology Licensing, Llc | System for producing recommendations and predicting purchases of products based on usage patterns |
-
2018
- 2018-11-20 CN CN201811384267.XA patent/CN109523082B/en active Active
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10002322B1 (en) * | 2017-04-06 | 2018-06-19 | The Boston Consulting Group, Inc. | Systems and methods for predicting transactions |
CN107274383A (en) * | 2017-05-17 | 2017-10-20 | 南京邮电大学 | A kind of haze visibility detecting method based on deep learning |
CN107291693A (en) * | 2017-06-15 | 2017-10-24 | 广州赫炎大数据科技有限公司 | A kind of semantic computation method for improving term vector model |
CN107544957A (en) * | 2017-07-05 | 2018-01-05 | 华北电力大学 | A kind of Sentiment orientation analysis method of business product target word |
CN107967532A (en) * | 2017-10-30 | 2018-04-27 | 厦门大学 | The Forecast of Urban Traffic Flow Forecasting Methodology of integration region vigor |
CN108564227A (en) * | 2018-04-26 | 2018-09-21 | 重庆大学 | A kind of track traffic for passenger flow amount prediction technique based on space-time characteristic |
Non-Patent Citations (1)
Title |
---|
融合气象数据的并行化航班延误预测模型;吴仁彪等;《信号处理》;20180531;第34卷(第05期);第2-3节 * |
Also Published As
Publication number | Publication date |
---|---|
CN109523082A (en) | 2019-03-26 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN109523082B (en) | CNN-LSTM flight normal clearance rate prediction method | |
Kim et al. | A deep learning approach to flight delay prediction | |
Thiagarajan et al. | A machine learning approach for prediction of on-time performance of flights | |
CN111383452A (en) | Method for estimating and predicting short-term traffic running state of urban road network | |
CN116468186B (en) | Flight delay time prediction method, electronic equipment and storage medium | |
CN110458181A (en) | A kind of syntax dependency model, training method and analysis method based on width random forest | |
CN115564114B (en) | Airspace carbon emission short-term prediction method and system based on graph neural network | |
CN112966853B (en) | Urban road network short-time traffic flow prediction method based on space-time residual mixed model | |
CN110443448A (en) | A kind of aircraft seat in the plane classification prediction technique and system based on two-way LSTM | |
CN109471698A (en) | System and method for detecting abnormal behavior of virtual machine in cloud environment | |
CN105606914A (en) | IWO-ELM-based Aviation power converter fault diagnosis method | |
CN109993225A (en) | A kind of airspace complexity classification method and device based on unsupervised learning | |
CN112288034A (en) | Semi-supervised online anomaly detection method for wireless sensor network | |
CN105471647A (en) | Power communication network fault positioning method | |
CN113742195B (en) | Bayesian neural network-based system health state prediction method | |
CN115114409A (en) | Civil aviation unsafe event combined extraction method based on soft parameter sharing | |
CN112766603A (en) | Traffic flow prediction method, system, computer device and storage medium | |
CN115966107A (en) | Airport traffic flow prediction method based on graph neural network | |
CN113128769A (en) | Intelligent flight delay prediction method based on deep learning | |
Dai et al. | Predicting go-around occurrence with input-output hidden Markov model | |
CN116010187A (en) | Log detection method and related device | |
CN114037122A (en) | Flight delay prediction method based on big data mining processing analysis | |
US20230401454A1 (en) | Method using weighted aggregated ensemble model for energy demand management of buildings | |
CN115689073B (en) | Flight delay prediction method and system based on high-order network | |
CN117035060A (en) | CC-MIDINN-based airplane arrival time prediction method |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
CB02 | Change of applicant information | ||
CB02 | Change of applicant information |
Address after: 510000 North Building of secondary company business building, block A4, New Baiyun International Airport, Baiyun District, Guangzhou City, Guangdong Province Applicant after: Guangdong Airport Baiyun Information Technology Co.,Ltd. Address before: 510000 North Building of secondary company business building, block A4, New Baiyun International Airport, Baiyun District, Guangzhou City, Guangdong Province Applicant before: GUANGDONG AIRPORT BAIYUN INFORMATION TECHNOLOGY CO.,LTD. |
|
GR01 | Patent grant | ||
GR01 | Patent grant |