CN112270349B - Individual position prediction method based on GCN-LSTM - Google Patents

Individual position prediction method based on GCN-LSTM Download PDF

Info

Publication number
CN112270349B
CN112270349B CN202011142144.2A CN202011142144A CN112270349B CN 112270349 B CN112270349 B CN 112270349B CN 202011142144 A CN202011142144 A CN 202011142144A CN 112270349 B CN112270349 B CN 112270349B
Authority
CN
China
Prior art keywords
similarity
user
time
users
lstm
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202011142144.2A
Other languages
Chinese (zh)
Other versions
CN112270349A (en
Inventor
赵志远
张宇
吴升
李代超
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Fuzhou University
Original Assignee
Fuzhou University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Fuzhou University filed Critical Fuzhou University
Priority to CN202011142144.2A priority Critical patent/CN112270349B/en
Publication of CN112270349A publication Critical patent/CN112270349A/en
Application granted granted Critical
Publication of CN112270349B publication Critical patent/CN112270349B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/22Matching criteria, e.g. proximity measures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/90Details of database functions independent of the retrieved data types
    • G06F16/95Retrieval from the web
    • G06F16/953Querying, e.g. by the use of web search engines
    • G06F16/9537Spatial or temporal dependent retrieval, e.g. spatiotemporal queries
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/044Recurrent networks, e.g. Hopfield networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/045Combinations of networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Data Mining & Analysis (AREA)
  • General Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • Evolutionary Computation (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Artificial Intelligence (AREA)
  • Computing Systems (AREA)
  • Software Systems (AREA)
  • Molecular Biology (AREA)
  • Computational Linguistics (AREA)
  • Biophysics (AREA)
  • Biomedical Technology (AREA)
  • Mathematical Physics (AREA)
  • General Health & Medical Sciences (AREA)
  • Health & Medical Sciences (AREA)
  • Databases & Information Systems (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Evolutionary Biology (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Management, Administration, Business Operations System, And Electronic Commerce (AREA)

Abstract

The invention relates to an individual position prediction method based on GCN-LSTM, which comprises the following steps: step S1: collecting track data of a user; s3, extracting the similarity characteristics of the users by using a graph convolution network according to the obtained similarity of the user tracks; and S4, constructing an improved GCN-LSTM model, and S5, extracting the time characteristics of the user track by adopting the improved GCN-LSTM model based on the similarity characteristics to obtain a prediction result. The method considers the similarity of the user tracks, utilizes the graph volume model to model the similarity characteristics of the user tracks, effectively extracts the similarity characteristics among users, and better utilizes the user similarity to improve the accuracy of individual position prediction.

Description

Individual position prediction method based on GCN-LSTM
Technical Field
The invention belongs to the technical field of spatial information, and particularly relates to an individual position prediction method based on GCN-LSTM.
Background
At present, china is undergoing a rapid urbanization process, a large amount of population aggregation in cities puts higher requirements on urban resource allocation, and a relatively lagged urban development level brings a series of urban problems (such as traffic jam, crowd trampling and other public safety events). The movement mode of the person is closely related to the urban resource distribution, the movement rule of the person is understood, the future activity position of the individual is predicted, the reasonable configuration of the urban resources can be supported, and therefore the urban problems can be more scientifically dealt with.
Disclosure of Invention
In view of the above, the present invention is directed to providing an individual location prediction method based on GCN-LSTM, which can effectively improve the accuracy of individual location prediction.
In order to achieve the purpose, the invention adopts the following technical scheme:
a GCN-LSTM-based individual position prediction method comprises the following steps:
step S1: collecting track data of a user;
s2, measuring the similarity of user tracks;
s3, extracting the similarity characteristics of the users by using a graph convolution network according to the similarity of the obtained user tracks;
s4, constructing an improved GCN-LSTM model;
and S5, extracting the time characteristics of the user track by adopting an improved GCN-LSTM model based on the similarity characteristics to obtain a prediction result.
Further, the step S2 specifically includes:
s21, preprocessing the trajectory data of the user to remove abnormal values and missing values;
s22, setting the sizes of the grid units in the vertical direction and the horizontal direction, respectively starting from the left side boundary and the lower side boundary of the research region, carrying out grid division on the research region rightward and upward, and coding grids;
step S23, calculating corresponding grid cells according to the position information of the user, replacing the position coordinate sequence with grid numbers, and converting the original user track into a grid track;
and S24, calculating the number of the two users in the same grid at the same time, and calculating the ratio of the number to the total time as the similarity between the two users.
Further, step S24 adopts a similarity measurement based on the track points, and the calculation method is as follows:
Figure BDA0002738584800000021
Figure BDA0002738584800000022
Figure BDA0002738584800000023
wherein R and S respectively represent grid tracks of two users, and R t And s t Respectively representing the grid position code at the time t, the recording points are n, dist (r) t ,s t ) Indicating that if two users mark as 1 in the same grid at the time t, and if not, marking as 0; eu (R, S) is the total number of two users on the same grid at the same time, sim (R, S) represents the similarity measure of both.
Further, the step S3 specifically includes:
s31, constructing a similarity graph matrix by utilizing the similarity among the users;
and S32, according to the obtained similarity graph matrix, adopting a graph convolution network to model the similarity characteristics to extract the similarity characteristics of the user.
Further, the step S31 specifically includes: screening out users with similarity exceeding a set threshold delta with the user to be predicted, calculating the similarity between the users, and further constructing a similarity graph matrix
Figure BDA0002738584800000031
In the formula, sim r,s Representing the similarity between user r and user s.
Further, the graph volume model specifically includes:
Figure BDA0002738584800000032
Figure BDA0002738584800000033
Relu(x)=max(0,x) (7)
wherein X represents a matrix formed by the positions of the user to be predicted and the user with similarity exceeding a threshold value at the current moment, A is a similarity graph matrix of the user,
Figure BDA0002738584800000035
the degree matrix, relu is the activation function, max is the function of taking the maximum value, and W is the weight matrix.
Further, the improved GCN-LSTM model is specifically as follows: and setting a threshold alpha, and based on the threshold, adopting GCN-LSTM to extract the time characteristics of the track if the user with the similarity exceeding the threshold can be found in the data set for the user to be predicted, or else, adopting LSTM to extract the time characteristics of the track.
Further, the temporal feature of the track extracted by using LSTM specifically includes:
(a) Inputting user position x at time t t Using the output h at time t-1 t-1 And input x at the current time t t Forgetting to calculate door f t
f t =σ(W f ·[h t-1 ,x t ]+b f ) (8)
Figure BDA0002738584800000034
In the formula, h t-1 Represents the output at time t-1, x t Position information indicating the predicted user at time t, f t Indicating forgetting the gate function at time t, W f As a weight matrix of the input layer, b f Is biased by the input layerItem, obtaining an optimal value through model training, wherein sigma is a sigmoid function;
(b) Using the output h at time t-1 t-1 And input x at the current time t t Input gate i for calculation t Using the output h at time t-1 t-1 And input x at the current time t t Generating a candidate vector
Figure BDA0002738584800000041
i t =σ(W i ·[h t-1 ,x t ]+b i ) (10)
Figure BDA0002738584800000042
Figure BDA0002738584800000043
In the formula, W i 、W C Representing the weight matrices in the input and state update layers, respectively, and b i 、b c If yes, the corresponding bias terms are obtained, and tanh is an activation function;
(c) Renewal of the cell state, i.e. C t-1 Is updated to C t . Value f to forget the door t Old cell state C with stored historical location information t-1 Multiplying, forgetting part of historical position information, and inputting a gate value i t And candidate vector
Figure BDA0002738584800000044
Multiplying, storing partial position information of current time, adding the two results to determine new cell state
Figure BDA0002738584800000045
(d) Using the output h at time t-1 t-1 And input x at the current time t t Calculation output gate o t Then using the tanh functionFor cell state C t Processes and outputs the processed value and the output gate value o t Multiplying to obtain an output value
o t =σ(W o ·[h t-1 ,x t ]+b o ) (14)
h t =o t *tanh(C t ) (15)
In the formula W o And b o And respectively inputting the weight matrix and the paranoia item of the output layer, and obtaining an optimal value through model training.
Further, the GCN-LSTM is used to extract the temporal features of the track, which specifically includes:
(a) Inputting location information X of a user to be predicted and a user having similarity thereto t ∈R v
Where v represents the total number of users to be predicted and their similarities
(b) Extracting similarity characteristics of users through a GCN model to obtain a matrix X' t ∈R v
(c) From matrix X' t Fetching a value x 'of a user to predict' t As input of the LSTM model, the hidden layer h at the time t-1 is then used t-1 And x' t Putting the time characteristics into an LSTM model to extract time characteristics, and finally obtaining a prediction result
x′ t →X′ t =f(A,X t ) (16)
y t =LSTM(x′ t ) (17)
Compared with the prior art, the invention has the following beneficial effects:
1. the method considers the similarity of the user tracks, utilizes the graph volume model to model the similarity characteristics of the user tracks, effectively extracts the similarity characteristics among users, and better utilizes the user similarity to improve the accuracy of individual position prediction;
2. according to the method, the threshold is set when the user similarity characteristic matrix is constructed, the similarity value larger than the threshold is reserved, and the calculated amount of the model is effectively reduced;
3. the invention provides a method for determining a threshold value which is helpful for personal position prediction by user similarity, and a GCN-LSTM model is improved on the basis of the method, so that the accuracy is effectively improved.
Drawings
FIG. 1 is a schematic diagram of a person location prediction in accordance with an embodiment of the present invention;
FIG. 2 is a schematic diagram of the present invention;
FIG. 3 is a user trajectory projection in accordance with an embodiment of the present invention;
FIG. 4 is a diagram illustrating a process for extracting user similarity using a convolution model according to an embodiment of the present invention;
FIG. 5 is a diagram of a long term and short term memory model according to an embodiment of the present invention;
FIG. 6 illustrates a GCN-LSTM model according to an embodiment of the present invention;
FIG. 7 is an improved GCN-LSTM model in accordance with an embodiment of the present invention.
Detailed Description
The invention is further explained by the following embodiments in conjunction with the drawings.
Referring to fig. 2, the present invention provides a GCN-LSTM-based individual location prediction method, which includes the following steps:
step S1: collecting track data of a user;
s2, measuring the similarity of user tracks;
s3, extracting the similarity characteristics of the users by using a graph volume network according to the similarity of the obtained user tracks;
s4, constructing an improved GCN-LSTM model;
and S5, extracting the time characteristics of the user track by adopting an improved GCN-LSTM model based on the similarity characteristics to obtain a prediction result.
Referring to fig. 3, in this embodiment, the step S2 specifically is that the step S2 specifically is:
s21, preprocessing the trajectory data of the user to remove abnormal values and missing values;
s22, setting the sizes of the grid units in the vertical direction and the horizontal direction, respectively starting from the left side boundary and the lower side boundary of the research area, carrying out grid division on the research area rightward and upward, and coding grids;
step S23, calculating corresponding grid cells according to the position information of the user, replacing the position coordinate sequence with grid numbers, and converting the original user track into a grid track;
and S24, calculating the number of the two users in the same grid at the same time, and calculating the ratio of the number to the total time as the similarity between the two users.
Preferably, by adopting the similarity measurement based on the track points, the calculation method is as follows:
Figure BDA0002738584800000061
Figure BDA0002738584800000062
Figure BDA0002738584800000063
wherein R and S respectively represent grid tracks of two users, and R t And s t Respectively representing the grid position code at the time t, the recording points are n, dist (r) t ,s t ) Indicating that if two users mark as 1 in the same grid at the time t, and if not, marking as 0; eu (R, S) is the total number of two users on the same grid at the same time, sim (R, S) represents the similarity measure of both.
In the embodiment, the similarity between users is modeled by a graph structure, and a similarity graph G between users is constructed S = V (V, a), V representing the set of users, a ∈ R V*V Representing a graph matrix, and specifically comprising the following steps of:
s31, screening out users with similarity exceeding a set threshold delta with the user to be predicted, calculating the similarity among the users, and further constructing a similarity graph matrix
Figure BDA0002738584800000071
In the formula, sim r,s Representing the similarity between user r and user s.
And S32, according to the obtained similarity graph matrix, adopting a graph convolution network to model the similarity characteristics to extract the similarity characteristics of the user.
Referring to figure 4, s 1 For the predicted user, the graph convolution model can obtain the similarity characteristics of the user to be predicted and similar users. Preferably, in this embodiment, the graph volume model specifically includes:
Figure BDA0002738584800000072
Figure BDA0002738584800000073
Relu(x)=max(0,x) (7)
wherein X represents a matrix formed by the positions of the user to be predicted and the user with similarity exceeding a threshold value at the current moment, A is a similarity graph matrix of the user,
Figure BDA0002738584800000074
and (3) calculating a degree matrix according to a formula (6), wherein Relu is an activation function, namely if the input numerical value is negative, the input numerical value is converted into 0, otherwise, the original numerical value is maintained, the calculation mode is shown in a formula (7), wherein max is a maximum function, W is a weight matrix, and the optimal value is obtained through model training.
Referring to fig. 7, in the present embodiment, the improved GCN-LSTM model is specifically: and setting a threshold alpha, and based on the threshold, if a user with the similarity exceeding the threshold can be found in the data set, extracting the time characteristic of the track by adopting GCN-LSTM, otherwise, extracting the time characteristic of the track by adopting LSTM.
The specific process for obtaining the threshold value alpha is as follows:
(a) Grouping users in the training data according to the similarity degree of the user with the most similar threshold value, and respectively predicting the different groups by using GCN-LSTM and original LSTM
(b) And comparing the experimental results. And (3) predicting the correct rate by using the GCN-LSTM model and the correct rate by using the LSTM model, and taking the similarity corresponding to the GCN-LSTM which is better than the native LSTM in expression as a threshold alpha.
Referring to fig. 5, in this embodiment, the temporal features of the track extracted by using LSTM specifically include the following:
(a) Inputting user position x at time t t Using the output h at time t-1 t-1 And input x at the current time t t Forgetting to calculate door f t
f t =σ(W f ·[h t-1 ,x t ]+b f ) (8)
Figure BDA0002738584800000081
In the formula h t-1 The output representing the time t-1 is obtained by iterative loop calculation, specifically referring to formula (15), x in the last step of the process t Position information indicating the predicted user at time t, f t Expressing forgetting to remember the gate function at time t, W f For the weight matrix of the input layer, obtaining the optimal values by model training, b f And (4) obtaining an optimal value for inputting a layer deviation item through model training, wherein sigma is a sigmoid function, and the calculation method is shown as a formula (9).
(b) Using the output h at time t-1 t-1 And input x at the current time t t Input gate i for calculation t Using the output h at time t-1 t-1 And input x at the current time t t Generating a candidate vector
Figure BDA0002738584800000082
i t =σ(W i ·[h t-1 ,x t ]+b i ) (10)
Figure BDA0002738584800000083
Figure BDA0002738584800000091
In the formula W i 、W c Representing the weight matrices in the input and state update layers, respectively, and b i 、b c And obtaining an optimal value for the corresponding paranoia term through model training, wherein tanh is an activation function, and the calculation method is shown as formula (12).
(c) Renewal of the cell state, i.e. C t-1 Is updated to C t . Will forget the value f of the door t Old cell state C with stored historical location information t-1 Multiplying and forgetting part of historical position information, and inputting a gate value i t And candidate vector
Figure BDA0002738584800000092
Multiplying, storing partial position information of current time, adding the two results to determine new cell state
Figure BDA0002738584800000093
(d) Using the output h at time t-1 t-1 And input x at the current time t t Calculation output gate o t And then using the tanh function to determine the cell state C t Processes and compares the processed value with an output gate value o t Multiplying to obtain an output value
o t =σ(W o ·[h t-1 ,x t ]+b o ) (14)
h t =o t *tanh(C t ) (15)
In the formula W o And b o And respectively inputting the weight matrix and the paranoia item of the output layer, and obtaining an optimal value through model training.
Referring to fig. 6, in this embodiment, the GCN-LSTM is used to extract the temporal features of the track, which is specifically as follows:
(a) Inputting position information X of user to be predicted and user with similarity to the user t ∈R v
Where v represents the total number of users to be predicted and their similarities
(b) Extracting similarity characteristics of users through a GCN model to obtain a matrix X' t ∈R v
(c) From matrix X' t Fetching value x 'of user to be predicted' t As input of the LSTM model, the hidden layer h at the time t-1 is then used t-1 And x' t Putting the time characteristics into an LSTM model to extract time characteristics, and finally obtaining a prediction result
x′ t →X′ t =f(A,X t ) (16)
y t =LSTM(x′ t ) (17)
The above description is only a preferred embodiment of the present invention, and all the changes and modifications made according to the claims should be covered by the present invention.

Claims (3)

1. An individual position prediction method based on GCN-LSTM is characterized by comprising the following steps:
step S1: collecting track data of a user;
s2, measuring the similarity of user tracks;
s3, extracting the similarity characteristics of the users by using a graph convolution network according to the similarity of the obtained user tracks;
s4, constructing an improved GCN-LSTM model;
s5, extracting the time characteristics of the user track by adopting an improved GCN-LSTM model based on the similarity characteristics to obtain a prediction result;
the step S2 specifically includes:
s21, preprocessing the trajectory data of the user to remove abnormal values and missing values;
step S22, setting the sizes of the grid cells in the vertical direction and the horizontal direction, respectively starting from the left side boundary and the lower side boundary of the research area, carrying out grid division on the research area rightward and upward, and coding grids;
step S23, calculating corresponding grid units according to the position information of the user, replacing the position coordinate sequence with grid numbers, and converting the original user track into a grid track;
step S24, calculating the number of the two users in the same grid at the same time, and then calculating the ratio of the number to the total time as the similarity between the two users;
the step S24 adopts similarity measurement based on the track points, and the calculation method is as follows:
Figure FDA0003966157210000011
Figure FDA0003966157210000021
Figure FDA0003966157210000022
wherein R, S respectively represent the grid tracks of two users, R t And s t Respectively representing the grid position codes at the time t, the recording points of the two users are n, dist (r) t ,s t ) Indicating that if two users are marked as 1 in the same grid at the time t, otherwise, marking as 0; eu (R, S) is the total number of two users located in the same grid at the same time, sim (R, S) represents the similarity measure of the two users;
the step S3 specifically includes:
s31, constructing a similarity graph matrix by utilizing the similarity among users;
s32, according to the obtained similarity graph matrix, adopting a graph convolution network to model similarity characteristics to extract the similarity characteristics of the users;
the step S31 specifically includes: screening out users with similarity exceeding a set threshold value delta with the user to be predicted, then calculating the similarity between the users, and further constructing a similarity graph matrix
Figure FDA0003966157210000023
In the formula, sim r,s Representing the similarity between user r and user s;
the graph convolution network specifically comprises:
Figure FDA0003966157210000024
Figure FDA0003966157210000025
Relu(x)=max(0,x) (7)
wherein X represents a matrix formed by the positions of the user to be predicted and the user with similarity exceeding a threshold value at the current moment, A is a similarity graph matrix of the user,
Figure FDA0003966157210000031
the degree matrix, relu is an activation function, max is a maximum function, and W is a weight matrix;
the improved GCN-LSTM model specifically comprises the following steps: and setting a threshold delta, and based on the threshold, adopting GCN-LSTM to extract the time characteristics of the track if the user with the similarity exceeding the threshold can be found in the data set for the user to be predicted, or else, adopting LSTM to extract the time characteristics of the track.
2. The GCN-LSTM based individual location prediction method of claim 1, wherein the temporal features of the trajectory extracted using LSTM are as follows:
(a) Inputting user position x at time t t Using the output h at time t-1 t-1 And input x at the current time t t Calculating forgetting door f t
Figure FDA0003966157210000032
Figure FDA0003966157210000033
In the formula, h t-1 Represents the output at time t-1, W f As a weight matrix of the input layer, b f Obtaining an optimal value for an input layer paranoil item through model training, wherein sigma is a sigmoid function;
(b) Using the output h at time t-1 t-1 And input x at the current time t t Calculation input gate i t Using the output h at time t-1 t-1 And input x at the current time t t Generating a cell state update value
Figure FDA0003966157210000034
Figure FDA0003966157210000041
Figure FDA0003966157210000042
Figure FDA0003966157210000043
In the formula, W i 、W C Representing the weight matrices in the input layer and the state update layer, respectively, and b i 、b c If yes, the corresponding bias term is obtained, and tanh is an activation function;
(c) Renewal of cell status, i.e. C t-1 Is updated to C t (ii) a Will forget the door f t And store historyOld cell state C of location information t-1 Multiplying and forgetting part of historical position information, and inputting the position information into a gate i t And cell state update value
Figure FDA0003966157210000044
Multiplying, storing part of the position information at the current time, and finally adding the two results to determine the new cell state
Figure FDA0003966157210000045
(d) Using the output h at time t-1 t-1 And input x at the current time t t Calculation output gate o t And then using the tanh function to determine the cell state C t Processes and outputs the processed value to the gate o t Multiplying to obtain an output value
Figure FDA0003966157210000046
h t =o t *tanh(C t ) (15)
In the formula W o And b o And respectively obtaining the optimal values of the weight matrix and the bias terms of the output layer through model training.
3. The method for predicting the location of an individual based on GCN-LSTM according to claim 1, wherein said extracting the temporal features of the trajectory using GCN-LSTM is performed as follows:
(a) Inputting position information X of user to be predicted and user with similarity to the user t ∈R v
Wherein v represents the total number of users to be predicted and their similarity users;
(b) Extracting similarity characteristics of users through a graph convolution network to obtain a matrix X' t ∈R v
(c) From matrix X' t Fetching value x 'of user to be predicted' t As LSTM modelThen the hidden layer h at the time t-1 is input t-1 And x' t Putting the time characteristics into an LSTM model to extract time characteristics, and finally obtaining a prediction result
x′ t ←X′ t =f(A,X t ) (16)
y t =LSTM(x′ t ) (17)。
CN202011142144.2A 2020-10-23 2020-10-23 Individual position prediction method based on GCN-LSTM Active CN112270349B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011142144.2A CN112270349B (en) 2020-10-23 2020-10-23 Individual position prediction method based on GCN-LSTM

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011142144.2A CN112270349B (en) 2020-10-23 2020-10-23 Individual position prediction method based on GCN-LSTM

Publications (2)

Publication Number Publication Date
CN112270349A CN112270349A (en) 2021-01-26
CN112270349B true CN112270349B (en) 2023-02-21

Family

ID=74341799

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011142144.2A Active CN112270349B (en) 2020-10-23 2020-10-23 Individual position prediction method based on GCN-LSTM

Country Status (1)

Country Link
CN (1) CN112270349B (en)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112990430B (en) * 2021-02-08 2021-12-03 辽宁工业大学 Group division method and system based on long-time and short-time memory network
WO2022201428A1 (en) * 2021-03-25 2022-09-29 楽天グループ株式会社 Estimation system, estimation method, and program
CN113111581B (en) * 2021-04-09 2022-03-11 重庆邮电大学 LSTM trajectory prediction method combining space-time factors and based on graph neural network

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101555876B1 (en) * 2015-03-31 2015-09-30 스타십벤딩머신 주식회사 System and method for object tracking for video synthesis
CN106488405A (en) * 2016-12-29 2017-03-08 电子科技大学 A kind of position predicting method merging individuality and neighbour's movement law
CN110795522A (en) * 2019-11-06 2020-02-14 中国人民解放军战略支援部队信息工程大学 Method and device for predicting track position of mobile user
CN110928993A (en) * 2019-11-26 2020-03-27 重庆邮电大学 User position prediction method and system based on deep cycle neural network
CN111339449A (en) * 2020-03-24 2020-06-26 青岛大学 User motion trajectory prediction method, device, equipment and storage medium

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101555876B1 (en) * 2015-03-31 2015-09-30 스타십벤딩머신 주식회사 System and method for object tracking for video synthesis
CN106488405A (en) * 2016-12-29 2017-03-08 电子科技大学 A kind of position predicting method merging individuality and neighbour's movement law
CN110795522A (en) * 2019-11-06 2020-02-14 中国人民解放军战略支援部队信息工程大学 Method and device for predicting track position of mobile user
CN110928993A (en) * 2019-11-26 2020-03-27 重庆邮电大学 User position prediction method and system based on deep cycle neural network
CN111339449A (en) * 2020-03-24 2020-06-26 青岛大学 User motion trajectory prediction method, device, equipment and storage medium

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
A hierarchical temporal attention-based LSTM encoder-decoder model for individual mobility prediction;Fa Li et al.;《Neurocomputing》;20200825;第403卷;全文 *
移动轨迹预测算法研究;杨鹏程;《中国优秀硕士学位论文全文数据库(电子期刊)信息科技辑》;20190915(第09期);全文 *

Also Published As

Publication number Publication date
CN112270349A (en) 2021-01-26

Similar Documents

Publication Publication Date Title
CN112270349B (en) Individual position prediction method based on GCN-LSTM
CN109142171B (en) Urban PM10 concentration prediction method based on feature expansion and fusing with neural network
Chen et al. Fuzzy time series forecasting based on optimal partitions of intervals and optimal weighting vectors
CN108985515B (en) New energy output prediction method and system based on independent cyclic neural network
CN111861013B (en) Power load prediction method and device
CN110138595A (en) Time link prediction technique, device, equipment and the medium of dynamic weighting network
CN113554466B (en) Short-term electricity consumption prediction model construction method, prediction method and device
CN112733417B (en) Abnormal load data detection and correction method and system based on model optimization
CN109116299B (en) Fingerprint positioning method, terminal and computer readable storage medium
CN107748940B (en) Power-saving potential quantitative prediction method
CN113449919B (en) Power consumption prediction method and system based on feature and trend perception
CN113780665B (en) Private car stay position prediction method and system based on enhanced recurrent neural network
CN112329990A (en) User power load prediction method based on LSTM-BP neural network
CN112733997A (en) Hydrological time series prediction optimization method based on WOA-LSTM-MC
CN112910711A (en) Wireless service flow prediction method, device and medium based on self-attention convolutional network
CN112084240B (en) Intelligent identification and linkage treatment method and system for group renting
CN113392587A (en) Parallel support vector machine classification method for large-area landslide risk evaluation
CN116187835A (en) Data-driven-based method and system for estimating theoretical line loss interval of transformer area
CN114461931A (en) User trajectory prediction method and system based on multi-relation fusion analysis
CN117436595B (en) New energy automobile energy consumption prediction method, device, equipment and storage medium
CN114862015A (en) Typhoon wind speed intelligent prediction method based on FEN-ConvLSTM model
CN116957131A (en) Power generation power prediction method based on hierarchical time sequence and Informier model fusion
CN112766240B (en) Residual multi-graph convolution crowd distribution prediction method and system based on space-time relationship
CN113762591B (en) Short-term electric quantity prediction method and system based on GRU and multi-core SVM countermeasure learning
CN111143774B (en) Power load prediction method and device based on influence factor multi-state model

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant