CN114492763A - Graph neural network method for enhancing attention by fusing global context information - Google Patents

Graph neural network method for enhancing attention by fusing global context information Download PDF

Info

Publication number
CN114492763A
CN114492763A CN202210141079.4A CN202210141079A CN114492763A CN 114492763 A CN114492763 A CN 114492763A CN 202210141079 A CN202210141079 A CN 202210141079A CN 114492763 A CN114492763 A CN 114492763A
Authority
CN
China
Prior art keywords
session
graph
target
global
conversation
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210141079.4A
Other languages
Chinese (zh)
Inventor
王永贵
王阳
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Liaoning Technical University
Original Assignee
Liaoning Technical University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Liaoning Technical University filed Critical Liaoning Technical University
Priority to CN202210141079.4A priority Critical patent/CN114492763A/en
Publication of CN114492763A publication Critical patent/CN114492763A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/25Fusion techniques
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F40/00Handling natural language data
    • G06F40/30Semantic analysis
    • G06F40/35Discourse or dialogue representation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q30/00Commerce
    • G06Q30/06Buying, selling or leasing transactions
    • G06Q30/0601Electronic shopping [e-shopping]
    • G06Q30/0631Item recommendations

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Data Mining & Analysis (AREA)
  • Business, Economics & Management (AREA)
  • Artificial Intelligence (AREA)
  • General Engineering & Computer Science (AREA)
  • General Health & Medical Sciences (AREA)
  • Computational Linguistics (AREA)
  • Finance (AREA)
  • Accounting & Taxation (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Health & Medical Sciences (AREA)
  • Evolutionary Computation (AREA)
  • Audiology, Speech & Language Pathology (AREA)
  • General Business, Economics & Management (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Evolutionary Biology (AREA)
  • Development Economics (AREA)
  • Economics (AREA)
  • Marketing (AREA)
  • Strategic Management (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Biomedical Technology (AREA)
  • Biophysics (AREA)
  • Molecular Biology (AREA)
  • Computing Systems (AREA)
  • Mathematical Physics (AREA)
  • Software Systems (AREA)
  • Information Retrieval, Db Structures And Fs Structures Therefor (AREA)

Abstract

The invention discloses a graph neural network method for enhancing attention by fusing global context information, which comprises the following steps of: firstly, a global graph is constructed by using all session sequence information, information related to the target session is extracted from the global graph so as to better infer user preference of the target session, secondly, GNN is used for learning all item vector representations related to the target session graph, and a target attention mechanism and a position attention network are used for obtaining session vector representations. And finally, the session length information is fused, and different weights are distributed for the long-term preference and the short-term preference, so that more effective recommendation can be made for the user. The invention adopts a conversation perception attention mechanism to recursively merge the neighbor embedding of each node on the global graph, and also designs a conversation-level object representation learning layer which uses GNN to learn the conversation-level object embedding in the current conversation on the conversation graph and uses the attention mechanism to aggregate the learned object representations.

Description

Graph neural network method for enhancing attention by fusing global context information
Technical Field
The invention belongs to the technical field of data processing, and particularly relates to a graph neural network method for enhancing attention by fusing global context information.
Background
In recent years, application and research based on the Session-based recommendation System (SRS) are one of the popular directions of the recommendation System. Early, models based on conversational recommendations were mostly studied based on markov chains, the main idea of which was to predict the next interest of a user from the previous actions of the user. However, this type of model relies on strong independence assumptions, which in practical applications may be affected by noisy data problems. Recently, many deep learning-based methods have been proposed for tasks that utilize paired item transition information to model user preferences for a given session. These methods have achieved encouraging results but still face the following problems. First, some people model continuous one-way transfer relationships between items by using recurrent neural networks, but ignore transfer relationships between other items in a conversation. Second, other graph-based neural networks and self-attention mechanisms, such as SR-GNN, learn a representation of the entire session by computing the correlation between each item and the last item, and performance depends largely on the correlation of the last item with the user preferences of the current session.
Disclosure of Invention
Aiming at the defects in the prior art, the technical problem to be solved by the invention is to provide a graph neural network method for enhancing attention by fusing global context information, and an attention network thought is utilized to obtain accurate user characteristics so as to recommend the user.
In order to solve the technical problems, the invention is realized by the following technical scheme:
the invention provides a graph neural network method for enhancing attention by fusing global context information, which comprises the following steps of:
step 1: constructing a global graph by using all the session sequence information, and extracting information related to the target session from the global graph;
step 2: learning session-level item embedding in target sessions using GNN models on session graphs;
and step 3: user preferences for the current session are modeled by aggregating session-level item representations, outputting a predicted probability of a candidate for recommendation.
Preferably, the specific steps of step 1 are as follows:
step 1.1: counting the occurrence frequency of adjacent articles as the weight of each edge of the global conversation graph to generate a weight matrix;
step 1.2: calculating a local adjacency matrix corresponding to the target session, and generating an item list from the items appearing in the target session;
step 1.3: and extracting information related to the target session item list through the weight matrix of all sessions to calculate a global adjacency matrix, and performing weighted summation on the global adjacency matrix and the local adjacency matrix to obtain an adjacency matrix of the sessions.
Further, the specific steps of step 2 are as follows:
step 2.1: learning a node vector of a session graph by adopting a graph neural network;
step 2.2: representing the short-term and long-term preferences of the user by using the node vectors involved in the target session;
step 2.3: an attention score is calculated between all items in the session relative to each target item using the attention network.
Further, the specific steps of step 3 are as follows:
step 3.1: for each project, obtaining a session representation by combining the user's short-term and long-term preferences;
step 3.2: the conversation representation is obtained by linearly combining the item representations.
From the above, the graph neural network method for enhancing attention by fusing global context information of the invention firstly constructs a global graph by using all session sequence information, extracts information related to a target session from the global graph so as to better infer user preference of the target session, secondly learns all item vector representations related in the target session graph by using GNN, and adopts a target attention mechanism and a position attention network to obtain session vector representations, and finally fuses session length information to allocate different weights for long-term preference and short-term preference, thereby making more effective recommendation for users.
The foregoing description is only an overview of the technical solutions of the present invention, and in order to make the technical means of the present invention more clearly understood, the present invention may be implemented in accordance with the content of the description, and in order to make the above and other objects, features, and advantages of the present invention more clearly understood, the following detailed description is given with reference to the preferred embodiments in conjunction with the accompanying drawings.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present invention, the drawings of the embodiments will be briefly described below.
FIG. 1 is a flow diagram of a graphical neural network method of the present invention incorporating global context information attention enhancement;
fig. 2 is a flowchart of the session transfer matrix calculation according to the present invention.
Detailed Description
Other aspects, features and advantages of the present invention will become apparent from the following detailed description, taken in conjunction with the accompanying drawings, which form a part of this specification, and which illustrate, by way of example, the principles of the invention. In the referenced drawings, the same or similar components in different drawings are denoted by the same reference numerals.
The method for integrating the global context information attention enhancement graph neural network specifically comprises the following execution steps:
step 1: constructing a global graph by using all the session sequence information, and extracting information related to the target session from the global graph:
step 1.1: counting the occurrence frequency of adjacent articles as the weight of each edge of the global conversation graph to generate a weight matrix;
step 1.2: calculating a local adjacency matrix corresponding to the target session, and generating an item list from the items appearing in the target session;
step 1.3: and extracting information related to the target session object list through the weight matrix of all sessions to calculate a global adjacency matrix, and then obtaining an adjacency matrix of the sessions through the weighted summation of the global adjacency matrix and the local adjacency matrix.
Step 2: using GNN models on conversational graphs to learn conversational-level item embedding in target conversations:
step 2.1: learning a node vector of a session graph by adopting a graph neural network;
step 2.2: representing the user's short-term and long-term preferences by using the node vectors involved in the target session;
step 2.3: an attention score is calculated between all items in the session relative to each target item using the attention network.
And step 3: modeling user preferences for a current session by aggregating session-level item representations, outputting a predicted probability for a recommended candidate:
step 3.1: for each project, obtaining a session representation by combining the user's short-term and long-term preferences;
step 3.2: the conversation representation is obtained by linearly combining the item representations.
Some relevant definitions to which the invention relates are as follows:
the SRS task predicts the user's next action by giving the user a record of their interaction with the item (e.g., clicking, browsing). The following gives the symbolic definitions of the present invention. In the session-based recommendation task, V ═ { V ═ V1,v2...vnDenotes the set of all clicked items in the session, S ═ S1,s2...ssRepresents all session data sets, each session corresponding to a sequence s of sequentially clicked itemsi=(v1,v2,,...,vl) And l denotes each session siLength of (b), wherein vie.V represents the item that the user clicked i-th time in session s. The GCI-GNN model uses session data to predict the user's next actions, i.e., using known session prefixes
Figure BDA0003506963300000051
Predicting the item v to be clicked by the user n +1 stepn+1
To this end, the invention makes use of a session prefix
Figure BDA0003506963300000052
A score is generated for each of the candidate items,
Figure BDA0003506963300000053
a score vector representing the output of the model, wherein
Figure BDA0003506963300000054
Representing the corresponding item viThe fraction of (c). Thus, the GCI-GNN model is based on the input of a conversation siPredictive scoring
Figure BDA0003506963300000055
The classification model of (1). To adapt user session data to a model, user session data is converted into graph structure data, i.e., a session sequence constructs a directed graph Gs=(Vs,Es),VsRepresenting a collection of items, EsRepresenting a set of directed edges between the items. Wherein each node v in the graphiRepresents an item, (v)i,vi+1)∈EsAs an edge, representing a user click v in a sessioniThen click on vi+1
The flow chart of the graph neural network method for fusing global context information attention enhancement of the invention is shown in FIG. 1. The invention provides a global context information attention-enhancing graph neural network method (GCI-GNN), which integrates global session information into a target session to further improve the recommendation effect because the information contained in the target session of a user is very limited.
The method comprises the steps of firstly constructing a global conversation graph for all conversations, generating an article transfer matrix according to the frequency of appearance between adjacent articles, and simultaneously extracting information related to a target conversation graph from the global conversation graph to recommend interested articles for a user in order to avoid introducing irrelevant articles in the global conversation into the target conversation. Secondly, since the importance of the same item appearing at different times does not have the same impact on user preferences, a location-aware attention network is proposed that embeds reverse location information into the item. Finally, the operation habits of each user are different, the session length may be different, and the longer the session length is, the more easily the user interest is affected by the short-term environment and mood. Therefore, the session length information is fused, different weights are distributed to the long-term preference and the short-term preference, and therefore more effective recommendation is made for the user.
The existing conversation recommendation method only focuses on the internal information of the target conversation, ignores the information of other conversations, but the information of other conversations often contains valuable supplementary information, and is beneficial to accurately deducing the user preference of the target conversation. For example, target session sequence: hua is mobile → data line → bluetooth headset, session sequence 1: bluetooth headset → wired headset, session sequence 2: hua is cell phone → Hua is computer → bluetooth headset. The existing method only focuses on the influence of the target conversation sequence on the next operation of the user, but for the conversation sequence 1, the user may have the intention of buying earphones and perform comparison of similar objects, so that the influence of the earphone type on the target conversation is generated, and for the conversation sequence 2, the user may like to be brand, and the influence of the brand type on the target conversation is generated. Therefore, the link between the global session information and the target session information can be utilized, the limitation of only using the target session is overcome, and more hidden information is mined, so that more accurate recommendation is provided for the user.
Therefore, the invention uses the adjacency matrix for conversation storage, and constructs the conversation sequence into an in-degree matrix and an out-degree matrix, so that the model can learn rich bidirectional connection relation. A. thes (in)And As (out)Respectively a session graph in-degree matrix and an out-degree matrix. Wherein A represents two adjacency matrices As (in)And As (out)Splicing of, i.e.
Figure BDA0003506963300000061
Firstly, generating a global conversation graph from all conversations, counting the occurrence times of adjacent objects as the weight of each edge of the global conversation graph, and generating a weight matrix countsThis process is shown in fig. 2 (a). Secondly, a local adjacency matrix corresponding to the target session s is calculated
Figure BDA0003506963300000062
This process is illustrated in FIG. 2(b), and the present invention generates an item list, denoted as items, for the items appearing in the target session ss. Finally, count is passed through the weight matrix of all sessionssExtracting s-articles conversing with targetList itemssCorrelation information computation global adjacency matrix
Figure BDA0003506963300000063
Then the global adjacency matrix AgAnd a local adjacency matrix AlWeighted summation obtains adjacency matrix A of conversation ss. I.e., equation (1):
As=ηAg+(1-η)Al(1)
in the present invention, it is introduced how to learn the representation of an item node by a Gated graph sequence Neural Network (GNN). The basic idea is to construct a conversation graph with each item viEmbedding into a unified space RdIn item v thereofiEmbedding vector is hi∈RdWhere d is the embedding vector dimension. Article embedding vector
Figure BDA0003506963300000064
Is a drawing GsAfter t updates
Expressed, the update function is as follows:
Figure BDA0003506963300000071
Figure BDA0003506963300000072
Figure BDA0003506963300000073
Figure BDA0003506963300000074
Figure BDA0003506963300000075
whereinAs,i∈R1×2nIs AsAnd a sub-matrix representing information propagation between the nodes in the graph and other nodes, wherein the information propagation is limited by the type of edges in the graph (including the direction of the edges and whether the edges exist). Thus, it is possible to provide
Figure BDA0003506963300000076
Including activation in each direction. H is belonged to Rd×2dAnd b ∈ RdRespectively, are the weight and the bias parameter,
Figure BDA0003506963300000077
is the vector of all nodes at time t-1.
Figure BDA0003506963300000078
And
Figure BDA0003506963300000079
respectively, an update gate and a reset gate, which determine the discarding and saving of information, respectively. σ (-) is a sigmoid function,
Figure BDA00035069633000000710
representing a Hadamard product (Hadamard product). Using GNN to learn the object embedding vector of the adjacent matrix, updating all nodes in the conversation graph until convergence, and obtaining the node vector
Figure BDA00035069633000000711
The user's short-term and long-term preferences are further represented by the node vectors involved in the session. And generates the user preference in consideration of the location information and session length information of each node in the session.
Traditional conversation recommendation methods only use item representations in the conversation to obtain user interests, and the traditional conversation recommendation methods compress the conversation into fixed vectors, but the fixed vectors limit the representation capability of a recommendation model in consideration of the diversity of target items. To ameliorate this deficiency, the present invention uses a target-aware attention network module to activate and target item v using a target-aware attention network after capture node embedding in the session graphaIn connection withThe user's interests. V thereofaAll items v in the session s are calculated for all candidate items to be predictediWith each target item vaE.g., the attention score between V, and apply a weight matrix W e Rd×dThe attention score is normalized by the softmax function applied to the target item and each node, and is defined as follows:
Figure BDA0003506963300000081
for each session s, the user targets an item vaIs interested in
Figure BDA0003506963300000082
Expressed, the formula is as follows:
Figure BDA0003506963300000083
short-term preference: in the browsing process, the next item browsed by the user is always associated with the current interest of the user, so the invention relates the final operation h of the user based on the graph structurenTreated as short-term preference of the user, expressed as local embedding
Figure BDA0003506963300000084
Long-term preference: during browsing, the user's preference for each item is different, and the contribution of different items to the next item prediction is not equal. Intuitively, the importance of an item depends on where it is located in the session, and the correlation between it and the most recently visited item. The present invention therefore proposes a location-aware attention network that incorporates reverse location information into the embedding of items. Using a learnable position embedding matrix P ═ P1,p2,...,pl]Wherein p isi∈RdIs a position vector for a particular position i, l represents the length of the session sequence. Position information ziThe fusion is carried out by splicing and nonlinear transformation,as shown in formula (9):
zi=tanh(W1[hi||pl-i+1]+b1) (9)
wherein, W1∈Rd×2dAnd b1∈RdAre trainable parameters.
Learning the corresponding weights through a positional attention mechanism and obtaining a long-term preference representation of the conversation using linear combinations as
Figure BDA0003506963300000085
As shown in formulas (10) and (11):
αi,t=qTσ(W2hn+W3hi+W4zi+b2) (10)
Figure BDA0003506963300000086
wherein q, b2∈RdAnd W2,W3,W4∈Rd×dIs a weight parameter.
Finally, the principle of Ebinghaos forgetting indicates that human memory is regular in the forgetting process, the interest embodied by the latest browsing content is more important than the interest embodied by the browsing content before a long time, and the importance of the latest browsing content is gradually attenuated along with the time. Thus, for different session lengths, the long-term preferences and the short-term preferences of the user have different effects on the user, the longer the session length, the more important the short-term preferences have on the user over time. Therefore, the invention adopts a normalization method to process the session length l, different weights are distributed to the long-term preference and the short-term preference, namely the longer the session length is, the more important the user short-term preference is captured, thereby improving the prediction accuracy. The definition is as follows:
Figure BDA0003506963300000091
Figure BDA0003506963300000092
Figure BDA0003506963300000093
Figure BDA0003506963300000094
indicating the average length of the session, W5∈Rd×3dEmbedding two vectors into a unified space RdIn the method, different conversation embeddings are generated for each target object, and the conversation embeddings stThe target embedding splicing method is formed by long-term and short-term preference and target embedding splicing.
The invention predicts the probability of each candidate item becoming the next click of the user according to the initial embedding and conversation representation of each item, performs descending sorting according to the probability, takes the items in the front as the items interested by the user for recommendation, and obtains output by applying a softmax function, which is defined as follows:
Figure BDA0003506963300000095
the loss function is a cross entropy function and is defined as follows:
Figure BDA0003506963300000096
the invention firstly utilizes all session sequence information to construct a global graph, extracts information related to the target session from the global graph so as to better infer user preference of the target session, secondly utilizes GNN to learn all item vector representations related in the target session graph, and adopts a target attention mechanism and a position attention network to obtain session vector representations. And finally, the session length information is fused, and different weights are distributed for the long-term preference and the short-term preference, so that more effective recommendation is made for the user. The invention adopts a conversation perception attention mechanism to recursively merge the neighbor embedding of each node on the global graph, and also designs a conversation-level object representation learning layer which uses GNN to learn the conversation-level object embedding in the current conversation on the conversation graph and uses the attention mechanism to aggregate the learned object representations.
While the foregoing is directed to the preferred embodiment of the present invention, other and further embodiments of the invention may be devised without departing from the basic scope thereof, and the scope thereof is determined by the claims that follow.

Claims (4)

1. A graph neural network method for fusing global context information attention enhancement is characterized by comprising the following steps:
step 1: constructing a global graph by using all the session sequence information, and extracting information related to the target session from the global graph;
step 2: learning session-level item embedding in target sessions using GNN models on session graphs;
and step 3: user preferences for the current session are modeled by aggregating session-level item representations, outputting a predicted probability of a candidate for recommendation.
2. The method for fusing global context information attention-enhancing graph neural network according to claim 1, wherein the specific steps of step 1 are as follows:
step 1.1: counting the occurrence frequency of adjacent articles as the weight of each edge of the global conversation graph to generate a weight matrix;
step 1.2: calculating a local adjacency matrix corresponding to the target session, and generating an item list from the items appearing in the target session;
step 1.3: and extracting information related to the target session item list through the weight matrix of all sessions to calculate a global adjacency matrix, and performing weighted summation on the global adjacency matrix and the local adjacency matrix to obtain an adjacency matrix of the sessions.
3. The method for fusing global context information attention-enhancing graph neural network according to claim 1, wherein the specific steps of step 2 are as follows:
step 2.1: learning a node vector of a session graph by adopting a graph neural network;
step 2.2: representing the short-term and long-term preferences of the user by using the node vectors involved in the target session;
step 2.3: an attention score is calculated between all items in the session relative to each target item using the attention network.
4. The method for fusing global context information attention-enhancing graph neural network according to claim 1, wherein the specific steps of step 3 are as follows:
step 3.1: for each project, obtaining a session representation by combining the user's short-term and long-term preferences;
step 3.2: the conversation representation is obtained by linearly combining the item representations.
CN202210141079.4A 2022-02-16 2022-02-16 Graph neural network method for enhancing attention by fusing global context information Pending CN114492763A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210141079.4A CN114492763A (en) 2022-02-16 2022-02-16 Graph neural network method for enhancing attention by fusing global context information

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210141079.4A CN114492763A (en) 2022-02-16 2022-02-16 Graph neural network method for enhancing attention by fusing global context information

Publications (1)

Publication Number Publication Date
CN114492763A true CN114492763A (en) 2022-05-13

Family

ID=81480465

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210141079.4A Pending CN114492763A (en) 2022-02-16 2022-02-16 Graph neural network method for enhancing attention by fusing global context information

Country Status (1)

Country Link
CN (1) CN114492763A (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114971784A (en) * 2022-05-21 2022-08-30 内蒙古工业大学 Graph neural network-based session recommendation method and system integrating self-attention mechanism
CN116304279A (en) * 2023-03-22 2023-06-23 烟台大学 Active perception method and system for evolution of user preference based on graph neural network
CN116662501A (en) * 2023-05-18 2023-08-29 哈尔滨工程大学 Session recommendation method based on session context information
CN118036654A (en) * 2024-03-19 2024-05-14 江苏大学 Session recommendation method based on graph neural network and attention mechanism

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112035746A (en) * 2020-09-01 2020-12-04 湖南大学 Session recommendation method based on space-time sequence diagram convolutional network
CN112364976A (en) * 2020-10-14 2021-02-12 南开大学 User preference prediction method based on session recommendation system
CN113487018A (en) * 2021-07-28 2021-10-08 辽宁工程技术大学 Global context enhancement graph neural network method based on session recommendation
US20210366024A1 (en) * 2020-05-25 2021-11-25 National University Of Defense Technology Item recommendation method based on importance of item in session and system thereof

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20210366024A1 (en) * 2020-05-25 2021-11-25 National University Of Defense Technology Item recommendation method based on importance of item in session and system thereof
CN112035746A (en) * 2020-09-01 2020-12-04 湖南大学 Session recommendation method based on space-time sequence diagram convolutional network
CN112364976A (en) * 2020-10-14 2021-02-12 南开大学 User preference prediction method based on session recommendation system
CN113487018A (en) * 2021-07-28 2021-10-08 辽宁工程技术大学 Global context enhancement graph neural network method based on session recommendation

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
孙鑫;刘学军;李斌;梁珂;: "基于图神经网络和时间注意力的会话序列推荐", 计算机工程与设计, no. 10, 16 October 2020 (2020-10-16) *

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114971784A (en) * 2022-05-21 2022-08-30 内蒙古工业大学 Graph neural network-based session recommendation method and system integrating self-attention mechanism
CN114971784B (en) * 2022-05-21 2024-05-14 内蒙古工业大学 Session recommendation method and system based on graph neural network by fusing self-attention mechanism
CN116304279A (en) * 2023-03-22 2023-06-23 烟台大学 Active perception method and system for evolution of user preference based on graph neural network
CN116304279B (en) * 2023-03-22 2024-01-26 烟台大学 Active perception method and system for evolution of user preference based on graph neural network
CN116662501A (en) * 2023-05-18 2023-08-29 哈尔滨工程大学 Session recommendation method based on session context information
CN116662501B (en) * 2023-05-18 2024-06-14 哈尔滨工程大学 Session recommendation method based on session context information
CN118036654A (en) * 2024-03-19 2024-05-14 江苏大学 Session recommendation method based on graph neural network and attention mechanism
CN118036654B (en) * 2024-03-19 2024-09-03 江苏大学 Session recommendation method based on graph neural network and attention mechanism

Similar Documents

Publication Publication Date Title
CN114492763A (en) Graph neural network method for enhancing attention by fusing global context information
WO2023108324A1 (en) Comparative learning enhanced two-stream model recommendation system and algorithm
CN113590900A (en) Sequence recommendation method fusing dynamic knowledge maps
CN112733018A (en) Session recommendation method based on graph neural network GNN and multi-task learning
CN106845644A (en) A kind of heterogeneous network of the contact for learning user and Mobile solution by correlation
CN112765461A (en) Session recommendation method based on multi-interest capsule network
CN113487018A (en) Global context enhancement graph neural network method based on session recommendation
CN112364245B (en) Top-K movie recommendation method based on heterogeneous information network embedding
CN114282122A (en) Efficient non-sampling graph convolution network recommendation method
CN113590976A (en) Recommendation method of space self-adaptive graph convolution network
CN116304279B (en) Active perception method and system for evolution of user preference based on graph neural network
CN114282077A (en) Session recommendation method and system based on session data
CN114971784A (en) Graph neural network-based session recommendation method and system integrating self-attention mechanism
CN112819575A (en) Session recommendation method considering repeated purchasing behavior
CN115860179A (en) Trajectory prediction method, apparatus, device, storage medium, and program product
CN113344648B (en) Advertisement recommendation method and system based on machine learning
CN117994011A (en) E-commerce dynamic perception data recommendation method based on memory updating and neighbor transfer
CN114625969A (en) Recommendation method based on interactive neighbor session
Yang et al. Gated graph convolutional network based on spatio-temporal semi-variogram for link prediction in dynamic complex network
CN114528490A (en) Self-supervision sequence recommendation method based on long-term and short-term interests of user
Li et al. Session Recommendation Model Based on Context‐Aware and Gated Graph Neural Networks
CN117556133A (en) Neural time gate time sequence enhanced session recommendation method based on graph neural network
CN115953215A (en) Search type recommendation method based on time and graph structure
CN115293812A (en) E-commerce platform session perception recommendation prediction method based on long-term and short-term interests
CN114610862A (en) Conversation recommendation method for enhancing context sequence of graph

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination