CN116186390A - Hypergraph-fused contrast learning session recommendation method - Google Patents

Hypergraph-fused contrast learning session recommendation method Download PDF

Info

Publication number
CN116186390A
CN116186390A CN202211692089.3A CN202211692089A CN116186390A CN 116186390 A CN116186390 A CN 116186390A CN 202211692089 A CN202211692089 A CN 202211692089A CN 116186390 A CN116186390 A CN 116186390A
Authority
CN
China
Prior art keywords
session
hypergraph
embedding
representing
item
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202211692089.3A
Other languages
Chinese (zh)
Inventor
史树敏
刘思辰
刘东阳
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Institute of Technology BIT
Original Assignee
Beijing Institute of Technology BIT
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Institute of Technology BIT filed Critical Beijing Institute of Technology BIT
Priority to CN202211692089.3A priority Critical patent/CN116186390A/en
Publication of CN116186390A publication Critical patent/CN116186390A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/90Details of database functions independent of the retrieved data types
    • G06F16/95Retrieval from the web
    • G06F16/953Querying, e.g. by the use of web search engines
    • G06F16/9535Search customisation based on user profiles and personalisation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/90Details of database functions independent of the retrieved data types
    • G06F16/901Indexing; Data structures therefor; Storage structures
    • G06F16/9024Graphs; Linked lists
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02DCLIMATE CHANGE MITIGATION TECHNOLOGIES IN INFORMATION AND COMMUNICATION TECHNOLOGIES [ICT], I.E. INFORMATION AND COMMUNICATION TECHNOLOGIES AIMING AT THE REDUCTION OF THEIR OWN ENERGY USE
    • Y02D10/00Energy efficient computing, e.g. low power processors, power management or thermal management

Abstract

The invention discloses a hypergraph-fused contrast learning session recommendation method, and belongs to the technical field of data mining and application. The method obtains high-order information between projects by constructing a global hypergraph, and meanwhile, obtains interaction information between projects by constructing a local session graph for each session. And processing the constructed global hypergraph by using a hypergraph attention network to obtain project embedding under the global hypergraph, and learning the project embedding by using a gating map neural network for the constructed local session graph. After obtaining two kinds of project embedding, using an attention mechanism to obtain two kinds of session embedding information, and using contrast learning as an auxiliary task to maximize two kinds of session embedding mutual information. And finally, calculating the recommendation probability of the candidate item according to the session characteristics. According to the method, hypergraph structure modeling is used, complex information among projects is better obtained, the defect of sparse session data is effectively overcome by using a contrast learning method, and the performance of a recommendation system is improved.

Description

Hypergraph-fused contrast learning session recommendation method
Technical Field
The invention relates to a contrast learning session recommendation method fusing hypergraphs, in particular to a method for enhancing interaction information between hypergraph information and articles by using a contrast learning method in a session-based recommendation task, and belongs to the technical field of data mining and application.
Background
In the information age, with the development of internet technology, information received by users every day has been explosively increased. Massive information often results in high information redundancy, and a recommendation system can combine user interests, filter and select information meeting user requirements, and recommend the information to a user. Therefore, the recommendation system is widely applied to scenes such as electronic commerce, video websites and the like.
Conventional recommendation systems mainly use known information of users, but in many real-world scenarios, the information of users is unknown due to a series of reasons such as privacy protection, so session-based recommendation systems gradually rise as one branch of information recommendation. The session-based recommendation system uses a behavior sequence of an anonymous user over a period of time to predict a next item, and can learn the preference of the user according to the item interacted by the user in the session, so as to provide accurate personalized recommendation for the user.
The existing session recommendation method mainly comprises four types: a session recommendation method based on Markov chains, a session recommendation method based on factorization, a session recommendation method based on a recurrent neural network and a session recommendation method based on a graph neural network. The first two methods based on models are to recommend the next item based on the previous item, and can capture the first-order dependency. The recurrent neural network-based approach typically models session data as an ordered sequence, modeling the order of items in a session. The method based on the graph neural network constructs the session data into a directed graph and constructs the items into a pair-wise relationship.
However, in a real-world scenario, the appearance of an item tends to be the result of the co-action of a previous series of items, the relationship between items not being a simple binary relationship but rather a more complex many-to-many relationship.
Disclosure of Invention
The invention aims at solving the technical problems that the performance of a session recommendation system is low due to the fact that the prior art fails to adequately model the high-order relation among projects, session data is high sparse and the like, and creatively provides a hypergraph-fused contrast learning session recommendation method.
Hypergraph structures can better model such higher order relationships than normal graph structures. The hyperedge of the hypergraph may connect multiple vertices, and the dependencies between the items encoded by the hyperedge are no longer binary, but ternary, quaternary, or higher order. Therefore, the invention uses the hypergraph structure to model the conversation, can capture the complex higher-order relation between projects, and is more in line with the characteristics of conversation data.
Meanwhile, in a session, interactive information between items is very important. The invention constructs a local session diagram for each session, and obtains the characteristics of the session by using the interaction information among the items in the session. In order to solve the problem that the session data has high sparsity, the session embedding obtained by two different image modes is compared in a comparison learning mode, so that the two session embedding modes can provide new information for each other, and the recommendation accuracy is improved.
First, the related concept related to the present invention will be explained:
1. item set and session set
Define V as the set of all items, i.e. the set of items, v= { V 1 ,v 2 ,v 3 ,...,v n And n is the number of items. Each session is represented as a set s,
Figure SMS_1
l is the length of the session.
2. Hypergraph
Hypergrams are extensions of the normal graph structure. In a common graph structure, one edge can only connect two vertices. While the edges of the hypergraph (i.e., the hyperedges) can be connected with any number of vertices.
The invention is realized by adopting the following technical scheme.
A contrast learning session recommendation method integrating hypergraphs comprises the following steps:
step 1: and constructing a global hypergraph and a local session graph according to the session data.
Specifically, step 1 includes the steps of:
step 1.1: building global hypergraph G from session data g ,G g =(V g ,E g );V g Representing a set of vertices in the hypergraph; e (E) g Representing a collection of hyperedges in the hypergraph, consisting of all historical session sequences.
According to the definition of hypergraph, one hyperedge can connect multiple vertices, thus each session is constructed as one hyperedge, i.e
Figure SMS_2
And each item->
Figure SMS_3
Step 1.2: building local session graph G from session data s ,G s =(V s ,E s );V s A vertex set representing all items in the conversation; e (E) s Representing a collection of edges in the session graph.
The local session map is constructed according to the current session, and a session is given
Figure SMS_4
Wherein the collection of all items in the session constitutes V s Forming a directed edge e in the local session graph according to the interaction sequence among the items i I.e.
Figure SMS_5
Figure SMS_6
The item representing the ith interaction in session s.
Step 2: and inputting the obtained global hypergraph data into a hypergraph attention network for learning to obtain the project embedded representation under the global hypergraph.
Specifically, step 2 includes the steps of:
step 2.1: at the beginning layer of the network, each item is embedded into a low-dimensional continuous vector space from a high-dimensional space to obtain an initial embedded representation of each item
Figure SMS_7
wherein />
Figure SMS_8
Representing the initial embedding of the nth item.
Step 2.2: the initial embedding representation of each node is input into a first layer hypergraph attention network, attention of different degrees is given to each node according to different node information, and the characteristic embedding of the superside is obtained by aggregating the node information on the superside.
The method comprises the following steps:
Figure SMS_9
Figure SMS_10
Figure SMS_11
/>
Figure SMS_12
wherein ,
Figure SMS_13
representing superedge e j Feature embedding in a layer I hypergraph attention network, < >>
Figure SMS_14
Representing node i passing through superedge e in a layer I hypergraph attention network j The information transferred; n (N) j Representing superedge e j All nodes connected, μ (l) Representing node-level context vectors trainable in a layer I hypergraph attention network。W 1 (l) and />
Figure SMS_15
Is a trainable transformation matrix; alpha i,j Representing node i as being in superb e j Attention score on->
Figure SMS_16
Representing feature embedding of node i in the layer 1 hypergraph attention network; s (·, ·) represents the similarity between compute node embedding and context vector; d is dimension, a and b are parameters, no practical meaning exists, and T represents transposition operation.
Step 2.3: and according to the obtained superside characteristic embedding, aggregating all the information containing the superside of a certain node by using an attention mechanism to obtain the information of the node.
The method comprises the following steps:
Figure SMS_17
Figure SMS_18
Figure SMS_19
wherein ,
Figure SMS_20
representing the feature embedding of node i in a layer I hypergraph attention network,/>
Figure SMS_21
Representing hyperedge e in a layer I hypergraph attention network j Information communicated to node i; y is Y i Representing a set of all superedges connected to node i;
Figure SMS_22
and W3 (l) Is a trainable transformation matrix. Beta i,j Representing superedge e j Attention score on node i.
Obtaining high-order information by stacking a plurality of hypergraph attention layers, outputting embedded representation of each node, namely each item, at the last layer, and finally obtaining item embedded h under the global hypergraph g
Step 3: and inputting the obtained local session map data into a gating map neural network for learning to obtain the item embedded representation under the local session map.
Specifically, step 3 includes the steps of:
step 3.1: at the beginning layer of the network, each item is embedded from a high-dimensional space into a low-dimensional continuous vector space, resulting in an initial embedded representation of each item
Figure SMS_23
Step 3.2: each constructed local session map is learned using a gated map neural network.
The method comprises the following steps:
Figure SMS_24
Figure SMS_25
Figure SMS_26
/>
Figure SMS_27
Figure SMS_28
wherein ,As Is an adjacency matrix, H is a weight matrix;
Figure SMS_29
is an update gate in a gated graph neural network, r i (l) Is a reset gate in the gated graph neural network; />
Figure SMS_30
For the intermediate quantity in the calculation process, +.>
Figure SMS_31
Representing the feature embedding of node i in a layer I hypergraph attention network,/>
Figure SMS_32
Representing feature embedding of node i in the layer 1 hypergraph attention network; as indicated by the letter "", tan h represents the hyperbolic tangent function; sigma represents a sigmod function; w (W) 4 ,W 5 ,W 6 ,U 1 ,U 2 ,U 3 ,b 1 Are trainable parameters.
Embedding the final output result as the characteristic of the item in the session, and processing all the local session graphs by the method to obtain the item embedded representation h under the local session graphs s
Step 4: and respectively aggregating the obtained project embedding information under the global hypergraph and the project embedding information under the local session graph by using an attention mechanism to obtain session embedding representation under the global hypergraph and session embedding representation under the local session graph.
The method comprises the following steps:
x i =tanh(W 7 (h i ||p L-i+1 )+b 2 ) (8)
Figure SMS_33
χ i =q T (W 8 x i +W 9 s+b 3 ) (10)
Figure SMS_34
wherein ,xi Feature embedded representation, p, representing an i-th item in a session after fusing location information L-i+1 Embedding a representation of the location of the i-th item in the representation session; h is a i A feature embedded representation representing an i-th item in the session; x-shaped articles i Representing the attention score of item i for the entire session; s is the intermediate quantity in the calculation process; l represents the length of the session, i.e. the number of items in the session; s represents an embedded representation of a session; w (W) 7 ,W 8 ,W 9 ,q,b 2 ,b 3 Are trainable parameters. T denotes a transpose operation.
Embedding items under global hypergraph into h g And item embedding h under local session map s The operations of the attention mechanisms are respectively carried out, and finally the conversation embedded representation S under the global hypergraph is obtained g And a session embedding representation S under a local session map s
Step 5: contrast learning is used to maximize the embedding of the two sessions into mutual information.
The two session embeddings are compared, if the session embeddings under the global hypergraph and the session embeddings under the local session graph represent the same session, the pair of session embeddings is marked as positive samples, otherwise, as negative samples. And (5) embedding the two sessions into mutual information to be maximized, so as to obtain contrast loss under contrast learning.
The method comprises the following steps:
L s =-logσ(ρ h ·ρ l )-logσ(1-ρ h ·ρ l ) (12)
wherein ,Ls Representing contrast learning loss function ρ h Representing positive samples, ρ l Representing a negative sample, σ represents a sigmod function.
Step 6: and calculating the recommended probability of the candidate item and giving a loss function.
The method comprises the following steps:
S=S g +S s (13)
Figure SMS_35
/>
Figure SMS_36
L=L c +λL s (16)
wherein S is an embedded representation of a session, S g Embedding a representation for a session under a global hypergraph, S s Embedding a representation for a session under a local session map; z i One-hot encoding representing the i-th candidate,
Figure SMS_37
representing a predicted probability of the ith candidate item; l (L) c Represents a cross entropy loss function, L s Representing a comparison learning loss function, L representing the final loss function, λ being a learnable parameter.
And (3) obtaining candidate item recommendation probabilities of the given session sequence from the step 1 to the step 6. And according to the recommendation probability, realizing the contrast learning session recommendation of fusion hypergraph.
Advantageous effects
Compared with the prior art, the method provided by the invention has the following advantages:
according to the method, the hypergraph structure is used for modeling the global structure, so that higher-order information among projects can be better obtained, and meanwhile, interaction information among the projects is modeled by using the local session graph, so that intra-session information is provided. For two different session information, the method uses contrast learning to compare the two session information, so that the two session information are mutually supplemented, the defect of sparse session data is effectively overcome, and the recommendation performance is improved.
Drawings
FIG. 1 is a flow chart of the method of the present invention.
Detailed Description
The invention will now be described in further detail with reference to the drawings and examples.
As shown in FIG. 1, the hypergraph-fused contrast learning session recommendation method comprises the following steps:
step A: constructing a global hypergraph and a session graph according to the session data;
in this embodiment, the method is the same as the step 1 of the summary of the invention;
and (B) step (B): inputting global hypergraph data into a hypergraph attention network to obtain item embedding;
in this embodiment, the method is the same as the step 2 of the summary of the invention;
step C: inputting session map data into a gate map neural network to obtain project embedding;
in this embodiment, the method is the same as the step 3 of the summary of the invention;
step D: obtaining conversation embedding under the global hypergraph and under the conversation graph by using an attention mechanism;
in this embodiment, the method is the same as the step 4 of the summary of the invention;
step E: enhancing two session embeddings using a contrast learning method;
in this embodiment, the same as in the summary of the invention step 5;
step F: calculating candidate item recommendation probability;
in this embodiment, the same as in the summary of the invention step 6 is applied.
Examples
In the session sequence "session 1: [ item 1, item 3, item 2, item 5, item 7]; session 2: [ item 2, item 6, item 8, item 10]; session 3: [ project 4, project 7, project 9, project 11] "is an embodiment, and the embodiment will explain the specific operation steps of a hypergraph-fused contrast learning session recommendation method according to the present invention in detail by using specific examples;
a contrast learning session recommendation method integrating hypergraphs is shown in fig. 1, and comprises the following steps:
step A: constructing a global hypergraph and a session graph according to the session data;
in this embodiment, a global hypergraph and a local session graph are built according to the sequence of items in a session, where the global hypergraph is built by building each session into a hyperedge, so that the built global hypergraph has 3 hyperedges in total, and each hyperedge is respectively connected with all items in sessions 1,2 and 3. The local session map constructs an item interaction map for each session, taking session 1 as an example, and nodes in the map are item 1, item 3, item 2, item 5 and item 7;
and (B) step (B): inputting global hypergraph data into a hypergraph attention network to obtain item embedding;
in the embodiment, firstly, all items in the sessions 1,2 and 3 are embedded into a low-dimensional space to obtain 100-dimensional embedded representations of all the items, the embedded representations are input into a constructed hypergraph attention network, and the 100-dimensional embedded representations of all the items in the sessions 1,2 and 3 are obtained through learning;
step C: inputting session map data into a gate map neural network to obtain project embedding;
in the embodiment, the local session graphs of the sessions 1,2 and 3 constructed in the step A are respectively input into a gating graph neural network, and 100-dimensional embedded representation of each item is obtained through learning;
step D: obtaining conversation embedding under the global hypergraph and under the conversation graph by using an attention mechanism;
specifically, in this embodiment, the item embedding under the global hypergraph obtained in the step B and the item embedding under the local session graph obtained in the step C are respectively processed, the position information is fused in the item embedding, the average value of all the items in the session is used as the initial embedding of the session, taking the session 1 as an example, the average value of the item 1, the item 3, the item 2, the item 5 and the item 7 embedding is used as the global embedding representation of the session 1, the last item in the session, namely the item 7 is used as the intention embedding representation of the session, and the final representation of the session is obtained by using the attention mechanism. Finally obtaining conversation embedding under the global hypergraph and under the conversation graph;
step E: enhancing two session embeddings using a contrast learning method;
in this embodiment, the embedding of the sessions 1,2,3 under the global hypergraph and the embedding of the sessions 1,2,3 under the session graph are obtained in the step D, if the session embedding under the global hypergraph and the session embedding under the local session graph represent the same session, the pair of session embedding is recorded as a positive sample, that is, the embedding of the session 1 under the global hypergraph and the embedding of the session 1 under the session graph are a pair of positive samples, the embedding of the session 2 under the global hypergraph and the embedding of the session 2 under the session graph are a pair of positive samples, and the embedding of the session 3 under the global hypergraph and the embedding of the session 3 under the session graph are a pair of positive samples, with the rest being negative samples. Maximizing the mutual information of two session embedding, obtaining a loss function under comparison learning, and adding the loss function into a final loss function for training;
step F: calculating candidate item recommendation probability;
specifically, in this embodiment, the articles most likely to be recommended in each session are obtained by performing inner product calculation on the feature representations of session 1, session 2, and session 3 and the candidate items.

Claims (7)

1. The hypergraph-fused contrast learning session recommendation method is characterized by comprising the following steps of:
define V as the set of all items, i.e. the set of items, v= { V 1 ,v 2 ,v 3 ,...,v n -wherein n is the number of items; each session is represented as a set s,
Figure FDA0004021681560000011
l is the length of the session;
step 1: constructing a global hypergraph and a local session graph according to session data;
step 2: inputting the obtained global hypergraph data into a hypergraph attention network for learning to obtain item embedded representation under the global hypergraph;
step 2.1: at the beginning layer of the network, each item is embedded into a low-dimensional continuous vector space from a high-dimensional space to obtain an initial embedded representation of each item
Figure FDA0004021681560000012
wherein />
Figure FDA0004021681560000013
Representing an initial embedding of an nth item;
step 2.2: inputting the initial embedding representation of each node into a first layer hypergraph attention network, giving different degrees of attention to each node according to different node information, and obtaining the characteristic embedding of the superside by aggregating the node information on the superside;
step 2.3: according to the obtained superside characteristic embedding, aggregating all the information containing the superside of a certain node by using an attention mechanism to obtain the information of the node;
obtaining high-order information by stacking a plurality of hypergraph attention layers, outputting embedded representations of each node, namely each item, at the last layer, and finally obtaining item embedding under the global hypergraph;
step 3: inputting the obtained local session map data into a gating map neural network for learning to obtain an item embedded representation under the local session map;
embedding the final output result as the characteristic of the item in the session, and processing all the local session diagrams by the method to obtain the item embedded representation under the local session diagrams;
step 4: the obtained project embedding information under the global hypergraph and the project embedding information under the local session graph are respectively aggregated by using an attention mechanism to obtain a session embedding representation under the global hypergraph and a session embedding representation under the local session graph;
respectively carrying out the operation of an attention mechanism on the project embedding under the global hypergraph and the project embedding under the local session graph to finally obtain the session embedding representation under the global hypergraph and the session embedding representation under the local session graph;
step 5: using contrast learning to maximize mutual information embedded by two sessions;
comparing the two session embeddings, if the session embeddings under the global hypergraph and the local session graph represent the same session, marking the pair of session embeddings as positive samples, otherwise marking as negative samples; the two sessions are embedded into mutual information to be maximized, and contrast loss under contrast learning is obtained;
step 6: calculating the recommendation probability of candidate articles and giving a loss function;
obtaining candidate item recommendation probability of a given session sequence from step 1 to step 6; and according to the recommendation probability, realizing the contrast learning session recommendation of fusion hypergraph.
2. The hypergraph-fused contrast learning session recommendation method as claimed in claim 1, wherein the step 1 comprises the steps of:
step 1.1: building global hypergraph G from session data g ,G g =(V g ,E g );V g Representing a set of vertices in the hypergraph; e (E) g Representing a set of hyperedges in the hypergraph, consisting of all historical session sequences; constructing each session as a superside, i.e
Figure FDA0004021681560000021
And each item->
Figure FDA0004021681560000022
Step 1.2: building local session graph G from session data s ,G s =(V s ,E s );V s A vertex set representing all items in the conversation; e (E) s Representing a set of edges in the session graph;
the local session map is constructed according to the current session, and a session is given
Figure FDA0004021681560000023
Wherein the collection of all items in the session constitutes V s Forming a directed edge e in the local session graph according to the interaction sequence among the items i I.e.
Figure FDA0004021681560000024
Figure FDA0004021681560000025
The item representing the ith interaction in session s.
3. The hypergraph-fused contrast learning session recommendation method of claim 1, wherein the implementation method of step 2.2 is as follows:
Figure FDA0004021681560000026
Figure FDA0004021681560000027
Figure FDA0004021681560000028
Figure FDA0004021681560000029
wherein ,
Figure FDA00040216815600000210
representing superedge e j Feature embedding in a layer I hypergraph attention network, < >>
Figure FDA00040216815600000211
Representing node i passing through superedge e in a layer I hypergraph attention network j The information transferred; n (N) j Representing superedge e j All nodes connected, μ (l) Representing node-level context vectors trainable in a layer I hypergraph attention network; w (W) 1 (l) and />
Figure FDA0004021681560000031
Is a trainable transformation matrix; alpha i,j Representing node i as being in superb e j Attention score on->
Figure FDA0004021681560000032
Representing feature embedding of node i in the layer 1 hypergraph attention network;s (,) represents the similarity between compute node embedding and context vectors; d is dimension, a and b are parameters, no practical significance exists, and T represents transposition;
the implementation method of the step 2.3 is as follows:
Figure FDA0004021681560000033
Figure FDA0004021681560000034
Figure FDA0004021681560000035
wherein ,hi (l) Representing feature embedding of node i in the layer i hypergraph attention network,
Figure FDA0004021681560000036
representing hyperedge e in a layer I hypergraph attention network j Information communicated to node i; y is Y i Representing a set of all superedges connected to node i;
Figure FDA0004021681560000037
and />
Figure FDA0004021681560000038
Is a trainable transformation matrix; beta i,j Representing superedge e j Attention score on node i.
4. The hypergraph-fused contrast learning session recommendation method as claimed in claim 1, wherein the step 3 comprises the steps of:
step 3.1: at the beginning layer of the network, each item is embedded from a high-dimensional space into a low-dimensional continuous vector space to obtain an initial embedded table of each itemShowing the
Figure FDA0004021681560000039
Step 3.2: and learning each constructed local session graph by using a gating graph neural network, wherein the method comprises the following steps of:
Figure FDA00040216815600000310
Figure FDA00040216815600000311
Figure FDA00040216815600000312
Figure FDA00040216815600000313
Figure FDA0004021681560000041
wherein ,As Is an adjacency matrix, H is a weight matrix;
Figure FDA0004021681560000042
is an update gate in a gated graph neural network, r i (l) Is a reset gate in the gated graph neural network; />
Figure FDA0004021681560000043
For the intermediate quantity in the calculation process, +.>
Figure FDA0004021681560000044
Representing node i in a layer I hypergraph attention networkIs embedded in (a) features of->
Figure FDA0004021681560000045
Representing feature embedding of node i in the layer 1 hypergraph attention network; as indicated by the letter "", tan h represents the hyperbolic tangent function; sigma represents a sigmod function; w (W) 4 ,W 5 ,W 6 ,U 1 ,U 2 ,U 3 ,b 1 Are trainable parameters.
5. The hypergraph-fused contrast learning session recommendation method as claimed in claim 1, wherein the step 4 comprises:
x i =tanh(W 7 (h i ||p L-i+1 )+b 2 ) (13)
Figure FDA0004021681560000046
χ i =q T (W 8 x i +W 9 s+b 3 ) (15)
Figure FDA0004021681560000047
wherein ,xi Feature embedded representation, p, representing an i-th item in a session after fusing location information L-i+1 Embedding a representation of the location of the i-th item in the representation session; h is a i A feature embedded representation representing an i-th item in the session; x-shaped articles i Representing the attention score of item i for the entire session; s is the intermediate quantity in the calculation process; l represents the length of the session, i.e. the number of items in the session; s represents an embedded representation of a session; w (W) 7 ,W 8 ,W 9 ,q,b 2 ,b 3 Are trainable parameters; t represents the transpose.
6. The hypergraph-fused contrast learning session recommendation method as claimed in claim 1, wherein the step 5 comprises:
L s =-logσ(ρ h ·ρ l )-logσ(1-ρ h ·ρ l ) (17)
wherein ,Ls Representing contrast learning loss function ρ h Representing positive samples, ρ l Representing a negative sample, σ represents a sigmod function.
7. The hypergraph-fused contrast learning session recommendation method as claimed in claim 1, wherein step 6 comprises:
S=S g +S s (18)
Figure FDA0004021681560000051
/>
Figure FDA0004021681560000052
L=L c +λL s (21)
wherein S represents an embedded representation of a session, S g Embedding a representation for a session under a global hypergraph, S s Embedding a representation for a session under a local session map; z i One-hot encoding representing the i-th candidate,
Figure FDA0004021681560000053
representing a predicted probability of the ith candidate item; l (L) c Represents a cross entropy loss function, L s Representing a comparison learning loss function, L representing the final loss function, λ being a learnable parameter. />
CN202211692089.3A 2022-12-28 2022-12-28 Hypergraph-fused contrast learning session recommendation method Pending CN116186390A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211692089.3A CN116186390A (en) 2022-12-28 2022-12-28 Hypergraph-fused contrast learning session recommendation method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211692089.3A CN116186390A (en) 2022-12-28 2022-12-28 Hypergraph-fused contrast learning session recommendation method

Publications (1)

Publication Number Publication Date
CN116186390A true CN116186390A (en) 2023-05-30

Family

ID=86443438

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211692089.3A Pending CN116186390A (en) 2022-12-28 2022-12-28 Hypergraph-fused contrast learning session recommendation method

Country Status (1)

Country Link
CN (1) CN116186390A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116541716A (en) * 2023-07-06 2023-08-04 深圳须弥云图空间科技有限公司 Recommendation model training method and device based on sequence diagram and hypergraph
CN116894122A (en) * 2023-07-06 2023-10-17 黑龙江大学 Cross-view contrast learning group recommendation method based on hypergraph convolutional network

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116541716A (en) * 2023-07-06 2023-08-04 深圳须弥云图空间科技有限公司 Recommendation model training method and device based on sequence diagram and hypergraph
CN116894122A (en) * 2023-07-06 2023-10-17 黑龙江大学 Cross-view contrast learning group recommendation method based on hypergraph convolutional network
CN116894122B (en) * 2023-07-06 2024-02-13 黑龙江大学 Cross-view contrast learning group recommendation method based on hypergraph convolutional network
CN116541716B (en) * 2023-07-06 2024-04-16 深圳须弥云图空间科技有限公司 Recommendation model training method and device based on sequence diagram and hypergraph

Similar Documents

Publication Publication Date Title
CN110119467B (en) Project recommendation method, device, equipment and storage medium based on session
Castillo Functional networks
CN116186390A (en) Hypergraph-fused contrast learning session recommendation method
CN111881342A (en) Recommendation method based on graph twin network
Li et al. Long-short term spatiotemporal tensor prediction for passenger flow profile
CN112529069B (en) Semi-supervised node classification method, system, computer equipment and storage medium
CN110738314B (en) Click rate prediction method and device based on deep migration network
CN113761250A (en) Model training method, merchant classification method and device
CN116664719B (en) Image redrawing model training method, image redrawing method and device
CN113868451A (en) Cross-modal social network conversation method and device based on context cascade perception
US20240037133A1 (en) Method and apparatus for recommending cold start object, computer device, and storage medium
CN114239675A (en) Knowledge graph complementing method for fusing multi-mode content
CN116821519A (en) Intelligent recommendation method for system filtering and noise reduction based on graph structure
CN113537613B (en) Temporal network prediction method for die body perception
CN112487231B (en) Automatic image labeling method based on double-image regularization constraint and dictionary learning
CN112734519B (en) Commodity recommendation method based on convolution self-encoder network
Huang et al. Short-term traffic flow prediction based on graph convolutional network embedded lstm
He et al. Center‐augmented ℓ2‐type regularization for subgroup learning
CN111737591A (en) Product recommendation method based on heterogeneous heavy-side information network translation model
CN117252665B (en) Service recommendation method and device, electronic equipment and storage medium
Li et al. Contextual Offline Demand Learning and Pricing with Separable Models
CN113065321B (en) User behavior prediction method and system based on LSTM model and hypergraph
Pang et al. A Survey on Dynamic Graph Neural Networks Modeling
CN112307227B (en) Data classification method
CN113420561B (en) Named entity identification method, device, equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination