CN114677234B - Graph convolution neural network social recommendation method and system integrating multichannel attention mechanisms - Google Patents

Graph convolution neural network social recommendation method and system integrating multichannel attention mechanisms Download PDF

Info

Publication number
CN114677234B
CN114677234B CN202210445519.5A CN202210445519A CN114677234B CN 114677234 B CN114677234 B CN 114677234B CN 202210445519 A CN202210445519 A CN 202210445519A CN 114677234 B CN114677234 B CN 114677234B
Authority
CN
China
Prior art keywords
social network
graph
node
matrix
social
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202210445519.5A
Other languages
Chinese (zh)
Other versions
CN114677234A (en
Inventor
翟锐
张莉博
李绍华
于俊洋
宋亚林
王瑛琦
白晨希
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Henan University
Original Assignee
Henan University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Henan University filed Critical Henan University
Priority to CN202210445519.5A priority Critical patent/CN114677234B/en
Publication of CN114677234A publication Critical patent/CN114677234A/en
Application granted granted Critical
Publication of CN114677234B publication Critical patent/CN114677234B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q50/00Information and communication technology [ICT] specially adapted for implementation of business processes of specific business sectors, e.g. utilities or tourism
    • G06Q50/01Social networking
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/045Combinations of networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Health & Medical Sciences (AREA)
  • Health & Medical Sciences (AREA)
  • General Physics & Mathematics (AREA)
  • Computing Systems (AREA)
  • Biophysics (AREA)
  • Software Systems (AREA)
  • Evolutionary Computation (AREA)
  • Computational Linguistics (AREA)
  • Molecular Biology (AREA)
  • Biomedical Technology (AREA)
  • General Engineering & Computer Science (AREA)
  • Artificial Intelligence (AREA)
  • Mathematical Physics (AREA)
  • Data Mining & Analysis (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Business, Economics & Management (AREA)
  • Economics (AREA)
  • Human Resources & Organizations (AREA)
  • Marketing (AREA)
  • Primary Health Care (AREA)
  • Strategic Management (AREA)
  • Tourism & Hospitality (AREA)
  • General Business, Economics & Management (AREA)
  • Information Retrieval, Db Structures And Fs Structures Therefor (AREA)

Abstract

The invention provides a graph convolution neural network social recommendation method and a system which are integrated with a multichannel attention mechanism, and the social recommendation effect is improved through the following work: 1. the node characteristic and the node embedding of the topological structure are learned, and simultaneously the node embedding of the combination of the node characteristic and the topological structure is learned, so that the common characteristic of the node characteristic and the node embedding is obtained, and the problem of excessive dependence on the single characteristic is relieved. 2. By learning the scattering embedding of the topological structure, the band-pass filtering of different signals is realized, and the phenomenon of over-smoothing is reduced. 3. And combining the attention mechanism to fuse the related information. Experimental results show that compared with other algorithms, the performance of the method and the system provided by the invention on a plurality of social network data sets is improved, and the invention also provides a new idea for subsequent researches.

Description

Graph convolution neural network social recommendation method and system integrating multichannel attention mechanisms
Technical Field
The invention relates to the technical field of social recommendation, in particular to a graph convolution neural network social recommendation method and system integrating a multichannel attention mechanism.
Background
With the development of networks, people have diversified social ways, and social contact becomes more and more convenient from previous telephone messages to current online social contact. The Graph (Graph) is flexible in structure and has strong representation capability, so that more and more data is represented by using the Graph. In social recommendations, each node can be considered a user, and if there is an edge connection between two nodes, it is indicated that similar information exists between them. But the processing and mining of graph data has been a challenge. As the data size increases, the graph data becomes more complex and relatively troublesome to handle with respect to signal data such as images and languages, and therefore, how to efficiently and accurately learn the graph data becomes a very important problem. Social recommendation tasks, one of the most extensive tasks on the graph, have been the focus of research by researchers.
The graph neural network (Graph Neural Network, GNN) is developed from the ideas of convolutional neural, cyclic neural networks and deep auto-encoders, defining and designing the neural network structure for processing the graph data. Later, as research proceeds, members of GNNs become more and more, graph roll-up Networks (Graph Convolution Networks, GCNs), graph annotation Networks (Graph Attention Networks), graph self-encoders (Graph Auto encoders), graph generation Networks (GRAPH GENERATIVE Networks), etc., have evolved for different task demands. In particular, the appearance of convolutional neural networks provides new ideas for the analysis of graph data. GCNs is a neural network architecture for learning graph data, which is widely used at present, and the basic idea is to combine the idea of CNN in GNN. GCNs is an end-to-end learning framework, where nodes aggregate their own and their neighbor information through each layer of convolution, so as to continuously update the obtained node information, thereby obtaining optimal node information, so as to recommend similar users to social users.
Recent studies have shown that learning of a single graph structure or node feature does not perform well in node classification tasks. The basic GCNs framework does not take into account well the graph structure and node feature correlation factors. To address this problem, wang et al [Wang X,Zhu M,Bo D,et al.AM-GCN:Adaptive Multi-channel Graph Convolutional Networks[C]//KDD'20:The 26th ACM SIGKDD Conference on Knowledge Discovery and Data Mining.ACM,2020.] explored the correlation between graph structure and node features using a constructed K-th order adjacency graph, learn different representations of nodes, and at the same time introduce a mechanism of attention to promote classification. Jin et al [Jin W,Derr T,Wang Y,et al.Node Similarity Preserving Graph Convolutional Networks[C]//WSDM'21:The Fourteenth ACM International Conference on Web Search and Data Mining.ACM,2021.] propose a semi-supervised method, which adaptively fuses according to the graph structure and node characteristic information, and learns according to the fused information. However, these methods do not provide optimal results for learning graph structures and node features.
On the other hand, GCNs as the depth increases, the nodes become more and more similar, resulting in insufficient individualization of each node. The difficulty of node classification is greatly increased, and the effect of user recommendation is poor. Aiming at the problem, partial scholars propose that the occurrence of over-smoothing is relieved by utilizing the geometric scattering information, and the geometric scattering network can capture the high-order regularization on the graph and fully learn the signal information with different frequencies. Zou et al [Dongmian Zou and Gilad Lerman,"Graph convolutional neural networks via scattering,"Applied and Computational Harmonic Analysis,vol.49,no.3,pp.1046–1074,2020.] demonstrate that the scattering network can make any feature pair arrangement generated approximately constant, enabling stable mapping operations. Fernando Gama et al [Fernando Gama,Alejandro Ribeiro,and Joan Bruna,"Diffusion scattering transforms on graphs,"in International Conference on Learning Representations,2019.] demonstrate that using diffuse wavelets can generalize the scattering transformation to the non-euclidean domain while being able to capture high frequency signals. zhu et al [Zhu H,Koniusz P.Simple Spectral Graph Convolution[C]//International Conference on Learning Representation 2021.2021.] use simple spectrogram convolution to trade-off the low-pass and high-pass filter bands of the captured global and local context of each node, thereby mitigating the occurrence of overcomplete.
Disclosure of Invention
The invention aims at the task of the current GCNs framework in processing complex relation diagrams: if social recommendation is performed, because the node characteristics are too depended, the phenomenon of excessive smoothness is very easy to occur in the node aggregation process, so that node representation is difficult to distinguish, and the effect of social recommendation is seriously affected. In order to self-adaptively allocate the weight, an attention mechanism is added, so that the node classification task effect is remarkably improved, and the social recommendation is more accurate.
In order to achieve the above purpose, the present invention adopts the following technical scheme:
The invention provides a graph convolution neural network social recommendation method integrating a multichannel attention mechanism, which comprises the following steps:
Step 1: calculating a feature matrix X of the social network graph G t=(At, X) by using cosine similarity, and then constructing a k-nearest neighbor graph of G t, namely a social network feature graph G f=(Af, X) based on the feature; wherein a t represents the symmetric adjacency matrix of G t, and a f represents the symmetric adjacency matrix of G f;
Step 2: learning the embedding Z S of the multiple signals of the social network data graph G t based on the graph convolution neural network;
Step 3: propagating the characteristics of the G t node on a topological space, performing convolution operation, and embedding the node of the G t node into Z t;
step 4: propagating the characteristics of the G f node on the characteristic space, performing convolution operation, and embedding the node of the G f node into Z f;
step 5: embedding Z c based on a combination of graph roll-up neural network learning G t and G f;
Step 6: introducing an attention mechanism to dynamically adjust the weight of Z S、Zt、Zf、Zc, and calculating to obtain the final embedded Z of each social network node based on the adjusted Z S、Zt、Zf、Zc;
Step 7: and calculating the category Y of the social network node based on the final embedded Z so as to conduct social recommendation.
Further, the step 1 includes:
firstly, calculating a similarity matrix of a social network data set by adopting a cosine similarity method Then, the first k similar node pairs of each node in the social network are selected to set edges, a k-nearest neighbor graph of the original graph, namely a social network feature graph, is constructed, and is denoted as G f=(Af and X), and the calculation method of the similarity matrix S is as follows:
Wherein X i and X j are feature vectors of social network nodes i and j, respectively.
Further, the step 2 includes:
geometric scattering on G t is constructed based on an inert random walk matrix:
where I n is the identity matrix, Adding a self-loop adjacency matrix to the original social network graph G t, D being/>Is a diagonal matrix of (a);
In geometric scattering, a wavelet transform of 2 k scales is introduced
Wherein U 0 represents the high frequency signal of the node itself;
Using first and second order high frequency signals, namely:
according to the propagation rule of the graph convolution neural network, the following scattering propagation rule is defined:
Wherein the method comprises the steps of Is the weight matrix of layer i of G t, σ is the activation function.
Further, the step 3 includes:
Original social network diagram G t=(At, X) convolution output of layer i in topological space The calculation method of (2) is shown in the formula (6):
Wherein, The weight matrix of layer I of G t, sigma as the activation function,/>, is representedAdd the adjacency matrix of the self-loop for G t, For/>And/>
Further, the step 4 includes:
Social network profile G f=(Af, X) convolution output of layer i in the feature space As shown in formula (7):
Wherein the method comprises the steps of The weight matrix of layer I of G f, sigma as the activation function,/>, is representedAdding a self-loop adjacency matrix for G f, wherein/> For/>And/>
Further, the step 5 includes:
And performing convolution operation on the original social network graph G t and the social network feature graph G f for one time respectively to obtain two convolution output representations, adding the two convolution output representations to be used as convolution input of the next convolution of the two graphs, and circularly reciprocating the two convolution input representations until all convolution operations are finished to obtain final convolution output to be used as a combined embedding Z c.
Further, the step 6 includes:
For any social network node k, the embedding in Z s is represented as Using weight vectorsCalculate the attention value/>, of the nodeAs shown in formula (12):
Wherein the method comprises the steps of A social network weight matrix;
Similarly, the attention value of the social network node k in Z t,Zc and Z f is obtained And/>
The attention value of the social network node k is then derived using the Softmax function, as shown in equation (13):
And the same is calculated to obtain For all of the social network nodes n in the graph,AndMu stcf represents the attention value of Z s,Zt,Zc,Zf, respectively;
According to the attention value of each node of the social network, calculating to obtain the final embedded Z of each social network node:
Z=μs·Zst·Ztc·Zcf·Zf#(14)。
further, the step 7 includes:
Adopting a cross entropy loss function as a final objective function, and defining the probability that the social network node k belongs to the class c as Then class prediction of n nodes is/>Obtaining/> from the final embedding Z
Wherein W is a weight vector of class c; b denotes the bias, which is a constant.
Another aspect of the present invention provides a social recommendation system for a graph roll-up neural network, which merges a multi-channel attention mechanism, including:
The feature map construction module is used for calculating a feature matrix X of the social network map G t=(At, X) by utilizing cosine similarity, and then constructing a k-nearest neighbor map of G t, namely the social network feature map G f=(Af, X) based on the feature; wherein a t represents the symmetric adjacency matrix of G t, and a f represents the symmetric adjacency matrix of G f;
The scattering module is used for learning the embedded Z S of the multiple signals of the social network data graph G t based on the graph convolution neural network;
The topology module is used for spreading the characteristics of the G t nodes on a topology space, carrying out convolution operation and learning the nodes of the G t to be embedded into Z t;
The characteristic module is used for spreading the characteristics of the G f node on the characteristic space, carrying out convolution operation and embedding the node of the G f into Z f;
A combination module for embedding Z c based on a combination of graph roll-up neural network learning G t and G f;
The attention module is used for introducing an attention mechanism to dynamically adjust the weight of Z S、Zt、Zf、Zc, and calculating to obtain the final embedded Z of each social network node based on the adjusted Z S、Zt、Zf、Zc;
And the social recommendation module calculates the category Y of the social network node based on the final embedded Z so as to conduct social recommendation.
Further, the feature map construction module is specifically configured to:
firstly, calculating a similarity matrix of a social network data set by adopting a cosine similarity method Then, the first k similar node pairs of each node in the social network are selected to set edges, a k-nearest neighbor graph of the original graph, namely a social network feature graph, is constructed, and is denoted as G f=(Af and X), and the calculation method of the similarity matrix S is as follows:
Wherein X i and X j are feature vectors of social network nodes i and j, respectively.
Further, the scattering module is specifically configured to:
geometric scattering on G t is constructed based on an inert random walk matrix:
where I n is the identity matrix, Adding a self-loop adjacency matrix to the original social network graph G t, D being/>Is a diagonal matrix of (a);
In geometric scattering, a wavelet transform of 2 k scales is introduced
Wherein U 0 represents the high frequency signal of the node itself;
Using first and second order high frequency signals, namely:
according to the propagation rule of the graph convolution neural network, the following scattering propagation rule is defined:
Wherein the method comprises the steps of Is the weight matrix of layer i of G t, σ is the activation function.
Further, the topology module is specifically configured to:
Original social network diagram G t=(At, X) convolution output of layer i in topological space The calculation method of (2) is shown in the formula (6):
Wherein, The weight matrix of layer I of G t, sigma as the activation function,/>, is representedAdd the adjacency matrix of the self-loop for G t, For/>And/>
Further, the feature module is specifically configured to:
Social network profile G f=(Af, X) convolution output of layer i in the feature space As shown in formula (7):
Wherein the method comprises the steps of The weight matrix of layer I of G f, sigma as the activation function,/>, is representedAdding a self-loop adjacency matrix for G f, wherein/> For/>And/>
Further, the combination module is specifically configured to:
And performing convolution operation on the original social network graph G t and the social network feature graph G f for one time respectively to obtain two convolution output representations, adding the two convolution output representations to be used as convolution input of the next convolution of the two graphs, and circularly reciprocating the two convolution input representations until all convolution operations are finished to obtain final convolution output to be used as a combined embedding Z c.
Further, the attention module is specifically configured to:
For any social network node k, the embedding in Z s is represented as Using weight vectorsCalculate the attention value/>, of the nodeAs shown in formula (12):
Wherein the method comprises the steps of A social network weight matrix;
Similarly, the attention value of the social network node k in Z t,Zc and Z f is obtained And/>
The attention value of the social network node k is then derived using the Softmax function, as shown in equation (13):
And the same is calculated to obtain For all of the social network nodes n in the graph,AndMu stcf represents the attention value of Z s,Zt,Zc,Zf, respectively;
According to the attention value of each node of the social network, calculating to obtain the final embedded Z of each social network node:
Z=μs·Zst·Ztc·Zcf·Zf#(14)。
further, the social recommendation module is specifically configured to:
Adopting a cross entropy loss function as a final objective function, and defining the probability that the social network node k belongs to the class c as Then class prediction of n nodes is/>Obtaining/> from the final embedding Z
Wherein W is a weight vector of class c; b denotes the bias, which is a constant.
Compared with the prior art, the invention has the beneficial effects that:
The invention considers the common information of the fusion node characteristics and the network topology and the over-smooth problem commonly encountered, and the problems are effectively solved by combining the embedding and the adding of the geometric scattering. The most relevant information of node characteristics and network topology is more effectively learned by combining the embedding process, and the scattering process mitigates the occurrence of overcomplete by learning the different frequency signals of the first and second order neighbors. Experiments show that the method and the device can achieve good effects in social recommendation tasks.
Drawings
FIG. 1 is a flowchart of a graph roll-up neural network social recommendation method incorporating a multi-channel attention mechanism according to an embodiment of the present invention;
FIG. 2 is a general architecture diagram of a graph roll-up neural network social recommendation system incorporating a multi-channel attention mechanism according to an embodiment of the present invention;
FIG. 3 is a graph showing the result of analysis of the attention mechanism according to the embodiment of the present invention.
Detailed Description
For a better understanding of the present application, the following explanation is first made:
given a social network graph g= (a, X), a is its symmetric adjacency matrix, and X is its feature matrix, and/>If there is an edge connection between social network nodes i and j, a ij =1, otherwise a ij =0.
The invention is further illustrated by the following description of specific embodiments in conjunction with the accompanying drawings:
As shown in fig. 1, a method for social recommendation (SM-GCN for short) of a graph roll-up neural network with a multi-channel attention mechanism is disclosed, which comprises the following steps:
Step S101: calculating a feature matrix X of the social network graph G t=(At, X) by using cosine similarity, and then constructing a k-nearest neighbor graph of G t, namely a social network feature graph G f=(Af, X) based on the feature; wherein a t represents the symmetric adjacency matrix of G t, and a f represents the symmetric adjacency matrix of G f;
Step S102: to alleviate the problem of overcomplete, learning the embedded Z S of the social network data graph G t multiple signals based on a graph convolutional neural network;
Step S103: propagating the characteristics of the G t node on a topological space, performing convolution operation, and embedding the node of the G t node into Z t to obtain information specific to G t;
Step S104: propagating the characteristics of the G f node on the characteristic space, performing convolution operation, and embedding the node of the G f node into Z f to obtain information specific to G f;
Step S105: embedding Z c based on a graph roll-up neural network learning combination of G t and G f to obtain their common characteristic information;
Step S106: introducing an attention mechanism to dynamically adjust the weight of Z S、Zt、Zf、Zc, and calculating to obtain the final embedded Z of each social network node based on the adjusted Z S、Zt、Zf、Zc;
Step S107: and calculating the category Y of the social network node based on the final embedded Z so as to conduct social recommendation, thereby improving the effect in the social recommendation task.
Further, the step S101 includes:
In the aspect of social network feature graph construction, the cosine similarity method is used for calculation in consideration of calculation complexity. The cosine similarity method is used for measuring the difference between two nodes by calculating the cosine value of the included angle of the two space vectors, and the smaller the included angle is, the more similar the two nodes are, and vice versa. Cosine similarity is more focused on the difference in direction of the two vectors than on the distance metric.
Firstly, calculating a similarity matrix of a social network data set by adopting a cosine similarity methodThen, the first k similar node pairs of each node in the social network are selected to set edges, a k-nearest neighbor (kNN) graph of the original graph, namely a social network feature graph, denoted as G f=(Af and X, is constructed, and the calculation method of the similarity matrix S is as follows:
Wherein X i and X j are feature vectors of social network nodes i and j, respectively.
Further, the step S102 includes:
The essence of convolution is the process of laplace smoothing, which causes neighboring nodes to become more and more similar, thus degrading classification. In the step, the invention gathers the scattering characteristics of the original social network diagram, extracts the multi-scale difference of the first-order and second-order neighborhoods by utilizing the wavelet matrix to make up the defect of low-frequency filtering caused by the traditional GCNs (diagram convolution neural network), and the scattering operation is complementary with the traditional GCNs operation to slow down the phenomenon of over-smoothing.
The geometric scattering structure on the figure is based on an inert random walk matrix, and can be expressed as follows:
where I n is the identity matrix, Adding a self-loop adjacency matrix to the original social network graph G t, D being/>Is a diagonal matrix of (a).
Since the high frequency signal can be recovered by multi-scale wavelet transform, in geometric scattering, the invention introduces 2 k-scale wavelet transform
Wherein U 0 represents the high frequency signal of the node itself;
Using first and second order high frequency signals, namely:
according to the propagation rule of GCNs, a scatter propagation rule is defined:
Wherein the method comprises the steps of Is the weight matrix of the first layer of the original social network graph G t, σ is the activation function, here the ReLU function is used.
Further, the step S103 includes:
And respectively carrying out graph rolling operation on the original social network graph and the feature graph, and learning the topological information representation and the feature information representation of the respective nodes.
Original social network diagram G t=(At, X) convolution output of layer i in topological spaceThe calculation method of (2) is shown in the formula (6):
Wherein, The weight matrix representing layer I of G t, σ is the activation function, where the ReLU function is used,/>Adding a self-loop adjacency matrix for G t,/> For/>And/>
Further, the step S104 includes:
Social network profile G f=(Af, X) convolution output of layer i in the feature space As shown in formula (7):
Wherein the method comprises the steps of The weight matrix representing layer I of G f, σ is the activation function, where the ReLU function is used,/>Adding a self-loop adjacency matrix for G f, wherein/> For/>And/>
Further, the step S105 includes:
This step is designed by the present invention to learn common characteristics of the original social networking graph and the feature graph, considering that the downstream social recommendation task may be related to common information of the original social networking graph and the feature graph. The main idea is to implement information sharing between two graphs by entering the two graph data into the same channel and performing convolution operation by using the same parameters. Firstly, the original social network diagram and the feature diagram are respectively subjected to one convolution operation to obtain two convolution output representations, and then the two convolution output representations are added to be used as convolution input of the next convolution of the two diagrams, so that the two figures are circularly reciprocated. The addition of the representations after each layer of convolution by the method of the present invention more highlights the common characteristics of the nodes than the method of convolving and re-summing the two graphs separately.
Original social network diagram G t=(At, X) convolution output of the first layer in topological spaceAs shown in formula (8):
Where σ is the activation function, here the ReLU function is used, Is the weight matrix of the first layer of the social network,
Social network profile G f=(Af, X) convolution output of the first layer in the feature spaceAs shown in formula (9):
Where σ is the activation function, here the ReLU function is used, Is the weight matrix of the first layer of the social network, and is the same as the weight matrix parameters in the topological space,/>
Input of the second layerThe expression is as shown in formula (10):
Thus two or more layers are convolved The expression is as shown in formula (11):
further, the step S106 includes:
in order to improve the effect of node classification tasks and improve social network recommendation accuracy, the embedding of the four social network nodes, namely Z s,Zt,Zc,Zf, is obtained. According to the method, the attention mechanism is used for calculating the embedding weight of the social network node, so that information with higher correlation is obtained. Mu stcf represents the attention value of Z s,Zt,Zc,Zf, respectively.
For any social network node k, the embedding in Z s is represented asThe invention uses weight vectorsCalculate the attention value/>, of the nodeAs shown in formula (12):
Wherein the method comprises the steps of Is a social network weight matrix. Similarly, the attention value/>, at Z t,Zc and Z f, of social network node k may be derivedAnd/>
The attention value of the social network node k is then derived using the Softmax function, as shown in equation (13):
Similarly, it can be calculated to For all social network nodes n,/>, in the graph And
According to the attention value, calculating to obtain a final embedded Z of each social network node:
Z=μs·Zst·Ztc·Zcf·Zf#(14)
Specifically, based on which category the obtained final embedded Z prediction node specifically belongs to, the recommendation of similar social interests and the like can be performed for the user according to the obtained category, so that the recommendation is more accurate.
Further, the step S107 includes:
The cross entropy can measure the degree of difference between two different probability distributions in the same random variable, and is expressed in machine learning as the difference between the true probability distribution and the predicted probability distribution. The smaller the value of the cross entropy, the better the model prediction effect. The invention adopts the cross entropy loss function as the final objective function. Defining the probability that the social network node k belongs to class c as Then class prediction for n nodes is/>From the final embedding Z, we get/>
Wherein W is a weight vector of class c; b denotes the bias, which is a constant.
Assuming the training set is L, for each L ε L, the actual label is Y l and the predicted label isThe loss in node classification is expressed as:
on the basis of the above embodiment, as shown in fig. 2, the present invention further provides a graph roll-up neural network social recommendation (SM-GCN for short) system that merges the multi-channel attention mechanism, including:
The feature map construction module is used for calculating a feature matrix X of the social network map G t=(At, X) by utilizing cosine similarity, and then constructing a k-nearest neighbor map of G t, namely the social network feature map G f=(Af, X) based on the feature; wherein a t represents the symmetric adjacency matrix of G t, and a f represents the symmetric adjacency matrix of G f;
The scattering module is used for learning the embedded Z S of the multiple signals of the social network data graph G t based on the graph convolution neural network;
The topology module is used for spreading the characteristics of the G t nodes on a topology space, carrying out convolution operation and learning the nodes of the G t to be embedded into Z t;
The characteristic module is used for spreading the characteristics of the G f node on the characteristic space, carrying out convolution operation and embedding the node of the G f into Z f;
A combination module for embedding Z c based on a combination of graph roll-up neural network learning G t and G f;
The attention module is used for introducing an attention mechanism to dynamically adjust the weight of Z S、Zt、Zf、Zc, and calculating to obtain the final embedded Z of each social network node based on the adjusted Z S、Zt、Zf、Zc;
And the social recommendation module calculates the category Y of the social network node based on the final embedded Z so as to conduct social recommendation.
Further, the feature map construction module is specifically configured to:
firstly, calculating a similarity matrix of a social network data set by adopting a cosine similarity method Then, the first k similar node pairs of each node in the social network are selected to set edges, a k-nearest neighbor graph of the original graph, namely a social network feature graph, is constructed, and is denoted as G f=(Af and X), and the calculation method of the similarity matrix S is as follows:
Wherein X i and X j are feature vectors of social network nodes i and j, respectively.
Further, the scattering module is specifically configured to:
geometric scattering on G t is constructed based on an inert random walk matrix:
where I n is the identity matrix, Adding a self-loop adjacency matrix to the original social network graph G t, D being/>Is a diagonal matrix of (a);
in geometric scattering, a wavelet transform of 2k scale is introduced
Wherein U 0 represents the high frequency signal of the node itself;
Using first and second order high frequency signals, namely:
according to the propagation rule of the graph convolution neural network, the following scattering propagation rule is defined:
Wherein the method comprises the steps of Is the weight matrix of layer i of G t, σ is the activation function.
Further, the topology module is specifically configured to:
Original social network diagram G t=(At, X) convolution output of layer i in topological space The calculation method of (2) is shown in the formula (6):
Wherein, The weight matrix of layer I of G t, sigma as the activation function,/>, is representedAdd the adjacency matrix of the self-loop for G t, For/>And/>
Further, the feature module is specifically configured to:
Social network profile G f=(Af, X) convolution output of layer i in the feature space As shown in formula (7):
Wherein the method comprises the steps of The weight matrix of layer I of G f, sigma as the activation function,/>, is representedAdding a self-loop adjacency matrix for G f, wherein/> For/>And/>
Further, the combination module is specifically configured to:
And performing convolution operation on the original social network graph G t and the social network feature graph G f for one time respectively to obtain two convolution output representations, adding the two convolution output representations to be used as convolution input of the next convolution of the two graphs, and circularly reciprocating the two convolution input representations until all convolution operations are finished to obtain final convolution output to be used as a combined embedding Z c.
Further, the attention module is specifically configured to:
For any social network node k, the embedding in Z s is represented as Using weight vectorsCalculate the attention value/>, of the nodeAs shown in formula (12):
Wherein the method comprises the steps of A social network weight matrix;
Similarly, the attention value of the social network node k in Z t,Zc and Z f is obtained And/>
The Sofimax function is then used to derive the attention value of the social network node k, as shown in equation (13):
And the same is calculated to obtain For all of the social network nodes n in the graph,AndMu stcf represents the attention value of Z s,Zt,Zc,Zf, respectively;
According to the attention value of each node of the social network, calculating to obtain the final embedded Z of each social network node:
Z=μs·Zst·Ztc·Zcf·Zf#(14)。
further, the social recommendation module is specifically configured to:
Adopting a cross entropy loss function as a final objective function, and defining the probability that the social network node k belongs to the class c as Then class prediction of n nodes is/>Obtaining/> from the final embedding Z
Wherein W is a weight vector of class c; b denotes the bias, which is a constant.
To verify the effect of the invention, the following experiments were performed:
specifically, the present invention has been evaluated on three social networks (UAI 2010, blogCatalog and Flickr), and the dataset information is shown in Table 1.
Table 1 dataset statistics
Dataset Nodes Edges Classes Femres
BlogCatalog 5196 171743 6 8189
Flickr 7575 239738 9 12047
UAI2010 3067 28311 19 4973
To demonstrate the effectiveness of the method, the method of the present invention was compared to the following six popular node classification methods:
GCN[Thomas N Kipf and Max Welling.2016.Semi-supervised classification with graph convolutional networks.arXiv preprint arXiv:1609.02907(2016).]: Is one of the most popular semi-supervised graph rolling network models at present, and achieves better representation by aggregating the information of neighbor nodes.
KNN-GCN[Luca Franceschi,Mathias Niepert,Massimiliano Pontil,and Xiao He.2019.Learning discrete structures for graph neural networks.ICML(2019).]:GCN In a variant of (c), a K-th order adjacency matrix is used as the input graph.
GAT: the method is a graph neural network model widely applied to the GNN baseline, and node characteristics are aggregated according to the learned scores of different nodes.
DEMO-Net [ Pei H, wei B, chang C, et al Geom-GCN: geometric Graph Convolutional Networks [ J ].2020.]: is a degree-specific graph neural network for node classification.
MixHop[Sami Abu-El-Haija,Bryan Perozzi,Amol Kapoor,Nazanin Alipourfard,KristinaLerman,Hrayr Harutyunyan,Greg Ver Steeg,and Aram Galstyan.2019.MixHop:Higher-Order Graph Convolutional Architectures via Sparsified NeighborhoodMixing.In ICML.21–29.]: Is a method of layering a feature representation on a graph volume using neighbors that blend in various distances.
AM-GCN[Wang X,Zhu M,Bo D,et al.AM-GCN:Adaptive Multi-channel Graph Convolutional Networks[C]//KDD'20:The 26th ACM SIGKDD Conference on Knowledge Discovery and Data Mining.ACM,2020.]: Is a network model that incorporates feature graphs, raw graphs, and their common feature representations.
The invention selects 20 nodes of each type of mark as a training set, and simultaneously selects 1000 nodes as a testing set. The invention trains two layers GCNs, and the setting of the hidden layer is nhid epsilon {512,768}, nhid epsilon {32,128,256}. The invention sets the learning rate to be 0.0002-0.0005, the dropout rate to be 0.5, the weight attenuation epsilon {5e-3,5e-4}, and the k-order neighbor graph epsilon {2, …,9}. All baselines, the present invention refers to parameters in their paper. All experiments were run 5 times to average. Evaluation of model performance, the invention adopts Accuracy (ACC) and macroscopic F1 score (F1). Accuracy refers to the ratio of the number of correctly classified samples to the total number of samples, i.e. the probability of predicting correctness, for a given test dataset. A single Accuracy (ACC) evaluation index has a significant defect in the case of unbalanced positive and negative samples, so a macroscopic F1 scoring standard is added, which is a harmonic mean of accuracy and recall.
The experimental environment is as follows:
Operating system: ubuntu Linux release 16.04.7LTS;
·CPU:Intel(R)Xeon(R)Silver CPU@2.20GHz;
·GPU:Quadro P4000;
Software version: python 3.7; pytorch.1.0; numpy 1.16.2; sciPy.3.1; networkx 2.4.4; scikit-learn 0.21.3.
According to the above circumstances, the results of obtaining social recommendation tasks are shown in table 2. The SM-GCN provided by the invention obtains the best performance on both evaluation criteria of all data sets. In particular, ACC was increased by 9.82% on BlogCatalog datasets and 10.07% on F1, respectively. By analysis, it is thought that this might be due to the following factors leading to an increase in effect.
1. By comparing the performance of GCN and KNN-GCN on the data sets, it can be seen that there are different effects on both different data sets, which illustrates that both the topology and the feature map are important.
2. By comparing the behavior of KNN-GCN and MixHop on the dataset, it can be seen that KNN-GCN performs well in the paper citation network, while MixHop performs well in the social network, which suggests that it is highly necessary to learn the characteristics of neighbor nodes.
3. The method SM-GCN can better learn node common information of the topological graph and the feature graph, and due to the existence of the scattering module, the occurrence of overcomplete is relieved, and compared with an AM-GCN effect, experimental results are improved by at least 1%.
TABLE 2 social recommendation result percentage (best effort bolded)
In order to demonstrate the effectiveness of the proposed module, an ablation experiment was performed on the model SM-GCN of the present invention. TF-GCN is the model only including topology module and characteristic module, S-GCN is the model including topology module, characteristic module and scattering module, C-GCN is the model including topology module, characteristic module and combination module, and experimental results are shown in Table 3.
TABLE 3 ablation experimental results percentage (best effect thickened)
On BlogCatalog datasets, the model of the present invention is greatly improved. According to the experimental results of TF-GCN and S-GCN, the invention can be used for supposition in a large scale, and the promotion is caused by learning the common characteristics of the original graph and the characteristic graph by the combination module. The S-GCN with the addition of the scattering module is improved over the convolution operation TF-GCN using only the original and feature maps, because the scattering module adds signals of other frequencies to the original map.
The classification effect is improved compared to TF-GCN, which demonstrates the necessity to learn both the scattering module and the combining module.
In order to understand which embedding the classification task is more prone to, the invention performs a distributed analysis of the attention mechanism of the model of the invention. As shown in fig. 3.
The model of the present invention has four nodes embedded before joining the attention mechanism: original graph node embedding, feature graph node embedding, combined node embedding and original graph scattering node embedding. From fig. 3, it can be seen that the embedding of the combined node on all three data sets (BlogCatalog, flickr, UAI, 2010) represents a crucial role. Scatter node embedding is more important than topology node embedding and feature node embedding, and feature node embedding is next to combining node embedding in the UAI2010 dataset, indicating that the dataset is more prone to information representation of features. A higher attention value embedded by the scattering nodes than the attention value embedded by the topology nodes indicates that the contribution of the high frequency signal to the classification task is higher than the low frequency signal generated by the GCN, and the addition of a scattering module is very necessary. In summary, the model of the invention distributes weights more adaptively, so that the social recommendation task achieves better effect.
In summary, the SM-GCN provided by the invention considers the common information of the fusion node characteristics and the network topology and the frequently encountered over-smooth problem, and the problems are effectively solved by combining the embedding and the adding of the geometric scattering. The most relevant information of node characteristics and network topology is more effectively learned by combining the embedding process, and the scattering process mitigates the occurrence of overcomplete by learning the different frequency signals of the first and second order neighbors. Experiments show that the method and the device can achieve good effects in social recommendation tasks.
The foregoing is merely illustrative of the preferred embodiments of this invention, and it will be appreciated by those skilled in the art that changes and modifications may be made without departing from the principles of this invention, and it is intended to cover such modifications and changes as fall within the true scope of the invention.

Claims (6)

1. A graph convolution neural network social recommendation method integrating a multichannel attention mechanism is characterized by comprising the following steps:
Step 1: calculating a feature matrix X of the social network graph G t=(At, X) by using cosine similarity, and then constructing a k-nearest neighbor graph of G t, namely a social network feature graph G f=(Af, X) based on the feature; wherein a t represents the symmetric adjacency matrix of G t, and a f represents the symmetric adjacency matrix of G f;
Step 2: learning the embedding Z S of the multiple signals of the social network data graph G t based on the graph convolution neural network;
Step 3: propagating the characteristics of the G t node on a topological space, performing convolution operation, and embedding the node of the G t node into Z t;
step 4: propagating the characteristics of the G f node on the characteristic space, performing convolution operation, and embedding the node of the G f node into Z f;
step 5: embedding Z c based on a combination of graph roll-up neural network learning G t and G f;
Step 6: introducing an attention mechanism to dynamically adjust the weight of Z S、Zt、Zf、Zc, and calculating to obtain the final embedded Z of each social network node based on the adjusted Z S、Zt、Zf、Zc;
step 7: calculating the category Y of the social network node based on the final embedded Z so as to conduct social recommendation;
The step 2 comprises the following steps:
geometric scattering on G t is constructed based on an inert random walk matrix:
where I n is the identity matrix, Adding a self-loop adjacency matrix to the original social network graph G t, D being/>Is a diagonal matrix of (a);
In geometric scattering, a wavelet transform of 2 k scales is introduced
Wherein U 0 represents the high frequency signal of the node itself;
Using first and second order high frequency signals, namely:
according to the propagation rule of the graph convolution neural network, the following scattering propagation rule is defined:
Wherein the method comprises the steps of Is the weight matrix of the first layer of G t, and sigma is an activation function;
the step 5 comprises the following steps:
Performing convolution operation on the original social network graph G t and the social network feature graph G f for one time respectively to obtain two convolution output representations, adding the two convolution output representations to serve as convolution input of the next convolution of the two graphs, and circularly reciprocating until all convolution operations are finished to obtain final convolution output serving as a combined embedding Z c;
The step 6 comprises the following steps:
For any social network node k, the embedding in Z s is represented as Using weight vector/>Calculate the attention value/>, of the nodeAs shown in formula (12):
Wherein the method comprises the steps of A social network weight matrix;
Similarly, the attention value of the social network node k in Z t,Zc and Z f is obtained And/>
The attention value of the social network node k is then derived using the Softmax function, as shown in equation (13):
And the same is calculated to obtain For all of the social network nodes n in the graph,AndMu stcf represents the attention value of Z s,Zt,Zc,Zf, respectively;
According to the attention value of each node of the social network, calculating to obtain the final embedded Z of each social network node:
Z=μs·Zst·Ztc·Zcf·Zf#(14)
The step 7 comprises the following steps:
Adopting a cross entropy loss function as a final objective function, and defining the probability that the social network node k belongs to the class c as Then class prediction of n nodes is/>Obtaining/> from the final embedding Z
Wherein W is a weight vector of class c; b denotes the bias, which is a constant.
2. The method for social recommendation of a graph roll-up neural network in combination with a multi-channel attention mechanism according to claim 1, wherein the step1 comprises:
firstly, calculating a similarity matrix of a social network data set by adopting a cosine similarity method Then, the first k similar node pairs of each node in the social network are selected to set edges, a k-nearest neighbor graph of the original graph, namely a social network feature graph, is constructed, and is denoted as G f=(Af and X), and the calculation method of the similarity matrix S is as follows:
Wherein X i and X j are feature vectors of social network nodes i and j, respectively.
3. The method for social recommendation of a graph roll-up neural network in combination with a multi-channel attention mechanism according to claim 1, wherein the step 3 comprises:
original social network diagram G t=(At, X) convolution output of layer i in topological space The calculation method of (2) is shown in the formula (6):
Where W t (l) represents the weight matrix of the first layer of G t, σ is the activation function, Add the adjacency matrix of the self-loop for G t, For/>And/>
4. The method for social recommendation of a graph roll-up neural network in combination with a multi-channel attention mechanism according to claim 1, wherein the step 4 comprises:
Social network profile G f=(Af, X) convolution output of layer i in the feature space As shown in formula (7):
Where W f (l) represents the weight matrix of layer i of G f, σ is the activation function, Adding a self-loop adjacency matrix for G f, where For/>And/>
5. A graph roll-up neural network social recommendation system based on a fusion of multi-channel attention mechanisms of the social recommendation method of any one of claims 1-4, comprising:
The feature map construction module is used for calculating a feature matrix X of the social network map G t=(At, X) by utilizing cosine similarity, and then constructing a k-nearest neighbor map of G t, namely the social network feature map G f=(Af, X) based on the feature; wherein a t represents the symmetric adjacency matrix of G t, and a f represents the symmetric adjacency matrix of G f;
The scattering module is used for learning the embedded Z S of the multiple signals of the social network data graph G t based on the graph convolution neural network;
The topology module is used for spreading the characteristics of the G t nodes on a topology space, carrying out convolution operation and learning the nodes of the G t to be embedded into Z t;
The characteristic module is used for spreading the characteristics of the G f node on the characteristic space, carrying out convolution operation and embedding the node of the G f into Z f;
A combination module for embedding Z c based on a combination of graph roll-up neural network learning G t and G f;
The attention module is used for introducing an attention mechanism to dynamically adjust the weight of Z S、Zt、Zf、Zc, and calculating to obtain the final embedded Z of each social network node based on the adjusted Z S、Zt、Zf、Zc;
And the social recommendation module calculates the category Y of the social network node based on the final embedded Z so as to conduct social recommendation.
6. The graph roll-up neural network social recommendation system incorporating a multi-channel attention mechanism of claim 5, wherein the feature graph construction module is specifically configured to:
firstly, calculating a similarity matrix of a social network data set by adopting a cosine similarity method Then, the first k similar node pairs of each node in the social network are selected to set edges, a k-nearest neighbor graph of the original graph, namely a social network feature graph, is constructed, and is denoted as G f=(Af and S), and the calculation method of the similarity matrix X is as follows:
wherein X i and X j are feature vectors of social network nodes i and j, respectively;
The scattering module is specifically used for:
geometric scattering on G t is constructed based on an inert random walk matrix:
where I n is the identity matrix, Adding a self-loop adjacency matrix to the original social network graph G t, D being/>Is a diagonal matrix of (a);
In geometric scattering, a wavelet transform of 2 k scales is introduced
Wherein U 0 represents the high frequency signal of the node itself;
Using first and second order high frequency signals, namely:
according to the propagation rule of the graph convolution neural network, the following scattering propagation rule is defined:
Wherein the method comprises the steps of Is the weight matrix of the first layer of G t, and sigma is an activation function;
The topology module is specifically configured to:
original social network diagram G t=(At, X) convolution output of layer i in topological space The calculation method of (2) is shown in the formula (6):
Where W t (l) represents the weight matrix of the first layer of G t, σ is the activation function, Add the adjacency matrix of the self-loop for G t, For/>And/>
The feature module is specifically used for:
Social network profile G f=(Af, X) convolution output of layer i in the feature space As shown in formula (7):
Where W f (l) represents the weight matrix of layer i of G f, σ is the activation function, Adding a self-loop adjacency matrix for G f, where For/>And/>
The combination module is specifically used for:
Performing convolution operation on the original social network graph G t and the social network feature graph G f for one time respectively to obtain two convolution output representations, adding the two convolution output representations to serve as convolution input of the next convolution of the two graphs, and circularly reciprocating until all convolution operations are finished to obtain final convolution output serving as a combined embedding Z c;
the attention module is specifically configured to:
For any social network node k, the embedding in Z s is represented as Using weight vector/>Calculate the attention value/>, of the nodeAs shown in formula (12):
Wherein the method comprises the steps of A social network weight matrix;
Similarly, the attention value of the social network node k in Z t,Zc and Z f is obtained And/>
The attention value of the social network node k is then derived using the Softmax function, as shown in equation (13):
And the same is calculated to obtain For all of the social network nodes n in the graph,AndMu stcf represents the attention value of Z s,Zt,Zc,Zf, respectively;
According to the attention value of each node of the social network, calculating to obtain the final embedded Z of each social network node:
Z=μs·Zst·Ztc·Zcf·Zf#(14);
the social recommendation module is specifically configured to:
Adopting a cross entropy loss function as a final objective function, and defining the probability that the social network node k belongs to the class c as Then class prediction of n nodes is/>Obtaining/> from the final embedding Z
Wherein W is a weight vector of class c; b denotes the bias, which is a constant.
CN202210445519.5A 2022-04-26 2022-04-26 Graph convolution neural network social recommendation method and system integrating multichannel attention mechanisms Active CN114677234B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210445519.5A CN114677234B (en) 2022-04-26 2022-04-26 Graph convolution neural network social recommendation method and system integrating multichannel attention mechanisms

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210445519.5A CN114677234B (en) 2022-04-26 2022-04-26 Graph convolution neural network social recommendation method and system integrating multichannel attention mechanisms

Publications (2)

Publication Number Publication Date
CN114677234A CN114677234A (en) 2022-06-28
CN114677234B true CN114677234B (en) 2024-04-30

Family

ID=82080899

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210445519.5A Active CN114677234B (en) 2022-04-26 2022-04-26 Graph convolution neural network social recommendation method and system integrating multichannel attention mechanisms

Country Status (1)

Country Link
CN (1) CN114677234B (en)

Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104601438A (en) * 2014-04-28 2015-05-06 腾讯科技(深圳)有限公司 Friend recommendation method and device
CN106980659A (en) * 2017-03-20 2017-07-25 华中科技大学鄂州工业技术研究院 A kind of doings based on isomery graph model recommend method
KR101872733B1 (en) * 2017-06-14 2018-06-29 조선대학교산학협력단 System for recommending social networking service following and method for recommending social networking service following using it
CN108320187A (en) * 2018-02-02 2018-07-24 合肥工业大学 A kind of recommendation method based on depth social networks
CN109410080A (en) * 2018-10-16 2019-03-01 合肥工业大学 A kind of social image recommended method based on level attention mechanism
CN110009093A (en) * 2018-12-07 2019-07-12 阿里巴巴集团控股有限公司 For analyzing the nerve network system and method for relational network figure
CN111259142A (en) * 2020-01-14 2020-06-09 华南师范大学 Specific target emotion classification method based on attention coding and graph convolution network
CN111428147A (en) * 2020-03-25 2020-07-17 合肥工业大学 Social recommendation method of heterogeneous graph volume network combining social and interest information
CN111523051A (en) * 2020-04-24 2020-08-11 山东师范大学 Social interest recommendation method and system based on graph volume matrix decomposition
CN113158071A (en) * 2021-03-19 2021-07-23 广东工业大学 Knowledge social contact recommendation method, system and equipment based on graph neural network
CN114036405A (en) * 2021-11-02 2022-02-11 扬州大学 Social contact recommendation method and system based on graph convolution network

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20210248461A1 (en) * 2020-02-11 2021-08-12 Nec Laboratories America, Inc. Graph enhanced attention network for explainable poi recommendation

Patent Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104601438A (en) * 2014-04-28 2015-05-06 腾讯科技(深圳)有限公司 Friend recommendation method and device
CN106980659A (en) * 2017-03-20 2017-07-25 华中科技大学鄂州工业技术研究院 A kind of doings based on isomery graph model recommend method
KR101872733B1 (en) * 2017-06-14 2018-06-29 조선대학교산학협력단 System for recommending social networking service following and method for recommending social networking service following using it
CN108320187A (en) * 2018-02-02 2018-07-24 合肥工业大学 A kind of recommendation method based on depth social networks
CN109410080A (en) * 2018-10-16 2019-03-01 合肥工业大学 A kind of social image recommended method based on level attention mechanism
CN110009093A (en) * 2018-12-07 2019-07-12 阿里巴巴集团控股有限公司 For analyzing the nerve network system and method for relational network figure
CN111259142A (en) * 2020-01-14 2020-06-09 华南师范大学 Specific target emotion classification method based on attention coding and graph convolution network
CN111428147A (en) * 2020-03-25 2020-07-17 合肥工业大学 Social recommendation method of heterogeneous graph volume network combining social and interest information
CN111523051A (en) * 2020-04-24 2020-08-11 山东师范大学 Social interest recommendation method and system based on graph volume matrix decomposition
CN113158071A (en) * 2021-03-19 2021-07-23 广东工业大学 Knowledge social contact recommendation method, system and equipment based on graph neural network
CN114036405A (en) * 2021-11-02 2022-02-11 扬州大学 Social contact recommendation method and system based on graph convolution network

Also Published As

Publication number Publication date
CN114677234A (en) 2022-06-28

Similar Documents

Publication Publication Date Title
Wang et al. Comparative study of monthly inflow prediction methods for the Three Gorges Reservoir
CN113407759B (en) Multi-modal entity alignment method based on adaptive feature fusion
CN107391670A (en) A kind of mixing recommendation method for merging collaborative filtering and user property filtering
CN113656596A (en) Multi-modal entity alignment method based on triple screening fusion
CN115358487A (en) Federal learning aggregation optimization system and method for power data sharing
CN106411572A (en) Community discovery method combining node information and network structure
CN116340646A (en) Recommendation method for optimizing multi-element user representation based on hypergraph motif
CN113054651B (en) Network topology optimization method, device and system
Kadavankandy et al. The power of side-information in subgraph detection
Zhou et al. Intelligent analysis system for signal processing tasks based on LSTM recurrent neural network algorithm
CN103793747A (en) Sensitive information template construction method in network content safety management
CN114037014A (en) Reference network clustering method based on graph self-encoder
Li et al. Evolutionary multiobjective optimization with clustering-based self-adaptive mating restriction strategy
CN103824285A (en) Image segmentation method based on bat optimal fuzzy clustering
CN114677234B (en) Graph convolution neural network social recommendation method and system integrating multichannel attention mechanisms
CN105159918A (en) Trust correlation based microblog network community discovery method
Cavallo et al. GCNH: A Simple Method For Representation Learning On Heterophilous Graphs
Li et al. Research on feature importance evaluation of wireless signal recognition based on decision tree algorithm in cognitive computing
CN113159160A (en) Semi-supervised node classification method based on node attention
Hao et al. An adaptive stochastic resonance detection method with a knowledge-based improved artificial fish swarm algorithm
Liang et al. Optimization of basic clustering for ensemble clustering: an information-theoretic perspective
Hafidi et al. Graph-Assisted Bayesian Node Classifiers
Zhai et al. A multi-channel attention graph convolutional neural network for node classification
Purnawansyah et al. K-Means clustering implementation in network traffic activities
Sun et al. RLIM: representation learning method for influence maximization in social networks

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant