CN114677234A - Graph convolution neural network social contact recommendation method and system integrating multi-channel attention mechanism - Google Patents
Graph convolution neural network social contact recommendation method and system integrating multi-channel attention mechanism Download PDFInfo
- Publication number
- CN114677234A CN114677234A CN202210445519.5A CN202210445519A CN114677234A CN 114677234 A CN114677234 A CN 114677234A CN 202210445519 A CN202210445519 A CN 202210445519A CN 114677234 A CN114677234 A CN 114677234A
- Authority
- CN
- China
- Prior art keywords
- social network
- graph
- node
- convolution
- matrix
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
- 238000000034 method Methods 0.000 title claims abstract description 46
- 238000013528 artificial neural network Methods 0.000 title claims abstract description 36
- 230000007246 mechanism Effects 0.000 title claims abstract description 31
- 239000011159 matrix material Substances 0.000 claims description 103
- 230000006870 function Effects 0.000 claims description 36
- 239000013598 vector Substances 0.000 claims description 20
- 230000004913 activation Effects 0.000 claims description 19
- 238000004364 calculation method Methods 0.000 claims description 13
- 238000010276 construction Methods 0.000 claims description 13
- 238000003780 insertion Methods 0.000 claims description 11
- 230000037431 insertion Effects 0.000 claims description 11
- 238000013527 convolutional neural network Methods 0.000 claims description 10
- 238000010586 diagram Methods 0.000 claims description 10
- 238000005295 random walk Methods 0.000 claims description 6
- 230000000694 effects Effects 0.000 abstract description 18
- 238000009499 grossing Methods 0.000 abstract description 9
- 238000011160 research Methods 0.000 abstract description 3
- 238000001914 filtration Methods 0.000 abstract description 2
- 230000008569 process Effects 0.000 description 7
- 238000002474 experimental method Methods 0.000 description 5
- 238000009826 distribution Methods 0.000 description 4
- 238000011156 evaluation Methods 0.000 description 4
- 230000004927 fusion Effects 0.000 description 4
- 230000006855 networking Effects 0.000 description 4
- 238000007418 data mining Methods 0.000 description 3
- 230000006872 improvement Effects 0.000 description 3
- 238000012545 processing Methods 0.000 description 3
- 238000012360 testing method Methods 0.000 description 3
- 230000009466 transformation Effects 0.000 description 3
- 238000002679 ablation Methods 0.000 description 2
- 230000003044 adaptive effect Effects 0.000 description 2
- 230000007547 defect Effects 0.000 description 2
- 238000009792 diffusion process Methods 0.000 description 2
- 238000012986 modification Methods 0.000 description 2
- 230000004048 modification Effects 0.000 description 2
- 239000000126 substance Substances 0.000 description 2
- 238000012549 training Methods 0.000 description 2
- 238000000844 transformation Methods 0.000 description 2
- NAWXUBYGYWOOIX-SFHVURJKSA-N (2s)-2-[[4-[2-(2,4-diaminoquinazolin-6-yl)ethyl]benzoyl]amino]-4-methylidenepentanedioic acid Chemical compound C1=CC2=NC(N)=NC(N)=C2C=C1CCC1=CC=C(C(=O)N[C@@H](CC(=C)C(O)=O)C(O)=O)C=C1 NAWXUBYGYWOOIX-SFHVURJKSA-N 0.000 description 1
- 241000287196 Asthenes Species 0.000 description 1
- 244000140570 Pinus monophylla Species 0.000 description 1
- 230000004931 aggregating effect Effects 0.000 description 1
- 230000002776 aggregation Effects 0.000 description 1
- 238000004220 aggregation Methods 0.000 description 1
- 230000009286 beneficial effect Effects 0.000 description 1
- 230000000295 complement effect Effects 0.000 description 1
- 125000004122 cyclic group Chemical group 0.000 description 1
- 230000000593 degrading effect Effects 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 238000010801 machine learning Methods 0.000 description 1
- 238000005259 measurement Methods 0.000 description 1
- 238000005065 mining Methods 0.000 description 1
- 230000000116 mitigating effect Effects 0.000 description 1
- 239000000203 mixture Substances 0.000 description 1
- 238000012544 monitoring process Methods 0.000 description 1
- 230000001537 neural effect Effects 0.000 description 1
- 238000003062 neural network model Methods 0.000 description 1
- 229910052709 silver Inorganic materials 0.000 description 1
- 239000004332 silver Substances 0.000 description 1
- 230000003595 spectral effect Effects 0.000 description 1
- 238000003466 welding Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q50/00—Information and communication technology [ICT] specially adapted for implementation of business processes of specific business sectors, e.g. utilities or tourism
- G06Q50/01—Social networking
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/04—Architecture, e.g. interconnection topology
- G06N3/045—Combinations of networks
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/08—Learning methods
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Health & Medical Sciences (AREA)
- Health & Medical Sciences (AREA)
- General Physics & Mathematics (AREA)
- Computing Systems (AREA)
- Biophysics (AREA)
- Software Systems (AREA)
- Evolutionary Computation (AREA)
- Computational Linguistics (AREA)
- Molecular Biology (AREA)
- Biomedical Technology (AREA)
- General Engineering & Computer Science (AREA)
- Artificial Intelligence (AREA)
- Mathematical Physics (AREA)
- Data Mining & Analysis (AREA)
- Life Sciences & Earth Sciences (AREA)
- Business, Economics & Management (AREA)
- Economics (AREA)
- Human Resources & Organizations (AREA)
- Marketing (AREA)
- Primary Health Care (AREA)
- Strategic Management (AREA)
- Tourism & Hospitality (AREA)
- General Business, Economics & Management (AREA)
- Information Retrieval, Db Structures And Fs Structures Therefor (AREA)
Abstract
The invention provides a social recommendation method and system of a graph convolution neural network integrating a multi-channel attention mechanism, which can improve the social recommendation effect by the following work: when the node characteristics and the topological structure node embedding are learned, the node embedding combining the node characteristics and the topological structure node embedding is learned, the common characteristics of the node characteristics and the topological structure node embedding are obtained, and the problem of excessive dependence on single characteristics is solved. And secondly, by learning the scattering embedding of the topological structure, the band-pass filtering of different signals is realized, and the over-smoothing phenomenon is reduced. And thirdly, combining an attention mechanism to fuse the related information. Experimental results show that compared with other algorithms, the performance of the method and the system provided by the invention on a plurality of social network data sets is improved, and the method and the system also provide a new thought for subsequent research.
Description
Technical Field
The invention relates to the technical field of social recommendation, in particular to a method and a system for recommending social networking services of a graph convolution neural network by fusing a multi-channel attention mechanism.
Background
With the development of networks, people have diversified social ways, and from the prior short messages to the current online social ways, the social ways are more and more convenient. Graph (Graph) is flexible in structure and has powerful representation capability, so more and more data is represented by Graph. In social recommendation, each node can be regarded as a user, and if edges are connected between two nodes, similar information exists between the two nodes. But graph data processing and mining has been a challenge. As the data size increases, the graph data becomes more complex and is relatively troublesome to process with respect to signal data such as images and languages, and therefore, it is a very important issue how to efficiently and accurately learn the representation of the graph data. Social recommendation tasks, one of the most widespread tasks on the graph, have been the focus of research by researchers.
Graph Neural Network (GNN) is based on the idea of convolutional Neural, cyclic Neural networks and depth autocoders, defining and designing Neural Network structures for processing Graph data. Later, as research progresses, members of GNNs become more and more, and Graph Convolution Networks (GCNs), Graph Attention Networks (Graph Attention Networks), Graph Auto encoders (Graph Auto encoders), Graph generation Networks (Graph generating Networks), and the like have come into force for different task requirements. And in particular the appearance of convolutional neural networks, provides new ideas for the analysis of graph data. GCNs are currently widely used neural network architectures for learning graph data, and the basic idea is to combine the idea of CNN in GNN. The GCNs are end-to-end learning frames, and the nodes aggregate information of the nodes and neighbors thereof through each layer of convolution, so that the obtained node information is continuously updated, the optimal node information is obtained, and similar users are recommended to social users.
Recent studies have shown that the learning of a single graph structure or node features does not work well in the node classification task. The basic framework of GCNs does not take into account graph structure and node feature correlation factors well. Aiming at The problem, Wang et al [ Wang X, Zhu M, Bo D, et al. AM-GCN: Adaptive Multi-channel Graph relational Networks [ C ]// KDD'20: The 26th ACM SIGKDD Conference on Knowledge Discovery and Data mining. ACM,2020 ] utilize a constructed K-order adjacency Graph to mine The correlation between Graph structures and node features, learn different representations of nodes, and simultaneously introduce attention to a power system to improve The classification effect. A semi-supervised method is proposed by Jin et al [ Jin W, Derr T, Wang Y, et al, node Similarity monitoring graphic Networks [ C ]// WSDM'21: The fourth ACM International Conference on Web Search and Data mining. ACM,2021 ], and self-adaptive fusion is carried out according to Graph structure and node characteristic information, and learning is carried out according to The information after The fusion of The Graph structure and The node characteristic information. However, these methods do not achieve the best learning effect on the graph structure and the node characteristics.
On the other hand, as the depth of the GCNs increases, the nodes become more similar, so that the personalized features of each node cannot be fully embodied. The difficulty of node classification is greatly increased, and the effect of user recommendation is poor. Aiming at the problem, some scholars propose that the occurrence of over-smoothing is relieved by utilizing geometric scattering information, and a geometric scattering network can capture high-order regular patterns on a graph and fully learn signal information of different frequencies. Zou et al [ Dongmian Zou and Gilad Lerman, "Graph connected neural network view profiling," Applied and computer Harmonic Analysis, vol.49, No.3, pp.1046-1074,2020 ] demonstrated that the scattering network can make any feature generated nearly invariant to alignment, enabling stable Graph operation. Fernando Gama et al [ Fernando Gama, Alejandro Ribeiro, and Joan Bruna, "Diffusion scattering transformations on graphs," in International Conference on Learning transformations, 2019 ] demonstrated that using Diffusion wavelets the scattering transformation can be generalized to non-Euclidean domains while enabling the capture of high frequency signals. Zhu et al [ Zhu H, Koniusz p. simple Spectral Graph convention [ C ]// International Conference on Learning retrieval 2021.2021 ] use simple spectrogram Convolution to balance the low-pass and high-pass filter bands of global and local context for each node captured, thereby mitigating the occurrence of over-smoothing.
Disclosure of Invention
Aiming at the task of processing a complex relation diagram of the current GCNs framework, the invention comprises the following steps: for example, in social recommendation, due to the fact that node characteristics are excessively depended on, an over-smoothing phenomenon easily occurs in a node aggregation process, node representations are difficult to distinguish, and the effect of social recommendation is seriously influenced. In order to adaptively distribute the weight, an attention mechanism is added, so that the node classification task effect is obviously improved, and the social recommendation is more accurate.
In order to achieve the purpose, the invention adopts the following technical scheme:
the invention provides a graph convolution neural network social contact recommendation method fusing a multi-channel attention mechanism, which comprises the following steps:
step 1: computing social network graph G using cosine similarityt=(AtX) and then constructing G based on the featurestK-nearest graph of (i.e. social network profile G)f=(AfX); wherein A istRepresents GtA symmetric adjacency matrix offRepresents GfA symmetric adjacency matrix of (a);
step 2: learning social network data graph G based on graph convolution neural networktEmbedding of multiple signals ZS;
And step 3: g is to betThe characteristics of the nodes are spread on the topological space, convolution operation is carried out, and G is learnedtNode of (2) embedding Zt;
And 4, step 4: g is to befThe characteristics of the nodes are spread on the characteristic space, convolution operation is carried out, and G is learnedfNode of (2) embedding Zf;
And 5: neural network learning G based on graph convolutiontAnd GfCombined insertion of Zc;
Step 6: dynamic adjustment of attention-drawing mechanism ZS、Zt、Zf、ZcBased on the adjusted ZS、Zt、Zf、ZcCalculating to obtain a final embedding Z of each social network node;
and 7: and calculating the category Y of the social network node based on the final embedding Z so as to perform social recommendation.
Further, the step 1 comprises:
first, a similarity matrix of a social network data set is calculated by using a cosine similarity methodThen, the front k similar node pairs of each node in the social network are selected to set edges, and a k-nearest graph of the original graph, namely a social network characteristic graph, which is represented as G, is constructedf=(AfX), the calculation method of the similarity matrix S is as follows:
wherein XiAnd XjFeature vectors for social network nodes i and j, respectively.
Further, the step 2 comprises:
construction G based on inertia random walk matrixtUpper geometric scattering:
wherein InIs a matrix of units, and is,as an original social network diagram GtAn adjacency matrix with added self-loops, D isA diagonal matrix of (a);
Wherein U is0A high frequency signal representing the node itself;
using first and second order high frequency signals, namely:
according to the propagation rule of the graph convolution neural network, the following scattering propagation rule is defined:
Further, the step 3 comprises:
original social network graph Gt=(AtX) convolution output of the l-th layer in topological spaceThe calculation method of (2) is shown in equation (6):
wherein, the first and the second end of the pipe are connected with each other,represents GtThe weight matrix of layer i, σ is the activation function,is GtA contiguous matrix of self-loops is added, is composed ofA diagonal matrix of
Further, the step 4 comprises:
social network profile Gf=(AfX) convolution output of the l-th layer in the feature spaceAs shown in equation (7):
whereinRepresents GfThe weight matrix of layer i, σ is the activation function,is GfAdding a contiguous matrix of self-loops, wherein Is composed ofA diagonal matrix of
Further, the step 5 comprises:
map the original social network GtAnd social network feature graph GfRespectively carrying out convolution operation once to obtain two convolution output representations, then adding the two convolution output representations to be used as convolution input of next convolution of the two graphs, circularly reciprocating the convolution input until all the convolution operations are finished, and obtaining final convolution output to be used as combined embedding Zc。
Further, the step 6 comprises:
for any social network node k, at ZsIs expressed asUsing weight vectorsCalculating the attention value of the nodeAs shown in equation (12):
then, the attention value of the social network node k is obtained by using a Softmax function, as shown in formula (13):
calculated by the same principle to obtainFor all social network nodes n in the graph,andμs,μt,μc,μfrespectively represents Zs,Zt,Zc,ZfAttention value of (d);
calculating to obtain the final embedding Z of each social network node according to the attention value of each node of the social network:
Z=μs·Zs+μt·Zt+μc·Zc+μf·Zf#(14)。
further, the step 7 includes:
defining the probability that the social network node k belongs to the class c as the final objective function by adopting a cross entropy loss functionThe class of n nodes is predicted asDerived from the final insertion Z
Wherein W is a weight vector of class c; b represents the bias, which is a constant.
The invention provides a multi-channel attention mechanism fused graph convolution neural network social recommendation system on the other hand, which comprises the following components:
a feature graph construction module for calculating a social network graph G by using cosine similarityt=(AtX) and then constructing G based on the featurestK-nearest graph of (i.e. social network profile G)f=(AfX); wherein A istRepresents GtA symmetric adjacency matrix offRepresents GfA symmetric adjacency matrix of (a);
a scattering module for learning a social network data graph G based on a graph convolution neural networktEmbedding of multiple signals ZS;
Topology module for connecting GtThe characteristics of the nodes are spread on the topological space, convolution operation is carried out, and G is learnedtNode of (2) embedding Zt;
Feature module for converting GfThe characteristics of the nodes are spread on the characteristic space, convolution operation is carried out, and G is learnedfNode of (2) embedding Zf;
Combination module for graph convolution based neural network learning GtAnd GfCombined insertion of Zc;
Attention module for dynamic adjustment of attention mechanism ZS、Zt、Zf、ZcBased on the weight ofAdjusted ZS、Zt、Zf、ZcCalculating to obtain a final embedding Z of each social network node;
and the social recommendation module calculates the category Y of the social network node based on the final embedding Z, so as to perform social recommendation.
Further, the feature map construction module is specifically configured to:
firstly, calculating a similarity matrix of a social network data set by adopting a cosine similarity methodThen, the front k similar node pairs of each node in the social network are selected to set edges, and a k-nearest graph of the original graph, namely a social network characteristic graph, which is represented as G, is constructedf=(AfX), the calculation method of the similarity matrix S is as follows:
wherein XiAnd XjFeature vectors for social network nodes i and j, respectively.
Further, the scattering module is specifically configured to:
construction G based on inertia random walk matrixtUpper geometric scattering:
wherein InIs a matrix of units, and is,as an original social network diagram GtAn adjacency matrix with added self-loops, D isA diagonal matrix of (a);
Wherein U is0A high frequency signal representing the node itself;
using first and second order high frequency signals, namely:
according to the propagation rule of the graph convolution neural network, the following scattering propagation rule is defined:
Further, the topology module is specifically configured to:
original social network graph Gt=(AtX) convolution output of the l-th layer in topological spaceThe calculation method of (2) is shown in equation (6):
wherein, the first and the second end of the pipe are connected with each other,represents GtThe weight matrix of layer i, σ is the activation function,is GtAn adjacency matrix of self-loops is added, is composed ofA diagonal matrix of
Further, the feature module is specifically configured to:
social network profile Gf=(AfX) convolution output of the l-th layer in the feature spaceAs shown in equation (7):
whereinRepresents GfThe weight matrix of layer i, σ is the activation function,is GfAdding a contiguous matrix of self-loops, wherein Is composed ofA diagonal matrix of
Further, the combination module is specifically configured to:
map the original social network GtAnd social network feature graph GfRespectively carrying out convolution operation once to obtain two convolution output representations, then adding the two convolution output representations to be used as convolution input of next convolution of the two graphs, circularly reciprocating the convolution input until all the convolution operations are finished, and obtaining final convolution output to be used as combined embedding Zc。
Further, the attention module is specifically configured to:
for any social network node k, at ZsIs expressed asUsing weight vectorsCalculating the attention value of the nodeAs shown in equation (12):
then, the attention value of the social network node k is obtained by using a Softmax function, as shown in formula (13):
calculated by the same principle to obtainFor all social network nodes n in the graph,andμs,μt,μc,μfrespectively represents Zs,Zt,Zc,ZfThe attention value of (1);
calculating to obtain the final embedding Z of each social network node according to the attention value of each node of the social network:
Z=μs·Zs+μt·Zt+μc·Zc+μf·Zf#(14)。
further, the social recommendation module is specifically configured to:
defining the probability that the social network node k belongs to the class c asThe class of n nodes is predicted asDerived from the final insertion Z
Wherein W is a weight vector of class c; b represents the bias, which is a constant.
Compared with the prior art, the invention has the following beneficial effects:
the invention simultaneously considers the common information of the fusion node characteristics and the network topology and the frequently encountered over-smooth problem, and effectively solves the problems by combining the embedding and the addition of the geometric scattering. The most relevant information of node features and network topology is learned more effectively through a combined embedding process, and the occurrence of over-smoothing is relieved through a scattering process by learning different frequency signals of first-order and second-order neighbors. Experiments show that the social recommendation method and the social recommendation system can achieve good effects in social recommendation tasks.
Drawings
FIG. 1 is a flowchart of a method for social recommendation of a convolutional neural network incorporating a multi-channel attention mechanism according to an embodiment of the present invention;
FIG. 2 is a diagram of an overall architecture of a convolutional neural network social recommendation system incorporating a multi-channel attention mechanism according to an embodiment of the present invention;
FIG. 3 is a graph of the results of an attention mechanism analysis in accordance with an embodiment of the present invention.
Detailed Description
For a better understanding of the present application, the following explanations are first made:
given a social networking graph G ═ (a, X), a is its symmetric adjacency matrix, andx is its feature matrix, anIf there is an edge connection between social network nodes i and j, then Aij1, otherwise Aij=0。
The invention is further illustrated by the following examples in conjunction with the accompanying drawings:
as shown in fig. 1, a method for social recommendation (SM-GCN for short) of a graph convolution neural network fusing a multi-channel attention mechanism includes:
step S101: computing a social network graph G using cosine similarityt=(AtX) and then constructing G based on the featurestK-nearest graph of (i.e. social network profile G)f=(AfX); wherein A istRepresents GtA symmetric adjacency matrix offRepresents GfA symmetric adjacency matrix of (a);
step S102: to alleviate the problem of over-smoothing, social network data graph G is learned based on graph convolution neural networkstEmbedding of multiple signals ZS;
Step S103: g is to betThe characteristics of the nodes are spread on the topological space, convolution operation is carried out, and G is learnedtNode of (2) embedding ZtTo obtain GtUnique information;
step S104: g is to befThe characteristics of the nodes are spread on the characteristic space, convolution operation is carried out, and G is learnedfNode of (2) embedding ZfTo obtain GfUnique information;
step S105: neural network learning G based on graph convolutiontAnd GfCombined insertion of ZcTo obtain their common characteristic information;
step S106: dynamic adjustment of attention-drawing mechanism ZS、Zt、Zf、ZcBased on the adjusted ZS、Zt、Zf、ZcCalculating to obtain a final embedding Z of each social network node;
step S107: and calculating the category Y of the social network node based on the final embedding Z, so as to perform social recommendation, and improve the effect in the social recommendation task.
Further, the step S101 includes:
in the aspect of construction of the social network characteristic diagram, the cosine similarity method is used for calculation in consideration of the complexity of calculation. The cosine similarity method is used for measuring the difference between two nodes by calculating cosine values of included angles of two space vectors, wherein the smaller the included angle is, the more similar the two nodes are, and vice versa. Cosine similarity is more focused on the difference of two vectors in direction than distance measurement.
Firstly, a similarity matrix of a social network data set is calculated by adopting a cosine similarity methodThen, the front k similar node pairs of each node in the social network are selected to set edges, and a k-nearest neighbor (kNN) graph of the original graph, namely a social network characteristic graph, which is represented as G, is constructedf=(AfX), the similarity matrix S is calculated as follows:
wherein XiAnd XjFeature vectors for social network nodes i and j, respectively.
Further, the step S102 includes:
the essence of convolution is the process of laplacian smoothing, which makes neighboring nodes more and more similar, thereby degrading the classification effect. In the step, the scattering characteristics of the original social network diagram are gathered, the multi-scale difference of the first-order neighborhood and the second-order neighborhood is extracted by utilizing the wavelet matrix, so that the defect of low-frequency filtering caused by the traditional GCNs (graph convolution neural networks) is overcome, the scattering operation is complementary with the traditional GCNs operation, and the over-smoothing phenomenon is relieved.
The construction of geometric scattering on the graph is based on an inert random walk matrix, which can be expressed as:
wherein InIs a matrix of units, and is,as an original social network diagram GtAn adjacency matrix with added self-loops, D isThe diagonal matrix of (a).
Since high frequency signals can be recovered by multi-scale wavelet transform, in geometric scattering, the invention introduces 2 k-scale wavelet transform
Wherein U is0A high frequency signal representing the node itself;
using first and second order high frequency signals, namely:
according to the propagation rules of GCNs, a scattering propagation rule is defined:
whereinIs the original social network diagram GtThe weight matrix, σ, of layer i is the activation function, here the ReLU function is used.
Further, the step S103 includes:
and respectively carrying out graph convolution operation on the original social network graph and the characteristic graph, and learning the topological information representation and the characteristic information representation of the respective nodes.
Original social network graph Gt=(AtX) convolution output of the l-th layer in topological spaceThe calculation method of (2) is shown in equation (6):
wherein the content of the first and second substances,represents GtThe weight matrix of layer i, σ is the activation function, here the ReLU function is used,is GtAn adjacency matrix of self-loops is added, is composed ofA diagonal matrix of
Further, the step S104 includes:
social network profile Gf=(AfX) convolution output of the l-th layer in the feature spaceAs shown in equation (7):
whereinRepresents GfThe weight matrix of layer i, σ is the activation function, here the ReLU function is used,is GfAdding a contiguous matrix of self-loops, wherein Is composed ofA diagonal matrix of
Further, the step S105 includes:
given that downstream social recommendation tasks may be related to common information of the original social networking graph and the feature graph, the present invention devises this step to learn common characteristics of the original social networking graph and the feature graph. The main idea is to enter data of two graphs into the same channel, and perform convolution operation by using the same parameters, so as to realize information sharing between the two graphs. Firstly, the original social network graph and the feature graph are respectively subjected to convolution operation once to obtain two convolution output representations, and then the two convolution output representations are added together to serve as the convolution input of the next convolution of the two graphs, so that the operation is repeated in a cycle. Compared with a method of performing convolution and adding on two graphs respectively, the method of the invention can highlight the common characteristic of the nodes by adding the representations after each layer of convolution.
Original social network graph Gt=(AtX) convolution output of the first layer in topological spaceAs shown in equation (8):
where σ is the activation function, here the ReLU function is used,is a weight matrix for the first layer of the social network,
social network profile Gf=(AfX) convolution output of a first layer in a feature spaceAs shown in formula (9):
where σ is the activation function, here the ReLU function is used,is a weight matrix of the first layer of the social network, has the same parameters as the weight matrix in the topological space,
further, the step S106 includes:
in order to improve the effect of node classification tasks and improve the accuracy of social network recommendation, the four social network node embeddings, namely Z, are obtaineds,Zt,Zc,Zf. According to the method, the social network node embedding weight is calculated by using an attention mechanism, and information with higher relevance is obtained. Mu.ss,μt,μc,μfRespectively represents Zs,Zt,Zc,ZfAttention value of (1).
For any social network node k, at ZsIs expressed asThe invention uses weight vectorsCalculating the attention value of the nodeAs shown in equation (12):
whereinIs a social network weight matrix. Similarly, social network node k may be obtained at Zt,ZcAnd ZfAttention value ofAnd
then, the attention value of the social network node k is obtained by using a Softmax function, as shown in formula (13):
according to the attention value, calculating to obtain the final embedding Z of each social network node:
Z=μs·Zs+μt·Zt+μc·Zc+μf·Zf#(14)
specifically, which category the node specifically belongs to can be predicted based on the obtained final embedded Z, and recommendation of similar social interests and the like can be performed for the user according to the obtained category, so that the recommendation is more accurate.
Further, the step S107 includes:
the cross entropy can measure the difference degree of two different probability distributions in the same random variable, and is expressed as the difference between the real probability distribution and the predicted probability distribution in machine learning. The smaller the value of the cross entropy, the better the model prediction effect. The invention adopts a cross entropy loss function as a final objective function. Defining the probability that social network node k belongs to class c asThen the class of n nodes is predicted asAccording to the final embedding Z, obtaining
Wherein W is a weight vector of class c; b represents the bias, which is a constant.
Assuming that the training set is L, for each L ∈ L, the actual label is YlThe prediction tag isThe penalty in node classification is then expressed as:
on the basis of the above embodiment, as shown in fig. 2, the present invention further provides a graph convolution neural network social recommendation (abbreviated as SM-GCN) system fusing a multi-channel attention mechanism, including:
a feature graph construction module for calculating a social network graph G by using cosine similarityt=(AtX) and then constructing G based on the featurestK-nearest graph of (1), i.e. social network feature graph Gf=(AfX); wherein, AtRepresents GtA symmetric adjacency matrix offRepresents GfA symmetric adjacency matrix of (a);
a scattering module for learning a social network data graph G based on a graph convolution neural networktEmbedding of multiple signals ZS;
Topology module for connecting GtCharacteristics of the nodesSpread in topological space, convolution operation, and learning GtNode of (2) embedding Zt;
Feature module for converting GfThe characteristics of the nodes are spread on the characteristic space, convolution operation is carried out, and G is learnedfNode of (2) embedding Zf;
Combination module for graph convolution based neural network learning GtAnd GfCombined insertion of Zc;
Attention module for dynamic adjustment of attention mechanism ZS、Zt、Zf、ZcBased on the adjusted ZS、Zt、Zf、ZcCalculating to obtain a final embedding Z of each social network node;
and the social recommendation module calculates the category Y of the social network node based on the final embedding Z, so as to perform social recommendation.
Further, the feature map construction module is specifically configured to:
first, a similarity matrix of a social network data set is calculated by using a cosine similarity methodThen, the front k similar node pairs of each node in the social network are selected to set edges, and a k-nearest graph of the original graph, namely a social network characteristic graph, which is represented as G, is constructedf=(AfX), the calculation method of the similarity matrix S is as follows:
wherein XiAnd XjFeature vectors for social network nodes i and j, respectively.
Further, the scattering module is specifically configured to:
construction of a random walk matrix G based on inertiatUpper geometric scattering:
wherein InIs a matrix of units, and is,for the original social network graph GtAn adjacency matrix with added self-loops, D isA diagonal matrix of (a);
Wherein U is0A high frequency signal representing the node itself;
using first and second order high frequency signals, namely:
according to the propagation rule of the graph convolution neural network, the following scattering propagation rule is defined:
Further, the topology module is specifically configured to:
original social network graph Gt=(AtX) convolution output of the l-th layer in topological spaceThe calculation method of (2) is shown in equation (6):
wherein the content of the first and second substances,represents GtThe weight matrix of layer l, σ is the activation function,is GtAn adjacency matrix of self-loops is added, is composed ofA diagonal matrix of
Further, the feature module is specifically configured to:
social network feature graph Gf=(AfX) convolution output of the l-th layer in the feature spaceAs shown in equation (7):
whereinRepresents GfThe weight matrix of layer l, σ is the activation function,is GfAdding a contiguous matrix of self-loops, wherein Is composed ofA diagonal matrix of
Further, the combination module is specifically configured to:
map the original social network GtAnd social network feature graph GfRespectively carrying out convolution operation once to obtain two convolution output representations, then adding the two convolution output representations to be used as convolution input of next convolution of the two graphs, circularly reciprocating the convolution input until all the convolution operations are finished, and obtaining final convolution output to be used as combined embedding Zc。
Further, the attention module is specifically configured to:
for any social network node k, at ZsIs expressed asUsing weight vectorsCalculating the attention value of the nodeAs shown in equation (12):
then, the attention value of the social network node k is obtained using the Sofimax function, as shown in formula (13):
calculated by the same principle to obtainFor all social network nodes n in the graph,andμs,μt,μc,μfrespectively represents Zs,Zt,Zc,ZfThe attention value of (1);
calculating to obtain the final embedding Z of each social network node according to the attention value of each node of the social network:
Z=μs·Zs+μt·Zt+μc·Zc+μf·Zf#(14)。
further, the social recommendation module is specifically configured to:
defining the probability that the social network node k belongs to the class c asThe class of n nodes is predicted asDerived from the final insertion Z
Wherein W is a weight vector of class c; b represents the bias, which is a constant.
To verify the effect of the present invention, the following experiment was performed:
specifically, the invention was evaluated on three social networks (UAI2010, BlogCatalog and Flickr), with the data set information as shown in table 1.
Table 1 data set statistics
Dataset | Nodes | Edges | Classes | Femres |
BlogCatalog | 5196 | 171743 | 6 | 8189 |
Flickr | 7575 | 239738 | 9 | 12047 |
UAI2010 | 3067 | 28311 | 19 | 4973 |
In order to prove the effectiveness of the method, the method of the invention is compared with the following six popular node classification methods:
GCN [ Thomas N Kipf and Max welding.2016. semi-super class classification with graph volume networks. arXiv preprinting arXiv:1609.02907(2016) ]: the method is one of the most popular semi-supervised graph convolution network models at present, and achieves better representation by aggregating information of neighbor nodes.
KNN-GCN [ Luca France schi, Mathias Niepert, Massimiliano Pontil, and Xiao He.2019.learning discrete structures for graph neural networks. ICML (2019) ]: a variant of GCN uses a K-step adjacency matrix as an input map.
And (3) GAT: the method is a graph neural network model widely applied to GNN baselines, and node features are gathered according to learned scores of different nodes.
DEMO-Net [ Pei H, Wei B, Chang C, et al. Geom-GCN: Geometric Graph volatile Networks [ J ].2020 ]: is a degree specific graph neural network for node classification.
MixHop [ Sami Abu-El-Haija, Bryan Perozzi, Amol Kapoor, Nazanin Alipourfard, KristinaLerman, Hrayr Harutyunyun, Greg Ver Steeg, and Aram Galstyan.2019.MixHop: Higher-Order Graph Convolvulnal architecture via sparse dispersed neighboring Miighbolond Mixing.In ICML.21-29 ]: is a method of representing a graph convolution layer using features of neighbors that blend various distances.
AM-GCN [ Wang X, Zhu M, Bo D, et al. AM-GCN: Adaptive Multi-channel Graph relational Networks [ C ]// KDD'20: The 26th ACM SIGKDD Conference on Knowledge Discovery and Data mining. ACM,2020 ]: the method is a network model combining feature graphs, original graphs and common feature representations of the feature graphs and the original graphs.
The invention selects 20 nodes marked in each class as a training set, and simultaneously selects 1000 nodes as a test set. The invention trains two layers of GCNs, and the setting of the hidden layer is nhid 1E {512,768}, and nhid 2E {32,128,256 }. The learning rate is set to be 0.0002-0.0005, the drop rate is set to be 0.5, the weight attenuation belongs to {5e-3,5e-4}, and the k-order neighborhood graph belongs to {2, …,9 }. All references to baseline, the present invention refer to the parameters in their paper. All experiments were run 5 times with an average. Evaluation of model Performance, the present invention employs Accuracy (ACC) and macroscopic F1 score (F1). Accuracy refers to the ratio of the number of samples correctly classified by the classifier to the total number of samples, i.e., the probability that the prediction is correct, for a given test data set. A single accuracy rate (ACC) evaluation index has a great defect under the condition of unbalance of positive and negative samples, so that a macroscopic F1 scoring standard is added, and the evaluation index is a harmonic mean of the accuracy rate and the recall rate.
The experimental environment is as follows:
the operating system: ubuntu Linux release 16.04.7 LTS;
·CPU:Intel(R)Xeon(R)Silver CPU@2.20GHz;
·GPU:Quadro P4000;
software version: python 3.7; pytrch 1.1.0; numpy 1.16.2; SciPy 1.3.1; networkx 2.4; scikit-leann 0.21.3.
The results of obtaining the social recommendation task according to the above environment are shown in table 2. The SM-GCN proposed by the present invention achieves the best performance on both evaluation criteria of all datasets. In particular, ACC and F1 are respectively promoted by 9.82% and 10.07% on the BlogCatalog data set respectively. By the analysis, it is considered that the effect may be improved due to the following factors.
1. By comparing the performance of GCN and KNN-GCN on the datasets, it can be seen that the two have different effects on different datasets, which indicates that both the topology map and the signature map are important.
2. By comparing the performance of KNN-GCN and MixHop on the data sets, it can be seen that KNN-GCN performs well in the paper citation network, while MixHop performs well in the social network, which suggests that learning the characteristics of neighbor nodes is very necessary.
3. The SM-GCN can better learn the node common information of the topological graph and the characteristic graph, the occurrence of over-smoothness is relieved due to the existence of the scattering module, and the effect of an experimental result is improved by at least 1% compared with that of AM-GCN.
TABLE 2 percent social recommendation results (best effort bolded)
In order to show the effectiveness of the module provided by the invention, an ablation experiment is carried out on the model SM-GCN provided by the invention. The TF-GCN is a model only comprising a topological module and a characteristic module, the S-GCN is a model comprising a topological module, a characteristic module and a scattering module, the C-GCN is a model comprising a topological module, a characteristic module and a combined module, and the experimental results are shown in Table 3.
TABLE 3 ablation test results percentage (best results bolded)
On the BlogCatalog data set, the model of the invention is greatly improved. According to the experimental results of TF-GCN and S-GCN, the invention can be speculated that the improvement is the result obtained by the combination module learning the common characteristics of the original graph and the characteristic graph. The S-GCN with the addition of the scattering module is improved over the TF-GCN operation using only the original map and the signature map, since the scattering module adds signals of other frequencies to the original map.
Compared with TF-GCN, the classification effect is improved, and the necessity of learning the scattering module and the combination module at the same time is proved.
In order to understand which embedding the classification task is more inclined to, the invention carries out distribution analysis on the attention mechanism of the model of the invention. As shown in fig. 3.
The model of the invention has four nodes embedded before adding attention mechanism: original graph node embedding, feature graph node embedding, combined node embedding and scattering node embedding of an original graph. From fig. 3, it can be seen that the combination node embedding on three data sets (BlogCatalog, Flickr, UAI2010) all shows a crucial role. Scattering node embedding is more important than topological node embedding and characteristic node embedding, and in the UAI2010 data set, characteristic node embedding is only next to combined node embedding, so that the data set is more prone to characteristic information representation. The higher value of the attention embedded by the scattering nodes than the value of the attention embedded by the topological nodes indicates that the high-frequency signal contributes to the classification task more than the low-frequency signal generated by the GCN, and the addition of the scattering module is very necessary. In conclusion, the model of the invention can distribute the weight more adaptively, so that the social recommendation task can achieve better effect.
In summary, the SM-GCN provided by the invention considers the common information of the fusion node characteristics and the network topology and the frequently encountered over-smooth problem at the same time, and the combination embedding and the addition of the geometric scattering effectively solve the problems. The most relevant information of node features and network topology is learned more effectively through a combined embedding process, and the occurrence of over-smoothing is relieved through a scattering process by learning different frequency signals of first-order and second-order neighbors. Experiments show that the social recommendation method and the social recommendation system can achieve good effects in social recommendation tasks.
The above shows only the preferred embodiments of the present invention, and it should be noted that it is obvious to those skilled in the art that various modifications and improvements can be made without departing from the principle of the present invention, and these modifications and improvements should also be considered as the protection scope of the present invention.
Claims (10)
1. A graph convolution neural network social recommendation method fusing a multi-channel attention mechanism is characterized by comprising the following steps:
step 1: computing social network graph G using cosine similarityt=(AtX) and then constructing G based on the featurestK-nearest graph of (i.e. social network profile G)f=(AfX); wherein A istRepresents GtA symmetric adjacency matrix offRepresents GfA symmetric adjacency matrix of (a);
and 2, step: learning social network data graph G based on graph convolution neural networktEmbedding of multiple signals ZS;
And 3, step 3: g is to betThe characteristics of the nodes are spread on the topological space, convolution operation is carried out, and G is learnedtNode of (2) embedding Zt;
And 4, step 4: g is to befThe characteristics of the nodes are spread on the characteristic space, convolution operation is carried out, and G is learnedfNode of (2) embedding Zf;
And 5: neural network learning G based on graph convolutiontAnd GfCombined insertion of Zc;
Step 6: dynamic adjustment of attention-drawing mechanism ZS、Zt、Zf、ZcBased on the adjusted ZS、Zt、Zf、ZcCalculating to obtain a final embedding Z of each social network node;
and 7: and calculating the category Y of the social network node based on the final embedding Z so as to perform social recommendation.
2. The method for social recommendation of the atlas neural network fusing the multichannel attention mechanism according to claim 1, wherein the step 1 comprises:
first, a similarity matrix of a social network data set is calculated by using a cosine similarity methodThen, the front k similar node pairs of each node in the social network are selected to set edges, and a k-nearest graph of the original graph, namely a social network characteristic graph, which is represented as G, is constructedf=(AfX), the calculation method of the similarity matrix S is as follows:
wherein XiAnd XjFeature vectors for social network nodes i and j, respectively.
3. The method for social recommendation of a convolutional neural network fused with a multi-channel attention mechanism as claimed in claim 1, wherein said step 2 comprises:
construction G based on inertia random walk matrixtUpper geometric scattering:
wherein InIs a matrix of units, and is,as an original social network diagram GtAn adjacency matrix with added self-loops, D isA diagonal matrix of (a);
Wherein U is0A high frequency signal representing the node itself;
using first and second order high frequency signals, namely:
according to the propagation rule of the graph convolution neural network, the following scattering propagation rule is defined:
4. The method for social recommendation of a convolutional neural network fused with a multi-channel attention mechanism as claimed in claim 1, wherein said step 3 comprises:
original social network graph Gt=(AtX) convolution output of the l-th layer in topological spaceThe calculation method of (2) is shown in equation (6):
5. The method for social recommendation of a convolutional neural network fused with a multi-channel attention mechanism as claimed in claim 1, wherein said step 4 comprises:
social network feature graph Gf=(AfX) convolution output of the l-th layer in the feature spaceAs shown in equation (7):
6. The method for social recommendation of a convolutional neural network fused with a multi-channel attention mechanism as claimed in claim 1, wherein said step 5 comprises:
map the original social network GtAnd social network feature graph GfRespectively carrying out convolution operation once to obtain two convolution output representations, then adding the two convolution output representations to be used as convolution input of next convolution of the two graphs, circularly reciprocating the convolution input until all the convolution operations are finished, and obtaining final convolution output to be used as combined embedding Zc。
7. The method for social recommendation of a convolutional neural network fused with a multi-channel attention mechanism as claimed in claim 1, wherein said step 6 comprises:
for any social network node k, at ZsIs expressed asUsing weight vectorsCalculating the attention value of the nodeAs shown in equation (12):
then, the attention value of the social network node k is obtained by using a Softmax function, as shown in formula (13):
calculated by the same principle to obtainFor all social network nodes n in the graph,andμs,μt,μc,μfrespectively represents Zs,Zt,Zc,ZfThe attention value of (1);
calculating to obtain the final embedding Z of each social network node according to the attention value of each node of the social network:
Z=μs·Zs+μt·Zt+μc·Zc+μf·Zf#(14)。
8. the method for social recommendation of a convolutional neural network fused with a multi-channel attention mechanism as claimed in claim 1, wherein said step 7 comprises:
defining the probability that the social network node k belongs to the class c asThe class of n nodes is predicted asDerived from the final insertion Z
Wherein W is a weight vector of class c; b represents the bias, which is a constant.
9. A graph convolution neural network social recommendation system fusing a multi-channel attention mechanism is characterized by comprising:
a feature graph construction module for calculating a social network graph G by using cosine similarityt=(AtX) and then constructing G based on the featurestK-nearest graph of (i.e. social network profile G)f=(AfX); wherein A istRepresents GtA symmetric adjacency matrix offRepresents GfA symmetric adjacency matrix of (a);
a scattering module for learning a social network data graph G based on a graph convolution neural networktEmbedding of multiple signals ZS;
Topology module for connecting GtThe characteristics of the nodes are spread on the topological space, convolution operation is carried out, and G is learnedtNode of (2) embedding Zt;
Feature module for converting GfThe characteristics of the nodes are spread on the characteristic space, convolution operation is carried out, and G is learnedfNode of (2) embedding Zf;
Combination module for graph convolution based neural network learning GtAnd GfCombined insertion of Zc;
Attention module for dynamic adjustment of attention mechanism ZS、Zt、Zf、ZcBased on the adjusted ZS、Zt、Zf、ZcCalculating to obtain a final embedding Z of each social network node;
and the social recommendation module calculates the category Y of the social network node based on the final embedding Z, so as to perform social recommendation.
10. The system of claim 9, wherein the feature map construction module is specifically configured to:
first, a similarity matrix of a social network data set is calculated by using a cosine similarity methodThen, the front k similar node pairs of each node in the social network are selected to set edges, and a k-nearest graph of the original graph, namely a social network characteristic graph, which is represented as G, is constructedf=(AfX), the calculation method of the similarity matrix S is as follows:
wherein XiAnd XjFeature vectors for social network nodes i and j, respectively;
the scattering module is specifically configured to:
construction G based on inertia random walk matrixtUpper geometric scattering:
wherein InIs a matrix of units, and is,as an original social network diagram GtAn adjacency matrix with added self-loops, D isA diagonal matrix of (a);
Wherein U is0A high frequency signal representing the node itself;
using first and second order high frequency signals, namely:
according to the propagation rule of the graph convolution neural network, the following scattering propagation rule is defined:
the topology module is specifically configured to:
original social network graph Gt=(AtX) convolution output of the l-th layer in topological spaceThe calculation method of (2) is shown in equation (6):
wherein, Wt (l)Represents GtThe weight matrix of layer i, σ is the activation function,is GtAn adjacency matrix of self-loops is added, is composed ofA diagonal matrix of
The feature module is specifically configured to:
social network profile Gf=(AfX) convolution output of the l-th layer in the feature spaceAs shown in equation (7):
whereinRepresents GfThe weight matrix of layer i, σ is the activation function,is GfAdding a contiguous matrix of self-loops, wherein Is composed ofA diagonal matrix of
The combination module is specifically configured to:
map the original social network GtAnd social network feature graph GfRespectively carrying out convolution operation once to obtain two convolution output representations, then adding the two convolution output representations to be used as convolution input of next convolution of the two graphs, circularly repeating the convolution input until all the convolution operations are finished, and obtaining final convolution output to be used as combination embedding Zc;
The attention module is specifically configured to:
for any social network node k, at ZsIs expressed asUsing weight vectorsCalculating the attention value of the nodeAs shown in equation (12):
then, the attention value of the social network node k is obtained by using a Softmax function, as shown in formula (13):
calculated by the same principle to obtainFor all social network nodes n in the graph,andμs,μt,μc,μfrespectively represents Zs,Zt,Zc,ZfThe attention value of (1);
calculating to obtain the final embedding Z of each social network node according to the attention value of each node of the social network:
Z=μs·Zs+μt·Zt+μc·Zc+μf·Zf#(14);
the social recommendation module is specifically configured to:
defining the probability that the social network node k belongs to the class c asThe class of n nodes is predicted asDerived from the final insertion Z
Wherein W is a weight vector of class c; b represents the bias, which is a constant.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202210445519.5A CN114677234B (en) | 2022-04-26 | 2022-04-26 | Graph convolution neural network social recommendation method and system integrating multichannel attention mechanisms |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202210445519.5A CN114677234B (en) | 2022-04-26 | 2022-04-26 | Graph convolution neural network social recommendation method and system integrating multichannel attention mechanisms |
Publications (2)
Publication Number | Publication Date |
---|---|
CN114677234A true CN114677234A (en) | 2022-06-28 |
CN114677234B CN114677234B (en) | 2024-04-30 |
Family
ID=82080899
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202210445519.5A Active CN114677234B (en) | 2022-04-26 | 2022-04-26 | Graph convolution neural network social recommendation method and system integrating multichannel attention mechanisms |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN114677234B (en) |
Citations (12)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN104601438A (en) * | 2014-04-28 | 2015-05-06 | 腾讯科技(深圳)有限公司 | Friend recommendation method and device |
CN106980659A (en) * | 2017-03-20 | 2017-07-25 | 华中科技大学鄂州工业技术研究院 | A kind of doings based on isomery graph model recommend method |
KR101872733B1 (en) * | 2017-06-14 | 2018-06-29 | 조선대학교산학협력단 | System for recommending social networking service following and method for recommending social networking service following using it |
CN108320187A (en) * | 2018-02-02 | 2018-07-24 | 合肥工业大学 | A kind of recommendation method based on depth social networks |
CN109410080A (en) * | 2018-10-16 | 2019-03-01 | 合肥工业大学 | A kind of social image recommended method based on level attention mechanism |
CN110009093A (en) * | 2018-12-07 | 2019-07-12 | 阿里巴巴集团控股有限公司 | For analyzing the nerve network system and method for relational network figure |
CN111259142A (en) * | 2020-01-14 | 2020-06-09 | 华南师范大学 | Specific target emotion classification method based on attention coding and graph convolution network |
CN111428147A (en) * | 2020-03-25 | 2020-07-17 | 合肥工业大学 | Social recommendation method of heterogeneous graph volume network combining social and interest information |
CN111523051A (en) * | 2020-04-24 | 2020-08-11 | 山东师范大学 | Social interest recommendation method and system based on graph volume matrix decomposition |
CN113158071A (en) * | 2021-03-19 | 2021-07-23 | 广东工业大学 | Knowledge social contact recommendation method, system and equipment based on graph neural network |
US20210248461A1 (en) * | 2020-02-11 | 2021-08-12 | Nec Laboratories America, Inc. | Graph enhanced attention network for explainable poi recommendation |
CN114036405A (en) * | 2021-11-02 | 2022-02-11 | 扬州大学 | Social contact recommendation method and system based on graph convolution network |
-
2022
- 2022-04-26 CN CN202210445519.5A patent/CN114677234B/en active Active
Patent Citations (12)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN104601438A (en) * | 2014-04-28 | 2015-05-06 | 腾讯科技(深圳)有限公司 | Friend recommendation method and device |
CN106980659A (en) * | 2017-03-20 | 2017-07-25 | 华中科技大学鄂州工业技术研究院 | A kind of doings based on isomery graph model recommend method |
KR101872733B1 (en) * | 2017-06-14 | 2018-06-29 | 조선대학교산학협력단 | System for recommending social networking service following and method for recommending social networking service following using it |
CN108320187A (en) * | 2018-02-02 | 2018-07-24 | 合肥工业大学 | A kind of recommendation method based on depth social networks |
CN109410080A (en) * | 2018-10-16 | 2019-03-01 | 合肥工业大学 | A kind of social image recommended method based on level attention mechanism |
CN110009093A (en) * | 2018-12-07 | 2019-07-12 | 阿里巴巴集团控股有限公司 | For analyzing the nerve network system and method for relational network figure |
CN111259142A (en) * | 2020-01-14 | 2020-06-09 | 华南师范大学 | Specific target emotion classification method based on attention coding and graph convolution network |
US20210248461A1 (en) * | 2020-02-11 | 2021-08-12 | Nec Laboratories America, Inc. | Graph enhanced attention network for explainable poi recommendation |
CN111428147A (en) * | 2020-03-25 | 2020-07-17 | 合肥工业大学 | Social recommendation method of heterogeneous graph volume network combining social and interest information |
CN111523051A (en) * | 2020-04-24 | 2020-08-11 | 山东师范大学 | Social interest recommendation method and system based on graph volume matrix decomposition |
CN113158071A (en) * | 2021-03-19 | 2021-07-23 | 广东工业大学 | Knowledge social contact recommendation method, system and equipment based on graph neural network |
CN114036405A (en) * | 2021-11-02 | 2022-02-11 | 扬州大学 | Social contact recommendation method and system based on graph convolution network |
Also Published As
Publication number | Publication date |
---|---|
CN114677234B (en) | 2024-04-30 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN113407759B (en) | Multi-modal entity alignment method based on adaptive feature fusion | |
CN106411572B (en) | A kind of community discovery method of combination nodal information and network structure | |
CN113656596A (en) | Multi-modal entity alignment method based on triple screening fusion | |
CN102879677A (en) | Intelligent fault diagnosis method based on rough Bayesian network classifier | |
CN107391670A (en) | A kind of mixing recommendation method for merging collaborative filtering and user property filtering | |
Feng et al. | Computational social indicators: a case study of chinese university ranking | |
CN107784327A (en) | A kind of personalized community discovery method based on GN | |
CN112311608B (en) | Multilayer heterogeneous network space node characterization method | |
CN116340646A (en) | Recommendation method for optimizing multi-element user representation based on hypergraph motif | |
CN115310005A (en) | Neural network recommendation method and system based on meta-path fusion and heterogeneous network | |
CN116416478A (en) | Bioinformatics classification model based on graph structure data characteristics | |
CN103793747A (en) | Sensitive information template construction method in network content safety management | |
Hu et al. | M-gcn: Multi-scale graph convolutional network for 3d point cloud classification | |
Li et al. | PC-Conv: Unifying Homophily and Heterophily with Two-fold Filtering | |
CN105159918A (en) | Trust correlation based microblog network community discovery method | |
CN114677234A (en) | Graph convolution neural network social contact recommendation method and system integrating multi-channel attention mechanism | |
Choi et al. | Finding heterophilic neighbors via confidence-based subgraph matching for semi-supervised node classification | |
Kek et al. | Multi-timescale wavelet scattering with genetic algorithm feature selection for acoustic scene classification | |
Tabak et al. | Topological properties of bank networks: the case of Brazil | |
CN110580280A (en) | Method, device and storage medium for discovering new words | |
CN111444454A (en) | Dynamic community dividing method based on spectrum method | |
Zhao et al. | ENADPool: The Edge-Node Attention-based Differentiable Pooling for Graph Neural Networks | |
Fu | Pairwise constraint propagation via low-rank matrix recovery | |
CN115134305B (en) | Dual-core cooperation SDN big data network flow accurate classification method | |
Zhang et al. | Personalized web page ranking based graph convolutional network for community detection in attribute networks |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |