CN114677234A - Graph convolution neural network social contact recommendation method and system integrating multi-channel attention mechanism - Google Patents

Graph convolution neural network social contact recommendation method and system integrating multi-channel attention mechanism Download PDF

Info

Publication number
CN114677234A
CN114677234A CN202210445519.5A CN202210445519A CN114677234A CN 114677234 A CN114677234 A CN 114677234A CN 202210445519 A CN202210445519 A CN 202210445519A CN 114677234 A CN114677234 A CN 114677234A
Authority
CN
China
Prior art keywords
social network
graph
node
convolution
matrix
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202210445519.5A
Other languages
Chinese (zh)
Other versions
CN114677234B (en
Inventor
翟锐
张莉博
李绍华
于俊洋
宋亚林
王瑛琦
白晨希
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Henan University
Original Assignee
Henan University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Henan University filed Critical Henan University
Priority to CN202210445519.5A priority Critical patent/CN114677234B/en
Publication of CN114677234A publication Critical patent/CN114677234A/en
Application granted granted Critical
Publication of CN114677234B publication Critical patent/CN114677234B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q50/00Information and communication technology [ICT] specially adapted for implementation of business processes of specific business sectors, e.g. utilities or tourism
    • G06Q50/01Social networking
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/045Combinations of networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Health & Medical Sciences (AREA)
  • Health & Medical Sciences (AREA)
  • General Physics & Mathematics (AREA)
  • Computing Systems (AREA)
  • Biophysics (AREA)
  • Software Systems (AREA)
  • Evolutionary Computation (AREA)
  • Computational Linguistics (AREA)
  • Molecular Biology (AREA)
  • Biomedical Technology (AREA)
  • General Engineering & Computer Science (AREA)
  • Artificial Intelligence (AREA)
  • Mathematical Physics (AREA)
  • Data Mining & Analysis (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Business, Economics & Management (AREA)
  • Economics (AREA)
  • Human Resources & Organizations (AREA)
  • Marketing (AREA)
  • Primary Health Care (AREA)
  • Strategic Management (AREA)
  • Tourism & Hospitality (AREA)
  • General Business, Economics & Management (AREA)
  • Information Retrieval, Db Structures And Fs Structures Therefor (AREA)

Abstract

The invention provides a social recommendation method and system of a graph convolution neural network integrating a multi-channel attention mechanism, which can improve the social recommendation effect by the following work: when the node characteristics and the topological structure node embedding are learned, the node embedding combining the node characteristics and the topological structure node embedding is learned, the common characteristics of the node characteristics and the topological structure node embedding are obtained, and the problem of excessive dependence on single characteristics is solved. And secondly, by learning the scattering embedding of the topological structure, the band-pass filtering of different signals is realized, and the over-smoothing phenomenon is reduced. And thirdly, combining an attention mechanism to fuse the related information. Experimental results show that compared with other algorithms, the performance of the method and the system provided by the invention on a plurality of social network data sets is improved, and the method and the system also provide a new thought for subsequent research.

Description

Graph convolution neural network social contact recommendation method and system integrating multi-channel attention mechanism
Technical Field
The invention relates to the technical field of social recommendation, in particular to a method and a system for recommending social networking services of a graph convolution neural network by fusing a multi-channel attention mechanism.
Background
With the development of networks, people have diversified social ways, and from the prior short messages to the current online social ways, the social ways are more and more convenient. Graph (Graph) is flexible in structure and has powerful representation capability, so more and more data is represented by Graph. In social recommendation, each node can be regarded as a user, and if edges are connected between two nodes, similar information exists between the two nodes. But graph data processing and mining has been a challenge. As the data size increases, the graph data becomes more complex and is relatively troublesome to process with respect to signal data such as images and languages, and therefore, it is a very important issue how to efficiently and accurately learn the representation of the graph data. Social recommendation tasks, one of the most widespread tasks on the graph, have been the focus of research by researchers.
Graph Neural Network (GNN) is based on the idea of convolutional Neural, cyclic Neural networks and depth autocoders, defining and designing Neural Network structures for processing Graph data. Later, as research progresses, members of GNNs become more and more, and Graph Convolution Networks (GCNs), Graph Attention Networks (Graph Attention Networks), Graph Auto encoders (Graph Auto encoders), Graph generation Networks (Graph generating Networks), and the like have come into force for different task requirements. And in particular the appearance of convolutional neural networks, provides new ideas for the analysis of graph data. GCNs are currently widely used neural network architectures for learning graph data, and the basic idea is to combine the idea of CNN in GNN. The GCNs are end-to-end learning frames, and the nodes aggregate information of the nodes and neighbors thereof through each layer of convolution, so that the obtained node information is continuously updated, the optimal node information is obtained, and similar users are recommended to social users.
Recent studies have shown that the learning of a single graph structure or node features does not work well in the node classification task. The basic framework of GCNs does not take into account graph structure and node feature correlation factors well. Aiming at The problem, Wang et al [ Wang X, Zhu M, Bo D, et al. AM-GCN: Adaptive Multi-channel Graph relational Networks [ C ]// KDD'20: The 26th ACM SIGKDD Conference on Knowledge Discovery and Data mining. ACM,2020 ] utilize a constructed K-order adjacency Graph to mine The correlation between Graph structures and node features, learn different representations of nodes, and simultaneously introduce attention to a power system to improve The classification effect. A semi-supervised method is proposed by Jin et al [ Jin W, Derr T, Wang Y, et al, node Similarity monitoring graphic Networks [ C ]// WSDM'21: The fourth ACM International Conference on Web Search and Data mining. ACM,2021 ], and self-adaptive fusion is carried out according to Graph structure and node characteristic information, and learning is carried out according to The information after The fusion of The Graph structure and The node characteristic information. However, these methods do not achieve the best learning effect on the graph structure and the node characteristics.
On the other hand, as the depth of the GCNs increases, the nodes become more similar, so that the personalized features of each node cannot be fully embodied. The difficulty of node classification is greatly increased, and the effect of user recommendation is poor. Aiming at the problem, some scholars propose that the occurrence of over-smoothing is relieved by utilizing geometric scattering information, and a geometric scattering network can capture high-order regular patterns on a graph and fully learn signal information of different frequencies. Zou et al [ Dongmian Zou and Gilad Lerman, "Graph connected neural network view profiling," Applied and computer Harmonic Analysis, vol.49, No.3, pp.1046-1074,2020 ] demonstrated that the scattering network can make any feature generated nearly invariant to alignment, enabling stable Graph operation. Fernando Gama et al [ Fernando Gama, Alejandro Ribeiro, and Joan Bruna, "Diffusion scattering transformations on graphs," in International Conference on Learning transformations, 2019 ] demonstrated that using Diffusion wavelets the scattering transformation can be generalized to non-Euclidean domains while enabling the capture of high frequency signals. Zhu et al [ Zhu H, Koniusz p. simple Spectral Graph convention [ C ]// International Conference on Learning retrieval 2021.2021 ] use simple spectrogram Convolution to balance the low-pass and high-pass filter bands of global and local context for each node captured, thereby mitigating the occurrence of over-smoothing.
Disclosure of Invention
Aiming at the task of processing a complex relation diagram of the current GCNs framework, the invention comprises the following steps: for example, in social recommendation, due to the fact that node characteristics are excessively depended on, an over-smoothing phenomenon easily occurs in a node aggregation process, node representations are difficult to distinguish, and the effect of social recommendation is seriously influenced. In order to adaptively distribute the weight, an attention mechanism is added, so that the node classification task effect is obviously improved, and the social recommendation is more accurate.
In order to achieve the purpose, the invention adopts the following technical scheme:
the invention provides a graph convolution neural network social contact recommendation method fusing a multi-channel attention mechanism, which comprises the following steps:
step 1: computing social network graph G using cosine similarityt=(AtX) and then constructing G based on the featurestK-nearest graph of (i.e. social network profile G)f=(AfX); wherein A istRepresents GtA symmetric adjacency matrix offRepresents GfA symmetric adjacency matrix of (a);
step 2: learning social network data graph G based on graph convolution neural networktEmbedding of multiple signals ZS
And step 3: g is to betThe characteristics of the nodes are spread on the topological space, convolution operation is carried out, and G is learnedtNode of (2) embedding Zt
And 4, step 4: g is to befThe characteristics of the nodes are spread on the characteristic space, convolution operation is carried out, and G is learnedfNode of (2) embedding Zf
And 5: neural network learning G based on graph convolutiontAnd GfCombined insertion of Zc
Step 6: dynamic adjustment of attention-drawing mechanism ZS、Zt、Zf、ZcBased on the adjusted ZS、Zt、Zf、ZcCalculating to obtain a final embedding Z of each social network node;
and 7: and calculating the category Y of the social network node based on the final embedding Z so as to perform social recommendation.
Further, the step 1 comprises:
first, a similarity matrix of a social network data set is calculated by using a cosine similarity method
Figure BDA0003616632700000031
Then, the front k similar node pairs of each node in the social network are selected to set edges, and a k-nearest graph of the original graph, namely a social network characteristic graph, which is represented as G, is constructedf=(AfX), the calculation method of the similarity matrix S is as follows:
Figure BDA0003616632700000032
wherein XiAnd XjFeature vectors for social network nodes i and j, respectively.
Further, the step 2 comprises:
construction G based on inertia random walk matrixtUpper geometric scattering:
Figure BDA0003616632700000041
wherein InIs a matrix of units, and is,
Figure BDA0003616632700000042
as an original social network diagram GtAn adjacency matrix with added self-loops, D is
Figure BDA0003616632700000043
A diagonal matrix of (a);
in geometric scattering, 2 is introducedkWavelet transform of scale
Figure BDA0003616632700000044
Figure BDA0003616632700000045
Wherein U is0A high frequency signal representing the node itself;
using first and second order high frequency signals, namely:
Figure BDA0003616632700000046
according to the propagation rule of the graph convolution neural network, the following scattering propagation rule is defined:
Figure BDA0003616632700000047
wherein
Figure BDA0003616632700000048
Is GtThe weight matrix of the l-th layer, σ, is the activation function.
Further, the step 3 comprises:
original social network graph Gt=(AtX) convolution output of the l-th layer in topological space
Figure BDA0003616632700000049
The calculation method of (2) is shown in equation (6):
Figure BDA00036166327000000410
wherein, the first and the second end of the pipe are connected with each other,
Figure BDA00036166327000000423
represents GtThe weight matrix of layer i, σ is the activation function,
Figure BDA00036166327000000411
is GtA contiguous matrix of self-loops is added,
Figure BDA00036166327000000412
Figure BDA00036166327000000413
is composed of
Figure BDA00036166327000000414
A diagonal matrix of
Figure BDA00036166327000000415
Further, the step 4 comprises:
social network profile Gf=(AfX) convolution output of the l-th layer in the feature space
Figure BDA00036166327000000416
As shown in equation (7):
Figure BDA00036166327000000417
wherein
Figure BDA00036166327000000424
Represents GfThe weight matrix of layer i, σ is the activation function,
Figure BDA00036166327000000418
is GfAdding a contiguous matrix of self-loops, wherein
Figure BDA00036166327000000419
Figure BDA00036166327000000420
Is composed of
Figure BDA00036166327000000421
A diagonal matrix of
Figure BDA00036166327000000422
Further, the step 5 comprises:
map the original social network GtAnd social network feature graph GfRespectively carrying out convolution operation once to obtain two convolution output representations, then adding the two convolution output representations to be used as convolution input of next convolution of the two graphs, circularly reciprocating the convolution input until all the convolution operations are finished, and obtaining final convolution output to be used as combined embedding Zc
Further, the step 6 comprises:
for any social network node k, at ZsIs expressed as
Figure BDA0003616632700000051
Using weight vectors
Figure BDA0003616632700000052
Calculating the attention value of the node
Figure BDA0003616632700000053
As shown in equation (12):
Figure BDA0003616632700000054
wherein
Figure BDA0003616632700000055
Is a social network weight matrix;
obtaining the social network node k at Z in the same wayt,ZcAnd ZfAttention value of
Figure BDA0003616632700000056
And
Figure BDA0003616632700000057
then, the attention value of the social network node k is obtained by using a Softmax function, as shown in formula (13):
Figure BDA0003616632700000058
calculated by the same principle to obtain
Figure BDA0003616632700000059
For all social network nodes n in the graph,
Figure BDA00036166327000000510
and
Figure BDA00036166327000000511
μs,μt,μc,μfrespectively represents Zs,Zt,Zc,ZfAttention value of (d);
calculating to obtain the final embedding Z of each social network node according to the attention value of each node of the social network:
Z=μs·Zst·Ztc·Zcf·Zf#(14)。
further, the step 7 includes:
defining the probability that the social network node k belongs to the class c as the final objective function by adopting a cross entropy loss function
Figure BDA00036166327000000512
The class of n nodes is predicted as
Figure BDA00036166327000000513
Derived from the final insertion Z
Figure BDA00036166327000000514
Figure BDA00036166327000000515
Wherein W is a weight vector of class c; b represents the bias, which is a constant.
The invention provides a multi-channel attention mechanism fused graph convolution neural network social recommendation system on the other hand, which comprises the following components:
a feature graph construction module for calculating a social network graph G by using cosine similarityt=(AtX) and then constructing G based on the featurestK-nearest graph of (i.e. social network profile G)f=(AfX); wherein A istRepresents GtA symmetric adjacency matrix offRepresents GfA symmetric adjacency matrix of (a);
a scattering module for learning a social network data graph G based on a graph convolution neural networktEmbedding of multiple signals ZS
Topology module for connecting GtThe characteristics of the nodes are spread on the topological space, convolution operation is carried out, and G is learnedtNode of (2) embedding Zt
Feature module for converting GfThe characteristics of the nodes are spread on the characteristic space, convolution operation is carried out, and G is learnedfNode of (2) embedding Zf
Combination module for graph convolution based neural network learning GtAnd GfCombined insertion of Zc
Attention module for dynamic adjustment of attention mechanism ZS、Zt、Zf、ZcBased on the weight ofAdjusted ZS、Zt、Zf、ZcCalculating to obtain a final embedding Z of each social network node;
and the social recommendation module calculates the category Y of the social network node based on the final embedding Z, so as to perform social recommendation.
Further, the feature map construction module is specifically configured to:
firstly, calculating a similarity matrix of a social network data set by adopting a cosine similarity method
Figure BDA0003616632700000061
Then, the front k similar node pairs of each node in the social network are selected to set edges, and a k-nearest graph of the original graph, namely a social network characteristic graph, which is represented as G, is constructedf=(AfX), the calculation method of the similarity matrix S is as follows:
Figure BDA0003616632700000062
wherein XiAnd XjFeature vectors for social network nodes i and j, respectively.
Further, the scattering module is specifically configured to:
construction G based on inertia random walk matrixtUpper geometric scattering:
Figure BDA0003616632700000063
wherein InIs a matrix of units, and is,
Figure BDA0003616632700000064
as an original social network diagram GtAn adjacency matrix with added self-loops, D is
Figure BDA0003616632700000065
A diagonal matrix of (a);
in geometric scattering, 2 is introducedkWavelet of scaleTransformation of
Figure BDA0003616632700000066
Figure BDA0003616632700000071
Wherein U is0A high frequency signal representing the node itself;
using first and second order high frequency signals, namely:
Figure BDA0003616632700000072
according to the propagation rule of the graph convolution neural network, the following scattering propagation rule is defined:
Figure BDA0003616632700000073
wherein
Figure BDA0003616632700000074
Is GtThe weight matrix of the l-th layer, σ, is the activation function.
Further, the topology module is specifically configured to:
original social network graph Gt=(AtX) convolution output of the l-th layer in topological space
Figure BDA0003616632700000075
The calculation method of (2) is shown in equation (6):
Figure BDA0003616632700000076
wherein, the first and the second end of the pipe are connected with each other,
Figure BDA00036166327000000719
represents GtThe weight matrix of layer i, σ is the activation function,
Figure BDA0003616632700000077
is GtAn adjacency matrix of self-loops is added,
Figure BDA0003616632700000078
Figure BDA0003616632700000079
is composed of
Figure BDA00036166327000000710
A diagonal matrix of
Figure BDA00036166327000000711
Further, the feature module is specifically configured to:
social network profile Gf=(AfX) convolution output of the l-th layer in the feature space
Figure BDA00036166327000000712
As shown in equation (7):
Figure BDA00036166327000000713
wherein
Figure BDA00036166327000000720
Represents GfThe weight matrix of layer i, σ is the activation function,
Figure BDA00036166327000000714
is GfAdding a contiguous matrix of self-loops, wherein
Figure BDA00036166327000000715
Figure BDA00036166327000000716
Is composed of
Figure BDA00036166327000000717
A diagonal matrix of
Figure BDA00036166327000000718
Further, the combination module is specifically configured to:
map the original social network GtAnd social network feature graph GfRespectively carrying out convolution operation once to obtain two convolution output representations, then adding the two convolution output representations to be used as convolution input of next convolution of the two graphs, circularly reciprocating the convolution input until all the convolution operations are finished, and obtaining final convolution output to be used as combined embedding Zc
Further, the attention module is specifically configured to:
for any social network node k, at ZsIs expressed as
Figure BDA0003616632700000081
Using weight vectors
Figure BDA00036166327000000815
Calculating the attention value of the node
Figure BDA0003616632700000082
As shown in equation (12):
Figure BDA0003616632700000083
wherein
Figure BDA0003616632700000084
Is a social network weight matrix;
obtaining the social network node k at Z in the same wayt,ZcAnd ZfAttention value of (2)
Figure BDA0003616632700000085
And
Figure BDA0003616632700000086
then, the attention value of the social network node k is obtained by using a Softmax function, as shown in formula (13):
Figure BDA0003616632700000087
calculated by the same principle to obtain
Figure BDA0003616632700000088
For all social network nodes n in the graph,
Figure BDA0003616632700000089
and
Figure BDA00036166327000000810
μs,μt,μc,μfrespectively represents Zs,Zt,Zc,ZfThe attention value of (1);
calculating to obtain the final embedding Z of each social network node according to the attention value of each node of the social network:
Z=μs·Zst·Ztc·Zcf·Zf#(14)。
further, the social recommendation module is specifically configured to:
defining the probability that the social network node k belongs to the class c as
Figure BDA00036166327000000811
The class of n nodes is predicted as
Figure BDA00036166327000000812
Derived from the final insertion Z
Figure BDA00036166327000000813
Figure BDA00036166327000000814
Wherein W is a weight vector of class c; b represents the bias, which is a constant.
Compared with the prior art, the invention has the following beneficial effects:
the invention simultaneously considers the common information of the fusion node characteristics and the network topology and the frequently encountered over-smooth problem, and effectively solves the problems by combining the embedding and the addition of the geometric scattering. The most relevant information of node features and network topology is learned more effectively through a combined embedding process, and the occurrence of over-smoothing is relieved through a scattering process by learning different frequency signals of first-order and second-order neighbors. Experiments show that the social recommendation method and the social recommendation system can achieve good effects in social recommendation tasks.
Drawings
FIG. 1 is a flowchart of a method for social recommendation of a convolutional neural network incorporating a multi-channel attention mechanism according to an embodiment of the present invention;
FIG. 2 is a diagram of an overall architecture of a convolutional neural network social recommendation system incorporating a multi-channel attention mechanism according to an embodiment of the present invention;
FIG. 3 is a graph of the results of an attention mechanism analysis in accordance with an embodiment of the present invention.
Detailed Description
For a better understanding of the present application, the following explanations are first made:
given a social networking graph G ═ (a, X), a is its symmetric adjacency matrix, and
Figure BDA0003616632700000091
x is its feature matrix, an
Figure BDA0003616632700000092
If there is an edge connection between social network nodes i and j, then Aij1, otherwise Aij=0。
The invention is further illustrated by the following examples in conjunction with the accompanying drawings:
as shown in fig. 1, a method for social recommendation (SM-GCN for short) of a graph convolution neural network fusing a multi-channel attention mechanism includes:
step S101: computing a social network graph G using cosine similarityt=(AtX) and then constructing G based on the featurestK-nearest graph of (i.e. social network profile G)f=(AfX); wherein A istRepresents GtA symmetric adjacency matrix offRepresents GfA symmetric adjacency matrix of (a);
step S102: to alleviate the problem of over-smoothing, social network data graph G is learned based on graph convolution neural networkstEmbedding of multiple signals ZS
Step S103: g is to betThe characteristics of the nodes are spread on the topological space, convolution operation is carried out, and G is learnedtNode of (2) embedding ZtTo obtain GtUnique information;
step S104: g is to befThe characteristics of the nodes are spread on the characteristic space, convolution operation is carried out, and G is learnedfNode of (2) embedding ZfTo obtain GfUnique information;
step S105: neural network learning G based on graph convolutiontAnd GfCombined insertion of ZcTo obtain their common characteristic information;
step S106: dynamic adjustment of attention-drawing mechanism ZS、Zt、Zf、ZcBased on the adjusted ZS、Zt、Zf、ZcCalculating to obtain a final embedding Z of each social network node;
step S107: and calculating the category Y of the social network node based on the final embedding Z, so as to perform social recommendation, and improve the effect in the social recommendation task.
Further, the step S101 includes:
in the aspect of construction of the social network characteristic diagram, the cosine similarity method is used for calculation in consideration of the complexity of calculation. The cosine similarity method is used for measuring the difference between two nodes by calculating cosine values of included angles of two space vectors, wherein the smaller the included angle is, the more similar the two nodes are, and vice versa. Cosine similarity is more focused on the difference of two vectors in direction than distance measurement.
Firstly, a similarity matrix of a social network data set is calculated by adopting a cosine similarity method
Figure BDA0003616632700000101
Then, the front k similar node pairs of each node in the social network are selected to set edges, and a k-nearest neighbor (kNN) graph of the original graph, namely a social network characteristic graph, which is represented as G, is constructedf=(AfX), the similarity matrix S is calculated as follows:
Figure BDA0003616632700000102
wherein XiAnd XjFeature vectors for social network nodes i and j, respectively.
Further, the step S102 includes:
the essence of convolution is the process of laplacian smoothing, which makes neighboring nodes more and more similar, thereby degrading the classification effect. In the step, the scattering characteristics of the original social network diagram are gathered, the multi-scale difference of the first-order neighborhood and the second-order neighborhood is extracted by utilizing the wavelet matrix, so that the defect of low-frequency filtering caused by the traditional GCNs (graph convolution neural networks) is overcome, the scattering operation is complementary with the traditional GCNs operation, and the over-smoothing phenomenon is relieved.
The construction of geometric scattering on the graph is based on an inert random walk matrix, which can be expressed as:
Figure BDA0003616632700000103
wherein InIs a matrix of units, and is,
Figure BDA0003616632700000104
as an original social network diagram GtAn adjacency matrix with added self-loops, D is
Figure BDA0003616632700000105
The diagonal matrix of (a).
Since high frequency signals can be recovered by multi-scale wavelet transform, in geometric scattering, the invention introduces 2 k-scale wavelet transform
Figure BDA0003616632700000106
Figure BDA0003616632700000107
Wherein U is0A high frequency signal representing the node itself;
using first and second order high frequency signals, namely:
Figure BDA0003616632700000111
according to the propagation rules of GCNs, a scattering propagation rule is defined:
Figure BDA0003616632700000112
wherein
Figure BDA0003616632700000113
Is the original social network diagram GtThe weight matrix, σ, of layer i is the activation function, here the ReLU function is used.
Further, the step S103 includes:
and respectively carrying out graph convolution operation on the original social network graph and the characteristic graph, and learning the topological information representation and the characteristic information representation of the respective nodes.
Original social network graph Gt=(AtX) convolution output of the l-th layer in topological space
Figure BDA0003616632700000114
The calculation method of (2) is shown in equation (6):
Figure BDA0003616632700000115
wherein the content of the first and second substances,
Figure BDA00036166327000001118
represents GtThe weight matrix of layer i, σ is the activation function, here the ReLU function is used,
Figure BDA0003616632700000116
is GtAn adjacency matrix of self-loops is added,
Figure BDA0003616632700000117
Figure BDA0003616632700000118
is composed of
Figure BDA0003616632700000119
A diagonal matrix of
Figure BDA00036166327000001110
Further, the step S104 includes:
social network profile Gf=(AfX) convolution output of the l-th layer in the feature space
Figure BDA00036166327000001111
As shown in equation (7):
Figure BDA00036166327000001112
wherein
Figure BDA00036166327000001119
Represents GfThe weight matrix of layer i, σ is the activation function, here the ReLU function is used,
Figure BDA00036166327000001113
is GfAdding a contiguous matrix of self-loops, wherein
Figure BDA00036166327000001114
Figure BDA00036166327000001115
Is composed of
Figure BDA00036166327000001116
A diagonal matrix of
Figure BDA00036166327000001117
Further, the step S105 includes:
given that downstream social recommendation tasks may be related to common information of the original social networking graph and the feature graph, the present invention devises this step to learn common characteristics of the original social networking graph and the feature graph. The main idea is to enter data of two graphs into the same channel, and perform convolution operation by using the same parameters, so as to realize information sharing between the two graphs. Firstly, the original social network graph and the feature graph are respectively subjected to convolution operation once to obtain two convolution output representations, and then the two convolution output representations are added together to serve as the convolution input of the next convolution of the two graphs, so that the operation is repeated in a cycle. Compared with a method of performing convolution and adding on two graphs respectively, the method of the invention can highlight the common characteristic of the nodes by adding the representations after each layer of convolution.
Original social network graph Gt=(AtX) convolution output of the first layer in topological space
Figure BDA0003616632700000121
As shown in equation (8):
Figure BDA0003616632700000122
where σ is the activation function, here the ReLU function is used,
Figure BDA0003616632700000123
is a weight matrix for the first layer of the social network,
Figure BDA0003616632700000124
social network profile Gf=(AfX) convolution output of a first layer in a feature space
Figure BDA0003616632700000125
As shown in formula (9):
Figure BDA0003616632700000126
where σ is the activation function, here the ReLU function is used,
Figure BDA0003616632700000127
is a weight matrix of the first layer of the social network, has the same parameters as the weight matrix in the topological space,
Figure BDA0003616632700000128
so that the input of the second layer
Figure BDA0003616632700000129
The expression is shown in formula (10):
Figure BDA00036166327000001210
thus convolved by two or more layers
Figure BDA00036166327000001211
Expressed as shown in formula (11):
Figure BDA00036166327000001212
further, the step S106 includes:
in order to improve the effect of node classification tasks and improve the accuracy of social network recommendation, the four social network node embeddings, namely Z, are obtaineds,Zt,Zc,Zf. According to the method, the social network node embedding weight is calculated by using an attention mechanism, and information with higher relevance is obtained. Mu.ss,μt,μc,μfRespectively represents Zs,Zt,Zc,ZfAttention value of (1).
For any social network node k, at ZsIs expressed as
Figure BDA00036166327000001213
The invention uses weight vectors
Figure BDA00036166327000001214
Calculating the attention value of the node
Figure BDA00036166327000001215
As shown in equation (12):
Figure BDA00036166327000001216
wherein
Figure BDA0003616632700000131
Is a social network weight matrix. Similarly, social network node k may be obtained at Zt,ZcAnd ZfAttention value of
Figure BDA0003616632700000132
And
Figure BDA0003616632700000133
then, the attention value of the social network node k is obtained by using a Softmax function, as shown in formula (13):
Figure BDA0003616632700000134
similarly, it can be calculated
Figure BDA0003616632700000135
For all social network nodes n in the graph,
Figure BDA0003616632700000136
Figure BDA0003616632700000137
and
Figure BDA0003616632700000138
according to the attention value, calculating to obtain the final embedding Z of each social network node:
Z=μs·Zst·Ztc·Zcf·Zf#(14)
specifically, which category the node specifically belongs to can be predicted based on the obtained final embedded Z, and recommendation of similar social interests and the like can be performed for the user according to the obtained category, so that the recommendation is more accurate.
Further, the step S107 includes:
the cross entropy can measure the difference degree of two different probability distributions in the same random variable, and is expressed as the difference between the real probability distribution and the predicted probability distribution in machine learning. The smaller the value of the cross entropy, the better the model prediction effect. The invention adopts a cross entropy loss function as a final objective function. Defining the probability that social network node k belongs to class c as
Figure BDA0003616632700000139
Then the class of n nodes is predicted as
Figure BDA00036166327000001310
According to the final embedding Z, obtaining
Figure BDA00036166327000001311
Figure BDA00036166327000001312
Wherein W is a weight vector of class c; b represents the bias, which is a constant.
Assuming that the training set is L, for each L ∈ L, the actual label is YlThe prediction tag is
Figure BDA00036166327000001313
The penalty in node classification is then expressed as:
Figure BDA00036166327000001314
on the basis of the above embodiment, as shown in fig. 2, the present invention further provides a graph convolution neural network social recommendation (abbreviated as SM-GCN) system fusing a multi-channel attention mechanism, including:
a feature graph construction module for calculating a social network graph G by using cosine similarityt=(AtX) and then constructing G based on the featurestK-nearest graph of (1), i.e. social network feature graph Gf=(AfX); wherein, AtRepresents GtA symmetric adjacency matrix offRepresents GfA symmetric adjacency matrix of (a);
a scattering module for learning a social network data graph G based on a graph convolution neural networktEmbedding of multiple signals ZS
Topology module for connecting GtCharacteristics of the nodesSpread in topological space, convolution operation, and learning GtNode of (2) embedding Zt
Feature module for converting GfThe characteristics of the nodes are spread on the characteristic space, convolution operation is carried out, and G is learnedfNode of (2) embedding Zf
Combination module for graph convolution based neural network learning GtAnd GfCombined insertion of Zc
Attention module for dynamic adjustment of attention mechanism ZS、Zt、Zf、ZcBased on the adjusted ZS、Zt、Zf、ZcCalculating to obtain a final embedding Z of each social network node;
and the social recommendation module calculates the category Y of the social network node based on the final embedding Z, so as to perform social recommendation.
Further, the feature map construction module is specifically configured to:
first, a similarity matrix of a social network data set is calculated by using a cosine similarity method
Figure BDA0003616632700000141
Then, the front k similar node pairs of each node in the social network are selected to set edges, and a k-nearest graph of the original graph, namely a social network characteristic graph, which is represented as G, is constructedf=(AfX), the calculation method of the similarity matrix S is as follows:
Figure BDA0003616632700000142
wherein XiAnd XjFeature vectors for social network nodes i and j, respectively.
Further, the scattering module is specifically configured to:
construction of a random walk matrix G based on inertiatUpper geometric scattering:
Figure BDA0003616632700000143
wherein InIs a matrix of units, and is,
Figure BDA0003616632700000144
for the original social network graph GtAn adjacency matrix with added self-loops, D is
Figure BDA0003616632700000145
A diagonal matrix of (a);
in geometric scattering, a wavelet transform of 2k scale is introduced
Figure BDA0003616632700000146
Figure BDA0003616632700000151
Wherein U is0A high frequency signal representing the node itself;
using first and second order high frequency signals, namely:
Figure BDA0003616632700000152
according to the propagation rule of the graph convolution neural network, the following scattering propagation rule is defined:
Figure BDA0003616632700000153
wherein
Figure BDA0003616632700000154
Is GtThe weight matrix of the l-th layer, σ, is the activation function.
Further, the topology module is specifically configured to:
original social network graph Gt=(AtX) convolution output of the l-th layer in topological space
Figure BDA0003616632700000155
The calculation method of (2) is shown in equation (6):
Figure BDA0003616632700000156
wherein the content of the first and second substances,
Figure BDA00036166327000001520
represents GtThe weight matrix of layer l, σ is the activation function,
Figure BDA0003616632700000157
is GtAn adjacency matrix of self-loops is added,
Figure BDA0003616632700000158
Figure BDA0003616632700000159
is composed of
Figure BDA00036166327000001510
A diagonal matrix of
Figure BDA00036166327000001511
Further, the feature module is specifically configured to:
social network feature graph Gf=(AfX) convolution output of the l-th layer in the feature space
Figure BDA00036166327000001512
As shown in equation (7):
Figure BDA00036166327000001513
wherein
Figure BDA00036166327000001519
Represents GfThe weight matrix of layer l, σ is the activation function,
Figure BDA00036166327000001514
is GfAdding a contiguous matrix of self-loops, wherein
Figure BDA00036166327000001515
Figure BDA00036166327000001516
Is composed of
Figure BDA00036166327000001517
A diagonal matrix of
Figure BDA00036166327000001518
Further, the combination module is specifically configured to:
map the original social network GtAnd social network feature graph GfRespectively carrying out convolution operation once to obtain two convolution output representations, then adding the two convolution output representations to be used as convolution input of next convolution of the two graphs, circularly reciprocating the convolution input until all the convolution operations are finished, and obtaining final convolution output to be used as combined embedding Zc
Further, the attention module is specifically configured to:
for any social network node k, at ZsIs expressed as
Figure BDA0003616632700000161
Using weight vectors
Figure BDA0003616632700000162
Calculating the attention value of the node
Figure BDA0003616632700000163
As shown in equation (12):
Figure BDA0003616632700000164
wherein
Figure BDA0003616632700000165
Is a social network weight matrix;
obtaining the social network node k at Z in the same wayt,ZcAnd ZfAttention value of
Figure BDA0003616632700000166
And
Figure BDA0003616632700000167
then, the attention value of the social network node k is obtained using the Sofimax function, as shown in formula (13):
Figure BDA0003616632700000168
calculated by the same principle to obtain
Figure BDA0003616632700000169
For all social network nodes n in the graph,
Figure BDA00036166327000001610
and
Figure BDA00036166327000001611
μs,μt,μc,μfrespectively represents Zs,Zt,Zc,ZfThe attention value of (1);
calculating to obtain the final embedding Z of each social network node according to the attention value of each node of the social network:
Z=μs·Zst·Ztc·Zcf·Zf#(14)。
further, the social recommendation module is specifically configured to:
defining the probability that the social network node k belongs to the class c as
Figure BDA00036166327000001612
The class of n nodes is predicted as
Figure BDA00036166327000001613
Derived from the final insertion Z
Figure BDA00036166327000001614
Figure BDA00036166327000001615
Wherein W is a weight vector of class c; b represents the bias, which is a constant.
To verify the effect of the present invention, the following experiment was performed:
specifically, the invention was evaluated on three social networks (UAI2010, BlogCatalog and Flickr), with the data set information as shown in table 1.
Table 1 data set statistics
Dataset Nodes Edges Classes Femres
BlogCatalog 5196 171743 6 8189
Flickr 7575 239738 9 12047
UAI2010 3067 28311 19 4973
In order to prove the effectiveness of the method, the method of the invention is compared with the following six popular node classification methods:
GCN [ Thomas N Kipf and Max welding.2016. semi-super class classification with graph volume networks. arXiv preprinting arXiv:1609.02907(2016) ]: the method is one of the most popular semi-supervised graph convolution network models at present, and achieves better representation by aggregating information of neighbor nodes.
KNN-GCN [ Luca France schi, Mathias Niepert, Massimiliano Pontil, and Xiao He.2019.learning discrete structures for graph neural networks. ICML (2019) ]: a variant of GCN uses a K-step adjacency matrix as an input map.
And (3) GAT: the method is a graph neural network model widely applied to GNN baselines, and node features are gathered according to learned scores of different nodes.
DEMO-Net [ Pei H, Wei B, Chang C, et al. Geom-GCN: Geometric Graph volatile Networks [ J ].2020 ]: is a degree specific graph neural network for node classification.
MixHop [ Sami Abu-El-Haija, Bryan Perozzi, Amol Kapoor, Nazanin Alipourfard, KristinaLerman, Hrayr Harutyunyun, Greg Ver Steeg, and Aram Galstyan.2019.MixHop: Higher-Order Graph Convolvulnal architecture via sparse dispersed neighboring Miighbolond Mixing.In ICML.21-29 ]: is a method of representing a graph convolution layer using features of neighbors that blend various distances.
AM-GCN [ Wang X, Zhu M, Bo D, et al. AM-GCN: Adaptive Multi-channel Graph relational Networks [ C ]// KDD'20: The 26th ACM SIGKDD Conference on Knowledge Discovery and Data mining. ACM,2020 ]: the method is a network model combining feature graphs, original graphs and common feature representations of the feature graphs and the original graphs.
The invention selects 20 nodes marked in each class as a training set, and simultaneously selects 1000 nodes as a test set. The invention trains two layers of GCNs, and the setting of the hidden layer is nhid 1E {512,768}, and nhid 2E {32,128,256 }. The learning rate is set to be 0.0002-0.0005, the drop rate is set to be 0.5, the weight attenuation belongs to {5e-3,5e-4}, and the k-order neighborhood graph belongs to {2, …,9 }. All references to baseline, the present invention refer to the parameters in their paper. All experiments were run 5 times with an average. Evaluation of model Performance, the present invention employs Accuracy (ACC) and macroscopic F1 score (F1). Accuracy refers to the ratio of the number of samples correctly classified by the classifier to the total number of samples, i.e., the probability that the prediction is correct, for a given test data set. A single accuracy rate (ACC) evaluation index has a great defect under the condition of unbalance of positive and negative samples, so that a macroscopic F1 scoring standard is added, and the evaluation index is a harmonic mean of the accuracy rate and the recall rate.
The experimental environment is as follows:
the operating system: ubuntu Linux release 16.04.7 LTS;
·CPU:Intel(R)Xeon(R)Silver CPU@2.20GHz;
·GPU:Quadro P4000;
software version: python 3.7; pytrch 1.1.0; numpy 1.16.2; SciPy 1.3.1; networkx 2.4; scikit-leann 0.21.3.
The results of obtaining the social recommendation task according to the above environment are shown in table 2. The SM-GCN proposed by the present invention achieves the best performance on both evaluation criteria of all datasets. In particular, ACC and F1 are respectively promoted by 9.82% and 10.07% on the BlogCatalog data set respectively. By the analysis, it is considered that the effect may be improved due to the following factors.
1. By comparing the performance of GCN and KNN-GCN on the datasets, it can be seen that the two have different effects on different datasets, which indicates that both the topology map and the signature map are important.
2. By comparing the performance of KNN-GCN and MixHop on the data sets, it can be seen that KNN-GCN performs well in the paper citation network, while MixHop performs well in the social network, which suggests that learning the characteristics of neighbor nodes is very necessary.
3. The SM-GCN can better learn the node common information of the topological graph and the characteristic graph, the occurrence of over-smoothness is relieved due to the existence of the scattering module, and the effect of an experimental result is improved by at least 1% compared with that of AM-GCN.
TABLE 2 percent social recommendation results (best effort bolded)
Figure BDA0003616632700000181
In order to show the effectiveness of the module provided by the invention, an ablation experiment is carried out on the model SM-GCN provided by the invention. The TF-GCN is a model only comprising a topological module and a characteristic module, the S-GCN is a model comprising a topological module, a characteristic module and a scattering module, the C-GCN is a model comprising a topological module, a characteristic module and a combined module, and the experimental results are shown in Table 3.
TABLE 3 ablation test results percentage (best results bolded)
Figure BDA0003616632700000191
On the BlogCatalog data set, the model of the invention is greatly improved. According to the experimental results of TF-GCN and S-GCN, the invention can be speculated that the improvement is the result obtained by the combination module learning the common characteristics of the original graph and the characteristic graph. The S-GCN with the addition of the scattering module is improved over the TF-GCN operation using only the original map and the signature map, since the scattering module adds signals of other frequencies to the original map.
Compared with TF-GCN, the classification effect is improved, and the necessity of learning the scattering module and the combination module at the same time is proved.
In order to understand which embedding the classification task is more inclined to, the invention carries out distribution analysis on the attention mechanism of the model of the invention. As shown in fig. 3.
The model of the invention has four nodes embedded before adding attention mechanism: original graph node embedding, feature graph node embedding, combined node embedding and scattering node embedding of an original graph. From fig. 3, it can be seen that the combination node embedding on three data sets (BlogCatalog, Flickr, UAI2010) all shows a crucial role. Scattering node embedding is more important than topological node embedding and characteristic node embedding, and in the UAI2010 data set, characteristic node embedding is only next to combined node embedding, so that the data set is more prone to characteristic information representation. The higher value of the attention embedded by the scattering nodes than the value of the attention embedded by the topological nodes indicates that the high-frequency signal contributes to the classification task more than the low-frequency signal generated by the GCN, and the addition of the scattering module is very necessary. In conclusion, the model of the invention can distribute the weight more adaptively, so that the social recommendation task can achieve better effect.
In summary, the SM-GCN provided by the invention considers the common information of the fusion node characteristics and the network topology and the frequently encountered over-smooth problem at the same time, and the combination embedding and the addition of the geometric scattering effectively solve the problems. The most relevant information of node features and network topology is learned more effectively through a combined embedding process, and the occurrence of over-smoothing is relieved through a scattering process by learning different frequency signals of first-order and second-order neighbors. Experiments show that the social recommendation method and the social recommendation system can achieve good effects in social recommendation tasks.
The above shows only the preferred embodiments of the present invention, and it should be noted that it is obvious to those skilled in the art that various modifications and improvements can be made without departing from the principle of the present invention, and these modifications and improvements should also be considered as the protection scope of the present invention.

Claims (10)

1. A graph convolution neural network social recommendation method fusing a multi-channel attention mechanism is characterized by comprising the following steps:
step 1: computing social network graph G using cosine similarityt=(AtX) and then constructing G based on the featurestK-nearest graph of (i.e. social network profile G)f=(AfX); wherein A istRepresents GtA symmetric adjacency matrix offRepresents GfA symmetric adjacency matrix of (a);
and 2, step: learning social network data graph G based on graph convolution neural networktEmbedding of multiple signals ZS
And 3, step 3: g is to betThe characteristics of the nodes are spread on the topological space, convolution operation is carried out, and G is learnedtNode of (2) embedding Zt
And 4, step 4: g is to befThe characteristics of the nodes are spread on the characteristic space, convolution operation is carried out, and G is learnedfNode of (2) embedding Zf
And 5: neural network learning G based on graph convolutiontAnd GfCombined insertion of Zc
Step 6: dynamic adjustment of attention-drawing mechanism ZS、Zt、Zf、ZcBased on the adjusted ZS、Zt、Zf、ZcCalculating to obtain a final embedding Z of each social network node;
and 7: and calculating the category Y of the social network node based on the final embedding Z so as to perform social recommendation.
2. The method for social recommendation of the atlas neural network fusing the multichannel attention mechanism according to claim 1, wherein the step 1 comprises:
first, a similarity matrix of a social network data set is calculated by using a cosine similarity method
Figure FDA0003616632690000011
Then, the front k similar node pairs of each node in the social network are selected to set edges, and a k-nearest graph of the original graph, namely a social network characteristic graph, which is represented as G, is constructedf=(AfX), the calculation method of the similarity matrix S is as follows:
Figure FDA0003616632690000012
wherein XiAnd XjFeature vectors for social network nodes i and j, respectively.
3. The method for social recommendation of a convolutional neural network fused with a multi-channel attention mechanism as claimed in claim 1, wherein said step 2 comprises:
construction G based on inertia random walk matrixtUpper geometric scattering:
Figure FDA0003616632690000021
wherein InIs a matrix of units, and is,
Figure FDA0003616632690000022
as an original social network diagram GtAn adjacency matrix with added self-loops, D is
Figure FDA0003616632690000023
A diagonal matrix of (a);
in geometric scattering, 2 is introducedkWavelet transform of scale
Figure FDA0003616632690000024
Figure FDA0003616632690000025
Wherein U is0A high frequency signal representing the node itself;
using first and second order high frequency signals, namely:
Figure FDA0003616632690000026
according to the propagation rule of the graph convolution neural network, the following scattering propagation rule is defined:
Figure FDA0003616632690000027
wherein
Figure FDA0003616632690000028
Is GtThe weight matrix of the l-th layer, σ, is the activation function.
4. The method for social recommendation of a convolutional neural network fused with a multi-channel attention mechanism as claimed in claim 1, wherein said step 3 comprises:
original social network graph Gt=(AtX) convolution output of the l-th layer in topological space
Figure FDA0003616632690000029
The calculation method of (2) is shown in equation (6):
Figure FDA00036166326900000210
wherein, Wt (l)Represents GtWeight matrix of layer I,. sigma.activationThe function of the function is that of the function,
Figure FDA00036166326900000211
is GtAn adjacency matrix of self-loops is added,
Figure FDA00036166326900000212
Figure FDA00036166326900000213
is composed of
Figure FDA00036166326900000214
A diagonal matrix of
Figure FDA00036166326900000215
5. The method for social recommendation of a convolutional neural network fused with a multi-channel attention mechanism as claimed in claim 1, wherein said step 4 comprises:
social network feature graph Gf=(AfX) convolution output of the l-th layer in the feature space
Figure FDA00036166326900000216
As shown in equation (7):
Figure FDA00036166326900000217
wherein
Figure FDA0003616632690000031
Represents GfThe weight matrix of layer i, σ is the activation function,
Figure FDA0003616632690000032
is GfAdding a contiguous matrix of self-loops, wherein
Figure FDA0003616632690000033
Figure FDA0003616632690000034
Is composed of
Figure FDA0003616632690000035
A diagonal matrix of
Figure FDA0003616632690000036
6. The method for social recommendation of a convolutional neural network fused with a multi-channel attention mechanism as claimed in claim 1, wherein said step 5 comprises:
map the original social network GtAnd social network feature graph GfRespectively carrying out convolution operation once to obtain two convolution output representations, then adding the two convolution output representations to be used as convolution input of next convolution of the two graphs, circularly reciprocating the convolution input until all the convolution operations are finished, and obtaining final convolution output to be used as combined embedding Zc
7. The method for social recommendation of a convolutional neural network fused with a multi-channel attention mechanism as claimed in claim 1, wherein said step 6 comprises:
for any social network node k, at ZsIs expressed as
Figure FDA0003616632690000037
Using weight vectors
Figure FDA0003616632690000038
Calculating the attention value of the node
Figure FDA0003616632690000039
As shown in equation (12):
Figure FDA00036166326900000310
wherein
Figure FDA00036166326900000311
Is a social network weight matrix;
obtaining the social network node k at Z in the same wayt,ZcAnd ZfAttention value of
Figure FDA00036166326900000312
And
Figure FDA00036166326900000313
then, the attention value of the social network node k is obtained by using a Softmax function, as shown in formula (13):
Figure FDA00036166326900000314
calculated by the same principle to obtain
Figure FDA00036166326900000315
For all social network nodes n in the graph,
Figure FDA00036166326900000316
and
Figure FDA00036166326900000317
μstcfrespectively represents Zs,Zt,Zc,ZfThe attention value of (1);
calculating to obtain the final embedding Z of each social network node according to the attention value of each node of the social network:
Z=μs·Zst·Ztc·Zcf·Zf#(14)。
8. the method for social recommendation of a convolutional neural network fused with a multi-channel attention mechanism as claimed in claim 1, wherein said step 7 comprises:
defining the probability that the social network node k belongs to the class c as
Figure FDA00036166326900000318
The class of n nodes is predicted as
Figure FDA00036166326900000319
Derived from the final insertion Z
Figure FDA00036166326900000320
Figure FDA0003616632690000041
Wherein W is a weight vector of class c; b represents the bias, which is a constant.
9. A graph convolution neural network social recommendation system fusing a multi-channel attention mechanism is characterized by comprising:
a feature graph construction module for calculating a social network graph G by using cosine similarityt=(AtX) and then constructing G based on the featurestK-nearest graph of (i.e. social network profile G)f=(AfX); wherein A istRepresents GtA symmetric adjacency matrix offRepresents GfA symmetric adjacency matrix of (a);
a scattering module for learning a social network data graph G based on a graph convolution neural networktEmbedding of multiple signals ZS
Topology module for connecting GtThe characteristics of the nodes are spread on the topological space, convolution operation is carried out, and G is learnedtNode of (2) embedding Zt
Feature module for converting GfThe characteristics of the nodes are spread on the characteristic space, convolution operation is carried out, and G is learnedfNode of (2) embedding Zf
Combination module for graph convolution based neural network learning GtAnd GfCombined insertion of Zc
Attention module for dynamic adjustment of attention mechanism ZS、Zt、Zf、ZcBased on the adjusted ZS、Zt、Zf、ZcCalculating to obtain a final embedding Z of each social network node;
and the social recommendation module calculates the category Y of the social network node based on the final embedding Z, so as to perform social recommendation.
10. The system of claim 9, wherein the feature map construction module is specifically configured to:
first, a similarity matrix of a social network data set is calculated by using a cosine similarity method
Figure FDA0003616632690000042
Then, the front k similar node pairs of each node in the social network are selected to set edges, and a k-nearest graph of the original graph, namely a social network characteristic graph, which is represented as G, is constructedf=(AfX), the calculation method of the similarity matrix S is as follows:
Figure FDA0003616632690000043
wherein XiAnd XjFeature vectors for social network nodes i and j, respectively;
the scattering module is specifically configured to:
construction G based on inertia random walk matrixtUpper geometric scattering:
Figure FDA0003616632690000051
wherein InIs a matrix of units, and is,
Figure FDA0003616632690000052
as an original social network diagram GtAn adjacency matrix with added self-loops, D is
Figure FDA0003616632690000053
A diagonal matrix of (a);
in geometric scattering, 2 is introducedkWavelet transform of scale
Figure FDA0003616632690000054
Figure FDA0003616632690000055
Wherein U is0A high frequency signal representing the node itself;
using first and second order high frequency signals, namely:
Figure FDA0003616632690000056
according to the propagation rule of the graph convolution neural network, the following scattering propagation rule is defined:
Figure FDA0003616632690000057
wherein
Figure FDA0003616632690000058
Is GtA weight matrix of the l layer, wherein sigma is an activation function;
the topology module is specifically configured to:
original social network graph Gt=(AtX) convolution output of the l-th layer in topological space
Figure FDA0003616632690000059
The calculation method of (2) is shown in equation (6):
Figure FDA00036166326900000510
wherein, Wt (l)Represents GtThe weight matrix of layer i, σ is the activation function,
Figure FDA00036166326900000511
is GtAn adjacency matrix of self-loops is added,
Figure FDA00036166326900000512
Figure FDA00036166326900000513
is composed of
Figure FDA00036166326900000514
A diagonal matrix of
Figure FDA00036166326900000515
The feature module is specifically configured to:
social network profile Gf=(AfX) convolution output of the l-th layer in the feature space
Figure FDA00036166326900000516
As shown in equation (7):
Figure FDA00036166326900000517
wherein
Figure FDA00036166326900000518
Represents GfThe weight matrix of layer i, σ is the activation function,
Figure FDA00036166326900000519
is GfAdding a contiguous matrix of self-loops, wherein
Figure FDA0003616632690000061
Figure FDA0003616632690000062
Is composed of
Figure FDA0003616632690000063
A diagonal matrix of
Figure FDA0003616632690000064
The combination module is specifically configured to:
map the original social network GtAnd social network feature graph GfRespectively carrying out convolution operation once to obtain two convolution output representations, then adding the two convolution output representations to be used as convolution input of next convolution of the two graphs, circularly repeating the convolution input until all the convolution operations are finished, and obtaining final convolution output to be used as combination embedding Zc
The attention module is specifically configured to:
for any social network node k, at ZsIs expressed as
Figure FDA0003616632690000065
Using weight vectors
Figure FDA0003616632690000066
Calculating the attention value of the node
Figure FDA0003616632690000067
As shown in equation (12):
Figure FDA0003616632690000068
wherein
Figure FDA0003616632690000069
Is a social network weight matrix;
obtaining the social network node k at Z in the same wayt,ZcAnd ZfAttention value of
Figure FDA00036166326900000610
And
Figure FDA00036166326900000611
then, the attention value of the social network node k is obtained by using a Softmax function, as shown in formula (13):
Figure FDA00036166326900000612
calculated by the same principle to obtain
Figure FDA00036166326900000613
For all social network nodes n in the graph,
Figure FDA00036166326900000614
and
Figure FDA00036166326900000615
μstcfrespectively represents Zs,Zt,Zc,ZfThe attention value of (1);
calculating to obtain the final embedding Z of each social network node according to the attention value of each node of the social network:
Z=μs·Zst·Ztc·Zcf·Zf#(14);
the social recommendation module is specifically configured to:
defining the probability that the social network node k belongs to the class c as
Figure FDA00036166326900000616
The class of n nodes is predicted as
Figure FDA00036166326900000617
Derived from the final insertion Z
Figure FDA00036166326900000618
Figure FDA00036166326900000619
Wherein W is a weight vector of class c; b represents the bias, which is a constant.
CN202210445519.5A 2022-04-26 2022-04-26 Graph convolution neural network social recommendation method and system integrating multichannel attention mechanisms Active CN114677234B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210445519.5A CN114677234B (en) 2022-04-26 2022-04-26 Graph convolution neural network social recommendation method and system integrating multichannel attention mechanisms

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210445519.5A CN114677234B (en) 2022-04-26 2022-04-26 Graph convolution neural network social recommendation method and system integrating multichannel attention mechanisms

Publications (2)

Publication Number Publication Date
CN114677234A true CN114677234A (en) 2022-06-28
CN114677234B CN114677234B (en) 2024-04-30

Family

ID=82080899

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210445519.5A Active CN114677234B (en) 2022-04-26 2022-04-26 Graph convolution neural network social recommendation method and system integrating multichannel attention mechanisms

Country Status (1)

Country Link
CN (1) CN114677234B (en)

Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104601438A (en) * 2014-04-28 2015-05-06 腾讯科技(深圳)有限公司 Friend recommendation method and device
CN106980659A (en) * 2017-03-20 2017-07-25 华中科技大学鄂州工业技术研究院 A kind of doings based on isomery graph model recommend method
KR101872733B1 (en) * 2017-06-14 2018-06-29 조선대학교산학협력단 System for recommending social networking service following and method for recommending social networking service following using it
CN108320187A (en) * 2018-02-02 2018-07-24 合肥工业大学 A kind of recommendation method based on depth social networks
CN109410080A (en) * 2018-10-16 2019-03-01 合肥工业大学 A kind of social image recommended method based on level attention mechanism
CN110009093A (en) * 2018-12-07 2019-07-12 阿里巴巴集团控股有限公司 For analyzing the nerve network system and method for relational network figure
CN111259142A (en) * 2020-01-14 2020-06-09 华南师范大学 Specific target emotion classification method based on attention coding and graph convolution network
CN111428147A (en) * 2020-03-25 2020-07-17 合肥工业大学 Social recommendation method of heterogeneous graph volume network combining social and interest information
CN111523051A (en) * 2020-04-24 2020-08-11 山东师范大学 Social interest recommendation method and system based on graph volume matrix decomposition
CN113158071A (en) * 2021-03-19 2021-07-23 广东工业大学 Knowledge social contact recommendation method, system and equipment based on graph neural network
US20210248461A1 (en) * 2020-02-11 2021-08-12 Nec Laboratories America, Inc. Graph enhanced attention network for explainable poi recommendation
CN114036405A (en) * 2021-11-02 2022-02-11 扬州大学 Social contact recommendation method and system based on graph convolution network

Patent Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104601438A (en) * 2014-04-28 2015-05-06 腾讯科技(深圳)有限公司 Friend recommendation method and device
CN106980659A (en) * 2017-03-20 2017-07-25 华中科技大学鄂州工业技术研究院 A kind of doings based on isomery graph model recommend method
KR101872733B1 (en) * 2017-06-14 2018-06-29 조선대학교산학협력단 System for recommending social networking service following and method for recommending social networking service following using it
CN108320187A (en) * 2018-02-02 2018-07-24 合肥工业大学 A kind of recommendation method based on depth social networks
CN109410080A (en) * 2018-10-16 2019-03-01 合肥工业大学 A kind of social image recommended method based on level attention mechanism
CN110009093A (en) * 2018-12-07 2019-07-12 阿里巴巴集团控股有限公司 For analyzing the nerve network system and method for relational network figure
CN111259142A (en) * 2020-01-14 2020-06-09 华南师范大学 Specific target emotion classification method based on attention coding and graph convolution network
US20210248461A1 (en) * 2020-02-11 2021-08-12 Nec Laboratories America, Inc. Graph enhanced attention network for explainable poi recommendation
CN111428147A (en) * 2020-03-25 2020-07-17 合肥工业大学 Social recommendation method of heterogeneous graph volume network combining social and interest information
CN111523051A (en) * 2020-04-24 2020-08-11 山东师范大学 Social interest recommendation method and system based on graph volume matrix decomposition
CN113158071A (en) * 2021-03-19 2021-07-23 广东工业大学 Knowledge social contact recommendation method, system and equipment based on graph neural network
CN114036405A (en) * 2021-11-02 2022-02-11 扬州大学 Social contact recommendation method and system based on graph convolution network

Also Published As

Publication number Publication date
CN114677234B (en) 2024-04-30

Similar Documents

Publication Publication Date Title
CN113407759B (en) Multi-modal entity alignment method based on adaptive feature fusion
CN106411572B (en) A kind of community discovery method of combination nodal information and network structure
CN113656596A (en) Multi-modal entity alignment method based on triple screening fusion
CN102879677A (en) Intelligent fault diagnosis method based on rough Bayesian network classifier
CN107391670A (en) A kind of mixing recommendation method for merging collaborative filtering and user property filtering
Feng et al. Computational social indicators: a case study of chinese university ranking
CN107784327A (en) A kind of personalized community discovery method based on GN
CN112311608B (en) Multilayer heterogeneous network space node characterization method
CN116340646A (en) Recommendation method for optimizing multi-element user representation based on hypergraph motif
CN115310005A (en) Neural network recommendation method and system based on meta-path fusion and heterogeneous network
CN116416478A (en) Bioinformatics classification model based on graph structure data characteristics
CN103793747A (en) Sensitive information template construction method in network content safety management
Hu et al. M-gcn: Multi-scale graph convolutional network for 3d point cloud classification
Li et al. PC-Conv: Unifying Homophily and Heterophily with Two-fold Filtering
CN105159918A (en) Trust correlation based microblog network community discovery method
CN114677234A (en) Graph convolution neural network social contact recommendation method and system integrating multi-channel attention mechanism
Choi et al. Finding heterophilic neighbors via confidence-based subgraph matching for semi-supervised node classification
Kek et al. Multi-timescale wavelet scattering with genetic algorithm feature selection for acoustic scene classification
Tabak et al. Topological properties of bank networks: the case of Brazil
CN110580280A (en) Method, device and storage medium for discovering new words
CN111444454A (en) Dynamic community dividing method based on spectrum method
Zhao et al. ENADPool: The Edge-Node Attention-based Differentiable Pooling for Graph Neural Networks
Fu Pairwise constraint propagation via low-rank matrix recovery
CN115134305B (en) Dual-core cooperation SDN big data network flow accurate classification method
Zhang et al. Personalized web page ranking based graph convolutional network for community detection in attribute networks

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant