CN111709474A - Graph embedding link prediction method fusing topological structure and node attributes - Google Patents

Graph embedding link prediction method fusing topological structure and node attributes Download PDF

Info

Publication number
CN111709474A
CN111709474A CN202010547872.5A CN202010547872A CN111709474A CN 111709474 A CN111709474 A CN 111709474A CN 202010547872 A CN202010547872 A CN 202010547872A CN 111709474 A CN111709474 A CN 111709474A
Authority
CN
China
Prior art keywords
node
network
nodes
attribute
vector
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202010547872.5A
Other languages
Chinese (zh)
Inventor
周明强
孔亦涵
张申申
张程
金海江
刘丹
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Chongqing University
Original Assignee
Chongqing University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Chongqing University filed Critical Chongqing University
Priority to CN202010547872.5A priority Critical patent/CN111709474A/en
Publication of CN111709474A publication Critical patent/CN111709474A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques
    • G06F18/241Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches
    • G06F18/2415Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches based on parametric or probabilistic models, e.g. based on likelihood ratio or false acceptance rate versus a false rejection rate
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/90Details of database functions independent of the retrieved data types
    • G06F16/95Retrieval from the web
    • G06F16/953Querying, e.g. by the use of web search engines
    • G06F16/9536Search customisation based on social or collaborative filtering
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/045Combinations of networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q50/00Information and communication technology [ICT] specially adapted for implementation of business processes of specific business sectors, e.g. utilities or tourism
    • G06Q50/01Social networking

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Data Mining & Analysis (AREA)
  • General Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • Databases & Information Systems (AREA)
  • Computing Systems (AREA)
  • General Health & Medical Sciences (AREA)
  • Health & Medical Sciences (AREA)
  • Evolutionary Computation (AREA)
  • Artificial Intelligence (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Business, Economics & Management (AREA)
  • Mathematical Physics (AREA)
  • Biomedical Technology (AREA)
  • Biophysics (AREA)
  • Computational Linguistics (AREA)
  • Software Systems (AREA)
  • Molecular Biology (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Evolutionary Biology (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Probability & Statistics with Applications (AREA)
  • Economics (AREA)
  • Human Resources & Organizations (AREA)
  • Marketing (AREA)
  • Primary Health Care (AREA)
  • Strategic Management (AREA)
  • Tourism & Hospitality (AREA)
  • General Business, Economics & Management (AREA)
  • Information Retrieval, Db Structures And Fs Structures Therefor (AREA)

Abstract

The invention provides a graph embedding link prediction method fusing a topological structure and node attributes, which comprises the following steps: s1, learning the topological structure information of the network through a graph embedding algorithm, and embedding the topological structure information of the network into a vector space to obtain an embedded structure feature vector based on the node; s2, uniformly coding the node attributes in the network to obtain node attribute feature vectors; and S3, carrying out nonlinear fusion on the structural features and the attribute features by using a deep neural network and outputting a link prediction score. The invention can well integrate the topological structure characteristics and the node attribute information in the network and help the user to obtain more accurate relevance pushing.

Description

Graph embedding link prediction method fusing topological structure and node attributes
Technical Field
The invention relates to a link prediction algorithm, in particular to a graph embedding link prediction method fusing a topological structure and node attributes.
Background
The invention patent with publication number CN109492133A proposes a link prediction method based on motif and network embedding. The method comprises the following specific steps of redefining first-order and second-order similarities of nodes according to a higher-order network structure motif; then a multi-coder model with shared parameters is designed, which inputs the neighbor information of all nodes in one motif at a time and adds a first-order similarity constraint condition at the generated vector representation. The link prediction method only considers the topological structure information of the nodes in the network and ignores rich and diverse attribute information of the nodes.
The invention patent publication CN109214599A proposes an end-to-end link prediction model based on graph attention network. Firstly, inputting a topological structure of an unweighted and undirected homogeneous network, then performing first-order and second-order neighbor sampling on all nodes according to the topological structure of a training set so as to batch the network, then inputting the batched training set into the model to train out model parameters, finally inputting a point pair to be predicted, and outputting the probability that a connecting edge exists between the point pair by the model.
Both methods only consider the topological structure information of the nodes in the network, and particularly depend on the defined structure similarity index, so that the difference of the predicted results in different networks is large, a universal structure similarity index is difficult to provide, and meanwhile, rich and diverse attribute information of the nodes is ignored.
In the network data obtained in the real world, not only the topological structure information of the network exists, but also rich attribute information is included, for example, in the social network, besides the friend relationship among users, the network includes a large amount of user attribute label information, and the information is also important for tasks such as friend recommendation in the social network, and the prediction accuracy can be greatly improved. Therefore, how to fully utilize the structural information and the attribute information in the network is a main research problem in the text.
Disclosure of Invention
The problem to be solved by the present invention is to provide a graph embedding link prediction method fusing topology structure and node attribute for the above-mentioned deficiencies in the prior art, and solve the problem that the topology structure and node attribute information cannot be fused at the same time for link prediction in the prior art.
In order to achieve the purpose, the invention adopts the following technical scheme:
a graph embedding link prediction method fusing topological structure and node attributes comprises the following steps: s1, learning the topological structure information of the network through a graph embedding algorithm, and embedding the topological structure information of the network into a vector space to obtain an embedded structure feature vector based on the node; s2, uniformly coding the node attributes in the network to obtain node attribute feature vectors; and S3, carrying out nonlinear fusion on the structural features and the attribute features by using a deep neural network and outputting a link prediction score.
Compared with the prior art, the invention has the following beneficial effects: the link prediction algorithm provided by the invention realizes link prediction by fusing the topological structure and the node attribute information. Firstly, the graph embedding technology is applied to the research of the link prediction problem, aiming at the problem that the structural similarity index is difficult to be universal, the topological structure information of the network is learned through a graph embedding algorithm, and the topological structure information of the network is embedded into a low-dimensional and dense vector space to obtain an embedded structure feature vector based on the nodes. And then carrying out unified coding on the node attributes in the network to obtain a node attribute feature vector, carrying out nonlinear fusion on the structural features and the attribute features by using a deep neural network, and outputting a link prediction score.
Additional advantages, objects, and features of the invention will be set forth in part in the description which follows and in part will become apparent to those having ordinary skill in the art upon examination of the following or may be learned from practice of the invention.
Drawings
FIG. 1 is a diagram illustrating attribute encoding of a network node according to an embodiment of the present invention;
FIG. 2 is a schematic structural diagram of a depth feature fusion network according to an embodiment of the present invention;
FIG. 3 is a general flow diagram of one embodiment of the present invention.
Detailed Description
In order to make the technical means, the creation characteristics, the achievement purposes and the functions of the invention clearer and easier to understand, the invention is further explained by combining the drawings and the detailed implementation mode:
the link prediction algorithm disclosed by the invention is mainly divided into three steps: firstly, applying a graph embedding technology to a link prediction problem research, and aiming at the problem that a structure similarity index is difficult to be universal, learning the topological structure information of the network through a graph embedding algorithm, and embedding the topological structure information of the network into a low-dimensional and dense vector space to obtain an embedded structure feature vector based on a node. Then, the node attributes in the network are uniformly coded to obtain a node attribute feature vector. And finally, carrying out nonlinear fusion on the structural features and the attribute features by using a deep neural network and outputting a link prediction score.
Before further elaborating on the preferred specific algorithm steps, the basic concepts are first defined: in a simple network without weights and directions, the network may use the G ═ V, E representation, where V represents the set of nodes in the network and E represents the set of connected edges in the network. For a undirected network, consider a node pair (v)i,vj) And (v)j,vi) Representing the same link, a simple weightless undirected network is generally defined by ① nodes having no rings, i.e., the nodes themselves cannot be connected, ② nodes having no side number of 1 and no weight on the side, ③ nodes having no side relation to the direction of connection, i.e., the two nodes are considered to be equivalent in the process of link formation, and an adjacency matrix A is used to represent the connection relation between the nodes of the networkiAnd vjThere is a connecting edge between them, then on the corresponding position of AThe element is 1, and the other elements are 0.
The steps of learning the structural feature vector of the node are shown as follows, and the algorithm mainly comprises three parts:
first, a node sequence is sampled on the network using a truncated random walk approach.
The random walk generator samples a sequence of nodes in the network, and selects a node V in the network G ═ E, ViAs a root node, starting random walk from the root node, randomly selecting a neighbor node from neighbor nodes of the last visited node as a next visited node until the maximum path length t is reached, and taking a path passed by the random walk as a sampling sequence Wvi,|WviI | ═ t. The sampling is repeated λ times at node vi.
And secondly, mapping the nodes into vector representation form.
And (4) taking the sampled node sequence as a sentence, inputting the sentence into a SkipGram model for training, and setting the window size to be w. In the network representation learning task, for the collected node sequence WviA window of size w is set. For each vjMapping to the current vector space phi (v)j) ∈ R. for each given node vjTo maximize its node sequence WvjProbability of neighbor node in middle window:
Figure BDA0002541393750000031
thirdly, the calculation amount is reduced by using a Softmax mode.
Hierarchical Softmax is used to reduce computational complexity. Given node vk∈ V, calculating conditional probability
Pr(vk|Φ(vj) A large amount of computation is required, and thus this conditional probability is calculated using the hierarchical softmax. The nodes are taken as leaf nodes and placed in a binary tree, for each leaf node, a unique path from a root node to the node is always provided, and the probability of the appearance of the leaf node is estimated according to the unique path.
To node vkThe path of (A) is recorded as
Figure BDA0002541393750000032
Wherein
Figure BDA0002541393750000033
Then:
Figure BDA0002541393750000034
may be at node blA two-classifier model is built on the parent node to estimate Pr (b)l|Φ(vj)),
Figure BDA0002541393750000036
Wherein
Figure BDA0002541393750000035
Represents node blA parent node of (a); and after final iterative training, using the weight matrix of the hidden layer of the SkiGram expansion model as potential vector representation of the node. Whereby Pr (v) can be converted tok|Φ(vj) Time complexity is reduced from O (V | to O (log | V |). After iterative training, the weight matrix of the hidden layer of the SkipGram extended model is used as the potential vector representation of the node.
The node attributes comprise discrete attributes and continuous attributes, a general attribute coding mode is provided for the different types of attributes, and the discrete node attributes and the continuous node attributes in the network are coded to obtain a uniform real-value attribute feature vector form, as shown in figure 1.
The attributes of the so-called discrete type are attributes of the category type, such as gender, age, and the like of the user in the social network. For the discrete attribute, one-hot encoded to continuous vector representation may be used. For example, the gender attribute has two values { male, female }, and a vector v ═ 0, 1} can be used to describe the gender of the user, where the value of the first element in the vector is 1 represents gender male and the value of the second element in the vector is 1 represents gender female. The method carries out one-hot coding on all the discrete node attributes in the network. Finally, the coding sequence of each type of attribute is spliced to obtain the final discrete attribute feature vector.
The continuous attribute is commonly present in the social network, such as blog information published by a user in a microblog network, picture data shared by the user on a picture sharing website, and the like. This class of attributes cannot be directly compared, but text-type attributes can be processed through TF-IDF techniques. The specific steps are as follows: calculating word frequency, and normalizing the word frequency in order to facilitate comparison of texts with different lengths; and calculating the frequency of the inverse document. Calculating TF-IDF value, using product of word frequency and inverse document frequency as TF-IDF value; and fourthly, splicing the TF-IDF values of each word in the document to form a feature vector of the text attribute.
After the continuous attributes are expressed into real-value vectors, the attribute final feature vectors of the nodes in the network can be obtained by splicing the attribute coding feature vectors of the nodes from beginning to end. In order to facilitate the input of the attribute feature vectors into a machine learning algorithm and a neural network, the attribute information of each node is processed into feature vectors with equal dimensions. And filling 0 in the corresponding attribute feature position of the node lacking some kind of attribute, thereby obtaining the attribute feature of the node with the same dimension.
When the nonlinear fusion between the features is carried out by establishing a deep learning model, a link prediction task is used as target guidance, and a neural network is established to fuse the network structure features and the attribute features of the nodes. The method can realize the nonlinear fusion of the network structure characteristics and the node attribute characteristics. The structure diagram of the depth feature fusion network is shown in fig. 2.
In particular, f is defined as two nodes vi,vjSimilarity function, using Softmax function to define node viFor vjConditional probability of (2):
Figure BDA0002541393750000041
wherein, p (v)j|vi) To measure vjAnd viThe possibility of connection. The structural similarity degree between nodes needs to be extracted from the neighbor nodes of the nodes, and the node v is defined based on the assumption of conditional independenceiNode set N adjacent theretoiThe conditional probability between is:
Figure BDA0002541393750000042
wherein N isiRepresentative node viOf the neighboring node. The likelihood function L of the entire model can be defined as:
Figure BDA0002541393750000043
taking the likelihood function as a learning target of the model, the structure of the feature depth fusion model is as follows:
inputting a layer: the node structure vector representation is denoted u and includes node structure information, and the node attribute code vector representation is denoted u' and includes node attribute information.
② hidden layer structural characteristic vector u and attribute characteristic vector u are weighted and input into a multi-layer perceptron, and the hidden representation of each layer is marked as h0,h1,...hn. The representation of the hidden layer is defined as follows:
Figure BDA0002541393750000051
hkk(Wkhk-1+bk),k=1,2,...,n
wherein gamma represents the weight of the attribute feature,krepresenting the activation function and n the number of layers of the hidden layer.
The hidden layer adopts a tower structure shown in the attached figure 2: the number of neurons in the current layer is halved compared to the previous layer. Such a tower structure can learn more abstract features from the data. u and u' need to be adjusted by a weight coefficient γ when input to the hidden layer.
③ output layer, willThe output h of the last hidden layer is converted into a probability vector o, which contains the input node viLink prediction for other nodes:
o=[p(v1|vi),p(v2|vi),...,p(v|V||vi)]
taking the corresponding row in the weight matrix of the hidden layer and the output layer as a node vjIs an abstract representation of (1), denoted as ujNode vi,vjThe similarity function of (a) is defined as:
Figure BDA0002541393750000052
further, a formula for calculating the connection probability in the vector o is obtained:
Figure BDA0002541393750000053
the optimization objective function of the entire model is then denoted as Θ
Figure BDA0002541393750000054
Figure BDA0002541393750000055
From the formula for calculating the connection probability, one can obtain:
Figure BDA0002541393750000056
the optimization objective function is to make the similarity of nodes in the neighbor nodes larger and the similarity of nodes not in the neighbor nodes smaller.
After obtaining the model, the flow chart is shown in fig. 3, and then the constructed depth model is trained. During the network training process, the Adam optimization framework model is preferably used. The Adam optimizer designs independent adaptive learning rates for different parameters by computing first and second order moment estimates of the gradient, for infrequently updated parametersA higher learning rate is used and frequently updated parameters are used with a lower learning rate. And finally, outputting the probability value of the connection between the node and other nodes in the network at the output layer of the neural network. Therefore, the invention embeds the network topological structure information into a low-dimensional and dense vector space, uniformly encodes the node attributes in the network to obtain the node attribute feature vector, and performs nonlinear fusion on the structure features and the attribute features by using the deep neural network of the tower structure. The invention meets the requirements that the interaction between the individuals of the social network is more convenient and frequent under the large environment along with the rapid development of the internet technology, better researches on the interaction relationship between the individuals in the network are greatly helpful for understanding the social network evolution mechanism and improving the communication experience between the individuals, the link prediction provided by the invention can help the user to obtain more accurate recommendation,orFind back missing friendsEtc. of. In addition, the method can be widely and practically applied in the fields of biology, electric power, communication and the like, and has great economic value and good social benefit.
Finally, the above embodiments are only for illustrating the technical solutions of the present invention and not for limiting, although the present invention has been described in detail with reference to the preferred embodiments, it should be understood by those skilled in the art that modifications or equivalent substitutions may be made to the technical solutions of the present invention without departing from the spirit and scope of the technical solutions of the present invention, and all of them should be covered in the claims of the present invention.

Claims (10)

1. A graph embedding link prediction method fusing a topological structure and node attributes is characterized by comprising the following steps: s1, learning the topological structure information of the network through a graph embedding algorithm, and embedding the topological structure information of the network into a vector space to obtain an embedded structure feature vector based on the node; s2, uniformly coding the node attributes in the network to obtain node attribute feature vectors; and S3, carrying out nonlinear fusion on the structural features and the attribute features by using a deep neural network and outputting a link prediction score.
2. The graph-embedded link prediction method fusing topology and node attributes according to claim 1, wherein a simple weightless undirected network G ═ V, E, where V represents a set of nodes in the network and E represents a set of connected edges in the network, and there is no ring between nodes, the number of connected edges between nodes is at most 1, there is no weight on connected edges, and the edges between nodes do not consider the connection direction.
3. The method for predicting graph-embedded links fusing topology and node attributes according to claim 2, wherein the step of obtaining the node-based embedded structure feature vector in step S1 is as follows: s11, sampling the node sequence on the network by using a cut-off random walk mode; s12, mapping the nodes into vector representation forms; and S13, using Softmax to participate in the optimization calculation of the probability under the condition of the given node.
4. The graph-embedded link prediction method fusing topology and node attributes according to claim 3, wherein in step S11, the random walk generator samples a node sequence in the network, and selects a node V in the network G ═ (E, V)iAs a root node, starting random walk from the root node, randomly selecting a neighbor node from neighbor nodes of the last visited node as a next visited node until the maximum path length t is reached, and taking a path passed by the random walk as a sampling sequence Wvi,|WviAt node v | ═ tiIs repeatedly sampled x times.
5. The graph-embedded link prediction method fusing topology and node attributes according to claim 4, wherein in step S12, the sampled node sequence is regarded as a sentence and input into a SkipGram model for training, and the window size is w; in the network representation learning task, for the collected node sequence WviSetting a window of size w for each vjMapping to the current vector space phi (v)j) ∈ R, for each given node vjIs shown in the drawing (a) and (b),maximizing its node sequence WvjProbability of neighbor node in middle window:
Figure FDA0002541393740000011
6. the graph-embedded link prediction method fusing topology and node attributes according to claim 5, wherein in the step of S13, the nodes are placed in a binary tree as leaf nodes, the probability of the occurrence of the leaf nodes is estimated according to the unique paths of the leaf nodes, and the nodes are connected to the node vkThe path of (A) is recorded as
Figure FDA0002541393740000013
Wherein the content of the first and second substances,
Figure FDA0002541393740000012
then there are:
Figure FDA0002541393740000021
then at node blA two-classifier model is built on the parent node to estimate Pr (b)l|Φ(vj)),
Figure FDA0002541393740000022
Wherein
Figure FDA0002541393740000023
Represents node blA parent node of (a); and after final iterative training, using the weight matrix of the hidden layer of the SkiGram expansion model as potential vector representation of the node.
7. The method of claim 6, wherein the network node in step S2 comprises a discrete attribute feature vector and a continuous attribute feature vector, wherein the discrete attribute feature vector and the continuous attribute feature vector
Obtaining the discrete attribute feature vector: performing one-hot coding on all discrete node attributes in a network, and splicing the coding sequence of each type of attribute to obtain a final discrete attribute feature vector;
obtaining the continuity attribute feature vector: the method comprises the following steps of (1) calculating word frequency, and normalizing the word frequency; (2) calculating the frequency of the inverse document; (3) calculating a TF-IDF value, and using the product of the word frequency and the inverse document frequency as the TF-IDF value; (4) splicing TF-IDF values of each word in the document to form a feature vector of a text attribute;
and splicing the attribute coding feature vectors of the nodes end to obtain the node attribute feature vector in the network.
8. The graph embedding link prediction method fusing the topology and the node attributes according to claim 7, wherein the attribute information of each node is processed into equal-dimension feature vectors, and the nodes missing some kind of attributes are filled with 0 at the corresponding attribute feature positions, thereby obtaining the attribute features of the nodes with the same dimension.
9. The graph embedding link prediction method fusing topology and node attributes according to any one of claims 2 to 8, wherein a link prediction task is used as a target guide, a neural network is established to deeply fuse network structure features and node attribute features, and f is defined as two nodes vi,vjSimilarity function, using Softmax function to define node viFor vjConditional probability of (2):
Figure FDA0002541393740000024
wherein, p (v)j|vi) Measure vjAnd viThe possibility of connection;
the structural similarity between nodes needs to be extracted from the neighbor nodes of the nodes and is independent based on conditionsAssuming sexuality, define node viNode set N adjacent theretoiThe conditional probability between is:
Figure FDA0002541393740000025
wherein N isiRepresentative node viThe likelihood function L of the whole model is:
Figure FDA0002541393740000026
the likelihood function L is taken as a learning target of the model, and the structure of the characteristic depth fusion model comprises an input layer, a hidden layer and an output layer, specifically,
the input layer: the node structure vector representation is marked as u and contains the node structure information, and the node attribute coding vector representation is marked as u' and contains the node attribute information;
the hidden layer: weighting the structure characteristic vector u and the attribute characteristic vector u 'and inputting the weighted structure characteristic vector u and the weighted attribute characteristic vector u' into a multi-layer perceptron, wherein the hidden representation of each layer is marked as h0,h1,...hnThe hidden layer is defined as follows:
Figure FDA0002541393740000031
hkk(Wkhk-1+bk),k=1,2,...,n
wherein gamma represents the weight of the attribute feature,krepresenting the activation function, n representing the number of layers of the hidden layer; a tower structure with the number of neurons of the current layer reduced by half compared with that of the previous layer is adopted in the hidden layer;
the output layer; converting the output h of the last layer of the hidden layer into a probability vector o comprising the input node viLink prediction for other nodes:
o=[p(v1|vi),p(v2|vi),...,p(v|V||vi)]
taking the corresponding row in the weight matrix of the hidden layer and the output layer as a node vjIs an abstract representation of (1), denoted as ujNode vi,vjThe similarity function of (a) is defined as:
Figure FDA0002541393740000032
further, a formula for calculating the connection probability in the vector o is obtained:
Figure FDA0002541393740000033
the optimization objective function of the entire model is then denoted as Θ
Figure FDA0002541393740000034
Figure FDA0002541393740000035
From the formula for calculating the connection probability, one can obtain:
Figure FDA0002541393740000036
10. the graph embedding link prediction method fusing the topological structure and the node attribute as claimed in claim 9, wherein in a network training process of the constructed depth model, an Adam optimization framework model is used, an Adam optimizer designs independent adaptive learning rates for different parameters by calculating first moment estimation and second moment estimation of a gradient, uses a higher learning rate for parameters which are not frequently updated and a lower learning rate for parameters which are frequently updated, and finally, a probability value of connection between an output node of the neural network and other nodes in the network is established at an output layer of the neural network.
CN202010547872.5A 2020-06-16 2020-06-16 Graph embedding link prediction method fusing topological structure and node attributes Pending CN111709474A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010547872.5A CN111709474A (en) 2020-06-16 2020-06-16 Graph embedding link prediction method fusing topological structure and node attributes

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010547872.5A CN111709474A (en) 2020-06-16 2020-06-16 Graph embedding link prediction method fusing topological structure and node attributes

Publications (1)

Publication Number Publication Date
CN111709474A true CN111709474A (en) 2020-09-25

Family

ID=72540797

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010547872.5A Pending CN111709474A (en) 2020-06-16 2020-06-16 Graph embedding link prediction method fusing topological structure and node attributes

Country Status (1)

Country Link
CN (1) CN111709474A (en)

Cited By (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112288495A (en) * 2020-11-09 2021-01-29 北京理工大学 ICT supply chain key node identification method combining network topology and business attributes
CN112395512A (en) * 2020-11-06 2021-02-23 中山大学 Method for constructing complex attribute network representation model based on path aggregation
CN112508085A (en) * 2020-12-05 2021-03-16 西安电子科技大学 Social network link prediction method based on perceptual neural network
CN112989199A (en) * 2021-03-30 2021-06-18 武汉大学 Cooperative network link prediction method based on multidimensional adjacent attribute network
CN113111224A (en) * 2021-03-17 2021-07-13 中山大学 Network embedding learning method based on topology perception text representation
CN113362071A (en) * 2021-06-21 2021-09-07 浙江工业大学 Pompe fraudster identification method and system for Ether house platform
CN113988464A (en) * 2021-11-17 2022-01-28 国家电网有限公司客户服务中心 Network link attribute relation prediction method and equipment based on graph neural network
WO2022160431A1 (en) * 2021-01-26 2022-08-04 中山大学 Attribute heterogeneous network embedding method, apparatus, and device, and medium
CN115225509A (en) * 2022-07-07 2022-10-21 天津大学 Internet of things topological structure generation method based on neural evolution
CN115529290A (en) * 2022-08-30 2022-12-27 中国人民解放军战略支援部队信息工程大学 IP street level positioning method and device based on graph neural network
CN115712755A (en) * 2022-09-30 2023-02-24 中国人民解放军战略支援部队信息工程大学 Big data access control entity relation prediction method based on GNN double-source learning
CN115883401A (en) * 2022-11-16 2023-03-31 华南师范大学 End-to-end network performance prediction method, system and platform based on flow interaction graph
CN117151279A (en) * 2023-08-15 2023-12-01 哈尔滨工业大学 Isomorphic network link prediction method and system based on line graph neural network

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104376382A (en) * 2014-11-18 2015-02-25 重庆大学 Asymmetric distributed type constraint optimization algorithm and system for large multi-Agent system
CN108923983A (en) * 2018-07-13 2018-11-30 南昌航空大学 Prediction technique, device and the readable storage medium storing program for executing of opportunistic network link

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104376382A (en) * 2014-11-18 2015-02-25 重庆大学 Asymmetric distributed type constraint optimization algorithm and system for large multi-Agent system
CN108923983A (en) * 2018-07-13 2018-11-30 南昌航空大学 Prediction technique, device and the readable storage medium storing program for executing of opportunistic network link

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
MINGQIANG ZHOU ET AL.: "The Deep Fusion of Topological Structure and Attribute Information for Link Prediction", 《IEEE ACCESS》 *

Cited By (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112395512A (en) * 2020-11-06 2021-02-23 中山大学 Method for constructing complex attribute network representation model based on path aggregation
CN112288495A (en) * 2020-11-09 2021-01-29 北京理工大学 ICT supply chain key node identification method combining network topology and business attributes
CN112508085A (en) * 2020-12-05 2021-03-16 西安电子科技大学 Social network link prediction method based on perceptual neural network
CN112508085B (en) * 2020-12-05 2023-04-07 西安电子科技大学 Social network link prediction method based on perceptual neural network
WO2022160431A1 (en) * 2021-01-26 2022-08-04 中山大学 Attribute heterogeneous network embedding method, apparatus, and device, and medium
CN113111224A (en) * 2021-03-17 2021-07-13 中山大学 Network embedding learning method based on topology perception text representation
CN113111224B (en) * 2021-03-17 2023-08-18 中山大学 Network embedded learning method based on topology perception text characterization
CN112989199A (en) * 2021-03-30 2021-06-18 武汉大学 Cooperative network link prediction method based on multidimensional adjacent attribute network
CN113362071A (en) * 2021-06-21 2021-09-07 浙江工业大学 Pompe fraudster identification method and system for Ether house platform
CN113988464A (en) * 2021-11-17 2022-01-28 国家电网有限公司客户服务中心 Network link attribute relation prediction method and equipment based on graph neural network
CN115225509A (en) * 2022-07-07 2022-10-21 天津大学 Internet of things topological structure generation method based on neural evolution
CN115225509B (en) * 2022-07-07 2023-09-22 天津大学 Internet of things topological structure generation method based on neural evolution
CN115529290A (en) * 2022-08-30 2022-12-27 中国人民解放军战略支援部队信息工程大学 IP street level positioning method and device based on graph neural network
CN115712755A (en) * 2022-09-30 2023-02-24 中国人民解放军战略支援部队信息工程大学 Big data access control entity relation prediction method based on GNN double-source learning
CN115883401A (en) * 2022-11-16 2023-03-31 华南师范大学 End-to-end network performance prediction method, system and platform based on flow interaction graph
CN117151279A (en) * 2023-08-15 2023-12-01 哈尔滨工业大学 Isomorphic network link prediction method and system based on line graph neural network

Similar Documents

Publication Publication Date Title
CN111709474A (en) Graph embedding link prediction method fusing topological structure and node attributes
CN112214685B (en) Knowledge graph-based personalized recommendation method
CN112150210B (en) Improved neural network recommendation method and system based on GGNN (global warming network)
CN109543180B (en) Text emotion analysis method based on attention mechanism
CN108073677B (en) Multi-level text multi-label classification method and system based on artificial intelligence
CN111368074B (en) Link prediction method based on network structure and text information
CN112380435B (en) Document recommendation method and system based on heterogeneous graph neural network
CN112487143A (en) Public opinion big data analysis-based multi-label text classification method
CN110413844A (en) Dynamic link prediction technique based on space-time attention depth model
CN111709518A (en) Method for enhancing network representation learning based on community perception and relationship attention
CN106897254B (en) Network representation learning method
CN113628059B (en) Associated user identification method and device based on multi-layer diagram attention network
CN112559764A (en) Content recommendation method based on domain knowledge graph
CN112507245B (en) Social network friend recommendation method based on graph neural network
CN110659411A (en) Personalized recommendation method based on neural attention self-encoder
CN112988917A (en) Entity alignment method based on multiple entity contexts
CN112100486B (en) Deep learning recommendation system and method based on graph model
CN114971784B (en) Session recommendation method and system based on graph neural network by fusing self-attention mechanism
CN114942998B (en) Knowledge graph neighborhood structure sparse entity alignment method integrating multi-source data
CN113591478A (en) Remote supervision text entity relation extraction method based on deep reinforcement learning
CN115391563A (en) Knowledge graph link prediction method based on multi-source heterogeneous data fusion
CN115496072A (en) Relation extraction method based on comparison learning
CN114356990A (en) Base named entity recognition system and method based on transfer learning
CN114840777B (en) Multi-dimensional endowment service recommendation method and device and electronic equipment
CN116401353A (en) Safe multi-hop question-answering method and system combining internal knowledge patterns and external knowledge patterns

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20200925