CN110688537A - Calculation graph node low-dimensional representation and related application method - Google Patents

Calculation graph node low-dimensional representation and related application method Download PDF

Info

Publication number
CN110688537A
CN110688537A CN201910911795.4A CN201910911795A CN110688537A CN 110688537 A CN110688537 A CN 110688537A CN 201910911795 A CN201910911795 A CN 201910911795A CN 110688537 A CN110688537 A CN 110688537A
Authority
CN
China
Prior art keywords
node
information
vector
nodes
attention
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201910911795.4A
Other languages
Chinese (zh)
Inventor
李金龙
吴钰泽
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
University of Science and Technology of China USTC
Original Assignee
University of Science and Technology of China USTC
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by University of Science and Technology of China USTC filed Critical University of Science and Technology of China USTC
Priority to CN201910911795.4A priority Critical patent/CN110688537A/en
Publication of CN110688537A publication Critical patent/CN110688537A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/90Details of database functions independent of the retrieved data types
    • G06F16/901Indexing; Data structures therefor; Storage structures
    • G06F16/9024Graphs; Linked lists
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F17/00Digital computing or data processing equipment or methods, specially adapted for specific functions
    • G06F17/10Complex mathematical operations
    • G06F17/16Matrix or vector computation, e.g. matrix-matrix or matrix-vector multiplication, matrix factorization

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Mathematical Physics (AREA)
  • Data Mining & Analysis (AREA)
  • Databases & Information Systems (AREA)
  • General Engineering & Computer Science (AREA)
  • Computational Mathematics (AREA)
  • Mathematical Analysis (AREA)
  • Mathematical Optimization (AREA)
  • Software Systems (AREA)
  • Pure & Applied Mathematics (AREA)
  • Computing Systems (AREA)
  • Algebra (AREA)
  • Information Retrieval, Db Structures And Fs Structures Therefor (AREA)

Abstract

The invention discloses a low-dimensional representation and related application method of a computational graph node, which comprises the following steps: calculating the importance parameters of each node according to the original data of the graph structure, so that the whole graph structure is represented by a series of node sequences; calculating a point mutual information matrix of a graph structure by using a series of node sequences, and carrying out primary coding by using a self-coder to obtain vector representations of each node, wherein the vector representations comprise topological structure information; according to a series of node sequences, an attention mechanism is introduced, and vector representations of each node, including attention information and node attribute information, are calculated; and fusing the vector representation of each node containing the topological structure information and the vector representation of each node containing the attention information and the node attribute information to obtain the final vector representation of each node retaining the self information. The method can reserve the information of the nodes to the maximum extent, further reserve the information of the whole graph structure, and realize the task of mining and analyzing the data in the practical application represented by the graph structure.

Description

Calculation graph node low-dimensional representation and related application method
Technical Field
The invention relates to the technical field of computer, artificial intelligence and graph sign learning, in particular to a computational graph node low-dimensional representation and a related application method.
Background
Relationships in many disciplines and areas can naturally be represented graphically. For example: the citation relationship among academic papers forms a paper citation graph, the friend relationship on an online social platform forms a social relationship graph, and the graph obtained based on reality data is very large in scale, so that the graph is of great importance for learning of low-dimensional representation of the graph in order to better solve the practical problem based on the graph structure.
For the related applications of graph structures, there are some solutions, for example, CN2018100543114 proposes a graph node classification method based on a recurrent neural network. However, most of the existing methods are based on the homogeneity of the nodes, ignore the differences among the nodes, and treat all the nodes in the graph as equivalent individuals, so that the representation result of the final node only can retain the topological structure information of the nodes, namely the side information of the nodes and the information of the neighbor nodes, and cannot well retain the self-carried labels, attributes and other information of the nodes, thereby further influencing the application effect on actual data.
Disclosure of Invention
The invention aims to provide a low-dimensional representation and a related application method of a computational graph node, which can furthest reserve the information of the node, further reserve the information of the whole graph structure and realize the task of mining and analyzing data in practical application (such as node classification and edge connection prediction) represented by the graph structure.
The purpose of the invention is realized by the following technical scheme:
a low-dimensional representation and related application method of a computational graph node comprises the following steps:
calculating the importance parameters of each node according to the original data of the graph structure, so that the whole graph structure is represented by a series of node sequences to obtain a node sequence set;
calculating a point mutual information matrix of the graph structure by using the obtained node sequence set, and performing primary coding by using a self-coder to obtain vector representation containing topological structure information of each node;
according to the obtained node sequence set, introducing an attention mechanism, and calculating vector representations of each node, wherein the vector representations comprise attention information and node attribute information;
fusing the vector representation of each node containing topological structure information and the vector representation of each node containing attention information and node attribute information to obtain the final vector representation of each node retaining the information of the node;
and classifying the nodes and predicting the edges based on the final vector representation of each node for retaining the self information.
The technical scheme provided by the invention can be seen that the high-dimensional sparse network nodes are represented as low-dimensional dense vectors by carrying out preliminary coding through the self-encoder and introducing an attention mechanism, so that the nodes in the graph structure can retain more information while being represented as the low-dimensional vectors, and a better effect is obtained in the application of actual data.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present invention, the drawings needed to be used in the description of the embodiments are briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present invention, and it is obvious for those skilled in the art to obtain other drawings based on the drawings without creative efforts.
FIG. 1 is a flowchart of a method for computing low-dimensional representations of graph nodes and related applications according to an embodiment of the present invention;
fig. 2 is a schematic structural diagram of a self-encoder according to an embodiment of the present invention.
Detailed Description
The technical solutions in the embodiments of the present invention are clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments of the present invention without making any creative effort, shall fall within the protection scope of the present invention.
The attention mechanism is a model which is recently proposed for the natural language processing direction, simulates the mechanism of human processing visual information, more weight is put into important parts in the processing information, and the weight is reduced for non-important parts. Based on this, an embodiment of the present invention provides a method for calculating a low-dimensional representation of a graph node, that is, a computer is used to represent nodes in a graph structure by using low-dimensional vectors, each node in the graph structure represents an entity in an actual relationship network, and a connecting edge in the graph represents a certain relationship between two entities, for example, each node in a social relationship graph represents a user, and a connecting edge of a node represents that a certain relationship exists between users. The low-dimensional representation of the nodes in the graph not only requires that high-dimensional sparse network nodes are represented as low-dimensional dense vectors, but also requires that the information of the nodes can be reserved to the greatest extent, and further the information of the whole graph can be reserved.
As shown in fig. 1, a flowchart of a method for computing low-dimensional representation and related applications of graph nodes is provided, which includes:
step 1, calculating importance parameters of each node according to graph structure original data, and accordingly representing the whole graph structure by a series of node sequences to obtain a node sequence set.
The preferred embodiment of this step is as follows:
calculating the pagerank value of a node according to the adjacency matrix of the graph structure, wherein the pagerank is a value for measuring the importance of the graph node, and the node wiThe pagerank value is calculated as follows:
Figure BDA0002214944110000031
where d is a manually set parameter, N represents the total number of nodes in the graph structure, N (w)i) Represents node wiSet of neighbors in the graph, L (w)i) Represents node wiIf the graph structure is an undirected graph, the output of (c) is represented by a point wiDegree of (1)Replacing;
in the embodiment of the present invention, a node walking method based on node importance is used, that is, one node in a graph structure is randomly selected to start walking, and a selection point of a next step in the walking process is determined according to the following formula:
Figure BDA0002214944110000032
in the above formula, the symbol ∈ indicates a positive ratio.
One node sequence can be obtained by walking once, the walking step is repeated for specified times (the specific times can be set according to actual conditions), the whole graph structure can be represented by a series of node sequences, and finally a node sequence set is formed by the representation of the series of node sequences obtained by walking.
In actual operation, a parameter representing the length of the node sequence can be set, and when the migration step is repeated for a certain number of times, the length of the node sequence meets the set condition, the migration can be stopped, and finally the node sequence set is obtained.
Those skilled in the art can understand that a graph structure can be formed by taking users as nodes, and a graph structure can also be formed by taking text information or other types of entities as nodes; the present invention is not limited to the specific form of the drawing structure.
And 2, calculating a point mutual information matrix of the graph structure by using the obtained node sequence set, and performing primary coding by using a self-coder to obtain vector representations of each node, wherein the vector representations comprise topological structure information.
The method mainly comprises the following steps: the method comprises two stages of point mutual information matrix calculation and self-encoder encoding, wherein the preferred implementation mode of each stage is as follows:
1) and (5) calculating a point mutual information matrix.
According to the step 1, an obtained node sequence set containing graph structure information can be obtained, and according to the node sequences, mutual point information of the graph can be calculated, wherein the mutual point information is a method for measuring the correlation between two variables based on probability theory, and the calculation formula is as follows:
Figure BDA0002214944110000041
wherein the probability p (w)i,wj) Refers to node wiAnd node wjThe probability of occurrence of adjacent nodes in the whole node sequence set is taken as; p (w)i) And p (w)j) Represents node wiAnd node wjProbability of simultaneous occurrence in a sequence of nodes, i.e. containing node wiAnd node wjThe ratio of sequences of (a) to the total number of sequences;
the point mutual information between any two nodes can be calculated through the formula, so that the point mutual information matrix PMI of all nodes in the graph structure is obtainedG∈RN×NWherein N represents the total number of nodes; point mutual information matrix PMIGEach PMI ofijRepresents node wiAnd node wjAs a preliminary representation form of the graph structure information. Wherein, PMIij=PMI(i,j)。
2) And encoding by an encoder.
Constructing a self-encoder structure with a K-layer encoder and a K-layer decoder; fig. 2 exemplarily shows a self-encoder structure having a six-layer encoder and a six-layer decoder.
Inputting the sum of each row of the point mutual information matrix and manually added Gaussian noise into a self-encoder as an original vector, wherein the Gaussian noise is added to improve the robustness of a representation vector, each layer in the self-encoder is a full-connection layer, and a loss function is used for minimizing the loss of the vector obtained by encoding and decoding and the original vector; the vector obtained by the last layer of encoder is the vector representation X containing the topological structure information of each nodei
The loss function is expressed as:
Figure BDA0002214944110000042
wherein the content of the first and second substances,
Figure BDA0002214944110000043
for the decoded vector, XinputIs the original vector.
It will be understood by those skilled in the art that each row of the aforementioned mutual information matrix refers to the mutual information of a certain node and all other nodes, that is, each row of the mutual information matrix is a vector.
And 3, introducing an attention mechanism according to a series of node sequences, and calculating vector representations of each node, wherein the vector representations comprise attention information and node attribute information.
The attention representation of the nodes in the path is calculated according to the attribute information of the nodes, wherein the attribute information of the nodes is vectorized description of the attribute characteristics of the nodes as an entity in the real relation, and reflects the characteristics of the entity itself outside the connection relation, and is usually a set of labels marked artificially.
Step 1 results in a series of node sequences which are preserved but in which the nodes are represented by attribute vectors, for each node sequence Φ ∈ Rs×DimenWherein s is the sequence length, Dimen is the dimension of the attribute vector, and a node sequence forms a random walk path; each item in the node sequence phikAn attribute vector representing the kth node of a sequence of nodes, let Φ be assumedkIs w, the attention weight of the kth node in the corresponding sequence
Taking the aggregation of attention weights of all neighboring nodes of the kth node in the same random walk path as the attention expression of the nodes in the path:
Figure BDA0002214944110000053
where, σ represents a non-linear activation function,
Figure BDA0002214944110000054
represents the set of neighbors of the kth node,
Figure BDA0002214944110000055
representing the attention weight of the k' th neighbor node in the corresponding sequence.
For node wiObtaining an attention representation of a set of nodes
Figure BDA0002214944110000056
Wherein phipRepresents node wiThe pth random walk path is located, P is 0, 1. Based on the node attention expression in each path, calculating the attention weight of the path according to the following formula:
Figure BDA0002214944110000057
wherein, W is a weight matrix, q is an attention vector, b is a bias vector, and the three are parameters to be trained;
Figure BDA0002214944110000058
is represented by node wiAll neighbors are on random walk path phipThe aggregation of attention weights in (1) as an attention representation of the nodes in the path.
For node wiIs combined with
Figure BDA00022149441100000510
And
Figure BDA00022149441100000511
a vector representation of the node may be computed that includes attention information and node attribute information:
Figure BDA0002214944110000059
in the embodiment of the invention, the loss function is designed to minimize the errors of the expected classification and the label of the node, and all the parameters to be trained can be obtained through training, so that the node representation of the node containing attention information and node attribute information is obtained.
And 4, fusing the vector representation of each node containing the topological structure information and the vector representation of each node containing the attention information and the node attribute information to obtain the final vector representation of each node retaining the self information.
In this step, the two kinds of characterization vectors with the same dimension and different contents obtained in step 2 and step 3 are fused, and the following calculation is firstly performed on the two vectors of each node:
Figure BDA0002214944110000061
final node wiThe low-dimensional representation vector of (a) is represented by:
ui=ζXi+(1-ζ)Zi
and 5, classifying the nodes and predicting the continuous edges based on the final vector representation of each node, which retains the self information.
This step is not shown in fig. 1.
In the embodiment of the present invention, the low-dimensional vector characterization of the node obtained in the foregoing step 4 may be further applied to node classification and edge prediction of a graph.
Node classification refers to a method of inputting a low-dimensional representation of a node into a classifier, training a characterization vector of a part of nodes, and classifying another part of nodes, for example, for a social network in which users are nodes and social relations between the users are connecting edges, the node classification may label the user nodes therein.
The edge-linking prediction means predicting whether an edge exists between nodes according to the low-dimensional vector representation of the nodes, for example, for a network which takes entities in text information as nodes and relationships between the entities as edges, for a new piece of text information, the relevance between the entities in the piece of text information and other information can be judged through edge-linking prediction, and then the relevance between the piece of information and an investigated event and the truth of the information can be judged.
The final vector representation based on the information of each node can be used for other applications, and the specific application mode can be realized by referring to the conventional technology.
According to the scheme of the embodiment of the invention, the self-encoder is used for carrying out preliminary encoding, and an attention mechanism is introduced, so that high-dimensional sparse network nodes are expressed into low-dimensional dense vectors, more information can be reserved while the nodes in the graph structure are expressed into the low-dimensional vectors, and a better effect is obtained in the application of actual data.
Through the above description of the embodiments, it is clear to those skilled in the art that the above embodiments can be implemented by software, and can also be implemented by software plus a necessary general hardware platform. With this understanding, the technical solutions of the embodiments can be embodied in the form of a software product, which can be stored in a non-volatile storage medium (which can be a CD-ROM, a usb disk, a removable hard disk, etc.), and includes several instructions for enabling a computer device (which can be a personal computer, a server, or a network device, etc.) to execute the methods according to the embodiments of the present invention.
The above description is only for the preferred embodiment of the present invention, but the scope of the present invention is not limited thereto, and any changes or substitutions that can be easily conceived by those skilled in the art within the technical scope of the present invention are included in the scope of the present invention. Therefore, the protection scope of the present invention shall be subject to the protection scope of the claims.

Claims (6)

1. A method for low-dimensional representation and related application of a computational graph node is characterized by comprising the following steps:
calculating the importance parameters of each node according to the original data of the graph structure, so that the whole graph structure is represented by a series of node sequences to obtain a node sequence set;
calculating a point mutual information matrix of the graph structure by using the obtained node sequence set, and performing primary coding by using a self-coder to obtain vector representation containing topological structure information of each node;
according to the obtained node sequence set, introducing an attention mechanism, and calculating vector representations of each node, wherein the vector representations comprise attention information and node attribute information;
fusing the vector representation of each node containing topological structure information and the vector representation of each node containing attention information and node attribute information to obtain the final vector representation of each node retaining the information of the node;
and classifying the nodes and predicting the edges based on the final vector representation of each node for retaining the self information.
2. The method of claim 1, wherein the calculating the importance parameter of each node according to the graph structure raw data so as to represent the whole graph structure by a series of node sequences to obtain a node sequence set comprises:
calculating the pagerank value of a node according to the adjacency matrix of the graph structure, wherein the pagerank is a value for measuring the importance of the graph node, and the node wiThe pagerank value is calculated as follows:
Figure FDA0002214944100000011
where d is a manually set parameter, N represents the total number of nodes in the graph structure, N (w)i) Represents node wiSet of neighbors in the graph, L (w)i) Represents node wiOut degree of (1) if the graph structure is an undirected graphThen use point wiDegree of substitution;
using a walking method based on node importance, namely randomly selecting a node in a graph structure to start walking, and determining a selected point of a next step of walking according to the following formula:
Figure FDA0002214944100000012
in the above formula, the symbol ∈ indicates a positive rate;
and (4) obtaining a node sequence by walking once, repeating the walking step for a specified number of times, namely representing the whole graph structure by using a series of node sequences, and finally representing by using the series of node sequences obtained by walking to form a node sequence set.
3. The method of claim 1, wherein the obtaining of the mutual point information matrix of the computational graph structure from the set of node sequences comprises:
the point mutual information is a method for measuring the correlation between two variables based on probability theory, and the calculation formula is as follows:
Figure FDA0002214944100000021
wherein the probability p (w)i,wj) Refers to node wiAnd node wjThe probability of occurrence of adjacent nodes in the whole node sequence set is taken as; p (w)i) And p (w)j) Represents node wiAnd node wjProbability of simultaneous occurrence in a sequence of nodes, i.e. containing node wiAnd node wjThe ratio of sequences of (a) to the total number of sequences;
the point mutual information between any two nodes can be calculated through the formula, so that the point mutual information matrix PMI of all nodes in the graph structure is obtainedG∈RN×NWherein N represents the total number of nodes; point mutual information matrix PMIGEach PMI ofijRepresents node wiAnd node wjAs a preliminary representation of the graph structure information, wherein the PMIij=PMI(i,j)。
4. A computational graph node low-dimensional representation and associated methods of application according to claim 1, 2 or 3, wherein the preliminary encoding by the autoencoder to obtain the vector representation of each node containing topology information comprises:
constructing a self-encoder structure with a K-layer encoder and a K-layer decoder;
inputting the sum of each row of the point mutual information matrix and Gaussian noise into a self-encoder as an original vector, wherein each layer in the self-encoder is a full-connection layer, and a loss function is to minimize the loss of the vector obtained by encoding and decoding and the original vector;
and the vector obtained by the last layer of encoder is the vector representation of each node containing the topological structure information.
5. A method for computing low-dimensional representations of graph nodes and associated applications according to claim 1, 2 or 3, wherein the method for computing vector representations of each node, including attention information and node attribute information, according to a series of node sequences and an attention mechanism is introduced, and comprises:
for each node sequence phi epsilon RS×DimenWherein s is the sequence length, Dimen is the dimension of the attribute vector, and a node sequence forms a random walk path; each item in the node sequence phikAn attribute vector representing the kth node of a sequence of nodes, let Φ be assumedkIs w, the attention weight of the kth node in the corresponding sequence is softmax (w ⊙ Φk);
Taking the aggregation of attention weights of all neighboring nodes of the kth node in the same random walk path as the attention expression of the nodes in the path:
Figure FDA0002214944100000022
where, σ represents a non-linear activation function,
Figure FDA0002214944100000031
represents the set of neighbors of the kth node,
Figure FDA0002214944100000032
representing attention weights of the kth neighbor node in the corresponding sequence;
for node wiObtaining an attention representation of a set of nodes
Figure FDA0002214944100000033
Wherein phipRepresents node wiThe P-th random walk path where P is 0,1, …, P; based on the node attention expression in each path, calculating the attention weight of the path according to the following formula:
wherein, W is a weight matrix, q is an attention vector, b is a bias vector, and the three are parameters to be trained;is represented by node wiAll neighbors are on random walk path phipAs an attention representation of the nodes in the path;
computing vector characterization of the node, including attention information and node attribute information:
Figure FDA0002214944100000036
6. the method according to claim 1, wherein the step of fusing the vector representation of each node including topology information and the vector representation of each node including attention information and node attribute information to obtain a final vector representation of each node retaining its own information comprises:
for node wiIn other words, the vector representation containing topology information is denoted as XiThe vector representation including the attention information and the node attribute information is denoted as Zi(ii) a Node w is calculated in the following manneriFinal vector characterization ui
Figure FDA0002214944100000037
ui=ζXi+(1-ζ)Zi
CN201910911795.4A 2019-09-25 2019-09-25 Calculation graph node low-dimensional representation and related application method Pending CN110688537A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910911795.4A CN110688537A (en) 2019-09-25 2019-09-25 Calculation graph node low-dimensional representation and related application method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910911795.4A CN110688537A (en) 2019-09-25 2019-09-25 Calculation graph node low-dimensional representation and related application method

Publications (1)

Publication Number Publication Date
CN110688537A true CN110688537A (en) 2020-01-14

Family

ID=69110185

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910911795.4A Pending CN110688537A (en) 2019-09-25 2019-09-25 Calculation graph node low-dimensional representation and related application method

Country Status (1)

Country Link
CN (1) CN110688537A (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111428091A (en) * 2020-03-19 2020-07-17 腾讯科技(深圳)有限公司 Encoder training method, information recommendation method and related device
CN111538870A (en) * 2020-07-07 2020-08-14 北京百度网讯科技有限公司 Text expression method and device, electronic equipment and readable storage medium
CN111598093A (en) * 2020-05-25 2020-08-28 深圳前海微众银行股份有限公司 Method, device, equipment and medium for generating structured information of characters in picture
CN112347260A (en) * 2020-11-24 2021-02-09 深圳市欢太科技有限公司 Data processing method and device and electronic equipment

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109376857A (en) * 2018-09-03 2019-02-22 上海交通大学 A kind of multi-modal depth internet startup disk method of fusion structure and attribute information
CN109740039A (en) * 2019-01-11 2019-05-10 西南大学 Dynamic network community structure recognition methods based on stack self-encoding encoder
CN109886401A (en) * 2019-01-10 2019-06-14 南京邮电大学 A kind of complex network representative learning method
CN110046252A (en) * 2019-03-29 2019-07-23 北京工业大学 A kind of medical textual hierarchy method based on attention mechanism neural network and knowledge mapping
CN110046698A (en) * 2019-04-28 2019-07-23 北京邮电大学 Heterogeneous figure neural network generation method, device, electronic equipment and storage medium

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109376857A (en) * 2018-09-03 2019-02-22 上海交通大学 A kind of multi-modal depth internet startup disk method of fusion structure and attribute information
CN109886401A (en) * 2019-01-10 2019-06-14 南京邮电大学 A kind of complex network representative learning method
CN109740039A (en) * 2019-01-11 2019-05-10 西南大学 Dynamic network community structure recognition methods based on stack self-encoding encoder
CN110046252A (en) * 2019-03-29 2019-07-23 北京工业大学 A kind of medical textual hierarchy method based on attention mechanism neural network and knowledge mapping
CN110046698A (en) * 2019-04-28 2019-07-23 北京邮电大学 Heterogeneous figure neural network generation method, device, electronic equipment and storage medium

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
HONGXU CHEN等: "Exploiting Centrality Information with Graph Convolutions for Network Representation Learning", 《IEEE》 *
李海生: "物联网下的移动网络攻击节点定位仿真", 《计算机仿真》 *

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111428091A (en) * 2020-03-19 2020-07-17 腾讯科技(深圳)有限公司 Encoder training method, information recommendation method and related device
CN111428091B (en) * 2020-03-19 2020-12-08 腾讯科技(深圳)有限公司 Encoder training method, information recommendation method and related device
CN111598093A (en) * 2020-05-25 2020-08-28 深圳前海微众银行股份有限公司 Method, device, equipment and medium for generating structured information of characters in picture
CN111598093B (en) * 2020-05-25 2024-05-14 深圳前海微众银行股份有限公司 Method, device, equipment and medium for generating structured information of characters in picture
CN111538870A (en) * 2020-07-07 2020-08-14 北京百度网讯科技有限公司 Text expression method and device, electronic equipment and readable storage medium
CN111538870B (en) * 2020-07-07 2020-12-18 北京百度网讯科技有限公司 Text expression method and device, electronic equipment and readable storage medium
CN112347260A (en) * 2020-11-24 2021-02-09 深圳市欢太科技有限公司 Data processing method and device and electronic equipment

Similar Documents

Publication Publication Date Title
CN112529168B (en) GCN-based attribute multilayer network representation learning method
WO2023065545A1 (en) Risk prediction method and apparatus, and device and storage medium
CN110688537A (en) Calculation graph node low-dimensional representation and related application method
Kuo et al. Integration of ART2 neural network and genetic K-means algorithm for analyzing Web browsing paths in electronic commerce
US8990128B2 (en) Graph-based framework for multi-task multi-view learning
CN109376857A (en) A kind of multi-modal depth internet startup disk method of fusion structure and attribute information
CN112529071B (en) Text classification method, system, computer equipment and storage medium
CN113158071A (en) Knowledge social contact recommendation method, system and equipment based on graph neural network
CN113051440A (en) Link prediction method and system based on hypergraph structure
CN116091152A (en) Recommendation method and system based on multi-level comparison learning and multi-mode knowledge graph
CN112667920A (en) Text perception-based social influence prediction method, device and equipment
CN112699222A (en) Text classification method and mail classification method based on quantum heuristic neural network
Ens et al. CAEMSI: A Cross-Domain Analytic Evaluation Methodology for Style Imitation.
CN113254652A (en) Social media posting authenticity detection method based on hypergraph attention network
Tran et al. Building interpretable predictive models with context-aware evolutionary learning
CN114880427A (en) Model based on multi-level attention mechanism, event argument extraction method and system
CN113704393A (en) Keyword extraction method, device, equipment and medium
CN112905906A (en) Recommendation method and system fusing local collaboration and feature intersection
Serratosa A general model to define the substitution, insertion and deletion graph edit costs based on an embedded space
CN113297385B (en) Multi-label text classification system and method based on improved GraphRNN
CN113065321B (en) User behavior prediction method and system based on LSTM model and hypergraph
Cathcart Toward a deep dialectological representation of Indo-Aryan
CN115879507A (en) Large-scale graph generation method based on deep confrontation learning
Salas et al. A global prediction model for sudden stops of capital flows using decision trees
US11734573B2 (en) Image element matching via graph processing

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20200114