CN114863234A - Graph representation learning method and system based on topological structure maintenance - Google Patents
Graph representation learning method and system based on topological structure maintenance Download PDFInfo
- Publication number
- CN114863234A CN114863234A CN202210464131.XA CN202210464131A CN114863234A CN 114863234 A CN114863234 A CN 114863234A CN 202210464131 A CN202210464131 A CN 202210464131A CN 114863234 A CN114863234 A CN 114863234A
- Authority
- CN
- China
- Prior art keywords
- graph
- node
- topology
- task
- tgssl
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000012423 maintenance Methods 0.000 title claims abstract description 23
- 238000000034 method Methods 0.000 title claims abstract description 21
- 239000013598 vector Substances 0.000 claims abstract description 51
- 238000013528 artificial neural network Methods 0.000 claims abstract description 13
- 239000011159 matrix material Substances 0.000 claims description 18
- 238000005192 partition Methods 0.000 claims description 17
- 238000010586 diagram Methods 0.000 claims description 14
- 238000000638 solvent extraction Methods 0.000 claims description 11
- 238000013461 design Methods 0.000 claims description 9
- 238000004321 preservation Methods 0.000 claims description 9
- 239000000126 substance Substances 0.000 claims description 9
- 230000006870 function Effects 0.000 claims description 6
- 241001229889 Metis Species 0.000 claims description 3
- 208000024891 symptom Diseases 0.000 claims description 3
- 238000012549 training Methods 0.000 claims description 3
- 238000002474 experimental method Methods 0.000 description 5
- 230000005540 biological transmission Effects 0.000 description 2
- 238000012986 modification Methods 0.000 description 2
- 230000004048 modification Effects 0.000 description 2
- 238000002679 ablation Methods 0.000 description 1
- 238000013527 convolutional neural network Methods 0.000 description 1
- 238000000354 decomposition reaction Methods 0.000 description 1
- 238000011156 evaluation Methods 0.000 description 1
- 238000013507 mapping Methods 0.000 description 1
- 210000002569 neuron Anatomy 0.000 description 1
- 238000011160 research Methods 0.000 description 1
- 238000012360 testing method Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/70—Arrangements for image or video recognition or understanding using pattern recognition or machine learning
- G06V10/77—Processing image or video features in feature spaces; using data integration or data reduction, e.g. principal component analysis [PCA] or independent component analysis [ICA] or self-organising maps [SOM]; Blind source separation
- G06V10/80—Fusion, i.e. combining data from various sources at the sensor level, preprocessing level, feature extraction level or classification level
- G06V10/806—Fusion, i.e. combining data from various sources at the sensor level, preprocessing level, feature extraction level or classification level of extracted features
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/04—Architecture, e.g. interconnection topology
- G06N3/045—Combinations of networks
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/08—Learning methods
- G06N3/088—Non-supervised learning, e.g. competitive learning
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/70—Arrangements for image or video recognition or understanding using pattern recognition or machine learning
- G06V10/82—Arrangements for image or video recognition or understanding using pattern recognition or machine learning using neural networks
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- Evolutionary Computation (AREA)
- Computing Systems (AREA)
- Health & Medical Sciences (AREA)
- Artificial Intelligence (AREA)
- Software Systems (AREA)
- General Physics & Mathematics (AREA)
- General Health & Medical Sciences (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Biomedical Technology (AREA)
- Molecular Biology (AREA)
- General Engineering & Computer Science (AREA)
- Data Mining & Analysis (AREA)
- Mathematical Physics (AREA)
- Computational Linguistics (AREA)
- Biophysics (AREA)
- Life Sciences & Earth Sciences (AREA)
- Databases & Information Systems (AREA)
- Medical Informatics (AREA)
- Multimedia (AREA)
- Data Exchanges In Wide-Area Networks (AREA)
Abstract
The invention discloses a graph representation learning method and system based on topology structure maintenance, which can be used for improving the graph representation learning quality by fusing topology structures and semantic features on the premise of maintaining the topology invariance of graph data; the method comprises the following steps: aiming at the characteristics of graph data, designing an automatic supervision task for maintaining a topological structure; inputting graph data, and performing feature coding on the graph data by using a graph convolution neural network so as to learn an initial vector representation of a node; inputting the learned node initial vector representation into a TGSSL (Topology-predicting Graph Self-Supervised Learning) model for Graph Self-supervision Learning, and finally obtaining high-quality node vector representation on the basis of structure maintenance. The invention can effectively solve the problem that the existing graph representation learning method can not effectively fuse structural information when learning the node vector.
Description
Technical Field
The invention relates to the field of graph data and representation learning, in particular to a graph representation learning method and system based on topological structure maintenance.
Background
When mapping graph data into a low-dimensional vector space, how to enable vectors to keep semantic information and structural information of nodes on the graph as much as possible is the key of graph representation learning research. The graph data generally has structural characteristics of community, hierarchy and the like, and the structure of the graph data is crucial to graph reasoning. However, the existing graph neural network and other graph representations consider the learning method to retain the semantic features of the nodes in the graph and the information of the lower-order neighbor nodes more, and lose the higher-order proximity and other types of graph structure characteristics among the nodes in the graph. Therefore, when graph data is represented and learned, the quality of representation learning needs to be further improved by fusing the topological structure and the node characteristics on the premise of keeping the topological invariance.
Disclosure of Invention
The invention mainly aims to solve the problem that the existing Graph data representation Learning method cannot effectively fuse structural information when Learning node vectors, and provides a Graph representation Learning method and system based on Topology structure maintenance.
The invention adopts the following technical scheme:
in one aspect, a graph representation learning method based on topology maintenance comprises the following steps:
step 2, inputting graph data, and performing feature coding on the graph data by using a graph convolution neural network so as to learn node initial vector representation;
and 3, inputting the learned node initial vector representation into a TGSSL model for graph self-supervision learning, and finally obtaining high-quality node vector representation on the basis of structure maintenance.
Preferably, the step 1 specifically includes:
step 1.1, carrying out topological partition of a structural design diagram based on diagram data, namely partitioning the diagram by using the connection density condition of edges in the diagram data, and predicting a partition index to which a node belongs;
step 1.2, designing a mask node self-supervision task based on the structure of the graph data, specifically reconstructing the mask node according to the characteristics of the neighbor nodes and the characteristics of the random mask partial nodes.
Preferably, the step 1.1 specifically includes:
step 1.1.1, dividing graph data into K e {1,2, …, | V | } communities by utilizing a graph partitioning algorithm METIS, and outputting a node set { V | with partition indexes p_1 ,…,V p_k ,…,V p_K |V p_k E.v, K is 1, …, K }; wherein | V | is the number of nodes of the graph data,
step 1.1.2, using the node partition index as a pseudo label y of the graph topology partition task TP For TGSSL model to learn; wherein, the partition index of the nth node is k, which can be formally expressed as: y is TP_n =k,if n ∈V p_k ,n=1,…,|V|,…,K;
Step 1.1.3, based on the pseudo label y TP Defining graph topology partitioning task loss function L TP For multi-class cross entropy:
wherein N is the total sample number of the nodes divided by the graph topology,is the predicted value of the nth node as the kth community.
Preferably, the step 1.2 specifically includes:
step 1.2.1, in the graph data, random mask | M a Characteristics x of | nodes; wherein the content of the first and second substances,
step 1.2.2, using the symptom node feature vector x before mask operation in step 1.2.1 as the pseudo label y of the mask node task MN For TGSSL model to learn;
step 1.2.3, based on the pseudo label y MN Defining a mask node task loss function L MN Mean absolute error:
wherein the content of the first and second substances,is a node v i And representing the node vector learned by the TGSSL model.
Preferably, the step 2 specifically includes:
node initial vector matrix X and normalized adjacency matrix of input graph dataPerforming graph representation learning by using a two-layer graph convolution neural network, so as to learn an initial vector representation H of a graph data node;
wherein the content of the first and second substances,for regularization I is an adjacency matrix to which a self loop is added, I is an identity matrix,is thatDegree matrix of (W) 1 Is the training parameter matrix of the first convolution.
Preferably, the step 3 specifically includes:
step 3.1, calculating loss L of graph topology division tasks in TGSSL model TP ;
Step 3.2, calculating the loss L of the task of the mask node in the TGSSL model MN ;
Step 3.3, calculating multi-classification cross entropy loss L of downstream tasks in TGSSL model main :
Wherein, I and C are the total sample number of the downstream tasks of the node classification and the node class number respectively,it is the predicted value of the ith sample as class c.
Step 3.4, updating the network parameter W according to the total loss L 1 Until the maximum iteration number T is reached, a high-quality node vector representation Z based on structure preservation and a well-trained TGSSL model can be finally obtained, as follows:
L=λ 1 L main +λ 2 L TP +λ 3 L MN (5)
wherein λ is 1 ,λ 2 ,Respectively a downstream task,And the graph topology divides the weight of the task and the task of the mask node.
In another aspect, a graph representation learning system based on topology preservation includes:
the self-supervision task design module is used for designing a self-supervision task kept by a topological structure aiming at the characteristics of the graph data;
the node initial vector representation learning module is used for inputting graph data and performing feature coding on the graph data by using a graph convolution neural network so as to learn node initial vector representation;
and the node vector representation acquisition module is used for inputting the learned node initial vector representation into the TGSSL model for graph self-supervision learning, and finally acquiring the high-quality node vector representation on the basis of structure maintenance.
As can be seen from the above description of the present invention, compared with the prior art, the present invention has the following advantages:
according to the structural characteristics of the graph data, the designed graph topology division and the self-supervision task of the mask node can keep the topological structure of the graph data, and meanwhile, the structural information is merged into a message transmission mechanism of a graph convolution neural network, so that the graph representation learning quality is improved. On one hand, a solution can be provided for graph representation learning of topology maintenance, and on the other hand, the learned high-quality node vector can serve a graph reasoning task.
Drawings
FIG. 1 is a flow chart of a graph representing a learning method based on topology preservation of the present invention;
FIG. 2 is a graph of a study framework illustrating learning oriented graph data topology maintenance;
FIG. 3 is a schematic representation of the TGSSL algorithm pseudo-code;
fig. 4 is a block diagram of a learning system based on a graph maintained by a topology structure according to the present invention.
Detailed Description
The invention will be further illustrated with reference to the following specific examples. It should be understood that these examples are for illustrative purposes only and are not intended to limit the scope of the present invention. Further, it should be understood that various changes or modifications of the present invention may be made by those skilled in the art after reading the teaching of the present invention, and such equivalents may fall within the scope of the present invention as defined in the appended claims.
Referring to fig. 1 to 3, the graph representation learning method based on topology maintenance of the present invention includes the following steps:
step 2, inputting graph data, and performing feature coding on the graph data by using a graph convolution neural network so as to learn the initial vector representation of the node;
and 3, inputting the learned node initial vector representation into a TGSSL model for graph self-supervision learning, and finally obtaining high-quality node vector representation on the basis of structure maintenance.
The embodiment is described by a node classification task on a public citation network PubMed data set. The data set is a graph formed by medical papers and reference relations thereof, nodes of the graph are papers, edges in the graph represent the reference relations between the papers, the characteristic dimension of each node is 500, and the labels of the nodes are categories to which the papers belong. The data set statistics are shown in table 1.
Table 1 citation network diagram data statistics
The step 1 specifically comprises:
step 1.1, carrying out topological partition of a structural design diagram based on diagram data, namely partitioning the diagram by using the connection density condition of edges in the diagram data, and predicting a partition index to which a node belongs;
step 1.2, designing a mask node self-supervision task based on the structure of graph data, specifically reconstructing the mask node according to the characteristics of neighbor nodes of the mask node through the characteristics of random mask partial nodes;
further, the step 1.1 specifically includes:
step 1.1.1, dividing graph data into k e {1,2, …, | V | } communities by utilizing a graph partitioning algorithm METIS, and outputting a node set { V | } with partition indexes p_1 ,…,V p_k ,…,V p_K |V p_k E.g. V, K1, …, K. Here, the number of nodes | V | ═ 19717 in the graph data; the number K of the partitions is 16;
step 1.1.2, using node partition index as pseudo label y of graph topology division task TP For the TGSSL model to learn. Wherein, the partition index of the nth node is k, which can be formally expressed as: y is TP_n =k,if v n ∈V p_k ,n=1,…,|V|,…,K;
Step 1.1.3, based on the pseudo label y TP Defining graph topology partitioning task loss function L TP For multi-class cross entropy:
wherein N is the total sample number of the nodes divided by the graph topology,is the predicted value of the nth node as the kth community.
Further, the step 1.2 specifically includes:
step 1.2.2, using the symptom node feature vector x before the mask operation is not done in step 1.2.1 as thePseudo label y of mask node task MN For TGSSL model to learn; in the experiment, because the dimensionality of the initial vector of the node is large, the dimensionality of the initial vector is reduced to 28 by using a singular value decomposition method, and the dimensionality-reduced node vector is used as a pseudo label of the self-supervision task;
step 1.2.3, based on the pseudo label y MN Defining a mask node task loss function L MN Mean absolute error:
wherein the content of the first and second substances,is a node v i And representing the node vector learned by the TGSSL model.
The step 2 specifically comprises:
step 2, inputting a node initial vector matrix X and a normalized adjacency matrix of graph dataPerforming graph representation learning by using a two-layer graph convolution neural network, so as to learn an initial vector representation H of a graph data node;
wherein the content of the first and second substances,for regularization I is an adjacency matrix to which a self loop is added, I is an identity matrix,is thatDegree matrix of (W) 1 Is the training parameter matrix of the first convolution.
The step 3 specifically includes:
step 3.1, calculating loss L of graph topology division tasks in TGSSL model TP (formula 1);
step 3.2, calculating the loss L of the task of the mask node in the TGSSL model MN (formula 2);
step 3.3, calculating multi-classification cross entropy loss L of downstream tasks in TGSSL model main :
Wherein, I and C are the total sample number of the downstream tasks of the node classification and the node class number respectively,it is the predicted value of the ith sample as class c.
Step 3.4, updating the network parameter W according to the total loss L (formula 5) 1 Until the maximum iteration number T is 10000, finally obtaining a high-quality node vector representation Z on the basis of structure preservation and a well-trained TGSSL model.
L=λ 1 L main +λ 2 L TP +λ 3 L MN (5)
Wherein λ is 1 ,λ 2 ,The weights of the downstream task, the graph topology partitioning task and the mask node task are respectively. The specific settings of the TGSSL model in the experiment were: learning rate is set to 0.01, dropout is set to 0.5, hidden layer neuron number is set to 16, L2 regularization weight is set to 5 × 10 -4 。
And 3.5, a prediction result of the downstream task can be output based on the trained TGSSL model and the learned high-quality node vector representation Z.
And in the experiment, the test set of the node classification is sent to a trained TGSSL model, and the label prediction result of the node classification is output. The Accuracy (Accuracy) is used as an evaluation index for node classification. Furthermore, to verify the validity of the TGSSL model, a comparison was made using the two types of models in table 2 as baselines. The first type is an existing related model, the second type is an ablation experiment model Graph Convolutional neural Network (GCN) of TGSSL, the effectiveness of the TGSSL model is verified by a Graph self-supervision learning model TP _ GCN only based on Graph topology division and a Graph self-supervision model MN _ GCN only based on mask nodes, and the experiment results are shown in Table 3.
TABLE 2 Baseline model settings
TABLE 3 node classification accuracy (%) of model on PubMed citation network data set
To this end, a graph representation learning method based on topology maintenance is completed. According to the invention, a graph topology division and a mask node self-supervision task are designed according to the structural characteristics of graph data, and the structural information is merged into a message transmission mechanism of a graph convolution neural network on the premise of maintaining the topology structure, so that the graph representation learning quality is improved, and the graph data reasoning tasks such as node classification and the like are effectively served.
Referring to fig. 4, a graph representation learning system based on topology preservation includes:
an auto-supervision task design module 401, configured to design an auto-supervision task maintained by a topology structure according to characteristics of graph data;
a node initial vector representation learning module 402, configured to input graph data, perform feature coding on the graph data using a graph convolution neural network, and thus learn a node initial vector representation;
and a node vector representation obtaining module 403, configured to input the learned node initial vector representation to a TGSSL model for graph self-supervision learning, and finally obtain a high-quality node vector representation on the basis of structure preservation.
The invention relates to a graph representation learning system based on topology structure maintenance, which specifically realizes the same graph representation learning method based on topology structure maintenance, and the invention is not repeated.
The above description is only an embodiment of the present invention, but the design concept of the present invention is not limited thereto, and any insubstantial modifications made by using the design concept should fall within the scope of infringing the present invention.
Claims (7)
1. A graph representation learning method based on topological structure preservation is characterized by comprising the following steps:
step 1, aiming at the characteristics of graph data, designing an automatic supervision task maintained by a topological structure;
step 2, inputting graph data, and performing feature coding on the graph data by using a graph convolution neural network so as to learn node initial vector representation;
and 3, inputting the learned node initial vector representation into a TGSSL model for graph self-supervision learning, and finally obtaining high-quality node vector representation on the basis of structure maintenance.
2. The graph representation learning method based on topology maintenance according to claim 1, wherein the step 1 specifically comprises:
step 1.1, carrying out topological partition of a structural design diagram based on diagram data, namely partitioning the diagram by using the connection density condition of edges in the diagram data, and predicting a partition index to which a node belongs;
step 1.2, designing a mask node self-supervision task based on the structure of the graph data, specifically reconstructing the mask node according to the characteristics of the neighbor nodes and the characteristics of the random mask partial nodes.
3. The graph representation learning method based on topology maintenance according to claim 2, wherein the step 1.1 specifically comprises:
step 1.1.1, dividing graph data into K e {1,2, …, | V | } communities by utilizing a graph partitioning algorithm METIS, and outputting a node set { V | with partition indexes p_1 ,…,V p_k ,…,V p_K |V p_k E.v, K is 1, …, K }; wherein | V | is the number of nodes of the graph data,
step 1.1.2, using node partition index as pseudo label y of graph topology division task TP For TGSSL model to learn; wherein, the partition index of the nth node is k, which can be formally expressed as: y is TP_n =k,if v n ∈V p_k ,n=1,…,|V|,
Step 1.1.3, based on the pseudo label y TP Defining graph topology partitioning task loss function L TP For multi-class cross entropy:
4. The graph representation learning method based on topology maintenance according to claim 3, wherein the step 1.2 specifically comprises:
step 1.2.1, in the graph data, random mask | M a Characteristics x of | nodes; wherein the content of the first and second substances,
step 1.2.2, using the symptom node feature vector x before the mask operation in step 1.2.1 as the pseudo label y of the mask node task MN For TGSSL model to learn;
step 1.2.3, based on the pseudo label y MN Defining a mask node task loss function L MN Mean absolute error:
5. The graph representation learning method based on topology maintenance according to claim 4, wherein the step 2 specifically comprises:
node initial vector matrix X and normalized adjacency matrix of input graph dataPerforming graph representation learning by using a two-layer graph convolution neural network, so as to learn an initial vector representation H of a graph data node;
6. The graph representation learning method based on topology maintenance according to claim 5, wherein the step 3 specifically comprises:
step 3.1, calculating loss L of graph topology division tasks in TGSSL model TP ;
Step 3.2, calculating the loss L of the task of the mask node in the TGSSL model MN ;
Step 3.3, calculating multi-classification cross entropy loss L of downstream tasks in TGSSL model main :
Wherein, I and C are the total sample number of the downstream tasks of the node classification and the node class number respectively,it is the predicted value of the ith sample as class c.
Step 3.4, updating the network parameter W according to the total loss L 1 Until the maximum iteration number T is reached, a high-quality node vector representation Z based on structure preservation and a well-trained TGSSL model can be finally obtained, as follows:
L=λ 1 L main +λ 2 L TP +λ 3 L MN (5)
7. A graph representation learning system based on topology preservation, comprising:
the self-supervision task design module is used for designing a self-supervision task kept by a topological structure aiming at the characteristics of the graph data;
the node initial vector representation learning module is used for inputting graph data and performing feature coding on the graph data by using a graph convolution neural network so as to learn node initial vector representation;
and the node vector representation acquisition module is used for inputting the learned node initial vector representation into the TGSSL model for graph self-supervision learning, and finally acquiring the high-quality node vector representation on the basis of structure maintenance.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202210464131.XA CN114863234A (en) | 2022-04-29 | 2022-04-29 | Graph representation learning method and system based on topological structure maintenance |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202210464131.XA CN114863234A (en) | 2022-04-29 | 2022-04-29 | Graph representation learning method and system based on topological structure maintenance |
Publications (1)
Publication Number | Publication Date |
---|---|
CN114863234A true CN114863234A (en) | 2022-08-05 |
Family
ID=82635945
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202210464131.XA Pending CN114863234A (en) | 2022-04-29 | 2022-04-29 | Graph representation learning method and system based on topological structure maintenance |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN114863234A (en) |
Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20180247224A1 (en) * | 2017-02-28 | 2018-08-30 | Nec Europe Ltd. | System and method for multi-modal graph-based personalization |
CN111950594A (en) * | 2020-07-14 | 2020-11-17 | 北京大学 | Unsupervised graph representation learning method and unsupervised graph representation learning device on large-scale attribute graph based on sub-graph sampling |
CN113065649A (en) * | 2021-02-22 | 2021-07-02 | 中国互联网络信息中心 | Complex network topology graph representation learning method, prediction method and server |
CN113378937A (en) * | 2021-06-11 | 2021-09-10 | 西安电子科技大学 | Small sample image classification method and system based on self-supervision enhancement |
CN113378913A (en) * | 2021-06-08 | 2021-09-10 | 电子科技大学 | Semi-supervised node classification method based on self-supervised learning |
CN114067177A (en) * | 2021-11-18 | 2022-02-18 | 中国人民解放军国防科技大学 | Remote sensing image classification network robustness improving method based on self-supervision learning |
CN114140645A (en) * | 2021-11-23 | 2022-03-04 | 杭州电子科技大学 | Photographic image aesthetic style classification method based on improved self-supervision feature learning |
CN114357221A (en) * | 2022-03-15 | 2022-04-15 | 南京航空航天大学 | Self-supervision active learning method based on image classification |
-
2022
- 2022-04-29 CN CN202210464131.XA patent/CN114863234A/en active Pending
Patent Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20180247224A1 (en) * | 2017-02-28 | 2018-08-30 | Nec Europe Ltd. | System and method for multi-modal graph-based personalization |
CN111950594A (en) * | 2020-07-14 | 2020-11-17 | 北京大学 | Unsupervised graph representation learning method and unsupervised graph representation learning device on large-scale attribute graph based on sub-graph sampling |
CN113065649A (en) * | 2021-02-22 | 2021-07-02 | 中国互联网络信息中心 | Complex network topology graph representation learning method, prediction method and server |
CN113378913A (en) * | 2021-06-08 | 2021-09-10 | 电子科技大学 | Semi-supervised node classification method based on self-supervised learning |
CN113378937A (en) * | 2021-06-11 | 2021-09-10 | 西安电子科技大学 | Small sample image classification method and system based on self-supervision enhancement |
CN114067177A (en) * | 2021-11-18 | 2022-02-18 | 中国人民解放军国防科技大学 | Remote sensing image classification network robustness improving method based on self-supervision learning |
CN114140645A (en) * | 2021-11-23 | 2022-03-04 | 杭州电子科技大学 | Photographic image aesthetic style classification method based on improved self-supervision feature learning |
CN114357221A (en) * | 2022-03-15 | 2022-04-15 | 南京航空航天大学 | Self-supervision active learning method based on image classification |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN109389151B (en) | Knowledge graph processing method and device based on semi-supervised embedded representation model | |
CN112487143A (en) | Public opinion big data analysis-based multi-label text classification method | |
CN112560432A (en) | Text emotion analysis method based on graph attention network | |
CN113065649B (en) | Complex network topology graph representation learning method, prediction method and server | |
CN113204674B (en) | Video-paragraph retrieval method and system based on local-overall graph inference network | |
CN113157957A (en) | Attribute graph document clustering method based on graph convolution neural network | |
CN113378913A (en) | Semi-supervised node classification method based on self-supervised learning | |
CN112417289A (en) | Information intelligent recommendation method based on deep clustering | |
CN113255366B (en) | Aspect-level text emotion analysis method based on heterogeneous graph neural network | |
CN109446414A (en) | A kind of software information website fast tag recommended method based on neural network classification | |
CN113344615A (en) | Marketing activity prediction method based on GBDT and DL fusion model | |
CN110781271A (en) | Semi-supervised network representation learning model based on hierarchical attention mechanism | |
CN116503118A (en) | Waste household appliance value evaluation system based on classification selection reinforcement prediction model | |
CN117036760A (en) | Multi-view clustering model implementation method based on graph comparison learning | |
CN114969078A (en) | Method for updating expert research interest of federated learning through real-time online prediction | |
Zhou et al. | Online recommendation based on incremental-input self-organizing map | |
CN114863234A (en) | Graph representation learning method and system based on topological structure maintenance | |
CN115840853A (en) | Course recommendation system based on knowledge graph and attention network | |
CN115660882A (en) | Method for predicting user-to-user relationship in social network and multi-head mixed aggregation graph convolutional network | |
Ren et al. | Variational flow graphical model | |
CN113159976B (en) | Identification method for important users of microblog network | |
CN114692867A (en) | Network representation learning algorithm combining high-order structure and attention mechanism | |
CN108388942A (en) | Information intelligent processing method based on big data | |
CN114511060A (en) | Attribute completion and network representation method based on self-encoder and generation countermeasure network | |
CN113297385A (en) | Multi-label text classification model and classification method based on improved GraphRNN |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination |