CN111159425A - Temporal knowledge graph representation method based on historical relationship and double-graph convolution network - Google Patents

Temporal knowledge graph representation method based on historical relationship and double-graph convolution network Download PDF

Info

Publication number
CN111159425A
CN111159425A CN201911392419.5A CN201911392419A CN111159425A CN 111159425 A CN111159425 A CN 111159425A CN 201911392419 A CN201911392419 A CN 201911392419A CN 111159425 A CN111159425 A CN 111159425A
Authority
CN
China
Prior art keywords
graph
historical
relationship
representation
edge
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201911392419.5A
Other languages
Chinese (zh)
Other versions
CN111159425B (en
Inventor
陈岭
汤星
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Zhejiang University ZJU
Original Assignee
Zhejiang University ZJU
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Zhejiang University ZJU filed Critical Zhejiang University ZJU
Priority to CN201911392419.5A priority Critical patent/CN111159425B/en
Publication of CN111159425A publication Critical patent/CN111159425A/en
Application granted granted Critical
Publication of CN111159425B publication Critical patent/CN111159425B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/30Information retrieval; Database structures therefor; File system structures therefor of unstructured textual data
    • G06F16/36Creation of semantic tools, e.g. ontology or thesauri
    • G06F16/367Ontology
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/045Combinations of networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/048Activation functions
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods

Abstract

The invention discloses a temporal knowledge graph representation method based on a historical relationship and a double-graph convolution network, which comprises the following steps: 1) preprocessing data in the temporal knowledge graph, and extracting an event and historical relation; 2) constructing an original graph and an edge graph to respectively represent interaction between entities and historical relationship; 3) constructing a multi-range time dependency relationship for the original image by using a historical relationship encoder which is determined by parameters and is based on a time self-attention mechanism, and obtaining historical relationship representation; 4) obtaining entity representation according to historical relationship representation, original graphs and edge graphs by using a double-graph convolution network determined by parameters; 5) and predicting the relation between the entities in a future period of time according to the entity representation and the historical relation by utilizing the semantic matching model determined by the parameters, and constructing a new temporal knowledge graph according to the relation which is likely to occur in the future period of time. The temporal knowledge graph representation method improves the model performance and has wide application prospects in the fields of international relationship prediction, social network analysis and the like.

Description

Temporal knowledge graph representation method based on historical relationship and double-graph convolution network
Technical Field
The invention relates to the field of representation of a temporal knowledge graph, in particular to a temporal knowledge graph representation method based on a historical relationship and a double-graph convolution network.
Background
The temporal knowledge graph contains a large amount of knowledge with time marks, can be regarded as a multi-relation graph, and is widely applied to the fields of international relation prediction, social network analysis and the like. Global event knowledge maps (Global Database of events, Language, and Tone, GDELT) and conflict event knowledge maps (ICEWS) are typical event-based temporal knowledge maps that represent knowledge over time in the form of event quads (head entity, relationship, tail entity, timestamp). The temporal knowledge graph representation learning maps entities and relations in the knowledge graph into low-dimensional and continuous vector representation by introducing time information, and has important significance for temporal knowledge graph completion.
Traditional temporal knowledge graph representation learning methods explicitly model timestamps as hyperplanes, vector representations, or fixed-format encodings. These methods simply embed the corresponding time stamp into the low-dimensional vector space, ignoring the time dependence. To address this problem, researchers have proposed temporal knowledge graph representation learning methods based on sequence learning models to model temporal dependencies.
The temporal knowledge graph representation learning method based on the sequence learning model utilizes the sequence learning model to model time dependence and can be divided into two categories. The first type is that a cyclic Neural Network (RNN) is used to model a sequence of relationships for a given pair of entities, which results in a representation of the entities. However, such approaches ignore simultaneous interactions between different entities. The second type is that a Gated Recurrent Unit (GRU) is used to model a tail entity sequence given a head entity and a relationship, and three different aggregation modes are designed to obtain a neighbor-based representation of the entity. However, such methods cannot take into account the effects of different relationships. In addition, existing temporal knowledge graph representation learning methods ignore interactions between relationships, particularly interactions between historical relationships (i.e., a sequence of relationships between pairs of entities with timestamps).
Disclosure of Invention
The technical problem to be solved by the invention is how to obtain a temporal knowledge graph representation by simultaneously considering interaction between entities and historical relations.
In order to solve the above problems, the present invention provides a temporal knowledge graph representation method based on a historical relationship and a dual graph convolution network, comprising the following steps:
1) preprocessing data in the temporal knowledge graph, and extracting an event and historical relation;
2) constructing an original graph and an edge graph to respectively represent interaction between entities and historical relations, wherein the entities are used as nodes, the historical relations between the entities are used as edges, and the original graph is constructed; regarding edges of the original graph as nodes, regarding the mutual influence among the historical relations as edges, and constructing an edge graph;
3) constructing a multi-range time dependency relationship for the original image by using a historical relationship encoder which is determined by parameters and is based on a time self-attention mechanism, and obtaining historical relationship representation;
4) obtaining entity representation according to historical relationship representation, original graphs and edge graphs by using a double-graph convolution network determined by parameters;
5) and predicting the relation between the entities in a future period of time according to the entity representation and the historical relation by utilizing the semantic matching model determined by the parameters, and constructing a new temporal knowledge graph according to the relation which is likely to occur in the future period of time.
According to the method, the original graph and the edge graph are constructed, and the double-graph convolution network is introduced for representation learning, so that interaction between entities and historical relations can be captured simultaneously, and the performance of the model is improved. Compared with the prior art, the method has the advantages that:
1) a historical relational encoder based on a temporal self-attention mechanism was introduced to model multi-range temporal dependencies.
2) And constructing an original graph according to the historical relationship, constructing an edge graph according to the mutual influence between edges, and introducing a double graph convolution network to model the interaction between the entities and the historical relationship.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, it is obvious that the drawings in the following description are only some embodiments of the present invention, and for those skilled in the art, other drawings can be obtained according to these drawings without creative efforts.
FIG. 1 is a flowchart of a temporal knowledge graph representation method based on historical relationships and a two-graph convolutional network according to an embodiment;
FIG. 2 is a schematic diagram of a historical relationship encoder structure based on a time-based self-attention mechanism according to an embodiment;
fig. 3 is a block diagram of a dual graph convolutional network according to an embodiment.
Detailed Description
In order to make the objects, technical solutions and advantages of the present invention more apparent, the present invention will be further described in detail with reference to the accompanying drawings and examples. It should be understood that the detailed description and specific examples, while indicating the scope of the invention, are intended for purposes of illustration only and are not intended to limit the scope of the invention.
Fig. 1 is a flowchart of a temporal knowledge graph representation method based on a historical relationship and a two-graph convolutional network according to an embodiment. Referring to fig. 1, the temporal knowledge graph representation method includes data preprocessing, original drawing and edge drawing construction, temporal knowledge graph representation learning, and an application stage, which is similar to the data preprocessing, the original drawing and edge drawing construction, and the temporal knowledge graph representation learning, and therefore is not shown in fig. 1, and each stage is described in detail below.
Data preprocessing stage
The specific flow of data preprocessing is as follows:
step 1-1, inputting a temporal knowledge map TKB, and extracting events to obtain an event set.
The temporal knowledge graph TKB comprises a large amount of time-labeled knowledge, and event extraction is performed on the temporal knowledge graph TKB to form an event set. Events are represented in the form of quadruplets (s, r, o, t), where s represents the head entity, r represents the relationship, o represents the tail entity, and t represents the timestamp.
Figure BDA0002345338270000045
Represents a collection of entities, and
Figure BDA0002345338270000042
Figure BDA0002345338270000043
represents a set of relationships, and
Figure BDA0002345338270000044
t represents a set of timestamps, and T ∈ T.
And 1-2, extracting historical relations of the ergodic event set through an entity composed of head and tail entities to obtain a complete training data set.
A historical relationship is a sequence of relationships between pairs of entities with a time stamp. In this step, the historical relationship h is extracted by traversing the event set through the entity pair (s, o), which is formally represented as h { (r)1,t1),…(ri,ti),(rj,tj) Where t isi<tjDenotes the relationship riRatio relation rjOccurs first.
Original and edge graph construction stage
The specific flow of the construction of the original graph and the edge graph is as follows:
step 2-1 is to construct the original image G ═ V, E, and a from the history relationship, taking the entity as a node and the history relationship as an edge.
Where V denotes a node set (i.e., an entity set), E denotes an edge set (i.e., a history relation set), and a denotes an adjacency matrix of the original image. If a historical relationship exists between two entities, the two entities are considered to have an edge therebetween, and the adjacency matrix A of the original graph has the following calculation formula:
Figure BDA0002345338270000041
wherein A isi,jConnectivity between entities is represented.
Step 2-2, regarding the edges in the original image as nodes, defining the edges according to whether the edges of the original image are communicated or not, and constructing an edge graph Gedge=(Vedge,Eedge,Aedge)。VedgeRepresenting a set of nodes (i.e., a set of historical relationships), EedgeRepresenting sets of edges, the edges of the edge graph being based onWhether the edges of the original are connected or not (i.e. whether the edges are connected or not), AedgeAn adjacency matrix representing an edge graph.
In this step, the edge graph is obtained by converting the original graph, the nodes of the edge graph are the edges of the original graph, the edges in the edge graph are defined according to whether the edges of the original graph are communicated, and the definition rule is as follows: if two edges in the original graph share a vertex, i.e. it indicates that two edges in the original graph are connected, then there is an edge between two nodes in the edge graph. Adjacency matrix A of edge mapedgeThe calculation formula is as follows:
Figure BDA0002345338270000051
wherein A isedge i→j,u→vNot equal to 0 indicates the communication between the historical relationships i → j and u → v, weight wi→j,u→vThe influence degree between the history relations (namely two edges in the original) is shown, the influence degree is determined by the degree of the shared vertex in the original, the larger the degree of the shared vertex is, the smaller the influence of the side i → j on the side u → v is, and conversely, the larger the influence is. Weight wi→j,u→vThe calculation formula of (a) is as follows:
Figure BDA0002345338270000052
wherein α is a hyperparameter, and 0< α < 0.5.
Temporal knowledge graph representation learning phase
The specific flow of the tense knowledge graph representation learning stage is as follows:
and 3-1, sampling the training data set for Z times according to a fixed size, sampling one subgraph every time, wherein the subgraph comprises P nodes representing entities and all edges representing historical relations among the nodes, and executing the steps 3-2 to 3-5 for each subgraph.
And 3-2, for the historical relationship h in the subgraph, modeling a multi-range time dependence relationship by using a historical relationship encoder based on a time self-attention mechanism to obtain a representation h of the historical relationship.
The self-attention mechanism may model the context of each portion in the sequence, assigning weights to the different portions. As shown in fig. 2, the historical relationship encoder based on the time self-attention mechanism takes the self-attention mechanism as a component, consists of the time-based intra-block self-attention mechanism and the time-based inter-block self-attention mechanism, and can simultaneously model the local time dependency relationship and the global time dependency relationship.
In this step, the history relation h { (r)1,t1),…(ri,ti),(rj,tj) Split into M blocks, formally defined as h ═ z1,z2,…,zM]. Each block contains N relationships, e.g. the first block formally defined as z1={(r1,t1),…(rN-1,tN-1),(rN,tN)}. Time-based intra-block self-attention mechanism assigns a weight to each relationship within a block
Figure BDA0002345338270000061
Taking the first block as an example, the calculation formula is as follows:
Figure BDA0002345338270000062
Figure BDA0002345338270000063
wherein, Wintra
Figure BDA0002345338270000064
For a learnable parameter, σ (-) is the activation function,
Figure BDA0002345338270000065
bintraas an offset, piFor each relationship and relationship r in the first blockiRelative time between pi={(t1-ti),…(tN-1-ti),(tN-ti)},riIs a relation riIs shown inThe representation z of each block can be obtained by weighted summation of each relation in the block through random initializationkTaking the first block as an example, the calculation formula is as follows:
Figure BDA0002345338270000066
time-based inter-block self-attention mechanism assigns weights to each block
Figure BDA0002345338270000067
The calculation formula is as follows:
Figure BDA0002345338270000068
Figure BDA0002345338270000069
wherein, Winter
Figure BDA00023453382700000610
For a learnable parameter, σ (-) is the activation function,
Figure BDA00023453382700000611
binteras an offset, qkFor the first relation in each block and block zkThe relative time of the first relationship in (a),
Figure BDA0002345338270000071
Figure BDA0002345338270000072
is a block zkThe time of the first relationship in (a),
Figure BDA0002345338270000073
is a block zMThe representation h of each historical relationship can be obtained by weighted summation of the representations of each block at the time of the first relationship:
Figure BDA0002345338270000074
step 3-3, representing all historical relations by the set of h, the adjacent matrixes A and A of the original graph and the edge graphedgeThe two-graph convolutional network is input together, and the representation of each node (namely the representation of the entity) is obtained by modeling the interaction between the entities and the historical relationship.
The Graph Convolution Network (GCN) is a deep neural network that performs convolution operation on a graph, can model message transmission between nodes, and is widely used in the fields of traffic prediction, image classification, and the like. The graph convolution network calculation formula used is as follows:
Figure BDA0002345338270000075
where gc (-) represents the graph convolution operation, X is the input node representation,
Figure BDA0002345338270000076
in order to normalize the laplacian matrix of the graph,
Figure BDA0002345338270000077
i is an identity matrix and D is a diagonalization matrix of nodes, i.e. Dii=∑jAij
Figure BDA0002345338270000078
And representing an activation function, wherein ReLU is taken as the activation function, and theta is a parameter of the graph convolution network. The one-layer graph convolution network can aggregate the messages of the 1-hop neighbors for each node, and the range of the message-passing neighbors can be enlarged by stacking the multi-layer graph convolution networks. As shown in fig. 3, the dual graph convolution network uses the graph convolution network as a component, and includes k-layer original graph convolution network and k-1 layer edge graph convolution network, so that the interaction between entities and between historical relationships can be modeled simultaneously.
In this step, a node-edge indication matrix M is first constructedneTo show the corresponding relation between the original image nodes and the edges when the original image and the edge image are merged. Node-edge indicating momentMatrix MneThe calculation formula is as follows:
Figure BDA0002345338270000079
then, the historical relationship is expressed as a set of h, an adjacent matrix A of the original image and an adjacent matrix A of the edge mapedgeNode-edge indication matrix MneAnd inputting the data into the double-graph convolution network to obtain the representation of the node (namely the representation of the entity).
In the original convolutional network, the input of k layer is the output X of the original convolutional network of k-1 layer(k-1)And the output Y of the edge graph convolution network of the k-1 layer(k-1)Except for the first layer. The calculation formula is as follows:
Figure BDA0002345338270000081
Figure BDA0002345338270000087
wherein, X(0)Is the initial representation of the original image node, is obtained by random initialization,
Figure BDA0002345338270000082
is the laplacian matrix of the original. Thetanode (k-1)Is a parameter of convolution of the original image of the (k-1) th layer, thetaen (k-1)Is the parameter of the k-1 layer edge map-artwork fusion convolution, [, ]]It is shown that the splicing operation is performed,
Figure BDA0002345338270000083
showing that the splicing result is activated;
in the edge graph convolution network, the input of the k-1 layer is the output Y of the edge graph convolution network of the k-2 layer(k-2)And the output X of the original convolutional network of the k-1 layer(k-1)Except for the first layer. The calculation formula is as follows:
Figure BDA0002345338270000084
Figure BDA0002345338270000085
wherein, Y(0)For an initial representation of the nodes of the edge graph,
Figure BDA0002345338270000086
normalized Laplace matrix, Y, for edge maps(0)Is obtained by linear conversion of the historical relationship expression hedge (k-2)Is a parameter of the k-2 layer edge map convolution, thetaen (k-1)The parameters of the k-1 layer original image-edge image fusion convolution are shown.
And 3-4, predicting the relation which may exist in the entity pair in a future period of time by utilizing a semantic matching model based on the entity representation output in the step 3-3.
The semantic matching model adopts a DistMult model, the DistMult model is a knowledge graph representation learning model, and the matching degree of the entity pairs and each relation is measured through a bilinear function. That is, the matching score is obtained by using the DistMult model, and the score function of the DistMult model is defined as follows:
f(s,r,o)=(sTMro) (16)
wherein s and o represent representations of a head entity and a tail entity, obtained from the output of the dual graph convolutional network; mrFor each relationship corresponding diagonal matrix, Mr=diag(mr) Diag () will vector mrConversion to diagonal matrix MrWherein m isr=Mhr,hrFor the historical relationship representation of the entity pair (s, o), M is a learnable weight matrix. Semantic matching is performed by using the DistMult model, the relation r which is possibly generated between entity pairs in a future period of time can be predicted, and
Figure BDA0002345338270000092
and 3-5, generating negative samples for all training samples in the subgraph by a method of randomly replacing head and tail entities, calculating the prediction loss of all samples in the subgraph, and then adjusting network parameters of a historical relationship encoder, a dual-graph convolution network and a semantic matching model based on a time self-attention mechanism according to a loss function.
In this step, cross entropy loss of all samples in the subgraph is calculated according to the prediction result, and the calculation formula is as follows:
Figure BDA0002345338270000091
where D is the set of positive and negative samples, and for positive samples (s, r, o), negative samples are obtained by randomly replacing s and o. Sig (·) is a sigmoid function, y takes a value of {0, 1}, y of a positive sample takes a value of 1, and a negative sample takes a value of 0.
Application phase
After determining the network parameters of the historical relationship encoder, the dual-graph convolutional network and the semantic matching model based on the time attention mechanism, namely building the model comprising the historical relationship encoder, the dual-graph convolutional network and the semantic matching model based on the time attention mechanism determined by the network parameters, the prediction of the future relationship between the entities can be carried out, and the specific process is as follows:
(a) preprocessing data in the temporal knowledge graph, and extracting an event and historical relation;
(b) constructing an original graph and an edge graph according to the steps 2-1 and 2-2;
(c) constructing a multi-range time dependency relationship for the original image by using a historical relationship encoder which is determined by parameters and is based on a time self-attention mechanism, and obtaining historical relationship representation;
(d) obtaining entity representation according to historical relationship representation, original graphs and edge graphs by using a double-graph convolution network determined by parameters;
(e) and predicting the relation between the entities in a future period of time according to the entity representation and the historical relation by utilizing the semantic matching model determined by the parameters, and constructing a new temporal knowledge graph according to the relation which is likely to occur in the future period of time.
Specifically, a semantic matching model determined by the parameters is used for calculating a matching score according to the entity representation and the historical relationship representation, and predicting a relationship which may occur in a future period of time according to the matching score, namely, a corresponding relationship of a part with a larger matching score can be selected as a relationship which may occur in the future period of time; a new temporal knowledge graph may be constructed based on the relationships that may occur over the future period of time.
According to the temporal knowledge graph representation method, the original graph and the edge graph are constructed, the double-graph convolution network is introduced for representation learning, the correlation between entities and the correlation between historical relations can be utilized, the model performance is improved, namely the prediction accuracy of the relation which is likely to occur in a period of time in the future is improved, and the method has a wide application prospect in the fields of international relation prediction, social network analysis and the like.
The above-mentioned embodiments are intended to illustrate the technical solutions and advantages of the present invention, and it should be understood that the above-mentioned embodiments are only the most preferred embodiments of the present invention, and are not intended to limit the present invention, and any modifications, additions, equivalents, etc. made within the scope of the principles of the present invention should be included in the scope of the present invention.

Claims (6)

1. A temporal knowledge graph representation method based on a historical relationship and a double-graph convolution network comprises the following steps:
1) preprocessing data in the temporal knowledge graph, and extracting an event and historical relation;
2) constructing an original graph and an edge graph to respectively represent interaction between entities and historical relations, wherein the entities are used as nodes, the historical relations between the entities are used as edges, and the original graph is constructed; regarding edges of the original graph as nodes, regarding the mutual influence among the historical relations as edges, and constructing an edge graph;
3) constructing a multi-range time dependency relationship for the original image by using a historical relationship encoder which is determined by parameters and is based on a time self-attention mechanism, and obtaining historical relationship representation;
4) obtaining entity representation according to historical relationship representation, original graphs and edge graphs by using a double-graph convolution network determined by parameters;
5) and predicting the relation between the entities in a future period of time according to the entity representation and the historical relation by utilizing the semantic matching model determined by the parameters, and constructing a new temporal knowledge graph according to the relation which is likely to occur in the future period of time.
2. The temporal knowledge graph representation method based on the historical relationship and the dual graph convolutional network as claimed in claim 1, wherein in step 3), the historical relationship encoder based on the time self-attention mechanism takes the self-attention mechanism as a component, and is composed of an intra-block self-attention mechanism based on time and an inter-block self-attention mechanism based on time, and can simultaneously model the local time dependency and the global time dependency;
setting the history relation h { (r)1,t1),…(ri,ti),(rj,tj) Split into M blocks, formally defined as h ═ z1,z2,…,zM]. Each block contains N relationships, e.g. the first block formally defined as z1={(r1,t1),…(rN-1,tN-1),(rN,tN)}. Time-based intra-block self-attention mechanism assigns a weight to each relationship within a block
Figure FDA0002345338260000011
Taking the first block as an example, the calculation formula is as follows:
Figure FDA0002345338260000021
Figure FDA0002345338260000022
wherein, Wintra
Figure FDA0002345338260000023
For a learnable parameter, σ (-) is the activation function,
Figure FDA0002345338260000024
bintraas an offset, piFor each relationship and relationship r in the first blockiRelative time between pi={(t1-ti),…(tN-1-ti),(tN-ti)},riIs a relation riIs obtained by random initialization, and the representation z of each block can be obtained by weighted summation of each relation in the blockkTaking the first block as an example, the calculation formula is as follows:
Figure FDA0002345338260000025
time-based inter-block self-attention mechanism assigns weights to each block
Figure FDA0002345338260000026
The calculation formula is as follows:
Figure FDA0002345338260000027
Figure FDA0002345338260000028
wherein, Winter
Figure FDA0002345338260000029
For a learnable parameter, σ (-) is the activation function,
Figure FDA00023453382600000210
binteras an offset, qkFor the first relation in each block and block zkThe relative time of the first relationship in (a),
Figure FDA00023453382600000211
Figure FDA00023453382600000212
is a block zkThe time of the first relationship in (a),
Figure FDA00023453382600000213
is a block zMThe representation h of each historical relationship can be obtained by weighted summation of the representations of each block at the time of the first relationship:
Figure FDA00023453382600000214
3. the temporal knowledge graph representation method based on historical relationship and dual graph convolutional network as claimed in claim 1, wherein in step 4), a node-edge indication matrix M is first constructedneTo represent the corresponding relation between the nodes and the edges of the original graph when the original graph and the edge graph are merged, and a node-edge indication matrix MneThe calculation formula is as follows:
Figure FDA00023453382600000215
then, the historical relationship is expressed as a set of h, an adjacent matrix A of the original image and an adjacent matrix A of the edge mapedgeNode-edge indication matrix MneInputting the data into a double-graph convolution network to obtain the representation of the nodes;
in the original convolutional network, the input of k layer is the output X of the original convolutional network of k-1 layer(k-1)And the output Y of the edge graph convolution network of the k-1 layer(k-1)Except for the first layer; the calculation formula is as follows:
Figure FDA0002345338260000031
Figure FDA0002345338260000032
wherein, X(0)Is the initial representation of the original image node, is obtained by random initialization,
Figure FDA0002345338260000033
laplace matrix, θ, of the originalnode (k-1)Is a parameter of convolution of the original image of the (k-1) th layer, thetaen (k-1)Is the parameter of the k-1 layer edge map-artwork fusion convolution, [, ]]It is shown that the splicing operation is performed,
Figure FDA0002345338260000034
showing that the splicing result is activated;
in the edge graph convolution network, the input of the k-1 layer is the output Y of the edge graph convolution network of the k-2 layer(k-2)And the output X of the original convolutional network of the k-1 layer(k-1)Except for the first layer, the calculation formula is as follows:
Figure FDA0002345338260000035
Figure FDA0002345338260000036
wherein, Y(0)For an initial representation of the nodes of the edge graph,
Figure FDA0002345338260000037
normalized Laplace matrix, Y, for edge maps(0)Is obtained by linear conversion of the historical relationship expression hedge (k-2)Is a parameter of the k-2 layer edge map convolution, thetaen (k-1)The parameters of the k-1 layer original image-edge image fusion convolution are shown.
4. The temporal knowledge graph representation method based on historical relations and two-graph convolutional networks as claimed in claim 1, wherein in step 5), a semantic matching model determined by parameters is used to calculate matching scores according to the entity representation and the historical relation representation, and the relations which are likely to occur in the future are predicted according to the matching scores.
5. The temporal knowledge graph representation method based on historical relationship and two-graph convolutional network as claimed in claim 4, wherein the semantic matching model adopts a DistMult model, and the DistMult model is used to obtain the matching score, and the DistMult model score function is defined as follows:
f(s,r,o)=(sTMro)
wherein s and o represent representations of a head entity and a tail entity, obtained from the output of the dual graph convolutional network; mrFor each relationship corresponding diagonal matrix, Mr=diag(mr) Diag () will vector mrConversion to diagonal matrix MrWherein m isr=Mhr,hrFor the historical relationship representation of the entity pair (s, o), M is a weight matrix.
6. The temporal knowledge graph representation method based on historical relationship and dual graph convolutional network as claimed in claim 4, wherein, during training, the cross entropy loss of all samples in the subgraph is calculated by the following formula:
Figure FDA0002345338260000041
where D is the set of positive and negative samples, and for positive samples (s, r, o), negative samples are obtained by randomly replacing s and o. Sig (·) is a sigmoid function, y takes a value of {0, 1}, y of a positive sample takes a value of 1, and a negative sample takes 0;
and adjusting network parameters of the historical relation encoder, the dual-graph convolution network and the semantic matching model based on the time self-attention mechanism according to the loss function, and determining the parameters.
CN201911392419.5A 2019-12-30 2019-12-30 Temporal knowledge graph representation method based on historical relationship and double-graph convolution network Active CN111159425B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201911392419.5A CN111159425B (en) 2019-12-30 2019-12-30 Temporal knowledge graph representation method based on historical relationship and double-graph convolution network

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911392419.5A CN111159425B (en) 2019-12-30 2019-12-30 Temporal knowledge graph representation method based on historical relationship and double-graph convolution network

Publications (2)

Publication Number Publication Date
CN111159425A true CN111159425A (en) 2020-05-15
CN111159425B CN111159425B (en) 2023-02-10

Family

ID=70558928

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911392419.5A Active CN111159425B (en) 2019-12-30 2019-12-30 Temporal knowledge graph representation method based on historical relationship and double-graph convolution network

Country Status (1)

Country Link
CN (1) CN111159425B (en)

Cited By (28)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111753054A (en) * 2020-06-22 2020-10-09 神思电子技术股份有限公司 Machine reading inference method based on graph neural network
CN111812450A (en) * 2020-06-01 2020-10-23 复旦大学 Method for identifying dangerous faults of power grid
CN111949892A (en) * 2020-08-10 2020-11-17 浙江大学 Multi-relation perception temporal interaction network prediction method
CN112085293A (en) * 2020-09-18 2020-12-15 支付宝(杭州)信息技术有限公司 Method and device for training interactive prediction model and predicting interactive object
CN112364108A (en) * 2020-11-13 2021-02-12 四川省人工智能研究院(宜宾) Time sequence knowledge graph completion method based on space-time architecture
CN112527915A (en) * 2020-11-17 2021-03-19 北京科技大学 Linear cultural heritage knowledge graph construction method, system, computing device and medium
CN112529071A (en) * 2020-12-08 2021-03-19 广州大学华软软件学院 Text classification method, system, computer equipment and storage medium
CN112668700A (en) * 2020-12-30 2021-04-16 广州大学华软软件学院 Width map convolutional network model based on grouping attention and training method thereof
CN112711664A (en) * 2020-12-31 2021-04-27 山西三友和智慧信息技术股份有限公司 Text emotion classification method based on TCN + LSTM
CN112905738A (en) * 2021-02-05 2021-06-04 中山大学 Social relationship evolution prediction method based on temporal knowledge graph reasoning
CN112925953A (en) * 2021-03-09 2021-06-08 南京航空航天大学 Dynamic network representation method and system
CN112989060A (en) * 2020-11-24 2021-06-18 杭州电子科技大学 GCN-based major event trend prediction method
CN113033209A (en) * 2021-05-25 2021-06-25 腾讯科技(深圳)有限公司 Text relation extraction method and device, storage medium and computer equipment
CN113051408A (en) * 2021-03-30 2021-06-29 电子科技大学 Sparse knowledge graph reasoning method based on information enhancement
CN113204647A (en) * 2021-04-29 2021-08-03 哈尔滨工程大学 Joint weight-based encoding and decoding framework knowledge graph embedding method
CN113377968A (en) * 2021-08-16 2021-09-10 南昌航空大学 Knowledge graph link prediction method adopting fused entity context
CN113392229A (en) * 2021-08-13 2021-09-14 四川新龟科技有限公司 Supply chain relation construction and prediction method, device, equipment and storage medium
CN113627676A (en) * 2021-08-18 2021-11-09 湘潭大学 Traffic prediction method and system based on multi-attention causal relationship
CN113641829A (en) * 2021-07-13 2021-11-12 北京百度网讯科技有限公司 Method and device for training neural network of graph and complementing knowledge graph
CN113688253A (en) * 2021-08-12 2021-11-23 浙江大学 Hierarchical perception temporal knowledge map representation learning method
CN113722510A (en) * 2021-09-13 2021-11-30 中国人民解放军国防科技大学 Knowledge graph complex problem generation method and system based on graph neural network
CN113742491A (en) * 2021-08-12 2021-12-03 上海熙业信息科技有限公司 Representation learning-based time knowledge graph reasoning method
CN113761337A (en) * 2020-12-31 2021-12-07 国家计算机网络与信息安全管理中心 Event prediction method and device based on implicit elements and explicit relations of events
CN113849163A (en) * 2021-10-09 2021-12-28 中国科学院软件研究所 API (application program interface) document map-based operating system intelligent programming method and device
CN114186069A (en) * 2021-11-29 2022-03-15 江苏大学 Deep video understanding knowledge graph construction method based on multi-mode heteromorphic graph attention network
CN114745183A (en) * 2022-04-14 2022-07-12 浙江网商银行股份有限公司 Alarm method and device
WO2023039901A1 (en) * 2021-09-18 2023-03-23 京东方科技集团股份有限公司 Text recommendation method and apparatus, model training method and apparatus, and readable storage medium
CN116340524A (en) * 2022-11-11 2023-06-27 华东师范大学 Method for supplementing small sample temporal knowledge graph based on relational adaptive network

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH06161854A (en) * 1992-07-16 1994-06-10 Internatl Business Mach Corp <Ibm> Source of relationship and solution of target in version-finished database management system
US20140149418A1 (en) * 2012-11-28 2014-05-29 Share This Inc. Method and system for measuring social influence and receptivity of users
CN109213872A (en) * 2018-09-11 2019-01-15 中国电子科技集团公司第二十八研究所 Knowledge based indicates the entity relationship prediction technique and forecasting system of study
CN109902183A (en) * 2019-02-13 2019-06-18 北京航空航天大学 A kind of knowledge mapping embedding grammar based on various figure attention mechanism

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH06161854A (en) * 1992-07-16 1994-06-10 Internatl Business Mach Corp <Ibm> Source of relationship and solution of target in version-finished database management system
US20140149418A1 (en) * 2012-11-28 2014-05-29 Share This Inc. Method and system for measuring social influence and receptivity of users
CN109213872A (en) * 2018-09-11 2019-01-15 中国电子科技集团公司第二十八研究所 Knowledge based indicates the entity relationship prediction technique and forecasting system of study
CN109902183A (en) * 2019-02-13 2019-06-18 北京航空航天大学 A kind of knowledge mapping embedding grammar based on various figure attention mechanism

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
MARTIN ATZMUELLER: "Onto Model-based Anomalous Link Pattern Mining", 《ACM ISBN 978-1-4503-6675-5/19/05》 *

Cited By (48)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111812450A (en) * 2020-06-01 2020-10-23 复旦大学 Method for identifying dangerous faults of power grid
CN111812450B (en) * 2020-06-01 2022-03-18 复旦大学 Method for identifying dangerous faults of power grid
CN111753054A (en) * 2020-06-22 2020-10-09 神思电子技术股份有限公司 Machine reading inference method based on graph neural network
CN111753054B (en) * 2020-06-22 2023-02-03 神思电子技术股份有限公司 Machine reading inference method based on graph neural network
CN111949892A (en) * 2020-08-10 2020-11-17 浙江大学 Multi-relation perception temporal interaction network prediction method
CN111949892B (en) * 2020-08-10 2022-04-05 浙江大学 Multi-relation perception temporal interaction network prediction method
CN112085293A (en) * 2020-09-18 2020-12-15 支付宝(杭州)信息技术有限公司 Method and device for training interactive prediction model and predicting interactive object
CN112085293B (en) * 2020-09-18 2022-09-09 支付宝(杭州)信息技术有限公司 Method and device for training interactive prediction model and predicting interactive object
CN112364108A (en) * 2020-11-13 2021-02-12 四川省人工智能研究院(宜宾) Time sequence knowledge graph completion method based on space-time architecture
CN112527915A (en) * 2020-11-17 2021-03-19 北京科技大学 Linear cultural heritage knowledge graph construction method, system, computing device and medium
CN112989060A (en) * 2020-11-24 2021-06-18 杭州电子科技大学 GCN-based major event trend prediction method
CN112989060B (en) * 2020-11-24 2022-04-15 杭州电子科技大学 GCN-based major event trend prediction method
CN112529071B (en) * 2020-12-08 2023-10-17 广州大学华软软件学院 Text classification method, system, computer equipment and storage medium
CN112529071A (en) * 2020-12-08 2021-03-19 广州大学华软软件学院 Text classification method, system, computer equipment and storage medium
CN112668700A (en) * 2020-12-30 2021-04-16 广州大学华软软件学院 Width map convolutional network model based on grouping attention and training method thereof
CN112668700B (en) * 2020-12-30 2023-11-28 广州大学华软软件学院 Width graph convolution network model system based on grouping attention and training method
CN113761337B (en) * 2020-12-31 2023-10-27 国家计算机网络与信息安全管理中心 Event prediction method and device based on implicit event element and explicit connection
CN112711664A (en) * 2020-12-31 2021-04-27 山西三友和智慧信息技术股份有限公司 Text emotion classification method based on TCN + LSTM
CN112711664B (en) * 2020-12-31 2022-09-20 山西三友和智慧信息技术股份有限公司 Text emotion classification method based on TCN + LSTM
CN113761337A (en) * 2020-12-31 2021-12-07 国家计算机网络与信息安全管理中心 Event prediction method and device based on implicit elements and explicit relations of events
CN112905738A (en) * 2021-02-05 2021-06-04 中山大学 Social relationship evolution prediction method based on temporal knowledge graph reasoning
CN112905738B (en) * 2021-02-05 2022-04-22 中山大学 Social relationship evolution prediction method based on temporal knowledge graph reasoning
CN112925953A (en) * 2021-03-09 2021-06-08 南京航空航天大学 Dynamic network representation method and system
CN112925953B (en) * 2021-03-09 2024-02-20 南京航空航天大学 Dynamic network representation method and system
CN113051408B (en) * 2021-03-30 2023-02-14 电子科技大学 Sparse knowledge graph reasoning method based on information enhancement
CN113051408A (en) * 2021-03-30 2021-06-29 电子科技大学 Sparse knowledge graph reasoning method based on information enhancement
CN113204647A (en) * 2021-04-29 2021-08-03 哈尔滨工程大学 Joint weight-based encoding and decoding framework knowledge graph embedding method
CN113204647B (en) * 2021-04-29 2023-01-03 哈尔滨工程大学 Joint weight-based encoding and decoding framework knowledge graph embedding method
CN113033209A (en) * 2021-05-25 2021-06-25 腾讯科技(深圳)有限公司 Text relation extraction method and device, storage medium and computer equipment
CN113641829A (en) * 2021-07-13 2021-11-12 北京百度网讯科技有限公司 Method and device for training neural network of graph and complementing knowledge graph
CN113641829B (en) * 2021-07-13 2023-11-24 北京百度网讯科技有限公司 Training and knowledge graph completion method and device for graph neural network
CN113688253A (en) * 2021-08-12 2021-11-23 浙江大学 Hierarchical perception temporal knowledge map representation learning method
CN113742491A (en) * 2021-08-12 2021-12-03 上海熙业信息科技有限公司 Representation learning-based time knowledge graph reasoning method
CN113392229A (en) * 2021-08-13 2021-09-14 四川新龟科技有限公司 Supply chain relation construction and prediction method, device, equipment and storage medium
CN113377968A (en) * 2021-08-16 2021-09-10 南昌航空大学 Knowledge graph link prediction method adopting fused entity context
CN113627676A (en) * 2021-08-18 2021-11-09 湘潭大学 Traffic prediction method and system based on multi-attention causal relationship
CN113627676B (en) * 2021-08-18 2023-09-01 湘潭大学 Traffic prediction method and system based on multi-attention causal relationship
CN113722510B (en) * 2021-09-13 2024-04-05 中国人民解放军国防科技大学 Knowledge graph complex problem generation method and system based on graph neural network
CN113722510A (en) * 2021-09-13 2021-11-30 中国人民解放军国防科技大学 Knowledge graph complex problem generation method and system based on graph neural network
WO2023039901A1 (en) * 2021-09-18 2023-03-23 京东方科技集团股份有限公司 Text recommendation method and apparatus, model training method and apparatus, and readable storage medium
CN113849163B (en) * 2021-10-09 2022-10-14 中国科学院软件研究所 API (application program interface) document map-based operating system intelligent programming method and device
CN113849163A (en) * 2021-10-09 2021-12-28 中国科学院软件研究所 API (application program interface) document map-based operating system intelligent programming method and device
CN114186069B (en) * 2021-11-29 2023-09-29 江苏大学 Depth video understanding knowledge graph construction method based on multi-mode different-composition attention network
CN114186069A (en) * 2021-11-29 2022-03-15 江苏大学 Deep video understanding knowledge graph construction method based on multi-mode heteromorphic graph attention network
CN114745183B (en) * 2022-04-14 2023-10-27 浙江网商银行股份有限公司 Alarm method and device
CN114745183A (en) * 2022-04-14 2022-07-12 浙江网商银行股份有限公司 Alarm method and device
CN116340524A (en) * 2022-11-11 2023-06-27 华东师范大学 Method for supplementing small sample temporal knowledge graph based on relational adaptive network
CN116340524B (en) * 2022-11-11 2024-03-08 华东师范大学 Method for supplementing small sample temporal knowledge graph based on relational adaptive network

Also Published As

Publication number Publication date
CN111159425B (en) 2023-02-10

Similar Documents

Publication Publication Date Title
CN111159425B (en) Temporal knowledge graph representation method based on historical relationship and double-graph convolution network
CN110263280B (en) Multi-view-based dynamic link prediction depth model and application
CN109887282A (en) A kind of road network traffic flow prediction technique based on level timing diagram convolutional network
CN114265986B (en) Information pushing method and system fusing knowledge graph structure and path semantics
CN113868474A (en) Information cascade prediction method based on self-attention mechanism and dynamic graph
CN113780002A (en) Knowledge reasoning method and device based on graph representation learning and deep reinforcement learning
CN113688253A (en) Hierarchical perception temporal knowledge map representation learning method
CN114039871B (en) Method, system, device and medium for cellular traffic prediction
CN114723037A (en) Heterogeneous graph neural network computing method for aggregating high-order neighbor nodes
CN115051929A (en) Network fault prediction method and device based on self-supervision target perception neural network
Ke et al. AutoSTG+: An automatic framework to discover the optimal network for spatio-temporal graph prediction
CN117236492B (en) Traffic demand prediction method based on dynamic multi-scale graph learning
CN113537613B (en) Temporal network prediction method for die body perception
CN116110232A (en) Traffic flow prediction method based on hierarchical dynamic residual map convolution network
CN115545833A (en) Recommendation method and system based on user social information
CN115526293A (en) Knowledge graph reasoning method considering semantic and structural information
CN114254738A (en) Double-layer evolvable dynamic graph convolution neural network model construction method and application
CN115063251A (en) Social communication propagation dynamic network representation method based on relationship strength and feedback mechanism
CN114153996A (en) Multi-map attention cooperative geoscience knowledge map updating method and device
CN111275564A (en) Method and system for detecting community number of microblog network
Cheng et al. Network SpaceTime AI: Concepts, Methods and Applications.
CN113159409B (en) National city air quality prediction method based on group perception map neural network
CN117133116B (en) Traffic flow prediction method and system based on space-time correlation network
CN116501924B (en) Graph link prediction method based on robust enhancement loss function
CN112925953B (en) Dynamic network representation method and system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant