CN113987203A - Knowledge graph reasoning method and system based on affine transformation and bias modeling - Google Patents

Knowledge graph reasoning method and system based on affine transformation and bias modeling Download PDF

Info

Publication number
CN113987203A
CN113987203A CN202111251386.XA CN202111251386A CN113987203A CN 113987203 A CN113987203 A CN 113987203A CN 202111251386 A CN202111251386 A CN 202111251386A CN 113987203 A CN113987203 A CN 113987203A
Authority
CN
China
Prior art keywords
entity
knowledge graph
relation
embedded
bias
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202111251386.XA
Other languages
Chinese (zh)
Inventor
宋勃升
苏祥
曾湘祥
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hunan University
Original Assignee
Hunan University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hunan University filed Critical Hunan University
Priority to CN202111251386.XA priority Critical patent/CN113987203A/en
Publication of CN113987203A publication Critical patent/CN113987203A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/30Information retrieval; Database structures therefor; File system structures therefor of unstructured textual data
    • G06F16/36Creation of semantic tools, e.g. ontology or thesauri
    • G06F16/367Ontology

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Animal Behavior & Ethology (AREA)
  • Computational Linguistics (AREA)
  • Data Mining & Analysis (AREA)
  • Databases & Information Systems (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Machine Translation (AREA)

Abstract

The invention discloses a knowledge graph reasoning method and a knowledge graph reasoning system based on affine transformation and bias modeling, which are used for acquiring problems (including an entity e and a relation r) from a user and inputting a trained knowledge graph to be embedded into a model to obtain a prediction result. The invention provides a concrete representation of entities and relations, maps the entities to a real-domain space, and learns a bias for each entity to solve entity uncertainty. And expressing the relation as two vectors, and carrying out affine transformation on the head entity after the uncertainty of the entity is solved. The invention carries out modeling representation on the knowledge graph by introducing bias and affine transformation, can solve the problem that the current embedded model cannot effectively learn and reason various relation modes in the knowledge graph, and thus can effectively improve the representation capability of entities and relations and the accuracy of knowledge graph link prediction.

Description

Knowledge graph reasoning method and system based on affine transformation and bias modeling
Technical Field
The invention belongs to the field of knowledge graph embedding, and particularly relates to a knowledge graph reasoning method and system based on affine transformation and bias modeling.
Background
Knowledge Graph (KG) is a multi-relationship directed Graph composed of fact triples. In a triplet (h, r, t), we denote the head and tail entities by h and t, respectively, and r is the relationship between the head and tail entities. Thus, in a multi-relational directed graph, head entities point to tail entities through relationships. In recent years, researchers have constructed a number of large-scale knowledge maps, such as WordNet, YAGO and Freebase, which capture knowledge in various fields. After pre-processing, they can be well used for various downstream tasks of Natural Language Processing (NLP), including question answering, recommendation and dialog systems, and so on. However, due to the complexity of reality, Knowledge-graphs still lack a large number of fact triplets, and the task of inferring missing links between entities from known fact triplets is called link prediction, which has been widely used in Knowledge-Graph Completion (KGC).
One effective method for link prediction is Knowledge Graph Embedding (KGE). The key idea is to map the entities and relations in KG to a continuous vector space to simplify the operation while preserving the inherent semantics and structure of the knowledge-graph. The existing knowledge graph embedding method comprises a shallow semantic model method and a deep learning model method; aiming at the mainstream method based on the shallow semantic model, after an entity and a vector are initialized, various relation modes are fully fitted by designing an effective scoring function, and the low complexity and the better performance can be obtained; for the mainstream method based on deep learning, by introducing excellent models such as a convolutional neural network, an attention mechanism and a graph neural network, richer features between entities and relations are captured, and then an effective scoring function is used for measurement.
However, the above-mentioned method of knowledge-graph embedding still has some non-negligible drawbacks: although the existing shallow semantic models begin to model various relation modes implicitly or explicitly, the existing shallow semantic models have limitations, and the existing relation models cannot be modeled, so that the precision is low; the deep learning-based method requires a large number of parameters and nonlinear operations, is too complex, and the final scoring still depends on a scoring function designed in a shallow semantic model.
Disclosure of Invention
Aiming at the defects or improvement requirements of the prior art, the invention provides a knowledge graph reasoning method and a knowledge graph reasoning system based on affine transformation and bias modeling, and aims to solve the technical problems that the precision is low due to the fact that the existing shallow semantic model cannot be modeled, and the complexity is overlarge due to the fact that the existing deep learning-based method needs a large number of parameters and nonlinear operation, and the final scoring still depends on a scoring function designed in the shallow semantic model.
To achieve the above object, according to one aspect of the present invention, there is provided a knowledge graph inference method based on affine transformation and bias modeling, comprising the steps of:
(1) acquiring a head entity h and a relation r;
(2) inputting the head entity h and the relation r obtained in the step (1) into a trained knowledge graph to be embedded into a model so as to obtain a corresponding score list;
(3) and (3) after all the scores in the score list obtained in the step (2) are sorted from high to low, taking out a first tail entity in all tail entities, and forming a triple together with the head entity h and the relation r obtained in the step (1) to serve as a knowledge graph reasoning result.
Preferably, the knowledge-graph embedding model is obtained by training through the following method:
(2-1) acquiring a knowledge graph data set, and randomly dividing the knowledge graph data set into a training set, a verification set and a test set, wherein each knowledge graph data is composed of a triple (h, R, t) composed of a head entity h, a relation R and a tail entity t, all entities form an entity set E, and all relations form a relation set R;
(2-2) randomly initializing the entity set E and the relation set R in the training set obtained in the step (2-1) to obtain an embedded representation of each entity and an embedded representation of each relation;
and (2-3) inputting the embedded representations of all the entities and the embedded representations of the relations in the training set obtained in the step (2-2) into a knowledge graph embedding model, and iteratively training the embedded representations of the entities and the embedded representations of the relations in the knowledge graph embedding model by using a self-confrontation negative sampling-based method until the knowledge graph embedding model converges, thereby obtaining the trained knowledge graph embedding model.
And (2-4) verifying the preliminarily trained knowledge graph embedded model according to the knowledge graph verification set obtained in the step (2-1) until the obtained minimum ranking target based on self-confrontation negative sampling reaches the optimum, so that the trained knowledge graph embedded model is obtained.
Preferably, the knowledge-graph dataset is a public knowledge-graph dataset FB15k-237, WN18RR, or YAGO 3-10.
Preferably, in the step (2-2), the size of the embedded representation of each entity and each relation is 2d, the value range of d is 1-1000, and each embedded representation meets the requirement of uniform distribution; wherein the front d dimension embedding of an entity is represented as e (h or t), and the back d dimension embedding is represented as the bias b of the entitye(bhOr bt) (ii) a And the front d-dimensional embedded representation of the relationship is the weight w of the relationshiprThe d-dimension-after embedded representation is the bias b of the relationshipr
Preferably, the step (2-3) is specifically:
firstly, constructing a scoring function according to the embedded representation of all the entities in the training set and the embedding of the relationship obtained in the step (2-2):
Figure BDA0003322625680000031
where h represents the embedded representation of the head entity h, t represents the embedded representation of the tail entity t, bhDenotes the offset of the head entity h, btRepresenting the offset, w, of the tail entity trAn embedded representation of the weights representing the relation r, brAn embedded representation of the bias representing the relationship r,
Figure BDA0003322625680000032
representing a regularization operation, drAnd (h, t) represents the corresponding fraction of the triple consisting of the head entity h, the relation r and the tail entity t.
And (3) then, constructing a negative triple set T' according to the triple set T formed by all the triples in the training set obtained in the step (2-1):
Figure BDA0003322625680000041
h' is an entity randomly selected from the entity set E as a head entity, so that the triplet formed by the head entity, r and T does not belong to the triplet set T; t' is an entity randomly selected from E as the tail entity so that its triplet with h, r does not belong to the triplet set T.
And then, minimizing a loss function L based on self-confrontation negative sampling according to the negative triple set T' and the triple set T in the training set, thereby obtaining a preliminarily trained knowledge graph embedding model.
Preferably, the loss function L is embodied as:
Figure BDA0003322625680000042
wherein | · | purple2Represents L2Regularization []Representing the splicing operation between vectors, lambda is a penalty factor of a regular term, the value range of lambda is 0 to 0.1, gamma is a boundary value, the value range of gamma is 1 to 24, E represents the embedded representation of all entities in an entity set E, beEmbedded representation, p (h), representing the bias of all entities in entity set Ej′,r,tj') denotes the jth (where j ∈ [1, num)]Num is the total number of negative triples in the negative triplet set T').
Preferably, the sampling weight p (h) of the jth negative tripletj′,r,tj') is as follows:
Figure BDA0003322625680000043
where α represents the self-antagonistic sampling temperature, when α is 0, the self-antagonistic sampling degenerates to a uniform sampling, and when α >0, the less scored negative triplet corresponds to a larger sampling weight.
According to another aspect of the invention, a knowledge-graph inference system based on affine transformation and bias modeling is provided, comprising:
a first module for obtaining a head entity h and a relationship r;
and the second module is used for inputting the head entity h and the relation r obtained by the first module into the trained knowledge graph embedding model so as to obtain a corresponding score list.
And the third module is used for sorting all the scores in the score list obtained by the second module from high to low, then taking out the first tail entity of all the tail entities to form a triple together with the head entity h and the relation r obtained by the first module, and using the triple as a knowledge graph reasoning result.
In general, compared with the prior art, the above technical solution contemplated by the present invention can achieve the following beneficial effects:
(1) the method adopts the steps (2) and (2-2), adopts the shallow semantic model, expands the translational attribute of additive interaction to affine transformation, and simultaneously learns a bias vector for each entity to correct the uncertainty of the entity, so that the model has less parameters and low complexity and can be better expanded to a large-scale knowledge map, thereby solving the technical problems that the complexity is overlarge due to the fact that a large number of parameters and nonlinear operations are needed in the conventional deep learning-based method and the final scoring still depends on the scoring function designed in the shallow semantic model while the performance is kept;
(2) because the steps (2) and (2-3) are adopted, an effective scoring function is designed, so that the combined mode of symmetry/antisymmetry, inversion, exchangeable/non-exchangeable can be effectively modeled, and complex relations of 1 to 1, more than 1 and more than many are modeled, thereby effectively solving the technical problem of low precision caused by the fact that the existing shallow semantic model can not model the existing relational models;
(3) the technical scheme of the invention has the advantages of less parameters, low complexity and high performance, and can fully meet the requirements of users on knowledge graph completion
Drawings
FIG. 1 is a model diagram of a knowledge graph inference method based on affine transformation and bias modeling;
FIG. 2 is a comparison of the modeling capabilities of a baseline model and the modeling capabilities of a model of the present invention;
FIG. 3 is a schematic representation of link prediction in three classical knowledge-graph datasets according to the present invention;
FIG. 4 is a comparison of a prior knowledge-graph embedding method with the present invention on three knowledge-graphs to link the prediction task;
FIG. 5 is a comparison of the complex relationship modeling capabilities of the prior art classical model and the present invention;
FIG. 6 is a flow chart of the knowledge graph inference method based on affine transformation and bias modeling.
Detailed Description
In order to make the objects, technical solutions and advantages of the present invention more apparent, the present invention is described in further detail below with reference to the accompanying drawings and embodiments. It should be understood that the specific embodiments described herein are merely illustrative of the invention and are not intended to limit the invention. In addition, the technical features involved in the embodiments of the present invention described below may be combined with each other as long as they do not conflict with each other.
The basic idea of the invention is to provide an efficient, simple and practical knowledge-graph embedding model. The method expands the additive interaction in the translation model to affine transformation, and simultaneously learns a bias for each entity to correct the affine transformation.
As shown in fig. 1 and fig. 6, the invention provides a knowledge graph inference method based on affine transformation and bias modeling, which specifically comprises the following steps:
(1) acquiring a head entity h and a relation r;
(2) inputting the head entity h and the relation r obtained in the step (1) into a trained knowledge graph to be embedded into a model so as to obtain a corresponding score list;
in the score list obtained by the invention, the mapping relation between different scores and corresponding tail entities is reflected.
Specifically, the knowledge graph embedding model is obtained by training through the following method:
(2-1) acquiring a knowledge graph data set, and randomly dividing the knowledge graph data set into a training set, a verification set and a test set, wherein each knowledge graph data is composed of a triple (h, R, t) composed of a head entity h, a relation R and a tail entity t, all entities form an entity set E, and all relations form a relation set R;
specifically, the knowledge-graph dataset in this step is a public knowledge-graph dataset, such as FB15k-237, WN18RR, or YAGO 3-10.
(2-2) randomly initializing the entity set E and the relation set R in the training set obtained in the step (2-1) to obtain an embedded representation of each entity and an embedded representation of each relation;
the size of each entity and each relation embedded representation is 2d, the value range of d is 1-1000, preferably 500, and each embedded representation meets the requirement of uniform distribution; wherein the front d dimension embedding of an entity is represented as e (h or t), and the back d dimension embedding is represented as the bias b of the entitye(bhOr bt) (ii) a And the front d-dimensional embedded representation of the relationship is the weight w of the relationshiprThe d-dimension-after embedded representation is the bias b of the relationshipr
The sub-step has the advantages that the relationship is initialized into two vectors, so that the change of the characteristics can be better captured by applying affine transformation, and compared with the common translation characteristic of a single relationship vector, the accuracy of model prediction is improved; in addition, the initialized entity bias is more helpful to solve the uncertainty problem of the entity, so as to correct the potential deviation of the entity.
(2-3) inputting the embedded representations of all the entities and the embedded representations of the relations in the training set obtained in the step (2-2) into a knowledge graph embedded model, and performing iterative training on the embedded representations of the entities and the embedded representations of the relations in the knowledge graph embedded model by using a method based on self-confrontation negative sampling until the knowledge graph embedded model converges, so as to obtain a trained knowledge graph embedded model;
specifically, the method of self-countervailing negative sampling can be referred to IN the article "ROTATE: KNOWLEDGE GRAPH EMBEDDING BY RELATIONAL ROTATION IN COMPLEX SPACE".
The step (2-3) is specifically as follows:
firstly, constructing a scoring function according to the embedded representation of all the entities in the training set and the embedding of the relationship obtained in the step (2-2):
Figure BDA0003322625680000071
where h represents the embedded representation of the head entity h, t represents the embedded representation of the tail entity t, bhDenotes the offset of the head entity h, btRepresenting the offset, w, of the tail entity trAn embedded representation of the weights representing the relation r, brAn embedded representation of the bias representing the relationship r,
Figure BDA0003322625680000072
representing a regularization operation, drAnd (h, t) represents the corresponding fraction of the triple consisting of the head entity h, the relation r and the tail entity t.
And (3) then, constructing a negative triple set T' according to the triple set T formed by all the triples in the training set obtained in the step (2-1):
Figure BDA0003322625680000081
h' is selected randomly from the entity set E as a head entity, so that the triple formed by the head entity, r and T does not belong to one entity of the triple set T; t' is an entity randomly selected from E as the tail entity so that its triplet with h, r does not belong to the triplet set T.
The neural network parameters of the knowledge-graph embedding model can all be learned by minimizing the loss function based on self-antagonistic negative sampling, the goal encouraging the scores of positive relationships (triplets) to be higher than the scores of any negative relationships (triplets). Triples that typically occur in a knowledge-graph are referred to as positive triples. Therefore, the set of all triples in the training set obtained in step (2-1) is regarded as a positive triplet set T, and a negative triplet set T' of negative triples is constructed by destroying the head entity or the tail entity of the positive triples.
Then, according to the negative triple set T' and the triple set T in the training set, minimizing a loss function L based on self-confrontation negative sampling, thereby obtaining a preliminarily trained knowledge graph embedding model, wherein the loss function L specifically comprises:
Figure BDA0003322625680000082
wherein | · | purple2Represents L2Regularization []The splicing operation between vectors is represented, lambda is a penalty factor of a regular term, the value range of lambda is 0 to 0.1, preferably 0.1, and the fraction of a positive triplet is recorded as dr(h, t), denote the fraction of negative triplets as dr(h ', t'), setting a boundary value gamma, and judging whether the value of the scoring function exceeds the boundary value gamma, if so, indicating that the triplet is not established, otherwise, establishing the triplet, wherein the value range of gamma is 1 to 24, and is preferably 4; e represents an embedded representation of all entities in the entity set E, beAn embedded representation representing the bias of all entities in entity set E;
p(hj′,r,tj') denotes the jth (where j ∈ [1, num)]Num is the total number of negative triples in the negative triplet set T') is calculated as follows:
Figure BDA0003322625680000091
where α represents the self-antagonistic sampling temperature, when α is 0, the self-antagonistic sampling degenerates to a uniform sampling, and when α >0, the less scored negative triplet corresponds to a larger sampling weight.
The sub-step has the advantages that a grading function with stronger expansibility is designed, modeling can be carried out on various existing relation models, and meanwhile the problems of 1 to more, more to 1 and more to more in complex relations are solved. The performance of the model is improved by the capability of modeling the relation mode, and therefore the method effectively improves the performance. Meanwhile, in order to ensure the convergence of the model, the loss function of self-confrontation negative sampling is expanded, the entity and the bias thereof are limited, and the stability of the method is effectively improved.
(2-4) verifying the preliminarily trained knowledge graph embedded model according to the knowledge graph verification set obtained in the step (2-1) until the obtained minimum ranking target based on self-confrontation negative sampling reaches the optimum, so as to obtain the trained knowledge graph embedded model;
(3) and (3) after all the scores in the score list obtained in the step (2) are sorted from high to low, taking out a first tail entity in all tail entities, and forming a triple together with the head entity h and the relation r obtained in the step (1) to serve as a knowledge graph reasoning result.
The invention carries out link prediction tasks in the three classical knowledge graph data sets shown in figure 3, namely, head entities and relations are given, and tail entities are predicted; or given a relationship and a tail entity, predicting a head entity; or the scoring function is used for scoring and ranking the test triples, and the results are evaluated;
the evaluation indexes of the invention are average ranking (MR), average to number ranking (MRR) and hit rate (Hits @ K) of the first K ranking. More specifically, MR represents the average ranking of the correct triples in all the prediction triples, MRR represents the average ranking of the correct triples in all the prediction triples after taking the reciprocal, and Hits @ K is the hit rate of the first K triples of the correct triples in all the prediction triples;
a good link prediction model should achieve lower MR and higher MRR and Hits @ K. Figure 4 shows the results of the present invention and the prior model in a filtering setting. Obviously, the model is basically superior to all existing models, and the knowledge graph can be well modeled by the method.
FIG. 5 also shows the superiority of the two classical models and the model in modeling the complex relationship of the data sets FB15k-237, the model is only slightly inferior in more than 1 pair, and the model is improved in other aspects.
Experiments on a plurality of advanced models show that the model has better performance in link prediction and can model various existing relational modes.
It will be understood by those skilled in the art that the foregoing is only a preferred embodiment of the present invention, and is not intended to limit the invention, and that any modification, equivalent replacement, or improvement made within the spirit and principle of the present invention should be included in the scope of the present invention.

Claims (8)

1. A knowledge graph reasoning method based on affine transformation and bias modeling is characterized by comprising the following steps:
(1) acquiring a head entity h and a relation r;
(2) inputting the head entity h and the relation r obtained in the step (1) into a trained knowledge graph to be embedded into a model so as to obtain a corresponding score list;
(3) and (3) after all the scores in the score list obtained in the step (2) are sorted from high to low, taking out a first tail entity in all tail entities, and forming a triple together with the head entity h and the relation r obtained in the step (1) to serve as a knowledge graph reasoning result.
2. The knowledge graph inference method based on affine transformation and bias modeling according to claim 1, wherein the knowledge graph embedding model is trained by the following method:
(2-1) acquiring a knowledge graph data set, and randomly dividing the knowledge graph data set into a training set, a verification set and a test set, wherein each knowledge graph data is composed of a triple (h, R, t) composed of a head entity h, a relation R and a tail entity t, all entities form an entity set E, and all relations form a relation set R;
(2-2) randomly initializing the entity set E and the relation set R in the training set obtained in the step (2-1) to obtain an embedded representation of each entity and an embedded representation of each relation;
and (2-3) inputting the embedded representations of all the entities and the embedded representations of the relations in the training set obtained in the step (2-2) into a knowledge graph embedding model, and iteratively training the embedded representations of the entities and the embedded representations of the relations in the knowledge graph embedding model by using a self-confrontation negative sampling-based method until the knowledge graph embedding model converges, thereby obtaining the trained knowledge graph embedding model.
And (2-4) verifying the preliminarily trained knowledge graph embedded model according to the knowledge graph verification set obtained in the step (2-1) until the obtained minimum ranking target based on self-confrontation negative sampling reaches the optimum, so that the trained knowledge graph embedded model is obtained.
3. The knowledge-graph reasoning method based on affine transformation and bias modeling according to claim 1 or 2, wherein the knowledge-graph dataset is a public knowledge-graph dataset FB15k-237, WN18RR or YAGO 3-10.
4. The knowledge graph reasoning method based on affine transformation and bias modeling according to any one of claims 1 to 3, wherein in the step (2-2), the size of each entity and each embedded representation of each relationship is 2d, the value range of d is 1-1000, and each embedded representation satisfies uniform distribution; wherein the front d dimension embedding of an entity is represented as e (h or t), and the back d dimension embedding is represented as the bias b of the entitye(bhOr bt) (ii) a And the front d-dimensional embedded representation of the relationship is the weight w of the relationshiprThe d-dimension-after embedded representation is the bias b of the relationshipr
5. The knowledge graph inference method based on affine transformation and bias modeling according to claim 1 or 2, wherein the step (2-3) is specifically:
firstly, constructing a scoring function according to the embedded representation of all the entities in the training set and the embedding of the relationship obtained in the step (2-2):
Figure FDA0003322625670000021
where h represents the embedded representation of the head entity h, t represents the embedded representation of the tail entity t, bhDenotes the offset of the head entity h, btRepresenting the offset, w, of the tail entity trAn embedded representation of the weights representing the relation r, brAn embedded representation of the bias representing the relationship r,
Figure FDA0003322625670000022
representing a regularization operation, drAnd (h, t) represents the corresponding fraction of the triple consisting of the head entity h, the relation r and the tail entity t.
And (3) then, constructing a negative triple set T' according to the triple set T formed by all the triples in the training set obtained in the step (2-1):
Figure FDA0003322625670000023
h' is an entity randomly selected from the entity set E as a head entity, so that the triplet formed by the head entity, r and T does not belong to the triplet set T; t' is an entity randomly selected from E as the tail entity so that its triplet with h, r does not belong to the triplet set T.
And then, minimizing a loss function L based on self-confrontation negative sampling according to the negative triple set T' and the triple set T in the training set, thereby obtaining a preliminarily trained knowledge graph embedding model.
6. The method for knowledge-graph inference based on affine transformation and bias modeling according to claim 5, wherein the loss function L is specifically:
Figure FDA0003322625670000031
wherein | · | purple2Represents L2Regularization []Representing the splicing operation between vectors, lambda is a penalty factor of a regular term, the value range of lambda is 0 to 0.1, gamma is a boundary value, the value range of gamma is 1 to 24, E represents the embedded representation of all entities in an entity set E, beEmbedded representation, p (h'j,r,t′j) Represents the jth (where j ∈ [1, num)]Num is the total number of negative triples in the negative triplet set T').
7. The knowledge graph reasoning method based on affine transformation and bias modeling as claimed in claim 5, wherein a sampling weight p (h ') of a jth negative triple is'j,r,t′j) The calculation formula of (a) is as follows:
Figure FDA0003322625670000032
where α represents the self-antagonistic sampling temperature, when α is 0, the self-antagonistic sampling degenerates to a uniform sampling, and when α >0, the less scored negative triplet corresponds to a larger sampling weight.
8. A knowledge graph inference system based on affine transformation and bias modeling, comprising:
a first module for obtaining a head entity h and a relationship r;
and the second module is used for inputting the head entity h and the relation r obtained by the first module into the trained knowledge graph embedding model so as to obtain a corresponding score list.
And the third module is used for sorting all the scores in the score list obtained by the second module from high to low, then taking out the first tail entity of all the tail entities to form a triple together with the head entity h and the relation r obtained by the first module, and using the triple as a knowledge graph reasoning result.
CN202111251386.XA 2021-10-27 2021-10-27 Knowledge graph reasoning method and system based on affine transformation and bias modeling Pending CN113987203A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111251386.XA CN113987203A (en) 2021-10-27 2021-10-27 Knowledge graph reasoning method and system based on affine transformation and bias modeling

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111251386.XA CN113987203A (en) 2021-10-27 2021-10-27 Knowledge graph reasoning method and system based on affine transformation and bias modeling

Publications (1)

Publication Number Publication Date
CN113987203A true CN113987203A (en) 2022-01-28

Family

ID=79742045

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111251386.XA Pending CN113987203A (en) 2021-10-27 2021-10-27 Knowledge graph reasoning method and system based on affine transformation and bias modeling

Country Status (1)

Country Link
CN (1) CN113987203A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116564408A (en) * 2023-04-28 2023-08-08 上海科技大学 Synthetic lethal gene pair prediction method, device, equipment and medium based on knowledge-graph reasoning
CN117493582A (en) * 2023-12-29 2024-02-02 珠海格力电器股份有限公司 Model result output method and device, electronic equipment and storage medium

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116564408A (en) * 2023-04-28 2023-08-08 上海科技大学 Synthetic lethal gene pair prediction method, device, equipment and medium based on knowledge-graph reasoning
CN116564408B (en) * 2023-04-28 2024-03-01 上海科技大学 Synthetic lethal gene pair prediction method, device, equipment and medium based on knowledge-graph reasoning
CN117493582A (en) * 2023-12-29 2024-02-02 珠海格力电器股份有限公司 Model result output method and device, electronic equipment and storage medium
CN117493582B (en) * 2023-12-29 2024-04-05 珠海格力电器股份有限公司 Model result output method and device, electronic equipment and storage medium

Similar Documents

Publication Publication Date Title
Bai et al. Stabilizing equilibrium models by jacobian regularization
CN109639710B (en) Network attack defense method based on countermeasure training
CN111753101B (en) Knowledge graph representation learning method integrating entity description and type
CN110555455A (en) Online transaction fraud detection method based on entity relationship
CN113177132B (en) Image retrieval method based on depth cross-modal hash of joint semantic matrix
CN112699247A (en) Knowledge representation learning framework based on multi-class cross entropy contrast completion coding
CN113987203A (en) Knowledge graph reasoning method and system based on affine transformation and bias modeling
CN112633403A (en) Graph neural network classification method and device based on small sample learning
CN112464004A (en) Multi-view depth generation image clustering method
CN112000689B (en) Multi-knowledge graph fusion method based on text analysis
CN110264372B (en) Topic community discovery method based on node representation
CN112580728B (en) Dynamic link prediction model robustness enhancement method based on reinforcement learning
CN108052683B (en) Knowledge graph representation learning method based on cosine measurement rule
CN114299362A (en) Small sample image classification method based on k-means clustering
CN116055175A (en) Intrusion detection method for optimizing neural network by combining symmetric uncertainty and super parameters
CN109933720B (en) Dynamic recommendation method based on user interest adaptive evolution
CN113361279B (en) Medical entity alignment method and system based on double neighborhood graph neural network
CN109472712A (en) A kind of efficient Markov random field Combo discovering method strengthened based on structure feature
CN111126758B (en) Academic team influence propagation prediction method, academic team influence propagation prediction equipment and storage medium
CN113806559A (en) Knowledge graph embedding method based on relationship path and double-layer attention
CN115860119A (en) Low-sample knowledge graph completion method and system based on dynamic meta-learning
CN116108189A (en) Quaternion-based knowledge graph embedding method and system
CN115131605A (en) Structure perception graph comparison learning method based on self-adaptive sub-graph
CN117033997A (en) Data segmentation method, device, electronic equipment and medium
CN114254738A (en) Double-layer evolvable dynamic graph convolution neural network model construction method and application

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination