CN113190691A - Link prediction method and system of knowledge graph - Google Patents

Link prediction method and system of knowledge graph Download PDF

Info

Publication number
CN113190691A
CN113190691A CN202110593313.2A CN202110593313A CN113190691A CN 113190691 A CN113190691 A CN 113190691A CN 202110593313 A CN202110593313 A CN 202110593313A CN 113190691 A CN113190691 A CN 113190691A
Authority
CN
China
Prior art keywords
vectors
entity
embedding
vector
knowledge graph
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202110593313.2A
Other languages
Chinese (zh)
Other versions
CN113190691B (en
Inventor
李爱民
李稼川
刘腾
刘笑含
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Qilu University of Technology
Original Assignee
Qilu University of Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Qilu University of Technology filed Critical Qilu University of Technology
Priority to CN202110593313.2A priority Critical patent/CN113190691B/en
Publication of CN113190691A publication Critical patent/CN113190691A/en
Application granted granted Critical
Publication of CN113190691B publication Critical patent/CN113190691B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/30Information retrieval; Database structures therefor; File system structures therefor of unstructured textual data
    • G06F16/36Creation of semantic tools, e.g. ontology or thesauri
    • G06F16/367Ontology
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/045Combinations of networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Data Mining & Analysis (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • General Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • Computational Linguistics (AREA)
  • General Health & Medical Sciences (AREA)
  • Evolutionary Computation (AREA)
  • Biophysics (AREA)
  • Molecular Biology (AREA)
  • Computing Systems (AREA)
  • Biomedical Technology (AREA)
  • Artificial Intelligence (AREA)
  • Mathematical Physics (AREA)
  • Software Systems (AREA)
  • Health & Medical Sciences (AREA)
  • Animal Behavior & Ethology (AREA)
  • Databases & Information Systems (AREA)
  • Management, Administration, Business Operations System, And Electronic Commerce (AREA)

Abstract

The invention belongs to the field of knowledge graph link prediction and provides a method and a system for predicting a knowledge graph link. The method comprises the steps of obtaining entity vectors and relation embedding vectors in a knowledge graph; generating a random permutation of the entity vectors and the relationship embedding vectors; embedding the randomly arranged entity vectors and the relations into vectors and shaping the vectors into a matrix form; embedding the entity vectors and the relationship embedded vectors after shaping in the batch normalization processing matrix; and performing convolution on the entity vector and the relation embedding vector after the normalization processing by using cyclic convolution, outputting the convolution vector and feeding back the output to a full connection layer to obtain an entity embedding matrix, and predicting a link of the knowledge graph.

Description

Link prediction method and system of knowledge graph
Technical Field
The invention belongs to the field of knowledge graph link prediction, and particularly relates to a knowledge graph link prediction method and a knowledge graph link prediction system.
Background
The statements in this section merely provide background information related to the present disclosure and may not necessarily constitute prior art.
A Knowledge Graph (KGs) is a knowledge base of graph structures in which nodes represent entities and edges represent relationships between entities. These relationships are represented in the form of (s, r, o) triplets (e.g., entity s yaoming, relationship r nationality, entity o china). KGs are used in many applications such as information retrieval, natural language understanding, question and answer systems, recommendation systems, and the like.
Despite the millions of entities and triples contained in a knowledge graph, they are far from complete compared to the existing facts and newly added knowledge of the real world. Therefore, completion of the knowledge-graph is very important. Link prediction accomplishes this by inferring missing facts based on the facts known in KGs. The current mainstream approach is to learn low-dimensional representations of all entities and relationships (called embedding) and use them to predict new facts. Given a scoring function, by learning the embedded KGs through the scoring function, the scoring function will assign a higher score to real facts than to invalid facts. From the initial TransE model, a number of knowledge-graph embedding models were subsequently proposed, such as TransH, DistMult, TransR, TransD, TransA.
These shallow models have limited ability to extract the number of features, and the only way to increase the number of features in the shallow model (and their expressiveness) is to increase the embedding size, which will result in a dramatic increase in the number of parameters, limiting their scalability to larger knowledge maps. Convolutional Neural Networks (CNNs) have the advantage of using multiple layers, thereby increasing their expressive power while maintaining parameter efficiency. Dettmers takes advantage of these properties and proposes a 2D convolution model-ConvE that applies convolution filters to entity and relationship embedding. They aim to increase the number of interactions between embedded components, but ConvE does not maximize entity-to-relationship interactions.
In summary, the inventors found that, in the field of link prediction of a knowledge graph, due to the limited interaction capture capability of a shallow model for entity and relationship embedding, the accuracy of link prediction is not high, and further the knowledge graph may not be completed.
Disclosure of Invention
In order to solve the technical problems in the background art, the invention provides a method and a system for predicting a link of a knowledge graph, which can enhance the interaction between relationship embedding and entity embedding and enhance the accuracy of the link prediction of the knowledge graph.
In order to achieve the purpose, the invention adopts the following technical scheme:
a first aspect of the invention provides a method of link prediction for a knowledge-graph.
A method of link prediction for a knowledge graph, comprising:
acquiring entity vectors and relationship embedded vectors in a knowledge graph;
generating a random permutation of the entity vectors and the relationship embedding vectors;
embedding the randomly arranged entity vectors and the relations into vectors and shaping the vectors into a matrix form;
embedding the entity vectors and the relationship embedded vectors after shaping in the batch normalization processing matrix;
and performing convolution on the entity vector and the relation embedding vector after the normalization processing by using cyclic convolution, outputting the convolution vector and feeding back the output to a full connection layer to obtain an entity embedding matrix, and predicting a link of the knowledge graph.
A second aspect of the invention provides a system for link prediction of a knowledge-graph.
A link prediction system for a knowledge graph, comprising:
the vector acquisition module is used for acquiring entity vectors and relationship embedded vectors in the knowledge graph;
a random permutation module for generating a random permutation of the entity vector and the relationship embedding vector;
the shaping module is used for shaping the entity vectors and the relation embedding vectors which are randomly arranged into a matrix form;
the batch normalization processing module is used for performing batch normalization processing on the entity vectors and the relation embedding vectors after shaping in the matrix;
and the link prediction module is used for performing convolution on the entity vector and the relation embedding vector after the normalization processing by using a cyclic convolution, outputting the convolution vector and feeding back the output to the full connection layer to obtain an entity embedding matrix, and predicting the link of the knowledge graph.
A third aspect of the invention provides a computer-readable storage medium.
A computer-readable storage medium, on which a computer program is stored which, when being executed by a processor, carries out the steps of the method for link prediction of a knowledge-graph as set forth above.
A fourth aspect of the invention provides a computer apparatus.
A computer device comprising a memory, a processor and a computer program stored on the memory and executable on the processor, the processor when executing the program implementing the steps in the method of link prediction of a knowledge-graph as described above.
Compared with the prior art, the invention has the beneficial effects that:
the invention randomly arranges entity vectors and relationship embedded vectors, shapes the randomly arranged entity vectors and relationship embedded vectors into a matrix form, normalizes the shaped entity vectors and relationship embedded vectors in the matrix in batches, convolutes the normalized entity vectors and relationship embedded vectors by using cyclic convolution, outputs the convolution vectorization and feeds back the convolution vectorization to a full connection layer to obtain an entity embedded matrix, predicts a link of a knowledge graph, enhances the interaction between the relationship embedding and the entity embedding by using three operations of random arrangement, reshaping and cyclic convolution to maximize the interaction between the entity and the relationship, improves the link prediction performance by increasing the number of the interaction between the entity and the relationship, and greatly improves the training speed of a model applied by the link prediction method of the knowledge graph by adding a batch normalization preprocessing method, and relieve the overfitting to some extent.
Advantages of additional aspects of the invention will be set forth in part in the description which follows, and in part will be obvious from the description, or may be learned by practice of the invention.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, are included to provide a further understanding of the invention, and are incorporated in and constitute a part of this specification, illustrate exemplary embodiments of the invention and together with the description serve to explain the invention and not to limit the invention.
FIG. 1 is a flow diagram of a method for link prediction of a knowledge graph in accordance with an embodiment of the present invention;
FIG. 2 is a link prediction overall architecture diagram of a knowledge graph of an embodiment of the invention.
Detailed Description
The invention is further described with reference to the following figures and examples.
It is to be understood that the following detailed description is exemplary and is intended to provide further explanation of the invention as claimed. Unless defined otherwise, all technical and scientific terms used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this invention belongs.
It is noted that the terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of exemplary embodiments according to the invention. As used herein, the singular forms "a", "an" and "the" are intended to include the plural forms as well, and it should be understood that when the terms "comprises" and/or "comprising" are used in this specification, they specify the presence of stated features, steps, operations, devices, components, and/or combinations thereof, unless the context clearly indicates otherwise.
Example one
As shown in fig. 1 and fig. 2, the present embodiment provides a method for predicting a link of a knowledge graph, which specifically includes the following steps:
step S101: and acquiring entity vectors and relationship embedded vectors in the knowledge graph.
The data for this method of this example was taken from three knowledge-map datasets applied to FB15K-237, WN18RR, and YAGO 3-10. The entity vector and the relation embedding vector in the knowledge graph are respectively esAnd er
Step S102: generating a random permutation of the entity vectors and the relationship embedding vectors.
The order of input is not fixed, but a plurality of arrangements are generated by random arrangement, so as to capture more interaction.
First generate esAnd erBy random arrangement of (A), we mean
Figure BDA0003090042620000051
In most cases, for different i
Figure BDA0003090042620000052
The included interaction sets are disjoint. From knowledge of the permutation-combinations we can see that the number of different interactions is huge in all permutations, n times the number of interactions for n different permutations compared to a single permutation.
Step S103: and shaping the randomly arranged entity vectors and the relationship embedding vectors into a matrix form.
Shaping function gamma Rd×Rd→Rm×nCan embed vector esAnd erConversion into matrix gamma (e)s,er) Wherein m × n ═ 2 d. We compare the three shaping functions and finally decide to apply the checkerboard shaping function to the method of this embodiment.
Use function
Figure BDA0003090042620000053
And define
Figure BDA0003090042620000054
The shaping function used in ConvE is γstk(. The function has a limited interaction Capacity, and γ is used in this embodimentchr(. cndot.) can capture the maximum heterogeneous interaction between an entity and a relationship. The following shaping functions are used for the three shaping functions:
①Stack(γstk) E is to besAnd erReshaped into a matrix of shape (m/2) x n and then simply stacked in height into an m x n matrix.
②Alternate
Figure BDA0003090042620000055
E is to besAnd erRemoulded into a matrix of (m/2) × n in shape, and then alternately stacked into an m × n matrix.
③Chequer(γchr) E is to besAnd erAre arranged in an m x n matrix, and all adjacent elements in the matrix are not from the same embedded component.
Wherein m is the number of elements constituting the entity vector, which is the same as the number of elements constituting the relationship-embedding vector; n is the kind of arrangement.
Step S104: and embedding the entity vectors and the relationship embedded vectors after shaping in the batch normalization processing matrix.
Step S105: and performing convolution on the entity vector and the relation embedding vector after the normalization processing by using cyclic convolution, outputting the convolution vector and feeding back the output to a full connection layer to obtain an entity embedding matrix, and predicting a link of the knowledge graph.
With filters
Figure BDA0003090042620000061
Is determined by the two-dimensional input matrix A ∈ Rm×nThe cyclic convolution of (a) is defined as:
Figure BDA0003090042620000062
wherein [ z ] is]nRepresenting z modulo n, [ ·]Representing the floor function.
Each shaped array is stacked into a single channel. By applying the cyclic convolution in the depth mode, different filter banks can adopt different filter arrangements, and in actual operation, the cross-channel shared filter with better effect is used, and more instances can be input to train a group of kernel weights.
During the training process of the network middle layer, the distribution of data changes, and the change is called Internal covariance Shift, so that a Batch Normalization (BN) algorithm is added into the convolutional network. The algorithm preprocesses input data, and a formula is defined as follows:
Figure BDA0003090042620000063
wherein E [ x ](l)]Representing each batch of training data neurons x(l)Is determined by the average value of (a) of (b),
Figure BDA0003090042620000064
representing each batch of training data neurons x(l)One standard deviation of activation.
Using only normalization processes destroys the feature distribution learned by the previous layer of the network, so two learnable parameters δ, η are introduced here:
Figure BDA0003090042620000071
each neuron x(l)Both have parameters δ, η. Therefore, when:
Figure BDA0003090042620000072
then we can recover the characteristics learned by a certain layer. Next, the forward conduction formula of the BN network layer is given:
Figure BDA0003090042620000073
Figure BDA0003090042620000074
Figure BDA0003090042620000075
Figure BDA0003090042620000076
where m represents the size of the smallest batch. μ, σ are the mean and standard deviation, respectively, which we use to calculate the mean and standard deviation required for the test phase.
E[x]←EBB]
Figure BDA0003090042620000077
Thus, the BN formula at the test stage is
Figure BDA0003090042620000078
x is data which is not subjected to batch normalization, and y is data subjected to batch normalization.
By using the method, each feature map is taken as a processable unit to be subjected to normalization processing, and parameters such as learning rate, Dropout and L2 do not need to be adjusted in the training process. The algorithm has the characteristics of fast convergence and improvement of the network generalization capability, and the model training speed applied by the method of the embodiment is greatly improved. In the training process, the BN breaks up the training data thoroughly, prevents the same sample from appearing repeatedly during each batch of training, and further improves the interaction capacity of the entity and the embedding relation of the embodiment.
The link prediction method of knowledge graph embedding is defined as FICN;
table 1 shows some scoring functions. We flatten and join the output of each cyclic convolution into a vector, and then project this vector into the embedding space (R)d). We define the scoring function used in FICN as follows:
Figure BDA0003090042620000081
where vec (-) represents a vector linkage, Θ represents a cyclic convolution with BN preprocessing added, W represents a learnable weight matrix, eoPresentation pairLike a physical embedded matrix. Sigmoid and ELU are used for functions h and f.
TABLE 1 Scoring function
Figure BDA0003090042620000082
The results of the experiment are shown in table 2:
table 2 link prediction evaluation over three data sets
Figure BDA0003090042620000083
Figure BDA0003090042620000091
Example two
The embodiment provides a link prediction system of a knowledge graph, which specifically includes:
the vector acquisition module is used for acquiring entity vectors and relationship embedded vectors in the knowledge graph;
a random permutation module for generating a random permutation of the entity vector and the relationship embedding vector;
the shaping module is used for shaping the entity vectors and the relation embedding vectors which are randomly arranged into a matrix form;
the batch normalization processing module is used for performing batch normalization processing on the entity vectors and the relation embedding vectors after shaping in the matrix;
and the link prediction module is used for performing convolution on the entity vector and the relation embedding vector after the normalization processing by using a cyclic convolution, outputting the convolution vector and feeding back the output to the full connection layer to obtain an entity embedding matrix, and predicting the link of the knowledge graph.
It should be noted that, each module in the link prediction system of the knowledge graph in the embodiment corresponds to each step in the link prediction method of the knowledge graph in the first embodiment one to one, and the specific implementation process is the same, which will not be described herein again.
EXAMPLE III
The present embodiment provides a computer-readable storage medium, on which a computer program is stored, which when executed by a processor implements the steps in the method for link prediction of a knowledge-graph as described above.
Example four
The present embodiment provides a computer device comprising a memory, a processor and a computer program stored on the memory and executable on the processor, wherein the processor executes the program to implement the steps of the method for link prediction of a knowledge-graph as described above.
As will be appreciated by one skilled in the art, embodiments of the present invention may be provided as a method, system, or computer program product. Accordingly, the present invention may take the form of a hardware embodiment, a software embodiment, or an embodiment combining software and hardware aspects. Furthermore, the present invention may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, optical storage, and the like) having computer-usable program code embodied therein.
The present invention is described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems), and computer program products according to embodiments of the invention. It will be understood that each flow and/or block of the flow diagrams and/or block diagrams, and combinations of flows and/or blocks in the flow diagrams and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
It will be understood by those skilled in the art that all or part of the processes of the methods of the embodiments described above can be implemented by a computer program, which can be stored in a computer-readable storage medium, and when executed, can include the processes of the embodiments of the methods described above. The storage medium may be a magnetic disk, an optical disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), or the like.
The above description is only a preferred embodiment of the present invention and is not intended to limit the present invention, and various modifications and changes may be made by those skilled in the art. Any modification, equivalent replacement, or improvement made within the spirit and principle of the present invention should be included in the protection scope of the present invention.

Claims (10)

1. A method for link prediction of a knowledge graph, comprising:
acquiring entity vectors and relationship embedded vectors in a knowledge graph;
generating a random permutation of the entity vectors and the relationship embedding vectors;
embedding the randomly arranged entity vectors and the relations into vectors and shaping the vectors into a matrix form;
embedding the entity vectors and the relationship embedded vectors after shaping in the batch normalization processing matrix;
and performing convolution on the entity vector and the relation embedding vector after the normalization processing by using cyclic convolution, outputting the convolution vector and feeding back the output to a full connection layer to obtain an entity embedding matrix, and predicting a link of the knowledge graph.
2. The method of link prediction of a knowledge-graph of claim 1 wherein the randomly arranged entity vectors and relationships embedding vectors are shaped into a matrix form using a shaping function.
3. The method of link prediction of a knowledge-graph of claim 2 wherein the shaping function is a Chequer function for cross-arranging randomly arranged entity vectors and elements in relationship embedding vectors into an m x n matrix, where m is the number of elements constituting an entity vector which is the same as the number of elements constituting a relationship embedding vector; n is the kind of arrangement.
4. The method for link prediction of a knowledge graph of claim 2, wherein the shaping function is a Stack function or an Alternate function.
5. The method for link prediction of a knowledge graph of claim 1 wherein each shaped permutation is stacked into a single channel.
6. The method for link prediction of a knowledge graph of claim 1 wherein the batch normalization process is formulated as:
Figure FDA0003090042610000011
wherein, Ex and Var x are mean and standard deviation respectively; μ, σ are mean and standard deviation, respectively; delta and eta are learnable parameters; x is data which is not subjected to batch normalization, and y is data subjected to batch normalization.
7. The method of link prediction of a knowledge-graph of claim 1 wherein the output of each cyclic convolution is flattened and concatenated into a vector, and then the vector is projected into an embedding space to obtain the entity embedding matrix.
8. A system for link prediction of a knowledge graph, comprising:
the vector acquisition module is used for acquiring entity vectors and relationship embedded vectors in the knowledge graph;
a random permutation module for generating a random permutation of the entity vector and the relationship embedding vector;
the shaping module is used for shaping the entity vectors and the relation embedding vectors which are randomly arranged into a matrix form;
the batch normalization processing module is used for performing batch normalization processing on the entity vectors and the relation embedding vectors after shaping in the matrix;
and the link prediction module is used for performing convolution on the entity vector and the relation embedding vector after the normalization processing by using a cyclic convolution, outputting the convolution vector and feeding back the output to the full connection layer to obtain an entity embedding matrix, and predicting the link of the knowledge graph.
9. A computer-readable storage medium, on which a computer program is stored which, when being executed by a processor, carries out the steps in the method for link prediction of a knowledge-graph according to any one of claims 1-7.
10. A computer device comprising a memory, a processor and a computer program stored on the memory and executable on the processor, characterized in that the processor implements the steps in the method of link prediction of a knowledge-graph according to any of claims 1-7 when executing the program.
CN202110593313.2A 2021-05-28 2021-05-28 Link prediction method and system of knowledge graph Active CN113190691B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110593313.2A CN113190691B (en) 2021-05-28 2021-05-28 Link prediction method and system of knowledge graph

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110593313.2A CN113190691B (en) 2021-05-28 2021-05-28 Link prediction method and system of knowledge graph

Publications (2)

Publication Number Publication Date
CN113190691A true CN113190691A (en) 2021-07-30
CN113190691B CN113190691B (en) 2022-09-23

Family

ID=76985813

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110593313.2A Active CN113190691B (en) 2021-05-28 2021-05-28 Link prediction method and system of knowledge graph

Country Status (1)

Country Link
CN (1) CN113190691B (en)

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20190122111A1 (en) * 2017-10-24 2019-04-25 Nec Laboratories America, Inc. Adaptive Convolutional Neural Knowledge Graph Learning System Leveraging Entity Descriptions
CN110990580A (en) * 2019-11-02 2020-04-10 国网辽宁省电力有限公司电力科学研究院 Knowledge graph construction method and device, computer equipment and storage medium
CN111784081A (en) * 2020-07-30 2020-10-16 南昌航空大学 Social network link prediction method adopting knowledge graph embedding and time convolution network
CN112182245A (en) * 2020-09-28 2021-01-05 中国科学院计算技术研究所 Knowledge graph embedded model training method and system and electronic equipment
CN112380345A (en) * 2020-11-20 2021-02-19 山东省计算中心(国家超级计算济南中心) COVID-19 scientific literature fine-grained classification method based on GNN
CN112685609A (en) * 2021-01-04 2021-04-20 福州大学 Knowledge graph complementing method combining translation mechanism and convolutional neural network
CN112765369A (en) * 2021-01-31 2021-05-07 西安电子科技大学 Knowledge graph information representation learning method, system, equipment and terminal
CN112765287A (en) * 2021-02-05 2021-05-07 中国人民解放军国防科技大学 Method, device and medium for mining character relation based on knowledge graph embedding
CN112818136A (en) * 2021-02-26 2021-05-18 福州大学 Time convolution-based interactive knowledge representation learning model TCIM prediction method

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20190122111A1 (en) * 2017-10-24 2019-04-25 Nec Laboratories America, Inc. Adaptive Convolutional Neural Knowledge Graph Learning System Leveraging Entity Descriptions
CN110990580A (en) * 2019-11-02 2020-04-10 国网辽宁省电力有限公司电力科学研究院 Knowledge graph construction method and device, computer equipment and storage medium
CN111784081A (en) * 2020-07-30 2020-10-16 南昌航空大学 Social network link prediction method adopting knowledge graph embedding and time convolution network
CN112182245A (en) * 2020-09-28 2021-01-05 中国科学院计算技术研究所 Knowledge graph embedded model training method and system and electronic equipment
CN112380345A (en) * 2020-11-20 2021-02-19 山东省计算中心(国家超级计算济南中心) COVID-19 scientific literature fine-grained classification method based on GNN
CN112685609A (en) * 2021-01-04 2021-04-20 福州大学 Knowledge graph complementing method combining translation mechanism and convolutional neural network
CN112765369A (en) * 2021-01-31 2021-05-07 西安电子科技大学 Knowledge graph information representation learning method, system, equipment and terminal
CN112765287A (en) * 2021-02-05 2021-05-07 中国人民解放军国防科技大学 Method, device and medium for mining character relation based on knowledge graph embedding
CN112818136A (en) * 2021-02-26 2021-05-18 福州大学 Time convolution-based interactive knowledge representation learning model TCIM prediction method

Also Published As

Publication number Publication date
CN113190691B (en) 2022-09-23

Similar Documents

Publication Publication Date Title
Rangarajan et al. Tomato crop disease classification using pre-trained deep learning algorithm
Zhong et al. Practical block-wise neural network architecture generation
CN109325516B (en) Image classification-oriented ensemble learning method and device
CN107943938A (en) A kind of large-scale image similar to search method and system quantified based on depth product
CN110462639A (en) Information processing equipment, information processing method and computer readable storage medium
CN112200300B (en) Convolutional neural network operation method and device
JP6977886B2 (en) Machine learning methods, machine learning devices, and machine learning programs
Zhou et al. Online filter clustering and pruning for efficient convnets
CN111967573A (en) Data processing method, device, equipment and computer readable storage medium
JP2024502225A (en) Method and system for convolution with activation sparsity with workload leveling
Easom-McCaldin et al. Efficient quantum image classification using single qubit encoding
CN113674156B (en) Method and system for reconstructing image super-resolution
Kishore et al. Impact of autotuned fully connected layers on performance of self-supervised models for image classification
CN114443862A (en) Knowledge graph completion method and system based on weighted graph convolution network
Kajkamhaeng et al. SE-SqueezeNet: SqueezeNet extension with squeeze-and-excitation block
CN113190691B (en) Link prediction method and system of knowledge graph
CN110210419A (en) The scene Recognition system and model generating method of high-resolution remote sensing image
Dutta et al. Effective building block design for deep convolutional neural networks using search
US11379731B2 (en) Relating complex data
Duan et al. Hamiltonian-based data loading with shallow quantum circuits
Nowotny Parallel implementation of a spiking neuronal network model of unsupervised olfactory learning on NVidia® CUDA™
Karmakar et al. Multilevel Random Forest algorithm in image recognition for various scientific applications
Arce et al. Dendrite morphological neural networks trained by differential evolution
Mitsuno et al. Channel planting for deep neural networks using knowledge distillation
CN108333940B (en) Method and device for optimizing designated controller parameters

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant