CN111582509A - Knowledge graph representation learning and neural network based collaborative recommendation method - Google Patents

Knowledge graph representation learning and neural network based collaborative recommendation method Download PDF

Info

Publication number
CN111582509A
CN111582509A CN202010378310.2A CN202010378310A CN111582509A CN 111582509 A CN111582509 A CN 111582509A CN 202010378310 A CN202010378310 A CN 202010378310A CN 111582509 A CN111582509 A CN 111582509A
Authority
CN
China
Prior art keywords
knowledge
training
kem
vector
low
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202010378310.2A
Other languages
Chinese (zh)
Other versions
CN111582509B (en
Inventor
王攀
黄琛
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Nanjing University of Posts and Telecommunications
Original Assignee
Nanjing University of Posts and Telecommunications
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nanjing University of Posts and Telecommunications filed Critical Nanjing University of Posts and Telecommunications
Priority to CN202010378310.2A priority Critical patent/CN111582509B/en
Publication of CN111582509A publication Critical patent/CN111582509A/en
Application granted granted Critical
Publication of CN111582509B publication Critical patent/CN111582509B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N20/00Machine learning
    • G06N20/20Ensemble learning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/30Information retrieval; Database structures therefor; File system structures therefor of unstructured textual data
    • G06F16/36Creation of semantic tools, e.g. ontology or thesauri
    • G06F16/367Ontology
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Software Systems (AREA)
  • Data Mining & Analysis (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Evolutionary Computation (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Computational Linguistics (AREA)
  • Computing Systems (AREA)
  • Mathematical Physics (AREA)
  • Artificial Intelligence (AREA)
  • Medical Informatics (AREA)
  • Health & Medical Sciences (AREA)
  • Biomedical Technology (AREA)
  • Biophysics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • General Health & Medical Sciences (AREA)
  • Molecular Biology (AREA)
  • Animal Behavior & Ethology (AREA)
  • Databases & Information Systems (AREA)
  • Information Retrieval, Db Structures And Fs Structures Therefor (AREA)

Abstract

The invention discloses a knowledge graph representation learning and neural network-based collaborative recommendation method, which is characterized in that items in a data set are mapped to public knowledge graph triples and are input into an OpenKE framework as a training set for model training, wherein a knowledge graph representation learning method is selected for learning in a parameter setting mode, corresponding vector matrixes E' of an entity set are reflected back to item individuals according to the sequence, and corresponding low-dimensional dense feature vectors I of each item, which are well constructed, are obtainedkemThe positive example low-dimensional dense feature vector Ikem‑posSum-case low-dimensional dense feature vector Ikem‑negReading in the model, replacing the traditional vector layer operation, namely embedding the knowledge into the vector layer for final output, and then starting training by the neural network training layerThe problems of sparse scoring matrix and cold start can be solved, and the performance and accuracy of collaborative filtering recommendation are enhanced.

Description

Knowledge graph representation learning and neural network based collaborative recommendation method
Technical Field
The invention provides a knowledge graph representation learning and neural network-based collaborative recommendation method, and belongs to the technical field of deep learning and recommendation systems.
Background
In the conventional recommendation system, the recommendation is carried out by depending on a matrix decomposition collaborative filtering algorithm, so that the problems of cold start and data sparsity inevitably occur. The data sparsity problem often means that the number of users and items is very large on a large-scale e-commerce platform, but the average number of items interacted by the users is small in the obtained user-item matrix, so that the user-item matrix is sparse. Whereas the cold start problem refers to how to make personalized recommendations for new users without a large amount of user data. The sparsity of data ultimately results in failure to capture relationships between different users and different items, thereby reducing the accuracy of the recommendation system. The neural network can analyze objects and the relation between the objects from a higher dimensionality, and the data sparsity problem is improved. Whether the cold start problem ends up or the information dimensions of the data are insufficient. The knowledge graph contains the fact relation of an object in the real world, which is equivalent to providing an additional information dimension for data needing to be trained in a model, so that the cold start problem is solved to a certain extent.
Disclosure of Invention
The purpose of the invention is as follows: in order to overcome the defects in the prior art, the invention provides a collaborative recommendation method based on knowledge graph representation learning and a neural network, which can solve the problems of sparse scoring matrix and cold start and enhance the performance and accuracy of collaborative filtering recommendation.
The technical scheme is as follows: in order to achieve the purpose, the invention adopts the technical scheme that:
a knowledge graph representation learning and neural network based collaborative recommendation method comprises the following steps:
step 1, acquiring a data set, mapping items in the data set to a public knowledge map triple K, and respectively constructing an entity set E, a relation set R and a training set S corresponding to the items;
step 2, inputting the constructed entity set E, the relationship set R and the training set S as training sets into an OpenKE framework for model training, wherein a knowledge graph representation learning method is selected for learning in a parameter setting mode;
step 3, outputting a corresponding vector matrix E' of the entity set in the second training process according to the second outputAnd reflecting the corresponding vector matrix E' of the entity set back to the project individual according to the project sequence of the entity set E, and finally constructing the corresponding low-dimensional dense feature vector I for each projectkem
Step 4, the low-dimensional dense feature vector I obtained in the step 3 is processedkemPositive and negative example selection processing is carried out to generate a positive example low-dimensional dense feature vector Ikem-posSum-case low-dimensional dense feature vector Ikem-negThen adding a read-in module, and adding the positive low-dimensional dense feature vector Ikem-posSum-case low-dimensional dense feature vector Ikem-negReading in the model, replacing the traditional vector layer operation, namely embedding the knowledge into the vector layer for final output, and then starting training by the neural network training layer.
Preferably: the reflection in step 3 refers to each low-dimensional dense feature vector I output after the training is finishedkemCorresponding to the item numbers of each input entity set E in order.
Preferably: the knowledge representation learning method includes a TransE method, a TransR method, a TransH method, and a TransD method.
Preferably: the knowledge embedding vector layer is a component that processes data offline.
Compared with the prior art, the invention has the following beneficial effects:
the invention provides a low-dimensional dense vector which is obtained by extracting the prior semantic data from a public knowledge base by using a knowledge graph structured representation learning method to enrich the information dimension of the prior data set, replaces the vectorization result of the traditional method with the low-dimensional dense vector which is obtained by representing the knowledge graph, and embeds the process into a collaborative filtering model of a neural network as a middle layer, thereby not only excavating the linear and nonlinear relations between users and projects, but also further integrating the knowledge relation of the projects, so that the neural network model can fully utilize a great deal of prior knowledge in the knowledge graph to excavate the relations between the projects, and further deeply excavate the interactive information between the users and the projects.
In the process of using the neural network, the invention allows the matrix decomposition module and the deep neural network module to be independently embedded and then respectively enter the hidden layer of the matrix decomposition module and the hidden layer of the deep neural network module to be calculated. The method comprises the steps of adding a knowledge map embedding layer into the bottom layer of a deep neural network, theoretically, the knowledge map embedding layer is a part of off-line operation, expressing and learning items by using the existing knowledge map to generate dense low-dimensional vectors with knowledge, splicing the dense low-dimensional vectors into a collaborative depth model as tributaries, adding a neural network prediction layer to fuse the output of the two hidden layers, and finally obtaining a relatively accurate recommendation list.
Drawings
Fig. 1 is a system architecture diagram representing a collaborative recommendation method for learning and neural networks based on a knowledge graph.
Fig. 2 is a flow chart diagram illustrating a collaborative recommendation method for learning and neural networks based on knowledge graph representation.
FIG. 3 is a set of entities, a set of relationships, and a set of hits for knowledge graph representation learning.
Fig. 4 shows a 128-dimensional dense vector generated by the TransH expression learning method.
FIG. 5 is a graph of the results of training on Movielens-1M using TransE and the original model.
Fig. 6 is a screenshot 1 of performance comparison experiments for different knowledge representation learning methods.
FIG. 7 is a screenshot 2 of performance comparison experiments for different knowledge representation learning methods.
Detailed Description
The present invention is further illustrated by the following description in conjunction with the accompanying drawings and the specific embodiments, it is to be understood that these examples are given solely for the purpose of illustration and are not intended as a definition of the limits of the invention, since various equivalent modifications will occur to those skilled in the art upon reading the present invention and fall within the limits of the appended claims.
A collaborative recommendation method based on knowledge graph representation learning and neural networks, as shown in figures 1 and 2, aims to estimate matching scores between users and items, and then generates personalized item recommendation lists for the users according to the scores. The framework is composed of an input layer, a knowledge vector embedding layer, a vector layer,The neural network training layer and the output layer. The core part of the invention is knowledge embedding vector layer. Knowledge embedding vector layer we can imagine it as a parallel part. The combination with the above layers is parallel and also belongs to the first part of the model. In other words, it has both functions. His input is the item vector i in the implicit feedback dataset and its output is the knowledge low-dimensional dense feature vector i' obtained using the knowledge representation learning method (TransE, TransR, TransH, TransD). The knowledge embedding vector layer is actually a component that processes data off-line relative to the other parts. Specifically, the original item number i is associated with the constructed external knowledge map, and the association is finally reflected in the output low-dimensional dense feature vector with knowledge. The relevant symbols involved are defined as follows: a three-tuple of the knowledge-graph is denoted by k ═ E, (R, S), where E is the set of all entities in the knowledge-graph, R is the set of all relationships,
Figure BDA0002480895500000031
representing a set of triples in the knowledge-graph. For a particular triplet, it is denoted by (h, r, t), where h and t represent the head and tail entities in the triplet, respectively, and r represents the relationship between h and t.
According to the previous introduction, the purpose of knowledge graph representation learning is to obtain vector representation of all entities and relations of triples in a low-dimensional continuous space by measuring semantic information of a triple structure in a knowledge graph through learning. The technical scheme of the invention is totally divided into two parts, wherein the first part is a knowledge representation learning module, and the second part is a knowledge embedding vector layer module.
First part-knowledge representation learning module: the method comprises the steps of firstly taking an item in a data set and an open knowledge graph triple S as input of a knowledge representation learning module, and processing procedures of the knowledge representation learning module are three steps.
1. Firstly, an open data set is obtained, items in the data set are mapped to an open knowledge graph triple K according to input, and an entity set E, a relation set R and a training set S corresponding to the items are respectively constructed for the items, namely, a triple related to the item is constructed for each item entity in the data set.
2. And inputting the entity set E, the relation set R and the training set S which are constructed in the last step into an OpenKE framework as training sets (the OpenKE framework is a main item based on TensorFlow and provides an optimized and stable framework for a knowledge graph embedding model), and performing model training, wherein different knowledge graph representation learning methods (TransE, TransR, TransH and TransD) can be selected and adopted for learning in a mode of setting parameters. The framework is mainly used for constructing the knowledge graph triples, and the invention skillfully uses the one-step intermediate process of the framework, namely, different knowledge graph representation learning methods are used for generating the low-dimensional dense vectors. The invention theoretically considers that the low-dimensional dense vectors corresponding to the items have certain structural knowledge, and not only is a simple vectorization result according to the sequence number.
3. In the second training process, a corresponding vector matrix E' of the entity set is output, and the vectors in the json format are output to a text for independent operation. Reflecting the corresponding vector matrix E' of the entity set back to the individual items according to the item sequence of the entity set E input by the second model, and explaining the reflection, namely reflecting each low-dimensional dense feature vector I output after the training is finishedknowledge-embedding(hereinafter referred to as I)kem) The item numbers corresponding to each input entity set E in sequence are associated, which is a separate offline operation. Finally, constructing a corresponding low-dimensional dense feature vector I for each projectkem
Second part-knowledge embedding vector layer module: since we have constructed for each project its corresponding low-dimensional dense feature vector IkemThe step is also equal to the operation of the vector layer in the model, except that the vector layer in the model carries out vectorization processing on the project according to the sequence number of the input training set to finally generate the low-dimensional feature vector, but the invention creatively provides the tool for constructing the first partKnowledge-based low-dimensional dense feature vector IkemThe traditional vector layer operation is replaced, namely a part is called a knowledge embedding vector layer module, and the part mainly completes logic of integrating and embedding into model positive and negative case selection. I due to first part generationkemIs an independent off-line operation, in the second part we first go to IkemThe selection processing of positive example and negative example is carried out to generate Ikem-posAnd Ikem-negThen adding a read-in module into the model, and adding Ikem-posAnd Ikem-negReading in the model, replacing the traditional vector layer operation, namely embedding the knowledge into the vector layer for final output, and sending the knowledge into a neural network training layer for training.
In the conventional recommendation system, the recommendation is carried out by depending on a matrix decomposition collaborative filtering algorithm, so that the problems of cold start and data sparsity inevitably occur. The data sparsity problem often means that the number of users and items is very large on a large-scale e-commerce platform, but the average number of items interacted by the users is small in the obtained user-item matrix, so that the user-item matrix is sparse. Whereas the cold start problem refers to how to make personalized recommendations for new users without a large amount of user data. The sparsity of data ultimately results in failure to capture relationships between different users and different items, thereby reducing the accuracy of the recommendation system. Implicit feedback is used for representing an implicit expression, the preference of a user can be obtained in various ways, and the expression preference is not limited to the displayed expression, so that a user-item matrix is enriched, and the problem of data sparsity is further solved. The neural network can analyze objects and the relation between the objects from a higher dimensionality, and the data sparsity problem is improved. Whether the cold start problem ends up or the information dimensions of the data are insufficient. The knowledge graph contains the fact relation of an object in the real world, which is equivalent to providing an additional information dimension for data needing to be trained in a model, so that the cold start problem is solved to a certain extent.
2. The experimental process comprises the following steps:
we will use a real movie public data set Movielens-1M to evaluate the performance results of the method. The experimental result is obviously improved on the recommendation index.
a. Experimental Environment
We implemented this algorithm on CentOS Linux distribution 7.2.1511 using GeForce GTX 1080. The training frame was Keras 2.0.0, the back end was TensorFlow-gpu 1.4.0, and the corresponding CUDA and CuDNN versions were 8.0 and 6.0, respectively.
b. Description of data sets
We used the Movielens-1M dataset, containing 1,000,209 scoring records, 6040 users, 3883 movies. For each data record, the scores of different users for the movie are mainly described, and the scores are gradually increased from 1 to 5. Since our experiments are implicit feedback, all records with scores are considered like, and those without scores are considered dislike. We use the open knowledge map Yago1The knowledge graph is composed of triples < h, r, t >. Each movie entity in the movie-1M is mapped with a head entity and a tail entity of a triple in YAGO to construct a triple, and finally, 5 relations related to the movie are extracted:<wroteMusicFor>、<directed>、<created>、<actedIn>、<edited>a total of 43847 triplets are formed, which contain 3,221 movie entities.
Since some movies in Movielens-1M are not associated with YAGO entities, as shown in fig. 1, we use the movie types provided in the original Movielens-1M dataset and construct a triplet like YAGO for a movie with the attribute of "type", such as < One flip Over the Cuckoo's Nest, generator, Drama >, and add a new type relationship: < generer >.
Table 1: description of data sets
Figure BDA0002480895500000051
Movielens-1M21,000,209 anonymous ratings of 3,883 movies made by 6,040 Movielens-1M users were included, which is the main experimental data set for our experiments. First, to guarantee data as much as possibleBalance of sets, we filtered users with scores below 10 times and movies with scores below 10 times. Then we sort the timestamps of the interactions between the user and the entries, the last interaction and the second through last interactions as the test set and validation set, respectively, which are 6,015. The basic embedded statistics for the data set are in a table with a Dateset column showing the training data set actually used for training.
As shown in fig. 3, the entity set, relationship set, diversity set for knowledge graph representation learning. This is the three training data sets we constructed from the Movielens-1M and YAGO data sets to represent learning only. The first file: entity2id is the number of 25360 movie entities that we extracted and renumbered each occurrence from Movielens-1M. The second file: relation2id shows that we use relation words in YAGO knowledge map, and numbering is carried out, and the total number is 6. The third file: train2id is a 49791 group according to the renumbered movie entities and the relation items of the knowledge graph and the user scoring table in the original Movielens-1M data set, and the corresponding new scoring table is matched again and is sent into the model as a training set to carry out knowledge graph representation learning.
After learning of knowledge graph representation is completed through OpenKE, only the dense vectors generated in training need to be extracted. As shown in FIG. 4, a 128-dimensional dense vector generated by the TransH representation learning method is shown, but only a small part is intercepted due to space limitation and is shown as a sample. The 128-dimensional dense knowledge vectors in parentheses in each group then represent the movie entities entered in order.
c. Evaluation index
1) Hit Ratio (HR): in the Top-K recommendation, HR is a commonly used index for measuring recall, the denominator GT is all test sets, and the numerator NumberOfHits @ K represents the sum of the number of test sets in the Top-K list of each user. The calculation formula is as follows:
Figure BDA0002480895500000061
2) normalized Dispersed Cummulant Gain (NDCG): the score is a normalized discounted cumulative revenue that evaluates ranking performance by considering the location of the correct item, representing the best list of recommended results returned by a user of the recommendation system, i.e., assuming that the returned results are ranked by relevance, the most relevant results are placed at the top. The formula for calculating the NDCG @ K for each user is:
Figure BDA0002480895500000062
z is a normalized operation, reliThe correlation of the recommendation result at position i is indicated, and k indicates the size of the recommendation list. The value of NDCG is between (0, 1)]. From the above equation it follows: 1) the greater the correlation of the recommendation, the greater the NDCG. 2) If the list is ranked in front of the list with good correlation, the better the recommendation, the larger the NDCG.
d. Results of the experiment
1. The results of the experiments are shown in FIG. 5, as shown in tables 2-1, 2-2, and 2-3, by comparing the experimental data with those obtained using the original model which is only representative of the learning method (TransE) and is not used.
Table 2-1: and (3) iterating the MF + TransE and the MF at Movielens-1M for 1-50 rounds of training loss.
Figure BDA0002480895500000071
Tables 2 to 2: HR @10 of MF + TransE and MF in Movielens-1M iteration for 1-50 rounds
Figure BDA0002480895500000072
Tables 2 to 3: NDCG @10 of MF + TransE and MF in Movielens-1M iteration for 1-50 rounds
Figure BDA0002480895500000081
2. Performance comparison experiments were performed using different knowledge representation learning methods TransE, TransR, TransD, TransH, with the vector dimension set to 128 dimensions, and the experimental results are shown in FIGS. 6 and 7.
A plurality of groups of comparison and contrast experiments prove that: by introducing the knowledge graph, the abundant relation knowledge of the knowledge graph is fused with the scoring matrix based on implicit feedback, the problems of sparse scoring matrix and cold start are effectively solved, and the performance and accuracy of collaborative filtering recommendation are enhanced. The experimental result shows that compared with individual learning of user and project characteristics, the method provided by the invention has better recommendation performance.
The invention uses a knowledge graph structuralization representation learning method to extract the existing semantic data from a public knowledge base so as to enrich the information dimension of the existing data set, and therefore, a low-dimensional dense vector which is learned by knowledge graph representation is provided as an intermediate layer to be embedded into a collaborative filtering model of a neural network. The invention carries out a plurality of groups of comparison experiments on the public data set, and the result shows that the method provided by the invention really and effectively improves the accuracy of the recommendation system.
The above description is only of the preferred embodiments of the present invention, and it should be noted that: it will be apparent to those skilled in the art that various modifications and adaptations can be made without departing from the principles of the invention and these are intended to be within the scope of the invention.

Claims (4)

1. A knowledge graph representation learning and neural network based collaborative recommendation method is characterized by comprising the following steps:
step 1, acquiring a data set, mapping items in the data set to a public knowledge map triple K, and respectively constructing an entity set E, a relation set R and a training set S corresponding to the items;
step 2, inputting the constructed entity set E, the relationship set R and the training set S as training sets into an OpenKE framework for model training, wherein a knowledge graph representation learning method is selected for learning in a parameter setting mode;
step 3, outputting a corresponding vector matrix E' of the entity set in the second training process according to the input in the second stepReflecting the corresponding vector matrix E' of the entity set back to the individual items according to the item sequence of the entity set E, and finally constructing the corresponding low-dimensional dense feature vector I for each itemkem
Step 4, the low-dimensional dense feature vector I obtained in the step 3 is processedkemPositive and negative example selection processing is carried out to generate a positive example low-dimensional dense feature vector Ikem-posSum-case low-dimensional dense feature vector Ikem-negThen adding a read-in module, and adding the positive low-dimensional dense feature vector Ikem-posSum-case low-dimensional dense feature vector Ikem-negReading in the model, replacing the traditional vector layer operation, namely embedding the knowledge into the vector layer for final output, and then starting training by the neural network training layer.
2. The knowledge-graph-based collaborative recommendation method for representing learning and neural networks according to claim 1, wherein: the reflection in step 3 refers to each low-dimensional dense feature vector I output after the training is finishedkemCorresponding to the item numbers of each input entity set E in order.
3. The knowledge-graph-based collaborative recommendation method for representing learning and neural networks according to claim 1, wherein: the knowledge representation learning method includes a TransE method, a TransR method, a TransH method, and a TransD method.
4. The knowledge-graph-based collaborative recommendation method for representing learning and neural networks according to claim 1, wherein: the knowledge embedding vector layer is a component that processes data offline.
CN202010378310.2A 2020-05-07 2020-05-07 Knowledge graph representation learning and neural network based collaborative recommendation method Active CN111582509B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010378310.2A CN111582509B (en) 2020-05-07 2020-05-07 Knowledge graph representation learning and neural network based collaborative recommendation method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010378310.2A CN111582509B (en) 2020-05-07 2020-05-07 Knowledge graph representation learning and neural network based collaborative recommendation method

Publications (2)

Publication Number Publication Date
CN111582509A true CN111582509A (en) 2020-08-25
CN111582509B CN111582509B (en) 2022-09-02

Family

ID=72126366

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010378310.2A Active CN111582509B (en) 2020-05-07 2020-05-07 Knowledge graph representation learning and neural network based collaborative recommendation method

Country Status (1)

Country Link
CN (1) CN111582509B (en)

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111738000A (en) * 2020-07-22 2020-10-02 腾讯科技(深圳)有限公司 Phrase recommendation method and related device
CN112182131A (en) * 2020-09-28 2021-01-05 中国电子科技集团公司第五十四研究所 Remote sensing image recommendation method based on multi-attribute fusion
CN112395506A (en) * 2020-12-04 2021-02-23 上海帜讯信息技术股份有限公司 Information recommendation method and device, electronic equipment and storage medium
CN112487200A (en) * 2020-11-25 2021-03-12 吉林大学 Improved deep recommendation method containing multi-side information and multi-task learning
CN112765339A (en) * 2021-01-21 2021-05-07 山东师范大学 Personalized book recommendation method and system based on reinforcement learning
CN113326384A (en) * 2021-06-22 2021-08-31 四川大学 Construction method of interpretable recommendation model based on knowledge graph
CN113987200A (en) * 2021-10-19 2022-01-28 云南大学 Recommendation method, system, terminal and medium combining neural network with knowledge graph
WO2022057671A1 (en) * 2020-09-16 2022-03-24 浙江大学 Neural network–based knowledge graph inconsistency reasoning method
CN114491055A (en) * 2021-12-10 2022-05-13 浙江辰时科技集团有限公司 Recommendation algorithm based on knowledge graph

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20190122111A1 (en) * 2017-10-24 2019-04-25 Nec Laboratories America, Inc. Adaptive Convolutional Neural Knowledge Graph Learning System Leveraging Entity Descriptions
CN110347847A (en) * 2019-07-22 2019-10-18 西南交通大学 Knowledge mapping complementing method neural network based
CN110489540A (en) * 2019-08-21 2019-11-22 合肥天源迪科信息技术有限公司 A kind of learning Content recommended method of knowledge based map

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20190122111A1 (en) * 2017-10-24 2019-04-25 Nec Laboratories America, Inc. Adaptive Convolutional Neural Knowledge Graph Learning System Leveraging Entity Descriptions
CN110347847A (en) * 2019-07-22 2019-10-18 西南交通大学 Knowledge mapping complementing method neural network based
CN110489540A (en) * 2019-08-21 2019-11-22 合肥天源迪科信息技术有限公司 A kind of learning Content recommended method of knowledge based map

Cited By (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111738000A (en) * 2020-07-22 2020-10-02 腾讯科技(深圳)有限公司 Phrase recommendation method and related device
CN111738000B (en) * 2020-07-22 2020-11-24 腾讯科技(深圳)有限公司 Phrase recommendation method and related device
WO2022057671A1 (en) * 2020-09-16 2022-03-24 浙江大学 Neural network–based knowledge graph inconsistency reasoning method
CN112182131A (en) * 2020-09-28 2021-01-05 中国电子科技集团公司第五十四研究所 Remote sensing image recommendation method based on multi-attribute fusion
CN112487200A (en) * 2020-11-25 2021-03-12 吉林大学 Improved deep recommendation method containing multi-side information and multi-task learning
CN112395506A (en) * 2020-12-04 2021-02-23 上海帜讯信息技术股份有限公司 Information recommendation method and device, electronic equipment and storage medium
CN112765339A (en) * 2021-01-21 2021-05-07 山东师范大学 Personalized book recommendation method and system based on reinforcement learning
CN113326384A (en) * 2021-06-22 2021-08-31 四川大学 Construction method of interpretable recommendation model based on knowledge graph
CN113987200A (en) * 2021-10-19 2022-01-28 云南大学 Recommendation method, system, terminal and medium combining neural network with knowledge graph
CN113987200B (en) * 2021-10-19 2024-03-15 云南大学 Recommendation method, system, terminal and medium for combining neural network with knowledge graph
CN114491055A (en) * 2021-12-10 2022-05-13 浙江辰时科技集团有限公司 Recommendation algorithm based on knowledge graph

Also Published As

Publication number Publication date
CN111582509B (en) 2022-09-02

Similar Documents

Publication Publication Date Title
CN111582509B (en) Knowledge graph representation learning and neural network based collaborative recommendation method
Balsmeier et al. Machine learning and natural language processing on the patent corpus: Data, tools, and new measures
CN111858954B (en) Task-oriented text-generated image network model
Guan et al. Matrix factorization with rating completion: An enhanced SVD model for collaborative filtering recommender systems
Averkiou et al. Shapesynth: Parameterizing model collections for coupled shape exploration and synthesis
Sang et al. Context-dependent propagating-based video recommendation in multimodal heterogeneous information networks
CN110599592B (en) Three-dimensional indoor scene reconstruction method based on text
Zhang et al. Multimodal dialog system: Relational graph-based context-aware question understanding
Liu et al. Cross-attentional spatio-temporal semantic graph networks for video question answering
CN112256981A (en) Rumor detection method based on linear and nonlinear propagation
CN107193882A (en) Why not query answer methods based on figure matching on RDF data
CN112016002A (en) Mixed recommendation method integrating comment text level attention and time factors
CN115618128A (en) Collaborative filtering recommendation system and method based on graph attention neural network
Wei et al. I know what you want to express: sentence element inference by incorporating external knowledge base
Yang Research on music content recognition and recommendation technology based on deep learning
Chen et al. Deciphering the noisy landscape: Architectural conceptual design space interpretation using disentangled representation learning
CN105354339B (en) Content personalization providing method based on context
Tavares et al. Trace encoding in process mining: A survey and benchmarking
Wen et al. Visual background recommendation for dance performances using deep matrix factorization
CN116664253B (en) Project recommendation method based on generalized matrix decomposition and attention shielding
Barbon Jr et al. Trace encoding in process mining: a survey and benchmarking
Du et al. Employ Multimodal Machine Learning for Content Quality Analysis
Gupta et al. Machine Learning enabled models for YouTube Ranking Mechanism and Views Prediction
Xu et al. A collaborative filtering framework based on variational autoencoders and generative adversarial networks
Qin et al. Voice of the customer oriented new product synthesis over knowledge graphs

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
CB02 Change of applicant information

Address after: 210000, 66 new model street, Gulou District, Jiangsu, Nanjing

Applicant after: NANJING University OF POSTS AND TELECOMMUNICATIONS

Address before: 210000 Jiangsu city of Nanjing province Ya Dong new Yuen Road No. 9

Applicant before: NANJING University OF POSTS AND TELECOMMUNICATIONS

CB02 Change of applicant information
GR01 Patent grant
GR01 Patent grant