CN111582509B - Knowledge graph representation learning and neural network based collaborative recommendation method - Google Patents

Knowledge graph representation learning and neural network based collaborative recommendation method Download PDF

Info

Publication number
CN111582509B
CN111582509B CN202010378310.2A CN202010378310A CN111582509B CN 111582509 B CN111582509 B CN 111582509B CN 202010378310 A CN202010378310 A CN 202010378310A CN 111582509 B CN111582509 B CN 111582509B
Authority
CN
China
Prior art keywords
knowledge
training
vector
kem
dimensional dense
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202010378310.2A
Other languages
Chinese (zh)
Other versions
CN111582509A (en
Inventor
王攀
黄琛
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Nanjing University of Posts and Telecommunications
Original Assignee
Nanjing University of Posts and Telecommunications
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nanjing University of Posts and Telecommunications filed Critical Nanjing University of Posts and Telecommunications
Priority to CN202010378310.2A priority Critical patent/CN111582509B/en
Publication of CN111582509A publication Critical patent/CN111582509A/en
Application granted granted Critical
Publication of CN111582509B publication Critical patent/CN111582509B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N20/00Machine learning
    • G06N20/20Ensemble learning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/30Information retrieval; Database structures therefor; File system structures therefor of unstructured textual data
    • G06F16/36Creation of semantic tools, e.g. ontology or thesauri
    • G06F16/367Ontology
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • Software Systems (AREA)
  • Data Mining & Analysis (AREA)
  • General Physics & Mathematics (AREA)
  • Computing Systems (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Computational Linguistics (AREA)
  • Artificial Intelligence (AREA)
  • Mathematical Physics (AREA)
  • Evolutionary Computation (AREA)
  • Animal Behavior & Ethology (AREA)
  • Databases & Information Systems (AREA)
  • Medical Informatics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Health & Medical Sciences (AREA)
  • Biomedical Technology (AREA)
  • Biophysics (AREA)
  • General Health & Medical Sciences (AREA)
  • Molecular Biology (AREA)
  • Information Retrieval, Db Structures And Fs Structures Therefor (AREA)

Abstract

The invention discloses a knowledge graph representation learning and neural network-based collaborative recommendation method, which is characterized in that items in a data set are mapped to public knowledge graph triples and are input into an OpenKE framework as a training set for model training, wherein a knowledge graph representation learning method is selected for learning in a parameter setting mode, corresponding vector matrixes E' of an entity set are reflected back to item individuals according to the sequence, and corresponding low-dimensional dense feature vectors I of each item, which are well constructed, are obtained kem The positive example low-dimensional dense feature vector I kem‑pos Sum-case low-dimensional dense feature vector I kem‑neg Reading in the model, replacing the traditional vector layer operation, namely embedding the knowledge into the vector layer for final output, and then starting training by the neural network training layer.

Description

Knowledge graph representation learning and neural network based collaborative recommendation method
Technical Field
The invention provides a knowledge graph representation learning and neural network based collaborative recommendation method, and belongs to the technical field of deep learning and recommendation systems.
Background
In a traditional recommendation system, the recommendation is carried out by depending on a matrix decomposition collaborative filtering algorithm, so that the problems of cold start and data sparsity inevitably occur. The data sparsity problem often means that the number of users and items is very large on a large-scale e-commerce platform, but the average number of items interacted by the users is small in the obtained user-item matrix, so that the user-item matrix is sparse. Whereas the cold start problem refers to how to make personalized recommendations for new users without a large amount of user data. The sparsity of data ultimately results in failure to capture relationships between different users and different items, thereby reducing the accuracy of the recommendation system. The neural network can analyze objects and the relation between the objects from a higher dimensionality, and the data sparsity problem is improved. Whether the cold start problem ends up or the information dimensions of the data are insufficient. The knowledge graph contains the fact relation of an object in the real world, which is equivalent to providing an additional information dimension for data needing to be trained in a model, so that the cold start problem is solved to a certain extent.
Disclosure of Invention
The purpose of the invention is as follows: in order to overcome the defects in the prior art, the invention provides a collaborative recommendation method based on knowledge graph representation learning and a neural network, which can solve the problems of sparse scoring matrix and cold start and enhance the performance and accuracy of collaborative filtering recommendation.
The technical scheme is as follows: in order to achieve the purpose, the invention adopts the technical scheme that:
a knowledge graph representation learning and neural network based collaborative recommendation method comprises the following steps:
step 1, acquiring a data set, mapping items in the data set to a public knowledge map triple K, and respectively constructing an entity set E, a relation set R and a training set S corresponding to the items;
step 2, inputting the constructed entity set E, the relationship set R and the training set S as training sets into an OpenKE framework for model training, wherein a knowledge graph representation learning method is selected to learn in a parameter setting mode;
step 3, outputting a corresponding vector matrix E 'of the entity set in the training process of the second step, reflecting the corresponding vector matrix E' of the entity set back to the project individuals according to the project sequence of the entity set E input in the second step and finally constructing a corresponding low-dimensional dense feature vector I for each project kem
Step 4, the low-dimensional dense feature vector I obtained in the step 3 is processed kem Positive and negative example selection processing is carried out to generate a positive example low-dimensional dense feature vector I kem-pos Sum-case low-dimensional dense feature vector I kem-neg Then adding a read-in module to read the positive low-dimensional dense feature vector I kem-pos Sum-case low-dimensional dense feature vector I kem-neg Reading in the model, replacing the traditional vector layer operation, namely embedding the knowledge into the vector layer for final output, and then starting training by the neural network training layer.
Preferably, the following components: the reflection in step 3 refers to each low-dimensional dense feature vector I output after the training is finished kem Corresponding to the item numbers of each input entity set E in order.
Preferably, the following components: the knowledge representation learning method includes a TransE method, a TransR method, a TransH method and a TransD method.
Preferably: the knowledge embedding vector layer is a component that processes data offline.
Compared with the prior art, the invention has the following beneficial effects:
the invention provides a low-dimensional dense vector which is obtained by extracting the prior semantic data from a public knowledge base by using a knowledge graph structured representation learning method to enrich the information dimension of the prior data set, replaces the vectorization result of the traditional method with the low-dimensional dense vector which is obtained by representing the knowledge graph, and embeds the process into a collaborative filtering model of a neural network as a middle layer, thereby not only excavating the linear and nonlinear relations between users and projects, but also further integrating the knowledge relation of the projects, so that the neural network model can fully utilize a great deal of prior knowledge in the knowledge graph to excavate the relations between the projects, and further deeply excavate the interactive information between the users and the projects.
In the process of using the neural network, the invention allows the matrix decomposition module and the deep neural network module to be independently embedded and then respectively enter the hidden layer of the matrix decomposition module and the hidden layer of the deep neural network module to be calculated. The method comprises the steps of adding a knowledge map embedding layer into the bottom layer of a deep neural network, theoretically, the knowledge map embedding layer is a part of off-line operation, expressing and learning items by using the existing knowledge map to generate dense low-dimensional vectors with knowledge, splicing the dense low-dimensional vectors into a collaborative depth model as tributaries, adding a neural network prediction layer to fuse the output of the two hidden layers, and finally obtaining a relatively accurate recommendation list.
Drawings
Fig. 1 is a system architecture diagram representing a collaborative recommendation method for learning and neural networks based on a knowledge graph.
Fig. 2 is a flow chart diagram illustrating a collaborative recommendation method for learning and neural networks based on knowledge graph representation.
FIG. 3 is a set of entities, a set of relationships, a set of hits for knowledge graph representation learning.
Fig. 4 shows a 128-dimensional dense vector generated by the TransH expression learning method.
FIG. 5 is a graph of the results of training on Movielens-1M using TransE and the original model.
Fig. 6 is a screenshot 1 of performance comparison experiments for different knowledge representation learning methods.
FIG. 7 is a screenshot 2 of performance comparison experiments for different knowledge representation learning methods.
Detailed Description
The present invention is further illustrated by the following description in conjunction with the accompanying drawings and the specific embodiments, it is to be understood that these examples are given solely for the purpose of illustration and are not intended as a definition of the limits of the invention, since various equivalent modifications will occur to those skilled in the art upon reading the present invention and fall within the limits of the appended claims.
A collaborative recommendation method based on knowledge graph representation learning and neural networks is disclosed in figures 1 and 2, and the goal is to estimate matching scores between users and items and then generate personalized item recommendation lists for the users according to the scores. The framework is composed of an input layer, a knowledge vector embedding layer, a vector layer, a neural network training layer and an output layer. The core part of the invention is knowledge embedding vector layer. Knowledge embedding vector layer we can imagine it as a parallel part. The combination with the above layers is parallel and also belongs to the first part of the model. In other words, it has both functions. His input is the item vector i in the implicit feedback dataset and its output is the knowledge low-dimensional dense feature vector i' obtained using the knowledge representation learning method (TransE, TransR, TransH, TransD). The knowledge embedding vector layer is actually a component that processes data off-line relative to the other parts. Specifically, the original item number i is associated with the constructed external knowledge map, and the association is finally reflected in the output low-dimensional dense feature vector with knowledge. The relevant symbols involved are defined as follows: a three-tuple of the knowledge-graph is denoted by k ═ E, (R, S), where E is the set of all entities in the knowledge-graph, R is the set of all relationships,
Figure BDA0002480895500000031
representing a set of triples in the knowledge-graph. For a specific triple, it is represented by (h, r, t), where h and t represent the head entity and the tail entity in the triple, respectively, and r represents the relationship between h and t.
According to the previous introduction, the purpose of knowledge graph representation learning is to obtain vector representation of all entities and relations of triples in a low-dimensional continuous space by measuring semantic information of a triple structure in a knowledge graph through learning. The technical scheme of the invention is totally divided into two parts, wherein the first part is a knowledge representation learning module, and the second part is a knowledge embedding vector layer module.
First part-knowledge representation learning module: the method comprises the steps of firstly taking an item in a data set and an open knowledge graph triple S as input of a knowledge representation learning module, and processing procedures of the knowledge representation learning module are three steps.
1. Firstly, an open data set is obtained, items in the data set are mapped to an open knowledge graph triple K according to input, and an entity set E, a relation set R and a training set S corresponding to the items are respectively constructed for the items, namely, a triple related to the item is constructed for each item entity in the data set.
2. The entity set E, the relation set R and the training set S constructed in the last step are used as training sets to be input into an OpenKE framework (the OpenKE framework is a main item based on TensorFlow and provides an optimized and stable framework for a knowledge graph embedding model) to carry out model training, wherein different knowledge graph representation learning methods (TransE, TransR, TransH and TransD) can be selected to be adopted to carry out learning in a mode of setting parameters. The framework is mainly used for constructing the knowledge graph triples, and the invention skillfully uses the one-step intermediate process of the framework, namely, different knowledge graph representation learning methods are used for generating the low-dimensional dense vectors. The invention theoretically considers that the low-dimensional dense vectors corresponding to the items have certain structural knowledge, and the low-dimensional dense vectors are not only simply vectorized according to the sequence numbers.
3. In the second step of training, the entity is outputThe corresponding vector matrix E' of the set, we output these vectors in json format to the text for separate operations. Reflecting the corresponding vector matrix E' of the entity set back to the individual items according to the item sequence of the entity set E input by the second model, and explaining the reflection, namely reflecting each low-dimensional dense feature vector I output after the training is finished knowledge-embedding (hereinafter referred to as I) kem ) The item numbers corresponding to each input entity set E in sequence are associated, which is a separate offline operation. Finally, constructing a corresponding low-dimensional dense feature vector I for each project kem
Second part-knowledge embedding vector layer module: since we have built for each project its corresponding low-dimensional dense feature vector I kem The step is also equal to the operation of a vector layer in the model, except that the vector layer in the model carries out vectorization processing on the project according to the sequence number of the input training set to finally generate the low-dimensional feature vector, and the invention creatively provides the knowledge-based low-dimensional dense feature vector I constructed by the first part kem The traditional vector layer operation is replaced, namely a part is called a knowledge embedding vector layer module, and the part mainly completes logic of integrating and embedding into model positive and negative case selection. I due to first part generation kem Is an independent off-line operation, in the second part we first go to I kem Doing the selection process of positive example and negative example to generate I kem-pos And I kem-neg Then adding a read-in module into the model, and adding I kem-pos And I kem-neg And reading the model, replacing the traditional vector layer operation, namely embedding the knowledge into the vector layer for final output, and sending the knowledge into a neural network training layer for training.
In a traditional recommendation system, the recommendation is carried out by depending on a matrix decomposition collaborative filtering algorithm, so that the problems of cold start and data sparsity inevitably occur. The data sparsity problem often means that the number of users and items is very large on platforms such as large-scale e-commerce and the like, but in the obtained user-item matrix, the number of items with which the users have average interaction is small, so that the user-item matrix is sparse. Whereas the cold start problem refers to how to make personalized recommendations for new users without a large amount of user data. The sparsity of data ultimately results in failure to capture relationships between different users and different items, thereby reducing the accuracy of the recommendation system. Implicit feedback is used for representing an implicit expression, the preference of a user can be obtained in various ways, and the expression preference is not limited to the displayed expression, so that a user-item matrix is enriched, and the problem of data sparsity is further solved. The neural network can analyze objects and the relation between the objects from a higher dimensionality, and the data sparsity problem is improved. Whether the cold start problem ends up or the information dimensions of the data are insufficient. The knowledge graph contains the fact relation of an object in the real world, which is equivalent to providing an additional information dimension for data needing to be trained in a model, so that the cold start problem is solved to a certain extent.
2. The experimental process comprises the following steps:
we will use a real public movie dataset Movielens-1M to evaluate the performance results of this method. The experimental result is obviously improved on the recommendation index.
a. Experimental Environment
We implemented this algorithm on CentOS Linux distribution 7.2.1511 using GeForce GTX 1080. The training frame was Keras 2.0.0, the back end was TensorFlow-gpu 1.4.0, and the corresponding CUDA and CuDNN versions were 8.0 and 6.0, respectively.
b. Description of data sets
We used the Movielens-1M dataset, containing 1,000,209 scoring records, 6040 users, 3883 movies. For each data record, the scores of different users for the movie are mainly described, and the score is gradually increased from 1 to 5. Since our experiments are implicit feedback, all records with scores are considered like, and those without scores are considered dislike. We use the open knowledge map Yago 1 The knowledge graph is composed of triples < h, r, t >. We associate each movie entity in Movielens-1M with a triplet in YAGOMapping the head entity and the tail entity, constructing a triple, and finally extracting 5 relations related to the film:<wroteMusicFor>、<directed>、<created>、<actedIn>、<edited>a total of 43847 triplets are formed, which contain 3,221 movie entities.
Since some movies in Movielens-1M are not associated to YAGO entity, as shown in fig. 1, we use the movie type provided in the original Movielens-1M dataset and construct a triple like YAGO for the movie with the attribute of "type", such as < One flip Over the Cuckoo's Nest, generator, Drama >, and add a new type relationship: < generer >.
Table 1: description of data sets
Figure BDA0002480895500000051
Movielens-1M 2 1,000,209 anonymous ratings of 3,883 movies made by 6,040 Movielens-1M users were included, which is the main experimental data set for our experiments. First, to ensure the balance of the data set as much as possible, we filter users with scores less than 10 times and movies with scores less than 10 times. Then we sort the timestamps of the interactions between the user and the entries, the last interaction and the second through last interactions as the test set and validation set, respectively, which are 6,015. The basic embedded statistics for the data set are in a table with a Dateset column showing the training data set actually used for training.
As shown in fig. 3, the entity set, relationship set, diversity set for knowledge graph representation learning. This is the three training data sets we constructed from the Movielens-1M and YAGO data sets to represent learning only. The first file: entity2id is the number of 25360 movie entities that we extracted and renumbered each occurrence from Movielens-1M. The second file: relation2id shows that we use relation words in YAGO knowledge map, and numbering is carried out, and the total number is 6. A third file: train2id is a 49791 group according to the renumbered movie entities and the relation items of the knowledge graph and the user scoring table in the original Movielens-1M data set, and the corresponding new scoring table is matched again and is sent into the model as a training set to carry out knowledge graph representation learning.
After learning of knowledge graph representation is completed through OpenKE, only the dense vectors generated in training need to be extracted. As shown in FIG. 4, a 128-dimensional dense vector generated by the TransH representation learning method is shown, but only a small part is intercepted due to space limitation and is shown as a sample. The 128-dimensional dense knowledge vectors in parentheses in each group then represent the movie entities entered in order.
c. Evaluation index
1) Hit Ratio (HR): in the Top-K recommendation, HR is a commonly used index for measuring recall, the denominator GT is all test sets, and the numerator NumberOfHits @ K represents the sum of the number of test sets in the Top-K list of each user. The calculation formula is as follows:
Figure BDA0002480895500000061
2) normalized Dispersed Cummulant Gain (NDCG): the score is a normalized discounted cumulative revenue that evaluates ranking performance by considering the location of the correct item, representing the best list of recommended results returned by a user of the recommendation system, i.e., assuming that the returned results are ranked by relevance, the most relevant results are placed at the top. The formula for calculating the NDCG @ K for each user is:
Figure BDA0002480895500000062
z is the normalized operation, rel i The correlation of the recommendation result at position i is indicated, and k indicates the size of the recommendation list. The value of NDCG is between (0, 1)]. From the above equation it follows: 1) the greater the correlation of the recommendation, the greater the NDCG. 2) If the list is ranked in front of the list with good correlation, the better the recommendation, the larger the NDCG.
d. Results of the experiment
1. The results of the experiments are shown in FIG. 5, as shown in tables 2-1, 2-2, and 2-3, by comparing the experimental data with those obtained using the original model which is only representative of the learning method (TransE) and is not used.
Table 2-1: and (3) iterating the MF + TransE and the MF at Movielens-1M for 1-50 rounds of training loss.
Figure BDA0002480895500000071
Tables 2 to 2: HR @10 of MF + TransE and MF in Movielens-1M iteration for 1-50 rounds
Figure BDA0002480895500000072
Tables 2 to 3: NDCG @10 of MF + TransE and MF in Movielens-1M iteration for 1-50 rounds
Figure BDA0002480895500000081
2. Performance comparison experiments were performed using different knowledge representation learning methods TransE, TransR, TransD, TransH, with the vector dimension set to 128 dimensions, and the experimental results are shown in FIGS. 6 and 7.
A plurality of groups of comparison and contrast experiments prove that: by introducing the knowledge graph, the abundant relation knowledge of the knowledge graph is fused with the scoring matrix based on implicit feedback, the problems of sparse scoring matrix and cold start are effectively solved, and the performance and accuracy of collaborative filtering recommendation are enhanced. The experimental result shows that compared with individual learning of user and project characteristics, the method provided by the invention has better recommendation performance.
The invention uses a knowledge graph structuralization representation learning method to extract the existing semantic data from a public knowledge base so as to enrich the information dimension of the existing data set, and therefore, a low-dimensional dense vector which is learned by knowledge graph representation is provided as an intermediate layer to be embedded into a collaborative filtering model of a neural network. The invention carries out a plurality of groups of comparison experiments on the public data set, and the result shows that the method provided by the invention really and effectively improves the accuracy of the recommendation system.
The above description is only of the preferred embodiments of the present invention, and it should be noted that: it will be apparent to those skilled in the art that various modifications and adaptations can be made without departing from the principles of the invention and these are intended to be within the scope of the invention.

Claims (4)

1. A knowledge graph representation learning and neural network based collaborative recommendation method is characterized by comprising the following steps:
step 1, acquiring a data set, mapping items in the data set to a public knowledge map triple K, and respectively constructing an entity set E, a relation set R and a training set S corresponding to the items;
the data set is a movie data set Movielens-1M, each data record of the movie public data set Movielens-1M describes the scores of different users for movies, and the score is gradually increased from 1 to 5; the method comprises the steps that an open knowledge map Yago is used for the open knowledge map, the open knowledge map Yago is formed by triples, wherein h, r and t are less than h, r and t, and each movie entity in a movie data set Movielens-1M is mapped with a head entity and a tail entity of the triples in the open knowledge map Yago to construct triples;
step 2, inputting the constructed entity set E, the relationship set R and the training set S as training sets into an OpenKE framework for model training, wherein a knowledge graph representation learning method is selected for learning in a parameter setting mode;
outputting a corresponding vector matrix E 'of the entity set in the training process, reflecting the corresponding vector matrix E' of the entity set back to the individual items according to the input item sequence of the entity set E and the sequence, and finally constructing a corresponding low-dimensional dense feature vector I for each item kem
Generating 128-dimensional dense vectors by adopting a TransH expression learning method, wherein the 128-dimensional dense knowledge vectors represent movie entities input in sequence;
step 3, knowledge embeddingThe vector layer is used for enabling the low-dimensional dense feature vector I obtained in the step 2 kem Positive and negative case selection processing is carried out to complete the logic of integrating and embedding the positive and negative case selection into the model, and a positive case low-dimensional dense feature vector I is generated kem-pos Sum-case low-dimensional dense feature vector I kem-neg Then adding a read-in module to read the positive low-dimensional dense feature vector I kem-pos Sum-case low-dimensional dense feature vector I kem-neg And reading in the model, replacing the traditional vector layer operation, namely embedding the knowledge into the vector layer for final output, and then starting training by the neural network training layer.
2. The knowledge-graph-based collaborative recommendation method for representing learning and neural networks according to claim 1, wherein: the reflection in step 2 refers to each low-dimensional dense feature vector I output after the training is finished kem Corresponding to the item numbers of each input entity set E in order.
3. The knowledge graph representation learning and neural network-based collaborative recommendation method of claim 1, wherein: the knowledge representation learning method includes a TransE method, a TransR method, a TransH method, and a TransD method.
4. The knowledge-graph-based collaborative recommendation method for representing learning and neural networks according to claim 1, wherein: the knowledge embedding vector layer is a component that processes data offline.
CN202010378310.2A 2020-05-07 2020-05-07 Knowledge graph representation learning and neural network based collaborative recommendation method Active CN111582509B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010378310.2A CN111582509B (en) 2020-05-07 2020-05-07 Knowledge graph representation learning and neural network based collaborative recommendation method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010378310.2A CN111582509B (en) 2020-05-07 2020-05-07 Knowledge graph representation learning and neural network based collaborative recommendation method

Publications (2)

Publication Number Publication Date
CN111582509A CN111582509A (en) 2020-08-25
CN111582509B true CN111582509B (en) 2022-09-02

Family

ID=72126366

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010378310.2A Active CN111582509B (en) 2020-05-07 2020-05-07 Knowledge graph representation learning and neural network based collaborative recommendation method

Country Status (1)

Country Link
CN (1) CN111582509B (en)

Families Citing this family (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111738000B (en) * 2020-07-22 2020-11-24 腾讯科技(深圳)有限公司 Phrase recommendation method and related device
CN112100403A (en) * 2020-09-16 2020-12-18 浙江大学 Knowledge graph inconsistency reasoning method based on neural network
CN112182131B (en) * 2020-09-28 2021-11-09 中国电子科技集团公司第五十四研究所 Remote sensing image recommendation method based on multi-attribute fusion
CN112487200B (en) * 2020-11-25 2022-06-07 吉林大学 Improved deep recommendation method containing multi-side information and multi-task learning
CN112395506A (en) * 2020-12-04 2021-02-23 上海帜讯信息技术股份有限公司 Information recommendation method and device, electronic equipment and storage medium
CN112765339B (en) * 2021-01-21 2022-10-04 山东师范大学 Personalized book recommendation method and system based on reinforcement learning
CN113326384A (en) * 2021-06-22 2021-08-31 四川大学 Construction method of interpretable recommendation model based on knowledge graph
CN113987200B (en) * 2021-10-19 2024-03-15 云南大学 Recommendation method, system, terminal and medium for combining neural network with knowledge graph
CN114491055B (en) * 2021-12-10 2022-11-08 浙江辰时科技集团有限公司 Recommendation method based on knowledge graph

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20190122111A1 (en) * 2017-10-24 2019-04-25 Nec Laboratories America, Inc. Adaptive Convolutional Neural Knowledge Graph Learning System Leveraging Entity Descriptions
CN110347847A (en) * 2019-07-22 2019-10-18 西南交通大学 Knowledge mapping complementing method neural network based
CN110489540A (en) * 2019-08-21 2019-11-22 合肥天源迪科信息技术有限公司 A kind of learning Content recommended method of knowledge based map

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20190122111A1 (en) * 2017-10-24 2019-04-25 Nec Laboratories America, Inc. Adaptive Convolutional Neural Knowledge Graph Learning System Leveraging Entity Descriptions
CN110347847A (en) * 2019-07-22 2019-10-18 西南交通大学 Knowledge mapping complementing method neural network based
CN110489540A (en) * 2019-08-21 2019-11-22 合肥天源迪科信息技术有限公司 A kind of learning Content recommended method of knowledge based map

Also Published As

Publication number Publication date
CN111582509A (en) 2020-08-25

Similar Documents

Publication Publication Date Title
CN111582509B (en) Knowledge graph representation learning and neural network based collaborative recommendation method
Balsmeier et al. Machine learning and natural language processing on the patent corpus: Data, tools, and new measures
CN111858954B (en) Task-oriented text-generated image network model
CN110874439B (en) Recommendation method based on comment information
CN110599592B (en) Three-dimensional indoor scene reconstruction method based on text
CN112256981B (en) Rumor detection method based on linear and nonlinear propagation
CN113592609B (en) Personalized clothing collocation recommendation method and system utilizing time factors
Zhang et al. Multimodal dialog system: Relational graph-based context-aware question understanding
Liu et al. Cross-attentional spatio-temporal semantic graph networks for video question answering
CN112016002A (en) Mixed recommendation method integrating comment text level attention and time factors
CN112464100B (en) Information recommendation model training method, information recommendation method, device and equipment
CN114461907B (en) Knowledge graph-based multi-element environment perception recommendation method and system
CN115630153A (en) Research student literature resource recommendation method based on big data technology
CN115840853A (en) Course recommendation system based on knowledge graph and attention network
Yang [Retracted] Research on Music Content Recognition and Recommendation Technology Based on Deep Learning
Wei et al. I know what you want to express: sentence element inference by incorporating external knowledge base
CN114417161A (en) Virtual article time sequence recommendation method, device, medium and equipment based on special-purpose map
CN118013134A (en) Enhanced social recommendation method and model based on score deviation offset
CN116664253B (en) Project recommendation method based on generalized matrix decomposition and attention shielding
Gupta et al. Machine learning enabled models for YouTube ranking mechanism and views prediction
Xu et al. A collaborative filtering framework based on variational autoencoders and generative adversarial networks
Zhou et al. Multi-modal multi-hop interaction network for dialogue response generation
CN116304372A (en) Collaborative knowledge graph network recommendation method integrating groups
Qin et al. Voice of the customer oriented new product synthesis over knowledge graphs
Lyu et al. Color matching generation algorithm for animation characters based on convolutional neural network

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
CB02 Change of applicant information

Address after: 210000, 66 new model street, Gulou District, Jiangsu, Nanjing

Applicant after: NANJING University OF POSTS AND TELECOMMUNICATIONS

Address before: 210000 Jiangsu city of Nanjing province Ya Dong new Yuen Road No. 9

Applicant before: NANJING University OF POSTS AND TELECOMMUNICATIONS

CB02 Change of applicant information
GR01 Patent grant
GR01 Patent grant