CN112686736B - System recommendation method - Google Patents
System recommendation method Download PDFInfo
- Publication number
- CN112686736B CN112686736B CN202110027315.5A CN202110027315A CN112686736B CN 112686736 B CN112686736 B CN 112686736B CN 202110027315 A CN202110027315 A CN 202110027315A CN 112686736 B CN112686736 B CN 112686736B
- Authority
- CN
- China
- Prior art keywords
- user
- network
- representing
- commodity
- vector
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 238000000034 method Methods 0.000 title claims abstract description 27
- 230000004927 fusion Effects 0.000 claims abstract description 19
- 238000012545 processing Methods 0.000 claims abstract description 7
- 239000013598 vector Substances 0.000 claims description 84
- 238000013528 artificial neural network Methods 0.000 claims description 30
- 230000006870 function Effects 0.000 claims description 20
- 230000002452 interceptive effect Effects 0.000 claims description 18
- 230000003993 interaction Effects 0.000 claims description 17
- 230000008569 process Effects 0.000 claims description 8
- 230000004913 activation Effects 0.000 claims description 7
- ORILYTVJVMAKLC-UHFFFAOYSA-N Adamantane Natural products C1C(C2)CC3CC1CC2C3 ORILYTVJVMAKLC-UHFFFAOYSA-N 0.000 claims description 4
- 230000002776 aggregation Effects 0.000 claims description 4
- 238000004220 aggregation Methods 0.000 claims description 4
- 238000005457 optimization Methods 0.000 claims description 3
- 230000002708 enhancing effect Effects 0.000 abstract description 2
- 238000002474 experimental method Methods 0.000 description 10
- 238000010586 diagram Methods 0.000 description 7
- 239000011159 matrix material Substances 0.000 description 6
- 230000000694 effects Effects 0.000 description 4
- 230000008901 benefit Effects 0.000 description 3
- 238000011161 development Methods 0.000 description 3
- 238000005516 engineering process Methods 0.000 description 3
- 238000000354 decomposition reaction Methods 0.000 description 2
- 238000005034 decoration Methods 0.000 description 2
- 230000007547 defect Effects 0.000 description 2
- 238000012986 modification Methods 0.000 description 2
- 230000004048 modification Effects 0.000 description 2
- 238000012549 training Methods 0.000 description 2
- 238000002679 ablation Methods 0.000 description 1
- 238000004364 calculation method Methods 0.000 description 1
- 230000002860 competitive effect Effects 0.000 description 1
- 238000012937 correction Methods 0.000 description 1
- 238000004880 explosion Methods 0.000 description 1
- 230000006855 networking Effects 0.000 description 1
- 238000012360 testing method Methods 0.000 description 1
Images
Landscapes
- Information Retrieval, Db Structures And Fs Structures Therefor (AREA)
Abstract
The invention discloses a system recommendation method, which comprises the following specific processing steps: 101) a data relation association step, 102) a user relation data fusion step, 103) a commodity data fusion step, and 104) a recommendation score prediction step; the invention provides a system recommendation method for performing representation learning on nodes through a heterogeneous network, enhancing node feature representation and improving sparse data processing capacity.
Description
Technical Field
The invention relates to the technical field of recommendation, in particular to a system recommendation method.
Background
With the development of the internet and big data technology, an information explosion age has come, various information is presented to people, and as a technology for helping people to select useful information, a recommendation system has been presented in various fields of medical treatment, business, education, etc., and shows a strong ability.
With the development of the times, network data become more and more complex, data in a network is not composed of a single type of elements, a large amount of different types of data form various network graphs, and a traditional recommendation algorithm can only process a regular topological graph composed of a single type of data, so that a Heterogeneous Information Network (HIN) is produced. Heterogeneous information networks have been applied to the recommendation field as a powerful modeling method. A heterogeneous information network is a special type of network that contains a large number of different types of nodes connected together by different types of edges that represent different relationships between network nodes. The heterogeneous information network is integrated into a recommendation system, and the development of the field is accelerated to a great extent.
In recent years, the graph neural network is also widely applied to a recommendation system, and has the function of fusing neighbor information of nodes to show more characteristic information of the nodes, but the graph neural network has defects of the graph neural network, and the graph Rec model is only a single first-order neighbor fused with the nodes, and cannot extract deeper relationships of the nodes. If the data set is sparse, the number of neighbor nodes of the node is relatively small, the feature information of the node cannot be fully fused, and therefore prediction inaccuracy is increased.
Disclosure of Invention
The invention solves the technical problems and provides a system recommendation method for performing representation learning on nodes through a heterogeneous network, enhancing node feature representation and improving sparse data processing capacity.
The technical scheme of the invention is as follows:
a system recommendation method comprises the following specific processing steps:
101) data relation association step: by U ═ U1,u2,...,unDenotes a user data set, V ═ V1,v2,...,vmThe representation of the data set of the goods,representing meta-paths between data, a representing a user or a commodity, R representing a social relationship or a buy and sell relationship; the meta path is generated as follows:
wherein n istIs the current node, nt+1In order to be the next node, the node is,among the neighbors representing v are those belonging to At+1ρ represents a meta path rule, and the length of the meta path is set to be L; phi (x) represents the type of the next node of the path, belonging to the node type in the network; v and x represent the current node and the next node, respectively, belonging to all nodes in the network;
meta-path of user and commodity interactive relationThe generated vector of the user and the commodity is expressed asAndthe user vector generated by the meta path of the user social relationship is represented asScore vector representation er;
102) And (3) user relationship data fusion: the user data set comprises u _ v of a user item interactive network and u _ u of a user social network; the u _ v of the user item interactive network is combined with the attribute network to generate corresponding weight to obtain the implicit vector representation of the user in the user interactive network, and the implicit vector representation is specifically represented by the following formula:
wherein,representing the implicit vector of a user i in an interactive network, sigma being an activation function, w and b being parameters of a neural network, C representing all scores of the user on the commodities, and NiDenotes the neighbors of node i, softmax denotes the softmax function, xjrVector fusion, g, representing item j and corresponding score rτIs a multi-layer fully-connected neural network,representing a concatenation of vectors, w1、w2、b1、b2Representing parameters of two fully connected layers of an attention network;
generating corresponding weight by combining u _ u of the user social network with the attention network to obtain implicit vector representation of the user in the social network, wherein the implicit vector representation is specifically represented by the following formula:
wherein,representing an implicit vector of a user i in the social network, and i' representing a neighbor user;
the user vector formula that ultimately includes the user project interaction network and the user social network is as follows:
hirepresenting the end-user vector,/n-1Indicates the number of network layers, wn、bnParameters representing w and b multilayer neural networks;
103) and (3) commodity data fusion: the deep relationship of the commodity in the network is collected in a meta-path mode, and the feature vector of the commodity is represented by fusing the features of the neighbor users through GNNThe specific formula is as follows:
104) and (3) recommendation score prediction step: predicting the scores of the user on the commodities according to the user vectors and the commodity vectors obtained in the steps 101) to 103), splicing the user vectors and the commodity vectors through a full-connection layer network to serve as the input of the network, and using the predicted scores as the output of the network, namely using a multilayer full-connection neural network to obtain the final predicted scores, wherein the specific formula is as follows:
Further, the specific process of implicit vector representation of the user in the user interaction network in step 102) is as follows:
where aggregate represents the fusion function, xjrThe vectors representing the commodities j and the corresponding scores r are fused, and C represents all scores which are made by the user on the commodities; x is the number ofjrThe formula of (1) is as follows:
the Aggregation fusion function adopts a fully-connected neural network, and the concrete formula is as follows:
σ is the activation function, w and b are neural network parameters, αjiFor the weight of a commodity j to a user i, the importance degree of each commodity to the user is different, the weight is trained through an Attention network, and the specific formula is as follows:
further, the implicit vector of the user in the social network of the user in step 102) is represented by the neighboring vector fusion of the user, and specifically, the training weight process in combination with the Attention network is as follows:
wherein beta isi′iThe weight index of the neighbor i' to the user i is represented by the following specific formula:
further, the method also comprises 105) a parameter adjusting step: adjusting parameters of the recommendation model established in the steps 101) to 104), and specifically, defining an objective function to optimize so as to reversely update the parameters in the recommendation model; the Adam method is selected for adjustment and optimization, wherein the adjusted loss formula is as follows:
wherein,predictive score, y, for user i for item jijFor a true score, n + m is the total number of users and goods.
Compared with the prior art, the invention has the advantages that:
the invention has the advantage of processing data with increased sparsity. The scheme model introduces a method for generating vectors by heterogeneous network element paths on the basis of a graph neural network, and learns the feature representation of the enhanced node through the heterogeneous network representation. Tests are carried out on the two public data sets, the results show that the model is superior to other models, and the model is proved to be more competitive in solving the sparse problem by changing the sparsity of the data sets. At present, a large number of data sets on a network contain a plurality of node attribute elements, and the attributes are fully utilized to effectively help a model and accurate prediction.
Drawings
FIG. 1 is a frame diagram of the present invention;
FIG. 2 is a user merchandise interaction data diagram of the present invention;
FIG. 3 is a diagram of a user's social network of the present invention;
FIG. 4 is a first set of learning rate experimental statistics of the present invention;
FIG. 5 is a second set of learning rate experimental statistics of the present invention;
FIG. 6 is a comparison graph of the cia dataset embedding dimension experiment index RMSE of the present invention;
FIG. 7 is a comparison graph of the ciao data set embedding dimension experiment index MAE of the present invention.
Detailed Description
The present invention is further described in the following detailed description in conjunction with the drawings, and portions not described or illustrated in detail herein can be implemented using conventional technology.
As shown in fig. 1 to 7, a system recommendation method includes the following specific processing steps:
101) data relation association step: from U ═ U1,u2,...,unDenotes a user data set, and n denotes the number of users. V ═ V1,v2,...,vmDenotes a commodity data set, and m denotes the number of commodities. P represents a meta-path between data, the specific meta-path beingWherein, A represents the entity in the network diagram, namely the user or the commodity, and R represents the interactive relationship between the entities, namely the social relationship or the buying and selling relationship;T∈Rn*mfor the user commodity interaction matrix, rijRepresents the grade given by the user i to the commodity j, and r is given if the user does not mark the gradeijThe score may be regarded as 0, and the higher the score is, the more the user likes the product. One user can score multiple items, as can one item scored by multiple users. G is belonged to Rn*nA social matrix for the user. And delta-R represents a network structure diagram, and is divided into a user commodity interaction diagram u _ v and a user social relationship diagram u _ u. The meta path is generated as follows:
wherein n istIs the current node, nt+1In order to be the next node, the node is,among the neighbors representing v are those belonging to At+1ρ represents a meta path rule, and the length of the meta path is set to L. Phi (x) represents the type of the next node of the path, belonging to the node type in the network; v and x represent the current node and the next node, respectively, belonging to all nodes in the network.
Then, a large number of path instances are generated according to the generation mode, and the vectors of the nodes generated by the classical model word2vec are substituted for expression, wherein the specific formula is as follows:
wherein context and NEG respectively represent positive and negative sample sets of the sample w, the window size is set to be related to the size of the positive sample, and the window size minus one is the number of the positive samples. The positive samples of w are selected from the path examples, specifically refer to a certain number of nodes, and are related to the path examples and the window size, and the selected positive samples are all contained in the path examples; the negative sample set is a node selected in addition to the nodes in the positive sample, and may or may not be in the path. The window size determines the number of nodes associated with the node, and the larger the window, the greater the number of associated nodes selected. For example, one example of a path is u1 → v1 → u2 → v2 → u3 → v3 → u4 → v4, and when the window size is 3, the positive samples are u3 and u4 for the v3 node, and the number is 2.
The vector representation of the user and the commodity generated by the meta path of the interaction relation of the user and the commodity isAndthe user vector generated by the meta path of the user social relationship is represented asScore vector representation er。
Specifically, as shown in FIG. 2, one of the meta-paths is u1→v1→u3→v2→u4The path length L is 5. As shown in FIG. 3, one of the meta-paths is u2→u4→u3→u1→u6The path length L is 5. Different meta-paths are generated aiming at different network graphs, and are used as sentences, word2vec is used for generating a feature vector of each node, so that the multi-order neighbor relation of each node is captured, and the defect that the graph neural network cannot be fused with the high-order neighbor relation is overcome.
102) And (3) user relationship data fusion: the user data set comprises u _ v of the user item interaction network and u _ u of the user social network. And generating corresponding weight by combining the u _ v of the user item interactive network with the attribute network to obtain the implicit vector representation of the user in the user interactive network. Namely, the hidden vector of the user for a certain commodity is represented by the neighbor commodity and the score of the user, then corresponding weight is generated through an attention network, and finally all the hidden vectors are fused to obtain the hidden vector representation of the user in the interactive network.
The following formula is shown:
wherein,representing the implicit vector of the user i in the interactive network, sigma being an activation function, w and b being parameters of the neural network, C representing all the scores made by the user on the commodity, and NiDenotes the neighbor of node i, softmax denotes the softmax function, xjrVector fusion, g, representing item j and corresponding score rτIs a multi-layer fully-connected neural network,splicing of the representation vectors, w1、w2、b1、b2And a parameter representing two fully connected layers of the attention network, namely the attention network comprises two fully linked layers.
The specific process of implicit vector representation of the user in the user interaction network is as follows:
where aggregate represents the fusion function, xjrThe vectors representing the commodities j and the corresponding scores r are fused, and C represents all scores which are made by the user on the commodities; x is the number ofjrThe formula of (1) is as follows:
the Aggregation fusion function adopts a fully-connected neural network, and the concrete formula is as follows:
σ is the activation function, w and b are neural network parameters, αjiFor the weight of a commodity j to a user i, the importance degree of each commodity to the user is different, the weight is trained through an Attention network, two layers of neural networks are used, and the specific formula is as follows:
in combination with the above, the user vector is used in the weight calculation, and the final result can be expressed as formula (2):
generating corresponding weight by combining u _ u of the social network of the user with the attention network to obtain implicit vector representation of the user in the social network, wherein the implicit vector representation is specifically represented by the following formula:
wherein,representing an implicit vector of a user i in the social network, and i' representing a neighbor user;
the implicit vector of the user in the social network of the user is represented, the implicit vector of the user in the social network is represented by the neighbor vector fusion of the user, and the training weight process specifically combined with the Attention network is the following formula:
wherein beta isi′iThe weight index of the neighbor i' to the user i is represented by the following specific formula:
when the weighting index is obtained, the vector representation of the user and the neighbors thereof in the social network is used, so that the final formula is obtained as formula (3):
and finally, the user aggregates the interaction relation between the user and the commodity in u _ v, and the user aggregates the social relation between the user and the user in u _ u, and the user vectors of the user project interaction network and the user social network are fused to generate the final user vector. The specific formula is as follows:
hirepresenting the end user vector, ln-1Indicates the number of network layers, wn、bnParameters representing the w and b neural networks;
103) and (3) commodity data fusion: the deep relationship of the commodity in the network is collected in a meta-path mode, and the feature vector of the commodity is represented by fusing the features of the neighbor users through GNNThe concrete formula is as follows:
wherein,representing an implicit vector of the commodity j in the interactive network; the other parameters are as defined above.
104) A recommendation score predicting step: the user vector and the commodity vector obtained in steps 101) to 103) are used for predicting the score of the user on the commodity, the splicing of the user vector and the commodity vector is used as the input of the network through a full-connection layer network, the prediction score is used as the output of the network, and the final prediction score is obtained by using a multi-layer (generally, three layers are used here) full-connection neural network. The specific formula is as follows:
wherein,is a prediction score. The specific process can be described as first splicing together the final vector representations of the user's goods, then placing them into a multi-layer fully-connected neural network, each layer having an activation function, and finally outputting the predicted value to be obtained.
And 105) parameter adjusting step: adjusting parameters of the recommendation model established in the steps 101) to 104), and specifically, defining an objective function to optimize so as to reversely update the parameters in the recommendation model; the Adam method is selected for adjustment and optimization, wherein the adjusted loss formula is as follows:
wherein,predictive score, y, for user i for item jijFor a true score, n + m is the total number of users and goods. The Adam method has the advantages that after offset correction, the learning rate of each iteration has a certain range, so that the parameters are relatively stable. The method provided by the scheme is adopted as a whole, and the loss formula is adopted only for correcting the parameters in the loss adjusting part.
The validity of the model was verified using data on both the Ciao and Epinions websites, crawled from both websites by Asian State university scholars when doing social networking studies. They contained 283319 and 764352 ratings, respectively. And the accuracy of the recommended model is verified by adopting two indexes of RMSE and MAE, and the smaller the value of the two indexes is, the more superior the model is. Meanwhile, other algorithms are compared to show the accuracy of the model of the scheme. Other algorithms for comparison are as follows:
PMF: the main idea of the model is that the user's preference for movies can be determined by a linear combination of a few factors.
SoRec: the main idea is to tie the user's social network relationships to the scoring matrix.
SocialMF: the main idea is to introduce trust propagation in matrix factorization, users representing users that are close to their trust.
NeuMF: the main idea of the model is to combine the traditional matrix decomposition and the multilayer perceptron, and can extract low-dimensional and high-dimensional features at the same time.
DeepsoR: the main idea is that the user representation learned from social relationships is integrated into the probability matrix decomposition.
GraphRec: the main idea is to merge the vectors of the user commodities in different networks for prediction.
TABLE 1
in the used ciao data set, the sparsity of the data set used by the original model is 98.96%, the number of times of participation and scoring of some users in the data set is large, and the number of times of participation and scoring of some users is small. In order for each user to participate in the interaction, the cold start problem is prevented from occurring. And selecting the users with high scoring participation times to delete the scoring number. Specifically, users with more scoring times of more than or equal to 60 are determined, the problem of cold start of commodities is considered in the deleting process, each commodity is guaranteed to participate in interaction, and finally deleting operation is carried out.
By reducing the user scoring times, the connection relation between the user nodes and the commodity nodes in the heterogeneous network graph is reduced, and the corresponding interaction relation is reduced, so that the field aggregation of the graph neural network cannot be integrated with more features, and finally the recommendation difficulty is increased. In such a sparse case, the effect of the model is detected. Two sets of sparsity variation comparison experiments were performed, with the following results:
TABLE 2
By reducing the commodities of users with a large number of interactive commodities, the number of scores is reduced, and further the sparsity is increased. 166 scores are deleted to increase the sparsity to 98.98%, 541 scores are deleted to increase the sparsity to 99%, and the sparse data sets are known to be larger through the experimental data sets, so that the model effect of the scheme is improved to be larger compared with that of a contrast model.
Ablation experiments were performed by adding two net representation learning sections separately to compare with the model of the traditional unused scheme. Two groups of experiments are carried out, wherein the first group respectively generates user item vectors through network representation learning by only adding a user item interaction network, and then the user item vectors are fused through a graph neural network. The second group generates a user vector representation by adding only the user social networks. The two added network representations learning is shown to increase prediction accuracy by the following experimental data table.
In the setting of parameters, the parameters included in the model include path length, window size, learning rate and embedding dimension. The following experiment comparison is respectively carried out on the parameters, the path length is 10 and 20, the corresponding window size is 5 and 10, the comparison experiment is carried out by adjusting different embedding dimensions, the learning rate and other parameters, and the experiment result is as follows:
as shown in fig. 4, under the condition that the path length is 10 and the window size is 5, two index values of MAE and RMSE are plotted against the learning rate. Fig. 5 is a graph showing the variation of two index values of MAE and RMSE with the learning rate under the condition that the path length is 20 and the window size is 10. According to experimental results, the scheme has the best effect when the path length is 20, the window size is 10 and the learning rate is 0.004. Fig. 6 and 7 show that when the embedding dimension is selected to be 64 dimensions, the model effect of the scheme is the best, as can be seen by comparing the ciao data set embedding dimension experiment index RMSE and the ciao data set embedding dimension experiment index MAE.
The foregoing is only a preferred embodiment of the present invention, and it should be noted that, for those skilled in the art, several modifications and decorations can be made without departing from the spirit of the present invention, and these modifications and decorations should also be regarded as being within the scope of the present invention.
Claims (3)
1. The system recommendation method is characterized by comprising the following specific processing steps:
101) data relation association step: from U ═ U1,u2,...,unDenotes a user data set, V ═ V1,v2,...,vmThe representation of the data set of the goods,representing meta-paths between data, a representing a user or a commodity, R representing a social relationship or a buy and sell relationship; the meta path is generated as follows:
wherein n istIs the current node, nt+1In order to be the next node, the node is,among the neighbors representing v are those belonging to At+1ρ represents a meta path rule, and the length of the meta path is set to be L; phi (x) represents the type of the next node of the path, belonging to the node type in the network; v and x respectively represent a current node and a next node, and belong to all nodes in the network;
the vector representation of the user and the commodity generated by the meta path of the interaction relation of the user and the commodity isAndthe user vector generated by the meta path of the user social relationship is represented asScore vector representation er;
102) And (3) user relationship data fusion: the user data set comprises u _ v of a user item interactive network and u _ u of a user social network; the u _ v of the user item interactive network is combined with the attribute network to generate corresponding weight to obtain the implicit vector representation of the user in the user interactive network, and the implicit vector representation is specifically represented by the following formula:
wherein,representing the implicit vector of the user i in the interactive network, sigma being an activation function, w and b being parameters of the neural network, C representing all the scores made by the user on the commodity, and NiDenotes the neighbors of user i, softmax denotes the softmax function, xjrVector fusion, g, representing item j and corresponding score rτIs a multi-layer fully-connected neural network,splicing of the representation vectors, w1、w2、b1、b2Representing parameters of two fully connected layers of an attention network;
generating corresponding weight by combining u _ u of the user social network with the attention network to obtain implicit vector representation of the user in the social network, wherein the implicit vector representation is specifically represented by the following formula:
wherein,representing an implicit vector of a user i in the social network, and i' representing a neighbor user;
the user vector formula that ultimately includes the user project interaction network and the user social network is as follows:
hirepresenting the end-user vector,/n-1Indicating the number of network layers, wn、bnParameters representing w and b multilayer neural networks;
103) and (3) commodity data fusion: the deep relationship of the commodity in the network is collected in a meta-path mode, and the feature vector of the commodity is represented by fusing the features of the neighbor users through GNNThe specific formula is as follows:
104) a recommendation score predicting step: predicting the scores of the user on the commodities according to the user vectors and the commodity vectors obtained in the steps 101) to 103), splicing the user vectors and the commodity vectors through a full-connection layer network to serve as the input of the network, and using the predicted scores as the output of the network, namely using a multilayer full-connection neural network to obtain the final predicted scores, wherein the specific formula is as follows:
2. The system recommendation method of claim 1, wherein: the specific process of implicit vector representation of the user in the user interaction network in step 102) is as follows:
where aggregate represents the fusion function, xjrThe vectors representing the commodities j and the corresponding scores r are fused, and C represents all scores which are made by the user on the commodities; x is the number ofjrThe formula of (1) is as follows:
the Aggregation fusion function adopts a fully-connected neural network, and the concrete formula is as follows:
σ is the activation function, w and b are neural network parameters, αjiFor the weight of a commodity j to a user i, the importance degree of each commodity to the user is different, the weight is trained through an Attention network, and the specific formula is as follows:
3. the system recommendation method of claim 1, wherein: further comprising 105) a parameter adjusting step: adjusting parameters of the recommendation model established in the steps 101) to 104), and specifically, defining an objective function to optimize so as to reversely update the parameters in the recommendation model; the Adam method is selected for adjustment and optimization, wherein the loss formula of adjustment is as follows:
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202110027315.5A CN112686736B (en) | 2021-01-09 | 2021-01-09 | System recommendation method |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202110027315.5A CN112686736B (en) | 2021-01-09 | 2021-01-09 | System recommendation method |
Publications (2)
Publication Number | Publication Date |
---|---|
CN112686736A CN112686736A (en) | 2021-04-20 |
CN112686736B true CN112686736B (en) | 2022-07-05 |
Family
ID=75456885
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202110027315.5A Active CN112686736B (en) | 2021-01-09 | 2021-01-09 | System recommendation method |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN112686736B (en) |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN104573103A (en) * | 2015-01-30 | 2015-04-29 | 福州大学 | Coauthor recommending method under scientific and technical literature heterogeneous network |
CN109871504A (en) * | 2019-01-24 | 2019-06-11 | 中国科学院软件研究所 | A kind of Course Recommendation System based on Heterogeneous Information network and deep learning |
CN110633422A (en) * | 2019-09-16 | 2019-12-31 | 安徽大学 | Microblog friend recommendation method based on heterogeneous information network |
Family Cites Families (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN107944629B (en) * | 2017-11-30 | 2020-08-07 | 北京邮电大学 | Recommendation method and device based on heterogeneous information network representation |
CN109002488B (en) * | 2018-06-26 | 2020-10-02 | 北京邮电大学 | Recommendation model training method and device based on meta-path context |
CN111191081B (en) * | 2019-12-17 | 2022-02-22 | 安徽大学 | Developer recommendation method and device based on heterogeneous information network |
-
2021
- 2021-01-09 CN CN202110027315.5A patent/CN112686736B/en active Active
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN104573103A (en) * | 2015-01-30 | 2015-04-29 | 福州大学 | Coauthor recommending method under scientific and technical literature heterogeneous network |
CN109871504A (en) * | 2019-01-24 | 2019-06-11 | 中国科学院软件研究所 | A kind of Course Recommendation System based on Heterogeneous Information network and deep learning |
CN110633422A (en) * | 2019-09-16 | 2019-12-31 | 安徽大学 | Microblog friend recommendation method based on heterogeneous information network |
Also Published As
Publication number | Publication date |
---|---|
CN112686736A (en) | 2021-04-20 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN111523047B (en) | Multi-relation collaborative filtering algorithm based on graph neural network | |
CN111563164B (en) | Specific target emotion classification method based on graph neural network | |
CN112989064B (en) | Recommendation method for aggregating knowledge graph neural network and self-adaptive attention | |
CN112232925A (en) | Method for carrying out personalized recommendation on commodities by fusing knowledge maps | |
Qu et al. | An end-to-end neighborhood-based interaction model for knowledge-enhanced recommendation | |
CN112950324B (en) | Knowledge graph assisted pairwise sorting personalized merchant recommendation method and system | |
CN109190030B (en) | Implicit feedback recommendation method fusing node2vec and deep neural network | |
CN103971161B (en) | Hybrid recommendation method based on Cauchy distribution quantum-behaved particle swarm optimization | |
Guo et al. | Trust-aware recommendation based on heterogeneous multi-relational graphs fusion | |
Duma et al. | Sparseness reduction in collaborative filtering using a nearest neighbour artificial immune system with genetic algorithms | |
AU2020101604A4 (en) | A Recommendation with Item Cooccurrence based on Metric Factorization | |
CN111881363A (en) | Recommendation method based on graph interaction network | |
CN108470075A (en) | A kind of socialization recommendation method of sequencing-oriented prediction | |
CN109710835A (en) | Heterogeneous information network recommendation method with time weight | |
CN115374347A (en) | Social recommendation method based on knowledge graph attention network | |
CN113051468A (en) | Movie recommendation method and system based on knowledge graph and reinforcement learning | |
CN115329215A (en) | Recommendation method and system based on self-adaptive dynamic knowledge graph in heterogeneous network | |
CN114817712A (en) | Project recommendation method based on multitask learning and knowledge graph enhancement | |
CN117495481A (en) | Article recommendation method based on heterogeneous timing diagram attention network | |
CN115840853A (en) | Course recommendation system based on knowledge graph and attention network | |
CN113342994B (en) | Recommendation system based on non-sampling cooperative knowledge graph network | |
CN114385804A (en) | Comment recommendation method of heterogeneous graph attention neural network based on meta-learning | |
Zhang et al. | Knowledge graph driven recommendation model of graph neural network | |
Chen et al. | Graph enhanced neural interaction model for recommendation | |
CN114842247B (en) | Characteristic accumulation-based graph convolution network semi-supervised node classification method |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |