CN111127142B - Article recommendation method based on generalized nerve attention - Google Patents

Article recommendation method based on generalized nerve attention Download PDF

Info

Publication number
CN111127142B
CN111127142B CN201911291806.XA CN201911291806A CN111127142B CN 111127142 B CN111127142 B CN 111127142B CN 201911291806 A CN201911291806 A CN 201911291806A CN 111127142 B CN111127142 B CN 111127142B
Authority
CN
China
Prior art keywords
model
attention
user
generalized
recommendation
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201911291806.XA
Other languages
Chinese (zh)
Other versions
CN111127142A (en
Inventor
郑莹
吕艳霞
魏方娜
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Northeastern University Qinhuangdao Branch
Original Assignee
Northeastern University Qinhuangdao Branch
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Northeastern University Qinhuangdao Branch filed Critical Northeastern University Qinhuangdao Branch
Priority to CN201911291806.XA priority Critical patent/CN111127142B/en
Publication of CN111127142A publication Critical patent/CN111127142A/en
Application granted granted Critical
Publication of CN111127142B publication Critical patent/CN111127142B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q30/00Commerce
    • G06Q30/06Buying, selling or leasing transactions
    • G06Q30/0601Electronic shopping [e-shopping]
    • G06Q30/0631Item recommendations
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods

Abstract

The invention provides an article recommending method based on generalized nerve attention, and relates to the technical field of information processing. The invention combines a generalized matrix factorization model GMF and a nerve attention similarity model NAIS to establish a generalized nerve attention recommendation model GNAS, optimizes the model by using an attention mechanism integrating the GMF and a multi-layer perceptron MLP (MLP, multilayer Perceptron) in the model, predicts the preference degree of a user on a target object through the optimized generalized nerve attention recommendation model after optimizing the model, and generates a personalized recommendation list for the user. The method and the system have the advantages that potential interests and hobbies of the user are mined, and the interpretability and the diversity of a recommendation system are improved; and secondly, estimating the weight occupied by each historical item when predicting the favorite degree of the target item by adopting a attention mechanism combining the GMF model and the MLP model, and greatly improving the recommendation accuracy by using smaller time cost so as to recommend items which are more in line with the interests of the user.

Description

Article recommendation method based on generalized nerve attention
Technical Field
The invention relates to the technical field of information processing, in particular to an article recommendation method based on generalized nerve attention.
Background
Today, we are experiencing a transition from the information age (Information Technology, IT) to the Data age (Data Technology, DT), the more obvious sign of the Data age being: the information is overloaded. How do help a particular user find interesting information quickly from a huge amount of information? There are two related solutions: search engines and recommendation systems. The search engine needs the user to accurately describe the own demands, and the recommendation system discovers the personalized demands and interest characteristics of the user by analyzing and mining the behaviors of the user and recommends the information or articles possibly interested by the user to the user. An excellent recommendation system can well connect users, merchants and platform parties in series and benefit three parties, so that the recommendation system not only has a great deal of attention and research in academia, but also has wide application in various application scenes, and gradually becomes the standard of most fields.
An e-commerce website is a large application field of personalized recommendation systems. Personalized recommendation systems are also an important application in movie and video websites. It can help users find videos of interest to them in a vast video library. In the aspect of social network, personalized article recommendation can be performed on the user by utilizing the social network information of the user, friends can be recommended to the user, and the like. The recommendation of personalized advertisements is also a hotspot of ongoing concern. In addition, personalized music recommendation, news reading recommendation, application in geographic position and the like are provided. In summary, the recommendation system is already seen everywhere, not only has extremely high commercial value, but also brings great convenience to our study and life.
The biggest advantage of personalized recommendations is that it can collect user profile and actively make personalized recommendations to the user based on user characteristics, such as interest preferences. Moreover, the recommendation given by the system can be updated in real time, and when the commodity library or the user characteristic library in the system is changed, the given recommendation sequence is automatically changed. This greatly improves the simplicity and effectiveness of the e-commerce activity, as well as the level of service for the enterprise. If the recommendation quality of the recommendation system is high, the user may rely on the recommendation system. Therefore, the personalized recommendation system not only can provide personalized recommendation service for the user, but also can establish a long-term stable relationship with the user, thereby effectively retaining the client, improving the loyalty of the client and the click rate of the website, and preventing the loss of the client. Under the increasingly vigorous competitive environment, the personalized recommendation system can effectively reserve clients, improve the service capability of the electronic commerce system, bring great convenience to life generations of users and bring great economic benefits to companies.
The most important module in the recommendation system is the recommendation algorithm, and the most widely used among recommendation algorithms is the collaborative filtering algorithm (Collaborative Filtering CF). The CFs are largely divided into two categories, user-based collaborative filtering algorithms (User-based CFs) and Item-based collaborative filtering algorithms (Item-based CFs). The core idea of the Item-based CF is to recommend items to the user that are similar to those they have liked before, so the algorithm is mainly divided into two steps: (1) calculating the similarity between the articles; (2) And generating a recommendation list for the user according to the similarity of the articles and the historical behaviors of the user.
Early Item-based CF calculated the similarity between items using only pearson coefficients and cosine similarity, among other methods. This method is too simple to be applied directly to the new data set, requiring manual tuning and the method after tuning cannot be applied directly to the new data set. In recent years, model-based methods have been used that customize an objective function to learn the similarity matrix directly from the data by minimizing the loss between the original user-Item interaction matrix and the interaction matrix reconstructed by the Item-based CF model. The performance of the method in scoring task and Top-k recommendation is superior to the traditional heuristic-based method. However, the number of items in the recommendation system is often huge, and the similarity matrix is learned with high complexity. Second, it can only estimate the similarity between two items that are purchased together or scored together, and cannot estimate the similarity between unrelated items so that the transfer relationship between items cannot be captured. Later Kabbur et al proposed a feature item similarity model that represented an item as an embedded vector, and the similarity between two items parameterized the inner product of the embedded vectors of the two items. When a user has new interaction, the similarity between the new article and the predicted article (also called as a target article) is only calculated, and then the similarity is accumulated with the original similarity to obtain the favorite degree of the user on the target article. Therefore, the method is very suitable for online recommendation tasks, and experimental results of a plurality of data sets with different sparsity indicate that the method can effectively process the sparse data sets. This model also has a drawback in that it assumes that the user interacted with historical items make the same contribution in predicting the user's preference for the target item. This does not follow the actual recommended scenario, and He et al propose a neural attention item similarity model, called NAIS, that uses the attention mechanism to assign a weight to each historical item to distinguish their different contributions to user preference. However, the interests of users are changed from time to time, and the single neural model has strong generalization capability due to the problems of deeper depth of the neural model, complex model and the like, but the most original interaction information between users and articles is ignored, the model lacks memory capability, and the recommended parts of articles possibly deviate from the interests of the users.
Disclosure of Invention
Aiming at the defects of the prior art, the invention provides an article recommending method based on generalized nerve attention, which improves the accuracy of article recommendation;
in order to solve the technical problems, the technical scheme adopted by the invention is an article recommending method based on generalized nerve attention, which comprises the following steps:
step 1, establishing a generalized nerve attention recommendation model GNAS by combining a generalized matrix factorization model GMF and a nerve attention similarity model NAIS;
the generalized matrix factorization model GMF has the following formula:
wherein ,representing the preference degree of the user u on the target object i, j being the historical object interacted with by the user u before, p i and qj Represents the target item vector to be predicted and the history item vector interacted with by the user, respectively, +. T Is a convolution layer used for extracting more characteristic information between users and articles, improving generalization capability,preventing self-recommendation;
step 1.1, constructing a generalized nerve attention recommendation model;
generating sparse potential vectors of the target object through one-hot independent encoding, and then generating potential vectors of the user through multi-hot encoding on the history object j interacted by the user u; the user and the article embedded vector are obtained by the two through the embedded layer; the generalized matrix factorization model GMF and the neural attention similarity model NAIS share the user and article embedded vectors to obtain a generalized neural attention recommendation model GNAS, and the generalized neural attention recommendation model GNAS is shown in the following formula:
wherein ,is a convolution layer, the purpose of which is to prevent the gradient disappearance, a) caused by the direct addition of the dot product result to the generalized neural attention recommendation model GNAS ij Is the attention weight used to calculate the contribution of the interacted historical item j when the user u predicts the preference of the target item i, which is parameterized as a softmax function variant with respect to the attention function f, as shown in the following equation:
wherein, beta is a punishment coefficient, and the value range of the beta is [0,1] which is used for reducing punishment of the model to active users with historical interaction articles exceeding a threshold value;
the attention function f is combined by two models of a generalized matrix factorization model GMF and an MLP and is combined by vectorsMapped to the output layer as shown in the following formula:
wherein the input of the attention function f is p i and qj W and b represent the weight matrix and the bias vector, respectively, reLU is the activation function,is a group of vectors requiring training, and aims to project the result from the hidden layer to the output layer, and the weight matrix W dimension is equal to h T Corresponding to the dimensions of (a);
step 1.2, pre-training a constructed generalized nerve attention recommendation model;
in the pre-training process, using the article embedded vector trained by the factorization article similarity model FISM to initialize the article vector in the generalized neural attention recommendation model GNAS to replace random initialization; other parameters to be learnedh T W, b are initialized with gaussian distributions;
step 2, optimizing the model by using a focus mechanism integrated by GMF and a multi-layer perceptron MLP (MLP, multilayer Perceptron);
step 2.1, establishing an objective function of the model, wherein the objective function is shown in the following formula:
where L is the loss and σ is the sigmoid function for the purpose of predicting the resultThe range of (1, 0), R + and R- A positive example set representing the articles interacted by the user and a negative example set representing the articles not interacted by the user, wherein the sum of the positive example set and the negative example set is a training example number N, Θ represents all training parameters including p i 、q j 、/>h T W, b, lambda is control L 2 The degree of regularization to prevent overfitting of the hyper-parameters;
step 2.2, in order to minimize an objective function, automatically adjusting the learning rate of parameters in training by adopting an adaptive gradient algorithm Adagrad; for each positive instance (u, i), a proportion of negative instances are randomly drawn in the absence of observed interactions to pair with.
Step 3, after the model is optimized, predicting the preference degree of the user for the target object through the optimized generalized nerve attention recommendation model, and generating a personalized recommendation list for the user;
the beneficial effects of adopting above-mentioned technical scheme to produce lie in:
according to the item recommending method based on generalized nerve attention, a generalized matrix decomposition model is used for memorizing second-order relations between users and items, and potential interest and hobbies of the users are mined by combining a nerve attention similarity method, so that the interpretability and diversity of a recommending system are improved; and secondly, estimating the weight occupied by each historical item when predicting the favorite degree of the target item by adopting a attention mechanism combining the GMF model and the MLP model, and greatly improving the recommendation accuracy by using smaller time cost so as to recommend items which are more in line with the interests of the user.
Drawings
FIG. 1 is a basic framework diagram of a generalized neural attention recommendation model GNAS provided by an embodiment of the present invention;
FIG. 2 is a graph showing the score contrast of three recommendation models GNAS, NAIS and FISM on two evaluation indexes of HR and NDCG on two real object data sets according to the embodiment of the invention;
wherein, (a) is the score contrast of the evaluation index HR of the three models on the data set Movielens, (b) is the score contrast of the evaluation index NDCG of the three models on the data set Movielens, (c) is the score contrast of the evaluation index HR of the three models on the data set Pinterest-20, and (d) is the score contrast of the evaluation index NDCG of the three models on the data set Pinterest-20.
Detailed Description
The following describes in further detail the embodiments of the present invention with reference to the drawings and examples. The following examples are illustrative of the invention and are not intended to limit the scope of the invention.
An article recommending method based on generalized nerve attention, as shown in fig. 1, combines a generalized matrix factorization model (GMF) and a nerve attention similarity model (NAIS) to establish a generalized nerve attention recommending model GNAS, optimizes the model by using an attention mechanism integrating the GMF and MLP, predicts the preference degree of a user on a target article through the optimized generalized nerve attention recommending model, and generates a personalized recommending list for the user;
the generalized matrix factorization model (GMF) is as follows:
wherein ,representing the preference degree of the user u on the target object i, j being the historical object interacted with by the user u before, p i and qj Represents the target item vector to be predicted and the history item vector interacted with by the user, respectively, +. T Is a convolution layer used for extracting more characteristic information between users and articles, improving generalization capability,preventing self-recommendation;
matrix Factorization (MF) is the most popular collaborative filtering algorithm in the recommendation field. The idea of item-based matrix factorization is to simulate a real user's click rate or scoring matrix for items by multiplying the user's low-dimensional potential vector matrix by the item's low-dimensional potential vector matrix. Generating respective sparse feature vectors by using a user and an object to be predicted through one-hot coding, and respectively obtaining embedded vectors of the user and the object to be predicted through an embedded layer; the generalized MF model may be more expressive in modeling interactions between users and historical items than the general MF model, and is therefore named GMF.
The specific method for establishing the generalized nerve attention recommendation model GNAS by combining the generalized matrix factorization model (GMF) and the nerve attention similarity model (NAIS) comprises the following steps:
(1) Construction of generalized neural attention recommendation model
Generating sparse potential vectors of the target items through one-hot, and then encoding the history items interacted by the user through Multi-hot to obtain potential vectors (Multi-hot) of the user; the user and the article embedded vector are obtained by the two through the embedded layer; the generalized matrix factorization model GMF and the neural attention similarity model NAIS share the user and article embedded vectors to obtain a generalized neural attention recommendation model GNAS, and the generalized neural attention recommendation model GNAS is shown in the following formula:
wherein ,is a convolution layer, the purpose of which is to prevent the gradient disappearance, a) caused by the direct addition of the dot product result to the generalized neural attention recommendation model GNAS ij Is the attention weight used to calculate the contribution of the interacted historical item j when the user u predicts the preference of the target item i, which is parameterized as a softmax function variant with respect to the attention function f, as shown in the following equation:
the variant of the Softmax function is mainly characterized in that an index is added on a denominator and is used for converting attention weight into a probability distribution, wherein beta is a penalty coefficient, the value range of beta is [0,1] and the penalty of the model on active users with historical interaction items exceeding a threshold value is reduced;
the GNAS model uses only MLP as the attention mechanism to model deep relationships between historical items and target items, lacking a wide kernel to memorize the most primitive information between users and items. To solve this problem, GMF method is also added to the attention function mechanismAn integrated attention network is constructed to calculate the contribution of historical items to the representation of user preferences, such that the calculated weights more fully model the complex user-item interactions in the user decision process. Integral framework of model and attention mechanism a ij Is unchanged, the attention function f is combined by two models of generalized matrix factorization model GMF and MLP, and is calculated by vectorMapped to the output layer as shown in the following formula:
wherein the input of the attention function f is p i and qj W and b represent the weight matrix and the bias vector, respectively, reLU is the activation function,is a set of vectors that need to be trained in order to project the result from the hidden layer to the output layer, h T Is a convolution layer whose dimension corresponds to the weight matrix W, < >>h T W, b is learned from experimental data;
the GNAS model established by the invention integrates the GMF model on the basis of the original neural attention similarity model (NAIS), calculates the weight of the historical articles interacted by the user by adopting an integrated attention mechanism, and provides the most advanced performance in the article recommendation scene based on implicit feedback. (2) Pre-training of constructed generalized neural attention recommendation model
Meanwhile, parameters of the training attention network and embedded vectors of the objects can lead to low convergence speed, self-adaptive effect is generated, and improvement of model performance is limited. The article embedded vector of the FISM training is used to initialize the article vector in the GNAS model instead of random initialization during the pre-training process; since the FISM model does not involve optimization of attention weights, more representative item vectors can be learned directly. By the arrangement, the convergence speed of the model can be increased, and the training of the attention network and other parameters is greatly improved. Since the model uses FISM pre-trained item embedding vectors, other parameters that need to be learned are initialized with a Gaussian distribution.
The specific method for optimizing the generalized nerve attention recommendation model GNAS comprises the following steps:
the objective function of the model is built as follows:
wherein σ is a sigmoid function for the purpose of predicting the resultThe range of (1, 0), R + and R- A positive example set representing the articles interacted by the user and a negative example set representing the articles not interacted by the user, wherein the sum of the positive example set and the negative example set is a training example number N, Θ represents all training parameters including p i 、q j 、/>h T W, b, lambda is control L 2 The degree of regularization to prevent overfitting of the hyper-parameters;
in order to minimize the objective function, an adaptive gradient algorithm Adagrad is adopted to automatically adjust the learning rate of parameters in training; if the gradient of the decline is large, the learning speed decays faster. For each positive instance (u, i), a proportion of negative instances are randomly drawn in the absence of observed interactions to pair with. The appropriate negative sampling rate has a positive impact on the model performance. Consistent with the setting of NAIS, the present embodiment sets the negative case number to 4.
In this example, the generalized neural attention recommendation model GNAS established by the present invention was experimentally verified by two real item data sets, movielens and Pinterest-20. The performance of the model was judged by two recommendation indices Hit Ratio (HR) and Normalized Discounted Cumulative Gain (NDCG). These two metrics have been widely used to evaluate Top-K recommendations and information retrieval. Hr@10 may be interpreted as a measure based on recall, representing the percentage of successful recommended users (i.e. positive instances appear in the top 10), while ndcg@10 is the predicted position taking into account the positive instance (the position of the positive instance in the top 10), the larger the values of these two indicators representing better performance.
In fig. 2, when the size of the embedded vector obtained by passing the sparse vector of the user and the article through the embedded layer is set to 16, the scores of the GNAS model and the NAIS model of the present invention on the two evaluation indexes are shown in fig. 2. In the experiment we run 100 epochs with the three models GNAS, NAIS and FISM until convergence and the last 50 epochs results were made into figure 2.
It is clear from fig. 2 that the GNAS model of the present invention performs far better than the NAIS model alone, demonstrating the effectiveness of combining the depth model and the breadth model to model user preferences. Specifically, the GNAS model of the present invention increased the scores of the two indicators to 70.88% and 42.69% over the MovieLens dataset compared to the scores 69.70% and 41.94% for NAIS over HR and NDCG. In the recommended task, the accuracy of the NAIS model has been significantly improved. In addition, the GNAS model of the present invention has a larger performance improvement over non-sparse data sets than over sparse data sets, so the GNAS model of the present invention is more suitable for dense data sets. The score of the GNAS model on two indexes is far higher than that of the FISM model, so that great advantages of the integrated recommendation model in the aspects of recommendation accuracy and interpretability are fully proved, and the necessity of adding an attention mechanism is reflected.
This example also shows the performance of the GNAS model of the invention as shown in table 1 compared to other novel recommendation methods. Some of these models are based on embedding. For fairness, the embedding size is uniformly set to 16.
TABLE 1 comparison of the performance of GNAS and basic methods on the HR@10 and NDCG@10 indicators at an embedding size of 16
TABLE 2 training time for each round of model
As can be seen from table 1, the GNAS model obtained the highest score on both indices. This benefits from the GMF enhancing the memory capabilities of user-item interactions, especially on non-sparse data sets MovieLens. While emphasizing the necessity of applying the integration model to the recommended tasks. At the same time, the performance of attention-based models (such as NAIS and GNAS) on both datasets is significantly better than other recommended methods. In addition, the performance of GNAS is superior to NAIS, reflecting the effectiveness of designing an integrated model in an attention network. The performance of user-based methods (MF, MLP) is inferior to article-based methods (NAIS, GNAS) due to the different ways in which users are represented.
The training time for each epoch for the GNAS model and the base model is also given in this example, as shown in Table 2. Training time is not shown because other models are implemented by JAVA rather than Tensorflow. The latter two models are time consuming due to the addition of the attention mechanism. From table 2, it can be seen that the GNAS model of the invention achieves a significant improvement in performance with less time cost compared to NAIS. This is reasonable because the generalized matrix decomposition can simply and effectively capture low-order connections between the user and the item.
Finally, it should be noted that: the above embodiments are only for illustrating the technical solution of the present invention, and are not limiting; although the invention has been described in detail with reference to the foregoing embodiments, it will be understood by those of ordinary skill in the art that: the technical scheme described in the foregoing embodiments can be modified or some or all of the technical features thereof can be replaced with equivalents; such modifications and substitutions do not depart from the spirit of the corresponding technical solutions, which are defined by the scope of the appended claims.

Claims (1)

1. An item recommendation method based on generalized nerve attention, which is characterized by comprising the following steps:
step 1, establishing a generalized nerve attention recommendation model GNAS by combining a generalized matrix factorization model GMF and a nerve attention similarity model NAIS;
the generalized matrix factorization model GMF has the following formula:
wherein ,representing the preference degree of the user u on the target object i, j being the historical object interacted with by the user u before, p i and qj Represents the target item vector to be predicted and the history item vector interacted with by the user, respectively, +. T Is a convolution layer for extracting more characteristic information between users and articles, improving generalization ability,/for>Preventing self-recommendation;
step 1.1, constructing a generalized nerve attention recommendation model;
generating sparse potential vectors of the target object through one-hot independent encoding, and then generating potential vectors of the user through multi-hot encoding on the history object j interacted by the user u; the user and the article embedded vector are obtained by the two through the embedded layer; the generalized matrix factorization model GMF and the neural attention similarity model NAIS share the user and article embedded vectors to obtain a generalized neural attention recommendation model GNAS, and the generalized neural attention recommendation model GNAS is shown in the following formula:
wherein ,is a convolution layer, the purpose of which is to prevent the gradient disappearance, a) caused by the direct addition of the dot product result to the generalized neural attention recommendation model GNAS ij Is the attention weight used to calculate the contribution of the interacted historical item j when the user u predicts the preference of the target item i, which is parameterized as a softmax function variant with respect to the attention function f, as shown in the following equation:
wherein, beta is a punishment coefficient, and the value range of the beta is [0,1] which is used for reducing punishment of the model to active users with historical interaction articles exceeding a threshold value;
the attention function f is combined by two models of a generalized matrix factorization model GMF and an MLP and is combined by vectorsMapped to the output layer as shown in the following formula:
wherein the input of the attention function f is p i and qj W and b represent the weight matrix and the bias vector, respectively, reLU is the activation function,is a group of vectors requiring training, and aims to project the result from the hidden layer to the output layer, and the weight matrix W dimension is equal to h T Corresponding to the dimensions of (a);
step 1.2, pre-training a constructed generalized nerve attention recommendation model;
in the pre-training process, using the article embedded vector trained by the factorization article similarity model FISM to initialize the article vector in the generalized neural attention recommendation model GNAS to replace random initialization; other parameters to be learnedh T W, b are initialized with gaussian distributions;
step 2, optimizing the model by using a focus mechanism integrated by the GMF and the multi-layer perceptron MLP in the model;
step 2.1, establishing an objective function of the model, wherein the objective function is shown in the following formula:
where L is the loss and σ is the sigmoid function for the purpose of predicting the resultThe range of (1, 0), R + and R- A positive example set representing the articles interacted by the user and a negative example set representing the articles not interacted by the user, wherein the sum of the positive example set and the negative example set is a training example number N, Θ represents all training parameters including p i 、q j 、/>h T W, b, lambda is control L 2 The degree of regularization to prevent overfitting of the hyper-parameters;
step 2.2, in order to minimize an objective function, automatically adjusting the learning rate of parameters in training by adopting an adaptive gradient algorithm Adagrad; for each positive instance (u, i), randomly extracting a proportion of negative instances in the absence of observed interactions to pair with;
and 3, after the model is optimized, predicting the preference degree of the user for the target object through the optimized generalized nerve attention recommendation model, and generating a personalized recommendation list for the user.
CN201911291806.XA 2019-12-16 2019-12-16 Article recommendation method based on generalized nerve attention Active CN111127142B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201911291806.XA CN111127142B (en) 2019-12-16 2019-12-16 Article recommendation method based on generalized nerve attention

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911291806.XA CN111127142B (en) 2019-12-16 2019-12-16 Article recommendation method based on generalized nerve attention

Publications (2)

Publication Number Publication Date
CN111127142A CN111127142A (en) 2020-05-08
CN111127142B true CN111127142B (en) 2023-09-08

Family

ID=70499116

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911291806.XA Active CN111127142B (en) 2019-12-16 2019-12-16 Article recommendation method based on generalized nerve attention

Country Status (1)

Country Link
CN (1) CN111127142B (en)

Families Citing this family (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111626827B (en) * 2020-05-28 2023-06-13 苏州大学 Article recommendation method, device, equipment and medium based on sequence recommendation model
CN111737578B (en) * 2020-06-22 2024-04-02 陕西师范大学 Recommendation method and system
CN112328893B (en) * 2020-11-25 2022-08-02 重庆理工大学 Recommendation method based on memory network and cooperative attention
CN112597392B (en) * 2020-12-25 2022-09-30 厦门大学 Recommendation system based on dynamic attention and hierarchical reinforcement learning
CN112631560B (en) * 2020-12-29 2023-07-07 上海海事大学 Method and terminal for constructing objective function of recommendation model
CN112862007B (en) * 2021-03-29 2022-12-13 山东大学 Commodity sequence recommendation method and system based on user interest editing
CN113362034A (en) * 2021-06-15 2021-09-07 南通大学 Position recommendation method
CN113643817A (en) * 2021-06-25 2021-11-12 合肥工业大学 Medical case knowledge recommendation method and system considering implicit feedback and man-machine interaction
CN114791983B (en) * 2022-04-13 2023-04-07 湖北工业大学 Sequence recommendation method based on time sequence article similarity
CN114581161B (en) * 2022-05-06 2022-08-16 深圳市明珞锋科技有限责任公司 Information pushing method and system based on deep learning
CN115187343B (en) * 2022-07-20 2023-08-08 山东省人工智能研究院 Attention graph convolution neural network-based multi-behavior recommendation method
CN115953215B (en) * 2022-12-01 2023-09-05 上海交通大学 Search type recommendation method based on time and graph structure

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108874914A (en) * 2018-05-29 2018-11-23 吉林大学 A kind of information recommendation method based on the long-pending and neural collaborative filtering of picture scroll
CN109087130A (en) * 2018-07-17 2018-12-25 深圳先进技术研究院 A kind of recommender system and recommended method based on attention mechanism
CN109299396A (en) * 2018-11-28 2019-02-01 东北师范大学 Merge the convolutional neural networks collaborative filtering recommending method and system of attention model
CN109670121A (en) * 2018-12-18 2019-04-23 辽宁工程技术大学 Project level and feature level depth Collaborative Filtering Recommendation Algorithm based on attention mechanism
CN109785062A (en) * 2019-01-10 2019-05-21 电子科技大学 A kind of hybrid neural networks recommender system based on collaborative filtering model

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20190325293A1 (en) * 2018-04-19 2019-10-24 National University Of Singapore Tree enhanced embedding model predictive analysis methods and systems

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108874914A (en) * 2018-05-29 2018-11-23 吉林大学 A kind of information recommendation method based on the long-pending and neural collaborative filtering of picture scroll
CN109087130A (en) * 2018-07-17 2018-12-25 深圳先进技术研究院 A kind of recommender system and recommended method based on attention mechanism
CN109299396A (en) * 2018-11-28 2019-02-01 东北师范大学 Merge the convolutional neural networks collaborative filtering recommending method and system of attention model
CN109670121A (en) * 2018-12-18 2019-04-23 辽宁工程技术大学 Project level and feature level depth Collaborative Filtering Recommendation Algorithm based on attention mechanism
CN109785062A (en) * 2019-01-10 2019-05-21 电子科技大学 A kind of hybrid neural networks recommender system based on collaborative filtering model

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
基于注意力机制的深度协同过滤模型;谢恩宁 等;《中国计量大学学报》;第30卷(第2期);219-225、242 *

Also Published As

Publication number Publication date
CN111127142A (en) 2020-05-08

Similar Documents

Publication Publication Date Title
CN111127142B (en) Article recommendation method based on generalized nerve attention
Darban et al. GHRS: Graph-based hybrid recommendation system with application to movie recommendation
CN109657156B (en) Individualized recommendation method based on loop generation countermeasure network
CN110717098B (en) Meta-path-based context-aware user modeling method and sequence recommendation method
CN112529168B (en) GCN-based attribute multilayer network representation learning method
CN109389151B (en) Knowledge graph processing method and device based on semi-supervised embedded representation model
CN112364976B (en) User preference prediction method based on session recommendation system
CN110781409B (en) Article recommendation method based on collaborative filtering
CN113918832B (en) Graph convolution collaborative filtering recommendation system based on social relationship
CN114519145A (en) Sequence recommendation method for mining long-term and short-term interests of users based on graph neural network
CN112307332A (en) Collaborative filtering recommendation method and system based on user portrait clustering and storage medium
CN116542720B (en) Time enhancement information sequence recommendation method and system based on graph convolution network
CN112364242A (en) Graph convolution recommendation system for context-aware type
Daneshvar et al. A social hybrid recommendation system using LSTM and CNN
Jin et al. Neighborhood-aware web service quality prediction using deep learning
US20200401880A1 (en) Generating a recommended target audience based on determining a predicted attendance utilizing a machine learning approach
Wang et al. Session-based recommendation with time-aware neural attention network
CN113590976A (en) Recommendation method of space self-adaptive graph convolution network
CN115809374B (en) Method, system, device and storage medium for correcting mainstream deviation of recommendation system
Mu et al. Auxiliary stacked denoising autoencoder based collaborative filtering recommendation
CN111475744A (en) Personalized position recommendation method based on ensemble learning
CN116541592A (en) Vector generation method, information recommendation method, device, equipment and medium
CN113704318A (en) Recurrent neural network FPGNN behavior trajectory prediction method based on frequent pattern graph embedding
Li et al. A collaborative filtering recommendation method based on TagIEA expert degree model
Wei et al. A novel image recommendation model based on user preferences and social relationships

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant