CN112085158A - Book recommendation method based on stack noise reduction self-encoder - Google Patents

Book recommendation method based on stack noise reduction self-encoder Download PDF

Info

Publication number
CN112085158A
CN112085158A CN202010703474.8A CN202010703474A CN112085158A CN 112085158 A CN112085158 A CN 112085158A CN 202010703474 A CN202010703474 A CN 202010703474A CN 112085158 A CN112085158 A CN 112085158A
Authority
CN
China
Prior art keywords
book
user
data
encoder
model
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202010703474.8A
Other languages
Chinese (zh)
Inventor
薛涛
赵雪青
许挺娟
高岭
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Xian Polytechnic University
Original Assignee
Xian Polytechnic University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Xian Polytechnic University filed Critical Xian Polytechnic University
Priority to CN202010703474.8A priority Critical patent/CN112085158A/en
Publication of CN112085158A publication Critical patent/CN112085158A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/045Combinations of networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/90Details of database functions independent of the retrieved data types
    • G06F16/95Retrieval from the web
    • G06F16/953Querying, e.g. by the use of web search engines
    • G06F16/9535Search customisation based on user profiles and personalisation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/21Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation
    • G06F18/214Generating training patterns; Bootstrap methods, e.g. bagging or boosting
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • G06N3/084Backpropagation, e.g. using gradient descent

Abstract

The invention discloses a book recommendation method based on a stack noise reduction self-encoder, which comprises the steps of preprocessing an obtained book training set to obtain a complete book training set; secondly, a feature extraction module is constructed, and the SDAE module carries out feature conversion and feature extraction on the data in the complete book training set to obtain a user feature vector, a user offset vector, a book feature vector and a book offset vector which represent the book sample; using the vectors as the input of a deep learning model; finally, training the obtained deep learning model by a random gradient descent method and a back propagation algorithm, and obtaining a finally finished model after the loss function convergence training is finished; and finally, applying the model to a library of the school to verify the validity and feasibility of the model. According to the method, the auxiliary information and the bias vector are added, so that the barrier between deep learning and the traditional recommendation algorithm is broken, the relationship between the deep learning and the traditional recommendation algorithm is correlated, and the performance of the recommendation algorithm is further improved.

Description

Book recommendation method based on stack noise reduction self-encoder
Technical Field
The invention belongs to the technical field of deep learning-based recommendation, and particularly relates to a recommendation technology research based on a stack noise reduction self-encoder, which is used for realizing intelligent recommendation in the field of books.
Background
The paper published by YouTube in 2016 obtains a very good effect of applying deep learning to video recommendation, and since then, the application of the deep learning technology to a recommendation system is more and more extensive, and various papers and industrial applications are endless. International well-known recommendation system meetings RecSys started with 2016 to organize meetings specifically for deep learning, which is increasingly valued in the recommendation circles. The traditional recommendation algorithm needs to recommend according to a user-item scoring matrix, and the method not only depends on the sparsity of the scoring matrix, but also has unsatisfactory recommendation precision. The biggest feature of the stack de-noising autoencoder (SDAE) is the ability to learn hidden vectors for users and items, relative to traditional approaches.
On one hand, in order to solve the problems of cold start, data sparsity and the like, the auxiliary information mainly comprises the auxiliary information of the user and the book and is used as the input of the deep learning model, and the result shows that the recommendation precision can be effectively improved by using the auxiliary information. On the other hand, in the deep search network, until deep learning is proposed in the recommendation algorithm field, improvement of recommendation accuracy has been an important issue in the recommendation field. The hidden vector matrixes U and V of the users and the projects are learned by combining the scoring matrixes R of the users and the projects and combining auxiliary information and offset vectors on the basis of learning the hidden vector matrixes U and V of the users and the projects by improving the depth model of the SDAE, missing values in the scoring matrixes R are predicted, and book recommendation is performed on the users by using the scoring matrixes.
Disclosure of Invention
The invention provides a book recommendation method based on a stack noise reduction self-encoder, which breaks the barrier between deep learning and the traditional recommendation algorithm and associates the barrier between the deep learning and the traditional recommendation algorithm by adding auxiliary information and a bias vector, thereby further improving the performance of the recommendation algorithm.
The technical scheme provided by the invention is as follows:
a book recommendation method based on a stack noise reduction self-encoder specifically comprises the following steps:
step 1: downloading a public Book data set Book-browsing;
step 2: preprocessing the book data set prepared in the step 1, solving the problems of inconsistent data and disorder, and obtaining a processed data set;
and step 3: defining hyper-parameters, the hyper-parameters defined by the model training of the invention are as follows: e _ epochs ═ 100; num _ layers is 3; leaving _ rate ═ 0.1;
and 4, step 4: extracting data characteristics of the user and the book by using a deep learning SDAE (software development association algorithm), and constructing a user characteristic vector and a book characteristic vector;
and 5: constructing two depth models of the stack noise reduction self-encoder, using the two characteristic vectors obtained in the step 4 as the input of the depth models, constructing a deep learning model, optimizing a loss function by using an SGD random gradient descent method and a back propagation training network in the model training process, adjusting the weight of the loss function layer by layer until the loss function is not converged any more, finishing the network training, and obtaining the deep learning model with accurate weight, so that the characteristic vectors of users and books are more accurate;
step 6: and (5) adding the characteristic vectors of the user and the book obtained by training in the step (5) into the offsets of the user and the book, adding the offsets into the characteristic vectors of the user and the book, then adding the two characteristic vectors into the vectors corresponding to the matrix decomposition, and filling the scoring matrix to achieve the purpose of book recommendation.
And 7: the model provided by the text is applied to the library of the school to recommend books for readers.
Further, the book data is preprocessed in the step 2 as follows:
filling data with empty fields in a data set, and adopting a mean value mode;
secondly, data fields which are not needed in the data set are backed up and deleted;
thirdly, manually processing the data aiming at the condition that the data fields and the data in the data set are not matched;
and (4) randomly selecting and superposing the data in the book training set processed by the step (i) and (iii), and adding the data into the original book data set to obtain a complete book data set.
Further, the process of training the deep network model in step 5 is as follows:
dividing the complete book data set in the step 2 into a plurality of data sample packets;
secondly, encoding the user and book information in the data set into vectors, wherein the characteristic expressions of encoding and decoding are as follows:
Figure BDA0002593767160000031
f and g are respectively mapping of an encoder and a decoder of the self-encoder, after solving is completed, the hidden layer characteristics h output by the encoder, namely the encoding characteristics can be regarded as the representation of the input data X, and the objective function is as follows:
Figure BDA0002593767160000032
wherein
Figure BDA0002593767160000033
And R is the minimum predicted value and the original value, and the target function is the square loss of the minimum predicted value and the original value;
and thirdly, training user data characteristics and bias information, book data characteristics and bias information respectively by using a stack noise reduction self-encoder, training layer by using a random gradient descent method and a back propagation algorithm, learning a weight matrix W and a bias value b of each layer, and obtaining a user characteristic vector, a user bias vector, a book characteristic vector and a book bias vector respectively.
Further, the process of the user and book interaction model in step 6 is as follows:
taking the implicit characteristic matrix and the offset vector of the user and the item obtained from the formula (1) as an input matrix of matrix decomposition, optimizing by adopting a random gradient descent method in the learning process of the stack noise reduction self-encoder, and finally realizing the prediction score through a text model by the implicit characteristic vector of the user and the item when predicting the score, namely:
Rui=f(Pu,Qi|P,Q,θf) (3)
wherein theta isfIs a model parameter of an interactive function f () and a self-encoder represented by the function, the result of the previous hidden layer is used as the input of the next hidden layer, and P and Q represent a user scoring matrix and a book scoring matrix;
secondly, adding the offset vector obtained after training into the model, namely changing the formula (4), so that the objective function of the model is as follows:
Figure BDA0002593767160000041
Figure BDA0002593767160000042
where μ is a trade-off parameter, fregulationIs a regularization term, prevents overfitting,
Figure BDA0002593767160000043
wherein W is the weight matrix of the two SDAEs, and b is the corresponding bias vector;
thirdly, the implicit characteristic vector of the user can be obtained through the second step, and the implicit characteristic vector of the book can be obtained through the same method:
Figure BDA0002593767160000044
obtaining the implicit characteristic vectors of the user and the books through the process, obtaining the user offset vector and the book offset vector by using the stack noise reduction self-encoder, then using the user offset vector and the book offset vector as the input of matrix decomposition, filling the matrix, and finally recommending the books to the user according to the scoring matrix.
Compared with the prior art, the invention has the following beneficial effects: according to the book recommendation method based on the stack noise reduction self-encoder, provided by the invention, by adding auxiliary information and a bias vector, the barrier between deep learning and the traditional recommendation algorithm is broken, and the relationship between the deep learning and the traditional recommendation algorithm is correlated, so that the performance of the recommendation algorithm is further improved.
Drawings
Fig. 1 is a structural diagram of the fusion auxiliary information of the present invention.
FIG. 2 is a block diagram of a fused offset vector.
Fig. 3 is a model of the present text.
Fig. 4 is a reference model used in comparison.
FIG. 5 is a graph comparing the RMSE effect of the deep learning model constructed by the present invention on the Book-cross Book data set with several other conventional models.
FIG. 6 is a graph comparing the Recall effect of the deep learning model constructed by the present invention on Book-Crossing Book data set with several other conventional models.
FIG. 7 is a new user recommendation list.
FIG. 8 is an old user recommendation list.
Detailed Description
The technical solution of the present invention is clearly and completely described below with reference to specific embodiments. It is obvious that the described embodiments are only some of the embodiments of the invention. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
The invention relates to a book recommendation technology research based on deep learning and application in a library, and the specific implementation steps are shown in FIG. 2, wherein bu and bv are user offset vectors and book offset vectors obtained by training respectively;
the method of the present invention is illustrated below by taking the Book-cross Book data set as an example.
Step 1: preparation of book data set
Downloading Book-browsing Book data sets over the internet is the scoring of 271379 books by 278858 users, including explicit and implicit scoring. Demographic attributes such as age of these users are kept in an anonymous form and analyzed. The book table mainly comprises three tables, namely a user table, a book table and a book-rating table, wherein the user table contains basic information of 278858 users, the book table contains 271379 book information, and the book-cross table contains 1,149,780 pieces of score data of 271,379 books of 278,858 users.
Step 2: book training set preprocessing
Filling data with empty fields in a data set, and adopting a mean value mode;
secondly, data fields which are not needed in the data set are backed up and deleted;
thirdly, manually processing the data aiming at the condition that the data fields and the data in the data set are not matched;
and (4) randomly selecting and superposing the data in the book training set processed by the step (i) and the step (iii), and adding the data into the original book training sample set to obtain a complete book training set.
And step 3: constructing a user characteristic vector and a book characteristic vector;
firstly, taking the complete book data set obtained in the step 2 as an input format, then constructing a deep learning module for feature extraction to perform feature extraction on the input data, and finally taking the extracted feature vector as the input of the next step. The feature expression of feature extraction is as follows:
f:χ→F (1)
g:F→χ (2)
Figure BDA0002593767160000061
wherein χ represents the characteristic input, and F and g represent the characteristic input function and the characteristic output function respectively.
And 4, step 4: building a deep learning model
And (3) constructing two stack denoising self-encoder depth models, and using the two eigenvectors obtained in the step (3) as the input of the depth models respectively.
And 5: training deep learning model
The whole network model from input to output is already constructed through the steps 1 to 4, in this step, the network model obtained in the previous step needs to be trained according to the provided training data set to adjust the weight of the network model to optimize the loss function until the loss function is no longer converged, and the final weight is obtained to obtain the trained network model, namely the trained network model is shown in fig. 1, wherein S, user and Vside in fig. 1 are respectively a user-book scoring matrix, user auxiliary information and book auxiliary information.
Setting an objective function of a network model
Figure BDA0002593767160000071
Where μ is a trade-off parameter, fregulationIs a regularization term to prevent overfitting.
Figure BDA0002593767160000072
② updating the weight by using random gradient method
And (4) performing alternate optimization on each variable in the objective function by adopting a controlled variable method.
For ui and vj, we learn these latent factors using a random gradient descent algorithm. We denote by L (U, V) the objective function when other variables, independent of U and V, are fixed. Thus, the update rule is:
Figure BDA0002593767160000073
Figure BDA0002593767160000074
and phi is the learning rate, the weight is continuously updated through repeated iterative training, loss convergence is gradually carried out, and finally the weight updating is stopped, so that the final network model is obtained.
Step 6: and recommending books.
Adding the characteristic vectors of the user and the book obtained by training in the step 5 into the offsets of the user and the book, adding the offsets into the characteristic vectors of the user and the book, as shown in fig. 2, and then adding the two characteristic vectors into the vectors corresponding to matrix decomposition to obtain a final recommendation model, as shown in fig. 3: wherein the user feature vectors and the book feature vectors obtained by the training of fig. 1 are added to fig. 3, respectively. Matrix filling is performed in the recommendation process using the following formula:
Rui=f(Pu,Qi|P,Q,θf) (8)
wherein R isuiA scoring matrix is represented.
And 7: evaluation using evaluation index
Taking Root Mean Square Error (RMSE) as one of evaluation indexes:
Figure BDA0002593767160000081
wherein R isijFor the user i to score the item j,
Figure BDA0002593767160000082
for the corresponding prediction score, T is the test set and | T | is the total number of scores in the test set.
Secondly, adopting recall rate to evaluate:
Figure BDA0002593767160000083
wherein T isrsTo representThe proportion of the number of books in the recommendation list that the reader is interested in, TsIndicating the total number of books in the recommendation list. The accuracy and the recall rate are mutually influenced, and generally, the method has the advantages of high accuracy, low recall rate and high accuracy. The evaluation results are shown in fig. 4, 5 and 6.
And 8: the text model is applied to the library of the school to recommend books for readers, and the feasibility and the practicability of the model are verified. The readers are divided into new readers and old readers to perform recommendation respectively, and the specific recommendation list is shown in fig. 7 and fig. 8.

Claims (4)

1. A book recommendation method based on a stack noise reduction self-encoder is characterized by specifically comprising the following steps:
step 1: downloading a public Book data set Book-browsing;
step 2: preprocessing the book data set prepared in the step 1, solving the problems of inconsistent data and disorder, and obtaining a processed data set;
and step 3: defining hyper-parameters, the hyper-parameters defined by the model training of the invention are as follows: e _ epochs ═ 100; num _ layers is 3; leaving _ rate ═ 0.1;
and 4, step 4: extracting data characteristics of the user and the book by using a deep learning SDAE (software development association algorithm), and constructing a user characteristic vector and a book characteristic vector;
and 5: constructing two depth models of the stack noise reduction self-encoder, using the two characteristic vectors obtained in the step 4 as the input of the depth models, constructing a deep learning model, optimizing a loss function by using an SGD random gradient descent method and a back propagation training network in the model training process, adjusting the weight of the loss function layer by layer until the loss function is not converged any more, finishing the network training, and obtaining the deep learning model with accurate weight, so that the characteristic vectors of users and books are more accurate;
step 6: adding the characteristic vectors of the users and the books obtained by training in the step 5 into the offsets of the users and the books, adding the offsets into the characteristic vectors of the users and the books, then adding the two characteristic vectors into the vectors corresponding to the matrix decomposition, and filling the scoring matrix to achieve the purpose of book recommendation;
and 7: the model provided by the text is applied to the library of the school to recommend books for readers.
2. The book recommendation method based on the stack noise reduction self-encoder as claimed in claim 1, wherein the book data is preprocessed in step 2 as follows:
filling data with empty fields in a data set, and adopting a mean value mode;
secondly, data fields which are not needed in the data set are backed up and deleted;
thirdly, manually processing the data aiming at the condition that the data fields and the data in the data set are not matched;
and (4) randomly selecting and superposing the data in the book training set processed by the step (i) and (iii), and adding the data into the original book data set to obtain a complete book data set.
3. The book recommendation method based on the stack noise reduction self-encoder as claimed in claim 1, wherein the process of training the deep network model in step 5 is as follows:
dividing the complete book data set in the step 2 into a plurality of data sample packets;
secondly, encoding the user and book information in the data set into vectors, wherein the characteristic expressions of encoding and decoding are as follows:
Figure FDA0002593767150000021
f and g are respectively mapping of an encoder and a decoder of the self-encoder, after solving is completed, the hidden layer characteristics h output by the encoder, namely the encoding characteristics can be regarded as the representation of the input data X, and the objective function is as follows:
Figure FDA0002593767150000022
wherein
Figure FDA0002593767150000023
And R is the minimum predicted value and the original value, and the target function is the square loss of the minimum predicted value and the original value;
and thirdly, training user data characteristics and bias information, book data characteristics and bias information respectively by using a stack noise reduction self-encoder, training layer by using a random gradient descent method and a back propagation algorithm, learning a weight matrix W and a bias value b of each layer, and obtaining a user characteristic vector, a user bias vector, a book characteristic vector and a book bias vector respectively.
4. The book recommendation method based on the stack noise reduction self-encoder as claimed in claim 1, wherein the process of the user and book interaction model in step 6 is as follows:
taking the implicit characteristic matrix and the offset vector of the user and the item obtained from the formula (1) as an input matrix of matrix decomposition, optimizing by adopting a random gradient descent method in the learning process of the stack noise reduction self-encoder, and finally realizing the prediction score through a text model by the implicit characteristic vector of the user and the item when predicting the score, namely:
Rui=f(Pu,Qi|P,Q,θf) (3)
wherein theta isfIs a model parameter of an interactive function f () and a self-encoder represented by the function, the result of the previous hidden layer is used as the input of the next hidden layer, and P and Q represent a user scoring matrix and a book scoring matrix;
secondly, adding the offset vector obtained after training into the model, namely changing the formula (4), so that the objective function of the model is as follows:
Figure FDA0002593767150000031
Figure FDA0002593767150000032
where μ is a trade-off parameter, fregulatiorIs a regularization term, prevents overfitting,
Figure FDA0002593767150000033
wherein W is the weight matrix of the two SDAEs, and b is the corresponding bias vector;
thirdly, the implicit characteristic vector of the user can be obtained through the second step, and the implicit characteristic vector of the book can be obtained through the same method:
Figure FDA0002593767150000034
obtaining the implicit characteristic vectors of the user and the books through the process, obtaining the user offset vector and the book offset vector by using the stack noise reduction self-encoder, then using the user offset vector and the book offset vector as the input of matrix decomposition, filling the matrix, and finally recommending the books to the user according to the scoring matrix.
CN202010703474.8A 2020-07-21 2020-07-21 Book recommendation method based on stack noise reduction self-encoder Pending CN112085158A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010703474.8A CN112085158A (en) 2020-07-21 2020-07-21 Book recommendation method based on stack noise reduction self-encoder

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010703474.8A CN112085158A (en) 2020-07-21 2020-07-21 Book recommendation method based on stack noise reduction self-encoder

Publications (1)

Publication Number Publication Date
CN112085158A true CN112085158A (en) 2020-12-15

Family

ID=73735476

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010703474.8A Pending CN112085158A (en) 2020-07-21 2020-07-21 Book recommendation method based on stack noise reduction self-encoder

Country Status (1)

Country Link
CN (1) CN112085158A (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112765339A (en) * 2021-01-21 2021-05-07 山东师范大学 Personalized book recommendation method and system based on reinforcement learning
CN112989196A (en) * 2021-03-30 2021-06-18 北京工业大学 Book recommendation method based on personalized recall algorithm LFM
CN113641907A (en) * 2021-08-17 2021-11-12 中国科学院重庆绿色智能技术研究院 Hyper-parameter self-adaptive depth recommendation method and device based on evolutionary algorithm
CN112989196B (en) * 2021-03-30 2024-04-19 北京工业大学 Book recommendation method based on personalized recall algorithm LFM

Citations (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102317962A (en) * 2008-12-12 2012-01-11 纽约市哥伦比亚大学理事会 Machine optimization devices, methods, and systems
CN104679835A (en) * 2015-02-09 2015-06-03 浙江大学 Book recommending method based on multi-view hash
CN107516113A (en) * 2017-08-28 2017-12-26 深圳市唯特视科技有限公司 A kind of visual search target decoder method based on image generation model
CN109241449A (en) * 2018-10-30 2019-01-18 国信优易数据有限公司 A kind of item recommendation method and device
CN110162601A (en) * 2019-05-22 2019-08-23 吉林大学 A kind of biomedical publication submission recommender system based on deep learning
CN110275964A (en) * 2019-06-26 2019-09-24 程淑玉 The recommended models of knowledge based map and Recognition with Recurrent Neural Network
LU101290B1 (en) * 2018-08-17 2019-11-29 Univ Qilu Technology Method, System, Storage Medium and Electric Device of Medical Automatic Question Answering
CN110609960A (en) * 2019-09-25 2019-12-24 华中师范大学 Learning resource recommendation method and device, data processing equipment and storage medium
CN110717103A (en) * 2019-10-09 2020-01-21 东北大学 Improved collaborative filtering method based on stack noise reduction encoder
CN110781401A (en) * 2019-11-07 2020-02-11 电子科技大学 Top-n project recommendation method based on collaborative autoregressive flow
CN110807154A (en) * 2019-11-08 2020-02-18 内蒙古工业大学 Recommendation method and system based on hybrid deep learning model
CN110826056A (en) * 2019-11-11 2020-02-21 南京工业大学 Recommendation system attack detection method based on attention convolution self-encoder
CN111309996A (en) * 2019-12-31 2020-06-19 马鞍山佰兆信息科技有限公司 Intelligent library auxiliary management system
WO2020135144A1 (en) * 2018-12-29 2020-07-02 北京世纪好未来教育科技有限公司 Method and apparatus for predicting object preference, and computer-readable medium

Patent Citations (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102317962A (en) * 2008-12-12 2012-01-11 纽约市哥伦比亚大学理事会 Machine optimization devices, methods, and systems
CN104679835A (en) * 2015-02-09 2015-06-03 浙江大学 Book recommending method based on multi-view hash
CN107516113A (en) * 2017-08-28 2017-12-26 深圳市唯特视科技有限公司 A kind of visual search target decoder method based on image generation model
LU101290B1 (en) * 2018-08-17 2019-11-29 Univ Qilu Technology Method, System, Storage Medium and Electric Device of Medical Automatic Question Answering
CN109241449A (en) * 2018-10-30 2019-01-18 国信优易数据有限公司 A kind of item recommendation method and device
WO2020135144A1 (en) * 2018-12-29 2020-07-02 北京世纪好未来教育科技有限公司 Method and apparatus for predicting object preference, and computer-readable medium
CN110162601A (en) * 2019-05-22 2019-08-23 吉林大学 A kind of biomedical publication submission recommender system based on deep learning
CN110275964A (en) * 2019-06-26 2019-09-24 程淑玉 The recommended models of knowledge based map and Recognition with Recurrent Neural Network
CN110609960A (en) * 2019-09-25 2019-12-24 华中师范大学 Learning resource recommendation method and device, data processing equipment and storage medium
CN110717103A (en) * 2019-10-09 2020-01-21 东北大学 Improved collaborative filtering method based on stack noise reduction encoder
CN110781401A (en) * 2019-11-07 2020-02-11 电子科技大学 Top-n project recommendation method based on collaborative autoregressive flow
CN110807154A (en) * 2019-11-08 2020-02-18 内蒙古工业大学 Recommendation method and system based on hybrid deep learning model
CN110826056A (en) * 2019-11-11 2020-02-21 南京工业大学 Recommendation system attack detection method based on attention convolution self-encoder
CN111309996A (en) * 2019-12-31 2020-06-19 马鞍山佰兆信息科技有限公司 Intelligent library auxiliary management system

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
高全力等: ""多特征的核线性判别分析推荐方法"", 《东南大学学报》, vol. 49, no. 5, pages 883 - 889 *

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112765339A (en) * 2021-01-21 2021-05-07 山东师范大学 Personalized book recommendation method and system based on reinforcement learning
CN112989196A (en) * 2021-03-30 2021-06-18 北京工业大学 Book recommendation method based on personalized recall algorithm LFM
CN112989196B (en) * 2021-03-30 2024-04-19 北京工业大学 Book recommendation method based on personalized recall algorithm LFM
CN113641907A (en) * 2021-08-17 2021-11-12 中国科学院重庆绿色智能技术研究院 Hyper-parameter self-adaptive depth recommendation method and device based on evolutionary algorithm
CN113641907B (en) * 2021-08-17 2023-11-28 中国科学院重庆绿色智能技术研究院 Super-parameter self-adaptive depth recommendation method and device based on evolutionary algorithm

Similar Documents

Publication Publication Date Title
CN110807154B (en) Recommendation method and system based on hybrid deep learning model
CN108460619B (en) Method for providing collaborative recommendation model fusing explicit and implicit feedback
CN110674407B (en) Hybrid recommendation method based on graph convolution neural network
CN111310063B (en) Neural network-based article recommendation method for memory perception gated factorization machine
CN108509573B (en) Book recommendation method and system based on matrix decomposition collaborative filtering algorithm
CN110674850A (en) Image description generation method based on attention mechanism
CN104063481A (en) Film individuation recommendation method based on user real-time interest vectors
Jiao et al. A novel learning rate function and its application on the SVD++ recommendation algorithm
CN109033294B (en) Mixed recommendation method for integrating content information
CN113918833B (en) Product recommendation method realized through graph convolution collaborative filtering of social network relationship
CN112734104B (en) Cross-domain recommendation method fusing generation countermeasure network and self-encoder
CN112149734B (en) Cross-domain recommendation method based on stacked self-encoder
CN113918832A (en) Graph convolution collaborative filtering recommendation system based on social relationship
CN112085158A (en) Book recommendation method based on stack noise reduction self-encoder
CN113918834A (en) Graph convolution collaborative filtering recommendation method fusing social relations
CN113449200B (en) Article recommendation method and device and computer storage medium
CN113342994A (en) Recommendation system based on non-sampling cooperative knowledge graph network
CN113051468A (en) Movie recommendation method and system based on knowledge graph and reinforcement learning
CN111160859A (en) Human resource post recommendation method based on SVD + + and collaborative filtering
CN111930926A (en) Personalized recommendation algorithm combined with comment text mining
CN110795640A (en) Adaptive group recommendation method for compensating group member difference
Contardo et al. Representation learning for cold-start recommendation
CN116186384A (en) Article recommendation method and system based on article implicit feature similarity
CN115545833A (en) Recommendation method and system based on user social information
CN111125541B (en) Method for acquiring sustainable multi-cloud service combination for multiple users

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination