CN110222258A - Eigenmatrix initial method based on attribute mapping and autocoding neural network - Google Patents
Eigenmatrix initial method based on attribute mapping and autocoding neural network Download PDFInfo
- Publication number
- CN110222258A CN110222258A CN201910416224.3A CN201910416224A CN110222258A CN 110222258 A CN110222258 A CN 110222258A CN 201910416224 A CN201910416224 A CN 201910416224A CN 110222258 A CN110222258 A CN 110222258A
- Authority
- CN
- China
- Prior art keywords
- article
- matrix
- attribute
- user
- eigenmatrix
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000000034 method Methods 0.000 title claims abstract description 26
- 238000013528 artificial neural network Methods 0.000 title claims abstract description 17
- 238000013507 mapping Methods 0.000 title claims abstract description 15
- 239000011159 matrix material Substances 0.000 claims abstract description 53
- 230000009467 reduction Effects 0.000 claims abstract description 16
- 230000006870 function Effects 0.000 claims description 12
- 238000012549 training Methods 0.000 claims description 11
- 230000004913 activation Effects 0.000 claims description 3
- 230000007246 mechanism Effects 0.000 abstract description 6
- 238000002474 experimental method Methods 0.000 abstract description 3
- 238000000354 decomposition reaction Methods 0.000 description 4
- 238000010586 diagram Methods 0.000 description 3
- 230000000694 effects Effects 0.000 description 2
- 238000001914 filtration Methods 0.000 description 2
- 230000008569 process Effects 0.000 description 2
- 206010034719 Personality change Diseases 0.000 description 1
- 230000009471 action Effects 0.000 description 1
- 238000007792 addition Methods 0.000 description 1
- 150000001875 compounds Chemical class 0.000 description 1
- 238000010276 construction Methods 0.000 description 1
- 230000007812 deficiency Effects 0.000 description 1
- 238000013461 design Methods 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 238000004880 explosion Methods 0.000 description 1
- 238000011478 gradient descent method Methods 0.000 description 1
- 238000012804 iterative process Methods 0.000 description 1
- 238000002156 mixing Methods 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 238000011946 reduction process Methods 0.000 description 1
- 238000006467 substitution reaction Methods 0.000 description 1
- 230000033772 system development Effects 0.000 description 1
- 239000002699 waste material Substances 0.000 description 1
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F16/00—Information retrieval; Database structures therefor; File system structures therefor
- G06F16/90—Details of database functions independent of the retrieved data types
- G06F16/95—Retrieval from the web
- G06F16/953—Querying, e.g. by the use of web search engines
- G06F16/9535—Search customisation based on user profiles and personalisation
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/20—Analysing
- G06F18/21—Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation
- G06F18/213—Feature extraction, e.g. by transforming the feature space; Summarisation; Mappings, e.g. subspace methods
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/20—Analysing
- G06F18/21—Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation
- G06F18/214—Generating training patterns; Bootstrap methods, e.g. bagging or boosting
Abstract
The invention discloses a kind of eigenmatrix initial methods based on attribute mapping and autocoding neural network, belong to personalized recommendation field, the application obtains goods attribute first and constructs article attribute matrix, then the dimension of article characteristics matrix is determined using goods attribute type, simultaneously using goods attribute matrix to article characteristics matrix initialisation, not only many experiments selection dimension had been avoided passing through, but also has improved convergence efficiency;It is decomposed for classical matrix low using the method precision of random value initialization, inefficient and the not strong problem of interpretation, the application initializes eigenmatrix using the value of goods attribute matrix, is fitted rating matrix by attribute mapping mechanism and obtains article characteristics vector, improves convergence efficiency;When attribute type is more, eigenmatrix dimension is larger, and the application carries out dimensionality reduction to the eigenmatrix after using goods attribute matrix initialisation using automatic coding, reduces the complexity of algorithm.
Description
Technical field
The invention belongs to personalized recommendation fields, and in particular to a kind of based on attribute mapping and autocoding neural network
Eigenmatrix initial method.
Background technique
Internet scale and covering surface increase rapidly, so that the text that generates of various terminals, voice and video information exist
The convergence of explosion type in network, user is when obtaining to oneself effective information, it is most likely that is submerged among information ocean.Individual character
Change and recommends to be one of current solution most effective tool of information overload.
Proposed algorithm is the important component of personalized recommendation, be can be generally divided into: content-based recommendation algorithm, association
Same filtering recommendation algorithms, Knowledge based engineering proposed algorithm and mixing proposed algorithm.Collaborative filtering is for recommender system
Development is of great significance, and can be divided mainly into proposed algorithm and proposed algorithm memory-based based on model.Based on model
Algorithm mainly establishes prediction model using score data, the scoring using model prediction user to article, and then generates recommendation.Square
Battle array decomposition algorithm be the proposed algorithm based on model basis, by user-article rating matrix resolve into user characteristics matrix with
And article characteristics matrix.Most common matrix disassembling method include SVD (Singular Value Decomposition),
SVD++ (SingularValue Decomposition plus plus) etc..Eigenmatrix initialization is matrix disassembling method
The first step and precondition, the superiority and inferiority of initialization directly influence the recommendation efficiency and accuracy of recommended models, therefore, feature square
Battle array initialization procedure plays a crucial role the superiority and inferiority of prediction model.
Traditional matrix decomposition algorithm has the problem that during matrix initialisation
Carry out initialization feature matrix usually using random value, without practical significance, and low efficiency, low precision.
The dimension of article characteristics matrix needs to waste a large amount of time and efforts by many experiments determination.
Summary of the invention
For the above-mentioned technical problems in the prior art, the invention proposes one kind to be compiled based on attribute mapping and automatically
The eigenmatrix initial method of code neural network, design rationally, overcome the deficiencies in the prior art, have good effect.
To achieve the goals above, the present invention adopts the following technical scheme:
A kind of eigenmatrix initial method based on attribute mapping and autocoding neural network, first for convenient for retouching
It states, is defined as follows:
U is user's set, and I is article set, and R is user-article rating matrix, and P is user characteristics matrix, and Q is article
Eigenmatrix, ruiIndicate scoring of the user u to article i;
The eigenmatrix initial method based on attribute mapping and autocoding neural network, including walk as follows
It is rapid:
Step 1: input goods attribute matrix A, training data, learning rate γ and regularization parameter λ;
Step 2: article characteristics matrix Q being initialized using goods attribute matrix A;
Step 3: dimensionality reduction being carried out to article characteristics matrix Q using automatic coding;
Wherein, goods attribute matrix Q is expressed as { x1,x2,…,xi,…,xm};Wherein, m is number of articles, xiFor article i
High dimensional feature vector, for xiHave:
h(xi;W, b)=σ (Wxi+b) (1);
Wherein, W is the weight of input layer and hidden layer, and b is the biasing of input layer, and σ is activation primitive;
Utilize the article characteristics vector after autocoding neural network dimensionality reduction are as follows:
Wherein,Article i feature vector after indicating dimensionality reduction, as article characteristics vector q in SVD++ modeliInput;
aj(1≤j≤k) indicates j-th of feature, and k is the characteristic dimension after dimensionality reduction;
Step 4: random initializtion P, buAnd bi;
Wherein, buAnd biThe respectively scoring biasing of user u and article i;
Step 5: being scored according to prediction and construct loss function;
Firstly, the scoring using formula (3) prediction user u to article i:
Wherein, puAnd qiThe respectively feature vector of user u and article i, μ indicate average score, NuIndicate that user u was commented
The article set divided, yjIndicate implicit feedback vector;
Then, it is scored according to prediction, constructs target loss function using squared difference formula, such as shown in (4):
Wherein, λ is regularization parameter, for preventing over-fitting;κ indicates scoring set;
Step 6: utilizing stochastic gradient descent (Stochastic Gradient Descent, SGD) training pattern, iteration
Formula is such as shown in (5):
Wherein, γ is learning rate;
Step 7: judging whether loss function restrains;
If: judging result is loss function convergence, thens follow the steps 8;
Or judging result is that loss function is not restrained, and thens follow the steps 6;
Step 8: output training pattern;
Step 9: assessment training pattern;
Step 10: terminating.
Advantageous effects brought by the present invention:
Aiming at the problem that eigenmatrix dimension selection course very complicated, the application obtains goods attribute and construction first
Then product attribute matrix is determined the dimension of article characteristics matrix using goods attribute type, while using goods attribute matrix
To article characteristics matrix initialisation, many experiments selection dimension was not only avoided passing through, but also improve convergence efficiency.
, inefficient and interpretation not strong ask low using the method precision of random value initialization is decomposed for classical matrix
Topic, the application initialize article characteristics matrix using the value of goods attribute matrix, are commented by the fitting of attribute mapping mechanism
Sub-matrix obtains article characteristics vector, improves convergence efficiency.
When attribute type is more, eigenmatrix dimension is larger, and the application is using automatic coding to using article category
Property matrix initialisation after eigenmatrix carry out dimensionality reduction, reduce the complexity of algorithm.
Detailed description of the invention
Fig. 1 is film " Toy Story " attribute information schematic diagram.
Fig. 2 is based on goods attribute initial method iteration schematic diagram.
Fig. 3 is the dimensionality reduction schematic diagram of article characteristics matrix.
Fig. 4 is the eigenmatrix initial method flow chart based on attribute mapping mechanism.
Fig. 5 is the eigenmatrix initial method flow chart based on autocoding neural network.
Specific embodiment
With reference to the accompanying drawing and specific embodiment invention is further described in detail:
For ease of description, it is defined as follows:
U, I is respectively user's set, and article set, R is user-article rating matrix, and P and Q are respectively user and article
Eigenmatrix, ruiIndicate scoring of the user u to article i.
1, based on the eigenmatrix initial method of attribute mapping mechanism
By taking film recommender system as an example, the generally existing description information to film in cinematic data, wherein comprising film
Attribute information mainly includes action movie, takes a risk, crime piece, documentary film, and magical, film noir, horror film, musical play, reasoning
Play, romance movie, science fiction film, horror film, war film, Westerns, cartoon, plays for children, comedy, 18 classifications of drama.This Shen
Rating matrix is please fitted by attribute mapping mechanism and obtains article characteristics vector.
(1) film native matrix constructs
Article attribute matrix A, a are constructed according to movies categoryiIndicate the attribute vector of article i.If article i has attribute
J, then attribute j is the explicit attribute of article i, and aij=1 otherwise aij=0.Fig. 1 shows film " the Toy Story " in data set
Attribute information.
(2) article characteristics matrix initialisation
The dimension k of article characteristics matrix is determined using goods attribute type, and utilizes goods attribute vector aiInitial compound
Product feature vector qi, it is used in combinationIndicate the article characteristics vector of initialization.
(3) parameter training
The application usesAs article initialization feature vector, and to user characteristics vector puIt is carried out just using random value
Beginningization.On this basis, feature vector is learnt and is trained using SVD++ model.
User u is as follows to the score in predicting formula of article i:
Wherein, puAnd qiThe respectively feature vector of user u and article i, μ indicate average score, buAnd biRespectively user
The scoring of u and article i bias, NuIndicate that user u comments excessive article set, yjIndicate implicit feedback vector.
It is scored according to prediction and constructs loss function:
Loss function is trained using stochastic gradient descent method, iterative formula is such as shown in (5).
Iterative process signal is as shown in Figure 2.Wherein,And puRespectively indicate the initialization feature vector sum training of user u
Feature vector afterwards, " random " represent random initial characteristic values, and " X " indicates the characteristic value after training.
2, based on the article characteristics initial method of autocoding neural network
Since goods attribute type is excessive in data set, article characteristics dimension after being initialized using goods attribute compared with
Greatly, so that algorithm complexity is high.In order to solve this problem, on the basis of carrying out feature initialization using goods attribute, this Shen
It please propose a kind of method that dimensionality reduction is carried out to article characteristics matrix using autocoding neural network.Fig. 3 illustrates article characteristics
Matrix reduction process.
(1) article characteristics matrix initialisation
1 step of square method (1) and (2).
(2) article characteristics matrix dimensionality reduction
Autocoding neural network is by up of three-layer: input layer, hidden layer and output layer.Wherein, input layer and hidden layer
Encoder is constituted, which attempts high dimensional data being mapped to lower dimensional space.
Article characteristics matrix Q can be expressed as { x1,x2,…,xi,…,xm, wherein m is number of articles, xiFor article i's
High dimensional feature vector, for xiHave:
h(xi;W, b)=σ (Wxi+b) (1);
Wherein, W is the weight of input layer and hidden layer, and b is the biasing of input layer, and σ is activation primitive.
Utilize the article characteristics vector after autocoding neural network dimensionality reduction are as follows:
Wherein,Article i feature vector after indicating dimensionality reduction, as article characteristics vector q in SVD++ modeliInput;
aj(1≤j≤k) indicates j-th of feature, and k is the characteristic dimension after dimensionality reduction.
Eigenmatrix initial method process based on attribute mapping mechanism is as shown in Figure 4.
Eigenmatrix initial method process based on autocoding neural network is as shown in Figure 5.
Certainly, the above description is not a limitation of the present invention, and the present invention is also not limited to the example above, this technology neck
The variations, modifications, additions or substitutions that the technical staff in domain is made within the essential scope of the present invention also should belong to of the invention
Protection scope.
Claims (1)
1. the eigenmatrix initial method based on attribute mapping and autocoding neural network, it is characterised in that: carry out first
Such as give a definition:
U is user's set, and I is article set, and R is user-article rating matrix, and P is user characteristics matrix, and Q is article characteristics
Matrix, ruiIndicate scoring of the user u to article i;
The eigenmatrix initial method based on attribute mapping and autocoding neural network, includes the following steps:
Step 1: input goods attribute matrix A, training data, learning rate γ and regularization parameter λ;
Step 2: article characteristics matrix Q being initialized using goods attribute matrix A;
Step 3: dimensionality reduction being carried out to article characteristics matrix Q using automatic coding;
Wherein, article characteristics matrix Q is expressed as { x1,x2,…,xi,…,xm};Wherein, m is number of articles, xiFor the height of article i
Dimensional feature vector, for xiHave:
h(xi;W, b)=σ (Wxi+b) (1);
Wherein, W is the weight of input layer and hidden layer, and b is the biasing of input layer, and σ is activation primitive;
Utilize the article characteristics vector after autocoding neural network dimensionality reduction are as follows:
Wherein,Article i feature vector after indicating dimensionality reduction, as article characteristics vector q in SVD++ modeliInput;aj(1
≤ j≤k) indicate j-th of feature, k is the characteristic dimension after dimensionality reduction;
Step 4: random initializtion P, buAnd bi;
Wherein, buAnd biThe respectively scoring biasing of user u and article i;
Step 5: being scored according to prediction and construct loss function;
Firstly, the scoring using formula (3) prediction user u to article i:
Wherein, puAnd qiThe respectively feature vector of user u and article i, μ indicate average score, NuIt is excessive to indicate that user u is commented
Article set, yjIndicate implicit feedback vector;
Then, it is scored according to prediction, constructs target loss function using squared difference formula, such as shown in (4):
Wherein, λ is regularization parameter, for preventing over-fitting;κ indicates scoring set;
Step 6: stochastic gradient descent training pattern is utilized, iterative formula is such as shown in (5):
Wherein, γ is learning rate;
Step 7: judging whether loss function restrains;
If: judging result is loss function convergence, thens follow the steps 8;
Or judging result is that loss function is not restrained, and thens follow the steps 6;
Step 8: output training pattern;
Step 9: assessment training pattern;
Step 10: terminating.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201910416224.3A CN110222258A (en) | 2019-05-20 | 2019-05-20 | Eigenmatrix initial method based on attribute mapping and autocoding neural network |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201910416224.3A CN110222258A (en) | 2019-05-20 | 2019-05-20 | Eigenmatrix initial method based on attribute mapping and autocoding neural network |
Publications (1)
Publication Number | Publication Date |
---|---|
CN110222258A true CN110222258A (en) | 2019-09-10 |
Family
ID=67821479
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201910416224.3A Pending CN110222258A (en) | 2019-05-20 | 2019-05-20 | Eigenmatrix initial method based on attribute mapping and autocoding neural network |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN110222258A (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN111797319A (en) * | 2020-07-01 | 2020-10-20 | 喜大(上海)网络科技有限公司 | Recommendation method, device, equipment and storage medium |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP3188111A1 (en) * | 2015-12-28 | 2017-07-05 | Deutsche Telekom AG | A method for extracting latent context patterns from sensors |
CN108829763A (en) * | 2018-05-28 | 2018-11-16 | 电子科技大学 | A kind of attribute forecast method of the film review website user based on deep neural network |
CN108874914A (en) * | 2018-05-29 | 2018-11-23 | 吉林大学 | A kind of information recommendation method based on the long-pending and neural collaborative filtering of picture scroll |
CN109740924A (en) * | 2018-12-29 | 2019-05-10 | 西安电子科技大学 | Merge the article score in predicting method of attribute information network and matrix decomposition |
-
2019
- 2019-05-20 CN CN201910416224.3A patent/CN110222258A/en active Pending
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP3188111A1 (en) * | 2015-12-28 | 2017-07-05 | Deutsche Telekom AG | A method for extracting latent context patterns from sensors |
CN108829763A (en) * | 2018-05-28 | 2018-11-16 | 电子科技大学 | A kind of attribute forecast method of the film review website user based on deep neural network |
CN108874914A (en) * | 2018-05-29 | 2018-11-23 | 吉林大学 | A kind of information recommendation method based on the long-pending and neural collaborative filtering of picture scroll |
CN109740924A (en) * | 2018-12-29 | 2019-05-10 | 西安电子科技大学 | Merge the article score in predicting method of attribute information network and matrix decomposition |
Non-Patent Citations (1)
Title |
---|
JIANLI ZHAO: "Attribute mapping and autoencoder neural network based matrix factorization initialization for recommendation systems", 《KNOWLEDGE-BASED SYSTEMS》 * |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN111797319A (en) * | 2020-07-01 | 2020-10-20 | 喜大(上海)网络科技有限公司 | Recommendation method, device, equipment and storage medium |
CN111797319B (en) * | 2020-07-01 | 2023-10-27 | 喜大(上海)网络科技有限公司 | Recommendation method, recommendation device, recommendation equipment and storage medium |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN111310063B (en) | Neural network-based article recommendation method for memory perception gated factorization machine | |
CN111523047B (en) | Multi-relation collaborative filtering algorithm based on graph neural network | |
CN111797321B (en) | Personalized knowledge recommendation method and system for different scenes | |
CN107563841B (en) | Recommendation system based on user score decomposition | |
CN112989064B (en) | Recommendation method for aggregating knowledge graph neural network and self-adaptive attention | |
CN108460619B (en) | Method for providing collaborative recommendation model fusing explicit and implicit feedback | |
CN110119467A (en) | A kind of dialogue-based item recommendation method, device, equipment and storage medium | |
CN107273438A (en) | A kind of recommendation method, device, equipment and storage medium | |
CN112417306B (en) | Method for optimizing performance of recommendation algorithm based on knowledge graph | |
CN104598611B (en) | The method and system being ranked up to search entry | |
CN109783739A (en) | A kind of collaborative filtering recommending method based on the sparse noise reduction self-encoding encoder enhancing of stacking | |
CN109325875B (en) | Implicit group discovery method based on hidden features of online social users | |
CN113918832B (en) | Graph convolution collaborative filtering recommendation system based on social relationship | |
CN109190030A (en) | Merge the implicit feedback recommended method of node2vec and deep neural network | |
CN109446420B (en) | Cross-domain collaborative filtering method and system | |
CN113918834B (en) | Graph convolution collaborative filtering recommendation method fusing social relations | |
CN107766742A (en) | Dependent is the same as more correlation difference privacy matrix disassembling methods under distributional environment | |
CN111488524A (en) | Attention-oriented semantic-sensitive label recommendation method | |
CN114357312B (en) | Community discovery method and personality recommendation method based on graph neural network automatic modeling | |
CN106886559A (en) | The collaborative filtering method of good friend's feature and similar users feature is incorporated simultaneously | |
CN112418525A (en) | Method and device for predicting social topic group behaviors and computer storage medium | |
CN110837603A (en) | Integrated recommendation method based on differential privacy protection | |
CN110688585A (en) | Personalized movie recommendation method based on neural network and collaborative filtering | |
CN114329233A (en) | Cross-region cross-scoring collaborative filtering recommendation method and system | |
CN110222258A (en) | Eigenmatrix initial method based on attribute mapping and autocoding neural network |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
RJ01 | Rejection of invention patent application after publication |
Application publication date: 20190910 |
|
RJ01 | Rejection of invention patent application after publication |