CN108647226A - A kind of mixing recommendation method based on variation autocoder - Google Patents

A kind of mixing recommendation method based on variation autocoder Download PDF

Info

Publication number
CN108647226A
CN108647226A CN201810253803.6A CN201810253803A CN108647226A CN 108647226 A CN108647226 A CN 108647226A CN 201810253803 A CN201810253803 A CN 201810253803A CN 108647226 A CN108647226 A CN 108647226A
Authority
CN
China
Prior art keywords
user
article
hidden
autocoder
vector coding
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201810253803.6A
Other languages
Chinese (zh)
Other versions
CN108647226B (en
Inventor
张寅�
林建实
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Zhejiang University ZJU
Original Assignee
Zhejiang University ZJU
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Zhejiang University ZJU filed Critical Zhejiang University ZJU
Priority to CN201810253803.6A priority Critical patent/CN108647226B/en
Publication of CN108647226A publication Critical patent/CN108647226A/en
Application granted granted Critical
Publication of CN108647226B publication Critical patent/CN108647226B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q30/00Commerce
    • G06Q30/06Buying, selling or leasing transactions
    • G06Q30/0601Electronic shopping [e-shopping]
    • G06Q30/0631Item recommendations
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/045Combinations of networks

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Business, Economics & Management (AREA)
  • Theoretical Computer Science (AREA)
  • Finance (AREA)
  • Accounting & Taxation (AREA)
  • General Physics & Mathematics (AREA)
  • Computational Linguistics (AREA)
  • Artificial Intelligence (AREA)
  • Evolutionary Computation (AREA)
  • General Health & Medical Sciences (AREA)
  • Molecular Biology (AREA)
  • Computing Systems (AREA)
  • General Engineering & Computer Science (AREA)
  • Biophysics (AREA)
  • Mathematical Physics (AREA)
  • Software Systems (AREA)
  • Biomedical Technology (AREA)
  • Data Mining & Analysis (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Health & Medical Sciences (AREA)
  • Development Economics (AREA)
  • Economics (AREA)
  • Marketing (AREA)
  • Strategic Management (AREA)
  • General Business, Economics & Management (AREA)
  • Information Retrieval, Db Structures And Fs Structures Therefor (AREA)
  • Image Analysis (AREA)

Abstract

The invention discloses a kind of, and method is recommended in the mixing based on variation autocoder.Method models the scoring feature and content characteristic of user and article by using variation autocoder, is encoded to sparse features by Factorization machine, automatic to carry out feature higher order combination;Meanwhile the multiple view data characteristics of user and article being fused in the framework of variation autocoder, to solve the problems, such as cold start-up;And it by the variation inference analysis of user and the hidden vector coding of article, is provided for the hidden vector coding of autocoder generation explanatory;By inputting user and the corresponding feature of article, preference value of the user to candidate item set can be got, is ranked up to obtain recommendation results according to preference value.The present invention can have better recommendation effect relative to conventional recommendation method.

Description

A kind of mixing recommendation method based on variation autocoder
Technical field
The present invention relates to computer recommending systems more particularly to a kind of mixing based on autocoder to recommend method.
Background technology
In recent years, with the continuous development of network and information technology, the data scale of construction of information on line generates speed and multiple Miscellaneous degree is all increasing sharply, and personalized recommendation system has become the important technology hand from very complicated extracting data information Section, and be widely used in industrial quarters.
Traditional recommendation method based on collaborative filtering, the especially method of matrix decomposition series have been demonstrate,proved in industrial quarters Bright is highly effective, although such as browsing, click and the collection of implicit feedback data, is commented relative to explicit feedback data such as film Point, commodity evaluation etc., be more prone to collect, but the sparse sex chromosome mosaicism of cold start-up problem and feature is still limitation commending system An important factor for performance.
And deep learning all made breakthrough progress in the fields such as graph image and natural language processing in recent years, Its excellent in performance in terms of characteristic processing is demonstrated therefore to apply deep learning and have become this on commending system An important directions in field.But the existing model based on deep neural network is all often in user and article Hold feature to be handled, and the key of commending system is to portray the interactive relation of user and article, there is no straight in this regard The research with deep learning is scooped out, more or directly arrives matrix decomposition frame again using the hidden vector coding input generated In.
Invention content
Blank and disadvantage in view of the prior art, the present invention provides a kind of, and the mixing based on variation autocoder is recommended Method.The technical solution that the present invention specifically uses is as follows:
Method is recommended in mixing based on variation autocoder comprising following steps:
(1) according to specific application configuration environment, daily record data is handled, the interaction relation information of user and article is obtained, Including two class of implicit feedback and explicit feedback;Characteristic processing is carried out for different information types:For implicit feedback data, there is friendship The label of mutual behavior is otherwise to be labeled as 0;For showing feedback data, its specific score value is recorded, then by characteristic value It is normalized;
(2) the multiple view information of user and article, including user's portrait information and item contents information are collected respectively, are solved Cold start-up problem;
(3) the recommendation feedback information of not liking preference of the user in addition to the article of existing historical behavior is collected, is generated negative Sample recycles negative sampling so that whole positive and negative sample number is identical;
(4) model based on variation autocoder mixing recommendation method is built, is become by the way of alternating iteration The gradient updating of amount, is trained model, preserves final model parameter;For having the article and use of history interbehavior Family retains corresponding hidden vector coding;
(5) in forecast period, for having had the user of hidden vector coding and article, directly as descriptor matrix in model Preference value of the user to special article is calculated in the input of decomposing module;And for lack hidden vector coding user and Article then calculates corresponding hidden vector coding by trained model first, then calculates its preference value;
(6) for certain specific user, its preference value to article in candidate item set is calculated, preference value is arranged Sequence obtains the recommendation item lists of the user;
In method implementation procedure, periodically arranges daily record and simultaneously repeat (1)~(4) computation model, update user and article Hidden vector coding.
Preferably, the step (3) includes:For each user, article is divided according to existing interbehavior As positive sample and negative sample, and for the article of not intersection record, the negative of a part is filtered out by way of sampling Sample.
Preferably, the model based on variation autocoder mixing recommendation method in the step (4) altogether by Three module compositions, include the variation autocoder of user side, the variation autocoder and descriptor matrix of article side Decomposing module, variation autocoder are divided into decoder and encoder;In the use for having received abovementioned steps (2), (3), (4) obtain After family and article characteristics value and corresponding positive negative sample preference value, the training of model is carried out.
Preferably, the gradient updating formula of variable is as follows in the step (4):
Wherein, ΦuvuvIt is coder parameters, the article autocoding of user's autocoder respectively with Ψ The decoder parameters and generalized moment of the coder parameters of device, the decoder parameters of user's autocoder, article autocoder The parameter of battle array decomposing module, θ and Φ are coder module parameter and decoder module parameter, η respectivelyuvΨIt is user respectively Side autocoder, article side autocoder and the newer rate of descriptor matrix decomposing module parameter, Zu,ZvIt is respectively The hidden vector coding generated by the autocoder of user side and the autocoder of article side, XB,UBIt is boarding steps respectively Degree declines the scoring feature of user's multiple view feature and user that batch size is B, YB,VBIt is that the article that batch size is B regards more respectively The scoring feature of figure feature and article, U and V are the scoring feature of user and article, f respectivelypooling(U),fpooling(V) respectively It is output of the scoring feature of user and article after pondization operation.
Preferably, the step (5) includes the following steps:
1) the model training parameter Φ obtained after step (4) is implemented is preserveduvuvAnd Ψ, for carrying out prediction;
2) for having the user of interbehavior and article, the hidden vector coding of preservation is directly read;For unknown use Family and article carry out the calculating of hidden vector coding by encoder section;
3) for the encoder section of user, the hidden vector coding of user iCalculation formula is as follows:
Wherein, g () is each layer of activation primitive, ui,xiIt is the scoring feature and multiple view feature of user i respectively,WithIt is that mean vector, variance vectors and hidden vector that user i is generated via variation autocoder are compiled respectively Code,It is the output result vector of k-th of hidden layer when calculating the hidden vector coding of user,It is to calculate user Hidden vector coding when the corresponding weight vectors of k-th of hidden layer, it is defeated to be respectively used to the processing output of hidden layer, multiple view feature Enter,It is to correspond to k-th of hidden layer bias term when calculating the hidden vector coding of user, k takes 2,3 ..., and L, L are of hidden layer Number, andIt is to be exported for mean vector when calculating the hidden vector coding of userWeight term,It is to calculate to use It is exported for mean vector when the hidden vector coding at familyBias term,It is when calculating the hidden vector coding of user It is exported for variance vectorsWeight term,It is to calculate the hidden vector coding hour hands of user to variance vectors output Bias term;ε be meet mean value by 0 and variance be 1 the numerical value sampled of normal distribution;
4) for the encoder section of article, the hidden vector coding of article iCalculation formula is as follows:
Wherein, g () is each layer of activation primitive, vi,yiIt is the scoring feature and multiple view feature of article i respectively,WithIt is mean vector, variance vectors and the hidden vector coding that article i is generated via variation autocoder respectively,It is the output result vector of k-th of hidden layer when calculating the hidden vector coding of article,It is to calculate article Hidden vector coding when the corresponding weight vectors of k-th of hidden layer, it is defeated to be respectively used to the processing output of hidden layer, multiple view feature Enter,It is to correspond to k-th of hidden layer bias term when calculating the hidden vector coding of article, k takes 2,3 ..., and L, L are hidden layers Number, andIt is to be exported for mean vector when calculating the hidden vector coding of articleWeight term,It is meter It is exported for mean vector when calculating the hidden vector coding of articleBias term,It is the hidden vector for calculating article Hour hands are encoded to export variance vectorsWeight term,Be calculate article hidden vector coding hour hands it is defeated to variance vectors Go outWeight term bias term;ε be meet mean value by 0 and variance be 1 the numerical value sampled of normal distribution;
5) scoring preference value of the user to article is calculated, formula is as follows:
R=fΨ(Zu, Zv)
Wherein ZuFor the hidden vector coding of user, ZvFor the hidden vector coding of article, fΨ() is the nerve using Ψ as parameter The function that the network architecture is fitted.
By the present invention in that the scoring feature and content characteristic of user and article are modeled with variation autocoder, The multiple view data characteristics of user and article is fused in the framework of variation autocoder simultaneously, and passes through user and object The variation inference analysis of the hidden vector coding of product can get user to the preference value of candidate item set and obtain recommending knot Fruit.The present invention can have better recommendation effect relative to conventional recommendation method.
Description of the drawings
Fig. 1 is the block mold figure of the mixing recommendation method based on variation autocoder;
Fig. 2 is the variation autocoder network architecture diagram of user side.
Specific implementation mode
In order to make the purpose , technical scheme and advantage of the present invention be clearer, below in conjunction with attached drawing, to the present invention into Row is further described.
Mixing based on variation autocoder recommends method to include the following steps:
(1) according to specific application configuration environment, the daily record data of system is handled, passes through the structure of data warehouse Build, the cleaning of characteristic obtains the interaction relation informations such as browsing, collection, click, the comment of user and article, mainly include it is hidden Formula is fed back and two class of explicit feedback;Characteristic processing is carried out for different information types:For implicit feedback data, there is interbehavior Label be, otherwise be labeled as 0;For showing feedback data, its specific score value is recorded;Then characteristic value is returned One change is handled;
(2) the multiple view information for collecting user and article respectively, is managed by the form of data warehouse, to solve Cold start-up problem:Collect the information such as user portrait information, such as age of user, gender, school, profession, past behavior record;It receives Collect item contents information, such as the graphic feature of picture, description text extracted by natural language processing method feature, object Clicking rate, collection rate of product etc.;
(3) other recommendation feedback informations of collection user other than having had the article of historical behavior record, such as with Family shows not favorite preference to certain articles, generates negative sample, and negative sampling is recycled to make whole positive and negative sample number substantially It is identical;
This step includes:For each user, article is divided by positive sample according to existing interbehavior and is born Sample, and for the article of not intersection record, the negative sample of a part is filtered out by way of sampling.
(4) model based on variation autocoder mixing recommendation method is built, is become by the way of alternating iteration The gradient updating of amount, is trained model, preserves final model parameter;For having the article and use of history interbehavior Family retains corresponding hidden vector coding;
In this step, the model construction process based on variation autocoder mixing recommendation method is as follows:
Model based on variation autocoder mixing recommendation method is mainly made of three parts, is the left and right sides respectively Autocoder framework, and intermediate multilayer neural network framework, i.e. MLP modules (Multi-Layer in Fig. 1 Perceptron).It is respectively that user and article encode to use two autocoder frameworks, generates the hidden of the two respectively Vector indicates (Latent Vector), and passes through input of the pond layer result of Factorization machine as multilayer neural network.Together When by the multiple view feature x of user and articleiAnd yjIt is cascaded in the input of each hidden layer of encoder (Encoder) module, So that the hidden vector of user and article indicates that the information to multiple view source can be learnt, when for a new user or newly Article, when without corresponding score information, it will be able to generated from the data of other views hidden vector carry out scoring estimate Meter, to alleviate cold start-up problem.
As shown in Fig. 2, using the coding autocoder of user side as example, U is defined as user's score data, X is The feature of the multiple view data of user is only the cascade of feature, Φ in encoder sideuParameter as encoder side is matched It sets, and ZuIt is the middle layer of the hidden vector coding generated, passes through decoder ΘuScore data is reconstructed back respectivelyWith multiple view number According to
First user side is derived, the framework of article side is also similar, and in order to which symbolic indication is convenient, Θ is not repartitioned in this sectionuAnd Θ.The process conditional probability P of original input U and X is reconstructed back from the hidden vector Z of low-dimensionalθ (X, U | Z) indicates that θ is the parameter in restructuring procedure, according to Maximum-likelihood estimation, target be maximize likelihood probability P (X, U)= P(X,U;Z, θ), acquire unknown hidden vector coding Z and restructuring procedure θ so that the probability for reconstructing to be originally inputted X and U is most Greatly:
The posterior probability P (Z | X, U) of hidden vector Z is incalculable, and a way of variation autocoder is to introduce Qφ(Z | X, U) approaches Pθ(Z|X,U)。
Specifically ,+the logQ on the right of above-mentioned formulaφ(Z|X,U)-logQφ(Z | X, U):
Both sides are simultaneously to Qφ(Z | X, U) it asks and it is expected:
The target of Maximum-likelihood estimation is the likelihood probability P so that sampleθ(X, U) is maximum, and because Qφ(Z | X, U) it is Pθ (Z's | X, U) approaches distribution, KL (Qφ(Z|X,U)||Pθ(Z | X, U)) >=0, so:
ThenIt is the lower bound of likelihood probability, referred to as variation lower bound (Variational Lower Bound)。
It is got in return by Bayesian formula change:
The lower bound for maximizing likelihood probability then requires approaching distribution Q using hypothesisφ(Z | X, U) under the conditions of, Neng Gousheng Expectation at the probability of X and U is maximum, used at the same time to be distributed the prior distribution for enabling to the distribution assumed to approach Z.It only requires Likelihood probability lower bound optimal solution θ and φ must be maximized, so that it may to design an autocoder, by PθThe process table of (X, U | Z) It is shown as a generator, from given Pθ(Z) in the case of prior probability distribution, the X for maximum probability occur is generated by generator And the data point in U, that is, sample, it is minimum to being originally inputted the reconstructed error of X and U at this time.
Similarly, the optimization variation lower bound of the article side network architecture can be obtained, and in order to which symbol is consistent, is retouched again The optimization aim of user side has been stated, it is as follows respectively:
It is consistent with recommendation method of the tradition based on matrix decomposition, by the distribution p (z) and q that assume hidden vector coding (z's | x, u) is distributed as Gaussian Profile.It is unfolded again, respectively obtains the optimization aim of user and article, is i.e. variation lower bound is:
In conjunction with the optimization aim of descriptor matrix decomposition model, final loss function is:
Model based on variation autocoder mixing recommendation method is made of three modules altogether, including user side Variation autocoder, the variation autocoder and descriptor matrix decomposing module of article side, variation autocoder are pressed again It is divided according to decoder, encoder, altogether there are five the parameter of part, newer recursion formula is as follows:
Wherein, ΦuvuvIt is coder parameters, the article autocoding of user's autocoder respectively with Ψ The decoder parameters and generalized moment of the coder parameters of device, the decoder parameters of user's autocoder, article autocoder The parameter of battle array decomposing module, θ and Φ are the general designation of coder module parameter and decoder module parameter, η respectivelyuvΨRespectively It is user side autocoder, article side autocoder and the newer rate of descriptor matrix decomposing module parameter, Zu,Zv It is the hidden vector coding generated by the autocoder of user side and the autocoder of article side, X respectivelyB,UBIt is respectively The scoring feature for the user's multiple view feature and user that stochastic gradient descent batch size is B, YB,VBIt is the object that batch size is B respectively The scoring feature of product multiple view feature and article, U and V are the scoring feature of user and article, f respectivelypooling(U),fpooling (V) it is respectively output of the scoring feature of user and article after pondization operation.Alternating iteration optimizes above-mentioned five parts Parameter is carried out by optimal ways such as gradient declines.Retain the parameter of five final parts simultaneously, and can calculate It to the hidden vector coding of existing subscriber and article, preserves, being subsequently encountered existing user and article can be directly as The input of descriptor matrix decomposing module, to accelerate to calculate.
(5) in forecast period, for having had the user of hidden vector coding and article, directly as descriptor matrix in model Preference value of the user to special article is calculated in the input of decomposing module;And for lack hidden vector coding user and Article then calculates corresponding hidden vector coding by trained model first, then calculates its preference value;
This step includes following sub-step:
1) the model training parameter Φ obtained after step (4) is implemented is preserveduvuvAnd Ψ, carry out prediction;
2) for having the user of interbehavior and article, the hidden vector coding of preservation is directly read;For unknown use Family and article, are calculated by encoder section;
3) for the encoder section of user, the hidden vector coding of user iCalculation formula is as follows:
Wherein, g () is each layer of activation primitive, ui,xiIt is the scoring feature and multiple view feature of user i respectively,WithIt is that mean vector, variance vectors and hidden vector that user i is generated via variation autocoder are compiled respectively Code,It is the output result vector of k-th of hidden layer when calculating the hidden vector coding of user,It is to calculate user Hidden vector coding when the corresponding weight vectors of k-th of hidden layer, it is defeated to be respectively used to the processing output of hidden layer, multiple view feature Enter,It is to correspond to k-th of hidden layer bias term when calculating the hidden vector coding of user, k takes 2,3 ..., and L, L are of hidden layer Number, andIt is to be exported for mean vector when calculating the hidden vector coding of userWeight term,It is to calculate to use It is exported for mean vector when the hidden vector coding at familyBias term,It is when calculating the hidden vector coding of user It is exported for variance vectorsWeight term,It is to calculate the hidden vector coding hour hands of user to variance vectors output Bias term;ε be meet mean value by 0 and variance be 1 the numerical value sampled of normal distribution;
4) for the encoder section of article, the hidden vector coding of article iCalculation formula is as follows:
Wherein, g () is each layer of activation primitive, vi,yiIt is the scoring feature and multiple view feature of article i respectively,WithIt is that mean vector, variance vectors and hidden vector that article i is generated via variation autocoder are compiled respectively Code,It is the output result vector of k-th of hidden layer when calculating the hidden vector coding of article,It is to calculate object K-th of hidden layer corresponding weight vectors when the hidden vector coding of product are respectively used to output, the multiple view feature of processing hidden layer Input,It is to correspond to k-th of hidden layer bias term when calculating the hidden vector coding of article, k takes 2,3 ..., and L, L are hidden layers Number, andIt is to be exported for mean vector when calculating the hidden vector coding of articleWeight term,It is It is exported for mean vector when calculating the hidden vector coding of articleBias term,Be calculate article it is hidden to Amount coding hour hands export variance vectorsWeight term,It is to calculate the hidden vector coding hour hands of article to variance vectors OutputWeight term bias term;ε be meet mean value by 0 and variance be 1 the numerical value sampled of normal distribution;
5) scoring preference value of the user to article is calculated, formula is as follows:
R=fΨ(Zu, Zv)
Wherein ZuFor the hidden vector coding of user, ZvFor the hidden vector coding of article, fΨ() is the nerve using Ψ as parameter The function that the network architecture is fitted, the function can select corresponding form as needed.
(6) for certain specific user, its preference value to article in candidate item set is calculated, preference value is arranged Sequence obtains the recommendation item lists of the user;
In entirely recommending method implementation procedure, system log can be continuously generated, it is therefore desirable to periodically be arranged daily record and be laid equal stress on Multiple (1)~(4) computation model, updates the hidden vector coding of user and article.

Claims (5)

1. method is recommended in a kind of mixing based on variation autocoder, it is characterised in that include the following steps:
(1) according to specific application configuration environment, daily record data is handled, the interaction relation information of user and article is obtained, including Two class of implicit feedback and explicit feedback;Characteristic processing is carried out for different information types:For implicit feedback data, there is interactive row For label be, otherwise be labeled as 0;For showing feedback data, its specific score value is recorded, then carries out characteristic value Normalized;
(2) the multiple view information of user and article, including user's portrait information and item contents information are collected respectively, solve cold opens Dynamic problem;
(3) the recommendation feedback information of not liking preference of the user in addition to the article of existing historical behavior is collected, negative sample is generated, Negative sampling is recycled so that whole positive and negative sample number is identical;
(4) model based on variation autocoder mixing recommendation method is built, variable is carried out by the way of alternating iteration Gradient updating is trained model, preserves final model parameter;For having the article of history interbehavior and user, Retain corresponding hidden vector coding;
(5) it in forecast period, for having had the user of hidden vector coding and article, is decomposed directly as descriptor matrix in model Preference value of the user to special article is calculated in the input of module;And for lacking user and the article of hidden vector coding, Corresponding hidden vector coding is then calculated by trained model first, then calculates its preference value;
(6) for certain specific user, its preference value to article in candidate item set is calculated, preference value is ranked up, is obtained To the recommendation item lists of the user;
In method implementation procedure, periodically arrange daily record and simultaneously repeat (1)~(4) computation model, update user and article it is hidden to Amount coding.
2. method is recommended in the mixing according to claim 1 based on variation autocoder, it is characterised in that the step (3) include:For each user, article is divided by positive sample and negative sample according to existing interbehavior, and for There is no the article of intersection record, the negative sample of a part is filtered out by way of sampling.
3. method is recommended in the mixing according to claim 1 based on variation autocoder, it is characterised in that the step Suddenly the model based on variation autocoder mixing recommendation method in (4) is made of three modules altogether, including user side Variation autocoder, the variation autocoder and descriptor matrix decomposing module of article side, variation autocoder point For decoder and encoder;Having received abovementioned steps (2), the user that (3), (4) obtain and article characteristics value and corresponding After positive negative sample preference value, the training of model is carried out.
4. method is recommended in the mixing according to claim 1 based on variation autocoder, it is characterised in that the step Suddenly the gradient updating formula of variable is as follows in (4):
Wherein, ΦuvuvIt is coder parameters, the volume of article autocoder of user's autocoder respectively with Ψ Code device parameter, the decoder parameters of user's autocoder, the decoder parameters of article autocoder and descriptor matrix decompose The parameter of module, θ and Φ are coder module parameter and decoder module parameter, η respectivelyuvΨUser side respectively from Dynamic encoder, article side autocoder and the newer rate of descriptor matrix decomposing module parameter, Zu,ZvIt is by user respectively The hidden vector coding that the autocoder of side and the autocoder of article side generate, XB,UBIt is stochastic gradient descent respectively Criticize the scoring feature of user's multiple view feature and user that size is B, YB,VBIt is the article multiple view feature that batch size is B respectively With the scoring feature of article, U and V are the scoring feature of user and article, f respectivelypooling(U),fpooling(V) it is respectively user With the output of the scoring feature of article after pondization operation.
5. method is recommended in the mixing according to claim 1 based on variation autocoder, it is characterised in that the step (5) include the following steps:
1) the model training parameter Φ obtained after step (4) is implemented is preserveduvuvAnd Ψ, for carrying out prediction;
2) for having the user of interbehavior and article, the hidden vector coding of preservation is directly read;For unknown user and Article carries out the calculating of hidden vector coding by encoder section;
3) for the encoder section of user, the hidden vector coding of user iCalculation formula is as follows:
Wherein, g () is each layer of activation primitive, ui,xiIt is the scoring feature and multiple view feature of user i respectively,WithIt is that mean vector, variance vectors and hidden vector that user i is generated via variation autocoder are compiled respectively Code,It is the output result vector of k-th of hidden layer when calculating the hidden vector coding of user,It is to calculate to use K-th of hidden layer corresponding weight vectors when the hidden vector coding at family are respectively used to output, the multiple view feature of processing hidden layer Input,It is to correspond to k-th of hidden layer bias term when calculating the hidden vector coding of user, k takes 2,3 ..., and L, L are hidden layers Number, andIt is to be exported for mean vector when calculating the hidden vector coding of userWeight term,It is meter It is exported for mean vector when calculating the hidden vector coding of userBias term,It is the hidden vector for calculating user Hour hands are encoded to export variance vectorsWeight term,Be calculate user hidden vector coding hour hands it is defeated to variance vectors Go outBias term;ε be meet mean value by 0 and variance be 1 the numerical value sampled of normal distribution;
4) for the encoder section of article, the hidden vector coding of article iCalculation formula is as follows:
Wherein, g () is each layer of activation primitive, vi,yiIt is the scoring feature and multiple view feature of article i respectively,WithIt is that mean vector, variance vectors and hidden vector that article i is generated via variation autocoder are compiled respectively Code,It is the output result vector of k-th of hidden layer when calculating the hidden vector coding of article,It is to calculate object K-th of hidden layer corresponding weight vectors when the hidden vector coding of product are respectively used to output, the multiple view feature of processing hidden layer Input,It is to correspond to k-th of hidden layer bias term when calculating the hidden vector coding of article, k takes 2,3 ..., and L, L are hidden layers Number, andIt is to be exported for mean vector when calculating the hidden vector coding of articleWeight term,It is It is exported for mean vector when calculating the hidden vector coding of articleBias term,Be calculate article it is hidden to Amount coding hour hands export variance vectorsWeight term,It is to calculate the hidden vector coding hour hands of article to variance vectors OutputWeight term bias term;ε be meet mean value by 0 and variance be 1 the numerical value sampled of normal distribution;
5) scoring preference value of the user to article is calculated, formula is as follows:
R=fΨ(Zu, Zv)
Wherein ZuFor the hidden vector coding of user, ZvFor the hidden vector coding of article, fΨ() is the neural network using Ψ as parameter The function that framework is fitted.
CN201810253803.6A 2018-03-26 2018-03-26 Hybrid recommendation method based on variational automatic encoder Active CN108647226B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201810253803.6A CN108647226B (en) 2018-03-26 2018-03-26 Hybrid recommendation method based on variational automatic encoder

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201810253803.6A CN108647226B (en) 2018-03-26 2018-03-26 Hybrid recommendation method based on variational automatic encoder

Publications (2)

Publication Number Publication Date
CN108647226A true CN108647226A (en) 2018-10-12
CN108647226B CN108647226B (en) 2021-11-02

Family

ID=63744507

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201810253803.6A Active CN108647226B (en) 2018-03-26 2018-03-26 Hybrid recommendation method based on variational automatic encoder

Country Status (1)

Country Link
CN (1) CN108647226B (en)

Cited By (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109408729A (en) * 2018-12-05 2019-03-01 广州市百果园信息技术有限公司 Material is recommended to determine method, apparatus, storage medium and computer equipment
CN110659411A (en) * 2019-08-21 2020-01-07 桂林电子科技大学 Personalized recommendation method based on neural attention self-encoder
CN110765353A (en) * 2019-10-16 2020-02-07 腾讯科技(深圳)有限公司 Processing method and device of project recommendation model, computer equipment and storage medium
WO2020088126A1 (en) * 2018-10-31 2020-05-07 北京达佳互联信息技术有限公司 Video recommendation method and device, and computer readable storage medium
CN111709231A (en) * 2020-04-30 2020-09-25 昆明理工大学 Class case recommendation method based on self-attention variational self-coding
CN112188487A (en) * 2020-12-01 2021-01-05 索信达(北京)数据技术有限公司 Method and system for improving user authentication accuracy
CN112231582A (en) * 2020-11-10 2021-01-15 南京大学 Website recommendation method and equipment based on variational self-coding data fusion
CN113536116A (en) * 2021-06-29 2021-10-22 中国海洋大学 Cross-domain recommendation method based on double-current sliced wasserstein self-encoder
CN115809374A (en) * 2023-02-13 2023-03-17 四川大学 Method, system, device and storage medium for correcting mainstream deviation of recommendation system
US11915121B2 (en) 2019-11-04 2024-02-27 International Business Machines Corporation Simulator-assisted training for interpretable generative models

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20160155136A1 (en) * 2014-12-02 2016-06-02 Fair Isaac Corporation Auto-encoder enhanced self-diagnostic components for model monitoring
CN107424016A (en) * 2017-08-10 2017-12-01 安徽大学 The real time bid method and its system that a kind of online wanted advertisement is recommended
CN107533683A (en) * 2015-04-28 2018-01-02 微软技术许可有限责任公司 Relevant group suggestion

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20160155136A1 (en) * 2014-12-02 2016-06-02 Fair Isaac Corporation Auto-encoder enhanced self-diagnostic components for model monitoring
CN107533683A (en) * 2015-04-28 2018-01-02 微软技术许可有限责任公司 Relevant group suggestion
CN107424016A (en) * 2017-08-10 2017-12-01 安徽大学 The real time bid method and its system that a kind of online wanted advertisement is recommended

Non-Patent Citations (7)

* Cited by examiner, † Cited by third party
Title
ANGSHUL MAJUMDAR等: "Cold-start, warm-start and everything in between: An autoencoder based approach to recommendation", 《2017 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN)》 *
WONSUNG LEE等: "Augmented Variational Autoencoders for Collaborative Filtering with Auxiliary Information", 《CIKM "17: PROCEEDINGS OF THE 2017 ACM ON CONFERENCE ON INFORMATION AND KNOWLEDGE MANAGEMENT》 *
XIAOPENG LI等: "Collaborative Variational Autoencoder for Recommender Systems", 《PROCEEDINGS OF THE 23RD ACM SIGKDD INTERNATIONAL CONFERENCE ON KNOWLEDGE DISCOVERY AND DATA MINING》 *
俞晨光等: "基于自动编码器的协同过滤推荐算法", 《微型电脑应用》 *
郭喻栋等: "基于降噪自编码器网络与词向量的信息推荐方法", 《计算机工程》 *
霍欢等: "栈式降噪自编码器的标签协同过滤推荐算法", 《小型微型计算机系统》 *
黄立威等: "基于深度学习的推荐系统研究综述", 《计算机学报》 *

Cited By (18)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11244228B2 (en) 2018-10-31 2022-02-08 Beijing Dajia Internet Information Technology Co., Ltd. Method and device for recommending video, and computer readable storage medium
WO2020088126A1 (en) * 2018-10-31 2020-05-07 北京达佳互联信息技术有限公司 Video recommendation method and device, and computer readable storage medium
CN109408729A (en) * 2018-12-05 2019-03-01 广州市百果园信息技术有限公司 Material is recommended to determine method, apparatus, storage medium and computer equipment
CN109408729B (en) * 2018-12-05 2022-02-08 广州市百果园信息技术有限公司 Recommended material determination method and device, storage medium and computer equipment
CN110659411A (en) * 2019-08-21 2020-01-07 桂林电子科技大学 Personalized recommendation method based on neural attention self-encoder
CN110659411B (en) * 2019-08-21 2022-03-11 桂林电子科技大学 Personalized recommendation method based on neural attention self-encoder
CN110765353A (en) * 2019-10-16 2020-02-07 腾讯科技(深圳)有限公司 Processing method and device of project recommendation model, computer equipment and storage medium
CN110765353B (en) * 2019-10-16 2022-03-08 腾讯科技(深圳)有限公司 Processing method and device of project recommendation model, computer equipment and storage medium
US11915121B2 (en) 2019-11-04 2024-02-27 International Business Machines Corporation Simulator-assisted training for interpretable generative models
CN111709231A (en) * 2020-04-30 2020-09-25 昆明理工大学 Class case recommendation method based on self-attention variational self-coding
CN111709231B (en) * 2020-04-30 2022-11-18 昆明理工大学 Class case recommendation method based on self-attention variational self-coding
CN112231582A (en) * 2020-11-10 2021-01-15 南京大学 Website recommendation method and equipment based on variational self-coding data fusion
CN112231582B (en) * 2020-11-10 2023-11-21 南京大学 Website recommendation method and equipment based on variation self-coding data fusion
CN112188487B (en) * 2020-12-01 2021-03-12 索信达(北京)数据技术有限公司 Method and system for improving user authentication accuracy
CN112188487A (en) * 2020-12-01 2021-01-05 索信达(北京)数据技术有限公司 Method and system for improving user authentication accuracy
CN113536116A (en) * 2021-06-29 2021-10-22 中国海洋大学 Cross-domain recommendation method based on double-current sliced wasserstein self-encoder
CN113536116B (en) * 2021-06-29 2023-11-28 中国海洋大学 Cross-domain recommendation method based on double-stream sliced wasserstein self-encoder
CN115809374A (en) * 2023-02-13 2023-03-17 四川大学 Method, system, device and storage medium for correcting mainstream deviation of recommendation system

Also Published As

Publication number Publication date
CN108647226B (en) 2021-11-02

Similar Documents

Publication Publication Date Title
CN108647226A (en) A kind of mixing recommendation method based on variation autocoder
Wu et al. Session-based recommendation with graph neural networks
Dew et al. Letting logos speak: Leveraging multiview representation learning for data-driven branding and logo design
CN110046304B (en) User recommendation method and device
CN109325112B (en) A kind of across language sentiment analysis method and apparatus based on emoji
Leng et al. A deep learning approach for relationship extraction from interaction context in social manufacturing paradigm
Liu et al. Dynamic attention-based explainable recommendation with textual and visual fusion
CN108229582A (en) Entity recognition dual training method is named in a kind of multitask towards medical domain
CN111414476A (en) Attribute-level emotion analysis method based on multi-task learning
CN111858944A (en) Entity aspect level emotion analysis method based on attention mechanism
Zhou et al. Time series forecasting and classification models based on recurrent with attention mechanism and generative adversarial networks
CN114519145A (en) Sequence recommendation method for mining long-term and short-term interests of users based on graph neural network
CN114238577B (en) Multi-task learning emotion classification method integrating multi-head attention mechanism
CN109902201A (en) A kind of recommended method based on CNN and BP neural network
Sadr et al. Convolutional neural network equipped with attention mechanism and transfer learning for enhancing performance of sentiment analysis
CN111582506A (en) Multi-label learning method based on global and local label relation
CN112463989A (en) Knowledge graph-based information acquisition method and system
Kumar et al. Sentic computing for aspect-based opinion summarization using multi-head attention with feature pooled pointer generator network
Aich et al. Convolutional neural network-based model for web-based text classification.
CN117972218A (en) User demand accurate matching method and system based on big data
Chan et al. A correlation-embedded attention module to mitigate multicollinearity: An algorithmic trading application
Jovanovic et al. Trends and challenges of real-time learning in large language models: A critical review
Hoffmann et al. Using graph embedding techniques in process-oriented case-based reasoning
Li et al. Graph Neural Networks for Tabular Data Learning: A Survey with Taxonomy and Directions
Hammoud et al. New Arabic medical dataset for diseases classification

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant