CN113343113A - Cold start entity recommendation method for knowledge distillation based on graph convolution network - Google Patents

Cold start entity recommendation method for knowledge distillation based on graph convolution network Download PDF

Info

Publication number
CN113343113A
CN113343113A CN202110755889.4A CN202110755889A CN113343113A CN 113343113 A CN113343113 A CN 113343113A CN 202110755889 A CN202110755889 A CN 202110755889A CN 113343113 A CN113343113 A CN 113343113A
Authority
CN
China
Prior art keywords
user
product
matrix
output
layer
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202110755889.4A
Other languages
Chinese (zh)
Inventor
张琨
汪帅
吴乐
洪日昌
汪萌
曾宪锋
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangzhou Boguanwen Language Technology Co ltd
Hefei University of Technology
Original Assignee
Guangzhou Boguanwen Language Technology Co ltd
Hefei University of Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangzhou Boguanwen Language Technology Co ltd, Hefei University of Technology filed Critical Guangzhou Boguanwen Language Technology Co ltd
Priority to CN202110755889.4A priority Critical patent/CN113343113A/en
Publication of CN113343113A publication Critical patent/CN113343113A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/90Details of database functions independent of the retrieved data types
    • G06F16/95Retrieval from the web
    • G06F16/953Querying, e.g. by the use of web search engines
    • G06F16/9535Search customisation based on user profiles and personalisation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/20Information retrieval; Database structures therefor; File system structures therefor of structured data, e.g. relational data
    • G06F16/28Databases characterised by their database models, e.g. relational or object models
    • G06F16/284Relational databases
    • G06F16/288Entity relationship models
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/90Details of database functions independent of the retrieved data types
    • G06F16/95Retrieval from the web
    • G06F16/953Querying, e.g. by the use of web search engines
    • G06F16/9536Search customisation based on social or collaborative filtering
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/045Combinations of networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q50/00Information and communication technology [ICT] specially adapted for implementation of business processes of specific business sectors, e.g. utilities or tourism
    • G06Q50/01Social networking

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Databases & Information Systems (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • Data Mining & Analysis (AREA)
  • General Health & Medical Sciences (AREA)
  • Health & Medical Sciences (AREA)
  • Computing Systems (AREA)
  • Computational Linguistics (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Evolutionary Computation (AREA)
  • Biophysics (AREA)
  • Biomedical Technology (AREA)
  • Mathematical Physics (AREA)
  • Software Systems (AREA)
  • Artificial Intelligence (AREA)
  • Business, Economics & Management (AREA)
  • Molecular Biology (AREA)
  • Economics (AREA)
  • Marketing (AREA)
  • Primary Health Care (AREA)
  • Strategic Management (AREA)
  • Tourism & Hospitality (AREA)
  • General Business, Economics & Management (AREA)
  • Human Resources & Organizations (AREA)
  • Management, Administration, Business Operations System, And Electronic Commerce (AREA)

Abstract

The invention discloses a cold start entity recommendation method for knowledge distillation based on a graph convolution network, which comprises the following steps: 1. constructing an implicit feedback matrix of a user to a product, a user attribute matrix, a product attribute matrix, a user image adjacency matrix of a user student model and a product image adjacency matrix of a product student model; 2. constructing an input layer by means of one-hot coding and random initialization; 3. respectively carrying out feature propagation on the teacher model, the user student model and the product student model through graph convolution; 4. constructing a prediction layer to predict the grade of the product by the user; 5. fitting a real label updating characteristic matrix and an embedded characterization matrix of the attribute node according to the output result of the prediction layer; 6. and repeating the steps 3-5 until the new product recommendation effect of the new user reaches the optimal value. The method and the system can fully mine the high-order information of the graph and the potential association among the users, the products and the attribute nodes, thereby realizing the accurate recommendation of the cold start entity.

Description

Cold start entity recommendation method for knowledge distillation based on graph convolution network
Technical Field
The invention relates to the field of cold start recommendation, in particular to a cold start entity recommendation method for knowledge distillation based on a graph convolution network.
Background
The information overload problem in the internet era interferes with the judgment of users, and the recommendation system is successfully applied to various industries and comprises the following steps: e-commerce, music, video, education, etc. The recommendation system is mainly used for recommending commodities to the user in a personalized manner according to historical click records and click records of the user. The collaborative filtering model in the traditional recommendation is the most popular way, and the user preference and the product characteristics are obtained by mining the history. However, the new users or new products appear without history, and the recommendation model based on collaborative filtering is often limited to the recommendation of new products by new users.
In order to solve the recommendation problem of cold start entities (new users and new products), a collaborative filtering system of attribute information is introduced, the user attributes (gender, age, occupation, and the like) and the product attributes (category, service, environment, and the like) are utilized to perform characterization modeling on the user and the product, and the relation between a collaborative information space and an attribute special space is learned, so that personalized recommendation is effectively provided for the cold start entities. However, this method only obtains a simple summary space transfer function in two embedded vector characterization spaces, which limits the recommended performance of the cold-start entity.
By modeling user behavior data on products as a user-product bipartite graph, the present scoring matrix-based collaborative filtering embedded characterization model can be converted into a graph problem. The attribute information is introduced into the graph, so that attribute representation can be effectively learned, and new users or new products can be effectively represented. The existing attribute enhanced graph model recommendation system (node attribute initialization, attribute feature fusion embedded characterization model, etc.) can perform attribute characterization on new users or new products lacking historical interaction information, but the accuracy of attribute characterization is still to be improved. In addition, attribute information characterization and entity embedding characterization are mutually facilitated in the graph, and are not independent optimization learning. How to utilize the attribute information to the graph model to complete the accurate personalized recommendation of the cold start entity (new user and new product) becomes a problem which needs to be solved urgently.
Disclosure of Invention
The invention provides a cold-start entity recommendation method for knowledge distillation based on a graph convolution network, aiming at solving the defects of the prior art, so that the internal interaction and potential association between entity nodes and attribute nodes in a graph can be fully mined, and the relationship between an attribute characterization space and a corresponding entity embedding space is mined, thereby realizing more accurate recommendation of cold-start entities.
The invention adopts the following technical scheme for solving the technical problems:
the invention relates to a cold start entity recommendation method for knowledge distillation based on a graph convolution network, which is characterized by comprising the following steps of:
step 1, let U represent a user set, and U ═ U1,...,ui,...,ub,...,uM},uiRepresents the ith user, ubRepresenting the b-th user, M represents the total number of users, i is more than or equal to 1, and b is more than or equal to M; let V denote the product set, and V ═ V1,...,vj,...,vN},vjIt is indicated that the product of the jth,n represents the total number of products, j is more than or equal to 1 and less than or equal to N; let RijRepresents the ith user uiFor jth product vjIf the implicit feedback exists, the implicit feedback matrix R of the product is set as Rij}M×N
Let the user attribute matrix be
Figure BDA0003147422900000021
Figure BDA0003147422900000022
Represents the ith user uiD of (A)uA dimension attribute vector; order product attribute matrix
Figure BDA0003147422900000023
Figure BDA0003147422900000024
Denotes the jth product vjD of (A)vA dimension attribute vector;
defining an embedded characterization matrix of user attribute nodes, and randomly initializing for the first time to
Figure BDA0003147422900000025
Figure BDA0003147422900000026
A K-dimensional embedded characterization vector representing a kth user attribute node; randomly initializing the embedded characterization matrix of the user attribute node for the second time to be
Figure BDA0003147422900000027
ekA K-dimensional embedded characterization vector representing a kth user attribute node;
defining an embedded characterization matrix of product attribute nodes and randomly initializing for the first time to
Figure BDA0003147422900000028
Figure BDA0003147422900000029
To representEmbedding a characterization vector into the K dimension of the ith product attribute node; randomly initializing the embedded characterization matrix of the product attribute nodes for the second time to
Figure BDA00031474229000000210
flA K-dimensional embedded characterization vector representing the ith product attribute node;
using implicit feedback matrix R ═ Rij}M×NAnd a user attribute matrix of
Figure BDA00031474229000000211
Constructing a user graph adjacency matrix SU
Using implicit feedback matrix R ═ Rij}M×NAnd a product attribute matrix
Figure BDA00031474229000000212
Construct the product drawing adjacency matrix SV
Step 2, obtaining a characteristic matrix through single hot coding, comprising the following steps: a user characteristic matrix and a product characteristic matrix:
step 2.1, performing one-hot coding on the user set U, thereby constructing a user feature matrix P ═ P1,...,pi,...,pMIn which p isiRepresents the ith user uiK-dimensional user feature vectors of (1);
step 2.2, performing one-hot coding on the item set V, thereby constructing a product characteristic matrix Q ═ Q1,...,qj,...,qNWherein q isjDenotes the jth product vjThe K-dimensional product feature vector of (1);
step 3, constructing a characteristic initialization layer:
step 3.1, defining the current updating times as t, and initializing t to be 0;
step 3.2, define and initialize ith user u updated for the t timeiUser feature vector of
Figure BDA00031474229000000213
Define and initialize the product of the jth product of the tth updateFeature vector
Figure BDA00031474229000000214
Defining and initializing K-dimensional embedded characterization vectors for kth updated user attribute nodes
Figure BDA0003147422900000031
Defining and initializing K-dimensional embedded characterization vectors for the ith updated product attribute node
Figure BDA0003147422900000032
And 4, carrying out feature propagation on the teacher model through graph convolution:
step 4.1, defining the teacher model to comprise a T' layer graph convolution layer;
step 4.2, the ith user u updated for the t timeiUser feature vector of
Figure BDA0003147422900000033
T updated product feature vector of jth product
Figure BDA0003147422900000034
K-dimensional embedded characterization vector of kth user attribute node of t-th update
Figure BDA0003147422900000035
And the t updated K-dimensional embedded characterization vector of the ith product attribute node
Figure BDA0003147422900000036
Inputting the teacher model for feature propagation, and respectively calculating the ith user u by using the formulas (1) and (2)iUser feature vector output from t +1 th convolution layer after t-th update
Figure BDA0003147422900000037
Jth product vjThe characteristic vector of the product output by the t +1 th convolution layer after the t-th update
Figure BDA0003147422900000038
Embedding characterization vector of user attribute node output by t +1 th convolution layer after t-time updating of kth user attribute node
Figure BDA0003147422900000039
And the embedded characterization vector of the product attribute node output by the t +1 th convolution layer after the t-th update of the ith product attribute node
Figure BDA00031474229000000310
Figure BDA00031474229000000311
Figure BDA00031474229000000312
In the formulae (1) and (2), AiIs the ith user uiThe set of interacted product and user attribute nodes,
Figure BDA00031474229000000313
is the jth product vjThe characteristic vector output by the t-th layer convolution layer after the t-th updating;
Figure BDA00031474229000000314
is the embedding characterization vector output by the kth layer convolution layer after the kth updating of the kth user attribute node; a. thejIs the jth product vjThe set of interacted user and product attribute nodes,
Figure BDA00031474229000000315
is the characteristic vector output by the ith user at the t-th layer convolution layer after the t-th updating;
Figure BDA00031474229000000316
is the t-th layer volume of the ith product attribute node after the t-th updateAn embedded characterization vector of the lamination output; a. thekIs the user set interacted by the kth user attribute node; a. thelIs the product set interacted by the ith product attribute node;
and 5, constructing a prediction layer of the teacher model according to the output of the T-th layer of graph convolution:
step 5.1, obtaining a user representation U of the teacher model by using the formula (3)iAnd product characterization Vj
Figure BDA0003147422900000041
In the formula (3), the reaction mixture is,
Figure BDA0003147422900000042
ith user u representing output of T' th layer of teacher modeliThe feature vector of the user of (2),
Figure BDA0003147422900000043
j product v representing teacher model T' layer outputjThe product feature vector of (1);
step 5.2, obtaining the ith user u by using the formula (4)iFor jth product vjPredictive scoring of teacher models
Figure BDA0003147422900000044
Figure BDA0003147422900000045
In the formula (4), < > represents the vector inner product;
step 6, constructing an input layer of the user student model, and inputting an embedded characterization matrix of the user attribute nodes for the second random initialization
Figure BDA0003147422900000046
Product characteristic matrix Q ═ Q1,...,qj,...,qN}, user graph adjacency matrix SU
Step 7, constructing a characteristic initialization layer:
step 7.1, initializing t to be 0;
step 7.2, define and initialize the kth user attribute node of the t-th update
Figure BDA0003147422900000047
Defining the product feature vector of the jth updated product
Figure BDA0003147422900000048
And 8, carrying out feature propagation on the user student model through the graph convolution:
step 8.1, defining the user student model to comprise a T' layer graph convolution layer;
step 8.2, embedding the K-dimension of the kth updated user attribute node into the characterization vector
Figure BDA0003147422900000049
And the t updated product feature vector of the jth product
Figure BDA00031474229000000410
Inputting the user student model for feature propagation, and calculating the embedded characterization vector of the user attribute node output by the t +1 th convolution layer after the kth updating of the kth user attribute node by using the formula (5)
Figure BDA00031474229000000411
And jth product vjThe characteristic vector of the product output by the t +1 th convolution layer after the t-th update
Figure BDA00031474229000000412
Figure BDA00031474229000000413
In the formula (5), the reaction mixture is,
Figure BDA0003147422900000051
is the k-th user attribute node in the user graph adjacency matrix SUThe corresponding set of products in (a) to (b),
Figure BDA0003147422900000052
is the embedded characterization vector output by the kth layer convolution layer after the kth updating of the kth user attribute node,
Figure BDA0003147422900000053
is the jth product vjThe characteristic vector output by the t-th layer convolution layer after the t-th updating;
Figure BDA0003147422900000054
is the jth product vjIn the user graph adjacency matrix SUTo a corresponding set of user attribute nodes.
Step 9, constructing a prediction layer of the user student model according to the user characteristics output by the T' th layer of the user student model:
step 9.1, obtaining the ith user u output by the user student model by using the formula (6)iUser characterization of
Figure BDA0003147422900000055
And jth product vjProduct characterization of
Figure BDA0003147422900000056
Figure BDA0003147422900000057
In the formula (6), the reaction mixture is,
Figure BDA0003147422900000058
an embedded token vector representing the kth user attribute node output by the T' th layer of the user student model,
Figure BDA0003147422900000059
representing users studentsJth product v output from T' th layer of modeljThe feature vector of the product of (1),
Figure BDA00031474229000000510
represents the ith user uiThe user attribute node set of (2);
step 10, constructing an input layer of a product student model, and inputting an embedded characterization matrix of product attribute nodes for second random initialization
Figure BDA00031474229000000511
User feature matrix P ═ { P ═ P1,...,pi,...,pM}, product graph adjacency matrix SV
Step 11, according to the processes of the step 7 to the step 9, carrying out feature initialization and feature propagation on the product student model so as to obtain the ith user u output by the product student modeliUser characterization of
Figure BDA00031474229000000512
And jth product vjProduct characterization of
Figure BDA00031474229000000513
Step 12, obtaining the ith user u by using the formula (7)iFor jth product vjUser student model and product student model of (1)
Figure BDA00031474229000000514
Figure BDA00031474229000000515
In the formula (7), the reaction mixture is,
Figure BDA00031474229000000516
ith user u representing user student model outputiThe user characterization of (a) is performed,
Figure BDA00031474229000000517
j product v representing product student model outputjThe product characterization of (1);
step 12.1, constructing a loss function L of the teacher model according to the formula (8)r
Figure BDA00031474229000000518
In the formula (8), σ represents a sigmoid activation function; duRepresents the u-th user uuTraining data of (2);
Figure BDA00031474229000000519
represents the u-th user uuFor the ith product viThe prediction score of (a);
Figure BDA0003147422900000061
represents the u-th user uuFor jth product vjThe prediction score of (a); θ represents a parameter to be optimized; gamma denotes the coefficient of the regularization term;
step 12.2, constructing a knowledge distillation loss function L of the user student model by using the formula (9)uAnd knowledge distillation loss function L of product student modelv
Figure BDA0003147422900000062
In formula (9), UiI-th user u representing teacher model outputiThe user characterization of (a) is performed,
Figure BDA0003147422900000063
ith user u representing user student model outputiA user characterization of (a); vjJth product v output by instructor modeljThe product of (1) is characterized in that,
Figure BDA0003147422900000064
j product v representing product student model outputjThe product characterization of (1);
step 12.3, constructing a loss function L of the score prediction of the user student model and the product student model by using the formula (10)s
Ls=||Ug(Vg)T-UU(VI)T|| (10)
In the formula (10), UgA user feature matrix representing the teacher model output; vgA product feature matrix representing the teacher model output; u shapeUA user characterization matrix representing user student model outputs; vIA product representation matrix representing the output of the product student model;
and 12.4, obtaining an overall Loss function Loss (theta) of the whole network according to the formula (11):
Loss(θ)=Lr+λLu+μLv+ηLs (11)
in formula (11), λ, μ, η are the hyperparameters used to balance the distillation loss functions of the different fractions; solving the Loss function Loss (theta) by a gradient descent method to minimize the Loss (theta) so as to obtain an updated optimal parameter theta*
Step 13, obtaining a new user u by using the updated user student modelcIs best characterized
Figure BDA0003147422900000065
Obtaining new product v by using updated product student modeldIs best characterized
Figure BDA0003147422900000066
Figure BDA0003147422900000067
In the formula (12), the reaction mixture is,
Figure BDA0003147422900000068
embedded token vector representing kth user attribute node of updated user student model T' th layer output,
Figure BDA0003147422900000069
Indicating new user ucThe user attribute node set of (2); f. ofl T′An embedded characterization vector representing the ith product attribute node output by the updated product student model at layer T',
Figure BDA0003147422900000071
indicating a new product vdThe set of product attribute nodes.
Compared with the prior art, the invention has the beneficial effects that:
1. the invention provides a cold start entity recommendation method for knowledge distillation based on a graph convolution network, aiming at the problem that users and products in a recommendation system lack historical interaction records under the condition of cold start entities (new users and new products), deeply mining the characteristic relation between collaborative filtering information and the entities and attributes, and iteratively updating the attributes and entity embedded characteristics, thereby effectively improving the accuracy of cold start entity recommendation.
2. The method processes the entity (user and product) set in a single hot coding mode, and is convenient for quickly performing index operation and calculation of the matrix, so that the method plays a role in expanding characteristics on the representation in recommendation.
3. The invention designs a heterogeneous graph to model implicit feedback data, user attribute data and product attribute data of a user to a product into nodes and connection relations in a teacher model, constructs indirect connection between entities and attributes into graph tie matrixes of a user student model and a product student model, introduces a knowledge distillation technology, and leads entity representations output by the teacher model to guide corresponding entity attribute representations output by the student model, thereby solving the recommendation problem of cold start entities.
4. The method adopts the graph convolution mode to carry out representation learning on the interaction information between the user and the product and the interaction information between the entity and the attribute, and the propagation is carried out through the graph convolution, so that the high-order similarity can be better captured, and the more accurate attribute representation can be learned.
5. According to the method, the node representation embedded matrix of the entity and the attribute is updated according to the prediction result of the teacher model and the distillation knowledge of the student model, iterative learning is carried out on the whole neural network, and the recommendation precision of the cold start entity is effectively improved.
Drawings
FIG. 1 is a flow chart of the cold start entity recommendation method for knowledge distillation based on graph convolution network of the present invention.
Detailed Description
In this embodiment, as shown in fig. 1, the method for recommending a cold start entity based on knowledge distillation by using a graph convolution network is performed according to the following steps:
step 1, let U represent a user set, and U ═ U1,...,ui,...,ub,...,uM},uiRepresents the ith user, ubRepresenting the b-th user, M represents the total number of users, i is more than or equal to 1, and b is more than or equal to M; let V denote the product set, and V ═ V1,...,vj,...,vN},vjRepresents the jth product, N represents the total number of products, and j is more than or equal to 1 and less than or equal to N; let RijRepresents the ith user uiFor jth product vjIf the implicit feedback exists, the implicit feedback matrix R of the product is set as Rij}M×NIf the ith user uiFor jth item vjWith implicit feedback recording, then RijNot all right 1, otherwise Rij=0;
Let the user attribute matrix be
Figure BDA0003147422900000072
Figure BDA0003147422900000073
Represents the ith user uiD of (A)uA dimension attribute vector; order product attribute matrix
Figure BDA0003147422900000081
Figure BDA0003147422900000082
Denotes the jth product vjD of (A)vA dimension attribute vector;
defining an embedded characterization matrix of user attribute nodes, and randomly initializing for the first time to
Figure BDA0003147422900000083
Figure BDA0003147422900000084
A K-dimensional embedded characterization vector representing a kth user attribute node; randomly initializing the embedded characterization matrix of the user attribute nodes for the second time to
Figure BDA0003147422900000085
ekA K-dimensional embedded characterization vector representing a kth user attribute node;
defining an embedded characterization matrix of product attribute nodes and randomly initializing for the first time to
Figure BDA0003147422900000086
Figure BDA0003147422900000087
A K-dimensional embedded characterization vector representing the ith product attribute node; randomly initializing the embedded characterization matrix of the product attribute nodes for the second time to
Figure BDA0003147422900000088
flA K-dimensional embedded characterization vector representing the ith product attribute node;
using implicit feedback matrix R ═ Rij}M×NAnd a user attribute matrix of
Figure BDA0003147422900000089
Constructing a user graph adjacency matrix SU
Using implicit feedback matrix R ═ Rij}M×NAnd a product attribute matrix
Figure BDA00031474229000000810
Construct the product drawing adjacency matrix SV
Step 2, obtaining a characteristic matrix through single hot coding, comprising the following steps: a user characteristic matrix and a product characteristic matrix:
step 2.1, performing one-hot coding on the user set U, thereby constructing a user feature matrix P ═ P1,...,pi,...,pMIn which p isiRepresents the ith user uiK-dimensional user feature vectors of (1);
step 2.2, performing one-hot coding on the item set V, thereby constructing a product characteristic matrix Q ═ Q1,...,qj,...,qNWherein q isjDenotes the jth product vjThe K-dimensional product feature vector of (1);
step 3, constructing a characteristic initialization layer:
step 3.1, defining the current updating times as t, and initializing t to be 0;
step 3.2, define and initialize ith user u updated for the t timeiUser feature vector of
Figure BDA00031474229000000811
Defining and initializing the product feature vector of the jth updated jth product
Figure BDA00031474229000000812
Defining and initializing K-dimensional embedded characterization vectors for kth updated user attribute nodes
Figure BDA00031474229000000813
Defining and initializing K-dimensional embedded characterization vectors for the ith updated product attribute node
Figure BDA00031474229000000814
And 4, carrying out feature propagation on the teacher model through graph convolution:
step 4.1, defining that the teacher model comprises a T' layer graph convolution layer;
step 4.2, updating the ith user u for the t timeiUser feature vector of
Figure BDA0003147422900000091
T updated product feature vector of jth product
Figure BDA0003147422900000092
K-dimensional embedded characterization vector of kth user attribute node of t-th update
Figure BDA0003147422900000093
And the t updated K-dimensional embedded characterization vector of the ith product attribute node
Figure BDA0003147422900000094
Inputting a teacher model for feature propagation, and respectively calculating the ith user u by using an equation (1) and an equation (2)iUser feature vector output from t +1 th convolution layer after t-th update
Figure BDA0003147422900000095
Jth product vjThe characteristic vector of the product output by the t +1 th convolution layer after the t-th update
Figure BDA0003147422900000096
Embedding characterization vector of user attribute node output by t +1 th convolution layer after t-time updating of kth user attribute node
Figure BDA0003147422900000097
And the embedded characterization vector of the product attribute node output by the t +1 th convolution layer after the t-th update of the ith product attribute node
Figure BDA0003147422900000098
Figure BDA0003147422900000099
Figure BDA00031474229000000910
In the formulae (1) and (2), AiIs the ith user uiThe set of interacted product and user attribute nodes,
Figure BDA00031474229000000911
is the jth product vjThe characteristic vector output by the t-th layer convolution layer after the t-th updating;
Figure BDA00031474229000000912
is the embedding characterization vector output by the kth layer convolution layer after the kth updating of the kth user attribute node; a. thejIs the jth product vjThe set of interacted user and product attribute nodes,
Figure BDA00031474229000000913
is the characteristic vector output by the ith user at the t-th layer convolution layer after the t-th updating;
Figure BDA00031474229000000914
is the embedded characterization vector output by the tth layer convolution layer after the tth update of the ith product attribute node; a. thekIs the user set interacted by the kth user attribute node; a. thelIs the product set interacted by the ith product attribute node;
and 5, constructing a prediction layer of the teacher model according to the output of the T-th layer of graph convolution:
step 5.1, obtaining a user representation U of the teacher model by using the formula (3)iAnd product characterization Vj
Figure BDA00031474229000000915
In the formula (3), the reaction mixture is,
Figure BDA0003147422900000101
ith user u representing output of T' th layer of teacher modeliThe feature vector of the user of (2),
Figure BDA0003147422900000102
j product v representing teacher model T' layer outputjThe product feature vector of (1);
step 5.2, obtaining the ith user u by using the formula (4)iFor jth product vjPredictive scoring of teacher models
Figure BDA0003147422900000103
Figure BDA0003147422900000104
In the formula (4), < > represents the vector inner product;
step 6, constructing an input layer of the user student model, and inputting an embedded characterization matrix of the user attribute nodes for the second random initialization
Figure BDA00031474229000001016
Product characteristic matrix Q ═ Q1,...,qj,...,qN}, user graph adjacency matrix SU
Step 7, constructing a characteristic initialization layer:
step 7.1, initializing t to be 0;
step 7.2, define and initialize the kth user attribute node of the t-th update
Figure BDA0003147422900000105
Defining the product feature vector of the jth updated product
Figure BDA0003147422900000106
And 8, carrying out feature propagation on the user student model through the graph convolution:
step 8.1, defining a user student model to comprise a T' layer graph convolution layer;
step 8.2, embedding the K dimension of the kth updated user attribute node into the characterization vector
Figure BDA0003147422900000107
And the t updated product feature vector of the jth product
Figure BDA0003147422900000108
Inputting a user student model for feature propagation, and calculating a user attribute node embedded characterization vector output by the t +1 th convolution layer after the kth updating of the kth user attribute node by using the formula (5)
Figure BDA0003147422900000109
And jth product vjThe characteristic vector of the product output by the t +1 th convolution layer after the t-th update
Figure BDA00031474229000001010
Figure BDA00031474229000001011
In the formula (5), the reaction mixture is,
Figure BDA00031474229000001012
is the k-th user attribute node in the user graph adjacency matrix SUThe corresponding set of products in (a) to (b),
Figure BDA00031474229000001013
is the embedded characterization vector output by the kth layer convolution layer after the kth updating of the kth user attribute node,
Figure BDA00031474229000001014
is the jth product vjThe characteristic vector output by the t-th layer convolution layer after the t-th updating;
Figure BDA00031474229000001015
is the jth product vjIn the user graph adjacency matrix SUTo a corresponding set of user attribute nodes.
Step 9, constructing a prediction layer of the user student model according to the user characteristics output by the T' th layer of the user student model:
step 9.1, obtaining the ith user u output by the user student model by using the formula (6)iUser characterization of
Figure BDA0003147422900000111
And jth product vjProduct characterization of
Figure BDA0003147422900000112
Figure BDA0003147422900000113
In the formula (6), the reaction mixture is,
Figure BDA0003147422900000114
an embedded token vector representing the kth user attribute node output by the T' th layer of the user student model,
Figure BDA0003147422900000115
j product v representing user student model T' layer outputjThe feature vector of the product of (1),
Figure BDA0003147422900000116
represents the ith user uiThe user attribute node set of (2);
step 10, constructing an input layer of a product student model, and inputting an embedded characterization matrix of product attribute nodes for second random initialization
Figure BDA0003147422900000117
User feature matrix P ═ { P ═ P1,...,pi,...,pM}, product graph adjacency matrix SV
Step 11, according to the processes of the step 7 to the step 9, carrying out feature initialization and feature propagation on the product student model so as to obtain the ith user u output by the product student modeliUser characterization of
Figure BDA0003147422900000118
And jth product vjProduct characterization of
Figure BDA0003147422900000119
Step 12, obtaining the ith user u by using the formula (7)iFor jth product vjUser student model and product student model of (1)
Figure BDA00031474229000001110
Figure BDA00031474229000001111
In the formula (7), the reaction mixture is,
Figure BDA00031474229000001112
ith user u representing user student model outputiThe user characterization of (a) is performed,
Figure BDA00031474229000001113
j product v representing product student model outputjThe product characterization of (1);
step 12.1, constructing a loss function L of the teacher model according to the formula (8)r
Figure BDA00031474229000001114
In the formula (8), σ represents a sigmoid activation function; duRepresents the u-th user uuTraining data of (2);
Figure BDA00031474229000001115
represents the u-th user uuFor the ith product viThe prediction score of (a);
Figure BDA00031474229000001116
represents the u-th user uuFor the firstj products vjThe prediction score of (a); θ represents a parameter to be optimized; gamma denotes the coefficient of the regularization term;
step 12.2, constructing a knowledge distillation loss function L of the user student model by using the formula (9)uAnd knowledge distillation loss function L of product student modelv
Figure BDA0003147422900000121
In formula (9), UiI-th user u representing teacher model outputiThe user characterization of (a) is performed,
Figure BDA0003147422900000122
ith user u representing user student model outputiA user characterization of (a); vjJth product v output by instructor modeljThe product of (1) is characterized in that,
Figure BDA0003147422900000123
j product v representing product student model outputjThe product characterization of (1);
step 12.3, constructing a loss function L of the score prediction of the user student model and the product student model by using the formula (10)s
Ls=||Ug(Vg)T-UU(VI)T|| (10)
In the formula (10), UgA user feature matrix representing the teacher model output; vgA product feature matrix representing the teacher model output; u shapeUA user characterization matrix representing user student model outputs; vIA product representation matrix representing the output of the product student model;
and 12.4, obtaining an overall Loss function Loss (theta) of the whole network according to the formula (11):
Loss(θ)=Lr+λLu+μLv+ηLs (11)
in formula (11), λ, μ, η are the super-parameters for balancing the distillation loss function of the different fractionsCounting; solving the Loss function Loss (theta) by a gradient descent method to minimize the Loss (theta) so as to obtain an updated optimal parameter theta*
Step 13, cold starting entities, namely new users and new products, and obtaining new users u by using the updated user student modelscIs best characterized
Figure BDA0003147422900000124
Obtaining new product v by using updated product student modeldIs best characterized
Figure BDA0003147422900000125
Figure BDA0003147422900000126
In the formula (12), the reaction mixture is,
Figure BDA0003147422900000127
an embedded token vector representing the kth user attribute node output by the T' th layer of the updated user student model,
Figure BDA0003147422900000128
indicating new user ucThe user attribute node set of (2); f. ofl T′An embedded characterization vector representing the ith product attribute node output by the updated product student model at layer T',
Figure BDA0003147422900000129
indicating a new product vdThe set of product attribute nodes.
Example (b):
to verify the effectiveness of the method, the invention employs three public data sets that are commonly used in recommendations: yelp, Amazon-Video Games, XING. Less than 3 users are scored per data set screening, and 30% of users are randomly drawn from all users as new users and 30% of products are drawn from all products as new products. The interaction records and corresponding attributes of the old user and the old product are used as training data. The method divides new user new product recommendation under different scenes into three tasks: recommending old products to a new user as a task one; and recommending the new product to the old user as a task II, and recommending the new product to the new user as a task III.
Corresponding to the product recommendation task, the invention adopts Hit Ratio (HR) and Normalized distributed relational gateway (NDCG) as evaluation criteria. The invention selects 8 methods for effect comparison, namely KNN, DropoutNet, Linmap, XDeepFM, CDL, Heater, Pinpage and Student.
TABLE 1 recommendation results for three tasks on a Yelp dataset for the methods of the present invention and comparison
Figure BDA0003147422900000131
TABLE 2 recommendation results for task two on Amazon-Video Games dataset by the method of the present invention and the comparative method
Figure BDA0003147422900000141
TABLE 3 recommendation results of three tasks on XING data set by the method and the comparative method of the present invention
Figure BDA0003147422900000142
As shown in Table 1, Table 2 and Table 3, the results of the experiments on the Yelp, Amazon-Video Games and XING data sets are shown respectively, and it can be seen that the method provided by the invention is superior to the 8 comparison methods in both the HR index and the NDCG index in three data sets.
The method (PGD) provided by the invention is remarkably superior to a plurality of comparative methods on three data sets by integrating three tasks under the cold start condition, thereby proving the feasibility of the method provided by the invention.

Claims (1)

1. A cold start entity recommendation method for knowledge distillation based on graph convolution network is characterized by comprising the following steps:
step 1, let U represent a user set, and U ═ U1,...,ui,...,ub,...,uM},uiRepresents the ith user, ubRepresenting the b-th user, M represents the total number of users, i is more than or equal to 1, and b is more than or equal to M; let V denote the product set, and V ═ V1,...,vj,...,vN},vjRepresents the jth product, N represents the total number of products, and j is more than or equal to 1 and less than or equal to N; let RijRepresents the ith user uiFor jth product vjIf the implicit feedback exists, the implicit feedback matrix R of the product is set as Rij}M×N
Let the user attribute matrix be
Figure FDA0003147422890000011
Figure FDA0003147422890000012
Represents the ith user uiD of (A)uA dimension attribute vector; order product attribute matrix
Figure FDA0003147422890000013
Figure FDA0003147422890000014
Denotes the jth product vjD of (A)vA dimension attribute vector;
defining an embedded characterization matrix of user attribute nodes, and randomly initializing for the first time to
Figure FDA0003147422890000015
Figure FDA0003147422890000016
A K-dimensional embedded characterization vector representing a kth user attribute node; randomly initializing the embedded characterization matrix of the user attribute node for the second time to be
Figure FDA0003147422890000017
ekA K-dimensional embedded characterization vector representing a kth user attribute node;
defining an embedded characterization matrix of product attribute nodes and randomly initializing for the first time to
Figure FDA0003147422890000018
Figure FDA0003147422890000019
A K-dimensional embedded characterization vector representing the ith product attribute node; randomly initializing the embedded characterization matrix of the product attribute nodes for the second time to
Figure FDA00031474228900000110
flA K-dimensional embedded characterization vector representing the ith product attribute node;
using implicit feedback matrix R ═ Rij}M×NAnd a user attribute matrix of
Figure FDA00031474228900000111
Constructing a user graph adjacency matrix SU
Using implicit feedback matrix R ═ Rij}M×NAnd a product attribute matrix
Figure FDA00031474228900000112
Construct the product drawing adjacency matrix SV
Step 2, obtaining a characteristic matrix through single hot coding, comprising the following steps: a user characteristic matrix and a product characteristic matrix:
step 2.1, performing one-hot coding on the user set U, thereby constructing a user feature matrix P ═ P1,...,pi,...,pMIn which p isiRepresents the ith user uiK-dimensional user feature vectors of (1);
step 2.2, carrying out independent hot coding on the item set V, thereby constructing a productThe product characteristic matrix Q ═ Q1,...,qj,...,qNWherein q isjDenotes the jth product vjThe K-dimensional product feature vector of (1);
step 3, constructing a characteristic initialization layer:
step 3.1, defining the current updating times as t, and initializing t to be 0;
step 3.2, define and initialize ith user u updated for the t timeiUser feature vector of
Figure FDA0003147422890000021
Defining and initializing the product feature vector of the jth updated jth product
Figure FDA0003147422890000022
Defining and initializing K-dimensional embedded characterization vectors for kth updated user attribute nodes
Figure FDA0003147422890000023
Defining and initializing K-dimensional embedded characterization vectors for the ith updated product attribute node
Figure FDA0003147422890000024
And 4, carrying out feature propagation on the teacher model through graph convolution:
step 4.1, defining the teacher model to comprise a T' layer graph convolution layer;
step 4.2, the ith user u updated for the t timeiUser feature vector of
Figure FDA0003147422890000025
T updated product feature vector of jth product
Figure FDA0003147422890000026
K-dimensional embedded characterization vector of kth user attribute node of t-th update
Figure FDA0003147422890000027
And the t updated K-dimensional embedded characterization vector of the ith product attribute node
Figure FDA0003147422890000028
Inputting the teacher model for feature propagation, and respectively calculating the ith user u by using the formulas (1) and (2)iUser feature vector output from t +1 th convolution layer after t-th update
Figure FDA0003147422890000029
Jth product vjThe characteristic vector of the product output by the t +1 th convolution layer after the t-th update
Figure FDA00031474228900000210
Embedding characterization vector of user attribute node output by t +1 th convolution layer after t-time updating of kth user attribute node
Figure FDA00031474228900000211
And the embedded characterization vector of the product attribute node output by the t +1 th convolution layer after the t-th update of the ith product attribute node
Figure FDA00031474228900000212
Figure FDA00031474228900000213
Figure FDA00031474228900000214
In the formulae (1) and (2), AiIs the ith user uiThe set of interacted product and user attribute nodes,
Figure FDA00031474228900000215
is the jth product vjThe characteristic vector output by the t-th layer convolution layer after the t-th updating;
Figure FDA00031474228900000216
is the embedding characterization vector output by the kth layer convolution layer after the kth updating of the kth user attribute node; a. thejIs the jth product vjThe set of interacted user and product attribute nodes,
Figure FDA00031474228900000217
is the characteristic vector output by the ith user at the t-th layer convolution layer after the t-th updating;
Figure FDA00031474228900000218
is the embedded characterization vector output by the tth layer convolution layer after the tth update of the ith product attribute node; a. thekIs the user set interacted by the kth user attribute node; a. thelIs the product set interacted by the ith product attribute node;
and 5, constructing a prediction layer of the teacher model according to the output of the T-th layer of graph convolution:
step 5.1, obtaining a user representation U of the teacher model by using the formula (3)iAnd product characterization Vj
Figure FDA0003147422890000031
In the formula (3), the reaction mixture is,
Figure FDA0003147422890000032
ith user u representing output of T' th layer of teacher modeliThe feature vector of the user of (2),
Figure FDA0003147422890000033
j product v representing teacher model T' layer outputjThe product feature vector of (1);
step 5Obtaining the ith user u by using the formula (4)iFor jth product vjPredictive scoring of teacher models
Figure FDA0003147422890000034
Figure FDA0003147422890000035
In the formula (4), < > represents the vector inner product;
step 6, constructing an input layer of the user student model, and inputting an embedded characterization matrix of the user attribute nodes for the second random initialization
Figure FDA0003147422890000036
Product characteristic matrix Q ═ Q1,...,qj,...,qN}, user graph adjacency matrix SU
Step 7, constructing a characteristic initialization layer:
step 7.1, initializing t to be 0;
step 7.2, define and initialize the kth user attribute node of the t-th update
Figure FDA0003147422890000037
Defining the product feature vector of the jth updated product
Figure FDA0003147422890000038
And 8, carrying out feature propagation on the user student model through the graph convolution:
step 8.1, defining the user student model to comprise a T' layer graph convolution layer;
step 8.2, embedding the K-dimension of the kth updated user attribute node into the characterization vector
Figure FDA0003147422890000039
And the t updated product feature vector of the jth product
Figure FDA00031474228900000310
Inputting the user student model for feature propagation, and calculating the embedded characterization vector of the user attribute node output by the t +1 th convolution layer after the kth updating of the kth user attribute node by using the formula (5)
Figure FDA00031474228900000311
And jth product vjThe characteristic vector of the product output by the t +1 th convolution layer after the t-th update
Figure FDA00031474228900000312
Figure FDA0003147422890000041
In the formula (5), the reaction mixture is,
Figure FDA0003147422890000042
is the k-th user attribute node in the user graph adjacency matrix SUThe corresponding set of products in (a) to (b),
Figure FDA0003147422890000043
is the embedded characterization vector output by the kth layer convolution layer after the kth updating of the kth user attribute node,
Figure FDA0003147422890000044
is the jth product vjThe characteristic vector output by the t-th layer convolution layer after the t-th updating;
Figure FDA0003147422890000045
is the jth product vjIn the user graph adjacency matrix SUTo a corresponding set of user attribute nodes.
Step 9, constructing a prediction layer of the user student model according to the user characteristics output by the T' th layer of the user student model:
step 9.1, obtaining the ith user u output by the user student model by using the formula (6)iUser characterization of
Figure FDA0003147422890000046
And jth product vjProduct characterization of
Figure FDA0003147422890000047
Figure FDA0003147422890000048
In the formula (6), the reaction mixture is,
Figure FDA0003147422890000049
an embedded token vector representing the kth user attribute node output by the T' th layer of the user student model,
Figure FDA00031474228900000410
j product v representing user student model T' layer outputjThe feature vector of the product of (1),
Figure FDA00031474228900000411
represents the ith user uiThe user attribute node set of (2);
step 10, constructing an input layer of a product student model, and inputting an embedded characterization matrix of product attribute nodes for second random initialization
Figure FDA00031474228900000412
User feature matrix P ═ { P ═ P1,...,pi,...,pM}, product graph adjacency matrix SV
Step 11, according to the processes of the step 7 to the step 9, carrying out feature initialization and feature propagation on the product student model so as to obtain the ith user u output by the product student modeliUser meterSign for
Figure FDA00031474228900000413
And jth product vjProduct characterization of
Figure FDA00031474228900000414
Step 12, obtaining the ith user u by using the formula (7)iFor jth product vjUser student model and product student model of (1)
Figure FDA00031474228900000415
Figure FDA00031474228900000416
In the formula (7), the reaction mixture is,
Figure FDA00031474228900000417
ith user u representing user student model outputiThe user characterization of (a) is performed,
Figure FDA00031474228900000418
j product v representing product student model outputjThe product characterization of (1);
step 12.1, constructing a loss function L of the teacher model according to the formula (8)r
Figure FDA0003147422890000051
In the formula (8), σ represents a sigmoid activation function; duRepresents the u-th user uuTraining data of (2);
Figure FDA0003147422890000052
represents the u-th user uuFor the ith product viThe prediction score of (a);
Figure FDA0003147422890000053
represents the u-th user uuFor jth product vjThe prediction score of (a); θ represents a parameter to be optimized; gamma denotes the coefficient of the regularization term;
step 12.2, constructing a knowledge distillation loss function L of the user student model by using the formula (9)uAnd knowledge distillation loss function L of product student modelv
Figure FDA0003147422890000054
In formula (9), UiI-th user u representing teacher model outputiThe user characterization of (a) is performed,
Figure FDA0003147422890000055
ith user u representing user student model outputiA user characterization of (a); vjJth product v output by instructor modeljThe product of (1) is characterized in that,
Figure FDA0003147422890000056
j product v representing product student model outputjThe product characterization of (1);
step 12.3, constructing a loss function L of the score prediction of the user student model and the product student model by using the formula (10)s
Ls=||Ug(Vg)T-UU(VI)T|| (10)
In the formula (10), UgA user feature matrix representing the teacher model output; vgA product feature matrix representing the teacher model output; u shapeUA user characterization matrix representing user student model outputs; vIA product representation matrix representing the output of the product student model;
and 12.4, obtaining an overall Loss function Loss (theta) of the whole network according to the formula (11):
Loss(θ)=Lr+λLu+μLv+ηLs (11)
in formula (11), λ, μ, η are the hyperparameters used to balance the distillation loss functions of the different fractions; solving the Loss function Loss (theta) by a gradient descent method to minimize the Loss (theta) so as to obtain an updated optimal parameter theta*
Step 13, obtaining a new user u by using the updated user student modelcIs best characterized
Figure FDA0003147422890000057
Obtaining new product v by using updated product student modeldIs best characterized
Figure FDA0003147422890000058
Figure FDA0003147422890000061
In the formula (12), the reaction mixture is,
Figure FDA0003147422890000062
an embedded token vector representing the kth user attribute node output by the T' th layer of the updated user student model,
Figure FDA0003147422890000063
indicating new user ucThe user attribute node set of (2); f. ofl T′An embedded characterization vector representing the ith product attribute node output by the updated product student model at layer T',
Figure FDA0003147422890000064
indicating a new product vdThe set of product attribute nodes.
CN202110755889.4A 2021-07-05 2021-07-05 Cold start entity recommendation method for knowledge distillation based on graph convolution network Pending CN113343113A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110755889.4A CN113343113A (en) 2021-07-05 2021-07-05 Cold start entity recommendation method for knowledge distillation based on graph convolution network

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110755889.4A CN113343113A (en) 2021-07-05 2021-07-05 Cold start entity recommendation method for knowledge distillation based on graph convolution network

Publications (1)

Publication Number Publication Date
CN113343113A true CN113343113A (en) 2021-09-03

Family

ID=77482453

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110755889.4A Pending CN113343113A (en) 2021-07-05 2021-07-05 Cold start entity recommendation method for knowledge distillation based on graph convolution network

Country Status (1)

Country Link
CN (1) CN113343113A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115545822A (en) * 2022-09-20 2022-12-30 中国电信股份有限公司 Product attribute recommendation method and device, computer storage medium and electronic equipment

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111291270A (en) * 2020-03-02 2020-06-16 合肥工业大学 Attribute reasoning and product recommendation method based on self-adaptive graph convolution network
US20200311552A1 (en) * 2019-03-25 2020-10-01 Samsung Electronics Co., Ltd. Device and method for compressing machine learning model
CN112861936A (en) * 2021-01-26 2021-05-28 北京邮电大学 Graph node classification method and device based on graph neural network knowledge distillation

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20200311552A1 (en) * 2019-03-25 2020-10-01 Samsung Electronics Co., Ltd. Device and method for compressing machine learning model
CN111291270A (en) * 2020-03-02 2020-06-16 合肥工业大学 Attribute reasoning and product recommendation method based on self-adaptive graph convolution network
CN112861936A (en) * 2021-01-26 2021-05-28 北京邮电大学 Graph node classification method and device based on graph neural network knowledge distillation

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
SHUAI WANG 等: "《Privileged Graph Distillation for Cold Start Recommendation》", 《ARXIV》 *

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115545822A (en) * 2022-09-20 2022-12-30 中国电信股份有限公司 Product attribute recommendation method and device, computer storage medium and electronic equipment

Similar Documents

Publication Publication Date Title
CN110516160B (en) Knowledge graph-based user modeling method and sequence recommendation method
CN105740401B (en) A kind of interested site recommended method and device based on individual behavior and group interest
CN111291270B (en) Attribute reasoning and product recommendation method based on self-adaptive graph convolution network
CN112487199B (en) User characteristic prediction method based on user purchasing behavior
CN113297370B (en) End-to-end multi-modal question-answering method and system based on multi-interaction attention
CN108921657A (en) A kind of sequence of recommendation method of knowledge based enhancing memory network
CN109584006B (en) Cross-platform commodity matching method based on deep matching model
CN115186097A (en) Knowledge graph and reinforcement learning based interactive recommendation method
Fuge et al. Automatically inferring metrics for design creativity
CN112800344B (en) Deep neural network-based movie recommendation method
CN111723285A (en) Depth spectrum convolution collaborative filtering recommendation method based on scores
CN114722182A (en) Knowledge graph-based online class recommendation method and system
CN105701225A (en) Cross-media search method based on unification association supergraph protocol
CN114386513A (en) Interactive grading prediction method and system integrating comment and grading
WO2023050232A1 (en) Asset value evaluation method and apparatus, model training method and apparatus, and readable storage medium
CN115631008B (en) Commodity recommendation method, device, equipment and medium
CN113343113A (en) Cold start entity recommendation method for knowledge distillation based on graph convolution network
US20240037133A1 (en) Method and apparatus for recommending cold start object, computer device, and storage medium
CN112256918A (en) Short video click rate prediction method based on multi-mode dynamic routing
CN116610874A (en) Cross-domain recommendation method based on knowledge graph and graph neural network
CN115545834A (en) Personalized service recommendation method based on graph neural network and metadata
CN115840853A (en) Course recommendation system based on knowledge graph and attention network
CN112784153B (en) Tourist attraction recommendation method integrating attribute feature attention and heterogeneous type information
CN112818196B (en) Data processing method, equipment, electronic device and storage medium based on electronic learning platform
CN108549979B (en) Open-source software development team extension method based on precise embedded representation

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication

Application publication date: 20210903

RJ01 Rejection of invention patent application after publication