CN113609398B - Social recommendation method based on heterogeneous graph neural network - Google Patents
Social recommendation method based on heterogeneous graph neural network Download PDFInfo
- Publication number
- CN113609398B CN113609398B CN202110942348.2A CN202110942348A CN113609398B CN 113609398 B CN113609398 B CN 113609398B CN 202110942348 A CN202110942348 A CN 202110942348A CN 113609398 B CN113609398 B CN 113609398B
- Authority
- CN
- China
- Prior art keywords
- user
- client
- local
- social
- project
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 238000000034 method Methods 0.000 title claims abstract description 42
- 238000013528 artificial neural network Methods 0.000 title claims abstract description 23
- 230000006870 function Effects 0.000 claims abstract description 8
- 230000008569 process Effects 0.000 claims abstract description 7
- 239000013598 vector Substances 0.000 claims description 19
- 230000002776 aggregation Effects 0.000 claims description 8
- 238000004220 aggregation Methods 0.000 claims description 8
- 230000007246 mechanism Effects 0.000 claims description 5
- 230000004931 aggregating effect Effects 0.000 claims description 4
- 238000013500 data storage Methods 0.000 abstract description 4
- 230000003993 interaction Effects 0.000 description 11
- 239000010410 layer Substances 0.000 description 11
- 239000011159 matrix material Substances 0.000 description 9
- 239000004973 liquid crystal related substance Substances 0.000 description 4
- 238000005457 optimization Methods 0.000 description 4
- 230000004913 activation Effects 0.000 description 3
- 238000010586 diagram Methods 0.000 description 3
- 238000013507 mapping Methods 0.000 description 3
- 238000004364 calculation method Methods 0.000 description 2
- 238000011161 development Methods 0.000 description 2
- 238000012545 processing Methods 0.000 description 2
- 239000002356 single layer Substances 0.000 description 2
- 238000013459 approach Methods 0.000 description 1
- 230000006399 behavior Effects 0.000 description 1
- 230000009286 beneficial effect Effects 0.000 description 1
- 238000013480 data collection Methods 0.000 description 1
- 238000000354 decomposition reaction Methods 0.000 description 1
- 238000009826 distribution Methods 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 238000002474 experimental method Methods 0.000 description 1
- 238000004880 explosion Methods 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000003997 social interaction Effects 0.000 description 1
- 238000003860 storage Methods 0.000 description 1
- 238000012549 training Methods 0.000 description 1
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F16/00—Information retrieval; Database structures therefor; File system structures therefor
- G06F16/90—Details of database functions independent of the retrieved data types
- G06F16/95—Retrieval from the web
- G06F16/953—Querying, e.g. by the use of web search engines
- G06F16/9535—Search customisation based on user profiles and personalisation
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F16/00—Information retrieval; Database structures therefor; File system structures therefor
- G06F16/90—Details of database functions independent of the retrieved data types
- G06F16/95—Retrieval from the web
- G06F16/953—Querying, e.g. by the use of web search engines
- G06F16/9536—Search customisation based on social or collaborative filtering
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/04—Architecture, e.g. interconnection topology
- G06N3/045—Combinations of networks
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/08—Learning methods
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q10/00—Administration; Management
- G06Q10/06—Resources, workflows, human or project management; Enterprise or organisation planning; Enterprise or organisation modelling
- G06Q10/063—Operations research, analysis or management
- G06Q10/0639—Performance analysis of employees; Performance analysis of enterprise or organisation operations
- G06Q10/06393—Score-carding, benchmarking or key performance indicator [KPI] analysis
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q10/00—Administration; Management
- G06Q10/10—Office automation; Time management
- G06Q10/103—Workflow collaboration or project management
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q50/00—Information and communication technology [ICT] specially adapted for implementation of business processes of specific business sectors, e.g. utilities or tourism
- G06Q50/01—Social networking
Landscapes
- Engineering & Computer Science (AREA)
- Business, Economics & Management (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- Human Resources & Organizations (AREA)
- General Physics & Mathematics (AREA)
- Strategic Management (AREA)
- Databases & Information Systems (AREA)
- Data Mining & Analysis (AREA)
- General Engineering & Computer Science (AREA)
- Economics (AREA)
- Entrepreneurship & Innovation (AREA)
- Health & Medical Sciences (AREA)
- Computing Systems (AREA)
- General Health & Medical Sciences (AREA)
- Marketing (AREA)
- Tourism & Hospitality (AREA)
- General Business, Economics & Management (AREA)
- Operations Research (AREA)
- Mathematical Physics (AREA)
- Software Systems (AREA)
- Educational Administration (AREA)
- Development Economics (AREA)
- Quality & Reliability (AREA)
- Life Sciences & Earth Sciences (AREA)
- Artificial Intelligence (AREA)
- Biomedical Technology (AREA)
- Biophysics (AREA)
- Computational Linguistics (AREA)
- Evolutionary Computation (AREA)
- Molecular Biology (AREA)
- Primary Health Care (AREA)
- Game Theory and Decision Science (AREA)
- Management, Administration, Business Operations System, And Electronic Commerce (AREA)
- Information Transfer Between Computers (AREA)
Abstract
The invention discloses a social recommendation method based on a heterogeneous graph neural network, which constructs a local heterogeneous graph according to local data on a client; the client requests model parameters from the server, and uses the graph annotation meaning network model to conduct embedded learning on the local heterograms so as to process the isomerism of the local graph and the personalized information of the client; a user is associated with a client, after the client joins the pseudo item label, the gradient of the client is calculated by using the loss function, and then the gradient is uploaded to a server after passing through a local differential privacy model; the server collects gradients of a plurality of clients and further updates model parameters to train a social recommendation model; and embedding the local client output by the social recommendation model to conduct social recommendation. According to the method and the system for the social recommendation, the data storage is dispersed, the local user privacy data of the client are comprehensively fused, the social recommendation is trained cooperatively by using the server, the social recommendation can be effectively realized, and the privacy of the data is protected.
Description
Technical Field
The invention belongs to the technical field of data processing, and particularly relates to a social recommendation method based on a heterogeneous graph neural network.
Background
With the rapid development of the internet and information computing, massive data are derived, and we have entered an era of information explosion, massive information is generated at any moment, and it is becoming more and more difficult for users to find information useful for themselves from the massive information. The interests of each person are not the same, so that a thousands of people and thousands of people can be realized, and a recommendation system is generated and becomes a current hot spot. The recommendation system recommends proper information to the user by exploring the behavior of the user, so that the personalized requirements of the user are met. It is designed to predict the potential interest of a user in an item by learning embeddings. In addition, the recent development of graphic neural networks also provides a powerful backbone for the recommendation system to learn Xi Qianru from the user project graph. However, merely utilizing user-project interactions can suffer from cold start problems due to the difficulty of data collection. Thus, social information is fused with user-item interactions to alleviate it, which is a social recommendation problem.
The goal of a social recommendation is to predict a user's score for an item given a social interaction and a user item interaction. Existing social recommendation methods can be divided into methods based on social matrix decomposition and methods based on graph neural networks. Social matrix factoring methods either federally factore ratings and social relationship matrices or normalize the embedding of users or items with social connection constraints. However, the scoring matrix and the social relation matrix have the characteristics of high sparseness, uneven distribution and the like, and the characteristics further cause the problems of low recommendation performance, cold start and the like. The graph neural network approach is to infer node embeddings directly from the graph. However, existing use of graph neural networks to aggregate social connections and user-item interactions simultaneously requires centralized storage of the user's social connections and item interactions, which leads to privacy concerns.
Disclosure of Invention
In order to solve the problems, the invention provides a social recommendation method based on a heterogeneous graph neural network, which disperses data storage, comprehensively fuses local user privacy data of a client, and uses a server to cooperatively train social recommendation, so that the social recommendation can be effectively realized, and the privacy of the data is protected.
In order to achieve the above purpose, the invention adopts the following technical scheme: a social recommendation method based on a heterogeneous graph neural network comprises the following steps:
s10, constructing a local heterogram according to local data on a client;
s20, the client requests model parameters from the server, and the local heterograms are embedded and learned by using the graph annotation meaning network model so as to process the isomerism of the local graph and the personalized information of the client;
s30, a user is associated with a client, after the client adds a pseudo item label, the gradient of the client is calculated by using a loss function, and then the gradient is uploaded to a server after passing through a local differential privacy model;
s40, the server collects gradients of a plurality of clients and further updates model parameters to train a social recommendation model; and embedding the local client output by the social recommendation model to conduct social recommendation.
Further, in the step S10, the method includes the steps of:
s11, storing the rating data and the social data by the local data of the clients, wherein each client is associated with a user;
s12, establishing a local heterogram of the client, wherein the local heterogram comprises user nodes and project nodes, and has two edge types, namely a user-project edge type and a user-user edge type, and comprises first-order neighbors of the user of the client.
Further, in the step S20, the method includes the steps of:
s21, for an nth client, acquiring item embedding and user embedding of the nth client according to the local heterograph;
s22, the obtained embedding is input into a drawing and annotating force network model for embedded learning, and the attention weight of the social relationship of the user and the attention weight of the project neighbor relationship are obtained;
s23, aggregating the relationship between the social relationship of the user and the project neighbor, connecting the hidden embedding with the relationship vector of the user social relationship and the project neighbor, and learning the aggregated weight by using a self-attention mechanism;
s24, obtaining inferred node embedding of a user by utilizing an aggregation result;
s25, dot product is carried out between the inferred node embedding and the project embedding of the user to predict the local project score.
Further, in the step S22, the obtained embedding is input to the graph attention network model for performing embedding learning, so as to obtain the attention weight of the social relationship of the user and the attention weight of the project neighbor relationship, and the method includes the steps of:
learning the weight of each neighbor by using the attention layer, and acquiring the attention score of the social pair; further calculating all neighbor weights of the center node to obtain final attention weights of the social relationship of the user;
learning the weight of each neighbor by using the attention layer, and acquiring the attention score of the item pair; and further calculating all neighbor weights of the central node to obtain the final attention weight of the project neighbor relation.
Further, in the step S23, the relationships between the social relationship of the user and the project neighbor are aggregated, the hidden embedments are connected with their relationship vectors, and the self-attention mechanism is used to learn the aggregated weights, where the formula is as follows:
wherein, gamma u ,γ t ,γ s Attention weights embedded by the social contact of the hidden user, the embedded neighbor of the hidden item and the embedded center node; hidden embedding of user social contact and project neighbors is respectivelyAndv u representing a social relationship vector, v t Representing user-project relationship vectors, v s Represents the central node vector, h s Representing the central node itself embedded, c represents the weight vector of the attention layer.
Further, in the step S30, the method includes the steps of:
s31, firstly, q items which are not in adjacent rating items are sampled as pseudo items; we then use the local model to predict the ratings of these pseudo items, the predicted ratings being rounded to pseudo ratings;
s32, calculating errors of the true value and the predicted value through root mean square errors;
s33, calculating the gradient of the client according to the obtained error;
s34, inputting the calculated gradient into a local differential privacy model; and (5) adding dynamic noise to obtain based on the gradient, and optimizing the gradient.
Further, in the step S40, the method includes the steps of:
s41, the server collects gradients from a plurality of clients and then aggregates the gradients;
s42, after aggregation, the server updates the model parameters; this learning process is operated a number of times until convergence.
The beneficial effect of adopting this technical scheme is:
according to the method, the user privacy data are stored in a scattered mode, the client and the server cooperate, and interaction among user items and social connection of users are aggregated to conduct social recommendation; formulating the data on the client as a plurality of local charts, processing the heterogeneity of the local charts using graphical attention network relationship concerns and aggregations; deducing user embedding according to the local data to reserve personalized information of the client; each user is associated with a client, the gradient of the client is calculated by using the loss function, then the server collects the gradients of a plurality of clients, and further updated parameters are sent to the clients; finally, the local client output through the model is embedded to conduct social recommendation. However, uploading gradients directly to the server can lead to privacy issues, which we use dynamic local differential privacy and pseudo-item tags at the client to solve. According to the invention, the data storage is dispersed, in addition, the local user privacy data of the client is comprehensively fused, and the server is used for cooperatively training the federal social recommendation system, so that the social recommendation can be effectively realized, and the privacy of the data is protected.
In order to more effectively integrate user project interaction and social connection information, the method and the system formulate the data on the client into a plurality of partial graphs, use the graph attention neural network to conduct embedded learning on the partial graphs, pay attention to the relationship of the partial graph neural network and gather and distinguish social contact and project neighbors, and keep the isomerism of the data. And personalized information is preserved for the client using local user embedded reasoning.
In order to effectively protect the privacy security of data, the distributed data storage is used, original privacy data is stored in the local client side and cannot be uploaded to the server side, and only gradient data obtained through calculation is uploaded to the server for updating parameters, so that the social recommendation system is trained cooperatively. In addition, a pseudo item label and a dynamic differential privacy technology are added to the client to protect the gradient from privacy data disclosure. The pseudo tag of the item also provides additional rating information, which may alleviate the cold start problem of the data. Moreover, we sample these pseudo-items, and the differences between the derived ratings and the prediction ratings are random, which enhances the robustness of the local model. The overall comparison experiment shows that the method is obviously superior to an SOTA federal learning framework in solving the social recommendation problem, and the privacy of user data is effectively protected.
Drawings
FIG. 1 is a schematic flow chart of a social recommendation method based on a heterogeneous graph neural network;
FIG. 2 is a schematic diagram of a social recommendation method based on a heterogeneous graph neural network in an embodiment of the invention;
FIG. 3 is a schematic diagram of attention weight calculation in an embodiment of the present invention;
fig. 4 is a schematic diagram of an illustrative force neural network in an embodiment of the invention.
Detailed Description
The present invention will be further described with reference to the accompanying drawings, in order to make the objects, technical solutions and advantages of the present invention more apparent.
In this embodiment, referring to fig. 1 and 2, the invention provides a social recommendation method based on heterogeneous graph neural network, which includes the steps of:
s10, constructing a local heterogram according to local data on a client;
s20, the client requests model parameters from the server, and the local heterograms are embedded and learned by using the graph annotation meaning network model so as to process the isomerism of the local graph and the personalized information of the client;
s30, a user is associated with a client, after the client adds a pseudo item label, the gradient of the client is calculated by using a loss function, and then the gradient is uploaded to a server after passing through a local differential privacy model;
s40, the server collects gradients of a plurality of clients and further updates model parameters to train a social recommendation model; and embedding the local client output by the social recommendation model to conduct social recommendation.
As an optimization scheme of the above embodiment, in the step S10, the steps include:
s11, the local data of the clients comprise stored rating data and social data, and each client C n Associated with user n;
s12, establishing a client C n Is a local heterograph G of n Local heterograph G n Comprises user nodes and project nodes, two of which are arrangedThe seed edge types, namely a user-project edge type and a user-user edge type, respectively, comprise first-order neighbors of a client user, wherein project nodes are expressed asUser node is denoted +.>
As an optimization scheme of the above embodiment, in the step S20, the steps include:
s21, for the nth client, obtaining its item embedding according to the local heterographAnd user is embedded as +.>
Wherein, the liquid crystal display device comprises a liquid crystal display device,are embedded matrixes, K represents the total number of items, and P represents the total number of users;
s22, the obtained embedding is input into a drawing and annotating force network model for embedded learning, and the attention weight of the social relationship of the user and the attention weight of the project neighbor relationship are obtained; the method comprises the following steps:
learning the weight of each neighbor using the layer of interest, for social pairs (u n ,u p ) Is given by the formula of the attention score of (2)Wherein e un Representing the embedding of user n, W 1 The method is characterized in that the method comprises the steps of performing parameterization on a linear mapping matrix by using a social relationship, wherein an attention layer is a single-layer feedforward neural network, and a weight matrix alpha and an activation function LeakyReLU are used, and I represents the connection operation of two vectors; and then all neighbor weights of the central node u are calculated to obtain the final attention weight alpha of the social relationship of the user np =softmax p (o np );
Learning the weight of each neighbor using the layer of interest, for item pairs (u n ,t k ) The attention score formula isWherein e un Representing the embedding of user n, W 2 The method comprises the steps that a user item is a linear mapping matrix, an attention layer is a single-layer feedforward neural network, a weight matrix b and an activation function LeakyReLU are used for parameterization, and I represents connection operation of two vectors; and then all neighbor weights of the central node u are calculated to obtain the final attention weight beta of the project neighbor relation nk =softmax k (ν nk )。
S23, aggregating the relationship between the social relationship of the user and the project neighbor, connecting the hidden embedding with the relationship vector of the user and the project neighbor, and learning the aggregated weight by using a self-attention mechanism, wherein the formula is as follows:
wherein, gamma u ,γ t ,γ s Attention weights embedded by the social contact of the hidden user, the embedded neighbor of the hidden item and the embedded center node; hidden embedding of user social contact and project neighbors is respectivelyAndv u representing the social relationship vector,v t representing user-project relationship vectors, v s Represents the central node vector, h s Representing the central node itself embedded, c represents the weight vector of the attention layer.
S24, obtaining inferred node embedding of the user by utilizing the aggregation result
S25, dot product is carried out between the inferred node embedding and the project embedding of the user to predict the local project scoringe t Representing item embedding.
As an optimization scheme of the above embodiment, in the step S30, the steps include:
s31, q items which are not in the adjacent rating items are sampled as pseudo items, expressed asWe then use the local model to predict the ratings of these pseudo items, the predicted ratings being rounded to pseudo ratings;
s32, calculating errors of the true value and the predicted value through root mean square errors, wherein the formula is as follows:
s33, calculating the gradient of the client according to the obtained error
Wherein, the liquid crystal display device comprises a liquid crystal display device,embedding gradients for items of the client c respectively, wherein the user embedding gradients and the model gradients are trainable parameters;
s34, inputting the calculated gradient into a local differential privacy model, and formulating into
Wherein clip (g) (n) Delta) represents limiting the gradient g by a threshold delta (n) Laplacian (0, λ) represents laplace noise having a 0 mean and λ intensity; when dealing with gradients of different magnitudes, a constant noise strength is not appropriate;
thus, adding dynamic noise to the gradient, the formula is optimized to
As an optimization scheme of the above embodiment, in the step S40, the steps include:
s41, the server collects gradients from a plurality of clients and then aggregates the gradients, wherein the formula is as follows:
wherein R is n The total number of interactions for calculating the gradient, including real interactions and pseudo interactions;and->The interaction times related to the project and the user are respectively;
s42, after aggregation, the server updates the model parameters theta toWhere η represents the learning rate, this learning process is operated multiple times until convergence.
Detailed description of the preferred embodimentthe figure 2 shows a two-client scenario. In each client, the local GAT graph attention layer is used to infer node embeddings and the attention layer is employed to aggregate social neighbors and project neighbors. Then, a set of pseudo items bound to the local data are sampled, and the loss and gradient are calculated. After the differential privacy model is operated, the embedded gradient and the model gradient are uploaded to a server for aggregation. And the server updates the parameters and then sends the parameters to the client.
As shown in fig. 3, the attention weight i between two embeddings is calculated np Comprising a linear mapping matrix W, two user inserts e un ,e up Is added to the series of the attention layer a, and the activation function LeakyReLU.
As shown in fig. 4, the local map neural network GNN embeds the user u1 by aggregating adjacent embedded learning, and then uploads to the server.
The foregoing has shown and described the basic principles and main features of the present invention and the advantages of the present invention. It will be understood by those skilled in the art that the present invention is not limited to the embodiments described above, and that the above embodiments and descriptions are merely illustrative of the principles of the present invention, and various changes and modifications may be made without departing from the spirit and scope of the invention, which is defined in the appended claims. The scope of the invention is defined by the appended claims and equivalents thereof.
Claims (4)
1. The social recommendation method based on the heterogeneous graph neural network is characterized by comprising the following steps of:
s10, constructing a local heterogram according to local data on a client, wherein the method comprises the following steps of:
s11, storing the rating data and the social data by the local data of the clients, wherein each client is associated with a user;
s12, establishing a local heterogram of the client, wherein the local heterogram comprises user nodes and project nodes, has two edge types, namely a user-project edge type and a user-user edge type, and comprises first-order neighbors of the user of the client;
s20, a client requests model parameters from a server, and uses a graph annotation meaning network model to conduct embedded learning on a local heterogram so as to process the isomerism of a local graph and personalized information of the client, and the method comprises the following steps:
s21, for an nth client, acquiring item embedding and user embedding of the nth client according to the local heterograph;
s22, the obtained embedding is input into a drawing and annotating force network model for embedded learning, and the attention weight of the social relationship of the user and the attention weight of the project neighbor relationship are obtained;
s23, aggregating the relationship between the social relationship of the user and the project neighbor, connecting the hidden embedding with the relationship vector of the user social relationship and the project neighbor, and learning the aggregated weight by using a self-attention mechanism;
s24, obtaining inferred node embedding of a user by utilizing an aggregation result;
s25, dot product is carried out between the inferred node embedding and the project embedding of the user to predict local project scoring;
s30, associating a user with a client, calculating the gradient of the client by using a loss function after the client joins the pseudo item label, and uploading the gradient to a server after passing through a local differential privacy model, wherein the method comprises the following steps:
s31, firstly, q items which are not in adjacent rating items are sampled as pseudo items; we then use the local model to predict the ratings of these pseudo items, the predicted ratings being rounded to pseudo ratings;
s32, calculating errors of the true value and the predicted value through root mean square errors;
s33, calculating the gradient of the client according to the obtained error;
s34, inputting the calculated gradient into a local differential privacy model; adding dynamic noise based on the gradient to obtain and optimize the gradient;
s40, the server collects gradients of a plurality of clients and further updates model parameters to train a social recommendation model; and embedding the local client output by the social recommendation model to conduct social recommendation.
2. The social recommendation method based on heterogeneous graph neural network according to claim 1, wherein in the step S22, the obtained embedment is input into a graph-annotation-force network model to perform embedment learning, so as to obtain the attention weight of the social relationship of the user and the attention weight of the project neighbor relationship, and the method comprises the steps of:
learning the weight of each neighbor by using the attention layer, and acquiring the attention score of the social pair; further calculating all neighbor weights of the center node to obtain final attention weights of the social relationship of the user;
learning the weight of each neighbor by using the attention layer, and acquiring the attention score of the item pair; and further calculating all neighbor weights of the central node to obtain the final attention weight of the project neighbor relation.
3. The social recommendation method based on heterogeneous graph neural network according to claim 2, wherein in the step S23, the relationships between the social relationship of the user and the neighbors of the item are aggregated, the hidden embedments are connected with their relationship vectors, and the self-attention mechanism is used to learn the aggregated weights, the formula is as follows:
wherein, gamma u ,γ t ,γ s Attention weights embedded by the social contact of the hidden user, the embedded neighbor of the hidden item and the embedded center node; hidden embedding of user social contact and project neighbors is respectivelyAndv u representing a social relationship vector, v t Representing user-project relationship vectors, v s Represents the central node vector, h s Representing the central node itself embedded, c represents the weight vector of the attention layer.
4. The social recommendation method based on the heterogeneous neural network according to claim 1, wherein in the step S40, the method comprises the steps of:
s41, the server collects gradients from a plurality of clients and then aggregates the gradients;
s42, after aggregation, the server updates the model parameters; this learning process is operated a number of times until convergence.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202110942348.2A CN113609398B (en) | 2021-08-17 | 2021-08-17 | Social recommendation method based on heterogeneous graph neural network |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202110942348.2A CN113609398B (en) | 2021-08-17 | 2021-08-17 | Social recommendation method based on heterogeneous graph neural network |
Publications (2)
Publication Number | Publication Date |
---|---|
CN113609398A CN113609398A (en) | 2021-11-05 |
CN113609398B true CN113609398B (en) | 2023-09-19 |
Family
ID=78340928
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202110942348.2A Active CN113609398B (en) | 2021-08-17 | 2021-08-17 | Social recommendation method based on heterogeneous graph neural network |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN113609398B (en) |
Families Citing this family (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN113821732B (en) * | 2021-11-24 | 2022-02-18 | 阿里巴巴达摩院(杭州)科技有限公司 | Item recommendation method and equipment for protecting user privacy and learning system |
CN114398538B (en) * | 2021-12-08 | 2024-02-06 | 西安电子科技大学 | Cross-domain recommendation method and system for privacy protection, storage medium and computer equipment |
CN114118388B (en) * | 2022-01-25 | 2022-04-19 | 湖南工商大学 | Heterogeneous network graph link prediction method facing user privacy protection and related equipment |
CN114510652B (en) * | 2022-04-20 | 2023-04-07 | 宁波大学 | Social collaborative filtering recommendation method based on federal learning |
CN115310762A (en) * | 2022-07-04 | 2022-11-08 | 上海淇玥信息技术有限公司 | Target service determination method and device based on heterogeneous graph neural network |
CN115081024B (en) * | 2022-08-16 | 2023-01-24 | 杭州金智塔科技有限公司 | Decentralized business model training method and device based on privacy protection |
CN116226540B (en) * | 2023-05-09 | 2023-09-26 | 浙江大学 | End-to-end federation personalized recommendation method and system based on user interest domain |
CN117520665B (en) * | 2024-01-05 | 2024-03-26 | 江西财经大学 | Social recommendation method based on generation of countermeasure network |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN111177781A (en) * | 2019-12-30 | 2020-05-19 | 北京航空航天大学 | Differential privacy recommendation method based on heterogeneous information network embedding |
CN112084428A (en) * | 2020-09-17 | 2020-12-15 | 辽宁工程技术大学 | Collaborative filtering recommendation method based on coupling network embedding and knowledge graph |
CN112990972A (en) * | 2021-03-19 | 2021-06-18 | 华南理工大学 | Recommendation method based on heterogeneous graph neural network |
CN113254803A (en) * | 2021-06-24 | 2021-08-13 | 暨南大学 | Social recommendation method based on multi-feature heterogeneous graph neural network |
Family Cites Families (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110245301A (en) * | 2018-11-29 | 2019-09-17 | 腾讯科技(深圳)有限公司 | A kind of recommended method, device and storage medium |
US11163803B2 (en) * | 2019-04-29 | 2021-11-02 | Adobe Inc. | Higher-order graph clustering |
-
2021
- 2021-08-17 CN CN202110942348.2A patent/CN113609398B/en active Active
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN111177781A (en) * | 2019-12-30 | 2020-05-19 | 北京航空航天大学 | Differential privacy recommendation method based on heterogeneous information network embedding |
CN112084428A (en) * | 2020-09-17 | 2020-12-15 | 辽宁工程技术大学 | Collaborative filtering recommendation method based on coupling network embedding and knowledge graph |
CN112990972A (en) * | 2021-03-19 | 2021-06-18 | 华南理工大学 | Recommendation method based on heterogeneous graph neural network |
CN113254803A (en) * | 2021-06-24 | 2021-08-13 | 暨南大学 | Social recommendation method based on multi-feature heterogeneous graph neural network |
Non-Patent Citations (2)
Title |
---|
基于图嵌入模型的协同过滤推荐算法;高海燕;毛林;窦凯奇;倪文晔;赵卫滨;余永红;;数据采集与处理(03);全文 * |
基于概率矩阵分解的社交网络推荐算法研究;陈永锋;朱振宇;;科技广场(01);全文 * |
Also Published As
Publication number | Publication date |
---|---|
CN113609398A (en) | 2021-11-05 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN113609398B (en) | Social recommendation method based on heterogeneous graph neural network | |
US20230039182A1 (en) | Method, apparatus, computer device, storage medium, and program product for processing data | |
Ray et al. | A surrogate assisted parallel multiobjective evolutionary algorithm for robust engineering design | |
CN112215604B (en) | Method and device for identifying transaction mutual-party relationship information | |
CN112364976A (en) | User preference prediction method based on session recommendation system | |
CN113961759A (en) | Anomaly detection method based on attribute map representation learning | |
He et al. | MTAD‐TF: Multivariate Time Series Anomaly Detection Using the Combination of Temporal Pattern and Feature Pattern | |
CN112600697B (en) | QoS prediction method and system based on federal learning, client and server | |
CN114595396A (en) | Sequence recommendation method and system based on federal learning | |
Long et al. | Fedsiam: Towards adaptive federated semi-supervised learning | |
Wang et al. | Digital-twin-aided product design framework for IoT platforms | |
Devi et al. | Smoothing approach to alleviate the meager rating problem in collaborative recommender systems | |
Cheng et al. | Dynamic games for social model training service market via federated learning approach | |
Yin et al. | An efficient recommendation algorithm based on heterogeneous information network | |
CN115098692A (en) | Cross-domain recommendation method and device, electronic equipment and storage medium | |
CN113361928B (en) | Crowd-sourced task recommendation method based on heterogram attention network | |
Chen et al. | Integrating dual user network embedding with matrix factorization for social recommender systems | |
CN117217820A (en) | Intelligent integrated prediction method and system for purchasing demand of supply chain | |
Sahu et al. | Matrix factorization in cross-domain recommendations framework by shared users latent factors | |
Zou et al. | Dynamic games in federated learning training service market | |
CN117035059A (en) | Efficient privacy protection recommendation system and method for communication | |
CN115391638A (en) | Recommendation model training method and device based on social network | |
CN110334283A (en) | Information recommendation method, device, server and storage medium | |
CN114996566A (en) | Intelligent recommendation system and method for industrial internet platform | |
Singh et al. | Self-Attention Mechanism Based Federated Learning Model for Cross Context Recommendation System |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |