CN113761350A - Data recommendation method, related device and data recommendation system - Google Patents

Data recommendation method, related device and data recommendation system Download PDF

Info

Publication number
CN113761350A
CN113761350A CN202110252347.5A CN202110252347A CN113761350A CN 113761350 A CN113761350 A CN 113761350A CN 202110252347 A CN202110252347 A CN 202110252347A CN 113761350 A CN113761350 A CN 113761350A
Authority
CN
China
Prior art keywords
data
recommendation model
global
local
parameters
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202110252347.5A
Other languages
Chinese (zh)
Inventor
李干
张翔
史贤伟
李昆
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Jingdong Zhenshi Information Technology Co Ltd
Original Assignee
Beijing Jingdong Zhenshi Information Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Jingdong Zhenshi Information Technology Co Ltd filed Critical Beijing Jingdong Zhenshi Information Technology Co Ltd
Priority to CN202110252347.5A priority Critical patent/CN113761350A/en
Publication of CN113761350A publication Critical patent/CN113761350A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/90Details of database functions independent of the retrieved data types
    • G06F16/95Retrieval from the web
    • G06F16/953Querying, e.g. by the use of web search engines
    • G06F16/9535Search customisation based on user profiles and personalisation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/90Details of database functions independent of the retrieved data types
    • G06F16/95Retrieval from the web
    • G06F16/953Querying, e.g. by the use of web search engines
    • G06F16/9538Presentation of query results
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q30/00Commerce
    • G06Q30/06Buying, selling or leasing transactions
    • G06Q30/0601Electronic shopping [e-shopping]
    • G06Q30/0631Item recommendations
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q40/00Finance; Insurance; Tax strategies; Processing of corporate or income taxes
    • G06Q40/02Banking, e.g. interest calculation or account maintenance
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q50/00Information and communication technology [ICT] specially adapted for implementation of business processes of specific business sectors, e.g. utilities or tourism
    • G06Q50/01Social networking

Landscapes

  • Engineering & Computer Science (AREA)
  • Business, Economics & Management (AREA)
  • Theoretical Computer Science (AREA)
  • Databases & Information Systems (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Accounting & Taxation (AREA)
  • Finance (AREA)
  • General Business, Economics & Management (AREA)
  • Strategic Management (AREA)
  • Marketing (AREA)
  • Economics (AREA)
  • Development Economics (AREA)
  • General Engineering & Computer Science (AREA)
  • Data Mining & Analysis (AREA)
  • General Health & Medical Sciences (AREA)
  • Human Resources & Organizations (AREA)
  • Primary Health Care (AREA)
  • Tourism & Hospitality (AREA)
  • Technology Law (AREA)
  • Health & Medical Sciences (AREA)
  • Computing Systems (AREA)
  • Information Retrieval, Db Structures And Fs Structures Therefor (AREA)

Abstract

The invention discloses a data recommendation method, a related device and a system, and relates to the technical field of computers. One embodiment of the method comprises: the first client side trains a local first local recommendation model by using a local user behavior data set and sends first sharable data obtained by training to the server; the first client downloads global shared data from the server, wherein the global shared data are obtained by the server through calculation according to the first sharable data and the second sharable data set; the first client updates the first local recommendation model by using the global shared data so as to train the global recommendation model, and recommends object data to the user by using the global recommendation model after the training of the global recommendation model is completed. The implementation method can solve the problem that the relevant feature data dimension of the user is single, improves the recommendation accuracy, can ensure the safety of private data when being attacked maliciously in the learning process, and improves the learning robustness.

Description

Data recommendation method, related device and data recommendation system
Technical Field
The invention relates to the technical field of computers, in particular to a data recommendation method, a related device and a data recommendation system.
Background
According to business needs, a website, an APP (application program) or an applet, the home page recommends relevant object data, such as commodity data, to a user according to relevant characteristics of the user. At present, a scheme for recommending commodity data to a user by an e-commerce institution is mainly to recommend related commodity data to the user according to related characteristic data of the user, for example, based on related data such as user browsing records, purchase records, collection records, attention records and the like in the e-commerce institution, and recommend the commodity data to the user through an algorithm.
In the process of implementing the invention, the inventor finds that at least the following problems exist in the prior art:
the relevant feature data of the user has single dimension and low recommendation accuracy.
Disclosure of Invention
In view of this, embodiments of the present invention provide a data recommendation method, a related apparatus, and a data recommendation system, which can solve the problem of single dimension of related feature data of a user, improve recommendation accuracy, and ensure security of private data when being attacked maliciously in a learning process, thereby improving learning robustness.
To achieve the above object, according to an aspect of an embodiment of the present invention, a data recommendation method is provided.
A method of data recommendation, comprising: the first client side trains a local first local recommendation model by using a local user behavior data set and sends first sharable data obtained by training to the server; the first client downloads global shared data from the server, wherein the global shared data are calculated by the server according to the first sharable data and a second sharable data set, and the second sharable data set comprises second sharable data sent to the server by one or more second clients; and the first client updates the first local recommendation model by using the global sharing data so as to train a global recommendation model, and recommends object data matched with corresponding user behavior data to a user by using the global recommendation model after the training of the global recommendation model is completed.
Optionally, the global shared data includes parameters of the global recommendation model and a gradient of a loss function of the global recommendation model; the first client updating the first local recommendation model with the global shared data, including: the first client updates the parameters of the first local recommendation model and the loss function gradient of the first local recommendation model according to the parameters of the global recommendation model and the loss function gradient of the global recommendation model, and the first sharable data comprises the parameters of the first local recommendation model and the loss function gradient.
Optionally, the first client updates the parameters of the first local recommendation model by: and the first client calculates the product of the learning rate of the first local recommendation model and the loss function gradient of the global recommendation model, and calculates the difference value between the current parameter of the first local recommendation model and the product to obtain the updated parameter of the first local recommendation model.
Optionally, the user behavior data set local to the first client is a set of one of the following data: the data of user consumption behaviors, the data of user social behaviors and the data of user operation behaviors on the object.
According to another aspect of the embodiments of the present invention, a data recommendation method is provided.
A method of data recommendation, comprising: the method comprises the steps that a server receives sharable data sent by clients in a client set, wherein the sharable data are obtained by training respective local recommendation models by the clients through respective user behavior data sets; and the server calculates global shared data according to the sharable data, the global shared data is used for updating respective local recommendation models of the clients so as to train a global recommendation model, and the global recommendation model is used for recommending object data matched with the corresponding user behavior data to respective users by the clients.
Optionally, the sharable data sent by the client includes parameters of a local recommendation model of the client, and the global shared data includes parameters of the global recommendation model; the server calculates global shared data according to each sharable data, and the method comprises the following steps: randomly selecting a local recommendation model of part or all of the clients in the client set as a target local recommendation model, and sequencing parameters of each target local recommendation model according to a first sequencing rule; according to the sorted parameters of each target local recommendation model, removing the maximum parameters of the first quantity and the minimum parameters of the second quantity, and calculating the average value of the remaining parameters to determine the parameters of the global recommendation model.
Optionally, the sharable data sent by the client includes parameters of a local recommendation model of the client, and the global shared data includes parameters of the global recommendation model; the server calculates global shared data according to each sharable data, and the method comprises the following steps: randomly selecting a partial recommendation model of part or all of the clients in the client set as a target partial recommendation model, and sequencing parameters of each target partial recommendation model according to a second sequencing rule; calculating the median of the sorted parameters of each target local recommendation model, and calculating the average value of two parameters positioned in the middle in the sorted parameters if the total number of the parameters of each target local recommendation model is an even number so as to determine the parameters of the global recommendation model.
Optionally, the global shared data further includes a loss function gradient of the global recommendation model, a global loss function is obtained by performing weighted average on the loss function of the target local recommendation model, and the loss function gradient of the global recommendation model is calculated based on the global loss function.
Optionally, according to a set percentage, randomly selecting a local recommendation model of some or all of the clients in the client set as the target local recommendation model.
According to another aspect of the embodiment of the invention, a first client for data recommendation is provided.
A first client for data recommendation, comprising: the local recommendation module is used for training a local recommendation model by using a local user behavior data set and sending first sharable data obtained by training to the server; a global shared data downloading module, configured to download global shared data from the server, where the global shared data is calculated by the server according to the first sharable data and a second sharable data set, and the second sharable data set includes second sharable data sent to the server by one or more second clients; and the first local recommendation model updating module is used for updating the first local recommendation model by using the global shared data so as to train a global recommendation model, and recommending object data matched with corresponding user behavior data to a user by using the global recommendation model after the global recommendation model is trained.
Optionally, the global shared data includes parameters of the global recommendation model and a gradient of a loss function of the global recommendation model; the first local recommendation model update module is further to: updating the parameters of the first local recommendation model and the loss function gradient of the first local recommendation model according to the parameters of the global recommendation model and the loss function gradient of the global recommendation model, wherein the first sharable data comprises the parameters of the first local recommendation model and the loss function gradient.
Optionally, the first local recommendation model updating module updates the parameters of the first local recommendation model by: and calculating a product of the learning rate of the first local recommendation model and the gradient of the loss function of the global recommendation model, and calculating a difference value between a current parameter of the first local recommendation model and the product to obtain an updated parameter of the first local recommendation model.
Optionally, the user behavior data set local to the first client is a set of one of the following data: the data of user consumption behaviors, the data of user social behaviors and the data of user operation behaviors on the object.
According to still another aspect of an embodiment of the present invention, there is provided a server for data recommendation.
A server for data recommendation, comprising: the sharable data receiving module is used for receiving sharable data sent by the clients in the client set, wherein the sharable data are obtained by training respective local recommendation models by the clients by using respective user behavior data sets; and the global shared data calculation module is used for calculating global shared data according to the sharable data, the global shared data is used for updating respective local recommendation models of the clients so as to train a global recommendation model, and the global recommendation model is used for recommending object data matched with the corresponding user behavior data to respective users by the clients.
Optionally, the sharable data sent by the client includes parameters of a local recommendation model of the client, and the global shared data includes parameters of the global recommendation model; the global shared data computation module is further configured to: randomly selecting a local recommendation model of part or all of the clients in the client set as a target local recommendation model, and sequencing parameters of each target local recommendation model according to a first sequencing rule; according to the sorted parameters of each target local recommendation model, removing the maximum parameters of the first quantity and the minimum parameters of the second quantity, and calculating the average value of the remaining parameters to determine the parameters of the global recommendation model.
Optionally, the sharable data sent by the client includes parameters of a local recommendation model of the client, and the global shared data includes parameters of the global recommendation model; the global shared data computation module is further configured to: randomly selecting a partial recommendation model of part or all of the clients in the client set as a target partial recommendation model, and sequencing parameters of each target partial recommendation model according to a second sequencing rule; calculating the median of the sorted parameters of each target local recommendation model, and calculating the average value of two parameters positioned in the middle in the sorted parameters if the total number of the parameters of each target local recommendation model is an even number so as to determine the parameters of the global recommendation model.
Optionally, the global shared data further includes a loss function gradient of the global recommendation model, a global loss function is obtained by performing weighted average on the loss function of the target local recommendation model, and the loss function gradient of the global recommendation model is calculated based on the global loss function.
Optionally, according to a set percentage, randomly selecting a local recommendation model of some or all of the clients in the client set as the target local recommendation model.
According to yet another aspect of an embodiment of the present invention, a data recommendation system is provided.
A data recommendation system, comprising: the data recommendation system comprises a plurality of clients and a server for data recommendation provided by the embodiment of the invention, wherein the plurality of clients comprise a first client for data recommendation provided by the embodiment of the invention.
According to yet another aspect of an embodiment of the present invention, an electronic device is provided.
An electronic device, comprising: one or more processors; a memory for storing one or more programs which, when executed by the one or more processors, cause the one or more processors to implement the data recommendation method provided by embodiments of the present invention.
According to yet another aspect of an embodiment of the present invention, a computer-readable medium is provided.
A computer-readable medium, on which a computer program is stored, which, when executed by a processor, implements a data recommendation method provided by an embodiment of the present invention.
One embodiment of the above invention has the following advantages or benefits: the first client side trains a local first local recommendation model by using a local user behavior data set and sends first sharable data obtained by training to the server; the method comprises the steps that a first client downloads global shared data from a server, the global shared data are obtained through calculation of the server according to first sharable data and a second sharable data set, and the second sharable data set comprises one or more second sharable data sent to the server by a second client; and the first client updates the first local recommendation model by using the global shared data so as to train the global recommendation model, and recommends object data matched with the corresponding user behavior data to the user by using the global recommendation model after the training of the global recommendation model is completed. The problem that the relevant feature data dimension of the user is single can be solved, the recommendation accuracy is improved, the safety of the private data can be guaranteed when the user is attacked maliciously in the learning process, and the learning robustness is improved.
Further effects of the above-mentioned non-conventional alternatives will be described below in connection with the embodiments.
Drawings
The drawings are included to provide a better understanding of the invention and are not to be construed as unduly limiting the invention. Wherein:
FIG. 1 is a schematic main flow diagram of a data recommendation method according to one embodiment of the present invention;
FIG. 2 is a schematic main flow chart of a data recommendation method according to another embodiment of the present invention;
FIG. 3 is a flow diagram of federated learning, in accordance with one embodiment of the present invention;
FIG. 4 is a schematic diagram of the main modules of a first client for data recommendation according to one embodiment of the present invention;
FIG. 5 is a schematic diagram of the main modules of a server for data recommendation according to one embodiment of the present invention;
FIG. 6 is a schematic diagram of the main components of a data recommendation system according to one embodiment of the present invention;
FIG. 7 is an exemplary system architecture diagram in which embodiments of the present invention may be employed;
fig. 8 is a schematic structural diagram of a computer system suitable for implementing a terminal device or a server according to an embodiment of the present invention.
Detailed Description
Exemplary embodiments of the present invention are described below with reference to the accompanying drawings, in which various details of embodiments of the invention are included to assist understanding, and which are to be considered as merely exemplary. Accordingly, those of ordinary skill in the art will recognize that various changes and modifications of the embodiments described herein can be made without departing from the scope and spirit of the invention. Also, descriptions of well-known functions and constructions are omitted in the following description for clarity and conciseness.
Fig. 1 is a main flow diagram of a data recommendation method according to an embodiment of the present invention.
As shown in fig. 1, the data recommendation method according to an embodiment of the present invention mainly includes steps S101 to S103 as follows. The data recommendation method of the present embodiment may be executed by a first client for data recommendation.
Step S101: the first client side trains a local first local recommendation model by using a local user behavior data set, and sends the first sharable data obtained by training to the server.
The user behavior data set local to the first client may be a collection of one of: the data of user consumption behaviors, the data of user social behaviors and the data of user operation behaviors on the object.
The object may be a commodity, a file, or the like.
The user consumption behavior data may be consumption flow data of users in a bank institution, the user social behavior data may be social behavior data of users in a social institution, such as chat records, browsing records and the like in social software of the users, and the operation behavior data of the user on the object may be behavior data of browsing, paying attention to and purchasing goods and the like of the users in an e-commerce institution.
Accordingly, the first client may specifically be a client of an organization such as a bank platform, a social platform, or an e-commerce platform.
The first shareable data may include parameters of the first local recommendation model and a loss function gradient. The first local recommendation model is a recommendation model local to the first client.
Step S102: the first client downloads global shared data from the server, the global shared data is calculated by the server according to the first sharable data and a second sharable data set, and the second sharable data set comprises one or more second sharable data sent by the second client to the server.
The second client is also a client for data recommendation, and has the same function as the first client, and at least one of the second client and the first client is a client of a different type, taking three clients as an example, wherein the first client is a client of a bank platform, and the two second clients are a client of a social platform and a client of an e-commerce platform.
The global shared data includes parameters of the global recommendation model and a gradient of a penalty function of the global recommendation model.
The global recommendation model refers to a unified recommendation model jointly trained by the clients (the first client and the second client).
Step S103: and the first client updates the first local recommendation model by using the global shared data so as to train the global recommendation model, and recommends object data matched with the corresponding user behavior data to the user by using the global recommendation model after the training of the global recommendation model is completed.
The first client updating the first local recommendation model with the global shared data may include: and the first client updates the parameters of the first local recommendation model and the loss function gradient of the first local recommendation model according to the parameters of the global recommendation model and the loss function gradient of the global recommendation model.
The first client may update the parameters of the first local recommendation model by: the first client calculates the product of the learning rate of the first local recommendation model and the loss function gradient of the global recommendation model, and calculates the difference value between the current parameter of the first local recommendation model and the product to obtain the updated parameter of the first local recommendation model.
The first client may update the gradient of the loss function of the first local recommendation model by: and the first client calculates the updated loss function gradient of the first local recommendation model according to the parameters of the global recommendation model and the updated parameters of the local first local recommendation model.
Fig. 2 is a main flow diagram of a data recommendation method according to another embodiment of the present invention.
As shown in fig. 2, the data recommendation method according to an embodiment of the present invention mainly includes steps S201 to S202 as follows. The data recommendation method of the present embodiment may be executed by a server for data recommendation.
Step S201: the server receives sharable data sent by the clients in the client set, wherein the sharable data is obtained by training the local recommendation models of the clients by using the user behavior data sets of the clients.
The shareable data sent by the client may include parameters of a local recommendation model of the client.
Step S202: the server calculates global shared data according to the sharable data, the global shared data are used for the client to update respective local recommendation models so as to train a global recommendation model, and the global recommendation model is used for the client to recommend object data matched with corresponding user behavior data to respective users.
The global shared data may include parameters of a global recommendation model.
The global shared data can also comprise a loss function gradient of the global recommendation model, a global loss function is obtained by carrying out weighted average on the loss function of the target local recommendation model, and the loss function gradient of the global recommendation model is calculated based on the global loss function.
The server calculates global shared data according to each sharable data, and may include: randomly selecting partial recommendation models of part or all of the clients in the client set as target partial recommendation models, and uniformly sequencing parameters of the target partial recommendation models according to a first sequencing rule; and removing the maximum parameters of the first quantity and the minimum parameters of the second quantity according to the sorted parameters of the target local recommendation models, and calculating the average value of the remaining parameters to determine the parameters of the global recommendation model.
The server calculates global shared data according to each sharable data, and may include: randomly selecting partial recommendation models of part or all of the clients in the client set as target partial recommendation models, and uniformly sequencing parameters of the target partial recommendation models according to a second sequencing rule; calculating the median of the parameters of each sorted target local recommendation model, if the total number of the parameters of each target local recommendation model is an even number, calculating the average value of two parameters positioned in the middle in the sorted parameters, and determining the parameters of the global recommendation model according to the calculated median or the average value.
The local recommendation models of part or all of the clients in the client set can be randomly selected as the target local recommendation model according to the set percentage.
The data recommendation method of the embodiment of the invention is described in detail below by taking a krum aggregation rule and federally learned intelligent retail commodity recommendation sharing model as an example. krum is an SGD (random steepest descent) method with byzantine fault tolerance.
Aiming at the defects of single algorithm and data source of the current commodity recommendation and the stability problem of the existing federal learning, the embodiment of the invention provides an intelligent retail commodity recommendation sharing model based on krum aggregation rules and federal learning. The federal learning is a machine learning framework, and can ensure that a plurality of organizations can use data and model machine learning under the condition of meeting the requirements of user privacy protection, data safety and conformity with regulations. The data source (i.e. the user behavior data set) of the commodity recommendation cannot be limited to the data of the e-commerce (i.e. the operation behavior data of the user on the object), the data related to the commodity recommendation shall be three aspects of user purchasing power, user personal preference and commodity characteristics, the user purchasing power can be obtained from the capital flow (i.e. the user consumption behavior data) in the bank, the user personal preference can be obtained from the chat records and browsing records (i.e. the user social behavior data) in the social software, the commodity characteristics can be obtained from the commodity information in the e-commerce, then the data of the three parties are subjected to integrated deep learning to obtain a final commodity recommendation model (i.e. a global recommendation model), but the user behavior data set comes from different institutions, the related data are privacy data of each institution, and cannot be exposed to the outside, and the data of each institution are usually heterogeneous, and the traditional machine learning model cannot directly learn on heterogeneous data. At present, the traditional machine learning method cannot effectively solve the problems, so that the embodiment of the invention provides an intelligent retail commodity recommendation sharing model based on krum aggregation rules and federal learning. Through federal learning, clients of all organizations jointly build a machine learning model without exporting data of all organizations, so that the privacy and data security of users are fully protected, personalized commodity recommendation service is provided for the users, multi-party common benefits are realized, and the problem of isomerism of characteristic data of the users and the users is solved. Aiming at the problem that the federated learning is easy to be attacked by malicious parties, the embodiment of the invention adopts the krum aggregation rule to ensure the data security in the federated learning.
FIG. 3 is a flow diagram of federated learning, in accordance with one embodiment of the present invention.
The federal learning process is as shown in fig. 3, a bank organization is set as an enterprise a, a social software organization is set as an enterprise B, an e-commerce organization is set as an enterprise C, each organization constructs a commodity recommendation model (i.e., a global recommendation model) based on the basic information of users and the basic information of commodities (i.e., user behavior data sets), data among the organizations are not transmitted with each other, and clients of the organizations are in equal status.
In federal learning, encrypted sample alignment is performed on a user behavior data set. Since the user groups of each organization are not completely overlapped, the collaborator D (i.e. the server) confirms the users shared by each organization by using the encryption-based user sample alignment technology on the premise that the enterprise a, the enterprise B and the enterprise C (specifically, the client of each enterprise) do not disclose respective data, and does not expose the users which are not overlapped with each other, so as to perform model training by the information of the common users.
After the encrypted sample alignment of the user behavior data set, the encryption model training is performed. In order to ensure the confidentiality of data in the training process, the method needs to perform encryption training by means of a collaborator D, and the training process can be as follows: (1) the collaborator D distributes the public key to the enterprise A, the enterprise B and the enterprise C (particularly to the client of each enterprise) to encrypt data (such as sharable data and the like) needing to be exchanged in the training process; (2) intermediate results (e.g., sharable data, etc.) for computing the gradient are interacted with between enterprise a, enterprise B, and enterprise C in encrypted form; (3) enterprise A, enterprise B and enterprise C respectively calculate based on the encrypted gradient values, and gather the results to collaborator D, and collaborator D calculates the total gradient value (namely the loss function gradient of the global recommendation model) through the gathered results and decrypts the total gradient value; (4) the collaborator D respectively transmits the decrypted gradient (namely the loss function gradient of the global recommendation model) back to the enterprise A, the enterprise B and the enterprise C, and the enterprise A, the enterprise B and the enterprise C update the parameters of the respective models (namely the local recommendation models) according to the gradient.
The number of the banking institutions, the number of the social software institutions and the number of the e-commerce institutions can be one or more.
The technical scheme of the embodiment of the invention mainly comprises a local model training part and a combined model training part.
In the local model training, a bank mechanism, a social software mechanism and an e-commerce mechanism construct and train respective local recommendation models based on a convolutional neural network according to respective local user behavior data sets, and consumption levels of users, suitable commodity types and the like can be judged through the local recommendation models. The convolutional neural network has the characteristics of local connection and weight sharing, so that the local recommendation model can identify tiny local features, complete feature information of user feature data can be extracted, classification tasks can be better realized, the structure of the convolutional neural network has high degree of fitting with a federal learning framework, and the uniformity of the model can be improved.
In the joint model training, all the mechanism clients participating in federal learning collaboratively train to obtain a joint commodity recommendation model, namely a global recommendation model. The number of client sides of organizations such as banks, social software, e-commerce and the like participating in federal learning is set as C, and each client side of the organizations has local private data (namely a local user behavior data set) of C
Figure BDA0002966562600000121
Wherein the content of the first and second substances,
Figure BDA0002966562600000122
is a vector of the features of the image,
Figure BDA0002966562600000123
is a corresponding label, the size of the data set of the c mechanism client participating in the federal learning is nc. In the whole federal study, the commodity recommendation model established by each organization through a shared data set is targeted, and meanwhile, the user behavior data of each organization is kept in the local client of each organization, so that the safety of the privacy data of the users of each organization is guaranteed.
Before participating in training of the commodity recommendation model, each organization client calculates the loss function gradient of each local recommendation model through the respective local user behavior data set and the parameters of the local recommendation model. The learning objectives for the non-convex neural network model are: :
Figure BDA0002966562600000124
wherein the formula represents a loss function defined by a parameter vector w minimizing the existing data samples (x, y), and the loss function of the whole system can be defined as an average value of the loss functions l of n data samples.
In the joint commodity recommendation model, C mechanism clients all have respective local private data sets | Di|=ncWhere n represents all participating in the entire federated commodity recommendation model buildData size, therefore, has
Figure BDA0002966562600000125
Therefore, the loss function of the whole system can be further explained as C mechanism clients utilizing the local data set DiAnd training the obtained aggregation of the local recommendation models, so that the target equation is represented as follows:
Figure BDA0002966562600000126
Figure BDA0002966562600000127
the method comprises the steps that a server initializes commodity recommendation model parameters (namely parameters of a global recommendation model), in the process of each iteration (the iteration time or iteration frequency of each iteration is recorded as t being 1,2 and 3.), mechanism clients with a certain percentage F are randomly selected from all mechanism clients participating in joint training and serve as target mechanism clients to be in direct communication with the server, parameters of the global recommendation model are determined through parameters of local recommendation models (namely target local recommendation models) of the target mechanism clients, and loss function gradients of the global recommendation model are calculated based on a global loss function.
The server can randomly select a local recommendation model of part or all of the clients in the client set as a target local recommendation model through a krum aggregation rule. The krum aggregation rule is that a model similar to other models is selected from a plurality of local models to serve as a global model, and even if the selected local model is from a damaged working node device, the local model from the damaged working node device is similar to the local models of other possible normal working node devices, so that the influence which the local model from the damaged working node device can produce is limited.
The krum aggregation rule may include both trimmed mean and median modes. Trimming the mean aggregation rule to sort the parameters of each target local recommendation model according to a first sorting rule; and removing the maximum parameters of the first quantity and the minimum parameters of the second quantity according to the sorted parameters of the target local recommendation models, and calculating the average value of the remaining parameters to determine the parameters of the global recommendation model. Specifically, according to a first ordering rule, for each parameter of the local recommendation model, taking the jth parameter as an example, the server orders the jth parameters of m local recommendation models, deletes the largest and smallest β parameters, and calculates an average value of the remaining m-2 β parameters as the jth parameter of the global recommendation model, where β is an enhancement coefficient, generally greater than 2, and ensures that a number obtained by m-2 β is close to the number of participants (i.e., clients) participating in model building. The median aggregation rule is used for sorting the parameters of each target local recommendation model according to a second sorting rule, and the second sorting rule is the same as the first sorting rule and is not repeated; calculating the median of the parameters of each sorted target local recommendation model, if the total number of the parameters of each target local recommendation model is an even number, calculating the average value of two parameters positioned in the middle in the sorted parameters, and determining the parameters of the global recommendation model according to the median or the average value of the two parameters in the middle. Specifically, for the jth model parameter, the server sorts the jth parameter of the m local recommendation models, and uses the median as the jth parameter of the global recommendation model, and when m is an even number, the median is the mean of the two parameters in the middle, which is the same as the trimmed mean aggregation rule, and when the objective function is strong convex, the median aggregation rule can also reach the order-superior error rate.
Each client participating in federal learning downloads parameters of the global recommendation model and global loss functions (i.e., global shared data) of the global recommendation model from the server. And each client updates the parameters of the local recommendation model by the fixed learning rate eta, the parameters of the global recommendation model and the global loss function of the global recommendation model. Each client calculates the product of the learning rate of each local recommendation model and the gradient of the loss function of the global recommendation model, and calculates the difference value between the current parameter of each local recommendation model and the product to obtain the updated parameter of each local recommendation modelAnd (4) counting. Updated parameters of each client based on respective local recommendation model and commodity recommendation model parameters wtUpdating the gradient f of the average loss over the respective private data setsc
fc=▽Lc(xc,yc;wt) And sending sharable data including the updated parameters of the local recommendation model to the server to update the global recommendation model. The local recommendation model may be a non-convex neural network model.
Specifically, the model parameter vector at the time t +1 is obtained by iterating the model parameter vector at the time t, and the specific iteration mode is that the model parameter w at the time t +1t+1(i.e., updated parameters of the locally recommended model) is equal to the model parameter w at time tt(i.e., the current parameters of the local recommendation model) minus the fixed learning rate η (i.e., the learning rate of the local recommendation model) times the gradient of the penalty function (specifically, the penalty function gradient of the global recommendation model) | (x, y; w) has:
wt+1←wt-η▽l(x,y;w)
further refining the expression mode of the gradient of the loss function, wherein the gradient of the loss function can be the aggregation of the local loss functions (namely the loss functions of the local recommendation models) of the C mechanism clients participating in the federal learning, and the method comprises the following steps:
Figure BDA0002966562600000141
Figure BDA0002966562600000142
wherein f iscIs that the c-th mechanism client end has model parameters W at the current time ttThe gradient of the average loss over the own private data set is calculated.
For each of the institutional clients c,
Figure BDA0002966562600000143
then there are:
Figure BDA0002966562600000151
in one embodiment, each institution client takes appropriate gradient updating measures, the commodity recommendation model is evaluated by utilizing respective local private client information data, and the server returns updated model parameters to all institution clients participating in federal learning in the next round through a shared commodity recommendation model formed by weighted average of the commodity recommendation model parameters local to the server. The entire process will continue for T iterations. The algorithm process is as follows:
the input is as follows: user private data of banks, social software merchants and e-merchants, namely local user behavior data sets of all clients;
the output is: a commodity recommendation model system based on federal learning, namely a global recommendation model;
ServerUpdate: \ \ initialization commodity recommendation model parameter w0The parameter is an initialization parameter of the global recommendation model;
t performs for each round T1, 2.
Randomly selecting max (F × C,1) mechanism clients from all mechanism clients participating in the federal learning, and recording the max as a set Nt
For each organization client c ∈ Nt(|NtK) are executed in parallel:
Figure BDA0002966562600000152
Figure BDA0002966562600000153
bank update (c, w): execution on the C-th organization
Data preprocessing: balancing the original data set by using a SMOTE (Synthetic Minrity Oversampling technique, namely a Synthetic Minority Oversampling technique);
w←wt
b ← mixing data set DnDivided into data blocks (batch) with size B of Batchsize
Execute from 1 to E for each local Epoch i;
executing for each data block (batch) B ∈ B;
w←w-η▽(x,y;w)
and returning the model parameters w and the commodity recommendation accuracy rate alpha to the server.
Where F is the ratio of the number of clients participating in each round of federal learning training, B is the local Batchsized number, BatchSize is the batch size, i.e., the number of samples selected per training, E is the number of local epochs, which represents the number of times training is performed using all samples in the training set. t is the index of the current round number, η is the learning rate, and a is the accuracy of the detection model, and can be calculated through a loss function.
FIG. 4 is a schematic diagram of the main modules of a first client for data recommendation according to one embodiment of the present invention.
As shown in fig. 4, a first client 400 for data recommendation according to an embodiment of the present invention mainly includes: a first local recommendation model training module 401, a global shared data downloading module 402, and a first local recommendation model updating module 403.
The first local recommendation model training module 401 is configured to train a local first local recommendation model using a local user behavior data set, and send the trained first sharable data to the server.
And a global shared data downloading module 402, configured to download global shared data from the server, where the global shared data is calculated by the server according to the first sharable data and a second sharable data set, and the second sharable data set includes second sharable data sent to the server by one or more second clients.
And a first local recommendation model updating module 403, configured to update the first local recommendation model with the global shared data to train the global recommendation model, and after the global recommendation model is trained, recommend, to the user, object data that matches the corresponding user behavior data with the global recommendation model.
In one embodiment, the global shared data may include parameters of the global recommendation model and a gradient of a loss function of the global recommendation model; the first local recommendation model update module is specifically configured to: and updating the parameters of the first local recommendation model and the loss function gradient of the first local recommendation model according to the parameters of the global recommendation model and the loss function gradient of the global recommendation model, wherein the first sharable data comprises the parameters of the first local recommendation model and the loss function gradient.
In one embodiment, the first local recommendation model update module updates parameters of the first local recommendation model by: and calculating the product of the learning rate of the first local recommendation model and the gradient of the loss function of the global recommendation model, and calculating the difference value between the current parameter of the first local recommendation model and the product to obtain the updated parameter of the first local recommendation model.
In one embodiment, the row of users local to the first client may be a collection of data that is one of the following: the data of user consumption behaviors, the data of user social behaviors and the data of user operation behaviors on the object.
For the content already described in the above embodiments, the description of the present embodiment is omitted.
FIG. 5 is a schematic diagram of the main modules of a server for data recommendation according to one embodiment of the present invention.
As shown in fig. 5, a first client 400 for data recommendation according to an embodiment of the present invention mainly includes: a sharable data receiving module 501 and a global shared data calculating module 502.
A sharable data receiving module 501, configured to receive sharable data sent by clients in the client set, where the sharable data is obtained by the clients training respective local recommendation models by using respective user behavior data sets.
The global shared data calculation module 502 is configured to calculate global shared data according to each sharable data, where the global shared data is used by the clients to update respective local recommendation models so as to train a global recommendation model, and the global recommendation model is used by the clients to recommend object data matched with corresponding user behavior data to respective users.
In one embodiment, the sharable data sent by the client may include parameters of a local recommendation model of the client, and the global shared data includes parameters of a global recommendation model; the global shared data calculation module is specifically configured to: randomly selecting a partial recommendation model of part or all of the clients in the client set as a target partial recommendation model, and sequencing parameters of each target partial recommendation model according to a first sequencing rule; and removing the maximum parameters of the first quantity and the minimum parameters of the second quantity according to the sorted parameters of the target local recommendation models, and calculating the average value of the remaining parameters to determine the parameters of the global recommendation model.
In one embodiment, the sharable data sent by the client may include parameters of a local recommendation model of the client, and the global shared data includes parameters of a global recommendation model; the global shared data calculation module is specifically configured to: randomly selecting partial recommendation models of part or all of the clients in the client set as target partial recommendation models, and sequencing parameters of the target partial recommendation models according to a second sequencing rule; and calculating the median of the parameters of each sorted target local recommendation model, and calculating the average value of two parameters positioned in the middle in the sorted parameters if the total number of the parameters of each target local recommendation model is an even number so as to determine the parameters of the global recommendation model.
In one embodiment, the global shared data may further include a loss function gradient of the global recommendation model, the global loss function is obtained by performing weighted average on the loss functions of the target local recommendation model, and the loss function gradient of the global recommendation model is calculated based on the global loss function.
In one embodiment, according to a set percentage, the local recommendation models of part or all of the clients in the client set are randomly selected as the target local recommendation model.
For the content already described in the above embodiments, the description of the present embodiment is omitted.
Fig. 6 is a main configuration diagram of a data recommendation system according to an embodiment of the present invention.
As shown in fig. 6, a data recommendation system 600 according to an embodiment of the present invention mainly includes: the first client 601 for data recommendation and the server 602 for data recommendation further include a second client 603, the number of the second clients 603 may be multiple, and one second client 603 is shown in fig. 6 as an example.
For the content already described in the above embodiments, the description of the present embodiment is omitted.
Fig. 7 shows an exemplary system architecture 700 of a data recommendation method, client, server, or data recommendation system to which embodiments of the invention may be applied.
As shown in fig. 7, the system architecture 700 may include terminal devices 701, 702, 703, a network 704, and a server 705. The network 704 serves to provide a medium for communication links between the terminal devices 701, 702, 703 and the server 705. Network 704 may include various connection types, such as wired, wireless communication links, or fiber optic cables, to name a few.
A user may use the terminal devices 701, 702, 703 to interact with a server 705 over a network 704, to receive or send messages or the like. The terminal devices 701, 702, 703 may have installed thereon various communication client applications, such as a shopping-like application, a web browser application, a search-like application, an instant messaging tool, a mailbox client, social platform software, etc. (by way of example only).
The terminal devices 701, 702, 703 may be various electronic devices having a display screen and supporting web browsing, including but not limited to smart phones, tablet computers, laptop portable computers, desktop computers, and the like.
The server 705 may be a server providing various services, such as a background management server (for example only) providing support for shopping websites browsed by users using the terminal devices 701, 702, 703. The backend management server may analyze and perform other processing on the received data such as the product information query request, and feed back a processing result (for example, target push information, product information — just an example) to the terminal device.
It should be noted that the data recommendation method provided in the embodiment of the present invention may be executed by the server 705 or the terminal devices 701, 702, and 703, and accordingly, the server 705 may be used as a server for data recommendation in the embodiment of the present invention, clients (such as a first client and a second client) for data recommendation in the embodiment of the present invention may be disposed in the terminal devices 701, 702, and 703, and the server 705 and the terminal devices 701, 702, and 703 may form a data recommendation system in the embodiment of the present invention.
It should be understood that the number of terminal devices, networks, and servers in fig. 7 is merely illustrative. There may be any number of terminal devices, networks, and servers, as desired for implementation.
Referring now to FIG. 8, shown is a block diagram of a computer system 800 suitable for use with a terminal device implementing an embodiment of the present invention. The terminal device shown in fig. 8 is only an example, and should not bring any limitation to the functions and the scope of use of the embodiments of the present invention.
As shown in fig. 8, the computer system 800 includes a Central Processing Unit (CPU)801 that can perform various appropriate actions and processes in accordance with a program stored in a Read Only Memory (ROM)802 or a program loaded from a storage section 808 into a Random Access Memory (RAM) 803. In the RAM 803, various programs and data necessary for the operation of the system 800 are also stored. The CPU 801, ROM 802, and RAM 803 are connected to each other via a bus 804. An input/output (I/O) interface 805 is also connected to bus 804.
The following components are connected to the I/O interface 805: an input portion 806 including a keyboard, a mouse, and the like; an output section 807 including a signal such as a Cathode Ray Tube (CRT), a Liquid Crystal Display (LCD), and the like, and a speaker; a storage portion 808 including a hard disk and the like; and a communication section 809 including a network interface card such as a LAN card, a modem, or the like. The communication section 809 performs communication processing via a network such as the internet. A drive 810 is also connected to the I/O interface 805 as necessary. A removable medium 811 such as a magnetic disk, an optical disk, a magneto-optical disk, a semiconductor memory, or the like is mounted on the drive 810 as necessary, so that a computer program read out therefrom is mounted on the storage section 808 as necessary.
In particular, according to the embodiments of the present disclosure, the processes described above with reference to the flowcharts may be implemented as computer software programs. For example, embodiments of the present disclosure include a computer program product comprising a computer program embodied on a computer readable medium, the computer program comprising program code for performing the method illustrated in the flow chart. In such an embodiment, the computer program can be downloaded and installed from a network through the communication section 809 and/or installed from the removable medium 811. The computer program executes the above-described functions defined in the system of the present invention when executed by the Central Processing Unit (CPU) 801.
It should be noted that the computer readable medium shown in the present invention can be a computer readable signal medium or a computer readable storage medium or any combination of the two. A computer readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any combination of the foregoing. More specific examples of the computer readable storage medium may include, but are not limited to: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the present invention, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. In the present invention, however, a computer readable signal medium may include a propagated data signal with computer readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated data signal may take many forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A computer readable signal medium may also be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device. Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to: wireless, wire, fiber optic cable, RF, etc., or any suitable combination of the foregoing.
The flowchart and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present invention. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams or flowchart illustration, and combinations of blocks in the block diagrams or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
The modules described in the embodiments of the present invention may be implemented by software or hardware. The described modules may also be provided in a processor, which may be described as: a processor includes a first local recommendation model training module, a global shared data download module, and a first local recommendation model update module. Where the names of these modules do not in some cases constitute a limitation on the modules themselves, for example, the first local recommendation model training module may also be described as "a module for training a local first local recommendation model using a local user behavior data set and sending the trained first shareable data to the server".
As another aspect, the present invention also provides a computer-readable medium that may be contained in the apparatus described in the above embodiments; or may be separate and not incorporated into the device. The computer readable medium carries one or more programs which, when executed by a device, cause the device to comprise: the first client side trains a local first local recommendation model by using a local user behavior data set and sends first sharable data obtained by training to the server; the method comprises the steps that a first client downloads global shared data from a server, the global shared data are obtained through calculation of the server according to first sharable data and a second sharable data set, and the second sharable data set comprises one or more second sharable data sent to the server by a second client; and the first client updates the first local recommendation model by using the global shared data so as to train the global recommendation model, and recommends object data matched with the corresponding user behavior data to the user by using the global recommendation model after the training of the global recommendation model is completed.
According to the technical scheme of the embodiment of the invention, a first client side trains a local first local recommendation model by using a local user behavior data set and sends the first sharable data obtained by training to a server; the method comprises the steps that a first client downloads global shared data from a server, the global shared data are obtained through calculation of the server according to first sharable data and a second sharable data set, and the second sharable data set comprises one or more second sharable data sent to the server by a second client; and the first client updates the first local recommendation model by using the global shared data so as to train the global recommendation model, and recommends object data matched with the corresponding user behavior data to the user by using the global recommendation model after the training of the global recommendation model is completed. The method can solve the problem that the relevant feature data dimension of the user is single, improve the commodity recommendation accuracy, ensure the safety of private data when the federal study is attacked maliciously, and improve the robustness of the federal study.
The above-described embodiments should not be construed as limiting the scope of the invention. Those skilled in the art will appreciate that various modifications, combinations, sub-combinations, and substitutions can occur, depending on design requirements and other factors. Any modification, equivalent replacement, and improvement made within the spirit and principle of the present invention should be included in the protection scope of the present invention.

Claims (14)

1. A method for recommending data, comprising:
the first client side trains a local first local recommendation model by using a local user behavior data set and sends first sharable data obtained by training to the server;
the first client downloads global shared data from the server, wherein the global shared data are calculated by the server according to the first sharable data and a second sharable data set, and the second sharable data set comprises second sharable data sent to the server by one or more second clients;
and the first client updates the first local recommendation model by using the global sharing data so as to train a global recommendation model, and recommends object data matched with corresponding user behavior data to a user by using the global recommendation model after the training of the global recommendation model is completed.
2. The method of claim 1, wherein the global shared data comprises parameters of the global recommendation model and a loss function gradient of the global recommendation model;
the first client updating the first local recommendation model with the global shared data, including:
the first client updates the parameters of the first local recommendation model and the loss function gradient of the first local recommendation model according to the parameters of the global recommendation model and the loss function gradient of the global recommendation model, and the first sharable data comprises the parameters of the first local recommendation model and the loss function gradient.
3. The method of claim 1, wherein the first client updates the parameters of the first local recommendation model by:
and the first client calculates the product of the learning rate of the first local recommendation model and the loss function gradient of the global recommendation model, and calculates the difference value between the current parameter of the first local recommendation model and the product to obtain the updated parameter of the first local recommendation model.
4. The method of claim 1, wherein the set of user behavior data local to the first client is a set of one of: the data of user consumption behaviors, the data of user social behaviors and the data of user operation behaviors on the object.
5. A method for recommending data, comprising:
the method comprises the steps that a server receives sharable data sent by clients in a client set, wherein the sharable data are obtained by training respective local recommendation models by the clients through respective user behavior data sets;
and the server calculates global shared data according to the sharable data, the global shared data is used for updating respective local recommendation models of the clients so as to train a global recommendation model, and the global recommendation model is used for recommending object data matched with the corresponding user behavior data to respective users by the clients.
6. The method of claim 5, wherein the shareable data sent by the client comprises parameters of a local recommendation model of the client, and wherein the global shared data comprises parameters of the global recommendation model;
the server calculates global shared data according to each sharable data, and the method comprises the following steps:
randomly selecting a local recommendation model of part or all of the clients in the client set as a target local recommendation model, and sequencing parameters of each target local recommendation model according to a first sequencing rule;
according to the sorted parameters of each target local recommendation model, removing the maximum parameters of the first quantity and the minimum parameters of the second quantity, and calculating the average value of the remaining parameters to determine the parameters of the global recommendation model.
7. The method of claim 5, wherein the shareable data sent by the client comprises parameters of a local recommendation model of the client, and wherein the global shared data comprises parameters of the global recommendation model;
the server calculates global shared data according to each sharable data, and the method comprises the following steps:
randomly selecting a partial recommendation model of part or all of the clients in the client set as a target partial recommendation model, and sequencing parameters of each target partial recommendation model according to a second sequencing rule;
calculating the median of the sorted parameters of each target local recommendation model, and calculating the average value of two parameters positioned in the middle in the sorted parameters if the total number of the parameters of each target local recommendation model is an even number so as to determine the parameters of the global recommendation model.
8. The method according to claim 6 or 7, wherein the global shared data further includes a loss function gradient of the global recommendation model, a global loss function is obtained by performing weighted average on the loss functions of the target local recommendation model, and the loss function gradient of the global recommendation model is calculated based on the global loss function.
9. The method according to claim 6 or 7, wherein the local recommendation models of some or all of the clients in the client set are randomly selected as the target local recommendation model according to a set percentage.
10. A first client for data recommendation, comprising:
the local recommendation module is used for training a local recommendation model by using a local user behavior data set and sending first sharable data obtained by training to the server;
a global shared data downloading module, configured to download global shared data from the server, where the global shared data is calculated by the server according to the first sharable data and a second sharable data set, and the second sharable data set includes second sharable data sent to the server by one or more second clients;
and the first local recommendation model updating module is used for updating the first local recommendation model by using the global shared data so as to train a global recommendation model, and recommending object data matched with corresponding user behavior data to a user by using the global recommendation model after the global recommendation model is trained.
11. A server for data recommendation, comprising:
the sharable data receiving module is used for receiving sharable data sent by the clients in the client set, wherein the sharable data are obtained by training respective local recommendation models by the clients by using respective user behavior data sets;
and the global shared data calculation module is used for calculating global shared data according to the sharable data, the global shared data is used for updating respective local recommendation models of the clients so as to train a global recommendation model, and the global recommendation model is used for recommending object data matched with the corresponding user behavior data to respective users by the clients.
12. A data recommendation system, comprising: a plurality of clients for data recommendation and a server for data recommendation as claimed in claim 11, wherein said plurality of clients comprises the first client of claim 10.
13. An electronic device, comprising:
one or more processors;
a storage device for storing one or more programs,
when executed by the one or more processors, cause the one or more processors to implement the method of any one of claims 1-9.
14. A computer-readable medium, on which a computer program is stored which, when being executed by a processor, carries out the method according to any one of claims 1-9.
CN202110252347.5A 2021-03-08 2021-03-08 Data recommendation method, related device and data recommendation system Pending CN113761350A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110252347.5A CN113761350A (en) 2021-03-08 2021-03-08 Data recommendation method, related device and data recommendation system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110252347.5A CN113761350A (en) 2021-03-08 2021-03-08 Data recommendation method, related device and data recommendation system

Publications (1)

Publication Number Publication Date
CN113761350A true CN113761350A (en) 2021-12-07

Family

ID=78786688

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110252347.5A Pending CN113761350A (en) 2021-03-08 2021-03-08 Data recommendation method, related device and data recommendation system

Country Status (1)

Country Link
CN (1) CN113761350A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114625977A (en) * 2022-05-16 2022-06-14 深圳市万物云科技有限公司 Service recommendation method and device based on federal learning and related medium
WO2023226947A1 (en) * 2022-05-23 2023-11-30 阿里巴巴达摩院(杭州)科技有限公司 Terminal-cloud collaborative recommendation system and method, and electronic device

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20160371589A1 (en) * 2015-06-17 2016-12-22 Yahoo! Inc. Systems and methods for online content recommendation
CN108985899A (en) * 2018-07-13 2018-12-11 合肥工业大学 Recommended method, system and storage medium based on CNN-LFM model
CN110069715A (en) * 2019-04-29 2019-07-30 腾讯科技(深圳)有限公司 A kind of method of information recommendation model training, the method and device of information recommendation
US20190385043A1 (en) * 2018-06-19 2019-12-19 Adobe Inc. Asynchronously training machine learning models across client devices for adaptive intelligence
CN112329940A (en) * 2020-11-02 2021-02-05 北京邮电大学 Personalized model training method and system combining federal learning and user portrait
CN112446507A (en) * 2020-12-01 2021-03-05 平安科技(深圳)有限公司 Recommendation model training method and device, terminal device and storage medium

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20160371589A1 (en) * 2015-06-17 2016-12-22 Yahoo! Inc. Systems and methods for online content recommendation
US20190385043A1 (en) * 2018-06-19 2019-12-19 Adobe Inc. Asynchronously training machine learning models across client devices for adaptive intelligence
CN108985899A (en) * 2018-07-13 2018-12-11 合肥工业大学 Recommended method, system and storage medium based on CNN-LFM model
CN110069715A (en) * 2019-04-29 2019-07-30 腾讯科技(深圳)有限公司 A kind of method of information recommendation model training, the method and device of information recommendation
CN112329940A (en) * 2020-11-02 2021-02-05 北京邮电大学 Personalized model training method and system combining federal learning and user portrait
CN112446507A (en) * 2020-12-01 2021-03-05 平安科技(深圳)有限公司 Recommendation model training method and device, terminal device and storage medium

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114625977A (en) * 2022-05-16 2022-06-14 深圳市万物云科技有限公司 Service recommendation method and device based on federal learning and related medium
WO2023226947A1 (en) * 2022-05-23 2023-11-30 阿里巴巴达摩院(杭州)科技有限公司 Terminal-cloud collaborative recommendation system and method, and electronic device

Similar Documents

Publication Publication Date Title
CN110245510B (en) Method and apparatus for predicting information
Wu et al. A federated graph neural network framework for privacy-preserving personalization
CN110119413B (en) Data fusion method and device
CN113169980B (en) Transaction account data maintenance system and method using blockchain
US9253203B1 (en) Diversity analysis with actionable feedback methodologies
US20220358447A1 (en) Optimizing user task schedules in a customer relationship management platform
US11088834B2 (en) System for privacy-preserving monetization of big data and method for using the same
WO2020053854A1 (en) Systems and methods for secure prediction using an encrypted query executed based on encrypted data
JP2017535857A (en) Learning with converted data
CN111046237B (en) User behavior data processing method and device, electronic equipment and readable medium
CN112465627B (en) Financial loan auditing method and system based on block chain and machine learning
US11570214B2 (en) Crowdsourced innovation laboratory and process implementation system
Lu et al. Show me the money: Dynamic recommendations for revenue maximization
CN112016796B (en) Comprehensive risk score request processing method and device and electronic equipment
US11605086B2 (en) Electronic database search and storage efficiency improvement
CN112039702A (en) Model parameter training method and device based on federal learning and mutual learning
CN113761350A (en) Data recommendation method, related device and data recommendation system
CN114398553A (en) Object recommendation method and device, electronic equipment and storage medium
CN110866625A (en) Promotion index information generation method and device
Hussain et al. Federated learning: A survey of a new approach to machine learning
US11979309B2 (en) System and method for discovering ad-hoc communities over large-scale implicit networks by wave relaxation
CN112749323A (en) Method and device for constructing user portrait
US10474688B2 (en) System and method to recommend a bundle of items based on item/user tagging and co-install graph
Bhagavan et al. Fedsmarteum: Secure federated matrix factorization using smart contracts for multi-cloud supply chain
US12015691B2 (en) Security as a service for machine learning

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination