CN111368196A - Model parameter updating method, device, equipment and readable storage medium - Google Patents

Model parameter updating method, device, equipment and readable storage medium Download PDF

Info

Publication number
CN111368196A
CN111368196A CN202010142907.7A CN202010142907A CN111368196A CN 111368196 A CN111368196 A CN 111368196A CN 202010142907 A CN202010142907 A CN 202010142907A CN 111368196 A CN111368196 A CN 111368196A
Authority
CN
China
Prior art keywords
terminal
data
updating
model parameter
model
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202010142907.7A
Other languages
Chinese (zh)
Inventor
黄安埠
刘洋
陈天健
杨强
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
WeBank Co Ltd
Original Assignee
WeBank Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by WeBank Co Ltd filed Critical WeBank Co Ltd
Priority to CN202010142907.7A priority Critical patent/CN111368196A/en
Publication of CN111368196A publication Critical patent/CN111368196A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/90Details of database functions independent of the retrieved data types
    • G06F16/95Retrieval from the web
    • G06F16/953Querying, e.g. by the use of web search engines
    • G06F16/9535Search customisation based on user profiles and personalisation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q10/00Administration; Management
    • G06Q10/06Resources, workflows, human or project management; Enterprise or organisation planning; Enterprise or organisation modelling
    • G06Q10/067Enterprise or organisation modelling
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q40/00Finance; Insurance; Tax strategies; Processing of corporate or income taxes
    • G06Q40/03Credit; Loans; Processing thereof

Abstract

The invention discloses a method, a device, equipment and a readable storage medium for updating model parameters, which relate to the field of financial science and technology, and the method comprises the following steps: the second terminal receives the encrypted first characteristic data sent by the first terminal and acquires the second characteristic data and label data corresponding to the second characteristic data, wherein the first characteristic data is obtained by the first terminal and at least one terminal of the same type through transverse federal learning; calculating to obtain a residual according to the first characteristic data, the second characteristic data and the label data, and sending the residual to the first terminal, so that the first terminal can calculate a first gradient value according to the residual, and update a first model parameter corresponding to the first characteristic data according to the first gradient value; and calculating a second gradient value according to the residual error, and updating a second model parameter corresponding to the second characteristic data according to the second gradient value. The invention realizes the model training by integrating the data of two different terminals so as to improve the performance of the model obtained by training.

Description

Model parameter updating method, device, equipment and readable storage medium
Technical Field
The invention relates to the technical field of data processing of financial technology (Fintech), in particular to a method, a device, equipment and a readable storage medium for updating model parameters.
Background
With the development of computer technology, more and more technologies are applied in the financial field, the traditional financial industry is gradually changing to financial technology (Fintech), and the data processing technology is no exception, but due to the requirements of security and real-time performance of the financial industry, higher requirements are also put forward on the data processing technology.
A typical application is a recommendation system, but many times, tag data is difficult to obtain directly, for example, in a recommendation scene of an e-commerce, a user does not directly specify whether to like a certain item, and we can only rely on the behavior of the user and define according to some artificial rules, for example, if the user purchases a certain item, we consider the user like (tag is 1); conversely, if the user only browses without purchase, we consider the user dislike (tab 0). But such artificial rules define that the tag data is not real after all and may sometimes be quite different from the actual situation.
Currently, the combined application of a mobile phone end and an enterprise end is a trend of current AI field development, and along with the development of mobile internet, the mobile phone end becomes the most main tool for people to acquire information currently, so that the mobile phone end has very rich user behavior data, but as described above, the authenticity of a data label of the mobile phone end is not high, and therefore, a large amount of data preprocessing work needs to be performed by utilizing the user behavior of the mobile phone end; on the other hand, on the enterprise side, the information data of the user is relatively less, but the information data is very sensitive and important privacy data, such as information of personal credit investigation and loan of a bank, and the authenticity of the data is much higher compared with the user behavior data on the mobile phone side. Therefore, if the important and real data can be utilized, the performance of the recommendation model can be greatly improved by combining the important and real data with the data at the mobile phone end.
Therefore, it is known that how to combine data of two different terminals to perform model training to improve the performance of a model obtained by training is an urgent problem to be solved.
Disclosure of Invention
The invention mainly aims to provide a method, a device and equipment for updating model parameters and a readable storage medium, and aims to solve the technical problem of how to combine data of two different terminals to train a model so as to improve the performance of the trained model.
In order to achieve the above object, the present invention provides a method for updating model parameters, which comprises the steps of:
the method comprises the steps that a second terminal receives encrypted first feature data sent by a first terminal, and obtains second feature data and label data corresponding to the second feature data, wherein the first feature data are obtained by the first terminal and at least one terminal of the same type through horizontal federal learning, and the first terminal and the second terminal are different types of terminals;
calculating to obtain a residual according to the first characteristic data, the second characteristic data and the label data, and sending the residual to the first terminal, so that the first terminal can calculate a first gradient value according to the residual, and update a first model parameter corresponding to the first characteristic data according to the first gradient value;
and calculating a second gradient value according to the residual error, and updating a second model parameter corresponding to the second characteristic data according to the second gradient value.
Preferably, after the steps of receiving, by the second terminal, the encrypted first feature data sent by the first terminal and acquiring the second feature data and the tag data corresponding to the second feature data, the method further includes:
acquiring an encrypted first characteristic parameter corresponding to the first characteristic data and acquiring a second characteristic parameter corresponding to the second characteristic data, wherein the device identifications corresponding to the first characteristic data and the second characteristic data are the same;
calculating a loss value according to the first characteristic data, the second characteristic data, the label data, the first characteristic parameter and the second characteristic parameter;
sending the loss value to a third terminal, so that the third terminal can judge whether the first model parameter and the second model parameter meet the update stop condition according to the loss value, and returning a prompt message;
and if the first model parameter and the second model parameter are determined not to meet the updating stop condition according to the prompt message, returning to the step of receiving the encrypted first characteristic data sent by the first terminal and acquiring the second characteristic data and the label data by the second terminal.
Preferably, the step of sending the loss value to a third terminal, so that the third terminal determines whether the first model parameter and the second model parameter meet the update stop condition according to the loss value includes:
and sending the loss value to a third terminal, so that the third terminal obtains a loss history value when the first model parameter and the second model parameter are updated last time after receiving the loss value, determines that the first model parameter and the second model parameter meet an update stop condition when detecting that an absolute value of a difference value between the loss value and the loss history value is smaller than a preset threshold value, and determines that the first model parameter and the second model parameter do not meet the update stop condition when detecting that the absolute value is larger than or equal to the preset threshold value.
Preferably, before the step of calculating the second gradient value according to the residual error, the method further includes:
acquiring training data corresponding to the second feature data, and acquiring a third feature parameter corresponding to the second feature data;
the step of calculating a second gradient value from the residual error comprises:
and calculating a second gradient value according to the training data, the third characteristic parameter and the residual error.
Preferably, the step of updating the second model parameter corresponding to the second feature data according to the second gradient value includes:
acquiring an updating coefficient corresponding to the second terminal;
and updating a second model parameter corresponding to the second characteristic data according to the second gradient value and the updating coefficient.
Preferably, the step of calculating a residual according to the first feature data, the second feature data and the tag data includes:
calculating a difference value between the second characteristic data and the second characteristic data label data, and encrypting the difference value to obtain an encrypted difference value;
calculating to obtain an encrypted residual error according to the encrypted difference value and the encrypted first characteristic data;
the step of calculating a second gradient value according to the residual error, and updating a second model parameter corresponding to the second feature data according to the second gradient value includes:
calculating according to the encrypted residual error to obtain an encrypted second gradient value, sending the encrypted second gradient value to the third terminal so that the third terminal can decrypt the encrypted second gradient value, and returning the decrypted second gradient value;
and receiving the decrypted second gradient value returned by the third terminal, and updating the second model parameter corresponding to the second characteristic data according to the decrypted second gradient value.
Preferably, after the step of calculating a second gradient value according to the residual error and updating a second model parameter corresponding to the second feature data according to the second gradient value, the method further includes:
taking the updated second model parameter as the model parameter of the loan prediction model in the second terminal to obtain the loan prediction model;
after receiving data to be predicted, inputting the data to be predicted into the loan prediction model to obtain loan probability corresponding to the data to be predicted, and pushing information according to the loan probability.
In addition, to achieve the above object, the present invention provides an updating apparatus for model parameters, including:
the terminal comprises a receiving module, a sending module and a processing module, wherein the receiving module is used for receiving encrypted first characteristic data sent by a first terminal, the first characteristic data is obtained by the first terminal and at least one terminal of the same type through horizontal federal learning, and the first terminal and the second terminal are different types of terminals;
the acquisition module is used for acquiring second characteristic data and label data corresponding to the second characteristic data;
the calculation module is used for calculating to obtain a residual error according to the first characteristic data, the second characteristic data and the label data;
a sending module, configured to send the residual to the first terminal, so that the first terminal calculates a first gradient value according to the residual, and updates a first model parameter corresponding to the first feature data according to the first gradient value;
the calculation module is further used for calculating a second gradient value according to the residual error;
and the updating module is used for updating the second model parameter corresponding to the second characteristic data according to the second gradient value.
In addition, in order to achieve the above object, the present invention further provides a model parameter updating device, which includes a memory, a processor, and a model parameter updating program stored in the memory and operable on the processor, and when the model parameter updating program is executed by the processor, the method includes the step of implementing a model parameter updating method corresponding to a federal learning server.
Further, to achieve the above object, the present invention also provides a computer-readable storage medium having stored thereon an update program of model parameters, which when executed by a processor, implements the steps of the update method of model parameters as described above.
In the invention, the second terminal performs joint training by combining the data encrypted by the first terminal, updates the model parameters in the prediction model, ensures the data privacy of the first terminal, namely on the basis that the first terminal does not reveal own original data, the second terminal can integrate own data and the data encrypted by the first terminal to perform joint training, updates the model parameters, namely updates the corresponding prediction model, thereby realizing the purpose of integrating the data of two different terminals to perform model training and improving the performance of the model obtained by training.
Drawings
FIG. 1 is a schematic flow chart diagram illustrating a first embodiment of a method for updating model parameters according to the present invention;
FIG. 2 is a flowchart illustrating a second embodiment of a method for updating model parameters according to the present invention;
FIG. 3 is a flow chart illustrating a model parameter update process according to an embodiment of the present invention;
FIG. 4 is a block diagram of a preferred embodiment of an apparatus for updating model parameters according to the present invention;
fig. 5 is a schematic structural diagram of a hardware operating environment according to an embodiment of the present invention.
The implementation, functional features and advantages of the objects of the present invention will be further explained with reference to the accompanying drawings.
Detailed Description
It should be understood that the specific embodiments described herein are merely illustrative of the invention and are not intended to limit the invention.
The invention provides a method for updating model parameters, and referring to fig. 1, fig. 1 is a schematic flow chart of a first embodiment of the method for updating model parameters of the invention.
While a logical order is shown in the flow chart, in some cases, the steps shown or described may be performed in an order different than that shown.
The updating method of the model parameters is applied to a server or a terminal, and the terminal may include a mobile terminal such as a mobile phone, a tablet computer, a notebook computer, a palm computer, a Personal Digital Assistant (PDA) and the like, and a fixed terminal such as a Digital TV, a desktop computer and the like. In the embodiments of the model parameter updating method, for convenience of description, the execution subject is omitted to explain the embodiments. The updating method of the model parameters comprises the following steps:
step S10, the second terminal receives the encrypted first feature data sent by the first terminal, and obtains second feature data and tag data corresponding to the second feature data, where the first feature data is obtained by the first terminal and at least one terminal of the same type through horizontal federal learning, and the first terminal and the second terminal are different types of terminals.
And the second terminal receives the encrypted first characteristic data sent by the first terminal. Specifically, after the first terminal obtains the first feature data through encryption, the first terminal sends the encrypted first feature data to the second terminal. Or when the second terminal needs to update the second model parameter, the second terminal generates an acquisition request and sends the acquisition request to the first terminal. After the first terminal receives the acquisition request, the first terminal performs horizontal federal learning with at least one terminal of the same type to obtain first characteristic data, and encrypts the first characteristic data to obtain encrypted first characteristic data. And the first terminal sends the encrypted first characteristic parameter to the second terminal. In this embodiment, the first terminal and the second terminal are different types of terminals, and if the first terminal is a mobile phone terminal, a terminal performing horizontal federal learning with the first terminal is also a mobile phone terminal, at this time, feature data of the mobile phone terminal is rich, but tag data is less, and data authenticity is not high; the second terminal is an enterprise terminal, and the enterprise terminal has less data but high data authenticity. The first terminal may be one terminal or a plurality of terminals. In both the first terminal and the second terminal, there are corresponding prediction models, wherein the prediction models may be linear regression models, machine learning models, deep learning models, or the like. For convenience of distinction, the prediction model in the first terminal is referred to as a first prediction model, and the prediction model in the second terminal is referred to as a second prediction model.
Specifically, the first terminal calculates first feature data according to first model parameters of the first prediction model and first training data. If the first model parameter is recorded as
Figure BDA0002399062900000061
First training data is recorded as
Figure BDA0002399062900000062
First characteristic data is recorded as
Figure BDA0002399062900000063
The formula for the first terminal to calculate the first characteristic data can be expressed as:
Figure BDA0002399062900000064
after the first terminal obtains a feature data through calculation, the first terminal encrypts the first feature data by adopting a preset encryption algorithm, obtains the encrypted first feature data and sends the encrypted first feature data to the second terminal. In the present embodiment, the following formula [ [ a ]]]Representing a after encryption, e.g. the first characteristic data after encryption is represented as
Figure BDA0002399062900000065
In this embodiment, the Encryption algorithm in the first terminal is not limited, and the first feature Data may be encrypted by using an Encryption algorithm such as DES (Data Encryption Standard), IDEA (International Data Encryption algorithm), AES (Advanced Encryption Standard), and the like. It can be understood that the first terminal encrypts the first feature data and sends the encrypted first feature data to the second terminal, so that the privacy of the data in the first terminal is ensured, that is, the security of the data in the first terminal is ensured.
And the second terminal acquires the second characteristic data and the label data corresponding to the second characteristic data. It should be noted that the process of obtaining the second feature data by the second terminal is similar to the process of obtaining the feature data by the first terminal, that is, the second terminal obtains the second feature dataAnd obtaining a second training data, and then calculating a product between the second parameter and the second training data to obtain a second feature data. If the second model parameter is recorded as
Figure BDA0002399062900000071
The second training data is recorded as
Figure BDA0002399062900000072
The second characteristic data is recorded as
Figure BDA0002399062900000073
The formula for the second terminal to calculate the second characteristic data can be expressed as:
Figure BDA0002399062900000074
it should be noted that, for convenience of description, the present embodiment uses two model parameters for example. In a specific application process, the number of model parameters in the first prediction model and the second prediction model can be set according to specific needs.
The label data is corresponding to second training data, each of which has corresponding label data. In this embodiment, only the second training data in the second terminal has the tag data, and the first training data in the first terminal does not have the corresponding tag data. Specifically, if the second training data indicates that the user likes the article a, the tag data is "1" if the user indicates that the user likes the article a; if the user indicates that the article a is disliked, the tag data is "0". In the present embodiment, the expression form of the tag data is not limited.
Step S20, obtaining a residual error according to the first feature data, the second feature data, and the label data, and sending the residual error to the first terminal, so that the first terminal calculates a first gradient value according to the residual error, and updates a first model parameter corresponding to the first feature data according to the first gradient value.
And after the second terminal obtains the encrypted first characteristic data, the encrypted second characteristic data and the encrypted label data, the second terminal calculates a residual according to the first characteristic data, the encrypted second characteristic data and the encrypted label data, and sends the residual to the first terminal. The residual transmitted from the second terminal to the first terminal is the encrypted residual. And when the first terminal receives the encrypted residual error, the first terminal calculates to obtain a first gradient value according to the encrypted residual error, the first training data and the first model parameter.
Specifically, the first terminal calculates a first product between the encrypted residual error and the first training data, calculates a second product between the first model parameter and the gradient coefficient, encrypts the second product to obtain an encrypted second product, and calculates an encrypted first gradient value according to the first product and the encrypted second product. If the residual error after encryption is recorded as [ [ d ]i]]The gradient coefficient is recorded as λ, the magnitude of the gradient coefficient can be set according to specific needs, the magnitude of the gradient coefficient is not specifically limited in this embodiment, and the encrypted first gradient value is recorded as λ
Figure BDA0002399062900000075
The formula for calculating the encrypted first gradient value can be expressed as:
Figure BDA0002399062900000076
and after the first terminal calculates the encrypted first gradient value, the first terminal sends the encrypted first gradient value to the third terminal. It should be noted that the third terminal is a terminal trusted by the first terminal and the second terminal, the third terminal knows the encryption algorithm of the encrypted data of the first terminal and the second terminal, and the third terminal can decrypt the encrypted data sent by the first terminal and the second terminal. And when the third terminal receives the encrypted first gradient value sent by the first terminal, the third terminal decrypts the encrypted first gradient value to obtain a decrypted first gradient value, and sends the decrypted first gradient value to the first terminal. When the first terminal receives the decrypted first gradient value, the first terminal obtains a first updating coefficient, calculates the first updating coefficient and the decrypted first gradient valueIn this embodiment, the size of the first update coefficient is not limited, and the user can set the size of the first update coefficient according to specific needs1And the updated first model parameter is recorded as
Figure BDA0002399062900000081
The formula for the first terminal to calculate the updated first model parameters can be expressed as:
Figure BDA0002399062900000082
as can be seen from the above formula, in the present embodiment, the model parameters are updated by using a gradient descent algorithm.
And step S30, calculating a second gradient value according to the residual error, and updating a second model parameter corresponding to the second characteristic data according to the second gradient value.
And after the second terminal calculates the residual error, the second terminal calculates a second gradient value according to the residual error, and updates a second model parameter corresponding to the second characteristic data according to the second gradient value, namely, updates the second model parameter of the second prediction model to obtain the updated second model parameter. And when the updated second model parameters are obtained, the second terminal inputs the updated second model parameters into the second prediction model to obtain a second prediction model for data prediction.
Further, the step of calculating a residual according to the first feature data, the second feature data, and the tag data includes:
step a, calculating a difference value between the second characteristic data and the label data corresponding to the second characteristic data, and encrypting the difference value to obtain an encrypted difference value.
And b, calculating to obtain an encrypted residual error according to the encrypted difference value and the encrypted first characteristic data.
Specifically, the second terminal calculates a difference between the second feature data and the tag data corresponding to the second feature data, and encrypts the calculated difference to obtain an encrypted difference. It can be understood that the label data corresponding to the second feature data is the label data corresponding to the second training data. In the present embodiment, the encryption algorithm by which the second terminal encrypts the difference is not limited. In this embodiment, in order to improve the updating efficiency of the model parameters, the encryption algorithm used by the second terminal and the encryption algorithm used by the first terminal to encrypt the data are the same. In other embodiments, the encryption algorithms used by the first terminal and the second terminal to encrypt the data may also be different.
And after the second terminal obtains the encrypted difference value, the second terminal calculates to obtain an encrypted residual error according to the encrypted difference value and the encrypted first characteristic data. If the tag data is marked as yiThen, the calculation formula for the second terminal to calculate the encrypted residual error can be expressed as:
Figure BDA0002399062900000091
the step S30 includes:
and c, calculating according to the encrypted residual error to obtain an encrypted second gradient value, sending the encrypted second gradient value to the third terminal so that the third terminal can decrypt the encrypted second gradient value, and returning the decrypted second gradient value.
And d, receiving the decrypted second gradient value returned by the third terminal, and updating the second model parameter corresponding to the second characteristic data according to the decrypted second gradient value.
And after the second terminal calculates the encrypted residual error, the second terminal calculates an encrypted second gradient value according to the encrypted residual error and sends the encrypted second gradient value to the third terminal. And after the third terminal receives the encrypted second gradient value, the third terminal decrypts the encrypted second gradient value and sends the decrypted second gradient value to the second terminal. And when the second terminal receives the decrypted second gradient value sent by the third terminal, the second terminal updates the second model parameter corresponding to the second characteristic data according to the decrypted second gradient value.
Further, the method for updating the model parameters further comprises:
and e, acquiring training data corresponding to the second characteristic data and acquiring a third characteristic parameter corresponding to the second characteristic data.
The step of calculating a second gradient value from the residual error comprises:
and f, calculating a second gradient value according to the training data, the third characteristic parameter and the residual error.
After the second terminal obtains the second feature data, the second terminal obtains training data corresponding to the second feature data, that is, obtains second training data corresponding to the second feature data, and obtains a third feature parameter corresponding to the second feature data. Specifically, the second terminal obtains a gradient coefficient, calculates a product between the gradient coefficient and the second model parameter, records the product between the gradient coefficient and the second model parameter as a third product, encrypts the third product to obtain an encrypted third product, calculates a product between the second training data and the encrypted residual error, records the product between the second training data and the encrypted residual error as a fourth product, and calculates an encrypted second gradient value according to the fourth product and the encrypted third product. Wherein, if the encrypted second gradient value is represented as
Figure BDA0002399062900000101
The gradient coefficient is represented by λ and the third characteristic parameter is represented by
Figure BDA0002399062900000102
The calculation formula for the second terminal to calculate the encrypted second gradient value may be as follows:
Figure BDA0002399062900000103
further, the step of updating the second model parameter corresponding to the second feature data according to the second gradient value includes:
and g, acquiring an updating coefficient corresponding to the second terminal.
And h, updating the second model parameter corresponding to the second characteristic data according to the second gradient value and the updating coefficient.
Specifically, it should be noted that, in the process of updating the second model parameter by the second terminal, the second gradient value used by the second terminal is the decrypted second gradient value, the second terminal obtains the update coefficient corresponding to the updated second model parameter, that is, obtains the second update coefficient, and updates the second model parameter corresponding to the second feature data according to the decrypted second gradient value and the second update coefficient2And the updated second model parameter is recorded as
Figure BDA0002399062900000104
The calculation formula for the second terminal to calculate the updated second model parameter may be:
Figure BDA0002399062900000105
in this embodiment, the second terminal performs joint training by combining the data encrypted by the first terminal, updates the model parameters in the prediction model, and on the basis of ensuring the data privacy of the first terminal, that is, on the basis that the original data of the first terminal is not revealed by the first terminal, the second terminal can integrate the data of the second terminal and the data encrypted by the first terminal to perform joint training, update the model parameters, that is, update the corresponding prediction model, thereby realizing that the data of two different terminals are integrated to perform model training, and improving the performance of the model obtained by training.
Further, a second embodiment of the method for updating model parameters of the present invention is provided. The second embodiment of the method for updating model parameters differs from the first embodiment of the method for updating model parameters in that, with reference to fig. 2, the method for updating model parameters further comprises:
step S40, obtaining an encrypted first feature parameter corresponding to the first feature data, and obtaining a second feature parameter corresponding to the second feature data, where the device identifiers corresponding to the first feature data and the second feature data are the same.
And the second terminal acquires the encrypted first characteristic parameter corresponding to the first characteristic data and acquires the second characteristic parameter corresponding to the second characteristic data, wherein the encrypted first characteristic parameter can be sent to the second terminal together when the first terminal sends the second characteristic data to the second terminal. The way of sending the encrypted first characteristic parameter to the second terminal by the first terminal may be the same as or different from the way of sending the encrypted first characteristic data. It should be noted that the device identifiers corresponding to the first feature data and the second feature data are the same, that is, the device identifiers of the first training data and the second training data in this embodiment are the same. The device identifier can uniquely represent a certain terminal, and different terminals can be identified through the device identifier. In this embodiment, the device identifier may be represented by a package name or other information of the terminal, or may be represented by an identity card number or a phone number of a user corresponding to the terminal.
Specifically, when the first terminal needs to send the encrypted first feature parameter to the second terminal, the first terminal obtains the first feature data, then calculates the square of the first feature data, obtains the first model parameter and the gradient coefficient, calculates the feature parameter to be calculated according to the first model parameter and the gradient coefficient, and obtains the first feature parameter according to the feature parameter to be calculated and the square of the first feature data. If the intersection of the device identifier in the first training data and the device identifier in the second training data is recorded as D, the first characteristic parameter after the encryption by the first terminal may be represented as:
Figure BDA0002399062900000111
further, the first terminal may obtain the encrypted first characteristic parameter by calculating a square of the encrypted first characteristic data and the encrypted first characteristic data. If the square of the encrypted first characteristic data is recorded as
Figure BDA0002399062900000112
The process that the first terminal obtains the encrypted first feature parameter by square calculation of the encrypted first feature data and the encrypted first feature gradient coefficient data may be represented as:
Figure BDA0002399062900000113
when the second terminal obtains the second characteristic parameter, the second terminal obtains the gradient coefficient and the second model parameter, and calculates the second characteristic parameter according to the gradient coefficient and the second model parameter, specifically, the encrypted second characteristic parameter can be expressed as
Figure BDA0002399062900000114
In this embodiment, the loss value is calculated by the second characteristic parameter, and the loss value is to be sent to the third terminal, so in order to protect the security of the data of the second terminal, the data for calculating the loss value needs to be encrypted. In the embodiment of the present invention, for convenience of calculation, the encryption algorithm of the encrypted data of the first terminal is the same as the encryption algorithm of the encrypted data of the second terminal, and in other embodiments, the encryption algorithm of the encrypted data of the first terminal may not be the same as the encryption algorithm of the encrypted data of the second terminal.
Step S50, calculating a loss value according to the first feature data, the second feature data, the label data, the first feature parameter, and the second feature parameter.
And after the second terminal obtains the first characteristic parameter and the second characteristic parameter, the second terminal calculates to obtain a loss value according to the first characteristic data, the second characteristic data, the label data, the first characteristic parameter and the second characteristic parameter. Note that, since each data for calculating the loss value is encrypted, the calculated loss value is also encrypted.
Specifically, if the encrypted loss value is recorded as [ [ L ] ], the loss function for the second terminal to calculate the loss value can be expressed as:
Figure BDA0002399062900000121
specifically, the derivation of the loss function is as follows:
Figure BDA0002399062900000122
due to the fact that
Figure BDA0002399062900000123
Thus:
Figure BDA0002399062900000124
step S60, sending the loss value to a third terminal, so that the third terminal can judge whether the first model parameter and the second model parameter meet the update stop condition according to the loss value, and returning a prompt message.
And after the second terminal calculates the encrypted loss value, the second terminal sends the encrypted loss value to the third terminal. The second terminal may send the loss value to the third terminal at the same time as the residual is sent to the first terminal, or may send the loss value to the third terminal without sending the residual to the first terminal, or before sending the residual to the first terminal. And after the third terminal receives the encrypted loss value, the third terminal decrypts the encrypted loss value to obtain a decrypted loss value, and judges whether the first model parameter and the second model parameter meet the update stop condition or not according to the decrypted loss value. The update stopping condition may be set according to specific needs, for example, it may be set to determine that the first model parameter and the second model parameter satisfy the update stopping condition when the decrypted loss value is smaller than a preset loss value; and when the decrypted loss value is greater than or equal to a preset loss value, determining that the first model parameter and the second model parameter do not meet the update stop bar, wherein the size of the preset loss value can be set according to specific needs.
And when the third terminal determines whether the first model parameter and the second model parameter meet the update stop condition, generating prompt information, and sending the prompt information to the second terminal, or sending the prompt information to the first terminal. The prompt information carries a keyword, and whether the first model parameter and the second model parameter meet the update stop condition can be determined through the keyword. If the keyword in the prompt message is "true", indicating that the first model parameter and the second model parameter satisfy the update stop condition; when the keyword in the prompt message is "false", it indicates that the first model parameter and the second model parameter do not satisfy the stop update condition. It should be noted that the examples of the keywords in this embodiment are only for ease of understanding, and do not constitute a limitation on the keywords, for example, the keywords may be set to "0" and "1".
Further, step S60 includes:
step g, sending the loss value to a third terminal, so that the third terminal obtains a loss history value when the first model parameter and the second model parameter are updated last time after receiving the loss value, determines that the first model parameter and the second model parameter meet an update stop condition when detecting that an absolute value of a difference between the loss value and the loss history value is smaller than a preset threshold, and determines that the first model parameter and the second model parameter do not meet the update stop condition when detecting that the absolute value is larger than or equal to the preset threshold.
Further, when the second terminal sends the encrypted loss value to the third terminal, and the third terminal receives the encrypted loss value and decrypts the encrypted loss value to obtain a decrypted loss value, the third terminal obtains the loss history value when the first model parameter and the second model parameter are updated last time. It should be noted that, each time the model parameters are updated, a loss value is generated, that is, each time the model parameters are updated, the third terminal receives the loss value sent by the second terminal and stores the loss value. And after the third terminal acquires the historical loss value, the third terminal calculates the difference between the currently received loss value and the historical loss value, calculates the absolute value of the difference, and detects whether the absolute value of the difference is smaller than a preset threshold value. If the third terminal detects that the absolute value of the difference is smaller than the preset threshold, the third terminal determines that the first model parameter and the second model parameter meet the update stop condition; and if the third terminal detects that the absolute value of the difference is greater than or equal to the preset threshold, the third terminal determines that the first model parameter and the second model parameter do not meet the update stop condition. The size of the preset threshold value can be set according to specific needs, for example, can be set to 0.00001.
Step S70, if it is determined that the first model parameter and the second model parameter do not satisfy the update stop condition according to the prompt information, returning to the step of receiving, by the second terminal, the encrypted first feature data sent by the first terminal, and acquiring the second feature data and the tag data.
And when the second terminal receives the prompt message and determines that the first model parameter and the second model parameter do not meet the update stop condition according to the prompt message, the second terminal receives the encrypted first characteristic data sent by the first terminal again and acquires the second characteristic data and the label data, namely, the update operation of the model parameters is continued. Further, when the second terminal receives the prompt message and determines that the first model parameter and the second model parameter meet the update stop condition according to the prompt message, the second terminal does not perform the update operation of the model parameters, that is, does not perform iteration, and inputs the updated second model parameter into the second prediction model to obtain the trained second prediction model. And when the first terminal determines that the first model parameter and the second model parameter meet the update stop condition according to the received prompt information, the first terminal inputs the updated first model parameter into the first prediction model to obtain the trained first prediction model. It is understood that when the first model parameter and the second model parameter satisfy the stop update condition, it indicates that the update of the model parameters satisfies the convergence condition.
In the embodiment, whether the updating operation of the first model parameter and the second model parameter is finished or not is determined through the loss value, and when the updating operation of the first model parameter and the second model parameter is determined not to be finished, that is, the first model parameter and the second model parameter are determined not to meet the updating stop condition, the updating operation of the first model parameter and the second model parameter is continuously executed, so that the model performance of the prediction model obtained according to the updated model parameters is improved, and the accuracy of the prediction model on data is improved.
Specifically, referring to fig. 3, the whole process of the model parameter updating method is as follows: the method comprises the following steps: in this embodiment, a user may select how to initialize the first model parameters and the second model parameters according to needs, for example, all the first model parameters may be initialized to the same value, or each first model parameter may be initialized to different values; step two, after the first model parameter and the second model parameter are initialized, equipment ID duplicate checking is carried out, namely, first training data and second training data with the same equipment identification are selected; step three, selecting training data of part of mobile phone end equipment for training, wherein the set of the mobile phone end equipment is the first terminal (PartA), A1,Ai,AkEtc. represent different mobile phone end devices; step four and step five are the process of calculating the encryption loss function and the encryption residual error at the first terminal (the calculation process is carried out; step six is the process of calculating the loss value and the residual error at the second terminal (Part B), step seven and step eight are the operation process of the first terminal after receiving the encrypted residual error, in step eight, the first terminal and the second terminal both calculate the encryption gradient, namely calculate the first gradient value and the second gradient value, in step nine, the third terminal (the third party C) can calculate the third gradient value according to the third party CThe gradient values sent by the first terminal and the second terminal obtain decrypted gradient values and correspondingly return; in the tenth step, the first terminal and the second terminal correspondingly update respective model parameters according to the gradient value returned by the third terminal; in the eleventh step, the third terminal decrypts the loss function to determine whether the iteration is finished, that is, determines whether the iteration is finished according to the loss value, where the finishing of the iteration indicates that the first model parameter and the second model parameter satisfy the update stop condition, and the unfilling of the iteration indicates that the first model parameter and the second model parameter do not satisfy the update stop condition, if the iteration is finished, the update operation of the model parameters is completed, and if the iteration is unfinished, the third step is performed, and the update operation of the model parameters is continued.
It should be noted that this embodiment is an extension of the existing federal learning scheme, and federal learning is a technology for performing collaborative modeling on the premise of protecting user data privacy, and it is ensured that data can still be effectively modeled on the premise that the data is not out of the local area. According to the division of data and characteristics, the method can be generally divided into a horizontal federation and a vertical federation, wherein the horizontal federation requires each client to have the same characteristics and label data at the same time, and the problem to be solved by the vertical federation is that only one party has the label data, and the other parties only have the characteristics.
Further, a third embodiment of the method for updating model parameters of the present invention is provided.
The third embodiment of the model parameter updating method differs from the first and/or second embodiment of the model parameter updating method in that the model parameter updating method further comprises:
and h, taking the updated second model parameter as the model parameter of the loan prediction model in the second terminal to obtain the loan prediction model.
Step i, after receiving data to be predicted, inputting the data to be predicted into the loan prediction model to obtain loan probability corresponding to the data to be predicted, and pushing information according to the loan probability.
And when the second terminal obtains the updated second model parameter, the second terminal takes the updated second model parameter as the model parameter of the loan prediction model. It is understood that, at this time, the second prediction model in the second terminal is a loan prediction model, and the user may set the second prediction model as a character recognition model or a picture recognition model, etc. as needed. It should be noted that, the training data used in the foregoing is different according to the type of the model, for example, the training data corresponding to the loan prediction model is data generated when the user makes a loan, and the training data corresponding to the image recognition model is image-related data.
After the second terminal obtains the loan prediction model, the second terminal inputs the data to be predicted into the loan prediction model to obtain a loan probability corresponding to the data to be predicted, and it can be understood that the loan probability is an output result of the loan prediction model. The second terminal can push the loan-related information according to the loan probability.
In this embodiment, the updated second model parameter is used as the model parameter of the loan prediction model in the second terminal to obtain the loan prediction model, and then information is pushed according to the result loan probability obtained by the loan prediction model, so as to improve the accuracy of pushing loan-related information.
In addition, the present invention provides an updating apparatus of model parameters, and referring to fig. 4, the updating apparatus of model parameters includes:
a receiving module 10, configured to receive encrypted first feature data sent by a first terminal, where the first feature data is obtained by the first terminal and at least one terminal of the same type through horizontal federal learning, and the first terminal and the second terminal are different types of terminals;
an obtaining module 20, configured to obtain second feature data and tag data corresponding to the second feature data;
a calculating module 30, configured to calculate a residual according to the first feature data, the second feature data, and the tag data;
a sending module 40, configured to send the residual to the first terminal, so that the first terminal calculates a first gradient value according to the residual, and updates a first model parameter corresponding to the first feature data according to the first gradient value;
the calculating module 30 is further configured to calculate a second gradient value according to the residual error;
and an updating module 50, configured to update a second model parameter corresponding to the second feature data according to the second gradient value.
Further, the obtaining module 20 is further configured to obtain an encrypted first feature parameter corresponding to the first feature data, and obtain a second feature parameter corresponding to the second feature data, where device identifiers corresponding to the first feature data and the second feature data are the same;
the calculation module 30 is further configured to calculate a loss value according to the first feature data, the second feature data, the tag data, the first feature parameter, and the second feature parameter;
the sending module 40 is further configured to send the loss value to a third terminal, so that the third terminal determines, according to the loss value, whether the first model parameter and the second model parameter meet an update stop condition, and returns a prompt message;
the device for updating the model parameters further comprises:
and the execution module is used for returning to execute the steps of receiving the encrypted first characteristic data sent by the first terminal by the second terminal and acquiring the second characteristic data and the label data if the first model parameter and the second model parameter do not meet the update stopping condition according to the prompt message.
Further, the sending module 40 is further configured to send the loss value to a third terminal, so that the third terminal obtains a loss history value when the first model parameter and the second model parameter are updated last time after receiving the loss value, determines that the first model parameter and the second model parameter satisfy an update stop condition when detecting that an absolute value of a difference between the loss value and the loss history value is smaller than a preset threshold, and determines that the first model parameter and the second model parameter do not satisfy the update stop condition when detecting that the absolute value is greater than or equal to the preset threshold.
Further, the obtaining module 20 is further configured to obtain training data corresponding to the second feature data, and obtain a third feature parameter corresponding to the second feature data;
the calculation module 30 is further configured to calculate a second gradient value according to the training data, the third feature parameter and the residual error.
Further, the update module 50 includes:
the acquisition unit is used for acquiring an update coefficient corresponding to the second terminal;
and the updating unit is used for updating the second model parameter corresponding to the second characteristic data according to the second gradient value and the updating coefficient.
Further, the calculation module 30 includes:
a calculation unit configured to calculate a difference between the second feature data and the second feature data tag data;
the encryption unit is used for encrypting the difference value to obtain an encrypted difference value;
the computing unit is further used for computing to obtain an encrypted residual error according to the encrypted difference value and the encrypted first characteristic data; calculating according to the encrypted residual error to obtain an encrypted second gradient value;
the sending unit is used for sending the encrypted second gradient value to the third terminal so that the third terminal can decrypt the encrypted second gradient value and return the decrypted second gradient value;
the updating module 50 is further configured to receive the decrypted second gradient value returned by the third terminal, and update the second model parameter corresponding to the second feature data according to the decrypted second gradient value.
Further, the device for updating the model parameters further comprises:
the processing module is used for taking the updated second model parameters as model parameters of a loan prediction model in the second terminal to obtain the loan prediction model;
and the input module is used for inputting the data to be predicted into the loan prediction model after receiving the data to be predicted, obtaining the loan probability corresponding to the data to be predicted, and pushing information according to the loan probability.
The specific implementation of the model parameter updating apparatus of the present invention is substantially the same as that of each embodiment of the model parameter updating method described above, and will not be described herein again.
In addition, the invention also provides equipment for updating the model parameters. As shown in fig. 5, fig. 5 is a schematic structural diagram of a hardware operating environment according to an embodiment of the present invention.
It should be noted that fig. 5 is a schematic structural diagram of a hardware operating environment of the model parameter updating apparatus. The updating device of the model parameters of the embodiment of the invention can be a terminal device such as a PC, a portable computer and the like.
As shown in fig. 5, the apparatus for updating the model parameters may include: a processor 1001, such as a CPU, a memory 1005, a user interface 1003, a network interface 1004, a communication bus 1002. Wherein a communication bus 1002 is used to enable connective communication between these components. The user interface 1003 may include a Display screen (Display), an input unit such as a Keyboard (Keyboard), and the optional user interface 1003 may also include a standard wired interface, a wireless interface. The network interface 1004 may optionally include a standard wired interface, a wireless interface (e.g., WI-FI interface). The memory 1005 may be a high-speed RAM memory or a non-volatile memory (e.g., a magnetic disk memory). The memory 1005 may alternatively be a storage device separate from the processor 1001.
Those skilled in the art will appreciate that the model parameter update facility configuration shown in FIG. 5 does not constitute a definition of a model parameter update facility, and may include more or fewer components than those shown, or some components in combination, or a different arrangement of components.
As shown in fig. 5, a memory 1005, which is a kind of computer storage medium, may include therein an operating system, a network communication module, a user interface module, and an update program of model parameters. The operating system is a program for managing and controlling the hardware and software resources of the equipment, and supports the operation of the updating program of the model parameters and other software or programs.
In the updating apparatus of model parameters shown in fig. 5, the user interface 1003 is mainly used for connecting the first terminal and the third terminal, and performing data communication with the first terminal and the third terminal, respectively; the network interface 1004 is mainly used for the background server and performs data communication with the background server; the processor 1001 may be configured to call an update program of model parameters stored in the memory 1005 and perform the steps of the update method of model parameters as described above.
The specific implementation of the model parameter updating device of the present invention is substantially the same as the embodiments of the model parameter updating method described above, and will not be described herein again.
Furthermore, an embodiment of the present invention further provides a computer-readable storage medium, where an update program of model parameters is stored on the computer-readable storage medium, and when the update program of model parameters is executed by a processor, the steps of the update method of model parameters described above are implemented.
The specific implementation of the computer-readable storage medium of the present invention is substantially the same as the embodiments of the above-mentioned model parameter updating method, and is not described herein again.
It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element.
The above-mentioned serial numbers of the embodiments of the present invention are merely for description and do not represent the merits of the embodiments.
Through the above description of the embodiments, those skilled in the art will clearly understand that the method of the above embodiments can be implemented by software plus a necessary general hardware platform, and certainly can also be implemented by hardware, but in many cases, the former is a better implementation manner. Based on such understanding, the technical solutions of the present invention may be embodied in the form of a software product, which is stored in a storage medium (such as ROM/RAM, magnetic disk, optical disk) and includes instructions for enabling a terminal device (such as a mobile phone, a computer, a server, an air conditioner, or a network device) to execute the method according to the embodiments of the present invention.
The above description is only a preferred embodiment of the present invention, and not intended to limit the scope of the present invention, and all modifications of equivalent structures and equivalent processes, which are made by using the contents of the present specification and the accompanying drawings, or directly or indirectly applied to other related technical fields, are included in the scope of the present invention.

Claims (10)

1. A method for updating model parameters is characterized by comprising the following steps:
the method comprises the steps that a second terminal receives encrypted first feature data sent by a first terminal, and obtains second feature data and label data corresponding to the second feature data, wherein the first feature data are obtained by the first terminal and at least one terminal of the same type through horizontal federal learning, and the first terminal and the second terminal are different types of terminals;
calculating to obtain a residual according to the first characteristic data, the second characteristic data and the label data, and sending the residual to the first terminal, so that the first terminal can calculate a first gradient value according to the residual, and update a first model parameter corresponding to the first characteristic data according to the first gradient value;
and calculating a second gradient value according to the residual error, and updating a second model parameter corresponding to the second characteristic data according to the second gradient value.
2. The method for updating model parameters according to claim 1, wherein after the steps of receiving the encrypted first feature data sent by the first terminal and acquiring the second feature data and the tag data corresponding to the second feature data, the second terminal further comprises:
acquiring an encrypted first characteristic parameter corresponding to the first characteristic data and acquiring a second characteristic parameter corresponding to the second characteristic data, wherein the device identifications corresponding to the first characteristic data and the second characteristic data are the same;
calculating a loss value according to the first characteristic data, the second characteristic data, the label data, the first characteristic parameter and the second characteristic parameter;
sending the loss value to a third terminal, so that the third terminal can judge whether the first model parameter and the second model parameter meet the update stop condition according to the loss value, and returning a prompt message;
and if the first model parameter and the second model parameter are determined not to meet the updating stop condition according to the prompt message, returning to the step of receiving the encrypted first characteristic data sent by the first terminal and acquiring the second characteristic data and the label data by the second terminal.
3. The method for updating model parameters according to claim 2, wherein the step of sending the loss value to a third terminal so that the third terminal can determine whether the first model parameter and the second model parameter satisfy the update stop condition according to the loss value comprises:
and sending the loss value to a third terminal, so that the third terminal obtains a loss history value when the first model parameter and the second model parameter are updated last time after receiving the loss value, determines that the first model parameter and the second model parameter meet an update stop condition when detecting that an absolute value of a difference value between the loss value and the loss history value is smaller than a preset threshold value, and determines that the first model parameter and the second model parameter do not meet the update stop condition when detecting that the absolute value is larger than or equal to the preset threshold value.
4. The method for updating model parameters of claim 1, wherein said step of calculating a second gradient value from said residual is preceded by the steps of:
acquiring training data corresponding to the second feature data, and acquiring a third feature parameter corresponding to the second feature data;
the step of calculating a second gradient value from the residual error comprises:
and calculating a second gradient value according to the training data, the third characteristic parameter and the residual error.
5. The method for updating model parameters according to claim 1, wherein the step of updating the second model parameters corresponding to the second feature data according to the second gradient values comprises:
acquiring an updating coefficient corresponding to the second terminal;
and updating a second model parameter corresponding to the second characteristic data according to the second gradient value and the updating coefficient.
6. The method for updating model parameters according to claim 1, wherein said step of calculating a residual from said first feature data, said second feature data and said tag data comprises:
calculating a difference value between the second characteristic data and the second characteristic data label data, and encrypting the difference value to obtain an encrypted difference value;
calculating to obtain an encrypted residual error according to the encrypted difference value and the encrypted first characteristic data;
the step of calculating a second gradient value according to the residual error, and updating a second model parameter corresponding to the second feature data according to the second gradient value includes:
calculating according to the encrypted residual error to obtain an encrypted second gradient value, sending the encrypted second gradient value to the third terminal so that the third terminal can decrypt the encrypted second gradient value, and returning the decrypted second gradient value;
and receiving the decrypted second gradient value returned by the third terminal, and updating the second model parameter corresponding to the second characteristic data according to the decrypted second gradient value.
7. The method for updating model parameters according to any one of claims 4 to 6, wherein said step of calculating a second gradient value according to said residual error and updating a second model parameter corresponding to said second feature data according to said second gradient value further comprises:
taking the updated second model parameter as the model parameter of the loan prediction model in the second terminal to obtain the loan prediction model;
after receiving data to be predicted, inputting the data to be predicted into the loan prediction model to obtain loan probability corresponding to the data to be predicted, and pushing information according to the loan probability.
8. An apparatus for updating model parameters, comprising:
the terminal comprises a receiving module, a sending module and a processing module, wherein the receiving module is used for receiving encrypted first characteristic data sent by a first terminal, the first characteristic data is obtained by the first terminal and at least one terminal of the same type through horizontal federal learning, and the first terminal and the second terminal are different types of terminals;
the acquisition module is used for acquiring second characteristic data and label data corresponding to the second characteristic data;
the calculation module is used for calculating to obtain a residual error according to the first characteristic data, the second characteristic data and the label data;
a sending module, configured to send the residual to the first terminal, so that the first terminal calculates a first gradient value according to the residual, and updates a first model parameter corresponding to the first feature data according to the first gradient value;
the calculation module is further used for calculating a second gradient value according to the residual error;
and the updating module is used for updating the second model parameter corresponding to the second characteristic data according to the second gradient value.
9. An updating apparatus of model parameters, characterized in that the updating apparatus of model parameters comprises a memory, a processor and an updating program of model parameters stored on the memory and executable on the processor, the updating program of model parameters implementing the steps of the updating method of model parameters as claimed in any one of claims 1 to 7 when executed by the processor.
10. A computer-readable storage medium, characterized in that the computer-readable storage medium has stored thereon an update program of model parameters, which when executed by a processor implements the steps of the update method of model parameters according to any one of claims 1 to 7.
CN202010142907.7A 2020-03-03 2020-03-03 Model parameter updating method, device, equipment and readable storage medium Pending CN111368196A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010142907.7A CN111368196A (en) 2020-03-03 2020-03-03 Model parameter updating method, device, equipment and readable storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010142907.7A CN111368196A (en) 2020-03-03 2020-03-03 Model parameter updating method, device, equipment and readable storage medium

Publications (1)

Publication Number Publication Date
CN111368196A true CN111368196A (en) 2020-07-03

Family

ID=71208508

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010142907.7A Pending CN111368196A (en) 2020-03-03 2020-03-03 Model parameter updating method, device, equipment and readable storage medium

Country Status (1)

Country Link
CN (1) CN111368196A (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111839495A (en) * 2020-07-30 2020-10-30 深圳前海微众银行股份有限公司 Detection method, device and storage medium
CN111931216A (en) * 2020-09-16 2020-11-13 支付宝(杭州)信息技术有限公司 Method and system for obtaining joint training model based on privacy protection
CN113190871A (en) * 2021-05-28 2021-07-30 脸萌有限公司 Data protection method and device, readable medium and electronic equipment

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111839495A (en) * 2020-07-30 2020-10-30 深圳前海微众银行股份有限公司 Detection method, device and storage medium
CN111839495B (en) * 2020-07-30 2023-04-07 深圳前海微众银行股份有限公司 Detection method, device and storage medium
CN111931216A (en) * 2020-09-16 2020-11-13 支付宝(杭州)信息技术有限公司 Method and system for obtaining joint training model based on privacy protection
CN113190871A (en) * 2021-05-28 2021-07-30 脸萌有限公司 Data protection method and device, readable medium and electronic equipment
CN113190871B (en) * 2021-05-28 2023-10-31 脸萌有限公司 Data protection method and device, readable medium and electronic equipment

Similar Documents

Publication Publication Date Title
WO2020177392A1 (en) Federated learning-based model parameter training method, apparatus and device, and medium
CN109255444B (en) Federal modeling method and device based on transfer learning and readable storage medium
CN110851869B (en) Sensitive information processing method, device and readable storage medium
CN109241770B (en) Information value calculation method and device based on homomorphic encryption and readable storage medium
CN111368196A (en) Model parameter updating method, device, equipment and readable storage medium
US20140033267A1 (en) Type mining framework for automated security policy generation
CN111368901A (en) Multi-party combined modeling method, device and medium based on federal learning
CN108848058A (en) Intelligent contract processing method and block catenary system
CN111401277A (en) Face recognition model updating method, device, equipment and medium
CN109325357B (en) RSA-based information value calculation method, device and readable storage medium
WO2020011200A1 (en) Cross-domain data fusion method and system, and storage medium
WO2020233137A1 (en) Method and apparatus for determining value of loss function, and electronic device
WO2018156461A1 (en) Configuring image as private within storage container
CN110837653A (en) Label prediction method, device and computer readable storage medium
CN112116008A (en) Target detection model processing method based on intelligent decision and related equipment thereof
CN105574430A (en) Novel privacy protection method in collaborative filtering recommendation system
CN112785002A (en) Model construction optimization method, device, medium, and computer program product
CN111324812A (en) Federal recommendation method, device, equipment and medium based on transfer learning
CN110162722A (en) Products Show method, server and storage medium based on two dimensional code
US20190158473A1 (en) Generating bridge match identifiers for linking identifers from server logs
WO2023134055A1 (en) Privacy-based federated inference method and apparatus, device, and storage medium
US20170308714A1 (en) Data management for combined data using structured data governance metadata
CN109902742B (en) Sample completion method, terminal, system and medium based on encryption migration learning
CN114036567A (en) Authority authentication method and system for information secure sharing
JP4594078B2 (en) Personal information management system and personal information management program

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination