CN115329385B - Model training method and device based on block chain cross-chain privacy protection - Google Patents
Model training method and device based on block chain cross-chain privacy protection Download PDFInfo
- Publication number
- CN115329385B CN115329385B CN202211238647.9A CN202211238647A CN115329385B CN 115329385 B CN115329385 B CN 115329385B CN 202211238647 A CN202211238647 A CN 202211238647A CN 115329385 B CN115329385 B CN 115329385B
- Authority
- CN
- China
- Prior art keywords
- local model
- parameters
- model
- privacy
- global
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Expired - Fee Related
Links
- 238000012549 training Methods 0.000 title claims abstract description 108
- 238000000034 method Methods 0.000 title claims abstract description 60
- 238000004364 calculation method Methods 0.000 claims abstract description 11
- 238000004422 calculation algorithm Methods 0.000 claims description 49
- 238000012795 verification Methods 0.000 claims description 27
- 230000004931 aggregating effect Effects 0.000 claims description 8
- 230000002776 aggregation Effects 0.000 claims description 6
- 238000004220 aggregation Methods 0.000 claims description 6
- 230000005540 biological transmission Effects 0.000 claims description 6
- 101100001669 Emericella variicolor andD gene Proteins 0.000 claims description 3
- 101100437734 Nocardia asteroides bla gene Proteins 0.000 claims 1
- 238000005516 engineering process Methods 0.000 abstract description 4
- 230000008569 process Effects 0.000 description 13
- 238000004891 communication Methods 0.000 description 8
- 238000012545 processing Methods 0.000 description 8
- 238000004590 computer program Methods 0.000 description 7
- 238000001514 detection method Methods 0.000 description 6
- 238000010586 diagram Methods 0.000 description 6
- 230000007246 mechanism Effects 0.000 description 6
- 230000006870 function Effects 0.000 description 5
- 238000007689 inspection Methods 0.000 description 4
- 230000008878 coupling Effects 0.000 description 3
- 238000010168 coupling process Methods 0.000 description 3
- 238000005859 coupling reaction Methods 0.000 description 3
- 230000003287 optical effect Effects 0.000 description 3
- 238000013461 design Methods 0.000 description 2
- 210000002569 neuron Anatomy 0.000 description 2
- 230000009286 beneficial effect Effects 0.000 description 1
- 238000010276 construction Methods 0.000 description 1
- 230000003111 delayed effect Effects 0.000 description 1
- 230000004044 response Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F21/00—Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
- G06F21/60—Protecting data
- G06F21/62—Protecting access to data via a platform, e.g. using keys or access control rules
- G06F21/6218—Protecting access to data via a platform, e.g. using keys or access control rules to a system of files or objects, e.g. local or distributed file system or database
- G06F21/6245—Protecting personal data, e.g. for financial or medical purposes
Landscapes
- Engineering & Computer Science (AREA)
- Health & Medical Sciences (AREA)
- Bioethics (AREA)
- General Health & Medical Sciences (AREA)
- Theoretical Computer Science (AREA)
- Computer Hardware Design (AREA)
- Databases & Information Systems (AREA)
- Computer Security & Cryptography (AREA)
- Software Systems (AREA)
- Physics & Mathematics (AREA)
- General Engineering & Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Medical Informatics (AREA)
- Information Retrieval, Db Structures And Fs Structures Therefor (AREA)
Abstract
The present disclosure relates to the field of blockchain technologies, and in particular, to a method and an apparatus for model training based on blockchain cross-chain privacy protection. The method comprises the steps that each first block chain trains a local model to obtain local model parameters and calculate loss values, disturbance is added to the local model parameters on the first block chain to obtain local model privacy parameters, and the local model privacy parameters and the loss values are sent to a relay chain; the relay chain aggregates the received local model privacy parameters and updates the global model parameters; and calculating a global loss value according to the loss value of each first block chain, and performing iterative training according to the convergence condition of the global loss value. By the method, the privacy and safety of the legal sushi three-way data are guaranteed while the robustness of model training is guaranteed, and the problems that the risk of leakage of the data of the legal sushi three-way block chain exists and the calculation amount of model training is large in the prior art are solved.
Description
Technical Field
The present disclosure relates to the field of blockchain technologies, and in particular, to a method and an apparatus for model training based on blockchain cross-chain privacy protection.
Background
In the judicial setting, the most important part of the work of legal professionals is to provide reliable and high quality legal advisory services to professionals, however, due to the insufficient number of legal professionals, ensuring that non-professionals are able to obtain adequate and high quality advisory services is a matter of intense concern in the art.
At present, a judicial question-answering system can solve the problem, and the construction method of the block chain-based collaborative question-answering model disclosed in the prior patent CN114528392B can ensure that the three-party collaborative question-answering model of the legal department is tested through a relay chain training method under the condition that the data of the three-party block chain of the legal department is not exported. However, in the parameter transmission process of the legal detection driver three-party, privacy protection is not performed on the parameters, and if an attacker intercepts the model parameters of the legal detection driver three-party block chain, the data of the legal detection driver three-party block chain can be reversely deduced through the intercepted model parameters, so that the data of the legal detection driver three-party block chain has a risk of leakage. And the model parameters are respectively aggregated by the three-part block chain of the law-detection department, so that the calculated amount of model training is large.
Now, a model training method based on block chain cross-chain privacy protection is needed urgently, so that the problems that in the prior art, data of a legal detection three-party block chain is leaked, and the calculation amount of model training is large are solved.
Disclosure of Invention
In order to solve the problems that data of a legal inspection department three-party block chain is at risk of leakage and calculation amount of model training is large, embodiments of the invention provide a block chain cross-chain privacy protection-based model training method and device, so that parameter data uploaded to a relay chain are subjected to local differential privacy encryption through intelligent contracts on each block chain, robustness of model training is guaranteed, and privacy safety of legal inspection department three-party data is guaranteed.
In order to solve the technical problems, the specific technical scheme is as follows:
in one aspect, embodiments herein provide a method for model training based on blockchain cross-chain privacy protection, performed by a first blockchain, comprising,
training a local model according to a first data set of the constructed model and randomly generated initial parameters of the local model to obtain parameters of the local model, and calculating a loss value of the local model;
adding disturbance into the local model parameters according to the set differential privacy parameters and a random algorithm under the condition that the consensus verification result of the local model parameters and the loss values is passed to obtain the local model privacy parameters;
sending the local model privacy parameters and the loss values to a relay chain so that the relay chain aggregates a plurality of received local model privacy parameters sent by a plurality of first blockchains under the condition that the result of consensus verification on the local model privacy parameters and the loss values is passed to obtain global model privacy parameters, calculating global loss values according to the received loss values sent by the plurality of first blockchains, judging whether the global loss values are converged, if so, notifying the plurality of first blockchains to stop training, taking a local model of any one first blockchain in the plurality of first blockchains as a target model, and if not, sending the global model privacy parameters to the plurality of first blockchains;
and under the condition that the received consensus verification result of the global model privacy parameters is passed, updating the local model initial parameters according to the global model privacy parameters, and repeatedly executing the step of training the local model according to the constructed first data set and the updated local model initial parameters.
Further, the formula for calculating the loss value of the local model is:
wherein,L i (ω) Is shown asiThe loss value for each of the first block chains,ωrepresenting the parameters of the local model in question,D i represents the firstiThe first data set of the first block chainD i L represents the number of data samples in the first data set,lj(ω,D i ) Representing the model parametersωTo (1) ajOf a data sample (x,y) The loss value of (a).
Further, adding disturbance into the local model parameters according to the set differential privacy parameters and the random algorithm to obtain the local model privacy parameters further comprises,
the algorithm adds disturbance to the local model parameters to obtain local model privacy parameters further comprising,
according to the formula:
stochastic algorithmAThe constraints of the above formula are satisfied, wherein,Arepresenting the random algorithm in question,Da data set of parameters of a local model is represented,D’representation and local model parameter data setDAdjacent any local model parameter data set, i.e.DOnly one record andD’in contrast to this, the present invention is,Orepresenting random algorithmsAAdding perturbed output, pr leaf, to an input local model parameter datasetA(D) = ODenotes the stochastic algorithmAThe input local model parameter data set is output after adding disturbanceOThe probability of (a) of (b) being,εa differential privacy parameter representing the setting; if the formula is not satisfied, adjusting the random algorithmAOr replacing the random algorithmAUntil the final random algorithmAThe formula is satisfied; if the random algorithmAAnd if the formula is met, adding disturbance to the local model parameter data set by using the random algorithm A to obtain the local model privacy parameters.
Further, the formula for updating the initial parameters of the local model according to the global model privacy parameters is,
wherein,is shown askThe first block chaintThe updated initial parameters of the local model of the round,is shown askA first block chaint-1 round of the global model privacy parameter,rindicating the learning rate, ∇ the gradiometer operator,L k (w) Is shown askThe loss value of each of the first block chains,ωdenotes the firstt1 round of local model parameters.
In another aspect, embodiments herein further provide a block chain cross-chain privacy protection-based model training apparatus, including,
the local model training unit is used for training a local model according to a first data set for constructing the model and randomly generated local model initial parameters to obtain local model parameters and calculating a loss value of the local model;
the local model parameter privacy unit is used for adding disturbance into the local model parameters according to set differential privacy parameters and a random algorithm under the condition that the consensus verification result of the local model parameters and the loss values is passed, so as to obtain the local model privacy parameters;
a local model privacy parameter sending unit, configured to send the local model privacy parameters and the loss values to a relay chain, so that the relay chain aggregates received multiple local model privacy parameters sent by multiple first blockchains when a result of performing consensus verification on the local model privacy parameters and the loss values is that the local model privacy parameters and the loss values pass, to obtain a global model privacy parameter, calculates a global loss value according to the received loss values sent by the multiple first blockchains, determines whether the global loss value converges, notifies the multiple first blockchains to stop training if the global loss value converges, takes a local model of any one first blockchain in the multiple first blockchains as a target model, and sends the global model privacy parameters to the multiple first blockchains if the global loss value does not converge;
and the iterative training unit is used for updating the initial parameters of the local model according to the global model privacy parameters under the condition that the consensus verification result of the received global model privacy parameters is passed, and repeatedly executing the step of training the local model according to the constructed first data set and the updated initial parameters of the local model.
Based on the same inventive concept, the embodiment herein also provides a model training method based on block chain cross-chain privacy protection, which is executed by a relay chain, the method includes,
receiving a plurality of local model privacy parameters and a plurality of loss values sent by a plurality of first blockchains, wherein the local model privacy parameters are obtained by the first blockchains after disturbance is added to the local model parameters according to set difference privacy parameters and a random algorithm, the local model parameters are obtained by the first blockchains through training of local models according to a first data set of a constructed model and randomly generated local model initial parameters, and the loss values are obtained by the first blockchains through calculation of the local models;
when the result of the consensus verification of the local model privacy parameters and the loss values is that the local model privacy parameters and the loss values pass, aggregating the received multiple local model privacy parameters sent by the multiple first blockchains to obtain global model privacy parameters;
calculating a global loss value according to the received loss values sent by the plurality of first blockchains;
judging whether the global loss value is converged, if so, notifying a plurality of first block chains to stop training, and taking a local model of any one of the first block chains as a target model;
if not, the global model privacy parameters are sent to the first block chains, so that the first block chains update the initial parameters of the local model according to the global model privacy parameters, and the step of training the local model according to the constructed first data set and the updated initial parameters of the local model is repeatedly executed.
Further, a global penalty value is calculated based on the received penalty values for the plurality of first blockchain transmissions,
wherein,L* A value representing the global penalty value is determined,representing the total number of data samples of the first plurality of blockchains,L i (ω) Is shown asiThe loss value of each of the first block chains,ωrepresents the firstiThe local model parameters of a first blockchain.
Further, aggregating a plurality of received local model privacy parameters sent by the plurality of first blockchains to obtain a global model privacy parameter according to a formula,
wherein,w t is shown astGlobal model privacy parameters of the wheelKL represents the total number of the first blockchain,denotes the firstkA first block of the first block chaintLocal model privacy parameters for the wheel.
Further, receiving a plurality of local model privacy parameters and loss values transmitted by a plurality of first blockchains further comprises,
receiving a plurality of local model privacy parameters and loss values sent by a plurality of first blockchains according to a set time threshold;
if the local model privacy parameters and the loss values sent by one or more first blockchains are not received within the time range corresponding to the time threshold, continuing to perform a step of aggregating the received multiple local model privacy parameters sent by the multiple first blockchains to obtain global model privacy parameters, and notifying one or more first blockchains corresponding to the local model privacy parameters and the loss values which are not received to train the constructed target model according to the first data set of the one or more first blockchains, so as to update the target model.
In another aspect, embodiments herein further provide a block chain cross-chain privacy protection-based model training apparatus, including,
a local model privacy parameter receiving unit, configured to receive a plurality of local model privacy parameters and a plurality of loss values that are sent by a plurality of first blockchains, where the local model privacy parameters are obtained by the first blockchains after adding a disturbance to the local model parameters according to a set differential privacy parameter and a random algorithm, the local model parameters are obtained by the first blockchains training local models according to a first data set of a constructed model and randomly generated local model initial parameters, and the loss values are obtained by the first blockchains calculating the local models;
a local model privacy parameter aggregation unit, configured to, in a case that a result of performing consensus verification on the local model privacy parameters and the loss values is that the local model privacy parameters and the loss values pass, aggregate the received multiple local model privacy parameters sent by the multiple first blockchains, and obtain global model privacy parameters;
a global loss value calculation unit, configured to calculate a global loss value according to the received loss values sent by the plurality of first blockchains;
the iterative training unit is used for judging whether the global loss value is converged, if so, informing the plurality of first block chains to stop training, and taking a local model of any one of the plurality of first block chains as a target model; if not, the global model privacy parameters are sent to the first blockchains, so that the first blockchains update the initial parameters of the local model according to the global model privacy parameters, and the step of training the local model according to the constructed first data set and the updated initial parameters of the local model is repeatedly executed.
The method includes the steps that a plurality of first blockchains can correspond to a court department, an inspection department and a judicial administration department respectively, by means of the embodiment, each first blockchain utilizes a respective first data set and randomly generated local model initial parameters to train a local model, local model parameters are obtained, loss values of the local model are calculated, then disturbance is added to the local model parameters of each first blockchain, the local model privacy parameters are obtained, then the local model privacy parameters and the loss values are sent to a relay chain, the privacy of the local model parameters is enhanced by adding the disturbed local model parameters, even if an attacker intercepts the local model privacy parameters, data of the first data set cannot be reversely deduced from the local model privacy parameters, and therefore privacy safety of the data of the first blockchains is guaranteed. And then the relay chain aggregates the received local model privacy parameters sent by the first blockchains to obtain a global privacy parameter, compared with a method that each first blockchain executes parameter aggregation once, the calculation amount of model training can be reduced to a certain extent, then the relay chain calculates a global loss value according to the loss values of the first blockchains, whether the global loss value is converged is judged, if the global loss value is not converged, the global privacy parameter is sent to the first blockchains for training again, so that the first blockchains train a collaborative model under the condition that source data is not exported until the global loss value is converged, and at this time, the local model trained by each first blockchain can meet the collaborative requirements of the first blockchains, and the local model of any first blockchain is taken as a final target model which can meet the business requirements of mechanisms (such as law, inspection and department) corresponding to the first blockchains. The method and the system have the advantages that the stability of model training is guaranteed, the privacy and the safety of the legal system department three-party data are guaranteed, and the problems that the data of the legal system department three-party block chain are leaked and the calculated amount of model training is large in the prior art are solved.
Drawings
In order to more clearly illustrate the embodiments or technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, and it is obvious that the drawings in the following description are only some embodiments of the present invention, and it is obvious for those skilled in the art that other drawings can be obtained according to the drawings without creative efforts.
Fig. 1 is a schematic diagram of a system for implementing a model training method based on block chain cross-chain privacy protection according to an embodiment of the present disclosure;
FIG. 2 is a flowchart illustrating a method for model training based on block chain cross-chain privacy protection according to an embodiment of the present disclosure;
FIG. 3 is a flowchart illustrating a method for model training based on block chain cross-chain privacy protection according to an embodiment of the present disclosure;
FIG. 4 illustrates a co-training process performed by the time threshold set in the embodiments herein;
FIG. 5 is a schematic structural diagram of a model training apparatus based on block chain cross-chain privacy protection according to an embodiment of the present disclosure;
FIG. 6 is a schematic structural diagram illustrating a model training apparatus based on block chain cross-chain privacy protection according to an embodiment of the present disclosure;
fig. 7 is a schematic structural diagram of a computer device according to an embodiment of the present disclosure.
[ description of reference ]:
101. a first block chain;
102. a relay chain;
501. a local model training unit;
502. a local model parameter privacy unit;
503. a local model privacy parameter sending unit;
504. an iterative training unit;
601. a local model privacy parameter receiving unit;
602. a local model privacy parameter aggregation unit;
603. a global loss value calculation unit;
604. an iterative training unit;
702. a computer device;
704. a processing device;
706. a storage resource;
708. a drive mechanism;
710. an input/output module;
712. an input device;
714. an output device;
716. a presentation device;
718. a graphical user interface;
720. a network interface;
722. a communication link;
724. a communication bus.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments herein without making any creative effort, shall fall within the scope of protection.
It should be noted that the terms "first," "second," and the like in the description and claims herein and in the above-described drawings are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used is interchangeable under appropriate circumstances such that the embodiments herein described are capable of operation in sequences other than those illustrated or described herein. Furthermore, the terms "comprises," "comprising," and "having," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a process, method, apparatus, article, or device that comprises a list of steps or elements is not necessarily limited to those steps or elements expressly listed, but may include other steps or elements not expressly listed or inherent to such process, method, article, or device.
It should be noted that the steps illustrated in the flowcharts of the figures may be performed in a computer system such as a set of computer-executable instructions and that, although a logical order is illustrated in the flowcharts, in some cases, the steps illustrated or described may be performed in an order different than here.
Fig. 1 is a schematic diagram of an implementation system of a model training method based on block chain cross-chain privacy protection according to an embodiment of the present disclosure, and the implementation system may include: a plurality of first blockchains 101 and relay chains 102, wherein the first blockchains 101 and the relay chains 102 communicate with each other through a Network, and the Network may include a Local Area Network (LAN), a Wide Area Network (WAN), the internet, or a combination thereof, and is connected to a website, a user device (e.g., a computing device), and a backend system. The plurality of first block chains 101 may respectively correspond to a court chain, a scouting chain, and a judicial administration chain, and the plurality of first block chains 101 construct their own models according to their own stored data. The relay chain 102 is responsible for cooperative training among the plurality of first blockchains 101. The first blockchain 101 may construct an object model through one or more servers, and the servers are deployed with a data processing system oriented to judicial data. Alternatively, the servers may be nodes of a cloud computing system (not shown), or each server may be a separate cloud computing system comprising multiple computers interconnected by a network and operating as a distributed processing system. The server may run any suitable computing system that enables it to act as a node in the blockchain network of the first blockchain 101.
In addition, it should be noted that fig. 1 is only one application environment provided by the present disclosure, and in practical applications, other application environments may also be included, for example, training a collaborative model for constructing multiple systems (e.g., a traffic management system, a vehicle management system, and a traffic police management system) may also be implemented on the multiple first block chains 101 and the relay chains 102 shown in fig. 1, and the number of the first block chains for training the collaborative model may also be adjusted according to the specific number of the systems, which is not limited in this specification.
Specifically, embodiments herein provide a model training method based on blockchain cross-chain privacy protection, which may be performed by any one of first blockchains for constructing a collaborative model, where model parameters trained by a plurality of first blockchains are transmitted and updated through a blockchain cross-chain system, and a target model of each first blockchain is constructed while ensuring privacy security of data in each first blockchain. Fig. 2 is a flowchart illustrating a model training method based on block chain cross-chain privacy protection according to an embodiment of the present disclosure. The process of training a model based on a blockchain is described in this figure, but may include more or fewer operational steps based on conventional or non-creative efforts. The order of steps recited in the embodiments is merely one manner of performing the steps in a multitude of sequences, and does not represent a unique order of performance. In the actual implementation of the system or the device product, the method according to the embodiments or shown in the drawings can be executed in sequence or in parallel. Specifically, as shown in fig. 2, the method may include:
step 201: training a local model according to a first data set of the constructed model and randomly generated initial parameters of the local model to obtain parameters of the local model, and calculating a loss value of the local model;
step 202: adding disturbance into the local model parameters according to the set differential privacy parameters and a random algorithm under the condition that the consensus verification result of the local model parameters and the loss values is passed, so as to obtain the local model privacy parameters;
step 203: sending the local model privacy parameters and the loss values to a relay chain;
in this step, after sending the local model privacy parameters and the loss values to a relay chain, the relay chain aggregates the received multiple local model privacy parameters sent by the multiple first blockchains under the condition that the result of consensus verification on the local model privacy parameters and the loss values is passed, so as to obtain global model privacy parameters, calculates global loss values according to the received loss values sent by the multiple first blockchains, judges whether the global loss values are converged, if yes, informs the multiple first blockchains to stop training, takes the local model of any one first blockchain in the multiple first blockchains as a target model, and if not, sends the global model privacy parameters to the multiple first blockchains;
step 204: and under the condition that the received consensus verification result of the global model privacy parameters is passed, updating the local model initial parameters according to the global model privacy parameters, and repeatedly executing the step of training the local model according to the constructed first data set and the updated local model initial parameters.
Correspondingly, the embodiment herein also provides a model training method based on block chain cross-chain privacy protection, which is performed by a relay chain, as shown in fig. 3, including,
step 301: receiving a plurality of local model privacy parameters and a plurality of loss values transmitted by a plurality of first blockchains;
in this step, the local model privacy parameter is obtained by the first blockchain after adding disturbance to the local model parameter according to the set difference privacy parameter and the random algorithm, the local model parameter is obtained by the first blockchain training the local model according to the first data set of the constructed model and the randomly generated local model initial parameter, and the loss value is obtained by the first blockchain calculating the local model;
step 302: under the condition that the result of the consensus verification of the local model privacy parameters and the loss values is passed, aggregating the received multiple local model privacy parameters sent by the multiple first blockchains to obtain global model privacy parameters;
step 303: calculating a global loss value according to the received loss values sent by the plurality of first blockchains;
step 304: judging whether the global loss value is converged, if so, notifying a plurality of first block chains to stop training, and taking a local model of any one of the first block chains as a target model;
step 305: if not, the global model privacy parameters are sent to the first block chains, so that the first block chains update the initial parameters of the local model according to the global model privacy parameters, and the step of training the local model according to the constructed first data set and the updated initial parameters of the local model is repeatedly executed.
According to the method, each first block chain trains a local model by using a respective first data set and randomly generated local model initial parameters to obtain local model parameters and calculate a loss value of the local model, then each first block chain adds disturbance to the respective local model parameters to obtain local model privacy parameters, and then the local model privacy parameters and the loss values are sent to a relay chain. And then the relay chain aggregates the received local model privacy parameters sent by the first blockchains to obtain a global privacy parameter, compared with a method that each first blockchain executes parameter aggregation once, the calculation amount of model training can be reduced to a certain extent, then the relay chain calculates a global loss value according to the loss values of the first blockchains, whether the global loss value is converged is judged, if the global loss value is not converged, the global privacy parameter is sent to the first blockchains for training again, so that the first blockchains train a collaborative model under the condition that source data is not exported until the global loss value is converged, at this time, the local model trained by each first blockchain can meet the collaborative requirements of the first blockchains, and the local model of any first blockchain is taken as a final target model which can meet the service requirements of institutions (such as law, department and department) corresponding to the first blockchains. The method and the system have the advantages that the stability of model training is guaranteed, the privacy safety of the legal system department three-way data is guaranteed, and the problems that the data of the legal system department three-way block chain is leaked and the calculated amount of model training is large in the prior art are solved.
In this embodiment, the data in the first data set is stored only on the first blockchain, and when the target model of each first blockchain is constructed, each blockchain acquires the respective stored data to construct the first data set and train the local model.
In this embodiment, when the first blockchain trains the local model according to the first data set and the randomly generated initial parameters of the local model, the parameters of the local model are obtained, and the loss value of the local model is calculated, in order to avoid falsification of the parameters and the loss value of the local model, the chain nodes of the first blockchain perform consensus verification on the parameters and the loss value of the local model, generate and store corresponding blocks, and the security of the parameters and the loss value of the local model is ensured through the consensus verification technology of the blockchain. And then, in order to avoid that an attacker reversely deduces the source data of the first block chain through the local model parameters, the first block chain adds disturbance into the local model parameters according to the set differential privacy parameters and the random algorithm to obtain the local model privacy parameters. The local model parameters correspond to a plurality of neurons, and the values of the designated neurons can be changed, so that the local model privacy parameters are obtained.
And then the first block chain sends the local model privacy parameter and the loss value to the relay chain, in order to avoid tampering the local model privacy parameter and the loss value received by the relay chain, the link node of the relay chain performs consensus verification on the local model privacy parameter and the loss value, generates a block and stores the block. And then aggregating a plurality of received local model privacy parameters sent by the first blockchains by the relay chain to obtain global model privacy parameters, wherein the global model privacy parameters comprise local model privacy parameters obtained after respective local models of the first blockchains are trained, sending the global model privacy parameters to the first blockchains, updating the local model parameters by the first blockchains according to the global model privacy parameters, and then training the local models by using the updated local model parameters and the local first data sets, so that a target model which can meet the service requirements of the relay chain and the service requirements of other first blockchains can be trained under the condition that source data of other first blockchains are not exported.
In the collaborative training process of each first block chain, after each round of local model training of each first block chain is completed, calculating a loss value of the local model once, but the loss value can only represent whether the local model of the first block chain is converged, but since each first block chain needs to be trained collaboratively, each first block chain also needs to add disturbance to the local model parameter trained in a new round to obtain a local model privacy parameter of the new round, sending the local model privacy parameter of the new round and the loss value of the new round to the relay chain, and the relay chain further aggregates the received local model privacy parameters of the first block chains again to obtain a global model privacy parameter, and calculates the loss values of the first block chains to obtain a global loss value. It can be understood that, since the global loss value is obtained by analyzing and calculating the loss value of the first blockchain, it is determined whether the global loss value converges, that is, it may be determined whether the collaborative model trained on each first blockchain converges, if so, the model training of each first blockchain is stopped, and a local model trained on any first blockchain may be used as the target model.
According to one embodiment herein, the formula for calculating the loss function of the onto-model is (1):
wherein,L i (ω) Denotes the firstiThe loss value of each of the first block chains,ωrepresenting the parameters of the local model in question,D i represents the firstiThe first data set of the first block chainD i L represents the number of data samples in the first data set,lj(ω,D i ) Representing the model parametersωTo (1) ajOf a data sample (A)x,y) The loss value of (c).
In this embodiment, although adding the disturbance to the local model parameter can avoid the source data from being leaked, if the added disturbance is too large, the accuracy of the obtained target model is also reduced, and therefore, the added disturbance needs to be controlled within a reasonable range, which can ensure that the source data cannot be leaked, and can also ensure the accuracy of the target model to a certain extent. In view of the above, according to one embodiment herein, adding perturbation to the local model parameters according to the set differential privacy parameters and a random algorithm, obtaining the local model privacy parameters further comprises,
according to formula (2):
stochastic algorithmAThe constraints of the above formula are satisfied, wherein,Awhich is representative of the random algorithm described above,Da data set of parameters of a local model is represented,D’representation and local model parameter data setDAdjacent any local model parameter data set, i.e.DOnly one record andD’in contrast to this, the present invention is,Orepresenting random algorithmsATo be transportedAdding the disturbed output, pr, of the local model parameter data setA(D) = ODenotes the stochastic algorithmAThe input local model parameter data set is output after disturbance is addedOThe probability of (a) of (b) being,εa differential privacy parameter representing the setting; if the formula is not satisfied, adjusting the random algorithmAOr replacing the random algorithmAUntil the final random algorithmAThe formula is satisfied; if the random algorithmAAnd if the formula is satisfied, adding disturbance to the local model parameter data set by using the random algorithm A to obtain the local model privacy parameters.
In the present example, the above formula (2) shows that the random algorithmAThe probability of outputting the same data in the face of any two adjacent local model parameter datasets should be comparable. The size of the differential privacy parameter will affect the accuracy of the final target model, and optionally, the differential privacy parameter may be set according to the actual accuracy requirement. It is understood that differential privacy is the design of a random algorithmAThe random algorithmAEnabling data setsDAnd any one of its neighboring datasetsD’Satisfy PrA(D)=O}≤e ϵ ×Pr{A(D’)=O}, data set processed by stochastic algorithmDThe privacy of the source data can be guaranteed (namely the obtained privacy parameters of the local model), and the accuracy of the finally obtained target model is within a set range.
According to one embodiment herein, the formula for updating the local model initial parameters according to the global model privacy parameters is (3),
wherein,is shown askThe first of block chaintThe updated initial parameters of the local model of the round,denotes the firstkA first block of the first block chaint-1 round of the global model privacy parameter,rindicating the learning rate, ∇ the gradient operator,L k (w) Denotes the firstkThe loss value of each of the first block chains,ωdenotes the firstt1 round of local model parameters.
According to one embodiment herein, the formula for calculating a global penalty value based on the received penalty values for a plurality of the first blockchain transmissions is (4),
wherein,L* A value representing the global penalty value is determined,representing the total number of data samples of the first plurality of blockchains,L i (ω) Is shown asiThe loss value of each of the first block chains,ωrepresents the firstiThe local model parameters of a first blockchain.
According to one embodiment herein, the received plurality of local model privacy parameters sent by the first blockchain are aggregated to obtain a global model privacy parameter according to the formula,
wherein,w t is shown astGlobal model privacy parameters of the wheelKL represents the total number of first blockchains,is shown askA first block chaintLocal model privacy parameters for the wheel.
In this embodiment, the training of the collaborative model is that each first blockchain concurrently performs model training, that is, in the iterative training process, after each first blockchain receives the global model privacy parameter sent by the relay chain, the iterative training of the local model is performed at the same time. However, since cooperative training requires data transmission across chains, if a certain first block chain is after the nth round of training is finished, in the process of sending the local model privacy parameter and the loss value to the relay chain, the relay chain cannot receive the local model privacy parameter and the loss value of the first block chain in a delayed manner due to network failure and the like, so that the relay chain cannot aggregate the global model privacy parameter, but the first block chain which has completed the nth round of training still waits for the relay chain to send the global model privacy parameter, which seriously affects the efficiency of cooperative training and occupies the resource of the first block chain which has completed the nth round of training. In response to the foregoing, according to one embodiment herein, as shown in fig. 4, receiving the plurality of local model privacy parameters and loss values for the plurality of first blockchain transmissions further comprises,
step 401: receiving a plurality of local model privacy parameters and loss values sent by a plurality of first blockchains according to a set time threshold;
step 402: if the local model privacy parameters and the loss values sent by one or more first blockchains are not received within the time range corresponding to the time threshold, continuing to perform a step of aggregating the received multiple local model privacy parameters sent by the multiple first blockchains to obtain global model privacy parameters, and notifying one or more first blockchains corresponding to the local model privacy parameters and the loss values which are not received to train the constructed target model according to the first data set of the one or more first blockchains, so as to update the target model.
In this embodiment, a time threshold may be set according to actual service needs, and in a time range corresponding to the set time threshold, if the local model privacy parameter and the loss function sent by one or some first blockchains are not received, the relay chain does not continue to wait for the local model privacy parameter and the loss function of the first blockchain, but aggregates the received local model privacy parameters to obtain a global model privacy parameter, and performs subsequent steps, that is, in a subsequent collaborative training process, data of one or more first blockchains corresponding to the local model privacy parameter and the loss value that are not received are not added, so that resource occupation of other first blockchains is avoided.
And after the training is finished, notifying one or more first block chains corresponding to the local model privacy parameters and the loss values which are not received to train the constructed target model according to the first data set of the first block chains, so that the target model is updated. The updated target model can meet the service requirement of each first block chain.
Based on the same inventive concept, the embodiment herein further provides a model training apparatus based on block chain cross-chain privacy protection, as shown in fig. 5, including,
a local model training unit 501, configured to train a local model according to a first data set of a constructed model and randomly generated local model initial parameters, to obtain local model parameters, and calculate a loss value of the local model;
a local model parameter privacy unit 502, configured to add a disturbance to the local model parameter according to a set differential privacy parameter and a random algorithm to obtain a local model privacy parameter when a consensus verification result of the local model parameter and the loss value passes;
a local model privacy parameter sending unit 503, configured to send the local model privacy parameter and the loss value to a relay chain, so that the relay chain aggregates the received multiple local model privacy parameters sent by the multiple first blockchains when a result of performing consensus verification on the local model privacy parameter and the loss value is passed, to obtain a global model privacy parameter, calculates a global loss value according to the received loss values sent by the multiple first blockchains, determines whether the global loss value is converged, if yes, notifies the multiple first blockchains to stop training, takes a local model of any one of the multiple first blockchains as a target model, and if not, sends the global model privacy parameter to the multiple first blockchains;
an iterative training unit 504, configured to, when a result of consensus verification on the received global model privacy parameters is that the result passes, update the local model initial parameters according to the global model privacy parameters, and repeatedly perform a step of training the local model according to the constructed first data set and the updated local model initial parameters.
Correspondingly, the embodiment herein further provides a model training device based on block chain cross-chain privacy protection, as shown in fig. 6, including,
a local model privacy parameter receiving unit 601, configured to receive a plurality of local model privacy parameters and a plurality of loss values sent by a plurality of first blockchains, where the local model privacy parameters are obtained by the first blockchain after adding a disturbance to the local model parameters according to a set difference privacy parameter and a random algorithm, the local model parameters are obtained by the first blockchain training a local model according to a first data set of a constructed model and a randomly generated local model initial parameter, and the loss values are obtained by the first blockchain calculating the local model;
a local model privacy parameter aggregation unit 602, configured to, in a case that a result of performing consensus verification on the local model privacy parameters and the loss values is that the result passes, aggregate the multiple received local model privacy parameters sent by the multiple first blockchains, so as to obtain global model privacy parameters;
a global penalty value calculation unit 603, configured to calculate a global penalty value according to the received penalty values sent by the plurality of first blockchains;
an iterative training unit 604, configured to determine whether the global loss value converges, and if so, notify the first blockchains to stop training, where a local model of any one of the first blockchains is used as a target model; if not, the global model privacy parameters are sent to the first block chains, so that the first block chains update the initial parameters of the local model according to the global model privacy parameters, and the step of training the local model according to the constructed first data set and the updated initial parameters of the local model is repeatedly executed.
The beneficial effects obtained by the above device or system are consistent with those obtained by the above method, and the embodiments of this specification are not described in detail.
Fig. 7 is a schematic structural diagram of a computer device according to an embodiment of the present invention, where an apparatus in the present invention may be the computer device in the embodiment, and execute the method of the present invention. Computer device 702 may include one or more processing devices 704, such as one or more Central Processing Units (CPUs), each of which may implement one or more hardware threads. The computer device 702 may also include any storage resources 706 for storing any kind of information, such as code, settings, data, etc. For example, and without limitation, the storage resources 706 may include any one or more of the following in combination: any type of RAM, any type of ROM, flash memory devices, hard disks, optical disks, etc. More generally, any storage resource may use any technology to store information. Further, any storage resource may provide volatile or non-volatile reservation of information. Further, any storage resources may represent fixed or removable components of computer device 702. In one case, when the processing device 704 executes associated instructions that are stored in any storage resource or combination of storage resources, the computer device 702 can perform any of the operations of the associated instructions. The computer device 702 also includes one or more drive mechanisms 708, such as a hard disk drive mechanism, an optical disk drive mechanism, or the like, for interacting with any storage resource.
Communication link 722 may be implemented in any manner, such as over a local area network, a wide area network (e.g., the Internet), a point-to-point connection, etc., or any combination thereof. Communication link 722 may include any combination of hardwired links, wireless links, routers, gateway functions, name servers, etc., as dictated by any protocol or combination of protocols.
It should be noted that, when the nodes on the first blockchain or the relay chain of the embodiments implement the method described in the embodiments of the present disclosure for the computer device 702 described in this embodiment, the presentation device 716 and the associated Graphical User Interface (GUI) 718 may not be included. Such as a computer minimal system comprising only processing device 704, storage resources 706, and network interface 720.
An embodiment of the present invention further provides a computer-readable storage medium, in which a computer program is stored, and the computer program, when executed by a processor, implements the above method.
An embodiment of the present invention further provides a computer program product, where the computer program product includes a computer program, and when the computer program is executed by a processor, the computer program implements the method described above.
It should be understood that, in various embodiments herein, the sequence numbers of the above-mentioned processes do not mean the execution sequence, and the execution sequence of each process should be determined by its function and inherent logic, and should not constitute any limitation to the implementation process of the embodiments herein.
It should also be understood that, in the embodiments herein, the term "and/or" is only one kind of association relation describing an associated object, meaning that three kinds of relations may exist. For example, a and/or B, may represent: a exists alone, A and B exist simultaneously, and B exists alone. In addition, the character "/" herein generally indicates that the former and latter related objects are in an "or" relationship.
Those of ordinary skill in the art will appreciate that the elements and algorithm steps of the examples described in connection with the embodiments disclosed herein may be embodied in electronic hardware, computer software, or combinations of both, and that the components and steps of the examples have been described in a functional general in the foregoing description for the purpose of illustrating clearly the interchangeability of hardware and software. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the implementation. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present disclosure.
It is clear to those skilled in the art that, for convenience and brevity of description, the specific working processes of the above-described systems, apparatuses and units may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again.
In the several embodiments provided herein, it should be understood that the disclosed system, apparatus, and method may be implemented in other ways. For example, the above-described apparatus embodiments are merely illustrative, and for example, the division of the units is only one type of logical functional division, and other divisions may be realized in practice, for example, multiple units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or units, and may also be an electric, mechanical or other form of connection.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purposes of the embodiments herein.
In addition, functional units in the embodiments herein may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit may be implemented in the form of hardware, or may also be implemented in the form of a software functional unit.
The integrated unit, if implemented in the form of a software functional unit and sold or used as a stand-alone product, may be stored in a computer readable storage medium. Based on such understanding, the technical solutions of the present invention may be implemented in a form of a software product, which is stored in a storage medium and includes several instructions for causing a computer device (which may be a personal computer, a server, or a network device) to execute all or part of the steps of the methods described in the embodiments of the present invention. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk, and other various media capable of storing program codes.
The principles and embodiments of this document are explained herein using specific examples, which are presented only to aid in understanding the methods and their core concepts; meanwhile, for a person skilled in the art, according to the idea of the present disclosure, there may be variations in the specific embodiments and the application scope, and in summary, the content of the present disclosure should not be construed as a limitation to the present disclosure.
Claims (10)
1. A method for model training based on blockchain cross-chain privacy protection, performed by a first blockchain, the method comprising,
training a local model according to a first data set of the constructed model and randomly generated initial parameters of the local model to obtain parameters of the local model, and calculating a loss value of the local model;
adding disturbance into the local model parameters according to the set differential privacy parameters and a random algorithm under the condition that the consensus verification result of the local model parameters and the loss values is passed, so as to obtain the local model privacy parameters;
sending the local model privacy parameters and the loss values to a relay chain so that the relay chain aggregates a plurality of received local model privacy parameters sent by a plurality of first blockchains under the condition that the result of consensus verification on the local model privacy parameters and the loss values is passed to obtain global model privacy parameters, calculating global loss values according to the received loss values sent by the plurality of first blockchains, judging whether the global loss values are converged, if yes, notifying the plurality of first blockchains to stop training, taking a local model of any one first blockchain in the plurality of first blockchains as a target model, and if not, sending the global model privacy parameters to the plurality of first blockchains;
and under the condition that the received result of the consensus verification of the global model privacy parameters is passed, updating the local model initial parameters according to the global model privacy parameters, and repeatedly executing the step of training the local model according to the constructed first data set and the updated local model initial parameters.
2. The method of claim 1, wherein the formula for calculating the loss value of the local model is:
wherein,L i (ω) Is shown asiThe loss value for each of the first block chains,ωrepresenting the parameters of the local model in question,D i represents the firstiThe first data set of the first block chainD i I denotes the secondThe number of data samples in a data set,lj(ω,D i ) Representing local model parametersωTo (1) ajOf a data sample (x,y) The loss value of (a).
3. The method of claim 1, wherein adding perturbation to the local model parameters according to the set differential privacy parameters and a stochastic algorithm to obtain local model privacy parameters further comprises,
according to the formula:
stochastic algorithmAThe constraints of the above formula are satisfied, wherein,Arepresenting the random algorithm in question,Da data set of parameters of a local model is represented,D’representation and local model parameter data setDAdjacent any local model parameter data set, i.e.DOnly one record andD’in contrast to this, the present invention is,Orepresenting random algorithmsAAdding perturbed output, pr leaf, to an input local model parameter datasetA(D) = ODenotes the stochastic algorithmAThe input local model parameter data set is output after adding disturbanceOThe probability of (a) of (b) being,εa differential privacy parameter representing the setting; if the formula is not satisfied, adjusting the random algorithmAOr replacing the random algorithmAUntil the final random algorithmAThe formula is satisfied; if the random algorithmAAnd if the formula is satisfied, adding disturbance to the local model parameter data set by using the random algorithm A to obtain the local model privacy parameters.
4. The method of claim 1, wherein the local model initial parameters are updated according to the global model privacy parameters by a formula of,
wherein,is shown askThe first of block chaintThe updated initial parameters of the local model of the round,is shown askA first block chaint-1 round of the global model privacy parameter,rindicating the learning rate, ∇ the gradiometer operator,L k (w) Is shown askThe loss value of each of the first block chains,ωis shown ast1 round of local model parameters.
5. A model training device based on block chain cross-chain privacy protection is characterized by comprising,
the local model training unit is used for training a local model according to a first data set for constructing the model and randomly generated local model initial parameters to obtain local model parameters and calculating a loss value of the local model;
the local model parameter privacy unit is used for adding disturbance into the local model parameters according to set differential privacy parameters and a random algorithm under the condition that the consensus verification result of the local model parameters and the loss values is passed, so as to obtain the local model privacy parameters;
a local model privacy parameter sending unit, configured to send the local model privacy parameter and the loss value to a relay chain, so that the relay chain aggregates the received multiple local model privacy parameters sent by the multiple first blockchains when a result of performing consensus verification on the local model privacy parameter and the loss value passes, to obtain a global model privacy parameter, calculates a global loss value according to the received loss values sent by the multiple first blockchains, determines whether the global loss value converges, notifies the multiple first blockchains to stop training if the global loss value converges, takes a local model of any one of the multiple first blockchains as a target model, and sends the global model privacy parameter to the multiple first blockchains if the global loss value does not converge;
and the iterative training unit is used for updating the initial parameters of the local model according to the global model privacy parameters under the condition that the consensus verification result of the received global model privacy parameters is passed, and repeatedly executing the step of training the local model according to the constructed first data set and the updated initial parameters of the local model.
6. A model training method based on block chain cross-chain privacy protection is characterized by being executed by a relay chain and comprises the following steps,
receiving a plurality of local model privacy parameters and a plurality of loss values sent by a plurality of first blockchains, wherein the local model privacy parameters are obtained by the first blockchains after disturbance is added to the local model parameters according to set difference privacy parameters and a random algorithm, the local model parameters are obtained by the first blockchains through training of local models according to a first data set of a constructed model and randomly generated local model initial parameters, and the loss values are obtained by the first blockchains through calculation of the local models;
under the condition that the result of the consensus verification of the local model privacy parameters and the loss values is passed, aggregating the received multiple local model privacy parameters sent by the multiple first blockchains to obtain global model privacy parameters;
calculating a global loss value according to the received loss values sent by the plurality of first blockchains;
judging whether the global loss value is converged, if so, notifying a plurality of first block chains to stop training, and taking a local model of any one of the first block chains as a target model;
if not, the global model privacy parameters are sent to the first blockchains, so that the first blockchains update the initial parameters of the local model according to the global model privacy parameters, and the step of training the local model according to the constructed first data set and the updated initial parameters of the local model is repeatedly executed.
7. The method of claim 6, wherein the global penalty value is computed based on the penalty values received for a plurality of the first blockchain transmissions by formulating a global penalty value,
8. The method of claim 6, wherein the received plurality of local model privacy parameters sent by the first blockchain are aggregated to obtain a global model privacy parameter according to the formula,
9. The method of claim 6, wherein receiving a plurality of local model privacy parameters and loss values sent by a plurality of first blockchain further comprises,
receiving a plurality of local model privacy parameters and loss values sent by a plurality of first blockchains according to a set time threshold;
if the local model privacy parameters and the loss values sent by one or more first blockchains are not received within the time range corresponding to the time threshold, continuing to perform a step of aggregating the received multiple local model privacy parameters sent by the multiple first blockchains to obtain global model privacy parameters, and notifying one or more first blockchains corresponding to the local model privacy parameters and the loss values which are not received to train the constructed target model according to the first data set of the one or more first blockchains, so as to update the target model.
10. A block chain cross-chain privacy protection-based model training device is characterized by comprising,
a local model privacy parameter receiving unit, configured to receive a plurality of local model privacy parameters and a plurality of loss values sent by a plurality of first blockchains, where the local model privacy parameters are obtained by the first blockchains after adding disturbances to the local model parameters according to set differential privacy parameters and a random algorithm, the local model parameters are obtained by the first blockchains training local models according to a first data set of a constructed model and randomly generated local model initial parameters, and the loss values are obtained by the first blockchains calculating the local models;
a local model privacy parameter aggregation unit, configured to, in a case that a result of performing consensus verification on the local model privacy parameters and the loss values is that the local model privacy parameters and the loss values pass, aggregate the received multiple local model privacy parameters sent by the multiple first blockchains, and obtain global model privacy parameters;
a global loss value calculation unit, configured to calculate a global loss value according to the received loss values sent by the plurality of first blockchains;
the iterative training unit is used for judging whether the global loss value is converged, if so, informing the plurality of first block chains to stop training, and taking a local model of any one of the plurality of first block chains as a target model; if not, the global model privacy parameters are sent to the first block chains, so that the first block chains update the initial parameters of the local model according to the global model privacy parameters, and the step of training the local model according to the constructed first data set and the updated initial parameters of the local model is repeatedly executed.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202211238647.9A CN115329385B (en) | 2022-10-11 | 2022-10-11 | Model training method and device based on block chain cross-chain privacy protection |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202211238647.9A CN115329385B (en) | 2022-10-11 | 2022-10-11 | Model training method and device based on block chain cross-chain privacy protection |
Publications (2)
Publication Number | Publication Date |
---|---|
CN115329385A CN115329385A (en) | 2022-11-11 |
CN115329385B true CN115329385B (en) | 2022-12-16 |
Family
ID=83914315
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202211238647.9A Expired - Fee Related CN115329385B (en) | 2022-10-11 | 2022-10-11 | Model training method and device based on block chain cross-chain privacy protection |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN115329385B (en) |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN113536382A (en) * | 2021-08-09 | 2021-10-22 | 北京理工大学 | Block chain-based medical data sharing privacy protection method by using federal learning |
CN113992360A (en) * | 2021-10-01 | 2022-01-28 | 浙商银行股份有限公司 | Block chain cross-chain-based federated learning method and equipment |
CN114398538A (en) * | 2021-12-08 | 2022-04-26 | 西安电子科技大学 | Cross-domain recommendation method and system for privacy protection, storage medium and computer equipment |
CN114528392A (en) * | 2022-04-24 | 2022-05-24 | 北京理工大学 | Block chain-based collaborative question-answering model construction method, device and equipment |
CN114861211A (en) * | 2022-06-06 | 2022-08-05 | 广东工业大学 | Meta-universe scene-oriented data privacy protection method, system and storage medium |
CN115037477A (en) * | 2022-05-30 | 2022-09-09 | 南通大学 | Block chain-based federated learning privacy protection method |
-
2022
- 2022-10-11 CN CN202211238647.9A patent/CN115329385B/en not_active Expired - Fee Related
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN113536382A (en) * | 2021-08-09 | 2021-10-22 | 北京理工大学 | Block chain-based medical data sharing privacy protection method by using federal learning |
CN113992360A (en) * | 2021-10-01 | 2022-01-28 | 浙商银行股份有限公司 | Block chain cross-chain-based federated learning method and equipment |
CN114398538A (en) * | 2021-12-08 | 2022-04-26 | 西安电子科技大学 | Cross-domain recommendation method and system for privacy protection, storage medium and computer equipment |
CN114528392A (en) * | 2022-04-24 | 2022-05-24 | 北京理工大学 | Block chain-based collaborative question-answering model construction method, device and equipment |
CN115037477A (en) * | 2022-05-30 | 2022-09-09 | 南通大学 | Block chain-based federated learning privacy protection method |
CN114861211A (en) * | 2022-06-06 | 2022-08-05 | 广东工业大学 | Meta-universe scene-oriented data privacy protection method, system and storage medium |
Non-Patent Citations (3)
Title |
---|
A study of a federated learning framework based on the interstellar file system and blockchain: Private Blockchain Federated Learning;Peng Zhang等;《2022 3rd International Conference on Computer Vision, Image and Deep Learning & International Conference on Computer Engineering and Applications (CVIDL & ICCEA)》;20220522;全文 * |
The Algorithmic Foundations of Differential Privacy;Cynthia Dwork等;《Foundations and Trends in Theoretical Computer Science》;20140831;全文 * |
基于区块链的联邦学习关键技术研究;金明;《中国优秀硕士学位论文全文数据库信息科技辑》;20220615;全文 * |
Also Published As
Publication number | Publication date |
---|---|
CN115329385A (en) | 2022-11-11 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN110084377B (en) | Method and device for constructing decision tree | |
US10963817B2 (en) | Training tree-based machine-learning modeling algorithms for predicting outputs and generating explanatory data | |
US20180048669A1 (en) | Comprehensive risk assessment in a heterogeneous dynamic network | |
CN109947740B (en) | Performance optimization method and device of block chain system | |
CN110866546B (en) | Method and device for evaluating consensus node | |
WO2022237194A1 (en) | Abnormality detection method and apparatus for accounts in federal learning system, and electronic device | |
CN113469373A (en) | Model training method, system, equipment and storage medium based on federal learning | |
US11948077B2 (en) | Network fabric analysis | |
CN111768305B (en) | Backwash money identification method and backwash money identification device | |
US11386507B2 (en) | Tensor-based predictions from analysis of time-varying graphs | |
CN104255011B (en) | Cloud computing secure data stores | |
US20230208882A1 (en) | Policy - aware vulnerability mapping and attack planning | |
US10282461B2 (en) | Structure-based entity analysis | |
KR20230031889A (en) | Anomaly detection in network topology | |
Shitharth et al. | Federated learning optimization: A computational blockchain process with offloading analysis to enhance security | |
CN113609345A (en) | Target object association method and device, computing equipment and storage medium | |
CN113988221A (en) | Insurance user classification model establishing method, classification method, device and equipment | |
US20160342899A1 (en) | Collaborative filtering in directed graph | |
Bhattacharya et al. | Normal approximation and fourth moment theorems for monochromatic triangles | |
CN115329385B (en) | Model training method and device based on block chain cross-chain privacy protection | |
CN107800640A (en) | A kind of method for detection and the processing for flowing rule | |
CN112434323A (en) | Model parameter obtaining method and device, computer equipment and storage medium | |
CN114528392B (en) | Block chain-based collaborative question-answering model construction method, device and equipment | |
Vega et al. | Sharing hardware resources in heterogeneous computer-supported collaboration scenarios | |
CN111815442B (en) | Link prediction method and device and electronic equipment |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant | ||
CF01 | Termination of patent right due to non-payment of annual fee |
Granted publication date: 20221216 |