CN115329385A - Model training method and device based on block chain cross-chain privacy protection - Google Patents

Model training method and device based on block chain cross-chain privacy protection Download PDF

Info

Publication number
CN115329385A
CN115329385A CN202211238647.9A CN202211238647A CN115329385A CN 115329385 A CN115329385 A CN 115329385A CN 202211238647 A CN202211238647 A CN 202211238647A CN 115329385 A CN115329385 A CN 115329385A
Authority
CN
China
Prior art keywords
local model
parameters
model
privacy
local
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202211238647.9A
Other languages
Chinese (zh)
Other versions
CN115329385B (en
Inventor
袁展译
孙福辉
成雨蓉
王晓燕
张志威
袁野
王国仁
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
People's Court Information Technology Service Center
Beijing Institute of Technology BIT
Original Assignee
People's Court Information Technology Service Center
Beijing Institute of Technology BIT
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by People's Court Information Technology Service Center, Beijing Institute of Technology BIT filed Critical People's Court Information Technology Service Center
Priority to CN202211238647.9A priority Critical patent/CN115329385B/en
Publication of CN115329385A publication Critical patent/CN115329385A/en
Application granted granted Critical
Publication of CN115329385B publication Critical patent/CN115329385B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/60Protecting data
    • G06F21/62Protecting access to data via a platform, e.g. using keys or access control rules
    • G06F21/6218Protecting access to data via a platform, e.g. using keys or access control rules to a system of files or objects, e.g. local or distributed file system or database
    • G06F21/6245Protecting personal data, e.g. for financial or medical purposes

Landscapes

  • Engineering & Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • Bioethics (AREA)
  • General Health & Medical Sciences (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Hardware Design (AREA)
  • Databases & Information Systems (AREA)
  • Computer Security & Cryptography (AREA)
  • Software Systems (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Medical Informatics (AREA)
  • Information Retrieval, Db Structures And Fs Structures Therefor (AREA)

Abstract

The present disclosure relates to the field of blockchain technologies, and in particular, to a method and an apparatus for model training based on blockchain cross-chain privacy protection. The method comprises the steps that each first block chain trains a local model to obtain local model parameters and calculate loss values, disturbance is added to the local model parameters on the first block chain to obtain local model privacy parameters, and the local model privacy parameters and the loss values are sent to a relay chain; the relay chain aggregates the received local model privacy parameters and updates the global model parameters; and calculating a global loss value according to the loss value of each first block chain, and performing iterative training according to the convergence condition of the global loss value. By the method, the privacy safety of the legal detection driver three-party data is guaranteed while the robustness of model training is guaranteed, and the problems that the risk of leakage of data of a legal detection driver three-party block chain exists and the calculated amount of model training is large in the prior art are solved.

Description

Model training method and device based on block chain cross-chain privacy protection
Technical Field
The present disclosure relates to the field of blockchain technologies, and in particular, to a method and an apparatus for model training based on blockchain cross-chain privacy protection.
Background
In the judicial setting, the most important part of the work of legal professionals is to provide reliable and high quality legal advisory services to professionals, however, due to the insufficient number of legal professionals, ensuring that non-professionals are able to obtain adequate and high quality advisory services is a matter of intense concern in the art.
At present, a judicial question-answering system can solve the problem, and the block chain-based collaborative question-answering model construction method disclosed by the prior patent CN114528392B can ensure that a law-checking company three-party collaborative question-answering model is checked through a relay chain training law under the condition that data of the law-checking company three-party block chain is not exported. However, in the parameter transmission process of the legal detection driver three-party, privacy protection is not carried out on the parameters, and if an attacker intercepts the model parameters of the legal detection driver three-party block chain, the data of the legal detection driver three-party block chain can be reversely deduced through the intercepted model parameters, so that the risk of leakage of the data of the legal detection driver three-party block chain exists. And the model parameters are respectively aggregated by the three-part block chain of the law-detection department, so that the calculated amount of model training is large.
At present, a model training method based on block chain cross-chain privacy protection is urgently needed, so that the problems that in the prior art, data of a block chain of three legal detection parties are leaked, and the calculated amount of model training is large are solved.
Disclosure of Invention
In order to solve the problems that data of a legal detection driver three-party block chain are leaked and the calculated amount of model training is large, the embodiment of the invention provides a block chain cross-chain privacy protection-based model training method and device, so that parameter data uploaded to a relay chain are subjected to local differential privacy encryption through intelligent contracts on each block chain, the robustness of model training is guaranteed, and the privacy safety of legal detection driver three-party data is guaranteed.
In order to solve the technical problems, the specific technical scheme is as follows:
in one aspect, embodiments herein provide a method for model training based on blockchain cross-chain privacy protection, performed by a first blockchain, comprising,
training a local model according to a first data set of the constructed model and randomly generated initial parameters of the local model to obtain parameters of the local model, and calculating a loss value of the local model;
adding disturbance into the local model parameters according to the set differential privacy parameters and a random algorithm under the condition that the consensus verification result of the local model parameters and the loss values is passed to obtain the local model privacy parameters;
sending the local model privacy parameters and the loss values to a relay chain so that the relay chain aggregates a plurality of received local model privacy parameters sent by a plurality of first blockchains under the condition that the result of consensus verification on the local model privacy parameters and the loss values is passed to obtain global model privacy parameters, calculating global loss values according to the received loss values sent by the plurality of first blockchains, judging whether the global loss values are converged, if so, notifying the plurality of first blockchains to stop training, taking a local model of any one first blockchain in the plurality of first blockchains as a target model, and if not, sending the global model privacy parameters to the plurality of first blockchains;
and under the condition that the received consensus verification result of the global model privacy parameters is passed, updating the local model initial parameters according to the global model privacy parameters, and repeatedly executing the step of training the local model according to the constructed first data set and the updated local model initial parameters.
Further, the formula for calculating the loss value of the local model is:
Figure 100002_DEST_PATH_IMAGE001
wherein the content of the first and second substances,L i (ω) Denotes the firstiThe loss value for each of the first block chains,ωrepresenting the parameters of the local model in question,D i represents the firstiThe first data set of the first block chainD i L represents the number of data samples in the first data set,lj(ω,D i ) Representing the model parametersωTo (1) ajOf a data sample (A)x,y) The loss value of (a).
Further, adding disturbance into the local model parameters according to the set differential privacy parameters and the random algorithm to obtain the local model privacy parameters further comprises,
the algorithm adds disturbance to the local model parameters to obtain the local model privacy parameters further comprises,
according to the formula:
Figure 811896DEST_PATH_IMAGE002
stochastic algorithmAThe constraints of the above formula are satisfied, wherein,Awhich is representative of the random algorithm described above,Da data set of parameters of the local model is represented,D’representation and local model parameter data setDAdjacent any local model parameter data set, i.e.DIn which there is only one record andD’in the different way, the first and the second,Orepresenting random algorithmsAAs input local model parametersThe data set is supplemented with a perturbed output, prA(D) = ODenotes the stochastic algorithmAThe input local model parameter data set is output after disturbance is addedOThe probability of (a) of (b) being,εa differential privacy parameter representing the setting; if the formula is not satisfied, adjusting the random algorithmAOr replacing the random algorithmAUp to the final stochastic algorithmAThe formula is satisfied; if the random algorithmAAnd if the formula is met, adding disturbance to the local model parameter data set by using the random algorithm A to obtain the local model privacy parameters.
Further, the formula for updating the initial parameters of the local model according to the global model privacy parameters is as follows,
Figure 100002_DEST_PATH_IMAGE003
wherein, the first and the second end of the pipe are connected with each other,
Figure 883888DEST_PATH_IMAGE004
denotes the firstkThe first block chaintThe updated initial parameters of the local model of the round,
Figure 100002_DEST_PATH_IMAGE005
denotes the firstkA first block chaint-1 round of the global model privacy parameters,rindicating a learning rate, \ 8711and indicating a gradient operator,L k (w) Is shown askThe loss value for each of the first block chains,ωdenotes the firstt1 round of local model parameters.
In another aspect, embodiments herein further provide a model training apparatus based on block chain cross-chain privacy protection, including,
the local model training unit is used for training a local model according to a first data set for constructing the model and randomly generated local model initial parameters to obtain local model parameters and calculating a loss value of the local model;
the local model parameter privacy unit is used for adding disturbance into the local model parameters according to set differential privacy parameters and a random algorithm under the condition that the consensus verification result of the local model parameters and the loss values is passed, so as to obtain the local model privacy parameters;
a local model privacy parameter sending unit, configured to send the local model privacy parameters and the loss values to a relay chain, so that the relay chain aggregates received multiple local model privacy parameters sent by multiple first blockchains when a result of performing consensus verification on the local model privacy parameters and the loss values is that the local model privacy parameters and the loss values pass, to obtain a global model privacy parameter, calculates a global loss value according to the received loss values sent by the multiple first blockchains, determines whether the global loss value converges, notifies the multiple first blockchains to stop training if the global loss value converges, takes a local model of any one first blockchain in the multiple first blockchains as a target model, and sends the global model privacy parameters to the multiple first blockchains if the global loss value does not converge;
and the iterative training unit is used for updating the initial parameters of the local model according to the global model privacy parameters under the condition that the consensus verification result of the received global model privacy parameters is passed, and repeatedly executing the step of training the local model according to the constructed first data set and the updated initial parameters of the local model.
Based on the same inventive concept, the embodiment of the present disclosure further provides a model training method based on block chain cross-chain privacy protection, which is performed by a relay chain, the method includes,
receiving a plurality of local model privacy parameters and a plurality of loss values sent by a plurality of first blockchains, wherein the local model privacy parameters are obtained by the first blockchains after disturbance is added to the local model parameters according to set difference privacy parameters and a random algorithm, the local model parameters are obtained by the first blockchains through training of local models according to a first data set of a constructed model and randomly generated local model initial parameters, and the loss values are obtained by the first blockchains through calculation of the local models;
under the condition that the result of the consensus verification of the local model privacy parameters and the loss values is passed, aggregating the received multiple local model privacy parameters sent by the multiple first blockchains to obtain global model privacy parameters;
calculating a global loss value according to the received loss values sent by the plurality of first blockchains;
judging whether the global loss value is converged, if so, notifying a plurality of first block chains to stop training, and taking a local model of any one of the first block chains as a target model;
if not, the global model privacy parameters are sent to the first block chains, so that the first block chains update the initial parameters of the local model according to the global model privacy parameters, and the step of training the local model according to the constructed first data set and the updated initial parameters of the local model is repeatedly executed.
Further, a global loss value is calculated according to the received loss values of the plurality of first blockchain transmissions,
Figure 903797DEST_PATH_IMAGE006
wherein the content of the first and second substances,L* A value representing the global penalty value is determined,
Figure 100002_DEST_PATH_IMAGE007
representing a total number of data samples of the first plurality of blockchains,L i (ω) Is shown asiThe loss value of each of the first block chains,ωrepresents the firstiThe local model parameters of a first blockchain.
Further, a plurality of received local model privacy parameters sent by the first blockchains are aggregated to obtain a global model privacy parameter according to a formula,
Figure 247666DEST_PATH_IMAGE008
wherein the content of the first and second substances,w t is shown astGlobal model privacy parameters of the wheelKL represents the total number of first blockchains,
Figure 100002_DEST_PATH_IMAGE009
denotes the firstkA first block chaintLocal model privacy parameters for the wheel.
Further, receiving a plurality of local model privacy parameters and loss values transmitted by a plurality of first blockchains further comprises,
receiving a plurality of local model privacy parameters and loss values sent by a plurality of first blockchains according to a set time threshold;
if the local model privacy parameters and the loss values sent by one or more first blockchains are not received within the time range corresponding to the time threshold, continuing to perform a step of aggregating the received multiple local model privacy parameters sent by the multiple first blockchains to obtain global model privacy parameters, and notifying one or more first blockchains corresponding to the local model privacy parameters and the loss values which are not received to train the constructed target model according to the first data set of the one or more first blockchains, so as to update the target model.
In another aspect, embodiments herein further provide a block chain cross-chain privacy protection-based model training apparatus, including,
a local model privacy parameter receiving unit, configured to receive a plurality of local model privacy parameters and a plurality of loss values that are sent by a plurality of first blockchains, where the local model privacy parameters are obtained by the first blockchains after adding a disturbance to the local model parameters according to a set differential privacy parameter and a random algorithm, the local model parameters are obtained by the first blockchains training local models according to a first data set of a constructed model and randomly generated local model initial parameters, and the loss values are obtained by the first blockchains calculating the local models;
a local model privacy parameter aggregation unit, configured to, in a case that a result of performing consensus verification on the local model privacy parameters and the loss values is that the local model privacy parameters and the loss values pass, aggregate the received multiple local model privacy parameters sent by the multiple first blockchains, and obtain global model privacy parameters;
a global loss value calculation unit, configured to calculate a global loss value according to the received loss values sent by the plurality of first blockchains;
the iterative training unit is used for judging whether the global loss value is converged, if so, informing the plurality of first block chains to stop training, and taking a local model of any one of the plurality of first block chains as a target model; if not, the global model privacy parameters are sent to the first block chains, so that the first block chains update the initial parameters of the local model according to the global model privacy parameters, and the step of training the local model according to the constructed first data set and the updated initial parameters of the local model is repeatedly executed.
The multiple first blockchains described in the embodiments herein may respectively correspond to a court department, an inspection department, and a judicial administration department, and with the embodiments herein, each first blockchain trains a local model using a respective first data set and randomly generated local model initial parameters to obtain local model parameters and calculate a loss value of the local model, and then each first blockchain adds disturbance to the respective local model parameters to obtain local model privacy parameters, and then sends the local model privacy parameters and the loss value to a relay chain, and the addition of the disturbed local model parameters enhances the privacy of the local model parameters, so that even if an attacker intercepts the local model privacy parameters, the data of the first data set cannot be reversely deduced from the local model privacy parameters, thereby ensuring the privacy security of the first blockchain data. And then the relay chain aggregates the received local model privacy parameters sent by the first blockchains to obtain a global privacy parameter, compared with a method that each first blockchain executes parameter aggregation once, the calculation amount of model training can be reduced to a certain extent, then the relay chain calculates a global loss value according to the loss values of the first blockchains, whether the global loss value is converged is judged, if the global loss value is not converged, the global privacy parameter is sent to the first blockchains for training again, so that the first blockchains train a collaborative model under the condition that source data is not exported until the global loss value is converged, and at this time, the local model trained by each first blockchain can meet the collaborative requirements of the first blockchains, and the local model of any first blockchain is taken as a final target model which can meet the business requirements of mechanisms (such as law, inspection and department) corresponding to the first blockchains. The method and the system have the advantages that the stability of model training is guaranteed, the privacy and the safety of the legal system department three-party data are guaranteed, and the problems that the data of the legal system department three-party block chain are leaked and the calculated amount of model training is large in the prior art are solved.
Drawings
In order to more clearly illustrate the embodiments or technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, and it is obvious that the drawings in the following description are only some embodiments of the present invention, and it is obvious for those skilled in the art that other drawings can be obtained according to the drawings without creative efforts.
Fig. 1 is a schematic diagram of a system for implementing a model training method based on block chain cross-chain privacy protection according to an embodiment of the present disclosure;
FIG. 2 is a flowchart illustrating a method for model training based on block chain cross-chain privacy protection according to an embodiment of the present disclosure;
FIG. 3 is a flowchart illustrating a method for model training based on block chain cross-chain privacy protection according to an embodiment of the present disclosure;
FIG. 4 illustrates a process of performing collaborative training using time thresholds set according to embodiments herein;
FIG. 5 is a schematic structural diagram illustrating a model training apparatus based on block chain cross-chain privacy protection according to an embodiment of the present disclosure;
FIG. 6 is a schematic structural diagram illustrating a model training apparatus based on block chain cross-chain privacy protection according to an embodiment of the present disclosure;
fig. 7 is a schematic structural diagram of a computer device according to an embodiment of the present disclosure.
[ description of reference ]:
101. a first block chain;
102. a relay chain;
501. a local model training unit;
502. a local model parameter privacy unit;
503. a local model privacy parameter sending unit;
504. an iterative training unit;
601. a local model privacy parameter receiving unit;
602. a local model privacy parameter aggregation unit;
603. a global loss value calculation unit;
604. an iterative training unit;
702. a computer device;
704. a processing device;
706. a storage resource;
708. a drive mechanism;
710. an input/output module;
712. an input device;
714. an output device;
716. a presentation device;
718. a graphical user interface;
720. a network interface;
722. a communication link;
724. a communication bus.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments herein without making any creative effort, shall fall within the scope of protection.
It should be noted that the terms "first," "second," and the like in the description and claims herein and in the above-described drawings are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used is interchangeable under appropriate circumstances such that the embodiments herein described are capable of operation in sequences other than those illustrated or described herein. Furthermore, the terms "comprises," "comprising," and "having," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a process, method, apparatus, article, or device that comprises a list of steps or elements is not necessarily limited to those steps or elements expressly listed, but may include other steps or elements not expressly listed or inherent to such process, method, article, or device.
It should be noted that the steps illustrated in the flowcharts of the figures may be performed in a computer system such as a set of computer-executable instructions and that, although a logical order is illustrated in the flowcharts, in some cases, the steps illustrated or described may be performed in an order different than here.
Fig. 1 is a schematic diagram of an implementation system of a model training method based on block chain cross-chain privacy protection according to an embodiment of the present disclosure, and the implementation system may include: a plurality of first blockchains 101 and relay chains 102, the first blockchains 101 and the relay chains 102 communicate with each other via a Network, and the Network may include a Local Area Network (LAN), a Wide Area Network (WAN), the internet, or a combination thereof, and is connected to a website, a user device (e.g., a computing device), and a backend system. The plurality of first block chains 101 may respectively correspond to a court chain, a scouting chain, and a judicial administration chain, and the plurality of first block chains 101 construct their own models according to their own stored data. The relay chain 102 is responsible for cooperative training among the plurality of first blockchains 101. The first blockchain 101 may be configured to build an object model through one or more servers, where the servers are deployed with a data processing system oriented to judicial data. Alternatively, the servers may be nodes of a cloud computing system (not shown), or each server may be a separate cloud computing system comprising multiple computers interconnected by a network and operating as a distributed processing system. The server may run any suitable computing system that enables it to act as a node in the blockchain network of the first blockchain 101.
In addition, it should be noted that fig. 1 is only one application environment provided by the present disclosure, and in practical applications, other application environments may also be included, for example, training a collaborative model for constructing multiple systems (e.g., a traffic management system, a vehicle management system, and a traffic police management system) may also be implemented on the multiple first block chains 101 and the relay chains 102 shown in fig. 1, and the number of the first block chains for training the collaborative model may also be adjusted according to the specific number of the systems, which is not limited in this specification.
Specifically, an embodiment herein provides a model training method based on blockchain cross-chain privacy protection, which may be performed by any one of first blockchains for constructing a collaborative model, and transmits and updates model parameters trained by multiple first blockchains through a blockchain cross-chain system, and ensures privacy security of data in each first blockchain, so as to construct a target model of each first blockchain. Fig. 2 is a flowchart illustrating a model training method based on block chain cross-chain privacy protection according to an embodiment of the present disclosure. The process of training the model based on the blockchain is described in this figure, but may include more or fewer operational steps based on conventional or non-creative efforts. The order of steps recited in the embodiments is merely one manner of performing the steps in a multitude of sequences, and does not represent a unique order of performance. In the actual implementation of the system or the device product, the method according to the embodiments or shown in the drawings can be executed in sequence or in parallel. Specifically, as shown in fig. 2, the method may include:
step 201: training a local model according to a first data set of the constructed model and randomly generated initial parameters of the local model to obtain parameters of the local model, and calculating a loss value of the local model;
step 202: adding disturbance into the local model parameters according to the set differential privacy parameters and a random algorithm under the condition that the consensus verification result of the local model parameters and the loss value is passed, so as to obtain the local model privacy parameters;
step 203: sending the local model privacy parameters and the loss values to a relay chain;
in this step, after sending a local model privacy parameter and the loss value to a relay chain, the relay chain aggregates a plurality of received local model privacy parameters sent by a plurality of first blockchains under the condition that the result of consensus verification on the local model privacy parameter and the loss value is passed, so as to obtain a global model privacy parameter, calculates a global loss value according to the received loss values sent by the plurality of first blockchains, determines whether the global loss value is converged, if yes, notifies the plurality of first blockchains to stop training, takes a local model of any one of the plurality of first blockchains as a target model, and if not, sends the global model privacy parameter to the plurality of first blockchains;
step 204: and under the condition that the received result of the consensus verification of the global model privacy parameters is passed, updating the local model initial parameters according to the global model privacy parameters, and repeatedly executing the step of training the local model according to the constructed first data set and the updated local model initial parameters.
Correspondingly, embodiments herein also provide a model training method based on block chain cross-chain privacy protection, which is performed by a relay chain, as shown in fig. 3, including,
step 301: receiving a plurality of local model privacy parameters and a plurality of loss values transmitted by a plurality of first blockchains;
in this step, the local model privacy parameter is obtained by the first blockchain after adding disturbance to the local model parameter according to the set differential privacy parameter and the random algorithm, the local model parameter is obtained by the first blockchain training the local model according to the first data set of the constructed model and the randomly generated local model initial parameter, and the loss value is obtained by the first blockchain calculating the local model;
step 302: under the condition that the result of the consensus verification of the local model privacy parameters and the loss values is passed, aggregating the received multiple local model privacy parameters sent by the multiple first blockchains to obtain global model privacy parameters;
step 303: calculating a global loss value according to the received loss values sent by the plurality of first blockchains;
step 304: judging whether the global loss value is converged, if so, notifying a plurality of first block chains to stop training, and taking a local model of any one of the first block chains as a target model;
step 305: if not, the global model privacy parameters are sent to the first blockchains, so that the first blockchains update the initial parameters of the local model according to the global model privacy parameters, and the step of training the local model according to the constructed first data set and the updated initial parameters of the local model is repeatedly executed.
According to the method, each first block chain trains the local model by using the respective first data set and the randomly generated initial parameters of the local model to obtain the parameters of the local model and calculate the loss value of the local model, then each first block chain adds disturbance to the respective local model parameters to obtain the privacy parameters of the local model, and then the privacy parameters and the loss value of the local model are sent to the relay chain. And then aggregating the received local model privacy parameters sent by the multiple first blockchains by the relay chain to obtain a global privacy parameter, compared with a method in which each first blockchain executes parameter aggregation once, the method can reduce the calculated amount of model training to a certain extent, then the relay chain calculates a global loss value according to the loss values of the multiple first blockchains, judges whether the global loss value is converged, if not, sends the global privacy parameter to the multiple first blockchains for training again, realizes that the multiple first blockchains train a collaborative model under the condition that source data is not exported until the global loss value is converged, and at this time, the local model trained by each first blockchain can meet the collaborative requirements of the multiple blockchains, and takes the local model of any first blockchain as a final target model, which can meet the service requirements of mechanisms (such as departments, departments and departments) corresponding to the multiple first blockchains. The method and the system have the advantages that the stability of model training is guaranteed, the privacy and the safety of the legal system department three-party data are guaranteed, and the problems that the data of the legal system department three-party block chain are leaked and the calculated amount of model training is large in the prior art are solved.
In this embodiment, the data in the first data set is stored only on the first blockchain, and when the target model of each first blockchain is constructed, each blockchain acquires the respective stored data to construct the first data set and train the local model.
In this embodiment, when the first blockchain trains the local model according to the first data set and the randomly generated initial parameters of the local model, the parameters of the local model are obtained, and the loss value of the local model is calculated, in order to avoid falsification of the parameters and the loss value of the local model, the chain nodes of the first blockchain perform consensus verification on the parameters and the loss value of the local model, generate and store corresponding blocks, and the security of the parameters and the loss value of the local model is ensured through the consensus verification technology of the blockchain. And then, in order to avoid that an attacker reversely deduces the source data of the first block chain through the local model parameters, the first block chain adds disturbance into the local model parameters according to the set differential privacy parameters and the random algorithm to obtain the local model privacy parameters. The local model parameters correspond to a plurality of neurons, and the values of the designated neurons can be changed, so that the local model privacy parameters are obtained.
And then the first block chain sends the local model privacy parameter and the loss value to the relay chain, in order to avoid tampering the local model privacy parameter and the loss value received by the relay chain, the link node of the relay chain performs consensus verification on the local model privacy parameter and the loss value, generates a block and stores the block. And then aggregating a plurality of received local model privacy parameters sent by the first blockchains by the relay chain to obtain global model privacy parameters, wherein the global model privacy parameters comprise local model privacy parameters obtained after respective local models of the first blockchains are trained, sending the global model privacy parameters to the first blockchains, updating the local model parameters by the first blockchains according to the global model privacy parameters, and then training the local models by using the updated local model parameters and the local first data sets, so that a target model which can meet the service requirements of the relay chain and the service requirements of other first blockchains can be trained under the condition that source data of other first blockchains are not exported.
In the collaborative training process of each first blockchain, after each round of local model training of each first blockchain is finished, calculating a loss value of the local model once, but the loss value can only represent whether the local model of the first blockchain is converged, but since each first blockchain needs to be trained collaboratively, each first blockchain also needs to add disturbance to the local model parameters trained in the new round to obtain the local model privacy parameters of the new round, sending the local model privacy parameters of the new round and the loss values of the new round to the relay chain, and then aggregating the received local model privacy parameters of the first blockchains again by the relay chain to obtain the global model privacy parameters, and calculating the loss values of the first blockchains to obtain the global loss value. It can be understood that, since the global loss value is obtained by analyzing and calculating the loss value of the first blockchain, it is determined whether the global loss value converges, that is, it may be determined whether the collaborative model trained for each first blockchain converges, if so, the model training for each first blockchain is stopped, and the local model trained for any first blockchain may be used as the target model.
According to one embodiment herein, the formula for calculating the loss function of the onto-model is (1):
Figure 567789DEST_PATH_IMAGE010
(1)
wherein the content of the first and second substances,L i (ω) Denotes the firstiThe loss value of each of the first block chains,ωrepresenting the parameters of the local model in question,D i represents the firstiThe first data set of the first block chainD i L represents the number of data samples in the first data set,lj(ω,D i ) Representing the model parametersωTo (1) ajOf a data sample (x,y) The loss value of (a).
In this embodiment, although adding the disturbance to the local model parameter can avoid the source data from being leaked, if the added disturbance is too large, the accuracy of the obtained target model is also reduced, and therefore, the added disturbance needs to be controlled within a reasonable range, which can ensure that the source data is not leaked and the accuracy of the target model is also ensured to a certain extent. In view of the above, according to an embodiment herein, adding a perturbation to the local model parameter according to the set differential privacy parameter and the random algorithm, obtaining the local model privacy parameter further includes,
according to formula (2):
Figure DEST_PATH_IMAGE011
(2)
stochastic algorithmAThe constraints of the above formula are satisfied, wherein,Awhich is representative of the random algorithm described above,Da data set of parameters of the local model is represented,D’representation and local model parameter data setDAdjacent any local model parameter data set, i.e.DOnly one record andD’in the different way, the first and the second,Orepresenting random algorithmsAAdding a perturbed output, pr, to an input local model parameter datasetA(D) = ODenotes the stochastic algorithmAThe input local model parameter data set is output after adding disturbanceOThe probability of (a) of (b) being,εa differential privacy parameter representing the setting; if the formula is not satisfied, adjusting the random algorithmAOr replacing the random algorithmAUntil the final random algorithmAThe formula is satisfied; if the random algorithmAAnd if the formula is satisfied, adding disturbance to the local model parameter data set by using the random algorithm A to obtain the local model privacy parameters.
In the present example, the above formula (2) shows that the random algorithmAThe probability of outputting the same data in the face of any two adjacent local model parameter datasets should be comparable. The size of the differential privacy parameter will affect the accuracy of the final target model, and optionally, the differential privacy parameter may be set according to the actual accuracy requirement. It is understood that differential privacy is the design of a random algorithmAThe random algorithmAEnabling data setsDAnd any one of its neighboring datasetsD’Satisfy PrA(D)=O}≤e ϵ ×Pr{A(D’)=OAnd fourthly, processing the data set by a random algorithmDThe privacy of the source data can be guaranteed (namely the obtained privacy parameters of the local model), and the accuracy of the finally obtained target model is within a set range.
According to one embodiment herein, the formula for updating the local model initial parameters according to the global model privacy parameters is (3),
Figure 76262DEST_PATH_IMAGE012
(3)
wherein the content of the first and second substances,
Figure 52308DEST_PATH_IMAGE004
denotes the firstkThe first of block chaintThe updated initial parameters of the local model of the round,
Figure 858590DEST_PATH_IMAGE005
is shown askA first block chaint-1 round of the global model privacy parameters,rindicating the learning rate, \ 8711and indicating the gradient operator,L k (w) Is shown askThe loss value of each of the first block chains,ωis shown ast1 round of local model parameters.
According to one embodiment herein, the formula for calculating a global penalty value based on the penalty values received for a plurality of the first blockchain transmissions is (4),
Figure DEST_PATH_IMAGE013
(4)
wherein the content of the first and second substances,L* A value representing the global penalty value is determined,
Figure 829957DEST_PATH_IMAGE007
representing the total number of data samples of the first plurality of blockchains,L i (ω) Is shown asiThe loss value of each of the first block chains,ωrepresents the firstiThe local model parameters of a first blockchain.
According to one embodiment herein, the received plurality of local model privacy parameters sent by the first blockchain are aggregated to obtain a global model privacy parameter according to the formula,
Figure 447014DEST_PATH_IMAGE014
(5)
wherein, the first and the second end of the pipe are connected with each other,w t denotes the firsttGlobal model privacy parameters of the wheelKL represents the total number of first blockchains,
Figure 707094DEST_PATH_IMAGE009
is shown askA first block chaintLocal model privacy parameters for the wheel.
In this embodiment, the training of the collaborative model is that each first blockchain concurrently performs model training, that is, in the iterative training process, after each first blockchain receives the global model privacy parameter sent by the relay chain, the iterative training of the local model is performed at the same time. However, since cooperative training requires data transmission across chains, if a certain first block chain is after the nth round of training is finished, in the process of sending the local model privacy parameter and the loss value to the relay chain, the relay chain cannot receive the local model privacy parameter and the loss value of the first block chain in a delayed manner due to network failure and the like, so that the relay chain cannot aggregate the global model privacy parameter, but the first block chain which has completed the nth round of training still waits for the relay chain to send the global model privacy parameter, which seriously affects the efficiency of cooperative training and occupies the resource of the first block chain which has completed the nth round of training. In view of the above, according to one embodiment herein, as shown in fig. 4, receiving a plurality of local model privacy parameters and loss values for a plurality of first blockchain transmissions further comprises,
step 401: receiving a plurality of local model privacy parameters and loss values sent by a plurality of first blockchains according to a set time threshold;
step 402: if the local model privacy parameters and the loss values sent by one or more first blockchain are not received within the time range corresponding to the time threshold, continuing to perform a step of aggregating the received multiple local model privacy parameters sent by the multiple first blockchain to obtain a global model privacy parameter, and notifying one or more first blockchain corresponding to the local model privacy parameters and the loss values which are not received to train the constructed target model according to the first data set of the one or more first blockchain, so as to update the target model.
In this embodiment, a time threshold may be set according to actual service needs, and in a time range corresponding to the set time threshold, if the local model privacy parameter and the loss function sent by one or some first blockchains are not received, the relay chain does not continue to wait for the local model privacy parameter and the loss function of the first blockchain, but aggregates the received local model privacy parameters to obtain a global model privacy parameter, and performs subsequent steps, that is, in a subsequent collaborative training process, data of one or more first blockchains corresponding to the local model privacy parameter and the loss value that are not received are not added, so that resource occupation of other first blockchains is avoided.
And after the training is finished, notifying one or more first block chains corresponding to the local model privacy parameters and the loss values which are not received to train the constructed target model according to the first data set of the first block chains, so that the target model is updated. The updated target model can meet the service requirements of each first block chain.
Based on the same inventive concept, the embodiment herein further provides a model training apparatus based on block chain cross-chain privacy protection, as shown in fig. 5, including,
a local model training unit 501, configured to train a local model according to a first data set of a constructed model and randomly generated local model initial parameters, to obtain local model parameters, and calculate a loss value of the local model;
a local model parameter privacy unit 502, configured to add a disturbance to the local model parameter according to a set differential privacy parameter and a random algorithm to obtain a local model privacy parameter when a consensus verification result of the local model parameter and the loss value passes;
a local model privacy parameter sending unit 503, configured to send the local model privacy parameter and the loss value to a relay chain, so that the relay chain aggregates the received multiple local model privacy parameters sent by the multiple first blockchains when a result of performing consensus verification on the local model privacy parameter and the loss value is passed, to obtain a global model privacy parameter, calculates a global loss value according to the received loss values sent by the multiple first blockchains, determines whether the global loss value is converged, if yes, notifies the multiple first blockchains to stop training, takes a local model of any one of the multiple first blockchains as a target model, and if not, sends the global model privacy parameter to the multiple first blockchains;
an iterative training unit 504, configured to, when a result of the consensus verification on the received global model privacy parameters is that the result is passed, update the local model initial parameters according to the global model privacy parameters, and repeatedly perform the step of training the local model according to the constructed first data set and the updated local model initial parameters.
Correspondingly, embodiments herein further provide a model training apparatus based on block chain cross-chain privacy protection, as shown in fig. 6, including,
a local model privacy parameter receiving unit 601, configured to receive a plurality of local model privacy parameters and a plurality of loss values sent by a plurality of first blockchains, where the local model privacy parameters are obtained by the first blockchain after adding a disturbance to the local model parameters according to a set difference privacy parameter and a random algorithm, the local model parameters are obtained by the first blockchain training a local model according to a first data set of a constructed model and a randomly generated local model initial parameter, and the loss values are obtained by the first blockchain calculating the local model;
a local model privacy parameter aggregation unit 602, configured to aggregate the received multiple local model privacy parameters sent by the multiple first blockchains to obtain a global model privacy parameter when a result of performing consensus verification on the local model privacy parameters and the loss values is that the local model privacy parameters and the loss values pass;
a global penalty value calculation unit 603, configured to calculate a global penalty value according to the penalty values received from the plurality of first blockchains;
an iterative training unit 604, configured to determine whether the global loss value converges, and if so, notify the first blockchains to stop training, where a local model of any one of the first blockchains is used as a target model; if not, the global model privacy parameters are sent to the first block chains, so that the first block chains update the initial parameters of the local model according to the global model privacy parameters, and the step of training the local model according to the constructed first data set and the updated initial parameters of the local model is repeatedly executed.
The beneficial effects obtained by the above device or system are consistent with those obtained by the above method, and the embodiments of this specification are not described in detail.
Fig. 7 is a schematic structural diagram of a computer device according to an embodiment of the present invention, where an apparatus in the present invention may be the computer device in the embodiment, and execute the method of the present invention. Computer device 702 may include one or more processing devices 704, such as one or more Central Processing Units (CPUs), each of which may implement one or more hardware threads. The computer device 702 may also include any storage resources 706 for storing any kind of information, such as code, settings, data, etc. For example, and without limitation, the storage resources 706 may include any one or more of the following in combination: any type of RAM, any type of ROM, flash memory devices, hard disks, optical disks, etc. More generally, any storage resource may use any technology to store information. Further, any storage resource may provide volatile or non-volatile reservation of information. Further, any storage resources may represent fixed or removable components of computer device 702. In one case, when the processing device 704 executes associated instructions that are stored in any storage resource or combination of storage resources, the computer device 702 can perform any of the operations of the associated instructions. The computer device 702 also includes one or more drive mechanisms 708, such as a hard disk drive mechanism, an optical disk drive mechanism, or the like, for interacting with any storage resource.
Computer device 702 can also include an input/output module 710 (I/O) for receiving various inputs (via input device 712) and for providing various outputs (via output device 714). One particular output mechanism may include a presentation device 716 and an associated Graphical User Interface (GUI) 718. In other embodiments, input/output module 710 (I/O), input device 712, and output device 714 may also not be included, as only one computer device in a network. Computer device 702 can also include one or more network interfaces 720 for exchanging data with other devices via one or more communication links 722. One or more communication buses 724 couple the above-described components together.
Communication link 722 may be implemented in any manner, such as over a local area network, a wide area network (e.g., the Internet), a point-to-point connection, etc., or any combination thereof. Communication link 722 may include any combination of hardwired links, wireless links, routers, gateway functions, name servers, etc., as dictated by any protocol or combination of protocols.
It should be noted that, when the nodes on the first blockchain or the relay chain of the embodiments implement the method described in the embodiments of the present disclosure for the computer device 702 described in this embodiment, the presentation device 716 and the associated Graphical User Interface (GUI) 718 may not be included. Such as a computer minimal system comprising only processing device 704, storage resources 706, and network interface 720.
An embodiment of the present invention further provides a computer-readable storage medium, where a computer program is stored, and when the computer program is executed by a processor, the computer program implements the foregoing method.
An embodiment of the present invention further provides a computer program product, where the computer program product includes a computer program, and when the computer program is executed by a processor, the computer program implements the method described above.
It should be understood that, in various embodiments herein, the sequence numbers of the above-mentioned processes do not mean the execution sequence, and the execution sequence of each process should be determined by its function and inherent logic, and should not constitute any limitation to the implementation process of the embodiments herein.
It should also be understood that, in the embodiments herein, the term "and/or" is only one kind of association relation describing an associated object, and means that there may be three kinds of relations. For example, a and/or B, may represent: a exists alone, A and B exist simultaneously, and B exists alone. In addition, the character "/" herein generally indicates that the former and latter related objects are in an "or" relationship.
Those of ordinary skill in the art will appreciate that the elements and algorithm steps of the examples described in connection with the embodiments disclosed herein may be embodied in electronic hardware, computer software, or combinations of both, and that the components and steps of the examples have been described in a functional general in the foregoing description for the purpose of illustrating clearly the interchangeability of hardware and software. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the technical solution. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present disclosure.
It is clear to those skilled in the art that, for convenience and brevity of description, the specific working processes of the above-described systems, apparatuses and units may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again.
In the several embodiments provided herein, it should be understood that the disclosed system, apparatus, and method may be implemented in other ways. For example, the above-described apparatus embodiments are merely illustrative, and for example, the division of the units is only one logical division, and other divisions may be realized in practice, for example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or units, and may also be an electric, mechanical or other form of connection.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purposes of the embodiments herein.
In addition, functional units in the embodiments herein may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit.
The integrated unit, if implemented in the form of a software functional unit and sold or used as a stand-alone product, may be stored in a computer readable storage medium. Based on such understanding, the technical solutions of the present invention may be implemented in a form of a software product, which is stored in a storage medium and includes several instructions for causing a computer device (which may be a personal computer, a server, or a network device) to execute all or part of the steps of the methods described in the embodiments of the present invention. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk, and other various media capable of storing program codes.
The principles and embodiments of the present disclosure are explained in detail by using specific embodiments, and the above description of the embodiments is only used to help understanding the method and its core idea; meanwhile, for a person skilled in the art, according to the idea of the present disclosure, there may be variations in the specific embodiments and the application scope, and in summary, the content of the present disclosure should not be construed as a limitation to the present disclosure.

Claims (10)

1. A method for model training based on block chain cross-chain privacy protection, performed by a first block chain, the method comprising,
training a local model according to a first data set of the constructed model and randomly generated initial parameters of the local model to obtain parameters of the local model, and calculating a loss value of the local model;
adding disturbance into the local model parameters according to the set differential privacy parameters and a random algorithm under the condition that the consensus verification result of the local model parameters and the loss values is passed, so as to obtain the local model privacy parameters;
sending the local model privacy parameters and the loss values to a relay chain so that the relay chain aggregates a plurality of received local model privacy parameters sent by a plurality of first blockchains under the condition that the result of consensus verification on the local model privacy parameters and the loss values is passed to obtain global model privacy parameters, calculating global loss values according to the received loss values sent by the plurality of first blockchains, judging whether the global loss values are converged, if yes, notifying the plurality of first blockchains to stop training, taking a local model of any one first blockchain in the plurality of first blockchains as a target model, and if not, sending the global model privacy parameters to the plurality of first blockchains;
and under the condition that the received consensus verification result of the global model privacy parameters is passed, updating the local model initial parameters according to the global model privacy parameters, and repeatedly executing the step of training the local model according to the constructed first data set and the updated local model initial parameters.
2. The method of claim 1, wherein the loss value of the local model is calculated by the formula:
Figure DEST_PATH_IMAGE001
wherein, the first and the second end of the pipe are connected with each other,L i (ω) Denotes the firstiThe loss value of each of the first block chains,ωrepresenting the parameters of the local model in question,D i represents the firstiThe first data set of the first block chainD i L represents the number of data samples in the first data set,lj(ω,D i ) Representing local model parametersωTo (1) ajA data sampleThis point that (1)x,y) The loss value of (c).
3. The method of claim 1, wherein obtaining local model privacy parameters by adding perturbation to the local model parameters according to set differential privacy parameters and a stochastic algorithm further comprises,
according to the formula:
Figure 226264DEST_PATH_IMAGE002
stochastic algorithmAThe constraints of the above formula are satisfied, wherein,Awhich is representative of the random algorithm described above,Da data set of parameters of the local model is represented,D’representation and local model parameter data setDAdjacent any local model parameter data set, i.e.DIn which there is only one record andD’in contrast to this, the present invention is,Orepresenting random algorithmsAAdding perturbed output, pr leaf, to an input local model parameter datasetA(D) = ODenotes the stochastic algorithmAThe input local model parameter data set is output after adding disturbanceOThe probability of (a) of (b) being,εa differential privacy parameter representing the setting; if the formula is not satisfied, adjusting the random algorithmAOr replacing the random algorithmAUp to the final stochastic algorithmAThe formula is satisfied; if the random algorithmAAnd if the formula is met, adding disturbance to the local model parameter data set by using the random algorithm A to obtain the local model privacy parameters.
4. The method according to claim 1, wherein the local model initial parameters are updated according to the global model privacy parameters by the formula,
Figure DEST_PATH_IMAGE003
wherein, the first and the second end of the pipe are connected with each other,
Figure 376623DEST_PATH_IMAGE004
denotes the firstkThe first of block chaintThe updated initial parameters of the local model of the round,
Figure DEST_PATH_IMAGE005
is shown askA first block chaint-1 round of the global model privacy parameter,rindicating the learning rate, \ 8711and indicating the gradient operator,L k (w) Is shown askThe loss value of each of the first block chains,ωis shown ast1 round of local model parameters.
5. A model training device based on block chain cross-chain privacy protection is characterized by comprising,
the local model training unit is used for training a local model according to a first data set for constructing the model and randomly generated local model initial parameters to obtain local model parameters and calculating a loss value of the local model;
the local model parameter privacy unit is used for adding disturbance into the local model parameters according to set differential privacy parameters and a random algorithm under the condition that the consensus verification result of the local model parameters and the loss values is passed, so as to obtain local model privacy parameters;
a local model privacy parameter sending unit, configured to send the local model privacy parameters and the loss values to a relay chain, so that the relay chain aggregates received multiple local model privacy parameters sent by multiple first blockchains when a result of performing consensus verification on the local model privacy parameters and the loss values is that the local model privacy parameters and the loss values pass, to obtain a global model privacy parameter, calculates a global loss value according to the received loss values sent by the multiple first blockchains, determines whether the global loss value converges, notifies the multiple first blockchains to stop training if the global loss value converges, takes a local model of any one first blockchain in the multiple first blockchains as a target model, and sends the global model privacy parameters to the multiple first blockchains if the global loss value does not converge;
and the iterative training unit is used for updating the initial parameters of the local model according to the global model privacy parameters and repeatedly executing the step of training the local model according to the constructed first data set and the updated initial parameters of the local model when the result of the consensus verification of the received global model privacy parameters is passed.
6. A block chain cross-chain privacy protection-based model training method is performed by a relay chain, and comprises the following steps,
receiving a plurality of local model privacy parameters and a plurality of loss values sent by a plurality of first blockchains, wherein the local model privacy parameters are obtained by the first blockchains after disturbance is added to the local model parameters according to set differential privacy parameters and a random algorithm, the local model parameters are obtained by the first blockchains through training of local models according to a first data set of a constructed model and randomly generated local model initial parameters, and the loss values are obtained by the first blockchains through calculation of the local models;
under the condition that the result of the consensus verification of the local model privacy parameters and the loss values is passed, aggregating the received multiple local model privacy parameters sent by the multiple first blockchains to obtain global model privacy parameters;
calculating a global loss value according to the received loss values sent by the plurality of first blockchains;
judging whether the global loss value is converged, if so, notifying a plurality of first block chains to stop training, and taking a local model of any one of the first block chains as a target model;
if not, the global model privacy parameters are sent to the first blockchains, so that the first blockchains update the initial parameters of the local model according to the global model privacy parameters, and the step of training the local model according to the constructed first data set and the updated initial parameters of the local model is repeatedly executed.
7. The method of claim 6, wherein a global loss value is calculated based on the received loss values for a plurality of the first blockchain transmissions according to the formula,
Figure 453776DEST_PATH_IMAGE006
wherein the content of the first and second substances,L* A value representing the global penalty value is determined,
Figure DEST_PATH_IMAGE007
representing a total number of data samples of the first plurality of blockchains,L i (ω) Is shown asiThe loss value of each of the first block chains,ωrepresents the firstiThe local model parameters of a first blockchain.
8. The method of claim 6, wherein the received plurality of local model privacy parameters sent by the first blockchain are aggregated to obtain a global model privacy parameter according to the formula,
Figure 918255DEST_PATH_IMAGE008
wherein the content of the first and second substances,w t denotes the firsttGlobal model privacy parameters of the wheelKL represents the total number of the first blockchain,
Figure DEST_PATH_IMAGE009
is shown askA first block of the first block chaintLocal model privacy parameters for the wheel.
9. The method of claim 6, wherein receiving a plurality of local model privacy parameters and loss values sent by a plurality of first blockchain further comprises,
receiving a plurality of local model privacy parameters and loss values sent by a plurality of first blockchains according to a set time threshold;
if the local model privacy parameters and the loss values sent by one or more first blockchain are not received within the time range corresponding to the time threshold, continuing to perform a step of aggregating the received multiple local model privacy parameters sent by the multiple first blockchain to obtain a global model privacy parameter, and notifying one or more first blockchain corresponding to the local model privacy parameters and the loss values which are not received to train the constructed target model according to the first data set of the one or more first blockchain, so as to update the target model.
10. A model training device based on block chain cross-chain privacy protection is characterized by comprising,
a local model privacy parameter receiving unit, configured to receive a plurality of local model privacy parameters and a plurality of loss values sent by a plurality of first blockchains, where the local model privacy parameters are obtained by the first blockchains after adding disturbances to the local model parameters according to set differential privacy parameters and a random algorithm, the local model parameters are obtained by the first blockchains training local models according to a first data set of a constructed model and randomly generated local model initial parameters, and the loss values are obtained by the first blockchains calculating the local models;
a local model privacy parameter aggregation unit, configured to, in a case that a result of performing consensus verification on the local model privacy parameters and the loss values is that the local model privacy parameters and the loss values pass, aggregate the received multiple local model privacy parameters sent by the multiple first blockchains, and obtain global model privacy parameters;
a global loss value calculation unit, configured to calculate a global loss value according to the received loss values sent by the plurality of first blockchains;
the iterative training unit is used for judging whether the global loss value is converged, if so, informing the plurality of first block chains to stop training, and taking a local model of any one of the plurality of first block chains as a target model; if not, the global model privacy parameters are sent to the first blockchains, so that the first blockchains update the initial parameters of the local model according to the global model privacy parameters, and the step of training the local model according to the constructed first data set and the updated initial parameters of the local model is repeatedly executed.
CN202211238647.9A 2022-10-11 2022-10-11 Model training method and device based on block chain cross-chain privacy protection Active CN115329385B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211238647.9A CN115329385B (en) 2022-10-11 2022-10-11 Model training method and device based on block chain cross-chain privacy protection

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211238647.9A CN115329385B (en) 2022-10-11 2022-10-11 Model training method and device based on block chain cross-chain privacy protection

Publications (2)

Publication Number Publication Date
CN115329385A true CN115329385A (en) 2022-11-11
CN115329385B CN115329385B (en) 2022-12-16

Family

ID=83914315

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211238647.9A Active CN115329385B (en) 2022-10-11 2022-10-11 Model training method and device based on block chain cross-chain privacy protection

Country Status (1)

Country Link
CN (1) CN115329385B (en)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113536382A (en) * 2021-08-09 2021-10-22 北京理工大学 Block chain-based medical data sharing privacy protection method by using federal learning
CN113992360A (en) * 2021-10-01 2022-01-28 浙商银行股份有限公司 Block chain cross-chain-based federated learning method and equipment
CN114398538A (en) * 2021-12-08 2022-04-26 西安电子科技大学 Cross-domain recommendation method and system for privacy protection, storage medium and computer equipment
CN114528392A (en) * 2022-04-24 2022-05-24 北京理工大学 Block chain-based collaborative question-answering model construction method, device and equipment
CN114861211A (en) * 2022-06-06 2022-08-05 广东工业大学 Meta-universe scene-oriented data privacy protection method, system and storage medium
CN115037477A (en) * 2022-05-30 2022-09-09 南通大学 Block chain-based federated learning privacy protection method

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113536382A (en) * 2021-08-09 2021-10-22 北京理工大学 Block chain-based medical data sharing privacy protection method by using federal learning
CN113992360A (en) * 2021-10-01 2022-01-28 浙商银行股份有限公司 Block chain cross-chain-based federated learning method and equipment
CN114398538A (en) * 2021-12-08 2022-04-26 西安电子科技大学 Cross-domain recommendation method and system for privacy protection, storage medium and computer equipment
CN114528392A (en) * 2022-04-24 2022-05-24 北京理工大学 Block chain-based collaborative question-answering model construction method, device and equipment
CN115037477A (en) * 2022-05-30 2022-09-09 南通大学 Block chain-based federated learning privacy protection method
CN114861211A (en) * 2022-06-06 2022-08-05 广东工业大学 Meta-universe scene-oriented data privacy protection method, system and storage medium

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
CYNTHIA DWORK等: "The Algorithmic Foundations of Differential Privacy", 《FOUNDATIONS AND TRENDS IN THEORETICAL COMPUTER SCIENCE》 *
PENG ZHANG等: "A study of a federated learning framework based on the interstellar file system and blockchain: Private Blockchain Federated Learning", 《2022 3RD INTERNATIONAL CONFERENCE ON COMPUTER VISION, IMAGE AND DEEP LEARNING & INTERNATIONAL CONFERENCE ON COMPUTER ENGINEERING AND APPLICATIONS (CVIDL & ICCEA)》 *
金明: "基于区块链的联邦学习关键技术研究", 《中国优秀硕士学位论文全文数据库信息科技辑》 *

Also Published As

Publication number Publication date
CN115329385B (en) 2022-12-16

Similar Documents

Publication Publication Date Title
US10601854B2 (en) Comprehensive risk assessment in a heterogeneous dynamic network
Raskutti et al. Learning directed acyclic graph models based on sparsest permutations
US20230039182A1 (en) Method, apparatus, computer device, storage medium, and program product for processing data
AU2015201161B2 (en) Event correlation
CN110024340B (en) Near-uniform load balancing in visibility networks via use of prediction
CN112425137A (en) System and method for modeling and simulating IoT system
CN110168523A (en) Change monitoring to inquire across figure
CN110866546B (en) Method and device for evaluating consensus node
CN104580349A (en) Secure cloud management agent
WO2022237194A1 (en) Abnormality detection method and apparatus for accounts in federal learning system, and electronic device
CN104255011B (en) Cloud computing secure data stores
US10282461B2 (en) Structure-based entity analysis
US11531538B2 (en) Meta-indexing, search, compliance, and test framework for software development using smart contracts
US20230208882A1 (en) Policy - aware vulnerability mapping and attack planning
WO2022126975A1 (en) Client information verification method and apparatus, and computer device and storage medium
Zhao et al. Competitive dynamics on complex networks
US11948077B2 (en) Network fabric analysis
KR20230031889A (en) Anomaly detection in network topology
CN114610475A (en) Training method of intelligent resource arrangement model
CN116112175A (en) Service processing method, device and medium of digital twin network based on block chain
Yang Random-term-absent addition-min fuzzy relation inequalities and their lexicographic minimum solutions
CN115329385B (en) Model training method and device based on block chain cross-chain privacy protection
CN107800640A (en) A kind of method for detection and the processing for flowing rule
US20160342899A1 (en) Collaborative filtering in directed graph
CN112434323A (en) Model parameter obtaining method and device, computer equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant