CN115392475A - Block chain network-based federal learning method, verification node, device and medium - Google Patents

Block chain network-based federal learning method, verification node, device and medium Download PDF

Info

Publication number
CN115392475A
CN115392475A CN202210713131.9A CN202210713131A CN115392475A CN 115392475 A CN115392475 A CN 115392475A CN 202210713131 A CN202210713131 A CN 202210713131A CN 115392475 A CN115392475 A CN 115392475A
Authority
CN
China
Prior art keywords
block
global model
node
link
model
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210713131.9A
Other languages
Chinese (zh)
Inventor
彭喆
徐建良
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hong Kong Baptist University HKBU
Original Assignee
Hong Kong Baptist University HKBU
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hong Kong Baptist University HKBU filed Critical Hong Kong Baptist University HKBU
Priority to CN202210713131.9A priority Critical patent/CN115392475A/en
Publication of CN115392475A publication Critical patent/CN115392475A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N20/00Machine learning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/20Information retrieval; Database structures therefor; File system structures therefor of structured data, e.g. relational data
    • G06F16/27Replication, distribution or synchronisation of data between databases or within a distributed database system; Distributed database system architectures therefor
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/60Protecting data
    • G06F21/64Protecting data integrity, e.g. using checksums, certificates or signatures

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Software Systems (AREA)
  • Computing Systems (AREA)
  • Databases & Information Systems (AREA)
  • Data Mining & Analysis (AREA)
  • Computer Security & Cryptography (AREA)
  • Bioethics (AREA)
  • General Health & Medical Sciences (AREA)
  • Computer Hardware Design (AREA)
  • Health & Medical Sciences (AREA)
  • Artificial Intelligence (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Evolutionary Computation (AREA)
  • Medical Informatics (AREA)
  • Mathematical Physics (AREA)
  • Information Retrieval, Db Structures And Fs Structures Therefor (AREA)

Abstract

The application is applicable to the technical field of block chains, and provides a block chain network-based federal learning method, a verification node, equipment and a medium, wherein the block chain network comprises the following components: the federated learning method is executed by the verification nodes, firstly local models sent by the multiple participation nodes are obtained, then a global model is generated according to the multiple local models, and finally the storage nodes to be stored in the global model are determined at least according to the link relation between non-adjacent blocks, and the global model is sent to the storage nodes. According to the embodiment of the application, the global model generated by each training is stored by establishing the link relation between the non-adjacent blocks, so that a user can conveniently perform jump type search, and the query efficiency of the global model is effectively improved.

Description

Block chain network-based federal learning method, verification node, device and medium
Technical Field
The application belongs to the technical field of block chains, and particularly relates to a block chain network-based federal learning method, a verification node, equipment and a medium.
Background
Federal learning, as a distributed machine learning framework, has been applied to various scenarios such as smart medicine. Although federated learning can jointly train a deep learning model based on a distributed data set, the existing method is still easily attacked by an attacker and is difficult to meet the requirement on system security in practical application. In the federal learning framework, a central server (e.g., a cloud server) is responsible for the distribution, aggregation, and updating of deep learning models. These centralized servers become targets for attackers to produce erroneous or biased models.
The block chain is used as a distributed account book taking cryptography as guarantee, customized services are provided for the outside through intelligent contracts carried by the block chain, and remarkable application potentials in the aspects of data tamper resistance, fraud reduction, contract fulfillment and the like are shown. Since in verifiable federal learning, model aggregation and distribution services need to be provided for each node participating in training, and verifiability and non-repudiation need to be ensured in the training process, the blockchain provides an opportunity for realizing credible federal learning.
The block link network ensures the tamper resistance of the federal learning data, but an effective management mechanism is lacked for the federal learning data stored on the link, so that the inquiry efficiency of the federal learning data is low.
Disclosure of Invention
The embodiment of the application provides a block chain network-based federal learning method, a verification node, equipment and a medium, which can solve the problem of low efficiency of inquiring federal learning data on a block chain.
A first aspect of an embodiment of the present application provides a federal learning method based on a blockchain network, where the blockchain network includes: the federated learning method is executed by the verification node and comprises the following steps:
obtaining local models sent by a plurality of participating nodes, wherein each local model is obtained by training an initial model through training data of the corresponding participating node;
generating a global model from the plurality of local models;
and determining a storage node to be stored by the global model at least according to the link relation between non-adjacent blocks, and sending the global model to the storage node.
Optionally, the link relationship includes a forward link and a backward link, and the determining, according to at least the link relationship between non-adjacent blocks, a storage node where the global model is to be stored, and sending the global model to the storage node includes:
establishing forward links among a plurality of groups of non-adjacent blocks and establishing backward links among a plurality of groups of non-adjacent blocks;
and determining a storage node to be stored by the global model at least according to the forward link and the backward link, and sending the global model to the storage node.
Optionally, the establishing of the forward link between the multiple sets of non-adjacent blocks includes:
generating a pre-signature of any non-adjacent block to be generated in the current block by using a chameleon hash function;
establishing a forward link between the current block and the non-adjacent block to be generated based on the pre-signature and a preset forward link parameter;
forward links between sets of non-adjacent blocks are established.
Optionally, the establishing of the backward link between the multiple groups of non-adjacent blocks includes:
establishing backward links between the current block and the generated non-adjacent blocks based on the hash value of any generated non-adjacent block and preset backward link parameters;
establishing backward links between groups of non-adjacent blocks.
Optionally, the blocks in the block chain include anchor blocks and container blocks, and one anchor block corresponds to a plurality of container blocks, the forward link includes anchor links and anchor block links, and the establishing of the forward link between the multiple groups of non-adjacent blocks includes:
establishing anchor point links among the anchor point blocks, and establishing anchor point block links between any anchor point block and the container block corresponding to the anchor point block.
Optionally, the obtaining the local models sent by the multiple participating nodes includes:
verifying a local model generated by training of any participating node according to a preset public key of the participating node;
and acquiring local models of a plurality of participant nodes which pass verification, wherein the number of the local models which pass verification is not lower than a preset local model number threshold value.
Optionally, the method further includes:
and determining a storage node to be stored by the global model according to the link relation between adjacent blocks, and sending the global model to the storage node.
A second aspect of an embodiment of the present application provides a verification node of a federal learning method based on a blockchain network, where the blockchain network includes: a plurality of participating nodes, a verifying node and a storage node, wherein the verifying node comprises:
the local module collection module is used for acquiring local models sent by a plurality of participating nodes, wherein each local model is obtained by training an initial model through training data of the corresponding participating node;
the global model generation module is used for generating a global model according to the plurality of local models;
and the storage module is used for determining a storage node to be stored by the global model according to the link relation between adjacent or non-adjacent blocks and sending the global model to the storage node.
A third aspect of an embodiment of the present application provides a terminal device, including: a memory, a processor and a computer program stored in the memory and executable on the processor, the processor implementing the federal learning method based on a blockchain network as set forth in the first aspect when executing the computer program.
A fourth aspect of embodiments of the present application provides a computer-readable storage medium, where a computer program is stored, and the computer program, when executed by a processor, implements the federal learning method based on a blockchain network as described in the first aspect.
A fifth aspect of embodiments of the present application provides a computer program product, which, when running on a terminal device, causes the terminal device to execute the federal learning method based on a blockchain network according to the first aspect.
Compared with the prior art, the embodiment of the application has the advantages that:
the application discloses a block chain network-based federal learning method, a verification node, equipment and a medium, wherein the block chain network comprises the following components: the method comprises the steps of firstly obtaining local models sent by a plurality of participating nodes, then generating a global model according to the local models, finally determining storage nodes to be stored of the global model at least according to the link relation between non-adjacent blocks, and sending the global model to the storage nodes. According to the embodiment of the application, the global model generated by each training is stored by establishing the link relation between the non-adjacent blocks, so that a user can conveniently perform jump type search, and the query efficiency of the global model is effectively improved.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present application, the drawings needed to be used in the embodiments or the prior art descriptions will be briefly described below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and it is obvious for those skilled in the art to obtain other drawings without creative efforts.
Fig. 1 is a block chain network-based federal learning system architecture diagram according to an embodiment of the present application;
fig. 2 is a schematic flowchart of a federal learning method based on a blockchain network according to an embodiment of the present application;
FIG. 3 is a block chain verifiable data structure according to an embodiment of the present application;
fig. 4 is a schematic flowchart of a block chain network-based federal learning method according to a second embodiment of the present application;
fig. 5 is a schematic flowchart of a federal learning method based on a blockchain network according to a third embodiment of the present application;
fig. 6 is a schematic structural diagram of a verification node of a block chain network-based federal learning method according to a fourth embodiment of the present application;
fig. 7 is a schematic structural diagram of a terminal device according to a fifth embodiment of the present application.
Detailed Description
In the following description, for purposes of explanation and not limitation, specific details are set forth, such as particular system structures, techniques, etc. in order to provide a thorough understanding of the embodiments of the present application. It will be apparent, however, to one skilled in the art that the present application may be practiced in other embodiments that depart from these specific details. In other instances, detailed descriptions of well-known systems, devices, circuits, and methods are omitted so as not to obscure the description of the present application with unnecessary detail.
It will be understood that the terms "comprises" and/or "comprising," when used in this specification and the appended claims, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof.
It should also be understood that the term "and/or" as used in this specification and the appended claims refers to any and all possible combinations of one or more of the associated listed items and includes such combinations.
As used in this specification and the appended claims, the term "if" may be interpreted contextually as "when", "upon" or "in response to" determining "or" in response to detecting ". Similarly, the phrase "if it is determined" or "if a [ described condition or event ] is detected" may be interpreted contextually to mean "upon determining" or "in response to determining" or "upon detecting [ described condition or event ]" or "in response to detecting [ described condition or event ]".
Furthermore, in the description of the present application and the appended claims, the terms "first," "second," "third," and the like are used for distinguishing between descriptions and not necessarily for describing or implying relative importance.
Reference throughout this specification to "one embodiment" or "some embodiments," or the like, means that a particular feature, structure, or characteristic described in connection with the embodiment is included in one or more embodiments of the present application. Thus, appearances of the phrases "in one embodiment," "in some embodiments," "in other embodiments," or the like, in various places throughout this specification are not necessarily all referring to the same embodiment, but rather "one or more but not all embodiments" unless specifically stated otherwise. The terms "comprising," "including," "having," and variations thereof mean "including, but not limited to," unless otherwise specifically stated.
It should be understood that, the sequence numbers of the steps in this embodiment do not mean the execution sequence, and the execution sequence of each process should be determined by the function and the inherent logic of the process, and should not constitute any limitation to the implementation process of the embodiment of the present application.
In the prior art, a block chain network ensures tamper resistance of federal learning data, but an effective management mechanism is lacked for the federal learning data stored on a chain, so that the inquiry efficiency of the federal learning data is low.
In view of this, the present application provides a block chain network-based federated learning method, a verification node, a device, and a medium, which may determine a storage node to be stored in the global model according to at least a link relationship between non-adjacent blocks, and send the global model to the storage node, so as to solve the above problems and achieve efficient query of the global model.
The federal learning method, the verification nodes, the equipment and the medium based on the block chain network can be applied to intelligent medical scenes, all medical institutions need to upload models obtained through local training to a block chain system, then the block chain system is aggregated to generate a global model, meanwhile, storage nodes to be stored in the global model are determined according to the link relation between non-adjacent blocks, and the global model is sent to the storage nodes. The method of the example can effectively inquire the global model generated by local data of each medical institution.
In the following description of the present invention, fig. 1 shows a block chain network-based federal learning system architecture diagram in an embodiment of the present invention, where the block chain network includes: the federate learning method disclosed in the embodiments of the present application may be performed by a plurality of participating nodes, a verifying node (committee), a storage node, a node that successfully generates a block (miner), and a user. Wherein the committee is composed of trusts and is used as a whole to replace a centralized server in the traditional federal learning, and the trusts are selected from miners in a block chain and are also responsible for the duties of the trusts besides the duties of the miners. In the federated learning framework, the global model achieves iterative updates by collecting local models for each participating node. The contracts referred to in the following embodiments all refer to intelligent contracts supported in a blockchain network.
In a possible implementation mode, the participating nodes are only responsible for model training and do not participate in building of the blockchain network, so that the calculation amount and storage overhead required by the nodes for maintaining the blockchain can be reduced, and the cost of changing and upgrading the existing federal learning system is reduced.
Since there are untrusted nodes in the blockchain network, a node with a high trust level needs to be selected as a trusting person. Trust in the committee comes from the generation process of the blocks in the blockchain system, and miners who successfully produced the blocks will be eligible to be selected into the committee. First, the credibility is defined for the miners from two aspects
Figure BDA0003708750780000071
Where t is the last time to become a trusted person and c is the number of tokens issued in the owned block chain system. In this definition, it is considered that the trustperson shouldTwo traits are possessed, namely liveness and loyalty. Liveness indicates that the selected trusts are able to dig in the blockchain system at all times, and thus are able to keep the committee online alive. Loyalty describes that trusts with more blockchain tokens have made more contributions to the overall blockchain system, thus keeping the committee stable. The committee then consisted of a fixed number of the most reliable miners. In the block chain system, whenever a new block is generated, if the credibility of the mineworker W generating the block exceeds the lowest credibility of the mineworkers in the current committee, the mineworker W replaces the mineworker joining committee to achieve updating of the committee. Therefore, the efficiency of model verification and aggregation in the block chain system can be effectively improved.
In order to explain the technical solution of the present application, the following description is given by way of specific examples.
Referring to fig. 2, a flowchart of a federal learning method based on a blockchain network according to an embodiment of the present application is shown. As shown in fig. 2, the federal learning method is performed by a verification node, and may include the following steps:
step S201, obtaining a plurality of local models sent by the participating nodes, where each local model is obtained by training an initial model with training data of the corresponding participating node.
In this embodiment of the present application, the obtaining a local model sent by a plurality of participating nodes includes:
and verifying the local model generated by the training of any participating node according to a preset public key of the participating node.
And acquiring local models of a plurality of participant nodes which pass verification, wherein the number of the local models which pass verification is not lower than a preset local model number threshold value.
In one possible implementation, it is considered that during the model training process, the participating nodes may join and exit dynamically, resulting in interruption of the federal learning process or reduced learning effect. Therefore, in this embodiment, each node participating in training may be assigned a different public and private key pair for signing and verifying the generated local model. For each training task, a parameter θ will be predefined to express the correct number of local models that the verification node needs to collect at the minimum when generating the global model for each round of training. At the beginning of the training task, the public keys of all nodes and the threshold θ are collected into a "policy" file. The file is used as a trust anchor point, is released on a block chain along with a learning task, and is obtained by a safe verified node. In each training round, the participating nodes send the local model and the signature to the verification node for corresponding verification through a learning contract.
Step S202, a global model is generated according to the plurality of local models.
And the verification node verifies the collected local models through a learning contract according to the strategy file and aggregates and updates the global model. Thereafter, a signature for the global model update is generated by a "verify" contract using a federated signature mechanism, and the model and signature are recorded in the blockchain. In order to enhance the safety and efficiency of the system, the public and private key pair owned by the verification node is updated after each round of training.
Step S203, determining a storage node to be stored by the global model at least according to the link relation between non-adjacent blocks, and sending the global model to the storage node.
In this embodiment of the present application, the link relationship includes a forward link and a backward link, and the determining, according to at least the link relationship between non-adjacent blocks, a storage node where the global model is to be stored, and sending the global model to the storage node includes:
establishing forward links between sets of non-adjacent blocks and establishing backward links between sets of non-adjacent blocks. The forward link is a signature generated for one block. Illustratively, when a Bi block is generated, we establish a link to the Bi +1 block, i.e., when a Bi block is generated, we generate a signature for the Bi +1 block and store the signature in the Bi block. But at this time, the Bi +1 block is not generated, and the signature cannot be generated in the traditional way (SHA 256 hash is performed on the block and then the private key is used for generating the signature). On the other hand, when a Bi +1 block is generated, a signature cannot be generated by a traditional method and stored in the Bi block, because the property of the block chain is tamper resistance, and when the Bi block is generated, the information stored in the block cannot be changed, whether data is added, modified or deleted. The forward link can index the forward block, for example, when the model information that the user wants to obtain is stored in the Bi +4 block, bi +4 can be directly found through the forward link stored in the Bi block, the required model can be found by checking the block Bi +4, the efficiency of data search can be accelerated by establishing the link, and the forward and backward directions are both suitable.
And determining a storage node to be stored by the global model at least according to the forward link and the backward link, and sending the global model to the storage node.
In an embodiment of the present application, the establishing a forward link between multiple sets of non-adjacent blocks includes:
and generating a pre-signature of any non-adjacent block to be generated in the current block by using a chameleon hash function.
And establishing a forward link between the current block and the non-adjacent block to be generated based on the pre-signature and a preset forward link parameter.
Forward links between sets of non-adjacent blocks are established.
In an embodiment of the present application, the establishing of backward links between multiple sets of non-adjacent blocks includes:
and establishing a backward link between the current block and the generated non-adjacent block based on the hash value of any generated non-adjacent block and a preset backward link parameter.
Establishing backward links between groups of non-adjacent blocks.
In this embodiment, the blocks in the block chain include anchor blocks and container blocks, and one anchor block corresponds to a plurality of container blocks, the forward link includes anchor links and anchor block links, and the establishing of the forward link between the non-adjacent blocks includes:
establishing anchor point links among the anchor point blocks, and establishing anchor point block links between any anchor point block and the container block corresponding to the anchor point block.
In the embodiment of the present application, the embodiment of the present application further includes: and determining a storage node to be stored by the global model according to the link relation between adjacent blocks, and sending the global model to the storage node.
In the embodiment of the present application, with reference to the problem that data query efficiency is low due to lack of an effective management mechanism for data on a chain, and a user may suffer from replay attack when querying data of a block chain, referring to fig. 3, the embodiment provides a block chain verifiable Data Structure (DSC), which can implement fast bidirectional query for data on a chain, and support updating of an authorization key to resist replay attack. The blockchain verifiable data structure mainly comprises four elements: anchor chunk, container chunk, backward link, and forward link.
The verifiable data structure of the block chain can establish forward link and backward link according to a certain rule according to the relation between data stored in different blocks. Therefore, when the on-chain data query is carried out, the efficiency can be greatly improved, and the efficient audit of the model is supported.
Wherein, the backward link is constructed by using the hash value of the previous block as in the conventional blockchain system. The forward link is formed by a signature generated for the block. The anchor block stores therein a public key periodically updated by the verification node, a hash value of a previous block (backward linking), and a signature generated on a subsequent block that has not yet been generated (forward linking). The container chunk stores verifiable information (including the signature of the global model and the signature of the local model) of each round of training, and the hash value (backward linking) of the previous chunk. The anchor block is used for separating updating of the verification node key, and is mainly responsible for storing the updated verification key and forward link, so that some blocks can be skipped, and the searching time is reduced. Information associated with the model and its corresponding verifiable signature are stored in the container block. The anchor point can be viewed as a marker location and the container used to store the model. By setting the anchor block and the container block, the complexity of system implementation can be effectively reduced, and the data management efficiency is improved.
For backward linking, in anchor block B i In the backward link with the hopping progression l, the backward link points to the anchor block B i-j Wherein j = (g) b ) l-1 * (μ + 1). Anchor block B i Number of backward links pi constructed in b Comprises the following steps:
Figure BDA0003708750780000101
wherein g is b And l b The number of jump bases and jump stages of backward links are respectively, i is the index number of the current block, and mu is the iteration number of model training.
Illustratively, for backward linking, when the jumping base is 2 and the number of jumping levels is 1, a Bi block has one link pointing to a Bi-5 block; when the number of jumping levels is 2, bi has a link pointing to the Bi-10 block; at a skip level of 3, the Bi block will have a link pointing to the Bi-20 block. As for the number of skip stages of the Bi block, it is related to the value of i: when i =5, there are 2 levels (i.e., level 0 and level 1). When i =10, there are 3 levels (i.e., levels 0, 1, 2). When i =15, there are 2 levels (i.e., level 0 and level 1). When i =20, there are 4 levels (i.e., levels 0, 1, 2, 3).
For forward link, since subsequent model update is not generated when an anchor block is generated in a block chain, the system can generate a signature (forward link) of a corresponding block in advance by using chameleon hash, and when a real signature is generated, the corresponding security parameter r' is stored in the anchor block so as to ensure the consistency of the signature. The chameleon hash function CH is used to enable an owner of a key to quickly find two sets of numbers (m, r) and (m ', r '), so that CH (m, r) = CH (m ', r ') = a hash value, and m is not equal to m '. Where m and m 'are two different pieces of information and r' are both random numbers. In this embodiment, since the subsequent container block is not generated, the anchor block first generates a preset hash value by using the chameleon hash function (i.e., first randomly generates m and r, and calculates the chameleon hash value), and regards the anchor block as the hash value of a subsequent container block, and then calculates a signature to generate a forward link, where the forward link exists in the anchor block. Waiting for the subsequent container chunk pointed to by the forward link to be generated (in this case, m 'is the content of this container chunk), the chameleon hash function of the container chunk is used to generate the corresponding r' (a random number), so that the chameleon hash value of the container chunk = the hash value used to generate the signature in the anchor chunk.
To improve the efficiency of the forward search, the present embodiment establishes two types of forward links, namely anchor-anchor links and anchor-container links. For the first type of link, to implement secure verification node updates and corresponding key updates, anchor-to-anchor links are established between two adjacent anchor blocks and located at the lth of each anchor block f Number of layer jump steps. For the second type of link, to improve the retrieval efficiency, an exponential number of anchor-container links will be established between one anchor chunk and a corresponding number of container chunks. In anchor block B i In the middle, the jumping stage number is l is more than or equal to 0 and less than or equal to l f Forward link of-1, will point to container block B i+j Wherein j = (g) f ) l . Since an anchor block only connects related container blocks, l f Has a maximum value of
Figure BDA0003708750780000111
Thus, within an anchor container, the number of forward links built is π f Is pi f =l f +1, wherein g f And l f Respectively, the hop base and the hop count of the forward link.
Illustratively, there are 1, 2, 3, \ 8230;, 21 blocks in order. The model that the user wants to find is in block 14. Since the bi-directional links are established between the blocks in this embodiment, the search can be started from block 1, then the content of block 6 is searched, since 14 is greater than 6, then the content of block 11 is searched, the content of block 16 is searched continuously, since 14 is smaller than 16, block 15 is searched reversely, then block 14 is searched reversely, and finally the required global model is found. In addition, the search may be started from the block 21, and the block 21 is searched first, and then the blocks 16, 15, and 14 are searched. Current blockchain techniques only support sequential traversal searches, i.e., searching the contents of a block in sequence starting with block 1 until found. Therefore, the embodiment can effectively improve the searching efficiency. In the embodiment, the bidirectional query is realized by establishing the links in two directions between the blocks, and because the links are established in an exponential form and can span part of the blocks, the potentially irrelevant blocks can be skipped during the search without searching, and the efficiency of querying data is effectively improved.
In one possible implementation, some blocks are not provided with forward links, here taking into account the computational complexity, and if each is established, the computational cost and the cost of storing blocks are too high. Meanwhile, as the blocks can be searched one by one through backward linking, block crossing is mainly realized through forward linking, and meanwhile, the closer blocks and the farther blocks are considered, and the crossing length is established on the basis of exponential order.
Compared with the existing block sequence traversal search method, the block chain verifiable data structure can realize the retrieval of O (logn) levels in a front-back two-way mode on the block chain, effectively improves the query efficiency of model verification information, safely supports the regular updating of a verification node secret key, can provide model-time reference for a user, and prevents replay attack when the user sends a global model.
In one possible embodiment, the user may choose to subscribe to an update of the global model for a long term or to purchase a version of the global model a single time, as desired for a particular application. The user's issuing of the training requirements of the model and the acquisition of the model generated by training are realized by the ' trading ' contract on the blockchain. The "transaction" contract specifies the proportion of tokens paid by the user to be allocated to the verification node and the participating nodes.
In one possible implementation, taking into account the fairness of the nodes, the members of the verifying nodes will distribute the reward equally to the verifying nodes, and the participating nodes will distribute the reward by the size of their local data sets used for training. In addition, all verification node members and participating nodes need to deposit a certain amount of deposit in the blockchain when joining the system. Aiming at the conditions of malicious behaviors possibly existing in the system or overtime execution tasks and the like, the system punishs the nodes in a mode of not collecting deposit.
The embodiment of the application discloses a federated learning method based on a block chain network, which stores a global model generated by each training through establishing a link relation between non-adjacent blocks, facilitates a user to perform jump-type search, and effectively improves the query efficiency of the global model.
Referring to fig. 4, a flowchart of a federal learning method based on a blockchain network according to the second embodiment of the present application is shown. As shown in fig. 4, the federal learning method is performed by a participating node, and may include the following steps:
step S401, training a local initial model by using corresponding training data to obtain a local model corresponding to the local initial model.
Step S402, the local model is sent to a plurality of verification nodes, so that the verification nodes generate a global model according to the obtained local models corresponding to different participation nodes.
Step S403, determining a storage node to be stored by the global model at least according to the link relation between non-adjacent blocks, and sending the global model to the storage node.
The present embodiment and the first embodiment can refer to each other, and details of the present embodiment are not repeated herein.
Referring to fig. 5, a flowchart of a federal learning method based on a blockchain network according to a third embodiment of the present application is shown. As shown in fig. 5, the federal learning method is executed by a storage node, and may include the following steps:
step S501, local models sent by a plurality of participating nodes are obtained, wherein each local model is obtained by training an initial model through training data of a corresponding participating node.
Step S502, the local model is sent to a plurality of verification nodes, so that the verification nodes generate a global model according to the obtained local models corresponding to different participation nodes.
Step S503, acquiring the global model generated by the verification node, and storing the global model at least according to the link relation between the non-adjacent blocks.
The present embodiment and the first embodiment can refer to each other, and the description of the present embodiment is omitted here.
Referring to fig. 6, a structural diagram of a verification node of a block chain network-based federal learning method according to the fourth embodiment of the present application is shown, and for convenience of description, only the parts related to the embodiment of the present application are shown.
The blockchain network comprises: the system comprises a plurality of participating nodes, a verification node and a storage node, wherein the verification node specifically comprises the following modules:
the local module collecting module 601 is configured to obtain local models sent by multiple participating nodes, where each local model is obtained by training an initial model through training data of a corresponding participating node.
A global model generation module 602, configured to generate a global model according to the plurality of local models.
The storage module 603 is configured to determine a storage node where the global model is to be stored according to a link relationship between adjacent or non-adjacent blocks, and send the global model to the storage node.
In this embodiment, the local module collecting module 601 may specifically include the following sub-modules:
the verification submodule is used for verifying the local model generated by the training of any one participant node according to a preset public key of the participant node;
and the local model obtaining submodule is used for obtaining a plurality of local models of which the participating nodes pass the verification, wherein the number of the local models passing the verification is not lower than a preset local model number threshold value.
In this embodiment, the link relationship includes a forward link and a backward link, and the storage module 603 specifically includes the following sub-modules:
and the forward and backward link establishing submodule is used for establishing forward links among the plurality of groups of non-adjacent blocks and establishing backward links among the plurality of groups of non-adjacent blocks.
And the storage submodule is used for determining a storage node to be stored by the global model at least according to the forward link and the backward link and sending the global model to the storage node.
In this embodiment of the present application, the forward link and backward link establishing sub-module may specifically include the following units:
and the pre-signature generating unit is used for generating a pre-signature of any non-adjacent block to be generated in the current block by using a chameleon hash function.
A forward link establishing unit, configured to establish a forward link between the current block and the to-be-generated non-adjacent block based on the pre-signature and a preset forward link parameter; forward links between sets of non-adjacent blocks are established.
In this embodiment of the present application, the forward and backward link establishing sub-module may specifically include the following units:
the backward link establishing unit is used for establishing a backward link between the current block and the generated non-adjacent block based on the hash value of any generated non-adjacent block and a preset backward link parameter; establishing backward links between groups of non-adjacent blocks.
In this embodiment of the present application, the blocks in the block chain include anchor blocks and container blocks, one anchor block corresponds to a plurality of container blocks, the forward link includes an anchor link and an anchor block link, and the forward and backward link establishing sub-module specifically includes the following units:
and the block linking unit is used for establishing anchor links among the anchor blocks and establishing anchor block links between any anchor block and the container blocks corresponding to the anchor block.
In this embodiment of the application, the storage module 603 is further configured to determine, according to a link relationship between adjacent blocks, a storage node where the global model is to be stored, and send the global model to the storage node.
The verification node of the federated learning method based on a blockchain network provided in the embodiment of the present application may be applied to the first method embodiment, and for details, reference is made to the description of the foregoing method embodiment, and of course, corresponding virtual devices corresponding to the participating node and the storage node also exist, which are not described herein again.
Fig. 7 is a schematic structural diagram of a terminal device according to a fifth embodiment of the present application. As shown in fig. 7, the terminal device 700 of this embodiment includes: at least one processor 710 (only one shown in fig. 7), a memory 720, and a computer program 721 stored in the memory 720 and operable on the at least one processor 710, the steps in the above embodiment of the block chain network-based federal learning method being implemented when the computer program 721 is executed by the processor 710.
The terminal device 700 may be a desktop computer, a notebook, a palm computer, a cloud server, or other computing devices. The terminal device may include, but is not limited to, a processor 710, a memory 720. Those skilled in the art will appreciate that fig. 7 is merely an example of the terminal device 700, and does not constitute a limitation of the terminal device 700, and may include more or less components than those shown, or combine some of the components, or different components, such as an input-output device, a network access device, etc.
The Processor 710 may be a Central Processing Unit (CPU), and the Processor 710 may be other general purpose Processor, a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), an off-the-shelf Programmable Gate Array (FPGA) or other Programmable logic device, discrete Gate or transistor logic, discrete hardware components, etc. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
The memory 720 may in some embodiments be an internal storage unit of the terminal device 700, such as a hard disk or a memory of the terminal device 700. The memory 720 may also be an external storage device of the terminal device 700 in other embodiments, such as a plug-in hard disk, a Smart Media Card (SMC), a Secure Digital (SD) Card, a Flash memory Card (Flash Card), and the like, which are provided on the terminal device 700. Further, the memory 720 may also include both an internal storage unit and an external storage device of the terminal device 700. The memory 720 is used for storing an operating system, an application program, a BootLoader (BootLoader), data, and other programs, such as program codes of the computer program. The memory 720 may also be used to temporarily store data that has been output or is to be output.
It will be apparent to those skilled in the art that, for convenience and brevity of description, only the above-mentioned division of the functional units and modules is illustrated, and in practical applications, the above-mentioned function distribution may be performed by different functional units and modules according to needs, that is, the internal structure of the apparatus is divided into different functional units or modules to perform all or part of the above-mentioned functions. Each functional unit and module in the embodiments may be integrated in one processing unit, or each unit may exist alone physically, or two or more units are integrated in one unit, and the integrated unit may be implemented in a form of hardware, or in a form of software functional unit. In addition, specific names of the functional units and modules are only used for distinguishing one functional unit from another, and are not used for limiting the protection scope of the present application. The specific working processes of the units and modules in the system may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again.
In the above embodiments, the description of each embodiment has its own emphasis, and reference may be made to the related description of other embodiments for parts that are not described or recited in any embodiment.
Those of ordinary skill in the art will appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the implementation. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
In the embodiments provided in the present application, it should be understood that the disclosed apparatus/terminal device and method may be implemented in other ways. For example, the above-described embodiments of the apparatus/terminal device are merely illustrative, and for example, the division of the modules or units is only one logical division, and there may be other divisions when actually implemented, for example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be through some interfaces, indirect coupling or communication connection of devices or units, and may be in an electrical, mechanical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present application may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit may be implemented in the form of hardware, or may also be implemented in the form of a software functional unit.
The integrated modules/units, if implemented in the form of software functional units and sold or used as separate products, may be stored in a computer readable storage medium. Based on such understanding, all or part of the flow in the method of the embodiments described above can be realized by a computer program, which can be stored in a computer readable storage medium and can realize the steps of the embodiments of the methods described above when the computer program is executed by a processor. Wherein the computer program comprises computer program code, which may be in the form of source code, object code, an executable file or some intermediate form, etc. The computer-readable medium may include: any entity or device capable of carrying the computer program code, recording medium, U.S. disk, removable hard disk, magnetic diskette, optical disk, computer Memory, read-Only Memory (ROM), random Access Memory (RAM), electrical carrier wave signal, telecommunications signal, and software distribution medium, etc. It should be noted that the computer-readable medium may contain suitable additions or subtractions depending on the requirements of legislation and patent practice in jurisdictions, for example, in some jurisdictions, computer-readable media may not include electrical carrier signals or telecommunication signals in accordance with legislation and patent practice.
When the computer program product runs on a terminal device, the terminal device can implement the steps in the method embodiments.
The above-mentioned embodiments are only used to illustrate the technical solutions of the present application, and not to limit the same. Although the present application has been described in detail with reference to the foregoing embodiments, it should be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; such modifications and substitutions do not substantially depart from the spirit and scope of the embodiments of the present application and are intended to be included within the scope of the present application.

Claims (10)

1. A federal learning method based on a blockchain network, the blockchain network comprising: the federated learning method is executed by the verification node, and is characterized by comprising the following steps:
obtaining local models sent by a plurality of participating nodes, wherein each local model is obtained by training an initial model through training data of the corresponding participating node;
generating a global model from the plurality of local models;
and determining a storage node to be stored by the global model at least according to the link relation between non-adjacent blocks, and sending the global model to the storage node.
2. The federal learning method for a blockchain network as claimed in claim 1, wherein the link relationship includes a forward link and a backward link, and the determining the storage node where the global model is to be stored according to at least the link relationship between non-adjacent blocks and sending the global model to the storage node comprises:
establishing forward links among a plurality of groups of non-adjacent blocks and establishing backward links among a plurality of groups of non-adjacent blocks;
and determining a storage node to be stored by the global model at least according to the forward link and the backward link, and sending the global model to the storage node.
3. The block chain network-based federal learning method of claim 2 wherein said establishing forward links between sets of non-adjacent blocks comprises:
generating a pre-signature of any non-adjacent block to be generated in the current block by using a chameleon hash function;
establishing a forward link between the current block and the non-adjacent block to be generated based on the pre-signature and a preset forward link parameter;
forward links between sets of non-adjacent blocks are established.
4. The method for federal learning based on a blockchain network as in claim 2, wherein said establishing backward links between groups of non-adjacent blocks comprises:
establishing backward links between the current block and the generated non-adjacent blocks based on the hash value of any generated non-adjacent block and preset backward link parameters;
establishing backward links between sets of non-adjacent blocks.
5. The block chain network-based federated learning method of claim 2, wherein the blocks in a block chain include anchor blocks and container blocks, and one anchor block corresponds to a plurality of container blocks, and the forward links include anchor links and anchor block links, and the establishing of the forward links between sets of non-adjacent blocks comprises:
establishing anchor links between anchor blocks, and establishing anchor block links between any anchor block and the container block corresponding to the anchor block.
6. The blockchain network based federated learning method of claim 1, wherein the obtaining the local model sent by the plurality of participating nodes comprises:
verifying a local model generated by training of any participating node according to a preset public key of the participating node;
and acquiring local models of a plurality of participant nodes which pass verification, wherein the number of the local models which pass verification is not lower than a preset local model number threshold value.
7. The blockchain network based federated learning method of any one of claims 1-6, further comprising:
and determining a storage node to be stored by the global model according to the link relation between adjacent blocks, and sending the global model to the storage node.
8. A validation node of a federated learning method based on a blockchain network, the blockchain network comprising: a plurality of participating nodes, a verifying node and a storage node, wherein the verifying node comprises:
the local module collection module is used for acquiring local models sent by a plurality of participating nodes, wherein each local model is obtained by training an initial model through training data of the corresponding participating node;
the global model generation module is used for generating a global model according to the plurality of local models;
and the storage module is used for determining a storage node to be stored by the global model according to the link relation between adjacent or non-adjacent blocks and sending the global model to the storage node.
9. A terminal device comprising a memory, a processor and a computer program stored in the memory and executable on the processor, characterized in that the processor implements the method according to any of claims 1 to 7 when executing the computer program.
10. A computer-readable storage medium, in which a computer program is stored which, when being executed by a processor, carries out the method according to any one of claims 1 to 7.
CN202210713131.9A 2022-06-22 2022-06-22 Block chain network-based federal learning method, verification node, device and medium Pending CN115392475A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210713131.9A CN115392475A (en) 2022-06-22 2022-06-22 Block chain network-based federal learning method, verification node, device and medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210713131.9A CN115392475A (en) 2022-06-22 2022-06-22 Block chain network-based federal learning method, verification node, device and medium

Publications (1)

Publication Number Publication Date
CN115392475A true CN115392475A (en) 2022-11-25

Family

ID=84115803

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210713131.9A Pending CN115392475A (en) 2022-06-22 2022-06-22 Block chain network-based federal learning method, verification node, device and medium

Country Status (1)

Country Link
CN (1) CN115392475A (en)

Similar Documents

Publication Publication Date Title
US11677550B2 (en) Methods and apparatus for a distributed database including anonymous entries
JP6811350B2 (en) Methods and equipment for distributed databases in the network
US20220391358A1 (en) Methods and apparatus for a distributed database within a network
US20240004869A1 (en) Methods and apparatus for a distributed database within a network
KR20190097225A (en) Method and apparatus for distributed database enabling deletion of events
US11367055B2 (en) Decentralized pooled mining for enabling proof-of-work on blockchains
GB2539430A (en) Digital token exchange system
CN113711202A (en) Method and apparatus for implementing state attestation and ledger identifiers in a distributed database
CN115134069A (en) Block chain editing method and block chain link point
CN115392475A (en) Block chain network-based federal learning method, verification node, device and medium
Lei et al. Enhancing Editability in Permissionless Blockchain: A Three-Chain Model for Efficiency and Ledger Consistency
CN110933155A (en) Novel block chain

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination