CN114579957A - Credible sandbox-based federated learning model training method and device and electronic equipment - Google Patents

Credible sandbox-based federated learning model training method and device and electronic equipment Download PDF

Info

Publication number
CN114579957A
CN114579957A CN202210068424.6A CN202210068424A CN114579957A CN 114579957 A CN114579957 A CN 114579957A CN 202210068424 A CN202210068424 A CN 202210068424A CN 114579957 A CN114579957 A CN 114579957A
Authority
CN
China
Prior art keywords
training
sandbox
local
trusted
mec server
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210068424.6A
Other languages
Chinese (zh)
Inventor
邱雪松
任殷林
沈韬
郭少勇
冯艳
亓峰
阮琳娜
张克勤
杨国铭
柏粉花
张驰
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Yunnan Provincial Academy Of Science And Technology
Beijing University of Posts and Telecommunications
Kunming University of Science and Technology
Original Assignee
Yunnan Provincial Academy Of Science And Technology
Beijing University of Posts and Telecommunications
Kunming University of Science and Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Yunnan Provincial Academy Of Science And Technology, Beijing University of Posts and Telecommunications, Kunming University of Science and Technology filed Critical Yunnan Provincial Academy Of Science And Technology
Priority to CN202210068424.6A priority Critical patent/CN114579957A/en
Publication of CN114579957A publication Critical patent/CN114579957A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/60Protecting data
    • G06F21/64Protecting data integrity, e.g. using checksums, certificates or signatures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/50Monitoring users, programs or devices to maintain the integrity of platforms, e.g. of processors, firmware or operating systems
    • G06F21/52Monitoring users, programs or devices to maintain the integrity of platforms, e.g. of processors, firmware or operating systems during program execution, e.g. stack integrity ; Preventing unwanted data erasure; Buffer overflow
    • G06F21/53Monitoring users, programs or devices to maintain the integrity of platforms, e.g. of processors, firmware or operating systems during program execution, e.g. stack integrity ; Preventing unwanted data erasure; Buffer overflow by executing in a restricted environment, e.g. sandbox or secure virtual machine
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/60Protecting data
    • G06F21/62Protecting access to data via a platform, e.g. using keys or access control rules
    • G06F21/6218Protecting access to data via a platform, e.g. using keys or access control rules to a system of files or objects, e.g. local or distributed file system or database
    • G06F21/6245Protecting personal data, e.g. for financial or medical purposes
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N20/00Machine learning
    • G06N20/20Ensemble learning

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Security & Cryptography (AREA)
  • Software Systems (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • Computer Hardware Design (AREA)
  • General Health & Medical Sciences (AREA)
  • Bioethics (AREA)
  • Health & Medical Sciences (AREA)
  • Medical Informatics (AREA)
  • Databases & Information Systems (AREA)
  • Artificial Intelligence (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Data Mining & Analysis (AREA)
  • Evolutionary Computation (AREA)
  • Computing Systems (AREA)
  • Mathematical Physics (AREA)
  • Management, Administration, Business Operations System, And Electronic Commerce (AREA)

Abstract

The invention provides a trusted sandbox-based federated learning model training method, a trusted sandbox-based federated learning model training device and electronic equipment, wherein the trusted sandbox-based federated learning model training method comprises the following steps: under the condition that the MEC server is determined to belong to the established trusted sandbox, sending first equipment information corresponding to the MEC server and second equipment information corresponding to a plurality of local equipment in communication connection with the MEC server to a block chain platform for registration; determining a plurality of target local devices for federal learning based on a node selection request broadcast by a blockchain platform; receiving a global model and a training request sent by a block chain platform, controlling a plurality of target local devices, and training the global model based on the training request to obtain a local model; and aggregating the local models to obtain a new global model. The trusted sandbox-based federated learning model training method, the trusted sandbox-based federated learning model training device and the electronic equipment can solve the privacy protection problem of model training data of a blockchain platform in the prior art.

Description

Credible sandbox-based federated learning model training method and device and electronic equipment
Technical Field
The invention relates to the technical field of data privacy protection, in particular to a trusted sandbox-based federated learning model training method and device and electronic equipment.
Background
Privacy-preserving data sharing has become an important prerequisite for future networks. As an emerging trusted data sharing method, the integration of federal learning and block chaining has attracted a lot of attention. Joint learning is a framework or system that enables multiple participants to collaboratively build and use machine learning models without revealing the original and private data that the participants possess, while achieving good performance. However, traditional federal learning achieves data privacy protection through model migration, and ignores trust issues of data, models, and results in transmission and computation. Meanwhile, the blockchain is a distributed ledger technology with a plurality of backups, and is essentially a distributed database without a central administrator, and data trust is transferred through a consensus mechanism during data sharing. Therefore, the combination of blockchains and federal learning becomes one of the feasible approaches to solve the above problems in a mobile edge computing environment. However, in the prior art, the blockchain and federal learning are combined, and a protection problem of shared data privacy of a blockchain platform also exists.
Disclosure of Invention
The invention provides a trusted sandbox-based federated learning model training method, a trusted sandbox-based federated learning model training device and electronic equipment, which are used for solving the privacy protection problem of model training data of a block chain platform when a block chain and federated learning are combined in the prior art.
The invention provides a credible sandbox-based federated learning model training method, which comprises the following steps:
under the condition that the MEC server is determined to belong to the created trusted sandbox, sending first equipment information corresponding to the MEC server and second equipment information corresponding to a plurality of local equipment in communication connection with the MEC server to a block chain platform for registration;
determining a plurality of target local devices for federal learning based on the node selection request broadcast by the blockchain platform;
receiving a global model and a training request sent by the blockchain platform, distributing the global model and the training request to the plurality of target local devices to control the plurality of target local devices, and training the global model based on the training request to obtain a local model;
acquiring local models uploaded by the target local devices, and aggregating the local models to obtain a new global model;
and the trusted sandbox is created for the MEC server based on a block chain state channel.
The credible sandbox-based federated learning model training method provided by the invention further comprises the following steps:
and sending the first equipment information and the second equipment information to a block chain platform for registration, acquiring an excitation mechanism of the block chain platform after the first equipment information and the second equipment information are identified by the block chain platform, and executing the excitation mechanism on the target local equipment.
According to the credible sandbox-based federated learning model training method provided by the invention, the global model is generated for the block chain platform based on an intelligent contract.
According to the trusted sandbox-based federated learning model training method provided by the invention, the trusted sandbox is obtained by selecting free block link point resources for the MEC server based on a state channel and randomly creating the free block link point resources.
The credible sandbox-based federated learning model training method provided by the invention further comprises the following steps:
and acquiring training process data corresponding to the local model, and evaluating the training process of the target local equipment based on the training process data.
According to the trusted sandbox-based federated learning model training method provided by the invention, the determining of the plurality of target local devices for federated learning based on the node selection request comprises the following steps:
and selecting target local equipment for federal learning based on the node selection request and by combining a deep reinforcement learning algorithm.
The invention also provides a trusted sandbox-based federal learning model training device, which comprises:
the device comprises a first sending module, a block chain platform and a second sending module, wherein the first sending module is used for sending first equipment information corresponding to an MEC server and second equipment information corresponding to a plurality of local equipment in communication connection with the MEC server to the block chain platform for registration under the condition that the MEC server is determined to belong to a created trusted sandbox;
the node selection module is used for determining a plurality of target local devices for federal learning based on the node selection request broadcast by the block chain platform;
the training module is used for receiving a global model and a training request sent by the blockchain platform, distributing the global model and the training request to the plurality of target local devices to control the plurality of target local devices, and training the global model based on the training request to obtain a local model;
the aggregation module is used for acquiring the local models uploaded by the plurality of target local devices and aggregating the local models to obtain a new global model;
and the trusted sandbox is created for the MEC server based on a block chain state channel.
The invention also provides an electronic device, which comprises a memory, a processor and a computer program stored on the memory and capable of running on the processor, wherein the processor executes the program to realize the steps of any one of the above trusted sandbox based federated learning model training methods.
The present invention also provides a non-transitory computer readable storage medium having stored thereon a computer program that, when executed by a processor, performs the steps of the trusted sandbox based federated learning model training method as described in any one of the above.
The invention also provides a computer program product comprising a computer program which, when executed by a processor, implements the steps of any of the above-described trusted sandbox based federated learning model training methods.
According to the trusted sandbox-based federated learning model training method, the trusted sandbox-based federated learning model training device and the electronic equipment, the block chain is endowed with the calculation attribute by utilizing the state channel, the trusted sandbox is created and used for instantiating a federated learning task in an untrusted edge computing environment, and privacy data such as a global model and a new global model are shared in the trusted environment, so that the rights and interests of users are protected.
Drawings
In order to more clearly illustrate the technical solutions of the present invention or the prior art, the drawings needed for the description of the embodiments or the prior art will be briefly described below, and it is obvious that the drawings in the following description are some embodiments of the present invention, and those skilled in the art can also obtain other drawings according to the drawings without creative efforts.
FIG. 1 is a schematic flow chart of a trusted sandbox-based federated learning model training method provided in the present invention;
FIG. 2 is a flow chart of a registration phase in the trusted sandbox based federated learning model training method provided in the present invention;
FIG. 3 is a flowchart of a task initialization phase in the trusted sandbox-based federated learning model training method provided in the present invention;
FIG. 4 is a flow chart of the training and recording phases in the trusted sandbox based federated learning model training method provided in the present invention;
FIG. 5 is a timing diagram corresponding to the trusted sandbox-based federated learning model training method provided in the present invention;
FIG. 6 is a system architecture diagram corresponding to the trusted sandbox based federated learning model training method provided in the present invention;
FIG. 7 is a schematic structural diagram of a trusted sandbox-based federated learning model training device provided in the present invention;
fig. 8 is a schematic structural diagram of an electronic device provided in the present invention.
Detailed Description
In order to make the objects, technical solutions and advantages of the present invention clearer, the technical solutions of the present invention will be clearly and completely described below with reference to the accompanying drawings, and it is obvious that the described embodiments are some, but not all embodiments of the present invention. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
The trusted sandbox-based federated learning model training method, apparatus, and electronic device of the present invention are described below with reference to fig. 1 to 8.
As shown in fig. 1, in the trusted sandbox-based federal learning model training method provided in the present invention, the trusted sandbox-based federal learning model training method is applied to an MEC (Mobile Edge Computing) server; the credible sandbox-based federated learning model training method comprises the following steps:
and step 110, under the condition that the MEC server is determined to belong to the created trusted sandbox, sending first device information corresponding to the MEC server and second device information corresponding to a plurality of local devices in communication connection with the MEC server to a block chain platform for registration.
It is to be appreciated that a trusted sandbox, i.e., a trustworthy sandbox. Sandbox, a security mechanism in the field of computer security, provides an isolated environment for running programs.
And sending first equipment information corresponding to the MEC server and second equipment information corresponding to the local equipment under the trusted sandbox to the block chain platform for registration so as to determine the identities of the MEC server and the local equipment, and executing an incentive mechanism to encourage each equipment to participate in training after the identities of the MEC server and the local equipment are identified together by the block chain nodes. The first equipment information comprises equipment data and resource information of the MEC server; the second device information contains device data and resource information of the local device.
As shown in fig. 2, the resource and data registration phase: and sending the first equipment information to the block chain platform, and registering the MEC server under the trusted sandbox in the block chain platform. And sending the second equipment information to the block chain platform, and registering the local equipment under the trusted sandbox in the block chain platform.
Further, the local device may be a personal computer, a vehicle, or a mobile phone.
And 120, determining a plurality of target local devices for federal learning based on the node selection request broadcasted by the blockchain platform.
It will be appreciated that the target local device for federal learning is determined from a plurality of local devices communicatively coupled to the MEC server.
As shown in fig. 3, the federal learning task initialization phase: and the block chain platform broadcasts a node selection request to the MEC server after receiving the task request. The MEC server then selects the target local device that is appropriate for the present federal learning through a node selection algorithm.
The MEC server creates a trusted sandbox through a status channel, returns IDs (Identity documents) of corresponding MEC servers and local equipment in the trusted sandbox to a block chain platform, and finally successfully creates a federal learning task after being commonly identified by block chain link points.
Step 130, receiving a global model and a training request sent by the blockchain platform, distributing the global model and the training request to the plurality of target local devices to control the plurality of target local devices, and training the global model based on the training request to obtain a local model.
And 140, acquiring the local models uploaded by the target local devices, and aggregating the local models to obtain a new global model.
And the trusted sandbox is established for the MEC server based on a block chain state channel.
It is to be understood that, under the trusted sandbox, one MEC server may be communicatively connected with one or more local devices. During training, after receiving a global model needing to be trained and sent by a block chain platform, an MEC server under a trusted sandbox distributes the global model to corresponding local devices, each local device performs federal learning, the MEC server broadcasts a training request, and after receiving the training request, the local devices start to train the received global model to obtain own local model.
Each local device transmits the local model thereof back to the MEC server, and the MEC server carries out aggregation to obtain a new global model, namely the federal learning model.
It is understood that the status channel is a block chain status channel. The basic components of the status channel are as follows: 1. part of the state of the blockchain is locked by multiple signatures and part of the intelligent contract, so that this part of the participants must fully agree on each other to update it. 2. Participants update the status themselves by generating and signing transfers, which are ultimately uploaded onto the blockchain, rather than being computed directly on the chain. Each new update refreshes the previous update. 3. Eventually, the participant returns the state back to the blockchain, then closes the state channel, and locks the state again (usually in a different setting than the beginning).
It should be noted that the MEC server and the local device may receive the global model to be trained, which is sent by the blockchain platform, only after the blockchain platform completes registration.
As shown in fig. 4, the training and recording phase: firstly, a block chain platform initializes an intelligent contract to obtain a global model, and sends the global model to a trusted sandbox. The trusted sandbox issues the model to the specific local device and broadcasts a training request to each selected node, i.e., the selected local device.
After the training of the local equipment is finished, the credible sandbox aggregates each local model into a new global model and executes the incentive strategy and reputation evaluation operation. And finally, updating results through block chain link point consensus, namely the MEC server of the sandbox can be trusted, and storing the obtained new global model in the block chain platform.
Further, fig. 5 may be referred to as a timing chart corresponding to the trusted sandbox-based federated learning model training method provided in the present invention.
In some embodiments, the trusted sandbox based federated learning model training method further comprises:
and sending the first equipment information and the second equipment information to a block chain platform for registration, acquiring an excitation mechanism of the block chain platform after the first equipment information and the second equipment information are identified by the block chain platform, and executing the excitation mechanism on the target local equipment.
It will be appreciated that the incentive mechanism may include economic incentives and a system of distribution of Token (in computer identity authentication, a temporary Token), the main role being to give each node in the network layer the incentive to maintain the blockchain network, and the nodes maintaining the blockchain system are given economic return by the constraints of the system, while the nodes intentionally destroying the blockchain system receive penalties.
In some embodiments, the global model is generated based on a Smart contract (English: Smart contract) for the blockchain platform.
It is to be understood that a smart contract is a computer protocol intended to propagate, verify or execute contracts in an informational manner. Smart contracts allow trusted transactions, which are traceable and irreversible, to be conducted without a third party. Global models, i.e. business models that process industrial data.
Further, the trusted sandbox is obtained by selecting idle block link point resources for the MEC server based on a state channel and randomly creating the idle block link point resources.
It is understood that the spare block link node resource may be a spare MEC server. When an MEC server determines that it is idle, a trusted sandbox is created based on the MEC server and local devices communicatively coupled to the MEC server.
In some embodiments, the trusted sandbox based federated learning model training method further comprises:
and acquiring training process data corresponding to the local model, and evaluating the training process of the target local equipment based on the training process data.
It is understood that after the evaluation of the training process of the target local device, it may be determined whether to continue to select the local device for training next time based on the evaluation result of the training process of the local device. For example, when the evaluation result of the training process of the local device is poor, another local device is selected for training next time, and when the evaluation result of the training process of the local device is good, the local device can be selected for training next time.
In some embodiments, the determining a plurality of target local devices for federal learning based on the node selection request includes:
and selecting target local equipment for federal learning based on the node selection request and by combining a deep reinforcement learning algorithm.
It is understood that the depth enhanced school Algorithm may be an A3C (Actor-critical Algorithm), a Q learning (Q-learning) Algorithm, a DQN Algorithm, or a DPG (Deterministic policy gradient) Algorithm.
In some embodiments, as shown in FIG. 6, sandboxing and state channels are introduced, building a new data privacy sharing paradigm through blockchains and federal learning. In this paradigm, a trusted sandbox is randomly created using the state channel, instantiating the joint learning task in the untrusted edge computing environment.
The raw data will be paired into a trusted sandbox and destroyed after computation. At the same time, the result is a trusted transfer to the requestor, and the trusted sandbox that released the resource is also revoked. This process supports trusted computing and sharing of data, but process security is not visible, called "sandbox computing" or "trusted sandbox computing".
And displaying the block chain nodes on a plurality of MEC servers with strong computing, storing and communicating capabilities to construct a block chain platform. The block chain platform supports joint learning task requests, network resource trusted management and data trusted management of users. Meanwhile, each MEC server or local device has a federal learning module used for instantiation of the federal learning task. The architecture consists of the following parts:
a block chain platform: the blockchain platform is displayed on a plurality of MEC servers with strong computing, storing and communication capabilities. The block chain platform has the functions of network resource trust management, data trust identification service, service request and distribution, network resource scheduling, service quality evaluation and the like. The method supports the user to request the Federal learning task to randomly create a sandbox, and instantiates the Federal learning task by using a state channel in an untrusted edge computing environment. And meanwhile, recording and evaluating the credit information of the local equipment on a federal learning node selection platform.
An MEC server: the MEC server supports deployed federated learning modules and block link points, and the MEC server also manages local equipment in the domain. Each MEC server selects an optimal local device group to instantiate the intelligent engine based federated learning task. The global model is then distributed to each local device and the local models are aggregated into the global model after training.
The local device: it consists of different types of mobile and fixed equipment (e.g., vehicles or mobile phones), with a federal learning module for local model training and raw data. These local devices communicate with the MEC server through the base station. Each local device registers computing, communication, and storage resources on the blockchain platform. Meanwhile, in order to protect the data privacy of the user and participate in cooperative work, the original data identifier is also registered to the block chain platform and can be verified at any time. Distributed model training is performed on different local devices through federal learning. Instead of uploading raw data to the MEC server and blockchain platform, the device upload model.
In other embodiments, a sandboxed computing based private data sharing architecture is presented. Based on the federation blockchain, trust can be established between data providers of different internet of things domains. In addition, through the status channel and the trusted sandbox, the entire process of the federal learning task can be more effectively supervised. Through this kind of mode, this framework supports the privacy information sharing in various thing networking fields, for example car networking, intelligent medical treatment, intelligent house, wearable equipment, directions such as robot.
Medical data such as electronic medical records, medical images, and the like. On the one hand, the data plays a role in realizing accurate health, and on the other hand, the data are often scattered everywhere and are difficult to share due to sensitivity. For example: the smart watch collects personal real-time heartbeat data, and researchers try to use the data to train the model to predict stroke months ago in combination with electronic medical records. However, medical records of manufacturers of smart watches and hospitals refuse data disclosure for privacy reasons. The private data sharing architecture proposed herein can ensure that sensitive patient data is retained in the hands of local agencies or individual consumers, is not leaked to model trainers during the federal machine learning process, and ensures the credibility of the federal learning training process.
The Internet of vehicles is one of application scenes of smart cities and is composed of intelligent vehicles with data collection, calculation and communication functions. The Internet of vehicles can realize functions of navigation, automatic driving, traffic control, intelligent parking, real-time early warning of road conditions (road congestion ahead, bumpy road or icing), coordination of vehicles for lane changing and the like. Although the current vehicle networking framework based on edge computing is mature, the vehicle data in the vehicle networking still relates to the driving path information, personal information and other private data of the driver. In the application scene of the Internet of vehicles, the private data sharing architecture provided by the invention can be adopted. The whole process of federal learning is supervised by technologies such as a trusted sandbox and a block chain, so that a vehicle can train on local equipment and upload updated model parameters, and the safety of local privacy data is effectively protected.
In summary, the trusted sandbox-based federated learning model training method provided by the present invention includes: under the condition that the MEC server is determined to belong to the created trusted sandbox, sending first equipment information corresponding to the MEC server and second equipment information corresponding to a plurality of local equipment in communication connection with the MEC server to a block chain platform for registration; determining a plurality of target local devices for federal learning based on the node selection request broadcast by the blockchain platform; receiving a global model and a training request sent by the blockchain platform, distributing the global model and the training request to the plurality of target local devices to control the plurality of target local devices, and training the global model based on the training request to obtain a local model; acquiring local models uploaded by the target local devices, and aggregating the local models to obtain a new global model; and the trusted sandbox is created for the MEC server based on a block chain state channel.
According to the trusted sandbox-based federated learning model training method, the block chain is endowed with the calculation attribute by using the state channel, the trusted sandbox is created and used for instantiating a federated learning task in an untrusted edge computing environment, and privacy data such as a global model and a new global model are shared in the trusted environment, so that the rights and interests of users are protected.
The trusted sandbox-based federated learning model training device provided by the present invention is described below, and the trusted sandbox-based federated learning model training device described below and the trusted sandbox-based federated learning model training method described above may be referred to each other.
As shown in fig. 7, the trusted sandbox-based federal learning model training device 700 provided in the present invention includes: a first sending module 710, a node selection module 720, a training module 730, and an aggregation module 740.
The first sending module 710 is configured to, when it is determined that the MEC server belongs to the created trusted sandbox, send first device information corresponding to the MEC server and second device information corresponding to a plurality of local devices communicatively connected to the MEC server to the block chain platform for registration.
The node selection module 720 is configured to determine a plurality of target local devices for federal learning based on the node selection request broadcast by the blockchain platform.
The training module 730 is configured to receive a global model and a training request sent by the blockchain platform, allocate the global model and the training request to the plurality of target local devices to control the plurality of target local devices, and train the global model based on the training request to obtain a local model.
The aggregation module 740 is configured to obtain the local models uploaded by the target local devices, and aggregate the local models to obtain a new global model.
And the trusted sandbox is created for the MEC server based on a block chain state channel.
In some embodiments, the trusted sandbox based federated learning model training apparatus 700 further comprises: and an excitation module.
The excitation module is used for sending the first equipment information and the second equipment information to a block chain platform for registration, acquiring an excitation mechanism of the block chain platform after the first equipment information and the second equipment information are identified by the block chain platform, and executing the excitation mechanism on the target local equipment.
In some embodiments, the global model is generated for the blockchain platform based on an intelligent contract.
Further, the trusted sandbox is obtained by selecting a free block link point resource for the MEC server based on a state channel and randomly creating the free block link point resource.
In some embodiments, the trusted sandbox based federated learning model training apparatus 700 further comprises: and a second sending module.
The second sending module is used for obtaining training process data corresponding to the local model and evaluating the training process of the target local device based on the training process data.
In some embodiments, the node selection module 720 is further configured to select a target local device for federal learning based on the node selection request in combination with a deep reinforcement learning algorithm.
The electronic device, the computer program product, and the storage medium according to the present invention are described below, and the electronic device, the computer program product, and the storage medium described below may be referred to in correspondence with the above-described trusted sandbox-based federated learning model training method.
Fig. 8 illustrates a physical structure diagram of an electronic device, and as shown in fig. 8, the electronic device may include: a processor (processor)810, a communication Interface 820, a memory 830 and a communication bus 840, wherein the processor 810, the communication Interface 820 and the memory 830 communicate with each other via the communication bus 840. The processor 810 may invoke logic instructions in the memory 830 to perform a trusted sandbox based federated learning model training method comprising:
step 110, under the condition that the MEC server is determined to belong to the established trusted sandbox, sending first equipment information corresponding to the MEC server and second equipment information corresponding to a plurality of local equipment in communication connection with the MEC server to a block chain platform for registration;
step 120, determining a plurality of target local devices for federal learning based on the node selection request broadcasted by the blockchain platform;
step 130, receiving a global model and a training request sent by the blockchain platform, distributing the global model and the training request to the plurality of target local devices to control the plurality of target local devices, and training the global model based on the training request to obtain a local model;
step 140, obtaining local models uploaded by the target local devices, and aggregating the local models to obtain a new global model;
and the trusted sandbox is created for the MEC server based on a block chain state channel.
In addition, the logic instructions in the memory 830 may be implemented in software functional units and stored in a computer readable storage medium when the logic instructions are sold or used as independent products. Based on such understanding, the technical solution of the present invention may be embodied in the form of a software product, which is stored in a storage medium and includes instructions for causing a computer device (which may be a personal computer, a server, or a network device) to execute all or part of the steps of the method according to the embodiments of the present invention. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk, and other various media capable of storing program codes.
In another aspect, the present invention also provides a computer program product, which includes a computer program, the computer program being stored on a non-transitory computer-readable storage medium, and when executed by a processor, the computer program being capable of executing the trusted sandbox based federated learning model training method provided by the above methods, the method including:
step 110, under the condition that the MEC server is determined to belong to the created trusted sandbox, sending first device information corresponding to the MEC server and second device information corresponding to a plurality of local devices in communication connection with the MEC server to a block chain platform for registration;
step 120, determining a plurality of target local devices for federal learning based on the node selection request broadcasted by the blockchain platform;
step 130, receiving a global model and a training request sent by the blockchain platform, distributing the global model and the training request to the plurality of target local devices to control the plurality of target local devices, and training the global model based on the training request to obtain a local model;
step 140, obtaining local models uploaded by the target local devices, and aggregating the local models to obtain a new global model;
and the trusted sandbox is created for the MEC server based on a block chain state channel.
In yet another aspect, the present invention also provides a non-transitory computer-readable storage medium having stored thereon a computer program, which when executed by a processor, implements a trusted sandbox-based federated learning model training method provided by the above methods, the method including:
step 110, under the condition that the MEC server is determined to belong to the created trusted sandbox, sending first device information corresponding to the MEC server and second device information corresponding to a plurality of local devices in communication connection with the MEC server to a block chain platform for registration;
step 120, determining a plurality of target local devices for federal learning based on the node selection request broadcasted by the blockchain platform;
step 130, receiving a global model and a training request sent by the blockchain platform, distributing the global model and the training request to the plurality of target local devices to control the plurality of target local devices, and training the global model based on the training request to obtain a local model;
step 140, obtaining local models uploaded by the target local devices, and aggregating the local models to obtain a new global model;
and the trusted sandbox is created for the MEC server based on a block chain state channel.
The above-described embodiments of the apparatus are merely illustrative, and the units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the modules may be selected according to actual needs to achieve the purpose of the solution of this embodiment. One of ordinary skill in the art can understand and implement it without inventive effort.
Through the above description of the embodiments, those skilled in the art will clearly understand that each embodiment can be implemented by software plus a necessary general hardware platform, and certainly can also be implemented by hardware. With this understanding in mind, the above-described technical solutions may be embodied in the form of a software product, which can be stored in a computer-readable storage medium such as ROM/RAM, magnetic disk, optical disk, etc., and includes instructions for causing a computer device (which may be a personal computer, a server, or a network device, etc.) to execute the methods described in the embodiments or some parts of the embodiments.
Finally, it should be noted that: the above examples are only intended to illustrate the technical solution of the present invention, but not to limit it; although the present invention has been described in detail with reference to the foregoing embodiments, it will be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; and such modifications or substitutions do not depart from the spirit and scope of the corresponding technical solutions of the embodiments of the present invention.

Claims (10)

1. A trusted sandbox-based federated learning model training method is characterized by comprising the following steps:
under the condition that the MEC server is determined to belong to the created trusted sandbox, sending first equipment information corresponding to the MEC server and second equipment information corresponding to a plurality of local equipment in communication connection with the MEC server to a block chain platform for registration;
determining a plurality of target local devices for federal learning based on the node selection request broadcast by the blockchain platform;
receiving a global model and a training request sent by the blockchain platform, distributing the global model and the training request to the plurality of target local devices to control the plurality of target local devices, and training the global model based on the training request to obtain a local model;
acquiring local models uploaded by the target local devices, and aggregating the local models to obtain a new global model;
and the trusted sandbox is created for the MEC server based on a block chain state channel.
2. The trusted sandbox-based federated learning model training method of claim 1, further comprising:
and sending the first equipment information and the second equipment information to a block chain platform for registration, acquiring an excitation mechanism of the block chain platform after the first equipment information and the second equipment information are identified by the block chain platform, and executing the excitation mechanism on the target local equipment.
3. The trusted sandbox-based federated learning model training method of claim 1, wherein the global model is generated for the blockchain platform based on an intelligent contract.
4. The method for training a trusted sandbox-based federated learning model according to claim 1, wherein the trusted sandbox is obtained by selecting a free block link point resource for the MEC server based on a state channel and randomly creating the free block link point resource.
5. The trusted sandbox-based federated learning model training method of claim 1, further comprising:
and acquiring training process data corresponding to the local model, and evaluating the training process of the target local equipment based on the training process data.
6. The trusted sandbox-based federated learning model training method of any one of claims 1-5, wherein the determining a plurality of target local devices for federated learning based on the node selection request comprises:
and selecting target local equipment for federal learning based on the node selection request and by combining a deep reinforcement learning algorithm.
7. A federal learning model training device based on a trusted sandbox is characterized by comprising:
the device comprises a first sending module, a block chain platform and a second sending module, wherein the first sending module is used for sending first equipment information corresponding to an MEC server and second equipment information corresponding to a plurality of local equipment in communication connection with the MEC server to the block chain platform for registration under the condition that the MEC server is determined to belong to a created trusted sandbox;
the node selection module is used for determining a plurality of target local devices for federal learning based on the node selection request broadcast by the block chain platform;
the training module is used for receiving a global model and a training request sent by the blockchain platform, distributing the global model and the training request to the plurality of target local devices to control the plurality of target local devices, and training the global model based on the training request to obtain a local model;
the aggregation module is used for acquiring the local models uploaded by the target local devices and aggregating the local models to obtain a new global model;
and the trusted sandbox is created for the MEC server based on a block chain state channel.
8. An electronic device comprising a memory, a processor, and a computer program stored on the memory and executable on the processor, wherein the processor when executing the program implements the steps of the trusted sandbox based federated learning model training method of any one of claims 1 to 6.
9. A non-transitory computer readable storage medium having stored thereon a computer program, wherein the computer program when executed by a processor implements the steps of the trusted sandbox based federated learning model training method of any one of claims 1 to 6.
10. A computer program product comprising a computer program, wherein the computer program, when executed by a processor, implements the steps of the trusted sandbox based federated learning model training method of any one of claims 1 to 6.
CN202210068424.6A 2022-01-20 2022-01-20 Credible sandbox-based federated learning model training method and device and electronic equipment Pending CN114579957A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210068424.6A CN114579957A (en) 2022-01-20 2022-01-20 Credible sandbox-based federated learning model training method and device and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210068424.6A CN114579957A (en) 2022-01-20 2022-01-20 Credible sandbox-based federated learning model training method and device and electronic equipment

Publications (1)

Publication Number Publication Date
CN114579957A true CN114579957A (en) 2022-06-03

Family

ID=81771143

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210068424.6A Pending CN114579957A (en) 2022-01-20 2022-01-20 Credible sandbox-based federated learning model training method and device and electronic equipment

Country Status (1)

Country Link
CN (1) CN114579957A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116628682A (en) * 2023-07-24 2023-08-22 中电科大数据研究院有限公司 Data contract type opening method based on data sandbox and related equipment
CN116720594A (en) * 2023-08-09 2023-09-08 中国科学技术大学 Decentralized hierarchical federal learning method

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116628682A (en) * 2023-07-24 2023-08-22 中电科大数据研究院有限公司 Data contract type opening method based on data sandbox and related equipment
CN116628682B (en) * 2023-07-24 2023-11-14 中电科大数据研究院有限公司 Data contract type opening method based on data sandbox and related equipment
CN116720594A (en) * 2023-08-09 2023-09-08 中国科学技术大学 Decentralized hierarchical federal learning method
CN116720594B (en) * 2023-08-09 2023-11-28 中国科学技术大学 Decentralized hierarchical federal learning method

Similar Documents

Publication Publication Date Title
Mollah et al. Blockchain for the internet of vehicles towards intelligent transportation systems: A survey
Xu et al. SG-PBFT: A secure and highly efficient distributed blockchain PBFT consensus algorithm for intelligent Internet of vehicles
Baza et al. B-ride: Ride sharing with privacy-preservation, trust and fair payment atop public blockchain
Mutlag et al. Enabling technologies for fog computing in healthcare IoT systems
CN110991622B (en) Machine learning model processing method based on block chain network and node
Sun et al. Security and privacy preservation in fog-based crowd sensing on the internet of vehicles
Tian et al. A blockchain-based machine learning framework for edge services in IIoT
Kong et al. Privacy-preserving aggregation for federated learning-based navigation in vehicular fog
CN109035014B (en) Data transaction system
Wang et al. Secure ride-sharing services based on a consortium blockchain
TW201835784A (en) The internet of things
Arif et al. SDN based communications privacy-preserving architecture for VANETs using fog computing
CN114579957A (en) Credible sandbox-based federated learning model training method and device and electronic equipment
CN106302334A (en) Access role acquisition methods, Apparatus and system
He et al. Security and privacy in vehicular digital twin networks: Challenges and solutions
Li et al. Feel: Federated end-to-end learning with non-iid data for vehicular ad hoc networks
Puri et al. A vital role of blockchain technology toward internet of vehicles
Dai et al. Digital twin envisioned secure air-ground integrated networks: A blockchain-based approach
CN111460330A (en) Data processing method, device, equipment and storage medium
Xiong et al. A lightweight privacy protection scheme based on user preference in mobile crowdsensing
Chen et al. A summary of security techniques-based blockchain in iov
Xu et al. TPSense: a framework for event-reports trustworthiness evaluation in privacy-preserving vehicular crowdsensing systems
Wang et al. Data security and privacy challenges of computing offloading in FINs
Shen et al. Blockchains for artificial intelligence of things: A comprehensive survey
Wang et al. ClusterRep: A cluster-based reputation framework for balancing privacy and trust in vehicular participatory sensing

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination