WO2021184345A1 - 隐私机器学习实现方法、装置、设备及存储介质 - Google Patents
隐私机器学习实现方法、装置、设备及存储介质 Download PDFInfo
- Publication number
- WO2021184345A1 WO2021184345A1 PCT/CN2020/080390 CN2020080390W WO2021184345A1 WO 2021184345 A1 WO2021184345 A1 WO 2021184345A1 CN 2020080390 W CN2020080390 W CN 2020080390W WO 2021184345 A1 WO2021184345 A1 WO 2021184345A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- machine learning
- native
- node
- cryptographic
- plaintext
- Prior art date
Links
- 238000010801 machine learning Methods 0.000 title claims abstract description 179
- 238000000034 method Methods 0.000 title claims abstract description 40
- 238000003860 storage Methods 0.000 title claims abstract description 15
- 230000003068 static effect Effects 0.000 claims description 59
- 238000004364 calculation method Methods 0.000 claims description 20
- 238000004590 computer program Methods 0.000 claims description 20
- 230000006870 function Effects 0.000 description 12
- 230000008569 process Effects 0.000 description 12
- 238000010586 diagram Methods 0.000 description 11
- 238000012545 processing Methods 0.000 description 11
- 238000005457 optimization Methods 0.000 description 5
- 239000008186 active pharmaceutical agent Substances 0.000 description 4
- 238000005516 engineering process Methods 0.000 description 4
- 108020004705 Codon Proteins 0.000 description 2
- 238000011161 development Methods 0.000 description 2
- 230000006872 improvement Effects 0.000 description 2
- 239000000463 material Substances 0.000 description 2
- 238000012986 modification Methods 0.000 description 2
- 230000004048 modification Effects 0.000 description 2
- 230000003287 optical effect Effects 0.000 description 2
- 238000000638 solvent extraction Methods 0.000 description 2
- 238000012549 training Methods 0.000 description 2
- 230000009286 beneficial effect Effects 0.000 description 1
- 230000005540 biological transmission Effects 0.000 description 1
- 230000008859 change Effects 0.000 description 1
- 238000004891 communication Methods 0.000 description 1
- 238000012790 confirmation Methods 0.000 description 1
- 230000008094 contradictory effect Effects 0.000 description 1
- 230000008676 import Effects 0.000 description 1
- 238000004519 manufacturing process Methods 0.000 description 1
- 230000000750 progressive effect Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F21/00—Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
- G06F21/60—Protecting data
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N20/00—Machine learning
Definitions
- This specification relates to the field of machine learning technology, and in particular to a method, device, equipment and storage medium for implementing privacy machine learning.
- Machine Learning has been applied in many fields.
- machine learning requires the use of a large amount of sample data
- two or more data holders will collaborate in machine learning modeling.
- any data holder there is a risk that its private data will be leaked or used improperly. Therefore, how to protect private data well is very important for machine learning.
- privacy machine learning frameworks In order to solve this problem, various encryption machine learning-based frameworks (hereinafter referred to as privacy machine learning frameworks) have emerged, such as: TF-Encrypted, PySyft, etc.
- these privacy machine learning frameworks make use of the ease of use of the application programming interface (API, Application Programming Interface) of plaintext machine learning frameworks (such as Tensorflow, PyTorch, etc.), and at the same time train encrypted data through a variety of cryptographic algorithms And predictions, so that users can use them without expertise in cryptography, distributed systems, or high-performance computing.
- API Application Programming Interface
- plaintext machine learning frameworks such as Tensorflow, PyTorch, etc.
- the purpose of the implementation of this specification is to provide a way to reduce the implementation cost of privacy machine learning.
- the implementation of this specification provides a method for implementing privacy machine learning, including:
- the plaintext machine learning model is generated based on a plaintext machine learning framework
- the dynamic optimizer is derived from the optimizer base class of the plaintext machine learning framework, and the cryptographic operator and all are registered in the plaintext machine learning framework. The dynamic optimizer.
- the implementation of this specification also provides a device for implementing privacy machine learning, including:
- the judgment module is used to confirm whether the execution time of the dynamic optimizer in the plaintext machine learning model comes when the calculation logic of the plaintext machine learning model is executed;
- the execution module is configured to execute the dynamic optimizer when the execution time of the dynamic optimizer arrives, so as to replace the native operator in the calculation logic with a cryptographic operator, and execute the cryptographic operator;
- the plaintext machine learning model is generated based on a plaintext machine learning framework
- the dynamic optimizer is derived from the optimizer base class of the plaintext machine learning framework, and the cryptographic operator and all are registered in the plaintext machine learning framework. The dynamic optimizer.
- the implementation of this specification also provides an electronic device, including a memory, a processor, and a computer program stored on the memory, and the computer program executes the aforementioned privacy machine learning when the computer program is run by the processor. Implementation.
- the implementation of this specification also provides a computer storage medium on which a computer program is stored, and when the computer program is executed by a processor, the aforementioned privacy machine learning implementation method is implemented.
- the native operator of the cipher is replaced with a cryptographic operator, and then by executing the cryptographic operator, private machine learning can be realized by reusing the existing plaintext machine learning model, thus avoiding the use of the APIs and privacy specific to the private machine learning framework due to the prior art
- Figure 1 is a flowchart of a privacy machine learning implementation method of some embodiments of this specification
- FIG. 2 is a schematic diagram of a structure in which multiple data holders jointly perform machine learning in an exemplary embodiment of this specification
- Fig. 3 is a flowchart of a method for implementing privacy machine learning according to an embodiment of this specification
- Figure 4 is a native static diagram of the plaintext machine learning model before executing the dynamic optimizer in an exemplary embodiment of this specification;
- Figure 5 is a static cryptographic diagram of the plaintext machine learning model after executing the dynamic optimizer in an exemplary embodiment of this specification
- Fig. 6 is a structural block diagram of a privacy machine learning implementation device according to some embodiments of this specification.
- FIG. 7 is a structural block diagram of an electronic device in some embodiments of this specification.
- the privacy machine learning implementation method may include the following steps:
- the plaintext machine learning model is generated based on a plaintext machine learning framework
- the dynamic optimizer is derived from the optimizer base class of the plaintext machine learning framework, and the cryptographic operator and all are registered in the plaintext machine learning framework. The dynamic optimizer.
- the implementation of this specification can also implement a dynamic optimizer in the plaintext machine learning model; since the dynamic optimizer is derived from the native optimizer class of the plaintext machine learning framework, the plaintext machine learning A cryptographic operator and a dynamic optimizer are registered in the framework, and the plaintext machine learning model is generated based on the plaintext machine learning framework.
- this dynamic replacement method since the implementation of this specification replaces the native operators with cryptographic operators when the calculation logic of the plaintext machine learning model is to be executed, this dynamic replacement method also has high flexibility. . Not only that, this dynamic replacement method also enables subsequent function expansion (for example, adding some business judgment conditions) without complicating the static graph, which is beneficial for users to find problems and is user-friendly.
- the plaintext machine learning framework can be any existing plaintext machine learning framework, such as TensorFlow, PyTorch, or Caffe. Therefore, this specification does not limit the specific plaintext machine learning framework used to generate the plaintext machine learning model, and it can be selected according to actual needs.
- the cryptographic operator can be implemented by a developer through a static language (such as C, C++, etc.) in advance to improve efficiency, and registered in the plaintext machine learning framework after implementation.
- the password operator generally also includes a password gradient operator.
- these cryptographic operators should correspond one-to-one with the native operators in the plaintext machine learning model generated based on the plaintext machine learning framework, so as to facilitate subsequent corresponding replacements.
- a cryptographic operator is any cryptographic operator that can provide privacy protection for the input data of all parties in a scenario where two or more data holders jointly (or collaboratively) perform machine learning modeling. son.
- the cryptographic operator may be a Secure Multi-Party Computation (MPC) operator, a homomorphic encryption (Homomorphic Encryption, HE) operator, or a zero-knowledge proof (Zero-knowledge) operator. Proof, ZKP) operator, etc.
- MPC Secure Multi-Party Computation
- HE homomorphic Encryption
- ZKP zero-knowledge proof
- this specification does not limit the specific codon operators used, which can be selected according to actual needs.
- the dynamic optimizer can also be derived and implemented by the developer based on the optimizer base class of the plaintext machine learning framework in advance, so that the optimizer base class of the plaintext machine learning framework can be reused.
- the optimizer base class of the plaintext machine learning framework can be reused.
- a developer can derive and implement a dynamic optimizer based on the TensorFlow optimizer base class GraphOptimizationPass, thereby reusing the TensorFlow optimizer base class.
- the dynamic optimizer can be registered in the plaintext machine learning framework to facilitate the use of the plaintext machine learning model generated based on the plaintext machine learning framework.
- the execution timing of the dynamic optimizer includes any one of the following:
- placement refers to assigning nodes in the native static graph to designated devices (such as CPU, GPU, etc.) for processing.
- Splitting refers to splitting the native static graph into multiple static graphs to facilitate concurrent execution.
- executing the computational logic of the plaintext machine learning model refers to processing training samples based on the computational logic of the plaintext machine learning model to perform machine learning modeling.
- the training samples are generally private data from at least two data holders.
- three data holders (data holder 1, data holder 2, and data holder 3) cooperate to perform machine learning modeling.
- the terminal of each data holder is equipped with a plaintext machine learning model, and a dynamic optimizer is imported into the plaintext machine learning model.
- the native operator of the plaintext machine learning model can be replaced with the MPC operator (here, the cryptographic operator is used Take the MPC operator as an example), and execute the MPC operator.
- MPC operator the cryptographic operator is used Take the MPC operator as an example
- the data of any one of the three data holders will not be known by the other two parties, thus achieving the improvement of the value of data utilization while taking into account the security of their own private data.
- each data holder can be trained to obtain a data prediction model, and subsequent data predictions can be made based on the data prediction model.
- the general principle of replacing the native operators in the plaintext machine learning model with cryptographic operators is: all native operators that affect data privacy protection need to be replaced with corresponding cryptographic operators. Ensure the privacy and security of input data; for native operators that do not affect data privacy protection, try not to replace them, so as to improve the reuse rate of plaintext machine learning models, thereby helping to reduce the implementation cost of private machine learning.
- a plaintext machine learning model that uses a computational graph (or called a directed acyclic graph) to represent processing logic (for example, a plaintext machine learning model generated based on Tensorflow)
- processing logic for example, a plaintext machine learning model generated based on Tensorflow
- each native operator in the plaintext machine learning model if (Necessary)
- the original static graph in the plaintext machine learning model is replaced with a cryptographic static graph.
- the native operators in the plaintext machine learning models are replaced It is a codon operator, and naturally it may not be a replacement for the calculation graph.
- the cryptographic operators and cryptographic static graphs mentioned in the embodiments of this specification refer to operators and static graphs under privacy protection, respectively.
- replacing the native static image of the plaintext machine learning model with the password static image may include the following steps:
- the native node is the native operator.
- the native static graph here can be a native static subgraph.
- the native node needs to be replaced with a corresponding cryptographic node by confirming whether the data stream corresponding to the native node contains model private data.
- the data stream corresponding to the native node contains model private data, it needs to be replaced to protect the privacy of the private data. Otherwise, it may not be replaced to improve the reuse rate of the plaintext machine learning model.
- variable 1 and variable 2 are private data held by each party, and constant 1 and constant 2 are public constant data.
- the MatMul 1 function is the native node that the data flow starting from variable 1 and variable 2 must pass through. Therefore, the MatMul 1 function needs to be replaced. For the same reason, the MatMul 3 function also needs to be replaced.
- the MatMul 2 function is the native node that the data flow starting from the constant 1 and constant 2 must pass. However, because the constant 1 and constant 2 are not private data and need not be protected, the MatMul 2 function does not need to be replaced.
- replacing the native node with a corresponding cryptographic node may include:
- the native node and its input edge and output edge are correspondingly replaced with the cryptographic node, the new input edge and the new output edge, thereby completing the replacement of the native operator.
- the replacement operation of the native static graph at runtime can be completed through continuous iteration, thereby generating a cryptographic static graph, and then the cryptographic static graph can be executed to realize private machine learning.
- the original static image shown in FIG. 4 can be changed to the password static image shown in FIG. 5 (it can be seen that in FIG. 5, MpcMatMul 1 and MpcMatMul 3 are MPC nodes).
- the privacy machine learning implementation device may include:
- the judging module 61 may be used to confirm whether the execution time of the dynamic optimizer in the plaintext machine learning model comes when the calculation logic of the plaintext machine learning model is executed;
- the execution module 62 may be configured to execute the dynamic optimizer when the execution time of the dynamic optimizer arrives, so as to replace the native operator in the calculation logic with a cryptographic operator, and execute the cryptographic operator ;
- the plaintext machine learning model is generated based on a plaintext machine learning framework
- the dynamic optimizer is derived from the optimizer base class of the plaintext machine learning framework, and the cryptographic operator and all are registered in the plaintext machine learning framework. The dynamic optimizer.
- the replacement of the native operator in the calculation logic with a cryptographic operator includes:
- the native static image of the plaintext machine learning model is replaced with a cryptographic static image.
- replacing the native static image of the plaintext machine learning model with a password static image includes:
- the confirmation whether it is necessary to replace the native node with a corresponding cryptographic node includes:
- the replacement of the native node with a corresponding cryptographic node includes:
- edge information Acquiring attribute information, edge information, and corresponding cryptographic node name of the native node;
- the edge information includes an input edge and an output edge;
- the execution timing of the dynamic optimizer includes any one of the following:
- an electronic device includes a memory, a processor, and a computer program stored on the memory, and the computer program can execute the above-mentioned when the computer program is run by the processor.
- Privacy machine learning implementation method includes a memory, a processor, and a computer program stored on the memory, and the computer program can execute the above-mentioned when the computer program is run by the processor.
- These computer program instructions can also be stored in a computer-readable memory that can direct a computer or other programmable data processing equipment to work in a specific manner, so that the instructions stored in the computer-readable memory produce an article of manufacture including the instruction device.
- the device implements the functions specified in one process or multiple processes in the flowchart and/or one block or multiple blocks in the block diagram.
- These computer program instructions can also be loaded on a computer or other programmable data processing equipment, so that a series of operation steps are executed on the computer or other programmable equipment to produce computer-implemented processing, so as to execute on the computer or other programmable equipment.
- the instructions provide steps for implementing the functions specified in one process or multiple processes in the flowchart and/or one block or multiple blocks in the block diagram.
- the computing device includes one or more processors (CPUs), input/output interfaces, network interfaces, and memory.
- processors CPUs
- input/output interfaces network interfaces
- memory volatile and non-volatile memory
- the memory may include non-permanent memory in a computer readable medium, random access memory (RAM) and/or non-volatile memory, such as read-only memory (ROM) or flash memory (flash RAM). Memory is an example of computer readable media.
- RAM random access memory
- ROM read-only memory
- flash RAM flash memory
- Computer-readable media include permanent and non-permanent, removable and non-removable media, and information storage can be realized by any method or technology.
- the information can be computer-readable instructions, data structures, program modules, or other data.
- Examples of computer storage media include, but are not limited to, phase change memory (PRAM), static random access memory (SRAM), dynamic random access memory (DRAM), other types of random access memory (RAM), read-only memory (ROM), electrically erasable programmable read-only memory (EEPROM), flash memory or other memory technology, CD-ROM, digital versatile disc (DVD) or other optical storage, Magnetic cassettes, magnetic tape magnetic disk storage or other magnetic storage devices or any other non-transmission media can be used to store information that can be accessed by computing devices. According to the definition in this article, computer-readable media does not include transitory media, such as modulated data signals and carrier waves.
- the implementation of this specification can be provided as a method, a system or a computer program product. Therefore, the implementation of this specification may adopt the form of a complete hardware implementation, a complete software implementation, or an implementation combining software and hardware. Moreover, the implementation of this specification may adopt the form of a computer program product implemented on one or more computer-usable storage media (including but not limited to disk storage, CD-ROM, optical storage, etc.) containing computer-usable program codes.
- computer-usable storage media including but not limited to disk storage, CD-ROM, optical storage, etc.
- program modules include routines, programs, objects, components, data structures, etc. that perform specific tasks or implement specific abstract data types.
- the embodiments of this specification can also be practiced in distributed computing environments. In these distributed computing environments, tasks are performed by remote processing devices connected through a communication network. In a distributed computing environment, program modules can be located in local and remote computer storage media including storage devices.
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Software Systems (AREA)
- General Physics & Mathematics (AREA)
- General Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Evolutionary Computation (AREA)
- Medical Informatics (AREA)
- Computing Systems (AREA)
- Data Mining & Analysis (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Mathematical Physics (AREA)
- Artificial Intelligence (AREA)
- Health & Medical Sciences (AREA)
- Bioethics (AREA)
- General Health & Medical Sciences (AREA)
- Computer Hardware Design (AREA)
- Computer Security & Cryptography (AREA)
- Information Retrieval, Db Structures And Fs Structures Therefor (AREA)
Abstract
Description
Claims (14)
- 一种隐私机器学习实现方法,其特征在于,包括:在执行明文机器学习模型的计算逻辑时,确认所述明文机器学习模型中的动态优化器的执行时机是否到来;当所述动态优化器的执行时机到来时,执行所述动态优化器,以将所述计算逻辑中的原生算子替换为密码算子,并执行所述密码算子;其中,所述明文机器学习模型基于明文机器学习框架生成,所述动态优化器派生于所述明文机器学习框架的优化器基类,所述明文机器学习框架中注册有所述密码算子和所述动态优化器。
- 如权利要求1所述的隐私机器学习实现方法,其特征在于,所述将所述计算逻辑中的原生算子替换为密码算子,包括:将所述明文机器学习模型的原生静态图替换为密码静态图。
- 如权利要求2所述的隐私机器学习实现方法,其特征在于,所述将所述明文机器学习模型的原生静态图替换为密码静态图,包括:从所述明文机器学习模型的原生静态图中获取原生节点;确认是否需要将该原生节点替换为对应的密码节点;如果需要,则将该原生节点替换为对应的密码节点。
- 如权利要求3所述的隐私机器学习实现方法,其特征在于,所述确认是否需要将该原生节点替换为对应的密码节点,包括:确认该原生节点对应的数据流中是否包含模型私有数据。
- 如权利要求3所述的隐私机器学习实现方法,其特征在于,所述将该原生节点替换为对应的密码节点,包括:获取该原生节点的属性信息、边信息及对应的密码节点名称;根据该原生节点的属性信息、边信息及对应的密码节点名称,创建密码算子、新输入边和新输出边;将该原生节点及其输入边和输出边对应替换为该密码节点、该新输入边和该新输出边。
- 如权利要求2所述的隐私机器学习实现方法,其特征在于,所述动态优化器的执行时机包括以下中的任意一种:安置所述明文机器学习模型中的原生静态图之前;安置所述明文机器学习模型中的原生静态图之后;优化所述明文机器学习模型中的原生静态图之后;分裂所述明文机器学习模型中的原生静态图之后。
- 一种隐私机器学习实现装置,其特征在于,包括:判断模块,用于在执行明文机器学习模型的计算逻辑时,确认所述明文机器学习模型中的动态优化器的执行时机是否到来;执行模块,用于当所述动态优化器的执行时机到来时,执行所述动态优化器,以将所述计算逻辑中的原生算子替换为密码算子,并执行所述密码算子;其中,所述明文机器学习模型基于明文机器学习框架生成,所述动态优化器派生于所述明文机器学习框架的优化器基类,所述明文机器学习框架中注册有所述密码算子和所述动态优化器。
- 如权利要求7所述的隐私机器学习实现装置,其特征在于,所述将所述计算逻辑中的原生算子替换为密码算子,包括:将所述明文机器学习模型的原生静态图替换为密码静态图。
- 如权利要求8所述的隐私机器学习实现装置,其特征在于,所述将所述明文机器学习模型的原生静态图替换为密码静态图,包括:从所述明文机器学习模型的原生静态图中获取原生节点;确认是否需要将该原生节点替换为对应的密码节点;如果需要,则将该原生节点替换为对应的密码节点。
- 如权利要求9所述的隐私机器学习实现装置,其特征在于,所述确认是否需要将该原生节点替换为对应的密码节点,包括:确认该原生节点对应的数据流中是否包含模型私有数据。
- 如权利要求9所述的隐私机器学习实现装置,其特征在于,所述将该原生节点替换为对应的密码节点,包括:获取该原生节点的属性信息、边信息及对应的密码节点名称;所述边信息包括输入边和输出边;根据该原生节点的属性信息、边信息及对应的密码节点名称,创建密码算子、新输入边和新输出边;将该原生算子、所述输入边和所述输出边对应替换为该密码算子、该新输入边和该新输出边。
- 如权利要求8所述的隐私机器学习实现装置,其特征在于,所述动态优化器的执行时机包括以下中的任意一种:安置所述明文机器学习模型中的原生静态图之前;安置所述明文机器学习模型中的原生静态图之后;优化所述明文机器学习模型中的原生静态图之后;分裂所述明文机器学习模型中的原生静态图之后。
- 一种电子设备,包括存储器、处理器、以及存储在所述存储器上的计算机程序,其特征在于,所述计算机程序被所述处理器运行时执行权利要求1-6任意一项所述的方法。
- 一种计算机存储介质,其上存储有计算机程序,其特征在于,所述计算机程序被处理器执行时实现权利要求1-6任意一项所述的方法。
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/CN2020/080390 WO2021184345A1 (zh) | 2020-03-20 | 2020-03-20 | 隐私机器学习实现方法、装置、设备及存储介质 |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/CN2020/080390 WO2021184345A1 (zh) | 2020-03-20 | 2020-03-20 | 隐私机器学习实现方法、装置、设备及存储介质 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2021184345A1 true WO2021184345A1 (zh) | 2021-09-23 |
Family
ID=77769966
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/CN2020/080390 WO2021184345A1 (zh) | 2020-03-20 | 2020-03-20 | 隐私机器学习实现方法、装置、设备及存储介质 |
Country Status (1)
Country | Link |
---|---|
WO (1) | WO2021184345A1 (zh) |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN109255234A (zh) * | 2018-08-15 | 2019-01-22 | 腾讯科技(深圳)有限公司 | 机器学习模型的处理方法、装置、介质及电子设备 |
CN110414187A (zh) * | 2019-07-03 | 2019-11-05 | 北京百度网讯科技有限公司 | 模型安全交付自动化的系统及其方法 |
CN110619220A (zh) * | 2019-08-09 | 2019-12-27 | 北京小米移动软件有限公司 | 对神经网络模型加密的方法及装置、存储介质 |
-
2020
- 2020-03-20 WO PCT/CN2020/080390 patent/WO2021184345A1/zh active Application Filing
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN109255234A (zh) * | 2018-08-15 | 2019-01-22 | 腾讯科技(深圳)有限公司 | 机器学习模型的处理方法、装置、介质及电子设备 |
CN110414187A (zh) * | 2019-07-03 | 2019-11-05 | 北京百度网讯科技有限公司 | 模型安全交付自动化的系统及其方法 |
CN110619220A (zh) * | 2019-08-09 | 2019-12-27 | 北京小米移动软件有限公司 | 对神经网络模型加密的方法及装置、存储介质 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN111428880A (zh) | 隐私机器学习实现方法、装置、设备及存储介质 | |
CN111415013B (zh) | 隐私机器学习模型生成、训练方法、装置及电子设备 | |
US9569288B2 (en) | Application pattern discovery | |
US11176469B2 (en) | Model training methods, apparatuses, and systems | |
US9563697B1 (en) | Calculating differences between datasets having differing numbers of partitions | |
US10313430B2 (en) | Distributed method and apparatus for processing streaming data | |
WO2021203260A1 (zh) | 一种节点匹配方法、装置、设备及系统 | |
US20230222356A1 (en) | Federated learning | |
CN112200713A (zh) | 一种联邦学习中的业务数据处理方法、装置以及设备 | |
WO2021184346A1 (zh) | 隐私机器学习模型生成、训练方法、装置及电子设备 | |
CN110633959A (zh) | 基于图结构的审批任务创建方法、装置、设备及介质 | |
CN108924185A (zh) | 接口生成方法及装置 | |
WO2021184345A1 (zh) | 隐私机器学习实现方法、装置、设备及存储介质 | |
WO2020211075A1 (zh) | 去中心化多方安全数据处理方法、装置及存储介质 | |
Guo et al. | A deadlock prevention approach for a class of timed Petri nets using elementary siphons | |
CN115118411B (zh) | 链下多方可信计算方法、装置、设备及存储介质 | |
CN106874341A (zh) | 一种数据库同步方法 | |
Kissmann et al. | What’s in it for my BDD? On causal graphs and variable orders in planning | |
CN110401925A (zh) | 一种通讯消息的生成方法及装置 | |
US9727311B2 (en) | Generating a service definition including a common service action | |
CN115567596A (zh) | 云服务资源部署方法、装置、设备及存储介质 | |
WO2016110204A1 (zh) | 处理对象的处理、插件生成方法及装置 | |
CN113204502A (zh) | 异构加速计算优化方法、装置、设备及可读存储介质 | |
CN111694870A (zh) | 一种大数据模型执行引擎系统及实现方法 | |
CN117056994B (zh) | 一种大数据建模的数据处理系统 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 20925130 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 20925130 Country of ref document: EP Kind code of ref document: A1 |
|
32PN | Ep: public notification in the ep bulletin as address of the adressee cannot be established |
Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 290323) |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 20925130 Country of ref document: EP Kind code of ref document: A1 |