CN111859267B - Operation method of privacy protection machine learning activation function based on BGW protocol - Google Patents

Operation method of privacy protection machine learning activation function based on BGW protocol Download PDF

Info

Publication number
CN111859267B
CN111859267B CN202010571112.8A CN202010571112A CN111859267B CN 111859267 B CN111859267 B CN 111859267B CN 202010571112 A CN202010571112 A CN 202010571112A CN 111859267 B CN111859267 B CN 111859267B
Authority
CN
China
Prior art keywords
function
machine learning
sigmoid
value
interaction
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202010571112.8A
Other languages
Chinese (zh)
Other versions
CN111859267A (en
Inventor
韩伟力
汤定一
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Fudan University
Original Assignee
Fudan University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Fudan University filed Critical Fudan University
Priority to CN202010571112.8A priority Critical patent/CN111859267B/en
Publication of CN111859267A publication Critical patent/CN111859267A/en
Application granted granted Critical
Publication of CN111859267B publication Critical patent/CN111859267B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F17/00Digital computing or data processing equipment or methods, specially adapted for specific functions
    • G06F17/10Complex mathematical operations
    • G06F17/15Correlation function computation including computation of convolution operations
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • G06N3/084Backpropagation, e.g. using gradient descent

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Mathematical Physics (AREA)
  • Data Mining & Analysis (AREA)
  • Computing Systems (AREA)
  • Software Systems (AREA)
  • General Engineering & Computer Science (AREA)
  • Biophysics (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Molecular Biology (AREA)
  • Evolutionary Computation (AREA)
  • Computational Linguistics (AREA)
  • Biomedical Technology (AREA)
  • Artificial Intelligence (AREA)
  • General Health & Medical Sciences (AREA)
  • Health & Medical Sciences (AREA)
  • Computational Mathematics (AREA)
  • Mathematical Analysis (AREA)
  • Mathematical Optimization (AREA)
  • Pure & Applied Mathematics (AREA)
  • Algebra (AREA)
  • Databases & Information Systems (AREA)
  • Management, Administration, Business Operations System, And Electronic Commerce (AREA)

Abstract

The invention belongs to the field of network space security, and particularly relates to an operation method of a privacy protection machine learning activation function based on a BGW protocol. The invention combines the secure multiparty computing protocol with the machine learning activation function, aims at providing the secure and efficient machine learning activation function, and realizes the machine learning activation function based on the BGW protocol. The secure multiparty computationally friendly machine learning activation function may be a secure ReLU function or a secure Sigmoid function. The invention can be used to implement relevant activation functions based on a machine learning model or framework of a secure multiparty computing protocol without revealing intermediate process information.

Description

Operation method of privacy protection machine learning activation function based on BGW protocol
Technical Field
The invention belongs to the technical field of network space security, and particularly relates to an operation method of a privacy protection machine learning activation function based on a BGW protocol.
Background
Machine learning has been widely used in respective actual scenarios. For example, internet companies collect massive amounts of user behavior data to train more accurate recommendation models. The hospital gathers the health data to generate a diagnostic model. Financial enterprises use historical transaction records to train more accurate fraud models.
In machine learning, data size plays an important role in model accuracy. However, data distributed among multiple data sources or individuals cannot be simply consolidated. Regulations related to privacy issues such as GDPR, the concern that enterprises maintain competitive advantages and issues related to data ownership prevent data from being publicly shared. Privacy preserving machine learning based on secure multiparty computing allows different principals to train their respective models on their federated data without revealing any information other than the final model.
All machine learning algorithms can be represented as data flow graphs. The dataflow graph is made up of nodes and edges. Each node in the dataflow graph represents input and output data in the form of an operator or matrix. The nodes are connected by directed edges, which represent the data flow of the training process. Operators define operations that need to be performed in forward and backward propagation and are abstracted into underlying computing components that can be used when needed. An operation defined on a node is performed if and only if all nodes directed to that node have been calculated. In this configuration, forward propagation and backward propagation are defined on the same dataflow graph, which ensures consistency of the forward propagation and the backward propagation.
Nodes are classified into three categories: input nodes, weight nodes and operator nodes.
The input nodes represent an input matrix, i.e. training data for training the model.
The weight nodes are weight matrices in the model. It is a parameter of model training, requiring initialization prior to training and updating after each round of training.
The operator nodes are responsible for operations including matrix addition and matrix multiplication. Each operator node needs to define Forward and Grad methods that represent the operations that the operator needs to perform during Forward and backward propagation, respectively.
Each node maintains two matrices: forward, grad. forward matrices maintain forward propagation results, which compute each operator in turn in topological order. And the grad matrix maintains the result of back propagation, which calculates each operator in turn in reverse topological order.
Signed integer representation in finite field
Signed integerBy a function fld:/>Encoded in integer domain/>Where q >2 k, the signed integer is encoded in the integer domain in a form similar to a two's complement.
Function LTZ
Octavian Catrina et al in 2010 proposed an optimization algorithm for secret sharing integer comparison operations.
[s]←LTZ([a],k)
The LTZ protocol is used to compare the size of a secret shared signed integer [ a ] with 0, where k represents the significant number of digits of the signed integer. If [ a ] <0, then [ s ] is a value of 1, otherwise the value is 0. The numbers bracketed in this document represent secret sharing numbers.
Disclosure of Invention
The invention aims to provide an operation method of a privacy protection machine learning activation function based on a BGW protocol; the invention can be used to implement relevant activation functions based on a machine learning model or framework of a secure multiparty computing protocol without revealing intermediate process information. The machine learning framework based on the BGW protocol can be enabled to be basically equal to the existing machine learning framework trained by plaintext, and the accuracy of the obtained model on the test set is basically equal when the same model is trained.
The technical scheme of the invention is specifically described as follows.
The invention provides an operation method of a privacy protection machine learning activation function based on a BGW protocol, wherein the activation function is a safety ReLU function, and a ReLU operator in the safety ReLU function defines operations to be executed in a forward propagation stage and a backward propagation stage;
Forward propagation of the ReLU operator computes y=x (x > 0), x representing the input matrix and y representing the output matrix, by:
In the first step, whether the secret sharing matrix [ a.forward ] is larger than 0 is judged through an LTZ function,
[ U ] = 1-LTZ ([ a.forward ], k), [ u ] is a secret sharing number, if [ a.forward ] is greater than or equal to 0, then [ u ] is 1, otherwise [ u ] is 0, k represents a signed integer significant bit number, a represents an input node, and forward represents a result of the node in forward propagation;
Calculating [ a.forward ] [ u ] to obtain a value of a function of ReLU (a): a > 0a:0;
in forward propagation the ReLU operator requires 4 rounds of interaction, wherein in the first step the LTZ function requires 3 rounds of interaction and in the second step the multiplication requires 1 round of interaction;
The reverse propagation calculation x '=y' =x >0 of the ReLU operator is converted into calculation [ a.grad ] = [ c.grad ] = [ a.forward ] >0, c represents the output node, grad represents the result of the node in the reverse propagation, and since the positive and negative works of the [ a.forward ] are already completed in the forward propagation, and the judgment result is cached, only 1 round of interaction is needed.
The invention also provides an operation method of the privacy protection machine learning activation function based on the BGW protocol, wherein the activation function is a security Sigmoid function, and Sigmoid operators in the security Sigmoid function define operations to be executed in a forward propagation stage and a backward propagation stage; wherein:
The forward propagation computation of the Sigmoid operator, y=1/((1+e (-x))), uses piecewise function modeling to divide the function into three parts:
① If the input value [ a.forward ] is in And/>Between, the output value is/>
② If [ a.forward ] is greater thanThe output value is 1;
③ If the input value [ a.forward ] is smaller than The output value is 0;
The calculation is performed by the following two steps:
First, judging the secret sharing matrix [ a.forward ] and the secret sharing matrix by LTZ function And/>In parallel to calculate/>Where [ u 1 ] represents whether [ a.forward ] is greater than/>K represents the signed integer significant number of bits, a represents the input node; if it isThen the value of [ u 1 ] is 1, otherwise the value of [ u 1 ] is 0; similarly [ u 2 ] indicates whether [ a.forward ] is greater than/>If it isThen the value of [ u 2 ] is 1, otherwise the value of [ u 2 ] is 0;
Second step, through Obtaining the value of the analog Sigmoid function, wherein the whole Sigmoid operation needs 4-round interaction, the first-step parallel LTZ function needs 3-round interaction, and the second-step parallel multiplication needs 1-round interaction;
The back propagation calculation [ x '] = [ y' ] [ x ] (1- [ x ]) of the Sigmoid operator is converted into calculation [ a.grad ] = [ c.grad ] [ a.forward ] (1- [ a.forward ]), c represents an output node, and the number shared by 3 secrets is required to be multiplied, and the [ a.forward ] (1- [ a.forward ]) is calculated in parallel in forward propagation in advance, so that the back propagation stage only needs 1 round of interaction under the condition of not increasing the number of forward propagation interaction rounds.
Compared with the prior art, the invention has the beneficial effects that:
The invention provides an operation method for training the activation function of the machine learning model based on the BGW protocol, which enables the secure multiparty computation based on the BGW protocol to support the activation function comprising the ReLU and the Sigmoid, meets the requirement of privacy-preserving machine learning operation based on the secure multiparty computation, allows different subjects to train respective models on the joint data thereof, and does not leak any information except the final model.
According to our experimental results, training the activation function provided by the invention in the machine learning framework based on the BGW protocol, such as logistic regression using Sigmoid function, BP neural network using ReLU function, etc., can make the accuracy of the model obtained on the test set substantially equal when training the same model.
In addition, the embodiment results show that the logistic regression interaction complexity based on the Sigmoid operator designed by the invention is low, and the logistic regression interaction complexity is superior to the existing work in wide area network setting.
Drawings
Fig. 1 is a general design diagram of an activation function in a BGW protocol-based privacy preserving machine learning model.
Fig. 2 is a forward propagation phase protocol of the ReLU operator.
Fig. 3 is a forward propagation phase protocol of the Sigmoid operator.
Fig. 4 is a comparison of logistic regression operating speeds in iterations per second. Wherein represents the estimated iteration number.
Detailed Description
The following describes in detail the examples of the present invention, which are implemented on the premise of the technical solution of the present invention, and detailed embodiments and specific operation procedures are given, but the scope of protection of the present invention is not limited to the following examples.
In the embodiment of the invention, the overall design diagram of the security activation function in the privacy protection machine learning model based on the BGW protocol is shown in FIG. 1. As can be seen from fig. 1, the secure activation function design consists of two parts: operator ReLU design and operator Sigmoid design in BGW protocol-based machine learning framework.
1. Operator ReLU design specification in machine learning framework based on BGW protocol
The ReLU operator defines the operations that need to be performed by the forward and backward propagation phases. Forward propagation of the ReLU operator in the machine learning framework computes y=x×0 (x > 0), instead as shown in fig. 2 in the BGW protocol-based machine learning framework. Two-step calculation is required:
First, it is judged whether the secret sharing matrix [ a.forward ] is greater than 0 by the LTZ function. And [ u ] is a secret sharing number, if [ a.forward ] is greater than or equal to 0, then [ u ] is 1, otherwise [ u ] is 0.k represents the signed integer significand.
And calculating [ a.forward ] [ u ] to obtain the value of the ReLU (a): a >0a:0 function.
The ReLU operator requires 4 rounds of interaction in forward propagation, with LTZ functions requiring 3 rounds of interaction in the first step and 1 round of interaction for multiplication in the second step.
The back propagation calculation x '=y' =0 of the ReLU operator in the machine learning framework, and instead the calculation of [ a.grad ] = [ c.grad ] ([ a.forward ] > 0) in the BGW-based machine learning framework, since the work of judging the positive and negative of [ a.forward ] is already completed in the forward propagation, only 1 round of interaction is needed by buffering the judgment result.
2. Operator Sigmoid design description in machine learning framework based on BGW protocol
The Sigmoid operator defines the operations that need to be performed by the forward and backward propagation phases.
The forward propagation of the Sigmoid operator in the machine learning framework computes y=1/((1+e (-x))). In secure multiparty computation, the cost of implementing continuous functions is large, so we divide the functions into 3 segments by piecewise function simulation, and design protocol is shown in fig. 3. Two-step calculation is required: firstly, judging the size relation between a secret sharing matrix [ a.forward ] and-1/2 and 1/2 through an LTZ function, and calculating two expressions of lines 1 and 2 in parallel, wherein [ u 1 ] represents whether [ a.forward ] is larger than-1/2, if [ a.forward ] > -1/2, the value of [ u 1 ] is 1, and otherwise, the value of [ u 1 ] is 0. Similarly, [ u 2 ] indicates whether [ a.forward ] is greater than 1/2, if [ a.forward ] >1/2, then the value of [ u 2 ] is 1, otherwise the value of [ u 2 ] is 0. Second byThe value of the analog Sigmoid function is obtained. The entire Sigmoid operation requires 4 rounds of interaction, where the first step of parallel LTZ functions requires 3 rounds of interaction and the second step of parallel multiplication requires 1 round of interaction.
The back propagation computation of Sigmoid operator [ x '] = [ y' ] [ x ] (1- [ x ]) in the machine learning framework based on BGW protocol is changed to computation of [ a.grad ] = [ c.grad ] [ a.forward ] (1- [ a.forward ]). Since the number of 3 secret shares needs to be multiplied, the [ a.forward ] (1- [ a.forward ]) is calculated in parallel in forward propagation in advance, and the backward propagation stage only needs 1 round of interaction under the condition of not increasing the number of forward propagation interaction rounds.
3. Experiment contrast the Sigmoid operator designed by this patent and the speed of the existing work in logistic regression
The embodiment shows through experiments that the Sigmoid operator designed by the patent is faster than the existing work (SecureML and ABY 3) in logistic regression. Wherein SecureML was proposed by Mohassel and Zhang in 2017, the work is based on additive secret sharing. ABY3 was proposed by Mohassel et al in 2018, which work combines arithmetic sharing, boolean sharing and Yao Shi garbled circuits.
All experiments were performed on an Intel i7 computer running Linux, with 32GB RAM. We have configured a Wide Area Network (WAN) environment with a 40ms RTT delay through Linuxtc facilities.
Experiments used MNIST handwriting datasets. Wherein the training set comprises 60000 pictures marked 0-9, and the test set comprises 10000 marked pictures. Each picture is 28 x 28 in size and contains 784 features represented by gray scales of 0-255.
The logistic regression model is used, and the measurement index is the iteration number per second. The logistic regression parameter w=w- α×x T (f (x×w) -y) was trained using random gradient descent. Where w is the parameter matrix to be trained, 784 x1 in size, α is the learning rate, and 0.001 is used. x is a small batch of training data, 128 x 784 in size, training 1 small batch per iteration, each small batch containing 128 training samples. y is the label corresponding to the small lot sample, 128 x1 in size.
The comparison is made with SecureML and ABY3 at the wide area network setup. The results are shown in FIG. 4. Exactly the same training procedure is used. In other words, they use the same training data, and the training data is input to the model in the same order. Comparing their iterations per second. The in fig. 4 represents the logistic regression based on the Sigmoid operator of the design of This patent, secureML and ABY3 represent the logistic regression they implement, respectively. When the feature dimension is 10 and the batch size is 128, the number of iterations per second of this patent is 4.56, which is higher than 1.4 of SecureML and 3.91 of ABY 3. When the feature dimension is 10 and the batch size is 256, the number of iterations per second of this patent solution is 4.08, which is higher than 0.94 of SecureML and 3.9 of ABY 3. The reason for realizing high efficiency is that the Sigmoid operator designed by the patent only needs to interact 4 times, which is less than 7 times of ABY 3. And less computationally complex. Experimental results show that the Sigmoid of the patent design is superior to the existing work in wide area network setting.

Claims (1)

1. A method for operating privacy protection machine learning activation function based on BGW protocol is characterized in that the activation function is a security Sigmoid function, sigmoid operator in the security Sigmoid function defines the operation to be executed in forward propagation stage and backward propagation stage; the security Sigmoid function is used in a logistic regression model, the training set comprises 60000 pictures marked as 0-9, the test set comprises 10000 marked pictures, the size of each picture is 28 x 28, and 784 features represented by gray scales of 0-255 are included; training logistic regression parameters w=w- α×x≡ (f (x×w) -y) using random gradient descent, where w is the parameter matrix to be trained, size 784×1; alpha is learning rate, and 0.001 is adopted; x is a small batch of training data, 128 x 784 in size, 1 small batch per iteration training, each small batch containing 128 training samples; wherein:
The forward propagation computation of the Sigmoid operator, y=1/((1+e (-x))), uses piecewise function modeling to divide the function into three parts:
① If the input value [ a.forward ] is in And/>Between, the output value is/>
② If [ a.forward ] is greater thanThe output value is 1;
③ If the input value [ a.forward ] is smaller than The output value is 0;
The calculation is performed by the following two steps:
First, judging the secret sharing matrix [ a.forward ] and the secret sharing matrix by LTZ function And/>In parallel with the magnitude relation of (a)Where [ u 1 ] represents whether [ a.forward ] is greater than/>K represents the signed integer significant number of bits, a represents the input node; if/>Then the value of [ u 1 ] is 1, otherwise the value of [ u 1 ] is 0; similarly [ u 2 ] indicates whether [ a.forward ] is greater than/>If/>Then the value of [ u 2 ] is 1, otherwise the value of [ u 2 ] is 0;
Second step, through Obtaining the value of the analog Sigmoid function, namely [ c.forward ];
the whole Sigmoid operation requires 4 rounds of interaction, wherein the first step of parallel LTZ function requires 3 rounds of interaction, and the second step of parallel multiplication requires 1 round of interaction;
The back propagation calculation [ x '] = [ y' ] [ x ] (1- [ x ]) of the Sigmoid operator is converted into calculation [ a.grad ] = [ c.grad ] [ a.forward ] (1- [ a.forward ]), c represents an output node, and the number shared by 3 secrets is required to be multiplied, and the [ a.forward ] (1- [ a.forward ]) is calculated in parallel in forward propagation in advance, so that the back propagation stage only needs 1 round of interaction under the condition of not increasing the number of forward propagation interaction rounds.
CN202010571112.8A 2020-06-22 2020-06-22 Operation method of privacy protection machine learning activation function based on BGW protocol Active CN111859267B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010571112.8A CN111859267B (en) 2020-06-22 2020-06-22 Operation method of privacy protection machine learning activation function based on BGW protocol

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010571112.8A CN111859267B (en) 2020-06-22 2020-06-22 Operation method of privacy protection machine learning activation function based on BGW protocol

Publications (2)

Publication Number Publication Date
CN111859267A CN111859267A (en) 2020-10-30
CN111859267B true CN111859267B (en) 2024-04-26

Family

ID=72987459

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010571112.8A Active CN111859267B (en) 2020-06-22 2020-06-22 Operation method of privacy protection machine learning activation function based on BGW protocol

Country Status (1)

Country Link
CN (1) CN111859267B (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113672985B (en) * 2021-08-25 2023-11-14 支付宝(杭州)信息技术有限公司 Machine learning algorithm script compiling method and compiler for privacy protection
CN117114059B (en) * 2023-05-16 2024-07-05 华为云计算技术有限公司 Method and device for calculating activation function in neural network and computing equipment

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108804715A (en) * 2018-07-09 2018-11-13 北京邮电大学 Merge multitask coordinated recognition methods and the system of audiovisual perception
CN109508784A (en) * 2018-12-28 2019-03-22 四川那智科技有限公司 A kind of design method of neural network activation primitive
CN110537191A (en) * 2017-03-22 2019-12-03 维萨国际服务协会 Secret protection machine learning
US10600006B1 (en) * 2019-01-11 2020-03-24 Alibaba Group Holding Limited Logistic regression modeling scheme using secrete sharing
CN111242290A (en) * 2020-01-20 2020-06-05 福州大学 Lightweight privacy protection generation countermeasure network system
CN111260081A (en) * 2020-02-14 2020-06-09 广州大学 Non-interactive privacy protection multi-party machine learning method

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110537191A (en) * 2017-03-22 2019-12-03 维萨国际服务协会 Secret protection machine learning
CN108804715A (en) * 2018-07-09 2018-11-13 北京邮电大学 Merge multitask coordinated recognition methods and the system of audiovisual perception
CN109508784A (en) * 2018-12-28 2019-03-22 四川那智科技有限公司 A kind of design method of neural network activation primitive
US10600006B1 (en) * 2019-01-11 2020-03-24 Alibaba Group Holding Limited Logistic regression modeling scheme using secrete sharing
CN111242290A (en) * 2020-01-20 2020-06-05 福州大学 Lightweight privacy protection generation countermeasure network system
CN111260081A (en) * 2020-02-14 2020-06-09 广州大学 Non-interactive privacy protection multi-party machine learning method

Also Published As

Publication number Publication date
CN111859267A (en) 2020-10-30

Similar Documents

Publication Publication Date Title
Gao et al. A recalling-enhanced recurrent neural network: Conjugate gradient learning algorithm and its convergence analysis
Prasse et al. Network reconstruction and prediction of epidemic outbreaks for general group-based compartmental epidemic models
CN111859267B (en) Operation method of privacy protection machine learning activation function based on BGW protocol
Guo et al. Identification of Wiener systems with quantized inputs and binary-valued output observations
Moiseev et al. Queueing network MAP−(GI/∞) K with high-rate arrivals
Pulch et al. Sensitivity analysis and model order reduction for random linear dynamical systems
Kerik et al. Optimizing MPC for robust and scalable integer and floating-point arithmetic
CN114021734B (en) Parameter calculation device, system and method for federal learning and privacy calculation
CN104834216B (en) A kind of circuit and method that PI controller parameters are adjusted based on BP neural network
Tsmots et al. Neural-like methods and hardware structures for real-time data encryption and decryption
CN115842627A (en) Decision tree evaluation method, device, equipment and medium based on secure multi-party computation
Nepomuceno et al. On the use of interval extensions to estimate the largest Lyapunov exponent from chaotic data
CN113935050A (en) Feature extraction method and device based on federal learning, electronic device and medium
US11444926B1 (en) Privacy-preserving efficient subset selection of features for regression models in a multi-party computation setting
Huang et al. latentcor: An R Package for estimating latent correlations from mixed data types
Calatayud et al. On the convergence of adaptive gPC for non-linear random difference equations: Theoretical analysis and some practical recommendations.
CN112101609B (en) Prediction system, method and device for user repayment timeliness and electronic equipment
CN115618663A (en) Quantum solving method and device for coupling grid equation and physical equation
Kahalé Randomized dimension reduction for Monte Carlo simulations
CN111985573A (en) Factorization machine classification model construction method and device and readable storage medium
Zhang et al. The robust physics-informed neural networks for a typical fourth-order phase field model
Ganesh et al. A pseudospectral quadrature method for Navier-Stokes equations on rotating spheres
Zhao et al. Numerical methods for distributed stochastic compositional optimization problems with aggregative structure
CN114358323A (en) Third-party-based efficient Pearson coefficient calculation method in federated learning environment
CN204695010U (en) A kind of circuit regulating PI controller parameter based on BP neural network

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant