CN117195972A - Ciphertext neural network construction method and system based on CKS - Google Patents

Ciphertext neural network construction method and system based on CKS Download PDF

Info

Publication number
CN117195972A
CN117195972A CN202311175758.4A CN202311175758A CN117195972A CN 117195972 A CN117195972 A CN 117195972A CN 202311175758 A CN202311175758 A CN 202311175758A CN 117195972 A CN117195972 A CN 117195972A
Authority
CN
China
Prior art keywords
neural network
ciphertext
data
public key
key
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202311175758.4A
Other languages
Chinese (zh)
Inventor
刘义铭
孟金桃
赵越
焦三秀
蔡乐才
高祥
成奎
张超洋
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
CETC 30 Research Institute
Original Assignee
CETC 30 Research Institute
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by CETC 30 Research Institute filed Critical CETC 30 Research Institute
Priority to CN202311175758.4A priority Critical patent/CN117195972A/en
Publication of CN117195972A publication Critical patent/CN117195972A/en
Pending legal-status Critical Current

Links

Landscapes

  • Management, Administration, Business Operations System, And Electronic Commerce (AREA)

Abstract

The application provides a ciphertext neural network construction method and a system based on CKS, comprising the following steps: s1, constructing a deep neural network model at a cloud server end, and completing training by adopting open plaintext data; s2, generating a public key and a private key based on CKS homomorphic encryption; s3, the user side stores the public key and the private key, encrypts the local private data through the public key, and uploads the encrypted data and the public key to the cloud server side; s4, encrypting the trained deep neural network model by the cloud server through a public key, and converting the encrypted deep neural network model into a ciphertext neural network model; inputting the encrypted data into a ciphertext neural network model for reasoning and predicting, and returning a ciphertext reasoning result to the user side; s5, the user end decrypts by using the held private key to obtain the plaintext model prediction output of the local data. The application can realize end-to-end ciphertext input and ciphertext output and provide core technical support for realizing the availability of invisible private data.

Description

Ciphertext neural network construction method and system based on CKS
Technical Field
The application relates to the field of deep learning privacy data safety protection, in particular to a ciphertext neural network construction method and system based on CKS.
Background
In recent years, with rapid development and wide application of Deep Learning (DL) technology, deep Learning has profoundly affected aspects of industry and our daily lives. Training deep learning models requires massive amounts of data, however, enterprises and users are reluctant to share data because their own data often contains private information, such as faces, fingerprints, medical records, etc., creating a large number of data islanding problems. Meanwhile, people attach importance to data privacy protection and privacy security to global topics. For example, the data security law issued by China prescribes that any organization or individual collect data in a legal and legal manner, and the data should not be stolen or obtained in other illegal manners. In the deep learning training process, the private data of the user may be revealed in the process that the user uploads the local plaintext data to the cloud server side and the cloud server side calculates the data uploaded by the local user. If the deep learning model is not subjected to privacy protection, the user privacy information is easy to reveal. How to realize data security sharing under the conditions of protecting user data privacy and conforming to privacy regulations is a problem to be solved urgently.
In the current existing research, the deep learning privacy data security protection technology mainly comprises differential privacy, secure multiparty calculation and homomorphic encryption. The implementation of differential privacy relies on randomness, affecting the deep learning model performance. The safe multiparty computing technology has the problems of high resource expense, low processing speed, poor portability and the like when being realized. Homomorphic encryption allows direct operations on ciphertext, and various algebraic operations can be implemented without key involvement. Through homomorphic encryption technology, the data owner can send the data to the cloud server without worrying about private data leakage, and is hopeful to realize end-to-end whole encryption. In recent years, some researches have been made to apply homomorphic encryption technology to encryption of deep neural networks. In 2016, microsoft institute originally adopted homomorphic encryption technology to encrypt neural networks, and encryption network cryptones was proposed. The core technology of the scheme is that after the neural network is simplified, isomorphic encryption is used to protect the privacy security of data and improve the efficiency of processing the data; in 2018, the university of hemp-province institute of technology proposed an extensible, low-delay secure neural network reasoning system Gazelle homomorphic encryption library, which provides a fast algorithm for homomorphic computation, such as SIMD (Single Instruction Multiple Data, SIMD) addition and SIMD multiplication and ciphertext slot arrangement homomorphic encryption schemes, so as to implement complete neural network reasoning; in 2019, the national institute of information and communication technology applies asynchronous random gradient descent to a neural network and combines with addition homomorphic encryption to solve the problem that local data can be stolen by honest but curious servers under tolerable expense; in 2021, the national university of korea institute of electrical and computer engineering improved the approximate homomorphic encryption scheme (RNS-variant Cheon-Kim-Song (RNS-CKKS)) of the residue system (Residue Number System, RNS) variant, which supports arithmetic operations on encrypted real or complex data and employs homomorphic mode reduction to improve message precision in bootstrapping; 2022, the university of aviation aerospace in south Beijing proposes a privacy enhanced joint average (PE-FedAvg) scheme to enhance the safety of model parameters, and uses a CKS homomorphic encryption scheme to encrypt the model parameters so as to ensure the privacy safety of parameter information; in the same year, the national institute of indonesia and the institute of information science have proposed a method of using a combination of discrete wavelet transform and CKKS homomorphic encryption schemes to accelerate and increase the efficiency of encryption watermarking, and CKKS scheme encryption uses approximate arithmetic computation and predetermined precision to accelerate encryption computation.
As can be seen from the above technical investigation, the fully homomorphic encryption technology needs to perform multi-layer circuit bootstrap calculation, and when the user data set is large or the deep learning network structure is deep, huge calculation overhead is caused. Therefore, the current deep learning privacy protection scheme based on homomorphic encryption mainly adopts a partial homomorphic encryption or approximately homomorphic encryption technology. Currently proposed encryption network cryptones are often implemented based on a BFV isomorphic encryption open source library, however, BFV only supports integer computation, and complex preprocessing of data and parameters is required when applied to deep learning. In 2017, the CKS homomorphic encryption scheme proposed by the Korean head-to-the-world university supports addition, subtraction, multiplication and operation of floating point vectors in ciphertext space and keeps homomorphic, is a feasible encryption scheme for solving floating point operation in a deep learning model, and still has huge calculation overhead.
Disclosure of Invention
Aiming at the problems in the prior art, the method and the system for constructing the ciphertext neural network based on the CKS are provided, support end-to-end ciphertext input and ciphertext output and provide core technical support for realizing the availability invisibility of privacy data.
The first aspect of the application provides a ciphertext neural network construction method based on CKS, which comprises the following steps:
s1, constructing or calling a deep neural network model at a cloud server end, and completing training by adopting open plaintext data;
s2, generating a public key and a private key based on CKS homomorphic encryption;
s3, the user side stores the public key and the private key, encrypts the local private data through the public key, and uploads the encrypted data and the public key to the cloud server side;
s4, encrypting the trained deep neural network model by the cloud server through a public key, and converting the encrypted deep neural network model into a ciphertext neural network model; inputting the encrypted data into a ciphertext neural network model for reasoning and predicting, and returning a ciphertext reasoning result to the user side;
s5, the user side decrypts by using the held private key to obtain model plaintext prediction output of the local data.
Further, the substep of step S1 includes:
s11, constructing or calling a deep neural network;
s12, acquiring a public plaintext data set, and preprocessing characteristic data;
s13, approximating a nonlinear activation function in the deep neural network to a polynomial function through a Taylor expansion;
s14, designing a loss function between a minimum training sample prediction label and a real label, and introducing a genetic algorithm to find out the optimal power of an approximate polynomial;
s15, training the deep neural network model by using the preprocessed data.
Further, the substep of step S14 includes:
step S141, adopting the opposite number of the mean square error loss function as an adaptation function;
step S142, setting population size S, cross probability and variation probability, and then generating an initialized seed population;
step S143, applying selection, crossing and mutation operators to the population, putting new chromosomes obtained by crossing and mutation into the population in the previous evolution round, sequencing from large to small according to the adaptive function value corresponding to each current chromosome, and reserving the former S chromosomes as the next generation population;
step S144, repeating step S143 until the predetermined number of iterations is completed, and outputting the MSE loss function value and the corresponding optimal power.
Further, the adaptation function is:
wherein N represents the number of Taylor expansion terms, y i Representing the true value of the i-th sample,representing the ith samplePredictive value of the root,/->Representing the error between the true and predicted values of the ith sample, f (N) represents the fitness function value for the chromosome for which the Taylor expansion term number is N.
Further, the preprocessing in step S12 includes normalization and denoising.
Further, the substep of step S2 includes:
s21, establishing a security key management center and completing parameter setting of a CKS homomorphic encryption scheme;
s22, generating a public key and a private key, and issuing the public key and the private key to the user side.
The second aspect of the present application provides a CKKS-based ciphertext neural network system, comprising:
the key management center generates a public key and a private key based on the CKS homomorphic encryption scheme;
the cloud server side completes deep neural network model training based on the disclosed plaintext data and converts the deep neural network model training into a ciphertext neural network through public key encryption; receiving ciphertext privacy data sent by a user side, completing reasoning through a ciphertext neural network and returning a ciphertext reasoning result;
the user side acquires a public key and a private key issued by the key management center, encrypts local private data through the public key, and uploads the encrypted data and the public key to the cloud server side; and decrypting the received ciphertext reasoning result through the private key to obtain model plaintext prediction output of the local data.
Further, the nonlinear activation function in the deep neural network is approximated to a polynomial function through a taylor expansion, and then the optimal power of an approximation polynomial is determined by utilizing a genetic algorithm, so that model training is completed.
Compared with the prior art, the beneficial effects of adopting the technical scheme are as follows: in the application, the user side only provides the encrypted privacy data and the public key for the cloud server side, the cloud server side returns the model reasoning result of the ciphertext to the user, and only the user with the private key can decrypt the model reasoning result to obtain the plaintext reasoning result, the cloud side does not snoop the plaintext privacy data and the plaintext reasoning result of the user in the whole process, and the computing power resource of the cloud server side can be effectively utilized while the safety of the privacy data of the user is ensured. Meanwhile, as a general ciphertext deep neural network construction method, a typical deep learning model can be changed into a ciphertext model version by using the general method, so that end-to-end ciphertext input and ciphertext output are realized, and core technical support is provided for realizing the availability of invisible private data.
Drawings
Fig. 1 is a schematic diagram of a method for constructing a ciphertext neural network based on CKKS according to the present application.
Fig. 2 is a schematic diagram of a homomorphic encryption flow in an embodiment of the application.
Detailed Description
Embodiments of the present application are described in detail below, examples of which are illustrated in the accompanying drawings, wherein like or similar reference numerals refer to like or similar modules or modules having like or similar functions throughout. The embodiments described below by referring to the drawings are illustrative only and are not to be construed as limiting the application. On the contrary, the embodiments of the application include all alternatives, modifications and equivalents as may be included within the spirit and scope of the appended claims.
Example 1
Because CKKS homomorphic encryption schemes only support a limited number of homomorphic multiplication operations, typical activation functions in deep learning models often include nonlinear operations such as exponential operations and trigonometric functions, and thus CKKS schemes cannot be directly applied to deep learning models. The nonlinear activation function in the existing neural network needs to select the expansion power by itself according to the artificial experience, which may result in missing the optimal solution for model convergence, thereby affecting the model performance, so how to guarantee the safety of user data in the deep learning training and reasoning process by using the CKKS scheme on the premise of guaranteeing the deep learning model performance is still a urgent problem to be solved. Based on the above, the embodiment of the application provides a ciphertext neural network construction method based on CKS, which comprises the following specific scheme:
and S1, training a deep neural network model at a cloud server end.
To solve the problem that the CKKS scheme does not support infinite homomorphic multiplication, the nonlinear activation functions including exponential operations, trigonometric functions, and the like are approximated as polynomials using taylor expansion in the present embodiment. Further, in order to reduce the model accuracy degradation caused by polynomial approximation, in this embodiment, a genetic algorithm is introduced to optimize the optimal power of the approximate polynomial by minimizing the error between the training sample prediction tag and the real tag. Specifically:
step S11, designing or calling a proper deep neural network aiming at a specific engineering problem.
And step S12, acquiring a public plaintext data set and preprocessing the characteristic data. In one embodiment, the preprocessing includes normalization and denoising processes.
And S13, approximating a nonlinear activation function in the deep neural network to a polynomial function through a Taylor expansion, predicting a loss function between the tag and the real tag through a minimum training sample, and introducing a genetic algorithm to find out the optimal power of the approximate polynomial.
Deep neural networks are generally composed of an input layer, multiple hidden layers, and an output layer, with model parameters optimized by random gradient descent. The hidden layer performs feature extraction on input data by using a nonlinear activation function, and the mainstream activation function mostly includes exponential operations, such as Sigmoid function, tanh function, and the like. The CKS homomorphic encryption scheme can only process finite times of homomorphic multiplication operation and cannot be directly applied to the deep neural network. Therefore, in the embodiment, the nonlinear activation function in the deep neural network is approximated to a polynomial function through the taylor expansion, so as to solve the problem that homomorphic encryption is not suitable for exponential nonlinear operation.
In order to reduce the model precision degradation caused by polynomial approximation, the embodiment obtains coordination between the model performance and the execution efficiency by minimizing the loss function between the training sample prediction label and the real label and introducing a genetic algorithm to find the optimal power of the approximate polynomial. The Tanh function is chosen here as an example of an activation function. The Tanh function can be regarded as the result of the Logistic function after translation, the value range is (-1, 1);
approximating the Tanh function to a polynomial by means of a taylor expansion, where g represents the activation function, n represents the power of the taylor expansion, a, to satisfy homomorphism n Is a coefficient of a polynomial.
g(x)=Tan h(x)=a 0 +a 1 x+a 2 x 2 +…+a n x n
The specific process of the genetic algorithm is as follows:
(1) The taylor expansion terms are typically set to 3 to 12, in this embodiment, for convenience in encrypting the model parameters, 8-bit binary codes are used here, and the taylor expansion terms are set to a range of 1 to 255 (i.e., 00000001 ~ 11111111), and each set of 8-bit binary numbers between 00000001 ~ 11111111 constitutes a chromosome solution space.
(2) Defining an adaptation function, in this embodiment using a mean square error (Mean Squared Error, MSE) loss functionThe opposite number of (a) is used as an adaptation function, so that the adaptation function value is conveniently calculated and the optimal power is conveniently found, and the adaptation function f is as follows:
wherein N represents the number of Taylor expansion terms, y i Representing the true value of the i-th sample,representing the predicted value of the ith sample, +.>Representing the error between the true and predicted values of the ith sample, f (N) represents the fitness function value for the chromosome for which the Taylor expansion term number is N.
(3) Setting the population size S=10, and the crossover probability p 1 Probability of variation p =0.4 2 =0.01, randomly generated initialization seed population χ= { x i |0≤i≤S}。
(4) And applying selection, crossover and mutation operators to the population, and selecting and forming a next generation population according to the adaptive function value. Specific: and (3) putting the new chromosomes obtained after the crossing and mutation into a population in the previous evolution, sequencing from large to small according to the corresponding adaptive function value of each current chromosome, reserving the chromosomes of which the adaptive function value is 10 before to form a next generation population, and eliminating the chromosomes of which the adaptive function values are later.
(5) The iteration round of model training is preset, and is set to 300 in the embodiment; and (3) repeating the step (4) until the preset iteration times are completed, and outputting the MSE loss function value with the best model performance and the corresponding optimal power.
And step S14, based on the acquired optimal power, the loss function and the preprocessed data, the deep neural network model training is completed.
And S2, generating a public key and a private key based on CKS homomorphic encryption.
In this embodiment, parameters of CKKS homomorphic encryption schemes are designed, and a security key management center is established, wherein the flow of CKKS homomorphic encryption schemes is shown in fig. 2, and specific encryption schemes and parameters are set as follows:
1).CKKS.Setup(1 λ ): setting a security parameter lambda=128 bit, initializing a CKKS scheme by polynomial modulus N=8192, wherein N/2 is the number of plaintext slots and an integer p>0, modulus q 0 L represents homomorphic multiplication times, and the modulus q of ciphertext is set l =q 0 .p l Where l=1, 2,..l. Setting Δ=2 40 Delta is a rescaling factor, the larger delta, the fewer homomorphic multiplications, typically aboutAnd (3) carrying out homomorphic multiplication, binding the optimal power n of Taylor expansion with homomorphic multiplication times for facilitating homomorphic operation, and setting the homomorphic multiplication times to be 5 on the assumption that the optimal power n of Taylor expansion is simulated to be 5. In order for the solution to resist existing lattice attacks based on the RLWE difficult assumption, respectively at +.>Selecting a key-dependent distribution χ k An error distribution χ e And a random distribution χ used as homomorphic encryption r
2)CKKS.KeyGen→(pk,sk,evk)
Randomly generating s≡χ k ,e←χ e ,/>e′←χ e Setting private key sk≡ (1, s), setting public key +.>Wherein b= -a.s+e (mod q L ) Setting an evaluation key->Wherein b ' = -a '. S+e ' + p.s 2 (mod p.q L )。
3)CKKS.Encrypt(cm,pk)→ct
Random generation r≡χ r ,e 0 ,e 1 ←χ e Plaintext m.epsilon.R, output ciphertext c.p+ (m+e) 0 ,e 1 )mod q L Wherein
4)CKKS.Decrypt(c,sk)→m
Private key sk, ciphertextFor ciphertext of the same layer l, decrypt to m' =<c,sk>mod q L
Correctness:
m′=<c,sk>mod q L
=<(b.r+m+e 0 ,a.r+e 1 ),(1,s)>mod q L
=((-a.s+e).r+m+e 0 )+(a.r.s+e 1 .s)mod q L
=m+e.r+e 0 +e 1 .s mod q L
≈m
e.r+e 0 +e 1 s is compared with q L Small enough, the error is negligible, while for shallow moduli, the error can be made smaller by rescaling.
5)CKKS.Add(c,c′)→c add
The ciphertext c is input to the device,the result of the ciphertext addition is c add =c+c′(mod q L )。
6)CKKS.Mult(c,c′)→c mult
The ciphertext c is input to the device,consider two decrypted value products:
Dec(c).Dec(c′)mod q L =(c 0 +c 1 .s).( 0 ′+c 1 ′.s)mod q L
=c 0 .c 0 ′+(c 0 .c 1 ′+c 0 ′.c 1 ).s+c 1 .c 1 ′.s 2 mod q L
=d 0 +d 1 .s+d 2 .s 2 mod q L
wherein d 0 =c 0 .c 0 ′mod q L ,d 1 =c0.c 1 ′+c 0 ′.c 1 mod q L ,d 2 =c 1 .c 1 ′mod q L
Due to the size increase of ciphertext, in order to keep the size of ciphertext atSuppose that the ciphertext has only (d 0 ,d 1 ) Then the decryption result is d 0 +d 1 .s mod q L Here, a quadratic term of s is missing, so in this embodiment the calculation auxiliary key evk is used, outputting +.> Representation pair p -1 .d 2 Evk the whole is taken.
<p -1 .d 2 .evk,sk>mod q L
=p -1 d 2 b′+(p -1 d 2 a′).s mod q L
=(p -1 d 2 -a′.s+e′+p.s 2 (mod p.q L )+(p -1 d 2 a′.s))mod q L
=s 2 d 2 +p -1 d 2 e′mod q L )≈s 2 d 2 mod q L
P is a large number, so P can be ignored -1 d 2 e' and the final result is approximately the quadratic term of s.
7)CKKS.RS l→l′ (c)
Input layer i ciphertextNew modulus q l ′<q l Output layer l' New ciphertext-> q l ′/q l Is prime number p l Is the inverse of (c). After rescaling, the ciphertext size is reduced by p l At the same time, the approximation error is reduced by p l Multiple times. Prime number p 1 ,…,p L Take around the scaling factor delta, q of the codec 0 Where G represents the upper limit of the integer part of the plaintext. After each rescaling, the scale is reduced by a similar factor and is approximately equal to Δ at each layer, thereby maintaining the desired accuracy.
Further, a public key and a secret key are generated based on the CKS encryption scheme, and the public key and the secret key are issued to the user side.
And S3, encrypting and uploading the local privacy data by the user.
The user end stores the public key and the secret key obtained from the secure secret key management center, encrypts the local private data by using the public key, and uploads the encrypted data and the public key to the cloud server end.
S4, the cloud server performs ciphertext data reasoning.
And the cloud server performs possible model compression work such as pruning and the like on the trained deep neural network model according to the received public key and according to the actual engineering task requirements, then performs encryption on parameters such as trained model weights and the like, and processes the parameters into an encrypted version to obtain the ciphertext neural network model. And carrying out reasoning prediction on the received ciphertext privacy data of the user through the ciphertext neural network model, outputting a ciphertext reasoning result, and returning the ciphertext reasoning result to the user side.
S5, decrypting by the user side.
The user decrypts the received ciphertext reasoning result by using the held private key of the CKS scheme to obtain the model prediction output of the plaintext, so that the availability of the user privacy data is invisible, and the safety of the user privacy data in the deep learning use and reasoning process is effectively ensured.
The application approximates nonlinear activation functions such as exponential operation, trigonometric function and the like contained in the deep neural network to polynomial functions by utilizing Taylor expansion. Further, optimizing the optimal power of the approximate polynomial by introducing a genetic algorithm; further, the public key of the CKS scheme uploaded by the user is utilized to encrypt the trained neural network model parameters, so that the trained deep neural network model in the plaintext domain is converted into an encrypted version, and the end-to-end ciphertext input and ciphertext output are realized. As a general ciphertext deep neural network construction method, various deep learning models can be changed into ciphertext model versions by using the general model framework, so that end-to-end ciphertext input and ciphertext output are realized, and core technical support is provided for realizing the availability of invisible private data.
Example 2
Referring to fig. 1, the present embodiment provides a ciphertext neural network system based on CKKS, and the whole system includes the following three entities: the system comprises a user end, a cloud server end and a key management center. The user is responsible for sending the encrypted private data and the public key of the CKS to the cloud server side, and decrypting by using the homomorphic encrypted private key of the CKS; the cloud server side is responsible for providing a storage space and calculation force for plaintext training and ciphertext testing; the key management center performs key generation and distribution work based on the CKKS scheme. Specifically:
the key management center generates a public key and a private key based on the CKS homomorphic encryption scheme;
the cloud server end completes the training of the deep neural network model based on open plaintext data and converts the deep neural network model into a ciphertext neural network through public key encryption; receiving ciphertext privacy data sent by a user side, completing reasoning through a ciphertext neural network and returning a reasoning result;
the user side acquires a public key and a private key of the key management center, encrypts local private data through the public key and uploads the encrypted data and the public key to the cloud server side; and decrypting the received ciphertext reasoning result through the private key, thereby obtaining model plaintext prediction output of the local data.
In the cloud server, when the deep neural network is trained, the nonlinear activation function of the model is approximated to a polynomial function through a Taylor expansion, the optimal power of an approximation polynomial is determined by using a genetic algorithm, and then model training is completed.
It should be noted that, in the description of the embodiments of the present application, unless explicitly specified and limited otherwise, the terms "disposed," "connected," and "connected" are to be construed broadly, and may be, for example, fixedly connected, detachably connected, or integrally connected; may be directly connected or indirectly connected through an intermediate medium. The specific meaning of the above terms in the present application will be understood in detail by those skilled in the art; the accompanying drawings, which are included to provide a further understanding of the application and are incorporated in and constitute a part of this specification, illustrate embodiments of the application and together with the description serve to explain the principles of the application. The components of the embodiments of the present application generally described and illustrated in the figures herein may be arranged and designed in a wide variety of different configurations.
While embodiments of the present application have been shown and described above, it will be understood that the above embodiments are illustrative and not to be construed as limiting the application, and that variations, modifications, alternatives and variations may be made to the above embodiments by one of ordinary skill in the art within the scope of the application.

Claims (8)

1. The method for constructing the ciphertext neural network based on the CKS is characterized by comprising the following steps of:
s1, constructing a deep neural network model at a cloud server end, and completing training by adopting open plaintext data;
s2, generating a public key and a private key based on CKS homomorphic encryption;
s3, the user side stores the public key and the private key, encrypts the local private data through the public key, and uploads the encrypted data and the public key to the cloud server side;
s4, encrypting the trained deep neural network model by the cloud server through a public key, and converting the encrypted deep neural network model into a ciphertext neural network model; inputting the encrypted data into a ciphertext neural network model for reasoning and predicting, and returning a ciphertext reasoning result to the user side;
s5, the user end decrypts the ciphertext reasoning result by using the held private key to obtain plaintext model prediction output of the local data.
2. The CKKS-based ciphertext neural network construction method of claim 1, wherein the substep of step S1 comprises:
s11, constructing or calling a deep neural network;
s12, acquiring a public plaintext data set, and preprocessing characteristic data;
s13, approximating a nonlinear activation function in the deep neural network to a polynomial function through a Taylor expansion;
s14, designing a loss function between a minimum training sample prediction label and a real label, and introducing a genetic algorithm to find out the optimal power of an approximate polynomial;
s15, training the deep neural network model by using the preprocessed data.
3. The CKKS-based ciphertext neural network construction method of claim 2, wherein the substep of step S14 comprises:
step S141, adopting the opposite number of the mean square error loss function as an adaptation function;
step S142, setting population size S, cross probability and variation probability, and then generating an initialized seed population;
step S143, applying selection, crossing and mutation operators to the population, putting new chromosomes obtained by crossing and mutation into the population in the previous evolution round, sequencing from large to small according to the adaptive function value corresponding to each current chromosome, and reserving the former S chromosomes as the next generation population;
step S144, repeating step S143 until the predetermined number of iterations is completed, and outputting the MSE loss function value and the corresponding optimal power.
4. The CKKS-based ciphertext neural network construction method of claim 3, wherein the adaptation function is:
wherein N represents the number of Taylor expansion terms, y i Representing the true value of the i-th sample,representing the predicted value of the i-th sample,representing the error between the true and predicted values of the ith sample, f (N) represents the fitness function value for the chromosome for which the Taylor expansion term number is N.
5. The CKKS-based ciphertext neural network construction method of claim 2, wherein the preprocessing in step S12 comprises normalization and denoising.
6. The CKKS-based ciphertext neural network construction method of claim 1, wherein the substep of step S2 comprises:
s21, establishing a security key management center and completing parameter setting of a CKS homomorphic encryption scheme;
s22, generating a public key and a private key, and issuing the public key and the private key to the user side.
7. A CKKS-based ciphertext neural network system, comprising:
the key management center generates a public key and a private key based on the CKS homomorphic encryption scheme;
the cloud server side completes deep neural network model training based on the disclosed plaintext data and converts the deep neural network model training into a ciphertext neural network through public key encryption; receiving ciphertext privacy data sent by a user side, completing reasoning through a ciphertext neural network and returning a ciphertext reasoning result;
the user side acquires a public key and a private key issued by the key management center, encrypts local private data through the public key, and uploads the encrypted data and the public key to the cloud server side; and decrypting the received ciphertext reasoning result through the private key to obtain model plaintext prediction output of the local privacy data.
8. The CKKS-based ciphertext neural network system of claim 7, wherein the nonlinear activation function in the deep neural network approximates a polynomial function by taylor expansion, further utilizing a genetic algorithm to determine an optimal power of the approximate polynomial, and further completing model training.
CN202311175758.4A 2023-09-12 2023-09-12 Ciphertext neural network construction method and system based on CKS Pending CN117195972A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202311175758.4A CN117195972A (en) 2023-09-12 2023-09-12 Ciphertext neural network construction method and system based on CKS

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202311175758.4A CN117195972A (en) 2023-09-12 2023-09-12 Ciphertext neural network construction method and system based on CKS

Publications (1)

Publication Number Publication Date
CN117195972A true CN117195972A (en) 2023-12-08

Family

ID=88984612

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202311175758.4A Pending CN117195972A (en) 2023-09-12 2023-09-12 Ciphertext neural network construction method and system based on CKS

Country Status (1)

Country Link
CN (1) CN117195972A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117592089A (en) * 2024-01-18 2024-02-23 腾讯科技(深圳)有限公司 Data processing method, device, equipment and storage medium
CN117688620A (en) * 2024-01-29 2024-03-12 江苏悉宁科技有限公司 Certificate verification optimization method and system based on big data information security

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117592089A (en) * 2024-01-18 2024-02-23 腾讯科技(深圳)有限公司 Data processing method, device, equipment and storage medium
CN117592089B (en) * 2024-01-18 2024-05-07 腾讯科技(深圳)有限公司 Data processing method, device, equipment and storage medium
CN117688620A (en) * 2024-01-29 2024-03-12 江苏悉宁科技有限公司 Certificate verification optimization method and system based on big data information security
CN117688620B (en) * 2024-01-29 2024-04-23 江苏悉宁科技有限公司 Certificate verification optimization method and system based on big data information security

Similar Documents

Publication Publication Date Title
US10944751B2 (en) Generating cryptographic function parameters from compact source code
US20180349740A1 (en) Machine learning based on homomorphic encryption
CN117195972A (en) Ciphertext neural network construction method and system based on CKS
CN112822005B (en) Secure transfer learning system based on homomorphic encryption
CN107359998B (en) A kind of foundation and operating method of portable intelligent password management system
US11316665B2 (en) Generating cryptographic function parameters based on an observed astronomical event
Jayapandian et al. Secure and efficient online data storage and sharing over cloud environment using probabilistic with homomorphic encryption
US11184168B2 (en) Method for storing data on a storage entity
CN111783129A (en) Data processing method and system for protecting privacy
EP3286747B1 (en) Generating cryptographic function parameters from a puzzle
CN111222645B (en) Management system and method based on Internet of things block chain quantum algorithm artificial intelligence
CN115392487A (en) Privacy protection nonlinear federal support vector machine training method and system based on homomorphic encryption
Jang et al. Privacy-preserving deep sequential model with matrix homomorphic encryption
CN116882524A (en) Federal learning method and system for meeting personalized privacy protection requirements of participants
Pathak et al. A secure framework for file encryption using base64 encoding
CN109104449A (en) A kind of more Backup Data property held methods of proof under cloud storage environment
Zhang et al. Efficient federated learning framework based on multi-key homomorphic encryption
CN114338090A (en) Data security detection method and device and electronic equipment
Huang et al. Secure word-level sorting based on fully homomorphic encryption
CN114866236B (en) Data sharing method of Internet of things in cloud based on alliance chain
CN112738108B (en) Multi-data encryption system and method for cloud computing system
Zeng et al. Secure outsourced numerical solution of algebraic equations
Johora et al. A New Chaotic-Based Analysis of Data Encryption and Decryption
Sanjana et al. Providing Cloud Storage Auditing Through Verifiable Key Update Outsourcing
CN117544311A (en) Data stream privacy protection system and method based on partial blind signature

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination