CN108898028B - Neural network model encryption protection system and method related to iteration and random encryption - Google Patents

Neural network model encryption protection system and method related to iteration and random encryption Download PDF

Info

Publication number
CN108898028B
CN108898028B CN201810735833.0A CN201810735833A CN108898028B CN 108898028 B CN108898028 B CN 108898028B CN 201810735833 A CN201810735833 A CN 201810735833A CN 108898028 B CN108898028 B CN 108898028B
Authority
CN
China
Prior art keywords
module
neural network
encryption
data
network model
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201810735833.0A
Other languages
Chinese (zh)
Other versions
CN108898028A (en
Inventor
尹愚
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Chengdu Daxiang Fractal Intelligent Technology Co ltd
Original Assignee
Chengdu Daxiang Fractal Intelligent Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Chengdu Daxiang Fractal Intelligent Technology Co ltd filed Critical Chengdu Daxiang Fractal Intelligent Technology Co ltd
Priority to CN201810735833.0A priority Critical patent/CN108898028B/en
Publication of CN108898028A publication Critical patent/CN108898028A/en
Application granted granted Critical
Publication of CN108898028B publication Critical patent/CN108898028B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/60Protecting data
    • G06F21/602Providing cryptographic facilities or services
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/045Combinations of networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L9/00Cryptographic mechanisms or cryptographic arrangements for secret or secure communications; Network security protocols
    • H04L9/08Key distribution or management, e.g. generation, sharing or updating, of cryptographic keys or passwords
    • H04L9/0861Generation of secret information including derivation or calculation of cryptographic keys or passwords

Abstract

The invention belongs to the field of artificial neural network protection mechanisms, and particularly relates to a neural network model encryption protection system and method relating to iteration and random encryption, wherein the neural network model encryption protection system comprises a data input module, an encryption module, an encrypted data input module, an artificial neural network model module and a data output module; the encryption module comprises a structure conversion module and an iteration processing module; the iterative processing module comprises a password generating module, a password embedding module and a single-layer convolution neural network model module; the password generation module comprises a fixed matrix generation module and a random matrix generation module. According to the method, the protective password can be embedded into the artificial neural network model under the condition that the calculated amount is not obviously increased and the performance of the artificial neural network is kept, so that after the artificial neural network model is released, any copying, secondary development or modification cannot affect the protective password, and the performance of the artificial neural network model is reduced or effective output cannot be made due to the fact that the protective password is damaged.

Description

Neural network model encryption protection system and method related to iteration and random encryption
Technical Field
The invention belongs to the field of artificial neural network protection mechanisms, and particularly relates to a neural network model encryption protection system and method relating to iteration and random encryption.
Background
Deep learning is the main technical solution of current artificial intelligence application. The artificial neural network model trained by the deep learning technology is a work result gathering the intelligence of the initial developer. But in the process of publishing and applying the artificial neural network model, the network structure and the node weight of the artificial neural network model are completely exposed to the outside. Artificial neural network models, after release and/or third party application, are easily copied, re-developed or modified, resulting in impaired initial developer profits. In the prior art, a protection scheme suitable for an artificial neural network model mainly comprises network overall encryption, training data encryption and network homomorphic encryption training.
The network overall encryption scheme is to encrypt and release the trained network model, and the network model cannot be used without a secret key. However, the encryption mode is only the secondary encapsulation of the network model, and after the network model is decrypted by using the key, the core information such as the structure, the node weight and the like of the network model can still be obtained through analysis and can be copied, propagated, secondarily developed or modified, so that the rights and interests of an initial developer of the network model cannot be protected;
the training data encryption scheme is used for escaping the training data, network training is carried out on the mapped data, and the mapping scheme is used for subsequent use of the network model so as to protect the core content of the network model. The encryption technology requires to destroy the internal statistical rules of data to avoid breaking encryption by statistical analysis, and artificial neural network training is to complete data classification and prediction based on important statistical features of training data, and the essence of the artificial neural network training is statistical learning, so the artificial neural network training is contradictory to the essence of a training data encryption scheme, for example, the MD5 algorithm is adopted to carry out modern advanced encryption on the data, each value of the data to be trained cannot generate a unique mapping value, the internal statistical characteristics of the data can be destroyed, and the artificial neural network training is not suitable. While simple mapping encryption can maintain the inherent statistical characteristics of data, the encryption mode is easy to analyze by a large amount of deep learning training data, so that the encryption protection fails;
the homomorphic encryption training scheme allows encryption information to be modified in a specific mode without reading and understanding the encryption information, network training is carried out on homomorphic encrypted data, core content of a network model can be protected, and internal statistical structures of homomorphic encrypted training data can be kept to make up for weaknesses of the training data encryption scheme. However, the scheme can lead to a great increase of the calculated amount, and because various homomorphic encryption algorithms have different degrees of computational incompleteness, certain mathematical operations cannot be directly realized, so that a great amount of used artificial neural network training methods cannot be realized, and the performance of the artificial neural network is reduced.
Disclosure of Invention
Aiming at the defects of the existing artificial neural network protection mechanism, the invention provides a neural network model encryption protection system and method relating to iteration and random encryption.
The specific scheme is as follows:
a neural network model encryption protection system related to iteration and random encryption is characterized in that: the device comprises a data input module, an encryption module, an encrypted data input module, an artificial neural network model module and a data output module; the data input module is in signal connection with the encryption module, the encryption module is in signal connection with the encrypted data input module, the encrypted data input module is in signal connection with the artificial neural network model module, and the artificial neural network model module is in signal connection with the data output module.
Further, the data input module is used for providing the original data to the encryption module; the encryption module is used for encrypting the original data provided by the data input module and outputting encrypted data; the encrypted data input module is used for receiving the encrypted data output by the encryption module and transmitting the encrypted data to the artificial neural network model module; the artificial neural network model module is used for receiving the encrypted data and calculating based on the encrypted data; and the data output module is used for outputting and processing the result calculated by the artificial neural network model module.
Further, in the network training stage, the artificial neural network model module realizes network training through forward network calculation and reverse error propagation calculation; in the using stage, the artificial neural network model module obtains a result through forward network calculation.
Further, in the network training stage, the data output module calculates the loss function of the output of the artificial neural network model module, and the loss function is used for realizing network training of the artificial neural network model module through a gradient back propagation algorithm; in the using stage, the data output module adopts the output of the artificial neural network model module to judge the actual function.
Furthermore, the encryption module uses a group of keys with N digits as a control quantity to encrypt all original data provided by the data input module so as to realize data encryption; the length of the key is limited by the encryption mode and the encryption requirement in the encryption processing; each digit of the N-digit number of the key is selected from one of the arabic numbers 0-9.
Furthermore, the encryption module comprises a structure conversion module and an iteration processing module; the structure conversion module is used for converting the original data into a two-dimensional structure; the iteration processing module receives the original data after the structure conversion as input data of first iteration processing, takes the output of the first iteration processing as input data of second iteration processing, and so on, and generates encrypted data after multiple times of iteration processing; wherein the number on any preset bit in the key is associated with the number of iterations.
Further, the iterative processing module comprises a password generation module, a password embedding module and a single-layer convolution neural network model module; aiming at each iteration, the password generating module generates a password with the same structure as the input data, the password embedding module is used for embedding the password into the input data in a superposition mode, and the single-layer convolution neural network model module is used for carrying out network calculation on the input data subjected to password embedding and outputting a calculation result; and the calculation result of the non-final iteration processing is used as input data of the next iteration processing and received by the password generation module, and the network calculation result of the final iteration processing is encrypted data.
Further, the password generation module comprises a fixed matrix generation module and a random matrix generation module; aiming at each iteration, a fixed matrix generating module generates a fixed matrix with the same structure as the input data through a fixed matrix generating function, wherein the fixed matrix generating function has a plurality of adjustable parameters, and numbers on any preset bits in a secret key are associated with the adjustable parameters; the random matrix generation module randomly generates a random matrix with the same structure as the input data through a random matrix generation function, wherein numbers on other preset bits in the secret key are associated with the mean value and the variance of the random matrix generation function; the Hadamard product of the fixed matrix and the random matrix respectively generated corresponding to each iteration is the password to be embedded into the input data of the iteration.
Furthermore, in the network training stage, while the artificial neural network model module realizes network training through forward network calculation and reverse error propagation calculation, the single-layer convolutional neural network model module also realizes network parameter training through forward network calculation and reverse error propagation calculation;
further, when the structure conversion module converts the original data into a two-dimensional structure, when the original data is in a one-dimensional structure, the original data is regarded as a form of a two-dimensional data matrix with the number of rows or columns being 1; when the original data is a structure larger than two dimensions, the original data is subjected to dimension reduction and converted into a two-dimensional data matrix form, and the original data is remapped into the original structure after the encryption step is completed.
Further, the fixed matrix generating function corresponding to each iteration process is selected from one of a linear function, a logarithmic function, an exponential function, a trigonometric function, an inverse trigonometric function, or other complex function.
Further, the random matrix generating function corresponding to each iteration process is selected from one of a normal distribution function, an F distribution function, a chi-square distribution function, a T distribution function, or other joint distribution function.
Further, in the N-digit key, each digit of the selectable arabic numbers 0-9 is mapped to an executable value, and the executable value determines the number of iterations, a plurality of adjustable parameters of the fixed matrix generating function corresponding to each iteration, and a mean and a variance of the random matrix generating function corresponding to each iteration.
A neural network model encryption protection method related to iteration and random encryption is characterized by comprising the following steps:
s1, providing original data;
s2, encrypting the original data to generate encrypted data;
s3, inputting the encrypted data into an artificial neural network model, and calculating the encrypted data by the artificial neural network model to obtain a result;
and S4, outputting the calculated result.
Wherein, step S2 specifically includes:
step S21, providing a group of keys with N digits as a control quantity, wherein the keys specifically limit the encryption processing of the original data; wherein, the length of the key is limited by the encryption mode and the encryption requirement in the encryption processing; each digit of the N digits of the key is selected from one of Arabic numerals 0-9;
s22, converting original data into a two-dimensional structure;
s23, taking the original data after the structure conversion as input data of first iteration processing, taking the output of the first iteration processing as input data of second iteration processing, and so on, and generating encrypted data after multiple times of iteration processing; wherein the number on any preset bit in the key is associated with the number of iterations.
For each iteration, step S23 specifically includes:
s231, generating a fixed matrix with the same structure as the input data by adopting a fixed matrix generating function, wherein the fixed matrix generating function is provided with a plurality of adjustable parameters, and numbers on any preset bits in the secret key are associated with the adjustable parameters; randomly generating a random matrix with the same structure as the input data by adopting a random matrix generating function, wherein numbers on other preset digits in the secret key are associated with the mean value and the variance of the random matrix generating function; the Hadamard product of the fixed matrix and the random matrix respectively generated corresponding to each iteration is the password to be embedded into the input data of the iteration.
Step S232, embedding the password into the input data in a superposition mode;
s233, performing network calculation on the input data embedded by the password by adopting a single-layer convolutional neural network, and outputting a calculation result; and the calculation result of the final iteration processing is the encrypted data.
Further, in step S22, when the original data is a one-dimensional structure, it is regarded as a form of a two-dimensional data matrix with the number of rows or columns being 1; when the original data is a structure larger than two dimensions, the original data is subjected to dimension reduction and converted into a two-dimensional data matrix form, and the original data is remapped into the original structure after the encryption step is completed.
Further, in step S231, the fixed matrix generating function corresponding to each iteration process is selected from one of a linear function, a logarithmic function, an exponential function, a trigonometric function, an inverse trigonometric function, or other complex function.
Further, in step S231, the random matrix generation function corresponding to each iteration process is selected from one of a normal distribution function, an F distribution function, a chi-square distribution function, a T distribution function, or other joint distribution function.
Further, in the N-digit key, each digit of the selectable arabic numbers 0-9 is mapped to an executable value, and the executable value determines the number of iterations, a plurality of adjustable parameters of the fixed matrix generating function corresponding to each iteration, and a mean and a variance of the random matrix generating function corresponding to each iteration.
The invention has the advantages that:
the invention provides a neural network model encryption protection system and method related to iteration and random encryption, which embed a protective password into input data used for training a neural network through an encryption step on the basis of not influencing the structure and the performance of the artificial neural network, and embed statistical characteristics attached to the protective password into the artificial neural network and a single-layer convolutional neural network related to an encryption mechanism in the training of the artificial neural network. Therefore, the trained artificial neural network cannot reasonably process input data which is not embedded by a password, input data which is embedded by an incorrect password, and input data which is embedded by an incorrect encryption mechanism. Compared with the prior art, the method can embed the protective password into the artificial neural network model under the condition that the calculated amount is not obviously increased and the performance of the artificial neural network is kept, so that after the artificial neural network model is released, any copying, secondary development or modification cannot influence the protective password, and the damage to the protective password can cause the performance of the artificial neural network model to be reduced or effective output cannot be made, so that the rights and interests of developers of the artificial neural network model are protected, and the technical controllability on the use and the release of the artificial neural network model is realized.
Drawings
Fig. 1 is a system structural diagram of a neural network model encryption protection system involving iterative and random encryption according to an embodiment of the present invention;
fig. 2 is an encryption principle of an encryption module of a neural network model encryption protection system relating to iterative and random encryption according to an embodiment of the present invention; wherein CNN represents a single-layer convolutional neural network;
FIG. 3 is a mapping relationship between a number and an executable value for each bit in a secret key of an encryption module of a neural network model encryption protection system involving iterative and random encryption according to an embodiment of the present invention;
FIG. 4 is a flowchart of a method for protecting encryption of a neural network model involving iterative and random encryption according to an embodiment of the present invention; where CNN represents a single layer convolutional neural network.
In the drawings:
100-encryption protection system, 1-input module, 2-encryption module, 3-encrypted data input module, 4-artificial neural network model module, 5-data output module, 21-structure conversion module, 22-iteration processing module, 221-password generation module, 222-password embedding module, 223-single-layer convolution neural network model module, 2211-fixed matrix generation module and 2212-random matrix generation module.
Detailed Description
The core idea of the deep learning artificial neural network is as follows: and adjusting the artificial neural network weight by utilizing a gradient back propagation algorithm to realize induction convergence with statistical properties on certain characteristics contained in the known input data set so as to achieve the aim of identifying and judging the unknown input data set.
The goal of artificial neural network training is to allow the network to iteratively perform exploration and statistics of known input datasets that contain some non-explicit combination of features that enable the dataset to be distinguished. Therefore, before the input data enters the network training, some characteristic engineering operations can be carried out, so that the trained artificial neural network has better performance.
In the encryption step of the encryption protection system for the neural network model, the input data is encrypted by the encryption module, and the password embedded in the input data is controllable noise substantially. Because the data in the random matrix always has the statistical distribution characteristics limited by the random matrix generating function, and the way of embedding the password by superposition does not influence the input data and the statistical characteristics in the random matrix, the encryption mode provided by the invention does not destroy the inherent statistical characteristics related to the identification characteristics of the input data, and can add additional specially-defined additional statistical characteristics. In the training stage of the neural network, the additional statistical characteristics are learned by the single-layer neural network in the artificial neural network and the encryption module, and are embedded into some positions which cannot be directly detected, such as the weights of the artificial neural network and the single-layer neural network. If the subsequent input data does not have the additional statistical characteristic, the trained artificial neural network model can generate wrong judgment, and the trained single-layer artificial neural network model can cause wrong data encryption to further influence the judgment of the artificial neural network model.
The additional statistical characteristics added into the input data by the encryption protection system disclosed by the invention only slightly increase the statistical characteristics to be sorted and summarized of the artificial neural network and the single-layer convolutional neural network, but from the aspect of test data and theory, the additional statistical characteristics can be easily learned for the artificial neural network and the single-layer convolutional neural network. Due to the back propagation algorithm, the learning processes are the adjustment of all weight parameters of the artificial neural network and the single-layer convolutional neural network, so that the learning processes are completely integrated in the artificial neural network and the single-layer convolutional neural network and cannot be simply separated out.
In order to make the objects, technical solutions and advantages of the present invention more apparent, the present invention is further described in detail with reference to the following embodiments. It should be understood that the specific embodiments described herein are merely illustrative of the invention and are not intended to limit the invention.
Example 1
A system structure diagram of a neural network model encryption protection system 100 related to iterative and random encryption is shown in the specification and attached figure 1, an encryption principle of an encryption module 2 of the encryption protection system 100 is shown in the specification and attached figure 2, and the encryption protection system 100 comprises a data input module 1, the encryption module 2, an encrypted data input module 3, an artificial neural network model module 4 and a data output module 5. The data input module 1 is connected with the encryption module 2 through signals, the encryption module 2 is connected with the encrypted data input module 3 through signals, the encrypted data input module 3 is connected with the artificial neural network model module 4 through signals, and the artificial neural network model module 4 is connected with the data output module 5 through signals.
Further, the data input module 1 is used to provide the raw data D to the cryptographic module0(ii) a The encryption module 2 is used for encrypting the original data D provided by the data input module 10Carrying out encryption processing and outputting encrypted data D'; the encrypted data input module 3 is used for receiving the encrypted data D 'output by the encryption module 2 and transmitting the encrypted data D' to the artificial neural network model module 4; the artificial neural network model module 4 is used for receiving the encrypted data D 'and calculating based on the encrypted data D'; and the data output module 5 is used for outputting and processing the result calculated by the artificial neural network model module 4.
Further, in the network training stage, the artificial neural network model module 4 realizes network training through forward network calculation and reverse error propagation calculation; in the using stage, the artificial neural network model module 4 obtains a result through forward network calculation.
Further, in the network training stage, the data output module 5 calculates a loss function of the output of the artificial neural network model module 4, and is used for the artificial neural network model module 4 to realize network training through a gradient back propagation algorithm; in the using stage, the data output module 5 adopts the output of the artificial neural network model module 4 to judge the actual function.
The encryption module 2 uses a group of N-bit digital keys S as control quantity to all original data D provided by the data input module 10Carrying out encryption processing to realize data encryption; the length of the secret key S is limited by an encryption mode and an encryption requirement in encryption processing; each digit of the N digits of the key S is selected from one of the arabic numerals 0-9.
Further, the encryption module 2 includes a structure conversion module 21 and an iterative processing moduleA block 22; the structure conversion module 21 is used for converting the original data D0Converting into a two-dimensional structure; the iterative processing module 22 receives the original data D after the structure conversion0As a first iteration of processing T1Input data D of1And processing the first iteration T1As a second iteration process T2Input data D of2By analogy, the T is processed by multiple iterations1-TnThen generating encrypted data D'; wherein the number on any preset bit in the secret key S is associated with the number of iterative processes. For example, the number on bit 1 of the selectable key S is associated with the number of iterations of the process.
Further, the iterative processing module 22 includes a password generating module 221, a password embedding module 222 and a single-layer convolutional neural network model module 223; processing T for each iteration1-TnThe password generation module 221 generates and inputs data D1-DnCipher s with same structure1-snThe password embedding module 222 is used for embedding the password s1-snEmbedding input data D in superimposed manner1-DnThe single-layer convolutional neural network model module 223 is used for performing network calculation on the input data embedded by the password and outputting a calculation result; wherein the non-final sub-iteration process T1-Tn-1The calculation result is used as input data of next iteration processing and received by the password generation module, and the final iteration processing T is carried outnThe network computation result is the encrypted data D'.
Further, the password generating module 221 includes a fixed matrix generating module 2211 and a random matrix generating module 2212; processing T for each iteration1-TnThe fixed matrix generation module 2211 generates the function M by the fixed matrix1-MnGenerating and inputting data D1-DnFixed matrix m with same structure1-mnWherein a fixed matrix generating function M1-MnThe key S has a plurality of adjustable parameters, and numbers on any preset bits in the key S are associated with the adjustable parameters; the random matrix generation module 2212 generates functions by random matricesNumber N1-NnRandomly generating and inputting data D1-DnRandom matrix n with same structure1-nnWherein, the numbers on other preset digits in the secret key S and the random matrix generating function N1-NnAre correlated with the mean and variance of;
for example, the numbers on bits 2-3 of the secret key S correspond to the first iteration T1Fixed matrix generating function M1Is associated with the number of bits 4-5 of the key S and the corresponding first iteration T1Random matrix generating function N1Are correlated, and so on.
Corresponding to each iteration processing T1-TnRespectively generated fixed matrix m1-mnAnd a random matrix n1-nnThe Hadamard product is the password s to be embedded into the input data of the iterative processing1-sn. Further, in the network training stage, while the artificial neural network model module 4 realizes network training through forward network calculation and reverse error propagation calculation, the single-layer convolutional neural network model module 223 also realizes network parameter training through forward network calculation and reverse error propagation calculation;
further, when the structure conversion module 21 converts the original data D0When converted into a two-dimensional structure, as the original data D0When the structure is a one-dimensional structure, the structure is regarded as a two-dimensional data matrix with the number of rows or columns being 1; when the original data D0And when the structure is larger than two-dimensional structure, reducing the dimension of the structure and converting the structure into a two-dimensional data matrix form, and remapping the structure into the original structure after the encryption step is finished.
Further, T is processed corresponding to each iteration1-TnFixed matrix generating function M1-MnSelected from one of a linear function, a logarithmic function, an exponential function, a trigonometric function, an inverse trigonometric function, or other complex function.
Further, T is processed corresponding to each iteration1-TnRandom matrix generating function N1-NnSelected from the group consisting of normalA distribution function, an F distribution function, a chi-squared distribution function, a T distribution function, or other joint distribution function.
Furthermore, in the key S with N digits, each optional arabic number 0-9 is mapped to an executable value, which determines the number of iterations and corresponds to each iteration T1-TnFixed matrix generating function M1-MnAnd corresponding to each iteration process T1-TnRandom matrix generating function N1-NnMean and variance of.
As shown in fig. 3 in the specification, optional arabic numbers 0 to 9 on each digit of the key S are respectively mapped to executable values Cp and q, where p ranges from 1 to N, q ranges from 0 to 9, and Cp and q refer to executable values obtained by mapping the number q of the p-th digit of the key S.
Example 2
A method flow diagram of a neural network model encryption protection method involving iterative and random encryption referring to figure 4 of the specification, the method comprising the steps of:
step S1, providing original data D0
S2, for original data D0Carrying out encryption processing to generate encrypted data D';
s3, inputting the encrypted data D 'into an artificial neural network model, and calculating the encrypted data D' by the artificial neural network model to obtain a result;
and S4, outputting the calculated result.
Wherein, step S2 specifically includes:
step S21. providing as a control quantity a set of keys S with N digits, which keys S specifically define for the original data D0The encryption processing of (1); the length of the secret key S is limited by an encryption mode and an encryption requirement in encryption processing; each digit of the N digits of the secret key S is selected from one of Arabic numerals 0-9;
step S22, original data D0Converting into a two-dimensional structure;
step S23, converting the structure of the original data D0As a first iteration of processing T1Input data D of1The first iteration is processed by T1As a second iteration process T2Input data D of2By analogy, the T is processed by multiple iterations1-TnThen generating encrypted data D'; wherein the number on any preset bit in the secret key S is associated with the number of iterative processes. For example, the number on bit 1 of the selectable key S is associated with the number of iterations of the process.
Wherein T is processed for each iteration1-TnStep S23 specifically includes:
step S231, generating function M by adopting fixed matrix1-MnGenerating and inputting data D1-DnFixed matrix m with same structure1-mnWherein a fixed matrix generating function M1-MnThe key S has a plurality of adjustable parameters, and numbers on any preset bits in the key S are associated with the adjustable parameters; generating a function N using a random matrix1-NnRandomly generating and inputting data D1-DnRandom matrix n with same structure1-nnWherein, the numbers on other preset digits in the secret key S and the random matrix generating function N1-NnAre correlated with the mean and variance of;
for example, the numbers on bits 2-3 of the secret key S correspond to the first iteration T1Fixed matrix generating function M1Is associated with the number of bits 4-5 of the key S and the corresponding first iteration T1Random matrix generating function N1Are correlated, and so on.
Corresponding to each iteration processing T1-TnRespectively generated fixed matrix m1-mnAnd a random matrix n1-nnThe Hadamard product is the password s to be embedded into the input data of the iterative processing1-sn
Step S232, the password s1-snEmbedding input data D in superimposed manner1-Dn
S233, performing network calculation on the input data embedded by the password by adopting a single-layer convolutional neural network, and outputting a calculation result; wherein the non-final sub-iteration process T1-Tn-1The calculation result of (a) is used as input data of the next iteration processing, and the final iteration processing TnThe network computation result is the encrypted data D'.
Further, in step S22, when the original data D is0When the structure is a one-dimensional structure, the structure is regarded as a two-dimensional data matrix with the number of rows or columns being 1; when the original data D0And when the structure is larger than two-dimensional structure, reducing the dimension of the structure and converting the structure into a two-dimensional data matrix form, and remapping the structure into the original structure after the encryption step is finished.
Further, in step S231, processing T corresponds to each iteration1-TnFixed matrix generating function M1-MnSelected from one of a linear function, a logarithmic function, an exponential function, a trigonometric function, an inverse trigonometric function, or other complex function.
Further, in step S231, processing T corresponds to each iteration1-TnRandom matrix generating function N1-NnOne selected from a normal distribution function, an F distribution function, a chi-square distribution function, a T distribution function, or other joint distribution function.
Furthermore, in the key S with N digits, each optional arabic number 0-9 is mapped to an executable value, which determines the number of iterations and corresponds to each iteration T1-TnFixed matrix generating function M1-MnAnd corresponding to each iteration process T1-TnRandom matrix generating function N1-NnMean and variance of.
As shown in fig. 3 in the specification, optional arabic numbers 0 to 9 on each digit of the key S are respectively mapped to executable values Cp and q, where p ranges from 1 to N, q ranges from 0 to 9, and Cp and q refer to executable values obtained by mapping the number q of the p-th digit of the key S.

Claims (9)

1. A neural network model encryption protection system related to iteration and random encryption is characterized in that: the device comprises a data input module (1), an encryption module (2), an encrypted data input module (3), an artificial neural network model module (4) and a data output module (5); the data input module (1) is in signal connection with the encryption module (2), the encryption module (2) is in signal connection with the encrypted data input module (3), the encrypted data input module (3) is in signal connection with the artificial neural network model module (4), and the artificial neural network model module (4) is in signal connection with the data output module (5);
wherein the encryption module (2) comprises a structure conversion module (21) and an iteration processing module (22); the iterative processing module (22) comprises a password generating module (221), a password embedding module (222) and a single-layer convolutional neural network model module (223); the password generation module (221) comprises a fixed matrix generation module (2211) and a random matrix generation module (2212);
the structure conversion module (21) is used for converting the original data into a two-dimensional structure; the iteration processing module (22) receives the original data after the structure conversion as input data of first iteration processing, takes the output of the first iteration processing as input data of second iteration processing, and so on, and generates encrypted data after multiple times of iteration processing;
for each iteration process, the password generating module (221) generates passwords with the same structure as the input data, the password embedding module (222) is used for embedding the passwords into the input data in a superposition mode, and the single-layer convolutional neural network model module (223) is used for performing network calculation on the input data subjected to password embedding and outputting the calculation result; the calculation result of the non-final iteration processing is used as input data of the next iteration processing and received by the password generation module (221), and the network calculation result of the final iteration processing is encrypted data.
2. The neural network model encryption protection system related to iterative and random encryption according to claim 1, characterized in that: the data input module (1) is used for providing original data to the encryption module (2); the encryption module (2) is used for encrypting the original data provided by the data input module (1) and outputting encrypted data; the encrypted data input module (3) is used for receiving the encrypted data output by the encryption module (2) and transmitting the encrypted data to the artificial neural network model module (4); the artificial neural network model module (4) is used for receiving the encrypted data and calculating based on the encrypted data; the data output module (5) is used for outputting and processing the result calculated by the artificial neural network model module (4);
further, in a network training stage, the artificial neural network model module (4) realizes network training through forward network calculation and reverse error propagation calculation; in the using stage, the artificial neural network model module (4) calculates a result through a forward network;
further, in a network training stage, the data output module (5) calculates a loss function of the output of the artificial neural network model module (4), and the loss function is used for realizing network training of the artificial neural network model module (4) through a gradient back propagation algorithm; in the using stage, the data output module (5) adopts the output of the artificial neural network model module (4) to judge the actual function.
3. The neural network model encryption protection system related to iterative and random encryption according to claim 1, characterized in that: the encryption module (2) uses a group of keys with N digits as control quantity to encrypt all original data provided by the data input module (1) so as to realize data encryption; the length of the key is limited by the encryption mode and the encryption requirement in the encryption processing; each digit of the N-digit number of the key is selected from one of the arabic numbers 0-9, respectively.
4. The neural network model encryption protection system related to iterative and random encryption as claimed in claim 3, wherein the number at any preset bit in the key is associated with the number of iterative processes.
5. The neural network model encryption protection system related to iterative and random encryption as claimed in claim 3, wherein for each iterative process, the fixed matrix generation module (2211) generates a fixed matrix having the same structure as the input data through a fixed matrix generation function, wherein the fixed matrix generation function has a plurality of adjustable parameters, and numbers at any preset positions in the secret key are associated with the plurality of adjustable parameters; the random matrix generation module (2212) randomly generates a random matrix with the same structure as the input data through a random matrix generation function, wherein numbers on other preset bits in the secret key are associated with the mean value and the variance of the random matrix generation function; the Hadamard product of the fixed matrix and the random matrix respectively generated corresponding to each iteration processing is the password which needs to be embedded into the input data of the iteration processing.
6. The encryption protection system for neural network models involving iterative and random encryption as claimed in claim 1, wherein in the network training phase, the single layer convolutional neural network model module (223) implements its network parameter training by forward network computation and inverse error propagation computation while the artificial neural network model module (4) implements network training by forward network computation and inverse error propagation computation.
7. The system of claim 5, wherein the N-digit number of the secret key is mapped to an executable number, wherein the executable number respectively determines the number of iterations, the adjustable parameters of the fixed matrix generating function corresponding to each iteration, and the mean and variance of the random matrix generating function corresponding to each iteration, and the optional Arabic numbers 0-9 on each digit of the secret key are respectively mapped to the executable number.
8. A neural network model encryption protection method related to iteration and random encryption is characterized by comprising the following steps:
s1, providing original data;
s2, encrypting the original data to generate encrypted data;
s3, inputting the encrypted data into an artificial neural network model, and calculating the encrypted data by the artificial neural network model to obtain a result;
s4, outputting a result obtained by calculation;
wherein, step S2 specifically includes:
step S21, providing a group of keys with N digits as a control quantity, wherein the keys specifically limit the encryption processing of the original data; wherein, the length of the key is limited by the encryption mode and the encryption requirement in the encryption processing; each digit of the N digits of the key is selected from one of Arabic numerals 0-9;
s22, converting original data into a two-dimensional structure;
s23, taking the original data after the structure conversion as input data of first iteration processing, taking the output of the first iteration processing as input data of second iteration processing, and so on, and generating encrypted data after multiple times of iteration processing; wherein, the number on any preset bit in the key is associated with the number of iterative processing;
for each iteration, step S23 specifically includes:
s231, generating a fixed matrix with the same structure as the input data by adopting a fixed matrix generating function, wherein the fixed matrix generating function is provided with a plurality of adjustable parameters, and numbers on any preset bits in the secret key are associated with the adjustable parameters; randomly generating a random matrix with the same structure as the input data by adopting a random matrix generating function, wherein numbers on other preset digits in the secret key are associated with the mean value and the variance of the random matrix generating function; hadamard products of the fixed matrix and the random matrix respectively generated corresponding to each iteration processing are passwords which need to be embedded into the input data of the iteration processing;
step S232, embedding the password into the input data in a superposition mode;
s233, performing network calculation on the input data embedded by the password by adopting a single-layer convolutional neural network, and outputting a calculation result; and the calculation result of the final iteration processing is the encrypted data.
9. The method of claim 8, wherein the N-digit key is mapped to an executable number, each of which is an optional digit 0-9, and the executable number determines the number of iterations, the adjustable parameters of the constant matrix generator corresponding to each iteration, and the mean and variance of the random matrix generator corresponding to each iteration.
CN201810735833.0A 2018-07-06 2018-07-06 Neural network model encryption protection system and method related to iteration and random encryption Active CN108898028B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201810735833.0A CN108898028B (en) 2018-07-06 2018-07-06 Neural network model encryption protection system and method related to iteration and random encryption

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201810735833.0A CN108898028B (en) 2018-07-06 2018-07-06 Neural network model encryption protection system and method related to iteration and random encryption

Publications (2)

Publication Number Publication Date
CN108898028A CN108898028A (en) 2018-11-27
CN108898028B true CN108898028B (en) 2020-07-03

Family

ID=64348224

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201810735833.0A Active CN108898028B (en) 2018-07-06 2018-07-06 Neural network model encryption protection system and method related to iteration and random encryption

Country Status (1)

Country Link
CN (1) CN108898028B (en)

Families Citing this family (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110045227B (en) * 2019-03-23 2019-12-17 广西电网有限责任公司电力科学研究院 power distribution network fault diagnosis method based on random matrix and deep learning
CN110457951B (en) * 2019-08-19 2021-04-16 南京大学 Artificial noise-free deep learning model protection method
CN110674941B (en) * 2019-09-25 2023-04-18 南开大学 Data encryption transmission method and system based on neural network
US11495145B2 (en) 2019-09-27 2022-11-08 Wipro Limited Method and system for selectively encrypting dataset
CN110795726A (en) * 2019-10-23 2020-02-14 成都索贝数码科技股份有限公司 Password protection method and system based on artificial neural network
CN113468544B (en) * 2020-03-30 2024-04-05 杭州海康威视数字技术股份有限公司 Training method and device for application model
CN111488602A (en) * 2020-04-16 2020-08-04 支付宝(杭州)信息技术有限公司 Data object privacy protection method and device and electronic equipment
CN111581671B (en) * 2020-05-11 2021-05-25 笵成科技南京有限公司 Digital passport protection method combining deep neural network and block chain
CN112395635B (en) * 2021-01-18 2021-05-04 北京灵汐科技有限公司 Image processing method, device, secret key generating method, device, training method and device, and computer readable medium
WO2022152153A1 (en) * 2021-01-18 2022-07-21 北京灵汐科技有限公司 Image processing method and device, key generation method and device, training method, and computer readable medium
CN112395636B (en) * 2021-01-19 2021-07-30 国网江西省电力有限公司信息通信分公司 Power grid data encryption model training method, system, storage medium and equipment
CN113014570B (en) * 2021-02-22 2022-08-02 西安理工大学 Communication data encryption and decryption method based on convolutional neural network

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1881874A (en) * 2006-04-26 2006-12-20 集美大学 Public key cipher encrypting and decrypting method based on nerval network chaotic attractor

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9740817B1 (en) * 2002-10-18 2017-08-22 Dennis Sunga Fernandez Apparatus for biological sensing and alerting of pharmaco-genomic mutation
CN102123026A (en) * 2011-04-12 2011-07-13 南开大学 Chaos and hyperchaos based two-level video streaming media encryption method
CN104104496B (en) * 2014-07-08 2018-02-23 华侨大学 A kind of one-way Hash function building method based on chaos dynamic Theory
US9946970B2 (en) * 2014-11-07 2018-04-17 Microsoft Technology Licensing, Llc Neural networks for encrypted data
CN107204844A (en) * 2017-04-21 2017-09-26 中山大学 A kind of encrypted multimedia and decryption method based on combination cellular automaton

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1881874A (en) * 2006-04-26 2006-12-20 集美大学 Public key cipher encrypting and decrypting method based on nerval network chaotic attractor

Also Published As

Publication number Publication date
CN108898028A (en) 2018-11-27

Similar Documents

Publication Publication Date Title
CN108898028B (en) Neural network model encryption protection system and method related to iteration and random encryption
CN108920981B (en) Neural network model encryption protection system and method related to data iterative encryption
CN108629193B (en) Encryption protection system and method for artificial neural network model
CN108830092B (en) Neural network model encryption protection system and method related to data random encryption
US11816226B2 (en) Secure data processing transactions
US20230171086A1 (en) Encrypting and decrypting information
US11032251B2 (en) AI-powered cyber data concealment and targeted mission execution
CN108804931B (en) Neural network model encryption protection system and method related to domain transformation data encryption
CN112765652B (en) Method, device and equipment for determining leaf node classification weight
Hu et al. Research on plaintext restoration of AES based on neural network
WO2021010896A1 (en) Method and system for distributed data management
CN110807484B (en) Privacy protection VGG-based dense image recognition method and system
CN117094008A (en) Neural network model encryption method, neural network model decryption device, neural network model encryption equipment and neural network model decryption medium
CN108900294B (en) Encryption protection system and method for neural network model related to specified frequency band encryption
Ibarrondo et al. Banners: Binarized neural networks with replicated secret sharing
Huang et al. Secure XOR-CIM engine: Compute-in-memory sram architecture with embedded xor encryption
WO2022241307A1 (en) Image steganography utilizing adversarial perturbations
US11829468B2 (en) Neural network confidentiality
CN116861485A (en) Student information privacy protection method based on deep learning fusion
Khavalko et al. Application of neural network technologies for information protection in real time
Hu et al. Research on encrypted face recognition algorithm based on new combined chaotic map and neural network
CN114244517A (en) Data encryption and signature method and device, computer equipment and storage medium
CN114338944A (en) Ciphertext domain image classification method based on deep learning
Yuan et al. Secure integrated circuit design via hybrid cloud
Zhao et al. PPCNN: An efficient privacy‐preserving CNN training and inference framework

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant