CN108898028A - It is related to the neural network model encryption protection system and method for iteration and accidental enciphering - Google Patents
It is related to the neural network model encryption protection system and method for iteration and accidental enciphering Download PDFInfo
- Publication number
- CN108898028A CN108898028A CN201810735833.0A CN201810735833A CN108898028A CN 108898028 A CN108898028 A CN 108898028A CN 201810735833 A CN201810735833 A CN 201810735833A CN 108898028 A CN108898028 A CN 108898028A
- Authority
- CN
- China
- Prior art keywords
- module
- data
- network model
- encryption
- iterative processing
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F21/00—Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
- G06F21/60—Protecting data
- G06F21/602—Providing cryptographic facilities or services
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/04—Architecture, e.g. interconnection topology
- G06N3/045—Combinations of networks
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/08—Learning methods
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L9/00—Cryptographic mechanisms or cryptographic arrangements for secret or secure communications; Network security protocols
- H04L9/08—Key distribution or management, e.g. generation, sharing or updating, of cryptographic keys or passwords
- H04L9/0861—Generation of secret information including derivation or calculation of cryptographic keys or passwords
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Health & Medical Sciences (AREA)
- Health & Medical Sciences (AREA)
- Software Systems (AREA)
- General Physics & Mathematics (AREA)
- General Engineering & Computer Science (AREA)
- Computing Systems (AREA)
- Life Sciences & Earth Sciences (AREA)
- Data Mining & Analysis (AREA)
- Molecular Biology (AREA)
- Computational Linguistics (AREA)
- Biophysics (AREA)
- Biomedical Technology (AREA)
- Mathematical Physics (AREA)
- Artificial Intelligence (AREA)
- Evolutionary Computation (AREA)
- Computer Security & Cryptography (AREA)
- Computer Hardware Design (AREA)
- Bioethics (AREA)
- Computer Networks & Wireless Communication (AREA)
- Signal Processing (AREA)
- Complex Calculations (AREA)
- Storage Device Security (AREA)
Abstract
The invention belongs to artificial neural network protection mechanism fields; in particular to a kind of to be related to the neural network model encryption protection system and method for iteration and accidental enciphering, including data input module, encrypting module, encryption data input module, artificial nerve network model module and data outputting module;Encrypting module includes structure conversion module and iterative processing module;The iterative processing module includes secret generation module, password insertion module and single layer convolution neural network model module;The secret generation module includes fixed matrix generation module and random matrix generation module.The present invention can be in the case where calculation amount be without dramatically increasing and keeping artificial neural network property; protectiveness password is embedded in artificial nerve network model; so that after artificial nerve network model publication; any duplication, secondary development or modification can not influence the protectiveness password, and destroy the protectiveness password and will lead to artificial nerve network model reduced performance or can not make effective output.
Description
Technical field
The invention belongs to artificial neural network protection mechanism fields, in particular to a kind of to be related to the mind of iteration and accidental enciphering
Through network model encryption protection system and method.
Background technique
Deep learning is the main technical schemes of current manual's intelligent use.With the people of the trained completion of depth learning technology
Artificial neural networks model is to summarize the fruit of labour of initial developer wisdom.But issuing and applying artificial nerve network model
During, network structure and node weights can externally expose completely.Artificial nerve network model is issued and/or third party
After, it is easy to be replicated, secondary development or modification, leads to initial developer damage of the rights and interests.It is suitable in the prior art artificial
The protection scheme of neural network model mainly includes that network integrally encrypts, training data encryption and network homomorphic cryptography are trained.
Network entirety encipherment scheme is that the network model completed to training carries out encryption publication, and no key is not available then
The network model.However the cipher mode is only the secondary encapsulation of network model, after key decryption network model, still may be used
Analysis obtains the core informations such as structure and the node weights of network model, and can be carried out duplication, propagation, secondary development or modification,
To which the equity of the initial developer of network model can not be protected;
Training data encipherment scheme carries out network training by training data escape, to the data after mapping, and by the mapping scheme
For the subsequent use of network model, to protect the core content of network model.The encryption technology requires to destroy data internal statistical
Rule cracks encryption to avoid using statistical analysis, and artificial neural network training is the important statistical nature based on training data
It completes data classification and prediction, essence is a kind of statistical learning, therefore contradicted with training data encipherment scheme essence, example
Such as, modern superencipherment being carried out to data using MD5 algorithm, each value to training data will not generate unique mapped value,
Inherent statistical property can be destroyed, therefore be unsuitable for artificial neural network training.Although and simple mapping encrypting can maintain number
It according to inherent statistical property, but is easy to analyze cipher mode by a large amount of training datas of deep learning, encipherment protection is made to fail;
Homomorphic cryptography training program allows to modify encryption information in a specific way without understanding encryption information, to through homomorphic cryptography
Data afterwards carry out network training, can protect the core content of network model, and the training data after homomorphic cryptography can still be protected
Its internal statistical structure is stayed, to make up the weakness of training data encipherment scheme.But the program will lead to calculation amount and largely be promoted, and
And since the various algorithms of homomorphic cryptography have different degrees of calculating incompleteness, cause certain mathematical operations directly real
It is existing, it cannot achieve the artificial neural network training method largely to have come into operation, artificial neural network property caused to decline.
Summary of the invention
For the drawbacks described above of existing artificial neural network protection mechanism, the present invention provides one kind to be related to iteration and random
The neural network model encryption protection system and method for encryption.
Concrete scheme is:
It is a kind of to be related to the neural network model encryption protection system of iteration and accidental enciphering, it is characterised in that:It is inputted including data
Module, encrypting module, encryption data input module, artificial nerve network model module and data outputting module;The data are defeated
Enter module to be connected with encrypting module signal, the encrypting module is connected with encryption data input module signal, the encryption data
Input module is connected with artificial nerve network model module by signal, the artificial nerve network model module and data outputting module
Signal is connected.
Further, data input module is used to provide initial data to encrypting module;Encrypting module is used for defeated to data
The initial data for entering module offer is encrypted, and exports encryption data;Encryption data input module is for receiving encryption mould
The encryption data of block output, and by encrypted data transmission to artificial nerve network model module;Artificial nerve network model module
It is calculated for receiving encryption data, and based on encryption data;Data outputting module is used for artificial nerve network model mould
The result that block calculates carries out output processing.
Further, in the network training stage, artificial nerve network model module passes through positive network query function and inversely misses
Difference, which is propagated to calculate, realizes network training;In service stage, artificial nerve network model module obtains knot by positive network query function
Fruit.
Further, in the network training stage, data outputting module carries out the output of artificial nerve network model module
The calculating of loss function realizes network training through gradient back-propagation algorithm for artificial nerve network model module;It is using
Stage, data outputting module judge actual functional capability using the output of artificial nerve network model module.
Further, encrypting module provides data input module by one group of key with N bit digital as control amount
All initial data be encrypted, to realize data encryption;The length of key by encryption cipher mode and
Security requirements are limited;Each of the N bit digital of key is respectively selected from one between Arabic numerals 0-9.
Further, encrypting module includes structure conversion module and iterative processing module;Structure conversion module is used for will be former
Beginning data are converted to two-dimensional structure;Iterative processing module receives the initial data after structure conversion as first time iterative processing
Input data, and the input data by the output of first time iterative processing as second of iterative processing, and so on, through excessive
Encryption data is generated after secondary iterative processing;Wherein, the number and the number of iterative processing on any one position presetting in key
It is associated.
Further, iterative processing module includes secret generation module, password insertion module and single layer convolutional neural networks
Model module;For each iterative processing, secret generation module generates password identical with input data structure, and password is embedded in mould
Block is used to for password being embedded in input data in a manner of being superimposed, and single layer convolution neural network model module is used for embedding by password
The input data entered carries out network query function, and calculated result is exported;Wherein, the calculated result of non-final secondary iterative processing
Input data as next iteration processing is received by secret generation module, and the network query function result of final iterative processing is
For encryption data.
Further, secret generation module includes fixed matrix generation module and random matrix generation module;For each
Iterative processing, fixed matrix generation module generate permanent moment identical with input data structure by fixed matrix generating function
Battle array, wherein fixed matrix generating function has multiple adjustable parameters, in key it is presetting it is several any on number with it is multiple
Adjustable parameter is associated;Random matrix generation module is generated identical as input data structure at random by random matrix generating function
Random matrix, wherein number and the mean value of the random matrix generating function and side in key on presetting other several
Difference is associated;Corresponding each iterative processing and the Hadamard product of fixed matrix and random matrix generated respectively be need it is embedding
Enter the password of the input data of current iteration processing.
Further, in the network training stage, pass through positive network query function and reverse in artificial nerve network model module
Error propagation calculate realize network training while, single layer convolution neural network model module also by positive network query function with it is inverse
It is calculated to error propagation and realizes the training of its network parameter;
Further, when structure conversion module converts raw data into two-dimensional structure, when initial data is one-dimentional structure,
It is regarded as the form for the two-dimensional data matrix that line number or columns are 1;When initial data is greater than two-dimensional structure, dropped
Dimension is converted to the form of two-dimensional data matrix, is remapped to prototype structure after the completion of encrypting step again.
Further, linear function is selected from corresponding to the fixed matrix generating function of each iterative processing, logarithmic function, refer to
One kind of number function, trigonometric function, antitrigonometric function or other compound functions.
Further, normal distyribution function is selected from corresponding to the random matrix generating function of each iterative processing, F is distributed letter
One kind of number, chi square distribution function, T distribution function or other joint distribution functions.
Further, in the key of N bit digital, on each optional Arabic numerals 0-9 be respectively mapped to one can
Numerical value is executed, which has determined the number of iterative processing respectively, the fixed matrix of corresponding each iterative processing generates
The mean and variance of multiple adjustable parameters of function and the random matrix generating function of corresponding each iterative processing.
It is a kind of to be related to the neural network model encryption protecting method of iteration and accidental enciphering, which is characterized in that method includes
Following steps:
Step S1. provides initial data;
Initial data is encrypted in step S2., generates encryption data;
Encryption data is input to artificial nerve network model by step S3., and artificial nerve network model counts encryption data
It calculates, obtains a result;
Step S4. is exported acquired results are calculated.
Wherein, step S2 is specifically included:
Step S21. provides one group of key with N bit digital as control amount, which specifically defines for initial data
Encryption;Wherein, the length of key by encryption cipher mode and security requirements limited;The N digit of key
Each of word is respectively selected from one between Arabic numerals 0-9;
Step S22. converts raw data into two-dimensional structure;
Step S23. structure is converted after initial data as the input data of first time iterative processing, at first time iteration
Input data of the output of reason as second of iterative processing, and so on, encryption data is generated after successive ignition is handled;
Wherein, the number on any one position presetting in key is associated with the number of iterative processing.,
Wherein, for each iterative processing, step S23 is specifically included:
Step S231. generates fixed matrix identical with input data structure using fixed matrix generating function, wherein permanent moment
Battle array generating function has multiple adjustable parameters, in key it is presetting it is several any on number it is related to multiple adjustable parameters
Connection;Generate random matrix identical with input data structure at random using random matrix generating function, wherein presetting in key
Other several on number it is associated with the mean and variance of the random matrix generating function;It corresponds to each iterative processing and divides
The Hadamard product of the fixed matrix and random matrix that do not generate need to as be embedded in the close of the input data of current iteration processing
Code.
Password is embedded in input data by step S232. in a manner of being superimposed;
Step S233. carries out network query function to the input data by password insertion using single layer convolutional neural networks, and will meter
Result is calculated to be exported;Wherein, the input data that the calculated result of non-final secondary iterative processing is handled as next iteration, most
The network query function result of time iterative processing is encryption data eventually.
Further, in step S22, when initial data is one-dimentional structure, two that line number or columns are 1 are regarded as
The form of dimension data matrix;When initial data is greater than two-dimensional structure, its dimensionality reduction is converted to the shape of two-dimensional data matrix
Formula, encrypting step are remapped to prototype structure again after the completion.
Further, in step S231, the fixed matrix generating function corresponding to each iterative processing is selected from linear letter
One kind of number, logarithmic function, exponential function, trigonometric function, antitrigonometric function or other compound functions.
Further, in step S231, the random matrix generating function corresponding to each iterative processing is selected from normal state point
One kind of cloth function, F distribution function, chi square distribution function, T distribution function or other joint distribution functions.
Further, in the key of N bit digital, on each optional Arabic numerals 0-9 be respectively mapped to one can
Numerical value is executed, which has determined the number of iterative processing respectively, the fixed matrix of corresponding each iterative processing generates
The mean and variance of multiple adjustable parameters of function and the random matrix generating function of corresponding each iterative processing.
The advantage of the invention is that:
It is related to the neural network model encryption protection system and method for iteration and accidental enciphering, the system the present invention provides a kind of
And the insertion of protectiveness password is used on the basis of not influencing artificial neural network structure and performance by encrypting step by method
The input data of training neural network, in the training of artificial neural network, the incidental statistical nature of the protectiveness password will
It is embedded in artificial neural network, and is embedded in the single layer convolutional neural networks that encryption mechanism is related to.What training was completed as a result, is artificial
Neural network can not rationally handle input data, the input data of insertion incorrect cipher without password insertion, and use
Input data after incorrect encryption mechanism insertion password.Compared with prior art, the present invention can increase in calculation amount without significant
In the case where adding and keeping artificial neural network property, protectiveness password is embedded in artificial nerve network model, so that artificial
After neural network model publication, any duplication, secondary development or modification can not influence the protectiveness password, and destroying should
Protectiveness password will lead to artificial nerve network model reduced performance or can not make effective output, so that artificial neural network mould
Developer's equity of type is protected, to artificial nerve network model use and publication to realize technology controllable.
Detailed description of the invention
Fig. 1 is provided in an embodiment of the present invention a kind of to be related to the neural network model encipherment protection system of iteration and accidental enciphering
The system construction drawing of system;
Fig. 2 is a kind of neural network model encryption protection system for being related to iteration and accidental enciphering provided in an embodiment of the present invention
The encryption principle of encrypting module;Wherein, CNN represents single layer convolutional neural networks;
Fig. 3 is a kind of neural network model encryption protection system for being related to iteration and accidental enciphering provided in an embodiment of the present invention
The mapping relations of numerical value can be performed in number-in each in the key of encrypting module;
Fig. 4 is a kind of neural network model encryption protecting method for being related to iteration and accidental enciphering provided in an embodiment of the present invention
Method flow diagram;Wherein, CNN represents single layer convolutional neural networks.
In attached drawing:
100- encryption protection system, 1- input module, 2- encrypting module, 3- encryption data input module, 4- artificial neural network
Model module, 5- data outputting module, 21- structure conversion module, 22- iterative processing module, 221- secret generation module, 222-
Password is embedded in module, 223- single layer convolution neural network model module, 2211- fixed matrix generation module, the random square of 2212-
Battle array generation module.
Specific embodiment
The core concept of deep learning artificial neural network is:Using gradient back-propagation algorithm, artificial neural network is adjusted
Network weight realizes certain features for including to known input data set, carries out the conclusion convergence for having statistical property, to reach pair
The purpose that Unknown worm data set is identified and judgeed.
The training objective of artificial neural network is to allow network in an iterative manner, completes to include certain to known input data set
Kind is able to achieve the exploration and statistics for the non-explicit feature combination distinguished to the data set.So entering data into net
Before network training, some Feature Engineering operations can be carried out, make housebroken artificial neural network that there is more preferably performance.
In a kind of encrypting step for the encryption protection system of neural network model disclosed by the invention, input data
Data encryption is realized by encrypting module, and the password for being embedded in input data is substantially a kind of controllable noise.Due to random square
Data in battle array have Statistical Distribution Characteristics defined by random matrix generating function always, and are embedded in password by superposition
Mode will not influence the statistical nature in input data and random matrix, therefore cipher mode provided by the invention will not destroy
In input data statistical property relevant to identification feature, and it is special that the additional statistical additionally especially defined can be added
Property.In the training stage of neural network, additional statistical characteristic can be by the monolayer neuronal net in artificial neural network and encrypting module
The acquistion of network institute, and weight for being embedded in artificial neural network and monolayer neuronal network etc. is certain can not the position arrived of direct detection.If
Subsequent input data completes trained artificial nerve network model and can generate mistake to sentence without this additional statistical property
It is disconnected, and the single layer artificial nerve network model for completing training can cause the data encryption of mistake, further influence artificial neural network
The judgement of network model.
The additional statistical characteristic that encryption protection system disclosed by the invention is added in input data has only been slightly increased people
The statistical nature that artificial neural networks and single layer convolutional neural networks wait arranging and concluding, but for test data and theory, this
A additional statistical nature can be very easy to acquistion for artificial neural network and single layer convolutional neural networks.And by
In back-propagation algorithm, these acquistion processes are all weight parameters to artificial neural network and single layer convolutional neural networks
Adjustment, therefore be to be blended in artificial neural network and single layer convolutional neural networks completely, one for can not splitting out merely
Point.
In order to make the objectives, technical solutions, and advantages of the present invention clearer, with reference to embodiments, to the present invention
It is further elaborated.It should be appreciated that described herein, specific examples are only used to explain the present invention, is not used to limit
The fixed present invention.
Embodiment 1
A kind of system construction drawing for the neural network model encryption protection system 100 being related to iteration and accidental enciphering is referring to specification
Attached drawing 1, referring to Figure of description 2, encryption protection system 100 includes the encryption principle of the encrypting module 2 of encryption protection system 100
Data input module 1, encrypting module 2, encryption data input module 3, artificial nerve network model module 4 and data outputting module
5.Data input module 1 is connected with 2 signal of encrypting module, and encrypting module 2 is connected with 3 signal of encryption data input module, encryption
Data input module 3 is connected with 4 signal of artificial nerve network model module, and artificial nerve network model module 4 and data export
5 signal of module is connected.
Further, data input module 1 is used to provide initial data D to encrypting module0;Encrypting module 2 is used for logarithm
The initial data D provided according to input module 10It is encrypted, exports encryption data D ';Encryption data input module 3 is used for
The encryption data D ' that encrypting module 2 exports is received, and encryption data D ' is transmitted to artificial nerve network model module 4;Manually
Neural network model module 4 is calculated for receiving encryption data D ', and based on encryption data D ';Data outputting module 5 is used
Output processing is carried out in the result for calculating artificial nerve network model module 4.
Further, in the network training stage, artificial nerve network model module 4 passes through positive network query function and inversely misses
Difference, which is propagated to calculate, realizes network training;In service stage, artificial nerve network model module 4 obtains knot by positive network query function
Fruit.
Further, in the network training stage, data outputting module 5 to the output of artificial nerve network model module 4 into
The calculating of row loss function realizes network training through gradient back-propagation algorithm for artificial nerve network model module 4;Make
With the stage, data outputting module 5 judges actual functional capability using the output of artificial nerve network model module 4.
Encrypting module 2, as control amount, owns 1 offer of data input module by one group of key S with N bit digital
Initial data D0It is encrypted, to realize data encryption;The length of key S by encryption cipher mode and encryption
It is required that being limited;Each of the N bit digital of key S is respectively selected from one between Arabic numerals 0-9.
Further, encrypting module 2 includes structure conversion module 21 and iterative processing module 22;Structure conversion module 21 is used
In by initial data D0Be converted to two-dimensional structure;Iterative processing module 22 receives the initial data D after structure conversion0As first
Secondary iterative processing T1Input data D1, and by first time iterative processing T1Output as second of iterative processing T2Input
Data D2, and so on, T is handled by successive ignition1-TnEncryption data D ' is generated afterwards;Wherein, presetting any in key S
Number on one is associated with the number of iterative processing.For example, the number and iterative processing on key S the 1st may be selected
Number is associated.
Further, iterative processing module 22 includes secret generation module 221, password insertion module 222 and single layer convolution
Neural network model module 223;For each iterative processing T1-Tn, secret generation module 221 generates and input data D1-DnKnot
The identical password s of structure1-sn, password is embedded in module 222 and is used for password s1-snInput data D is embedded in a manner of superposition1-Dn,
Single layer convolution neural network model module 223 is used to carry out the input data by password insertion network query function, and will calculate
As a result it is exported;Wherein, non-final secondary iterative processing T1-Tn-1The input number that is handled as next iteration of calculated result
It is received according to by secret generation module, final iterative processing TnNetwork query function result be encryption data D '.
Further, secret generation module 221 includes fixed matrix generation module 2211 and random matrix generation module
2212;For each iterative processing T1-Tn, fixed matrix generation module 2211 pass through fixed matrix generating function M1-MnGenerate with
Input data D1-DnThe identical fixed matrix m of structure1-mn, wherein fixed matrix generating function M1-MnWith multiple adjustable ginsengs
Number, in key S it is presetting it is several any on number it is associated with multiple adjustable parameters;Random matrix generation module 2212 is logical
Cross random matrix generating function N1-NnIt is random to generate and input data D1-DnThe identical random matrix n of structure1-nn, wherein key
Number and random matrix generating function N in S on presetting other several1-NnMean and variance it is associated;
For example, the digital and corresponding first time iterative processing T on the position key S 2-31Fixed matrix generating function M1It is multiple
Adjustable parameter is associated, the digital and corresponding first time iterative processing T on the position key S 4-51Random matrix generating function N1
Mean and variance it is associated, and so on.
Corresponding each iterative processing T1-TnAnd the fixed matrix m generated respectively1-mnWith random matrix n1-nnHadamard
Product is the password s that need to be embedded in the input data of current iteration processing1-sn.Further, in the network training stage, artificial
Neural network model module 4 passes through positive network query function and reversal error propagation calculates while realizing network training, single laminate roll
Product neural network model module 223, which is propagated to calculate also by positive network query function and reversal error, realizes the training of its network parameter;
Further, when structure conversion module 21 is by initial data D0When being converted to two-dimensional structure, as initial data D0It is one-dimensional
When structure, it is regarded as the form for the two-dimensional data matrix that line number or columns are 1;As initial data D0For greater than two-dimensional structure
When, its dimensionality reduction is converted to the form of two-dimensional data matrix, is remapped to prototype structure after the completion of encrypting step again.
Further, correspond to each iterative processing T1-TnFixed matrix generating function M1-MnSelected from linear function, right
One kind of number function, exponential function, trigonometric function, antitrigonometric function or other compound functions.
Further, correspond to each iterative processing T1-TnRandom matrix generating function N1-NnSelected from normal distribution letter
One kind of number, F distribution function, chi square distribution function, T distribution function or other joint distribution functions.
Further, in the key S of N bit digital, on each optional Arabic numerals 0-9 be respectively mapped to one can
Numerical value is executed, which has determined the number of iterative processing, corresponding each iterative processing T respectively1-TnFixed matrix
Generating function M1-MnMultiple adjustable parameters and corresponding each iterative processing T1-TnRandom matrix generating function N1-NnIt is equal
Value and variance.
As shown in Figure of description 3, key S on each optional Arabic numerals 0-9 be respectively mapped to executable number
Value Cp, q, wherein the value range of p is 1-N, and the value range of q is 0-9, and Cp, q refer to that the digital q of key S pth position maps to obtain
Executable numerical value.
Embodiment 2
A kind of method flow diagram for the neural network model encryption protecting method being related to iteration and accidental enciphering is attached referring to specification
Fig. 4, method include the following steps:
Step S1. provides initial data D0;
Step S2. is to initial data D0It is encrypted, generates encryption data D ';
Encryption data D ' is input to artificial nerve network model by step S3., artificial nerve network model to encryption data D ' into
Row calculates, and obtains a result;
Step S4. is exported acquired results are calculated.
Wherein, step S2 is specifically included:
Step S21. provides one group of key S with N bit digital as control amount, and key S is specifically defined for original number
According to D0Encryption;Wherein, the length of key S by encryption cipher mode and security requirements limited;Key S's
Each of N bit digital is respectively selected from one between Arabic numerals 0-9;
Step S22. is by initial data D0Be converted to two-dimensional structure;
Step S23. structure is converted after initial data D0As first time iterative processing T1Input data D1, will be for the first time
Iterative processing T1Output as second of iterative processing T2Input data D2, and so on, T is handled by successive ignition1-Tn
Encryption data D ' is generated afterwards;Wherein, the number on any one position presetting in key S is associated with the number of iterative processing.
For example, the number that may be selected on key S the 1st is associated with the number of iterative processing.
Wherein, for each iterative processing T1-Tn, step S23 specifically includes:
Step S231. uses fixed matrix generating function M1-MnIt generates and input data D1-DnThe identical fixed matrix m of structure1-
mn, wherein fixed matrix generating function M1-MnWith multiple adjustable parameters, in key S it is presetting it is several any on number
It is associated with multiple adjustable parameters;Using random matrix generating function N1-NnIt is random to generate and input data D1-DnStructure is identical
Random matrix n1-nn, wherein number and random matrix generating function N in key S on presetting other several1-Nn
Mean and variance it is associated;
For example, the digital and corresponding first time iterative processing T on the position key S 2-31Fixed matrix generating function M1It is multiple
Adjustable parameter is associated, the digital and corresponding first time iterative processing T on the position key S 4-51Random matrix generating function N1
Mean and variance it is associated, and so on.
Corresponding each iterative processing T1-TnAnd the fixed matrix m generated respectively1-mnWith random matrix n1-nnHadamard
Product is the password s that need to be embedded in the input data of current iteration processing1-sn。
Step S232. is by password s1-snInput data D is embedded in a manner of superposition1-Dn;
Step S233. carries out network query function to the input data by password insertion using single layer convolutional neural networks, and will meter
Result is calculated to be exported;Wherein, non-final secondary iterative processing T1-Tn-1The input that is handled as next iteration of calculated result
Data, final iterative processing TnNetwork query function result be encryption data D '.
Further, in step S22, as initial data D0When for one-dimentional structure, it is regarded as line number or columns is 1
The form of two-dimensional data matrix;As initial data D0When for greater than two-dimensional structure, its dimensionality reduction is converted into two-dimensional data matrix
Form, prototype structure is remapped to after the completion of encrypting step again.
Further, in step S231, correspond to each iterative processing T1-TnFixed matrix generating function M1-MnChoosing
From one kind of linear function, logarithmic function, exponential function, trigonometric function, antitrigonometric function or other compound functions.
Further, in step S231, correspond to each iterative processing T1-TnRandom matrix generating function N1-NnChoosing
From one kind of normal distyribution function, F distribution function, chi square distribution function, T distribution function or other joint distribution functions.
Further, in the key S of N bit digital, on each optional Arabic numerals 0-9 be respectively mapped to one can
Numerical value is executed, which has determined the number of iterative processing, corresponding each iterative processing T respectively1-TnFixed matrix
Generating function M1-MnMultiple adjustable parameters and corresponding each iterative processing T1-TnRandom matrix generating function N1-NnIt is equal
Value and variance.
As shown in Figure of description 3, key S on each optional Arabic numerals 0-9 be respectively mapped to executable number
Value Cp, q, wherein the value range of p is 1-N, and the value range of q is 0-9, and Cp, q refer to that the digital q of key S pth position maps to obtain
Executable numerical value.
Claims (10)
1. a kind of be related to the neural network model encryption protection system of iteration and accidental enciphering, it is characterised in that:It is defeated including data
Enter module(1), encrypting module(2), encryption data input module(3), artificial nerve network model module(4)Mould is exported with data
Block(5);The data input module(1)With the encrypting module(2)Signal is connected, the encrypting module(2)With the encryption
Data input module(3)Signal is connected, the encryption data input module(3)With the artificial nerve network model module(4)
Signal is connected, the artificial nerve network model module(4)With the data outputting module(5)Signal is connected;
Wherein, the encrypting module(2)Including structure conversion module(21)And iterative processing module(22);The iterative processing mould
Block(22)Including secret generation module(221), password be embedded in module(222)With single layer convolution neural network model module(223);
The secret generation module(221)Including fixed matrix generation module(2211)And random matrix generation module(2212).
2. it is according to claim 1 it is a kind of be related to the neural network model encryption protection system of iteration and accidental enciphering,
It is characterized in that:The data input module(1)For to the encrypting module(2)Initial data is provided;The encrypting module(2)
For to the data input module(1)The initial data of offer is encrypted, and exports encryption data;The encryption data
Input module(3)For receiving the encrypting module(2)The encryption data of output, and by the encrypted data transmission to the people
Artificial neural networks model module(4);The artificial nerve network model module(4)For receiving encryption data, and based on encryption
Data are calculated;The data outputting module(5)For by the artificial nerve network model module(4)The result of calculating into
Row output processing;
Further, in network training stage, the artificial nerve network model module(4)By positive network query function and inversely
Error propagation, which calculates, realizes network training;In service stage, the artificial nerve network model module(4)Pass through positive network meter
It obtains a result;
Further, in network training stage, the data outputting module(5)To the artificial nerve network model module(4)
Output carry out loss function calculating, be used for the artificial nerve network model module(4)It is real through gradient back-propagation algorithm
Existing network training;In service stage, the data outputting module(5)Using the artificial nerve network model module(4)It is defeated
Actual functional capability is judged out.
3. it is according to claim 1 it is a kind of be related to the neural network model encryption protection system of iteration and accidental enciphering,
It is characterized in that:The encrypting module(2)By one group of key with N bit digital as control amount, to the data input module
(1)All initial data provided are encrypted, to realize data encryption;The length of the key is by encryption
Cipher mode and security requirements are limited;Each of the N bit digital of the key is respectively selected between Arabic numerals 0-9
One.
4. it is according to claim 3 it is a kind of be related to the neural network model encryption protection system of iteration and accidental enciphering,
It is characterized in that:The structure conversion module(21)For converting raw data into two-dimensional structure;The iterative processing module
(22)Input data of the initial data as first time iterative processing after receiving structure conversion, and by first time iterative processing
The input data as second of iterative processing is exported, and so on, the encryption data is generated after successive ignition is handled;
Wherein, the number on any one position presetting in the key is associated with the number of iterative processing.
5. it is according to claim 4 it is a kind of be related to the neural network model encryption protection system of iteration and accidental enciphering,
It is characterized in that:For each iterative processing, the secret generation module(221)Password identical with input data structure is generated,
The password is embedded in module(222)For each password to be embedded in input data, the single layer convolution nerve net in a manner of being superimposed
Network model module(223)For carrying out network query function to the input data by password insertion, and calculated result is exported;
Wherein, the input data that the calculated result of non-final secondary iterative processing is handled as next iteration is by the secret generation module
(221)It receives, the network query function result of final iterative processing is encryption data.
6. it is according to claim 5 it is a kind of be related to the neural network model encryption protection system of iteration and accidental enciphering,
It is characterized in that, for each iterative processing, the fixed matrix generation module(2211)It is generated by fixed matrix generating function
Fixed matrix identical with input data structure, wherein the fixed matrix generating function has multiple adjustable parameters, described close
In key it is presetting it is several any on number it is associated with multiple adjustable parameters;The random matrix generation module(2212)It is logical
It crosses random matrix generating function and generates random matrix identical with input data structure at random, wherein is presetting in the key
Other several on number it is associated with the mean and variance of the random matrix generating function;It corresponds to each iterative processing and divides
The Hadamard product of the fixed matrix and the random matrix that do not generate is that need to be embedded in the input of current iteration processing
The password of data.
7. it is according to claim 5 it is a kind of be related to the neural network model encryption protection system of iteration and accidental enciphering,
It is characterized in that, in the network training stage, in the artificial nerve network model module(4)It is missed by positive network query function with reverse
While difference propagates calculating realization network training, the single layer convolution neural network model module(223)Also by positive network
It calculates to propagate to calculate with reversal error and realizes the training of its network parameter.
8. it is according to claim 6 it is a kind of be related to the neural network model encryption protection system of iteration and accidental enciphering,
It is characterized in that, optional Arabic numerals 0-9, which is respectively mapped to one, in the key of N bit digital, on each can be performed
Numerical value, which has determined the number of iterative processing respectively, the fixed matrix of corresponding each iterative processing generates
The mean and variance of multiple adjustable parameters of function and the random matrix generating function of corresponding each iterative processing.
9. a kind of be related to the neural network model encryption protecting method of iteration and accidental enciphering, which is characterized in that method include with
Lower step:
Step S1. provides initial data;
Initial data is encrypted in step S2., generates encryption data;
Encryption data is input to artificial nerve network model by step S3., and artificial nerve network model counts encryption data
It calculates, obtains a result;
Step S4. is exported acquired results are calculated;
Wherein, step S2 is specifically included:
Step S21. provides one group of key with N bit digital as control amount, which specifically defines for initial data
Encryption;Wherein, the length of key by encryption cipher mode and security requirements limited;The N digit of key
Each of word is respectively selected from one between Arabic numerals 0-9;
Step S22. converts raw data into two-dimensional structure;
Step S23. structure is converted after initial data as the input data of first time iterative processing, at first time iteration
Input data of the output of reason as second of iterative processing, and so on, encryption data is generated after successive ignition is handled;
Wherein, the number on any one position presetting in key is associated with the number of iterative processing;
Wherein, for each iterative processing, step S23 is specifically included:
Step S231. generates fixed matrix identical with input data structure using fixed matrix generating function, wherein permanent moment
Battle array generating function has multiple adjustable parameters, in key it is presetting it is several any on number it is related to multiple adjustable parameters
Connection;Generate random matrix identical with input data structure at random using random matrix generating function, wherein presetting in key
Other several on number it is associated with the mean and variance of the random matrix generating function;It corresponds to each iterative processing and divides
The Hadamard product of the fixed matrix and random matrix that do not generate need to as be embedded in the close of the input data of current iteration processing
Code;
Password is embedded in input data by step S232. in a manner of being superimposed;
Step S233. carries out network query function to the input data by password insertion using single layer convolutional neural networks, and will meter
Result is calculated to be exported;Wherein, the input data that the calculated result of non-final secondary iterative processing is handled as next iteration, most
The network query function result of time iterative processing is encryption data eventually.
10. it is according to claim 9 it is a kind of be related to the neural network model encryption protecting method of iteration and accidental enciphering,
It is characterized in that, optional Arabic numerals 0-9 is respectively mapped to an executable number on each in the key of N bit digital
Value, the executable numerical value have determined the fixed matrix generating function of the number of iterative processing, corresponding each iterative processing respectively
The mean and variance of multiple adjustable parameters and the random matrix generating function of corresponding each iterative processing.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201810735833.0A CN108898028B (en) | 2018-07-06 | 2018-07-06 | Neural network model encryption protection system and method related to iteration and random encryption |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201810735833.0A CN108898028B (en) | 2018-07-06 | 2018-07-06 | Neural network model encryption protection system and method related to iteration and random encryption |
Publications (2)
Publication Number | Publication Date |
---|---|
CN108898028A true CN108898028A (en) | 2018-11-27 |
CN108898028B CN108898028B (en) | 2020-07-03 |
Family
ID=64348224
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201810735833.0A Active CN108898028B (en) | 2018-07-06 | 2018-07-06 | Neural network model encryption protection system and method related to iteration and random encryption |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN108898028B (en) |
Cited By (12)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110045227A (en) * | 2019-03-23 | 2019-07-23 | 广西电网有限责任公司电力科学研究院 | A kind of Fault Diagnosis Method for Distribution Networks based on random matrix and deep learning |
CN110457951A (en) * | 2019-08-19 | 2019-11-15 | 南京大学 | A kind of deep learning model protection method of prosthetic noise |
CN110674941A (en) * | 2019-09-25 | 2020-01-10 | 南开大学 | Data encryption transmission method and system based on neural network |
CN110795726A (en) * | 2019-10-23 | 2020-02-14 | 成都索贝数码科技股份有限公司 | Password protection method and system based on artificial neural network |
CN111488602A (en) * | 2020-04-16 | 2020-08-04 | 支付宝(杭州)信息技术有限公司 | Data object privacy protection method and device and electronic equipment |
CN111581671A (en) * | 2020-05-11 | 2020-08-25 | 笵成科技南京有限公司 | Digital passport protection method combining deep neural network and block chain |
CN112395635A (en) * | 2021-01-18 | 2021-02-23 | 北京灵汐科技有限公司 | Image processing method, device, secret key generating method, device, training method and device, and computer readable medium |
CN112395636A (en) * | 2021-01-19 | 2021-02-23 | 国网江西省电力有限公司信息通信分公司 | Power grid data encryption model training method, system, storage medium and equipment |
CN113014570A (en) * | 2021-02-22 | 2021-06-22 | 西安理工大学 | Communication data encryption and decryption method based on convolutional neural network |
CN113468544A (en) * | 2020-03-30 | 2021-10-01 | 杭州海康威视数字技术股份有限公司 | Training method and device of application model |
WO2022152153A1 (en) * | 2021-01-18 | 2022-07-21 | 北京灵汐科技有限公司 | Image processing method and device, key generation method and device, training method, and computer readable medium |
US11495145B2 (en) | 2019-09-27 | 2022-11-08 | Wipro Limited | Method and system for selectively encrypting dataset |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN1881874A (en) * | 2006-04-26 | 2006-12-20 | 集美大学 | Public key cipher encrypting and decrypting method based on nerval network chaotic attractor |
CN102123026A (en) * | 2011-04-12 | 2011-07-13 | 南开大学 | Chaos and hyperchaos based two-level video streaming media encryption method |
CN104104496A (en) * | 2014-07-08 | 2014-10-15 | 华侨大学 | One-way Harsh function construction method based on chaotic dynamics theory |
US20160350648A1 (en) * | 2014-11-07 | 2016-12-01 | Microsoft Technology Licensing, Llc. | Neural networks for encrypted data |
US9740817B1 (en) * | 2002-10-18 | 2017-08-22 | Dennis Sunga Fernandez | Apparatus for biological sensing and alerting of pharmaco-genomic mutation |
CN107204844A (en) * | 2017-04-21 | 2017-09-26 | 中山大学 | A kind of encrypted multimedia and decryption method based on combination cellular automaton |
-
2018
- 2018-07-06 CN CN201810735833.0A patent/CN108898028B/en active Active
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9740817B1 (en) * | 2002-10-18 | 2017-08-22 | Dennis Sunga Fernandez | Apparatus for biological sensing and alerting of pharmaco-genomic mutation |
CN1881874A (en) * | 2006-04-26 | 2006-12-20 | 集美大学 | Public key cipher encrypting and decrypting method based on nerval network chaotic attractor |
CN102123026A (en) * | 2011-04-12 | 2011-07-13 | 南开大学 | Chaos and hyperchaos based two-level video streaming media encryption method |
CN104104496A (en) * | 2014-07-08 | 2014-10-15 | 华侨大学 | One-way Harsh function construction method based on chaotic dynamics theory |
US20160350648A1 (en) * | 2014-11-07 | 2016-12-01 | Microsoft Technology Licensing, Llc. | Neural networks for encrypted data |
CN107204844A (en) * | 2017-04-21 | 2017-09-26 | 中山大学 | A kind of encrypted multimedia and decryption method based on combination cellular automaton |
Cited By (18)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110045227A (en) * | 2019-03-23 | 2019-07-23 | 广西电网有限责任公司电力科学研究院 | A kind of Fault Diagnosis Method for Distribution Networks based on random matrix and deep learning |
CN110045227B (en) * | 2019-03-23 | 2019-12-17 | 广西电网有限责任公司电力科学研究院 | power distribution network fault diagnosis method based on random matrix and deep learning |
CN110457951A (en) * | 2019-08-19 | 2019-11-15 | 南京大学 | A kind of deep learning model protection method of prosthetic noise |
CN110674941A (en) * | 2019-09-25 | 2020-01-10 | 南开大学 | Data encryption transmission method and system based on neural network |
CN110674941B (en) * | 2019-09-25 | 2023-04-18 | 南开大学 | Data encryption transmission method and system based on neural network |
US11495145B2 (en) | 2019-09-27 | 2022-11-08 | Wipro Limited | Method and system for selectively encrypting dataset |
CN110795726A (en) * | 2019-10-23 | 2020-02-14 | 成都索贝数码科技股份有限公司 | Password protection method and system based on artificial neural network |
CN113468544A (en) * | 2020-03-30 | 2021-10-01 | 杭州海康威视数字技术股份有限公司 | Training method and device of application model |
CN113468544B (en) * | 2020-03-30 | 2024-04-05 | 杭州海康威视数字技术股份有限公司 | Training method and device for application model |
CN111488602A (en) * | 2020-04-16 | 2020-08-04 | 支付宝(杭州)信息技术有限公司 | Data object privacy protection method and device and electronic equipment |
CN111581671A (en) * | 2020-05-11 | 2020-08-25 | 笵成科技南京有限公司 | Digital passport protection method combining deep neural network and block chain |
CN112395635B (en) * | 2021-01-18 | 2021-05-04 | 北京灵汐科技有限公司 | Image processing method, device, secret key generating method, device, training method and device, and computer readable medium |
CN112395635A (en) * | 2021-01-18 | 2021-02-23 | 北京灵汐科技有限公司 | Image processing method, device, secret key generating method, device, training method and device, and computer readable medium |
WO2022152153A1 (en) * | 2021-01-18 | 2022-07-21 | 北京灵汐科技有限公司 | Image processing method and device, key generation method and device, training method, and computer readable medium |
CN112395636A (en) * | 2021-01-19 | 2021-02-23 | 国网江西省电力有限公司信息通信分公司 | Power grid data encryption model training method, system, storage medium and equipment |
CN112395636B (en) * | 2021-01-19 | 2021-07-30 | 国网江西省电力有限公司信息通信分公司 | Power grid data encryption model training method, system, storage medium and equipment |
CN113014570A (en) * | 2021-02-22 | 2021-06-22 | 西安理工大学 | Communication data encryption and decryption method based on convolutional neural network |
CN113014570B (en) * | 2021-02-22 | 2022-08-02 | 西安理工大学 | Communication data encryption and decryption method based on convolutional neural network |
Also Published As
Publication number | Publication date |
---|---|
CN108898028B (en) | 2020-07-03 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN108898028A (en) | It is related to the neural network model encryption protection system and method for iteration and accidental enciphering | |
CN108920981A (en) | It is related to the neural network model encryption protection system and method for data iterative cryptographic | |
CN108629193A (en) | A kind of encryption protection system and method for artificial nerve network model | |
Zhou et al. | PassBio: Privacy-preserving user-centric biometric authentication | |
CN108830092A (en) | It is related to the neural network model encryption protection system and method for data accidental enciphering | |
TWI719635B (en) | Safe feature engineering method and device | |
Shihab | A backpropagation neural network for computer network security | |
CN113298268B (en) | Vertical federal learning method and device based on anti-noise injection | |
Lytvyn et al. | Information encryption based on the synthesis of a neural network and AES algorithm | |
CN112597519B (en) | Non-key decryption method based on convolutional neural network in OFDM encryption system | |
Zapechnikov | Privacy-preserving machine learning as a tool for secure personalized information services | |
CN108804931B (en) | Neural network model encryption protection system and method related to domain transformation data encryption | |
CN113221153A (en) | Graph neural network training method and device, computing equipment and storage medium | |
Meng et al. | Fedmonn: meta operation neural network for secure federated aggregation | |
Li | Combination of blockchain and AI for music intellectual property protection | |
CN108900294A (en) | It is related to the neural network model encryption protection system and method for designated frequency band encryption | |
CN111783109A (en) | Data query method, system and storage medium | |
CN111784337A (en) | Authority verification method and system | |
CN114003744A (en) | Image retrieval method and system based on convolutional neural network and vector homomorphic encryption | |
Khavalko et al. | Application of neural network technologies for information protection in real time | |
Peleshchak et al. | Two-stage AES encryption method based on stochastic error of a neural network | |
CN115865302A (en) | Multi-party matrix multiplication method with privacy protection attribute | |
CN111698284A (en) | Block chain-based computer encryption system and method | |
Kim et al. | Deep neural networks based key concealment scheme | |
Noaman et al. | Data security based on neural networks |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |