CN110071798A - A kind of equivalent key acquisition methods, device and computer readable storage medium - Google Patents

A kind of equivalent key acquisition methods, device and computer readable storage medium Download PDF

Info

Publication number
CN110071798A
CN110071798A CN201910216928.6A CN201910216928A CN110071798A CN 110071798 A CN110071798 A CN 110071798A CN 201910216928 A CN201910216928 A CN 201910216928A CN 110071798 A CN110071798 A CN 110071798A
Authority
CN
China
Prior art keywords
neural network
data set
training
equivalent key
network model
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201910216928.6A
Other languages
Chinese (zh)
Other versions
CN110071798B (en
Inventor
何文奇
盘水新
彭翔
韩本年
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen University
Original Assignee
Shenzhen University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen University filed Critical Shenzhen University
Priority to CN201910216928.6A priority Critical patent/CN110071798B/en
Publication of CN110071798A publication Critical patent/CN110071798A/en
Application granted granted Critical
Publication of CN110071798B publication Critical patent/CN110071798B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04BTRANSMISSION
    • H04B10/00Transmission systems employing electromagnetic waves other than radio-waves, e.g. infrared, visible or ultraviolet light, or employing corpuscular radiation, e.g. quantum communication
    • H04B10/80Optical aspects relating to the use of optical transmission for specific applications, not provided for in groups H04B10/03 - H04B10/70, e.g. optical power feeding or optical transmission through water
    • H04B10/85Protection from unauthorised access, e.g. eavesdrop protection
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L9/00Cryptographic mechanisms or cryptographic arrangements for secret or secure communications; Network security protocols
    • H04L9/08Key distribution or management, e.g. generation, sharing or updating, of cryptographic keys or passwords
    • H04L9/0861Generation of secret information including derivation or calculation of cryptographic keys or passwords
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L2209/00Additional information or applications relating to cryptographic mechanisms or cryptographic arrangements for secret or secure communication H04L9/00
    • H04L2209/12Details relating to cryptographic hardware or logic circuitry

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Computer Security & Cryptography (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Signal Processing (AREA)
  • Biophysics (AREA)
  • Molecular Biology (AREA)
  • Artificial Intelligence (AREA)
  • Biomedical Technology (AREA)
  • Health & Medical Sciences (AREA)
  • Computational Linguistics (AREA)
  • Data Mining & Analysis (AREA)
  • Evolutionary Computation (AREA)
  • General Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Computing Systems (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Mathematical Physics (AREA)
  • Software Systems (AREA)
  • Electromagnetism (AREA)
  • Image Analysis (AREA)

Abstract

The embodiment of the invention discloses a kind of equivalent key acquisition methods, device and computer readable storage mediums, by obtaining preset training data set;The training data set includes the combination of multiple plaintext images with corresponding ciphertext sequence;Neural network is trained based on the training data set, obtains the neural network model of training completion;The neural network model is determined as equivalent key;The equivalent key is used to carry out safety analysis to optical encryption system.Implementation through the invention, a series of known ciphertexts-are trained to being placed in neural network structure in plain text, the mapping relations between ciphertext and plaintext to obtain optical encryption system, using this as equivalent key, improve the efficiency and accuracy of safety analysis, and without whether additional random scrambling or other secondary cryptographic operations is carried out to ciphertext sequence concerning encryption system, extend applicable scene.

Description

A kind of equivalent key acquisition methods, device and computer readable storage medium
Technical field
The present invention relates to optics art of cryptography more particularly to a kind of equivalent key acquisition methods, device and computer Readable storage medium storing program for executing.
Background technique
Optical image encryption is a kind of novel encryption technology different from traditional mathematics encryption, encrypts skill with traditional mathematics Art is compared, and optical image encryption technology has various dimensions, large capacity, high robust and natural parallel data processing capacity Advantage, so that the encryption technology is got the attention in recent years and sustainable development.
However, first concern point should be the safety of the system, it is necessary to by password as a kind of cryptographic system System carries out stringent safety analysis to confirm whether it is reliable.Currently, being carried out in the related technology to optical encryption system When safety analysis, it usually needs dependent on the geometric parameter and dependency structure of optical encryption system, by plaintext attack side Formula obtain system key, attack efficiency and accuracy are lower, and which be only applicable to encryption system not to ciphertext sequence into The scene of row additional random scrambling or other secondary cryptographic operations is applicable in scene and more limits to.
Summary of the invention
The main purpose of the embodiment of the present invention is to provide a kind of equivalent key acquisition methods, device and computer-readable deposits Storage media is at least able to solve and adds in the related technology dependent on the geometric parameter of optical encryption system and dependency structure progress optics The safety analysis of close system, analysis efficiency and accuracy are lower, are applicable in the problem of scene is more limited to.
To achieve the above object, first aspect of the embodiment of the present invention provides a kind of equivalent key acquisition methods, this method Include:
Obtain preset training data set;The training data set includes multiple plaintext images and corresponding ciphertext sequence Combination;
Neural network is trained based on the training data set, obtains the neural network model of training completion;
The neural network model is determined as equivalent key;The equivalent key is for pacifying optical encryption system Full property analysis.
To achieve the above object, second aspect of the embodiment of the present invention provides a kind of equivalent key acquisition device, the device Include:
Module is obtained, for obtaining preset training data set;The training data set includes multiple plaintext images With the combination of corresponding ciphertext sequence;
Training module obtains the mind of training completion for being trained based on the training data set to neural network Through network model;
Determining module, for the neural network model to be determined as equivalent key;The equivalent key is used for optics Encryption system carries out safety analysis.
To achieve the above object, the third aspect of the embodiment of the present invention provides a kind of electronic device, which includes: Processor, memory and communication bus;
The communication bus is for realizing the connection communication between the processor and memory;
The processor is above-mentioned any one to realize for executing one or more program stored in the memory The step of kind equivalent key acquisition methods.
To achieve the above object, fourth aspect of the embodiment of the present invention provides a kind of computer readable storage medium, the meter Calculation machine readable storage medium storing program for executing is stored with one or more program, and one or more of programs can be by one or more It manages device to execute, the step of to realize any one of the above equivalent key acquisition methods.
Equivalent key acquisition methods, device and the computer readable storage medium provided according to embodiments of the present invention, passes through Obtain preset training data set;The training data set includes the combination of multiple plaintext images with corresponding ciphertext sequence; Neural network is trained based on the training data set, obtains the neural network model of training completion;By the nerve Network model is determined as equivalent key;The equivalent key is used to carry out safety analysis to optical encryption system.By this hair A series of known ciphertexts-are trained to being placed in neural network structure, to obtain optical encryption by bright implementation in plain text Mapping relations between the ciphertext and plaintext of system improve the efficiency of safety analysis and accurate using this as equivalent key Property, and without additional random scrambling or other secondary cryptographic operations whether is carried out to ciphertext sequence concerning encryption system, it extends Applicable scene.
Other features of the invention and corresponding effect are described in the aft section of specification, and should be appreciated that At least partly effect is apparent from from the record in description of the invention.
Detailed description of the invention
In order to more clearly explain the embodiment of the invention or the technical proposal in the existing technology, to embodiment or will show below There is attached drawing needed in technical description to be briefly described, it should be apparent that, the accompanying drawings in the following description is only this Some embodiments of invention for those skilled in the art without creative efforts, can also basis These attached drawings obtain other attached drawings.
Fig. 1 is the basic procedure schematic diagram for the equivalent key acquisition methods that first embodiment of the invention provides;
Fig. 2 is the optical encryption schematic diagram that first embodiment of the invention provides;
Fig. 3 is the training schematic diagram for the deep neural network model that first embodiment of the invention provides;
Fig. 4 is the schematic diagram for abandon to neural network Regularization that first embodiment of the invention provides;
Fig. 5 is the structural schematic diagram for the equivalent key acquisition device that second embodiment of the invention provides;
Fig. 6 is the structural schematic diagram for the electronic device that third embodiment of the invention provides.
Specific embodiment
In order to make the invention's purpose, features and advantages of the invention more obvious and easy to understand, below in conjunction with the present invention Attached drawing in embodiment, technical scheme in the embodiment of the invention is clearly and completely described, it is clear that described reality Applying example is only a part of the embodiment of the present invention, and not all embodiments.Based on the embodiments of the present invention, those skilled in the art Member's every other embodiment obtained without making creative work, shall fall within the protection scope of the present invention.
First embodiment:
In order to solve to carry out optical encryption dependent on the geometric parameter of optical encryption system and dependency structure in the related technology The safety analysis of system, analysis efficiency and accuracy are lower, are applicable in the technical issues of scene is more limited to, and the present embodiment proposes A kind of equivalent key acquisition methods, as shown in Figure 1 the basic procedure for equivalent key acquisition methods provided in this embodiment shows It is intended to, the equivalent key acquisition methods that the present embodiment proposes include the following steps:
Step 101 obtains preset training data set;Training data set includes multiple plaintext images and corresponding ciphertext The combination of sequence.
Specifically, the present invention is based on the basic norm Kerckhoffs principles that cryptoanalysis field is generally acknowledged, using selection Plaintext attack scheme, i.e. hypothesis attacker can select a series of given plaintexts and be able to know that its is corresponding close Text.It should be noted that neural network is trained under the frame of supervised learning, to need to construct training in the present embodiment Data acquisition system, to train neural network based on multiple training datas in training data set.The training number of the present embodiment According to being that " plaintext-ciphertext " is right, each specific plaintext image corresponds to a ciphertext sequence.
Optionally, obtaining preset training data set includes: to load random-phase marks in spatial light modulator, Light source Jing Guo spatial light modulator is encoded;By coding after light source respectively to multiple plaintext images to be encrypted into Row encryption, obtains ciphertext sequence corresponding to each plaintext image;By the combination of all plaintext images and corresponding ciphertext sequence It is configured to training data set.
Specifically, communicating pair, when being communicated, first communication party sends a private information encryption to second and leads to Letter side, the two share a key Key, the one-dimensional sequence { S which is made of N number of numberi(i=1 ..., N). Before encryption, first communication party is to gather { SiIn N number of element be that seed generates N number of random-phase marksAssuming that { SiAnd random-phase marksBetween relationship be specific, and be only communicate it is double Fang Suozhi, phase value are evenly distributed on [0,2 π] namely each random-phase marksBoth correspond to { SiOne A key components.It should be noted that a kind of preferred embodiment as the present embodiment, optical encryption system can be meter Calculate the optical encryption system of ghost imaging.It is illustrated in figure 2 optical encryption schematic diagram provided in this embodiment, by laser (Laser) beam of laser emitted passes through spatial light modulator (Spatial Light Modulator, SLM), while will be N number of Random-phase marks (Random Phase Mask, RPM), which are successively loaded, encodes light source on SLM, the light beam after coding After the fresnel diffraction for carrying out a distance Z, it is irradiated to plaintext to be encrypted (Plaintext) image T (x, y), then coverlet Pixel bucket detector (Bucket Detector, BD) detection acquisition, and record its corresponding intensity value.For N number of difference Phase mask, repetitive operation n times record N number of intensity value BiAn one-dimensional sequence is formed as ciphertext (Ciphertext).It is logical Chang Eryan, N value are bigger, and quality reconstruction is better.
Step 102 is trained neural network based on training data set, obtains the neural network mould of training completion Type.
Specifically, based on constructed training data set in the present embodiment, using certain under specific training environment Optimization algorithm carry out neural network model training, wherein training when learning rate and frequency of training according to actual needs and It is fixed, it does not limit uniquely herein.In the training process, the ciphertext sequence in training data set is the defeated of neural network model Enter, and output constraint of the corresponding plaintext image as neural network model, by iterating over training data set All " plaintext-ciphertexts " is right, determines each training parameter in neural network model, and then obtains the neural network of training completion Model.
Optionally, neural network includes deep neural network DNN, convolutional neural networks CNN, in Recognition with Recurrent Neural Network RNN Any one.
Specifically, in practical applications, different types of neural network can be selected according to different usage scenarios, to instruct The neural network model for practicing the safety analysis for optical encryption system, since there is DNN preferable one-dimensional data to handle energy Power preferably selects DNN to train neural network model in the present embodiment.
Optionally, neural network be deep neural network when, deep neural network include: an input layer, three it is hidden Layer, an adjustment layer and an output layer are hidden, the neuron between adjacent layer is connected using full connection mode.
It is illustrated in figure 3 the training schematic diagram of deep neural network model provided in this embodiment.It should be noted that this Input layer (Input Layer), three hidden layer (Hidden can be determined according to the length of single ciphertext sequence in embodiment Layer), in output layer (Output Layer) and adjustment layer (Reshaping Layer) independent neuron number, for example: The size of plaintext image is 28*28, then corresponding ciphertext sequence is 1*784, and the number of our above-mentioned independent neurons will be set It is set to 784.In a training process, we input the ciphertext sequence of a 1*784 in input layer first, then, by 1*784 Sequence successively pass through input layer, three hidden layers and output layer, and 1 × 784 sequence is adjusted to 28* using an adjustment layer The output image of 28 pixels.
Optionally, neural network is trained based on training data set, obtains the neural network model of training completion Include: to input ciphertext sequence in M deep neural network model, passes sequentially through input layer, three hidden layers, output layer and tune After flood training, the reality output plaintext image of the M times repetitive exercise is obtained, M takes the positive integer more than or equal to 1;Using default Loss function, reality output plaintext image is compared with the original plaintext image in training data set;It is tied comparing When fruit meets the preset condition of convergence, M deep neural network model is determined as to the deep neural network model of training completion; When comparison result is unsatisfactory for the preset condition of convergence, continue the M+1 times repetitive exercise, until meeting the condition of convergence.
Specifically, training data is being input to deep neural network above-mentioned in the present embodiment please continue to refer to Fig. 3, After training to deep neural network model, output image is compared with original plaintext image, utilizes loss function (Loss Function) differentiate output image whether meet demand, if conditions are not met, then needing to continue in training data set " plaintext-ciphertext " to the parameters continued in iteration optimization deep neural network model, until the model can will be in training set Any ciphertext be converted to the output image met the requirements, i.e., corresponding plaintext image output image similar enough.According to The training process of above-mentioned neural network successively to training set in sample be trained, complete training set in all samples (i.e. " plaintext-ciphertext " to) training after, whole process is repeated as many times repeatedly in the present embodiment by as primary complete training process Generation optimization reaches set the number of iterations or meets the preset standard of loss function (condition of convergence), that is, completes whole The training process of a deep neural network model.
Optionally, loss function is average absolute value error function, is indicated are as follows:
Wherein, yi' indicate correspond to original plaintext image true value in i-th of element, yiIt indicates to correspond to practical defeated I-th of element in the output valve of plaintext image out, Δ indicate the average absolute value error of output valve and true value.
Specifically, selecting average absolute value error (Mean Absolute Error, MAE) function in the present embodiment is damage Function is lost, it is more steady than mean square error (Mean Squared Error, MSE) function because it is not necessarily to square operation.
It should be noted that each neuron is by activation primitive to input in deep neural network model Information function, and then transmitted by weight connection, and obtain corresponding nonlinear response output.It, can be in the present embodiment Select sigmoid function for activation primitive, function statement are as follows: yk+1=sigmoid (BiWk+bk), wherein k indicates the number of plies, BiTable Show the input of some neuron of kth layer, yk+1Indicate the output response of some neuron of kth layer, WkIndicate some nerve of kth layer The weight of member, bkIndicate that the biasing of some neuron of kth layer, weight and biasing are referred to as the training parameter of neural network.
It should also be noted that output figure is judged in the present embodiment according to the loss function in deep neural network model Seem it is no meet the requirements, need to optimize network parameter if being unsatisfactory for, the present invention using the adaptive moment estimate (Adaptive Moment Estimation, Adam) algorithm, alternatively referred to as stochastic gradient descent optimization algorithm, to calculate loss The gradient that function is closed in entire training dataset is adjusted each using the single order moments estimation and second moment estimation self-adaptive of gradient The learning rate of network parameter, this method have different parameters different autoadapted learning rates, so that calculation amount obtains greatly Width reduces, and accelerates the training convergence of deep neural network.
Optionally, be trained based on training data set to neural network includes: to be marked using discarding regularization method The preset neuron of random drop in each layer of quasi- neural network reduces neuron connection amount;It is right based on training data set Neural network after discarding neuron is trained.
It is illustrated in figure 4 the schematic diagram provided in this embodiment for abandon to neural network Regularization.Specifically, In the training process of neural network model, training set is excessive and frequency of training excessively may result in nerve network system and go out Existing over-fitting.In order to avoid such case, is further used in the present invention and abandon regularization (Dropout Regularization) technology, for each layer of standard neural network (Standard Neural Net), random drop one A little units, main function are exactly temporarily to lose the company between a part of neuron in neural metwork training learning process at random The case where connecing, reducing neuron scale, and then prevent over-fitting.
Optionally, after the neural network model for obtaining training completion, further includes: obtain preset test data set It closes;Test data set includes the combination of multiple plaintext images with corresponding ciphertext sequence;By the ciphertext sequence in test data set Column are input to the neural network model of training completion, obtain the plaintext image of test output;Will test output plaintext image with Plaintext image in test data set carries out relatedness computation;When the degree of correlation is greater than preset relevance threshold, will train The neural network model of completion is determined as effective neural network model.
Specifically, in the present embodiment after neural network model is completed in training, also using in the verifying of test data set The validity of neural network model is stated, i.e., by the ciphertext sequence inputting in test data set to trained neural network mould Type determines its validity by comparing the test plaintext image of its output and the correlation of original plaintext image, in test data When the degree of correlation between initial data is greater than preset threshold, determine that the neural network model trained is effective, correct mould Then effective neural network model is determined as equivalent key by type, otherwise, then illustrate that trained neural network model building is deposited In mistake, need to restart to construct neural network model.
Neural network model is determined as equivalent key by step 103;Equivalent key is for pacifying optical encryption system Full property analysis.
Specifically, the neural network model in the present embodiment is the equivalent key that can be considered optical encryption system, it is equivalent close Key refers to the key that can be used for restoring ciphertext.It should be noted that equivalent key acquisition methods of the invention are without knowing meter The geometric parameter and dependency structure for calculating optical encryption system, regardless of encryption system whether to ciphertext sequence carried out it is additional with Machine scramble or other common secondary cryptographic operations.Meanwhile if carrying out down-sampled processing to ciphertext in training (reduces training consumption When, improve attack efficiency), attack method is still effective;If introducing a small amount of noise when cracking subsequent ciphertext, attack method also according to So effectively.
The equivalent key acquisition methods provided according to embodiments of the present invention, by obtaining preset training data set;Institute State the combination that training data set includes multiple plaintext images with corresponding ciphertext sequence;Based on the training data set to nerve Network is trained, and obtains the neural network model of training completion;The neural network model is determined as equivalent key;It is described Equivalent key is used to carry out safety analysis to optical encryption system.A series of implementation through the invention, by known ciphertexts- It is trained in plain text to being placed in neural network structure, so that the mapping obtained between the ciphertext and plaintext of optical encryption system is closed System regard this as equivalent key, improves the efficiency and accuracy of safety analysis, and nothing concerning encryption system whether to close Literary sequence carries out additional random scrambling or other secondary cryptographic operations, extends applicable scene.
Second embodiment:
In order to solve to carry out optical encryption dependent on the geometric parameter of optical encryption system and dependency structure in the related technology The safety analysis of system, analysis efficiency and accuracy are lower, are applicable in the technical issues of scene is more limited to, and the present embodiment is shown A kind of equivalent key acquisition device, specifically refers to Fig. 5, the equivalent key acquisition device of the present embodiment includes:
Module 501 is obtained, for obtaining preset training data set;Training data set include multiple plaintext images with The combination of corresponding ciphertext sequence;
Training module 502 obtains the nerve of training completion for being trained based on training data set to neural network Network model;
Determining module 503, for neural network model to be determined as equivalent key;Equivalent key is used for optical encryption system System carries out safety analysis.
Specifically, including multiple training datas in training data set, the training data of the present embodiment is " plaintext-close Right, each the specific plaintext image of text ", corresponding ciphertext sequence.In the training process, the ciphertext in training data set Sequence is the input of neural network model, and output constraint of the corresponding plaintext image as neural network model, is passed through All " plaintext-ciphertexts " for iterating over training data set is right, determines each training parameter in neural network model, in turn Obtain the neural network model of training completion.Neural network model in the present embodiment can be considered the equivalent of optical encryption system Key, equivalent key refer to the key that can be used for restoring ciphertext.It should be noted that one kind as the present embodiment is preferred Embodiment, optical encryption system can be the optical encryption system for calculating ghost imaging.
In some embodiments of the present embodiment, obtains module 501 and be specifically used for loading random-phase marks in sky Between on optical modulator, the light source Jing Guo spatial light modulator is encoded;By the light source after coding respectively to multiple to be added Close plaintext image is encrypted, and ciphertext sequence corresponding to each plaintext image is obtained;By all plaintext images with it is corresponding The combination of ciphertext sequence is configured to training data set.
In some embodiments of the present embodiment, equivalent key acquisition device further include: test module, it is pre- for obtaining If test data set;Test data set includes the combination of multiple plaintext images with corresponding ciphertext sequence;By test data The neural network model that ciphertext sequence inputting in set is completed to training, obtains the plaintext image of test output;It will test defeated The plaintext image in plaintext image and test data set out carries out relatedness computation;It is greater than the preset degree of correlation in the degree of correlation When threshold value, the neural network model that training is completed is determined as effective neural network model.
In some embodiments of the present embodiment, neural network includes deep neural network DNN, convolutional neural networks Any one in CNN, Recognition with Recurrent Neural Network RNN.
Further, when neural network is deep neural network, deep neural network includes: an input layer, three Hidden layer, an adjustment layer and an output layer, the neuron between adjacent layer are connected using full connection mode.
In some embodiments of the present embodiment, training module 502 is specifically used for defeated in M deep neural network model Enter ciphertext sequence, after passing sequentially through input layer, three hidden layers, output layer and adjustment layer training, obtains the M times repetitive exercise Reality output plaintext image, M takes the positive integer more than or equal to 1;Using preset loss function, by reality output plaintext image It is compared with the original plaintext image in training data set;When comparison result meets the preset condition of convergence, by M depth Degree neural network model is determined as the deep neural network model of training completion;The preset condition of convergence is unsatisfactory in comparison result When, continue the M+1 times repetitive exercise, until meeting the condition of convergence.
Further, in some embodiments of the present embodiment, loss function is average absolute value error function, is indicated Are as follows:
Wherein, y'iIndicate to correspond to i-th of element in the true value of original plaintext image, yiIt indicates to correspond to practical defeated I-th of element in the output valve of plaintext image out, Δ indicate the average absolute value error of output valve and true value.
In some embodiments of the present embodiment, training module 502 is specifically used for marking using discarding regularization method The preset neuron of random drop in each layer of quasi- neural network reduces neuron connection amount;It is right based on training data set Neural network after discarding neuron is trained, and obtains the neural network model of training completion.
It is provided in this embodiment equivalent to should be noted that the equivalent key acquisition methods in previous embodiment can be based on Key acquisition device realizes that those of ordinary skill in the art can be clearly understood that, for convenience and simplicity of description, this The specific work process of equivalent key acquisition device described in embodiment, can be with reference to the correspondence in preceding method embodiment Process, details are not described herein.
Using equivalent key acquisition device provided in this embodiment, by obtaining preset training data set;The instruction Practice the combination that data acquisition system includes multiple plaintext images with corresponding ciphertext sequence;Based on the training data set to neural network It is trained, obtains the neural network model of training completion;The neural network model is determined as equivalent key;It is described equivalent Key is used to carry out safety analysis to optical encryption system.A series of implementation through the invention, by known ciphertext-plaintexts It is trained to being placed in neural network structure, so that the mapping relations between the ciphertext and plaintext of optical encryption system are obtained, Using this as equivalent key, improve the efficiency and accuracy of safety analysis, and without concerning encryption system whether to ciphertext Sequence carries out additional random scrambling or other secondary cryptographic operations, extends applicable scene.
3rd embodiment:
A kind of electronic device is present embodiments provided, it is shown in Figure 6 comprising processor 601, memory 602 and logical Believe bus 603, in which: communication bus 603 is for realizing the connection communication between processor 601 and memory 602;Processor 601 for executing one or more computer program stored in memory 602, equivalent in above-described embodiment one to realize At least one step in key acquisition method.
The present embodiment additionally provides a kind of computer readable storage medium, which, which is included in, is used for Store any method or skill of information (such as computer readable instructions, data structure, computer program module or other data) The volatibility implemented in art or non-volatile, removable or non-removable medium.Computer readable storage medium includes but not It is limited to RAM (Random Access Memory, random access memory), ROM (Read-Only Memory, read-only storage Device), EEPROM (Electrically Erasable Programmable read only memory, band electric erazable programmable Read-only memory), flash memory or other memory technologies, (Compact Disc Read-Only Memory, CD is only by CD-ROM Read memory), digital versatile disc (DVD) or other optical disc storages, magnetic holder, tape, disk storage or other magnetic memory apparatus, Or any other medium that can be used for storing desired information and can be accessed by a computer.
Computer readable storage medium in the present embodiment can be used for storing one or more computer program, storage One or more computer program can be executed by processor, with realize the method in above-described embodiment one at least one step Suddenly.
The present embodiment additionally provides a kind of computer program, which can be distributed in computer-readable medium On, by can computing device execute, to realize at least one step of the method in above-described embodiment one;And in certain situations Under, at least one shown or described step can be executed using the described sequence of above-described embodiment is different from.
The present embodiment additionally provides a kind of computer program product, including computer readable device, the computer-readable dress It sets and is stored with computer program as shown above.The computer readable device may include calculating as shown above in the present embodiment Machine readable storage medium storing program for executing.
As it can be seen that those skilled in the art should be understood that whole or certain steps in method disclosed hereinabove, be Functional module/unit in system, device may be implemented as the software (computer program code that can be can be performed with computing device To realize), firmware, hardware and its combination appropriate.In hardware embodiment, the functional module that refers in the above description/ Division between unit not necessarily corresponds to the division of physical assemblies;For example, a physical assemblies can have multiple functions, or One function of person or step can be executed by several physical assemblies cooperations.Certain physical assemblies or all physical assemblies can be by realities It applies as by processor, such as the software that central processing unit, digital signal processor or microprocessor execute, or is implemented as hard Part, or it is implemented as integrated circuit, such as specific integrated circuit.
In addition, known to a person of ordinary skill in the art be, communication media generally comprises computer-readable instruction, data knot Other data in the modulated data signal of structure, computer program module or such as carrier wave or other transmission mechanisms etc, and It and may include any information delivery media.So the present invention is not limited to any specific hardware and softwares to combine.
The above content is combining specific embodiment to be further described to made by the embodiment of the present invention, cannot recognize Fixed specific implementation of the invention is only limited to these instructions.For those of ordinary skill in the art to which the present invention belongs, Without departing from the inventive concept of the premise, a number of simple deductions or replacements can also be made, all shall be regarded as belonging to the present invention Protection scope.

Claims (10)

1. a kind of equivalent key acquisition methods characterized by comprising
Obtain preset training data set;The training data set includes the group of multiple plaintext images with corresponding ciphertext sequence It closes;
Neural network is trained based on the training data set, obtains the neural network model of training completion;
The neural network model is determined as equivalent key;The equivalent key is used to carry out safety to optical encryption system Analysis.
2. equivalent key acquisition methods as described in claim 1, which is characterized in that described to obtain preset training data set Include:
By random-phase marks load in spatial light modulator, the light source Jing Guo the spatial light modulator is encoded;
Multiple plaintext images to be encrypted are encrypted respectively by the light source after coding, it is right to obtain each plaintext image institute The ciphertext sequence answered;
All plaintext images are configured to training data set with the combination of the corresponding ciphertext sequence.
3. equivalent key acquisition methods as described in claim 1, which is characterized in that obtaining the neural network mould of training completion After type, further includes:
Obtain preset test data set;The test data set includes the group of multiple plaintext images with corresponding ciphertext sequence It closes;
The neural network model that ciphertext sequence inputting in the test data set is completed to the training, obtains testing defeated Plaintext image out;
Plaintext image in the plaintext image and the test data set of the test output is subjected to relatedness computation;
When the degree of correlation is greater than preset relevance threshold, the neural network model that the training is completed is determined as effectively Neural network model;
It is described the neural network model is determined as equivalent key to include:
Effective neural network model is determined as equivalent key.
4. equivalent key acquisition methods as described in claim 1, which is characterized in that the neural network be depth nerve net When network, the deep neural network includes: an input layer, three hidden layers, an adjustment layer and an output layer, adjacent Neuron between layer is connected using full connection mode.
5. equivalent key acquisition methods as claimed in claim 4, which is characterized in that described to be based on the training data set pair Neural network is trained, and the neural network model for obtaining training completion includes:
The ciphertext sequence is inputted in M deep neural network model, passes sequentially through the input layer, three hidden layers, output After layer and adjustment layer training, the reality output plaintext image of the M times repetitive exercise is obtained;
Using preset loss function, by the original plaintext figure in the reality output plaintext image and the training data set As being compared;
When comparison result meets the preset condition of convergence, the M deep neural network model is determined as training completion Deep neural network model;
When comparison result is unsatisfactory for the preset condition of convergence, continue the M+1 times repetitive exercise, until meeting the convergence Condition.
6. equivalent key acquisition methods as claimed in claim 5, which is characterized in that the loss function is average absolute value mistake Difference function indicates are as follows:
Wherein, y 'iIndicate to correspond to i-th of element in the true value of the original plaintext image, yiIt indicates to correspond to the reality Border exports i-th of element in the output valve of plaintext image, and Δ indicates the average absolute value of the output valve Yu the true value Error.
7. equivalent key acquisition methods as described in claim 1, which is characterized in that described to be based on the training data set pair Neural network, which is trained, includes:
Using regularization method preset neuron of random drop in each layer of standard neural network is abandoned, neuron is reduced Connection amount;
Based on the training data set, the neural network after discarding neuron is trained.
8. a kind of equivalent key acquisition device characterized by comprising
Module is obtained, for obtaining preset training data set;The training data set include multiple plaintext images with it is right Answer the combination of ciphertext sequence;
Training module obtains the nerve net of training completion for being trained based on the training data set to neural network Network model;
Determining module, for the neural network model to be determined as equivalent key;The equivalent key is used for optical encryption System carries out safety analysis.
9. a kind of electronic device characterized by comprising processor, memory and communication bus;
The communication bus is for realizing the connection communication between the processor and memory;
The processor is for executing one or more program stored in the memory, to realize such as claim 1 to 7 Any one of described in equivalent key acquisition methods the step of.
10. a kind of computer readable storage medium, which is characterized in that the computer-readable recording medium storage have one or Multiple programs, one or more of programs can be executed by one or more processor, to realize such as claim 1 to 7 Any one of described in equivalent key acquisition methods the step of.
CN201910216928.6A 2019-03-21 2019-03-21 Equivalent key obtaining method and device and computer readable storage medium Active CN110071798B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910216928.6A CN110071798B (en) 2019-03-21 2019-03-21 Equivalent key obtaining method and device and computer readable storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910216928.6A CN110071798B (en) 2019-03-21 2019-03-21 Equivalent key obtaining method and device and computer readable storage medium

Publications (2)

Publication Number Publication Date
CN110071798A true CN110071798A (en) 2019-07-30
CN110071798B CN110071798B (en) 2022-03-04

Family

ID=67366428

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910216928.6A Active CN110071798B (en) 2019-03-21 2019-03-21 Equivalent key obtaining method and device and computer readable storage medium

Country Status (1)

Country Link
CN (1) CN110071798B (en)

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110674941A (en) * 2019-09-25 2020-01-10 南开大学 Data encryption transmission method and system based on neural network
CN111259427A (en) * 2020-01-21 2020-06-09 北京安德医智科技有限公司 Image processing method and device based on neural network and storage medium
CN111709867A (en) * 2020-06-10 2020-09-25 四川大学 Novel full convolution network-based equal modulus vector decomposition image encryption analysis method
CN112733173A (en) * 2021-01-18 2021-04-30 北京灵汐科技有限公司 Image processing method, device, secret key generating method, device, training method and device, and computer readable medium
CN112802145A (en) * 2021-01-27 2021-05-14 四川大学 Color calculation ghost imaging method based on deep learning
CN113726979A (en) * 2021-07-31 2021-11-30 浪潮电子信息产业股份有限公司 Picture encryption method, decryption method, encryption system and related devices
CN116032636A (en) * 2023-01-06 2023-04-28 南京通力峰达软件科技有限公司 Internet of vehicles data encryption method and system based on neural network

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20160352520A1 (en) * 2013-10-29 2016-12-01 Jory Schwach Encryption using biometric image-based key
CN106411510A (en) * 2016-10-28 2017-02-15 深圳大学 Method and apparatus for obtaining equivalent key of random phase coding-based optical encryption system
CN107659398A (en) * 2017-09-28 2018-02-02 四川长虹电器股份有限公司 Suitable for Android symmetric encryption method
CN108921282A (en) * 2018-05-16 2018-11-30 深圳大学 A kind of construction method and device of deep neural network model

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20160352520A1 (en) * 2013-10-29 2016-12-01 Jory Schwach Encryption using biometric image-based key
CN106411510A (en) * 2016-10-28 2017-02-15 深圳大学 Method and apparatus for obtaining equivalent key of random phase coding-based optical encryption system
CN107659398A (en) * 2017-09-28 2018-02-02 四川长虹电器股份有限公司 Suitable for Android symmetric encryption method
CN108921282A (en) * 2018-05-16 2018-11-30 深圳大学 A kind of construction method and device of deep neural network model

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
文洁: "MSE与MAE对机器学习性能优化的作用比较", 《信息与电脑(理论版)》 *
金升箭: "《深度学习 基于MATLAB的设计实例》", 30 April 2018 *

Cited By (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110674941A (en) * 2019-09-25 2020-01-10 南开大学 Data encryption transmission method and system based on neural network
CN111259427A (en) * 2020-01-21 2020-06-09 北京安德医智科技有限公司 Image processing method and device based on neural network and storage medium
CN111259427B (en) * 2020-01-21 2020-11-06 北京安德医智科技有限公司 Image processing method and device based on neural network and storage medium
CN111709867A (en) * 2020-06-10 2020-09-25 四川大学 Novel full convolution network-based equal modulus vector decomposition image encryption analysis method
CN111709867B (en) * 2020-06-10 2022-11-25 四川大学 Novel full convolution network-based equal-modulus vector decomposition image encryption analysis method
CN112733173A (en) * 2021-01-18 2021-04-30 北京灵汐科技有限公司 Image processing method, device, secret key generating method, device, training method and device, and computer readable medium
CN112802145A (en) * 2021-01-27 2021-05-14 四川大学 Color calculation ghost imaging method based on deep learning
CN113726979A (en) * 2021-07-31 2021-11-30 浪潮电子信息产业股份有限公司 Picture encryption method, decryption method, encryption system and related devices
CN113726979B (en) * 2021-07-31 2024-04-26 浪潮电子信息产业股份有限公司 Picture encryption method, picture decryption method, picture encryption system and related devices
CN116032636A (en) * 2023-01-06 2023-04-28 南京通力峰达软件科技有限公司 Internet of vehicles data encryption method and system based on neural network
CN116032636B (en) * 2023-01-06 2023-10-20 南京通力峰达软件科技有限公司 Internet of vehicles data encryption method based on neural network

Also Published As

Publication number Publication date
CN110071798B (en) 2022-03-04

Similar Documents

Publication Publication Date Title
CN110071798A (en) A kind of equivalent key acquisition methods, device and computer readable storage medium
CN109214973B (en) Method for generating countermeasure security carrier aiming at steganalysis neural network
US20160012330A1 (en) Neural network and method of neural network training
CN112862001B (en) Privacy protection method and system for decentralizing data modeling under federal learning
US12033233B2 (en) Image steganography utilizing adversarial perturbations
CN115860112B (en) Model inversion method-based countermeasure sample defense method and equipment
CN113285797B (en) Multi-image encryption method for optical rotation domain based on compressed sensing and deep learning
CN111681154A (en) Color image steganography distortion function design method based on generation countermeasure network
Fang et al. Gifd: A generative gradient inversion method with feature domain optimization
Gu et al. Federated deep learning with bayesian privacy
CN118070107B (en) Deep learning-oriented network anomaly detection method, device, storage medium and equipment
Yaras et al. Randomized histogram matching: A simple augmentation for unsupervised domain adaptation in overhead imagery
CN117350373B (en) Personalized federal aggregation algorithm based on local self-attention mechanism
CN116743934B (en) Equal resolution image hiding and encrypting method based on deep learning and ghost imaging
CN113792632A (en) Finger vein identification method, system and storage medium based on multi-party cooperation
CN111951954A (en) Body health state detection method and device, readable storage medium and terminal equipment
CN116341004B (en) Longitudinal federal learning privacy leakage detection method based on feature embedding analysis
Liu et al. Structure aware visual cryptography
CN111275603B (en) Security image steganography method based on style conversion and electronic device
CN115374863A (en) Sample generation method, sample generation device, storage medium and equipment
Hu et al. Research on encrypted face recognition algorithm based on new combined chaotic map and neural network
Škorić On the entropy of keys derived from laser speckle; statistical properties of Gabor-transformed speckle
Hu et al. The recovery scheme of computer-generated holography encryption–hiding images based on deep learning
Zhou et al. Optical encryption using a sparse-data-driven framework
Hualong et al. Research on double encryption of ghost imaging by SegNet deep neural network

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant