CN115633129A - Stack type sparse self-encoder and GAN chaotic sequence image encryption method and device - Google Patents

Stack type sparse self-encoder and GAN chaotic sequence image encryption method and device Download PDF

Info

Publication number
CN115633129A
CN115633129A CN202211252929.4A CN202211252929A CN115633129A CN 115633129 A CN115633129 A CN 115633129A CN 202211252929 A CN202211252929 A CN 202211252929A CN 115633129 A CN115633129 A CN 115633129A
Authority
CN
China
Prior art keywords
network
encoder
gan
layer
image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202211252929.4A
Other languages
Chinese (zh)
Other versions
CN115633129B (en
Inventor
何斌
杨振坤
李刚
陆萍
朱忠攀
程斌
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tongji University
Original Assignee
Tongji University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tongji University filed Critical Tongji University
Priority to CN202211252929.4A priority Critical patent/CN115633129B/en
Publication of CN115633129A publication Critical patent/CN115633129A/en
Application granted granted Critical
Publication of CN115633129B publication Critical patent/CN115633129B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N1/00Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
    • H04N1/32Circuits or arrangements for control or supervision between transmitter and receiver or between image input and image output device, e.g. between a still-image camera and its memory or between a still-image camera and a printer device
    • H04N1/32101Display, printing, storage or transmission of additional information, e.g. ID code, date and time or title
    • H04N1/32144Display, printing, storage or transmission of additional information, e.g. ID code, date and time or title embedded in the image data, i.e. enclosed or integrated in the image, e.g. watermark, super-imposed logo or stamp
    • H04N1/32149Methods relating to embedding, encoding, decoding, detection or retrieval operations
    • H04N1/32267Methods relating to embedding, encoding, decoding, detection or retrieval operations combined with processing of the image
    • H04N1/32272Encryption or ciphering
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/60Protecting data
    • G06F21/602Providing cryptographic facilities or services
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • G06N3/084Backpropagation, e.g. using gradient descent
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L9/00Cryptographic mechanisms or cryptographic arrangements for secret or secure communications; Network security protocols
    • H04L9/001Cryptographic mechanisms or cryptographic arrangements for secret or secure communications; Network security protocols using chaotic signals

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Signal Processing (AREA)
  • Computer Security & Cryptography (AREA)
  • General Physics & Mathematics (AREA)
  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • General Engineering & Computer Science (AREA)
  • Software Systems (AREA)
  • Bioethics (AREA)
  • Biophysics (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Multimedia (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Artificial Intelligence (AREA)
  • Biomedical Technology (AREA)
  • Computer Hardware Design (AREA)
  • Computational Linguistics (AREA)
  • Data Mining & Analysis (AREA)
  • Evolutionary Computation (AREA)
  • Molecular Biology (AREA)
  • Computing Systems (AREA)
  • Mathematical Physics (AREA)
  • Image Analysis (AREA)

Abstract

The invention discloses a stack type sparse self-encoder and a method and a device for encrypting a GAN chaotic sequence image, and relates to the technical field of image encryption. The method comprises the following steps: acquiring an original image to be encrypted; acquiring a chaotic sequence of an original image; inputting the chaos sequence into a trained deep learning network encryption model based on a stack sparse self-encoder and a generated anti-network GAN; and obtaining a ciphertext image of the original image to be encrypted according to the chaos sequence and the deep learning network encryption model. According to the invention, the stack-type sparse self-encoder is used for learning the complex intrinsic characteristics in the image, and the stack-type sparse self-encoder not only has a multilayer network structure, but also has the constraint capacity of the sparse encoder, so that the model can learn more effective image characteristics in a complex environment. The stack sparse autoencoder and the GAN generate an encryption model combined with a countermeasure network, so that the periodicity of a chaotic sequence can be eliminated, and the security of the encryption model is further improved.

Description

Stack type sparse self-encoder and GAN chaotic sequence image encryption method and device
Technical Field
The invention relates to the technical field of image encryption, in particular to a stack type sparse self-encoder and a method and a device for encrypting a GAN chaotic sequence image.
Background
The image contains many confidential and important information, so that the important practical significance is achieved for ensuring the image security. Chaotic encryption provides an effective method for security protection of digital images, researchers have designed many digital image encryption methods based on chaotic encryption technology in recent years, however, images subjected to chaotic encryption still have certain periodicity characteristics. The existing image encryption method has the problems of low security and confidentiality when encryption is carried out.
Disclosure of Invention
The invention provides a method for encrypting an image, which aims to solve the problems of low safety and confidentiality when the existing image encryption method is used for encryption.
In order to solve the technical problems, the invention provides the following technical scheme:
in one aspect, the present invention provides a stack-type sparse self-encoder and GAN chaotic sequence image encryption method, where the method is implemented by an electronic device, and the method includes:
s1, obtaining an original image to be encrypted.
And S2, acquiring a chaotic sequence of the original image.
And S3, inputting the chaotic sequence into a trained stack sparse-based self-encoder and a deep learning network encryption model for generating a countermeasure network GAN.
And S4, obtaining a ciphertext image of the original image to be encrypted according to the chaos sequence and the deep learning network encryption model.
Optionally, the acquiring the chaotic sequence of the original image in S2 includes:
and S21, carrying out blocking processing on the original image to obtain original image sub-blocks.
And S22, generating a real number sequence for the original image sub-blocks based on a Logistic chaotic system.
And S23, converting the real number sequence based on a threshold value method to obtain a chaotic sequence of the original image.
Optionally, the deep learning network encryption model based on the stack sparse self-encoder and GAN in S3 includes a generation network and a countermeasure network.
The generating network is constructed based on the stack type sparse self-encoder and is used for generating the ciphertext image.
And the countermeasure network is used for judging whether the input is a chaotic sequence or a ciphertext image.
Optionally, the generating network comprises an encoder and a decoder;
the encoder comprises an encoder full connection layer, a first convolution layer, a second convolution layer and a third convolution layer.
The encoder full connection layer, the first convolution layer and the second convolution layer are respectively connected with the GELU type activation function.
The third convolution layer connects Sigmoid-type activation functions.
The output of the encoder full-link layer and the output of the third convolutional layer are connected through a residual structure.
The decoder comprises a first deconvolution layer, a second deconvolution layer, a third deconvolution layer and a decoder full-link layer.
The first deconvolution layer, the second deconvolution layer and the third deconvolution layer are respectively connected with the GELU type activation function;
the decoder full connection layer connects a Sigmoid type activation function.
Optionally, the countermeasure network includes a countermeasure network first convolutional layer, a countermeasure network second convolutional layer, a countermeasure network third convolutional layer, and a pooling layer.
The first convolution layer of the countermeasure network, the second convolution layer of the countermeasure network and the third convolution layer of the countermeasure network are respectively connected with the GELU type activation function.
Optionally, the training process based on the stack sparse self-encoder and generating the deep learning network encryption model of the countermeasure network GAN in S3 includes:
s31, optimizing network parameters of the stack sparse self-encoder to obtain the optimal weight and threshold of the stack sparse self-encoder network.
S32: and constructing a deep learning network encryption model based on the stack sparse autoencoder and the GAN based on the optimal weight and threshold.
S33: and training a deep learning network encryption model based on the stack sparse autoencoder and the GAN by using a gradient descent algorithm.
Optionally, the training of the deep learning network encryption model based on the stack sparse autoencoder and GAN by using the gradient descent algorithm in S33 includes:
training a generation network and a confrontation network of a deep learning network encryption model based on a stack sparse self-encoder and a GAN by using a gradient descent algorithm.
Wherein training the countermeasure network comprises:
the countermeasure network is trained using a cross entropy loss function.
Training the generated network includes:
and inputting the ciphertext image generated by the generated network into the anti-network to obtain an output result of the anti-network, corresponding the output result of the anti-network with the label of the chaotic sequence and calculating an error, and performing error back propagation on the calculated error by using an AdamW gradient descent algorithm to continuously correct the network parameters of the generated network.
Optionally, the obtaining of the ciphertext image of the original image to be encrypted according to the chaos sequence and the deep learning network encryption model in S4 includes:
and inputting the chaotic sequence into a deep learning network encryption model, and generating a ciphertext image of an original image to be encrypted by a generation network of the deep learning network encryption model through forward propagation of the gradient.
On the other hand, the invention provides a stack type sparse self-encoder and a GAN chaotic sequence image encryption device, which is applied to a stack type sparse self-encoder and GAN chaotic sequence image encryption method, and comprises the following steps:
and the acquisition module is used for acquiring the original image to be encrypted.
And the chaotic sequence generating module is used for acquiring the chaotic sequence of the original image.
And the input module is used for inputting the chaotic sequence into a trained stack sparse-based auto-encoder and a deep learning network encryption model for generating the anti-network GAN.
And the output module is used for obtaining the ciphertext image of the original image to be encrypted according to the chaos sequence and the deep learning network encryption model.
Optionally, the chaotic sequence generating module is further configured to:
and S21, carrying out blocking processing on the original image to obtain original image sub-blocks.
And S22, generating a real number sequence for the original image sub-blocks based on a Logistic chaotic system.
And S23, converting the real number sequence based on a threshold value method to obtain the chaos sequence of the original image.
Optionally, the deep learning network encryption model based on the stack sparse autoencoder and GAN includes a generation network and a countermeasure network.
The generating network is constructed based on the stack type sparse self-encoder and is used for generating the ciphertext image.
And the countermeasure network is used for judging whether the input is a chaotic sequence or a ciphertext image.
Optionally, the generating network comprises an encoder and a decoder;
the encoder comprises an encoder full-connection layer, a first convolution layer, a second convolution layer and a third convolution layer.
The encoder full connection layer, the first convolution layer and the second convolution layer are respectively connected with the GELU type activation function.
The third convolution layer connects Sigmoid-type activation functions.
The output of the encoder full connection layer and the output of the third convolution layer are connected through a residual error structure.
The decoder comprises a first deconvolution layer, a second deconvolution layer, a third deconvolution layer and a decoder full-link layer.
The first deconvolution layer, the second deconvolution layer and the third deconvolution layer are respectively connected with the GELU type activation function;
the decoder full connection layer connects a Sigmoid type activation function.
Optionally, the countermeasure network includes a countermeasure network first convolutional layer, a countermeasure network second convolutional layer, a countermeasure network third convolutional layer and a pooling layer.
The first convolution layer of the countermeasure network, the second convolution layer of the countermeasure network and the third convolution layer of the countermeasure network are respectively connected with the GELU type activation function.
Optionally, the input module is further configured to:
s31, optimizing network parameters of the stack sparse self-encoder to obtain the optimal weight and threshold of the stack sparse self-encoder network.
S32: and constructing a deep learning network encryption model based on the stack sparse autoencoder and the GAN based on the optimal weight and threshold.
S33: and training a deep learning network encryption model based on the stack sparse self-encoder and the GAN by using a gradient descent algorithm.
Optionally, the input module is further configured to:
and training a generation network and a countermeasure network of the deep learning network encryption model based on the stack sparse autoencoder and the GAN by using a gradient descent algorithm.
Wherein training the countermeasure network comprises:
the countermeasure network is trained using a cross entropy loss function.
Training the generated network includes:
and inputting the ciphertext image generated by the generated network into the countermeasure network to obtain an output result of the countermeasure network, corresponding the output result of the countermeasure network with the label of the chaotic sequence and calculating an error, and performing error back propagation on the calculated error by using an AdamW gradient descent algorithm to continuously correct the network parameters of the generated network.
Optionally, the output module is further configured to:
and inputting the chaotic sequence into a deep learning network encryption model, and generating a ciphertext image of an original image to be encrypted by a generation network of the deep learning network encryption model through forward propagation of the gradient.
In one aspect, an electronic device is provided, which includes a processor and a memory, where the memory stores at least one instruction, and the at least one instruction is loaded and executed by the processor to implement the stacked sparse self-encoder and the GAN chaotic sequence image encryption method.
In one aspect, a computer-readable storage medium is provided, in which at least one instruction is stored, and the at least one instruction is loaded and executed by a processor to implement the above-mentioned stacked sparse self-encoder and the chaotic sequence image encryption method of GAN.
The technical scheme provided by the embodiment of the invention has the beneficial effects that at least:
in the scheme, a stack-type sparse self-encoder and a GAN are combined to generate an antagonistic network, a chaotic sequence image encryption method based on the stack-type sparse self-encoder and the GAN is designed, a multilayer network structure of the stack-type sparse self-encoder provides strong nonlinear feature mapping capability for the model, and meanwhile, the stack-type sparse self-encoder has the constraint capability of the sparse encoder, so that the model can learn more effective image features in a complex environment. The stack sparse autoencoder and the GAN generate an encryption model combining a countermeasure network, so that the periodicity of a chaotic sequence can be eliminated, and the safety and the confidentiality of the encryption model are further improved.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present invention, the drawings required to be used in the description of the embodiments are briefly introduced below, and it is obvious that the drawings in the description below are only some embodiments of the present invention, and it is obvious for those skilled in the art that other drawings can be obtained according to the drawings without creative efforts.
Fig. 1 is a schematic flow chart of a stacked sparse self-encoder and a method for encrypting GAN chaotic sequence images according to an embodiment of the present invention;
FIG. 2 is a schematic structural diagram of a deep learning network encryption model for generating a countermeasure network based on a stack sparse autoencoder and a GAN according to an embodiment of the present invention;
fig. 3 is a schematic structural diagram of a generation network provided in an embodiment of the present invention;
fig. 4 is a schematic structural diagram of a competing network according to an embodiment of the present invention;
fig. 5 is a block diagram of a chaotic sequence image encryption apparatus for a stacked sparse autoencoder and a GAN according to an embodiment of the present invention;
fig. 6 is a schematic structural diagram of an electronic device according to an embodiment of the present invention.
Detailed Description
In order to make the technical problems, technical solutions and advantages of the present invention more apparent, the following detailed description is given with reference to the accompanying drawings and specific embodiments.
As shown in fig. 1, an embodiment of the present invention provides a stack-type sparse self-encoder and a chaotic sequence image encryption method of GAN, which may be implemented by an electronic device. As shown in fig. 1, a flow chart of a method for encrypting a chaotic sequence image by a stacked sparse self-encoder and a GAN is provided, and a processing flow of the method may include the following steps:
s1, obtaining an original image to be encrypted.
And S2, acquiring a chaos sequence of the original image.
Optionally, the step S2 may include the following steps S21 to S23:
and S21, carrying out blocking processing on the original image to obtain original image sub-blocks.
In a feasible implementation mode, firstly, the plaintext image (original image) is subjected to plaintext image blocking processing to obtain plaintext image sub-blocks, no overlapping part exists between the obtained plaintext image sub-blocks, and the effect of the plaintext image blocking facilitates the processing of a chaos system in the later period.
And S22, generating a real number sequence for the original image sub-blocks based on a Logistic chaotic system.
In one possible implementation, a real number sequence is generated for a plaintext image sub-block by using a Logistic chaotic system, and an iterative formula of the Logistic chaotic system is as follows (1):
x i+1 =μx i (1-x i ),i=0,1,…,n (1)
wherein x is i ∈[0,1],μ∈(0,4)。
And S23, converting the real number sequence based on a threshold value method to obtain the chaos sequence of the original image.
And S3, inputting the chaotic sequence into a trained stack sparse-based self-encoder and a deep learning network encryption model for generating a countermeasure network GAN.
Optionally, the deep learning network encryption model based on the stack sparse self-encoder and GAN in S3 includes a generation network and a countermeasure network.
In a possible embodiment, the deep learning Network encryption model based on the stack sparse autoencoder and GAN (generic adaptive Network) is composed of a generation Network and a countermeasure Network, and a schematic structural diagram of the deep learning Network encryption model is shown in fig. 2.
The generation network is constructed based on a stack-type sparse self-encoder and is used for generating the ciphertext image.
Optionally, the generating network comprises an encoder and a decoder;
the encoder comprises an encoder full-connection layer, a first convolution layer, a second convolution layer and a third convolution layer.
The encoder full connection layer, the first convolution layer and the second convolution layer are respectively connected with a GELU (Gaussian Error Linear Unit) type activation function.
The third convolution layer connects Sigmoid-type activation functions.
The output of the encoder full-link layer and the output of the third convolutional layer are connected through a residual structure.
The decoder comprises a first deconvolution layer, a second deconvolution layer, a third deconvolution layer and a decoder full-link layer.
The first deconvolution layer, the second deconvolution layer and the third deconvolution layer are respectively connected with the GELU type activation function;
the decoder full connection layer connects Sigmoid type activation functions.
In a possible implementation, the random vector generates a ciphertext image by generating a network based on a stacked sparse self-encoder.
Further, a schematic structural diagram of the generation network is shown in fig. 3, the generation network is designed based on a stacked sparse self-encoder and is composed of an encoder and a decoder, wherein the encoder of the generation network includes 1 fully-connected layer and 31 × 1 convolutional layers, and the fully-connected layer and the first 21 × 1 convolutional layers are followed by a GELU-type activation function; the output of the full connection layer is connected with the output of the 3 rd 1 x 1 convolution layer through 1 residual structure; the decoder of the generating network comprises 31 × 1 deconvolution layers and 1 fully connected layer, each deconvolution layer is followed by 1 GELU-type activation function, and the fully connected layer is followed by 1 Sigmoid-type activation function.
Optionally, the countermeasure network is used for judging whether the input is a chaotic sequence or a ciphertext image.
Optionally, the countermeasure network includes a countermeasure network first convolutional layer, a countermeasure network second convolutional layer, a countermeasure network third convolutional layer, and a pooling layer.
The first convolution layer of the countermeasure network, the second convolution layer of the countermeasure network and the third convolution layer of the countermeasure network are respectively connected with the GELU type activation function.
In a possible implementation manner, the countermeasure network is a discriminator, which is used to judge whether the input is a chaotic sequence generated by the chaotic system or a ciphertext image generated by the network. The schematic structure of the countermeasure network is shown in fig. 4, and the countermeasure network includes 31 × 1 convolutional layers and one pooling layer, each of which is followed by 1 GELU type activation function.
Further, the generation network and the countermeasure network compete with each other, so that the ciphertext image generated by the generation network and the chaotic sequence generated by the chaotic system are closer and closer.
Optionally, the training process based on the stack sparse self-encoder and generating the deep learning network encryption model of the countermeasure network GAN in S3 includes:
s31, optimizing network parameters of the stack sparse self-encoder to obtain the optimal weight and threshold of the stack sparse self-encoder network.
S32: and constructing a deep learning network encryption model based on the stack sparse autoencoder and the GAN based on the optimal weight and threshold.
S33: and training a deep learning network encryption model based on the stack sparse self-encoder and the GAN by using a gradient descent algorithm.
Optionally, the training, by using a gradient descent algorithm, of the deep learning network encryption model based on the stack sparse self-encoder and the GAN in S33 includes:
training a generation network and a confrontation network of a deep learning network encryption model based on a stack sparse self-encoder and a GAN by using a gradient descent algorithm.
Wherein training the countermeasure network comprises:
the impedance network is trained using a cross entropy loss function.
In a feasible implementation mode, the training of the deep learning network encryption model is firstly the training of the countermeasure network, the training of the countermeasure network uses a cross entropy loss function, and the training target is that the countermeasure network can distinguish a chaotic sequence generated by a chaotic system and generate a ciphertext image generated by the network. Setting the label of a cipher text image generated by a generation network as 0, setting the label of a chaotic sequence generated by the chaotic system as 1, respectively inputting the chaotic sequence generated by the chaotic system and the cipher text image generated by the generation network into an antagonistic network, comparing the result output by the antagonistic network with the label, simultaneously calculating corresponding errors, and performing error back propagation on the calculated errors by using an AdamW gradient descent algorithm to continuously correct network parameters of the antagonistic network, so that the antagonistic network can more and more distinguish the chaotic sequence generated by the chaotic system and the cipher text image generated by the generation network.
Training the generated network includes:
and inputting the ciphertext image generated by the generated network into the anti-network to obtain an output result of the anti-network, corresponding the output result of the anti-network with the label of the chaotic sequence and calculating an error, and performing error back propagation on the calculated error by using an AdamW gradient descent algorithm to continuously correct the network parameters of the generated network.
In a feasible implementation mode, a generative network is trained, an antagonistic network is fixed, and the aim of network training is to enable ciphertext images generated by the generative network to approach a chaotic sequence generated by a chaotic system more and more; inputting the generated ciphertext image into the anti-network, enabling an output result of the anti-network to correspond to the tag 1 of the corresponding chaotic sequence, simultaneously calculating a corresponding error, and performing error back propagation on the calculated error by using an AdamW gradient descent algorithm to continuously correct network parameters of the generated network, so that the ciphertext image generated by the generated network is more and more close to the chaotic sequence generated by the chaotic system, and the encryption of the plaintext image is realized.
And S4, obtaining a ciphertext image of the original image to be encrypted according to the chaos sequence and the deep learning network encryption model.
Optionally, the step S4 may be:
and inputting the chaos sequence into a deep learning network encryption model, and generating a ciphertext image of an original image to be encrypted by a generation network of the deep learning network encryption model through forward propagation of gradient.
In a feasible implementation mode, a logic chaotic system is used for generating a chaotic sequence for a plaintext image, the chaotic sequence is input into a trained deep learning network encryption model, and a network generation result is the ciphertext image to be obtained through forward propagation of gradient.
In the embodiment of the invention, a stack-type sparse self-encoder and a GAN generation confrontation network are combined, a chaotic sequence image encryption method based on the stack-type sparse self-encoder and the GAN is designed, and a multilayer network structure of the stack-type sparse self-encoder provides strong nonlinear feature mapping capability for a model, and meanwhile, the stack-type sparse self-encoder has the constraint capability of the sparse encoder, so that the model can learn more effective image features in a complex environment. The stack sparse autoencoder and the GAN generate an encryption model combined with a countermeasure network, so that the periodicity of a chaotic sequence can be eliminated, and the security and the confidentiality of the encryption model are further improved.
As shown in fig. 5, an embodiment of the present invention provides a stacked sparse autoencoder and GAN chaotic sequence image encryption apparatus 500, where the apparatus 500 is applied to implement a stacked sparse autoencoder and GAN chaotic sequence image encryption method, and the apparatus 500 includes:
an obtaining module 510, configured to obtain an original image to be encrypted.
And a chaotic sequence generating module 520, configured to obtain a chaotic sequence of the original image.
An input module 530, configured to input the chaotic sequence to a trained stack sparse-based auto-encoder and a deep learning network encryption model for generating the countermeasure network GAN.
And the output module 540 is configured to obtain a ciphertext image of the original image to be encrypted according to the chaos sequence and the deep learning network encryption model.
Optionally, the chaotic sequence generating module 520 is further configured to:
and S21, carrying out blocking processing on the original image to obtain original image sub-blocks.
And S22, generating a real number sequence for the original image sub-blocks based on a Logistic chaotic system.
And S23, converting the real number sequence based on a threshold value method to obtain a chaotic sequence of the original image.
Optionally, the deep learning network encryption model based on the stack sparse autoencoder and GAN includes a generation network and a countermeasure network.
The generating network is constructed based on the stack type sparse self-encoder and is used for generating the ciphertext image.
And the countermeasure network is used for judging whether the input is a chaotic sequence or a ciphertext image.
Optionally, the generating network comprises an encoder and a decoder;
the encoder comprises an encoder full connection layer, a first convolution layer, a second convolution layer and a third convolution layer.
The encoder full connection layer, the first convolution layer and the second convolution layer are respectively connected with the GELU type activation function.
The third convolution layer connects Sigmoid-type activation functions.
The output of the encoder full-link layer and the output of the third convolutional layer are connected through a residual structure.
The decoder includes a first deconvolution layer, a second deconvolution layer, a third deconvolution layer, and a decoder full link layer.
The first deconvolution layer, the second deconvolution layer and the third deconvolution layer are respectively connected with the GELU type activation function;
the decoder full connection layer connects a Sigmoid type activation function.
Optionally, the countermeasure network includes a countermeasure network first convolutional layer, a countermeasure network second convolutional layer, a countermeasure network third convolutional layer, and a pooling layer.
The first convolution layer of the countermeasure network, the second convolution layer of the countermeasure network and the third convolution layer of the countermeasure network are respectively connected with the GELU type activation function.
Optionally, the input module 530 is further configured to:
s31, optimizing network parameters of the stack sparse self-encoder to obtain the optimal weight and threshold of the stack sparse self-encoder network.
S32: and constructing a deep learning network encryption model based on the stack sparse autoencoder and the GAN based on the optimal weight and threshold.
S33: and training a deep learning network encryption model based on the stack sparse autoencoder and the GAN by using a gradient descent algorithm.
Optionally, the input module 530 is further configured to:
training a generation network and a confrontation network of a deep learning network encryption model based on a stack sparse self-encoder and a GAN by using a gradient descent algorithm.
Wherein training the countermeasure network comprises:
the countermeasure network is trained using a cross entropy loss function.
Training the generated network includes:
and inputting the ciphertext image generated by the generated network into the countermeasure network to obtain an output result of the countermeasure network, corresponding the output result of the countermeasure network with the label of the chaotic sequence and calculating an error, and performing error back propagation on the calculated error by using an AdamW gradient descent algorithm to continuously correct the network parameters of the generated network.
Optionally, the output module 540 is further configured to:
and inputting the chaotic sequence into a deep learning network encryption model, and generating a ciphertext image of an original image to be encrypted by a generation network of the deep learning network encryption model through forward propagation of the gradient.
In the embodiment of the invention, a stack-type sparse self-encoder and a GAN generation confrontation network are combined, a chaotic sequence image encryption method based on the stack-type sparse self-encoder and the GAN is designed, and a multilayer network structure of the stack-type sparse self-encoder provides strong nonlinear feature mapping capability for a model, and meanwhile, the stack-type sparse self-encoder has the constraint capability of the sparse encoder, so that the model can learn more effective image features in a complex environment. The stack sparse autoencoder and the GAN generate an encryption model combined with a countermeasure network, so that the periodicity of a chaotic sequence can be eliminated, and the security and the confidentiality of the encryption model are further improved.
Fig. 6 is a schematic structural diagram of an electronic device 600 according to an embodiment of the present invention, where the electronic device 600 may generate relatively large differences due to different configurations or performances, and may include one or more processors (CPUs) 601 and one or more memories 602, where at least one instruction is stored in the memory 602, and is loaded and executed by the processor 601 to implement the following stacked sparse self-encoder and GAN chaotic sequence image encryption method:
s1, obtaining an original image to be encrypted.
And S2, acquiring a chaos sequence of the original image.
And S3, inputting the chaotic sequence into a trained deep learning network encryption model based on a stack sparse autoencoder and generating an anti-network GAN.
And S4, obtaining a ciphertext image of the original image to be encrypted according to the chaos sequence and the deep learning network encryption model.
In an exemplary embodiment, a computer-readable storage medium, such as a memory including instructions executable by a processor in a terminal, to perform the above method of solving a large scale customized downmix line production scheduling problem is also provided. For example, the computer readable storage medium may be a ROM, a Random Access Memory (RAM), a CD-ROM, a magnetic tape, a floppy disk, an optical data storage device, and the like.
It will be understood by those skilled in the art that all or part of the steps for implementing the above embodiments may be implemented by hardware, or may be implemented by a program instructing relevant hardware, where the program may be stored in a computer-readable storage medium, and the storage medium may be a read-only memory, a magnetic disk or an optical disk.
The above description is only for the purpose of illustrating the preferred embodiments of the present invention and is not to be construed as limiting the invention, and any modifications, equivalents, improvements and the like that fall within the spirit and principle of the present invention are intended to be included therein.

Claims (10)

1. A stack sparse auto-encoder and a GAN chaotic sequence image encryption method is characterized by comprising the following steps:
s1, acquiring an original image to be encrypted;
s2, acquiring a chaotic sequence of the original image;
s3, inputting the chaotic sequence into a trained deep learning network encryption model based on a stack sparse autoencoder and a generated countermeasure network GAN;
and S4, obtaining a ciphertext image of the original image to be encrypted according to the chaos sequence and a deep learning network encryption model.
2. The method of claim 1, wherein the obtaining of the chaotic sequence of the original images in S2 comprises:
s21, carrying out blocking processing on the original image to obtain original image sub-blocks;
s22, generating a real number sequence for the original image sub-blocks based on a Logistic chaotic system;
and S23, converting the real number sequence based on a threshold value method to obtain a chaotic sequence of the original image.
3. The method according to claim 1, wherein the deep learning network encryption model based on the stack sparse self-encoder and the GAN in S3 comprises a generation network and a countermeasure network;
the generating network is constructed based on a stack type sparse self-encoder and is used for generating a ciphertext image;
the countermeasure network is used for judging whether the input is a chaotic sequence or a ciphertext image.
4. The method of claim 3, wherein the generation network comprises an encoder and a decoder;
the encoder comprises an encoder full-connection layer, a first convolution layer, a second convolution layer and a third convolution layer;
the encoder full connection layer, the first convolution layer and the second convolution layer are respectively connected with the GELU type activation function;
said third layer of convolution connection connecting a Sigmoid type activation function;
the output of the encoder full connection layer is connected with the output of the third convolution layer through a residual error structure;
the decoder comprises a first deconvolution layer, a second deconvolution layer, a third deconvolution layer and a decoder full-connection layer;
the first deconvolution layer, the second deconvolution layer and the third deconvolution layer are respectively connected with the GELU type activation function;
the decoder full connection layer connects a Sigmoid type activation function.
5. The method of claim 3, wherein the countermeasure network includes a countermeasure network first convolutional layer, a countermeasure network second convolutional layer, a countermeasure network third convolutional layer, and a pooling layer;
the first convolution layer of the countermeasure network, the second convolution layer of the countermeasure network and the third convolution layer of the countermeasure network are respectively connected with the GELU type activation function.
6. The method according to claim 1, wherein the training process based on the stack sparse self-encoder and generating the deep learning network encryption model of the countermeasure network GAN in S3 comprises:
s31, optimizing network parameters of the stack sparse self-encoder to obtain an optimal weight and a threshold value of the stack sparse self-encoder network;
s32: constructing a deep learning network encryption model based on a stack sparse autoencoder and the GAN based on the optimal weight and the threshold;
s33: and training the deep learning network encryption model based on the stack sparse self-encoder and the GAN by using a gradient descent algorithm.
7. The method according to claim 6, wherein the training of the stack sparse self-encoder and GAN based deep learning network encryption model by using a gradient descent algorithm in S33 comprises:
training a generation network and a confrontation network of the deep learning network encryption model based on the stack sparse self-encoder and the GAN by using a gradient descent algorithm;
wherein training the countermeasure network comprises:
training the countermeasure network using a cross entropy loss function;
training the generated network comprises:
and inputting a ciphertext image generated by a generated network into the anti-network to obtain an output result of the anti-network, corresponding the output result of the anti-network with a label of the chaotic sequence and calculating an error, and performing error back propagation on the calculated error by using an AdamW gradient descent algorithm to continuously correct network parameters of the generated network.
8. The method according to claim 1, wherein obtaining the ciphertext image of the original image to be encrypted according to the chaotic sequence and a deep learning network encryption model in S4 comprises:
and inputting the chaos sequence into the deep learning network encryption model, and generating a ciphertext image of the original image to be encrypted by a generation network of the deep learning network encryption model through forward propagation of gradient.
9. A chaotic sequence image encryption apparatus of a stacked sparse autoencoder and GAN, the apparatus comprising:
the acquisition module is used for acquiring an original image to be encrypted;
the chaotic sequence generating module is used for acquiring the chaotic sequence of the original image;
the input module is used for inputting the chaotic sequence into a trained stack sparse-based self-encoder and a deep learning network encryption model for generating a countermeasure network GAN;
and the output module is used for obtaining the ciphertext image of the original image to be encrypted according to the chaos sequence and the deep learning network encryption model.
10. The apparatus of claim 9, wherein the deep learning network cryptographic model based on the stacked sparse autoencoder and GAN comprises a generating network and a countering network;
the generating network is constructed on the basis of a stack-type sparse self-encoder and is used for generating a ciphertext image;
the countermeasure network is used for judging whether the input is a chaotic sequence or a ciphertext image.
CN202211252929.4A 2022-10-13 2022-10-13 Stacked sparse self-encoder and GAN chaotic sequence image encryption method and device Active CN115633129B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211252929.4A CN115633129B (en) 2022-10-13 2022-10-13 Stacked sparse self-encoder and GAN chaotic sequence image encryption method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211252929.4A CN115633129B (en) 2022-10-13 2022-10-13 Stacked sparse self-encoder and GAN chaotic sequence image encryption method and device

Publications (2)

Publication Number Publication Date
CN115633129A true CN115633129A (en) 2023-01-20
CN115633129B CN115633129B (en) 2024-03-05

Family

ID=84903790

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211252929.4A Active CN115633129B (en) 2022-10-13 2022-10-13 Stacked sparse self-encoder and GAN chaotic sequence image encryption method and device

Country Status (1)

Country Link
CN (1) CN115633129B (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101706946A (en) * 2009-11-26 2010-05-12 大连大学 Digital image encryption method based on DNA sequence and multi-chaotic mapping
CN111105338A (en) * 2019-12-20 2020-05-05 长春理工大学 Image encryption method based on GAN and chaotic system
CN111582312A (en) * 2020-04-09 2020-08-25 华南理工大学 Secure biological hash code generation method for resisting relation attack based on periodic function
US20210385081A1 (en) * 2020-06-03 2021-12-09 University Of Electronic Science And Technology Of China Key Generation Method Based On Deep Learning Generative Adversarial Network
CN114915398A (en) * 2022-04-29 2022-08-16 南京信息工程大学 Variational self-encoder-based chaotic encryption method, application, computer equipment and storage medium

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101706946A (en) * 2009-11-26 2010-05-12 大连大学 Digital image encryption method based on DNA sequence and multi-chaotic mapping
CN111105338A (en) * 2019-12-20 2020-05-05 长春理工大学 Image encryption method based on GAN and chaotic system
CN111582312A (en) * 2020-04-09 2020-08-25 华南理工大学 Secure biological hash code generation method for resisting relation attack based on periodic function
US20210385081A1 (en) * 2020-06-03 2021-12-09 University Of Electronic Science And Technology Of China Key Generation Method Based On Deep Learning Generative Adversarial Network
CN114915398A (en) * 2022-04-29 2022-08-16 南京信息工程大学 Variational self-encoder-based chaotic encryption method, application, computer equipment and storage medium

Also Published As

Publication number Publication date
CN115633129B (en) 2024-03-05

Similar Documents

Publication Publication Date Title
US11829882B2 (en) System and method for addressing overfitting in a neural network
CN110490128B (en) Handwriting recognition method based on encryption neural network
US20210385081A1 (en) Key Generation Method Based On Deep Learning Generative Adversarial Network
CN112787971B (en) Construction method of side channel attack model, password attack equipment and computer storage medium
KR102533984B1 (en) Advanced system and method for detecting video forgery
CN113282707A (en) Data prediction method and device based on Transformer model, server and storage medium
Lopes et al. Constrained robust model predicted control of discrete‐time Markov jump linear systems
CN115795406A (en) Reversible face anonymization processing system
Drummond et al. Reduced-order neural network synthesis with robustness guarantees
CN115633129A (en) Stack type sparse self-encoder and GAN chaotic sequence image encryption method and device
Anumasa et al. Delay differential neural networks
Din et al. Text steganalysis using evolution algorithm approach
CN114842242A (en) Robust countermeasure sample generation method based on generative model
CN112199482A (en) Dialog generation method, device, equipment and readable storage medium
CN111033532B (en) Training method and system for generating countermeasure network, electronic device and storage medium
KR102477705B1 (en) Method and apparatus for detecting attack in CAN BUS
CN117973488B (en) Large language model training and reasoning method and system with privacy protection
CN116049840B (en) Data protection method, device, related equipment and system
CN116090021A (en) Authentication method and device of access equipment and communication equipment
CN117830935A (en) Crowd counting model training method and device based on self-adaptive region selection module
Sahoo et al. A Novel Lossless Image Cryptosystem for Binary Images Using Feed-Forward Back-Propagation Neural Networks
CN117081722A (en) Method and device for constructing side channel attack model, and side channel attack method and device
CN116232563A (en) Original image Gaussian sampling method, system, electronic equipment and storage medium
CN115545154A (en) Convolutional neural network model intellectual property protection method based on PUF
CN116976462A (en) Data processing method, device, equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant