CN115633129B - Stacked sparse self-encoder and GAN chaotic sequence image encryption method and device - Google Patents

Stacked sparse self-encoder and GAN chaotic sequence image encryption method and device Download PDF

Info

Publication number
CN115633129B
CN115633129B CN202211252929.4A CN202211252929A CN115633129B CN 115633129 B CN115633129 B CN 115633129B CN 202211252929 A CN202211252929 A CN 202211252929A CN 115633129 B CN115633129 B CN 115633129B
Authority
CN
China
Prior art keywords
network
encoder
sparse self
gan
chaotic sequence
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202211252929.4A
Other languages
Chinese (zh)
Other versions
CN115633129A (en
Inventor
何斌
杨振坤
李刚
陆萍
朱忠攀
程斌
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tongji University
Original Assignee
Tongji University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tongji University filed Critical Tongji University
Priority to CN202211252929.4A priority Critical patent/CN115633129B/en
Publication of CN115633129A publication Critical patent/CN115633129A/en
Application granted granted Critical
Publication of CN115633129B publication Critical patent/CN115633129B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N1/00Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
    • H04N1/32Circuits or arrangements for control or supervision between transmitter and receiver or between image input and image output device, e.g. between a still-image camera and its memory or between a still-image camera and a printer device
    • H04N1/32101Display, printing, storage or transmission of additional information, e.g. ID code, date and time or title
    • H04N1/32144Display, printing, storage or transmission of additional information, e.g. ID code, date and time or title embedded in the image data, i.e. enclosed or integrated in the image, e.g. watermark, super-imposed logo or stamp
    • H04N1/32149Methods relating to embedding, encoding, decoding, detection or retrieval operations
    • H04N1/32267Methods relating to embedding, encoding, decoding, detection or retrieval operations combined with processing of the image
    • H04N1/32272Encryption or ciphering
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/60Protecting data
    • G06F21/602Providing cryptographic facilities or services
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • G06N3/084Backpropagation, e.g. using gradient descent
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L9/00Cryptographic mechanisms or cryptographic arrangements for secret or secure communications; Network security protocols
    • H04L9/001Cryptographic mechanisms or cryptographic arrangements for secret or secure communications; Network security protocols using chaotic signals

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Signal Processing (AREA)
  • Computer Security & Cryptography (AREA)
  • General Physics & Mathematics (AREA)
  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • General Engineering & Computer Science (AREA)
  • Software Systems (AREA)
  • Bioethics (AREA)
  • Biophysics (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Multimedia (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Artificial Intelligence (AREA)
  • Biomedical Technology (AREA)
  • Computer Hardware Design (AREA)
  • Computational Linguistics (AREA)
  • Data Mining & Analysis (AREA)
  • Evolutionary Computation (AREA)
  • Molecular Biology (AREA)
  • Computing Systems (AREA)
  • Mathematical Physics (AREA)
  • Image Analysis (AREA)

Abstract

The invention discloses a chaotic sequence image encryption method and device of a stacked sparse self-encoder and a GAN, and relates to the technical field of image encryption. Comprising the following steps: acquiring an original image to be encrypted; acquiring a chaotic sequence of an original image; inputting the chaotic sequence to a trained deep learning network encryption model based on a stack sparse self-encoder and generating an countermeasure network GAN; and obtaining a ciphertext image of the original image to be encrypted according to the chaotic sequence and the deep learning network encryption model. According to the invention, the stack-type sparse self-coding is used for learning complex internal features in the image through the stack-type sparse self-coder, and has a multi-layer network structure and the constraint capacity of the sparse coder, so that the model can learn more effective image features in a complex environment. The stack sparse self-encoder and the GAN generate an encryption model combined with an countermeasure network, so that the periodicity of a chaotic sequence can be eliminated, and the security of the encryption model is further improved.

Description

Stacked sparse self-encoder and GAN chaotic sequence image encryption method and device
Technical Field
The invention relates to the technical field of image encryption, in particular to a chaotic sequence image encryption method and device of a stacked sparse self-encoder and a GAN.
Background
The image contains a lot of confidential and important information, so that the guarantee of the image security has important practical significance. The chaotic encryption provides an effective method for protecting the safety of the digital image, and in recent years, researchers design a plurality of digital image encryption methods based on the chaotic encryption technology, however, the image subjected to the chaotic encryption processing still has a certain periodicity characteristic. The existing image encryption method has the problem of low security and confidentiality when encrypting.
Disclosure of Invention
The invention provides the method for encrypting the image, which aims at the problem that the security and confidentiality are low when the encryption is carried out by the existing image encryption method.
In order to solve the technical problems, the invention provides the following technical scheme:
in one aspect, the present invention provides a method for encrypting a chaotic sequence image of a stacked sparse self-encoder and a GAN, the method being implemented by an electronic device, the method comprising:
s1, acquiring an original image to be encrypted.
S2, acquiring a chaotic sequence of the original image.
S3, inputting the chaotic sequence into a trained deep learning network encryption model based on a stack sparse self-encoder and generating an countermeasure network GAN.
S4, obtaining a ciphertext image of the original image to be encrypted according to the chaotic sequence and the deep learning network encryption model.
Optionally, the chaotic sequence of acquiring the original image in S2 includes:
s21, performing block processing on the original image to obtain an original image sub-block.
S22, generating a real sequence for the original image sub-block based on the Logistic chaotic system.
S23, converting the real number sequence based on a threshold method to obtain a chaotic sequence of the original image.
Optionally, the deep learning network encryption model based on the stack sparse self encoder and GAN in S3 includes generating a network and an antagonizing network.
The generation network is constructed based on a stack-type sparse self-encoder and is used for generating the ciphertext image.
And the countermeasure network is used for judging whether the input is a chaotic sequence or a ciphertext image.
Optionally, the generating network comprises an encoder and a decoder;
the encoder comprises an encoder full connection layer, a first convolution layer, a second convolution layer and a third convolution layer.
The encoder full connection layer, the first convolution layer and the second convolution layer are respectively connected with the GELU type activation function.
The third convolution layer connects Sigmoid-type activation functions.
The output of the encoder full-connection layer and the output of the third convolution layer are connected through a residual structure.
The decoder includes a first deconvolution layer, a second deconvolution layer, a third deconvolution layer, and a decoder full connection layer.
The first deconvolution layer, the second deconvolution layer and the third deconvolution layer are respectively connected with a GELU type activation function;
the decoder full connection layer connects Sigmoid type activation functions.
Optionally, the countermeasure network includes a countermeasure network first convolution layer, a countermeasure network second convolution layer, a countermeasure network third convolution layer, and a pooling layer.
The first convolution layer of the countermeasure network, the second convolution layer of the countermeasure network and the third convolution layer of the countermeasure network are respectively connected with the GELU type activation functions.
Optionally, the training process based on the stack sparse self-encoder and generating the deep learning network encryption model against the network GAN in S3 includes:
and S31, optimizing network parameters of the stack sparse self-encoder to obtain optimal weight and threshold of the stack sparse self-encoder network.
S32: and constructing a deep learning network encryption model based on the stack sparse self-encoder and the GAN based on the optimal weight and the threshold.
S33: and training a deep learning network encryption model based on the stack sparse self-encoder and the GAN by using a gradient descent algorithm.
Optionally, training the deep learning network encryption model based on the stack sparse self-encoder and GAN using a gradient descent algorithm in S33 includes:
the generation network and the countermeasure network of the deep learning network encryption model based on the stack sparse self-encoder and the GAN are trained by using a gradient descent algorithm.
Wherein training the countermeasure network comprises:
the antagonism network is trained using a cross entropy loss function.
Training the generation network includes:
inputting the ciphertext image generated by the generating network into the countermeasure network to obtain an output result of the countermeasure network, corresponding the output result of the countermeasure network to the tag of the chaotic sequence, calculating an error, and carrying out error back propagation on the calculated error by using an AdamW gradient descent algorithm to continuously correct network parameters of the generating network.
Optionally, obtaining the ciphertext image of the original image to be encrypted according to the chaotic sequence and the deep learning network encryption model in S4 includes:
and inputting the chaotic sequence into a deep learning network encryption model, and generating a ciphertext image of the original image to be encrypted by a generation network of the deep learning network encryption model through forward propagation of gradients.
On the other hand, the invention provides a chaotic sequence image encryption device of a stacked sparse self-encoder and a GAN, which is applied to a chaotic sequence image encryption method for realizing the stacked sparse self-encoder and the GAN, and comprises the following steps:
and the acquisition module is used for acquiring the original image to be encrypted.
And the chaotic sequence generation module is used for acquiring the chaotic sequence of the original image.
And the input module is used for inputting the chaotic sequence into the trained stack-based sparse self-encoder and generating a deep learning network encryption model against the network GAN.
And the output module is used for obtaining the ciphertext image of the original image to be encrypted according to the chaotic sequence and the deep learning network encryption model.
Optionally, the chaotic sequence generating module is further configured to:
s21, performing block processing on the original image to obtain an original image sub-block.
S22, generating a real sequence for the original image sub-block based on the Logistic chaotic system.
S23, converting the real number sequence based on a threshold method to obtain a chaotic sequence of the original image.
Optionally, the deep learning network encryption model based on the stack sparse self encoder and GAN includes generating a network and an countermeasure network.
The generation network is constructed based on a stack-type sparse self-encoder and is used for generating the ciphertext image.
And the countermeasure network is used for judging whether the input is a chaotic sequence or a ciphertext image.
Optionally, the generating network comprises an encoder and a decoder;
the encoder comprises an encoder full connection layer, a first convolution layer, a second convolution layer and a third convolution layer.
The encoder full connection layer, the first convolution layer and the second convolution layer are respectively connected with the GELU type activation function.
The third convolution layer connects Sigmoid-type activation functions.
The output of the encoder full-connection layer and the output of the third convolution layer are connected through a residual structure.
The decoder includes a first deconvolution layer, a second deconvolution layer, a third deconvolution layer, and a decoder full connection layer.
The first deconvolution layer, the second deconvolution layer and the third deconvolution layer are respectively connected with a GELU type activation function;
the decoder full connection layer connects Sigmoid type activation functions.
Optionally, the countermeasure network includes a countermeasure network first convolution layer, a countermeasure network second convolution layer, a countermeasure network third convolution layer, and a pooling layer.
The first convolution layer of the countermeasure network, the second convolution layer of the countermeasure network and the third convolution layer of the countermeasure network are respectively connected with the GELU type activation functions.
Optionally, the input module is further configured to:
and S31, optimizing network parameters of the stack sparse self-encoder to obtain optimal weight and threshold of the stack sparse self-encoder network.
S32: and constructing a deep learning network encryption model based on the stack sparse self-encoder and the GAN based on the optimal weight and the threshold.
S33: and training a deep learning network encryption model based on the stack sparse self-encoder and the GAN by using a gradient descent algorithm.
Optionally, the input module is further configured to:
the generation network and the countermeasure network of the deep learning network encryption model based on the stack sparse self-encoder and the GAN are trained by using a gradient descent algorithm.
Wherein training the countermeasure network comprises:
the antagonism network is trained using a cross entropy loss function.
Training the generation network includes:
inputting the ciphertext image generated by the generating network into the countermeasure network to obtain an output result of the countermeasure network, corresponding the output result of the countermeasure network to the tag of the chaotic sequence, calculating an error, and carrying out error back propagation on the calculated error by using an AdamW gradient descent algorithm to continuously correct network parameters of the generating network.
Optionally, the output module is further configured to:
and inputting the chaotic sequence into a deep learning network encryption model, and generating a ciphertext image of the original image to be encrypted by a generation network of the deep learning network encryption model through forward propagation of gradients.
In one aspect, an electronic device is provided, where the electronic device includes a processor and a memory, where the memory stores at least one instruction, where the at least one instruction is loaded and executed by the processor to implement the stacked sparse self encoder and GAN chaotic sequence image encryption method described above.
In one aspect, a computer readable storage medium is provided, in which at least one instruction is stored, loaded and executed by a processor to implement the stacked sparse self encoder and GAN chaotic sequence image encryption method described above.
The technical scheme provided by the embodiment of the invention has the beneficial effects that at least:
in the scheme, the stacked sparse self-encoder and the GAN generation countermeasure network are combined, the chaotic sequence image encryption method based on the stacked sparse self-encoder and the GAN is designed, a stacked sparse self-encoding multi-layer network structure provides strong nonlinear feature mapping capability for the model, and meanwhile, the stacked sparse self-encoding has constraint capability of the sparse encoder, so that the model can learn more effective image features in a complex environment. The stack sparse self-encoder and the GAN generate an encryption model combined with an countermeasure network, so that the periodicity of a chaotic sequence can be eliminated, and the security and confidentiality of the encryption model are further improved.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present invention, the drawings required for the description of the embodiments will be briefly described below, and it is apparent that the drawings in the following description are only some embodiments of the present invention, and other drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
Fig. 1 is a schematic flow chart of a chaotic sequence image encryption method of a stacked sparse self-encoder and GAN provided by an embodiment of the present invention;
FIG. 2 is a schematic diagram of a deep learning network encryption model based on a stack sparse self-encoder and GAN generation countermeasure network according to an embodiment of the present invention;
FIG. 3 is a schematic diagram of a generating network according to an embodiment of the present invention;
FIG. 4 is a schematic diagram of a challenge network according to an embodiment of the present invention;
fig. 5 is a block diagram of a chaotic sequence image encryption device for solving a stacked sparse self-encoder and GAN according to an embodiment of the present invention;
fig. 6 is a schematic structural diagram of an electronic device according to an embodiment of the present invention.
Detailed Description
In order to make the technical problems, technical solutions and advantages to be solved more apparent, the following detailed description will be given with reference to the accompanying drawings and specific embodiments.
As shown in fig. 1, the embodiment of the invention provides a stack type sparse self-encoder and a GAN chaotic sequence image encryption method, which can be implemented by electronic equipment. The flow chart of the chaotic sequence image encryption method of the stacked sparse self-encoder and the GAN shown in fig. 1, the processing flow of the method can comprise the following steps:
s1, acquiring an original image to be encrypted.
S2, acquiring a chaotic sequence of the original image.
Optionally, the step S2 may include the following steps S21 to S23:
s21, performing block processing on the original image to obtain an original image sub-block.
In a feasible implementation manner, firstly, a plaintext image (original image) is subjected to plaintext image blocking processing to obtain plaintext image sub-blocks, no overlapping part exists between the obtained plaintext image sub-blocks, and the function of the plaintext image blocking is convenient for the subsequent chaotic system processing.
S22, generating a real sequence for the original image sub-block based on the Logistic chaotic system.
In a possible implementation manner, a real sequence is generated on a plaintext image sub-block by using a Logistic chaotic system, and an iterative formula of the Logistic chaotic system is as follows (1):
x i+1 =μx i (1-x i ),i=0,1,…,n (1)
wherein x is i ∈[0,1],μ∈(0,4)。
S23, converting the real number sequence based on a threshold method to obtain a chaotic sequence of the original image.
S3, inputting the chaotic sequence into a trained deep learning network encryption model based on a stack sparse self-encoder and generating an countermeasure network GAN.
Optionally, the deep learning network encryption model based on the stack sparse self encoder and GAN in S3 includes generating a network and an antagonizing network.
In a possible implementation, the deep learning network encryption model based on the stack sparse self-encoder and GAN (Generative Adversarial Network) to generate the countermeasure network is composed of the generation network and the countermeasure network, and its structure schematic diagram is shown in fig. 2.
The generation network is constructed based on a stack-type sparse self-encoder and is used for generating the ciphertext image.
Optionally, the generating network comprises an encoder and a decoder;
the encoder comprises an encoder full connection layer, a first convolution layer, a second convolution layer and a third convolution layer.
The encoder full-connection layer, the first convolution layer, and the second convolution layer are respectively connected to a GELU (Gaussian Error Linear Units, gaussian error linear unit) activation function.
The third convolution layer connects Sigmoid-type activation functions.
The output of the encoder full-connection layer and the output of the third convolution layer are connected through a residual structure.
The decoder includes a first deconvolution layer, a second deconvolution layer, a third deconvolution layer, and a decoder full connection layer.
The first deconvolution layer, the second deconvolution layer and the third deconvolution layer are respectively connected with a GELU type activation function;
the decoder full connection layer connects Sigmoid type activation functions.
In one possible implementation, the random vector generates a ciphertext image via a generation network that is built based on a stacked sparse self-encoder.
Further, as shown in fig. 3, the structure schematic diagram of the generating network is designed based on a stacked sparse self-encoder, and the generating network is composed of an encoder and a decoder, wherein the encoder of the generating network comprises 1 full-connection layer and 31×1 convolution layers, and the full-connection layer and the first 21×1 convolution layers are both connected with a GELU type activation function; the 3 rd 1X 1 convolution layer is connected with 1 Sigmoid type activation function, and the output of the full connection layer is connected with the 3 rd 1X 1 convolution layer output through 1 residual error structure; the decoder generating the network comprises 31×1 deconvolution layer components and 1 full connection layer, each deconvolution layer is followed by 1 GELU type activation function, and the full connection layer is followed by 1 Sigmoid type activation function.
Optionally, the countermeasure network is used for judging that the input is a chaotic sequence or a ciphertext image.
Optionally, the countermeasure network includes a countermeasure network first convolution layer, a countermeasure network second convolution layer, a countermeasure network third convolution layer, and a pooling layer.
The first convolution layer of the countermeasure network, the second convolution layer of the countermeasure network and the third convolution layer of the countermeasure network are respectively connected with the GELU type activation functions.
In one possible embodiment, the countermeasure network is a discriminator for determining whether the input is a chaotic sequence generated by the chaotic system or a ciphertext image generated by the network. The architecture of the challenge network is shown in fig. 4, where the challenge network includes 31 x 1 convolutional layers and a pooling layer, each of which is followed by 1 gel-type activation function.
Further, the generation network and the countermeasure network are in competition, so that the ciphertext image generated by the generation network and the chaotic sequence generated by the chaotic system are more and more similar.
Optionally, the training process based on the stack sparse self-encoder and generating the deep learning network encryption model against the network GAN in S3 includes:
and S31, optimizing network parameters of the stack sparse self-encoder to obtain optimal weight and threshold of the stack sparse self-encoder network.
S32: and constructing a deep learning network encryption model based on the stack sparse self-encoder and the GAN based on the optimal weight and the threshold.
S33: and training a deep learning network encryption model based on the stack sparse self-encoder and the GAN by using a gradient descent algorithm.
Optionally, training the deep learning network encryption model based on the stack sparse self-encoder and GAN using a gradient descent algorithm in S33 includes:
the generation network and the countermeasure network of the deep learning network encryption model based on the stack sparse self-encoder and the GAN are trained by using a gradient descent algorithm.
Wherein training the countermeasure network comprises:
the antagonism network is trained using a cross entropy loss function.
In a possible implementation manner, the deep learning network encryption model is trained, firstly, the countermeasure network is trained by using a cross entropy loss function, and the training aim is that the countermeasure network can distinguish a chaotic sequence generated by a chaotic system and generate a ciphertext image generated by the network. Setting a tag of a ciphertext image generated by a generating network to 0, setting a tag of a chaotic sequence generated by a chaotic system to 1, respectively inputting the chaotic sequence generated by the chaotic system and the ciphertext image generated by the generating network to an countermeasure network, comparing a result output by the countermeasure network with the tag, calculating corresponding errors, and carrying out error counter propagation on the calculated errors by using an AdamW gradient descent algorithm, so that network parameters of the countermeasure network are continuously corrected, and the chaotic sequence generated by the chaotic system and the ciphertext image generated by the generating network can be distinguished more and more by the countermeasure network.
Training the generation network includes:
inputting the ciphertext image generated by the generating network into the countermeasure network to obtain an output result of the countermeasure network, corresponding the output result of the countermeasure network to the tag of the chaotic sequence, calculating an error, and carrying out error back propagation on the calculated error by using an AdamW gradient descent algorithm to continuously correct network parameters of the generating network.
In a feasible implementation mode, training is carried out on a generating network, an countermeasure network is fixed, and the aim of the generating network training is to enable ciphertext images generated on the generating network to approach a chaotic sequence generated by a chaotic system more and more; inputting the generated ciphertext image into an countermeasure network, enabling an output result of the countermeasure network to correspond to a label 1 of a corresponding chaotic sequence, calculating a corresponding error, carrying out error back propagation on the calculated error by using an AdamW gradient descent algorithm, continuously correcting network parameters of the generation network, enabling the ciphertext image generated by the generation network to be more and more approximate to the chaotic sequence generated by the chaotic system, and encrypting the plaintext image.
S4, obtaining a ciphertext image of the original image to be encrypted according to the chaotic sequence and the deep learning network encryption model.
Alternatively, the step S4 may be:
and inputting the chaotic sequence into a deep learning network encryption model, and generating a ciphertext image of the original image to be encrypted by a generation network of the deep learning network encryption model through forward propagation of gradients.
In a feasible implementation mode, a chaotic sequence is generated by using a Logistic chaotic system on a plaintext image, the chaotic sequence is input into a trained deep learning network encryption model, and a result generated by a network is the ciphertext image to be obtained through forward propagation of gradients.
In the embodiment of the invention, a stacked sparse self-encoder and a GAN generation countermeasure network are combined, a chaotic sequence image encryption method based on the stacked sparse self-encoder and the GAN is designed, a stacked sparse self-encoding multi-layer network structure provides strong nonlinear feature mapping capability for a model, and meanwhile, the stacked sparse self-encoding has constraint capability of the sparse encoder, so that the model can learn more effective image features in a complex environment. The stack sparse self-encoder and the GAN generate an encryption model combined with an countermeasure network, so that the periodicity of a chaotic sequence can be eliminated, and the security and confidentiality of the encryption model are further improved.
As shown in fig. 5, an embodiment of the present invention provides a chaotic sequence image encryption apparatus 500 of a stacked sparse self-encoder and GAN, where the apparatus 500 is applied to implement a chaotic sequence image encryption method of a stacked sparse self-encoder and GAN, and the apparatus 500 includes:
an obtaining module 510, configured to obtain an original image to be encrypted.
The chaotic sequence generating module 520 is configured to acquire a chaotic sequence of an original image.
An input module 530 for inputting the chaotic sequence to a trained stack-based sparse self-encoder and generating a deep learning network encryption model against the network GAN.
And the output module 540 is used for obtaining the ciphertext image of the original image to be encrypted according to the chaotic sequence and the deep learning network encryption model.
Optionally, the chaotic sequence generating module 520 is further configured to:
s21, performing block processing on the original image to obtain an original image sub-block.
S22, generating a real sequence for the original image sub-block based on the Logistic chaotic system.
S23, converting the real number sequence based on a threshold method to obtain a chaotic sequence of the original image.
Optionally, the deep learning network encryption model based on the stack sparse self encoder and GAN includes generating a network and an countermeasure network.
The generation network is constructed based on a stack-type sparse self-encoder and is used for generating the ciphertext image.
And the countermeasure network is used for judging whether the input is a chaotic sequence or a ciphertext image.
Optionally, the generating network comprises an encoder and a decoder;
the encoder comprises an encoder full connection layer, a first convolution layer, a second convolution layer and a third convolution layer.
The encoder full connection layer, the first convolution layer and the second convolution layer are respectively connected with the GELU type activation function.
The third convolution layer connects Sigmoid-type activation functions.
The output of the encoder full-connection layer and the output of the third convolution layer are connected through a residual structure.
The decoder includes a first deconvolution layer, a second deconvolution layer, a third deconvolution layer, and a decoder full connection layer.
The first deconvolution layer, the second deconvolution layer and the third deconvolution layer are respectively connected with a GELU type activation function;
the decoder full connection layer connects Sigmoid type activation functions.
Optionally, the countermeasure network includes a countermeasure network first convolution layer, a countermeasure network second convolution layer, a countermeasure network third convolution layer, and a pooling layer.
The first convolution layer of the countermeasure network, the second convolution layer of the countermeasure network and the third convolution layer of the countermeasure network are respectively connected with the GELU type activation functions.
Optionally, the input module 530 is further configured to:
and S31, optimizing network parameters of the stack sparse self-encoder to obtain optimal weight and threshold of the stack sparse self-encoder network.
S32: and constructing a deep learning network encryption model based on the stack sparse self-encoder and the GAN based on the optimal weight and the threshold.
S33: and training a deep learning network encryption model based on the stack sparse self-encoder and the GAN by using a gradient descent algorithm.
Optionally, the input module 530 is further configured to:
the generation network and the countermeasure network of the deep learning network encryption model based on the stack sparse self-encoder and the GAN are trained by using a gradient descent algorithm.
Wherein training the countermeasure network comprises:
the antagonism network is trained using a cross entropy loss function.
Training the generation network includes:
inputting the ciphertext image generated by the generating network into the countermeasure network to obtain an output result of the countermeasure network, corresponding the output result of the countermeasure network to the tag of the chaotic sequence, calculating an error, and carrying out error back propagation on the calculated error by using an AdamW gradient descent algorithm to continuously correct network parameters of the generating network.
Optionally, the output module 540 is further configured to:
and inputting the chaotic sequence into a deep learning network encryption model, and generating a ciphertext image of the original image to be encrypted by a generation network of the deep learning network encryption model through forward propagation of gradients.
In the embodiment of the invention, a stacked sparse self-encoder and a GAN generation countermeasure network are combined, a chaotic sequence image encryption method based on the stacked sparse self-encoder and the GAN is designed, a stacked sparse self-encoding multi-layer network structure provides strong nonlinear feature mapping capability for a model, and meanwhile, the stacked sparse self-encoding has constraint capability of the sparse encoder, so that the model can learn more effective image features in a complex environment. The stack sparse self-encoder and the GAN generate an encryption model combined with an countermeasure network, so that the periodicity of a chaotic sequence can be eliminated, and the security and confidentiality of the encryption model are further improved.
Fig. 6 is a schematic structural diagram of an electronic device 600 according to an embodiment of the present invention, where the electronic device 600 may have a relatively large difference due to different configurations or performances, and may include one or more processors (central processing units, CPU) 601 and one or more memories 602, where at least one instruction is stored in the memories 602, and the at least one instruction is loaded and executed by the processors 601 to implement the following stacked sparse self encoder and GAN chaotic sequence image encryption method:
s1, acquiring an original image to be encrypted.
S2, acquiring a chaotic sequence of the original image.
S3, inputting the chaotic sequence into a trained deep learning network encryption model based on a stack sparse self-encoder and generating an countermeasure network GAN.
S4, obtaining a ciphertext image of the original image to be encrypted according to the chaotic sequence and the deep learning network encryption model.
In an exemplary embodiment, a computer readable storage medium, such as a memory comprising instructions executable by a processor in a terminal to perform the above-described method of solving a mass-customized downmixed line production scheduling problem is also provided. For example, the computer readable storage medium may be ROM, random Access Memory (RAM), CD-ROM, magnetic tape, floppy disk, optical data storage device, etc.
It will be understood by those skilled in the art that all or part of the steps for implementing the above embodiments may be implemented by hardware, or may be implemented by a program for instructing relevant hardware, where the program may be stored in a computer readable storage medium, and the storage medium may be a read-only memory, a magnetic disk or an optical disk, etc.
The foregoing description of the preferred embodiments of the invention is not intended to limit the invention to the precise form disclosed, and any such modifications, equivalents, and alternatives falling within the spirit and scope of the invention are intended to be included within the scope of the invention.

Claims (9)

1. A method for encrypting a chaotic sequence image of a stacked sparse self-encoder and GAN, the method comprising:
s1, acquiring an original image to be encrypted;
s2, acquiring a chaotic sequence of the original image;
s3, inputting the chaotic sequence to a trained deep learning network encryption model based on a stack sparse self-encoder and generating an countermeasure network GAN;
s4, obtaining a ciphertext image of the original image to be encrypted according to the chaotic sequence and the deep learning network encryption model;
the training process based on the stack sparse self-encoder and generating the deep learning network encryption model against the network GAN in S3 includes:
s31, optimizing network parameters of the stack sparse self-encoder to obtain optimal weight and threshold of the stack sparse self-encoder network;
s32: based on the optimal weight and the threshold, constructing a deep learning network encryption model based on a stack sparse self-encoder and a GAN;
s33: and training the deep learning network encryption model based on the stack sparse self-encoder and the GAN by using a gradient descent algorithm.
2. The method of claim 1, wherein the acquiring the chaotic sequence of the original image in S2 comprises:
s21, performing blocking processing on the original image to obtain an original image sub-block;
s22, generating a real number sequence for the original image sub-block based on the Logistic chaotic system;
s23, converting the real number sequence based on a threshold method to obtain a chaotic sequence of the original image.
3. The method of claim 1, wherein the stack sparse self encoder and GAN based deep learning network encryption model in S3 comprises generating a network and an antagonizing network;
the generation network is constructed based on a stack-type sparse self-encoder and is used for generating a ciphertext image;
the countermeasure network is used for judging whether the input is a chaotic sequence or a ciphertext image.
4. A method according to claim 3, wherein the generation network comprises an encoder and a decoder;
the encoder comprises an encoder full-connection layer, a first convolution layer, a second convolution layer and a third convolution layer;
the encoder full-connection layer, the first convolution layer and the second convolution layer are respectively connected with a GELU type activation function;
the third convolution layer is connected with a Sigmoid type activation function;
the output of the encoder full-connection layer is connected with the output of the third convolution layer through a residual structure;
the decoder comprises a first deconvolution layer, a second deconvolution layer, a third deconvolution layer and a decoder full-connection layer;
the first deconvolution layer, the second deconvolution layer and the third deconvolution layer are respectively connected with a GELU type activation function;
the decoder full connection layer connects Sigmoid type activation functions.
5. The method of claim 3, wherein the countermeasure network comprises a countermeasure network first convolution layer, a countermeasure network second convolution layer, a countermeasure network third convolution layer, and a pooling layer;
the first convolution layer of the countermeasure network, the second convolution layer of the countermeasure network and the third convolution layer of the countermeasure network are respectively connected with a GELU type activation function.
6. The method of claim 1, wherein training the stack sparse self encoder and GAN based deep learning network encryption model using a gradient descent algorithm in S33 comprises:
training a generation network and an countermeasure network of the deep learning network encryption model based on the stack sparse self-encoder and the GAN by utilizing a gradient descent algorithm;
wherein training the countermeasure network comprises:
training the countermeasure network using a cross entropy loss function;
training the generation network includes:
inputting a ciphertext image generated by a generating network into the countermeasure network to obtain an output result of the countermeasure network, corresponding the output result of the countermeasure network to a tag of a chaotic sequence, calculating an error, and carrying out error back propagation on the calculated error by using an AdamW gradient descent algorithm to continuously correct network parameters of the generating network.
7. The method according to claim 1, wherein the obtaining the ciphertext image of the original image to be encrypted according to the chaotic sequence and a deep learning network encryption model in S4 comprises:
and inputting the chaotic sequence into the deep learning network encryption model, and generating a ciphertext image of the original image to be encrypted by a generation network of the deep learning network encryption model through forward propagation of gradients.
8. A chaotic sequence image encryption device of a stacked sparse self-encoder and GAN, the device comprising:
the acquisition module is used for acquiring an original image to be encrypted;
the chaotic sequence generation module is used for acquiring the chaotic sequence of the original image;
the input module is used for inputting the chaotic sequence to a trained deep learning network encryption model based on a stack sparse self-encoder and generating an countermeasure network GAN;
the output module is used for obtaining the ciphertext image of the original image to be encrypted according to the chaotic sequence and the deep learning network encryption model;
the training process based on the stack sparse self-encoder and generating the deep learning network encryption model against the network GAN comprises the following steps:
s31, optimizing network parameters of the stack sparse self-encoder to obtain optimal weight and threshold of the stack sparse self-encoder network;
s32: based on the optimal weight and the threshold, constructing a deep learning network encryption model based on a stack sparse self-encoder and a GAN;
s33: and training the deep learning network encryption model based on the stack sparse self-encoder and the GAN by using a gradient descent algorithm.
9. The apparatus of claim 8, wherein the deep learning network encryption model based on the stack sparse self encoder and GAN comprises generating a network and an countermeasure network;
the generation network is constructed based on a stack-type sparse self-encoder and is used for generating a ciphertext image;
the countermeasure network is used for judging whether the input is a chaotic sequence or a ciphertext image.
CN202211252929.4A 2022-10-13 2022-10-13 Stacked sparse self-encoder and GAN chaotic sequence image encryption method and device Active CN115633129B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211252929.4A CN115633129B (en) 2022-10-13 2022-10-13 Stacked sparse self-encoder and GAN chaotic sequence image encryption method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211252929.4A CN115633129B (en) 2022-10-13 2022-10-13 Stacked sparse self-encoder and GAN chaotic sequence image encryption method and device

Publications (2)

Publication Number Publication Date
CN115633129A CN115633129A (en) 2023-01-20
CN115633129B true CN115633129B (en) 2024-03-05

Family

ID=84903790

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211252929.4A Active CN115633129B (en) 2022-10-13 2022-10-13 Stacked sparse self-encoder and GAN chaotic sequence image encryption method and device

Country Status (1)

Country Link
CN (1) CN115633129B (en)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101706946A (en) * 2009-11-26 2010-05-12 大连大学 Digital image encryption method based on DNA sequence and multi-chaotic mapping
CN111105338A (en) * 2019-12-20 2020-05-05 长春理工大学 Image encryption method based on GAN and chaotic system
CN111582312A (en) * 2020-04-09 2020-08-25 华南理工大学 Secure biological hash code generation method for resisting relation attack based on periodic function
CN114915398A (en) * 2022-04-29 2022-08-16 南京信息工程大学 Variational self-encoder-based chaotic encryption method, application, computer equipment and storage medium

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111654368B (en) * 2020-06-03 2021-10-08 电子科技大学 Key generation method for generating countermeasure network based on deep learning

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101706946A (en) * 2009-11-26 2010-05-12 大连大学 Digital image encryption method based on DNA sequence and multi-chaotic mapping
CN111105338A (en) * 2019-12-20 2020-05-05 长春理工大学 Image encryption method based on GAN and chaotic system
CN111582312A (en) * 2020-04-09 2020-08-25 华南理工大学 Secure biological hash code generation method for resisting relation attack based on periodic function
CN114915398A (en) * 2022-04-29 2022-08-16 南京信息工程大学 Variational self-encoder-based chaotic encryption method, application, computer equipment and storage medium

Also Published As

Publication number Publication date
CN115633129A (en) 2023-01-20

Similar Documents

Publication Publication Date Title
US11829882B2 (en) System and method for addressing overfitting in a neural network
Liu et al. Flexible data integrity checking with original data recovery in IoT-enabled maritime transportation systems
CN111625816A (en) Intrusion detection method and device
CN115659408B (en) Method, system and storage medium for sharing sensitive data of power system
CN112787971A (en) Construction method of side channel attack model, password attack equipment and computer storage medium
CN110071798A (en) A kind of equivalent key acquisition methods, device and computer readable storage medium
Costilla-Enriquez et al. Attack power system state estimation by implicitly learning the underlying models
CN111860790A (en) Method and system for improving precision of depth residual error pulse neural network to optimize image classification
CN114513337B (en) Privacy protection link prediction method and system based on mail data
CN112364388A (en) Sensor data authentication method and device based on block chain
CN116311451A (en) Multi-mode fusion human face living body detection model generation method and device and electronic equipment
CN116049691A (en) Model conversion method, device, electronic equipment and storage medium
CN109274504B (en) Multi-user big data storage sharing method and system based on cloud platform
CN114003961A (en) Deep neural network reasoning method with privacy protection
CN115633129B (en) Stacked sparse self-encoder and GAN chaotic sequence image encryption method and device
CN117408991A (en) Image anomaly detection method and device based on reconstruction resistance and storage medium
Anumasa et al. Delay differential neural networks
CN112329025A (en) Power terminal bypass safety analysis method and power terminal bypass safety analysis system
CN112199482A (en) Dialog generation method, device, equipment and readable storage medium
CN114218617B (en) Cross entropy loss function safety calculation method and system
CN118174846B (en) Fault injection method, device, equipment, medium and program for SM9 algorithm
CN117235665B (en) Self-adaptive privacy data synthesis method, device, computer equipment and storage medium
KR102153923B1 (en) Wireless communication control device performing signal transmission processing based on analog cooperative beamforming through communication nodes distributed in three-dimensional space and operating method thereof
CN117081722A (en) Method and device for constructing side channel attack model, and side channel attack method and device
Chen et al. Mp-badnet: a backdoor-attack detection and identification protocol among multi-participants in private deep neural networks

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant