WO2021047482A1 - Method and system for performing steganographic technique - Google Patents

Method and system for performing steganographic technique Download PDF

Info

Publication number
WO2021047482A1
WO2021047482A1 PCT/CN2020/113840 CN2020113840W WO2021047482A1 WO 2021047482 A1 WO2021047482 A1 WO 2021047482A1 CN 2020113840 W CN2020113840 W CN 2020113840W WO 2021047482 A1 WO2021047482 A1 WO 2021047482A1
Authority
WO
WIPO (PCT)
Prior art keywords
steganographic
neural network
convolutional neural
network model
target
Prior art date
Application number
PCT/CN2020/113840
Other languages
French (fr)
Inventor
Yongliang Liu
Wuzhen SHI
Original Assignee
Alibaba Group Holding Limited
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Alibaba Group Holding Limited filed Critical Alibaba Group Holding Limited
Publication of WO2021047482A1 publication Critical patent/WO2021047482A1/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/10Protecting distributed programs or content, e.g. vending or licensing of copyrighted material ; Digital rights management [DRM]
    • G06F21/106Enforcing content protection by specific content processing
    • G06F21/1063Personalisation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/10Protecting distributed programs or content, e.g. vending or licensing of copyrighted material ; Digital rights management [DRM]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/045Combinations of networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/047Probabilistic or stochastic networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/46Embedding additional information in the video signal during the compression process
    • H04N19/467Embedding additional information in the video signal during the compression process characterised by the embedded information being invisible, e.g. watermarking

Definitions

  • the present application relates to a method and a system for performing a steganographic technique.
  • a steganographic technique refers to a technique whereby information to be embedded (e.g., information to be transferred or digital copyright information) is written in a hidden and disguised manner into a carrier object to obtain a corresponding steganographic object including the embedded information.
  • the generative adversarial network-based steganographic techniques typically generate steganographic objects using a steganographic convolutional neural network model, determine whether an input object includes embedded information using a binary classification convolutional neural network model, and extract embedded information from a steganographic object using a steganographic information extraction convolutional neural network model.
  • the generative adversarial network-based steganographic techniques’ core concept includes using an adversarial training strategy.
  • input and output objects of the steganographic convolutional neural network model serve as input objects for the binary classification convolutional neural network model, and the concealment qualities of steganographic objects generated by the steganographic convolutional neural network model are examined using the binary classification convolutional neural network model. Then the two convolutional neural network models contest each other in games, during which the performance of the two convolutional neural network models gradually improves. Steganography is then performed using the stable-performing steganographic convolutional neural network model.
  • the generative adversarial network-based steganographic techniques can increase the concealment of embedded information in the steganographic objects obtained from the generative adversarial network-based steganographic techniques
  • the generative adversarial network-based steganographic techniques do have some limitations.
  • the adversarial information that the generative adversarial network-based steganographic techniques use is insufficiently accurate.
  • the input data for the binary classification network that conducts adversarial training for the steganographic convolutional neural network model is the input and output data of the steganographic convolutional neural network model, i.e., the carrier object and the corresponding steganographic object.
  • the input data and the output data do not have much of a relationship between each other.
  • the structure of the steganographic convolutional neural network model in the above technique is relatively simple because a basic neural network is used. Embedded information in a steganographic object obtained using the steganographic convolutional neural network model can easily be detected and destroyed.
  • FIG. 1 is a diagram of an embodiment of an application scenario for performing a steganographic technique.
  • FIG. 2 is a diagram of a conventional generative adversarial network-based steganographic technique.
  • FIG. 3 is a flowchart of an embodiment of a process for performing a steganographic technique.
  • FIG. 4 is a diagram of an example of a cycle generative adversarial network.
  • FIG. 5 is a diagram of an example of a target steganographic convolutional neural network model.
  • FIG. 6A is a diagram of examples of original carrier objects.
  • FIG. 6B is a diagram of examples of target steganographic objects obtained by a conventional process for performing a steganographic technique.
  • FIG. 6C is a diagram of examples of target steganographic objects obtained by the embodiment of the process 300 for performing a steganographic technique of FIG. 3.
  • FIG. 7 is a flowchart of an embodiment of a process for detecting information.
  • FIG. 8 is a diagram of an embodiment of a device for performing a steganographic technique.
  • FIG. 9 is a diagram of an embodiment of a device for detecting information.
  • FIG. 10 is a functional diagram illustrating a programmed computer system for performing a steganographic technique in accordance with some embodiments.
  • the invention can be implemented in numerous ways, including as a process; an apparatus; a system; a composition of matter; a computer program product embodied on a computer readable storage medium; and/or a processor, such as a processor configured to execute instructions stored on and/or provided by a memory coupled to the processor.
  • these implementations, or any other form that the invention may take, may be referred to as techniques.
  • the order of the steps of disclosed processes may be altered within the scope of the invention.
  • a component such as a processor or a memory described as being configured to perform a task may be implemented as a general component that is temporarily configured to perform the task at a given time or a specific component that is manufactured to perform the task.
  • the term ‘processor’ refers to one or more devices, circuits, and/or processing cores configured to process data, such as computer program instructions.
  • FIG. 1 is a diagram of an embodiment of an application scenario for performing a steganographic technique.
  • a client 120 first establishes a connection with a server 110.
  • the connected client 120 sends a carrier object and to-be-embedded information to the server 110.
  • the server After receiving the carrier object and the to-be-embedded information, the server inputs the carrier object and the to-be-embedded information into a target steganographic convolutional neural network model to obtain a target residual object.
  • the target steganographic convolutional neural network model is a model obtained from training using reverse carrier objects output by a reverse steganographic convolutional neural network model as sample objects.
  • the reverse steganographic convolutional neural network model is used to obtain reverse carrier objects based on steganographic objects obtained with the target steganographic convolutional neural network model.
  • the reverse carrier objects are objects from which embedded information has been eliminated.
  • the server obtains the target steganographic object in which the to-be-embedded information is embedded using the carrier object and the target residual object, and sends the target steganographic object including the to-be-embedded information to the client.
  • the client receives the target steganographic object.
  • an image can represent the carrier object of FIG. 1.
  • the carrier object is an object other than an image, such as a video object or an audio object.
  • a carrier image is used as an example to explain the performing of the steganographic technique or process for concealing information.
  • the client can be a mobile terminal device such as a mobile phone or a tablet, or the client can be a common computer device.
  • the application scenario 100 for performing a steganographic technique can solely be applied to a client or a server. For example, after the acquiring of the carrier object and the to-be-embedded information, the client performs processing directly through a corresponding application installed on the client, and then acquires the target steganographic object.
  • the server after acquiring the carrier object and the to-be-embedded information, directly stores the acquired target steganographic object locally or remotely without having to send the acquired target steganographic object to the client.
  • the above application scenario is merely an embodiment of the application scenario 100 for performing a steganographic technique.
  • the above embodiments of the application scenario 100 for performing a steganographic technique are described to make it easier to understand.
  • FIG. 2 is a diagram of a conventional generative adversarial network-based steganographic technique.
  • the generative adversarial network 200 includes: a steganographic convolutional neural network model 230 that generates a corresponding steganographic object 240 based on a carrier object 210 and to-be-embedded information 220; a binary classification convolutional neural network model 250 that uses the carrier object 210 and the steganographic object 240 as input data and determines whether the type of the input data includes embedded information; and an information extraction convolutional neural network model 260 that generates target steganographic information 280 based on the steganographic object 240.
  • the steganographic object 240 is generated by the steganographic convolutional neural network model 230, and the binary classification convolutional neural network model 250 identifies whether its input data includes embedded information.
  • the binary classification convolutional neural network model 250 outputs a discrimination result 270, thus increasing the robustness of the convolutional neural network model in the generative adversarial network 200 through a process of continual adversarial training.
  • the steganographic convolutional neural network model 230 uses the carrier object 210 and the to-be-embedded information 220 as input.
  • a first layer of the steganographic convolutional neural network model 230 is a fully connected layer. Then the fully connected layer subjects the carrier object 210 and the to-be-embedded information 220 to processing.
  • the steganographic convolutional neural network model 230 deforms the fully connected layer output to the expression 4 ⁇ 4 ⁇ I ⁇ 8.
  • I is the width of the carrier object.
  • This is followed by four processing modules including a deconvolution unit (DeConv) + a batch normalization unit (BN) + a rectified linear unit (ReLU) .
  • the final layer can be an activation layer.
  • the activation is a hyperbolic tangent function (tanh) activation layer.
  • the information extraction convolutional neural network model 260 can also be a convolutional neural network model, which includes four convolutional layers and one fully connected layer.
  • the final layer of the information extraction convolutional neural network model 260 is a tanh activation layer. All of the remaining layers use leaky rectified linear units (LReLU) as activation functions, and each layer includes a batch normalization module.
  • LReLU leaky rectified linear units
  • the network architecture of the binary classification convolutional neural network model 250 is similar to the network architecture of the steganographic convolutional neural network model 230. The only difference between the two models is that the activation function used by the final activation layer of the binary classification convolutional neural network model 250 is a sigmoid activation function.
  • the input data, i.e., the adversarial information, of the binary classification convolutional neural network model 250 used to conduct adversarial training in conventional generative adversarial network-based steganographic techniques is insufficiently accurate.
  • Accurate adversarial information should be another steganographic object corresponding to the carrier object 210 and the to-be-embedded information 220.
  • the structure of the convolutional neural network models in the above steganographic technique is relatively simple, and the target steganographic objects 280 obtained thereby have relatively poor concealment qualities and are easily destroyed.
  • FIG. 3 is a flowchart of an embodiment of a process for performing a steganographic technique.
  • the process 300 is implemented by the server 110 of FIG. 1 and comprises:
  • the server acquires a carrier object and to-be-embedded information.
  • the carrier object is a carrier for carrying embedded information.
  • the carrier object is an image, audio, video, or other such object.
  • An example of the carrier image is used for descriptive convenience in process 300.
  • the to-be-embedded information can refer to information that is to be transferred covertly or information used to provide digital copyright protection.
  • the to-be-embedded information is a piece of text information.
  • the to-be-embedded information is image information, audio information, or video information, such as a company logo, a contract document scan, an audio or video file, or other such information.
  • the acquiring of the carrier object and the to-be-embedded information is performed by a computing device, such as a server computing device or a client computing device, used to implement receiving the carrier object and the to-be-embedded information from a user or other computing device receiving a data request message to acquire a target steganographic object and acquiring the carrier object and to-be-embedded information from the data request message.
  • a computing device such as a server computing device or a client computing device, used to implement receiving the carrier object and the to-be-embedded information from a user or other computing device receiving a data request message to acquire a target steganographic object and acquiring the carrier object and to-be-embedded information from the data request message.
  • the server inputs the carrier object and the to-be-embedded information into a target steganographic convolutional neural network model to obtain a target residual object.
  • the target steganographic convolutional neural network model is a model obtained from training using reverse carrier objects output by a reverse steganographic convolutional neural network model as sample objects.
  • the reverse steganographic convolutional neural network model is used to obtain the reverse carrier objects based on steganographic objects obtained with the target steganographic convolutional neural network model.
  • the reverse carrier objects are objects from which embedded information has been eliminated.
  • the target residual object can refer to an object formed from the residual between the carrier object and the anticipated target steganographic object.
  • the target residual object is the object obtained after inputting the carrier object and the to-be-embedded information into the target steganographic convolutional neural network model.
  • the residual can refer to a difference between an actual observed value and an estimated value.
  • the estimated value is also called a fitted value, and the observed value is also called an expected value.
  • the residual is the difference between the expected data and the output data obtained after input data has been input into a corresponding convolutional neural network model and has undergone processing by the convolutional neural network model.
  • the residual can also be regarded as the observed value of error.
  • process 300 can include: introducing a reverse steganographic convolutional neural network model; obtaining, through training, a target steganographic convolutional neural network model for steganographic processing using a cycle generative adversarial network including the reverse steganographic convolutional neural network model; and performing steganographic processing using the target steganographic convolutional neural network model.
  • the reverse steganographic convolutional neural network model is used to obtain the reverse carrier objects based on the steganographic objects obtained from the target steganographic convolutional neural network model.
  • the reverse carrier objects are objects from which embedded information has been eliminated.
  • FIG. 4 is a diagram of an example of a cycle generative adversarial network.
  • the obtaining of the target steganographic convolutional neural network model can include: acquiring an original steganographic convolutional neural network model; adversarially training the reverse steganographic convolutional neural network model and the original steganographic convolutional neural network model to cause the original steganographic convolutional neural network model to converge, and regarding the converged original steganographic convolutional neural network model as the target steganographic convolutional neural network model.
  • the adversarially training of the reverse steganographic convolutional neural network model and the original steganographic convolutional neural network model to cause the original steganographic convolutional neural network model to converge includes: acquiring a first discriminative convolutional neural network model and a second discriminative convolutional network model, determining, using the first discriminative convolutional neural network model, whether a received object includes embedded information and determining, using the second discriminative convolutional neural network model, whether a received object is an original carrier object or the reverse carrier object; then adversarially training the original steganographic convolutional neural network model, the reverse steganographic convolutional neural network model, the first discriminative convolutional neural network model, and the second discriminative convolutional neural network model to cause the original steganographic convolutional neural network model to converge.
  • the adversarially training of the original steganographic convolutional neural network model, the reverse steganographic convolutional neural network model, the first discriminative convolutional neural network model, and the second discriminative convolutional neural network model to cause the original steganographic convolutional neural network model to converge includes: acquiring an original training carrier object and acquiring original training to-be-embedded information; inputting the original training carrier object and the original training to-be-embedded information into the original steganographic convolutional neural network model to obtain an original training steganographic object; inputting the original training steganographic object into the reverse steganographic convolutional neural network model to obtain a reverse training carrier object; training the first discriminative convolutional neural network model using the original training steganographic object and the original training carrier object; training the second discriminative convolutional neural network model using the original training carrier object and the reverse training carrier object; during the training of the first and second convolutional neural network models, adjusting, using loss functions of the convolutional neural
  • the original training carrier object and the original training to-be-embedded information can be the original training sample data for training each convolutional neural network model in the cycle generative adversarial network.
  • the original training steganographic object can correspond to the original training carrier object in which the original training to-be-embedded information is embedded.
  • the reverse training carrier object can correspond to the object obtained after eliminating the original training to-be-embedded information from the original training steganographic object.
  • the training of the first discriminative convolutional neural network model using the original training steganographic object and the original training carrier object refers to regarding the original training steganographic object or the original training carrier object as input data for the first discriminative convolutional neural network model to obtain a corresponding first discrimination result.
  • the first discrimination result indicates whether the input data for the first discriminative convolutional neural network model includes embedded information.
  • the training of the second discriminative convolutional neural network model using the original training carrier object and the reverse training carrier object refers to regarding the original training carrier object or the reverse training carrier object as input data for the second discriminative convolutional neural network model to obtain a corresponding second discrimination result.
  • the second discrimination result indicates whether the input data for the second discriminative convolutional neural network model is the original training carrier object in which the information was never embedded or the reverse training carrier object from which the embedded information was eliminated.
  • process 300 includes the following: a cycle generative adversarial network is designed, and, after original training sample data is obtained in the cycle generative adversarial network, various convolutional neural network models having different functions are used to generate training data for each other. The various convolutional neural network models then contest with each other in adversarial games. Moreover, in the adversarial game process, the parameters of each convolutional neural network model are adjusted using their respective loss functions to cause each convolutional neural network model in the cycle generative adversarial network to converge.
  • an image is used as an example of a carrier object.
  • original training carrier object o_img1 and original to-be-embedded information o_msg1 are used.
  • the original training carrier object o_img1 and the original to-be-embedded information o_msg1 can be input into an original steganographic convolutional neural network model to obtain an original training steganographic object o_s_img1.
  • the original training steganographic object o_s_img1 is input into a reverse convolutional neural network model to obtain a reverse training carrier object i_img1.
  • the original training carrier object o_img1 or the original training steganographic object o_s_img1 is input into a first discriminative convolutional neural network model to obtain a first discrimination result est_result1
  • the original to-be-embedded information o_img1 or the reverse training carrier object i_img1 is input into a second discriminative convolutional neural network model to obtain a second discrimination result est_result2.
  • the original training steganographic object o_s_img1 can also be input into a steganographic information extraction convolutional neural network model to extract the target embedded information from the original training steganographic object o_s_img1.
  • loss functions of each convolutional neural network model is used to adjust its respective parameters until all the convolutional neural network models achieve convergence. Then, each converged convolutional neural network model is used in subsequent steganographic processing.
  • the original steganographic convolutional neural network model in the cycle generative adversarial network can be a residual learning convolutional neural network model.
  • the residual learning convolutional neural network model can correspond to a deep neural network model that includes at least one residual block.
  • Output data of the residual learning convolutional neural network model can be the residual of input data and real expected data. Adding the input data of the residual learning convolutional neural network model to the residual that the residual learning convolutional neural network model outputs provides the data closest to the expected data.
  • the residual learning convolutional neural network model can learn to form corresponding residual functions, and the residual functions can be more easily optimized than traditional model functions which have no relationship to the input data.
  • the model network layers can then be deepened to form a network model that can attain better performance, e.g. such as, for example, better concealment.
  • a residual block typically refers to a combination of multiple convolutional layers including shortcut connections.
  • a shortcut connection can also be called a short-circuit connection.
  • the shortcut connection can evolve from shortcut connections in recurrent neural networks (RNN) and gating algorithms and can relate to a technique for alleviating the vanishing gradient problem in deep architectures.
  • the vanishing gradient problem occurs when the network parameters and hyperparameters are not properly set.
  • the residual learning convolutional neural network model can correspond to a convolutional neural network model that includes at least one residual block and at least one fully connected layer.
  • the residual block can include a convolution module (Conv) , a batch normalization module (BN) , and a leak rectified linear unit (LReLU) .
  • the at least one fully connected layer can include the same number of channels as the carrier object.
  • the size of the convolution kernel in the at least one residual block and the at least one fully connected layer can be 5 ⁇ 5, with a stride of 1 and padding of 2.
  • the number of channels of each layer in the residual block can be 128.
  • the fully connected layer can have the same number of channels as the original carrier object. In some embodiments, each of the relevant numerical values above can be set to other different values as needed.
  • the reverse steganographic convolutional neural network model can include an architecture similar to that of the original steganographic network model.
  • the reverse steganographic convolutional neural network model can correspond to a lightweight, end-to-end network model.
  • the steganographic information extraction convolutional neural network model is used for end-to-end extraction of embedded information from steganographic objects.
  • the steganographic information extraction convolutional neural network model can include four structurally-identical convolutional layers (Conv + BN + LReLU) , one fully connected layer, and one tanh activation layer.
  • the four structurally-identical convolutional layers have 32, 64, 128, and 256 channels, respectively.
  • the fully connected layer can be used to generate one piece of vector data having the same shape as the original steganographic information.
  • the tanh activation layer converts the real numbers in the vector data to the [-1, 1] range.
  • the tanh activation layer binarizes data output by the steganographic information extraction network model (with 0 regarded as a positive number) to obtain rebuilt target steganographic information corresponding to the original steganographic information.
  • the first discriminative convolutional neural network model and the second discriminative convolutional neural network model have the same architecture, which is composed of four structurally-identical convolutional layers (Conv + BN +LReLU) , one fully connected layer, and one sigmoid activation layer.
  • the four structurally-identical convolutional layers have 32, 64, 128, and 256 channels, respectively.
  • the loss functions of the two network models are sigmoid cross-entropy
  • the fully connected layer output of the two network models is one numerical value.
  • the sigmoid activation layer can convert the output from the fully connected layer to the [0, 1] interval range, which is used to express the probability whether the input data of the two network models is the original carrier object.
  • sigmoid cross-entropy is one way to measure predicted values and actual values of neural networks.
  • the following is a description of how the loss function of each convolutional neural network model is used to adjust the parameters of each convolutional neural network model in the process of training the convolutional neural network models in a cycle generative adversarial network and how the original steganographic convolutional neural network model is caused to converge.
  • S represents the original steganographic convolutional neural network model
  • I represents the reverse steganographic convolutional neural network model
  • H represents the steganographic information extraction convolutional neural network model
  • D 1 represents the first discriminative convolutional neural network model
  • D 2 represents the second discriminative convolutional neural network model
  • q s , q i , q h , q d1 , and q d2 represent the parameter information of the original steganographic convolutional neural network model S, the reverse steganographic convolutional neural network model I, the steganographic information extraction convolutional neural network model H, the first discriminative convolutional neural network model D 1 , and the second discriminative convolutional neural network modelD 2 , respectively.
  • the input/output relationships between the various models in the cycle generative adversarial network can be described by the formulas below:
  • expression 1 is used to express the input/output relationship of the original steganographic convolutional neural network model
  • expression 2 is used to express the input/output relationship of the reverse steganographic convolutional neural network model
  • expression 3 is used to express the input/output relationship of the first discriminative convolutional neural network model
  • expression 4 is used to express the input/output relationship of the second discriminative convolutional neural network model
  • C, C', C” , and M represent the original training carrier object, the original training steganographic object, the reverse training carrier object, and the original training steganographic information, respectively.
  • S (., . ) , I (., . ) , H (., . ) , D 1 (., . ) , and D 2 (., . ) can correspond to the output data of convolutional neural network models S, I, H, D 1 , and D 2 , respectively.
  • the loss function of the steganographic information extraction network model can be defined as the Euclidean distance between the original steganographic information M and the target steganographic information M’.
  • the loss function can correspond to a function that maps the values of random events or of random variables relating thereto to non-negative real numbers to express the “risk” or “loss” of the random event.
  • loss functions typically are used to intuitively express the difference between convolutional neural network model original data and the target data obtained by processing.
  • the loss function of the steganographic information extraction convolutional neural network model can be described as:
  • the loss function of the first discriminative convolutional neural network model can be defined as sigmoid cross-entropy. In some embodiments, the loss function of the first discriminative convolutional neural network model is described as:
  • the loss function of the second discriminative convolutional neural network model can be defined as sigmoid cross-entropy. In some embodiments, the loss function of the second discriminative convolutional neural network model is described as:
  • the setting up of the reverse steganographic convolutional neural network model is to use the original training steganographic object C' to obtain the original training carrier object C and to enable the acquired object to also deceive the second discriminative convolutional neural network model.
  • the loss function of the reverse steganographic convolutional neural network model can both include the quality of the reverse training carrier object C” and include the loss caused by the reverse training carrier object C” .
  • the loss function of the reverse steganographic convolutional neural network model can be described as: wherein l is for balancing the weights between both terms.
  • Output from the original steganographic convolutional neural network model S can be input data for the reverse steganographic convolutional network model I, the steganographic information extraction convolutional neural network model H, and the first discriminative convolutional neural network model D 1 . Therefore, the loss function of the reverse steganographic convolutional neural network model can be described as:
  • l s , l h , l i , and l d1 can be used to define weights between the various terms.
  • each convolutional neural network model in the cycle generative adversarial network can be caused to converge and the converged original steganographic convolutional neural network model can be regarded as the target steganographic convolutional neural network model used for steganographic processing.
  • the converged target steganographic convolutional neural network model has been obtained using the cycle generative adversarial network.
  • the target steganographic convolutional neural network model can be used to subject carrier objects and to-be-embedded information to steganographic processing.
  • the process 300 of FIG. 3 uses the target steganographic convolutional neural network model obtained through the above operations to process carrier objects and to-be-embedded information to acquire a target residual object corresponding to the carrier object and to-be-embedded information.
  • Process 300 then obtains a target steganographic object based on the target residual object and the carrier object.
  • the obtaining of the target steganographic object can include in subsequent processing to be described.
  • the description here is to be limited to how the target steganographic convolutional neural network model is used to obtain a target residual object corresponding to the carrier object and to-be-embedded information.
  • the target steganographic convolutional neural network model structure and its processing process are described below in light of FIG. 5.
  • FIG. 5 is a diagram of an example of a target steganographic convolutional neural network model.
  • the target steganographic convolutional neural network model can be a residual learning convolutional neural network model, which includes at least one residual block and one fully connected layer.
  • the residual learning convolutional neural network model has already been described above.
  • the inputting of the carrier object and the to-be-embedded information into a target steganographic convolutional neural network model to obtain a target residual object can include: inputting the carrier object and the to-be-embedded information into the residual learning convolutional neural network model to obtain the target residual object.
  • the inputting of the carrier object and the to-be-embedded information into the residual learning convolutional neural network model to obtain the target residual object includes: subjecting the carrier object and the to-be-embedded information to fusion processing to obtain object fused information; and inputting the object fused information into the residual learning convolutional neural network model to obtain the target residual object.
  • the object fused information refers to a type of data content information that is formed by fusing together different data objects using intrinsic features of the data and that simultaneously includes the different data objects.
  • the different data objects refer to objects expressed in binary form.
  • the different data objects refer to image, audio, video, text and other such objects.
  • the object fused information refers to the data content information that is formed by fusing the carrier object with the to-be-embedded information and that includes both the carrier object and the to-be-embedded information.
  • the subjecting of the carrier object and the to-be-embedded information to fusion processing to obtain the object fused information includes: subjecting the to-be-embedded information to fusion preprocessing to obtain preprocessed to-be-embedded information; and obtaining the object fused information based on the preprocessed to-be-embedded information and the carrier object.
  • spatial dimensions of the preprocessed to-be-embedded information are the same as the spatial dimensions of the carrier object.
  • the preprocessed to-be-embedded information refers to information corresponding to to-be-embedded information that is obtained prior to subjecting the carrier object and the to-be-embedded information to fusion processing and after subjecting the to-be-embedded information to relevant preprocessing to make the shapes or spatial dimensions of the carrier object and the shapes or spatial dimensions of the to-be-embedded information the same.
  • the subjecting of the to-be-embedded information to fusion preprocessing to obtain the preprocessed to-be-embedded information includes: subjecting the to-be-embedded information to replication processing based on the spatial dimension information of the carrier object to obtain preprocessed to-be-embedded information having the same spatial dimensions as the carrier object.
  • the subjecting of the to-be-embedded information to replication processing based on the spatial dimension information of the carrier object includes: replicating the binary series corresponding to the to-be-embedded information to obtain preprocessed to-be-embedded information having the same spatial dimensions as the carrier object.
  • the specific number of replications is based on spatial dimension information of the carrier object.
  • the obtaining of the object fused information based on the preprocessed to-be-embedded information and the carrier object includes: subjecting the preprocessed to-be-embedded information and the carrier object to additive processing to obtain the object fused information.
  • the subjecting of the preprocessed to-be-embedded information and the carrier object to additive processing includes: subjecting binary series data corresponding to the preprocessed to-be-embedded information and carrier object matrix data to a type of additive processing according to their spatial positions.
  • the carrier image is img1 having a shape corresponding to h ⁇ w ⁇ c (as an example, h is height information, w is width information, and c is the number of channels; h, w, and c are all greater than 0) .
  • the to-be-embedded information is msg1 having a corresponding binary shape is 1 ⁇ l (as an example, l is greater than 0) .
  • the process whereby the target steganographic convolutional neural network model acquires the target residual object res_img1 corresponding to the carrier image img1 and the to-be-embedded information msg1 is: first, subjecting the to-be-embedded information msg1 to fusion preprocessing to acquire the preprocessed to-be-embedded information msg1-1, i.e., replicating to-be-embedded information msg1 based on the spatial dimensions h ⁇ w of the carrier object img1 and replicating the binary message space corresponding to the to-be-embedded information msg1 h ⁇ w times to acquire the preprocessed to-be-embedded information msg1-1 having the same spatial dimensions, i.e., a shape of h ⁇ w ⁇ l, as the carrier object img1; next, connecting and fusing the carrier object img1 and the preprocessed
  • process 300 of FIG. 3 is performed on a carrier object other than an image, e.g., on an audio, video, or other such object to acquire a target residual object corresponding the audio, video, or other such object.
  • the inputting of the object fused information into the residual learning convolutional neural network model to obtain the target residual object includes: processing the object fused information to obtain a to-be-processed residual object using a residual block in the target steganographic convolutional neural network model; and process the to-be-processed residual object to obtain a target residual object using a fully connected layer in the target steganographic convolutional neural network model.
  • the server obtains a target steganographic object in which the to-be-embedded information is embedded based on the carrier object and the target residual object.
  • the above operations introduced how a reverse steganographic convolutional neural network model and an original steganographic convolutional neural network model, both of which are in a cycle generative adversarial network are adversarially trained to acquire a target steganographic convolutional neural network model for steganographic processing and how the target steganographic convolutional neural network model is used to process the carrier object and to-be-embedded information obtained in operation 310 to obtain a target residual object corresponding to the carrier object and the to-be-embedded information.
  • a target steganographic object in which the to-be-embedded information is embedded is obtained based on the carrier object and the target residual object.
  • the obtaining of the target steganographic object in which the to-be-embedded information is embedded based on the carrier object and the target residual object includes: subjecting the carrier object and the target residual object to additive processing to obtain the target steganographic object.
  • the carrier object and the target residual object res_img1 undergo additive processing to obtain the target steganographic object des_img1 in which the to-be-embedded information msg1 is embedded.
  • the subjecting of the carrier object and the target residual object to additive processing includes adding the matrix data corresponding to the two objects.
  • the to-be-embedded information can undergo encryption processing prior to being embedded in the carrier object.
  • the encryption processing can further increase security during information transfer. Therefore, the inputting of the carrier object and the to-be-embedded information into a target steganographic convolutional neural network model to obtain a target residual object includes: subjecting the to-be-embedded information to encryption processing to obtain encrypted to-be-embedded information; and inputting the carrier object and the encrypted to-be-embedded information into the target steganographic convolutional neural network model to obtain the target residual object.
  • the subjecting of the to-be-embedded to encryption processing can include encrypting the to-be-embedded information using an encryption algorithm corresponding to the file format of the to-be-embedded information.
  • a target steganographic object that corresponds to the carrier object and the to-be-embedded information, and in which the to-be-embedded information has been embedded, can be obtained.
  • Table 1 presents the quantitative comparison results for the steganographic technique provided by process of FIG. 3 against a conventional generative adversarial network steganographic technique.
  • the Table 1 data correspond to comparison results obtained from test data for a carrier object in an image format.
  • Pixel depth refers to the number of bits used to store each pixel in an image. Pixel depth can be used to measure image resolution.
  • the decoding success rate refers to the decoding success rate of the two processes vis-à-vis untrained sample data.
  • the detection rate refers to the recognition rate of discriminators in the two processes vis-à-vis steganographic objects.
  • PSNR peak signal-to-noise ratio
  • FIG. 6A is a diagram of examples of original carrier objects.
  • FIG. 6B is a diagram of examples of target steganographic objects obtained by a conventional process for performing a steganographic technique.
  • FIG. 6C is a diagram of examples of target steganographic objects obtained by the embodiment of the process for performing a steganographic technique of FIG. 3. A comparison of FIG. 6B and FIG. 6C shows that the target steganographic objects obtained by the process 300 for performing a steganographic technique of FIG. 3 have better picture quality and image visual effects.
  • the structure of the target steganographic convolutional neural network model used in the process 300 of FIG. 3 and the process 300 for performing a steganographic technique whereby the process 300 performs steganography are more complex than the structure of conventional steganographic convolutional neural network models and more hidden than the steganographic processing processes using the conventional steganographic convolutional neural network models.
  • the embedded information in the target steganographic object that a target steganographic convolutional neural network model ultimately obtains has better concealment qualities and is not easily destroyed.
  • the steganographic process by inputting the acquired carrier object and to-be-embedded information into the target steganographic convolutional neural network model, acquires a target residual object, and then obtains a target steganographic object based on the carrier object and the target residual object.
  • the present application by introducing a reverse steganographic convolutional neural network model and regarding reverse objects output by the reverse steganographic convolutional neural network model as sample objects for training to obtain the target steganographic convolutional neural network.
  • the reverse steganographic convolutional neural network model first acquires a target residual object corresponding to the carrier object and the to-be-embedded information, and then obtains the final target steganographic object based on the carrier object and the target residual object.
  • the security of embedded information in target steganographic objects can thus be greatly increased.
  • a steganographic process is provided above.
  • a process for detecting information corresponding to the above steganographic process is described. Some of the operations therein were already described in the process for performing a steganographic technique.
  • FIG. 7 is a flowchart of an embodiment of a process for detecting information.
  • the process 700 is implemented by the server 110 of FIG. 1 and comprises:
  • the server acquires an object under scrutiny in which information was steganographically embedded.
  • the server inputs the object under scrutiny into a steganographic information extraction convolutional neural network model to obtain target embedded information embedded in the object under scrutiny.
  • the steganographic information extraction convolutional neural network model is a model obtained through adversarially training a target steganographic convolutional neural network model used to generate steganographic objects and a reverse steganographic convolutional neural network model.
  • the target steganographic convolutional neural network model is a model obtained from training using reverse carrier objects output by the reverse steganographic convolutional neural network model as sample objects.
  • the reverse steganographic convolutional neural network model is used to obtain reverse carrier objects based on steganographic objects obtained with the target steganographic convolutional neural network model.
  • the reverse carrier objects are objects from which embedded information has been eliminated.
  • process 700 after obtaining, through operation 720, the target embedded information in the image under scrutiny, further comprises: subjecting the target embedded information to decryption processing to obtain decrypted target embedded information.
  • the subjecting of the target embedded information to decryption processing includes subjecting the target embedded information to decryption processing using a decryption algorithm corresponding to the encryption algorithm whereby the target embedded information was encrypted.
  • process 700 further comprises: performing verification processing for the target object based on the target embedded information.
  • the target object is an entity object awaiting permission verification.
  • entity object awaiting permission verification could be a user awaiting verification, a vehicle awaiting verification, or other such object.
  • process 700 is applied to different contexts, such as code verification, permission checking, and copyright protection.
  • the application scenario is a vehicle gate management system
  • a pass including permission information that is acquired with the process 300 of FIG. 3 can be used in advance.
  • the vehicle gate management system acquires an image of the pass included in the vehicle through a connected image sensor such as a camera.
  • the vehicle gate management system acquires permission information steganographically embedded in the image through process 700 of FIG. 7 and performs a corresponding permission check for the vehicle.
  • the verification succeeds, the vehicle is permitted to pass.
  • process 700 can be used in other manners based on different application scenarios.
  • FIG. 8 is a diagram of an embodiment of a device for performing a steganographic technique.
  • the device 800 is configured to perform the process 300 of FIG. 3 and comprises: an information acquiring unit 810, a target residual object acquiring unit 820, and a target steganographic object acquiring unit 830.
  • the information acquiring unit 810 is configured to acquire a carrier object and to-be-embedded information.
  • the target residual object acquiring unit 820 is configured to input the carrier object and the to-be-embedded information into a target steganographic convolutional neural network model to acquire a target residual object.
  • the target steganographic convolutional neural network model corresponds to a model obtained from training using reverse carrier objects output by a reverse steganographic convolutional neural network model as sample objects.
  • the reverse steganographic convolutional neural network model is configured to obtain reverse carrier objects based on steganographic objects obtained with the target steganographic convolutional neural network model.
  • the reverse carrier objects are objects from which embedded information has been eliminated.
  • the target steganographic object acquiring unit 830 is configured to obtain a target steganographic object in which the to-be-embedded information is embedded based on the carrier object and the target residual object.
  • the units described above can be implemented as software components executing on one or more general purpose processors, as hardware such as programmable logic devices and/or Application Specific Integrated Circuits designed to perform certain functions or a combination thereof.
  • the units can be embodied by a form of software products which can be stored in a nonvolatile storage medium (such as optical disk, flash storage device, mobile hard disk, etc. ) , including a number of instructions for making a computer device (such as personal computers, servers, network equipment, etc. ) implement the methods described in the embodiments of the present invention.
  • the units may be implemented on a single device or distributed across multiple devices. The functions of the units may be merged into one another or further split into multiple sub-units.
  • FIG. 9 is a diagram of an embodiment of a device for detecting information.
  • the device 800 is configured to perform the process 700 of FIG. 7 and comprises: an object under scrutiny acquiring unit 910 and a target embedded information acquiring unit 920.
  • the object under scrutiny acquiring unit 910 is configured to acquire an object under scrutiny in which information was steganographically embedded.
  • the target embedded information acquiring unit 920 is configured to input the object under scrutiny into a steganographic information extraction convolutional neural network model to obtain the target embedded information embedded in the object under scrutiny.
  • the steganographic information extraction convolutional neural network model is a model obtained through adversarially training a target steganographic convolutional neural network model used to generate steganographic objects and a reverse steganographic convolutional neural network model.
  • the target steganographic convolutional neural network model is a model obtained from training using reverse carrier objects output by the reverse steganographic convolutional neural network model as sample objects.
  • the reverse steganographic convolutional neural network model is used to obtain reverse carrier objects based on steganographic objects obtained from the target steganographic convolutional neural network model.
  • the reverse carrier objects are objects from which embedded information has been eliminated.
  • the present application further provides a commodity object corresponding to the steganographic technique provided by process 300 of FIG. 3.
  • the commodity object includes the commodity object itself and a target steganographic object attached to the commodity object itself.
  • the target steganographic object is a steganographic object obtained using the steganographic technique provided by process 300 of FIG. 3.
  • the commodity object is a book, a bottled product, a vehicle, or other such commodity object.
  • the target steganographic object attached to the commodity object itself can be a steganographic object in which copyright information is embedded that is attached on or inside a product package, a steganographic object in which is embedded virtual currency information, or a steganographic object in which is embedded other information in need of hidden transfer.
  • the target steganographic object is a steganographic object in which is embedded virtual currency information
  • a target steganographic image on the commodity object itself can be recognized with an application provided by the merchant using the process 700 of FIG. 7 for detecting information, and virtual currency information in the target steganographic technique can thus be acquired.
  • the virtual currency information can be electronic currency information, such as product points or virtual exchange currency, which is used to provide value-added service to the commodity object.
  • FIG. 10 is a functional diagram illustrating a programmed computer system for performing a steganographic technique in accordance with some embodiments.
  • Computer system 1000 which includes various subsystems as described below, includes at least one microprocessor subsystem (also referred to as a processor or a central processing unit (CPU) ) 1002.
  • processor 1002 can be implemented by a single-chip processor or by multiple processors.
  • processor 1002 is a general purpose digital processor that controls the operation of the computer system 1000. Using instructions retrieved from memory 1010, the processor 1002 controls the reception and manipulation of input data, and the output and display of data on output devices (e.g., display 1018) .
  • Processor 1002 is coupled bi-directionally with memory 1010, which can include a first primary storage, typically a random access memory (RAM) , and a second primary storage area, typically a read-only memory (ROM) .
  • primary storage can be used as a general storage area and as scratch-pad memory, and can also be used to store input data and processed data.
  • Primary storage can also store programming instructions and data, in the form of data objects and text objects, in addition to other data and instructions for processes operating on processor 1002.
  • primary storage typically includes basic operating instructions, program code, data and objects used by the processor 1002 to perform its functions (e.g., programmed instructions) .
  • memory 1010 can include any suitable computer-readable storage media, described below, depending on whether, for example, data access needs to be bi-directional or uni-directional.
  • processor 1002 can also directly and very rapidly retrieve and store frequently needed data in a cache memory (not shown) .
  • a removable mass storage device 1012 provides additional data storage capacity for the computer system 1000, and is coupled either bi-directionally (read/write) or uni-directionally (read only) to processor 1002.
  • storage 1012 can also include computer-readable media such as magnetic tape, flash memory, PC-CARDS, portable mass storage devices, holographic storage devices, and other storage devices.
  • a fixed mass storage 1020 can also, for example, provide additional data storage capacity. The most common example of mass storage 1020 is a hard disk drive.
  • Mass storages 1012, 1020 generally store additional programming instructions, data, and the like that typically are not in active use by the processor 1002. It will be appreciated that the information retained within mass storages 1012 and 1020 can be incorporated, if needed, in standard fashion as part of memory 1010 (e.g., RAM) as virtual memory.
  • bus 1014 can also be used to provide access to other subsystems and devices. As shown, these can include a display monitor 1018, a network interface 1016, a keyboard 1004, and a pointing device 1006, as well as an auxiliary input/output device interface, a sound card, speakers, and other subsystems as needed.
  • the pointing device 1006 can be a mouse, stylus, track ball, or tablet, and is useful for interacting with a graphical user interface.
  • the network interface 1016 allows processor 1002 to be coupled to another computer, computer network, or telecommunications network using a network connection as shown.
  • the processor 1002 can receive information (e.g., data objects or program instructions) from another network or output information to another network in the course of performing method/process steps.
  • Information often represented as a sequence of instructions to be executed on a processor, can be received from and outputted to another network.
  • An interface card or similar device and appropriate software implemented by (e.g., executed/performed on) processor 1002 can be used to connect the computer system 1000 to an external network and transfer data according to standard protocols.
  • various process embodiments disclosed herein can be executed on processor 1002, or can be performed across a network such as the Internet, intranet networks, or local area networks, in conjunction with a remote processor that shares a portion of the processing.
  • Additional mass storage devices can also be connected to processor 1002 through network interface 1016.
  • auxiliary I/O device interface (not shown) can be used in conjunction with computer system 1000.
  • the auxiliary I/O device interface can include general and customized interfaces that allow the processor 1002 to send and, more typically, receive data from other devices such as microphones, touch-sensitive displays, transducer card readers, tape readers, voice or handwriting recognizers, biometrics readers, cameras, portable mass storage devices, and other computers.
  • the computer system shown in FIG. 10 is but an example of a computer system suitable for use with the various embodiments disclosed herein.
  • Other computer systems suitable for such use can include additional or fewer subsystems.
  • bus 1014 is illustrative of any interconnection scheme serving to link the subsystems.
  • Other computer architectures having different configurations of subsystems can also be utilized.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Software Systems (AREA)
  • General Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • Artificial Intelligence (AREA)
  • Computational Linguistics (AREA)
  • Mathematical Physics (AREA)
  • Computing Systems (AREA)
  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Multimedia (AREA)
  • Biomedical Technology (AREA)
  • Biophysics (AREA)
  • Molecular Biology (AREA)
  • Data Mining & Analysis (AREA)
  • Evolutionary Computation (AREA)
  • General Health & Medical Sciences (AREA)
  • Computer Security & Cryptography (AREA)
  • Technology Law (AREA)
  • Computer Hardware Design (AREA)
  • Signal Processing (AREA)
  • Probability & Statistics with Applications (AREA)
  • Image Analysis (AREA)

Abstract

Performing a steganographic technique is disclosed including acquiring a carrier object and to-be-embedded information, inputting the carrier object and the to-be-embedded information into a target steganographic convolutional neural network model to obtain a target residual object, obtaining a target steganographic object in which the to-be-embedded information is embedded based on the carrier object and the target residual object, and outputting the target steganographic object.

Description

METHOD AND SYSTEM FOR PERFORMING A STEGANOGRAPHIC TECHNIQUE
CROSS REFERENCE TO OTHER APPLICATIONS
This application claims priority to People’s Republic of China Patent Application PNo. 201910858185.2 entitled STEGANOGRAPHIC METHOD AND INFORMATION DETECTION METHOD AND MEANS filed September 11, 2019 which is incorporated herein by reference for all purposes
FIELD OF THE INVENTION
The present application relates to a method and a system for performing a steganographic technique.
BACKGROUND OF THE INVENTION
As digital media technology and Internet technology develop, garnering more attention is the concealment of transmitted information through steganographic techniques when information is transferred or an image, a video, or other such object is provided with digital copyright protection. A steganographic technique refers to a technique whereby information to be embedded (e.g., information to be transferred or digital copyright information) is written in a hidden and disguised manner into a carrier object to obtain a corresponding steganographic object including the embedded information.
Recently, generative adversarial network-based steganographic techniques have gradually come into use. The generative adversarial network-based steganographic techniques typically generate steganographic objects using a steganographic convolutional neural network model, determine whether an input object includes embedded information using a binary classification convolutional neural network model, and extract embedded information from a steganographic object using a steganographic information extraction convolutional neural network model. The generative adversarial network-based steganographic techniques’ core concept includes using an adversarial training strategy. In other words, input and output objects of the steganographic convolutional neural network model serve as input objects for the binary  classification convolutional neural network model, and the concealment qualities of steganographic objects generated by the steganographic convolutional neural network model are examined using the binary classification convolutional neural network model. Then the two convolutional neural network models contest each other in games, during which the performance of the two convolutional neural network models gradually improves. Steganography is then performed using the stable-performing steganographic convolutional neural network model.
Although the generative adversarial network-based steganographic techniques can increase the concealment of embedded information in the steganographic objects obtained from the generative adversarial network-based steganographic techniques, the generative adversarial network-based steganographic techniques do have some limitations. First, the adversarial information that the generative adversarial network-based steganographic techniques use is insufficiently accurate. In the above technique, the input data for the binary classification network that conducts adversarial training for the steganographic convolutional neural network model is the input and output data of the steganographic convolutional neural network model, i.e., the carrier object and the corresponding steganographic object. The input data and the output data do not have much of a relationship between each other. Second, the structure of the steganographic convolutional neural network model in the above technique is relatively simple because a basic neural network is used. Embedded information in a steganographic object obtained using the steganographic convolutional neural network model can easily be detected and destroyed.
BRIEF DESCRIPTION OF THE DRAWINGS
Various embodiments of the invention are disclosed in the following detailed description and the accompanying drawings.
FIG. 1 is a diagram of an embodiment of an application scenario for performing a steganographic technique.
FIG. 2 is a diagram of a conventional generative adversarial network-based steganographic technique.
FIG. 3 is a flowchart of an embodiment of a process for performing a steganographic technique.
FIG. 4 is a diagram of an example of a cycle generative adversarial network.
FIG. 5 is a diagram of an example of a target steganographic convolutional neural network model.
FIG. 6A is a diagram of examples of original carrier objects.
FIG. 6B is a diagram of examples of target steganographic objects obtained by a conventional process for performing a steganographic technique.
FIG. 6C is a diagram of examples of target steganographic objects obtained by the embodiment of the process 300 for performing a steganographic technique of FIG. 3.
FIG. 7 is a flowchart of an embodiment of a process for detecting information.
FIG. 8 is a diagram of an embodiment of a device for performing a steganographic technique.
FIG. 9 is a diagram of an embodiment of a device for detecting information.
FIG. 10 is a functional diagram illustrating a programmed computer system for performing a steganographic technique in accordance with some embodiments.
DETAILED DESCRIPTION
The invention can be implemented in numerous ways, including as a process; an apparatus; a system; a composition of matter; a computer program product embodied on a computer readable storage medium; and/or a processor, such as a processor configured to execute instructions stored on and/or provided by a memory coupled to the processor. In this specification, these implementations, or any other form that the invention may take, may be referred to as techniques. In general, the order of the steps of disclosed processes may be altered within the scope of the invention. Unless stated otherwise, a component such as a processor or a memory described as being configured to perform a task may be implemented as a general component that is temporarily configured to perform the task at a given time or a specific component that is manufactured to perform the task. As used herein, the term ‘processor’ refers to one or more devices, circuits, and/or processing cores configured to process data, such as computer program instructions.
A detailed description of one or more embodiments of the invention is provided below along with accompanying figures that illustrate the principles of the invention. The invention is described in connection with such embodiments, but the invention is not limited to any embodiment. The scope of the invention is limited only by the claims and the invention encompasses numerous alternatives, modifications and equivalents. Numerous specific details are set forth in the following description in order to provide a thorough understanding of the invention. These details are provided for the purpose of example and the invention may be practiced according to the claims without some or all of these specific details. For the purpose of clarity, technical material that is known in the technical fields related to the invention has not been described in detail so that the invention is not unnecessarily obscured.
In some embodiments, the present application is applied to a client-server interaction scenario. FIG. 1 is a diagram of an embodiment of an application scenario for performing a steganographic technique. A client 120 first establishes a connection with a server 110. The connected client 120 sends a carrier object and to-be-embedded information to the server 110. After receiving the carrier object and the to-be-embedded information, the server inputs the carrier object and the to-be-embedded information into a target steganographic  convolutional neural network model to obtain a target residual object. In some embodiments, the target steganographic convolutional neural network model is a model obtained from training using reverse carrier objects output by a reverse steganographic convolutional neural network model as sample objects. In some embodiments, the reverse steganographic convolutional neural network model is used to obtain reverse carrier objects based on steganographic objects obtained with the target steganographic convolutional neural network model. In some embodiments, the reverse carrier objects are objects from which embedded information has been eliminated. Next, the server obtains the target steganographic object in which the to-be-embedded information is embedded using the carrier object and the target residual object, and sends the target steganographic object including the to-be-embedded information to the client. The client then receives the target steganographic object.
Please note that an image can represent the carrier object of FIG. 1. In some embodiments, the carrier object is an object other than an image, such as a video object or an audio object. For the sake of convenience, a carrier image is used as an example to explain the performing of the steganographic technique or process for concealing information. The client can be a mobile terminal device such as a mobile phone or a tablet, or the client can be a common computer device. In addition, in some embodiments, the application scenario 100 for performing a steganographic technique can solely be applied to a client or a server. For example, after the acquiring of the carrier object and the to-be-embedded information, the client performs processing directly through a corresponding application installed on the client, and then acquires the target steganographic object. In another example, the server, after acquiring the carrier object and the to-be-embedded information, directly stores the acquired target steganographic object locally or remotely without having to send the acquired target steganographic object to the client. The above application scenario is merely an embodiment of the application scenario 100 for performing a steganographic technique. The above embodiments of the application scenario 100 for performing a steganographic technique are described to make it easier to understand.
An embodiment of a process for performing a steganographic technique is explained in light of FIGS. 2 through 6 below.
Before introducing the process for performing a steganographic technique, a  conventional process for performing a generative adversarial network-based steganographic technique is first described.
FIG. 2 is a diagram of a conventional generative adversarial network-based steganographic technique. The generative adversarial network 200 includes: a steganographic convolutional neural network model 230 that generates a corresponding steganographic object 240 based on a carrier object 210 and to-be-embedded information 220; a binary classification convolutional neural network model 250 that uses the carrier object 210 and the steganographic object 240 as input data and determines whether the type of the input data includes embedded information; and an information extraction convolutional neural network model 260 that generates target steganographic information 280 based on the steganographic object 240. In the generative adversarial network 200, the steganographic object 240 is generated by the steganographic convolutional neural network model 230, and the binary classification convolutional neural network model 250 identifies whether its input data includes embedded information. In other words, the binary classification convolutional neural network model 250 outputs a discrimination result 270, thus increasing the robustness of the convolutional neural network model in the generative adversarial network 200 through a process of continual adversarial training.
The steganographic convolutional neural network model 230 uses the carrier object 210 and the to-be-embedded information 220 as input. A first layer of the steganographic convolutional neural network model 230 is a fully connected layer. Then the fully connected layer subjects the carrier object 210 and the to-be-embedded information 220 to processing. Next, the steganographic convolutional neural network model 230 deforms the fully connected layer output to the expression 4 × 4 × I × 8. In some embodiments, I is the width of the carrier object. This is followed by four processing modules including a deconvolution unit (DeConv) + a batch normalization unit (BN) + a rectified linear unit (ReLU) . The final layer can be an activation layer. The activation is a hyperbolic tangent function (tanh) activation layer.
The information extraction convolutional neural network model 260 can also be a convolutional neural network model, which includes four convolutional layers and one fully connected layer. The final layer of the information extraction convolutional neural network  model 260 is a tanh activation layer. All of the remaining layers use leaky rectified linear units (LReLU) as activation functions, and each layer includes a batch normalization module.
The network architecture of the binary classification convolutional neural network model 250 is similar to the network architecture of the steganographic convolutional neural network model 230. The only difference between the two models is that the activation function used by the final activation layer of the binary classification convolutional neural network model 250 is a sigmoid activation function.
From the above description, the input data, i.e., the adversarial information, of the binary classification convolutional neural network model 250 used to conduct adversarial training in conventional generative adversarial network-based steganographic techniques is insufficiently accurate. Accurate adversarial information should be another steganographic object corresponding to the carrier object 210 and the to-be-embedded information 220. However, it is typically difficult to obtain such an ideal steganographic object for adversarial training. Moreover, the structure of the convolutional neural network models in the above steganographic technique is relatively simple, and the target steganographic objects 280 obtained thereby have relatively poor concealment qualities and are easily destroyed.
FIG. 3 is a flowchart of an embodiment of a process for performing a steganographic technique. In some embodiments, the process 300 is implemented by the server 110 of FIG. 1 and comprises:
In 310, the server acquires a carrier object and to-be-embedded information.
In some embodiments, the carrier object is a carrier for carrying embedded information. In some embodiments, the carrier object is an image, audio, video, or other such object. An example of the carrier image is used for descriptive convenience in process 300.
The to-be-embedded information can refer to information that is to be transferred covertly or information used to provide digital copyright protection. In some embodiments, the to-be-embedded information is a piece of text information. In some embodiments, the to-be-embedded information is image information, audio information, or video information, such as a company logo, a contract document scan, an audio or video file, or other such information.
In some embodiments, the acquiring of the carrier object and the to-be-embedded information is performed by a computing device, such as a server computing device or a client computing device, used to implement receiving the carrier object and the to-be-embedded information from a user or other computing device receiving a data request message to acquire a target steganographic object and acquiring the carrier object and to-be-embedded information from the data request message.
In 320, the server inputs the carrier object and the to-be-embedded information into a target steganographic convolutional neural network model to obtain a target residual object. In some embodiments, the target steganographic convolutional neural network model is a model obtained from training using reverse carrier objects output by a reverse steganographic convolutional neural network model as sample objects. In some embodiments, the reverse steganographic convolutional neural network model is used to obtain the reverse carrier objects based on steganographic objects obtained with the target steganographic convolutional neural network model. In some embodiments, the reverse carrier objects are objects from which embedded information has been eliminated.
The target residual object can refer to an object formed from the residual between the carrier object and the anticipated target steganographic object. In some embodiments, the target residual object is the object obtained after inputting the carrier object and the to-be-embedded information into the target steganographic convolutional neural network model.
In data statistics, the residual can refer to a difference between an actual observed value and an estimated value. In some embodiments, the estimated value is also called a fitted value, and the observed value is also called an expected value. In some embodiments, the residual is the difference between the expected data and the output data obtained after input data has been input into a corresponding convolutional neural network model and has undergone processing by the convolutional neural network model. The residual can also be regarded as the observed value of error.
To increase the difficulty to find and destroy steganographic information in the target steganographic object that is ultimately obtained, process 300 can include: introducing a reverse steganographic convolutional neural network model; obtaining, through training, a target  steganographic convolutional neural network model for steganographic processing using a cycle generative adversarial network including the reverse steganographic convolutional neural network model; and performing steganographic processing using the target steganographic convolutional neural network model. In some embodiments, the reverse steganographic convolutional neural network model is used to obtain the reverse carrier objects based on the steganographic objects obtained from the target steganographic convolutional neural network model. In some embodiments, the reverse carrier objects are objects from which embedded information has been eliminated.
FIG. 4 is a diagram of an example of a cycle generative adversarial network. The obtaining of the target steganographic convolutional neural network model can include: acquiring an original steganographic convolutional neural network model; adversarially training the reverse steganographic convolutional neural network model and the original steganographic convolutional neural network model to cause the original steganographic convolutional neural network model to converge, and regarding the converged original steganographic convolutional neural network model as the target steganographic convolutional neural network model.
In some embodiments, the adversarially training of the reverse steganographic convolutional neural network model and the original steganographic convolutional neural network model to cause the original steganographic convolutional neural network model to converge includes: acquiring a first discriminative convolutional neural network model and a second discriminative convolutional network model, determining, using the first discriminative convolutional neural network model, whether a received object includes embedded information and determining, using the second discriminative convolutional neural network model, whether a received object is an original carrier object or the reverse carrier object; then adversarially training the original steganographic convolutional neural network model, the reverse steganographic convolutional neural network model, the first discriminative convolutional neural network model, and the second discriminative convolutional neural network model to cause the original steganographic convolutional neural network model to converge.
As shown in FIG. 4, the adversarially training of the original steganographic convolutional neural network model, the reverse steganographic convolutional neural network  model, the first discriminative convolutional neural network model, and the second discriminative convolutional neural network model to cause the original steganographic convolutional neural network model to converge includes: acquiring an original training carrier object and acquiring original training to-be-embedded information; inputting the original training carrier object and the original training to-be-embedded information into the original steganographic convolutional neural network model to obtain an original training steganographic object; inputting the original training steganographic object into the reverse steganographic convolutional neural network model to obtain a reverse training carrier object; training the first discriminative convolutional neural network model using the original training steganographic object and the original training carrier object; training the second discriminative convolutional neural network model using the original training carrier object and the reverse training carrier object; during the training of the first and second convolutional neural network models, adjusting, using loss functions of the convolutional neural network models, the parameters of the convolutional neural network models to cause the original steganographic convolutional neural network model to converge.
The original training carrier object and the original training to-be-embedded information can be the original training sample data for training each convolutional neural network model in the cycle generative adversarial network. The original training steganographic object can correspond to the original training carrier object in which the original training to-be-embedded information is embedded. The reverse training carrier object can correspond to the object obtained after eliminating the original training to-be-embedded information from the original training steganographic object.
The training of the first discriminative convolutional neural network model using the original training steganographic object and the original training carrier object refers to regarding the original training steganographic object or the original training carrier object as input data for the first discriminative convolutional neural network model to obtain a corresponding first discrimination result. In some embodiments, the first discrimination result indicates whether the input data for the first discriminative convolutional neural network model includes embedded information.
The training of the second discriminative convolutional neural network model using the original training carrier object and the reverse training carrier object refers to regarding the original training carrier object or the reverse training carrier object as input data for the second discriminative convolutional neural network model to obtain a corresponding second discrimination result. In some embodiments, the second discrimination result indicates whether the input data for the second discriminative convolutional neural network model is the original training carrier object in which the information was never embedded or the reverse training carrier object from which the embedded information was eliminated.
In other words, referring back to FIG. 3, process 300 includes the following: a cycle generative adversarial network is designed, and, after original training sample data is obtained in the cycle generative adversarial network, various convolutional neural network models having different functions are used to generate training data for each other. The various convolutional neural network models then contest with each other in adversarial games. Moreover, in the adversarial game process, the parameters of each convolutional neural network model are adjusted using their respective loss functions to cause each convolutional neural network model in the cycle generative adversarial network to converge.
For illustration purposes, an image is used as an example of a carrier object. As an example, original training carrier object o_img1 and original to-be-embedded information o_msg1 are used. First, the original training carrier object o_img1 and the original to-be-embedded information o_msg1 can be input into an original steganographic convolutional neural network model to obtain an original training steganographic object o_s_img1. Then, the original training steganographic object o_s_img1 is input into a reverse convolutional neural network model to obtain a reverse training carrier object i_img1. Next, the original training carrier object o_img1 or the original training steganographic object o_s_img1 is input into a first discriminative convolutional neural network model to obtain a first discrimination result est_result1, and the original to-be-embedded information o_img1 or the reverse training carrier object i_img1 is input into a second discriminative convolutional neural network model to obtain a second discrimination result est_result2. In addition, the original training steganographic object o_s_img1 can also be input into a steganographic information extraction convolutional neural network model to extract the target embedded information from the original training  steganographic object o_s_img1. In the above training process, loss functions of each convolutional neural network model is used to adjust its respective parameters until all the convolutional neural network models achieve convergence. Then, each converged convolutional neural network model is used in subsequent steganographic processing.
Please note that the original steganographic convolutional neural network model in the cycle generative adversarial network can be a residual learning convolutional neural network model.
The residual learning convolutional neural network model can correspond to a deep neural network model that includes at least one residual block. Output data of the residual learning convolutional neural network model can be the residual of input data and real expected data. Adding the input data of the residual learning convolutional neural network model to the residual that the residual learning convolutional neural network model outputs provides the data closest to the expected data. By establishing an association with the input data of each layer, the residual learning convolutional neural network model can learn to form corresponding residual functions, and the residual functions can be more easily optimized than traditional model functions which have no relationship to the input data. The model network layers can then be deepened to form a network model that can attain better performance, e.g. such as, for example, better concealment.
A residual block typically refers to a combination of multiple convolutional layers including shortcut connections. A shortcut connection can also be called a short-circuit connection. The shortcut connection can evolve from shortcut connections in recurrent neural networks (RNN) and gating algorithms and can relate to a technique for alleviating the vanishing gradient problem in deep architectures. In some embodiments, the vanishing gradient problem occurs when the network parameters and hyperparameters are not properly set.
In process 300, the residual learning convolutional neural network model can correspond to a convolutional neural network model that includes at least one residual block and at least one fully connected layer. The residual block can include a convolution module (Conv) , a batch normalization module (BN) , and a leak rectified linear unit (LReLU) . The at least one fully connected layer can include the same number of channels as the carrier object. The size of the  convolution kernel in the at least one residual block and the at least one fully connected layer can be 5 × 5, with a stride of 1 and padding of 2. The number of channels of each layer in the residual block can be 128. The fully connected layer can have the same number of channels as the original carrier object. In some embodiments, each of the relevant numerical values above can be set to other different values as needed.
The reverse steganographic convolutional neural network model can include an architecture similar to that of the original steganographic network model. The reverse steganographic convolutional neural network model can correspond to a lightweight, end-to-end network model.
In some embodiments, the steganographic information extraction convolutional neural network model is used for end-to-end extraction of embedded information from steganographic objects. The steganographic information extraction convolutional neural network model can include four structurally-identical convolutional layers (Conv + BN + LReLU) , one fully connected layer, and one tanh activation layer. In some embodiments, the four structurally-identical convolutional layers have 32, 64, 128, and 256 channels, respectively. The fully connected layer can be used to generate one piece of vector data having the same shape as the original steganographic information. In some embodiments, the tanh activation layer converts the real numbers in the vector data to the [-1, 1] range. Lastly, in some embodiments, the tanh activation layer binarizes data output by the steganographic information extraction network model (with 0 regarded as a positive number) to obtain rebuilt target steganographic information corresponding to the original steganographic information.
In some embodiments, the first discriminative convolutional neural network model and the second discriminative convolutional neural network model have the same architecture, which is composed of four structurally-identical convolutional layers (Conv + BN +LReLU) , one fully connected layer, and one sigmoid activation layer. In some embodiments, the four structurally-identical convolutional layers have 32, 64, 128, and 256 channels, respectively. In some embodiments, the loss functions of the two network models are sigmoid cross-entropy, and the fully connected layer output of the two network models is one numerical value. The sigmoid activation layer can convert the output from the fully connected layer to the [0, 1]  interval range, which is used to express the probability whether the input data of the two network models is the original carrier object. In some embodiments, sigmoid cross-entropy is one way to measure predicted values and actual values of neural networks.
The following is a description of how the loss function of each convolutional neural network model is used to adjust the parameters of each convolutional neural network model in the process of training the convolutional neural network models in a cycle generative adversarial network and how the original steganographic convolutional neural network model is caused to converge.
For the sake of convenience, assume that S represents the original steganographic convolutional neural network model, I represents the reverse steganographic convolutional neural network model, H represents the steganographic information extraction convolutional neural network model, D 1 represents the first discriminative convolutional neural network model, D 2 represents the second discriminative convolutional neural network model, and q s, q i, q h, q d1, and q d2 represent the parameter information of the original steganographic convolutional neural network model S, the reverse steganographic convolutional neural network model I, the steganographic information extraction convolutional neural network model H, the first discriminative convolutional neural network model D 1, and the second discriminative convolutional neural network modelD 2, respectively. The input/output relationships between the various models in the cycle generative adversarial network can be described by the formulas below:
Expression 1: H (q h, C') =H (q h, S (q s, C, M) )
Expression 2: I (q i, C') =I (q i, S (q s, C, M) )
Expression 3: D 1 (q d1, C, C') =D 1 (q d1, C, S (q s, C, M) )
Expression 4: 
Figure PCTCN2020113840-appb-000001
In some embodiments, expression 1 is used to express the input/output relationship of the original steganographic convolutional neural network model, expression 2 is used to express the input/output relationship of the reverse steganographic convolutional neural network model, expression 3 is used to express the input/output relationship of the first discriminative convolutional neural network model, expression 4 is used to express the input/output relationship of the second discriminative convolutional neural network model, and C, C', C” , and M represent the original training carrier object, the original training steganographic object, the reverse training carrier object, and the original training steganographic information, respectively. S (., . ) , I (., . ) , H (., . ) , D 1 (., . ) , and D 2 (., . ) can correspond to the output data of convolutional neural network models S, I, H, D 1, and D 2, respectively.
The loss function of the steganographic information extraction network model can be defined as the Euclidean distance between the original steganographic information M and the target steganographic information M’. The loss function can correspond to a function that maps the values of random events or of random variables relating thereto to non-negative real numbers to express the “risk” or “loss” of the random event. In the computer field, loss functions typically are used to intuitively express the difference between convolutional neural network model original data and the target data obtained by processing.
The loss function of the steganographic information extraction convolutional neural network model can be described as:
L H (q s, q h, M, C) =d (M, M') =d (M, H (q h, C') ) , wherein d (., . ) represents Euclidean distance.
The loss function of the first discriminative convolutional neural network model can be defined as sigmoid cross-entropy. In some embodiments, the loss function of the first  discriminative convolutional neural network model is described as:
Figure PCTCN2020113840-appb-000002
wherein if x is the original training steganographic object C', y = 0, and if x is not the original training steganographic object, y = 1.
The loss function of the second discriminative convolutional neural network model can be defined as sigmoid cross-entropy. In some embodiments, the loss function of the second discriminative convolutional neural network model is described as:
Figure PCTCN2020113840-appb-000003
wherein if x is the reverse training carrier object C” , y = 0, and if x is the original training carrier object C, y = 1.
Please note that the setting up of the reverse steganographic convolutional neural network model is to use the original training steganographic object C' to obtain the original training carrier object C and to enable the acquired object to also deceive the second discriminative convolutional neural network model. As an aspect, the loss function of the reverse steganographic convolutional neural network model can both include the quality of the reverse training carrier object C” and include the loss caused by the reverse training carrier object C” . The loss function of the reverse steganographic convolutional neural network model can be described as: 
Figure PCTCN2020113840-appb-000004
wherein l is for balancing the weights between both terms.
Output from the original steganographic convolutional neural network model S can be input data for the reverse steganographic convolutional network model I, the steganographic information extraction convolutional neural network model H, and the first discriminative convolutional neural network model D 1. Therefore, the loss function of the  reverse steganographic convolutional neural network model can be described as:
Figure PCTCN2020113840-appb-000005
wherein l s, l h, l i, and l d1 can be used to define weights between the various terms.
During the training of each convolutional neural network model in the cycle generative adversarial network, the difference values with which parameters in each network model are to be updated can be calculated using the loss function of each network model and the interrelationships between the various network models, then the parameter information of each network model can be updated based on the difference values, and thus the performance of each network model in the cycle generative adversarial network can be optimized. Furthermore, each convolutional neural network model in the cycle generative adversarial network can be caused to converge and the converged original steganographic convolutional neural network model can be regarded as the target steganographic convolutional neural network model used for steganographic processing.
At this point, the converged target steganographic convolutional neural network model has been obtained using the cycle generative adversarial network. Next, the target steganographic convolutional neural network model can be used to subject carrier objects and to-be-embedded information to steganographic processing.
To address issues of the simple structure of the convolutional neural network models in conventional generative adversarial network-based steganographic techniques and the embedded information in steganographic objects generated by the conventional generative adversarial network-based steganographic techniques being easily destroyed, the process 300 of FIG. 3 uses the target steganographic convolutional neural network model obtained through the above operations to process carrier objects and to-be-embedded information to acquire a target residual object corresponding to the carrier object and to-be-embedded information. Process 300 then obtains a target steganographic object based on the target residual object and the carrier object. The obtaining of the target steganographic object can include in subsequent processing to be described. The description here is to be limited to how the target steganographic convolutional neural network model is used to obtain a target residual object corresponding to  the carrier object and to-be-embedded information.
The target steganographic convolutional neural network model structure and its processing process are described below in light of FIG. 5.
FIG. 5 is a diagram of an example of a target steganographic convolutional neural network model. The target steganographic convolutional neural network model can be a residual learning convolutional neural network model, which includes at least one residual block and one fully connected layer. The residual learning convolutional neural network model has already been described above.
The inputting of the carrier object and the to-be-embedded information into a target steganographic convolutional neural network model to obtain a target residual object can include: inputting the carrier object and the to-be-embedded information into the residual learning convolutional neural network model to obtain the target residual object.
The inputting of the carrier object and the to-be-embedded information into the residual learning convolutional neural network model to obtain the target residual object includes: subjecting the carrier object and the to-be-embedded information to fusion processing to obtain object fused information; and inputting the object fused information into the residual learning convolutional neural network model to obtain the target residual object.
In some embodiments, the object fused information refers to a type of data content information that is formed by fusing together different data objects using intrinsic features of the data and that simultaneously includes the different data objects. In some embodiments, the different data objects refer to objects expressed in binary form. In some embodiments, the different data objects refer to image, audio, video, text and other such objects. In some embodiments, the object fused information refers to the data content information that is formed by fusing the carrier object with the to-be-embedded information and that includes both the carrier object and the to-be-embedded information.
In some embodiments, the subjecting of the carrier object and the to-be-embedded information to fusion processing to obtain the object fused information includes: subjecting the to-be-embedded information to fusion preprocessing to obtain preprocessed to-be-embedded  information; and obtaining the object fused information based on the preprocessed to-be-embedded information and the carrier object. In some embodiments, spatial dimensions of the preprocessed to-be-embedded information are the same as the spatial dimensions of the carrier object.
In some embodiments, the preprocessed to-be-embedded information refers to information corresponding to to-be-embedded information that is obtained prior to subjecting the carrier object and the to-be-embedded information to fusion processing and after subjecting the to-be-embedded information to relevant preprocessing to make the shapes or spatial dimensions of the carrier object and the shapes or spatial dimensions of the to-be-embedded information the same.
In some embodiments, the subjecting of the to-be-embedded information to fusion preprocessing to obtain the preprocessed to-be-embedded information includes: subjecting the to-be-embedded information to replication processing based on the spatial dimension information of the carrier object to obtain preprocessed to-be-embedded information having the same spatial dimensions as the carrier object.
In some embodiments, the subjecting of the to-be-embedded information to replication processing based on the spatial dimension information of the carrier object includes: replicating the binary series corresponding to the to-be-embedded information to obtain preprocessed to-be-embedded information having the same spatial dimensions as the carrier object. In some embodiments, the specific number of replications is based on spatial dimension information of the carrier object. An explanation of how the to-be-embedded information is serialized into a corresponding binary series will not be further provided for conciseness.
In some embodiments, the obtaining of the object fused information based on the preprocessed to-be-embedded information and the carrier object includes: subjecting the preprocessed to-be-embedded information and the carrier object to additive processing to obtain the object fused information.
In some embodiments, the subjecting of the preprocessed to-be-embedded information and the carrier object to additive processing includes: subjecting binary series data  corresponding to the preprocessed to-be-embedded information and carrier object matrix data to a type of additive processing according to their spatial positions.
For example, the carrier image is img1 having a shape corresponding to h × w × c (as an example, h is height information, w is width information, and c is the number of channels; h, w, and c are all greater than 0) . The to-be-embedded information is msg1 having a corresponding binary shape is 1 × l (as an example, l is greater than 0) . Thus, the process whereby the target steganographic convolutional neural network model acquires the target residual object res_img1 corresponding to the carrier image img1 and the to-be-embedded information msg1 is: first, subjecting the to-be-embedded information msg1 to fusion preprocessing to acquire the preprocessed to-be-embedded information msg1-1, i.e., replicating to-be-embedded information msg1 based on the spatial dimensions h × w of the carrier object img1 and replicating the binary message space corresponding to the to-be-embedded information msg1 h × w times to acquire the preprocessed to-be-embedded information msg1-1 having the same spatial dimensions, i.e., a shape of h × w × l, as the carrier object img1; next, connecting and fusing the carrier object img1 and the preprocessed to-be-embedded information msg1-1 based on the number of channels c of the carrier object img1 to acquire the object fused information img_msg_1 having a size is h × w × l × (1 + c) ; furthermore, inputting the object fused information img_msg_1 into the target steganographic convolutional neural network model to acquire the target residual object res_img1 corresponding to the target steganographic convolutional neural network model. Please note that a carrier object corresponding to an image is used here as an example to describe how a target residual object is acquired. In some embodiments, process 300 of FIG. 3 is performed on a carrier object other than an image, e.g., on an audio, video, or other such object to acquire a target residual object corresponding the audio, video, or other such object.
In addition, please note that the inputting of the object fused information into the residual learning convolutional neural network model to obtain the target residual object includes: processing the object fused information to obtain a to-be-processed residual object using a residual block in the target steganographic convolutional neural network model; and process the to-be-processed residual object to obtain a target residual object using a fully connected layer in the target steganographic convolutional neural network model.
Referring back to FIG. 3, after operation 320 is performed, in 330, the server obtains a target steganographic object in which the to-be-embedded information is embedded based on the carrier object and the target residual object.
The above operations introduced how a reverse steganographic convolutional neural network model and an original steganographic convolutional neural network model, both of which are in a cycle generative adversarial network are adversarially trained to acquire a target steganographic convolutional neural network model for steganographic processing and how the target steganographic convolutional neural network model is used to process the carrier object and to-be-embedded information obtained in operation 310 to obtain a target residual object corresponding to the carrier object and the to-be-embedded information.
After the target residual object corresponding to the carrier object and the to-be-embedded information is acquired, a target steganographic object in which the to-be-embedded information is embedded is obtained based on the carrier object and the target residual object.
The obtaining of the target steganographic object in which the to-be-embedded information is embedded based on the carrier object and the target residual object includes: subjecting the carrier object and the target residual object to additive processing to obtain the target steganographic object.
For example, in the above obtaining operation, after the target steganographic convolutional neural network model is used to obtain the target residual object res_img1 corresponding to the carrier object img1 and the to-be-embedded information msg1, the carrier object and the target residual object res_img1 undergo additive processing to obtain the target steganographic object des_img1 in which the to-be-embedded information msg1 is embedded.
Please note that the subjecting of the carrier object and the target residual object to additive processing includes adding the matrix data corresponding to the two objects.
In addition, to further increase information security, the to-be-embedded information can undergo encryption processing prior to being embedded in the carrier object. The encryption processing can further increase security during information transfer. Therefore, the inputting of the carrier object and the to-be-embedded information into a target  steganographic convolutional neural network model to obtain a target residual object includes: subjecting the to-be-embedded information to encryption processing to obtain encrypted to-be-embedded information; and inputting the carrier object and the encrypted to-be-embedded information into the target steganographic convolutional neural network model to obtain the target residual object. In addition, the subjecting of the to-be-embedded to encryption processing can include encrypting the to-be-embedded information using an encryption algorithm corresponding to the file format of the to-be-embedded information.
At this point, a target steganographic object that corresponds to the carrier object and the to-be-embedded information, and in which the to-be-embedded information has been embedded, can be obtained.
Table 1 presents the quantitative comparison results for the steganographic technique provided by process of FIG. 3 against a conventional generative adversarial network steganographic technique. The Table 1 data correspond to comparison results obtained from test data for a carrier object in an image format.
Figure PCTCN2020113840-appb-000006
Figure PCTCN2020113840-appb-000007
Table 1
Pixel depth (bpp, bits per pixel) refers to the number of bits used to store each pixel in an image. Pixel depth can be used to measure image resolution.
The decoding success rate refers to the decoding success rate of the two processes vis-à-vis untrained sample data.
The detection rate refers to the recognition rate of discriminators in the two processes vis-à-vis steganographic objects.
The peak signal-to-noise ratio (PSNR) is an objective measurement for determining image quality.
FIG. 6A is a diagram of examples of original carrier objects. FIG. 6B is a diagram of examples of target steganographic objects obtained by a conventional process for performing a steganographic technique. FIG. 6C is a diagram of examples of target steganographic objects obtained by the embodiment of the process for performing a steganographic technique of FIG. 3. A comparison of FIG. 6B and FIG. 6C shows that the target steganographic objects obtained by the process 300 for performing a steganographic technique of FIG. 3 have better picture quality and image visual effects.
From the above description, the structure of the target steganographic convolutional neural network model used in the process 300 of FIG. 3 and the process 300 for performing a steganographic technique whereby the process 300 performs steganography are more complex than the structure of conventional steganographic convolutional neural network models and more hidden than the steganographic processing processes using the conventional steganographic convolutional neural network models. The embedded information in the target steganographic object that a target steganographic convolutional neural network model ultimately obtains has better concealment qualities and is not easily destroyed.
In summary, the steganographic process, by inputting the acquired carrier object and to-be-embedded information into the target steganographic convolutional neural network model, acquires a target residual object, and then obtains a target steganographic object based on the carrier object and the target residual object. To address insufficiently accurate adversarial information in conventional generative adversarial network-based steganographic techniques and the fact that embedded information in steganographic objects obtained thereby is easily destroyed, the present application, by introducing a reverse steganographic convolutional neural network model and regarding reverse objects output by the reverse steganographic convolutional neural network model as sample objects for training to obtain the target steganographic convolutional neural network. At the same time, in generating steganographic objects, the reverse steganographic convolutional neural network model first acquires a target residual object corresponding to the carrier object and the to-be-embedded information, and then obtains the final target steganographic object based on the carrier object and the target residual object. The security of embedded information in target steganographic objects can thus be greatly increased.
A steganographic process is provided above. A process for detecting information corresponding to the above steganographic process is described. Some of the operations therein were already described in the process for performing a steganographic technique.
FIG. 7 is a flowchart of an embodiment of a process for detecting information. In some embodiments, the process 700 is implemented by the server 110 of FIG. 1 and comprises:
In 710, the server acquires an object under scrutiny in which information was steganographically embedded.
In 720, the server inputs the object under scrutiny into a steganographic information extraction convolutional neural network model to obtain target embedded information embedded in the object under scrutiny. In some embodiments, the steganographic information extraction convolutional neural network model is a model obtained through adversarially training a target steganographic convolutional neural network model used to generate steganographic objects and a reverse steganographic convolutional neural network model. In some embodiments, the target steganographic convolutional neural network model is a model obtained from training using reverse carrier objects output by the reverse steganographic  convolutional neural network model as sample objects. In some embodiments, the reverse steganographic convolutional neural network model is used to obtain reverse carrier objects based on steganographic objects obtained with the target steganographic convolutional neural network model. In some embodiments, the reverse carrier objects are objects from which embedded information has been eliminated.
In addition, in the event that embedded information in an object under scrutiny is encrypted information, process 700, after obtaining, through operation 720, the target embedded information in the image under scrutiny, further comprises: subjecting the target embedded information to decryption processing to obtain decrypted target embedded information. The subjecting of the target embedded information to decryption processing includes subjecting the target embedded information to decryption processing using a decryption algorithm corresponding to the encryption algorithm whereby the target embedded information was encrypted.
In addition, in the event that the target embedded information is information that provides permission verification for the target object, then, after obtaining, through operation 720, the target embedded information in the image under scrutiny, process 700 further comprises: performing verification processing for the target object based on the target embedded information.
In some embodiments, the target object is an entity object awaiting permission verification. The entity object awaiting permission verification could be a user awaiting verification, a vehicle awaiting verification, or other such object.
In some embodiments, process 700 is applied to different contexts, such as code verification, permission checking, and copyright protection. For example, in the event that the application scenario is a vehicle gate management system, a pass including permission information that is acquired with the process 300 of FIG. 3 can be used in advance. In the event that the vehicle pass undergoes verification, the vehicle gate management system acquires an image of the pass included in the vehicle through a connected image sensor such as a camera. Next, the vehicle gate management system acquires permission information steganographically embedded in the image through process 700 of FIG. 7 and performs a corresponding permission  check for the vehicle. In the event that the verification succeeds, the vehicle is permitted to pass. Of course, the above example is merely one application scenario of process 700. In some embodiments, process 700 can be used in other manners based on different application scenarios.
FIG. 8 is a diagram of an embodiment of a device for performing a steganographic technique. In some embodiments, the device 800 is configured to perform the process 300 of FIG. 3 and comprises: an information acquiring unit 810, a target residual object acquiring unit 820, and a target steganographic object acquiring unit 830.
In some embodiments, the information acquiring unit 810 is configured to acquire a carrier object and to-be-embedded information.
In some embodiments, the target residual object acquiring unit 820 is configured to input the carrier object and the to-be-embedded information into a target steganographic convolutional neural network model to acquire a target residual object. In some embodiments, the target steganographic convolutional neural network model corresponds to a model obtained from training using reverse carrier objects output by a reverse steganographic convolutional neural network model as sample objects. In some embodiments, the reverse steganographic convolutional neural network model is configured to obtain reverse carrier objects based on steganographic objects obtained with the target steganographic convolutional neural network model. In some embodiments, the reverse carrier objects are objects from which embedded information has been eliminated.
In some embodiments, the target steganographic object acquiring unit 830 is configured to obtain a target steganographic object in which the to-be-embedded information is embedded based on the carrier object and the target residual object.
The units described above can be implemented as software components executing on one or more general purpose processors, as hardware such as programmable logic devices and/or Application Specific Integrated Circuits designed to perform certain functions or a combination thereof. In some embodiments, the units can be embodied by a form of software products which can be stored in a nonvolatile storage medium (such as optical disk, flash storage device, mobile hard disk, etc. ) , including a number of instructions for making a computer device  (such as personal computers, servers, network equipment, etc. ) implement the methods described in the embodiments of the present invention. The units may be implemented on a single device or distributed across multiple devices. The functions of the units may be merged into one another or further split into multiple sub-units.
FIG. 9 is a diagram of an embodiment of a device for detecting information. In some embodiments, the device 800 is configured to perform the process 700 of FIG. 7 and comprises: an object under scrutiny acquiring unit 910 and a target embedded information acquiring unit 920.
In some embodiments, the object under scrutiny acquiring unit 910 is configured to acquire an object under scrutiny in which information was steganographically embedded.
In some embodiments, the target embedded information acquiring unit 920 is configured to input the object under scrutiny into a steganographic information extraction convolutional neural network model to obtain the target embedded information embedded in the object under scrutiny. In some embodiments, the steganographic information extraction convolutional neural network model is a model obtained through adversarially training a target steganographic convolutional neural network model used to generate steganographic objects and a reverse steganographic convolutional neural network model. In some embodiments, the target steganographic convolutional neural network model is a model obtained from training using reverse carrier objects output by the reverse steganographic convolutional neural network model as sample objects. In some embodiments, the reverse steganographic convolutional neural network model is used to obtain reverse carrier objects based on steganographic objects obtained from the target steganographic convolutional neural network model. In some embodiments, the reverse carrier objects are objects from which embedded information has been eliminated.
In some embodiments, the present application further provides a commodity object corresponding to the steganographic technique provided by process 300 of FIG. 3. The commodity object includes the commodity object itself and a target steganographic object attached to the commodity object itself. In some embodiments, the target steganographic object is a steganographic object obtained using the steganographic technique provided by process 300 of FIG. 3.
In some embodiments, the commodity object is a book, a bottled product, a vehicle, or other such commodity object. The target steganographic object attached to the commodity object itself can be a steganographic object in which copyright information is embedded that is attached on or inside a product package, a steganographic object in which is embedded virtual currency information, or a steganographic object in which is embedded other information in need of hidden transfer. In an example where the target steganographic object is a steganographic object in which is embedded virtual currency information, when a user purchases a commodity object, a target steganographic image on the commodity object itself can be recognized with an application provided by the merchant using the process 700 of FIG. 7 for detecting information, and virtual currency information in the target steganographic technique can thus be acquired. The virtual currency information can be electronic currency information, such as product points or virtual exchange currency, which is used to provide value-added service to the commodity object.
Please note that, with regard to how to acquire a target steganographic image and how to recognize embedded information in a target steganographic image, please refer to the descriptions in the relevant sections of process 300 of FIG. 3 and process 700 of FIG. 7.
FIG. 10 is a functional diagram illustrating a programmed computer system for performing a steganographic technique in accordance with some embodiments. As will be apparent, other computer system architectures and configurations can be used to perform a steganographic technique. Computer system 1000, which includes various subsystems as described below, includes at least one microprocessor subsystem (also referred to as a processor or a central processing unit (CPU) ) 1002. For example, processor 1002 can be implemented by a single-chip processor or by multiple processors. In some embodiments, processor 1002 is a general purpose digital processor that controls the operation of the computer system 1000. Using instructions retrieved from memory 1010, the processor 1002 controls the reception and manipulation of input data, and the output and display of data on output devices (e.g., display 1018) .
Processor 1002 is coupled bi-directionally with memory 1010, which can include a first primary storage, typically a random access memory (RAM) , and a second primary storage  area, typically a read-only memory (ROM) . As is well known in the art, primary storage can be used as a general storage area and as scratch-pad memory, and can also be used to store input data and processed data. Primary storage can also store programming instructions and data, in the form of data objects and text objects, in addition to other data and instructions for processes operating on processor 1002. Also as is well known in the art, primary storage typically includes basic operating instructions, program code, data and objects used by the processor 1002 to perform its functions (e.g., programmed instructions) . For example, memory 1010 can include any suitable computer-readable storage media, described below, depending on whether, for example, data access needs to be bi-directional or uni-directional. For example, processor 1002 can also directly and very rapidly retrieve and store frequently needed data in a cache memory (not shown) .
A removable mass storage device 1012 provides additional data storage capacity for the computer system 1000, and is coupled either bi-directionally (read/write) or uni-directionally (read only) to processor 1002. For example, storage 1012 can also include computer-readable media such as magnetic tape, flash memory, PC-CARDS, portable mass storage devices, holographic storage devices, and other storage devices. A fixed mass storage 1020 can also, for example, provide additional data storage capacity. The most common example of mass storage 1020 is a hard disk drive.  Mass storages  1012, 1020 generally store additional programming instructions, data, and the like that typically are not in active use by the processor 1002. It will be appreciated that the information retained within  mass storages  1012 and 1020 can be incorporated, if needed, in standard fashion as part of memory 1010 (e.g., RAM) as virtual memory.
In addition to providing processor 1002 access to storage subsystems, bus 1014 can also be used to provide access to other subsystems and devices. As shown, these can include a display monitor 1018, a network interface 1016, a keyboard 1004, and a pointing device 1006, as well as an auxiliary input/output device interface, a sound card, speakers, and other subsystems as needed. For example, the pointing device 1006 can be a mouse, stylus, track ball, or tablet, and is useful for interacting with a graphical user interface.
The network interface 1016 allows processor 1002 to be coupled to another computer, computer network, or telecommunications network using a network connection as shown. For example, through the network interface 1016, the processor 1002 can receive information (e.g., data objects or program instructions) from another network or output information to another network in the course of performing method/process steps. Information, often represented as a sequence of instructions to be executed on a processor, can be received from and outputted to another network. An interface card or similar device and appropriate software implemented by (e.g., executed/performed on) processor 1002 can be used to connect the computer system 1000 to an external network and transfer data according to standard protocols. For example, various process embodiments disclosed herein can be executed on processor 1002, or can be performed across a network such as the Internet, intranet networks, or local area networks, in conjunction with a remote processor that shares a portion of the processing. Additional mass storage devices (not shown) can also be connected to processor 1002 through network interface 1016.
An auxiliary I/O device interface (not shown) can be used in conjunction with computer system 1000. The auxiliary I/O device interface can include general and customized interfaces that allow the processor 1002 to send and, more typically, receive data from other devices such as microphones, touch-sensitive displays, transducer card readers, tape readers, voice or handwriting recognizers, biometrics readers, cameras, portable mass storage devices, and other computers.
The computer system shown in FIG. 10 is but an example of a computer system suitable for use with the various embodiments disclosed herein. Other computer systems suitable for such use can include additional or fewer subsystems. In addition, bus 1014 is illustrative of any interconnection scheme serving to link the subsystems. Other computer architectures having different configurations of subsystems can also be utilized.
Although the foregoing embodiments have been described in some detail for purposes of clarity of understanding, the invention is not limited to the details provided. There are many alternative ways of implementing the invention. The disclosed embodiments are illustrative and not restrictive.
WHAT IS CLAIMED IS:

Claims (19)

  1. A method, comprising:
    acquiring a carrier object and to-be-embedded information;
    inputting the carrier object and the to-be-embedded information into a target steganographic convolutional neural network model to obtain a target residual object, wherein the target steganographic convolutional neural network model corresponds to a model obtained from training using reverse carrier objects output by a reverse steganographic convolutional neural network model as sample objects, wherein the reverse steganographic convolutional neural network model is used to obtain the reverse carrier objects based on steganographic objects obtained from the target steganographic convolutional neural network model, and wherein the reverse carrier objects are objects from which embedded information has been eliminated;
    obtaining a target steganographic object in which the to-be-embedded information is embedded based on the carrier object and the target residual object; and
    outputting the target steganographic object.
  2. The method as described in claim 1, wherein:
    the target steganographic convolutional neural network model is a residual learning convolutional neural network model; and
    the inputting of the carrier object and the to-be-embedded information into the target steganographic convolutional neural network model to obtain the target residual object comprises:
    inputting the carrier object and the to-be-embedded information into the residual learning convolutional neural network model to obtain the target residual object.
  3. The method as described in claim 2, wherein the inputting of the carrier object and the to-be-embedded information into the residual learning convolutional neural network model to obtain the target residual object comprises:
    subjecting the carrier object and the to-be-embedded information to fusion processing to obtain object fused information; and
    inputting the object fused information into the residual learning convolutional neural network model to obtain the target residual object.
  4. The method as described in claim 3, wherein the subjecting of the carrier object and the to-be-embedded information to fusion processing to obtain the object fused information comprises:
    subjecting the to-be-embedded information to fusion preprocessing to obtain preprocessed to-be-embedded information, wherein spatial dimensions of the preprocessed to-be-embedded information are the same as spatial dimensions of the carrier object; and
    acquiring the object fused information based on the preprocessed to-be-embedded information and the carrier object.
  5. The method as described in claim 4, wherein the subjecting of the to-be-embedded information to fusion preprocessing to obtain the preprocessed to-be-embedded information comprises:
    subjecting the to-be-embedded information to replication processing based on spatial dimension information of the carrier object to obtain the preprocessed to-be-embedded information having the same spatial dimensions as the carrier object.
  6. The method as described in claim 4, wherein the acquiring of the object fused information based on the preprocessed to-be-embedded information and the carrier object comprises:
    subjecting the preprocessed to-be-embedded information and the carrier object to additive processing to obtain the object fused information.
  7. The method as described in claim 3, wherein:
    the residual learning convolutional neural network model comprises at least one residual block and at least one fully connected layer;
    a number of channels of the at least one fully connected layer is the same as a number of channels of the carrier object; and
    the inputting of the object fused information into the residual learning convolutional neural network model to obtain the target residual object comprises:
    processing, using the at least one residual block, the object fused information to obtain a to-be-processed residual object; and
    processing, using the at least one fully connected layer, the to-be-processed residual object to obtain the target residual object.
  8. The method as described in claim 1, wherein the obtaining of the target steganographic object in which the to-be-embedded information is embedded based on the carrier object and the target residual object comprises:
    subjecting the carrier object and the target residual object to additive processing to obtain the target steganographic object.
  9. The method as described in claim 1, wherein the target steganographic convolutional neural network model is obtained by:
    acquiring an original steganographic convolutional neural network model; and
    adversarially training the reverse steganographic convolutional neural network model and the original steganographic convolutional neural network model to cause the original steganographic convolutional neural network model to converge; and
    regarding the converged original steganographic convolutional neural network model as the target steganographic convolutional neural network model.
  10. The method as described in claim 1, wherein the adversarially training of the reverse steganographic convolutional neural network model and the original steganographic convolutional neural network model to cause the original steganographic convolutional neural network model to converge comprises:
    acquiring a first discriminative convolutional neural network model and a second discriminative convolutional neural network model, wherein the first discriminative convolutional neural network model is configured to determine whether a received object includes embedded information and the second discriminative convolutional neural network model is configured to determine whether a received object is an original carrier object or the reverse carrier object; and
    adversarially training the original steganographic convolutional neural network model, the reverse steganographic convolutional neural network model, the first discriminative convolutional neural network model, and the second discriminative convolutional neural network model to cause the original steganographic convolutional neural network model to converge.
  11. The method as described in claim 10, wherein the adversarially training of the original steganographic convolutional neural network model, the reverse steganographic convolutional neural network model, the first discriminative convolutional neural network model, and the  second discriminative convolutional neural network model to cause the original steganographic convolutional neural network model to converge comprises:
    acquiring an original training carrier object and original training to-be-embedded information;
    inputting the original training carrier object and the original training to-be-embedded information into the original steganographic convolutional neural network model to obtain an original training steganographic object;
    inputting the original training steganographic object into the reverse steganographic convolutional neural network model to obtain a reverse training carrier object;
    training, using the original training steganographic object and the original training carrier object, the first discriminative convolutional neural network model;
    training, using the original training carrier object and the reverse training carrier object, the second discriminative convolutional neural network model; and
    during the training of the first and second convolutional neural network models, adjusting, using loss functions of the first and second convolutional neural network models, parameters of the first and second convolutional neural network models to cause the original steganographic convolutional neural network model to converge.
  12. The method as described in claim 1, wherein the inputting of the carrier object and the to-be-embedded information into the target steganographic convolutional neural network model to obtain the target residual object comprises:
    subjecting the to-be-embedded information to encryption processing to obtain encrypted to-be-embedded information; and
    inputting the carrier object and the encrypted to-be-embedded information into the target steganographic convolutional neural network model to obtain the target residual object.
  13. An method, comprising:
    acquiring an object under scrutiny in which information was steganographically embedded; and
    inputting the object under scrutiny into a steganographic information extraction convolutional neural network model to obtain target embedded information embedded in the object under scrutiny, wherein the steganographic information extraction convolutional neural network model is a model obtained through adversarially training a target steganographic  convolutional neural network model used to generate steganographic objects and a reverse steganographic convolutional neural network model, wherein the target steganographic convolutional neural network model is a model obtained from training using reverse carrier objects output by the reverse steganographic convolutional neural network model as sample objects, wherein the reverse steganographic convolutional neural network model is used to obtain the reverse carrier objects based on steganographic objects obtained with the target steganographic convolutional neural network model, and wherein the reverse carrier objects are objects from which embedded information has been eliminated.
  14. The method as described in claim 13, wherein:
    the embedded information in the object under scrutiny corresponds to information that has undergone encryption processing; and
    the method further comprises:
    subjecting the target embedded information to decryption processing to obtain decrypted target embedded information.
  15. The method as described in claim 13, wherein:
    the target embedded information corresponds to information relating to permission verification for a target object; and
    the method further comprises:
    subjecting the target object to verification processing based on the target embedded information.
  16. A system, comprising:
    a processor; and
    a memory coupled with the processor, wherein the memory is configured to provide the processor with instructions which when executed cause the processor to:
    acquire a carrier object and to-be-embedded information;
    input the carrier object and the to-be-embedded information into a target steganographic convolutional neural network model to obtain a target residual object, wherein the target steganographic convolutional neural network model corresponds to a model obtained from training using reverse carrier objects output by a reverse steganographic convolutional neural network model as sample objects, wherein the  reverse steganographic convolutional neural network model is used to obtain the reverse carrier objects based on steganographic objects obtained from the target steganographic convolutional neural network model, and wherein the reverse carrier objects are objects from which embedded information has been eliminated;
    obtain a target steganographic object in which the to-be-embedded information is embedded based on the carrier object and the target residual object; and
    output the target steganographic object.
  17. A system, comprising:
    a processor; and
    a memory coupled with the processor, wherein the memory is configured to provide the processor with instructions which when executed cause the processor to:
    acquire an object under scrutiny in which information was steganographically embedded; and
    input the object under scrutiny into a steganographic information extraction convolutional neural network model to obtain target embedded information embedded in the object under scrutiny, wherein the steganographic information extraction convolutional neural network model is a model obtained through adversarially training a target steganographic convolutional neural network model used to generate steganographic objects and a reverse steganographic convolutional neural network model, wherein the target steganographic convolutional neural network model is a model obtained from training using reverse carrier objects output by the reverse steganographic convolutional neural network model as sample objects, wherein the reverse steganographic convolutional neural network model is used to obtain the reverse carrier objects based on steganographic objects obtained with the target steganographic convolutional neural network model, and wherein the reverse carrier objects are objects from which embedded information has been eliminated.
  18. A computer program product being embodied in a tangible non-transitory computer readable storage medium and comprising computer instructions for:
    acquiring a carrier object and to-be-embedded information;
    inputting the carrier object and the to-be-embedded information into a target steganographic convolutional neural network model to obtain a target residual object, wherein  the target steganographic convolutional neural network model corresponds to a model obtained from training using reverse carrier objects output by a reverse steganographic convolutional neural network model as sample objects, wherein the reverse steganographic convolutional neural network model is used to obtain the reverse carrier objects based on steganographic objects obtained from the target steganographic convolutional neural network model, and wherein the reverse carrier objects are objects from which embedded information has been eliminated;
    obtaining a target steganographic object in which the to-be-embedded information is embedded based on the carrier object and the target residual object; and
    outputting the target steganographic object.
  19. A computer program product being embodied in a tangible non-transitory computer readable storage medium and comprising computer instructions for:
    acquiring an object under scrutiny in which information was steganographically embedded; and
    inputting the object under scrutiny into a steganographic information extraction convolutional neural network model to obtain target embedded information embedded in the object under scrutiny, wherein the steganographic information extraction convolutional neural network model is a model obtained through adversarially training a target steganographic convolutional neural network model used to generate steganographic objects and a reverse steganographic convolutional neural network model, wherein the target steganographic convolutional neural network model is a model obtained from training using reverse carrier objects output by the reverse steganographic convolutional neural network model as sample objects, wherein the reverse steganographic convolutional neural network model is used to obtain the reverse carrier objects based on steganographic objects obtained with the target steganographic convolutional neural network model, and wherein the reverse carrier objects are objects from which embedded information has been eliminated.
PCT/CN2020/113840 2019-09-11 2020-09-07 Method and system for performing steganographic technique WO2021047482A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201910858185.2A CN112487365B (en) 2019-09-11 2019-09-11 Information steganography method and information detection method and device
CN201910858185.2 2019-09-11

Publications (1)

Publication Number Publication Date
WO2021047482A1 true WO2021047482A1 (en) 2021-03-18

Family

ID=74866835

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2020/113840 WO2021047482A1 (en) 2019-09-11 2020-09-07 Method and system for performing steganographic technique

Country Status (2)

Country Link
CN (1) CN112487365B (en)
WO (1) WO2021047482A1 (en)

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113077377A (en) * 2021-05-13 2021-07-06 海南大学 Color image steganography method based on generation countermeasure network
CN114900586A (en) * 2022-04-28 2022-08-12 中国人民武装警察部队工程大学 Information steganography method and device based on DCGAN
WO2022241307A1 (en) * 2021-05-14 2022-11-17 Cornell University Image steganography utilizing adversarial perturbations
CN116016798A (en) * 2023-01-06 2023-04-25 中国人民解放军国防科技大学 Passive information hiding method and system based on feature mapping
CN116664599A (en) * 2023-06-01 2023-08-29 云南大学 Image steganalysis method based on steganography area prediction
US12033233B2 (en) 2022-05-16 2024-07-09 Cornell University Image steganography utilizing adversarial perturbations

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113076549B (en) * 2021-04-08 2023-05-12 上海电力大学 Novel U-Net structure generator-based countermeasures network image steganography method

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20180068429A1 (en) * 2015-04-15 2018-03-08 Institute Of Automation Chinese Academy Of Sciences Image Steganalysis Based on Deep Learning
CN108346125A (en) * 2018-03-15 2018-07-31 中山大学 A kind of spatial domain picture steganography method and system based on generation confrontation network
CN108648135A (en) * 2018-06-01 2018-10-12 深圳大学 Hide model training and application method, device and computer readable storage medium
CN108764270A (en) * 2018-04-03 2018-11-06 上海大学 A kind of Information Hiding & Detecting method integrated using convolutional neural networks
CN108961137A (en) * 2018-07-12 2018-12-07 中山大学 A kind of image latent writing analysis method and system based on convolutional neural networks

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106920206B (en) * 2017-03-16 2020-04-14 广州大学 Steganalysis method based on antagonistic neural network

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20180068429A1 (en) * 2015-04-15 2018-03-08 Institute Of Automation Chinese Academy Of Sciences Image Steganalysis Based on Deep Learning
CN108346125A (en) * 2018-03-15 2018-07-31 中山大学 A kind of spatial domain picture steganography method and system based on generation confrontation network
CN108764270A (en) * 2018-04-03 2018-11-06 上海大学 A kind of Information Hiding & Detecting method integrated using convolutional neural networks
CN108648135A (en) * 2018-06-01 2018-10-12 深圳大学 Hide model training and application method, device and computer readable storage medium
CN108961137A (en) * 2018-07-12 2018-12-07 中山大学 A kind of image latent writing analysis method and system based on convolutional neural networks

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113077377A (en) * 2021-05-13 2021-07-06 海南大学 Color image steganography method based on generation countermeasure network
WO2022241307A1 (en) * 2021-05-14 2022-11-17 Cornell University Image steganography utilizing adversarial perturbations
CN114900586A (en) * 2022-04-28 2022-08-12 中国人民武装警察部队工程大学 Information steganography method and device based on DCGAN
CN114900586B (en) * 2022-04-28 2024-04-16 中国人民武装警察部队工程大学 Information steganography method and device based on DCGAN
US12033233B2 (en) 2022-05-16 2024-07-09 Cornell University Image steganography utilizing adversarial perturbations
CN116016798A (en) * 2023-01-06 2023-04-25 中国人民解放军国防科技大学 Passive information hiding method and system based on feature mapping
CN116016798B (en) * 2023-01-06 2024-05-17 中国人民解放军国防科技大学 Passive information hiding method and system based on feature mapping
CN116664599A (en) * 2023-06-01 2023-08-29 云南大学 Image steganalysis method based on steganography area prediction
CN116664599B (en) * 2023-06-01 2024-06-11 云南大学 Image steganalysis method based on steganography area prediction

Also Published As

Publication number Publication date
CN112487365B (en) 2024-01-30
CN112487365A (en) 2021-03-12

Similar Documents

Publication Publication Date Title
WO2021047482A1 (en) Method and system for performing steganographic technique
US20230119080A1 (en) Face verification method and apparatus
WO2020207189A1 (en) Method and device for identity authentication, storage medium, and computer device
CN108509915B (en) Method and device for generating face recognition model
US20220044009A1 (en) Face verifying method and apparatus
CN108664782B (en) Face verification method and device
US11444774B2 (en) Method and system for biometric verification
CN110222573B (en) Face recognition method, device, computer equipment and storage medium
KR102137329B1 (en) Face Recognition System for Extracting Feature Vector Using Face Recognition Model Based on Deep Learning
WO2022033220A1 (en) Face liveness detection method, system and apparatus, computer device, and storage medium
CN111507386B (en) Method and system for detecting encryption communication of storage file and network data stream
KR102161359B1 (en) Apparatus for Extracting Face Image Based on Deep Learning
EP4085369A1 (en) Forgery detection of face image
KR20210069388A (en) Face Recognition System For Real Image Judgment Using Face Recognition Model Based on Deep Learning
KR102184493B1 (en) System for Face Recognition Based On AI
US20190347472A1 (en) Method and system for image identification
US11348364B2 (en) Method and system for neural fingerprint enhancement for fingerprint recognition
KR102184490B1 (en) Edge Device for Face Recognition
US20220198711A1 (en) Learned forensic source system for identification of image capture device models and forensic similarity of digital images
KR102308122B1 (en) Server And System for Face Recognition Using The Certification Result
CN116452886A (en) Image recognition method, device, equipment and storage medium
CN113518061B (en) Data transmission method, equipment, device, system and medium in face recognition
KR102312152B1 (en) Face Recognition Server For Reflecting Space-Time Environment and Face Recognition System Having The Same
Shibel et al. Deep learning detection of facial biometric presentation attack
Pasqualino et al. A multi camera unsupervised domain adaptation pipeline for object detection in cultural sites through adversarial learning and self-training

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20863510

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20863510

Country of ref document: EP

Kind code of ref document: A1