CN110362683A - A kind of information steganography method based on recurrent neural network, device and storage medium - Google Patents

A kind of information steganography method based on recurrent neural network, device and storage medium Download PDF

Info

Publication number
CN110362683A
CN110362683A CN201910559075.6A CN201910559075A CN110362683A CN 110362683 A CN110362683 A CN 110362683A CN 201910559075 A CN201910559075 A CN 201910559075A CN 110362683 A CN110362683 A CN 110362683A
Authority
CN
China
Prior art keywords
word
steganography
connection
text
neural network
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201910559075.6A
Other languages
Chinese (zh)
Inventor
王仕豪
李千目
龙华秋
容振邦
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Wuyi University
Original Assignee
Wuyi University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Wuyi University filed Critical Wuyi University
Priority to CN201910559075.6A priority Critical patent/CN110362683A/en
Publication of CN110362683A publication Critical patent/CN110362683A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/30Information retrieval; Database structures therefor; File system structures therefor of unstructured textual data
    • G06F16/35Clustering; Classification
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/045Combinations of networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • G06N3/084Backpropagation, e.g. using gradient descent
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N1/00Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
    • H04N1/32Circuits or arrangements for control or supervision between transmitter and receiver or between image input and image output device, e.g. between a still-image camera and its memory or between a still-image camera and a printer device
    • H04N1/32101Display, printing, storage or transmission of additional information, e.g. ID code, date and time or title
    • H04N1/32144Display, printing, storage or transmission of additional information, e.g. ID code, date and time or title embedded in the image data, i.e. enclosed or integrated in the image, e.g. watermark, super-imposed logo or stamp

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Data Mining & Analysis (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Biomedical Technology (AREA)
  • Computing Systems (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Artificial Intelligence (AREA)
  • Software Systems (AREA)
  • Biophysics (AREA)
  • Computational Linguistics (AREA)
  • Evolutionary Computation (AREA)
  • General Health & Medical Sciences (AREA)
  • Molecular Biology (AREA)
  • Health & Medical Sciences (AREA)
  • Mathematical Physics (AREA)
  • Databases & Information Systems (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Machine Translation (AREA)

Abstract

The invention discloses a kind of information steganography methods based on recurrent neural network, include the following steps: to acquire text, the text input is subjected to classification processing into recurrent neural network, the connection word set and combined probability of several words are obtained, word and the relationship that connect word in connection word set are effectively reflected;The connection word is encoded according to the connection word set of the word and combined probability, obtains the information steganography model of training completion, steganography processing can be carried out to the information in sentence;Acquisition will be binary code to steganography text conversion by transmitting terminal to steganography text, and be input to information steganography model, and obtained steganography text code has effectively encrypted the important information in sentence, and steganography text code is prevented to be cracked;Steganography text code is converted into steganography text using receiving end, and steganography text input is decoded to information steganography model, decoded accuracy is higher, it is ensured that the important information of ciphertext is accurately conveyed.

Description

A kind of information steganography method based on recurrent neural network, device and storage medium
Technical field
The present invention relates to a kind of information security fields, and in particular to a kind of information steganography side based on recurrent neural network Method, device and storage medium.
Background technique
With the continuous development of information security field technology, the transmission of private information is increasingly by the attention of user.It is existing Then some information steganography technologies select different carriers for passing usually by all vector encodeds into a collection of vectors It is defeated, to realize the transmitting of hidden message.But this information steganography technology existing defects, high information coding degree this kind of to text, The ciphertext of the carrier of low amount of redundancy, generation is second-rate, so that original vector becomes difficult to read, or cannot effectively encrypt Important information, cause ciphertext be easy it is under attack, distort and even crack.
Summary of the invention
To solve the above problems, the purpose of the present invention is to provide a kind of information steganography sides based on recurrent neural network Method, device and storage medium can carry out steganography processing to text information, obtain the higher ciphertext of closed quality, effectively Ground encrypts important information, and ciphertext is avoided to be cracked;Meanwhile the accuracy being decoded to ciphertext is also higher, really The important information for protecting text is accurately conveyed.
Technical solution used by the present invention solves the problems, such as it is: in a first aspect, the embodiment of the present invention proposes a kind of base In the information steganography method of recurrent neural network, include the following steps:
Text is acquired, the text input is subjected to classification processing into recurrent neural network, obtains several words Word set and combined probability are connected, the connection word set includes several connection words, the word and the connection word phase Neighbour, and the connection word is arranged in behind the word;
The connection word is encoded according to the connection word set of the word and combined probability, obtains training completion Information steganography model;
Acquisition will be binary code to steganography text conversion by transmitting terminal to steganography text, and by the binary code It is input to information steganography model, obtains steganography text code;
Steganography text code is converted into steganography text using receiving end, and by the steganography text input to information steganography Model obtains decoding text.
Further, text is acquired, the text input is subjected to classification processing into recurrent neural network, obtains several The connection word set and combined probability of word, include the following steps:
Text is acquired, the first word of the text is extracted using recurrent neural network, and according to the first word Several the first words of frequency selection purposes constitute lists of keywords;
The mapping for being carried out semantic space to the text using the recurrent neural network is obtained several and contains semantic sky Between word;
The word is input to the preliminary extraction that LSTM hidden layer carries out weight, obtains the initial weight of the word;
Softmax activation primitive layer is connected behind the LSTM hidden layer, by the word and the word just Beginning weight is input to softmax activation primitive layer and carries out classification processing, obtains the connection word set and combined probability of several words.
Further, text is acquired, the text input is subjected to classification processing into recurrent neural network, obtains several The connection word set and combined probability of word, further include following steps:
The total losses value of the combined probability of several words is calculated using loss function, and by reversely adjusting the recurrence Neural network keeps the total losses value minimum, obtains the optimal combined probability of several words.
Further, the connection word is encoded according to the connection word set of the word and combined probability, is instructed Practice the information steganography model completed, includes the following steps:
The connection Huffman tree that the word is constructed using the connection word set and combined probability of the word, obtains the company The coding of word is connect, then information steganography model training is completed.
Further, the connection word is encoded according to the connection word set of the word and combined probability, is instructed Practice the information steganography model completed, further include following steps:
Connection candidate pool is constructed according to the connection Huffman tree of the word, the connection candidate pool includes the conjunction The coding of language and the connection word.
Further, acquisition will be binary code to steganography text conversion by transmitting terminal to steganography text, and by described two Ary codes are input to information steganography model, include the following steps:
Acquisition extracts any one the first word of the lists of keywords using recurrent neural network to steganography text, And according to the corresponding connection Huffman tree of the first place word match, the first connection Huffman tree is obtained;
Transmitting terminal according to binary coding table will it is described to steganography text conversion be binary code, and by the binary code Be input to the matching that is encoded in the first connection Huffman tree, obtain the first connection word and the not matched 1st into Code processed;
The second connection is obtained if successful match according to the corresponding connection Huffman tree of the first connection word match Huffman tree;If it fails to match, another arbitrary the first word of the lists of keywords is extracted, and according to described another A the first word match third connects Huffman tree;
Not matched first binary code is input to the second connection Huffman tree or third connection The matching encoded in Huffman tree obtains the second connection word, continues to match according to the second connection word corresponding Huffman tree is connected, is completed until first binary code matches, several third connection words are obtained;
Will the first connection word, the second connection word that matching obtains connect with several thirds word into Row combination connection, obtains steganography text;
The steganography text is converted according to binary coding table, obtains steganography text code.
Further, steganography text code is converted into steganography text using receiving end, and the steganography text input is arrived Information steganography model, obtains decoding text, includes the following steps:
The steganography text code is converted to steganography text according to binary coding table by receiving end;
The extraction of word, the sequencer being arranged in order are carried out to the steganography text using recurrent neural network Language;
Corresponding connection Huffman tree is successively matched according to the sequence of the sequence word, obtains sequential encoding;
The sequential encoding is converted according to binary coding table, obtains decoding text.
Second aspect, the embodiment of the present invention also proposed a kind of information steganography device based on recurrent neural network, including At least one control processor and memory for being communicated to connect at least one described control processor;The memory is deposited The instruction that can be executed by least one described control processor is contained, described instruction is held by least one described control processor Row, so that at least one described control processor is able to carry out a kind of as described in any of the above item based on recurrent neural network Information steganography method.
The third aspect, the embodiment of the present invention also proposed a kind of computer readable storage medium, described computer-readable to deposit Storage media is stored with computer executable instructions, and the computer executable instructions are for making computer execute such as any of the above item A kind of information steganography method based on recurrent neural network.
The technical solution provided in the embodiment of the present invention, at least has the following beneficial effects: and is passed by entering text into Return and carry out classification processing in neural network, connection word set and combined probability, the combined probability for obtaining several words are effectively anti- Word and the relationship that connect word in connection word set have been reflected, has been facilitated to text progress steganography processing;According to the connection word set of word With combined probability to connection word encode, obtain training completion information steganography model, can to the information in sentence into The processing of row steganography;Transmitting terminal will be binary code to steganography text conversion, can effectively improve the processing effect to steganography text Binary code is input to information steganography model by rate, obtains steganography text code, i.e. ciphertext, and steganography text code is effective Ground has encrypted the important information in sentence, and steganography text code is prevented to be cracked;Receiving end is converted to steganography text code hidden Text is write, and steganography text input is decoded to information steganography model, decoded accuracy is higher, it is ensured that ciphertext Important information accurately convey.
Detailed description of the invention
The invention will be further described with example with reference to the accompanying drawing:
Fig. 1 is the overall flow figure of one embodiment of the information steganography method of the invention based on recurrent neural network;
Fig. 2 is the acquisition text of one embodiment of the information steganography method of the invention based on recurrent neural network, will The text input carries out the flow chart of classification processing into recurrent neural network;
Fig. 3 is the receiving end of one embodiment of the information steganography method of the invention based on recurrent neural network by steganography Text code is converted to steganography text, and is input to the flow chart of information steganography model.
Specific embodiment
Then existing information steganography technology selects different usually by all vector encodeds into a collection of vectors Carrier is used for transmission, to realize the transmitting of hidden message.But this information steganography technology existing defects, high letter this kind of to text The carrier of coding degree, low amount of redundancy is ceased, the ciphertext of generation is second-rate, so that original vector becomes difficult to read, or cannot Effectively encrypt important information, cause text be easy it is under attack, distort and even crack.
Based on this, the present invention provides a kind of information steganography method based on recurrent neural network, device and storage medium, Steganography processing can be carried out to text information, obtain the higher ciphertext of closed quality, effectively encrypt important information, avoid Ciphertext is cracked;Meanwhile the accuracy being decoded to ciphertext is also higher, it is ensured that the important information of text is accurate It conveys.
With reference to the accompanying drawing, the embodiment of the present invention is further elaborated.
Referring to Fig.1, An embodiment provides a kind of information steganography method based on recurrent neural network, packets Include following steps:
Step S100: the text input is carried out classification processing into recurrent neural network, obtained several by acquisition text The connection word set and combined probability of a word, the connection word set include several connection words, the word and the company It is adjacent to connect word, and the connection word is arranged in behind the word;
Step S200: the connection word is encoded according to the connection word set of the word and combined probability, is obtained The information steganography model that training is completed;
Step S300: acquisition will be binary code to steganography text conversion by transmitting terminal to steganography text, and will be described Binary code is input to information steganography model, obtains steganography text code;
Step S400: being converted to steganography text for steganography text code using receiving end, and by the steganography text input To information steganography model, decoding text is obtained.
In the present embodiment, the recurrent neural network of step S100 is that have tree-shaped hierarchical structure, and network node presses it The order of connection carries out recursive artificial neural network to input information, is one of deep learning algorithm, has flexible topology knot Structure and weight are shared, are suitable for inclusion in the machine learning task of structural relation, training effect is good.By entering text into recurrence Classification processing is carried out in neural network, the connection word set and combined probability, combined probability for obtaining several words effectively reflect Word and the relationship for connect word in connection word set, facilitate to text progress steganography processing.Step S200 is according to the company of word It connects word set and combined probability to encode connection word, obtains the information steganography model of training completion, realize to text information Steganography processing.
Step S300 transmitting terminal will be binary code to steganography text conversion, can effectively improve the place to steganography text Manage efficiency;Wherein, the conversion of binary code can be realized according to ASCII binary coding table, and ASCII binary coding table is Most general information exchange standard now.Binary code is input to information steganography model, steganography text code is obtained, that is, encrypts Text, steganography text code have effectively encrypted the important information in sentence, and steganography text code is prevented to be cracked.
Steganography text code is converted to steganography text by the receiving end step S400, and by steganography text input to information steganography Model is decoded, and the accuracy of obtained steganography text code is higher, it is ensured that the important information of text is accurately conveyed.
Further, referring to Fig. 2, another embodiment of the invention additionally provides a kind of letter based on recurrent neural network Cease steganography method, wherein the text input is carried out classification processing into recurrent neural network, obtained several by acquisition text The connection word set and combined probability of a word, include the following steps:
Step S110: acquisition text extracts the first word of the text using recurrent neural network, and according to the head Several the first words of the frequency selection purposes of position word constitute lists of keywords;
Step S120: the mapping of semantic space is carried out to the text using the recurrent neural network, obtains several Word containing semantic space;
Step S130: the word is input to the preliminary extraction that LSTM hidden layer carries out weight, obtains the word Initial weight;
Step S140: connecting softmax activation primitive layer behind the LSTM hidden layer, by the word and described The initial weight of word be input to softmax activation primitive layer carry out classification processing, obtain several words connection word set and Combined probability.
In the present embodiment, text is made of several sentences in step S110, extracted using recurrent neural network described in The first word of each sentence obtains the set of the first word, calculates the frequency that the first word occurs in set, selecting frequency Several higher the first words constitute lists of keywords.Preferably, 500 the first word composition lists of keywords can be chosen, Lists of keywords can effectively reflect common first place word in sentence, and word can make the company between each sentence for the first time It connects more smooth.
Step S120 enters text into recurrent neural network, and each sentence is regarded as a sequential signal, each sentence I-th of word WordsiThe signal of time step i can be seen as.The first layer of recurrent neural network is embeding layer, and embeding layer is used for The word of text is extracted, and word is carried out to the mapping of semantic space, obtains the word for the dense semantic space that there is d to tie up, i.e., Wordsi∈Rd, WordsiComprising d element, wherein WordsiFor i-th of word of sentence S, RdFor the dense semantic space of d dimension. For each sentence S, matrix S ∈ R can be usedL×dIt indicates, L is the length of matrix S, it may be assumed that
The word is input to LSTM hidden layer by step S130, and the LSTM hidden layer includes several hidden layers and several LSTM network layer is all connected with one layer of LSTM network layer behind every layer of hidden layer.Wherein, hidden layer is RNN network layer, is had Short-term memory, LSTM network layer has long-term memory, therefore LSTM hidden layer has the characteristic of long-term memory, can store network In important information, improve information between the degree of association.
Because each hidden layer has multiple neural network units, it is assumed that j-th of hidden layer UjNeural network unit number Mesh is n, then word passes through j-th of hidden layer UjObtained output unit can indicate are as follows:
For first hidden layer, in time step t, each output unitIt is by WordstThe weighted sum of middle element It is calculated, it may be assumed that
WhereinWithIt is the weight and deviation that training obtains.In time step t, output unitBy first The output valve that LSTM network layer handles obtain are as follows:
So, one is obtained after sentence is by first hidden layer and first LSTM network layer handles in time step t Total output valve of a sentenceAre as follows:
Adjacent LSTM network layer is connected with hidden layer by transmission matrix, it is assumed that (L-1) a LSTM network layer and L Transmission matrix between a hidden layer is wl, then:
In time step t, the output unit of l-th hidden layerIt is by the output valve of (L-1) a LSTM network layer What weighted sum was calculated, it may be assumed that
Wherein,WithIt is the weight and deviation that training obtains.In time step t, output unitBy l-th The output valve that LSTM network layer handles obtain are as follows:
So, obtain sentence S's after sentence S is by L hidden layer and L LSTM network layer handles in time step t Total output valveAre as follows:
Thus, it is supposed that including NL layers of hidden layer and NL layers of LSTM network layer in LSTM hidden layer, then text passes through NL layers After hidden layer and NL layers of LSTM network processes, the initial weight of word can be obtainedWith the set of initial weight
Step S140 connects softmax activation primitive layer behind LSTM hidden layer, i.e., in the last layer LSTM network Softmax activation primitive layer is connected after layer.The initial weight of word and word is input to softmax activation letter by LSTM hidden layer Several layers, softmax activation primitive layer handles the initial weight of word, obtains the combining weights of word:
Wherein wpTo predict weight,N is the quantity of word in the text.
Softmax activation primitive layer is handled word in the position distribution of text according to word in LSTM hidden layer, Several adjacent phrases are obtained, wherein adjacent word is made of the connection word of word and setting face behind.If according to The difference of word, carries out classification processing to adjacent word, obtains the connection word set of several words in dry adjacent word, described Connection word set includes several connection words;According to the combined probability connection word set of word calculating word and connect word:
Word is 1 with the summation of the combined probability for all connection words connecting in word set, and combined probability effectively reflects Word and the relationship for connecting word, it is convenient that steganography processing is carried out to text.
Further, another embodiment of the invention additionally provides a kind of information steganography side based on recurrent neural network Method, wherein the text input is carried out classification processing into recurrent neural network, obtains several words by acquisition text Word set and combined probability are connected, further includes following steps:
Step S150: the total losses value of the combined probability of several words is calculated using loss function, and by reversely adjusting The whole recurrent neural network keeps the total losses value minimum, obtains the optimal combined probability of several words.
In the present embodiment, the total losses value of the loss function of recurrent neural network is defined as whole words by step S150 Connection word set combined probability negative logarithm summation, i.e., the negative logarithm of whole words and its combined probability for connecting word is total With:
In the training process, reversed adjustment is realized using back-propagation algorithm, by the ginseng for updating recurrent neural network Number realizes optimization to network so that total losses value reaches minimum, obtain several words connection word set it is optimal combination it is general Rate reduces the error of the combined probability of connection word set.
Further, another embodiment of the invention additionally provides a kind of information steganography side based on recurrent neural network Method, wherein the connection word is encoded according to the connection word set of the word and combined probability, obtains training completion Information steganography model, includes the following steps:
Step S210: constructing the connection Huffman tree of the word using the connection word set and combined probability of the word, The coding of the connection word is obtained, then information steganography model training is completed.
In the present embodiment, step S210 according in the connection word set of word connect word combined probability size, it is right Connection word is ranked up, according to after sequence connection word and combined probability construct the connection Huffman tree of the word.Even It connects Huffman tree to encode using combined probability of the variable length coding table to connection word, the biggish connection word of combined probability makes With shorter coding, the lesser connection word of combined probability on the contrary then uses longer coding, so that the conjunction of coding nodes Average length, the desired value of the character string of language reduce, to achieve the purpose that lossless compression data.Information steganography model training is complete Cheng Hou can carry out steganography processing to all words in text, obtain the higher steganography text of steganography degree.
Wherein, each word has corresponding connection Huffman tree, storage in connection Huffman tree in information steganography model There is connection word and connects the coding of word.
Further, another embodiment of the invention additionally provides a kind of information steganography side based on recurrent neural network Method, wherein the connection word is encoded according to the connection word set of the word and combined probability, obtains training completion Information steganography model, further includes following steps:
Step S220: connection candidate pool is constructed according to the connection Huffman tree of the word, the connection candidate pool includes The coding of the connection word and the connection word.
In the present embodiment, step S220 using in the connection Huffman tree of word connection word and its coding connected The building for connecing candidate pool, obtained connection candidate pool with connection Huffman tree be it is corresponding, all contain and connect word and connection The coding of word, i.e., each word has corresponding connection Huffman tree and connection candidate pool, so that there are many texts to be encoded The matching that mode is encoded is matched to choose suitable mode according to the actual needs, improves the speed of steganography.When When the connection Huffman tree depth of word is larger in sentence to be encoded, connection candidate pool can be chosen to connection word progress Match, accelerates the speed of the matching speed of connection word and the steganography of text.
In user's use information steganography model, the size of connection candidate pool can be chosen according to the capacity of actual vector, And the connection word in the connection candidate pool chosen is extracted according to the size of combined probability;Candidate pool is connected by choosing The mode of size alleviates the weight bearing of carrier, accelerates the speed of coding.If after the size for choosing connection candidate pool, connection Connection word of the candidate pool to word then needs the connection Huffman tree by matching the word when it fails to match, to realize pair Connect the matching of word coding.Wherein, carrier can be the terminals such as computer, mobile phone.
Further, another embodiment of the invention additionally provides a kind of information steganography side based on recurrent neural network Method, wherein acquisition will be binary code to steganography text conversion by transmitting terminal to steganography text, and by the binary code It is input to information steganography model, is included the following steps:
Step S310: acquisition extracts the lists of keywords any one using recurrent neural network to steganography text The first word, and according to the corresponding connection Huffman tree of the first place word match, obtain the first connection Huffman tree;
Step S320: transmitting terminal according to binary coding table will it is described to steganography text conversion be binary code, and by institute It states binary code and is input to the matching encoded in the first connection Huffman tree, obtain the first connection word and do not match The first binary code;
Step S330: it is obtained according to the corresponding connection Huffman tree of the first connection word match if successful match Second connection Huffman tree;If it fails to match, another arbitrary the first word of the lists of keywords is extracted, and according to It is described another the first word match third connect Huffman tree;
Step S340: not matched first binary code is input to the second connection Huffman tree or described The matching encoded in third connection Huffman tree, obtains the second connection word, according to the second connection word continuation It with corresponding connection Huffman tree, is completed until first binary code matches, obtains several third connection words;
Step S350: the first connection word, the second connection word and several thirds that matching is obtained Connection word is combined connection, obtains steganography text;
Step S360: the steganography text is converted according to binary coding table, obtains steganography text code.
In the present embodiment, step S310 extracts any one first place of the lists of keywords using recurrent neural network Word, and using the first place word as the beginning to steganography text, according to the corresponding first connection Huffman of the first word match Tree, described first connects all connection words and its corresponding coding that the first word is stored in Huffman tree.
Step S320 transmitting terminal will be binary code to steganography text conversion according to binary coding table, breathe out in the first connection Fu Man tree to binary code encoded it is matched during, first connection Huffman tree be from root node, to two into The first digit of code processed carries out matched.Wherein, contain several ways diameter in the first connection Huffman tree, and if having on path Dry leaf node, each leaf node have a corresponding binary coding, e.g. 0,1;So, the first connection Huffman tree Just there are unique paths to match the preceding n number of binary code.When binary code match to obtain one it is corresponding When path, according to leaf node corresponding on path, the coding of the first connection word is obtained, to obtain the first connection word.Its In, binary coding table can choose ASCII binary coding table.
Step S330 and step S340, according to the corresponding connection Huffman tree of the first connection word match, if successful match, Then obtain the second connection Huffman tree, be stored in the second connection Huffman tree the first connection word all connection words and its Not matched first binary code is input to the matching encoded in the second connection Huffman tree, obtained by corresponding coding To the second connection word.If it fails to match, using recurrent neural network extract the lists of keywords another is arbitrary The first word connects Huffman tree, storage in the third connection Huffman tree according to another the first word match third There are all connection words and its corresponding coding of another the first word, not matched first binary code is input to the The matching encoded in three connection Huffman trees, obtains the second connection word.Continue operating procedure S330 and step S340, root Continue to match corresponding connection Huffman tree according to the second connection word, be completed until first binary code matches, if obtaining Dry third connects word.
The first connection word, the second connection word and several thirds that step S350 obtains matching connect Word is connect, connection is combined by the sequence that matching is completed, obtains steganography text.Step S360 is according to binary coding table by institute It states steganography text to be converted, obtains steganography text code, and steganography text code is transmitted to receiving end, receiving end must benefit Steganography text code is decoded with identical steganography mode, otherwise decoding obtain decoding text information and transmitting terminal to The information for encoding text is completely inconsistent, therefore steganography text code has effectively encrypted the important information in text, prevents text Originally it is cracked and distorts.
Further, referring to Fig. 3, another embodiment of the invention additionally provides a kind of letter based on recurrent neural network Cease steganography method, wherein steganography text code is converted into steganography text using receiving end, and the steganography text input is arrived Information steganography model, obtains decoding text, includes the following steps:
Step S410: the steganography text code is converted to steganography text according to binary coding table by receiving end;
Step S420: the extraction of word is carried out to the steganography text using recurrent neural network, is arranged in order Sequence word;
Step S430: successively matching corresponding connection Huffman tree according to the sequence of the sequence word, obtains sequence and compiles Code;
Step S440: the sequential encoding is converted according to binary coding table, obtains decoding text.
In the present embodiment, step S420 extracts the sequence word of the steganography text using recurrent neural network, is convenient for Corresponding connection Huffman tree is matched, decoded speed is improved.Step S430 is successively matched according to the sequence of the sequence word Corresponding connection Huffman tree, when matching obtains the connection Huffman tree of first sequence word, by second sequence word It is input to the connection Huffman tree of first sequence word, matching obtains the coding of second sequence word;Then is matched again Third sequence word is input to the connection Huffman of second sequence word by the connection Huffman tree of two sequence words Tree, matching obtain the coding of third sequence word;Continue aforesaid operations, is completed until all sequence words encode, by institute Some sequence words are combined connection by matched sequence, obtain sequential encoding.Step S440 is according to binary coding table pair The sequential encoding is converted, and decoding text is obtained.It, can when transmitting terminal and receiving end use identical information steganography model To realize the accurate reception and registration of text information, error is smaller.
In addition ,-Fig. 3, another embodiment of the invention additionally provide a kind of letter based on recurrent neural network referring to Fig.1 Steganography method is ceased, is included the following steps:
Step S110: acquisition text extracts the first word of the text using recurrent neural network, and according to the head Several the first words of the frequency selection purposes of position word constitute lists of keywords;
Step S120: the mapping of semantic space is carried out to the text using the recurrent neural network, obtains several Word containing semantic space;
Step S130: the word is input to the preliminary extraction that LSTM hidden layer carries out weight, obtains the word Initial weight;
Step S140: connecting softmax activation primitive layer behind the LSTM hidden layer, by the word and described The initial weight of word be input to softmax activation primitive layer carry out classification processing, obtain several words connection word set and Combined probability;
Step S150: the total losses value of the combined probability of several words is calculated using loss function, and by reversely adjusting The whole recurrent neural network keeps the total losses value minimum, obtains the optimal combined probability of several words;
Step S210: constructing the connection Huffman tree of the word using the connection word set and combined probability of the word, Obtain the coding of the connection word;
Step S220: connection candidate pool is constructed according to the connection Huffman tree of the word, the connection candidate pool includes The coding of the connection word and the connection word, then information steganography model training is completed;
Step S310: acquisition extracts the lists of keywords any one using recurrent neural network to steganography text The first word, and according to the corresponding connection Huffman tree of the first place word match, obtain the first connection Huffman tree;
Step S320: transmitting terminal according to binary coding table will it is described to steganography text conversion be binary code, and by institute It states binary code and is input to the matching encoded in the first connection Huffman tree, obtain the first connection word and do not match The first binary code;
Step S330: it is obtained according to the corresponding connection Huffman tree of the first connection word match if successful match Second connection Huffman tree;If it fails to match, another arbitrary the first word of the lists of keywords is extracted, and according to It is described another the first word match third connect Huffman tree;
Step S340: not matched first binary code is input to the second connection Huffman tree or described The matching encoded in third connection Huffman tree, obtains the second connection word, according to the second connection word continuation It with corresponding connection Huffman tree, is completed until first binary code matches, obtains several third connection words;
Step S350: the first connection word, the second connection word and several thirds that matching is obtained Connection word is combined connection, obtains steganography text;
Step S360: the steganography text is converted according to binary coding table, obtains steganography text code;
Step S410: the steganography text code is converted to steganography text according to binary coding table by receiving end;
Step S420: the extraction of word is carried out to the steganography text using recurrent neural network, is arranged in order Sequence word;
Step S430: successively matching corresponding connection Huffman tree according to the sequence of the sequence word, obtains sequence and compiles Code;
Step S440: the sequential encoding is converted according to binary coding table, obtains decoding text.
In the present embodiment, classification processing is carried out in recurrent neural network by entering text into, obtain several words The connection word set and combined probability of language, combined probability effectively reflect word and the relationship that connect word in connection word set, just Just steganography processing is carried out to text;Connection word is encoded according to the connection word set of word and combined probability, is trained The information steganography model of completion can carry out steganography processing to the information in sentence;Transmitting terminal will be to according to binary coding table Steganography text conversion is binary code, can effectively improve the treatment effeciency to steganography text, binary code is input to letter Steganography model is ceased, obtains steganography text code, i.e. ciphertext, steganography text code has effectively encrypted the important letter in sentence Breath, prevents steganography text code to be cracked;Steganography text code is converted to steganography text by receiving end, and by steganography text input It is decoded to information steganography model, decoded accuracy is higher, it is ensured that the important information of ciphertext is accurately conveyed.
In addition, another embodiment of the invention additionally provides a kind of information steganography device based on recurrent neural network, Memory including at least one control processor and for being communicated to connect at least one described control processor;The storage Device is stored with the instruction that can be executed by least one described control processor, and described instruction is by least one described control processor It executes, so that at least one described control processor is able to carry out described in any item one kind as above based on recurrent neural network Information steganography method.
In the present embodiment, steganography device includes: one or more control processors and memory, control processor and is deposited Reservoir can be connected by bus or other modes.
Memory as a kind of non-transient computer readable storage medium, can be used for storing non-transient software program, it is non-temporarily State property computer executable program and module, such as the corresponding program instruction/module of the steganography method in the embodiment of the present invention.Control Processor processed is by running non-transient software program, instruction and module stored in memory, thereby executing steganography device Various function application and data processing, that is, realize above method embodiment steganography method.
Memory may include storing program area and storage data area, wherein storing program area can storage program area, extremely Application program required for a few function;Storage data area, which can be stored, uses created data etc. according to steganography device. In addition, memory may include high-speed random access memory, it can also include non-transient memory, for example, at least a disk Memory device, flush memory device or other non-transient solid-state memories.In some embodiments, it includes phase that memory is optional The memory remotely located for control processor, these remote memories can pass through network connection to the steganography device.On The example for stating network includes but is not limited to internet, intranet, local area network, mobile radio communication and combinations thereof.
One or more of modules store in the memory, when by one or more of control processors When execution, the steganography method in above method embodiment is executed, for example, executing above description steganography method step S100 extremely S400, S110 are to S150, S210 to S220, the function of S310 to S360 and S410 to S440.
The embodiment of the invention also provides a kind of computer readable storage medium, the computer-readable recording medium storage There are computer executable instructions, which is executed by one or more control processors, for example, a control Processor executes, and said one or multiple control processors may make to execute the steganography method in above method embodiment, for example, Method sequence described above S100 is executed to S400, S110 to S150, S210 to S220, S310 to S360 and S410 are extremely The function of S440.
The apparatus embodiments described above are merely exemplary, wherein described, unit can as illustrated by the separation member It is physically separated with being or may not be, it can it is in one place, or may be distributed over multiple network lists In member.Some or all of the modules therein can be selected to achieve the purpose of the solution of this embodiment according to the actual needs.
Through the above description of the embodiments, those skilled in the art can be understood that each embodiment can borrow Help software that the mode of general hardware platform is added to realize.It will be appreciated by those skilled in the art that realizing in above-described embodiment method All or part of the process is relevant hardware can be instructed to complete by computer program, and the program can be stored in one In computer-readable storage medium, the program is when being executed, it may include such as the process of the embodiment of the above method.Wherein, institute The storage medium stated can be magnetic disk, CD, read-only memory (ReadOnly Memory, ROM) or random access memory (Random Access Memory, RAM) etc..
It is to be illustrated to preferable implementation of the invention, but the invention is not limited to above-mentioned embodiment party above Formula, those skilled in the art can also make various equivalent variations on the premise of without prejudice to spirit of the invention or replace It changes, these equivalent deformations or replacement are all included in the scope defined by the claims of the present application.

Claims (9)

1. a kind of information steganography method based on recurrent neural network, characterized by the following steps:
Text is acquired, the text input is subjected to classification processing into recurrent neural network, obtains the connection of several words Word set and combined probability, the connection word set include several connection words, and the word is adjacent with the connection word, and The connection word is arranged in behind the word;
The connection word is encoded according to the connection word set of the word and combined probability, obtains the information of training completion Steganography model;
Acquisition will be binary code to steganography text conversion by transmitting terminal to steganography text, and the binary code is inputted To information steganography model, steganography text code is obtained;
Steganography text code is converted into steganography text using receiving end, and by the steganography text input to information steganography mould Type obtains decoding text.
2. a kind of information steganography method based on recurrent neural network according to claim 1, it is characterised in that: acquisition text This, carries out classification processing into recurrent neural network for the text input, obtains the connection word set and combination of several words Probability includes the following steps:
Text is acquired, the first word of the text is extracted using recurrent neural network, and according to the frequency of the first word It chooses several the first words and constitutes lists of keywords;
The mapping for being carried out semantic space to the text using the recurrent neural network, is obtained several and contains semantic space Word;
The word is input to the preliminary extraction that LSTM hidden layer carries out weight, obtains the initial weight of the word;
Softmax activation primitive layer is connected behind the LSTM hidden layer, by the initial power of the word and the word It is input to softmax activation primitive layer again and carries out classification processing, obtains the connection word set and combined probability of several words.
3. a kind of information steganography method based on recurrent neural network according to claim 2, it is characterised in that: acquisition text This, carries out classification processing into recurrent neural network for the text input, obtains the connection word set and combination of several words Probability further includes following steps:
The total losses value of the combined probability of several words is calculated using loss function, and by reversely adjusting the recurrent neural Network keeps the total losses value minimum, obtains the optimal combined probability of several words.
4. a kind of information steganography method based on recurrent neural network according to claim 2, it is characterised in that: according to institute The connection word set and combined probability of predicate language encode the connection word, obtain the information steganography model of training completion, Include the following steps:
The connection Huffman tree that the word is constructed using the connection word set and combined probability of the word, obtains the conjunction The coding of language, then information steganography model training is completed.
5. a kind of information steganography method based on recurrent neural network according to claim 4, it is characterised in that: according to institute The connection word set and combined probability of predicate language encode the connection word, obtain the information steganography model of training completion, Further include following steps:
According to the connection Huffman tree of the word construct connection candidate pool, the connection candidate pool include the connection word with The coding of the connection word.
6. a kind of information steganography method based on recurrent neural network according to claim 4, it is characterised in that: acquisition to Steganography text will be binary code to steganography text conversion by transmitting terminal, and the binary code is input to information steganography Model includes the following steps:
Acquisition extracts any one the first word of the lists of keywords, and root using recurrent neural network to steganography text According to the corresponding connection Huffman tree of the first place word match, the first connection Huffman tree is obtained;
Transmitting terminal is binary code to steganography text conversion by described according to binary coding table, and the binary code is inputted The matching encoded into the first connection Huffman tree, obtains the first connection word and not matched first binary system Code;
It obtains the second connection if successful match according to the corresponding connection Huffman tree of the first connection word match and breathes out not Man Shu;If it fails to match, extract another arbitrary the first word of the lists of keywords, and according to it is described another The first word match third connects Huffman tree;
Not matched first binary code is input to the second connection Huffman tree or third connection Kazakhstan not The matching encoded in graceful tree obtains the second connection word, continues to match corresponding connection according to the second connection word Huffman tree is completed until first binary code matches, and obtains several third connection words;
The first connection word, the second connection word that matching is obtained connect word with several thirds and carry out group Connection is closed, steganography text is obtained;
The steganography text is converted according to binary coding table, obtains steganography text code.
7. a kind of information steganography method based on recurrent neural network according to claim 4, it is characterised in that: using connecing Steganography text code is converted to steganography text by receiving end, and the steganography text input is decoded to information steganography model Text includes the following steps:
The steganography text code is converted to steganography text according to binary coding table by receiving end;
The extraction of word, the sequence word being arranged in order are carried out to the steganography text using recurrent neural network;
Corresponding connection Huffman tree is successively matched according to the sequence of the sequence word, obtains sequential encoding;
The sequential encoding is converted according to binary coding table, obtains decoding text.
8. a kind of information steganography device based on recurrent neural network, which is characterized in that including at least one control processor and Memory for being communicated to connect at least one described control processor;The memory be stored with can by it is described at least one The instruction that control processor executes, described instruction are executed by least one described control processor, so that at least one described control Processor processed is able to carry out such as a kind of described in any item information steganography methods based on recurrent neural network of claim 1-7.
9. a kind of computer readable storage medium, which is characterized in that the computer-readable recording medium storage has computer can It executes instruction, the computer executable instructions are for making computer execute a kind of such as the described in any item bases of claim 1-7 In the information steganography method of recurrent neural network.
CN201910559075.6A 2019-06-26 2019-06-26 A kind of information steganography method based on recurrent neural network, device and storage medium Pending CN110362683A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910559075.6A CN110362683A (en) 2019-06-26 2019-06-26 A kind of information steganography method based on recurrent neural network, device and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910559075.6A CN110362683A (en) 2019-06-26 2019-06-26 A kind of information steganography method based on recurrent neural network, device and storage medium

Publications (1)

Publication Number Publication Date
CN110362683A true CN110362683A (en) 2019-10-22

Family

ID=68216998

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910559075.6A Pending CN110362683A (en) 2019-06-26 2019-06-26 A kind of information steganography method based on recurrent neural network, device and storage medium

Country Status (1)

Country Link
CN (1) CN110362683A (en)

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111222583A (en) * 2020-01-15 2020-06-02 北京中科研究院 Image steganalysis method based on confrontation training and key path extraction
CN111447188A (en) * 2020-03-20 2020-07-24 青岛大学 Carrier-free text steganography method based on language steganography feature space
CN111476702A (en) * 2020-04-07 2020-07-31 兰州交通大学 Image steganography detection method and system based on nonlinear mixed kernel feature mapping
CN112579781A (en) * 2020-12-28 2021-03-30 平安银行股份有限公司 Text classification method and device, electronic equipment and medium
CN112581346A (en) * 2020-12-24 2021-03-30 深圳大学 Binary image steganography method based on countermeasure network
CN116108466A (en) * 2022-12-28 2023-05-12 南京邮电大学盐城大数据研究院有限公司 Encryption method based on statistical language model
CN117131202A (en) * 2023-08-14 2023-11-28 湖北大学 Text steganography method based on knowledge graph, related method and device

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
ZHONG-LIANG YANG 等: "RNN-Stega: Linguistic Steganography Based on Recurrent Neural Networks", 《IEEE TRANSACTIONS ON INFORMATION FORENSICS AND SECURITY》 *

Cited By (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111222583A (en) * 2020-01-15 2020-06-02 北京中科研究院 Image steganalysis method based on confrontation training and key path extraction
CN111447188A (en) * 2020-03-20 2020-07-24 青岛大学 Carrier-free text steganography method based on language steganography feature space
CN111476702A (en) * 2020-04-07 2020-07-31 兰州交通大学 Image steganography detection method and system based on nonlinear mixed kernel feature mapping
CN112581346A (en) * 2020-12-24 2021-03-30 深圳大学 Binary image steganography method based on countermeasure network
CN112581346B (en) * 2020-12-24 2023-11-17 深圳大学 Binary image steganography method based on countermeasure network
CN112579781A (en) * 2020-12-28 2021-03-30 平安银行股份有限公司 Text classification method and device, electronic equipment and medium
CN112579781B (en) * 2020-12-28 2023-09-15 平安银行股份有限公司 Text classification method, device, electronic equipment and medium
CN116108466A (en) * 2022-12-28 2023-05-12 南京邮电大学盐城大数据研究院有限公司 Encryption method based on statistical language model
CN116108466B (en) * 2022-12-28 2023-10-13 南京邮电大学盐城大数据研究院有限公司 Encryption method based on statistical language model
CN117131202A (en) * 2023-08-14 2023-11-28 湖北大学 Text steganography method based on knowledge graph, related method and device

Similar Documents

Publication Publication Date Title
CN110362683A (en) A kind of information steganography method based on recurrent neural network, device and storage medium
Henderson et al. ConveRT: Efficient and accurate conversational representations from transformers
CN111291836B (en) Method for generating student network model
US20190155905A1 (en) Template generation for a conversational agent
CN108153913B (en) Training method of reply information generation model, reply information generation method and device
CN109614471B (en) Open type problem automatic generation method based on generation type countermeasure network
CN111310439B (en) Intelligent semantic matching method and device based on depth feature dimension changing mechanism
CN109858044B (en) Language processing method and device, and training method and device of language processing system
CN109635204A (en) Online recommender system based on collaborative filtering and length memory network
CN108920720A (en) The large-scale image search method accelerated based on depth Hash and GPU
CN108763567A (en) Method of Knowledge Reasoning and device applied to intelligent robot interaction
CN111178093A (en) Neural machine translation system training acceleration method based on stacking algorithm
CN112463924B (en) Text intention matching method for intelligent question answering based on internal correlation coding
CN110297897A (en) Question and answer processing method and Related product
CN116049459A (en) Cross-modal mutual retrieval method, device, server and storage medium
CN109033294A (en) A kind of mixed recommendation method incorporating content information
CN111339274A (en) Dialogue generation model training method, dialogue generation method and device
CN117236410B (en) Trusted electronic file large language model training and reasoning method and device
CN113722510A (en) Knowledge graph complex problem generation method and system based on graph neural network
Li et al. Deep multi-similarity hashing for multi-label image retrieval
CN113312919A (en) Method and device for generating text of knowledge graph
CN112507106A (en) Deep learning model training method and device and FAQ similarity discrimination method
EP3525107A1 (en) Conversational agent
Yao et al. Hash bit selection with reinforcement learning for image retrieval
CN113989697B (en) Short video classification method and device based on multi-mode self-supervision deep countermeasure network

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20191022