CN115049541B - Reversible gray scale method, system and device based on neural network and image steganography - Google Patents
Reversible gray scale method, system and device based on neural network and image steganography Download PDFInfo
- Publication number
- CN115049541B CN115049541B CN202210834416.8A CN202210834416A CN115049541B CN 115049541 B CN115049541 B CN 115049541B CN 202210834416 A CN202210834416 A CN 202210834416A CN 115049541 B CN115049541 B CN 115049541B
- Authority
- CN
- China
- Prior art keywords
- code stream
- image
- reversible
- color
- gray
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 230000002441 reversible effect Effects 0.000 title claims abstract description 88
- 238000000034 method Methods 0.000 title claims abstract description 71
- 238000013528 artificial neural network Methods 0.000 title claims abstract description 65
- 238000006243 chemical reaction Methods 0.000 claims abstract description 40
- 230000008569 process Effects 0.000 claims description 11
- 238000012986 modification Methods 0.000 claims description 7
- 230000004048 modification Effects 0.000 claims description 7
- 238000003860 storage Methods 0.000 claims description 5
- 238000004590 computer program Methods 0.000 claims description 3
- 230000002194 synthesizing effect Effects 0.000 claims description 2
- 238000010586 diagram Methods 0.000 description 7
- 238000012549 training Methods 0.000 description 6
- 230000015572 biosynthetic process Effects 0.000 description 4
- 238000012545 processing Methods 0.000 description 4
- 238000003786 synthesis reaction Methods 0.000 description 4
- 239000000284 extract Substances 0.000 description 3
- FGUUSXIOTUKUDN-IBGZPJMESA-N C1(=CC=CC=C1)N1C2=C(NC([C@H](C1)NC=1OC(=NN=1)C1=CC=CC=C1)=O)C=CC=C2 Chemical compound C1(=CC=CC=C1)N1C2=C(NC([C@H](C1)NC=1OC(=NN=1)C1=CC=CC=C1)=O)C=CC=C2 FGUUSXIOTUKUDN-IBGZPJMESA-N 0.000 description 2
- 230000004913 activation Effects 0.000 description 2
- 230000005540 biological transmission Effects 0.000 description 2
- 230000009977 dual effect Effects 0.000 description 2
- 230000006870 function Effects 0.000 description 2
- 238000010606 normalization Methods 0.000 description 2
- 238000013139 quantization Methods 0.000 description 2
- 230000000007 visual effect Effects 0.000 description 2
- ORILYTVJVMAKLC-UHFFFAOYSA-N Adamantane Natural products C1C(C2)CC3CC1CC2C3 ORILYTVJVMAKLC-UHFFFAOYSA-N 0.000 description 1
- 230000002238 attenuated effect Effects 0.000 description 1
- 230000009286 beneficial effect Effects 0.000 description 1
- 238000004040 coloring Methods 0.000 description 1
- 238000000354 decomposition reaction Methods 0.000 description 1
- 238000013135 deep learning Methods 0.000 description 1
- 238000009826 distribution Methods 0.000 description 1
- 230000004927 fusion Effects 0.000 description 1
- 238000013507 mapping Methods 0.000 description 1
- 230000007246 mechanism Effects 0.000 description 1
- 238000005457 optimization Methods 0.000 description 1
- 230000008447 perception Effects 0.000 description 1
- 238000006467 substitution reaction Methods 0.000 description 1
- 238000012360 testing method Methods 0.000 description 1
- 238000012800 visualization Methods 0.000 description 1
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T3/00—Geometric image transformations in the plane of the image
- G06T3/04—Context-preserving transformations, e.g. by using an importance map
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T1/00—General purpose image data processing
- G06T1/0021—Image watermarking
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T9/00—Image coding
- G06T9/002—Image coding using neural networks
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10024—Color image
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20084—Artificial neural networks [ANN]
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Artificial Intelligence (AREA)
- Evolutionary Computation (AREA)
- Multimedia (AREA)
- Image Processing (AREA)
Abstract
The invention discloses a reversible gray scale method, a system and a device based on neural network and image steganography, which comprise the steps of carrying out reversible conversion on an original color image to obtain a gray scale component Y and color components U and V; performing neural network coding and arithmetic coding on the color components to obtain a characteristic code stream and a super prior code stream; according to the image steganography, the characteristic code stream and the super prior code stream are steganographically written into the gray component Y, and a reversible gray image A is generated; reading a characteristic code stream and a super prior code stream in the reversible gray scale image A, and taking the read gray scale image A as a gray scale component Y R of the color image to be reconstructed; performing neural network decoding and arithmetic decoding on the characteristic code stream and the super prior code stream to convert the characteristic code stream and the super prior code stream into color components U R and V R of a color image to be reconstructed; and combining the gray component of the color image to be reconstructed and the color component of the color image to be reconstructed, and performing reversible conversion to obtain a reconstructed color image I R. The invention can be realized in the reversible gray scale of the neural network and the image steganography.
Description
Technical Field
The invention relates to the field of reversible gray scale, in particular to a reversible gray scale method, a system and a device based on a neural network and image steganography.
Background
Color image generation gray scale image methods have important applications in many fields, such as printing, engraving, monochrome display, image processing, and the like. The conventional gray image generation method focuses on perceptions such as contrast, texture features, and the like. There is also a class of gray scale generation methods called reversible gray scale, which are mainly aimed at generating a gray scale image, and at encoding color information of a color image in the generated gray scale with concealment, and at recovering the original color image as perfectly as possible when necessary.
In 2018, xia et al proposed a reversible gray scale method on ACM Transactions on Graphics that models the image decoloring and coloring process as a closed loop through the encoding-decoding network. The method can embed the color information into the generated gray-scale image, so that the decoded image can reconstruct the color consistent with the original color more accurately.
In 2020, ye et al propose a dual feature set network on IEEE ACCESS, using dense residual representation, integrating the ability of local residual learning and local feature fusion, suppressing the redundancy characteristics generated by dual path modules by means of an attention mechanism, thus obtaining a gray scale image with better consistency and reconstructing a color image.
2021, Liu et al proposed a JPEG robust reversible gray scale system on IEEE Transactions on Visualization and Computer Graphics that introduced an contrast training and JPEG simulator on the basis of a codec network, enabling the generated gray scale image to be JPEG robust and reducing the encoded texture of the generated image. Zhao et al propose a new reversible gray scale method on IEEE Transactions on Image Processing that maps color images forward to gray scale images and latent variables through a reversible neural network, and then converts gray scale images and a set of gaussian-distributed random variables into color images that are close to the original ones through reverse mapping.
For the reversible gray scale method, the most critical performance is the similarity degree of the generated gray scale image and the reconstructed color image with the target gray scale image and the original color image respectively. While the prior art uses an end-to-end structural framework, which is imperfect in terms of performance affable, there is significant lifting space, and this is mainly due to the two main disadvantages of these techniques: ① Failure to truly efficiently eliminate redundancy of color information results in a large amount of information of the content to be encoded. ② In the process of encoding color information, the information loss of the gray image is large.
Disclosure of Invention
The invention aims to provide a neural network and image steganography-based reversible gray scale method, system and device, and aims to solve the problem of reversible gray scale of an image.
The invention provides a reversible gray scale method based on a neural network and image steganography, which comprises the following steps:
s1, carrying out reversible RGB2YUV conversion on an original color image to obtain a gray component Y and color components U and V;
S2, performing neural network coding and arithmetic coding on the color components U and V to obtain a characteristic code stream and a super prior code stream;
S3, steganographically writing the characteristic code stream and the super prior code stream into the gray component Y according to image steganography to generate a reversible gray image A;
S4, reading a characteristic code stream and a super prior code stream in the reversible gray scale image A, and taking the read gray scale image A as a gray scale component Y R of the color image to be reconstructed;
S5, performing neural network decoding and arithmetic decoding on the characteristic code stream and the super prior code stream to convert the characteristic code stream and the super prior code stream into color components U R and V R of the color image to be reconstructed;
s6, combining the gray component of the color image to be reconstructed and the color component of the color image to be reconstructed, and performing reversible YUV2RGB conversion to obtain a reconstructed color image I R.
The invention also provides a reversible gray scale system based on the neural network and the image steganography, which comprises:
And a conversion module: the method comprises the steps of performing reversible RGB2YUV conversion on an original color image to obtain a gray component Y and color components U and V;
and a coding module: the method comprises the steps of performing neural network coding and arithmetic coding on color components U and V to obtain a characteristic code stream and a super prior code stream;
The steganography module is used for steganographically writing the characteristic code stream and the super prior code stream into the gray component Y according to the image steganography to generate a reversible gray image A;
And a reading module: the method comprises the steps of reading a characteristic code stream and a super prior code stream in a reversible gray level image A, and taking the read gray level image A as a gray level component Y R of a color image to be reconstructed;
And a decoding module: the method comprises the steps of performing neural network decoding and arithmetic decoding on a characteristic code stream and a super prior code stream to convert the characteristic code stream and the super prior code stream into color components U R and V R of a color image to be reconstructed;
and a reconstruction module: and the method is used for merging the gray component of the color image to be reconstructed and the color component of the color image to be reconstructed, and carrying out reversible YUV2RGB conversion to obtain a reconstructed color image I R.
The embodiment of the invention also provides a reversible gray scale device based on the neural network and the image steganography, which comprises: a memory, a processor and a computer program stored on the memory and executable on the processor, which when executed by the processor, performs the steps of the method described above.
The embodiment of the invention also provides a computer readable storage medium, wherein the computer readable storage medium stores an information transmission implementation program, and the program realizes the steps of the method when being executed by a processor.
In the first aspect, the invention designs a neural network to encode color components of a color image, extracts key features of the color components, models a probability model of the color components, and encodes the features into binary code streams through arithmetic coding. Compared with the prior art, the strategy can more effectively eliminate redundancy of color information and reduce loss of the color information, and solves the problem that the quality of the generated gray image is poor due to too much information needed to be hidden in the middle process.
In a second aspect, the present invention writes (or reads) color information to a gray component by an image steganography technique, and embeds the color information therein with little modification on the gray component, thereby generating a target gray image. Compared with the prior art, the gray image generated by the method has more ideal visual effect and smaller gray information loss.
In the third aspect, the invention utilizes a reversible component conversion method to decompose a color image into a gray component and a color component for orthogonal processing, and combines a neural network and image steganography to remarkably improve the comprehensive performance indexes of both generating the gray image and reconstructing the color image.
The foregoing description is only an overview of the present invention, and is intended to provide a more clear understanding of the technical means of the present invention, as it is embodied in accordance with the present invention, and to make the above and other objects, features and advantages of the present invention more apparent, as it is embodied in the following detailed description of the invention.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings that are needed in the description of the embodiments or the prior art will be briefly described, and it is obvious that the drawings in the description below are some embodiments of the present invention, and other drawings can be obtained according to the drawings without inventive effort for a person skilled in the art.
FIG. 1 is a flow chart of a neural network and image steganography-based reversible gray scale method in accordance with an embodiment of the present invention;
FIG. 2 is a schematic diagram of a neural network and image steganography-based reversible gray scale method according to an embodiment of the present invention;
FIG. 3 is a schematic illustration of a neural network based on a neural network and image steganography reversible gray scale method in accordance with an embodiment of the present invention;
FIG. 4 is a modified pixel schematic diagram of a neural network and image steganography-based reversible gray scale method in accordance with an embodiment of the present invention;
FIG. 5 is a schematic diagram of a neural network and image steganography-based reversible gray scale system in accordance with an embodiment of the present invention;
fig. 6 is a schematic diagram of a neural network and image steganography-based reversible gray scale apparatus according to an embodiment of the present invention.
Detailed Description
The technical solutions of the present invention will be clearly and completely described in connection with the embodiments, and it is apparent that the described embodiments are some embodiments of the present invention, but not all embodiments. All other embodiments, which can be made by those skilled in the art based on the embodiments of the invention without making any inventive effort, are intended to be within the scope of the invention.
Method embodiment
According to an embodiment of the present invention, a reversible gray scale method based on a neural network and image steganography is provided, and fig. 1 is a flowchart of the reversible gray scale method based on the neural network and image steganography according to an embodiment of the present invention, as shown in fig. 1, specifically including:
s1, carrying out reversible RGB2YUV conversion on an original color image to obtain a gray component Y and color components U and V;
S2, performing neural network coding and arithmetic coding on the color components U and V to obtain a characteristic code stream and a super prior code stream;
S3, steganographically writing the characteristic code stream and the super prior code stream into the gray component Y according to image steganography to generate a reversible gray image A;
S4, reading a characteristic code stream and a super prior code stream in the reversible gray scale image A, and taking the read gray scale image A as a gray scale component Y R of the color image to be reconstructed;
S5, performing neural network decoding and arithmetic decoding on the characteristic code stream and the super prior code stream to convert the characteristic code stream and the super prior code stream into color components U R and V R of the color image to be reconstructed;
s6, combining the gray component of the color image to be reconstructed and the color component of the color image to be reconstructed, and performing reversible YUV2RGB conversion to obtain a reconstructed color image I R.
Fig. 2 is a schematic diagram of a framework of a neural network and image steganography-based reversible gray scale method according to an embodiment of the present invention, as shown in fig. 2,
Gray image generation process of color image:
Step a: carrying out reversible RGB2YUV conversion on an original color image I, and decomposing the image to obtain a gray component Y and a color component U and V;
Step b: the color components U, V are analyzed and encoded through a neural network to obtain characteristics And super prior/>And establish the characteristics/>Prior probability model/>And super prior/>Independent probabilistic model/>The features are then combined with arithmetic codingSuper prior/>Respectively converted into characteristic code stream/>Super prior code stream/>
Step c: feature code stream using image steganographyAnd super a priori code stream/>Writing the gray component Y in a hidden manner to generate a reversible gray image A;
Gray image reconstruction color image process:
step d: reading out characteristic code stream from gray image A And super a priori code stream/>The read gray image A is used as a gray component Y R of the color image to be reconstructed;
step e: based on independent probability models Super prior code stream/>Arithmetic decoding is super prior/>Then decoding through a neural network to obtain the characteristic/>To obtain the prior probability model/>And will feature code stream/>Arithmetic decoding is characterized by/>The feature is then paired by a neural networkSynthesizing to obtain a color component U R,VR of the color image to be reconstructed;
Step f: combining the gray component Y R and the color component U R,VR, and performing reversible YUV2RGB conversion to obtain a reconstructed color image I R;
preferably, the reversible RGB2YUV conversion formula of the step a is:
wherein Y represents a gray component value, U, V represents a color component value; r, G, B represent pixel values of the original color image; Representing a rounding down.
Preferably, the content of the neural network is as follows:
Fig. 3 is a schematic view of a neural network based on a neural network and image steganography reversible gray scale method according to an embodiment of the present invention, as shown in fig. 3,
The neural network comprises four parts: the system comprises a feature analysis network, a feature synthesis network, a super priori coding network and a super priori decoding network.
The feature analysis network extracts the main feature x of the color component, and marks the main feature x as a feature after rounding and quantizationFeatures/>The method is beneficial to effectively reconstructing the color components by the feature synthesis network under the condition that the information entropy is as small as possible; the characteristic analysis network consists of a convolution network layer and GDN (generalized divisive normalization) nonlinear layers; assuming that the dimension of the input color component is H×W×2, the output feature dimension of the feature analysis network is/>
The super prior coding network realizes further coding and quantization of the characteristic x, and calculates the super prior with the information entropy as small as possibleInputting the model to a super prior decoding network to accurately model a probability model of the feature x; the super prior coding network consists of a convolution network layer and RELU nonlinear activation layers, and the super prior dimension of the corresponding output is/>
Super prior decoding network implementation pair super priorDecoding, wherein the decoded variable is a normal distribution probability model parameter/>, of the characteristic xFurther build feature/>Prior probability model/>The super prior decoding network consists of a transposed convolutional network layer and a RELU nonlinear activation layer, and the output probability model parameter is/>Wherein, the prior probability model/>The form of (2) is:
Feature synthesis network based on input features Synthesizing and reconstructing color components which are as close to the original as possible; the characteristic synthesis network consists of a transposed convolution network layer and IGDN (inversed generalized divisive normalization) nonlinear layers, and the dimension of the output reconstruction component is H multiplied by W multiplied by 2;
Preferably, the image steganography method is implemented as follows:
Pixels of the carrier gray scale image are represented by column scans as P 1,P2,P3,…,Pm; feature code stream Super prior code streamIs combined into a binary code stream O and denoted b 1,b2,b3,…,bn; sequentially taking every 3 pixels and every 3 binary codes as a group, wherein 3 pixels of the ith group are respectively denoted as P i_1,Pi_2,Pi_3, and 3 binary codes are respectively denoted as b i_1,bi_2,bi_3;
In step c, the steganography process implements embedding 3-bit binary code information into 3 pixels of each group, and specifically implements the following steps:
first, a 3-bit prediction code is calculated from values of 3 pixels:
Where B i_1,Bi_2,Bi_3 denotes the 3-bit prediction code of the i-th group, The lowest j-th bit of the binary value representing the ith group of nth pixels p i_n,/>Representing an exclusive-or operation;
If the predicted code B i_1,Bi_2,Bi_3 is equal to the binary code B i_1,b i_2,b i_3, no modification to the pixels is required; otherwise, the predicted code and the binary code are equal by modifying the pixel P i_1,P i_2,P i_3, so that the steganography of the binary code is completed;
Fig. 4 is a schematic diagram of a modified pixel of the neural network and image steganography-based reversible gray scale method according to an embodiment of the present invention, as shown in fig. 4,
B i_1≠bi_1,Bi_2=bi_2,Bi_3=bi_3; if P i_2 mod 2 = 0 is, then P i_2=Pi_2 -1; if P i_2 mod 2 = 0 no, P i_2=Pi_2 +1;
in addition, the last number of code streams which do not satisfy 3 bits is to directly replace the lowest binary bit of the corresponding sequence pixels to realize steganography.
In step d, the reading process is:
Calculating the predictive codes of every 3 pixels P i_1,Pi_2,Pi_3 from P 1 to P m according to the above-mentioned predictive method sequence, and if the last number of pixels which do not meet 3 bits is that the lowest binary bit of the directly-read pixel replaces the correspondent predictive code, so as to read out the binary code stream O hidden in gray image, then obtaining the characteristic code stream after further decomposition And super a priori code stream/>
Preferably, the reversible YUV2RGB conversion formula in the step f is:
wherein R, G, B represent pixel values of a color image; y represents a gray component value, U, V represents a color component value; Representing a rounding down.
Specifically, the embodiment is realized by Python, the neural network is constructed by using Pytorch deep learning frames and optimized training is performed by an Adam optimizer; the training set is Pascal VOC2012, 20000 images are randomly extracted from the public data set, the clipping resolution is 512 multiplied by 512, and the test set is Kodak Photo CD image data set; the training initial learning rate is set to be 1 multiplied by 10 -4, the learning rate is attenuated to be 1 multiplied by 10 -5 after the iteration is performed for 2 multiplied by 10 6 times, and the iteration is continued for 5 multiplied by 10 5 times; the training process restrains the optimization direction of the neural network through the loss function, and comprehensively reduces the conversion loss and the characteristics of the color componentsSuper prior/>Is a coded stream length of the code. The loss function used for training is:
Wherein, the first term of the equation is the mean square error value between the color component U, V before conversion and the color component U R,VR after conversion; the second and the third are respectively characterized in that Super prior/>Is an information entropy of (a);
In summary, the present embodiment finally realizes: the original color image generates a corresponding reversible gray scale image from which a color image substantially identical to the original can then be reconstructed.
Compared with the prior art, the technical scheme provided by the invention has at least the following advantages:
in the first aspect, the invention designs a neural network to encode color components of a color image, extracts key features of the color components, models a probability model of the color components, and encodes the features into binary code streams through arithmetic coding. Compared with the prior art, the strategy can more effectively eliminate redundancy of color information and reduce loss of the color information, and solves the problem that the quality of the generated gray image is poor due to too much information needed to be hidden in the middle process.
In a second aspect, the present invention writes (or reads) color information to a gray component by an image steganography technique, and embeds the color information therein with little modification on the gray component, thereby generating a target gray image. Compared with the prior art, the gray image generated by the method has more ideal visual effect and smaller gray information loss.
In the third aspect, the invention utilizes a reversible component conversion method to decompose a color image into a gray component and a color component for orthogonal processing, and combines a neural network and image steganography to remarkably improve the comprehensive performance indexes of both generating the gray image and reconstructing the color image.
System embodiment
According to an embodiment of the present invention, a reversible gray scale system based on neural network and image steganography is provided, and fig. 5 is a schematic diagram of the reversible gray scale system based on neural network and image steganography according to an embodiment of the present invention, as shown in fig. 5, specifically including:
And a conversion module: the method comprises the steps of performing reversible RGB2YUV conversion on an original color image to obtain a gray component Y and color components U and V;
and a coding module: the method comprises the steps of performing neural network coding and arithmetic coding on color components U and V to obtain a characteristic code stream and a super prior code stream;
The steganography module is used for steganographically writing the characteristic code stream and the super prior code stream into the gray component Y according to the image steganography to generate a reversible gray image A;
And a reading module: the method comprises the steps of reading a characteristic code stream and a super prior code stream in a reversible gray level image A, and taking the read gray level image A as a gray level component Y R of a color image to be reconstructed;
And a decoding module: the method comprises the steps of performing neural network decoding and arithmetic decoding on a characteristic code stream and a super prior code stream to convert the characteristic code stream and the super prior code stream into color components U R and V R of a color image to be reconstructed;
and a reconstruction module: and the method is used for merging the gray component of the color image to be reconstructed and the color component of the color image to be reconstructed, and carrying out reversible YUV2RGB conversion to obtain a reconstructed color image I R.
The conversion module is specifically used for:
Carrying out reversible RGB2YUV conversion on an original color image to obtain a gray component Y and color components U and V, wherein the reversible RGB2YUV conversion formula is as follows:
Wherein Y represents a gray component value, and U and V represent color component values; r, G, B represent pixel values of the original color image; Representing a downward rounding;
the coding module is specifically used for:
The color components U and V are analyzed and coded through a neural network to obtain characteristics and super-prior, an independent probability model of the characteristics and an independent probability model of the super-prior are established, and then the characteristic heat exchange super-prior is respectively converted into a characteristic code stream and a super-prior code stream by combining arithmetic coding;
The steganography module is specifically used for:
The pixels of the gray component Y are represented as P 1,P2,P3,…,Pm by column scanning, and the characteristic code stream and the super prior code stream are combined into a binary code stream and represented as b 1,b2,b3,…,bn;
the 3-bit predictive code is predicted from 3 pixel values, and the formula is as follows:
Where B i_1,Bi_2,Bi_3 denotes the 3-bit prediction code of the i-th group, The lowest j-th bit of the binary value representing the ith group of nth pixels p i_n,/>Representing an exclusive-or operation;
If the binary code B i_1,bi_2,bi_3 is equal to the predicted code B i_1,Bi_2,Bi_3, no modification is needed for the pixels, otherwise, the pixels P i_1,Pi_2,Pi_3 are modified until the binary code is equal to the predicted code, so that the steganography of the binary code is completed, the lowest binary bit of the corresponding sequence pixels is directly replaced by the code streams with the number not meeting 3 bits to realize steganography, and the 3-bit binary code B i_1,bi_2,bi_3 is hidden by the 3 pixels P i_1,Pi_2,Pi_3 to generate a reversible gray level image A;
the decoding module is specifically used for:
According to the independent probability model, the super prior code stream is arithmetically decoded into super prior, then the probability model parameters of the features are obtained through the neural network decoding, so that the prior probability model is obtained, the feature code stream is arithmetically decoded into the features, and then the features are synthesized by the neural network to obtain the color component U R,VR of the color image to be reconstructed;
The reconstruction module is specifically used for:
Combining the gray component of the color image to be reconstructed and the color component of the color image to be reconstructed, performing reversible YUV2RGB conversion to obtain a reconstructed color image I R,
The reversible YUV2RGB conversion formula is:
Wherein R, G, B represent pixel values of a color image; y represents a gray component value, U, V represents a color component value; Representing a rounding down.
The embodiment of the present invention is a system embodiment corresponding to the above method embodiment, and specific operations of each module may be understood by referring to the description of the method embodiment, which is not repeated herein.
Device embodiment 1
The embodiment of the invention provides a reversible gray scale device based on a neural network and image steganography, as shown in fig. 5, comprising: the memory 50, the processor 52 and a computer program stored on the memory 50 and executable on the processor 52, which when executed by the processor, performs the steps of the method embodiments described above.
Device example two
The embodiment of the present invention provides a computer readable storage medium, on which a program for implementing information transmission is stored, which when executed by the processor 52 implements the steps of the method embodiment described above.
Finally, it should be noted that: the above embodiments are only for illustrating the technical solution of the present invention, and not for limiting the same; although the invention has been described in detail with reference to the foregoing embodiments, it will be understood by those of ordinary skill in the art that: the technical scheme described in the foregoing embodiments can be modified or some or all of the technical features thereof can be replaced by equivalents; and these modifications or substitutions may be made to the technical solutions of the embodiments of the present invention without departing from the spirit of the corresponding technical solutions.
Claims (8)
1. A reversible gray scale method based on neural network and image steganography is characterized by comprising the following steps,
S1, carrying out reversible RGB2YUV conversion on an original color image to obtain a gray component Y and color components U and V;
S2, performing neural network coding and arithmetic coding on the color components U and V to obtain a characteristic code stream and a super prior code stream;
S3, steganographically writing the characteristic code stream and the super prior code stream into the gray component Y according to image steganography to generate a reversible gray image A;
S4, reading a characteristic code stream and a super prior code stream in the reversible gray scale image A, and taking the read gray scale image A as a gray scale component Y R of the color image to be reconstructed;
S5, performing neural network decoding and arithmetic decoding on the characteristic code stream and the super prior code stream to convert the characteristic code stream and the super prior code stream into color components U R and V R of the color image to be reconstructed;
S6, combining the gray component of the color image to be reconstructed and the color component of the color image to be reconstructed, and performing reversible YUV2RGB conversion to obtain a reconstructed color image I R;
The step S2 specifically comprises the following steps:
The color components U and V are analyzed and coded through a neural network to obtain characteristics and super-prior, an independent probability model of the characteristics and an independent probability model of the super-prior are established, and then the characteristic heat exchange super-prior is respectively converted into a characteristic code stream and a super-prior code stream by combining arithmetic coding;
the step S3 specifically comprises the following steps:
The pixels of the gray component Y are represented as P 1,P2,P3,…,Pm by column scanning, and the characteristic code stream and the super prior code stream are combined into a binary code stream and represented as b 1,b2,b3,…,bn;
Sequentially taking every 3 pixels and every 3 binary codes as a group, wherein 3 pixels of the ith group are respectively denoted as P i_1,Pi_2,Pi_3, and 3 binary codes are respectively denoted as b i_1,bi_2,bi_3;
In step c, the steganography process implements embedding 3-bit binary code information into 3 pixels of each group, and specifically implements the following steps:
first, a 3-bit prediction code is calculated from values of 3 pixels:
Where B i_1,Bi_2,Bi_3 denotes the 3-bit prediction code of the i-th group, The lowest j-th bit of the binary value representing the ith group of nth pixels p i_n,/>Representing an exclusive-or operation;
If the predicted code B i_1,Bi_2,Bi_3 is equal to the binary code B i_1,b i_2,b i_3, no modification to the pixels is required; otherwise, the predicted code and the binary code are equal by modifying the pixel P i_1,P i_2,P i_3, so that the steganography of the binary code is completed;
the code stream with the number not meeting 3 bits directly replaces the lowest binary bit of the corresponding sequence of pixels to realize steganography, and 3 pixels P i+1,Pi+2,Pi+3 are hidden in the 3-bit binary code b i+1,bi+2,bi+3 to generate a reversible gray level image A.
2. The method according to claim 1, wherein S1 specifically comprises:
Carrying out reversible RGB2YUV conversion on an original color image to obtain a gray component Y and color components U and V, wherein the reversible RGB2YUV conversion formula is as follows:
Wherein Y represents a gray component value, and U and V represent color component values; r, G, B represent pixel values of the original color image; Representing a rounding down.
3. The method according to claim 1, wherein S5 specifically comprises:
And arithmetically decoding the super prior code stream into super prior according to the independent probability model, then decoding through a neural network to obtain probability model parameters of the features, thereby obtaining a prior probability model, arithmetically decoding the feature code stream into the features, and then synthesizing the features through the neural network to obtain the color component U R,VR of the color image to be reconstructed.
4. A method according to claim 3, wherein S6 comprises:
Combining the gray component of the color image to be reconstructed and the color component of the color image to be reconstructed, performing reversible YUV2RGB conversion to obtain a reconstructed color image I R,
The reversible YUV2RGB conversion formula is:
Wherein R, G, B represent pixel values of a color image; y represents a gray component value, U, V represents a color component value; Representing a rounding down.
5. A reversible gray scale system based on neural network and image steganography is characterized by comprising,
And a conversion module: the method comprises the steps of performing reversible RGB2YUV conversion on an original color image to obtain a gray component Y and color components U and V;
and a coding module: the method comprises the steps of performing neural network coding and arithmetic coding on color components U and V to obtain a characteristic code stream and a super prior code stream;
The steganography module is used for steganographically writing the characteristic code stream and the super prior code stream into the gray component Y according to the image steganography to generate a reversible gray image A;
And a reading module: the method comprises the steps of reading a characteristic code stream and a super prior code stream in a reversible gray level image A, and taking the read gray level image A as a gray level component Y R of a color image to be reconstructed;
And a decoding module: the method comprises the steps of performing neural network decoding and arithmetic decoding on a characteristic code stream and a super prior code stream to convert the characteristic code stream and the super prior code stream into color components U R and V R of a color image to be reconstructed;
And a reconstruction module: the method comprises the steps of combining gray components of a color image to be reconstructed and color components of the color image to be reconstructed, and carrying out reversible YUV2RGB conversion to obtain a reconstructed color image I R;
the coding module is specifically used for:
The color components U and V are analyzed and coded through a neural network to obtain characteristics and super-prior, an independent probability model of the characteristics and an independent probability model of the super-prior are established, and then the characteristic heat exchange super-prior is respectively converted into a characteristic code stream and a super-prior code stream by combining arithmetic coding;
The steganography module is specifically used for:
The pixels of the gray component Y are represented as P 1,P2,P3,…,Pm by column scanning, and the characteristic code stream and the super prior code stream are combined into a binary code stream and represented as b 1,b2,b3,…,bn;
Sequentially taking every 3 pixels and every 3 binary codes as a group, wherein 3 pixels of the ith group are respectively denoted as P i_1,Pi_2,Pi_3, and 3 binary codes are respectively denoted as b i_1,bi_2,bi_3;
In step c, the steganography process implements embedding 3-bit binary code information into 3 pixels of each group, and specifically implements the following steps:
first, a 3-bit prediction code is calculated from values of 3 pixels:
Where B i_1,Bi_2,Bi_3 denotes the 3-bit prediction code of the i-th group, The lowest j-th bit of the binary value representing the ith group of nth pixels p i_n,/>Representing an exclusive-or operation;
If the predicted code B i_1,Bi_2,Bi_3 is equal to the binary code B i_1,b i_2,b i_3, no modification to the pixels is required; otherwise, the predicted code and the binary code are equal by modifying the pixel P i_1,P i_2,P i_3, so that the steganography of the binary code is completed;
the code stream with the number not meeting 3 bits directly replaces the lowest binary bit of the corresponding sequence of pixels to realize steganography, and 3 pixels P i+1,Pi+2,Pi+3 are hidden in the 3-bit binary code b i+1,bi+2,bi+3 to generate a reversible gray level image A.
6. The system of claim 5, wherein the conversion module is specifically configured to:
Carrying out reversible RGB2YUV conversion on an original color image to obtain a gray component Y and color components U and V, wherein the reversible RGB2YUV conversion formula is as follows:
Wherein Y represents a gray component value, and U and V represent color component values; r, G, B represent pixel values of the original color image; Representing a downward rounding;
the decoding module is specifically used for:
According to the independent probability model, the super prior code stream is arithmetically decoded into super prior, then the probability model parameters of the features are obtained through the neural network decoding, so that the prior probability model is obtained, the feature code stream is arithmetically decoded into the features, and then the features are synthesized by the neural network to obtain the color component U R,VR of the color image to be reconstructed;
The reconstruction module is specifically used for:
Combining the gray component of the color image to be reconstructed and the color component of the color image to be reconstructed, performing reversible YUV2RGB conversion to obtain a reconstructed color image I R,
The reversible YUV2RGB conversion formula is:
Wherein R, G, B represent pixel values of a color image; y represents a gray component value, U, V represents a color component value; Representing a rounding down.
7. A neural network and image steganography-based reversible gray scale device, comprising: memory, a processor and a computer program stored on the memory and executable on the processor, which when executed by the processor, implements the steps of the neural network and image steganography based reversible greyscale method of any one of claims 1 to 4.
8. A computer-readable storage medium, wherein a program for implementing information transfer is stored on the computer-readable storage medium, and the program when executed by a processor implements the steps of the neural network and image steganography-based reversible gray scale method of any one of claims 1 to 4.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202210834416.8A CN115049541B (en) | 2022-07-14 | 2022-07-14 | Reversible gray scale method, system and device based on neural network and image steganography |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202210834416.8A CN115049541B (en) | 2022-07-14 | 2022-07-14 | Reversible gray scale method, system and device based on neural network and image steganography |
Publications (2)
Publication Number | Publication Date |
---|---|
CN115049541A CN115049541A (en) | 2022-09-13 |
CN115049541B true CN115049541B (en) | 2024-05-07 |
Family
ID=83165878
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202210834416.8A Active CN115049541B (en) | 2022-07-14 | 2022-07-14 | Reversible gray scale method, system and device based on neural network and image steganography |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN115049541B (en) |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US7253836B1 (en) * | 1998-06-30 | 2007-08-07 | Nikon Corporation | Digital camera, storage medium for image signal processing, carrier wave and electronic camera |
CN111696026A (en) * | 2020-05-06 | 2020-09-22 | 华南理工大学 | Reversible gray scale map algorithm and computing device based on L0 regular term |
CN111882476A (en) * | 2020-07-17 | 2020-11-03 | 广州大学 | Image steganography method for automatically learning embedded cost based on deep reinforcement learning |
CN112801922A (en) * | 2021-04-01 | 2021-05-14 | 暨南大学 | Color image-gray image-color image conversion method |
CN113259676A (en) * | 2020-02-10 | 2021-08-13 | 北京大学 | Image compression method and device based on deep learning |
WO2021164176A1 (en) * | 2020-02-20 | 2021-08-26 | 北京大学 | End-to-end video compression method and system based on deep learning, and storage medium |
-
2022
- 2022-07-14 CN CN202210834416.8A patent/CN115049541B/en active Active
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US7253836B1 (en) * | 1998-06-30 | 2007-08-07 | Nikon Corporation | Digital camera, storage medium for image signal processing, carrier wave and electronic camera |
CN113259676A (en) * | 2020-02-10 | 2021-08-13 | 北京大学 | Image compression method and device based on deep learning |
WO2021164176A1 (en) * | 2020-02-20 | 2021-08-26 | 北京大学 | End-to-end video compression method and system based on deep learning, and storage medium |
CN111696026A (en) * | 2020-05-06 | 2020-09-22 | 华南理工大学 | Reversible gray scale map algorithm and computing device based on L0 regular term |
CN111882476A (en) * | 2020-07-17 | 2020-11-03 | 广州大学 | Image steganography method for automatically learning embedded cost based on deep reinforcement learning |
CN112801922A (en) * | 2021-04-01 | 2021-05-14 | 暨南大学 | Color image-gray image-color image conversion method |
Non-Patent Citations (7)
Title |
---|
emote Sensing Image Colorization Based on Multiscale SEnet GAN;Min Wu 等;2019 12th International Congress on Image and Signal Processing, BioMedical Engineering and Informatics (CISP-BMEI);20200123;全文 * |
Generating High-Fidelity Images with Disentangled Adversarial VAEs and Structure-Aware Loss;Habibeh Naderi 等;2020 International Joint Conference on Neural Networks (IJCNN);20200928;全文 * |
Invertible Grayscale;MENGHAN XIA 等;ACM Transactions on Graphics;20181130;第37卷(第6期);第246:1-246:10 * |
基于神经网络编码和图像隐写术的可逆灰度方法;林焕然;中国优秀硕士学位论文全文数据库 信息科技辑;20240215;I138-134 * |
基于颜色编码和图像隐 写术的可逆灰度方法;林焕然 等;计算机应用研究;20231117;第41卷(第5期);全文 * |
用于盗版追踪的格雷码加密域可逆水印研究;石慧;冯斌;王相海;李明楚;宋传鸣;;中国图象图形学报;20181116(11);第23-39页 * |
采用插值和排序的大容量可逆数据隐藏算法;熊祥光;曹永锋;欧卫华;刘彬;韦立;李伟岸;;计算机辅助设计与图形学学报;20181015(10);第173-184页 * |
Also Published As
Publication number | Publication date |
---|---|
CN115049541A (en) | 2022-09-13 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN110290387B (en) | Image compression method based on generative model | |
CN109996073B (en) | Image compression method, system, readable storage medium and computer equipment | |
CN111009018A (en) | Image dimensionality reduction and reconstruction method based on deep neural network | |
CN110248190B (en) | Multilayer residual coefficient image coding method based on compressed sensing | |
CN111246206B (en) | Optical flow information compression method and device based on self-encoder | |
CN112132158A (en) | Visual picture information embedding method based on self-coding network | |
CN114610935A (en) | Method and system for synthesizing semantic image of text control image style | |
Li et al. | Research into an image inpainting algorithm via multilevel attention progression mechanism | |
Zebang et al. | Densely connected AutoEncoders for image compression | |
Sadeeq et al. | Image compression using neural networks: a review | |
CN114332479A (en) | Training method of target detection model and related device | |
CN115049541B (en) | Reversible gray scale method, system and device based on neural network and image steganography | |
CN112492313B (en) | Picture transmission system based on generation countermeasure network | |
Karthikeyan et al. | An efficient image compression method by using optimized discrete wavelet transform and Huffman encoder | |
CN111666950A (en) | Font family generation method based on stream model | |
CN116523985A (en) | Structure and texture feature guided double-encoder image restoration method | |
CN113949880B (en) | Extremely-low-bit-rate man-machine collaborative image coding training method and coding and decoding method | |
CN111343458B (en) | Sparse gray image coding and decoding method and system based on reconstructed residual | |
CN115331073A (en) | Image self-supervision learning method based on TransUnnet architecture | |
CN112015932A (en) | Image storage method, medium and device based on neural network | |
Li et al. | Image compression algorithm research based on improved LSTM | |
KR102616344B1 (en) | Image converting apparatus and method thereof | |
CN117528085B (en) | Video compression coding method based on intelligent feature clustering | |
CN114222124B (en) | Encoding and decoding method and device | |
CN117915107B (en) | Image compression system, image compression method, storage medium and chip |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |