CN103854248B - Watermark embedding method and device based on Context resolution and perception layering - Google Patents

Watermark embedding method and device based on Context resolution and perception layering Download PDF

Info

Publication number
CN103854248B
CN103854248B CN201210518446.4A CN201210518446A CN103854248B CN 103854248 B CN103854248 B CN 103854248B CN 201210518446 A CN201210518446 A CN 201210518446A CN 103854248 B CN103854248 B CN 103854248B
Authority
CN
China
Prior art keywords
watermark
information
wavelet
jnd
entropy
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201210518446.4A
Other languages
Chinese (zh)
Other versions
CN103854248A (en
Inventor
杨成
张亚娜
李伟
杨秋
皮婉素
李晨
聂艳龙
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Communication University of China
Original Assignee
Communication University of China
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Communication University of China filed Critical Communication University of China
Priority to CN201210518446.4A priority Critical patent/CN103854248B/en
Publication of CN103854248A publication Critical patent/CN103854248A/en
Application granted granted Critical
Publication of CN103854248B publication Critical patent/CN103854248B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Abstract

The present invention provides a kind of watermark embedding method based on Context resolution, primal sketch decomposition is carried out by copyright, to obtain cell information and texture information;Wavelet transformation is carried out to cell information and texture information respectively, to obtain the wavelet coefficient of cell information and the wavelet coefficient of texture;The watermark strength parameter value of the wavelet coefficient of the wavelet coefficient and texture of cell information is calculated respectively(PER or ENT), two watermark strength parameter values are merged to obtain final watermark strength parameter value, and according to the final watermark strength parameter value embedding parameter.

Description

Watermark embedding method and device based on Context resolution and perception layering
Technical field
The present invention relates to a kind of watermark embedding method and device based on Context resolution and perception layering.
Background technology
Digital watermark technology is providing effective skill as a branch of Information Hiding Techniques for copyright protection of digital product Art means aspect has inherent advantage.Specific copyright information is hidden into original digital video content using digital watermark technology (Such as TV programme)In, under the premise of ensureing that its is sightless(That is, meet low perceptibility, the quality of original works is not influenceed), Make original works high definition, high-fidelity, by conventional TV programme issue and signal transmission, still provide normal program viewing. Only when necessary, be extracted is proved with providing corresponding copyright.In the issue of the digital video content of such as TV programme In transmitting procedure, digital video content is processed together with wherein hiding digital watermarking(For example, compressing, adding and make an uproar)Or can receive To the rogue attacks of bootlegger so that watermark is subject to a certain degree of destruction.Therefore need to design with low perceptibility simultaneously Digital watermarking with strong robustness.
In the prior art, by taking digital video watermarking as an example, adopt carries out copyright protection to digital video with the following method.It is first First, the frame picture Y in raw digital video sequence is extracted, the monochrome information array of each pixel of the frame picture is designated as I. I is transformed from a spatial domain into frequency domain, for example, can be by cosine transform(DCT)Or wavelet transformation(DWT)Carry out from spatial domain to The conversion of frequency domain.Then, watermark strength parameter is calculated, the intensity for determining watermark insertion.Enter using perceptual masking model Water-filling print it is embedded when, the watermark strength parameter to be calculated is minimum to perceive distortion (Just Noticeable Difference), abbreviation JND values, are also expressed as threshold of perception current PER.In addition, determining the insertion of watermark using entropy masking model When, calculate entropy ENT.JND values and entropy ENT are watermark embedding parameters, that is to say, that watermark embedment strength is depending on JND values Or entropy ENT.Finally according to JND values or entropy ENT, the embedded watermark in the frequency coefficient of brightness set I.
In digital watermark embedding technology, wavelet decomposition is a kind of conventional spatial domain, the method for frequency-domain transform.Fig. 1 illustrates Show to property and carry out the wavelet coefficient array after one-level wavelet transformation to the brightness set I of picture Y, from the figure, it can be seen that The perceived direction s of wavelet coefficient includes 4 parts:LL, LH, HL and HH.In above-mentioned data waterprint embedded method, can basis Need to do multilevel wavelet decomposition.
Existing watermark embedding method has taken into account the low perceptibility and strong robustness of watermark to a certain extent.However, existing Some perceptual masking models or entropy masking model are used mostly the global statistics characteristic based on pixel, the JND for so calculating Value can not accurate response human eye visually-perceptible, although such as in image/video some regions have comentropy high, human eye pair The attention rate of its content is but very low, such Information Meter value, and easily the insertion to watermark is misled, and causes robust performance With the decline of perceptual performance.
The content of the invention
One aspect of the present invention proposes a kind of improved watermark embedding method, and to digital picture, i.e. its set I does small Before wave conversion, set I is decomposed, i.e. carry out Context resolution, obtain cell information IWSAnd texture informationAnd meetCell information I is tried to achieve respectivelyWSWatermark strength parameter(For example, PER or ENT)And texture informationWatermark Intensive parameter(For example, PER or ENT), two watermark strength parameter values are merged.
An embodiment provides a kind of watermark embedding method based on Context resolution and perception level, bag Include:Digital picture is decomposed, to obtain cell information and texture information;Cell information and texture information are carried out respectively small Wave conversion, to obtain the wavelet coefficient of cell information and the wavelet coefficient of texture;Primitive is calculated respectively using perceptual masking model The JND values of the wavelet coefficient of information and the wavelet coefficient of texture information, i.e. JNDWSWithTo JNDWSWithIt is as follows Merge to obtain final threshold of perception current:
Wherein, to perceive level, s is perceived direction to l, and m, n are constant;And according to what is be calculatedWatermark is embedding Enter in the digital picture.
In the above-described embodiments, the value of m, n is preferably values below so that embedded watermark has preferable low perceptibility And robustness
An alternative embodiment of the invention provides a kind of watermark embedding method based on Context resolution and perception level, bag Include, digital picture is decomposed, to obtain cell information and texture information;Cell information and texture information are carried out respectively small Wave conversion, to obtain the wavelet coefficient of cell information and the wavelet coefficient of texture;Primitive is calculated respectively using entropy masking model to believe The entropy of the wavelet coefficient of breath and the wavelet coefficient of texture information, ENTWSWithTo ENTWSWithIt is carried out as follows conjunction And to obtain final entropy:
Wherein, to perceive level, s is perceived direction to l, and m, n are constant, n >=0.9,0<m≤0.1;And according to calculating The entropy for arrivingIn embedding watermark data into the digital picture.
Another embodiment of the present invention provides a kind of watermark embedding method, including:Number is calculated using perceptual masking model The threshold of perception current of the wavelet coefficient of word imageThe entropy of the wavelet coefficient of the digital picture is calculated using entropy masking model ValueIt is calculated through entropy using following formulaThe watermark strength parameter JND values of modulation:
Wherein, to perceive level, s is perceived direction to l, and ρ is constant;According to what is be calculatedEmbed watermark data into institute In stating digital picture.
In this embodiment, when 0<ρ<When 1.1, embedded watermark has preferable low perceptibility;Work as ρ>When 1.15, institute Embedded watermark has stronger robustness;When 1.1≤ρ≤1.15, embedded watermark has the low sense of good compromise Intellectual and robustness.
Another embodiment of the present invention provides a kind of watermark embedding device, including:Data receipt unit(210), it is used for The image of watermark is embedded into from external reception;Resolving cell(220), by the picture breakdown that be embedded into watermark into primitive Part and texture part;Converter unit(230), for respectively by the primitive part and texture part from spatial transform to frequency Domain;Watermark strength parameter determination unit(240), primitive part and the watermark of texture part for determining to transform to frequency domain respectively Intensive parameter, and the two parameters are merged, to obtain final watermark strength parameter value;Watermark determining unit(250), it is used for Calculate the watermark to be embedded in;Watermark embedder unit(260), for according to the watermark strength parameter determination unit(240)It is determined that Final watermark strength parameter value by watermark determining unit(250)The watermark of calculating is embedded into data receipt unit(210)Connect What is received will be embedded in the image of watermark.
Another embodiment of the present invention provides a kind of watermark embedding device, including:Data receipt unit(210), it is used for The image of watermark is embedded into from external reception;Converter unit(230), for doing small echo to the image that be embedded into watermark Conversion, with from spatial transform to frequency domain;Watermark strength parameter determination unit(240), it is used for(a)Calculated using perceptual masking model The threshold of perception current of the wavelet coefficient of described image(b)The entropy of the wavelet coefficient of described image is calculated using entropy masking model Value(c)It is calculated through entropy using following formulaThe watermark strength parameter JND values of modulation:
Wherein, to perceive level, s is perceived direction to l, and ρ is constant;Wherein the numerical value of ρ be preferably three below scope it One 0<ρ<1.1, ρ>1.15 and 1.1≤ρ≤1.15.
The device also includes watermark determining unit(250), for calculating the watermark to be embedded in;Watermark embedder unit(260), According to being calculatedIn embedding watermark data into described image.
Beneficial effect
As described above, the present invention is separately considered cell information and texture information when watermark strength parameter value is calculated, make The watermark that must be embedded in has lower perceptibility and stronger robustness.
In addition in above-mentioned other method of the invention, due to having taken into account sensor model and entropy masking model so that embedded Watermark have lower perceptibility and stronger robustness.
Brief description of the drawings
Fig. 1 schematically shows carries out the wavelet coefficient array after one-level wavelet transformation to original digital image;
Fig. 2 is the block diagram of watermark embedding device according to an embodiment of the invention;
Fig. 3 is schematic diagram of the use perceptual masking model according to an embodiment of the invention to original video watermarking;
Fig. 4 is that use perceptual masking model according to an embodiment of the invention determines the frame picture of raw digital video one The flow chart of PER values;
Fig. 5 is schematic diagram of the use entropy masking model in accordance with another embodiment of the present invention to original video watermarking;
Fig. 6 is the ENT that original digital image is determined according to the use entropy masking model of above-mentioned another embodiment of the present invention The flow chart of value.
Specific embodiment
Some specific embodiments are described below with reference to accompanying drawings.However, these embodiments are not intended to limit the present invention, should This understands that the present invention includes falling various modifications, equivalent description and replacement in idea of the invention and technical scope.Omit that It is a little to be considered as to understanding that the present invention does not have the detailed description in helpful prior art related to the present invention.
Fig. 2 shows the block diagram of watermark embedding device according to an embodiment of the invention 200.
As shown in Fig. 2 the watermark embedding device 200 according to the embodiment includes:Data receipt unit 210, for receiving The data of watermark are embedded into, the data that be embedded into watermark can be the numerical datas of a width picture, or have The video data Y of multiframe picture, in the case where input data is video data Y, extracts a frame therein from video data Y The brightness data of the two field picture is designated as I by image as the data of watermark to be embedded in;Resolving cell 220, for data receiver The brightness data that what unit 210 was received will be embedded into a two field picture of watermark is decomposed, for example, carrying out primal sketch point Solution, to resolve into primitive part IWSAnd texture partAnd meetConverter unit 230, for that will divide respectively The primitive part I that solution unit 220 is decomposedWSAnd texture partFrequency domain is transformed from a spatial domain to, for example, carrying out wavelet transformation, is obtained ArriveWithWhereinWithIt is respectively primitive part IWSAnd texture partWavelet coefficient;Watermark strength parameter is true Order unit 240, for determining primitive part wavelet coefficient respectivelyWith texture part wavelet coefficientWatermark strength parameter value (PER or ENT), and the two watermark strength parameter values are merged, to obtain final watermark strength parameter value;Watermark determines single Unit 250, the water that the watermark received in data or reception user directly input will be embedded in for being calculated according to the instruction of user Print;Watermark embedder unit 260, for the final watermark strength parameter value determined according to watermark strength parameter determination unit 240 The watermark that watermark determining unit 250 is calculated or received is embedded into the number that data receipt unit 210 receives watermark to be embedded into In.
Fig. 3 shows and utilized according to one embodiment of the invention the watermark embedding device 200 shown in Fig. 2 according to perceptual mask Model determines the JND values of the frame picture extracted in digital of digital video data and showing for watermark is thus embedded in the frame picture It is intended to, it is therein to perceive the wavelet coefficients at different levels that component refers to primitive and texture.Fig. 4 is in digital video one according to the embodiment The flow chart of the method for embedded watermark in frame picture.
Reference picture 4, in step S110, data receipt unit 210 is regarded from external reception digital of digital video data from the numeral The luminance signal of a frame picture Y is extracted in frequency, set I is designated as, set I is the brightness of each pixel of extracted a frame picture Set.In step S120,220 pairs of pictures of the extraction of resolving cell, i.e. set I is decomposed(For example, carrying out initial letter Figure is decomposed), to obtain cell information Is and texture informationIt meetsIn step S130, converter unit 230 Respectively to cell information IWSAnd texture informationWavelet transformation is carried out, cell information I is obtainedWSWavelet coefficientBelieve with texture BreathWavelet coefficientHere, it is preferred that to cell information IWSAnd texture informationCarry out 1 to 3 grades of wavelet transformation. Step S140, watermark strength parameter determination unit 240 calculates the wavelet domain coefficients of cell information using perceptual masking model respectively JND values JNDWSWith the JND values of the wavelet domain coefficients of texture informationIn step S150, watermark strength parameter determination list 240 couples of JND of the wavelet domain coefficients of cell information of unitWSWith the wavelet domain coefficients of texture informationMerge obtain as JND values total down, or threshold of perception current
Wherein, subscript WS represents cell information, and the "-" on variable represents complementary relationship, for example, IWSRepresent primitive letter Breath, thenRepresent texture information complementary therewith;L represents the series of wavelet transformation, i.e. perceive level, s is perceived direction, That is, s represents any one of LL, LH, HL and the HH in wavelet coefficient;M, n are constant.Further, since JNDWSWithMeter Calculate relevant with l and s, therefore,With upper subscript s and l.
In the case where 1 to 3 grades of wavelet transformations are carried out to cell information and texture information, through overtesting, m, n are following Embedded watermark has preferable low perceptibility and robustness during number range:
In step S160, watermark embedder unit 260 is according to the watermark strength parameter being calculatedValue is embedded a watermark into To data receipt unit 210 in the frame picture Y extracted from the raw digital video of external reception.It is in picture Y In the case of gray scale picture, embedded watermark is in the brightness set I of picture Y embedded watermark in picture Y;In picture Y In the case of being colour picture, embedded watermark includes in picture Y, the embedded watermark first in the brightness set I of picture Y, Then the set I by embedded watermark merges with the chroma data of picture Y.
In addition, the watermark being embedded into picture Y, can be that watermark embedder unit 260 is calculated according to the instruction of user The watermark for obtaining, or the watermark directly inputted by user, for example, copyright information.
Fig. 5 is shown and sheltered according to entropy using the watermark embedding device 200 shown in Fig. 2 according to a further embodiment of the invention Model determines the entropy ENT of the two field picture in digital video and is thus embedded in the schematic diagram of watermark.Fig. 6 is according to the embodiment The flow chart of embedded watermark in a two field picture of digital video.
The difference of Fig. 6 and Fig. 4 is, in step S140, the present embodiment profit watermark strength parameter determination unit 240 Calculate the entropy ENT of the wavelet domain coefficients of cell information respectively rather than perceptual masking model with entropy masking modelWSBelieve with texture The entropy of the wavelet domain coefficients of breathThen to the ENT of wavelet domain coefficientsWSWith the wavelet domain coefficients of texture informationMerge and obtain following totalValue,
Wherein, m, n are constant, n >=0.9,0<m≤0.1.
Then in step S160, watermark embedder unit 260 is according to the watermark strength parameter being calculatedIt is worth watermark Be embedded into data receipt unit 210 from the raw digital video of external reception extract the frame picture Y in.
In another embodiment of the invention, watermark strength parameter determination unit 240 is calculated using perceptual masking model To the watermark strength parameter of a frame of digital picture YAnd the water of a frame of digital picture Y is calculated using entropy masking model Print intensive parameterThen, it is calculated through entropy using following formulaThe watermark strength parameter JND values of modulation:
Experiment shows, when 0<ρ<When 1.1, embedded watermark has preferable low perceptibility;Work as ρ>It is embedded when 1.15 Watermark have stronger robustness;When 1.1≤ρ≤1.15, embedded watermark has the low perceptibility of good compromise And robustness.
In this embodiment, the watermark strength parameter of image Y is calculated using perceptual masking modelCan utilize Fig. 4 Described method utilizes formula(1)Perform.Meanwhile, the watermark strength parameter of image Y is calculated using entropy masking modelCan be with It is to utilize formula using the method described in Fig. 6(2)Perform.
In a modification of the embodiment, the watermark strength parameter of image Y is calculated using perceptual masking modelBag Include and wavelet transformation is done by the entirety to a two field picture(That is, wavelet transformation is done to its brightness set I), then calculate image entirety Wavelet coefficientAnd the watermark strength parameter of image Y is calculated using entropy masking modelIncluding by described The entirety of image does wavelet transformation(That is, wavelet transformation is done to its brightness set I), then calculate the overall wavelet coefficient of image
Perceptual masking model sample calculation
It is small using perceptual masking model calculating primitive in the step of being introduced in the above-mentioned methods in following example S140 The circular of the JND values of wave system number and texture wavelet coefficient, those skilled in the art are it is understood that the invention is not restricted to following Circular.
Consider frequency masking characteristic Frequency (l, s), brightness masking characteristics Luminance (l, x, y) and texture Masking characteristics Texture (l, x, y) is calculated as below JND values:
Wherein,(x,y)Two-dimensional coordinate in statement wavelet coefficient array.
In the case where 3 grades of wavelet transformations are done to raw video image Y, frequency masking characteristic Frequency (l, s) can be with It is expressed as:
Brightness masking characteristics Lumin α nce (l, x, y) is expressed as:
Lumiance(l,x,y)=1+L(l,x,y) (6)
Wherein,
And
Texture masking Texture (l, x, y) is expressed as:
Entropy masking model sample calculation
In the step of being introduced in the above-mentioned methods in following example S140 primitive small echo is calculated using entropy masking model The circular of entropy during the JND values of coefficient and texture wavelet coefficient.
Fig. 5 is to calculate the embedment strength i.e. entropy ENT of parameter using entropy masking model, and the ENT values being calculated according to this Determine the schematic diagram of watermark embedment strength and then embedded watermark.The method that the calculating of entropy is counted using stationary window traversal, each The information entropy of coefficient value is expressed as centered on the coefficient value, calculates its n × n(For the region for ensureing calculated can have one Clear and definite central point, n is generally odd number)Comentropy in size area, pixel, rule are enclosed for (n-1)/2 of image border Fixed its information entropy is 0.The entropy of rest of pixels is calculated by following formula:
Wherein,Represent the l grades of wavelet coefficient in s directionsShannon entropy, p (z) is wavelet systems Number probability of occurrence,DefineN × n adjacent coefficient.
Although describing the present invention with reference to one or more embodiments, it should be apparent to those skilled in the art that can With in the case of without departing from the spirit and scope of the present invention described in appended claims in a variety of manners amendment or Change the present invention.

Claims (13)

1. it is a kind of based on Context resolution and perceive level watermark embedding method, including:
The decomposition based on primal sketch is carried out to digital image luminance information I, to obtain cell information IWSAnd texture information And meet
Respectively wavelet transformation is carried out to cell information and texture information, to obtain the wavelet coefficient and texture information of cell information Wavelet coefficient;
Calculate the JND values JND of the wavelet domain coefficients of cell information respectively using perceptual masking modelWSWith the wavelet field of texture information The JND values of coefficient
To JNDWSWithMerging is carried out as follows to obtain final threshold of perception current:
PER l s = m &times; JND W S + n &times; JND W S &OverBar;
Wherein, to perceive level, s is perceived direction to l, and m, n are constant;
According to what is be calculatedIn embedding watermark data into the monochrome information of the digital picture.
2. watermark embedding method according to claim 1, wherein, wavelet transformation bag is carried out to cell information and texture information Include carries out 1 to 3 grade of wavelet transformation to cell information and texture information, and the value of m, n is preferably:
m = 0.6 ~ 0.9 l = 1 0.01 ~ 0.1 l = 2 , 3 n = 0.6 ~ 0.8 l = 1 0.9 ~ 1 l = 2 , 3 .
3. it is a kind of based on Context resolution and perceive level watermark embedding method, including:
Monochrome information I to digital picture carries out the decomposition based on primal sketch, to obtain cell information IWSAnd texture informationAnd meet
Respectively wavelet transformation is carried out to cell information and texture information, to obtain the wavelet coefficient and texture information of cell information Wavelet coefficient;
Calculate the entropy ENT of the wavelet domain coefficients of cell information respectively using entropy masking modelWSWith the wavelet field system of texture information Several entropy
To ENTWSWithMerging is carried out as follows to obtain final entropy:
ENT l s = m &times; ENT W S + n &times; ENT W S &OverBar;
Wherein, to perceive level, s is perceived direction to l, and m, n are constant, n >=0.9,0<m≤0.1;
According to the entropy being calculatedIn embedding watermark data into the monochrome information of the digital picture.
4. a kind of watermark embedding method, including:
The digital picture of watermark is embedded into from external reception;
Monochrome information to the digital picture that be embedded into watermark does wavelet transformation, with from spatial transform to frequency domain;
The wavelet field threshold of perception current of the monochrome information of the digital picture is calculated using perceptual masking model
The wavelet field entropy of the monochrome information of the digital picture is calculated using entropy masking model
The watermark strength parameter JND values modulated by entropy are calculated using following formula:
JND l s = &rho; &times; sinh ( ENT l s max ( ENT l s ) ) &times; PER l s
Wherein, to perceive level, s is perceived direction to l, and ρ is constant;
According to what is be calculatedIn embedding watermark data into the monochrome information of the digital picture;
Wherein, the use perceptual masking model calculates the wavelet field threshold of perception current of the monochrome information of the digital pictureBag Include:
The monochrome information of the digital picture is carried out into the decomposition based on primal sketch, to obtain cell information and texture information;
Wavelet transformation is carried out to cell information and texture information respectively, to obtain the wavelet coefficient of cell information and the small echo of texture Coefficient;
Calculate the JND values JND of the wavelet domain coefficients of cell information respectively using perceptual masking modelWSWith the wavelet field of texture information The JND values of coefficient
To JNDWSWithMerging is carried out as follows to obtain final threshold of perception current:
PER l s = m &times; JND W S + n &times; JND W S &OverBar;
Wherein, m, n are constant.
5. watermark embedding method according to claim 4, wherein, the numerical value of ρ is preferably one of three below scope
0<ρ<1.1, ρ>1.15 and 1.1≤ρ≤1.15.
6. watermark embedding method according to claim 4, the use perceptual masking model calculates the digital picture The wavelet field threshold of perception current of monochrome informationIncluding:
The monochrome information of the digital picture is carried out into the decomposition based on primal sketch, to obtain cell information and texture information;
Wavelet transformation is carried out to cell information and texture information respectively, to obtain the wavelet coefficient of cell information and the small echo of texture Coefficient;
Calculate the JND values JND of the wavelet domain coefficients of cell information respectively using perceptual masking modelWSWith the wavelet field of texture information The JND values of coefficient
To JNDWSWithMerging is carried out as follows to obtain final threshold of perception current:
PER l s = m &times; JND W S + n &times; JND W S &OverBar;
Wherein, to perceive level, s is perceived direction to l, and m, n are constant.
7. watermark embedding method according to claim 6, wherein, wavelet transformation bag is carried out to cell information and texture information Include carries out 1 to 3 grade of wavelet transformation to cell information and texture information, and the value of m, n is preferably:
m = 0.6 ~ 0.9 l = 1 0.01 ~ 0.1 l = 2 , 3 n = 0.6 ~ 0.8 l = 1 0.9 ~ 1 l = 2 , 3 .
8. a kind of watermark embedding method, including:
The digital picture of watermark is embedded into from external reception;
Monochrome information to the digital picture that be embedded into watermark does wavelet transformation, with from spatial transform to frequency domain;
The wavelet field threshold of perception current of the monochrome information of the digital picture is calculated using perceptual masking model
The wavelet field entropy of the monochrome information of the digital picture is calculated using entropy masking model
The watermark strength parameter JND values modulated by entropy are calculated using following formula:
JND l s = &rho; &times; sinh ( ENT l s max ( ENT l s ) ) &times; PER l s
Wherein, to perceive level, s is perceived direction to l, and ρ is constant;
According to what is be calculatedIn embedding watermark data into the monochrome information of the digital picture;
Wherein, the utilization entropy masking model calculates the entropy of the wavelet coefficient of the digital pictureIncluding:
Monochrome information to the digital picture carries out the decomposition based on primal sketch, obtains cell information and texture information;
Calculate the wavelet field entropy ENT of the cell information respectively using entropy masking modelWSWith the wavelet field entropy of texture information
To ENTWSWithMerging is carried out as follows to obtain final ENT values:
ENT l s = m &times; ENT W S + n &times; ENT W S &OverBar;
Wherein, to perceive level, s is perceived direction to l, and m, n are constant, n >=0.9,0<m≤0.1.
9. a kind of watermark embedding device, including:
Data receipt unit (210), the image for be embedded into watermark from external reception;
Resolving cell (220), primitive part I is resolved into by the monochrome information I of the image that be embedded into watermarkWSWith texture portion PointAnd meet
Converter unit (230), for respectively by the primitive part and texture part from spatial transform to frequency domain;
Watermark strength parameter determination unit (240), primitive part and the water of texture part for determining to transform to frequency domain respectively Print intensive parameter, and the two parameters are merged, to obtain final watermark strength parameter value;
Watermark determining unit (250), for calculating the watermark to be embedded in;
Watermark embedder unit (260), the final watermark for being determined according to the watermark strength parameter determination unit (240) is strong The watermark that watermark determining unit (250) is calculated is embedded into data receipt unit (210) and receives watermark to be embedded in by degree parameter value Image monochrome information in;
Wherein described watermark strength parameter is entropy ENT or JND value.
10. watermark embedding device according to claim 9, wherein
The converter unit (230) does wavelet transformation respectively to the primitive part and texture part.
11. watermark embedding devices according to claim 10, wherein the watermark strength parameter determination unit (240)
Calculate the watermark strength of the wavelet coefficient of the wavelet coefficient and texture information of cell information respectively using perceptual masking model Parameter value, i.e. JNDWSWith
To JNDWSWithMerging is carried out as follows to obtain final threshold of perception current:
PER l s = m &times; JND W S + n &times; JND W S &OverBar;
Wherein, to perceive level, s is perceived direction to l, and m, n are constant, and the value of m, n is preferably:
m = 0.6 ~ 0.9 l = 1 0.01 ~ 0.1 l = 2 , 3 n = 0.6 ~ 0.8 l = 1 0.9 ~ 1 l = 2 , 3 .
12. watermark embedding devices according to claim 10, wherein the watermark strength parameter determination unit (240)
Calculate the entropy of the entropy of the wavelet field of cell information and the wavelet field of texture information, ENT respectively using entropy masking modelWS With
To ENTWSWithMerging is carried out as follows to obtain final entropy:
ENT l s = m &times; ENT W S + n &times; ENT W S &OverBar;
Wherein, to perceive level, s is perceived direction to l, and m, n are constant, n >=0.9,0<m≤0.1.
A kind of 13. watermark embedding devices, including:
Data receipt unit (210), the image for be embedded into watermark from external reception;
Converter unit (230), wavelet transformation is done for the monochrome information to the image that be embedded into watermark, is become with from spatial domain Change to frequency domain;
Watermark strength parameter determination unit (240), is used for
The threshold of perception current of the wavelet coefficient of described image is calculated using perceptual masking model
The entropy of the wavelet coefficient of described image is calculated using entropy masking model
The watermark strength parameter JND values modulated by entropy are calculated using following formula:
JND l s = &rho; &times; sinh ( ENT l s max ( ENT l s ) ) &times; PER l s
Wherein, to perceive level, s is perceived direction to l, and ρ is constant;Wherein the numerical value of ρ is preferably one of three below scope
0<ρ<1.1, ρ>1.15 and 1.1≤ρ≤1.15,
Watermark determining unit (250), for calculating the watermark to be embedded in;
Watermark embedder unit (260), according to being calculatedIn embedding watermark data into the monochrome information of described image;
Wherein, the use perceptual masking model calculates the wavelet field threshold of perception current of the monochrome information of described imageIncluding:
The monochrome information of described image is carried out into the decomposition based on primal sketch, to obtain cell information and texture information;
Wavelet transformation is carried out to cell information and texture information respectively, to obtain the wavelet coefficient of cell information and the small echo of texture Coefficient;
Calculate the JND values JND of the wavelet domain coefficients of cell information respectively using perceptual masking modelWSWith the wavelet field of texture information The JND values of coefficient
To JNDWSWithMerging is carried out as follows to obtain final threshold of perception current:
PER l s = m &times; JND W S + n &times; JND W S &OverBar;
Wherein, m, n are constant.
CN201210518446.4A 2012-12-05 2012-12-05 Watermark embedding method and device based on Context resolution and perception layering Active CN103854248B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201210518446.4A CN103854248B (en) 2012-12-05 2012-12-05 Watermark embedding method and device based on Context resolution and perception layering

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201210518446.4A CN103854248B (en) 2012-12-05 2012-12-05 Watermark embedding method and device based on Context resolution and perception layering

Publications (2)

Publication Number Publication Date
CN103854248A CN103854248A (en) 2014-06-11
CN103854248B true CN103854248B (en) 2017-06-20

Family

ID=50861868

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201210518446.4A Active CN103854248B (en) 2012-12-05 2012-12-05 Watermark embedding method and device based on Context resolution and perception layering

Country Status (1)

Country Link
CN (1) CN103854248B (en)

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106612467A (en) * 2015-10-21 2017-05-03 上海文广互动电视有限公司 A video content protection method and apparatus based on watermarks
CN105611110A (en) * 2015-12-25 2016-05-25 努比亚技术有限公司 Image processing method and apparatus
CN108280797B (en) * 2018-01-26 2021-08-31 江西理工大学 Image digital watermarking algorithm system based on texture complexity and JND model
CN115797141B (en) * 2023-01-18 2023-04-14 北京锐马视讯科技有限公司 Method, system, equipment and storage medium for multiplexing and extracting watermark
CN117173003B (en) * 2023-11-02 2024-01-05 南京师范大学 BIM data zero watermark method, device and medium

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101661605A (en) * 2008-08-26 2010-03-03 浙江大学 Embedding and positioning tampering methods of digital watermark and device thereof
CN102419979A (en) * 2011-11-23 2012-04-18 北京邮电大学 Audio semi-fragile watermarking algorithm for realizing precise positioning of altered area

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8280098B2 (en) * 2005-05-19 2012-10-02 Uti Limited Partnership Digital watermarking CMOS sensor
KR100873947B1 (en) * 2007-05-30 2008-12-12 주식회사 엠엠칩스 Method and System of inserting watermark for H.264/AVC video stream

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101661605A (en) * 2008-08-26 2010-03-03 浙江大学 Embedding and positioning tampering methods of digital watermark and device thereof
CN102419979A (en) * 2011-11-23 2012-04-18 北京邮电大学 Audio semi-fragile watermarking algorithm for realizing precise positioning of altered area

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
An improved quantization based watermarking scheme using local entropy in wavelet domain;Mehdi Boroumand, et al;《2009 IEEE International Conference on Signal and Image Processing Applications》;20091231;摘要,第268页第2栏第3段-270页第1栏第5段 *
Improved Wavelet-Based Watermarking Through Pixel-Wise Masking;Mauro Barni, et al;《IEEE Transactions on Image Processing》;20010531;第10卷(第5期);摘要,第783页第2栏第3段-786页第1栏第3段 *

Also Published As

Publication number Publication date
CN103854248A (en) 2014-06-11

Similar Documents

Publication Publication Date Title
Chou et al. A perceptually tuned watermarking scheme for color images
CN103854248B (en) Watermark embedding method and device based on Context resolution and perception layering
Liu Wavelet-based watermarking for color images through visual masking
CN100481867C (en) A digital watermark embedding and extraction algorithm for space domain and small wave domain double sub-set
Wang et al. Blind color image watermarking based on DWT and LU decomposition
CN104978706B (en) Watermark embedding and extracting method, device and system based on Fourier descriptor
CN113628090B (en) Anti-interference message steganography and extraction method, system, computer equipment and terminal
EP2154649A1 (en) An adaptive watermarking system and method
Tsai A visible watermarking algorithm based on the content and contrast aware (COCOA) technique
CN109064377B (en) Stereo image watermark embedding system, stereo image watermark extracting system, stereo image watermark embedding method and stereo image watermark extracting method
Al-Gindy et al. Watermarking of colour images in the DCT domain using Y channel
Abodena et al. Hybrid technique for robust image watermarking using discrete time fourier transform
Yang Robust Watermarking Scheme Based on Radius Weight Mean and Feature‐Embedding Technique
Niu et al. A visual saliency modulated just noticeable distortion profile for image watermarking
Yang et al. Information entropy used in digital watermarking
Zeng et al. Perceptual quality measure using JND model of the human visual system
Tsai et al. A visible wavelet watermarking technique based on exploiting the contrast sensitivity function and noise reduction of human vision system
Zhou et al. Blind watermarking algorithm based on DCT for color images
Halima et al. A novel approach of digital image watermarking using HDWT-DCT
Foriš et al. Implementations of HVS models in digital image watermarking
CN103559677A (en) Self-adaptive image watermark embedding method based on wavelet transformation and visual characteristics
Yang et al. High‐Quality and Robust Reversible Data Hiding by Coefficient Shifting Algorithm
Drebuzhan et al. The Steganographic Method of Introducing Additional Information, Resistant to Transcoding and Conversion
CN1937769B (en) Water print method and device for digital video signal
Rai et al. Embedded Zero-Tree Wavelet Based Image Steganography

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant