CN103854248A - Watermark embedding method and device based on content analysis and perception hierarchy - Google Patents
Watermark embedding method and device based on content analysis and perception hierarchy Download PDFInfo
- Publication number
- CN103854248A CN103854248A CN201210518446.4A CN201210518446A CN103854248A CN 103854248 A CN103854248 A CN 103854248A CN 201210518446 A CN201210518446 A CN 201210518446A CN 103854248 A CN103854248 A CN 103854248A
- Authority
- CN
- China
- Prior art keywords
- watermark
- wavelet coefficient
- information
- entropy
- jnd
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
Images
Landscapes
- Editing Of Facsimile Originals (AREA)
- Compression Or Coding Systems Of Tv Signals (AREA)
- Image Processing (AREA)
Abstract
The invention provides a watermark embedding method based on content analysis. According to the method, basic element information and texture information are obtained by conducting initial schematic diagram decomposition on digital works; the wavelet coefficient of the basic element information and the wavelet coefficient of the texture information are obtained by conducting wavelet transform on the basic element information and the texture information respectively; the watermark intensity parameter values (PER or ENT) of the wavelet coefficient of the basic element information and the wavelet coefficient of texture information are calculated respectively, the two watermark intensity parameter values are combined so that a final watermark intensity parameter value can be obtained, and parameter embedding is conducted according to the final watermark intensity parameter value.
Description
Technical field
The present invention relates to watermark embedding method and the device of a kind of content-based parsing and perception layering.
Background technology
Digital watermark technology is as a branch of Information Hiding Techniques, has inherent advantage providing for copyright protection of digital product aspect effect technique means.Utilize digital watermark technology that specific copyright information is for example hidden into, in raw digital video content (TV programme), guaranteeing under its sightless prerequisite (, meet low perceptibility, do not affect the quality of original works), make original works high definition, high-fidelity, issue and signal transmission through conventional TV programme, normal program viewing still can be provided.Only where necessary, be extracted to provide corresponding copyright to prove.In the issue and transmitting procedure of the digital video content such as TV programme, digital video content is processed (for example together with wherein hiding digital watermarking, compression, add and make an uproar etc.) maybe can be subject to bootlegger's rogue attacks, make watermark be subject to destruction to a certain extent.Therefore need to design the digital watermarking that there is low perceptibility and have simultaneously strong robustness.
In the prior art, take digital video watermarking as example, adopt with the following method digital video is carried out to copyright protection.First, extract the frame picture Y in raw digital video sequence, the monochrome information array of each pixel of this frame picture is designated as to I.I is transformed from a spatial domain to frequency domain, for example, can carry out the conversion from spatial domain to frequency domain by cosine transform (DCT) or wavelet transformation (DWT).Then, calculate watermark strength parameter, for the intensity of determining that watermark embeds.In the time utilizing perceptual masking model to carry out the embedding of watermark, the watermark strength parameter that calculate is minimum discernable distortion (Just Noticeable Difference), is called for short JND value, is also expressed as threshold of perception current PER.In addition, while utilizing entropy masking model to determine the embedding of watermark, calculate entropy ENT.JND value and entropy ENT are watermark embedding parameters, that is to say, watermark embed strength depends on JND value or entropy ENT.Finally according to JND value or entropy ENT, embed watermark in the frequency coefficient of brightness set I.
In digital watermarking embedded technology, wavelet decomposition is a kind of method of conventional spatial domain, frequency domain conversion.Fig. 1 schematically shows the brightness set I of picture Y is carried out to the wavelet coefficient array after one-level wavelet transformation, and as we can see from the figure, the perceived direction s of wavelet coefficient comprises 4 part: LL, LH, HL and HH.In above-mentioned data waterprint embedded method, can do as required multilevel wavelet and decompose.
Existing watermark embedding method has been taken into account low perceptibility and the strong robustness of watermark to a certain extent.But, existing perceptual masking model or entropy masking model use the global statistics characteristic based on pixel mostly, the JND value of calculating like this can not accurate response human eye visually-perceptible, as although some region in image/video has high information entropy, but human eye is very low to the attention rate of its content, such measure information value, easily misleads to the embedding of watermark, and causes the decline of robust performance and perceptual performance.
Summary of the invention
One aspect of the present invention proposes a kind of improved watermark embedding method, and to digital picture, its set I does before wavelet transformation, and pair set I decomposes, and, carries out Context resolution that is, obtains primitive information I
wSand texture information
and meet
try to achieve respectively primitive information I
wSwatermark strength parameter (for example, PER or ENT) and texture information
watermark strength parameter (for example, PER or ENT), two watermark strength parameter values are merged.
One embodiment of the present of invention provide the watermark embedding method of a kind of content-based parsing and perception level, comprising: digital picture is decomposed, to obtain primitive information and texture information; Respectively primitive information and texture information are carried out to wavelet transformation, to obtain the wavelet coefficient of primitive information and the wavelet coefficient of texture; Utilize perceptual masking model to calculate respectively the JND value of the wavelet coefficient of primitive information and the wavelet coefficient of texture information, that is, and JND
wSwith
to JND
wSwith
merge as follows to obtain final threshold of perception current:
Wherein, l is perception level, and s is perceived direction, and m, n are constant; And according to what calculate
watermark is embedded in described digital picture.
In the above-described embodiments, thus the value of m, n be preferably following value make embed watermark there is good low perceptibility and robustness
An alternative embodiment of the invention provides the watermark embedding method of a kind of content-based parsing and perception level, comprises, digital picture is decomposed, to obtain primitive information and texture information; Respectively primitive information and texture information are carried out to wavelet transformation, to obtain the wavelet coefficient of primitive information and the wavelet coefficient of texture; Utilize entropy masking model to calculate respectively the entropy of the wavelet coefficient of primitive information and the wavelet coefficient of texture information, ENT
wSwith
to ENT
wSwith
merge as follows to obtain final entropy:
Wherein, l is perception level, and s is perceived direction, and m, n are constant, n>=0.9,0<m≤0.1; And according to the entropy calculating
watermark is embedded in described digital picture.
Another embodiment of the present invention provides a kind of watermark embedding method, comprising: adopt perceptual masking model to calculate the threshold of perception current of the wavelet coefficient of digital picture
utilize entropy masking model to calculate the entropy of the wavelet coefficient of described digital picture
use following formula to calculate through entropy
the watermark strength parameter JND value of modulation:
Wherein, l is perception level, and s is perceived direction, and ρ is constant; According to what calculate
watermark is embedded in described digital picture.
In this embodiment, in the time of 0< ρ <1.1, embedded watermark has good low perceptibility; In the time of ρ >1.15, embedded watermark has stronger robustness; In the time of 1.1≤ρ≤1.15, embedded watermark has low perceptibility and the robustness of good compromise.
Another embodiment of the present invention provides a kind of watermark flush mounting, comprising: data receiver unit (210), and for receive the image that will be embedded into watermark from outside; Resolving cell (220), becomes primitive part and texture part by the described picture breakdown that will be embedded into watermark; Converter unit (230), for respectively by described primitive part and texture part from spatial transform to frequency domain; Watermark strength parameter determining unit (240), for determining and transform to the primitive part of frequency domain and the watermark strength parameter of texture part respectively, and merges these two parameters, to obtain final watermark strength parameter value; Watermark determining unit (250), for calculating the watermark that will embed; Watermark embedded unit (260), the watermark of watermark determining unit (250) being calculated for the final watermark strength parameter value of determining according to described watermark strength parameter determining unit (240) is embedded into the image of wanting embed watermark that data receiver unit (210) receives.
Another embodiment of the present invention provides a kind of watermark flush mounting, comprising: data receiver unit (210), and for receive the image that will be embedded into watermark from outside; Converter unit (230), for the described image that will be embedded into watermark is done to wavelet transformation, with from spatial transform to frequency domain; Watermark strength parameter determining unit (240), adopts perceptual masking model to calculate the threshold of perception current of the wavelet coefficient of described image for (a)
(b) utilize entropy masking model to calculate the entropy of the wavelet coefficient of described image
(c) use following formula to calculate through entropy
the watermark strength parameter JND value of modulation:
Wherein, l is perception level, and s is perceived direction, and ρ is constant; Wherein the numerical optimization of ρ is one of following three scopes 0< ρ <1.1, ρ >1.15 and 1.1≤ρ≤1.15.
This device also comprises watermark determining unit (250), for calculating the watermark that will embed; Watermark embedded unit (260), described in calculating
watermark is embedded in described image.
Beneficial effect
As mentioned above, the present invention separately considers primitive information and texture information in the time calculating watermark strength parameter value, makes the watermark embedding have lower perceptibility and stronger robustness.
In above-mentioned other method of the present invention, owing to having taken into account sensor model and entropy masking model, make the watermark embedding there is lower perceptibility and stronger robustness in addition.
Accompanying drawing explanation
Fig. 1 schematically shows original digital image is carried out to the wavelet coefficient array after one-level wavelet transformation;
Fig. 2 is the block scheme of watermark flush mounting according to an embodiment of the invention;
Fig. 3 adopts perceptual masking model original video to be added to the schematic diagram of watermark according to an embodiment of the invention;
Fig. 4 adopts perceptual masking model to determine the process flow diagram of the PER value of raw digital video one frame picture according to an embodiment of the invention;
Fig. 5 adopts entropy masking model original video to be added to the schematic diagram of watermark in accordance with another embodiment of the present invention;
Fig. 6 is the process flow diagram that the employing entropy masking model of above-mentioned another embodiment according to the present invention is determined the ENT value of original digital image.
Embodiment
Some specific embodiments are described below with reference to accompanying drawings.But these embodiment the present invention includes for limiting the present invention, should be understood that the various modifications that drop in concept of the present invention and technical scope, are equal to and describe and replace.Omit the detailed description that those are regarded as not having understanding the present invention helpful prior art related to the present invention.
Fig. 2 shows the block scheme of watermark flush mounting 200 according to an embodiment of the invention.
As shown in Figure 2, comprise according to the watermark flush mounting 200 of this embodiment: data receiver unit 210, for receiving the data that will be embedded into watermark, these data that will be embedded into watermark can be the numerical datas of a width picture, also can be the video data Y with multiframe picture, in the situation that input data are video data Y, from video data Y, an extraction two field picture wherein, as the data of wanting embed watermark, is designated as I by the brightness data of this two field picture; Resolving cell 220, decomposes for the brightness data of a two field picture that will be embedded into watermark that data receiving element 210 is received, for example, carries out primal sketch decomposition, to resolve into primitive part I
wSand texture part
and meet
converter unit 230, for the primitive part I respectively resolving cell 220 being decomposed
wSand texture part
transform from a spatial domain to frequency domain, for example, carry out wavelet transformation, obtain
with
wherein
with
respectively primitive part I
wSand texture part
wavelet coefficient; Watermark strength parameter determining unit 240, for determining respectively primitive part wavelet coefficient
with texture part wavelet coefficient
watermark strength parameter value (PER or ENT), and these two watermark strength parameter values are merged, to obtain final watermark strength parameter value; Watermark determining unit 250, for embedding the watermark of described reception data or receive the directly watermark of input of user according to user's command calculations; Watermark embedded unit 260, the watermark of watermark determining unit 250 being calculated or being received for the final watermark strength parameter value of determining according to watermark strength parameter determining unit 240 is embedded into data receiver unit 210 and receives the data that will be embedded into watermark.
Fig. 3 shows according to one embodiment of the invention and utilizes the watermark flush mounting 200 shown in Fig. 2 to determine JND value the schematic diagram of embed watermark in this frame picture thus of the frame picture extracting according to perceptual masking model in digital of digital video data, and perception component wherein refers to the wavelet coefficients at different levels of primitive and texture.Fig. 4 is according to the process flow diagram of this embodiment method of embed watermark in digital video one frame picture.
With reference to Fig. 4, at step S110, data receiver unit 210 is from outside receiving digital video data, and from this digital video, extracts the luminance signal of a frame picture Y, is designated as set I, and set I is the set of the brightness of each pixel of extracted a frame picture.At step S120, the picture of resolving cell 220 to this extraction, that is, set I decomposes (for example, carrying out primal sketch decomposition), to obtain primitive information Is and texture information
it meets
at step S130, converter unit 230 is respectively to primitive information I
wSand texture information
carry out wavelet transformation, obtain primitive information I
wSwavelet coefficient
and texture information
wavelet coefficient
here, preferably to primitive information I
wSand texture information
carry out the wavelet transformation of 1 to 3 grade.At step S140, watermark strength parameter determining unit 240 utilizes perceptual masking model to calculate respectively the JND value JND of the wavelet domain coefficients of primitive information
wSjND value with the wavelet domain coefficients of texture information
at step S150, the JND of the wavelet domain coefficients of watermark strength parameter determining unit 240 to primitive information
wSwith the wavelet domain coefficients of texture information
merge and obtain following total JND value, or threshold of perception current
Wherein, subscript WS represents primitive information, and "-" on variable represents complementary relationship, for example, and I
wSrepresent primitive information,
represent complementary with it texture information; L represents the progression of wavelet transformation, that is, perception level, s is perceived direction, that is, s represents any of LL, LH, HL and HH in wavelet coefficient; M, n are constant.In addition, due to JND
wSwith
calculating all relevant with l and s, therefore,
there is upper subscript s and l.
In the situation that primitive information and texture information are carried out to 1 to 3 grade of wavelet transformation, through overtesting, m, n embedded watermark in the time of following numerical range has good low perceptibility and robustness:
At step S160, watermark embedded unit 260 is according to the watermark strength parameter calculating
value is embedded into watermark in a described frame picture Y who extracts the raw digital video receiving from outside data receiver unit 210.In the situation that picture Y is gray scale picture, in this picture Y, embed watermark is embed watermark in the brightness set I of picture Y; In the situation that picture Y is colour picture, in this picture Y, embed watermark comprises, then first embed watermark in the brightness set I of picture Y merges the chroma data of the set I of embed watermark and picture Y.
In addition, described in be embedded into the watermark in picture Y, can be the watermark that watermark embedded unit 260 obtains according to user's command calculations, can be also by the directly watermark of input of user, for example, copyright information.
Fig. 5 shows the watermark flush mounting 200 that utilizes according to a further embodiment of the invention shown in Fig. 2 and determines the entropy ENT of the two field picture in digital video the schematic diagram of embed watermark thus according to entropy masking model.Fig. 6 is according to the process flow diagram of this embodiment embed watermark in a two field picture of digital video.
The difference of Fig. 6 and Fig. 4 is, in step S140, the present embodiment profit watermark strength parameter determining unit 240 use entropy masking models rather than perceptual masking model calculate respectively the entropy ENT of the wavelet domain coefficients of primitive information
wSentropy with the wavelet domain coefficients of texture information
then the ENT to wavelet domain coefficients
wSwith the wavelet domain coefficients of texture information
merge obtain following total
value,
Wherein, m, n are constant, n >=0.9,0<m≤0.1.
Then at step S160, watermark embedded unit 260 is according to the watermark strength parameter calculating
value is embedded into watermark in a described frame picture Y who extracts the raw digital video receiving from outside data receiver unit 210.
In another embodiment of the present invention, watermark strength parameter determining unit 240 adopts perceptual masking model to calculate the watermark strength parameter of a frame of digital picture Y
and utilize entropy masking model to calculate the watermark strength parameter of a frame of digital picture Y
then, use following formula to calculate through entropy
the watermark strength parameter JND value of modulation:
Test shows, in the time of 0< ρ <1.1, embedded watermark has good low perceptibility; In the time of ρ >1.15, embedded watermark has stronger robustness; In the time of 1.1≤ρ≤1.15, embedded watermark has low perceptibility and the robustness of good compromise.
In this embodiment, adopt the watermark strength parameter of perceptual masking model computed image Y
can be to utilize the method described in Fig. 4 to utilize formula (1) to carry out.Meanwhile, utilize the watermark strength parameter of entropy masking model computed image Y
can be to utilize the method described in Fig. 6 to utilize formula (2) to carry out.
In a modification of this embodiment, adopt the watermark strength parameter of perceptual masking model computed image Y
comprise by the entirety of a two field picture being done to wavelet transformation (that is, I does wavelet transformation to its brightness set), then the wavelet coefficient of computed image entirety
and utilize the watermark strength parameter of entropy masking model computed image Y
comprise by the entirety of described image being done to wavelet transformation (that is, I does wavelet transformation to its brightness set), then the wavelet coefficient of computed image entirety
Perceptual masking model sample calculation
In example below, introducing in the step S140 in said method utilizes perceptual masking model to calculate the circular of the JND value of primitive wavelet coefficient and texture wavelet coefficient, those skilled in the art can understand, and the invention is not restricted to following circular.
Consider frequency masking characteristic Frequency (l, s), brightness masking characteristics Luminance (l, x, y) and texture masking Texture (l, x, y) and calculate as follows JND value:
Wherein, the two-dimensional coordinate in (x, y) statement wavelet coefficient array.
In the situation that raw video image Y is done to 3 grades of wavelet transformations, frequency masking characteristic Frequency (l, s) can be expressed as:
Brightness masking characteristics Lumin α nce (l, x, y) is expressed as:
Lumiance(l,x,y)=1+L(l,x,y) (6)
Texture masking Texture (l, x, y) is expressed as:
Entropy masking model sample calculation
The circular of entropy while introducing the JND value of utilizing entropy masking model calculating primitive wavelet coefficient and texture wavelet coefficient in the step S140 in said method in example below.
Fig. 5 is that to utilize entropy masking model to calculate embedment strength parameter be entropy ENT, and according to the schematic diagram of the definite watermark embed strength of this ENT value calculating and then embed watermark.The calculating of entropy adopts the method for stationary window traversal statistics, the information entropy value representation of each coefficient value is centered by this coefficient value, calculate the region that its n × n(calculates for assurance and can have a clear and definite central point, n is generally odd number) information entropy in size area, for (the n-1)/2 circle pixel of image border, stipulate that its information entropy is 0.The entropy of rest of pixels is calculated by following formula:
Wherein,
represent the wavelet coefficient of l level s direction
shannon entropy, p (z) is wavelet coefficient probability of occurrence,
define
n × n adjacent coefficient.
Although described the present invention with reference to one or more embodiment, those of skill in the art should be understood that can be with various forms correction or change the present invention in the case of the spirit and scope of the present invention described in not departing from appended claims.
Claims (17)
1. a watermark embedding method for content-based parsing and perception level, comprising:
Digital picture is decomposed, to obtain primitive information and texture information;
Respectively primitive information and texture information are carried out to wavelet transformation, to obtain the wavelet coefficient of primitive information and the wavelet coefficient of texture;
Utilize perceptual masking model to calculate respectively the JND value of the wavelet coefficient of primitive information and the wavelet coefficient of texture information, that is, and JND
wSwith
Wherein, l is perception level, and s is perceived direction, and m, n are constant;
2. watermark embedding method according to claim 1, wherein, carries out wavelet transformation to primitive information and texture information and comprises primitive information and texture information are carried out to 1 to 3 grade of wavelet transformation, and the value of m, n is preferably:
3. a watermark embedding method for content-based parsing and perception level, comprising:
Digital picture is decomposed, to obtain primitive information and texture information;
Respectively primitive information and texture information are carried out to wavelet transformation, to obtain the wavelet coefficient of primitive information and the wavelet coefficient of texture;
Utilize entropy masking model to calculate respectively the entropy of the wavelet coefficient of primitive information and the wavelet coefficient of texture information, ENT
wSwith
Wherein, l is perception level, and s is perceived direction, and m, n are constant, n >=0.9,0<m≤0.1;
4. according to the watermark embedding method described in claim 1 or 3,
Wherein, digital picture is decomposed to obtain primitive information and texture information comprises, described digital picture is carried out to primal sketch decomposition.
5. a watermark embedding method, comprising:
Adopt perceptual masking model to calculate the threshold of perception current of the wavelet coefficient of digital picture
Utilize entropy masking model to calculate the entropy of the wavelet coefficient of described digital picture
Use following formula to calculate the watermark strength parameter JND value through entropy modulation:
Wherein, l is perception level, and s is perceived direction, and ρ is constant;
6. watermark embedding method according to claim 5, wherein, the numerical optimization of ρ is one of following three scopes
0< ρ <1.1, ρ >1.15 and 1.1≤ρ≤1.15.
7. watermark embedding method according to claim 5, described employing perceptual masking model calculates the threshold of perception current of the wavelet coefficient of described digital picture
comprise:
Described digital picture is decomposed into primitive information and texture information;
Respectively primitive information and texture information are carried out to wavelet transformation, to obtain the wavelet coefficient of primitive information and the wavelet coefficient of texture;
Utilize perceptual masking model to calculate respectively the JND value of the wavelet coefficient of primitive information and the wavelet coefficient of texture information, that is, and JND
wSwith
Wherein, l is perception level, and s is perceived direction, and m, n are constant.
8. watermark embedding method according to claim 7, wherein, carries out wavelet transformation to primitive information and texture information and comprises primitive information and texture information are carried out to 1 to 3 grade of wavelet transformation, and the value of m, n is preferably:
9. watermark embedding method according to claim 5, described employing perceptual masking model calculates the threshold of perception current of the wavelet coefficient of described digital picture
comprise:
Described digital picture entirety is done to wavelet transformation;
10. watermark embedding method according to claim 5, the described entropy that utilizes entropy masking model to calculate the wavelet coefficient of described digital picture
comprise:
Described digital picture is decomposed into primitive information and texture information;
Respectively primitive information and texture information are carried out to wavelet transformation, to obtain the wavelet coefficient of primitive information and the wavelet coefficient of texture;
Utilize entropy masking model to calculate respectively the entropy of the wavelet coefficient of primitive information and the wavelet coefficient of texture information, ENT
wSwith
Wherein, l is perception level, and s is perceived direction, and m, n are constant, n >=0.9,0<m≤0.1.
11. according to the watermark embedding method described in claim 7 or 10,
Wherein, described digital picture is decomposed into primitive information and texture information comprises, described digital picture is carried out to primal sketch decomposition.
12. watermark embedding methods according to claim 5, described employing entropy masking model calculates the entropy of the wavelet coefficient of described digital picture
comprise:
Described digital picture entirety is done to wavelet transformation;
13. 1 kinds of watermark flush mountings, comprising:
Data receiver unit (210), for receiving the image that will be embedded into watermark from outside;
Resolving cell (220), becomes primitive part and texture part by the described picture breakdown that will be embedded into watermark;
Converter unit (230), for respectively by described primitive part and texture part from spatial transform to frequency domain;
Watermark strength parameter determining unit (240), for determining and transform to the primitive part of frequency domain and the watermark strength parameter of texture part respectively, and merges these two parameters, to obtain final watermark strength parameter value;
Watermark determining unit (250), for calculating the watermark that will embed;
Watermark embedded unit (260), the watermark of watermark determining unit (250) being calculated for the final watermark strength parameter value of determining according to described watermark strength parameter determining unit (240) is embedded into data receiver unit (210) and receives the image of wanting embed watermark.
14. watermark flush mountings according to claim 13, wherein
Described converter unit (230) does respectively wavelet transformation to described primitive part and texture part.
15. watermark flush mountings according to claim 14, wherein said watermark strength parameter determining unit (240)
Utilize perceptual masking model to calculate respectively the strong parameter value of watermark of the wavelet coefficient of primitive information and the wavelet coefficient of texture information, that is, and JND
wSwith
Wherein, l is perception level, and s is perceived direction, and m, n are constant, and the value of m, n is preferably:
16. watermark flush mountings according to claim 14, wherein said watermark strength parameter determining unit (240)
Utilize entropy masking model to calculate respectively the entropy of the wavelet coefficient of primitive information and the wavelet coefficient of texture information, ENT
wSwith
Wherein, l is perception level, and s is perceived direction, and m, n are constant, n >=0.9,0<m≤0.1.
17. 1 kinds of watermark flush mountings, comprising:
Data receiver unit (210), for receiving the image that will be embedded into watermark from outside;
Converter unit (230), for the described image that will be embedded into watermark is done to wavelet transformation, with from spatial transform to frequency domain;
Watermark strength parameter determining unit (240), for
Adopt perceptual masking model to calculate the threshold of perception current of the wavelet coefficient of described image
Utilize entropy masking model to calculate the entropy of the wavelet coefficient of described image
Use following formula to calculate the watermark strength parameter JND value through entropy modulation:
Wherein, l is perception level, and s is perceived direction, and ρ is constant; Wherein the numerical optimization of ρ is one of following three scopes
0< ρ <1.1, ρ >1.15 and 1.1≤ρ≤1.15,
Watermark determining unit (250), for calculating the watermark that will embed;
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201210518446.4A CN103854248B (en) | 2012-12-05 | 2012-12-05 | Watermark embedding method and device based on Context resolution and perception layering |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201210518446.4A CN103854248B (en) | 2012-12-05 | 2012-12-05 | Watermark embedding method and device based on Context resolution and perception layering |
Publications (2)
Publication Number | Publication Date |
---|---|
CN103854248A true CN103854248A (en) | 2014-06-11 |
CN103854248B CN103854248B (en) | 2017-06-20 |
Family
ID=50861868
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201210518446.4A Active CN103854248B (en) | 2012-12-05 | 2012-12-05 | Watermark embedding method and device based on Context resolution and perception layering |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN103854248B (en) |
Cited By (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN105611110A (en) * | 2015-12-25 | 2016-05-25 | 努比亚技术有限公司 | Image processing method and apparatus |
CN106612467A (en) * | 2015-10-21 | 2017-05-03 | 上海文广互动电视有限公司 | A video content protection method and apparatus based on watermarks |
CN108280797A (en) * | 2018-01-26 | 2018-07-13 | 江西理工大学 | A kind of Arithmetic on Digital Watermarking of Image system based on Texture complication and JND model |
CN115797141A (en) * | 2023-01-18 | 2023-03-14 | 北京锐马视讯科技有限公司 | Method, system, equipment and storage medium for extracting watermark multiplexing |
CN117173003A (en) * | 2023-11-02 | 2023-12-05 | 南京师范大学 | BIM data zero watermark method, device and medium |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2008147142A2 (en) * | 2007-05-30 | 2008-12-04 | Mmchips Co., Ltd. | Method and system of inserting watermark for h.264/avc video stream |
US20090141931A1 (en) * | 2005-05-19 | 2009-06-04 | Orly Yadid-Pecht | Digital Watermarking CMOS Sensor |
CN101661605A (en) * | 2008-08-26 | 2010-03-03 | 浙江大学 | Embedding and positioning tampering methods of digital watermark and device thereof |
CN102419979A (en) * | 2011-11-23 | 2012-04-18 | 北京邮电大学 | Audio semi-fragile watermarking algorithm for realizing precise positioning of altered area |
-
2012
- 2012-12-05 CN CN201210518446.4A patent/CN103854248B/en active Active
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20090141931A1 (en) * | 2005-05-19 | 2009-06-04 | Orly Yadid-Pecht | Digital Watermarking CMOS Sensor |
WO2008147142A2 (en) * | 2007-05-30 | 2008-12-04 | Mmchips Co., Ltd. | Method and system of inserting watermark for h.264/avc video stream |
CN101661605A (en) * | 2008-08-26 | 2010-03-03 | 浙江大学 | Embedding and positioning tampering methods of digital watermark and device thereof |
CN102419979A (en) * | 2011-11-23 | 2012-04-18 | 北京邮电大学 | Audio semi-fragile watermarking algorithm for realizing precise positioning of altered area |
Non-Patent Citations (2)
Title |
---|
MAURO BARNI, ET AL: "Improved Wavelet-Based Watermarking Through Pixel-Wise Masking", 《IEEE TRANSACTIONS ON IMAGE PROCESSING》 * |
MEHDI BOROUMAND, ET AL: "An improved quantization based watermarking scheme using local entropy in wavelet domain", 《2009 IEEE INTERNATIONAL CONFERENCE ON SIGNAL AND IMAGE PROCESSING APPLICATIONS》 * |
Cited By (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN106612467A (en) * | 2015-10-21 | 2017-05-03 | 上海文广互动电视有限公司 | A video content protection method and apparatus based on watermarks |
CN105611110A (en) * | 2015-12-25 | 2016-05-25 | 努比亚技术有限公司 | Image processing method and apparatus |
CN108280797A (en) * | 2018-01-26 | 2018-07-13 | 江西理工大学 | A kind of Arithmetic on Digital Watermarking of Image system based on Texture complication and JND model |
CN108280797B (en) * | 2018-01-26 | 2021-08-31 | 江西理工大学 | Image digital watermarking algorithm system based on texture complexity and JND model |
CN115797141A (en) * | 2023-01-18 | 2023-03-14 | 北京锐马视讯科技有限公司 | Method, system, equipment and storage medium for extracting watermark multiplexing |
CN115797141B (en) * | 2023-01-18 | 2023-04-14 | 北京锐马视讯科技有限公司 | Method, system, equipment and storage medium for multiplexing and extracting watermark |
CN117173003A (en) * | 2023-11-02 | 2023-12-05 | 南京师范大学 | BIM data zero watermark method, device and medium |
CN117173003B (en) * | 2023-11-02 | 2024-01-05 | 南京师范大学 | BIM data zero watermark method, device and medium |
Also Published As
Publication number | Publication date |
---|---|
CN103854248B (en) | 2017-06-20 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
Chou et al. | A perceptually tuned watermarking scheme for color images | |
CN101662686B (en) | DCT domain video watermark treatment method based on texture features | |
CN101605272B (en) | Method for evaluating objective quality of partial reference type image | |
CN103854248A (en) | Watermark embedding method and device based on content analysis and perception hierarchy | |
Zhang et al. | Spread spectrum image watermarking based on perceptual quality metric | |
Liu | Wavelet-based watermarking for color images through visual masking | |
CN101504760A (en) | Digital image concealed information detecting and positioning method | |
CN104766269A (en) | Spread transform dither modulation watermarking method based on JND brightness model | |
Nguyen et al. | Perceptual watermarking using a new Just-Noticeable-Difference model | |
CN113628090B (en) | Anti-interference message steganography and extraction method, system, computer equipment and terminal | |
EP2154649A1 (en) | An adaptive watermarking system and method | |
Carosi et al. | Toward a simplified perceptual quality metric for watermarking applications | |
Abodena et al. | Hybrid technique for robust image watermarking using discrete time fourier transform | |
Okagaki et al. | Robustness evaluation of digital watermarking based on discrete wavelet transform | |
CN114363719B (en) | Content self-adaptive robustness video watermarking method based on Bessel-Fourier moment | |
Yang et al. | Information entropy used in digital watermarking | |
Zhou et al. | Reduced reference stereoscopic image quality assessment using digital watermarking | |
CN103559677B (en) | Based on the adapting to image watermark embedding method of wavelet transformation and visual characteristic | |
Yan et al. | Block-based adaptive image watermarking scheme using just noticeable difference | |
KR101321896B1 (en) | A watermarking method based on occlusion for copyright protection of stereoscopic image | |
Ramaiya et al. | Digital security using watermarking techniques via discrete wavelet transform | |
CN1937769B (en) | Water print method and device for digital video signal | |
Drebuzhan et al. | The Steganographic Method of Introducing Additional Information, Resistant to Transcoding and Conversion | |
Liu | Human visual system based watermarking for color images | |
Liu et al. | An improved spatial spread-spectrum video watermarking |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
C06 | Publication | ||
PB01 | Publication | ||
C10 | Entry into substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |