CN112788342A - Watermark information embedding method and device - Google Patents

Watermark information embedding method and device Download PDF

Info

Publication number
CN112788342A
CN112788342A CN201911093370.3A CN201911093370A CN112788342A CN 112788342 A CN112788342 A CN 112788342A CN 201911093370 A CN201911093370 A CN 201911093370A CN 112788342 A CN112788342 A CN 112788342A
Authority
CN
China
Prior art keywords
sequence
target
gray
bit sequence
bit
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201911093370.3A
Other languages
Chinese (zh)
Other versions
CN112788342B (en
Inventor
刘永亮
曾晶华
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Alibaba Group Holding Ltd
Original Assignee
Alibaba Group Holding Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Alibaba Group Holding Ltd filed Critical Alibaba Group Holding Ltd
Priority to CN201911093370.3A priority Critical patent/CN112788342B/en
Priority to PCT/CN2020/126360 priority patent/WO2021093648A1/en
Publication of CN112788342A publication Critical patent/CN112788342A/en
Application granted granted Critical
Publication of CN112788342B publication Critical patent/CN112788342B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/10Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding
    • H04N19/169Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the coding unit, i.e. the structural portion or semantic portion of the video signal being the object or the subject of the adaptive coding
    • H04N19/184Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the coding unit, i.e. the structural portion or semantic portion of the video signal being the object or the subject of the adaptive coding the unit being bits, e.g. of the compressed video stream
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T1/00General purpose image data processing
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/20Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using video object coding
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/46Embedding additional information in the video signal during the compression process
    • H04N19/467Embedding additional information in the video signal during the compression process characterised by the embedded information being invisible, e.g. watermarking

Abstract

The application discloses a watermark information embedding method and a device, wherein the method comprises the following steps: obtaining a carrier object; acquiring watermark information to be embedded; obtaining a target amble sequence, the target amble sequence satisfying the following conditions: after the target guide code sequence is subjected to shift processing, the obtained discrimination between the shifted sequence and the target guide code sequence is greater than the preset discrimination; and embedding the watermark information to be embedded and the target guide code sequence into the carrier object. By using the method, the watermark information which is completely consistent with the embedded watermark information can be extracted, and the problem that the uniqueness of the watermark information cannot be ensured because the embedded watermark information is not completely matched with the extracted watermark information in the prior art is solved.

Description

Watermark information embedding method and device
Technical Field
The application relates to the technical field of computers, in particular to a watermark information embedding method. The application also relates to a watermark information embedding device and an electronic device. The application also relates to a watermark information extraction method, a watermark information extraction device and an electronic device.
Background
With the rapid development of communication technology and multimedia technology, the use of digital watermarking technology to solve the problems of security, copyright protection and authentication of digital media has become a research hotspot, and the phenomenon of illegal distribution is rampant gradually due to convenient network propagation conditions, so that the basic interests of copyright owners are seriously infringed. In order to prevent digital film and television works from being illegally distributed and tampered wantonly, a digital watermarking technology is widely applied to carrier objects such as multimedia information, documents, software and the like as an effective method for realizing copyright protection and anti-fake tracking.
The digital watermarking technology is characterized in that watermarking information is embedded into a carrier object to be protected, under the condition that normal use of the carrier object is not influenced, content is prevented from being copied and changed at will through methods such as encryption and the like, for example, when piracy or copyright dispute occurs, the watermarking information can be extracted from a disputed work to serve as evidence of copyright ownership, and therefore the rights and interests of an owner are maintained; in addition, the video watermarking technology can also judge whether the carrier object is changed at will through the embedded identification information, so that a modifier is tracked, and the purpose of protecting the basic rights and interests of copyright owners is achieved.
The signal processing processes related to video images include channel noise, filtering, digital/analog and analog/digital conversion, resampling, image cropping, image shifting, image scaling, image compression encoding, and the like. Robustness means that the watermark information remains intact and can be accurately detected after undergoing various unintentional or intentional signal processing procedures. The existing watermark embedding and extracting algorithm aiming at the video sequence can effectively resist the signal processing processes of image compression, noise addition, filtering and the like, but the existing watermark embedding and extracting algorithm lacks robustness for geometric attack. That is, after an attacker performs geometric transformation such as rotation and scaling on a target image that does not affect the visual effect of the target image, most watermark detectors cannot detect the embedded watermark information again, for example, a watermark embedding method based on image blocking, and after the above geometric transformation process destroys the synchronization between image blocks, the same blocking manner as that used for embedding watermark information cannot be obtained, which results in an error in extracting watermark information.
Because the shape of the gray level histogram of the image has invariance after geometric transformation such as image rotation, scaling and the like, the watermark embedding method based on the gray level histogram of the image has a good effect on resisting geometric attack. The watermark algorithm based on the gray histogram searches the gray mean value of the image so as to determine the embedding position of the watermark information, a plurality of watermark information are detected in the process and are matched with the original watermark information so as to determine whether the original watermark information is embedded in the detected image.
However, the above-mentioned watermark embedding method based on the image gray histogram brings a very large false positive probability in the process of extracting the watermark information, that is, in the process of determining the embedding position of the watermark information, since a large number of watermark sequences have a high similarity before and after shifting, even if the detected watermark information is the shifted watermark information and is matched with the original watermark information, a high reliability can be obtained, or even if one completely wrong watermark information (non-original watermark information) is matched with a plurality of detected watermark information, the watermark information with a high reliability can be extracted from the carrier image, and thus, the uniqueness of the watermark information cannot be ensured. For a sequence of video frames, since it is also necessary to determine the embedding position of watermark information and match a plurality of detected watermark information with the original watermark information, the uniqueness of the watermark information cannot be achieved.
Disclosure of Invention
The embodiment of the application provides a watermark information embedding method, which aims to solve the problem that the uniqueness of watermark information cannot be ensured in the existing watermark information embedding method. Further embodiments of the present application provide a watermark information embedding apparatus and an electronic device. The application also provides a watermark information extraction method, a watermark information extraction device and an electronic device.
The embodiment of the application provides a watermark information embedding method, which comprises the following steps:
obtaining a carrier object; acquiring watermark information to be embedded; obtaining a target amble sequence; embedding the watermark information to be embedded and the target amble sequence into the carrier object; wherein the target pilot sequence satisfies the following condition: after the target amble sequence is shifted, the discrimination between the shifted sequence and the target amble sequence is greater than a predetermined discrimination.
Optionally, the target guide code sequence is a binary bit sequence, and the embedding the information of the watermark to be embedded and the target guide code sequence into the carrier object includes: adding the target guide code sequence to the front end or the rear end of the watermark information to be embedded to obtain a target bit sequence; embedding the target bit sequence into the carrier object.
Optionally, the carrier object includes a carrier image, and the embedding the target bit sequence into the carrier object includes: obtaining a gray level histogram of the carrier image; according to the target bit sequence, adjusting the shape of the gray level histogram to obtain an adjusted gray level histogram; and adjusting the gray value of a pixel point in the carrier image according to the adjusted gray level histogram to obtain a target image embedded with the target bit sequence.
Optionally, the adjusting the shape of the gray level histogram according to the target bit sequence to obtain an adjusted gray level histogram includes: obtaining a target gray level interval of the gray level histogram; dividing the target gray scale interval according to the bit number of the target bit sequence to obtain gray scale subintervals corresponding to the bit number, wherein each gray scale subinterval comprises at least two adjacent gray scale levels, and the gray scale levels are used for representing the number of pixel points with the same gray scale value; obtaining quantity relation information of pixel points in a gray scale subinterval corresponding to the bit value; and adjusting the number of pixel points contained in at least two adjacent gray levels of the gray sub-interval according to the target bit sequence and the number relation information of the pixel points in the gray sub-interval corresponding to the bit value to obtain an adjusted gray histogram.
Optionally, the obtaining of the target gray level interval of the gray level histogram includes: calculating the gray average value of the carrier image; and calculating a target gray level interval of the gray level histogram according to the gray level mean value based on the representation range of the gray level histogram and the bit number of the target bit sequence.
Optionally, the obtaining information of the number relationship between the pixel points in the gray scale subinterval corresponding to the bit value includes: when the bit value is 1, obtaining a first comparison relation between the ratio of the number of pixel points contained in two adjacent gray levels in a gray sub-interval corresponding to the bit value and the preset embedding strength; when the bit value is 0, obtaining a second comparison relation between the ratio of the number of pixel points contained in two adjacent gray levels in the gray sub-interval corresponding to the bit value and the preset embedding strength;
the adjusting the number of the pixel points included in at least two adjacent gray levels of the gray sub-interval according to the target bit sequence and the information of the number relationship of the pixel points in the gray sub-interval corresponding to the bit value to obtain an adjusted gray histogram includes:
the bit value to be embedded of the target bit sequence is 1, and if the relation between the ratio of the number of pixel points contained in two adjacent gray levels in the gray subinterval corresponding to the bit value to be embedded and the preset embedding strength accords with the first comparison relation, the pixel points are not adjusted; if the relation between the ratio of the number of pixel points contained in two adjacent gray levels in the gray subinterval corresponding to the bit value to be embedded and the preset embedding strength does not accord with the first comparison relation, selecting a first number of pixel points from the pixel points contained in the gray level with the larger gray level and moving the pixel points to the gray level with the smaller gray level;
the bit value to be embedded of the target bit sequence is 0, and if the relation between the ratio of the number of pixel points contained in two adjacent gray levels in the gray subinterval corresponding to the bit value to be embedded and the preset embedding strength accords with the second comparison relation, the pixel points are not adjusted; if the relation between the ratio of the number of pixel points contained in two adjacent gray levels in the gray subinterval corresponding to the bit value to be embedded and the preset embedding strength does not accord with the second comparison relation, selecting a second number of pixel points from the pixel points contained in the gray level with the smaller gray value to move to the gray level with the larger gray value;
and the first quantity and the second quantity are obtained by calculation according to the preset embedding intensity and the quantity of pixel points contained in two adjacent gray levels in the gray subinterval.
Optionally, the selecting a first number of pixel points from the pixel points included in the gray level with the larger gray value and moving the pixel points to the gray level with the smaller gray value includes: selecting a first number of pixel points from the pixel points contained in the gray level with a larger gray value in a random mode to move to the gray level with a smaller gray level;
the selecting a second number of pixel points from the pixel points included in the gray level with the smaller gray value to move to the gray level with the larger gray value comprises the following steps: and selecting a second number of pixel points from the pixel points contained in the gray level with the smaller gray value in a random mode to move to the gray level with the larger gray value.
Optionally, before obtaining the gray level histogram of the carrier image, the method further includes: carrying out Gaussian filtering processing on the carrier image to obtain a low-frequency signal part of the carrier image; the obtaining of the gray level histogram of the carrier image includes: a grey level histogram of a low frequency signal portion of the carrier image is obtained.
Optionally, the obtaining a gray level histogram of a low-frequency signal portion of the carrier image includes: carrying out blocking processing on a low-frequency signal part of the carrier image to obtain a blocking image; calculating the gray average value of the block images; and counting to obtain a gray level histogram of the block image according to the gray level mean value of the block image.
Optionally, the method further includes: calculating to obtain the mean square error of the block image; the adjusting the shape of the gray level histogram according to the target bit sequence to obtain an adjusted gray level histogram includes: selecting gray level histograms of a preset number of block images according to the sequence of the mean square deviations of the block images from large to small; and adjusting the shape of the gray level histogram of the block images of the preset number according to the target bit sequence to obtain the adjusted gray level histogram of the block images.
Optionally, the carrier object includes a sequence of video frames, and the embedding the target bit sequence into the carrier object includes: obtaining a first sub-bit sequence representing a bit value "0" in the target bit sequence and obtaining a second sub-bit sequence representing a bit value "1" in the target bit sequence, the first sub-bit sequence being distinct from the second sub-bit sequence; obtaining a target video image in the video frame sequence, wherein the target video image refers to a video image to be embedded into the target bit sequence; obtaining a target bit value to be embedded into the target video image; embedding the first or second sub-bit sequence representing the target bit value into the target video image.
Optionally, the first sub-bit sequence satisfies the following condition: after the first sub-bit sequence is subjected to shift processing, the discrimination between the obtained shifted bit sequence and the first sub-bit sequence is greater than a preset discrimination; and the second sub-bit sequence satisfies the following condition: after the second sub-bit sequence is shifted, the discrimination between the shifted bit sequence and the second sub-bit sequence is greater than a preset discrimination.
Optionally, the first sub-bit sequence includes a first sub-amble sequence and first sub-watermark information, and the first sub-amble sequence satisfies the following condition: after the first sub-amble sequence is shifted, the discrimination between the shifted bit sequence and the first sub-amble sequence is greater than a preset discrimination; and the second sub-bit sequence comprises a second sub-amble sequence and second sub-watermark information, and the second sub-amble sequence satisfies the following condition: after the second sub-amble sequence is shifted, the discrimination between the shifted bit sequence and the second sub-amble sequence is greater than a preset discrimination; wherein the first sub-watermark information is different from the second sub-watermark information.
Optionally, the first sub-amble sequence is the same as the second sub-amble sequence.
Optionally, the first sub-amble sequence and the target amble sequence are the same bit sequence, and the second sub-amble sequence and the target amble sequence are the same bit sequence.
Optionally, the obtaining a target video image in the video frame sequence includes: and according to the mode of embedding a bit value in one frame of video image of the video frame sequence, obtaining a target video image to be embedded in the target bit sequence from the video frame sequence according to the number of the bit values contained in the target bit sequence.
Optionally, the embedding the first sub-bit sequence or the second sub-bit sequence representing the target bit value into the target video image includes: obtaining a gray level histogram of the target video image; adjusting the shape of a gray level histogram of the target video image according to the first sub-bit sequence or the second sub-bit sequence for representing the target bit value to obtain an adjusted gray level histogram; and adjusting the gray value of a pixel point in the target video image according to the adjusted gray level histogram to obtain the target video image embedded with the first sub-bit sequence or the second sub-bit sequence.
Optionally, the carrier object includes a sequence of video frames, and the embedding the target bit sequence into the carrier object includes: repeatedly embedding the target bit sequence into the video frame sequence according to a predetermined embedding number.
The embodiment of the present application further provides a watermark information extraction method, including:
obtaining an object to be detected; obtaining a reference bit sequence comprising a pilot sequence and original watermark information, the pilot sequence satisfying the following conditions: after the pilot sequence is shifted, the discrimination between the shifted sequence and the pilot sequence is greater than the preset discrimination. (ii) a Extracting a target bit sequence from the object to be detected; and matching the target bit sequence with the reference bit sequence, and determining whether the target bit sequence is the reference bit sequence embedded in the object to be detected.
Optionally, the setting of the amble sequence at the front end of the original watermark information, and the extracting of the target bit sequence from the object to be detected includes: and taking the initial position of the embedding range of the target bit sequence as an initial detection point for extracting the target bit sequence, and extracting the target bit sequence from the object to be detected.
Optionally, the amble sequence is set at a rear end of the original watermark information, and the extracting a target bit sequence from the object to be detected includes: and taking the tail position of the embedding range of the target bit sequence as an initial detection point for extracting the target bit sequence, and extracting the target bit sequence from the object to be detected.
Optionally, the object to be detected includes an image to be detected, and the extracting a target watermark sequence from the object to be detected includes: obtaining a gray level histogram of the image to be detected; and extracting a target watermark sequence from the object to be detected based on the gray histogram.
Optionally, the object to be detected includes a video frame sequence, and the extracting the target watermark sequence from the object to be detected includes: obtaining a first reference sub-bit sequence for representing a bit value "0" in the target bit sequence and obtaining a second reference sub-bit sequence for representing a bit value "1" in the target bit sequence, the first reference sub-bit sequence being distinct from the second reference sub-bit sequence; extracting a plurality of target sub-bit sequences from video images of the sequence of video frames; and comparing the target sub-bit sequence with the first reference sub-bit sequence and the second reference sub-bit sequence respectively to determine that the bit value embedded in the video image of the video frame sequence is '0' or '1'.
Optionally, the extracting a target watermark sequence from the object to be detected based on the gray level histogram includes: calculating the gray average value of the image to be detected; calculating a target gray level interval of the gray level histogram according to the gray level mean value; dividing the target gray scale interval according to the number of the bit values of the watermark sequence embedded into the image to be detected, so as to obtain gray scale subintervals corresponding to the number of the bit values of the embedded watermark sequence, wherein each gray scale subinterval comprises at least two adjacent gray scale levels, and the gray scale levels are used for representing the number of pixel points with the same gray scale value; obtaining predetermined bit value extraction data; and respectively extracting the bit values embedded into the image to be detected according to the number of pixel points in at least two adjacent gray levels of the gray subintervals and the bit value extraction data to obtain the target bit sequence.
Another embodiment of the present application further provides a watermark information embedding apparatus, including:
a carrier object obtaining unit for obtaining a carrier object;
a to-be-embedded watermark information obtaining unit, configured to obtain to-be-embedded watermark information;
a target amble sequence obtaining unit, configured to obtain a target amble sequence, where the target amble sequence satisfies the following condition: after the target guide code sequence is subjected to shift processing, the discrimination between the obtained shifted sequence and the target guide code sequence is greater than a preset discrimination;
and the information embedding unit is used for embedding the watermark information to be embedded and the target guide code sequence into the carrier object.
Another embodiment of the present application further provides an electronic device, including: a processor and a memory for storing a watermark information embedding program, which when read and executed by the processor, performs the following operations: obtaining a carrier object; acquiring watermark information to be embedded; obtaining a target pilot sequence, the target pilot sequence satisfying the following conditions: after the target guide code sequence is subjected to shift processing, the discrimination between the obtained shifted sequence and the target guide code sequence is greater than a preset discrimination; and embedding the watermark information to be embedded and the target guide code sequence into the carrier object.
Another embodiment of the present application further provides a watermark information extraction apparatus, including:
the object to be detected obtaining unit is used for obtaining an object to be detected;
a reference bit sequence obtaining unit, configured to obtain a preset reference bit sequence including a pilot sequence and original watermark information, where the pilot sequence satisfies the following condition: after the pilot sequence is shifted, the discrimination between the shifted sequence and the pilot sequence is greater than the preset discrimination;
a target bit sequence extraction unit, configured to extract a target bit sequence from the object to be detected;
and the information matching unit is used for matching the target bit sequence with the reference bit sequence and determining whether the target bit sequence is the reference bit sequence embedded in the object to be detected.
Another embodiment of the present application further provides an electronic device, including: a processor and a memory for storing a watermark information extraction program, which when read and executed by the processor, performs the following operations: obtaining an object to be detected; obtaining a preset reference bit sequence containing a guide code sequence and original watermark information, wherein the guide code sequence meets the following conditions: after the pilot sequence is shifted, the discrimination between the shifted sequence and the pilot sequence is greater than the preset discrimination; extracting a target bit sequence from the object to be detected; and matching the target bit sequence with the reference bit sequence, and determining whether the target bit sequence is the reference bit sequence embedded in the object to be detected.
Another embodiment of the present application further provides a watermark information embedding method, including: obtaining a carrier object; obtaining at least two pieces of watermark information to be embedded; adding different target guide code sequences to the at least two pieces of watermark information to be embedded to obtain at least two target embedded sequences; embedding the at least two target embedding sequences into the carrier object; wherein the target amble sequence satisfies the following condition: after the target pilot sequence is shifted, the discrimination between the shifted sequence and the target pilot sequence is greater than a preset discrimination.
Optionally, the adding different target amble sequences to the at least two pieces of watermark information to be embedded includes: and respectively adding a target guide code sequence corresponding to the watermark information to be embedded to each piece of watermark information to be embedded in the at least two pieces of watermark information to be embedded.
Compared with the prior art, the method has the following advantages:
according to the watermark information embedding method, when watermark information is embedded into a carrier object, a target guide code sequence is simultaneously embedded, and the target guide code sequence meets the following conditions: after the target amble sequence is shifted, the discrimination between the shifted sequence and the target amble sequence is greater than a predetermined discrimination. Because the distinction degree of the target guide code sequence after the displacement and before the displacement is larger, when the watermark information is extracted by the watermark information extraction end, the guide code sequence can be accurately matched, the embedding interval of the watermark information can be accurately positioned, and the watermark information can be accurately positioned. By using the method, the watermark information which is completely consistent with the embedded watermark information can be extracted, and the problem that the uniqueness of the watermark information cannot be ensured because the embedded watermark information is not completely matched with the extracted watermark information in the prior art is solved.
Drawings
Fig. 1 is a flowchart of a watermark information embedding method provided in a first embodiment of the present application;
fig. 1-a is a schematic diagram of a grayscale histogram of a carrier image after embedding watermark information according to a first embodiment of the present application;
fig. 1-B is a schematic diagram of embedding watermark information to be embedded and an object amble sequence into a carrier object according to a first embodiment of the present application;
fig. 1-C is a schematic diagram illustrating adjustment of a pixel point corresponding to embedding watermark information according to a first embodiment of the present application;
fig. 2 is a flowchart of a watermark information extraction method provided in a second embodiment of the present application;
fig. 3 is a block diagram of a watermark information embedding apparatus according to a third embodiment of the present application;
fig. 4 is a schematic logical structure diagram of an electronic device according to a fourth embodiment of the present application;
fig. 5 is a block diagram of a unit of a watermark information extraction apparatus according to a fifth embodiment of the present application;
fig. 6 is a schematic logical structure diagram of an electronic device according to a sixth embodiment of the present application.
Detailed Description
In the following description, numerous specific details are set forth in order to provide a thorough understanding of the present application. This application is capable of implementation in many different ways than those herein set forth and of similar import by those skilled in the art without departing from the spirit of this application and is therefore not limited to the specific implementations disclosed below.
The application provides a watermark information embedding method, a watermark information embedding device and electronic equipment corresponding to the method, and also provides a watermark information extracting method, a watermark information extracting device and electronic equipment corresponding to the method, aiming at the watermark information embedding and extracting scene, in order to accurately position the watermark information in the watermark information extracting process and extract the watermark information which is completely consistent with the embedded watermark information. The following provides embodiments to explain the method, apparatus, and electronic device in detail.
A first embodiment of the present application provides a watermark information embedding method, which is described below with reference to fig. 1. The following description refers to embodiments for the purpose of illustrating the principles of the methods, and is not intended to be limiting in actual use.
As shown in fig. 1, in step S101, a carrier object is obtained.
The digital watermarking technology is a computer information hiding technology based on a content and non-password mechanism, and refers to that under the condition of not influencing the use value of a carrier object, watermarking information which is not easy to be detected and modified and can be identified and recognized by a specified main body is directly embedded into the carrier object or the structure of a specific area of the carrier object is modified. The watermark information hidden in the carrier object can achieve the purposes of confirming content creators and purchasers, transmitting secret information or judging whether the carrier object is tampered or not, and the like, and is an effective method for protecting information safety and realizing anti-counterfeiting traceability and copyright protection. The carrier object includes multimedia information, documents, software, etc. In this embodiment, the carrier object is mainly a carrier image or a sequence of video frames.
As shown in fig. 1, in step S102, watermark information to be embedded is obtained.
Watermark information is a kind of protection information embedded in a carrier object using a computer algorithm, mainly a sequence of numbers or bits, e.g. watermark information 1010110101 comprising ten bit values. The watermark information to be embedded refers to the preset watermark information for embedding in the carrier image or the video frame sequence.
As shown in fig. 1, in step S103, a target amble sequence is obtained.
The target amble sequence satisfies the following condition: after the target amble sequence is shifted, the obtained discrimination between the shifted sequence and the target amble sequence is greater than the preset discrimination, that is, after the target amble sequence is shifted in the forward or reverse direction according to the sorting direction of the target amble sequence, the obtained discrimination between the shifted sequence and the target amble sequence is greater than the preset discrimination. The distinction degree is greater than the preset distinction degree for indicating that the target amble sequence has strong distinction before and after shifting, for example, for two bit sequences having the same number of bits and strong distinction, after aligning the two bit sequences by the number of bits and comparing, the number of the same bit values corresponding to each bit is less than the predetermined threshold. In this embodiment, the target amble sequence is preferably a binary bit sequence, for example, the bit sequence 1100110010 is shifted forward (the bit sequence is shifted to the left in whole) by one bit to obtain a shifted bit sequence 100110010, the last bit corresponds to a random bit value 0 or 1, and the bit sequence after shifting is aligned with the bit sequence before shifting and then compared to obtain a bit sequence before shifting, where the distinction degree between the bit sequence after shifting and the bit sequence before shifting is large; the bit sequence after forward movement (the bit sequence moves to the left in whole) by two bits is 00110010, the last two bits are random, and the discrimination between the bit sequence after the movement and the bit sequence before the movement is larger; the bit value of the last nine bits of the bit sequence obtained by moving the bit sequence in the reverse direction (moving the whole bit sequence to the right) by one bit is 110011001, the first bit is random, and if the bit value corresponding to the first bit is 0, the bit sequence after the shift is 0110011001, and the discrimination between the bit sequence before the shift and the bit sequence before the shift is also large. Thus, the bit sequence 1100110010 may be referred to as a target amble sequence.
In the existing watermark information embedding method, the embedding position of watermark information needs to be searched in detail at the watermark information extraction stage. For example, in the process of embedding and extracting the watermark information aiming at a single image, a watermark algorithm based on a gray histogram is adopted, and a carrier image after embedding the watermark information is assumed to be IWWhen the carrier image is attacked, the attacked carrier image is I'WFrom I 'in the process of determining the watermark information embedding range'WCalculated mean value of gray scale and IWMean value of gray scale of
Figure RE-GDA0002388292730000101
Different, therefore, according to the formula
Figure RE-GDA0002388292730000102
Setting a search interval, traversing the search interval, extracting a plurality of watermark information, comparing the watermark information with the input original watermark information, however, in the process of determining the embedding position of the watermark information, because a large number of watermark sequences have higher similarity before and after shifting, even if the detected watermark information is the shifted watermark information, and the watermark information is matched with the original watermark information after aligning the watermark information according to bit number, higher reliability can be obtained, or a completely wrong watermark information (non-original watermark information) is matched with the detected plurality of watermark information, the watermark information with high reliability can be extracted from the carrier image, and the watermark information has the problem of false positive, namely, the watermark information which is not embedded is input, and the watermark information with great similarity with the original watermark information can also be extracted, therefore, the above method cannot guarantee the uniqueness of the watermark information.
For example, with two gray levels as a gray subinterval, a watermark sequence containing 10 bit values is embedded in the embedding range of the gray histogram: 1010110101, as shown in fig. 1-a (schematic diagram of a grey level histogram of a carrier image after embedding watermark information): in fig. 1-a, a gray level with a gray value of 12 is used as a left end point (a starting embedding position of watermark information) for embedding original watermark information, and when extracting the watermark information, the left end point needs to be searched according to an average value of a carrier image, a length of the watermark information, and a size of a packet, and the watermark information is extracted in sequence based on the left end point. However, when a gray level with a gray value of 15 is searched, a bit sequence opposite to the original watermark information may be extracted, and when a gray level with a gray value of 16 is searched, a bit sequence after the original watermark information is shifted to the left may be extracted, and the bit values corresponding to the first nine bits of the bit sequence are: 01011010, the last bit corresponds to a random bit value of 0 or 1. As can be seen from the comparison, the bit sequence opposite to the original watermark information and the bit sequence obtained after the original watermark information is shifted to the left have a higher matching degree with the original watermark information, and therefore both have a higher reliability, for example, if the original watermark information is very similar to the shifted watermark information, in the process of searching for a left endpoint, the bit sequence obtained after the original watermark information is shifted is matched with the original watermark information, so that the watermark information with a high reliability can also be obtained, and the uniqueness of the watermark information cannot be guaranteed.
For the above reasons, in the embodiment of the present application, before embedding the watermark information to be embedded into the carrier object, the target amble sequence is obtained, and the amble sequence is used for positioning the watermark information to be embedded. For example, the watermark information to be embedded is 1010110101, and the target preamble sequence is 1100110010.
As shown in fig. 1, in step S104, the watermark information to be embedded and the target amble sequence are embedded into the carrier object.
After the watermark information to be embedded and the target amble sequence are obtained in the above steps, as shown in fig. 1-a, this step is used to embed the above watermark information to be embedded and the target amble sequence into the carrier object to obtain the target object embedded with the watermark information to be embedded and the target amble sequence. The process specifically comprises the following steps: adding the target guide code sequence to the front end or the rear end of the watermark information to be embedded to obtain a target bit sequence; and embedding the target bit sequence into the carrier object, wherein in the process, the target guide code sequence can be used as the unique identification code of the watermark information to be embedded. For example, the target amble sequence 1100110010 is added to the front end of the watermark information 1010110101 to be embedded, the obtained target bit sequence is 11001100101010110101, and the target bit sequence is embedded into the carrier object according to a predetermined watermark information embedding manner.
In this embodiment, the carrier object may refer to a single frame of carrier image or a sequence of video frames.
When the carrier object is a carrier image, the process of embedding the target bit sequence into the carrier object may include the following steps:
and A, obtaining a gray level histogram of the carrier image.
In this embodiment, the process of obtaining the gray level histogram of the carrier image specifically includes:
firstly, the carrier image is subjected to Gaussian filtering processing by adopting the following formula to obtain a low-frequency signal part of the carrier image: i isLow(x, y) ═ G (x, y, σ) × I (x, y), where I denotes a carrier image, I denotes a carrier imageLowRepresenting a low-frequency signal part of a carrier image obtained by carrying out Gaussian filtering on the carrier image, wherein G represents a Gaussian low-pass filter; after the carrier image is subjected to the Gaussian filtering processing, the watermark information has good robustness to filtering, noise adding and other attacks, namely the watermark information is embedded in the low-frequency signal part of the carrier image, so that the robustness of the watermark information is stronger. For a two-dimensional carrier image, the gaussian low-pass filter is defined as follows:
Figure RE-GDA0002388292730000121
where σ is the standard deviation of the Gaussian distribution.
Secondly, a grey level histogram of the low-frequency signal portion of the carrier image is obtained, i.e. statistics ILowThe gray level histogram is used as the statistical characteristic of the image, depends on the number of pixel points contained in each gray level in the image, and is irrelevant to the specific positions of the pixel points. The shape of the gray histogram is used to represent the proportion of the number of pixels contained in each gray level to the total number of pixels in the image. When the image is geometrically attacked, the size and the spatial position of the image are correspondingly changed, but the proportion of the number of the pixel points contained in each gray level to the total number of the pixel points of the image is not changed, namely, the shape of the gray value histogram can be kept stable under the geometric attack, so that the method can be applied to a watermark algorithm for resisting the geometric attack.
And B, adjusting the shape of the gray level histogram according to the target bit sequence to obtain the adjusted gray level histogram.
In this embodiment, the process specifically includes the following steps:
and B-1, obtaining a target gray level interval of the gray level histogram, wherein the target gray level interval is used for expressing the embedding range of the target bit sequence. The process of obtaining the target gray level interval of the gray level histogram specifically comprises the following steps: calculating to obtain the gray average value of the carrier image
Figure RE-GDA0002388292730000122
Then, the target gray level interval B of the gray level histogram is obtained by adopting the following formula:
Figure RE-GDA0002388292730000123
wherein the positive decimal number λ satisfies the following condition: b cannot exceed the representation range of the gradation histogram, and B satisfies the number of bits of the target bit sequence.
B-2, dividing the target gray scale interval according to the bit number of the target bit sequence to obtain gray scale subintervals corresponding to the bit number, wherein each gray scale subinterval comprises at least two adjacent gray levels, and the gray levels are used for expressing the number of pixel points with the same gray value;
and B-3, obtaining the quantity relation information of the pixel points in the gray subintervals corresponding to the bit values. In this embodiment, the process of obtaining the number relation information of the pixel points in the gray scale subinterval corresponding to the bit value specifically includes: when the bit value is 1, obtaining a first comparison relation between the ratio of the number of pixel points contained in two adjacent gray levels in a gray sub-interval corresponding to the bit value and the preset embedding strength; and when the bit value is 0, obtaining a second comparison relation between the ratio of the number of pixel points contained in two adjacent gray levels in the gray subinterval corresponding to the bit value and the preset embedding strength. For example, the information of the number relationship between the pixel points in the gray scale subinterval corresponding to the bit value is:
Figure RE-GDA0002388292730000124
w (i) represents a bit value, a and b represent the number of pixels included in two adjacent gray levels in one gray subinterval, a represents the number of pixels included in a gray bin1 with a smaller gray value, b represents the number of pixels included in a gray bin2 with a larger gray value, and T represents a predetermined embedding strength.
And B-4, adjusting the number of pixel points corresponding to at least two adjacent gray levels of the gray subintervals according to the target bit sequence and the number relation information of the pixel points in the gray subintervals corresponding to the bit values to obtain an adjusted gray histogram.
In this embodiment, the process of adjusting the pixel specifically includes: when the bit value to be embedded of the target bit sequence is 1, if the relation between the ratio of the number of pixel points contained in two adjacent gray levels in the gray scale subinterval corresponding to the bit value to be embedded and the preset embedding strength accords with the first comparison relation, the pixel points are not adjusted; if the relation between the ratio of the number of pixel points contained in two adjacent gray levels in the gray subinterval corresponding to the bit value to be embedded and the preset embedding strength does not conform to the first comparison relation, selecting a first number of pixel points from the pixel points contained in the gray level with the larger gray value and moving the pixel points to the gray level with the smaller gray level;
when the bit value to be embedded of the target bit sequence is 0, if the relation between the ratio of the number of pixel points contained in two adjacent gray levels in the gray scale subinterval corresponding to the bit value to be embedded and the preset embedding strength accords with the second comparison relation, the pixel points are not adjusted; and if the relation between the ratio of the number of pixel points contained in two adjacent gray levels in the gray subinterval corresponding to the bit value to be embedded and the preset embedding strength does not accord with the second comparison relation, selecting a second number of pixel points from the pixel points contained in the gray level with the smaller gray value and moving the pixel points to the gray level with the larger gray value.
For example, the bit value to be embedded of the target bit sequence is 1, if a/b ≧ T, there is no need to adjust the pixel point, if a/b ≧ T<T, selecting I from the pixel points contained in the gray level bin21Moving each pixel point to a gray level bin 1; the bit value to be embedded of the target bit sequence is 0, if b/a is larger than or equal to T, the pixel point does not need to be adjusted, and if b/a is smaller than T, I is selected from the pixel points contained in the gray level bin10The individual pixel points are shifted into gray level bin 2.
Wherein the first number and the second number are obtained by calculation according to the embedding intensity and the number of pixel points included in two adjacent gray levels in the gray sub-region, for example:
Figure RE-GDA0002388292730000131
in this embodiment, the selecting a first number of pixels from the pixels included in the gray level with the larger gray value to move to the gray level with the smaller gray level may be: selecting a first number of pixel points from the pixel points contained in the gray level with a larger gray value in a random mode to move to the gray level with a smaller gray level; the selecting of a second number of pixel points from the pixel points included in the gray level with the smaller gray value to move to the gray level with the larger gray value may be: and selecting a second number of pixel points from the pixel points contained in the gray level with the lower gray value in a random mode to move to the gray level with the higher gray value. For example, selecting I from the pixels contained in the gray bin2 in a random manner1Moving each pixel point to the gray level bin1, and selecting I from the pixel points contained in the gray level bin1 in a random manner0The individual pixel points are shifted into gray level bin 2.
As shown in FIG. 1-C, FIG. 1-C shows the specific adjustment manner of embedding a bit corresponding to four relative magnitude relationships of a and b, wherein I1And I0For the minimum value of the number of the pixel points to be adjusted, (a)1<a/b<T; (b)a/b>T>1;(c)1<b/a<T;(d)b/a>T>1
Selecting I from pixel points contained in gray level bin21Moving each pixel point to a gray level bin1, or selecting I from the pixel points contained in the gray level bin10Each pixel point is moved to a gray level bin2, and the method is specifically realized by adopting the following mode:
Figure RE-GDA0002388292730000141
where M is the width of the grey level, f1(i) Is the ith pixel, f, selected from bin12(j) Is the jth pixel selected from bin 2.
It should be noted that, the process of obtaining the gray level histogram of the low-frequency signal portion of the carrier image may also be as follows: performing block processing on the low-frequency signal part of the carrier image to obtain a block image, for example, blocking the low-frequency signal part of the carrier image according to 8 × 8; calculating the gray average value of each block image; and statistically obtaining a gray level histogram of each block image according to the gray level mean value of each block image, and calculating the mean square error of each block image, wherein the mean square error of the block images can be used for expressing the smoothness degree of the block images. The adjusting the shape of the gray level histogram according to the target amble sequence to obtain an adjusted gray level histogram may refer to: and for the gray scale subinterval in which the number of the pixel points needs to be adjusted, selecting a preset number of block images with larger mean square deviations according to the sequence of the mean square deviations of the block images from large to small, and adjusting the shape of the gray scale histogram of the block image with larger mean square deviations according to the target guide code sequence to obtain the adjusted gray scale histogram of the block image. The method can pre-select the non-smooth area of the carrier image to be modified, so as to improve the quality of the carrier image.
And C, adjusting the gray value of a pixel point in the carrier image according to the adjusted gray histogram to obtain a target image embedded with the target bit sequence.
When the carrier object is a sequence of video frames, the process of embedding the target bit sequence into the carrier object may include the following:
a1, obtaining a first sub-bit sequence for representing a bit value "0" in the target bit sequence, and obtaining a second sub-bit sequence for representing a bit value "1" in the target bit sequence, the first sub-bit sequence being distinct from the second sub-bit sequence.
In this embodiment, the first sub-bit sequence and the second sub-bit sequence may be set as the target amble sequence, and the purpose is to obtain the embedded first sub-bit sequence or second sub-bit sequence through the correlation characteristic of the target amble sequence in the video image detection process for the video image embedded in the first sub-bit sequence or the second sub-bit sequence in the video frame sequence. For example, the first sub-bit sequence may satisfy the following condition: after the first sub-bit sequence is shifted, the discrimination between the obtained shifted bit sequence and the first sub-bit sequence is greater than a preset discrimination, that is, after the first sub-bit sequence is shifted in the forward direction or in the reverse direction according to the sorting direction of the first sub-bit sequence, the discrimination between the obtained shifted sequence and the first sub-bit sequence is greater than the preset discrimination; the second sub-bit sequence may satisfy the following condition: after the second sub-bit sequence is shifted, the discrimination between the obtained shifted bit sequence and the second sub-bit sequence is greater than the preset discrimination, that is, after the second sub-bit sequence is shifted in the forward direction or the reverse direction according to the sorting direction of the second sub-bit sequence, the discrimination between the obtained shifted sequence and the second sub-bit sequence is greater than the preset discrimination.
In this embodiment, the first sub-bit sequence and the second sub-bit sequence may be further configured as follows:
the first sub-bit sequence includes a first sub-amble sequence and first sub-watermark information, and the first sub-amble sequence may satisfy the following condition: after the first sub-amble sequence is shifted, the discrimination between the obtained shifted bit sequence and the first sub-amble sequence is greater than the preset discrimination, that is, after the first sub-amble sequence is shifted in the forward or reverse direction according to the sorting direction of the first sub-amble sequence, the discrimination between the obtained shifted sequence and the first sub-amble sequence is greater than the preset discrimination; the second sub-sequence of bits includes a second sub-amble sequence and second sub-watermark information, and the second sub-amble sequence may satisfy the following condition: after the second sub-amble sequence is shifted, the discrimination between the obtained shifted bit sequence and the second sub-amble sequence is greater than the preset discrimination, that is, after the second sub-amble sequence is shifted in the forward direction or in the reverse direction according to the sorting direction of the second sub-amble sequence, the discrimination between the obtained shifted sequence and the second sub-amble sequence is greater than the preset discrimination; and the first sub-watermark information is different from the second sub-watermark information. The first sub-amble sequence and the second sub-amble sequence may be the same bit sequence, and the first sub-amble sequence and the second sub-amble sequence may be the same bit sequence as the target amble sequence, for example, 0: 11001100101100110010, respectively; 1: 11001100100011001101. the first sub-bit sequence or the second sub-bit sequence is set in this way, and when the first sub-bit sequence or the second sub-bit sequence is embedded in the video image in the video frame sequence, the same way as embedding the watermark information to be embedded and the target guide code sequence in the carrier image is performed, so that the watermark information embedded in each carrier object (the video image and the video frame sequence) can be positioned according to the uniform guide code sequence for a single frame video image in the video frame sequence for embedding the first sub-bit sequence or the second sub-bit sequence.
B1, obtaining the target video image in the video frame sequence, wherein the target video image refers to the video image to be embedded into the target bit sequence. In this embodiment, according to a manner of embedding a bit value in a frame of video image of the video frame sequence, and according to the number of bit values included in the target bit sequence, a target video image to be embedded in the target bit sequence is obtained from the video frame sequence. For example, if the target bit sequence is 11001100101010110101, the 20 bit values included in the target bit sequence need to be embedded into 20 frames of video images of the video frame sequence, respectively, where the 20 frames of video images are the target video images for embedding the target bit sequence.
C1, obtaining the target bit value to be embedded into the target video image. For example, if the current target video image to be embedded is the 9 th frame image in the 20 frame target video images, the value of the target bit to be embedded is 1.
D1, embedding the first or second sub-bit sequence representing the target bit value into the target video image. For example, a bit value of "1" corresponds to the second sub-bit sequence, and thus the second sub-bit sequence is embedded in the 9 th frame image. In this embodiment, the embedding process includes the following steps: obtaining a gray level histogram of the target video image; adjusting the shape of a gray level histogram of the target video image according to the first sub-bit sequence or the second sub-bit sequence for representing the target bit value to obtain an adjusted gray level histogram; and adjusting the gray value of a pixel point in the target video image according to the adjusted gray level histogram to obtain the target video image embedded with the first sub-bit sequence or the second sub-bit sequence. For details of implementing the embedding process, please refer to the above process for embedding the target bit sequence in the carrier image, which is not described herein again.
In this embodiment, in order to resist frame rate conversion attack (for example, the video frame sequence is changed from 50 frames to 25 frames, and watermark information embedded in the video frame sequence is lost due to the loss of video images), in the process of embedding the target bit sequence into the video frame sequence, the embedding may be repeated multiple times in time sequence, that is, the target bit sequence is repeatedly embedded into the video frame sequence according to a predetermined embedding time. For example, after embedding the target bit sequence 11001100101010110101 in the sequence of video frames, the embedding process described above is repeated a number of times subsequently for each frame of video images of the sequence of video frames.
As shown in the following table, the numerical values in the following table represent the watermark information detection result obtained after matching the reverse watermark sequence of the input original watermark information with the extracted watermark information in the watermark information detection process, and compared with the watermark information embedding method not using the target guide code sequence, the watermark information embedding method using the target guide code sequence provided in this embodiment has a significantly lower false detection rate of the obtained watermark information than the watermark information embedding method using no target guide code sequence.
Figure RE-GDA0002388292730000171
Therefore, the watermark information embedding method adopting the target guide code sequence has low false detection rate in the watermark information detection process, and can judge whether the input watermark information is the original watermark information embedded in the carrier object.
In this embodiment, the target amble sequence may correspond to a plurality of watermark information, that is, when the plurality of watermark information are embedded in the same carrier object or embedded in a plurality of carrier objects respectively, the same target amble sequence may be added to the front end or the back end of the plurality of watermark information to locate the plurality of watermark information respectively. For example, multiple pieces of watermark information of the same category may correspond to one target amble sequence, or may be matched with a target amble sequence corresponding to different usage requirements and applicable scenarios of the watermark information (e.g., different encryption levels provided by the watermark information).
In this embodiment, the above method may also be adopted to embed multiple watermark information into one carrier object at the same time, and the process specifically includes the following steps: obtaining a carrier object, which can be a carrier image or a sequence of video frames; obtaining at least two pieces of watermark information to be embedded, wherein the at least two pieces of watermark information to be embedded can be a plurality of pieces of watermark information of different categories or a plurality of pieces of watermark information of the same category; adding different target amble sequences to the at least two pieces of watermark information to be embedded to obtain at least two target embedding sequences, for example, for different types of watermark information, adding a predetermined target amble sequence corresponding to the type of watermark information to the at least two pieces of watermark information, wherein after shifting the target amble sequences, the obtained degree of distinction between the shifted sequences and the target amble sequences is greater than a predetermined degree of distinction; embedding the at least two target embedding sequences into the carrier object. The manner for adding different target amble sequences to at least two pieces of watermark information to be embedded may specifically be: and respectively adding a preset target guide code sequence corresponding to the watermark information to be embedded into each piece of watermark information to be embedded into the at least two pieces of watermark information to be embedded. By the method, a plurality of watermark information containing positioning information (target guide code sequence) can be embedded in one carrier object, and correspondingly, a plurality of watermark information can be accurately extracted in the process of extracting the watermark, so that the effect of multiple encryption is achieved.
In the watermark information embedding method provided in this embodiment, when embedding watermark information in a carrier object, a target amble sequence is simultaneously embedded, where the target amble sequence satisfies the following conditions: after the target amble sequence is shifted, the discrimination between the shifted sequence and the target amble sequence is greater than a predetermined discrimination. . Because the distinction degree of the target guide code sequence after the displacement and before the displacement is larger, when the watermark information is extracted by the watermark information extraction end, the guide code sequence can be accurately matched, the embedding interval of the watermark information can be accurately positioned, and the watermark information can be accurately positioned. By using the method, the watermark information which is completely consistent with the embedded watermark information can be extracted, and the problem that the uniqueness of the watermark information cannot be ensured because the embedded watermark information is not completely matched with the extracted watermark information in the prior art is solved.
In correspondence with the watermark embedding method provided in the first embodiment, a second embodiment of the present application provides a watermark information extraction method, which is described below with reference to fig. 2.
As shown in fig. 2, in step S201, an object to be detected is obtained. The object to be detected can be an image to be detected or a video frame sequence to be detected.
As shown in fig. 2, in step S202, a reference bit sequence containing an amble sequence and original watermark information is obtained, where the amble sequence satisfies the following condition: after the pilot sequence is shifted, the discrimination between the shifted sequence and the pilot sequence is greater than a preset discrimination. The amble sequence may be set at a front end or a back end of the original watermark information.
The step is used for extracting the watermark sequence embedded in the object to be detected, and specifically comprises the following contents: obtaining a gray level histogram of the image to be detected; calculating the gray average value of the image to be detected; calculating and obtaining a target gray level interval of the gray level histogram according to the gray level mean value; dividing the target gray level interval according to the bit number of a preset embedded watermark sequence to obtain gray level subintervals corresponding to the bit number, wherein each gray level subinterval comprises at least two adjacent gray levels; obtaining an extraction strategy corresponding to a bit value; and extracting the watermark sequence of the image to be detected according to the number of pixel points in at least two adjacent gray levels of the gray subintervals and the extraction strategy corresponding to the bit value to obtain the watermark sequence embedded in the image to be detected.
As shown in fig. 2, in step S203, a target bit sequence is extracted from the object to be detected.
In this embodiment, the method for extracting the target bit sequence is related to a preset setting method of original watermark information and a preset setting method of a guide code sequence, and when the guide code sequence is set at the front end of the original watermark information, the target bit sequence is extracted according to a forward extraction method, that is, a starting position in an embedding range of the target bit sequence is used as an initial detection point for extracting the target bit sequence, and the target bit sequence is sequentially extracted from the object to be detected; when the guide code sequence is arranged at the rear end of the original watermark information, the target bit sequence is extracted in a reverse extraction mode when being extracted, namely, the target bit sequence is extracted from the object to be detected in sequence by taking the tail position of the embedding range of the target bit sequence as an initial detection point for extracting the target bit sequence.
In this embodiment, when the object to be detected is an image to be detected, the extracting of the target bit sequence from the object to be detected specifically includes the following steps:
firstly, a gray level histogram of an image to be detected is obtained.
Secondly, extracting a target bit sequence from the object to be detected based on the gray level histogram. For example, calculating a gray level mean value of the image to be detected; calculating a target gray level interval of the gray level histogram according to the gray level mean value; dividing the target gray scale interval according to the number of the bit values of the watermark sequence embedded into the image to be detected, so as to obtain gray scale subintervals corresponding to the number of the bit values of the embedded watermark sequence, wherein each gray scale subinterval comprises at least two adjacent gray levels, and the gray levels are used for representing the number of pixel points with the same gray level; obtaining predetermined bit value extraction data; and respectively extracting bit values embedded into the image to be detected according to the number of pixel points in at least two adjacent gray levels of the gray subintervals and the bit value extraction data to obtain the target bit sequence. The process is a reverse process of embedding watermark information in a carrier image in the first embodiment of the present application, and the implementation manner is similar to that of the existing watermark information extraction process, and the specific implementation process may refer to the existing watermark information extraction process, which is not described herein again.
When the object to be detected is a video frame sequence, the extracting of the target bit sequence from the object to be detected specifically includes the following contents:
first, a first reference sub-bit sequence for representing a bit value "0" in the target bit sequence is obtained, and a second reference sub-bit sequence for representing a bit value "1" in the target bit sequence is obtained, the first reference sub-bit sequence being distinct from the second reference sub-bit sequence. For the related contents of the first reference sub-bit sequence and the second reference sub-bit sequence, please refer to the first sub-bit sequence and the second sub-bit sequence in the first embodiment, which is not described herein again.
Next, a plurality of target sub-bit sequences are extracted from the video images of the sequence of video frames. Each video image embedded with original watermark information or a guide code sequence corresponds to a target sub-bit sequence, for example, watermark information is extracted for each video frame of the video frame sequence, information extracted from each video frame of the video frame sequence to be detected is counted by using a maximum election method, and finally a plurality of target sub-bit sequences consistent with the number of bit values in the target bit sequence are obtained.
Finally, the target sub-bit sequence is respectively compared with the first reference sub-bit sequence and the second reference sub-bit sequence to determine that the bit values embedded in the video images of the video frame sequence are '0' or '1', and so on until all the bit values embedded in the video frame sequence are extracted, and the extracted bit values form the target bit sequence.
It should be noted that the implementation sequence of the step S202 and the step S203 is not limited, that is, the reference bit sequence including the amble sequence and the original watermark information may be obtained after the target bit sequence is extracted from the object to be detected.
As shown in fig. 2, in step S204, the target bit sequence is matched with the reference bit sequence, and it is determined whether the target bit sequence is a reference bit sequence embedded in the object to be detected. For example, according to the fact that the amble sequence in the reference bit sequence is set at the front end or the back end of the original watermark information, the target bit sequence is matched with the reference bit sequence in a forward matching or reverse matching (consistent with the forward extraction or reverse extraction). Because the discrimination between the information obtained after the pilot sequence is shifted forward or backward in the sorting direction and the pilot sequence is greater than the preset discrimination, the precise matching between the reference bit sequence and the extracted target bit sequence can be realized.
The watermark information extraction method provided by this embodiment can realize accurate matching of the guide code sequence and can accurately position the embedding interval of the watermark information when extracting the watermark information, thereby realizing accurate positioning of the watermark information. By using the method, the watermark information which is completely consistent with the embedded watermark information can be extracted, and the problem that the uniqueness of the watermark information cannot be ensured because the embedded watermark information is not completely matched with the extracted watermark information in the prior art is avoided.
The third embodiment of the present application also provides a watermark information embedding apparatus, since the apparatus embodiment is basically similar to the method embodiment, so that the description is relatively simple, and the details of the related technical features can be found in the corresponding description of the method embodiment provided above, and the following description of the apparatus embodiment is only illustrative.
Referring to fig. 3, to understand the embodiment, fig. 3 is a block diagram of a unit of the apparatus provided in the embodiment, and as shown in fig. 3, the apparatus provided in the embodiment includes:
a carrier object obtaining unit 301 for obtaining a carrier object;
a to-be-embedded watermark information obtaining unit 302 configured to obtain to-be-embedded watermark information;
a target amble sequence obtaining unit 303, configured to obtain a target amble sequence, where the target amble sequence satisfies the following condition: after the pilot sequence is subjected to shift processing, the obtained discrimination between the shifted sequence and the pilot sequence is greater than a preset discrimination;
an information embedding unit 304, configured to embed the watermark information to be embedded and the target amble sequence into the carrier object.
The target guide code sequence is a binary bit sequence, and the embedding the watermark information to be embedded and the target guide code sequence into the carrier object includes: adding the target guide code sequence to the front end or the rear end of the watermark information to be embedded to obtain a target bit sequence; embedding the target bit sequence into the carrier object.
The carrier object comprises a carrier image, and the embedding the target bit sequence into the carrier object comprises: obtaining a gray level histogram of the carrier image; according to the target bit sequence, adjusting the shape of the gray level histogram to obtain an adjusted gray level histogram; and adjusting the gray value of a pixel point in the carrier image according to the adjusted gray level histogram to obtain a target image embedded with the target bit sequence.
The adjusting the shape of the gray level histogram according to the target bit sequence to obtain an adjusted gray level histogram includes: obtaining a target gray level interval of the gray level histogram; dividing the target gray scale interval according to the bit number of the target bit sequence to obtain gray scale subintervals corresponding to the bit number, wherein each gray scale subinterval comprises at least two adjacent gray levels, and the gray levels are used for representing the number of pixel points with the same gray value; obtaining quantity relation information of pixel points in a gray scale subinterval corresponding to the bit value; and adjusting the number of pixel points contained in at least two adjacent gray levels of the gray subintervals according to the target bit sequence and the number relation information of the pixel points in the gray subintervals corresponding to the bit values to obtain an adjusted gray histogram.
The obtaining of the target gray level interval of the gray level histogram includes: calculating a gray level mean value of the carrier image; and calculating a target gray level interval of the gray level histogram according to the gray level mean value based on the representation range of the gray level histogram and the bit number of the target bit sequence.
The obtaining of the quantity relation information of the pixel points in the gray scale subinterval corresponding to the bit value includes: when the bit value is 1, obtaining a first comparison relation between the ratio of the number of pixel points contained in two adjacent gray levels in a gray sub-interval corresponding to the bit value and the preset embedding strength; when the bit value is 0, obtaining a second comparison relation between the ratio of the number of pixel points contained in two adjacent gray levels in the gray subinterval corresponding to the bit value and the preset embedding strength;
the adjusting the number of the pixel points included in at least two adjacent gray levels of the gray sub-interval according to the target bit sequence and the information of the number relationship of the pixel points in the gray sub-interval corresponding to the bit value to obtain an adjusted gray histogram includes:
the bit value to be embedded of the target bit sequence is 1, and if the relation between the ratio of the number of pixel points contained in two adjacent gray levels in the gray subinterval corresponding to the bit value to be embedded and the preset embedding strength accords with the first comparison relation, the pixel points are not adjusted; if the relation between the ratio of the number of pixel points contained in two adjacent gray levels in the gray subinterval corresponding to the bit value to be embedded and the preset embedding strength does not accord with the first comparison relation, selecting a first number of pixel points from the pixel points contained in the gray level with the larger gray level and moving the pixel points to the gray level with the smaller gray level;
the bit value to be embedded of the target bit sequence is 0, and if the relation between the ratio of the number of pixel points contained in two adjacent gray levels in the gray subinterval corresponding to the bit value to be embedded and the preset embedding strength accords with the second comparison relation, the pixel points are not adjusted; if the relation between the ratio of the number of pixel points contained in two adjacent gray levels in the gray subinterval corresponding to the bit value to be embedded and the preset embedding strength does not accord with the second comparison relation, selecting a second number of pixel points from the pixel points contained in the gray level with the smaller gray value to move to the gray level with the larger gray value;
and the first quantity and the second quantity are obtained by calculation according to the preset embedding intensity and the quantity of pixel points contained in two adjacent gray levels in the gray subinterval.
The selecting a first number of pixel points from the pixel points contained in the gray level with the larger gray value to move to the gray level with the smaller gray level comprises the following steps: selecting a first number of pixel points from the pixel points contained in the gray level with a larger gray value in a random mode to move to the gray level with a smaller gray level;
the selecting a second number of pixel points from the pixel points included in the gray level with the smaller gray value to move to the gray level with the larger gray value comprises the following steps: and selecting a second number of pixel points from the pixel points contained in the gray level with the smaller gray value in a random mode to move to the gray level with the larger gray value.
Before obtaining the gray level histogram of the carrier image, the method further comprises the following steps: carrying out Gaussian filtering processing on the carrier image to obtain a low-frequency signal part of the carrier image; the obtaining of the gray level histogram of the carrier image includes: a grey level histogram of a low frequency signal portion of the carrier image is obtained.
The obtaining of the gray level histogram of the low-frequency signal part of the carrier image includes: carrying out blocking processing on a low-frequency signal part of the carrier image to obtain a blocking image; calculating the gray average value of the block image; and counting to obtain a gray level histogram of the block image according to the gray level mean value of the block image.
Further comprising: calculating to obtain the mean square error of the block image; the adjusting the shape of the gray level histogram according to the target bit sequence to obtain an adjusted gray level histogram includes: selecting gray level histograms of a preset number of block images according to the sequence of the mean square deviations of the block images from large to small; and adjusting the shape of the gray level histogram of the block images of the preset number according to the target bit sequence to obtain the adjusted gray level histogram of the block images.
The carrier object comprising a sequence of video frames, said embedding the target bit sequence into the carrier object comprising: obtaining a first sub-bit sequence representing a bit value "0" in the target bit sequence and obtaining a second sub-bit sequence representing a bit value "1" in the target bit sequence, the first sub-bit sequence being distinct from the second sub-bit sequence; obtaining a target video image in the video frame sequence, wherein the target video image refers to a video image to be embedded into the target bit sequence; obtaining a target bit value to be embedded into the target video image; embedding the first or second sub-bit sequence representing the target bit value into the target video image.
The first sub-bit sequence satisfies the following condition: after the first sub-bit sequence is subjected to shift processing, the discrimination between the obtained shifted bit sequence and the first sub-bit sequence is greater than a preset discrimination; and the second sub-bit sequence satisfies the following condition: after the second sub-bit sequence is shifted, the discrimination between the obtained shifted bit sequence and the second sub-bit sequence is greater than a preset discrimination.
The first sub-bit sequence comprises a first sub-amble sequence and first sub-watermark information, and the first sub-amble sequence satisfies the following condition: after the first sub-amble sequence is shifted, the discrimination between the shifted bit sequence and the first sub-amble sequence is greater than a preset discrimination; and the second sub-bit sequence comprises a second sub-amble sequence and second sub-watermark information, and the second sub-amble sequence satisfies the following condition: after the second sub-amble sequence is shifted, the discrimination between the shifted bit sequence and the second sub-amble sequence is greater than a preset discrimination; wherein the first sub-watermark information is different from the second sub-watermark information.
The first sub-amble sequence is identical to the second sub-amble sequence.
The first sub-amble sequence and the target amble sequence are identical bit sequences, and the second sub-amble sequence and the target amble sequence are identical bit sequences.
The obtaining a target video image in the sequence of video frames comprises: and obtaining a target video image to be embedded into the target bit sequence from the video frame sequence according to the number of bit values contained in the target bit sequence in a mode of embedding a bit value in one frame of video image of the video frame sequence.
The embedding the first sub-bit sequence or the second sub-bit sequence for representing the target bit value into the target video image comprises: obtaining a gray level histogram of the target video image; adjusting the shape of a gray level histogram of the target video image according to the first sub-bit sequence or the second sub-bit sequence for representing the target bit value to obtain an adjusted gray level histogram; and adjusting the gray value of a pixel point in the target video image according to the adjusted gray level histogram to obtain the target video image embedded with the first sub-bit sequence or the second sub-bit sequence.
The carrier object comprising a sequence of video frames, said embedding the target bit sequence into the carrier object comprising: repeatedly embedding the target bit sequence into the video frame sequence according to a predetermined embedding number.
In the embodiments described above, a watermark information embedding method and a watermark information embedding apparatus are provided, and in addition, a fourth embodiment of the present application also provides an electronic device, which is basically similar to the method embodiment and therefore is relatively simple to describe, and the details of the related technical features may be obtained by referring to the corresponding description of the method embodiment provided above, and the following description of the electronic device embodiment is only illustrative. The embodiment of the electronic equipment is as follows:
please refer to fig. 4 for understanding the present embodiment, fig. 4 is a schematic view of an electronic device provided in the present embodiment.
As shown in fig. 4, the electronic apparatus includes: a processor 401; a memory 402;
the memory 402 is used for storing a watermark information embedding program, and when the program is read and executed by the processor, the program performs the following operations:
obtaining a carrier object;
acquiring watermark information to be embedded;
obtaining a target amble sequence, the target amble sequence satisfying the following conditions: after the pilot sequence is shifted, the discrimination degree between the shifted sequence and the pilot sequence is greater than the preset discrimination degree;
and embedding the watermark information to be embedded and the target guide code sequence into the carrier object.
The target guide code sequence is a binary bit sequence, and the embedding the watermark information to be embedded and the target guide code sequence into the carrier object includes: adding the target guide code sequence to the front end or the rear end of the watermark information to be embedded to obtain a target bit sequence; embedding the target bit sequence into the carrier object.
The carrier object comprises a carrier image, and the embedding the target bit sequence into the carrier object comprises: obtaining a gray level histogram of the carrier image; according to the target bit sequence, adjusting the shape of the gray level histogram to obtain an adjusted gray level histogram; and adjusting the gray value of a pixel point in the carrier image according to the adjusted gray level histogram to obtain a target image embedded with the target bit sequence.
The adjusting the shape of the gray level histogram according to the target bit sequence to obtain an adjusted gray level histogram includes: obtaining a target gray level interval of the gray level histogram; dividing the target gray scale interval according to the bit number of the target bit sequence to obtain gray scale subintervals corresponding to the bit number, wherein each gray scale subinterval comprises at least two adjacent gray levels, and the gray levels are used for representing the number of pixel points with the same gray value; obtaining quantity relation information of pixel points in a gray scale subinterval corresponding to the bit value; and adjusting the number of pixel points contained in at least two adjacent gray levels of the gray subintervals according to the target bit sequence and the number relation information of the pixel points in the gray subintervals corresponding to the bit values to obtain an adjusted gray histogram.
The obtaining of the target gray level interval of the gray level histogram includes: calculating a gray level mean value of the carrier image; and calculating a target gray level interval of the gray level histogram according to the gray level mean value based on the representation range of the gray level histogram and the bit number of the target bit sequence.
The obtaining of the quantity relation information of the pixel points in the gray scale subinterval corresponding to the bit value includes: when the bit value is 1, obtaining a first comparison relation between the ratio of the number of pixel points contained in two adjacent gray levels in a gray sub-interval corresponding to the bit value and the preset embedding strength; when the bit value is 0, obtaining a second comparison relation between the ratio of the number of pixel points contained in two adjacent gray levels in the gray subinterval corresponding to the bit value and the preset embedding strength;
the adjusting the number of the pixel points included in at least two adjacent gray levels of the gray sub-interval according to the target bit sequence and the information of the number relationship of the pixel points in the gray sub-interval corresponding to the bit value to obtain an adjusted gray histogram includes:
the bit value to be embedded of the target bit sequence is 1, and if the relation between the ratio of the number of pixel points contained in two adjacent gray levels in the gray subinterval corresponding to the bit value to be embedded and the preset embedding strength accords with the first comparison relation, the pixel points are not adjusted; if the relation between the ratio of the number of pixel points contained in two adjacent gray levels in the gray subinterval corresponding to the bit value to be embedded and the preset embedding strength does not accord with the first comparison relation, selecting a first number of pixel points from the pixel points contained in the gray level with the larger gray level and moving the pixel points to the gray level with the smaller gray level;
the bit value to be embedded of the target bit sequence is 0, and if the relation between the ratio of the number of pixel points contained in two adjacent gray levels in the gray subinterval corresponding to the bit value to be embedded and the preset embedding strength accords with the second comparison relation, the pixel points are not adjusted; if the relation between the ratio of the number of pixel points contained in two adjacent gray levels in the gray subinterval corresponding to the bit value to be embedded and the preset embedding strength does not accord with the second comparison relation, selecting a second number of pixel points from the pixel points contained in the gray level with the smaller gray value to move to the gray level with the larger gray value;
and the first quantity and the second quantity are obtained by calculation according to the preset embedding intensity and the quantity of pixel points contained in two adjacent gray levels in the gray subinterval.
The selecting a first number of pixel points from the pixel points contained in the gray level with the larger gray value to move to the gray level with the smaller gray level comprises the following steps: selecting a first number of pixel points from the pixel points contained in the gray level with a larger gray value in a random mode to move to the gray level with a smaller gray level;
the selecting a second number of pixel points from the pixel points included in the gray level with the smaller gray value to move to the gray level with the larger gray value comprises the following steps: and selecting a second number of pixel points from the pixel points contained in the gray level with the smaller gray value in a random mode to move to the gray level with the larger gray value.
Before obtaining the gray level histogram of the carrier image, the method further comprises the following steps: carrying out Gaussian filtering processing on the carrier image to obtain a low-frequency signal part of the carrier image; the obtaining of the gray level histogram of the carrier image includes: a grey level histogram of a low frequency signal portion of the carrier image is obtained.
The obtaining of the gray level histogram of the low-frequency signal part of the carrier image includes: carrying out blocking processing on a low-frequency signal part of the carrier image to obtain a blocking image; calculating the gray average value of the block image; and counting to obtain a gray level histogram of the block image according to the gray level mean value of the block image.
Further comprising: calculating to obtain the mean square error of the block image; the adjusting the shape of the gray level histogram according to the target bit sequence to obtain an adjusted gray level histogram includes: selecting gray level histograms of a preset number of block images according to the sequence of the mean square deviations of the block images from large to small; and adjusting the shape of the gray level histogram of the block images of the preset number according to the target bit sequence to obtain the adjusted gray level histogram of the block images.
The carrier object comprising a sequence of video frames, said embedding the target bit sequence into the carrier object comprising: obtaining a first sub-bit sequence representing a bit value "0" in the target bit sequence and obtaining a second sub-bit sequence representing a bit value "1" in the target bit sequence, the first sub-bit sequence being distinct from the second sub-bit sequence; obtaining a target video image in the video frame sequence, wherein the target video image refers to a video image to be embedded into the target bit sequence; obtaining a target bit value to be embedded into the target video image; embedding the first or second sub-bit sequence representing the target bit value into the target video image.
The first sub-bit sequence satisfies the following condition: after the first sub-bit sequence is subjected to shift processing, the discrimination between the obtained shifted bit sequence and the first sub-bit sequence is greater than a preset discrimination; and the second sub-bit sequence satisfies the following condition: after the second sub-bit sequence is shifted, the discrimination between the obtained shifted bit sequence and the second sub-bit sequence is greater than a preset discrimination.
The first sub-bit sequence comprises a first sub-amble sequence and first sub-watermark information, and the first sub-amble sequence satisfies the following condition: after the first sub-amble sequence is shifted, the discrimination between the shifted bit sequence and the first sub-amble sequence is greater than a preset discrimination; and the second sub-bit sequence comprises a second sub-amble sequence and second sub-watermark information, and the second sub-amble sequence satisfies the following condition: after the second sub-amble sequence is shifted, the discrimination between the shifted bit sequence and the second sub-amble sequence is greater than a preset discrimination; wherein the first sub-watermark information is different from the second sub-watermark information.
The first sub-amble sequence is identical to the second sub-amble sequence.
The first sub-amble sequence and the target amble sequence are identical bit sequences, and the second sub-amble sequence and the target amble sequence are identical bit sequences.
The obtaining a target video image in the sequence of video frames comprises: and obtaining a target video image to be embedded into the target bit sequence from the video frame sequence according to the number of bit values contained in the target bit sequence in a mode of embedding a bit value in one frame of video image of the video frame sequence.
The embedding the first sub-bit sequence or the second sub-bit sequence for representing the target bit value into the target video image comprises: obtaining a gray level histogram of the target video image; adjusting the shape of a gray level histogram of the target video image according to the first sub-bit sequence or the second sub-bit sequence for representing the target bit value to obtain an adjusted gray level histogram; and adjusting the gray value of a pixel point in the target video image according to the adjusted gray level histogram to obtain the target video image embedded with the first sub-bit sequence or the second sub-bit sequence.
The carrier object comprising a sequence of video frames, said embedding the target bit sequence into the carrier object comprising: repeatedly embedding the target bit sequence into the video frame sequence according to a predetermined embedding number.
The foregoing second embodiment provides a watermark information extraction method, and correspondingly, the fifth embodiment of the present application also provides a watermark information extraction apparatus, since the apparatus embodiment is basically similar to the method embodiment, so that the description is relatively simple, and the details of the related technical features may be referred to the corresponding description of the method embodiment provided above, and the following description of the apparatus embodiment is only illustrative.
Referring to fig. 5, to understand the embodiment, fig. 5 is a block diagram of a unit of the apparatus provided in the embodiment, and as shown in fig. 5, the apparatus provided in the embodiment includes:
an object to be detected obtaining unit 501, configured to obtain an object to be detected;
a reference bit sequence obtaining unit 502, configured to obtain a preset reference bit sequence including a leader sequence and original watermark information, where the leader sequence satisfies the following conditions: after the pilot sequence is subjected to shift processing, the discrimination between the shifted sequence and the pilot sequence is greater than the preset discrimination;
a target bit sequence extracting unit 503, configured to extract a target bit sequence from the object to be detected;
an information matching unit 504, configured to match the target bit sequence with the reference bit sequence, and determine whether the target bit sequence is a reference bit sequence embedded in the object to be detected.
The method for extracting the target bit sequence from the object to be detected includes: and taking the initial position of the embedding range of the target bit sequence as an initial detection point for extracting the target bit sequence, and extracting the target bit sequence from the object to be detected.
The method for extracting the target bit sequence from the object to be detected includes: and taking the tail position of the embedding range of the target bit sequence as an initial detection point for extracting the target bit sequence, and extracting the target bit sequence from the object to be detected.
The object to be detected comprises an image to be detected, and the step of extracting a target watermark sequence from the object to be detected comprises the following steps: obtaining a gray level histogram of the image to be detected; and extracting a target watermark sequence from the object to be detected based on the gray level histogram.
The object to be detected comprises a video frame sequence, and the extracting of the target watermark sequence from the object to be detected comprises: obtaining a first reference sub-bit sequence for representing a bit value "0" in the target bit sequence and obtaining a second reference sub-bit sequence for representing a bit value "1" in the target bit sequence, the first reference sub-bit sequence being distinct from the second reference sub-bit sequence; extracting a plurality of target sub-bit sequences from video images of the sequence of video frames; and comparing the target sub-bit sequence with the first reference sub-bit sequence and the second reference sub-bit sequence respectively to determine that the bit value embedded in the video image of the video frame sequence is '0' or '1'.
The extracting a target watermark sequence from the object to be detected based on the gray level histogram includes: calculating the gray average value of the image to be detected; calculating a target gray level interval of the gray level histogram according to the gray level mean value; dividing the target gray scale interval according to the number of the bit values of the watermark sequence embedded into the image to be detected, so as to obtain gray scale subintervals corresponding to the number of the bit values of the embedded watermark sequence, wherein each gray scale subinterval comprises at least two adjacent gray levels, and the gray levels are used for representing the number of pixel points with the same gray value; obtaining predetermined bit value extraction data; and respectively extracting bit values embedded into the image to be detected according to the number of pixel points in at least two adjacent gray levels of the gray subintervals and the bit value extraction data to obtain the target bit sequence.
In the above embodiments, a watermark information extraction method and a watermark information extraction apparatus are provided, and in addition, a sixth embodiment of the present application also provides an electronic device, which is basically similar to the method embodiment and therefore is relatively simple to describe, and the details of the related technical features may be obtained by referring to the corresponding description of the method embodiment provided above, and the following description of the electronic device embodiment is only illustrative. The embodiment of the electronic equipment is as follows:
please refer to fig. 6 for understanding the present embodiment, fig. 6 is a schematic view of an electronic device provided in the present embodiment.
As shown in fig. 6, the electronic apparatus includes: a processor 601; a memory 602;
the memory 602 is configured to store a watermark information extraction program, and when the program is read and executed by the processor, the program performs the following operations:
obtaining an object to be detected;
obtaining a reference bit sequence comprising a pilot sequence and original watermark information, the pilot sequence satisfying the following conditions: after the pilot sequence is shifted, the discrimination between the shifted sequence and the pilot sequence is greater than the preset discrimination;
extracting a target bit sequence from the object to be detected;
and matching the target bit sequence with the reference bit sequence, and determining whether the target bit sequence is the reference bit sequence embedded in the object to be detected.
The method for extracting the target bit sequence from the object to be detected includes: and taking the initial position of the embedding range of the target bit sequence as an initial detection point for extracting the target bit sequence, and extracting the target bit sequence from the object to be detected.
The method for extracting the target bit sequence from the object to be detected includes: and taking the tail position of the embedding range of the target bit sequence as an initial detection point for extracting the target bit sequence, and extracting the target bit sequence from the object to be detected.
The object to be detected comprises an image to be detected, and the step of extracting a target watermark sequence from the object to be detected comprises the following steps: obtaining a gray level histogram of the image to be detected; and extracting a target watermark sequence from the object to be detected based on the gray level histogram.
The object to be detected comprises a video frame sequence, and the extracting of the target watermark sequence from the object to be detected comprises: obtaining a first reference sub-bit sequence for representing a bit value "0" in the target bit sequence and obtaining a second reference sub-bit sequence for representing a bit value "1" in the target bit sequence, the first reference sub-bit sequence being distinct from the second reference sub-bit sequence; extracting a plurality of target sub-bit sequences from video images of the sequence of video frames; and comparing the target sub-bit sequence with the first reference sub-bit sequence and the second reference sub-bit sequence respectively to determine that the bit value embedded in the video image of the video frame sequence is '0' or '1'.
The extracting a target watermark sequence from the object to be detected based on the gray level histogram includes: calculating the gray average value of the image to be detected; calculating a target gray level interval of the gray level histogram according to the gray level mean value; dividing the target gray scale interval according to the number of the bit values of the watermark sequence embedded into the image to be detected, so as to obtain gray scale subintervals corresponding to the number of the bit values of the embedded watermark sequence, wherein each gray scale subinterval comprises at least two adjacent gray levels, and the gray levels are used for representing the number of pixel points with the same gray value; obtaining predetermined bit value extraction data; and respectively extracting bit values embedded into the image to be detected according to the number of pixel points in at least two adjacent gray levels of the gray subintervals and the bit value extraction data to obtain the target bit sequence.
In a typical configuration, a computing device includes one or more processors (CPUs), input/output interfaces, network interfaces, and memory.
The memory may include forms of volatile memory in a computer readable medium, Random Access Memory (RAM) and/or non-volatile memory, such as Read Only Memory (ROM) or flash memory (flash RAM). Memory is an example of a computer-readable medium.
1. Computer-readable media, including both non-transitory and non-transitory, removable and non-removable media, may implement information storage by any method or technology. The information may be computer readable instructions, data structures, modules of a program, or other data. Examples of computer storage media include, but are not limited to, phase change memory (PRAM), Static Random Access Memory (SRAM), Dynamic Random Access Memory (DRAM), other types of Random Access Memory (RAM), Read Only Memory (ROM), Electrically Erasable Programmable Read Only Memory (EEPROM), flash memory or other memory technology, compact disc read only memory (CD-ROM), Digital Versatile Discs (DVD) or other optical storage, magnetic cassettes, magnetic tape magnetic disk storage or other magnetic storage devices, or any other non-transmission medium that can be used to store information that can be accessed by a computing device. As defined herein, computer readable media does not include non-transitory computer readable media (transient media), such as modulated data signals and carrier waves.
2. As will be appreciated by one skilled in the art, embodiments of the present application may be provided as a method, system, or computer program product. Accordingly, the present application may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, the present application may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein.
Although the present application has been described with reference to the preferred embodiments, it is not intended to limit the present application, and any person skilled in the art can make variations and modifications without departing from the spirit and scope of the present application, therefore, the scope of the present application should be limited by the scope of the claims.

Claims (30)

1. A watermark information embedding method, comprising:
obtaining a carrier object;
acquiring watermark information to be embedded;
obtaining a target amble sequence;
embedding the watermark information to be embedded and the target amble sequence into the carrier object;
wherein the target amble sequence satisfies the following condition: after the target amble sequence is shifted, the discrimination between the shifted sequence and the target amble sequence is greater than a predetermined discrimination.
2. The method according to claim 1, wherein the target amble sequence is a binary bit sequence, and said embedding the watermark information to be embedded and the target amble sequence into the carrier object comprises:
adding the target guide code sequence to the front end or the rear end of the watermark information to be embedded to obtain a target bit sequence;
embedding the target bit sequence into the carrier object.
3. The method of claim 2, wherein the carrier object comprises a carrier image, and wherein embedding the target bit sequence into the carrier object comprises:
obtaining a gray level histogram of the carrier image;
according to the target bit sequence, adjusting the shape of the gray level histogram to obtain an adjusted gray level histogram;
and adjusting the gray value of a pixel point in the carrier image according to the adjusted gray level histogram to obtain a target image embedded with the target bit sequence.
4. The method of claim 3, wherein the adjusting the shape of the histogram of gray scales according to the target bit sequence to obtain an adjusted histogram of gray scales comprises:
obtaining a target gray level interval of the gray level histogram;
dividing the target gray scale interval according to the bit number of the target bit sequence to obtain gray scale subintervals corresponding to the bit number, wherein each gray scale subinterval comprises at least two adjacent gray levels, and the gray levels are used for representing the number of pixel points with the same gray value;
obtaining quantity relation information of pixel points in a gray scale subinterval corresponding to the bit value;
and adjusting the number of pixel points contained in at least two adjacent gray levels of the gray subintervals according to the target bit sequence and the number relation information of the pixel points in the gray subintervals corresponding to the bit values to obtain an adjusted gray histogram.
5. The method of claim 4, wherein obtaining the target gray level interval of the gray level histogram comprises:
calculating the gray average value of the carrier image;
and calculating a target gray level interval of the gray level histogram according to the gray level mean value based on the representation range of the gray level histogram and the bit number of the target bit sequence.
6. The method of claim 4, wherein the obtaining information of the number relationship between the pixel points in the gray subintervals corresponding to the bit values comprises:
when the bit value is 1, obtaining a first comparison relation between the ratio of the number of pixel points contained in two adjacent gray levels in a gray sub-interval corresponding to the bit value and the preset embedding strength; and
when the bit value is 0, obtaining a second comparison relation between the ratio of the number of pixel points contained in two adjacent gray levels in the gray subinterval corresponding to the bit value and the preset embedding strength;
the adjusting the number of the pixel points included in at least two adjacent gray levels of the gray sub-interval according to the target bit sequence and the number relation information of the pixel points in the gray sub-interval corresponding to the bit value to obtain an adjusted gray histogram includes:
the bit value to be embedded of the target bit sequence is 1, and if the relation between the ratio of the number of pixel points contained in two adjacent gray levels in the gray scale subinterval corresponding to the bit value to be embedded and the preset embedding strength accords with the first comparison relation, the pixel points are not adjusted; if the relation between the ratio of the number of pixel points contained in two adjacent gray levels in the gray subinterval corresponding to the bit value to be embedded and the preset embedding strength does not conform to the first comparison relation, selecting a first number of pixel points from the pixel points contained in the gray level with the larger gray value to move to the gray level with the smaller gray level;
the bit value to be embedded of the target bit sequence is 0, and if the relation between the ratio of the number of pixel points contained in two adjacent gray levels in the gray scale subinterval corresponding to the bit value to be embedded and the preset embedding strength accords with the second comparison relation, the pixel points are not adjusted; if the relation between the ratio of the number of pixel points contained in two adjacent gray levels in the gray subinterval corresponding to the bit value to be embedded and the preset embedding strength does not conform to the second comparison relation, selecting a second number of pixel points from the pixel points contained in the gray level with the smaller gray value and moving the pixel points to the gray level with the larger gray value;
and the first quantity and the second quantity are obtained by calculation according to the preset embedding intensity and the quantity of pixel points contained in two adjacent gray levels in the gray subinterval.
7. The method of claim 6, wherein said selecting a first number of pixels from the pixels comprising the gray level with the higher gray level to move to the gray level with the lower gray level comprises: selecting a first number of pixel points from the pixel points contained in the gray level with a larger gray value in a random mode to move to the gray level with a smaller gray level;
the selecting a second number of pixel points from the pixel points included in the gray level with the smaller gray value to move to the gray level with the larger gray value comprises the following steps: and selecting a second number of pixel points from the pixel points contained in the gray level with the smaller gray value in a random mode to move to the gray level with the larger gray value.
8. The method of claim 2, further comprising, prior to obtaining the gray-level histogram of the carrier image: carrying out Gaussian filtering processing on the carrier image to obtain a low-frequency signal part of the carrier image;
the obtaining of the gray level histogram of the carrier image includes:
a grey level histogram of a low frequency signal portion of the carrier image is obtained.
9. The method of claim 8, wherein obtaining a grayscale histogram of a low frequency signal portion of the carrier image comprises:
carrying out blocking processing on a low-frequency signal part of the carrier image to obtain a blocking image;
calculating the gray average value of the block image;
and counting to obtain a gray level histogram of the block image according to the gray level mean value of the block image.
10. The method of claim 9, further comprising: calculating to obtain the mean square error of the block image;
the adjusting the shape of the gray level histogram according to the target bit sequence to obtain an adjusted gray level histogram includes:
selecting gray level histograms of a preset number of block images according to the sequence of the mean square deviations of the block images from large to small;
and adjusting the shape of the gray level histogram of the block images of the preset number according to the target bit sequence to obtain the adjusted gray level histogram of the block images.
11. The method of claim 2, wherein the carrier object comprises a sequence of video frames, and wherein embedding the target bit sequence into the carrier object comprises:
obtaining a first sub-bit sequence representing a bit value "0" in the target bit sequence and obtaining a second sub-bit sequence representing a bit value "1" in the target bit sequence, the first sub-bit sequence being distinct from the second sub-bit sequence;
obtaining a target video image in the video frame sequence, wherein the target video image refers to a video image to be embedded into the target bit sequence;
obtaining a target bit value to be embedded into the target video image;
embedding the first or second sub-bit sequence representing the target bit value into the target video image.
12. The method of claim 11, wherein the first sub-bit sequence satisfies the following condition: after the first sub-bit sequence is subjected to shift processing, the discrimination between the obtained shifted bit sequence and the first sub-bit sequence is greater than a preset discrimination; and the second sub-bit sequence satisfies the following condition: after the second sub-bit sequence is shifted, the discrimination between the shifted bit sequence and the second sub-bit sequence is greater than a preset discrimination.
13. The method of claim 11, wherein the first sub-bit sequence comprises a first sub-amble sequence and a first sub-watermark, and wherein the first sub-amble sequence satisfies the following condition: after the first sub-amble sequence is shifted, the discrimination between the shifted bit sequence and the first sub-amble sequence is greater than a preset discrimination; and the second sub-bit sequence comprises a second sub-amble sequence and second sub-watermark information, and the second sub-amble sequence satisfies the following condition: after the second sub-amble sequence is shifted, the discrimination between the shifted bit sequence and the second sub-amble sequence is greater than a preset discrimination; wherein the first sub-watermark information is different from the second sub-watermark information.
14. The method of claim 13, wherein the first sub-amble sequence is identical to the second sub-amble sequence.
15. The method of claim 13, wherein the first sub-amble sequence and the target amble sequence are a same bit sequence, and wherein the second sub-amble sequence and the target amble sequence are a same bit sequence.
16. The method of claim 11, wherein obtaining the target video image in the sequence of video frames comprises:
and obtaining a target video image to be embedded into the target bit sequence from the video frame sequence according to the number of bit values contained in the target bit sequence in a mode of embedding a bit value in one frame of video image of the video frame sequence.
17. The method of any of claims 11-15, wherein said embedding the first sub-bit sequence or the second sub-bit sequence representing the target bit value into the target video image comprises:
obtaining a gray level histogram of the target video image;
adjusting the shape of a gray level histogram of the target video image according to the first sub-bit sequence or the second sub-bit sequence for representing the target bit value to obtain an adjusted gray level histogram;
and adjusting the gray value of a pixel point in the target video image according to the adjusted gray level histogram to obtain the target video image embedded with the first sub-bit sequence or the second sub-bit sequence.
18. The method of claim 2, wherein the carrier object comprises a sequence of video frames, and wherein embedding the target bit sequence into the carrier object comprises:
repeatedly embedding the target bit sequence into the video frame sequence according to a predetermined embedding number.
19. A watermark information extraction method, comprising:
obtaining an object to be detected;
obtaining a reference bit sequence containing a pilot sequence and original watermark information, wherein the pilot sequence satisfies the following conditions: after the pilot sequence is shifted, the discrimination between the shifted sequence and the pilot sequence is greater than the preset discrimination. (ii) a
Extracting a target bit sequence from the object to be detected;
and matching the target bit sequence with the reference bit sequence, and determining whether the target bit sequence is the reference bit sequence embedded in the object to be detected.
20. The method of claim 19, wherein the amble sequence is set at a front end of the original watermark information, and the extracting a target bit sequence from the object to be detected comprises:
and taking the initial position of the embedding range of the target bit sequence as an initial detection point for extracting the target bit sequence, and extracting the target bit sequence from the object to be detected.
21. The method of claim 19, wherein the amble sequence is set at a rear end of the original watermark information, and the extracting a target bit sequence from the object to be detected comprises:
and taking the tail position of the embedding range of the target bit sequence as an initial detection point for extracting the target bit sequence, and extracting the target bit sequence from the object to be detected.
22. The method according to any one of claims 19 to 21, wherein the object to be detected comprises an image to be detected, and the extracting the target watermark sequence from the object to be detected comprises:
obtaining a gray level histogram of the image to be detected;
and extracting a target watermark sequence from the object to be detected based on the gray level histogram.
23. The method according to any one of claims 19 to 21, wherein the object to be detected comprises a sequence of video frames, and the extracting the target watermark sequence from the object to be detected comprises:
obtaining a first reference sub-bit sequence for representing a bit value "0" in the target bit sequence and obtaining a second reference sub-bit sequence for representing a bit value "1" in the target bit sequence, the first reference sub-bit sequence being distinct from the second reference sub-bit sequence;
extracting a plurality of target sub-bit sequences from video images of the sequence of video frames;
and comparing the target sub-bit sequence with the first reference sub-bit sequence and the second reference sub-bit sequence respectively to determine that the bit value embedded in the video image of the video frame sequence is '0' or '1'.
24. The method according to claim 22, wherein the extracting a target watermark sequence from the object to be detected based on the gray histogram comprises:
calculating the gray average value of the image to be detected;
calculating a target gray level interval of the gray level histogram according to the gray level mean value;
dividing the target gray scale interval according to the number of the bit values of the watermark sequence embedded into the image to be detected, so as to obtain gray scale subintervals corresponding to the number of the bit values of the embedded watermark sequence, wherein each gray scale subinterval comprises at least two adjacent gray levels, and the gray levels are used for representing the number of pixel points with the same gray value;
obtaining predetermined bit value extraction data;
and respectively extracting the bit values embedded into the image to be detected according to the number of pixel points in at least two adjacent gray levels of the gray subintervals and the bit value extraction data to obtain the target bit sequence.
25. A watermark information embedding apparatus, comprising:
a carrier object obtaining unit for obtaining a carrier object;
a to-be-embedded watermark information obtaining unit, configured to obtain to-be-embedded watermark information;
a target amble sequence obtaining unit, configured to obtain a target amble sequence, where the target amble sequence satisfies the following condition: after the target guide code sequence is subjected to shift processing, the discrimination between the obtained shifted sequence and the target guide code sequence is greater than a preset discrimination;
and the information embedding unit is used for embedding the watermark information to be embedded and the target guide code sequence into the carrier object.
26. An electronic device, comprising:
a processor;
a memory for storing a watermark information embedding program, which when read and executed by the processor performs the following operations:
obtaining a carrier object;
acquiring watermark information to be embedded;
obtaining a target amble sequence, the target amble sequence satisfying the following conditions: after the target guide code sequence is subjected to shift processing, the discrimination between the obtained shifted sequence and the target guide code sequence is greater than a preset discrimination;
and embedding the watermark information to be embedded and the target guide code sequence into the carrier object.
27. An apparatus for extracting watermark information, comprising:
the object to be detected obtaining unit is used for obtaining an object to be detected;
a reference bit sequence obtaining unit, configured to obtain a preset reference bit sequence including a pilot sequence and original watermark information, where the pilot sequence satisfies the following condition: after the pilot sequence is shifted, the discrimination between the shifted sequence and the pilot sequence is greater than the preset discrimination;
a target bit sequence extraction unit, configured to extract a target bit sequence from the object to be detected;
and the information matching unit is used for matching the target bit sequence with the reference bit sequence and determining whether the target bit sequence is the reference bit sequence embedded in the object to be detected.
28. An electronic device, comprising:
a processor;
a memory for storing a watermark information extraction program, which when read and executed by the processor performs the following operations:
obtaining an object to be detected;
obtaining a preset reference bit sequence containing a guide code sequence and original watermark information, wherein the guide code sequence meets the following conditions: after the pilot sequence is shifted, the discrimination between the shifted sequence and the pilot sequence is greater than the preset discrimination;
extracting a target bit sequence from the object to be detected;
and matching the target bit sequence with the reference bit sequence, and determining whether the target bit sequence is the reference bit sequence embedded in the object to be detected.
29. A watermark information embedding method, comprising:
obtaining a carrier object;
obtaining at least two pieces of watermark information to be embedded;
adding different target guide code sequences to the at least two pieces of watermark information to be embedded to obtain at least two target embedded sequences;
embedding the at least two target embedding sequences into the carrier object;
wherein the target amble sequence satisfies the following condition: after the target amble sequence is shifted, the discrimination between the shifted sequence and the target amble sequence is greater than a predetermined discrimination.
30. The method according to claim 29, wherein said adding different target amble sequences for said at least two pieces of watermark information to be embedded comprises:
and respectively adding a target guide code sequence corresponding to the watermark information to be embedded to each piece of watermark information to be embedded in the at least two pieces of watermark information to be embedded.
CN201911093370.3A 2019-11-11 2019-11-11 Watermark information embedding method and device Active CN112788342B (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN201911093370.3A CN112788342B (en) 2019-11-11 2019-11-11 Watermark information embedding method and device
PCT/CN2020/126360 WO2021093648A1 (en) 2019-11-11 2020-11-04 Watermark information embedding method and apparatus

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911093370.3A CN112788342B (en) 2019-11-11 2019-11-11 Watermark information embedding method and device

Publications (2)

Publication Number Publication Date
CN112788342A true CN112788342A (en) 2021-05-11
CN112788342B CN112788342B (en) 2022-07-08

Family

ID=75749597

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911093370.3A Active CN112788342B (en) 2019-11-11 2019-11-11 Watermark information embedding method and device

Country Status (2)

Country Link
CN (1) CN112788342B (en)
WO (1) WO2021093648A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114332092A (en) * 2022-03-16 2022-04-12 北京中科慧眼科技有限公司 Defect image detection method, system, intelligent terminal and medium

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116263931A (en) * 2021-12-15 2023-06-16 深圳先进技术研究院 Self-adaptive color image reversible information hiding method and system
CN114913422B (en) * 2022-05-23 2024-03-15 桂林理工大学 Karst mountain area soil erosion and water loss field monitoring device
CN115834792B (en) * 2023-02-22 2023-05-12 湖南洛普洛科技有限公司 Video data processing method and system based on artificial intelligence
CN116703686B (en) * 2023-08-01 2023-12-22 腾讯科技(深圳)有限公司 Image processing method, device, equipment and storage medium
CN117061768B (en) * 2023-10-12 2024-01-30 腾讯科技(深圳)有限公司 Video watermark processing method, video watermark processing device, electronic equipment and storage medium

Citations (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20020071593A1 (en) * 2000-10-31 2002-06-13 Hirofumi Muratani Digital watermark embedding apparatus, digital watermark detecting apparatus, digital watermark embedding method, digital watermark detecting method and computer program product
JP2003092677A (en) * 2001-07-11 2003-03-28 Canon Inc Data processing method and apparatus, its program and storage medium
US20030152225A1 (en) * 2002-02-13 2003-08-14 Sanyo Electric Co., Ltd. Digital watermarking system using scrambling method
DE10216261A1 (en) * 2002-04-12 2003-11-06 Fraunhofer Ges Forschung Method and device for embedding watermark information and method and device for extracting embedded watermark information
US20040101160A1 (en) * 2002-11-08 2004-05-27 Sanyo Electric Co., Ltd. Multilayered digital watermarking system
US20040204943A1 (en) * 1999-07-13 2004-10-14 Microsoft Corporation Stealthy audio watermarking
US20070121997A1 (en) * 2005-11-30 2007-05-31 Microsoft Corporation Digital fingerprinting using synchronization marks and watermarks
CN101005615A (en) * 2006-01-18 2007-07-25 华中科技大学 Embedding and detecting method and system for image data watermark information
JP2008294916A (en) * 2007-05-28 2008-12-04 Mitsubishi Electric Corp Digital watermark detecting device, digital watermark detecting method, digital watermark detecting program, and digital watermark processing system
CN102122385A (en) * 2011-02-28 2011-07-13 北京工业大学 Digital watermark method capable of simultaneously resisting various attacks
US20120163583A1 (en) * 2010-12-28 2012-06-28 Fujitsu Limited Digital watermark embedding device, computer-readable recording medium, and digital watermark detecting device and computer-readable recording medium
US20120243727A1 (en) * 2009-10-29 2012-09-27 Lg Electronics Inc. Device and method for embedding watermarks in contents and for detecting embedded watermarks
CN102842309A (en) * 2008-03-14 2012-12-26 弗劳恩霍夫应用研究促进协会 Embedder for embedding watermark into information representation, detector for detecting watermark in information representation, method and computer program and information signal
US20130136296A1 (en) * 2011-11-29 2013-05-30 Fujitsu Limited Digital watermark embedding apparatus and method, and digital watermark detecting apparatus and method
US20180077456A1 (en) * 2015-06-22 2018-03-15 Sony Corporation Receiving device, transmitting device, and data processing method
US20190007753A1 (en) * 2015-10-07 2019-01-03 Lg Electronics Inc. Broadcast signal transmission/reception device and method

Patent Citations (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20040204943A1 (en) * 1999-07-13 2004-10-14 Microsoft Corporation Stealthy audio watermarking
US20020071593A1 (en) * 2000-10-31 2002-06-13 Hirofumi Muratani Digital watermark embedding apparatus, digital watermark detecting apparatus, digital watermark embedding method, digital watermark detecting method and computer program product
JP2003092677A (en) * 2001-07-11 2003-03-28 Canon Inc Data processing method and apparatus, its program and storage medium
US20030152225A1 (en) * 2002-02-13 2003-08-14 Sanyo Electric Co., Ltd. Digital watermarking system using scrambling method
DE10216261A1 (en) * 2002-04-12 2003-11-06 Fraunhofer Ges Forschung Method and device for embedding watermark information and method and device for extracting embedded watermark information
US20040101160A1 (en) * 2002-11-08 2004-05-27 Sanyo Electric Co., Ltd. Multilayered digital watermarking system
US20070121997A1 (en) * 2005-11-30 2007-05-31 Microsoft Corporation Digital fingerprinting using synchronization marks and watermarks
CN101005615A (en) * 2006-01-18 2007-07-25 华中科技大学 Embedding and detecting method and system for image data watermark information
JP2008294916A (en) * 2007-05-28 2008-12-04 Mitsubishi Electric Corp Digital watermark detecting device, digital watermark detecting method, digital watermark detecting program, and digital watermark processing system
CN102842309A (en) * 2008-03-14 2012-12-26 弗劳恩霍夫应用研究促进协会 Embedder for embedding watermark into information representation, detector for detecting watermark in information representation, method and computer program and information signal
US20120243727A1 (en) * 2009-10-29 2012-09-27 Lg Electronics Inc. Device and method for embedding watermarks in contents and for detecting embedded watermarks
US20120163583A1 (en) * 2010-12-28 2012-06-28 Fujitsu Limited Digital watermark embedding device, computer-readable recording medium, and digital watermark detecting device and computer-readable recording medium
CN102122385A (en) * 2011-02-28 2011-07-13 北京工业大学 Digital watermark method capable of simultaneously resisting various attacks
US20130136296A1 (en) * 2011-11-29 2013-05-30 Fujitsu Limited Digital watermark embedding apparatus and method, and digital watermark detecting apparatus and method
US20180077456A1 (en) * 2015-06-22 2018-03-15 Sony Corporation Receiving device, transmitting device, and data processing method
US20190007753A1 (en) * 2015-10-07 2019-01-03 Lg Electronics Inc. Broadcast signal transmission/reception device and method

Non-Patent Citations (4)

* Cited by examiner, † Cited by third party
Title
VISHAKHA KELKAR,ET AL.: "Reversible watermarking in medical images using Histogram shifting method with improved security and embedding capacity", 《 2016 IEEE INTERNATIONAL CONFERENCE ON RECENT TRENDS IN ELECTRONICS, INFORMATION & COMMUNICATION TECHNOLOGY (RTEICT)》 *
XIAOMING YAO,ET AL.: "A Similarity Attack to Correlation-Based Public Watermarking Detection", 《 2009 THIRD INTERNATIONAL CONFERENCE ON MULTIMEDIA AND UBIQUITOUS ENGINEERING》 *
翁韶伟.: "数字图像的高容量可逆水印的研究", 《中国优秀博士学位全文数据库(电子期刊)》 *
胡琴.: "基于纹理分析的可逆图像水印算法研究", 《中国优秀硕士学位论文全文数据库(电子期刊)》 *

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114332092A (en) * 2022-03-16 2022-04-12 北京中科慧眼科技有限公司 Defect image detection method, system, intelligent terminal and medium

Also Published As

Publication number Publication date
CN112788342B (en) 2022-07-08
WO2021093648A1 (en) 2021-05-20

Similar Documents

Publication Publication Date Title
CN112788342B (en) Watermark information embedding method and device
AU2010201199B2 (en) Desynchronized fingerprinting method and system for digital multimedia data
Kumar et al. Information hiding with adaptive steganography based on novel fuzzy edge identification
Fridrich Visual hash for oblivious watermarking
US7792377B2 (en) Method of image authentication and restoration
Bahrami et al. A new robust video watermarking algorithm based on SURF features and block classification
JP2005020742A (en) Method and apparatus for video copy detection
EP1714244A1 (en) Watermark detection
Kaur et al. Copy-move forgery detection using DCT and SIFT
Lian et al. Content-based video copy detection–a survey
CN116385250B (en) Track data double watermarking method based on robust watermarking and fragile watermarking
Liu et al. Robust hybrid image watermarking scheme based on KAZE features and IWT-SVD
Keyvanpour et al. A secure method in digital video watermarking with transform domain algorithms
CN116805069B (en) Track data zero watermark generation method, track data zero watermark detection device and storage medium
Renklier et al. A novel Frei‐Chen based fragile watermarking method for authentication of an image
Nikbakht et al. Targeted watermark removal of a SVD-based image watermarking scheme
Salman et al. Proposed copyright protection systems for 3D video based on key frames
Ishizuka et al. A zero-watermarking-like steganography and potential applications
CN113496449A (en) Data processing method and device, electronic equipment and storage equipment
Dattatherya et al. A generalized image authentication based on statistical moments of color histogram
Shivani et al. A dual watermarking scheme for ownership verification and pixel level authentication
Lin et al. A study on detecting image hiding by feature analysis
Gunjan et al. Detection attack analysis using partial watermark in DCT domain
Zhao et al. An overview on passive image forensics technology for automatic computer forgery
Sen et al. An algorithm for digital watermarking of still images for copyright protection

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant