CN108961138B - Watermark data embedding and extracting method - Google Patents

Watermark data embedding and extracting method Download PDF

Info

Publication number
CN108961138B
CN108961138B CN201810487176.2A CN201810487176A CN108961138B CN 108961138 B CN108961138 B CN 108961138B CN 201810487176 A CN201810487176 A CN 201810487176A CN 108961138 B CN108961138 B CN 108961138B
Authority
CN
China
Prior art keywords
image
matrix
watermark data
noise matrix
noise
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201810487176.2A
Other languages
Chinese (zh)
Other versions
CN108961138A (en
Inventor
金元真
李在焄
李哲宪
李桓雄
林在星
崔钟允
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Haroo Co
Snapshots Label Co ltd
Original Assignee
Haroo Co
Snapshots Label Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Haroo Co, Snapshots Label Co ltd filed Critical Haroo Co
Priority to CN202310298120.3A priority Critical patent/CN116485620A/en
Publication of CN108961138A publication Critical patent/CN108961138A/en
Application granted granted Critical
Publication of CN108961138B publication Critical patent/CN108961138B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T1/00General purpose image data processing
    • G06T1/0021Image watermarking
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T1/00General purpose image data processing
    • G06T1/0021Image watermarking
    • G06T1/005Robust watermarking, e.g. average attack or collusion attack resistant
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N1/00Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
    • H04N1/32Circuits or arrangements for control or supervision between transmitter and receiver or between image input and image output device, e.g. between a still-image camera and its memory or between a still-image camera and a printer device
    • H04N1/32101Display, printing, storage or transmission of additional information, e.g. ID code, date and time or title
    • H04N1/32144Display, printing, storage or transmission of additional information, e.g. ID code, date and time or title embedded in the image data, i.e. enclosed or integrated in the image, e.g. watermark, super-imposed logo or stamp
    • H04N1/32149Methods relating to embedding, encoding, decoding, detection or retrieval operations
    • H04N1/32309Methods relating to embedding, encoding, decoding, detection or retrieval operations in colour image data
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2201/00General purpose image data processing
    • G06T2201/005Image watermarking
    • G06T2201/0051Embedding of the watermark in the spatial domain
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2201/00General purpose image data processing
    • G06T2201/005Image watermarking
    • G06T2201/0052Embedding of the watermark in the frequency domain
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2201/00General purpose image data processing
    • G06T2201/005Image watermarking
    • G06T2201/0065Extraction of an embedded watermark; Reliable detection

Abstract

The method is reliable for geometric distortion and low-quality shooting, so that the success probability of watermark data extraction is high for an original image, but the success probability of watermark data extraction is seriously weakened under the condition of unauthorized copying and the like. The data embedding method according to an embodiment of the invention comprises the following steps: converting the noise matrix image by using watermark data; and adjusting the original image using the converted noise matrix image.

Description

Watermark data embedding and extracting method
Technical Field
The present invention relates to an embedding (embedding) and extraction method of watermark data. And more particularly, to a method of embedding watermark data into an image by means of image processing extraction, which is not recognized by human vision, and extracting watermark data embedded in this way from the image.
Background
Watermark techniques may be used to insert watermark data or watermark texture into the original image, which is not visible in the original version but is exposed upon unauthorized copying. The watermarking technology is widely applied to the purposes of genuine authentication, legal authentication and the like. However, the watermark technology can be extracted by image processing of a computer device without being seen by a person, but it has not been possible to provide the watermark technology in a form that cannot be extracted without unauthorized copying or the like.
Prior art literature
Patent literature
Patent document 1: U.S. publication No. 2017-013741
Patent document 2: U.S. publication No. 2017-0024844
Patent document 3: U.S. publication No. 2017-0024845
Patent document 4: U.S. published patent number 2016-0267218
Disclosure of Invention
The invention aims to provide a watermark data embedding and extracting method which is reliable in geometric distortion and low-quality shooting, so that the watermark data is highly likely to be successfully extracted from an original image, but the watermark data is severely weakened in the case of unauthorized copying and the like.
Still another technical problem to be solved by the present invention is to provide a method for embedding and extracting watermark data, which minimizes the quality damage of an original image by embedding watermark data, and does not show whether the watermark data is embedded or not.
Another technical problem to be solved by the present invention is to provide a method for embedding and extracting watermark data, wherein the size of the embeddable watermark data increases with the increase of the size of the original image.
A further technical problem to be solved by the present invention is to provide a method for embedding and extracting watermark data, which minimizes the possibility of leakage of cues associated with the embedding and extracting method.
Another technical problem to be solved by the present invention is to provide a method for embedding and extracting watermark data of an image, which is free from problems even if the printing material changes.
A further technical problem to be solved by the present invention is to provide an embedding and extraction method that is applicable irrespective of the size of the image.
The technical problems of the present invention are not limited to the above-mentioned technical problems, and other technical problems not mentioned can be properly understood by those skilled in the art from the following description.
Technical solution
In order to achieve the above object, a data embedding method according to an embodiment of the present invention embeds watermark data in an original image, including: converting the noise matrix image by using the watermark data; and a step of adjusting the original image using the converted noise matrix image, wherein the noise matrix image is represented by a two-dimensional amplitude spectrum having amplitude peaks of specific coordinates specified in a frequency domain and a random phase.
In one embodiment, the noise matrix image is represented in the frequency domain by a plurality of amplitude peaks and a random phase, the plurality of amplitude peaks being formed by a plurality of amplitude peaks formed at the specified coordinates, respectively. In this case, the plurality of amplitude peaks are formed by a specific number of the plurality of amplitude peaks.
In an embodiment, the amplitude value of the amplitude peak is a first value, the noise matrix image has an amplitude spectrum, and the amplitude values of all areas except the amplitude peak in the frequency domain have a second value different from the first value.
In one embodiment, the random phase is generated using a fixed key value.
In one embodiment, the step of converting the original image using the converted noise matrix image includes: and connecting the plurality of converted noise matrix images in at least one of a horizontal direction and a vertical direction based on the size of the original image.
In one embodiment, the step of adjusting the original image using the converted noise matrix image comprises: and adjusting the first pixel of the original image under the condition that the R channel value, the G channel value and the B channel value of the first pixel exceed the threshold value or are lower than the threshold value. At this time, the step of adjusting the current pixel of the original image includes: and performing a first adjustment on the first pixel when the second pixel of the converted noise matrix image is white, and performing a second adjustment on the first pixel according to a rule different from the first adjustment when the second pixel is black, wherein the first pixel and the second pixel correspond to each other with an intra-image position reference.
In order to achieve the above object, a computer program of another embodiment of the present invention embeds watermark data into an original image by being combined with a computer device, and is stored on a recording medium for performing the steps of: converting the noise matrix image by using the watermark data; and adjusting the original image using the converted noise matrix image. In this case, the noise matrix image is represented by a two-dimensional amplitude spectrum having amplitude peaks with specific coordinates specified in the frequency domain and a random phase.
In order to achieve the above object, a data extraction method of a further embodiment of the present invention, which extracts watermark data from a captured image that is an image embedded with the watermark data, includes: a step of generating a second result matrix indicating whether or not the noise matrix is included in the captured image by performing a frequency domain processing for comparing the captured image with the noise matrix; a step of generating a second reference image by converting the second result matrix into a spatial domain; a step of determining a reference point as an extraction position of the watermark data in the captured image by using the second reference image; and extracting the watermark data from the reference point of the captured image by referring to the noise matrix image.
In an embodiment, before the step of generating the second result matrix, the method further includes a step of generating a modified image from the captured image, the step of generating the two-dimensional second result matrix includes a step of generating a reference point indicating whether the noise matrix includes the second result matrix of the modified image by performing a process of comparing the modified image with the noise matrix, the step of determining the reference point as an extraction position of the watermark data includes a step of determining the reference point in the modified image, and the step of extracting the watermark data includes a step of extracting the watermark data from the reference point of the modified image. At this time, the step of generating a modified image from the captured image includes: calculating a scale and a rotation angle of the noise matrix image with respect to the captured image by performing a digital-to-polar coordinate conversion on the noise matrix image and the captured image; and generating the modified image from the captured image by reflecting the scale and the rotation angle.
In one embodiment, before the step of generating the second result matrix, the method further includes a step of generating a modified image from the captured image, a step of monochromatizing the modified image to maximize the expression of noise inherent in the modified image, the step of generating the two-dimensional second result matrix includes a step of generating a frequency domain indicating whether the noise matrix includes the second result matrix of the modified image by performing a process of comparing the monochromatized modified image with the noise matrix, the step of determining an extraction position of the watermark data includes a step of determining a reference point as an extraction position of the watermark data in the monochromatized modified image, and the step of extracting the watermark data includes a step of extracting the watermark data from the monochromatized modified image.
In one embodiment, the noise matrix image is represented by a two-dimensional amplitude spectrum with amplitude peaks of specific coordinates specified in the frequency domain and a random phase.
In order to achieve the above object, a computer program according to an embodiment of the present invention extracts watermark data from a captured image that is an image in which the watermark data is embedded, and stores the watermark data in a recording medium in order to execute: a step of generating a second result matrix indicating whether or not the noise matrix is included in the captured image by performing a frequency domain processing for comparing the captured image with the noise matrix; a step of generating a second reference image by converting the second result matrix into a spatial domain; a step of determining a reference point as an extraction position of the watermark data in the captured image by using the second reference image; and extracting the watermark data from the reference point of the captured image by referring to the noise matrix image.
Drawings
Fig. 1 is a conceptual diagram for explaining a situation in which an embedding method of watermark data according to an embodiment of the present invention and an extracting method of watermark data according to an embodiment of the present invention are used.
Fig. 2 is a diagram for comparing an image embedded with watermark data and an original image of an embedding method of watermark data according to various embodiments of the present invention.
Fig. 3 is a flowchart of a method of embedding watermark data according to an embodiment of the invention.
Fig. 4 is a conceptual diagram that facilitates an understanding of an embedding method of watermark data that can be understood by referring to fig. 3.
Fig. 5 is a diagram for explaining a noise matrix referred to in various embodiments of the present invention.
Fig. 6 and 7 are diagrams for explaining an example of a noise matrix that can be referred to in various embodiments of the present invention.
Fig. 8 and 9 are diagrams for explaining another example of a noise matrix that can be referred to in the various embodiments of the present invention.
Fig. 10 is a diagram of noise substrates of mutually different sizes that can be referenced in various embodiments of the present invention.
Fig. 11 is a flowchart for further details of a portion of the operation of the embedding method of watermark data that may be understood by reference to fig. 3.
Fig. 12 is a graph illustrating the results of the noise matrix being modularized using the results of a portion of the operations of fig. 11.
Fig. 13 is a flowchart for further details of a portion of the operation of the embedding method of watermark data that may be understood by reference to fig. 3.
Fig. 14 and 16 are diagrams illustrating an operation of converting a noise matrix to represent watermark data using the result of the operation of fig. 13.
Fig. 15 is a diagram for explaining a noise matrix conversion standard that can be referred to in various embodiments of the present invention.
Fig. 17 is a diagram for explaining block traversal of watermark data embedding or extraction that can be referred to in various embodiments of the present invention.
Fig. 18 is a diagram illustrating the result of conversion of a noise matrix to represent watermark data using the result of a portion of the operations of fig. 13.
Fig. 19 is a flowchart of a watermark data extraction method according to still another embodiment of the invention.
Fig. 20 is a diagram for explaining the reason why the photographed image needs to be preprocessed in the watermark data extraction method according to an embodiment of the present invention.
Fig. 21 is a detailed flowchart for explaining in detail the operations related to image preprocessing of the watermark data extraction method, which can be understood by referring to fig. 19.
Fig. 22 is a diagram for explaining the preprocessing result of a photographed image.
Fig. 23 is a detailed flowchart for explaining in detail the image processing-related operation concerning the determination of the watermark extraction reference point of the watermark data extraction method that can be understood by referring to fig. 19.
Fig. 24 is a detailed flowchart for explaining in detail the data extraction-related operation of the watermark data extraction method that can be understood by referring to fig. 19.
Fig. 25 is a diagram for explaining a method of extracting watermark data different from each other based on a photographing angle of an image according to another embodiment of the present invention.
Fig. 26 is a block configuration diagram of a watermark data embedding device according to still another embodiment of the present invention.
Fig. 27 is a block configuration diagram of a watermark data extraction apparatus according to still another embodiment of the invention.
Detailed Description
Hereinafter, preferred embodiments of the present invention will be described in detail with reference to the accompanying drawings. The advantages and features of the present invention, as well as methods for accomplishing the same, may be apparent from the accompanying drawings by reference to the embodiments described in detail below. However, the present invention is not limited to the embodiments disclosed below, and can be implemented in various forms different from each other, and these embodiments are provided only for the sake of completeness of the disclosure of the present invention, and to fully inform a person skilled in the art of the scope of the present invention, which is defined only by the scope of the claimed invention. Like reference numerals refer to like elements throughout the specification.
Unless defined otherwise, all terms (including technical and scientific terms) used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this invention belongs. And terms defined in commonly used dictionaries are not to be interpreted as being ideally or exaggeratedly defined unless expressly defined otherwise. The terminology used in the description is for the purpose of describing various embodiments only and is not intended to be limiting of the invention. In this specification, the singular forms also include the plural unless the context specifically mentions.
Hereinafter, various embodiments of the present invention will be described with reference to the accompanying drawings.
Brief description of the drawings
Fig. 1 is a conceptual diagram for explaining a situation in which an embedding method of watermark data according to an embodiment of the present invention and an extracting method of watermark data according to an embodiment of the present invention are used. First, the resulting image 30, formed by embedding (embedding) the watermark data 10 in the original image 20, may be distributed through a variety of channels. The resulting image 30 may be, for example, a business card, a label attached to the surface of the product. Also, the resulting image 30 may also be printed on the exterior surface of a vehicle such as a box.
As shown in fig. 1, it is seen that it is difficult to visually recognize whether watermark data is embedded in the resulting image 30. Unlike the conventional techniques of bar codes, two-dimensional codes (QR codes), and the like, the watermark data embedding method according to the present embodiment has an advantage in that it is difficult to recognize whether watermark data is embedded or not with the naked eye, and thus watermark data can be loaded without damaging the original image. The watermark data according to one embodiment has a size of 144 bits (bits), and the practically applicable data is 48 bits, and as a result, the number of cases of 248 can be represented by the watermark data. Thus, when the present technique is applied to a label attached to the surface of a product, each product is given an inherent ID although it appears to the naked eye as the same label, and embedding its account number into the label using watermark data can also be achieved.
It is difficult to recognize with the naked eye whether watermark data is embedded or not to see the result image 30, but if the user terminal 40 performing the watermark data extraction method according to the embodiments of the present invention is used, the watermark data 10 can be extracted from the result image 30. For example, the watermark data 10 may be extracted by capturing the result image 30 with a camera provided in the user terminal 40 and performing watermark data extraction logic on the captured result.
The watermark data 10 may have, for example, the format of table 1.
TABLE 1
4 bits 48 bits 92 bits
Control parameters Theme data Error correction bits
The control parameter (control parameter) of table 1 may refer to the type of theme data. For example, the user terminal 40 extracts the watermark data 10 and recognizes the 48-bit subject data therein, and can transmit a content request containing the subject data and the control parameters to the content server. The content server may store a content Database (DB) that manages respective contents assigned to the inherent account by type. The content server may provide the user terminal 40 with a type of content corresponding to the control parameter among the contents of the account number matched with the theme data.
The 92-bit error correction bits (Error Correction Bit) may include error correction codes to supplement the likelihood that the watermark data 10 may be extracted in error. For example, the error correction code may be a Bose-Chaudhuri-Hocquenghem (BCH) code.
Fig. 2 is a diagram for comparing an image embedded with watermark data and an original image of an embedding method of watermark data according to various embodiments of the present invention. As shown in fig. 2, the resulting image 30 is not significantly different from the original image 20 as seen by the naked eye. However, the resulting image 30a converted to a single color is quite noisy compared to the single color version 20a of the original image. This is because watermark data according to an embodiment of the present invention is embedded in the original image in the form of noise. In this connection, further details will be described with reference to the drawings.
Embedding watermark data
Hereinafter, referring to fig. 3, a watermark data embedding method according to an embodiment of the present invention will be described as a whole. The embedding method of watermark data according to the present embodiment may be performed by means of, for example, a computer device.
The computer device loads (load) watermark data (step S100) and loads a noise matrix image (step S102). And, the original image for watermarking data is also loaded (step S106). A noise matrix (noise base) is understood as a data vessel for watermarking data. The noise matrix may be loaded from an image file, and may also be loaded in the form of data that refers to a two-dimensional matrix of the frequency domain. In addition, in an embodiment, the loading can also be performed in the form of an amplitude spectrum and a phase spectrum of the frequency domain.
The above-mentioned noise matrix is converted by the stored data watermark data (step S104), and the original image is adjusted by the converted noise matrix, so that the image embedded with the watermark data is generated and outputted (step S108, step S110). According to an embodiment, the output is achieved in the form of an external surface applied to the result printed by means of a three-dimensional (3D) printer.
Referring to fig. 4, it is helpful to understand the embedding method of watermark data according to the present embodiment. The noise floor image 50 is shown as a two-dimensional image. The noise matrix image 50 also illustrates a converted noise matrix image 60 using the watermark data 10. The original image 20 generates a resultant image 30 embedded with watermark data by the result of the adjustment (step S108) using the converted noise matrix image 60.
As described with reference to fig. 3 to 4, in the embedding method according to the present embodiment, the noise matrix plays a central role. Thus, the noise matrix is neither easily leaked by analysis of the resulting image 30, but is robust to geometric distortion, and thus should be formed in such a way that the success rate of extraction of watermark data can be improved. Referring to fig. 5, a noise matrix used in various embodiments of the present invention will be described.
The noise floor is represented in the form of a two-dimensional image 50 in a Spatial Domain (Spatial Domain) 53. As is well known, the spatial Domain is transformed into the Frequency Domain (Frequency Domain) by fourier transformation. In the frequency domain, the noise matrix is represented by a two-dimensional amplitude spectrum 51 with amplitude peaks (magnitude peaks) of specified specific coordinates and a random phase spectrum 52. The random phase spectrum 52 is formed using random numbers generated by specific keys.
In one embodiment, the noise matrix has amplitude values other than 0 only in the areas belonging to the amplitude peaks, and has a two-dimensional amplitude spectrum 51 having amplitude values of 0 in the areas other than the amplitude peaks. In fig. 5, the two-dimensional amplitude spectrum 51 is illustrated on a logarithmic scale, and therefore the middle of the spectrum image means a low frequency, and the diffusion to the periphery means a high frequency. The brighter the color of the spectral image means that the magnitude value of the frequency according to the coordinates of its point is high.
In one embodiment, the noise matrix has a maximum amplitude value only in the region belonging to the amplitude peak, and has a two-dimensional amplitude spectrum 51 having minimum amplitude values in the regions other than the amplitude peak. In this embodiment, the difference between the amplitude value of the amplitude peak and the amplitude value of the area other than the amplitude peak is maximized, thereby improving the success rate of detecting whether or not there is an embedded trace of watermark data according to the embedding method of the present invention at the time of extraction of watermark data.
In one embodiment, the noise matrix has a two-dimensional amplitude spectrum 51 with a plurality of amplitude peaks. In this case, the number of amplitude peaks is defined to be preset.
The noise matrix generated by the frequency domain maintains the characteristics of the synchronization signal (Synchronization Signal) even if a part of the image is damaged according to the characteristics of the frequency. Therefore, in terms of identifying watermark data using a smart phone camera or the like, the noise matrix generated by the frequency domain may function as a cue capable of solving geometric distortion. That is, in the watermark data extraction process, the noise matrix generated by the frequency domain utilizes the similarity between the synchronization signal and the captured image to provide clues of geometric distortions such as Scale (Scale), rotation (Rotation), position (Location) and the like.
As shown in fig. 6 and 7, the amplitude peak may include a plurality of points on a plane formed by a horizontal frequency axis 51i representing a plurality of unit frequencies according to frequency dispersion (discrete) and a vertical frequency axis 51j also representing a plurality of unit frequencies according to frequency dispersion. In fig. 6 and 7, 8 amplitude peaks 51a to 51h are illustrated. Referring to fig. 7, in the horizontal frequency axis 51i, the amplitude peak 51c corresponds to 3 unit frequencies 51c-1, 51c-2, 51c-3, and in the vertical frequency axis 51j, the amplitude peak 51c corresponds to 3 unit frequencies 51c-4, 51c-5, 51 c-6.
In contrast, as shown in fig. 8 and 9, the amplitude peaks included in the plurality of amplitude peaks 51k to 51r are represented by a first frequency on the horizontal axis and a second frequency on the vertical axis, and the first frequency and the second frequency may correspond to one unit frequency according to the frequency dispersion. As shown in fig. 9, the amplitude peak 51k corresponds to 1 unit frequency 51k-1 in the horizontal frequency axis 51i and also corresponds to 1 unit frequency 51k-2 in the vertical frequency axis 51 j. In fig. 9, the result of representing the magnitude spectrum value by using a two-dimensional matrix is also illustrated with a point where only one element (element) has a value other than 0 and all other elements have 0. This is a case where the noise floor has a maximum amplitude value (255 'in fig. 9) only in the area belonging to the amplitude peak and has a minimum amplitude value (0' in fig. 9) in the areas other than the amplitude peak as the illustrated embodiment.
In the present embodiment, the coordinates of the amplitude peak on the amplitude spectrum for generating the noise matrix do not leak to the outside. The noise matrix converted from only watermark data reflected after the original image is distributed to the outside, and even if the image in the distributed state is processed, the original version of the noise matrix as shown in fig. 10 cannot be obtained. This is because conversion according to the watermark data described above is performed after the noise matrix master is modularized. Therefore, the watermark data embedding method according to the present embodiment can prevent unauthorized theft. This is because extraction of watermark data is possible only in the case of watermark embedding based on a specified noise matrix master. This is described in detail later.
The image size of the noise floor may be inferior to the original image size. Even then, when the above-described noise matrix is reflected only in a specific region of the original image, the user should be well aligned with the photographing angle of the original image, and thus the watermark data extraction success rate is reduced. Therefore, as shown in fig. 10, the noise matrix image converted as described above is connected in at least one of the horizontal direction and the vertical direction based on the size of the original image, so that the noise matrix is laid in the widest area of the original image, and thus, even if a user photographs a part of the original image, induction can be performed, so that extraction of watermark data is successful. In fig. 10, 256×256 size noise substrates 50-1 are illustrated, 2 noise substrates 50-2 are connected in the horizontal direction and 2 noise substrates 50-2 are connected in the vertical direction, and 4 noise substrates 50-3 are connected in the horizontal direction and 4 noise substrates 50-3 are connected in the vertical direction.
As the size of the noise floor increases, the size of the watermark data that can be embedded in the noise floor also increases. Therefore, the data embedding method according to the present embodiment has expandability in terms of embeddable data size.
A method of converting a noise matrix master using watermark data (step S104) will be described in further detail below with reference to fig. 11.
The noise matrix image is divided in block units of n×n pixels (e.g., 4×4) (step S104-1). Preferably, one block has a square shape formed with even-numbered pixels in the horizontal direction and the vertical direction. For example, one block may be of a size of 4×4. In the following, for the sake of understanding, a block of 4×4 size will be described.
Next, an average (M) of pixel values belonging to the block is calculated for each block of the noise matrix image (step S104-2). When the average value (M) is equal to or greater than a threshold value (for example, an intermediate value between a pixel value indicating white and a pixel value indicating black) (step S104-3), all pixels of the block are unified to a first color (for example, black) (step S104-4). In the case where the average value (M) thereof is smaller than the threshold value (step S104-3), all pixels of the block thereof are unified into the second color (e.g., white) (step S104-5). Fig. 12 illustrates the noise matrix image 55 modularized as a result of the modularized operation (step S104-1 to step S104-7) of the noise matrix described with reference to fig. 11.
The reason for modeling the noise matrix to 4 x 4 before inserting the data is 3. First, when 144 bits are inserted into a black-and-white noise matrix in a predetermined manner, a predetermined pattern is generated, and when the above-described method is used for modularization, the generation of the pattern is prevented by converting the color of the noise matrix. When noise with data inserted is inserted into the original image, a large influence is brought about on the invisibility.
Secondly, the recognition rate in the watermark data extraction process is improved. When data is inserted in 1 pixel unit, the accuracy and speed of the present technology using smart phone camera recognition may be greatly reduced. Thereby, modularization is performed.
Third, the characteristics of the synchronization signal inserted into the noise matrix are not destroyed. If data is inserted without performing a modular operation, characteristics of the noise matrix itself, that is, characteristics of the synchronization signal, cannot exist. It is possible to form a black-and-white image that is completely different from the noise floor. By modularization, the function of keeping the characteristics of the noise matrix itself in the block is performed, so that even if data is inserted into the noise matrix as a whole, the synchronization signal in the noise matrix is not damaged.
Next, an operation of converting a modularized noise matrix image using watermark data will be described with reference to fig. 13 to 16.
The watermark data is loaded (step S104-8), and as each block of the noise matrix is traversed, an operation of embedding the watermark data from the starting offset by 2 bits is performed. In the case where the current block is white (step S104-9), a part of pixels of the block are converted to black (step S104-10). Conversely, in the case where the current block is black (step S104-9), a part of pixels of the block thereof is converted into white (step S104-11).
The aspect of a part of the pixels of the conversion block is described in further detail below. According to fig. 14, if the 2bit to be embedded is currently '01', the current block 55a is white, and the first block at the upper left end of the noise matrix image 55 is modularized. FIG. 15 is a noise floor transition reference that may be referenced in various embodiments of the invention. According to the present reference, the current block is white, and if the 2bit to be embedded is '01', a plurality of (2×2) pixels at the upper right end of the current block are converted into black. According to such a reference, the result of converting the upper right plurality of pixels 55a-1 of the current block 55a into black is illustrated in fig. 16.
As can be understood with reference to fig. 15 to 16, one block having the same pixel value is formed of 4 pixel groups. And, one of the 4 pixel groups is converted to have a pixel value different from the pixel value of the entire block. For example, for a white block, only 1 pixel group is converted to black. Therefore, the color of the pixel group is changed to a color different from the color of the block for each block, so that the number of representable cases is 4. This is a number of cases that can be expressed in 2 bits. Thus, each block may be embedded in 2 bits.
The 2-bit watermark data embedding operation for each block proceeds as each block of the noise matrix image is traversed (reverse) until watermark data is embedded (step S104-13). The traversal order of the noise matrix image is according to a predefined. Fig. 17 illustrates a traversal sequence according to an example. For embedding of 144-bit watermark data, there are 72 points required for 4 x 4 blocks illustrated in fig. 17. When this point is considered that each block can be embedded in 2 bits, it can be understood why 72 blocks 58 are required.
A 256 x 256 size noise floor image includes a total of 4096 blocks. After embedding 144 bits of watermark data with 72 blocks 58, there are remaining blocks. In order to increase the watermark data extraction success rate, embedding of watermark data using 72 blocks 58 is repeated in the horizontal and vertical directions. Fig. 18 illustrates the result 60 of the modular noise matrix image 55 being transformed by reflecting the watermark data.
Returning again to fig. 3 for explanation, the original image is adjusted by using the converted noise matrix image, thereby generating an image embedded with watermark data (step S108). The adjustment includes adjustment of the first pixel only in a case where R-channel values, G-channel values, and B-channel values of the first pixel of the original image all exceed a threshold value or are all below a threshold value. In this case, adjusting the current pixel of the original image includes performing a first adjustment on the first pixel in a case where a second pixel of the converted noise matrix image is white, and performing a second adjustment on the first pixel in a different rule from the first adjustment in a case where the second pixel is black. For example, the first adjustment may decrease the R channel value of the first pixel, increase the G channel value, decrease the B channel value, and the second adjustment may decrease the R channel value of the first pixel, decrease the G channel value, and increase the B channel value.
For the above adjustment, values of three constants α, β, γ are used. The values of the three constants are ratios used in monochromizing for maximizing noise of an image in watermark data extraction processing. The values of the above three constants α, β, γ may be set to be plural depending on the image output medium. The image output medium may be various examples of paper, plastic, metal, and the like. For example, when the image output medium is paper, the values of the three constants α, β, and γ may be set to α1, β1, and γ1, and when the image output medium is plastic, the values of the three constants α, β, and γ may be set to α2, β2, and γ2.
Taking (R, G, B =each pixel value of R channel, G channel, B channel before update, R ', G ', B ' =each pixel value of R channel, G channel, B channel after update) as a premise, an adjustment rule can be considered.
First case: when the pixel value of the noise is 0 and the three channels R and G, B are all greater than 250, the pixel value of the original image is R ' =r (1- α), G ' =g (1+β), and B ' =b (1- γ)
Second case: when the pixel value of the noise is 0 and the three channels R and G, B are smaller than 250, the pixel value of the original image is R ' =r (1- α), G ' =g (1+β), and B ' =b (1- γ)
Third case: when the pixel value of the noise is 255 and the three channels R and G, B are all greater than 250, the pixel value of the original image is R ' =r (1+α), G ' =g (1- β), and B ' =b (1+γ)
Fourth case: when the pixel value of the noise is 255 and the three channels R and G, B are less than 250, the pixel value of the initial image is R ' =r (1+α), G ' =g (1- β), and B ' =b (1+γ)
The three constants α, β, γ used for inserting noise into the original image vary according to the original image. The present technique has a characteristic of being difficult to identify with the naked human eye. To meet this, the α, β, γ values vary to a plurality of kinds according to the pixel values of the original image into which noise needs to be inserted.
Extraction of watermark data
Hereinafter, another watermark data extraction method according to another embodiment of the present invention will be described with reference to fig. 19 to 24. The extraction method of watermark data according to the present embodiment is applicable only to watermark data embedded by the embedding method of watermark data according to an embodiment of the present invention described above. First, description is made with reference to fig. 19.
The extraction method of watermark data according to the present embodiment starts by receiving the provision of a captured image (step S200).
If the watermark is extracted on a personal computer using a data image file inserted with the watermark as in the conventional data watermarking technology, the embedding process is accurately performed in the opposite direction, and the watermark can be extracted. However, the technical support of the present invention utilizes a camera of a mobile terminal such as a smart phone camera to identify a watermark, and thus requires many image preprocessing processes before extracting the watermark. Referring to fig. 20, the user of the smartphone 40 can photograph the medium 30' printed with the image embedded with the watermark data without paying special attention to extract the watermark data. The watermark data extraction method according to the present embodiment has been developed so that extraction of watermark data can also be performed in this case.
The preprocessing process can maintain a high recognition rate for a plurality of variables such as the position, the size, the rotation or not, the brightness or the damage or not of an object which needs to be judged whether the watermark exists in the picture of the camera. In the process, the synchronous signals inserted in the embedding process are utilized to screen whether the watermark exists or not, and the data is converted into the image under the condition of optimal extraction.
The patching may be performed in advance before formally executing the preprocessing process described above. For example, when a smart phone camera is used to determine whether or not a watermark exists, a preview screen of a smart phone screen is 1920×1080. Then, all pixel values of the captured image are changed to RGB form. Various embodiments of the present invention operate on an RGB basis.
Then, when the user uses the camera to take an image of an object, the possibility that the object is taken at the midpoint of the screen is very high, and therefore, in order to reduce unnecessary calculations, the midpoint area is extracted as an image of 1024×1024 size.
Next, the extracted square image is reduced by 256×256.
Then, the image reduced by 256×256 size is made clear by a one-time restoration (Deblur) process.
The preliminary patching described above may be performed on the captured image provided (step S200).
Next, the noise matrix image is loaded (step S205), and the first similarity, rotation angle, and scale are confirmed by comparison using the processing on the frequency domain of the captured image after the pre-patch and the noise matrix (step S210). This operation is further described with reference to fig. 21.
First, the photographed image is converted into a frequency domain (step S210-1), and an image (MS 2) of a 2D magnitude spectrum is generated (step S210-2). An image (MS 1) of the 2D amplitude spectrum of the noise matrix compared with the captured image is also prepared (step S210-3). An image (MS 1) of the 2D magnitude spectrum of the noise floor is generated from the stored image of the noise floor or simply queried from the stored data.
Next, a digital polar (LogPolar) conversion is performed on the image (MS 1) of the 2D magnitude spectrum of the noise substrate and the image (MS 2) of the 2D magnitude spectrum of the captured image (step S210-5). The polar coordinate conversion can make the two-dimensional rotation and size conversion have unchanged characteristics, and the polar coordinate conversion can be used to obtain reliable effect of repairing the defect of the shot image in terms of rotation and size.
Then, the image created by the digital-to-polar coordinate conversion results is converted into frequency domains (step S210-5), multiplied by the conversion result into the frequency domains (step S210-6), and divided into an amplitude matrix of the conversion result (f_lp2) into the frequency domains of the captured image, thereby generating a first result matrix (step S210-7). The first result matrix may be understood as a matrix of how much of the presenter scale the noise matrix comprises within the captured image. The first result matrix may be understood as a matrix indicating the similarity between the captured image and the image of the noise matrix.
Next, a first reference image is generated by converting the first result matrix into a spatial domain (step S210-8). The first similarity is calculated using the first reference image (step S210-9). The first similarity is a maximum value among a plurality of pixel values of the first reference image. The rotation angle is the X coordinate of the pixel whose pixel value is the maximum value, and the scale is the Y coordinate of the pixel (step S210-9).
The description will be made with reference again to fig. 19. If the first similarity does not reach the standard value, the image is considered as an image without watermark data embedded therein, and the process returns to the initial operation (step S220). If the first similarity is equal to or greater than the standard value, the captured image is preprocessed by using the obtained rotation angle and scale (step S230). As a result, it can be understood with reference to fig. 22. As a result of the above preprocessing, the orientation of the captured image is integrated and adjusted in a manner conforming to the comparison with the noise matrix.
The description will be made with reference again to fig. 19.
The photographed image after the adjustment is processed in the frequency domain, and an operation for determining a watermark extraction reference point is performed. First, the preprocessed captured image (modified image) and the noise matrix image are processed in the frequency domain, a second result matrix indicating the processing result thereof is converted into the spatial domain, and a second reference image is generated (step S240).
Referring to fig. 23, the generation operation of the second reference image (step S240) is described. The generation operation of the second reference image is similar to the generation operation of the first reference image described above except that the digital-to-polar coordinate conversion is not performed. The conversion (f 2) to the frequency domain of the modified image (step S240-1), the query or generation of the domain matrix (f 1) as the frequency of the noise matrix (step S240-2), the generation of the matrix R 'multiplying f1 and f2 (step S240-3), the division of the magnitude matrix of f2 by R' to form a second result matrix (step S240-4), and the generation of the second reference image by converting the above-mentioned second result matrix to the spatial domain (step S240-5). The second result matrix is understood to be a matrix that modifies how much noise matrix is included in the image, suggesting a proportion thereof.
In analyzing the second reference image, information about the similarity of the modified image and the noise matrix position within the modified image may be obtained. The coordinates of the standard pixel having the largest pixel value among the pixel values of the second reference image, which is the second similarity, are the reference points as the watermark data extraction positions (step S250). When the second similarity does not reach the standard, it is regarded that the watermark data is not present, and the watermark data extraction initial operation is returned again (step S260).
Next, watermark data is extracted using the watermark data extraction reference point as a standard (step S270). The description will be given with reference to fig. 24. First, the modified image is monochromized (step S270-1). In this case, three constants α, β, γ for inserting noise into the original image are used identically. As a result, the noise representation within the modified image is likely to be maximized.
The modified image after the monochromatization is subjected to the pixel value averaging in 2×2 pixel units, and the average value is compared with a standard value, and all pixels are unified into one of white and black by 2×2 pixel units (step S270-2). Next, referring to the noise matrix image modularized in 4×4 pixel units, watermark data is extracted as the monochrome modified image is traversed by each 4×4 block (step S270-3). The operation of extracting 2 bits per block can be understood as the reverse of the embedding process described with reference to fig. 14 to 16. The order of traversing each block is made the same as the block traversing order of the embedding process.
The entire watermark data is generated by concatenating (concatenating) the extracted data.
The methods according to the embodiments of the present invention described thus far may be performed by execution of a computer program implemented by computer-readable codes. The computer program is transmitted from a first computer device to a second computer device via a network such as the internet, and is provided in the second computer device, so that the computer program can be used in the second computer device. The first computer device and the second computer device each include a server apparatus, a physical server belonging to a server pool for cloud service, a stationary computer device such as a desktop computer, and the like.
The computer program may be stored in a recording medium such as a digital Read Only Memory (DVD-ROM), a flash Memory device, or the like.
As shown in fig. 25, the watermark data embedding and extracting method according to the embodiment of the present invention may be implemented in a form of being converted into noise different from each other along the observation direction.
Hereinafter, referring to fig. 26, the structure and operation of other data embedding apparatuses according to other embodiments of the present invention will be described. As shown in fig. 26, the data embedding apparatus 300 according to the present embodiment includes a memory (RAM, random Access Memory) 330, an image process 320 that performs data embedding software operations loaded in the memory 330, and a memory 340. In an embodiment, the embedded device 300 may also include an image sensor 310, a network interface 370, a system bus 350, and a printer interface 360.
The memory 340 is used to store an execution file (binary file) 342 of data embedding software in which a data embedding method to be described with reference to fig. 1 to 18 is embodied in a software form. The execution files 342 of the data embedded software may be loaded into the memory 330 via the bus 350. Operation 332 of the data embedding software loaded in memory 330 is illustrated in fig. 26.
The memory 340 may also store noise matrix data 341. In one embodiment, the memory 340 stores the noise matrix data 341 and is stored in an encrypted form, preventing leakage of the noise matrix master to the outside. In one embodiment, the memory 340 includes noise matrix data in the form of resources within the data embedded software execution file and prevents leakage of the noise matrix master to the outside.
The data embedding apparatus 300 receives watermark data from an external device via the network interface 370, or directly receives input from a user via an input unit (not shown) such as a keyboard or a mouse, or reads watermark data (not shown) stored in the memory 340. The data embedding apparatus 300 acquires an original image to be an object to be embedded with the watermark data from the image sensor 310 of the photographing target object, receives the original image from an external device via the network interface 370, or reads the watermark data (not shown) stored in the memory 340.
The data embedding apparatus 300 transmits the watermark data and the original image to the external device via the network interface 370, and the calculated result image is displayed on a display device (not shown), or is printed by a printing device connected via the printer interface 360.
The data embedding apparatus 300 according to the present embodiment may be, for example, a computer device, a printing device. In the case where the data embedding apparatus 300 is a printing device, the data embedding apparatus 300 may include a plurality of structural elements for performing a printing function in place of the printer interface 360.
Hereinafter, a structure and an operation of a data decoding apparatus according to another embodiment of the present invention will be described with reference to fig. 27. In this specification, decoding of watermark data and extraction of watermark data are referred to as substantially the same operations. As shown in fig. 27, the data decoding apparatus 400 according to the present embodiment includes a memory (RAM) 430, an image process 420 for performing data decoding software operations loaded at 430, and a memory 440. In an embodiment, the decoding apparatus 400 may also include an image sensor 410, a network interface 470, a system bus 350, and a watermark data parsing process 460.
The memory 440 is used to store an execution file (binary file) 442 of data decoding software in which a data extraction method to be described with reference to fig. 1 to 18 is embodied in a software form. An execution file 442 of the data decoding software may be loaded into the memory 430 via the bus 450. An operation 432 of the data decoding software loaded in the memory 430 is illustrated in fig. 27.
The memory 440 may also store noise matrix data 441. In one embodiment, the memory 440 stores noise matrix data 441, which may be stored in an encrypted form or included in a resource (resource) form within a data embedded software execution file to prevent leakage of the noise matrix master to the outside.
The data decoding apparatus 400 may acquire data of a photographed image from the image sensor 410 for photographing a target object printed with an image embedded with watermark data, or receive the photographed image to an external device through the network interface 470, or store the photographed image in the memory 440. The above-described data for receiving the photographed image is loaded in the memory 430 through the system bus 450, and referred to when the data embedding software operation 432 is performed.
The data decoding apparatus 400 displays the watermark extracted as a result of the execution of the data embedding software operation 432 on a display (not shown), or is transmitted to an external device through the network interface 470, or may be stored in the memory 440.
In an embodiment, the data decoding apparatus 400 may include a watermark data parsing process 460 for performing a process using the extracted watermark data described above. For example, the content request containing the watermark data (see table 1) and the control parameter (see table 1) may be transmitted to a content server (not shown) via the network interface 470 by extracting the watermark data. The display device (not shown) of the data decoding apparatus 400 may display the content received from the content server as a response to the content request.
The data decoding apparatus 400 according to the present embodiment may be a portable terminal provided with, for example, a camera.
While the embodiments of the present invention have been described above with reference to the drawings, it will be understood by those skilled in the art that the present invention may be embodied in other specific forms without changing the technical spirit or essential features thereof. Accordingly, it should be understood that the various embodiments described above are illustrative in all respects, and not restrictive.

Claims (20)

1. A data embedding method for embedding watermark data in an original image, characterized in that,
comprising the following steps:
a step of forming a noise matrix image in block units having m×m pixels;
unifying pixel values of all pixels belonging to each block into one of a first color or a second color;
converting the noise matrix image by traversing the each block and reflecting the watermark data to modify pixel values of some pixels of the each block; and
and adjusting the original image by using the converted noise matrix image.
2. The method for embedding data according to claim 1, wherein,
The noise floor image is represented in the frequency domain with a plurality of amplitude peaks and random phases,
the plurality of amplitude peaks are formed of a plurality of amplitude peaks formed at the specified coordinates, respectively.
3. The method for embedding data according to claim 2, wherein,
the plurality of amplitude peaks is formed by a specific number of the plurality of amplitude peaks.
4. The method for embedding data according to claim 2, wherein,
the frequency domain is established by performing a discrete fourier transform on the noise matrix image,
each of the amplitude peaks included in the plurality of amplitude peaks is represented by a first frequency of a horizontal axis and a second frequency of a vertical axis, and the first frequency and the second frequency correspond to one unit frequency according to a frequency dispersion.
5. The method for embedding data according to claim 2, wherein,
the amplitude value of the amplitude peak is a first value,
the noise matrix image has a magnitude spectrum, and magnitude values of all regions of the magnitude spectrum in the frequency domain except for the magnitude peak have a second value different from the first value.
6. The method for embedding data as claimed in claim 5, wherein,
The first value is a maximum amplitude value, and the second value is a minimum amplitude value.
7. The method for embedding data according to claim 2, wherein,
the random phase is generated using a fixed key value.
8. The method for embedding data according to claim 1, wherein,
the step of converting the original image using the converted noise matrix image includes: and connecting the plurality of converted noise matrix images in at least one of a horizontal direction and a vertical direction based on the size of the original image.
9. The method for embedding data according to claim 1, wherein,
the step of adjusting the original image using the converted noise matrix image includes: a step of adjusting the first pixel for the case that the R channel value, the G channel value and the B channel value of the first pixel of the original image are all over a threshold value or are all below a threshold value,
the step of adjusting the current pixel of the original image comprises: a step of performing a first adjustment on the first pixel in the case where the second pixel of the converted noise matrix image is white, performing a second adjustment on the first pixel in the case where the second pixel is black, with a rule different from the first adjustment,
The first pixel and the second pixel correspond to each other with an intra-image position reference.
10. The method for embedding data according to claim 9, wherein,
the first adjustment decreases the R-channel value of the first pixel, increases the G-channel value, decreases the B-channel value,
the second adjustment increases the R-channel value of the first pixel, decreases the G-channel value, and increases the B-channel value.
11. A data extraction method that extracts watermark data from a captured image that is an image in which the watermark data is embedded, comprising:
a step of generating a second result matrix indicating whether or not the noise matrix is included in the captured image by performing a frequency domain processing for comparing the captured image with the noise matrix;
a step of generating a second reference image by converting the second result matrix into a spatial domain;
a step of determining a reference point as an extraction position of the watermark data in the captured image by using the second reference image; and
the step of extracting the watermark data from the reference point of the captured image by referring to the noise matrix image includes:
A step of forming the noise matrix image in block units having m×m pixels;
unifying pixel values of all pixels belonging to each block into one of a first color or a second color;
extracting pixel values of some pixels of each block by traversing each block while reflecting the watermark data; and
the watermark data is constructed by concatenating the extracted pixel values.
12. The method for extracting data according to claim 11, wherein,
before the step of generating the second result matrix, the method further comprises the step of generating a modified image from the captured image,
the step of generating the two-dimensional second result matrix includes a step of generating the second result matrix indicating whether the noise matrix includes the modified image by performing a process of comparing a frequency domain of the modified image with a noise matrix,
the step of determining a reference point as an extraction position of the watermark data includes the step of determining the reference point in the modified image,
the step of extracting the watermark data includes the step of extracting the watermark data from the reference point of the modified image,
the step of generating a modified image from the captured image includes:
Calculating a scale and a rotation angle of the noise matrix image with respect to the captured image by performing a digital-to-polar coordinate conversion on the noise matrix image and the captured image; and
and generating the modified image from the captured image by reflecting the scale and the rotation angle.
13. The method for extracting data according to claim 12, wherein,
the step of calculating the scale and rotation angle of the noise matrix image with respect to the captured image includes:
converting the photographed image into a frequency domain;
generating a 2D-magnitude spectrum image of the captured image converted into the frequency domain;
inquiring or generating the 2D amplitude spectrum image of the noise matrix;
generating a resultant image by performing a digital-to-polar coordinate conversion on the 2D amplitude spectrum image of the noise matrix;
generating a resultant image by performing a digital-to-polar coordinate conversion on the 2D amplitude spectrum image of the captured image;
generating a frequency matrix by converting a result image of the 2D-magnitude spectrum image of the noise matrix into a frequency domain;
generating a frequency matrix by converting a result image of the 2D-magnitude spectrum image of the photographed image into a frequency domain;
A step of generating a first result matrix by multiplying a frequency matrix of the 2D spectrum image of the noise matrix and a frequency matrix of the 2D spectrum image of the captured image, and then dividing the multiplied result matrix by an amplitude matrix of the frequency matrix of the 2D spectrum image of the captured image;
a step of generating a first reference image by converting the first result matrix into a spatial domain; and
and calculating a scale and a rotation angle of the noise matrix image with respect to the captured image by using the first reference image.
14. The method for extracting data according to claim 13, wherein,
the step of calculating the scale and rotation angle of the noise matrix image with respect to the captured image using the first reference image includes:
selecting a reference pixel having a maximum value among a plurality of pixel values of the first reference image;
a step of determining an X-coordinate of the reference pixel as the rotation angle; and
and determining the Y coordinate of the reference pixel as the scale.
15. The method for extracting data according to claim 13, wherein,
the step of calculating the scale and rotation angle of the noise matrix image with respect to the captured image using the first reference image includes:
A step of determining a maximum value as a first similarity among a plurality of pixel values of the first reference image; and
and stopping the data extraction step when the first similarity is lower than the threshold.
16. The method for extracting data according to claim 13, wherein,
the step of converting the photographed image into a frequency domain includes:
a step of maximizing the expression of noise included in the captured image by monochromatizing the captured image; and
and converting the monocolored photographed image into a frequency domain.
17. The method for extracting data according to claim 11, wherein,
before the step of generating the second result matrix, a step of generating a modified image from the captured image, monochromatizing the modified image to maximize the expression of noise inherent to the modified image,
the step of generating the two-dimensional second result matrix includes a step of generating the second result matrix indicating whether the noise matrix includes the modified image by performing a process of comparing a frequency domain of the monochromatic modified image with a noise matrix,
The step of determining the extraction position of the watermark data includes a step of determining a reference point as the extraction position of the watermark data on the monochrome modified image,
the step of extracting the watermark data includes the step of extracting the watermark data from the monochrome modified image.
18. The method for extracting data according to claim 17, wherein,
the step of generating a second result matrix indicative of whether said noise matrix comprises two dimensions of said monochrome modified image comprises:
generating or inquiring a two-dimensional first matrix of the noise matrix for the prompt frequency domain;
generating a two-dimensional second matrix of the monochrome modified image in the prompt frequency domain;
and a step of generating the second result matrix by multiplying the first matrix and the second matrix and dividing the first result matrix by an amplitude matrix of the modified image in the frequency domain.
19. The method for extracting data according to claim 17, wherein,
the step of extracting the watermark data from the monochromised modified image comprises:
a step of modularizing the modified image obtained by the monochromatization into block units each formed of m×m pixels, and performing modularization with reference to the reference point;
A step of unifying all pixels belonging to a partial block formed of n×n pixels into one of a first color or a second color for each block of the above-described monochromized modified image, where n=m/2;
traversing each block of the monocolored modified image with reference to a noise matrix modularized into block units formed of m×m pixels, and extracting pixel values of some pixels of each block; and
and forming the watermark data by connecting the extracted pixel values.
20. The data extraction method according to claim 11, wherein the noise matrix image is represented by a two-dimensional amplitude spectrum having amplitude peaks of specific coordinates specified in a frequency domain and a random phase.
CN201810487176.2A 2017-05-19 2018-05-21 Watermark data embedding and extracting method Active CN108961138B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310298120.3A CN116485620A (en) 2017-05-19 2018-05-21 Watermark data embedding and extracting method

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
KR10-2017-0062500 2017-05-19
KR1020170062500A KR101877372B1 (en) 2017-05-19 2017-05-19 Method for embedding and extraction of watermarking data

Related Child Applications (1)

Application Number Title Priority Date Filing Date
CN202310298120.3A Division CN116485620A (en) 2017-05-19 2018-05-21 Watermark data embedding and extracting method

Publications (2)

Publication Number Publication Date
CN108961138A CN108961138A (en) 2018-12-07
CN108961138B true CN108961138B (en) 2023-05-26

Family

ID=62913648

Family Applications (2)

Application Number Title Priority Date Filing Date
CN201810487176.2A Active CN108961138B (en) 2017-05-19 2018-05-21 Watermark data embedding and extracting method
CN202310298120.3A Pending CN116485620A (en) 2017-05-19 2018-05-21 Watermark data embedding and extracting method

Family Applications After (1)

Application Number Title Priority Date Filing Date
CN202310298120.3A Pending CN116485620A (en) 2017-05-19 2018-05-21 Watermark data embedding and extracting method

Country Status (2)

Country Link
KR (1) KR101877372B1 (en)
CN (2) CN108961138B (en)

Families Citing this family (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR102035796B1 (en) * 2018-07-26 2019-10-24 주식회사 딥핑소스 Method, system and non-transitory computer-readable recording medium for processing data to be anonymized
KR102137229B1 (en) * 2018-11-05 2020-07-23 엘엔피랩 주식회사 Electronic device for extracting fingerprinting signal from image and operating method thereof
CN109685708B (en) * 2018-12-26 2024-02-06 珠海奔图电子有限公司 Image processing method and device, electronic equipment and computer readable storage medium
US10621378B1 (en) * 2019-10-24 2020-04-14 Deeping Source Inc. Method for learning and testing user learning network to be used for recognizing obfuscated data created by concealing original data to protect personal information and learning device and testing device using the same
KR102347288B1 (en) 2020-03-23 2022-01-05 스냅태그 주식회사 Method for embedding and extraction of watermarking data
KR102435876B1 (en) 2020-03-23 2022-08-24 스냅태그 주식회사 Method for embedding and extraction of watermarking data
KR102417767B1 (en) 2020-06-12 2022-07-06 스냅태그 주식회사 Method for embedding watermarking data on video content and extracting the same therefrom
KR102579261B1 (en) 2021-01-06 2023-09-15 스냅태그 주식회사 Method for embedding and extraction of watermarking data
KR102292595B1 (en) * 2021-05-14 2021-08-24 (주)대원씨앤씨 Apparatus for recognizing watermark using image integrating QR code and watermark and method thereof
CN114760390A (en) * 2022-04-01 2022-07-15 南通启锦智能科技有限公司 Image encryption transmission method and decryption method

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2002232698A (en) * 2000-11-30 2002-08-16 Kowa Co Method and device for embedding and extracting electronic watermark
WO2005006641A1 (en) * 2003-06-13 2005-01-20 International Business Machines Corporation Applying image watermarks to previously halftoned images
CN101082981A (en) * 2007-05-22 2007-12-05 中山大学 Watermark embeding and extracting method of binary image
CN102495833A (en) * 2011-11-30 2012-06-13 哈尔滨工业大学 Document watermark copyright information protection device based on Opentype vector outline fonts
CN103295184A (en) * 2013-05-13 2013-09-11 西安电子科技大学 Frequency domain and airspace combined halftone watermarking method

Family Cites Families (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
FR2792797B1 (en) * 1999-04-23 2001-07-13 Thomson Csf IMAGE TATTOO PROCESS
US6282300B1 (en) * 2000-01-21 2001-08-28 Signafy, Inc. Rotation, scale, and translation resilient public watermarking for images using a log-polar fourier transform
JP2002084413A (en) * 2000-06-28 2002-03-22 Oki Electric Ind Co Ltd Image processing method, image processing unit, image coder, and image decoder
US6940993B2 (en) * 2000-12-13 2005-09-06 Eastman Kodak Company System and method for embedding a watermark signal that contains message data in a digital image
US7058201B2 (en) * 2001-03-28 2006-06-06 Lg Electronics Inc. Method of embedding watermark into digital image
KR100374665B1 (en) * 2001-04-13 2003-03-04 주식회사 마크애니 Method of inserting/detecting digital watermarks and apparatus for using thereof
US7607016B2 (en) * 2001-04-20 2009-10-20 Digimarc Corporation Including a metric in a digital watermark for media authentication
KR100378222B1 (en) * 2001-04-21 2003-03-29 주식회사 마크애니 Method of inserting/detecting digital watermarks and apparatus for using thereof
JP2003143388A (en) * 2001-10-31 2003-05-16 Toppan Printing Co Ltd Image processing method and apparatus, and printed matter
TW545061B (en) * 2002-01-25 2003-08-01 Univ Nat Central Method for embedding and extracting of watermark in image compression system, and the embedded hardware structure
KR100457249B1 (en) * 2002-10-14 2004-11-16 (주)디지탈이노텍 Geometirc transform resistant image watermarking scheme with side information
JP4193665B2 (en) * 2003-03-05 2008-12-10 株式会社日立製作所 Digital watermarking method for binary images
US7587062B2 (en) * 2003-05-09 2009-09-08 Avid Technology, Inc. Watermarking
JP2006186626A (en) * 2004-12-27 2006-07-13 Sanyo Electric Co Ltd Device and method for embedding watermark and for extracting it
US9418396B2 (en) * 2015-01-15 2016-08-16 Gopro, Inc. Watermarking digital images to increase bit depth

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2002232698A (en) * 2000-11-30 2002-08-16 Kowa Co Method and device for embedding and extracting electronic watermark
WO2005006641A1 (en) * 2003-06-13 2005-01-20 International Business Machines Corporation Applying image watermarks to previously halftoned images
CN101082981A (en) * 2007-05-22 2007-12-05 中山大学 Watermark embeding and extracting method of binary image
CN102495833A (en) * 2011-11-30 2012-06-13 哈尔滨工业大学 Document watermark copyright information protection device based on Opentype vector outline fonts
CN103295184A (en) * 2013-05-13 2013-09-11 西安电子科技大学 Frequency domain and airspace combined halftone watermarking method

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
数字图像零水印算法研究;程莎;《中国优秀硕士学位论文全文数据库》;20120915(第09期);全文 *
数字水印在证件防伪中的应用研究;曹葵葵;《中国优秀硕士学位论文全文数据库》;20121015(第10期);全文 *

Also Published As

Publication number Publication date
CN116485620A (en) 2023-07-25
CN108961138A (en) 2018-12-07
KR101877372B1 (en) 2018-07-13

Similar Documents

Publication Publication Date Title
CN108961138B (en) Watermark data embedding and extracting method
Fang et al. Screen-shooting resilient watermarking
US11070700B2 (en) Method of extracting watermark data in an image with reference to a noise-based image located in the image based on a comparison process performed in the frequency domain
Piva An overview on image forensics
US9684941B2 (en) Determining pose for use with digital watermarking, fingerprinting and augmented reality
KR101960290B1 (en) Method for embedding and extraction of watermarking data
Mahmood et al. A survey on block based copy move image forgery detection techniques
US9159112B2 (en) Digital watermarking using saturation patterns
CN110766594A (en) Information hiding method and device, detection method and device and anti-counterfeiting tracing method
EP2518685A1 (en) Method and apparatus for embedding a watermark image in a host image
CN108596817B (en) Watermark data embedding and extracting method
CN113469869B (en) Image management method and device
CN112070648A (en) Watermark embedding method, watermark extracting method, watermark embedding device, watermark extracting device and electronic equipment
KR102564641B1 (en) Method for embedding and extraction of watermarking data
CN113763224A (en) Image processing method and device
CN109657487B (en) Image processing method, image verification method and device
KR20230011460A (en) Method for embedding and extraction of watermarking data
CN105427234A (en) Sharing method and sharing system for WIFI password
Thongkor et al. Robust image watermarking for camera-captured image using image registration technique
JP4958868B2 (en) Document feature extraction apparatus and method
CN116502660B (en) Digital resource application method and system based on two-dimension code
KR20220099304A (en) Method for embedding and extraction of watermarking data
Dobre et al. Active image authentication and forgery localization for mobile devices
Gourrame et al. Fourier Image Watermarking: Print-Cam Application. Electronics 2022, 11, 266
CN113160357A (en) Information auditing method, system and computer readable storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
TA01 Transfer of patent application right

Effective date of registration: 20230510

Address after: Gyeonggi Do, South Korea

Applicant after: Snapshots Label Co.,Ltd.

Applicant after: HAROO Co.

Address before: Seoul, South Kerean

Applicant before: DAEWON CANDC

Applicant before: HAROO Co.

TA01 Transfer of patent application right
GR01 Patent grant
GR01 Patent grant