CN109447957B - Image copying and pasting detection method based on key point transmission matching - Google Patents

Image copying and pasting detection method based on key point transmission matching Download PDF

Info

Publication number
CN109447957B
CN109447957B CN201811198057.1A CN201811198057A CN109447957B CN 109447957 B CN109447957 B CN 109447957B CN 201811198057 A CN201811198057 A CN 201811198057A CN 109447957 B CN109447957 B CN 109447957B
Authority
CN
China
Prior art keywords
point
image
matching
harris
laplace
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201811198057.1A
Other languages
Chinese (zh)
Other versions
CN109447957A (en
Inventor
蔺聪
林韩辉
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangdong University of Business Studies
Original Assignee
Guangdong University of Business Studies
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangdong University of Business Studies filed Critical Guangdong University of Business Studies
Priority to CN201811198057.1A priority Critical patent/CN109447957B/en
Publication of CN109447957A publication Critical patent/CN109447957A/en
Application granted granted Critical
Publication of CN109447957B publication Critical patent/CN109447957B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/20Image preprocessing
    • G06V10/24Aligning, centring, orientation detection or correction of the image
    • G06V10/245Aligning, centring, orientation detection or correction of the image by locating a pattern; Special marks for positioning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/46Descriptors for shape, contour or point-related descriptors, e.g. scale invariant feature transform [SIFT] or bags of words [BoW]; Salient regional features
    • G06V10/462Salient features, e.g. scale invariant feature transforms [SIFT]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10004Still image; Photographic image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20112Image segmentation details
    • G06T2207/20164Salient point detection; Corner detection

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Multimedia (AREA)
  • Quality & Reliability (AREA)
  • Image Analysis (AREA)

Abstract

The invention discloses an image copying and pasting detection method based on key point transmission matching, wherein Harris-Laplace key points used in the method have scale and rotation invariance; the LIOP characteristic has better scale and rotation invariance compared with other characteristics; meanwhile, the transmitted matching solves the problem that a part of tampered areas cannot be detected due to the lack of the matching relation under the condition of one-to-many, and the tampering detection effect is improved. The method can effectively detect the copying and pasting behaviors in the image, has higher robustness, and can still accurately position the tampered area even if the tampered area is subjected to translation, scaling, rotation, JPEG compression, Gaussian white noise addition and the like.

Description

Image copying and pasting detection method based on key point transmission matching
Technical Field
The invention relates to the technical field of image digital evidence obtaining, in particular to an image copying and pasting detection method based on key point transmission matching.
Background
With the development of internet and smart phone technologies, a large number of digital images appear on the network. And due to the development of digital media editing tools, these images are very easy to be tampered, resulting in "eye sight not necessarily real". Therefore, the need of authentication service for authenticity of digital image data by wide enterprises and public institutions and social individuals is increasingly strong. Therefore, digital image evidence obtaining technology comes.
Digital image forensics can be divided into multiple directions such as image area copy and paste detection and image stitching detection. The image area copy-paste refers to pasting a part of one image to another part of the same image after copying the part of the image. The copying process often includes processing means such as scaling, rotation, noise adding and the like, so that the copied trace cannot be visually distinguished. Our goal is to detect the presence of image copy-and-paste and mark tampered areas without the original image.
Existing detection techniques are generally classified into image block-based and keypoint-based. The image block-based method has been used less now due to the disadvantages of too long computation time, low robustness, etc. The currently mainstream methods are key point based methods.
Disclosure of Invention
In order to overcome the limitation of the existing detection technology, the invention provides a novel image copying and pasting detection method based on key point transmission matching. The method provided by the invention can effectively detect the copying and pasting behaviors in the image, and can still accurately position the tampered area even under the conditions that the tampered area is subjected to translation, scaling, rotation, JPEG compression, Gaussian white noise addition and the like.
In order to solve the technical problems, the technical scheme of the invention is as follows:
an image copying and pasting detection method based on key point transfer matching comprises the following steps:
s1: for an image to be detected, extracting Harris-Laplace key points from the image;
s2: constructing a LIOP descriptor of the Harris-Laplace key point;
s3: storing the Harris-Laplace key points of S1 in a Kd-Tree, and then obtaining an initial matching relationship of the Harris-Laplace key points by adopting a matching strategy of g2 NN;
s4: dividing an image to be detected into a plurality of meaningful image blocks, and counting the matching number N of Harris-Laplace key points between any two image blocksH
S5: transmitting the matching relation of the Harris-Laplace key points in a plurality of image blocks with matching relation to obtain a new matching relation of the Harris-Laplace key points; the delivery rule is (K)1,K2),
Figure GDA0002662252260000021
Wherein K1,K2,K3Are all Harris-Laplace key points, (K)1,K2),(K1,K3) Is a known match relationship;
s6: after matching of the transfer, pair NHUpdate and according to NHMaking a judgment if N isHIf the difference is larger than O, the two image blocks are considered to be matched,keeping the two image blocks and the corresponding matched Harris-Laplace key points thereof, and executing S7; if N is presentHIf not, the corresponding matching point pair is abandoned, and the corresponding image block is considered to be unmatched; the O is an artificial preset value;
s7: randomly selecting three non-collinear matching point pairs from any two image blocks with matching relations, estimating affine transformation matrixes of the two image blocks with matching relations, multiplying the starting points of the matching point pairs with the affine transformation matrixes respectively, comparing the starting points with the end points of the matching pairs, and if the error between the starting points and the end points of the matching pairs is less than beta, defining the matching point pairs as matching point pairs meeting the conditions; the iteration is executed for a preset number of times NitThen, selecting an affine transformation matrix with the maximum number of matching point pairs meeting the conditions and the minimum sum of errors of the matching point pairs meeting the conditions as a final affine transformation matrix; beta and NitIs an artificial preset value;
s8: combining the image to be detected with the final affine transformation matrix to obtain a transformed image; calculating the correlation coefficient of the transformed image and the image to be detected at the corresponding position, and carrying out binarization processing on the correlation coefficient to obtain a binarization correlation coefficient, if the binarization correlation coefficient of a point is greater than zeta, defining the point as a suspicious point, and setting the suspicious point as 1 at the corresponding position of the binary image; if the correlation coefficient of the binarization of the point is not more than zeta, setting the correlation coefficient to be 0 at the corresponding position of the binary image; performing morphological operation on the final binary image to obtain a final detection result image; and zeta is an artificial preset value.
In a preferred embodiment, the S1 includes the following steps:
s1.1: constructing a plurality of scale spaces of Harris to obtain angular points of each scale, thereby forming an initial point set;
s1.2: and selecting qualified corner points in the initial point set to become Harris-Laplace key points.
In a preferred embodiment, S1.1 includes the following:
in the method for extracting the angular point, an autocorrelation matrix M corresponding to a point x is obtained:
Figure GDA0002662252260000022
in the formula, the sigmaIIs an integral scale, said σDIs a differential scale, said g (σ)I) Is a Gaussian convolution kernel, said IxIs the partial derivative in the x-direction, said IyIs the partial derivative in the y-direction;
where M in turn can be represented as:
Figure GDA0002662252260000031
wherein
Figure GDA0002662252260000032
Figure GDA0002662252260000033
Figure GDA0002662252260000034
Formation rule of initial point set:
R=det(M)-αtrace2(M)
firstly, comparing the angular point response of a point x with angular point responses corresponding to 8 points around the point x, and needing to be more than the angular point responses of 8 points around the point x; second, R must be greater than a given threshold TR(here T)R20). If the conditions are met, judging the point x as an angular point; if not, judging that the corner is not the angular point; alpha is an artificial preset value and the value range is [0.04,0.06]]Here, we take α as 0.04; where det (M) is the determinant of the matrix M,the calculation method is det (M) ═ AB-C2(ii) a Wherein trace (M) is a trace of matrix M, and is calculated by trace (M) ═ a + B;
in a preferred embodiment, the conditions in S1.2 include the following:
for a certain point x in the initial point set, verifying whether the certain point x is the maximum value of the scale space, if so, retaining the point, and otherwise, discarding the point:
Figure GDA0002662252260000035
in a preferred embodiment, the S2 includes the following steps:
s2.1: dividing the periphery of the Harris-Laplace key point into a plurality of small blocks;
s2.2: for a point y in the small block, taking N points from the periphery of the point y, and solving the LIOP description corresponding to the point y, wherein the LIOP characteristic is expressed by the following formula:
LIOP(y)=Φ(γ(P(y)))
wherein said p (y) is (I (y)1),I(y2),···,I(yN))∈PNSaid I (y)i) I-th neighboring point y representing point yiA pixel value of (a); the gamma () represents the set of all permutations corresponding to an N-dimensional vector, with N! The possibility of the seeds; the phi () constitutes an N! Vector of dimension, and the mapping corresponds to the Q-th dimension, then N! The Q-th position of the vector of dimensions is 1 and the rest positions are 0, the Q is between 1 and N! A positive integer in between;
s2.3: and multiplying LIOP feature descriptions of all the small blocks in the S2.1 by corresponding weight functions to construct LIOP descriptors.
In a preferred embodiment, the matching strategy of g2NN of S3 includes the following contents:
searching R Harris-Laplace key points which are closest to the characteristics corresponding to the specified Harris-Laplace key points in a Kd-Tree Tree, wherein R is a positive integer; defining the key point of Harris-Laplace and its nearest neighborThe distance between corresponding features is d1Defining the distance d between the Harris-Laplace key point and the feature corresponding to the next adjacent key point2And so on until dR(ii) a Then calculate d(i)/d(i+1)I is in the range of [1, R-1 ]](ii) a If when i ═ j, there is d(j)/d(j+1)<TdThen, consider d(j)The corresponding Harris-Laplace key point, the Harris-Laplace key point before the Harris-Laplace key point and the designated Harris-Laplace key point are matched, and an initial matching relation between the Harris-Laplace key points is formed; the value range of j is [1, R-1 ]](ii) a The T isdIs an artificial preset value.
In a preferred embodiment, said S5 includes the following contents:
if the image blocks have matching relationship, the image blocks are transmitted according to the existing matching relationship, and the transmission rule is (K)1,K2),
Figure GDA0002662252260000041
As shown in fig. 5, the region Ω1And Ω2There is a matching point pair (a)1,b1) Region omega1And Ω3There is a matching point pair (a)1,c1) Region omega2And Ω3There is a matching point pair (b)3,c3) Therefore, the region Ω1,Ω2And Ω3There are mutually matched point pairs, so that the key point pairs between the three regions are matched in a transmission manner according to the existing matching relationship (a)1,b1),(a1,c1) Applying the delivery rule (K)1,K2),
Figure GDA0002662252260000042
Obtaining a new matching relation (b)1,c1) And is indicated by a dotted line in the image, and other new matching relations can be obtained similarly.
In a preferred embodiment, said S7 includes the following contents:
for a pair of matching point pairs, X is defined as (p, q) and
Figure GDA0002662252260000043
the affine transformation relationship is expressed by the following formula:
Figure GDA0002662252260000044
in the above formula, a, b, c, d, tx,tyIs a pending coefficient, and T is an affine transformation matrix; substituting three pairs of non-collinear matching point pairs into the formula to obtain T;
the error is expressed by the following formula:
Figure GDA0002662252260000051
said
Figure GDA0002662252260000052
And XiIs a pair of matching point pairs; the equation represents the affine transformation matrix T corresponding to the minimum error.
In the preferred embodiment, each calculation obtains an affine transformation matrix T, transforms the starting point of the matching point pair according to T, compares the transformation result with the end point of the matching point pair, and calculates the error. If the error is less than β, the pair of points is left. And iterating the steps for multiple times, and finally taking the matrix which has the most point pairs and the smallest sum of errors as the matrix to be solved.
In a preferred embodiment, the correlation coefficient of S8 is expressed by the following formula:
Figure GDA0002662252260000053
wherein c (x) represents a correlation coefficient, and Ω(x)Is a region centered on x, and I (mu) isThe F (μ) is a value obtained by multiplying the position of the point μ in the original image to be detected by a corresponding affine transformation matrix, and the F (μ) is a value obtained by multiplying the position of the point μ in the original image to be detected by the corresponding affine transformation matrix
Figure GDA0002662252260000054
And
Figure GDA0002662252260000055
is the average pixel value of the region centered at x.
In a preferred embodiment, O ═ 3 is used.
In a preferred embodiment, β is 0.05.
In a preferred embodiment, N isit=2000。
In a preferred embodiment, ζ is 0.5.
In a preferred embodiment, T isR=20。
In a preferred embodiment, α is 0.04.
In a preferred embodiment, N ═ 4 is used.
In a preferred embodiment, R ═ 10 is used.
In a preferred embodiment, T isd=0.5。
Compared with the prior art, the technical scheme of the invention has the beneficial effects that:
the method extracts Harris-Laplace key points and LIOP characteristic vectors, and compared with the traditional method, the robustness of the method is better; meanwhile, due to the adoption of the transmitted matching, the matching effect of key points is improved, and convenience is provided for estimating affine transformation. Experimental results show that the method has good effects in the aspects of geometric transformation such as translation, scale and rotation, JPEG compression, Gaussian white noise addition and other post-processing.
Drawings
FIG. 1 is a flow chart of an embodiment.
Fig. 2 is an image to be detected in the embodiment.
Fig. 3 shows the corresponding actual tampered area in the embodiment.
FIG. 4 is a diagram illustrating the actual detection effect of the embodiment.
FIG. 5 is an exemplary diagram of the matching passed in the present invention.
Detailed Description
The drawings are for illustrative purposes only and are not to be construed as limiting the patent;
for the purpose of better illustrating the embodiments, certain features of the drawings may be omitted, enlarged or reduced, and do not represent the size of an actual product;
it will be understood by those skilled in the art that certain well-known structures in the drawings and descriptions thereof may be omitted.
The technical solution of the present invention is further described below with reference to the accompanying drawings and examples.
As shown in fig. 1, an image copy and paste detection method based on keypoint transfer matching includes the following steps:
s1: for an image to be detected, Harris-Laplace key points are extracted from the image,
s1.1: constructing a plurality of scale spaces of Harris to obtain angular points of each scale, thereby forming an initial point set;
firstly, obtaining an autocorrelation matrix M corresponding to a point x:
Figure GDA0002662252260000061
in the formula, σIIs the integral scale, σDIs the differential scale, g (σ)I) Is a Gaussian convolution kernel, IxIs the partial derivative in the x-direction, IyIs the partial derivative in the y-direction;
where M in turn can be represented as:
Figure GDA0002662252260000062
wherein
Figure GDA0002662252260000063
Figure GDA0002662252260000064
Figure GDA0002662252260000065
Formation rule of initial point set:
R=det(M)-αtrace2(M)
firstly, comparing the angular point response R of a point x with angular point responses corresponding to 8 points around the point x, wherein the angular point response R is larger than the angular point response of 8 points around the point x; second, R must be greater than 20. If the conditions are met, judging the point x as an angular point; if not, judging that the corner is not the angular point; α is an artificial preset value, where α is 0.04; where det (M) is determinant of matrix M, and the calculation method is det (M) -AB-C2(ii) a Wherein trace (M) is a trace of matrix M, and is calculated by trace (M) ═ a + B;
s1.2: and (3) selecting qualified corner points in the initial point set to form Harris-Laplace key points, wherein the conditions are as follows:
Figure GDA0002662252260000071
if the key point is the Harris-Laplace key point, otherwise, the point is discarded.
S2: for Harris-Laplace key points, a LIOP descriptor is constructed,
s2.1: dividing the periphery of the Harris-Laplace key point into a plurality of small blocks;
s2.2: for a point y in the small block, taking N points around the point y to obtain LIOP characteristic description corresponding to the point y, wherein the LIOP characteristic description is expressed by the following formula:
LIOP(y)=Φ(γ(P(y)))
wherein p (y) is (I (y)1),I(y2),···,I(yN))∈PN,I(yi) I-th neighboring point y representing point yiA pixel value of (a); γ represents the set of all permutations corresponding to a 4-dimensional vector, with 4! The possibility of the seeds; phi (gamma) constitutes a 4 | for each of the just-mentioned possibilities! Vector of dimension, and the mapping corresponds to the Q-th dimension, then 4! The Q-th position of the vector of dimensions is 1 and the remaining positions are 0, Q is between 1 and 4! A positive integer in between;
s2.3: and multiplying LIOP feature descriptions of all the small blocks in the S2.1 by corresponding weight functions to construct LIOP descriptors.
S3: searching 10 Harris-Laplace key points which are closest to the characteristics corresponding to the specified Harris-Laplace key points in a Kd-Tree Tree; defining the distance d between the Harris-Laplace key point and the characteristic corresponding to the nearest neighbor key point1Defining the distance d between the Harris-Laplace key point and the feature corresponding to the next adjacent key point2And so on until d10(ii) a Then calculate d(i)/d(i+1)(ii) (i ═ 1,2, …, 9); if when i ═ j, there is d(j)/d(j+1)<0.5, then d is considered(j)The corresponding Harris-Laplace key point, the Harris-Laplace key point before the Harris-Laplace key point and the designated Harris-Laplace key point are matched, and an initial matching relation between the Harris-Laplace key points is formed; the value range of j is [1,9 ]]。
S4: dividing an image to be detected into a plurality of meaningful image blocks, and counting the matching number N of Harris-Laplace key points between any two image blocksH
S5: transmitting the matching relation of the Harris-Laplace key points in a plurality of image blocks with matching relation to obtain a new matching relation of the Harris-Laplace key points; the delivery rule is (K)1,K2),
Figure GDA0002662252260000081
Wherein K1,K2,K3Are all Harris-Laplace key points, (K)1,K2),(K1,K3) Is a known match relationship;
s6: after matching of the transfer, pair NHUpdate and according to NHMaking a judgment if N isHIf the number of the image blocks is more than 3, the two image blocks are considered to be matched, the two image blocks and the corresponding matched Harris-Laplace key points are reserved, and S7 is executed; if N is presentHIf the number of the image blocks is not more than 3, the corresponding matching point pairs are abandoned, and the corresponding image blocks are considered to be unmatched;
s7: randomly taking three non-collinear matching point pairs from any two image blocks with matching relations, estimating affine transformation matrixes of the two image blocks with matching relations, multiplying the starting points of the matching point pairs with the affine transformation matrixes respectively, comparing the starting points with the end points of the matching pairs, and if the error between the starting points and the end points of the matching point pairs is less than a preset value of 0.05, defining the matching point pairs as matching point pairs meeting the conditions; after iteration is carried out for 2000 times, selecting an affine transformation matrix which has the most number of matching point pairs meeting the conditions and the minimum sum of errors of the matching point pairs meeting the conditions as a final affine transformation matrix;
for a pair of matching point pairs, X is defined as (p, q) and
Figure GDA0002662252260000082
the affine transformation relationship is expressed by the following formula:
Figure GDA0002662252260000083
in the above formula, a, b, c, d, tx,tyIs the undetermined coefficient, T is the affine transformation matrix; substituting three pairs of non-collinear matching point pairs into the formula to obtain T;
the error is expressed by the following formula:
Figure GDA0002662252260000084
Figure GDA0002662252260000085
and XiIs a pair of matching point pairs; the equation represents an affine transformation matrix T corresponding to the minimum error;
s8: combining the image to be detected with the final affine transformation matrix to obtain a transformed image; calculating the correlation coefficient of the transformed image and the image to be detected at the corresponding position, and performing binarization processing on the correlation coefficient to obtain a binarized correlation coefficient, wherein if the binarized correlation coefficient of the point is greater than 0.5, the point is defined as a suspicious point, and the point is set as 1 at the corresponding position of the binary image; if the correlation coefficient of the binarization of the point is not more than 0.5, setting the correlation coefficient to be 0 at the corresponding position of the binary image; performing morphological operation on the final binary image to obtain a final detection result image;
wherein the correlation coefficient is expressed by the following formula:
Figure GDA0002662252260000091
wherein c (x) represents a correlation coefficient, Ω(x)Is a 5 × 5 region centered on x, I (μ) is a pixel value of a position where the point μ is located in the original image to be detected, F (μ) is a value obtained by multiplying the position where the point μ is located in the original image to be detected by the corresponding affine transformation matrix,
Figure GDA0002662252260000092
and
Figure GDA0002662252260000093
is the average pixel value of a 5 × 5 region centered at x.
The effect of the embodiment is as follows: fig. 2 is an image to be detected, fig. 3 is a corresponding actual tampered area, and fig. 4 is an actual detection effect diagram. As can be understood from fig. 2, fig. 3 and fig. 4, the actual tampered area is precisely identified by the present embodiment.
The terms describing positional relationships in the drawings are for illustrative purposes only and are not to be construed as limiting the patent;
it should be understood that the above-described embodiments of the present invention are merely examples for clearly illustrating the present invention, and are not intended to limit the embodiments of the present invention. Other variations and modifications will be apparent to persons skilled in the art in light of the above description. And are neither required nor exhaustive of all embodiments. Any modification, equivalent replacement, and improvement made within the spirit and principle of the present invention should be included in the protection scope of the claims of the present invention.

Claims (10)

1. An image copying and pasting detection method based on key point transmission matching is characterized by comprising the following steps:
s1: for an image to be detected, extracting Harris-Laplace key points from the image;
s2: constructing a LIOP descriptor of the Harris-Laplace key point;
s3: storing the Harris-Laplace key points of S1 in a Kd-Tree, and then obtaining an initial matching relationship of the Harris-Laplace key points by adopting a matching strategy of g2 NN;
s4: dividing an image to be detected into a plurality of image blocks, and counting the matching number N of Harris-Laplace key points between any two image blocksH
S5: transmitting the matching relation of the Harris-Laplace key points in a plurality of image blocks with matching relation to obtain a new matching relation of the Harris-Laplace key points; the delivery rule is (K)1,K2),
Figure FDA0002651634350000011
Wherein K1,K2,K3Are all Harris-Laplace key points, (K)1,K2),(K1,K3) Is a known match relationship;
s6: on-goingAfter successive matches, for NHUpdate and according to NHMaking a judgment if N isHIf the number of the image blocks is larger than O, the two image blocks are considered to be matched, the two image blocks and the corresponding matched Harris-Laplace key points are reserved, and S7 is executed; if N is presentHIf not, the corresponding matching point pair is abandoned, and the corresponding image block is considered to be unmatched; the O is an artificial preset value;
s7: randomly selecting three non-collinear matching point pairs from any two image blocks with matching relations, estimating affine transformation matrixes of the two image blocks with matching relations, multiplying the starting points of the matching point pairs with the affine transformation matrixes respectively, comparing the starting points with the end points of the matching point pairs, and if the error between the starting points and the end points is less than beta, defining the matching point pairs as matching point pairs meeting the conditions; the iteration is executed for a preset number of times NitThen, selecting an affine transformation matrix with the maximum number of matching point pairs meeting the conditions and the minimum sum of errors of the matching point pairs meeting the conditions as a final affine transformation matrix; beta and NitIs an artificial preset value;
s8: combining the image to be detected with the final affine transformation matrix to obtain a transformed image; calculating the correlation coefficient of the transformed image and the image to be detected at the corresponding position, and carrying out binarization processing on the correlation coefficient to obtain a binarization correlation coefficient, if the binarization correlation coefficient of a point is greater than zeta, defining the point as a suspicious point, and setting the suspicious point as 1 at the corresponding position of the binary image; if the correlation coefficient of the binarization of the point is not more than zeta, setting the correlation coefficient to be 0 at the corresponding position of the binary image; performing morphological operation on the final binary image to obtain a final detection result image; and zeta is an artificial preset value.
2. The image copy-paste detection method according to claim 1, wherein said S1 includes the following steps:
s1.1: constructing a plurality of scale spaces of Harris to obtain angular points of each scale, thereby forming an initial point set;
s1.2: and selecting qualified corner points in the initial point set to become Harris-Laplace key points.
3. The image copy-paste detection method according to claim 2, wherein S1.1 comprises the following contents:
the extraction method of the angular point firstly obtains an autocorrelation matrix M corresponding to a point x:
Figure FDA0002651634350000021
in the formula, the sigmaIIs an integral scale, said σDIs a differential scale, said g (σ)I) Is a Gaussian convolution kernel, said IxIs the partial derivative in the x-direction, said IyIs the partial derivative in the y-direction;
wherein M may be further represented as:
Figure FDA0002651634350000022
wherein
Figure FDA0002651634350000023
Figure FDA0002651634350000024
Figure FDA0002651634350000025
Formation rule of initial point set:
R=det(M)-α trace2(M)
in the formula, R is a corner response function, and whether R is a corner is determined by the value of R, two determination conditions need to be satisfied simultaneously:
firstly, comparing the corner response of a point x with the corner responses corresponding to 8 points around the point x, wherein the corner responses are greater than the corner responses of the 8 points around the point x;
second, R must be greater than a given threshold TR
If the conditions are met, judging the point x as an angular point; if not, judging that the point x is not the angular point; alpha is an artificial preset value, and the value range is [0.04,0.06 ];
where det (M) is determinant of matrix M, and the calculation method is det (M) -AB-C2(ii) a Where trace (M) is a trace of matrix M, and the calculation method is trace (M) ═ a + B.
4. The image copy-paste detection method of claim 3, wherein the conditions in S1.2 include the following:
and verifying whether a certain point x in the initial point set is a maximum value of the scale space, if so, retaining the point, and otherwise, discarding the point.
5. The image duplication and pasting detection method according to any one of claims 1 to 4, wherein said S2 includes the following steps:
s2.1: dividing the periphery of the Harris-Laplace key point into a plurality of small blocks;
s2.2: for a point y in a small block, taking N points from the periphery of the point y, and then solving a LIOP characteristic description corresponding to the point y, wherein the LIOP characteristic description is expressed by the following formula:
LIOP(y)=Φ(γ(P(y)))
wherein said p (y) is (I (y)1),I(y2),···,I(yN))∈PNSaid I (y)i) I-th neighboring point y representing point yiA pixel value of (a); the gamma () represents the set of all permutations corresponding to an N-dimensional vector, with N! The possibility of the seeds; the phi () constitutes an N! Vector of dimension, and the mapping corresponds to the Q-th dimension, then N! The Q-th position of the vector of dimensions is 1, and the rest positions are all positionedIs 0, said Q is between 1 and N! A positive integer in between;
s2.3: and multiplying LIOP feature descriptions of all the small blocks in the S2.1 by corresponding weight functions to construct LIOP descriptors.
6. The image copy-paste detection method of claim 5, wherein the matching policy of g2NN of S3 includes the following:
searching R Harris-Laplace key points which are closest to the characteristics corresponding to the specified Harris-Laplace key points in a Kd-Tree Tree, wherein R is a positive integer; defining the distance d between the designated Harris-Laplace key point and the feature corresponding to the nearest neighbor key point1Defining the distance d between the specified Harris-Laplace key point and the feature corresponding to the next adjacent key point2And so on until dR(ii) a Then calculate d(i)/d(i+1)I is in the range of [1, R-1 ]](ii) a If when i ═ j, there is d(j)/d(j+1)<TdThen, consider d(j)The corresponding Harris-Laplace key point, the Harris-Laplace key point before the Harris-Laplace key point and the designated Harris-Laplace key point are matched, and an initial matching relation between the Harris-Laplace key points is formed; the value range of j is [1, R-1 ]](ii) a The T isdIs an artificial preset value.
7. The image copy-paste detection method according to claim 6, wherein said S7 includes the following contents:
for a pair of matching point pairs, X is defined as (p, q) and
Figure FDA0002651634350000031
the affine transformation relationship is expressed by the following formula:
Figure FDA0002651634350000041
in the above formula, a, b, c, d,tx,tyIs a pending coefficient, and T is an affine transformation matrix; substituting three pairs of non-collinear matching point pairs into the formula to obtain T;
the error is expressed by the following formula:
Figure FDA0002651634350000042
said
Figure FDA0002651634350000043
And XiIs a pair of matching point pairs; the above formula represents the affine transformation T corresponding to the minimum sum of errors.
8. The image duplication and sticking detection method according to claim 1,2, 3, 4, 6, or 7, wherein the correlation coefficient of S8 is expressed by the following equation:
Figure FDA0002651634350000044
wherein c (x) represents a correlation coefficient, and Ω(x)Is the area with x as the center, I (mu) is the pixel value of the position of the point mu in the original image to be detected, F (mu) is the value obtained by multiplying the position of the point mu in the original image to be detected by the corresponding affine transformation matrix, and F (mu) is the value obtained by multiplying the position of the point mu in the original image to be detected by the corresponding affine transformation matrix
Figure FDA0002651634350000045
And
Figure FDA0002651634350000046
is the average pixel value of the region centered at x.
9. The image duplication and pasting detection method of claim 8, wherein O is 3.
10. The image duplication and sticking detection method according to claim 1,2, 3, 4, 6, 7, or 9, wherein ζ is 0.5.
CN201811198057.1A 2018-10-15 2018-10-15 Image copying and pasting detection method based on key point transmission matching Active CN109447957B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201811198057.1A CN109447957B (en) 2018-10-15 2018-10-15 Image copying and pasting detection method based on key point transmission matching

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201811198057.1A CN109447957B (en) 2018-10-15 2018-10-15 Image copying and pasting detection method based on key point transmission matching

Publications (2)

Publication Number Publication Date
CN109447957A CN109447957A (en) 2019-03-08
CN109447957B true CN109447957B (en) 2020-11-10

Family

ID=65545417

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201811198057.1A Active CN109447957B (en) 2018-10-15 2018-10-15 Image copying and pasting detection method based on key point transmission matching

Country Status (1)

Country Link
CN (1) CN109447957B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111008955B (en) * 2019-11-06 2023-05-26 重庆邮电大学 Rapid copying, pasting and tampering detection method for multi-scale image block matching

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106056122A (en) * 2016-05-26 2016-10-26 中山大学 KAZE feature point-based image region copying and pasting tampering detection method
CN108335290A (en) * 2018-01-23 2018-07-27 中山大学 A kind of image zone duplicating and altering detecting method based on LIOP features and Block- matching

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20140102038A (en) * 2013-02-13 2014-08-21 삼성전자주식회사 Video matching device and video matching method

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106056122A (en) * 2016-05-26 2016-10-26 中山大学 KAZE feature point-based image region copying and pasting tampering detection method
CN108335290A (en) * 2018-01-23 2018-07-27 中山大学 A kind of image zone duplicating and altering detecting method based on LIOP features and Block- matching

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
抗翻转的图像复制-粘贴篡改检测算法;张艳华 等;《无线电通信技术》;20151231;第41卷(第3期);第34-37页 *

Also Published As

Publication number Publication date
CN109447957A (en) 2019-03-08

Similar Documents

Publication Publication Date Title
Pham et al. Lcd: Learned cross-domain descriptors for 2d-3d matching
Meena et al. A copy-move image forgery detection technique based on Gaussian-Hermite moments
Zhang et al. Detecting and extracting the photo composites using planar homography and graph cut
Prakash et al. Detection of copy-move forgery using AKAZE and SIFT keypoint extraction
Al-Qershi et al. Evaluation of copy-move forgery detection: datasets and evaluation metrics
CN106056122B (en) A kind of image zone duplicating stickup altering detecting method based on KAZE characteristic point
CN110136125B (en) Image copying and moving counterfeiting detection method based on hierarchical feature point matching
Abidin et al. Copy-move image forgery detection using deep learning methods: a review
CN104504723A (en) Image registration method based on remarkable visual features
US20200005078A1 (en) Content aware forensic detection of image manipulations
Flenner et al. Resampling forgery detection using deep learning and a-contrario analysis
Liu et al. PatchMatch-based automatic lattice detection for near-regular textures
Tahaoglu et al. Improved copy move forgery detection method via L* a* b* color space and enhanced localization technique
Choi et al. Learning descriptor, confidence, and depth estimation in multi-view stereo
Dixit et al. Region duplication detection in digital images based on Centroid Linkage Clustering of key–points and graph similarity matching
CN111199558A (en) Image matching method based on deep learning
CN109447957B (en) Image copying and pasting detection method based on key point transmission matching
CN107993230B (en) Image tampering detection method based on triangular mesh comprehensive characteristics
CN110766708B (en) Image comparison method based on contour similarity
Sedik et al. AI-enabled digital forgery analysis and crucial interactions monitoring in smart communities
Hossein-Nejad et al. Copy-move image forgery detection using redundant keypoint elimination method
Wang et al. Unified detection of skewed rotation, reflection and translation symmetries from affine invariant contour features
CN112417961B (en) Sea surface target detection method based on scene prior knowledge
Phogat et al. Different image registration methods—an overview
CN112381149B (en) Reasonable countermeasure analysis method for source camera identification based on deep learning

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant