CN110599478A - Image area copying and pasting tampering detection method - Google Patents

Image area copying and pasting tampering detection method Download PDF

Info

Publication number
CN110599478A
CN110599478A CN201910871558.XA CN201910871558A CN110599478A CN 110599478 A CN110599478 A CN 110599478A CN 201910871558 A CN201910871558 A CN 201910871558A CN 110599478 A CN110599478 A CN 110599478A
Authority
CN
China
Prior art keywords
feature
calculating
point
points
liop
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201910871558.XA
Other languages
Chinese (zh)
Other versions
CN110599478B (en
Inventor
卢伟
吕启越
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Sun Yat Sen University
Original Assignee
Sun Yat Sen University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Sun Yat Sen University filed Critical Sun Yat Sen University
Priority to CN201910871558.XA priority Critical patent/CN110599478B/en
Publication of CN110599478A publication Critical patent/CN110599478A/en
Application granted granted Critical
Publication of CN110599478B publication Critical patent/CN110599478B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/23Clustering techniques
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/70Denoising; Smoothing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/11Region-based segmentation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/44Local feature extraction by analysis of parts of the pattern, e.g. by detecting edges, contours, loops, corners, strokes or intersections; Connectivity analysis, e.g. of connected components
    • G06V10/443Local feature extraction by analysis of parts of the pattern, e.g. by detecting edges, contours, loops, corners, strokes or intersections; Connectivity analysis, e.g. of connected components by matching or filtering
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10004Still image; Photographic image

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Data Mining & Analysis (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Multimedia (AREA)
  • Quality & Reliability (AREA)
  • Artificial Intelligence (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Evolutionary Biology (AREA)
  • Evolutionary Computation (AREA)
  • General Engineering & Computer Science (AREA)
  • Image Analysis (AREA)

Abstract

The invention provides a method for detecting copying, pasting and tampering of an image area, which comprises the following steps: extracting feature points from an image to be detected; calculating LIOP characteristics of the local image blocks according to the intensity sequence of the local image blocks of the characteristic points; processing the characteristic points by utilizing a Delaunay triangulation Bowyer-Watson algorithm, and calculating LIOP descriptors of each triangle; performing triangle matching and calculating a matched triangle neighborhood; keeping the characteristic points in the triangular neighborhood to form a characteristic point set; generating a feature point matching pair, and clustering the feature point matching pair to obtain a plurality of categories; calculating an affine matrix of each category; and calculating a correlation coefficient diagram before and after the corresponding region is transformed according to the feature point matching pairs and the corresponding affine matrix, and positioning the tampered region. The tampering detection method provided by the invention has better robustness on rotation, scaling, JPEG compression, noise addition and the like; the speed is high, and the practicability is stronger; more accurate affine transformation matrix is facilitated, and the clustering operation of the matching pairs can cope with the tampering operation of multiple copies.

Description

Image area copying and pasting tampering detection method
Technology neighborhood
The invention relates to a neighborhood of an image digital evidence obtaining technology, in particular to a method for detecting copying, pasting and tampering of an image area.
Background
In recent years, with the rapid development of multimedia technology and computer network technology and the widespread use of smart phones and computers, people can easily edit and modify images by means of image editing software such as Photoshop, american show, and the like. If the digital image is maliciously tampered and spread, the digital image has a bad influence on social stability and life harmony. In addition, in some situations, such as scientific research, news publishing, and jurisdictions, the integrity, reliability, and authenticity of images need to be guaranteed.
The image area copying and pasting tampering detection is an important branch of passive evidence obtaining in a digital image evidence obtaining technology, and is mainly used for detecting whether an area copying and pasting behavior exists in an image, namely copying and pasting one or more areas in the image to other areas of the image, wherein the tampering operation can hide important objects in the image. Before pasting, the copied area is often subjected to one or more operations, such as scaling, rotating, compressing, adding noise, etc., so that it is difficult for human eyes to judge a copied and pasted image.
The existing image area copy and paste detection technology mainly adopts a block-based detection method, but has poor robustness to the conditions of image rotation, scaling and the like, and has good applicability to noise addition due to the characteristics of an overlapped block area, but has high calculation cost.
Disclosure of Invention
The invention provides a method for detecting the copying and pasting tampering of an image area, aiming at overcoming the technical defects of poor robustness and high calculation cost of the existing block-based image area copying and pasting detection method.
In order to solve the technical problems, the technical scheme of the invention is as follows:
an image area copy-paste tamper detection method includes the following steps:
s1: preprocessing an image to be detected to obtain feature points, so as to obtain a local image block of the feature to be extracted;
s2: dividing the local image block into sub-regions according to the pixel intensity, and calculating the local intensity sequence characteristics of each pixel point to obtain LIOP characteristics of corresponding characteristic points;
s3: processing the feature points by utilizing a Delaunay triangulation Bowyer-Watson algorithm to generate a Delaunay triangulation network, and respectively calculating the average value of LIOP descriptors of three vertexes of each triangle as the feature vector of the corresponding triangle;
s4: performing triangle matching according to the feature vector of each triangle and calculating a matched triangle neighborhood;
s5: judging whether the corresponding feature points are located in the triangular neighborhood or not; if yes, go to step S6; otherwise, discarding the feature point and the corresponding LIOP feature;
s6: reserving the feature point and the corresponding LIOP feature to form a feature point set;
s7: performing feature point matching according to the feature point set to generate feature point matching pairs, and clustering the feature point matching pairs to obtain a plurality of categories;
s8: calculating an affine matrix of each category;
s9: calculating a correlation coefficient graph before and after the corresponding region is transformed according to the feature point matching pairs and the corresponding affine matrixes, and positioning the tampered region;
s10: judging whether all classes are calculated, if so, combining tampered areas of all classes as a detection result; otherwise, step S9 is executed.
Wherein, the step S1 specifically includes the following steps:
s11: carrying out Gaussian filtering on the images to be detected under the conditions of the same kernel size and different standard deviations to obtain different images;
s12: carrying out difference operation on adjacent filtering images to obtain a series of difference images, and marking uniform extreme points in each difference image as feature points;
s13: removing noise by Gaussian filtering according to the obtained characteristic points, and normalizing the adjacent detection area into a circular area with a fixed diameter;
s14: and removing noise generated by difference operation in the normalization processing by using Gaussian smoothing with standard deviation different from the standard deviation to obtain a local image block of the feature to be extracted.
Wherein, the step S2 specifically includes the following steps:
s21: the local image block is divided into a plurality of B sub-areas according to the ascending order of the pixel intensity, and each sub-area has the same number of pixels;
s22: pixel sampling is carried out on each pixel point neighborhood of each subregion, and the local intensity sequence characteristic of each pixel point is calculated;
s23: adding all local intensity sequence characteristics to obtain a LIOP descriptor of the sub-region;
s24: and arranging LIOP descriptors of all the sub-areas in sequence to obtain LIOP characteristics of the corresponding characteristic points.
Wherein the local intensity sequence characteristics are defined as:
for a set P of N-dimensional integer vectorsN={P=(p1,p2,...,pN)|pjE R and a set Π of all permutation combinations consisting of integers 1,2NDefinition of gamma: PN→ΠNFor the set P ∈ PNTo pi ∈ ΠNBy arranging N elements of P in descending order, which is correspondingly arranged as pi ═ i (i)1,i2,...,iN) Wherein i is1Corresponds to p1The corresponding sequence numbers in descending order, and so on, are mathematically defined for γ:
γ(P)=π,P∈PN,π∈ΠN
wherein pi ═ i (i)1,i2,...,iN),P=(p1,p2,...,pN),ijCorresponds to pjThe corresponding sequence number, j, in descending order, is 1, 2. Mapping gamma to set PNIs divided into N! Each partition, the N-dimensional vector in each partition is mapped to a fixed permutation, namely, the partitions and the permutations have one-to-one relationship, the P can be pairedNAll the partitions are coded, an index table is established for all the permutation and combination, a characteristic mapping function phi is defined according to the index table, and the permutation and combination phi is mapped toWhereinIs an N! The feature vector of the dimension, mathematically the mapping function φ is defined as:
wherein Ind (π) is the number of permutation and combination π in the index table, and
the LIOP characteristics are extracted by the following steps:
taking O as the center of a circle of the circular sub-area, and sampling neighborhood pixels around each pixel in the sub-area; taking any pixel point mu to establish a local rectangular coordinate system, taking mu as an origin and OμThe method comprises the steps that the Y-axis forward direction is adopted, the X-axis forward direction is the Y-axis forward direction, the clockwise rotation is carried out for 90 degrees, then a circle with the radius r is drawn by taking mu as the center of the circle, the circle is a neighborhood circle of a pixel point mu, the intersection point of the neighborhood circle and the Y-axis is taken as a first sampling point, and N points are uniformly sampled on the circle in the anticlockwise direction; calculating LIOP descriptors of the pixel points mu by the N sampled points; defining P (mu) as an N-dimensional vector formed by N sampling points of the pixel point mu in the sub-region, and then expressing the local intensity sequence characteristics of the mu point as follows:
wherein P (μ) ═ I (μ)1),I(μ2),...,I(μN))∈PNThe sampling point is the ith sampling point sampled around the mu point in sequence; adding the local intensity sequence characteristics of all pixel points in the sub-area to obtain a LIOP descriptor in the sub-area, wherein the LIOP characteristics of the local image blocks to be subjected to characteristic extraction are LIOP descriptors of all the sub-areasIn order, expressed as:
LIOP descriptor=(des1,des2,...,desB)
wherein biniIs the ith sub-region, B is the number of sub-region divisions, the LIOP eigenvector of the eigenvalue is N! Dimension x B.
Wherein, the step S4 specifically includes the following steps:
s41: calculating Euclidean distances between each triangle feature and all other feature vectors, and sorting according to arrival from childhood;
s42: calculating nearest neighbor d1And next neighbor d2If the ratio is less than 0.6, the distance is d1Match two features of d, and2as nearest neighbor, the nearest neighbor d is calculated3The ratio between the two is compared with the threshold value in the same way until the ratio is more than 0.6;
s43: deleting repeated matching pairs of triangles and matching pairs with the distance between the centers of inscribed circles of the triangles smaller than 20;
s44: and calculating a triangle neighborhood near each matched triangle, and constructing a region set formed by adjacent square regions.
In step S44, the process of constructing the triangular square neighborhood specifically includes: selecting a triangle tiCalculating the center o of the triangle tangent circleiAnd radius r thereofiAt the center o of the circleiSelecting the length of the side as m × r for the centeriThe square area of size serves as the square neighborhood for the triangle.
Wherein the step S7 includes the steps of:
s71: calculating Euclidean distances between each feature point and all other feature points according to the feature point set, and sorting the Euclidean distances from small to large;
s72: calculating the nearest d1And next neighbor d2If the ratio is less than 0.9, the distance is d1Are matched and d is compared2As nearest neighbor, the nearest neighbor d is calculated3The ratio between the two is compared with the threshold value in the same way until the ratio is more than 0.9;
s73: deleting the repeated feature point matching and the matching pair with the coordinate distance smaller than 20 to complete the matching of the feature points;
s74: the characteristic point matching pair is subjected to unified angle adjustment, and the method specifically comprises the following steps:
for any matching pair { (x)1i,y1i),(x2i,y2i) Its difference vector is defined as:
di=(dxi,dyi)=(x1i-x2i,y1i-y2i)
angle a of matched pairiIs defined as a vector diThe angle between the positive direction of the x axis is [ -pi, pi [ -pi [ ]]For matched pairs { (x)1i,y1i),(x2i,y2i) If its angle aiIf < 0, the match pair is reversed, i.e., { (x)1i,y1i),(x2i,y2i) Instead { (x)2i,y2i),(x1i,y1i)};
S75: and (3) forming a feature vector of the feature point matching pair by using the coordinates and the difference value of the two point coordinates of the feature point matching pair, and recording the feature vector as: (x)1,y1,x2,y2,x1-x2,y1-y2) And taking all the feature vectors of the feature point matching pairs as a classification basis, and performing DBSCAN clustering to obtain K classes.
In step S8, the calculation process of the affine matrix of each category specifically includes:
for two feature points that match, p1=(x1,y1) And p1'=(x2,y2) The affine transformation relationship is expressed as:
wherein, a, b, c, d, tx,tyThe undetermined coefficient is obtained by using three pairs of non-collinear point matching pairs and obtaining a corresponding affine transformation matrix T by using the formula; n for class kkDotAnd its corresponding matching pointThe radiation matrix calculation steps are as follows:
randomly selecting three non-collinear point matching pairs, and calculating affine matrix TjThen the rest matching points are determined according to TjAnd transforming and calculating the sum of the total mean square errors, wherein the calculation formula is as follows:
iterating the step NiterThen, select the product satisfying SMSEMinimum TjAffine matrix T as the classk
In step S9, the specific process of locating the tampered area is:
creating a correlation coefficient image C having the same size as the image to be measuredmapThe initial pixel values are all set to 0, and the copied image is stored as F, according to the affine matrix set { T }1,T2,...,TkFor each category, the following operations are performed: points according to the class kSetting a rectangular area I covering all the characteristic points according to the coordinate values, carrying out affine transformation on each coordinate in the area to obtain transformed coordinates, reserving the area I with the pixel value in F, recording the transformed coordinates as an area M, and calculating the correlation coefficient c (p) between the corresponding positions p one by one through pixel points, wherein the calculation formula is as follows:
where Ω (p) is a 7 × 7 region centered on p, I (u) and M (μ) are pixel values of the corresponding positions,andis the average pixel value of the 7 × 7 region; writing the calculated correlation coefficient C (p) into a correlation coefficient graph CmapThe original coordinates and the transformed coordinates; when writing in, the current value is compared with the original value of the corresponding position, and only a larger value is reserved; after all classes are processed by the above process, the obtained correlation coefficient graph C is obtainedmapCarrying out binarization processing, wherein the threshold value is 0.60; if the correlation coefficient value is greater than 0.60, the point of the position is considered as a suspicious point, the value of the corresponding position of the binary image is set as 1, otherwise, the value is set as 0; and finally, performing morphological operation on the obtained binary image to filter out disordered points, deleting small areas, and finally generating a detection result image, wherein the white area is a tampered area.
Compared with the prior art, the technical scheme of the invention has the beneficial effects that:
according to the image area copying pasting tampering detection method provided by the invention, LIOP characteristics are selected as an image characteristic advance algorithm, and the method has better robustness on rotation, scaling, JPEG compression, noise addition and the like; because only a limited number of non-overlapping triangles need to be subjected to similar matching during matching, compared with the traditional block-based detection algorithm, the method is high in speed and high in practicability; the invention utilizes secondary matching to obtain more feature point matching pairs, thereby facilitating more accurate affine transformation matrix, and the clustering operation of the matching pairs can deal with the tampering operation of multiple duplication.
Drawings
FIG. 1 is a schematic flow diagram of the process of the present invention;
FIG. 2 is an image to be detected in example 2;
FIG. 3 is a schematic view of an actual copy-and-paste area in example 2;
fig. 4 is a diagram showing the actual effect of detection in example 2.
Detailed Description
The drawings are for illustrative purposes only and are not to be construed as limiting the patent;
for the purpose of better illustrating the embodiments, certain features of the drawings may be omitted, enlarged or reduced, and do not represent the size of an actual product;
it will be understood by those skilled in the art that certain well-known structures in the drawings and their descriptions may be omitted.
The technical solution of the present invention is further described below with reference to the accompanying drawings and examples.
Example 1
As shown in fig. 1, an image area copy-paste tamper detection method includes the steps of:
s1: preprocessing an image to be detected to obtain feature points, so as to obtain a local image block of the feature to be extracted;
s2: dividing the local image block into sub-regions according to the pixel intensity, and calculating the local intensity sequence characteristics of each pixel point to obtain LIOP characteristics of corresponding characteristic points;
s3: processing the feature points by utilizing a Delaunay triangulation Bowyer-Watson algorithm to generate a Delaunay triangulation network, and respectively calculating the average value of LIOP descriptors of three vertexes of each triangle as the feature vector of the corresponding triangle;
s4: performing triangle matching according to the feature vector of each triangle and calculating a matched triangle neighborhood;
s5: judging whether the corresponding feature points are located in the triangular neighborhood or not; if yes, go to step S6; otherwise, discarding the feature point and the corresponding LIOP feature;
s6: reserving the feature point and the corresponding LIOP feature to form a feature point set;
s7: performing feature point matching according to the feature point set to generate feature point matching pairs, and clustering the feature point matching pairs to obtain a plurality of categories;
s8: calculating an affine matrix of each category;
s9: calculating a correlation coefficient graph before and after the corresponding region is transformed according to the feature point matching pairs and the corresponding affine matrixes, and positioning the tampered region;
s10: judging whether all classes are calculated, if so, combining tampered areas of all classes as a detection result; otherwise, step S9 is executed.
In the specific implementation process, the tampering detection method provided by the invention selects LIOP characteristics as an image characteristic advance algorithm, and has better robustness on rotation, scaling, JPEG compression, noise addition and the like; because only a limited number of non-overlapping triangles need to be subjected to similar matching during matching, compared with the traditional block-based detection algorithm, the method is high in speed and high in practicability; the invention utilizes secondary matching to obtain more feature point matching pairs, thereby facilitating more accurate affine transformation matrix, and the clustering operation of the matching pairs can deal with the tampering operation of multiple duplication.
Example 2
More specifically, on the basis of embodiment 2, the step S1 specifically includes the following steps:
s11: carrying out Gaussian filtering on the images to be detected under the conditions of the same kernel size and different standard deviations to obtain different images;
s12: carrying out difference operation on adjacent filtering images to obtain a series of difference images, and marking uniform extreme points in each difference image as feature points;
s13: removing noise by Gaussian filtering according to the obtained characteristic points, and normalizing the adjacent detection area into a circular area with a fixed diameter;
s14: and removing noise generated by difference operation in the normalization processing by using Gaussian smoothing with standard deviation different from the standard deviation to obtain a local image block of the feature to be extracted.
More specifically, the step S2 specifically includes the following steps:
s21: ascending the local image block into a plurality of B (B is 6) sub-areas according to the pixel intensity, wherein each sub-area has the same number of pixels;
s22: pixel sampling is carried out on each pixel point neighborhood of each subregion, and the local intensity sequence characteristic of each pixel point is calculated;
s23: adding all local intensity sequence characteristics to obtain a LIOP descriptor of the sub-region;
s24: and arranging LIOP descriptors of all the sub-areas in sequence to obtain LIOP characteristics of the corresponding characteristic points.
More specifically, the local intensity sequence characteristic is defined as:
for a set P of N-dimensional integer vectorsN={P=(p1,p2,...,pN)|pjE R and a set Π of all permutation combinations consisting of integers 1,2NDefinition of gamma: PN→∏NFor the set P ∈ PNTo pi e ∈ piNBy arranging N elements of P in descending order, which is correspondingly arranged as pi ═ i (i)1,i2,...,iN) Wherein i is1Corresponds to p1The corresponding sequence numbers in descending order, and so on, are mathematically defined for γ:
γ(P)=π,P∈PN,π∈∏N
wherein pi ═ i (i)1,i2,...,iN),P=(p1,p2,...,pN),ijCorresponds to pjThe corresponding sequence number, j, in descending order, is 1, 2. Mapping gamma to set PNIs divided into N! Each partition, the N-dimensional vector in each partition is mapped to a fixed permutation, namely, the partitions and the permutations have one-to-one relationship, the P can be pairedNAll the partitions are coded, an index table is established for all the permutation and combination, a characteristic mapping function phi is defined according to the index table, and the permutation and combination phi is mapped toWhereinIs an N! The feature vector of the dimension, mathematically the mapping function φ is defined as:
wherein Ind (π) is the number of permutation and combination π in the index table, and
more specifically, the extracting step of the LIOP feature specifically includes:
taking O as the center of a circle of the circular sub-area, and sampling neighborhood pixels around each pixel in the sub-area; taking any pixel point mu to establish a local rectangular coordinate system, taking mu as an origin and OμThe method comprises the steps that the Y-axis forward direction is adopted, the X-axis forward direction is the Y-axis forward direction, the clockwise rotation is carried out for 90 degrees, then a circle with the radius r is drawn by taking mu as the center of the circle, the circle is a neighborhood circle of a pixel point mu, the intersection point of the neighborhood circle and the Y-axis is taken as a first sampling point, and N points are uniformly sampled on the circle in the anticlockwise direction; calculating LIOP descriptors of the pixel points mu by the N sampled points; defining P (mu) as an N-dimensional vector formed by N sampling points of the pixel point mu in the sub-region, and then expressing the local intensity sequence characteristics of the mu point as follows:
wherein P (μ) ═ I (μ)1),I(μ2),...,I(μN))∈PNThe sampling point is the ith sampling point sampled around the mu point in sequence; adding the local intensity sequence characteristics of all pixel points in the sub-regions to obtain a LIOP descriptor in one sub-region, wherein the LIOP characteristics of the local image blocks to be subjected to feature extraction are sequentially arranged according to the LIOP descriptors of all the sub-regions, and are represented as follows:
LIOP descriptor=(des1,des2,...,desB)
wherein biniIs as followsi sub-regions, B is the number of sub-region divisions, the LIOP eigenvector of the eigenvalue is N! Dimension x B.
Wherein, the step S4 specifically includes the following steps:
s41: calculating Euclidean distances between each triangle feature and all other feature vectors, and sorting according to arrival from childhood;
s42: calculating nearest neighbor d1And next neighbor d2If the ratio is less than 0.6, the distance is d1Match two features of d, and2as nearest neighbor, the nearest neighbor d is calculated3The ratio between the two is compared with the threshold value in the same way until the ratio is more than 0.6;
s43: deleting repeated matching pairs of triangles and matching pairs with the distance between the centers of inscribed circles of the triangles smaller than 20;
s44: and calculating a triangle neighborhood near each matched triangle, and constructing a region set formed by adjacent square regions.
More specifically, in step S44, the process of constructing the triangular square neighborhood specifically includes: selecting a triangle tiCalculating the center o of the triangle tangent circleiAnd radius r thereofiAt the center o of the circleiSelecting the length of the side as m × r for the centeriThe square area of size serves as the square neighborhood for the triangle.
Wherein the step S7 includes the steps of:
s71: calculating Euclidean distances between each feature point and all other feature points according to the feature point set, and sorting the Euclidean distances from small to large;
s72: calculating the nearest d1And next neighbor d2If the ratio is less than 0.9, the distance is d1Are matched and d is compared2As nearest neighbor, the nearest neighbor d is calculated3The ratio between the two is compared with the threshold value in the same way until the ratio is more than 0.9;
s73: deleting the repeated feature point matching and the matching pair with the coordinate distance smaller than 20 to complete the matching of the feature points;
s74: the characteristic point matching pair is subjected to unified angle adjustment, and the method specifically comprises the following steps:
for any matching pair { (x)1i,y1i),(x2i,y2i) Its difference vector is defined as:
di=(dxi,dyi)=(x1i-x2i,y1i-y2i)
angle a of matched pairiIs defined as a vector diThe angle between the positive direction of the x axis is [ -pi, pi [ -pi [ ]]For matched pairs { (x)1i,y1i),(x2i,y2i) If its angle aiIf < 0, the match pair is reversed, i.e., { (x)1i,y1i),(x2i,y2i) Instead { (x)2i,y2i),(x1i,y1i)};
S75: and (3) forming a feature vector of the feature point matching pair by using the coordinates and the difference value of the two point coordinates of the feature point matching pair, and recording the feature vector as: (x)1,y1,x2,y2,x1-x2,y1-y2) And taking all the feature vectors of the feature point matching pairs as a classification basis, and performing DBSCAN clustering to obtain K classes.
More specifically, in step S8, the calculation process of the affine matrix of each category specifically includes:
for two feature points that match, p1=(x1,y1) And p1'=(x2,y2) The affine transformation relationship is expressed as:
wherein, a, b, c, d, tx,tyThe undetermined coefficient is obtained by using three pairs of non-collinear point matching pairs and obtaining a corresponding affine transformation matrix T by using the formula; n for class kkDotAnd its corresponding matching pointThe radiation matrix calculation steps are as follows:
randomly selecting three non-collinear point matching pairs, and calculating affine matrix TjThen the rest matching points are determined according to TjAnd transforming and calculating the sum of the total mean square errors, wherein the calculation formula is as follows:
iterating the step NiterThen, select the product satisfying SMSEMinimum TjAffine matrix T as the classk
More specifically, in step S9, the specific process of locating the tampered area is:
creating a correlation coefficient image C having the same size as the image to be measuredmapThe initial pixel values are all set to 0, and the copied image is stored as F, according to the affine matrix set { T }1,T2,...,TkFor each category, the following operations are performed:
points according to the class kSetting a rectangular area I covering all the characteristic points according to the coordinate values, carrying out affine transformation on each coordinate in the area to obtain transformed coordinates, reserving the area I with the pixel value in F, recording the transformed coordinates as an area M, and calculating the correlation coefficient c (p) between the corresponding positions p one by one through pixel points, wherein the calculation formula is as follows:
where Ω (p) is a 7 × 7 region centered on p, I (u) and M (μ) are pixel values of the corresponding positions,andis the average pixel value of the 7 × 7 region; writing the calculated correlation coefficient C (p) into a correlation coefficient graph CmapThe original coordinates and the transformed coordinates; when writing in, the current value is compared with the original value of the corresponding position, and only a larger value is reserved; after all classes are processed by the above process, the obtained correlation coefficient graph C is obtainedmapCarrying out binarization processing, wherein the threshold value is 0.60; if the correlation coefficient value is greater than 0.60, the point of the position is considered as a suspicious point, the value of the corresponding position of the binary image is set as 1, otherwise, the value is set as 0; and finally, performing morphological operation on the obtained binary image to filter out disordered points, deleting small areas, and finally generating a detection result image, wherein the white area is a tampered area.
In a specific implementation process, the method of the present invention is applied to detect the image shown in fig. 2, and the detected white area is a copy and paste area, which is specifically shown in fig. 3; after the detection, the actual detection effect as shown in fig. 4 is obtained, and it is obvious that the copy and paste area is accurately located.
It should be understood that the above-described embodiments of the present invention are merely examples for clearly illustrating the present invention, and are not intended to limit the embodiments of the present invention. Other variations and modifications will be apparent to persons skilled in the art based on the foregoing description. And are neither required nor exhaustive of all embodiments. Any modification, equivalent replacement, and improvement made within the spirit and principle of the present invention should be included in the protection scope of the claims of the present invention.

Claims (10)

1. An image area copying and pasting tamper detection method is characterized by comprising the following steps:
s1: preprocessing an image to be detected to obtain feature points, so as to obtain a local image block of the feature to be extracted;
s2: dividing the local image block into sub-regions according to the pixel intensity, and calculating the local intensity sequence characteristics of each pixel point to obtain LIOP characteristics of corresponding characteristic points;
s3: processing the feature points by utilizing a Delaunay triangulation Bowyer-Watson algorithm to generate a Delaunay triangulation network, and respectively calculating the average value of LIOP descriptors of three vertexes of each triangle as the feature vector of the corresponding triangle;
s4: performing triangle matching according to the feature vector of each triangle and calculating a matched triangle neighborhood;
s5: judging whether the corresponding feature points are located in the triangular neighborhood or not; if yes, go to step S6; otherwise, discarding the feature point and the corresponding LIOP feature;
s6: reserving the feature point and the corresponding LIOP feature to form a feature point set;
s7: performing feature point matching according to the feature point set to generate feature point matching pairs, and clustering the feature point matching pairs to obtain a plurality of categories;
s8: calculating an affine matrix of each category;
s9: calculating a correlation coefficient graph before and after the corresponding region is transformed according to the feature point matching pairs and the corresponding affine matrixes, and positioning the tampered region;
s10: judging whether all classes are calculated, if so, combining tampered areas of all classes as a detection result; otherwise, step S9 is executed.
2. The image area copying and pasting tamper detection method according to claim 1, wherein the step S1 specifically includes the steps of:
s11: carrying out Gaussian filtering on the images to be detected under the conditions of the same kernel size and different standard deviations to obtain different images;
s12: carrying out difference operation on adjacent filtering images to obtain a series of difference images, and marking uniform extreme points in each difference image as feature points;
s13: removing noise by Gaussian filtering according to the obtained characteristic points, and normalizing the adjacent detection area into a circular area with a fixed diameter;
s14: and removing noise generated by difference operation in the normalization processing by using Gaussian smoothing with standard deviation different from the standard deviation to obtain a local image block of the feature to be extracted.
3. The image area copying and pasting tamper detection method according to claim 2, wherein the step S2 specifically includes the steps of:
s21: the local image block is divided into a plurality of B sub-areas according to the ascending order of the pixel intensity, and each sub-area has the same number of pixels;
s22: pixel sampling is carried out on each pixel point neighborhood of each subregion, and the local intensity sequence characteristic of each pixel point is calculated;
s23: adding all local intensity sequence characteristics to obtain a LIOP descriptor of the sub-region;
s24: and arranging LIOP descriptors of all the sub-areas in sequence to obtain LIOP characteristics of the corresponding characteristic points.
4. An image area copy-on-paste tamper detection method according to claim 3, wherein the local intensity sequence feature is defined as:
for a set P of N-dimensional integer vectorsN={P=(p1,p2,...,pN)|pjE R and a set Π of all permutation combinations consisting of integers 1,2NDefinition of gamma: PN→ΠNFor the set P ∈ PNTo pi ∈ ΠNBy arranging N elements of P in descending order, which is correspondingly arranged as pi ═ i (i)1,i2,...,iN) Wherein i is1Corresponds to p1The corresponding sequence numbers in descending order, and so on, are mathematically defined for γ:
γ(P)=π,P∈PN,π∈ΠN
wherein pi ═ i (i)1,i2,...,iN),P=(p1,p2,...,pN),ijCorresponds to pjThe corresponding sequence number, j, in descending order, is 1, 2. Mapping gamma to set PNIs divided into N! Each partition, the N-dimensional vector in each partition is mapped to a fixed permutation, namely, the partitions and the permutations have one-to-one relationship, the P can be pairedNAll the partitions are coded, an index table is established for all the permutation and combination, a characteristic mapping function phi is defined according to the index table, and the permutation and combination phi is mapped toWhereinIs an N! The feature vector of the dimension, mathematically the mapping function φ is defined as:
wherein Ind (π) is the number of permutation and combination π in the index table, and
5. the image area copying and pasting tamper detection method according to claim 3, wherein the LIOP feature extraction step specifically comprises:
taking O as the center of a circle of the circular sub-area, and sampling neighborhood pixels around each pixel in the sub-area; taking any pixel point mu to establish a local rectangular coordinate system, taking mu as an origin and OμThe method comprises the steps that the Y-axis forward direction is adopted, the X-axis forward direction is the Y-axis forward direction, the clockwise rotation is carried out for 90 degrees, then a circle with the radius r is drawn by taking mu as the center of the circle, the circle is a neighborhood circle of a pixel point mu, the intersection point of the neighborhood circle and the Y-axis is taken as a first sampling point, and N points are uniformly sampled on the circle in the anticlockwise direction; calculating LIOP descriptors of the pixel points mu by the N sampled points; defining P (mu) as an N-dimensional vector formed by N sampling points of the pixel point mu in the sub-region, and then expressing the local intensity sequence characteristics of the mu point as follows:
wherein P (μ) ═ I (μ)1),I(μ2),...,I(μN))∈PNThe sampling point is the ith sampling point sampled around the mu point in sequence; adding the local intensity sequence characteristics of all pixel points in the sub-regions to obtain a LIOP descriptor in one sub-region, wherein the LIOP characteristics of the local image blocks to be subjected to feature extraction are sequentially arranged according to the LIOP descriptors of all the sub-regions, and are represented as follows:
LIOP descriptor=(des1,des2,...,desB)
wherein biniIs the ith sub-region, B is the number of sub-region divisions, the LIOP eigenvector of the eigenvalue is N! Dimension x B.
6. The image area copying and pasting tamper detection method according to claim 3, wherein the step S4 specifically comprises the steps of:
s41: calculating Euclidean distances between each triangle feature and all other feature vectors, and sorting according to arrival from childhood;
s42: calculating nearest neighbor d1And next neighbor d2If the ratio is less than 0.6, the distance is d1Match two features of d, and2as nearest neighbor, the nearest neighbor d is calculated3The ratio between the two is compared with the threshold value in the same way until the ratio is more than 0.6;
s43: deleting repeated matching pairs of triangles and matching pairs with the distance between the centers of inscribed circles of the triangles smaller than 20;
s44: and calculating a triangle neighborhood near each matched triangle, and constructing a region set formed by adjacent square regions.
7. According to the rightThe image area copying, pasting and tampering detection method according to claim 6, wherein in the step S44, the process of constructing the triangular square neighborhood specifically comprises: selecting a triangle tiCalculating the center o of the triangle tangent circleiAnd radius r thereofiAt the center o of the circleiSelecting the length of the side as m × r for the centeriThe square area of size serves as the square neighborhood for the triangle.
8. The image area copying pasting tampering detection method as claimed in claim 6, wherein said step S7 includes the steps of:
s71: calculating Euclidean distances between each feature point and all other feature points according to the feature point set, and sorting the Euclidean distances from small to large;
s72: calculating the nearest d1And next neighbor d2If the ratio is less than 0.9, the distance is d1Are matched and d is compared2As nearest neighbor, the nearest neighbor d is calculated3The ratio between the two is compared with the threshold value in the same way until the ratio is more than 0.9;
s73: deleting the repeated feature point matching and the matching pair with the coordinate distance smaller than 20 to complete the matching of the feature points;
s74: the characteristic point matching pair is subjected to unified angle adjustment, and the method specifically comprises the following steps:
for any matching pair { (x)1i,y1i),(x2i,y2i) Its difference vector is defined as:
di=(dxi,dyi)=(x1i-x2i,y1i-y2i)
angle a of matched pairiIs defined as a vector diThe angle between the positive direction of the x axis is [ -pi, pi [ -pi [ ]]For matched pairs { (x)1i,y1i),(x2i,y2i) If its angle aiIf < 0, the match pair is reversed, i.e., { (x)1i,y1i),(x2i,y2i) Instead { (x)2i,y2i),(x1i,y1i)};
S75: and (3) forming a feature vector of the feature point matching pair by using the coordinates and the difference value of the two point coordinates of the feature point matching pair, and recording the feature vector as: (x)1,y1,x2,y2,x1-x2,y1-y2) And taking all the feature vectors of the feature point matching pairs as a classification basis, and performing DBSCAN clustering to obtain K classes.
9. The image area copying and pasting tamper detection method according to claim 8, wherein in the step S8, the calculation process of the affine matrix of each category is specifically as follows:
for two feature points that match, p1=(x1,y1) And p'1=(x2,y2) The affine transformation relationship is expressed as:
wherein, a, b, c, d, tx,tyThe undetermined coefficient is obtained by using three pairs of non-collinear point matching pairs and obtaining a corresponding affine transformation matrix T by using the formula; n for class kkDot (p)1,p2,...,pnk) And its corresponding matching pointThe radiation matrix calculation steps are as follows:
randomly selecting three non-collinear point matching pairs, and calculating affine matrix TjThen the rest matching points are determined according to TjAnd transforming and calculating the sum of the total mean square errors, wherein the calculation formula is as follows:
iterating the step NiterThen, select the product satisfying SMSEMinimum TjAs a class ofAffine matrix Tk
10. The image area copying and pasting tamper detection method according to claim 9, wherein in step S9, the specific process of locating the tampered area is:
creating a correlation coefficient image C having the same size as the image to be measuredmapThe initial pixel values are all set to 0, and the copied image is stored as F, according to the affine matrix set { T }1,T2,...,TkFor each category, the following operations are performed: points according to the class kSetting a rectangular area I covering all the characteristic points according to the coordinate values, carrying out affine transformation on each coordinate in the area to obtain transformed coordinates, reserving the area I with the pixel value in F, recording the transformed coordinates as an area M, and calculating the correlation coefficient c (p) between the corresponding positions p one by one through pixel points, wherein the calculation formula is as follows:
where Ω (p) is a 7 × 7 region centered on p, I (u) and M (μ) are pixel values of the corresponding positions,andis the average pixel value of the 7 × 7 region; writing the calculated correlation coefficient C (p) into a correlation coefficient graph CmapThe original coordinates and the transformed coordinates; when writing in, the current value is compared with the original value of the corresponding position, and only a larger value is reserved; after all classes are processed by the above process, the obtained correlation coefficient graph C is obtainedmapCarrying out binarization processing, wherein the threshold value is 0.60; if the correlation coefficient value is greater than 0.60, the point of the current position is considered as a suspicious point, and the value of the corresponding position of the binary image is considered as the value of the corresponding positionSet to 1, otherwise set to 0; and finally, performing morphological operation on the obtained binary image to filter out disordered points, deleting small areas, and finally generating a detection result image, wherein the white area is a tampered area.
CN201910871558.XA 2019-09-16 2019-09-16 Image area copying and pasting tampering detection method Active CN110599478B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910871558.XA CN110599478B (en) 2019-09-16 2019-09-16 Image area copying and pasting tampering detection method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910871558.XA CN110599478B (en) 2019-09-16 2019-09-16 Image area copying and pasting tampering detection method

Publications (2)

Publication Number Publication Date
CN110599478A true CN110599478A (en) 2019-12-20
CN110599478B CN110599478B (en) 2023-02-03

Family

ID=68859781

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910871558.XA Active CN110599478B (en) 2019-09-16 2019-09-16 Image area copying and pasting tampering detection method

Country Status (1)

Country Link
CN (1) CN110599478B (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111767956A (en) * 2020-06-30 2020-10-13 苏州科达科技股份有限公司 Image tampering detection method, electronic device, and storage medium
CN112116585A (en) * 2020-09-28 2020-12-22 苏州科达科技股份有限公司 Image removal tampering blind detection method, system, device and storage medium
CN112802140A (en) * 2021-03-03 2021-05-14 中天恒星(上海)科技有限公司 Image coding system for preventing and identifying image tampering

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2012058902A1 (en) * 2010-11-02 2012-05-10 中兴通讯股份有限公司 Method and apparatus for combining panoramic image
US20170091588A1 (en) * 2015-09-02 2017-03-30 Sam Houston State University Exposing inpainting image forgery under combination attacks with hybrid large feature mining
CN106600598A (en) * 2016-12-22 2017-04-26 辽宁师范大学 Color image tampering detection method based on local grid matching
KR101755980B1 (en) * 2016-01-29 2017-07-10 세종대학교산학협력단 Copy-Move Forgery Detection method and apparatus based on scale space representation
US20180096224A1 (en) * 2016-10-05 2018-04-05 Ecole Polytechnique Federale De Lausanne (Epfl) Method, System, and Device for Learned Invariant Feature Transform for Computer Images
WO2018098891A1 (en) * 2016-11-30 2018-06-07 成都通甲优博科技有限责任公司 Stereo matching method and system
CN108335290A (en) * 2018-01-23 2018-07-27 中山大学 A kind of image zone duplicating and altering detecting method based on LIOP features and Block- matching

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2012058902A1 (en) * 2010-11-02 2012-05-10 中兴通讯股份有限公司 Method and apparatus for combining panoramic image
US20170091588A1 (en) * 2015-09-02 2017-03-30 Sam Houston State University Exposing inpainting image forgery under combination attacks with hybrid large feature mining
KR101755980B1 (en) * 2016-01-29 2017-07-10 세종대학교산학협력단 Copy-Move Forgery Detection method and apparatus based on scale space representation
US20180096224A1 (en) * 2016-10-05 2018-04-05 Ecole Polytechnique Federale De Lausanne (Epfl) Method, System, and Device for Learned Invariant Feature Transform for Computer Images
WO2018098891A1 (en) * 2016-11-30 2018-06-07 成都通甲优博科技有限责任公司 Stereo matching method and system
CN106600598A (en) * 2016-12-22 2017-04-26 辽宁师范大学 Color image tampering detection method based on local grid matching
CN108335290A (en) * 2018-01-23 2018-07-27 中山大学 A kind of image zone duplicating and altering detecting method based on LIOP features and Block- matching

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
WEI LU ET,AL: "Copy-move Forgery Detection based on Digital Audio", 《2017 IEEE SECOND INTERNATIONAL CONFERENCE ON DATA SCIENCE IN CYBERSPACE(DSC)》 *
WEI LU ET,AL: "Fast Copy-Move Detection of Digital Audio", 《ENGINEERING APPLICATIONS OF ARTFICIAL INTELLIGENCE》 *

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111767956A (en) * 2020-06-30 2020-10-13 苏州科达科技股份有限公司 Image tampering detection method, electronic device, and storage medium
CN112116585A (en) * 2020-09-28 2020-12-22 苏州科达科技股份有限公司 Image removal tampering blind detection method, system, device and storage medium
CN112802140A (en) * 2021-03-03 2021-05-14 中天恒星(上海)科技有限公司 Image coding system for preventing and identifying image tampering

Also Published As

Publication number Publication date
CN110599478B (en) 2023-02-03

Similar Documents

Publication Publication Date Title
Meena et al. A copy-move image forgery detection technique based on tetrolet transform
Bi et al. Multi-level dense descriptor and hierarchical feature matching for copy–move forgery detection
Meena et al. A copy-move image forgery detection technique based on Gaussian-Hermite moments
Davarzani et al. Copy-move forgery detection using multiresolution local binary patterns
CN110599478B (en) Image area copying and pasting tampering detection method
Raju et al. Copy-move forgery detection using binary discriminant features
Matkan et al. Road extraction from lidar data using support vector machine classification
CN110334762B (en) Feature matching method based on quad tree combined with ORB and SIFT
Prakash et al. Detection of copy-move forgery using AKAZE and SIFT keypoint extraction
Thajeel et al. State of the art of copy-move forgery detection techniques: a review
AlSawadi et al. Copy-move image forgery detection using local binary pattern and neighborhood clustering
Liu et al. Copy move forgery detection based on keypoint and patch match
Flenner et al. Resampling forgery detection using deep learning and a-contrario analysis
Thajeel et al. A Novel Approach for Detection of Copy Move Forgery using Completed Robust Local Binary Pattern.
Tahaoglu et al. Improved copy move forgery detection method via L* a* b* color space and enhanced localization technique
CN111754441B (en) Image copying, pasting and forging passive detection method
Fadl et al. A proposed accelerated image copy-move forgery detection
Trung et al. Blind inpainting forgery detection
Stucker et al. Supervised outlier detection in large-scale MVS point clouds for 3D city modeling applications
Nawaz et al. Single and multiple regions duplication detections in digital images with applications in image forensic
Emam et al. A robust detection algorithm for image Copy-Move forgery in smooth regions
Al-Qershi et al. Copy-move forgery detection using on locality sensitive hashing and k-means clustering
Thajeel et al. Detection copy-move forgery in image via quaternion polar harmonic transforms
Sunitha et al. Copy-move tampering detection using keypoint based hybrid feature extraction and improved transformation model
Isaac et al. A key point based copy-move forgery detection using HOG features

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant