CN109447957A - One kind transmitting matched image duplication based on key point and pastes detection method - Google Patents
One kind transmitting matched image duplication based on key point and pastes detection method Download PDFInfo
- Publication number
- CN109447957A CN109447957A CN201811198057.1A CN201811198057A CN109447957A CN 109447957 A CN109447957 A CN 109447957A CN 201811198057 A CN201811198057 A CN 201811198057A CN 109447957 A CN109447957 A CN 109447957A
- Authority
- CN
- China
- Prior art keywords
- point
- harris
- image
- laplace
- key point
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/0002—Inspection of images, e.g. flaw detection
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/70—Determining position or orientation of objects or cameras
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/20—Image preprocessing
- G06V10/24—Aligning, centring, orientation detection or correction of the image
- G06V10/245—Aligning, centring, orientation detection or correction of the image by locating a pattern; Special marks for positioning
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/40—Extraction of image or video features
- G06V10/46—Descriptors for shape, contour or point-related descriptors, e.g. scale invariant feature transform [SIFT] or bags of words [BoW]; Salient regional features
- G06V10/462—Salient features, e.g. scale invariant feature transforms [SIFT]
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10004—Still image; Photographic image
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20112—Image segmentation details
- G06T2207/20164—Salient point detection; Corner detection
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Multimedia (AREA)
- Quality & Reliability (AREA)
- Image Analysis (AREA)
Abstract
Matched image duplication is transmitted based on key point the invention discloses one kind and pastes detection method, the Harris-Laplace key point that the present invention uses has scale and rotational invariance;The LIOP feature of use has better scale and rotational invariance relative to other features;Meanwhile the matching of transmitting solves in one-to-many situation, due to the missing of matching relationship, the problem for causing part tampered region that cannot be detected improves the effect of tampering detection.The present invention can the duplication effectively in detection image paste behavior, robustness is higher, even if tampered region, by translation, scaling when rotation, JPEG compression and addition white Gaussian noise, still can accurately position tampered region.
Description
Technical field
The present invention relates to image digitization forensic technologies fields, matched based on key point transmitting more particularly, to one kind
Detection method is pasted in image duplication.
Background technique
With the development of internet and smart phone technology, occurs a large amount of digital picture on network.And due to number
The development of word media editing tool, these images are very easy to be tampered, and cause " soon also may not be real ".Therefore, vast
Enterprises and institutions and Social Individual are also increasingly strong to the identification demand for services of digital image document authenticity.Therefore digital picture
Forensic technologies are come into being.
Digital image evidence collecting can be divided into image zone duplicating and paste the multiple directions such as detection and image mosaic detection.Its
In, after image zone duplicating stickup refers to a part duplication piece image, paste the other parts of same piece image.
Often comprising processing means such as scaling, rotation, plus noises during duplication, so that the trace of duplication can not be looked into visually not.
Our target is sought in the case where no original image, and detection image duplication, which is pasted, whether there is, and label tampered area
Domain.
Existing detection technique is generally divided into based on image block and based on two kinds of key point.Method image block based is due to meter
The disadvantages of evaluation time is too long, and robustness is not high, now less use.The method of mainstream is the side based on key point at present
Method.
Summary of the invention
For the limitation for overcoming existing detection technique, the invention proposes one kind being transmitted newly matched based on key point
Detection method is pasted in image duplication.Method proposed by the present invention can the duplication effectively in detection image paste behavior, even
In tampered region by translating, scaling rotates, when JPEG compression and addition white Gaussian noise, side proposed by the present invention
Method still can accurately position tampered region.
In order to solve the above technical problems, technical scheme is as follows:
One kind transmitting matched image duplication based on key point and pastes detection method, comprising the following steps:
S1: for image to be detected, Harris-Laplace key point is extracted in the picture;
S2: for Harris-Laplace key point, its LIOP description is constructed;
S3: the Harris-Laplace key point of S1 is stored on a Kd-Tree tree, then using g2NN's
Matching strategy obtains the initial matching relationship of Harris-Laplace key point;
S4: image to be detected is divided into several significant image blocks, is counted between any two image block
The matching number N of Harris-Laplace key pointH;
S5: in several image blocks that mutually there is matching relationship, the matching of Harris-Laplace key point is carried out
The transmitting of relationship obtains the new matching relationship of Harris-Laplace key point;Delivery rules areWherein K1,K2,K3It is all Harris-Laplace key point, (K1,K2),(K1,K3) it is known
Matching relationship;
S6: after the matching of transmitting, to NHIt is updated, and according to NHJudged, if NHGreater than O, then it is assumed that on
The two pieces of image blocks stated be it is matched, retain above-mentioned two pieces of image blocks and its corresponding matched Harris-Laplace be crucial
Point, and execute S7;If NHNo more than O, then corresponding matching double points are abandoned, and thinks that corresponding image block is unmatched;Institute
The O stated is artificial preset value;
S7: in image block of any two there are matching relationship, three pairs of non-colinear matching double points are arbitrarily taken, and estimate
It is above-mentioned two there are the affine transformation matrix of the image block of matching relationship, by the starting point of matching double points respectively with affine transformation matrix
It is multiplied, is then compared with the terminal of matching pair, if error between the two is less than β, matching double points are defined as meeting
The matching double points of condition;Iteration executes preset times NitAfterwards, selection meets that matching double points quantity is most and corresponding error
The sum of the smallest affine transformation matrix be final affine transformation matrix;The β and NitIt is artificial preset value;
S8: by the way that image to be detected is combined final affine transformation matrix, transformed image is obtained;Calculate transformation
Image and related coefficient of the image to be detected on corresponding position afterwards, and binary conversion treatment is carried out to it, obtain binaryzation
Related coefficient, if point binaryzation related coefficient be greater than ζ, be defined as suspicious points, and on the corresponding position of binary map
It is set as 1;If the related coefficient of the binaryzation of point is not more than ζ, 0 is set as on the corresponding position of binary map;In final two-value
Morphological operation is carried out on figure, obtains final testing result figure;The ζ is artificial preset value.
In a preferred solution, the S1 includes following below scheme:
S1.1: constructing several scale spaces of Harris, obtain the angle point of each scale, to form initial point set;
S1.2: it is concentrated through the qualified angle point of selection in initial point, becomes Harris-Laplace key point.
In a preferred solution, the S1.1 includes the following contents:
The extracting method of angle point, we obtain autocorrelation matrix M corresponding to point x first:
In formula, the σIIt is integral scale, the σDIt is differential scale, the g (σI) it is Gaussian convolution core, it is described
IxIt is the partial derivative in the direction x, the IyIt is the partial derivative in the direction y;
Wherein M can be indicated again are as follows:
Wherein
The formation rule of initial point set:
R=det (M)-α trace2(M)
In formula, R is angle point receptance function, judges whether it is angle point by the value of R, needs to meet simultaneously two and judges item
Part, first, angle point response and the response of angle point corresponding to surrounding 8 points of point x is compared, and must be greater than it
The angle point response of 8 points of surrounding;Second, R have to be larger than given threshold value TR, (T hereR=20).If meeting above-mentioned item simultaneously
Part, being judged as set point x is angle point;If it is not, then judging is not angle point;α is an artificial preset value, value range be [0.04,
0.06], we take α=0.04 here;In formula, det (M) is the determinant of matrix M, and calculation method is det (M)=AB-C2;Formula
In, trace (M) is the mark of matrix M, and calculation method is trace (M)=A+B;
In a preferred solution, the condition in the S1.2 includes the following contents:
For some point x that initial point is concentrated, verify whether it is the maximum of scale space, if just retaining above-mentioned
Point, if otherwise just abandoning above-mentioned point:
In a preferred solution, the S2 includes following below scheme:
S2.1: several fritters will be divided into around Harris-Laplace key point;
S2.2: for the point y in fritter, taking N number of point around point y, asks the corresponding LIOP description of point y, described
LIOP feature is expressed by following formula:
LIOP (y)=Φ (γ (P (y)))
In formula, the P (y)=(I (y1),I(y2),···,I(yN))∈PN, the I (yi) indicate the i-th of point y
A consecutive points yiPixel value;The γ indicates the set that all displacements are constituted corresponding to a N-dimensional vector, there is N!Kind can
Energy;The Φ (γ) is to one N of various possible compositions just now!The vector of dimension, and mapping corresponding is Q, then N!
The position Q of the vector of dimension is 1, remaining position is all set to 0, and the Q is between 1 and N!Between positive integer;
S2.3: by the LIOP feature description of fritter all in S2.1 multiplied by corresponding weight function, LIOP description is constructed
Son.
In a preferred solution, the matching strategy of the g2NN of the S3 includes the following contents:
It is found in a Kd-Tree tree and the immediate R of feature corresponding to specified Harris-Laplace key point
A Harris-Laplace key point, the R are positive integers;Define specified Harris-Laplace key point and its arest neighbors
The distance between feature corresponding to key point is d1, define specified Harris-Laplace key point and time neighbour's key point institute
The distance between corresponding feature is d2, and so on, until dR;Then d is calculated(i)/d(i+1)Ratio (i=1,2 ..., R-
1);If having d when i=j(j)/d(j+1)<Td, then it is assumed that d(j)Corresponding Harris-Laplace key point and its before
Harris-Laplace key point is matched, the pass formation Harris-Laplace with specified Harris-Laplace key point
Initial matching relationship between key point;The value range of the j is [1, R-1];The TdIt is artificial preset value.
In a preferred solution, the S5 includes the following contents:
If mutually there is matching relationship between multiple images block, transmitted according to existing matching relationship, transmitting rule
It is thenAs shown in figure 5, region Ω1And Ω2Between there are matching double points (a1,b1), region Ω1
And Ω3Between there are matching double points (a1,c1), region Ω2And Ω3Between there are matching double points (b3,c3), so, region Ω1,
Ω2And Ω3Between there is the point pair that is mutually matched, therefore just to the matching between these three regions to key point to being transmitted,
According to existing matching relationship (a1,b1),(a1,c1), using delivery rulesObtain new
With relationship (b1,c1), it is represented by dashed line in the picture, similar can also obtain other new matching relationships.
In a preferred solution, the S7 includes the following contents:
For a pair of of matching double points, be defined as X=(p, q) andAffine transformation relationship carries out table by following formula
It reaches:
In above-mentioned formula, a, b, c, d, tx,tyIt is undetermined coefficient, the T is affine transformation matrix;Pass through
By three pairs of non-colinear matching double points, T can be acquired by being updated in above-mentioned formula;
Error is expressed by following formula:
DescribedAnd XiIt is a pair of of matching double points;Above-mentioned formula indicates affine transformation corresponding when error obtains minimum
Matrix T.
In this preferred embodiment, an affine transformation matrix T can all be obtained every time by calculating, and by the starting point of matching double points according to
T is converted, and transformation results compared with the terminal of matching double points, calculates its error.If error is less than β, just that this is right
Point is to leaving.Abovementioned steps iteration is multiple, last satisfactory point that matrix the smallest to most and the sum of error
For required matrix.
In a preferred solution, the related coefficient of the S8 is expressed by following formula:
In formula, the c (x) indicates related coefficient, the Ω(x)It is the region centered on x, the I (μ) is former
The pixel value for the position point μ in image to be detected begun, the F (μ) is the point μ institute in original image to be detected
Position pixel value multiplied by corresponding affine transformation after obtained value, it is describedWithIt is the flat of the region centered on x
Equal pixel value.
In a preferred solution, the O=3.
In a preferred solution, β=0.05.
In a preferred solution, the Nit=2000.
In a preferred solution, ζ=0.5.
In a preferred solution, the TR=20.
In a preferred solution, α=0.04.
In a preferred solution, the N=4.
In a preferred solution, the R=10.
In a preferred solution, the Td=0.5.
Compared with prior art, the beneficial effect of technical solution of the present invention is:
This method extracts Harris-Laplace key point and LIOP feature vector, of the invention relative to conventional method
Robustness is more preferable;Simultaneously as using the matching of transmitting, the matching effect of key point is improved, is mentioned for estimation affine transformation
Convenience is supplied.The experimental results showed that the present invention is translating, scale, geometric transformations and the JPEG compression such as rotation add white Gaussian
Good effect is all obtained in terms of the post-processings such as noise.
Detailed description of the invention
Fig. 1 is the flow chart of embodiment.
Fig. 2 is image to be detected in embodiment.
Fig. 3 is corresponding practical tampered region in embodiment.
Fig. 4 is the actually detected effect picture of embodiment.
Fig. 5 is the matched exemplary diagram transmitted in the present invention.
Specific embodiment
The attached figures are only used for illustrative purposes and cannot be understood as limitating the patent;
In order to better illustrate this embodiment, the certain components of attached drawing have omission, zoom in or out, and do not represent actual product
Size;
To those skilled in the art, it is to be understood that certain known features and its explanation, which may be omitted, in attached drawing
's.
The following further describes the technical solution of the present invention with reference to the accompanying drawings and examples.
As shown in Figure 1, a kind of transmit matched image duplication stickup detection method based on key point, comprising the following steps:
S1: for image to be detected, extracting Harris-Laplace key point in the picture,
S1.1: constructing several scale spaces of Harris, obtain the angle point of each scale, to form initial point set;
Wherein, autocorrelation matrix M corresponding to point x is obtained first:
In formula, σIIt is integral scale, σDIt is differential scale, g (σI) it is Gaussian convolution core, IxIt is the partial derivative in the direction x, IyIt is
The partial derivative in the direction y;
Wherein M can be indicated again are as follows:
Wherein
The formation rule of initial point set:
R=det (M)-α trace2(M)
In formula, R is angle point receptance function, judges whether it is angle point by the value of R, needs to meet simultaneously two and judges item
Part, first, angle point response R and the response of angle point corresponding to surrounding 8 points of point x is compared, and must be greater than
The angle point response of 8 points around it;Second, R have to be larger than 20.If meeting above-mentioned condition simultaneously, being judged as set point x is angle
Point;If it is not, then judging is not angle point;α is artificial preset value, here α=0.04;In formula, det (M) is the determinant of matrix M,
Calculation method is det (M)=AB-C2;In formula, trace (M) is the mark of matrix M, and calculation method is trace (M)=A+B;
S1.2: it is concentrated through the qualified angle point of selection in initial point, becomes Harris-Laplace key point, condition
It is as follows:
If just becoming Harris-Laplace key point, above-mentioned point is otherwise just abandoned.
S2: for Harris-Laplace key point, constructing its LIOP description,
S2.1: several fritters will be divided into around Harris-Laplace key point;
S2.2: for the point y in fritter, taking N number of point around it, obtains the corresponding LIOP feature description of point y, LIOP
Feature description is expressed by following formula:
LIOP (y)=Φ (γ (P (y)))
In formula, P (y)=(I (y1),I(y2),···,I(yN))∈PN, I (yi) indicate point y i-th of consecutive points yi's
Pixel value;γ indicates the set that all displacements are constituted corresponding to 4 dimensional vectors, there is 4!Kind may;Φ (γ) was to just now
Various possible composition one 4!The vector of dimension, and mapping corresponding is Q, then 4!The position Q of the vector of dimension is 1,
It is between 1 and 4 that remaining position, which is all set to 0, Q,!Between positive integer;
S2.3: by the LIOP feature description of fritter all in S2.1 multiplied by corresponding weight function, LIOP description is constructed
Son.
S3: it is found in a Kd-Tree tree closest with feature corresponding to specified Harris-Laplace key point
10 Harris-Laplace key points;It defines corresponding to specified Harris-Laplace key point and its arest neighbors key point
The distance between feature be d1, define feature corresponding to specified Harris-Laplace key point and time neighbour's key point it
Between distance be d2, and so on, until d10;Then d is calculated(i)/d(i+1)Ratio (i=1,2 ..., 9);If as i=j
It waits, there is d(j)/d(j+1)< 0.5, then it is assumed that d(j)Corresponding Harris-Laplace key point and its Harris-Laplace before
Key point with specified Harris-Laplace key point be it is matched, formed initial between Harris-Laplace key point
Matching relationship;The value range of j is [1,9].
S4: image to be detected is divided into several significant image blocks, is counted between any two image block
The matching number N of Harris-Laplace key pointH;
S5: in several image blocks that mutually there is matching relationship, the matching of Harris-Laplace key point is carried out
The transmitting of relationship obtains the new matching relationship of Harris-Laplace key point;Delivery rules areWherein K1,K2,K3It is all Harris-Laplace key point, (K1,K2),(K1,K3) it is known
Matching relationship;
S6: after the matching of transmitting, to NHIt is updated, and according to NHJudged, if NHGreater than 3, then it is assumed that on
The two pieces of image blocks stated be it is matched, retain above-mentioned two pieces of image blocks and its corresponding matched Harris-Laplace be crucial
Point, and execute S7;If NHNo more than 3, then corresponding matching double points are abandoned, and thinks that corresponding image block is unmatched;
S7: in image block of any two there are matching relationship, three pairs of non-colinear matching double points are arbitrarily taken, and estimate
It is above-mentioned two there are the affine transformation matrix of the image block of matching relationship, by the starting point of matching double points respectively with affine transformation matrix
It is multiplied, is then compared with the terminal of matching pair, if error between the two is less than preset value 0.05, matching double points are determined
Justice is qualified matching double points;After iteration 2000 times, selection meet that matching double points quantity is most and corresponding error it
It is final affine transformation matrix with the smallest affine transformation matrix;
For a pair of of matching double points, be defined as X=(p, q) andAffine transformation relationship carries out table by following formula
It reaches:
In above-mentioned formula, a, b, c, d, tx,tyIt is undetermined coefficient, T is affine transformation matrix;By by three pairs of non-colinears
Matching double points, T can be acquired by being updated in above-mentioned formula;
Error is expressed by following formula:
And XiIt is a pair of of matching double points;Above-mentioned formula indicates affine transformation matrix T corresponding when error obtains minimum;
S8: by the way that image to be detected is combined final affine transformation matrix, transformed image is obtained;Calculate transformation
Image and related coefficient of the image to be detected on corresponding position afterwards, and binary conversion treatment is carried out to it, obtain binaryzation
Related coefficient, if point binaryzation related coefficient be greater than 0.5, be defined as suspicious points, and in the corresponding position of binary map
On be set as 1;If the related coefficient of the binaryzation of point is not more than 0.5,0 is set as on the corresponding position of binary map;Final
Morphological operation is carried out in binary map, obtains final testing result figure;
Wherein, related coefficient is expressed by following formula:
In formula, c (x) indicates related coefficient, Ω(x)It is 5 × 5 regions centered on x, I (μ) is original to be detected
The pixel value of the position point μ in image, F (μ) are that the pixel value of the position point μ in original image to be detected multiplies
With the value obtained after corresponding affine transformation,WithIt is the average pixel value in 5 × 5 regions centered on x.
The effect of the present embodiment is as follows: Fig. 2 is image to be detected, and Fig. 3 is corresponding practical tampered region, and Fig. 4 is real
Border detection effect figure.Complex chart 2, Fig. 3 and Fig. 4 are it is known that practical tampered region is accurately calibrated by the present embodiment.
The terms describing the positional relationship in the drawings are only for illustration, should not be understood as the limitation to this patent;
Obviously, the above embodiment of the present invention be only to clearly illustrate example of the present invention, and not be pair
The restriction of embodiments of the present invention.For those of ordinary skill in the art, may be used also on the basis of the above description
To make other variations or changes in different ways.There is no necessity and possibility to exhaust all the enbodiments.It is all this
Made any modifications, equivalent replacements, and improvements etc., should be included in the claims in the present invention within the spirit and principle of invention
Protection scope within.
Claims (10)
1. one kind, which transmits matched image duplication based on key point, pastes detection method, which comprises the following steps:
S1: for image to be detected, Harris-Laplace key point is extracted in the picture;
S2: for Harris-Laplace key point, its LIOP description is constructed;
S3: the Harris-Laplace key point of S1 is stored on a Kd-Tree tree, then uses the matching of g2NN
Strategy obtains the initial matching relationship of Harris-Laplace key point;
Image to be detected: being divided into several image blocks by S4, is counted Harris-Laplace between any two image block and is closed
The matching number N of key pointH;
S5: in several image blocks that mutually there is matching relationship, the matching relationship of Harris-Laplace key point is carried out
Transmitting, obtain the new matching relationship of Harris-Laplace key point;Delivery rules are (K1,K2),Wherein K1,K2,K3It is all Harris-Laplace key point, (K1,K2),(K1,K3) it is known matching
Relationship;
S6: after the matching of transmitting, to NHIt is updated, and according to NHJudged, if NHGreater than O, then it is assumed that above-mentioned
Two pieces of image blocks be it is matched, retain two pieces of above-mentioned image blocks and its corresponding matched Harris-Laplace key point,
And execute S7;If NHNo more than O, then corresponding matching double points are abandoned, and thinks that corresponding image block is unmatched;Described
O is artificial preset value;
S7: in image block of any two there are matching relationship, three pairs of non-colinear matching double points are arbitrarily taken, and estimate above-mentioned
Two there are the affine transformation matrix of the image block of matching relationship, by the starting point of matching double points respectively with affine transformation matrix phase
Multiply, be then compared with the terminal of matching pair, if error between the two is less than β, matching double points is defined as to meet item
The matching double points of part;Iteration executes preset times NitAfterwards, selection meet that matching double points quantity is most and corresponding error it
It is final affine transformation matrix with the smallest affine transformation matrix;The β and NitIt is artificial preset value;
S8: by the way that image to be detected is combined final affine transformation matrix, transformed image is obtained;It calculates transformed
Image and related coefficient of the image to be detected on corresponding position, and binary conversion treatment is carried out to it, obtain the phase of binaryzation
Relationship number is defined as suspicious points, and be set as on the corresponding position of binary map if the related coefficient of the binaryzation of point is greater than ζ
1;If the related coefficient of the binaryzation of point is not more than ζ, 0 is set as on the corresponding position of binary map;In final binary map
Morphological operation is carried out, final testing result figure is obtained;The ζ is artificial preset value.
2. detection method is pasted in image duplication according to claim 1, which is characterized in that the S1 includes to flow down
Journey:
S1.1: constructing several scale spaces of Harris, obtain the angle point of each scale, to form initial point set;
S1.2: it is concentrated through the qualified angle point of selection in initial point, becomes Harris-Laplace key point.
3. detection method is pasted in image duplication according to claim 2, which is characterized in that the S1.1 includes in following
Hold:
The extracting method of angle point obtains autocorrelation matrix M corresponding to point x first:
In formula, the σIIt is integral scale, the σDIt is differential scale, the g (σI) it is Gaussian convolution core, the IxIt is
The partial derivative in the direction x, the IyIt is the partial derivative in the direction y;
Wherein M can be further indicated that are as follows:
Wherein
The formation rule of initial point set:
R=det (M)-α trace2(M)
In formula, R is angle point receptance function, judges whether it is angle point by the value of R, needs to meet simultaneously two Rule of judgment:
The angle point response of first, point x are compared with the response of angle point corresponding to 8 points around point x, and are greater than around it
The angle point response of 8 points;
Second, R have to be larger than given threshold value TR;
If meeting above-mentioned condition simultaneously, being judged as fixed point x is angle point;If it is not, then judging that point x is not angle point;α is one artificial
Preset value, value range are [0.04,0.06];
In formula, det (M) is the determinant of matrix M, and calculation method is det (M)=AB-C2;In formula, trace (M) is matrix M
Mark, calculation method are trace (M)=A+B.
4. detection method is pasted in image duplication according to claim 3, which is characterized in that the condition packet in the S1.2
Include the following contents:
For some point x that initial point is concentrated, verify whether it is the maximum of scale space, if just retaining above-mentioned point,
Otherwise above-mentioned point is just abandoned:
5. being replicated to image described in any claim in 4 paste detection method according to claim 1, which is characterized in that institute
The S2 stated includes following below scheme:
S2.1: several fritters will be divided into around Harris-Laplace key point;
S2.2: for the point y in fritter, taking N number of point around point y, then asks the corresponding LIOP feature description of point y, described
The description of LIOP feature is expressed by following formula:
LIOP (y)=Φ (γ (P (y)))
In formula, the P (y)=(I (y1),I(y2),···,I(yN))∈PN, the I (yi) indicate point y i-th of phase
Adjoint point yiPixel value;The γ indicates the set that all displacements are constituted corresponding to a N-dimensional vector, there is N!Kind may;
The Φ (γ) is to one N of various possible compositions just now!The vector of dimension, and mapping corresponding is Q, then N!Dimension
The position Q of vector is 1, remaining position is all set to 0, and the Q is between 1 and N!Between positive integer;
S2.3: by the LIOP feature description of fritter all in S2.1 multiplied by corresponding weight function, LIOP description is constructed.
6. detection method is pasted in image duplication according to claim 5, which is characterized in that the matching of the g2NN of the S3
Strategy includes the following contents:
It is found in a Kd-Tree tree and the immediate R of feature corresponding to specified Harris-Laplace key point
Harris-Laplace key point, the R are positive integers;The specified Harris-Laplace key point of definition and its arest neighbors
The distance between corresponding feature of key point is d1, define specified Harris-Laplace key point and time neighbour's key point pair
The distance between feature answered is d2, and so on, until dR;Then d is calculated(i)/d(i+1)Ratio (i=1,2 ..., R-1);
If having d when i=j(j)/d(j+1)<Td, then it is assumed that d(j)Corresponding Harris-Laplace key point and its before
Harris-Laplace key point is matched, the pass formation Harris-Laplace with specified Harris-Laplace key point
Initial matching relationship between key point;The value range of the j is [1, R-1];The TdIt is artificial preset value.
7. detection method is pasted in image duplication according to claim 6, which is characterized in that the S7 includes in following
Hold:
For a pair of of matching double points, be defined as X=(p, q) andAffine transformation relationship is expressed by following formula:
In above-mentioned formula, a, b, c, d, tx,tyIt is undetermined coefficient, the T is affine transformation matrix;By by three
To non-colinear matching double points, T can be acquired by being updated in above-mentioned formula;
Error is expressed by following formula:
DescribedAnd XiIt is a pair of of matching double points;Formula indicates affine transformation T corresponding when the sum of error is minimum above.
8. detection method is pasted in the according to claim 1, duplication of image described in 2,3,4,6 or 7, which is characterized in that the S8
Related coefficient expressed by following formula:
In formula, the c (x) indicates related coefficient, the Ω(x)It is the region centered on x, the I (μ) is original
The pixel value of the position point μ in image to be detected, the F (μ) are that the point μ institute in original image to be detected is in place
The pixel value set is described multiplied by the value obtained after corresponding affine transformationWithIt is the average picture in the region centered on x
Element value.
9. detection method is pasted in image duplication according to claim 8, which is characterized in that the O=3.
10. detection method is pasted in the according to claim 1, duplication of image described in 2,3,4,6,7 or 9, which is characterized in that described
ζ=0.5.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201811198057.1A CN109447957B (en) | 2018-10-15 | 2018-10-15 | Image copying and pasting detection method based on key point transmission matching |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201811198057.1A CN109447957B (en) | 2018-10-15 | 2018-10-15 | Image copying and pasting detection method based on key point transmission matching |
Publications (2)
Publication Number | Publication Date |
---|---|
CN109447957A true CN109447957A (en) | 2019-03-08 |
CN109447957B CN109447957B (en) | 2020-11-10 |
Family
ID=65545417
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201811198057.1A Active CN109447957B (en) | 2018-10-15 | 2018-10-15 | Image copying and pasting detection method based on key point transmission matching |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN109447957B (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN111008955A (en) * | 2019-11-06 | 2020-04-14 | 重庆邮电大学 | Multi-scale image block matching rapid copying pasting tampering detection method |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20140226906A1 (en) * | 2013-02-13 | 2014-08-14 | Samsung Electronics Co., Ltd. | Image matching method and apparatus |
CN106056122A (en) * | 2016-05-26 | 2016-10-26 | 中山大学 | KAZE feature point-based image region copying and pasting tampering detection method |
CN108335290A (en) * | 2018-01-23 | 2018-07-27 | 中山大学 | A kind of image zone duplicating and altering detecting method based on LIOP features and Block- matching |
-
2018
- 2018-10-15 CN CN201811198057.1A patent/CN109447957B/en active Active
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20140226906A1 (en) * | 2013-02-13 | 2014-08-14 | Samsung Electronics Co., Ltd. | Image matching method and apparatus |
CN106056122A (en) * | 2016-05-26 | 2016-10-26 | 中山大学 | KAZE feature point-based image region copying and pasting tampering detection method |
CN108335290A (en) * | 2018-01-23 | 2018-07-27 | 中山大学 | A kind of image zone duplicating and altering detecting method based on LIOP features and Block- matching |
Non-Patent Citations (1)
Title |
---|
张艳华 等: "抗翻转的图像复制-粘贴篡改检测算法", 《无线电通信技术》 * |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN111008955A (en) * | 2019-11-06 | 2020-04-14 | 重庆邮电大学 | Multi-scale image block matching rapid copying pasting tampering detection method |
Also Published As
Publication number | Publication date |
---|---|
CN109447957B (en) | 2020-11-10 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
Lin et al. | Dynamic spatial propagation network for depth completion | |
Beardsley et al. | 3D model acquisition from extended image sequences | |
CN101833765B (en) | Characteristic matching method based on bilateral matching and trilateral restraining | |
CN101512601B (en) | Method for determining a depth map from images, device for determining a depth map | |
CN103345736B (en) | A kind of virtual viewpoint rendering method | |
CN104616247B (en) | A kind of method for map splicing of being taken photo by plane based on super-pixel SIFT | |
CN103034982B (en) | Image super-resolution rebuilding method based on variable focal length video sequence | |
Pritchett et al. | Matching and reconstruction from widely separated views | |
CN101556692A (en) | Image mosaic method based on neighborhood Zernike pseudo-matrix of characteristic points | |
CN106971404A (en) | A kind of robust SURF unmanned planes Color Remote Sensing Image method for registering | |
CN106056122B (en) | A kind of image zone duplicating stickup altering detecting method based on KAZE characteristic point | |
CN111028284A (en) | Binocular vision stereo matching method and device based on homonymous mark points | |
CN113763269B (en) | Stereo matching method for binocular images | |
CN106530336B (en) | Stereo matching method based on color information and graph cut theory | |
CN108109121A (en) | A kind of face based on convolutional neural networks obscures quick removing method | |
CN103679193A (en) | FREAK-based high-speed high-density packaging component rapid location method | |
CN102117474B (en) | Digital picture watermark embedding and detecting method and device | |
CN108038488A (en) | The robustness image hash method mixed based on SIFT and LBP | |
Li et al. | Self-supervised coarse-to-fine monocular depth estimation using a lightweight attention module | |
CN111402429B (en) | Scale reduction and three-dimensional reconstruction method, system, storage medium and equipment | |
CN109447957A (en) | One kind transmitting matched image duplication based on key point and pastes detection method | |
CN112329662B (en) | Multi-view saliency estimation method based on unsupervised learning | |
CN116383470B (en) | Image searching method with privacy protection function | |
Maybank | A probabilistic definition of salient regions for image matching | |
CN109741389A (en) | One kind being based on the matched sectional perspective matching process of region base |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |