CN102609891B - Texture-characteristic-based method for passively and blindly obtaining evidence of digital image - Google Patents
Texture-characteristic-based method for passively and blindly obtaining evidence of digital image Download PDFInfo
- Publication number
- CN102609891B CN102609891B CN201210007510.2A CN201210007510A CN102609891B CN 102609891 B CN102609891 B CN 102609891B CN 201210007510 A CN201210007510 A CN 201210007510A CN 102609891 B CN102609891 B CN 102609891B
- Authority
- CN
- China
- Prior art keywords
- texture
- formula
- image
- block
- characteristic
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Expired - Fee Related
Links
Images
Landscapes
- Image Analysis (AREA)
Abstract
The invention discloses a texture-characteristic-based method for passively and blindly obtaining the evidence of a digital image. The method is characterized in that the image content is expressed by selecting six texture characteristic quantities, namely the image texture mean value, the texture standard deviation, the texture smoothness, the texture third moment, the texture consistency and the texture entropy. Compared with the existing method for expressing the information of the image content by selecting the indirect characteristic quantity of an image transform domain, the method is used for expressing the image information by directly selecting the texture characteristic of the image, so that the information of the image can be more adequately expressed; and compared with the existing method, the method can be used for reducing the dimensionality of the characteristic quantity, and the texture characteristic selected by the method has the characteristics of layer property, dimensional property, translation invariance and the like, so that the image falsifying and evidence obtaining detection rate and the robustness can be improved, and the method is applicable to the fields of identification on the content truth of images and the like.
Description
Technical field
The invention belongs to Signal and Information Processing technical field, be specifically related to copy-paste in image forge to distort the passive blind evidence collecting method based on textural characteristics.
Background technology
The arriving of digital Age, bringing the leading visual effect of enjoyment to people when, also makes some lawless persons by some forgeries of the broadcasting medias such as network and the image of distorting, to individual and society, cause negative effect.So digital Age also becomes one " double-edged sword " when being convenient for people to.It is a kind of the most common image forge means that digital picture copy-paste is distorted, and generally can be divided into two large classes: a class is that the copy-paste of same piece image is distorted to forgery; Another kind of is that the copy-paste between different images is distorted to forgery.
The method that the existing copy-paste for same piece image is distorted evidence obtaining has:
In the U.S. in 2004, reach the passive blind evidence collecting method of a kind of digital picture copy-paste based on principal component analysis (PCA) (PCA) proposing in special thatch thatch university computer science and technology report meeting, adopt image transform domain characteristic quantity to carry out presentation video content, exist the defect of poor robustness.
The copy-paste based on discrete cosine transform (DCT) coefficient that electric electronic engineering association (IEEE) information processing international conference the 53 volume second phase of the collected works 758-767 page of holding for 2005 proposes is distorted passive blind evidence collecting method, although realized the evidence obtaining that copying image-stickup is distorted, there is the defect that calculated amount is large.
The passive blind evidence collecting method in region is forged in the copying image based on small echo and svd that electric electronic engineering association (IEEE) multimedia messages of holding for 2007 and forensic technologies international conference collected works third phase 1750-1753 page propose, although further reduced the calculated amount of evidence collecting method, still cannot collect evidence fast for image larger in reality.
A kind of digital picture copy-paste based on main transfer vector that China's " Chinese journal of computers " the 30 volume ten first phase 1998-2007 pages in 2007 are mentioned is distorted passive blind evidence collecting method and can only be collected evidence to the distorted image of single form.
A kind of image zone duplicating and altering evidence collecting method based on gray scale symbiosis square of China's " computer utility " the 31 the 6th phase of volume 1621-1630 page proposition in 2011, exists calculated amount greatly and the defect of poor robustness.
Summary of the invention
The object of the invention is to propose the passive blind evidence collecting method of a kind of image based on textural characteristics, to overcome the above-mentioned defect of prior art, raising, to distorted image evidence obtaining verification and measurement ratio and robustness, makes the method can be applied to the fields such as discriminating of the content authenticity of image.
The passive blind evidence collecting method of digital picture that the present invention is based on textural characteristics, is characterized in that:
First carry out texture feature extraction:
The first step, suspect image to be detected is carried out to gray scale judgement, non-gray level image, is converted into gray level image if;
Second step, the texture part that is converted to the image to be detected after gray level image is cut out out;
The 3rd step, by cutting out the gray level image texture part obtaining, carry out overlap partition; The 4th step, according to following formula, calculate respectively texture mean value, texture standard deviation, texture smoothness, third moment, using texture homogeneity and the texture entropy of each Streak block, as the characteristic statistic of each sub-block, represent each Streak block content information:
1) texture Mean Value Formulas
In formula, ave is texture mean value, the size that s * s is Streak block, and n (x, y) is the gray-scale value of (x, y) point;
2) texture standard deviation formula
In formula, var is texture standard deviation;
3) texture smoothness formula
In formula, P is texture smoothness, and L is texture gray level;
4) third moment formula
In formula, S is texture third moment, and E represents to get the computing of average statistical;
5) using texture homogeneity formula
K=∑(p
2)
In formula, K is using texture homogeneity, and p is texture histogram average;
6) texture entropy formula
In formula, E is texture entropy, p
x,ydistribution probability for (x, y) point;
Then carry out the similar coupling of textural characteristics, specifically comprise following steps:
The 5th step, the characteristic quantity of each sub-block obtaining is normalized;
The 6th step, the proper vector of establishing the textural characteristics sub-block after normalization are (ave
i, var
i, P
i, S
i, K
i, E
i), wherein, i=(M-b+1) * (N-b+1), textural characteristics sub-block vector is arranged in order and obtained N
w* 6 eigenmatrix, the eigenmatrix T after being sorted, T
ia line in representation feature matrix T, i=1,2 ... N
w, M, N represents the size of test pattern texture region, the size that b is detection window, N
wthe number of the sub-block comprising in expression test pattern;
The 7th step, traversal eigenmatrix T, the side-play amount of adjacent two row-coordinate values in calculated characteristics matrix T, obtains side-play amount matrix;
The 8th step, according to following similarity decision rule formula
Each row to the side-play amount matrix obtaining carries out similarity matching; In formula, D represents to calculate the Euclidean distance of two vectors, S
i, S
jfor the row vector in excursion matrix, δ is judgement threshold values;
The 9th step, by be less than judgement threshold values the corresponding image texture part Zhong of row vector position be labeled as 1, the position mark that does not meet texture part corresponding to the row vector of judgement threshold values is 0;
The tenth step, to the non-connected region producing in evidence obtaining process, first to its carry out morphology open operation and then it is carried out to morphology closed operation and eliminates.
These six texture statistics amounts of texture mean value, texture standard deviation, texture smoothness, third moment, using texture homogeneity and texture entropy of each Streak block in the inventive method, have been chosen as the proper vector of image subblock, and existing method is minimum, must choose seven characteristic quantities as the proper vector of image subblock, so the relatively existing method of the present invention has reduced the dimension of characteristic quantity.Six texture characteristic amounts directly choosing image due to the present invention carry out presentation video content information, compare existing method and all select the indirect characteristic quantity of image transform domain to carry out representative image content information, more the information of image can be showed fully.The textural characteristics of choosing due to the present invention has the features such as level, yardstick and translation invariance, and the feature of these textural characteristics can strengthen the robustness of the method, can be applicable to the fields such as discriminating of the content authenticity of image.
Accompanying drawing explanation
Fig. 1 is the flow process principle schematic that the present invention is based on the passive blind evidence collecting method of image of textural characteristics.
Embodiment
Embodiment 1:
The passive blind forensic technologies of the present embodiment Applied Digital image is collected evidence to the content authenticity of suspect image.Fig. 1 has provided the flow process principle schematic of the passive blind evidence collecting method of image that the present invention is based on textural characteristics.Referring now to Fig. 1, to the present embodiment, the specific operation process of the passive blind evidence collecting method of image based on textural characteristics is described below:
First carry out texture feature extraction:
The first step, decision steps A, carry out gray scale judgement to suspect image P to be detected, if image to be detected is non-gray level image, image P to be detected is converted to gray level image;
Second step, cut out step B, the texture part of the image to be detected after judgement is cut out out, cast out and after those are cut out, be less than half marginal portion of piecemeal window;
The 3rd step, piecemeal step C, by cutting out the gray level image texture part obtaining, carry out overlap partition, concrete operations are: with the window of b * b size from the pixel of sliding from left to right up and down at every turn, if image texture to be detected part size is M * N, the sub-block quantity obtaining be so (M-b+1) (N-b+1);
The 4th step, characteristic statistic calculation procedure D, according to following texture formula, calculate respectively texture mean value, texture standard deviation, texture smoothness, third moment, using texture homogeneity and the texture entropy of each Streak block, as the characteristic statistic of each sub-block, represent each Streak block content information:
1) texture Mean Value Formulas
In formula, ave is texture mean value, the size that s * s is Streak block, and n (x, y) is the gray-scale value of (x, y) point;
2) texture standard deviation formula
In formula, var is texture standard deviation;
3) texture smoothness formula
In formula, P is texture smoothness, and L is texture gray level;
4) third moment formula
In formula, S is texture third moment, and E represents to get the computing of average statistical;
5) using texture homogeneity formula
K=∑(p
2)
In formula, K is using texture homogeneity, and p is texture histogram average;
6) texture entropy formula
In formula, E is texture entropy, p
x,ydistribution probability for (x, y) point;
Then carry out the similar coupling of textural characteristics, specifically comprise following steps:
The 5th step, normalization step e, be normalized the characteristic quantity of each sub-block obtaining;
The 6th step, ordered steps F, the proper vector of establishing the textural characteristics sub-block after normalization is (ave
i, var
i, P
i, S
i, K
i, E
i), wherein, i=(M-b+1) * (N-b+1); By textural characteristics sub-block vector by obtain N along seeking arranging
w* 6 eigenmatrix, then by this eigenmatrix by the line ordering of advancing, obtain eigenmatrix T, T
ia line in representation feature matrix T, i=1,2 ... N
w, M, N represents the size of test pattern texture region, the size that b is detection window, N
wthe number of the sub-block comprising in expression test pattern; ;
The 7th step, side-play amount calculation procedure G, traversal eigenmatrix T, the side-play amount of adjacent two row-coordinate values in calculated characteristics matrix T, obtains side-play amount matrix;
The 8th step, similarity matching step H, carry out similarity matching according to following similarity decision rule formula to each row of the side-play amount matrix obtaining, and in this example, τ gets 0.02;
Similarity decision rule formula
In formula, D represents to calculate the Euclidean distance of two vectors, S
i, S
jfor the row vector in excursion matrix, τ is judgement threshold values;
The 9th step, mark can region step I, by be less than judgement threshold values the corresponding image texture part Zhong of row vector position be labeled as 1, the position mark that does not meet texture part corresponding to the row vector of judgement threshold values is 0;
The tenth step, process the region step G of non-UNICOM, to the non-connected region producing in evidence obtaining process, first it is carried out to morphology and open operation and then it is carried out to morphology closed operation elimination.
Comparing existing method all selects the indirect characteristic quantity of image transform domain to carry out representative image content information, the passive blind evidence collecting method of the digital picture based on textural characteristics that the present invention takes, directly choose image texture characteristic presentation video information, more the information of image can be showed fully.
These six texture statistics amounts of texture mean value, texture standard deviation, texture smoothness, third moment, using texture homogeneity and texture entropy of each Streak block in the inventive method, have been chosen as the proper vector of image subblock, and existing method is minimum, must choose seven characteristic quantities as the proper vector of image subblock, so the relatively existing method of the present invention has reduced the dimension of characteristic quantity.
The textural characteristics of choosing due to the present invention has the features such as level, yardstick and translation invariance, and the feature of these textural characteristics can strengthen the robustness of the method, can be applicable to the fields such as discriminating of the content authenticity of image.
Claims (1)
1. the passive blind evidence collecting method of the digital picture based on textural characteristics, is characterized in that:
First carry out texture feature extraction:
The first step, suspect image to be detected is carried out to gray scale judgement, non-gray level image, is converted into gray level image if;
Second step, the texture part that is converted to the image to be detected after gray level image is cut out out;
The 3rd step, by cutting out the gray level image texture part obtaining, carry out overlap partition; The 4th step, according to following formula, calculate respectively texture mean value, texture standard deviation, texture smoothness, third moment, using texture homogeneity and the texture entropy of each Streak block, as the characteristic statistic of each sub-block, represent each Streak block content information:
1) texture Mean Value Formulas
In formula, ave is texture mean value, the size that s * s is Streak block, and n (x, y) is the gray-scale value of (x, y) point;
2) texture standard deviation formula
In formula, var is texture standard deviation;
3) texture smoothness formula
In formula, P is texture smoothness, and L is texture gray level;
4) third moment formula
In formula, S is texture third moment, and E represents to get the computing of average statistical;
5) using texture homogeneity formula
K=∑(p
2)
In formula, K is using texture homogeneity, and p is texture histogram average;
6) texture entropy formula
In formula, E is texture entropy, p
x,ydistribution probability for (x, y) point;
Then carry out the similar coupling of textural characteristics, specifically comprise following steps:
The 5th step, the characteristic quantity of each sub-block obtaining is normalized;
The 6th step, the proper vector of establishing the textural characteristics sub-block after normalization are (ave
i, var
i, P
i, S
i, K
i, E
i), its
In, i=(M-b+1) * (N-b+1), textural characteristics sub-block vector is arranged in order and obtained N
w* 6 eigenmatrix, the eigenmatrix T after being sorted, T
ia line in representation feature matrix T, i=1,2 ... N
w, M, N represents the size of test pattern texture region, the size that b is detection window, N
wthe number of the sub-block comprising in expression test pattern;
The 7th step, traversal eigenmatrix T, the side-play amount of adjacent two row-coordinate values in calculated characteristics matrix T, obtains side-play amount matrix;
The 8th step, according to following similarity decision rule formula
Each row to the side-play amount matrix obtaining carries out similarity matching; In formula, D represents to calculate the Euclidean distance of two vectors, S
i, S
jfor the row vector in excursion matrix, δ is judgement threshold values;
The 9th step, by be less than judgement threshold values the corresponding image texture part Zhong of row vector position be labeled as 1, the position mark that does not meet texture part corresponding to the row vector of judgement threshold values is 0;
The tenth step, to the non-connected region producing in evidence obtaining process, first to its carry out morphology open operation and then it is carried out to morphology closed operation and eliminates.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201210007510.2A CN102609891B (en) | 2012-01-12 | 2012-01-12 | Texture-characteristic-based method for passively and blindly obtaining evidence of digital image |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201210007510.2A CN102609891B (en) | 2012-01-12 | 2012-01-12 | Texture-characteristic-based method for passively and blindly obtaining evidence of digital image |
Publications (2)
Publication Number | Publication Date |
---|---|
CN102609891A CN102609891A (en) | 2012-07-25 |
CN102609891B true CN102609891B (en) | 2014-01-15 |
Family
ID=46527239
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201210007510.2A Expired - Fee Related CN102609891B (en) | 2012-01-12 | 2012-01-12 | Texture-characteristic-based method for passively and blindly obtaining evidence of digital image |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN102609891B (en) |
Families Citing this family (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN104282008B (en) * | 2013-07-01 | 2017-07-28 | 株式会社日立制作所 | The method and apparatus that Texture Segmentation is carried out to image |
CN105654089A (en) * | 2014-08-20 | 2016-06-08 | 江南大学 | Image re-sampling detection based on Markov process and Gabor filtering |
CN106295478A (en) * | 2015-06-04 | 2017-01-04 | 深圳市中兴微电子技术有限公司 | A kind of image characteristic extracting method and device |
CN110555792B (en) * | 2019-08-16 | 2022-05-17 | 广东外语外贸大学南国商学院 | Image tampering blind detection method based on normalized histogram comprehensive feature vector |
Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102184537A (en) * | 2011-04-22 | 2011-09-14 | 西安理工大学 | Image region tamper detection method based on wavelet transform and principal component analysis |
CN102289671A (en) * | 2011-09-02 | 2011-12-21 | 北京新媒传信科技有限公司 | Method and device for extracting texture feature of image |
-
2012
- 2012-01-12 CN CN201210007510.2A patent/CN102609891B/en not_active Expired - Fee Related
Patent Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102184537A (en) * | 2011-04-22 | 2011-09-14 | 西安理工大学 | Image region tamper detection method based on wavelet transform and principal component analysis |
CN102289671A (en) * | 2011-09-02 | 2011-12-21 | 北京新媒传信科技有限公司 | Method and device for extracting texture feature of image |
Non-Patent Citations (2)
Title |
---|
一种新颖的数字图像幅值粘贴篡改被动取证算法;徐彩臣等;《仪器仪表学报》;20111231;第32卷(第12期);29-33 * |
徐彩臣等.一种新颖的数字图像幅值粘贴篡改被动取证算法.《仪器仪表学报》.2011,第32卷(第12期),29-33. |
Also Published As
Publication number | Publication date |
---|---|
CN102609891A (en) | 2012-07-25 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN108052980B (en) | Image-based air quality grade detection method | |
WO2018023734A1 (en) | Significance testing method for 3d image | |
CN102081731B (en) | Method and device for extracting text from image | |
CN102609948B (en) | Manipulation detection method for copy-paste tampered photo digital photos | |
CN104732215A (en) | Remote-sensing image coastline extracting method based on information vector machine | |
CN101833664A (en) | Video image character detecting method based on sparse expression | |
CN104331450B (en) | Video copying detection method based on multi-mode feature and tensor resolution | |
CN106780449A (en) | A kind of non-reference picture quality appraisement method based on textural characteristics | |
CN102968637A (en) | Complicated background image and character division method | |
CN104318219A (en) | Face recognition method based on combination of local features and global features | |
CN102542660A (en) | Bill anti-counterfeiting identification method based on bill watermark distribution characteristics | |
CN102609891B (en) | Texture-characteristic-based method for passively and blindly obtaining evidence of digital image | |
CN105678309A (en) | Image multi-tag marking algorithm based on multi-example package feature learning | |
CN107818321A (en) | A kind of watermark date recognition method for vehicle annual test | |
CN103985130A (en) | Image significance analysis method for complex texture images | |
CN111325687A (en) | Smooth filtering evidence obtaining method based on end-to-end deep network | |
CN101655912B (en) | Method for detecting computer generated image and natural image based on wavelet transformation | |
CN105808665A (en) | Novel hand-drawn sketch based image retrieval method | |
CN102547477B (en) | Video fingerprint method based on contourlet transformation model | |
Qiu et al. | Coverless image steganography method based on feature selection | |
CN105678261A (en) | Supervised figure-based transductive data dimension-descending method | |
CN105912739A (en) | Similar image retrieval system and method | |
CN104837028A (en) | Video same-bit-rate dual-compression detection method | |
Ding et al. | Research on video text recognition technology based on OCR | |
CN102625028B (en) | The method and apparatus that static logos present in video is detected |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
C06 | Publication | ||
PB01 | Publication | ||
C10 | Entry into substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
C14 | Grant of patent or utility model | ||
GR01 | Patent grant | ||
CF01 | Termination of patent right due to non-payment of annual fee | ||
CF01 | Termination of patent right due to non-payment of annual fee |
Granted publication date: 20140115 Termination date: 20170112 |