CN104732548A - Print file identification method based on texture synthesis - Google Patents

Print file identification method based on texture synthesis Download PDF

Info

Publication number
CN104732548A
CN104732548A CN201510161664.0A CN201510161664A CN104732548A CN 104732548 A CN104732548 A CN 104732548A CN 201510161664 A CN201510161664 A CN 201510161664A CN 104732548 A CN104732548 A CN 104732548A
Authority
CN
China
Prior art keywords
sigma
texture
region
print file
repaired
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201510161664.0A
Other languages
Chinese (zh)
Inventor
陈庆虎
熊海亚
周前进
鄢煜尘
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Wuhan University WHU
Original Assignee
Wuhan University WHU
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Wuhan University WHU filed Critical Wuhan University WHU
Priority to CN201510161664.0A priority Critical patent/CN104732548A/en
Publication of CN104732548A publication Critical patent/CN104732548A/en
Pending legal-status Critical Current

Links

Landscapes

  • Image Analysis (AREA)

Abstract

The invention provides a print file identification method based on texture synthesis in order to solve the problem of identification under the condition that same characters between print files lack even no same character exists between the print files. Each character image of a print file is acquired by utilizing an overall high magnification method, an incomplete character image of the print file is converted into a complete texture image through image inpainting, interferences of different character morphological structures are eliminated accordingly, and therefore feature fusion can be performed on all the images synthesized by one printer, and feature description of the printer is more accurate. Four statistical features of a gray-level co-occurrence matrix of the synthesized images are calculated, namely, the contrast degree, consistency, self-correlation and entropy, a feature mean value of all the images serves as feature description of one printer, and feature distance between two printers serves as a basis for matching classification. By the adoption of the print file identification method based on texture synthesis, same characters do not need to exist between the files of the printers, the application range of the print file identification method is widened and accuracy is improved.

Description

A kind of print file discrimination method based on textures synthesis
Technical field
Technical field assert by the source machine that the present invention relates to printer document, and propose a kind of print file discrimination method based on textures synthesis, object is the discriminating one to one in order to realize two parts of printed documents.
Background technology
Because the parameter configuration of every platform printer when producing and the loss situation in using are all unique, therefore the document printed is not identical yet, this is also the objective base that print file can distinguish qualification, and so-called print file differentiate to be exactly that qualification two parts of printed documents are whether from the technology of same printer.
Universal along with computing machine and printer, print file also replace the main media that manual documentation becomes beared information gradually, meanwhile, the criminal offences such as relevant print file are distorted, forgery are also increasing, and therefore increasing people drops into the Related Research Domain that print file are differentiated.It is all have when same word at two parts of printed documents just can accomplish that current print file are differentiated mainly, because when kinds of characters does coupling discriminating, the impact of the impact meeting override printer feature of its morphosis, cannot obtain correct conclusion; Secondly, in order to the accuracy obtained, identical characters multiplicity can not very little, because it is more stable to repeat more printer characteristics that it shows.And above said these 2 are difficult to be met in actual applications, especially more difficultly in Chinese printed document is differentiated to accomplish, therefore the present invention proposes a kind of print file discrimination method based on textures synthesis, to solve the problem cannot carrying out when not having identical characters to mate or identical characters repeats little differentiating.
Summary of the invention
For the print file solved in the rare situation of printed document identical characters differentiate problem, the present invention utilizes the method for textures synthesis, by the blank parts reparation in the single character picture of printed document, obtain complete texture image thus eliminate the morphosis impact between kinds of characters.The texture information of file and picture carries the characteristic of its printer, and whether two parts of printed documents are from same printer to utilize textural characteristics to differentiate.
The present invention adopts following technical scheme:
Based on a print file discrimination method for textures synthesis, specifically comprise the following steps:
(1) integral image magnification at high multiple scanning system is utilized to gather the single character picture of printed document;
(2) by texture synthesis method, the white space in the single character picture of incompleteness is repaired as complete texture image;
(3) calculate the gray level co-occurrence matrixes of the complete texture image after repairing, and extract contrast, consistance, auto-correlation and entropy four statistical natures;
(4) normalized and Fusion Features are done to carried feature, calculate the diagnostic characteristics of printed document, and using the characteristic mean of all images of a printed document as the feature of affiliated printer;
(5) classified by sorter, differentiate that whether two parts of printed documents are from same printer.
In described step (5), calculate the characteristic distance of two parts of printed documents, when distance is less than empirical value, judge that two parts of documents are from same printer.
Described empirical value adopts the method for Bayesian Estimation to calculate: the printed document sample first taking 100 printers, calculate the range distribution between same printer document and the range distribution between different printer document respectively, minimum for criterion with Type Ⅰ Ⅱ error rate sum, find empirical value T.
Described texture synthesis method comprise strain-based design Method and Process restorative procedure and all can be the method for complete texture by incomplete texture repair.
Described texture synthesis method adopts the texture repairing method based on sample block, first will mark the region to be repaired of image:
First, utilize Otsu algorithm to do binary conversion treatment to original image, tentatively distinguish texture region and background area;
Secondly, remove the noise in background area by medium filtering, but medium filtering cannot remove the noise of texture region marginal portion than comparatively dense, so need to operate by dilation erosion the pure property ensureing texture region;
Finally, the background area after dilation erosion is modified as pure white in the corresponding position of original image, and the white portion obtained is region to be repaired.
Described texture synthesis method utilizes in source region and fills region to be filled to be filled piece of match block the most similar, if region to be repaired is Ω, and the edge in known region to be Φ, δ Ω be region to be repaired; Key step is as follows:
(2.1) priority calculates, and determines repairing order: priority computing method have employed two indices: the degree of confidence C (p) of pixel and the data item D (p) of edge strength;
P (p)=C (p) * D (p) (1) wherein,
C ( p ) = Σ q ∈ Ψ p Φ C ( q ) Area ( Ψ p ) D ( p ) = | ▿ I p ⊥ α + ϵ - - - ( 2 )
Wherein, Ψ pfor the object block to be repaired centered by p point, with be respectively Ψ pin adhere to the part in Ω and Φ region separately, represent the isophote vector at p point place, n pfor unit normal vector; α is normalized parameter, and ε is disturbance constant; The pixel around C (p) larger explanation p point with high confidence level is more, and D (p) larger explanation p point place is the joint of known region isophote and zone boundary to be repaired; The place that P (p) value is high should preferentially be repaired;
(2.2) point setting preferred value the highest as p, according to object block Ψ pin the texture of part searches for best matching blocks in known region Φ, and the block that namely matching distance is minimum, is designated as
(2.3) will in with corresponding partial replication is to Ψ pin, and upgrade the edge δ Ω in region to be repaired;
(2.4) more than, 3 continuous cyclings of step, until all regions to be repaired are all filled, complete texture repair.
In described step (3), contrast, consistance, auto-correlation and entropy four statistical nature computing formula are respectively:
Contrast: CON = Σ i = 0 K - 1 Σ j = 0 K - 1 ( i - j ) 2 P ( i , j )
Consistance: Homo = Σ i = 0 K - 1 Σ j = 0 K - 1 P ( i , j ) log 2 P ( i , j )
Auto-correlation: Cor = Σ i = 0 K - 1 Σ j = 0 K - 1 ijP ( i , j ) - μ x μ y σ x σ y
Wherein:
μ x = Σ i = 0 K - 1 Σ j = 0 K - 1 iP ( i - j ) , μ y = Σ i = 0 K - 1 Σ j = 0 K - 1 jP ( i , j )
σ x = Σ i = 0 K - 1 Σ j = 0 K - 1 ( i - μ x ) 2 P ( i , j ) , σ y = Σ i = 0 K - 1 Σ j = 0 K - 1 ( i , μ y ) 2 P ( i , j )
Entropy: Ee = Σ i = 0 K - 1 Σ j = 0 K - 1 P 2 ( i - j ) .
Compared with prior art, the print file discrimination method based on textures synthesis that the present invention proposes, does not need there is identical characters between printer document, improves the scope of application and the accuracy rate of print file discrimination method.
Accompanying drawing explanation
Fig. 1 is that printed document gathers character picture example;
Fig. 2 is textures synthesis image shows; The complete texture image of (a) original image (b) image (c) to be repaired;
Fig. 3 is Criminisi algorithm schematic diagram.
Embodiment
Below in conjunction with drawings and Examples, the invention will be further described.
1. integral image magnification at high multiple gathers and Character segmentation
Be amplified the printed document image after 150 times by integral image magnification at high multiple scanning system, then obtain by Character segmentation the single character picture enriching texture information, as shown in Figure 1.
2. textures synthesis
If to single character image zooming-out feature, so its eigenwert must contain the texture of image and morphosis two kinds of information of character.In printer is differentiated, the texture information of image is the important evidence differentiated, but due to the interference of character morphosis, the feature between different character pictures can not be directly used in coupling classification.In order to eliminate this interference, the present invention proposes to utilize the method for textures synthesis to repair the character picture of printed document.Texture synthesis method is various, and we have selected the method based on sample block that effect is proposed by Criminisi preferably, because the method effect is fine, so do not do any change.
For the texture repairing method based on sample block, first to mark the region to be repaired of image: first utilize Otsu algorithm to do binary conversion treatment to original image, tentatively distinguish texture region and background area, secondly the noise in background area is removed by medium filtering, but medium filtering cannot remove the noise of texture region marginal portion than comparatively dense, so need to operate by dilation erosion the pure property ensureing texture region, finally the background area after dilation erosion is modified as pure white in the corresponding position of original image, the white portion obtained is region to be repaired, as shown in (b) in Fig. 2.By calculating the priority of degree of confidence item and the data item decision repairing of repairing edge, then the region that search and multiblock to be repaired mate most in the known region of image, finally this matching area is copied to multiblock to be repaired, obtain complete texture image, in Fig. 2 shown in (c).
3. feature extraction
Classical gray level co-occurrence matrixes (GLCM) algorithm is directly quoted in texture image feature extraction, get horizontal and vertical directions, the scanning of the corresponding printer of difference and page orientation, step-length respectively gets 20 steps, obtain the gray level co-occurrence matrixes P of image, the size of matrix is K × K, and in matrix, each element representation is P (i, j), 4 conventional statistical natures of GLCM are finally calculated as characteristics of image:
(1) contrast: CON = Σ i = 0 K - 1 Σ j = 0 K - 1 ( i - j ) 2 P ( i , j )
(2) consistance: Homo = Σ i = 0 K - 1 Σ j = 0 K - 1 P ( i , j ) log 2 P ( i , j )
(3) auto-correlation: Cor = Σ i = 0 K - 1 Σ j = 0 K - 1 ijP ( i , j ) - μ x μ y σ x σ y Wherein:
μ x = Σ i = 0 K - 1 Σ j = 0 K - 1 iP ( i - j ) , μ y = Σ i = 0 K - 1 Σ j = 0 K - 1 jP ( i , j )
σ x = Σ i = 0 K - 1 Σ j = 0 K - 1 ( i - μ x ) 2 P ( i , j ) , σ y = Σ i = 0 K - 1 Σ j = 0 K - 1 ( i , μ y ) 2 P ( i , j )
(4) entropy: Ee = Σ i = 0 K - 1 Σ j = 0 K - 1 P 2 ( i - j )
So often open feature totally 2 × 20 × 4=160 dimension of image zooming-out, be designated as x l, l=1,2 ..., 160.
4. Fusion Features
Because 4 statistical nature codomains of GLCM are different, first will do normalized respectively, for contrast metric, normalization formula is as follows:
y l = x l - x min x max - x min , l = 1,2 , . . . , 40 x max = max ( x l ) , x min = min ( x l ) - - - ( 1 )
All the other 3 statistical natures process all in this way, finally obtain new eigenwert y l, l=1,2 ..., 160.Remember that the new feature value that a printer M opens image is l=1,2 ..., 160, m=1,2 ..., M.Complete texture after synthesis is no longer by character form structure influence, and all characters of so a printed document can be regarded as same character, and therefore, the source machine feature of this printed document can be described as the characteristic mean that all M open image:
Fea l = 1 M Σ m = 1 M y m l l = 1,2 , . . . , 160 - - - ( 2 )
5. sorter
Obtained the feature interpretation of a printed document by (2) formula, the matching distance between the source machine of two parts of printed documents is defined as their features with between city block distance:
Dist = Σ l = 1 160 | Fea 1 l - Fea 2 l | Σ l = 1 160 Fea 1 l Σ l = 1 160 Fea 2 l - - - ( 3 )
Then adopting the method for threshold classification to adjust the distance to differentiate, judging that when being less than threshold value T two parts of printed documents are from same printer, judge from different machine when being greater than T.For obtaining this threshold value, we adopt the method for Bayesian Estimation.First take the printed document sample of 100 printers, calculate the range distribution between same printer document and the range distribution between different printer document respectively, minimum for criterion with Type Ⅰ Ⅱ error rate sum, find empirical value T.
6. test
Set up 30 printers, the database of every platform printer 80 Chinese characters, wherein half image makees sample, and half image is tested.Utilize Criminisi based on the texture synthesis method of sample block, obtain the complete texture of each character picture, then calculate their GLCM feature, finally obtain the distance between two printers, and itself and threshold value T are carried out comparison-of-pair sorting.Experimental result accuracy, false rejection rate FRR, false acceptance rate FAR tri-standards are evaluated, as shown in table 1.
Table 1 is identification result one to one
Texture synthesis method Accuracy Error rate FRR FAR
Criminisi 98% 2% 2.22% 0
Below Criminisi texture synthesis method is elaborated:
Criminisi texture synthesis method utilizes in source region and fills region to be filled to be filled piece of match block the most similar.As shown in Figure 3, if region to be repaired is Ω, the edge in known region to be Φ, δ Ω be region to be repaired.Algorithm key step is as follows:
(1) priority calculates, and determines repairing order.Priority computing method have employed two indices: the degree of confidence C (p) of pixel and the data item D (p) of edge strength.
P(p)=C(p)*D(p) (2)
Wherein,
C ( p ) = Σ q ∈ Ψ p Φ C ( q ) Area ( Ψ p ) D ( p ) = | ▿ I p ⊥ α + ϵ - - - ( 2 )
Ψ pfor the object block to be repaired centered by p point, with be respectively Ψ pin adhere to the part in Ω and Φ region separately, represent the isophote vector at p point place, n pfor unit normal vector.α is normalized parameter, and ε is disturbance constant.The pixel around C (p) larger explanation p point with high confidence level is more, and D (p) larger explanation p point place is the joint of known region isophote and zone boundary to be repaired.The place that P (p) value is high should preferentially be repaired.
(2) point setting preferred value the highest as p, according to object block Ψ pin the texture of part searches for best matching blocks in known region Φ, and the block that namely matching distance is minimum, is designated as
(3) will in with corresponding partial replication is to Ψ pin, and upgrade the edge δ Ω in region to be repaired.
Above 3 continuous cyclings of step, until all regions to be repaired are all filled, complete texture repair.

Claims (7)

1., based on a print file discrimination method for textures synthesis, it is characterized in that: specifically comprise the following steps:
(1) integral image magnification at high multiple scanning system is utilized to gather the single character picture of printed document;
(2) by texture synthesis method, the white space in the single character picture of incompleteness is repaired as complete texture image;
(3) calculate the gray level co-occurrence matrixes of the complete texture image after repairing, and extract contrast, consistance, auto-correlation and entropy four statistical natures;
(4) normalized and Fusion Features are done to carried feature, calculate the diagnostic characteristics of printed document, and using the characteristic mean of all images of a printed document as the feature of affiliated printer;
(5) classified by sorter, differentiate that whether two parts of printed documents are from same printer.
2. a kind of print file discrimination method based on textures synthesis according to claim 1, it is characterized in that: in described step (5), calculate the characteristic distance of two parts of printed documents, when distance is less than empirical value, judge that two parts of documents are from same printer.
3. a kind of print file discrimination method based on textures synthesis according to claim 2, it is characterized in that: described empirical value adopts the method for Bayesian Estimation to calculate: the printed document sample first taking 100 printers, calculate the range distribution between same printer document and the range distribution between different printer document respectively, minimum for criterion with Type Ⅰ Ⅱ error rate sum, find empirical value T.
4. a kind of print file discrimination method based on textures synthesis according to claim 1, is characterized in that: described texture synthesis method comprise strain-based design Method and Process restorative procedure and all can be the method for complete texture by incomplete texture repair.
5. a kind of print file discrimination method based on textures synthesis according to claim 4, is characterized in that: described texture synthesis method adopts the texture repairing method based on sample block, first will mark the region to be repaired of image:
First, utilize Otsu algorithm to do binary conversion treatment to original image, tentatively distinguish texture region and background area;
Secondly, remove the noise in background area by medium filtering, but medium filtering cannot remove the noise of texture region marginal portion than comparatively dense, so need to operate by dilation erosion the pure property ensureing texture region;
Finally, the background area after dilation erosion is modified as pure white in the corresponding position of original image, and the white portion obtained is region to be repaired.
6. a kind of print file discrimination method based on textures synthesis according to claim 4, it is characterized in that: described texture synthesis method utilizes in source region and fills region to be filled to be filled piece of match block the most similar, if region to be repaired is Ω, the edge in known region to be Φ, δ Ω be region to be repaired; Key step is as follows:
(2.1) priority calculates, and determines repairing order: priority computing method have employed two indices: the degree of confidence C (p) of pixel and the data item D (p) of edge strength;
P(p)=C(p)*D(p) (1)
Wherein,
C ( p ) = Σ q ∈ ψ p Φ C ( q ) Area ( ψ p ) D ( p ) = | ▿ I p ⊥ · n p | α + ϵ - - - ( 2 )
Wherein, Ψ pfor the object block to be repaired centered by p point, with be respectively Ψ pin adhere to the part in Ω and Φ region separately, represent the isophote vector at p point place, n pfor unit normal vector; α is normalized parameter, and ε is disturbance constant; The pixel around C (p) larger explanation p point with high confidence level is more, and D (p) larger explanation p point place is the joint of known region isophote and zone boundary to be repaired; The place that P (p) value is high should preferentially be repaired;
(2.2) point setting preferred value the highest as p, according to object block Ψ pin the texture of part searches for best matching blocks in known region Φ, and the block that namely matching distance is minimum, is designated as
(2.3) will in with corresponding partial replication is to Ψ pin, and upgrade the edge δ Ω in region to be repaired;
(2.4) more than, 3 continuous cyclings of step, until all regions to be repaired are all filled, complete texture repair.
7. a kind of print file discrimination method based on textures synthesis according to claim 1, is characterized in that: in described step (3), and contrast, consistance, auto-correlation and entropy four statistical nature computing formula are respectively:
Contrast: CON = Σ i = 0 K - 1 Σ j = 0 K - 1 ( i - j ) 2 P ( i , j )
Consistance: Homo = Σ i = 0 K - 1 Σ j = 0 K - 1 P ( i , j ) log 2 P ( i , j )
Auto-correlation: Cor = Σ i = 0 K - 1 Σ j = 0 K - 1 ijP ( i , j ) - μ x μ y σ x σ y
Wherein:
μ x = Σ i = 0 K - 1 Σ j = 0 K - 1 iP ( i , j ) , μ y = Σ i = 0 K - 1 Σ j = 0 K - 1 jP ( i , j )
σ x = Σ i = 0 K - 1 Σ j = 0 K - 1 ( i - μ x ) 2 P ( i , j ) , σ y = Σ i = 0 K - 1 Σ j = 0 K - 1 ( j - μ y ) 2 P ( i , j )
Entropy: Ee = Σ i = 0 K - 1 Σ j = 0 K - 1 P 2 ( i , j ) .
CN201510161664.0A 2015-04-07 2015-04-07 Print file identification method based on texture synthesis Pending CN104732548A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201510161664.0A CN104732548A (en) 2015-04-07 2015-04-07 Print file identification method based on texture synthesis

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201510161664.0A CN104732548A (en) 2015-04-07 2015-04-07 Print file identification method based on texture synthesis

Publications (1)

Publication Number Publication Date
CN104732548A true CN104732548A (en) 2015-06-24

Family

ID=53456416

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201510161664.0A Pending CN104732548A (en) 2015-04-07 2015-04-07 Print file identification method based on texture synthesis

Country Status (1)

Country Link
CN (1) CN104732548A (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106599910A (en) * 2016-12-02 2017-04-26 武汉珞珈博研科技有限责任公司 Printing file discriminating method based on texture recombination
CN107067360A (en) * 2016-10-28 2017-08-18 上海大学 Robust steganography method based on textures synthesis
CN107273898A (en) * 2017-07-10 2017-10-20 武汉珞珈博研科技有限责任公司 Mimeograph documents discrimination method based on Texture features region segmentation
CN107480728A (en) * 2017-08-28 2017-12-15 南京大学 A kind of discrimination method of the mimeograph documents based on Fourier's residual values

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1570978A (en) * 2004-05-09 2005-01-26 北京航空航天大学 Grain synthesizing method based on multiple master drawings
CN1731449A (en) * 2005-07-14 2006-02-08 北京航空航天大学 A method of image restoration

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1570978A (en) * 2004-05-09 2005-01-26 北京航空航天大学 Grain synthesizing method based on multiple master drawings
CN1731449A (en) * 2005-07-14 2006-02-08 北京航空航天大学 A method of image restoration

Non-Patent Citations (4)

* Cited by examiner, † Cited by third party
Title
A.CRIMINISI等: "Object Removal by Exemplar-Based Inpaintin", 《IEEE TRANSACTIONS ON IMAGE PROCESSING》 *
朱文燕: "基于内容的图像检索中特征性能分析", 《中国优秀硕士学位论文全文数据库 信息科技辑》 *
罗霄等: "基于非参数搜索的打印墨粉纹理合成及评价", 《科学技术与工程》 *
邓伟: "基于打印墨粉纹理分析的打印文件检验研究", 《电子测量技术》 *

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107067360A (en) * 2016-10-28 2017-08-18 上海大学 Robust steganography method based on textures synthesis
CN107067360B (en) * 2016-10-28 2019-11-19 上海大学 Robust steganography method based on textures synthesis
CN106599910A (en) * 2016-12-02 2017-04-26 武汉珞珈博研科技有限责任公司 Printing file discriminating method based on texture recombination
CN106599910B (en) * 2016-12-02 2019-06-25 武汉珞珈博研科技有限责任公司 Mimeograph documents discrimination method based on texture recombination
CN107273898A (en) * 2017-07-10 2017-10-20 武汉珞珈博研科技有限责任公司 Mimeograph documents discrimination method based on Texture features region segmentation
CN107480728A (en) * 2017-08-28 2017-12-15 南京大学 A kind of discrimination method of the mimeograph documents based on Fourier's residual values
CN107480728B (en) * 2017-08-28 2019-02-26 南京大学 A kind of discrimination method of the mimeograph documents based on Fourier's residual values

Similar Documents

Publication Publication Date Title
US6252988B1 (en) Method and apparatus for character recognition using stop words
Busch et al. Texture for script identification
CN100440250C (en) Recognition method of printed mongolian character
France et al. A new approach to automated pollen analysis
US8494273B2 (en) Adaptive optical character recognition on a document with distorted characters
RU2445699C1 (en) Method to process data of optical character recognition (ocr), where output data includes character images with affected visibility
CN110298376B (en) Bank bill image classification method based on improved B-CNN
CN110020692B (en) Handwriting separation and positioning method based on print template
Tamilselvi et al. A Novel Text Recognition Scheme using Classification Assisted Digital Image Processing Strategy
Wen et al. A new optical music recognition system based on combined neural network
CN104732548A (en) Print file identification method based on texture synthesis
Nag et al. New cold feature based handwriting analysis for enthnicity/nationality identification
CN106599910B (en) Mimeograph documents discrimination method based on texture recombination
CN109087234A (en) Watermark embedding method and device in a kind of text image
CN107480728B (en) A kind of discrimination method of the mimeograph documents based on Fourier's residual values
Munir et al. Automatic character extraction from handwritten scanned documents to build large scale database
Deng et al. Printer identification based on distance transform
CN104700106B (en) A kind of mimeograph documents discrimination method based on information excavating and information fusion
CN115457044A (en) Pavement crack segmentation method based on class activation mapping
JP4492258B2 (en) Character and figure recognition and inspection methods
Ajao et al. Yoruba handwriting word recognition quality evaluation of preprocessing attributes using information theory approach
Gaceb et al. A new mixed binarization method used in a real time application of automatic business document and postal mail sorting.
EP2225700A1 (en) A method for processing optical character recognition (ocr) output data, wherein the output data comprises double printed character images
JP2008219800A (en) Writing extraction method, writing extracting device, and writing extracting program
Sotoodeh et al. Staff detection and removal using derivation and connected component analysis

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
WD01 Invention patent application deemed withdrawn after publication

Application publication date: 20150624

WD01 Invention patent application deemed withdrawn after publication