WO2007105890A1 - Recognizing the denomination of a note using wavelet transform - Google Patents

Recognizing the denomination of a note using wavelet transform Download PDF

Info

Publication number
WO2007105890A1
WO2007105890A1 PCT/KR2007/001194 KR2007001194W WO2007105890A1 WO 2007105890 A1 WO2007105890 A1 WO 2007105890A1 KR 2007001194 W KR2007001194 W KR 2007001194W WO 2007105890 A1 WO2007105890 A1 WO 2007105890A1
Authority
WO
WIPO (PCT)
Prior art keywords
note
image
wavelet transform
feature vector
wavelet
Prior art date
Application number
PCT/KR2007/001194
Other languages
French (fr)
Inventor
Eui Sun Choi
Original Assignee
Nautilus Hyosung Inc.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nautilus Hyosung Inc. filed Critical Nautilus Hyosung Inc.
Publication of WO2007105890A1 publication Critical patent/WO2007105890A1/en

Links

Classifications

    • GPHYSICS
    • G07CHECKING-DEVICES
    • G07DHANDLING OF COINS OR VALUABLE PAPERS, e.g. TESTING, SORTING BY DENOMINATIONS, COUNTING, DISPENSING, CHANGING OR DEPOSITING
    • G07D7/00Testing specially adapted to determine the identity or genuineness of valuable papers or for segregating those which are unacceptable, e.g. banknotes that are alien to a currency
    • G07D7/20Testing patterns thereon
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/52Scale-space analysis, e.g. wavelet analysis
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20048Transform domain processing
    • G06T2207/20064Wavelet transform [DWT]

Definitions

  • the wavelet transform is a process of reconfiguring a signal into very simple basic functions. That is, the wavelet transform can be considered as a method of decomposing data, a function, or an operator into different frequency components and examining each of the components associated with a resolution corresponding to each scale.
  • the fundamental principle of the wavelet transform is similar to that of the Fourier analysis.
  • the use of the wavelet transform for signal processing can restore a weak signal mixed with noise.
  • the wavelet transform is different in that a narrow window is used for a high frequency bandwidth and a wide window is used for a low frequency bandwidth.
  • the wavelet transform has been proved particularly to be useful for X-ray or magnetic resonance image processing in the medical field. An image processed in such a method can be processed clearly without a blur in its details. In addition, since the wavelet transform exactly reflects the fact that a person first recognizes an overall outline of a thing and then gradually concentrates on detailed portions, the wavelet transform is suitable for image processing.
  • the sub-bands LL, LH, HL, and HH are differentiated according to the filters applied to the image as shown in Fig. 1.
  • the LL sub-band 110 comprises coefficients in which high frequency components are excluded from the image by applying low pass filters to the original image in the horizontal and vertical directions.
  • the HH sub-band 140 for which high pass filters are applied to the original image in the horizontal and vertical directions contains only high frequency components contrary to the LL sub-band 110.
  • the wavelet transform is recursively performed over a plurality of steps, not limited to a first transform, and thus, the respective steps have different resolutions (multi-resolutions) and frequency characteristics (scalability).
  • a specific example of the two-dimensional wavelet transform will be explained with reference to Fig. 2.
  • Four sub-bands shown in Fig. 2 (b) (hereinafter, referred to as LL region 220, HL region 222, LH region, 224, and HH region 226) are created by performing wavelet transform once.
  • LL region 220 is subjected once more to wavelet transform, four sub-bands of LL region 230, LH2 region, 234, HL2 region 232, and HH2 region 236 are created, and thus, the original image is decomposed into total seven sub-bands (LL 230, LHl 224, HLl 222, HHl 226, LH2 234, HL2 232, and HH2 236).
  • the reason why the LL regions 220, 230, 240 and 250 are decomposed is that each
  • LL region contains important information on the image.
  • the decomposition is repeated as many times as a certain decomposition value until desired information is obtained.
  • the LL regions 220, 230, 240 and 250 are recursively decomposed, thereby obtaining a new processing target image.
  • high speed Haar wavelet transform is used to extract features from a note image.
  • the method of recognizing the denomination of a note according to the present invention allows any kind of wavelet transform to be used.
  • the high speed Haar wavelet transform is advantageous in expressing a wide area of continuous colors, easy to implement, and speedy in transforming.
  • the high speed Haar wavelet transform satisfies conditions of uniqueness, completeness, invariance, sensitivity and abstraction that are characteristics of a system for expressing an image.
  • a note desired to be recognized is first scanned (S310) so that an original image 410 is created by as shown in Fig. 4.
  • the original image 410 is then preprocessed (S320).
  • the preprocessing means detecting edges in the note image, which is performed by means of a relatively simple filter mask.
  • the edge detection in the note image enhances the reliability of the feature vector and makes it easy to remove noise in the note image and to detect an outer boundary or the like, thereby enabling smooth processing of the note image.
  • the edge detection as the preprocessing can be implemented in hardware using a digital signal processing (DSP) chip or the like. In this case, since the preprocessing is performed almost concurrently with obtaining (scanning) the note image, wavelet features can be extracted at a high speed.
  • DSP digital signal processing
  • Edges are detected by referring to changes in the absolute values calculated in the x- and y-directions. That is, a region where the magnitude of the absolute value is abruptly increased is detected as an edge. Since the edge is basically a region where discontinuity of a grey level exists in an image, the edge is obtained by calculating differences in grey level among neighboring pixels around a certain pixel.
  • the preprocessed original image is wavelet transformed as many times as the decomposition value (S330).
  • the wavelet transform is to apply conventional wavelet transform to recognition of a note. If the preprocessed original image 610 is wavelet transformed, LL, LH, HL and HH regions are created. Next, the LL region is recursively wavelet transformed as many times as the certain decomposition value, resulting in created LH, HL and HH regions.
  • each of the regions dl to d6 (when J is two) is configured with wavelet coefficients having features of each band using differences in information of neighboring regions (S340).
  • the wavelet coefficient the LL region 730 for which low pass filters are applied in the horizontal and vertical directions is configured with approximate coefficient values of the image, and d5 region 734 and d2 region 724, i.e., LH regions for which a high pass filter is applied in the horizontal direction, are respectively configured with coefficient values of horizontal components of the image.
  • d4 region 732 and dl region 722 i.e., HL regions for which a high pass filter is applied in the vertical direction
  • d6 region 736 and d3 region 726 are respectively configured with coefficient values of diagonal components of the image.
  • the feature vector is a vector value configured by comparing the magnitudes of the absolute values of the extracted wavelet coefficients with a predetermined reference value. For example, obtaining feature vectors of d2 region 724 will be described. First, d2 region 724 is divided into m x n cells in the horizontal and vertical directions as shown in Fig. 8 (S350). For the respective m x n cells, the magnitudes of the absolute values of wavelet coefficients are compared with a reference value (S360) to extract the number of wavelet coefficients of which the magnitudes are equal to or larger than the reference value, thereby configuring a first vector (S370).
  • d2 region 724 is divided into m x n (l l x 4) cells as shown in Fig. 9, and the number of coefficients of which the magnitudes are larger than the reference value 7 is counted among coefficients in first cell mini 910. That is, it is understood that the number of coefficients of which the magnitudes are larger than the reference value 7 is 35 in cell mini 910 comprising coefficients of ⁇ 5, 7, 9, 10, 11, 9, ..., 9, 9, 8, 7, 7, 6 ⁇ , and the number 35 becomes a feature vector 912 of cell mini.
  • the normal analysis is one of statistical feature extraction methods, which means a process of transforming a feature vector into a vector space ⁇ that maximizes the ratio shown in formula 1. That is, in a case where feature vectors are transformed through the normal analysis, the amount of variance is decreased if denominations of notes are identical, and a relative distance is increased if denominations of notes are different. Therefore, correct and reliable feature extraction can be achieved.
  • Table 1 below shows classification ratios in methods where the normal analysis is applied and not applied.
  • the wavelet decomposition value J is set to 1
  • the number of cells in a row (m) is set to 2
  • the number of cells in a column (n) is set to 2.
  • the classification ratio is 97.71 % in the method where the normal analysis is not applied.
  • the classification ratio is 99.91 %. It is understood from the results that the accuracy of the method where the normal analysis is applied is increased as compared with the method where the normal analysis is not applied.
  • the present invention has been described in connection with a method of recognizing a typical note.
  • the present invention is not limited to the recognition of such a typical note but can be applied to recognition of general types of notes having certain printed patterns or watermark images, such as securities, a variety of paper money guaranties, and checks. Therefore, the scope of the present invention should not be defined by the embodiment described above but apparently defined by the invention of the appended claims and equivalents thereof.
  • the denomination of a note can be stably recognized regardless of whether the note has been worn out and contaminated due to the use thereof for a long time.

Abstract

The present invention relates to a method of recognizing the denomination of a note using wavelet transform, and more particularly, to a method of recognizing a note using a feature vector that is configured by preprocessing an inputted note image, such as detecting edges in the note image, performing wavelet transform for the preprocessed input image by a predetermined number of times, and comparing the magnitudes of the absolute values of wavelet coefficients with a reference value, in order to more correctly recognize the denomination of the note without being affected by entire or partial contamination of the note.

Description

Description
RECOGNIZING THE DENOMINATION OF A NOTE USING
WAVELET TRANSFORM
Technical Field
[1] The present invention relates to a method of recognizing the denomination of a note using wavelet transform, and more particularly, to a method of recognizing a note using a feature vector that is configured by preprocessing an inputted note image, such as detecting edges in the note image, performing wavelet transform for the pre- processed input image by a predetermined number of times, and comparing the magnitudes of the absolute values of wavelet coefficients with a reference value, in order to more correctly recognize the denomination of the note without being affected by entire or partial contamination of the note.
[2]
Background Art
[3] In a conventional method of identifying a note, one-dimensional data
(one-dimensional data array or image) are obtained from a note using a plurality of single element type transmissive and reflective optical sensors, and an image pattern of the note is examined and compared with patterns of other notes in terms of specific regions that are considered to have a lot of distinguishable factors (template method). That is, the denomination of a note is identified by comparing image patterns with one another in terms of a set of specific regions (a template) or by measuring the size of the note.
[4] However, in the method of comparing and examining surface patterns of specific regions, if the type of a note to be identified is changed or a note is replaced by a new type of note, the contents and position of an existing template should be changed. Further, the contents and position of an existing template should be also changed if the position or orientation of a light emitting device or a light receiving device is adjusted. Accordingly, the design of an apparatus employing the method should be frequently changed. Furthermore, since the type of a note is identified only through the surface pattern scheme in the conventional method, it is impossible to determine the position of a portion to be examined so that optimized surface patterns can be obtained for all kinds of notes. This makes it difficult to distinguish all kinds of notes quite precisely.
[5] In order to solve the problem, there has been proposed a method of recognizing the denomination of a note, wherein the denomination of a note is recognized by scanning the note to obtain an image of the note, performing wavelet transform for the scanned note image, dividing each of wavelet sub-band regions into a certain number of cells, comparing the magnitude of the absolute value of a wavelet coefficient in each cell with a reference value, and extracting the number of wavelet coefficients of which magnitude is larger than the reference value as a feature vector. However, due to changes in luminance information of the finally scanned image, which are produced by entire or partial contamination of an actual note or inconsistent output of a sensor for creating the note image, it is difficult to guarantee the reliability or robustness of the proposed method of recognizing the denomination of a note using wavelet transform. Accordingly, in the present invention, the changes in luminance information of the note image are decreased by appropriately preprocessing the inputted note image, such as extracting edges in the note image, before the step of extracting wavelet features. Thus, it is possible to extract wavelet features that are more robust against noise.
[6] In addition, as for setting the reference value that is compared with the magnitudes of the absolute values of the wavelet coefficients in the process of extracting the wavelet features, an allowable range as a determination criterion is required to be set quite narrowly in order to correctly recognize the denomination of a note. However, if the allowable range is too narrow, it is difficult to recognize the denomination of a note due to entire contamination of the note. On the contrary, in a case where the allowable range is too wide, if shape patterns of different types of notes are similar, the denomination of a note cannot be correctly recognized, e.g., the denomination of a note is wrongly recognized. In the present invention, an optimized reference value can be set through appropriate preprocessing, i.e., extracting edges in a note image.
[7]
Disclosure of Invention
Technical Problem
[8] The present invention is conceived to solve the aforementioned problems. The present invention enables correct recognition of a note even though the note to be identified is entirely or partially contaminated or replaced by a new type of note, and makes it easy to identify all kinds of notes quite precisely, using a feature vector that is configured by preprocessing an inputted note image, such as detecting edges in the note image, performing wavelet transform for the preprocessed input image by a predetermined number of times, and comparing the magnitudes of the absolute values of wavelet coefficients with a reference value.
[9]
Technical Solution
[10] To achieve the object described above, a method of recognizing the denomination of a note using wavelet transform according to the present invention comprises the steps of performing a preprocessing process of detecting edges in an original image created by scanning the note; performing the wavelet transform for the preprocessed original image by a predetermined number of times; obtaining wavelet coefficients for each of sub-bands created through the wavelet transform, and dividing each of the sub- bands into m x n cells in horizontal and vertical directions; comparing a magnitude of an absolute value of each of the wavelet coefficients with a reference value for each of the m x n cells, and extracting the number of wavelet coefficients with the magnitudes larger than or equal to the reference value; completing the extracting step for each of the sub-bands, and configuring a feature vector with a vertical structure; applying normal analysis to the feature vector to enhance identification ability and to decrease the size of the feature vector; and recognizing the denomination of the note by measuring similarity between the feature vector and a predetermined determination criterion vector. [H]
Advantageous Effects
[12] In order to achieve more reliable recognition of the denomination of a note, which is more robust against noise such as changes in luminance of a note image due to entire or partial contamination of the note, inconsistent output of a sensor and the like, the present invention enables recognition of the denomination of a note using a feature vector that is configured by detecting edges in a note image, performing wavelet transform for the note image by a predetermined number of times, and comparing the magnitudes of the absolute values of wavelet coefficients with a reference value. Thus, the present invention enables correct recognition of a note even though the note to be identified is entirely or partially contaminated or replaced by a new type of note, and makes it easy to identify all kinds of notes quite precisely.
[13] Further, a note can be identified regardless of whether the note has been contaminated or worn out due to the use thereof for a long time, and the denomination of the note can be stably recognized by excluding an error factor such as changes in the performance of an object included in a note identifying apparatus.
[14]
Brief Description of the Drawings
[15] Fig. 1 shows a typical wavelet transform process.
[16] Fig. 2 shows results of a general two-dimensional wavelet transform performed on an image.
[17] Fig. 3 is a flowchart illustrating a method of recognizing the denomination of a note using wavelet transform according to the present invention.
[18] Fig. 4 shows an example of an original image created according to the present invention. [19] Fig. 5 shows 3 x 3 Sobel masks according to the present invention.
[20] Fig. 6 shows results of detecting edges from the original image according to the present invention.
[21] Fig. 7 shows results of performing wavelet transform for the original image twice according to the present invention.
[22] Fig. 8 shows an example in which a d2 sub-band is divided into m x n cells according to the present invention.
[23] Fig. 9 exemplarily shows a process of extracting a feature vector of the d2 sub-band according to the present invention.
[24] Fig. 10 exemplarily shows a process of extracting a feature vector by performing wavelet transform twice according to the present invention.
[25] Fig. 11 shows a process of extracting a feature vector according to the present invention.
[26] **Explanation of Reference Numerals for Main Portions in Drawings**
[27] S320: Preprocessing process S330: Wavelet transform
[28] S350: Divide sub-band into cells
[29] S360: Compare coefficients with reference value by cell
[30] S370: Configure feature vector
[31] S380: Apply normal analysis
[32]
Best Mode for Carrying Out the Invention
[33] Hereinafter, preferred embodiments of the present invention will be described in detail with reference to the accompanying drawings. It should be noted that like elements are designated by like reference numerals throughout the drawings.
[34] The outline of general wavelet transform will be described before describing the present invention.
[35] The wavelet transform is a process of reconfiguring a signal into very simple basic functions. That is, the wavelet transform can be considered as a method of decomposing data, a function, or an operator into different frequency components and examining each of the components associated with a resolution corresponding to each scale. The fundamental principle of the wavelet transform is similar to that of the Fourier analysis. The use of the wavelet transform for signal processing can restore a weak signal mixed with noise. However, compared with the Fourier analysis in which filters with the same size are used for all frequency bandwidths, the wavelet transform is different in that a narrow window is used for a high frequency bandwidth and a wide window is used for a low frequency bandwidth.
[36] The wavelet transform has been proved particularly to be useful for X-ray or magnetic resonance image processing in the medical field. An image processed in such a method can be processed clearly without a blur in its details. In addition, since the wavelet transform exactly reflects the fact that a person first recognizes an overall outline of a thing and then gradually concentrates on detailed portions, the wavelet transform is suitable for image processing.
[37] Basic operations of the wavelet transform are applied to discrete signals having n samples. A pair of filters are applied to a signal to separate the signal into a low frequency bandwidth and a high frequency bandwidth. Since each of the bandwidths is sub-sampled with a factor of 2, it contains n/2 samples.
[38] An example of the wavelet transform will be explained with reference to Fig. 1.
When signals are extracted from an image using low band pass filters (LPF) and high band pass filters (HPF) in spatial X- and Y-axis directions and then subjected to wavelet transform, four sub-bands LL 110, LH 120, HL 130 and HH 140 are created for each frequency bandwidth.
[39] At this time, the sub-bands LL, LH, HL, and HH are differentiated according to the filters applied to the image as shown in Fig. 1. The LL sub-band 110 comprises coefficients in which high frequency components are excluded from the image by applying low pass filters to the original image in the horizontal and vertical directions. The HH sub-band 140 for which high pass filters are applied to the original image in the horizontal and vertical directions contains only high frequency components contrary to the LL sub-band 110.
[40] The HL sub-band 130 for which a high pass filter is applied in the vertical direction contains error components of a frequency in the vertical direction, and the LH sub- band 120 for which a high pass filter is applied in the horizontal direction contains error components of a frequency in the horizontal direction. The LH sub-band and the HL sub-band have an effect obtained by employing edge detection for horizontal and vertical components from the original image, and the HH sub-band has an effect obtained by employing edge detection for diagonal components.
[41] Then, the wavelet transform is recursively performed over a plurality of steps, not limited to a first transform, and thus, the respective steps have different resolutions (multi-resolutions) and frequency characteristics (scalability). A specific example of the two-dimensional wavelet transform will be explained with reference to Fig. 2. Four sub-bands shown in Fig. 2 (b) (hereinafter, referred to as LL region 220, HL region 222, LH region, 224, and HH region 226) are created by performing wavelet transform once. If the LL region 220 is subjected once more to wavelet transform, four sub-bands of LL region 230, LH2 region, 234, HL2 region 232, and HH2 region 236 are created, and thus, the original image is decomposed into total seven sub-bands (LL 230, LHl 224, HLl 222, HHl 226, LH2 234, HL2 232, and HH2 236). [42] The reason why the LL regions 220, 230, 240 and 250 are decomposed is that each
LL region contains important information on the image. The decomposition is repeated as many times as a certain decomposition value until desired information is obtained. The LL regions 220, 230, 240 and 250 are recursively decomposed, thereby obtaining a new processing target image.
[43] The wavelet transform is repeated as many times as the decomposition value to reduce the width of a low frequency bandwidth, and thus, a doubled spatial resolution is obtained. The decomposition value provided as a criterion for determining the number of recursive wavelet transforms is determined as an appropriate value in consideration of a loss of information and the size of a feature vector.
[44] In the present invention, high speed Haar wavelet transform is used to extract features from a note image. The method of recognizing the denomination of a note according to the present invention allows any kind of wavelet transform to be used. The high speed Haar wavelet transform is advantageous in expressing a wide area of continuous colors, easy to implement, and speedy in transforming. In addition, the high speed Haar wavelet transform satisfies conditions of uniqueness, completeness, invariance, sensitivity and abstraction that are characteristics of a system for expressing an image.
[45] Fig. 3 is a flowchart illustrating a method of recognizing the denomination of a note using wavelet transform according to the present invention.
[46] In order to recognize the demolition of a note according to the present invention, a note desired to be recognized is first scanned (S310) so that an original image 410 is created by as shown in Fig. 4. The original image 410 is then preprocessed (S320). The preprocessing means detecting edges in the note image, which is performed by means of a relatively simple filter mask. The edge detection in the note image enhances the reliability of the feature vector and makes it easy to remove noise in the note image and to detect an outer boundary or the like, thereby enabling smooth processing of the note image. Considering entire processing time required for recognition of the denomination of a note, the edge detection as the preprocessing can be implemented in hardware using a digital signal processing (DSP) chip or the like. In this case, since the preprocessing is performed almost concurrently with obtaining (scanning) the note image, wavelet features can be extracted at a high speed.
[47] The edge means a border line placed between regions with different strengths of colors, and the edge detection is to detect such a border line existing in an image. The reason why the edge detection is important in image recognition is that most of important information of an image exists on the border line between different regions. In addition, if an image is expressed as edges, it is advantageous in that a large amount of data to be processed in an upper step of image recognition can be reduced while maintaining information on the shape of an object in the image, and the image recognition can be easily combined with another image recognition algorithm.
[48] Since pixel values are abruptly changed around the edges, a differential operation can be used for the edge detection. Well-known algorithms include a method of using Sobel transform or Robert transform having a first differential form, and a method of using Laplacian transform having a second differential form. Here, in order to achieve the object of the present invention, an arbitrary type of edge detection operator is allowed to be used for preprocessing such as edge detection in the present invention.
[49] Since other differential methods calculate a gradient in the middle of two pixels, the present invention employs the Sobel transform using a 3x3 mask for calculating a gradient to supplement the differential methods. The Sobel transform extracts edges in the horizontal or vertical directions and geometrically adds the results. Thus, the Sobel transform exhibits excellent performance in detecting edges in a diagonal direction, as well as in detecting edges in the horizontal and vertical directions.
[50] Fig. 5 shows 3x3 Sobel masks in which x and y denote x-direction (vertical direction) and y-direction (horizontal direction), respectively. In the Sobel transform, when an original image is inputted, a pixel value of the original image is first multiplied by a certain weighting factor having a magnitude of the 3x3 Sobel mask. Second, a value obtained by multiplying the pixel value of the original image by the weighting factor of the Sobel mask is added. Third, the absolute value of the value obtained in the second step is obtained. Fourth, the aforementioned steps are repeated to obtain absolute values by moving one pixel at a time in the y-direction. Finally, the process from the first to third steps is repeated to obtain absolute values by moving one pixel at a time in the x-direction. The absolute value can be expressed as G = IGxI + IGyI, where Gx denotes a value taken from the Sobel mask in the x-direction, and Gy denotes a value taken from the Sobel mask in the y-direction.
[51] Edges are detected by referring to changes in the absolute values calculated in the x- and y-directions. That is, a region where the magnitude of the absolute value is abruptly increased is detected as an edge. Since the edge is basically a region where discontinuity of a grey level exists in an image, the edge is obtained by calculating differences in grey level among neighboring pixels around a certain pixel.
[52] When edges of the original image are detected as describe above, an image where changes of luminance are more alleviated as compared with the original grey scale image is created as shown in Fig. 6, and it is easy to remove an unnecessary background portion outside the rectangular region of the pattern of the note.
[53] Next, the preprocessed original image is wavelet transformed as many times as the decomposition value (S330). The wavelet transform is to apply conventional wavelet transform to recognition of a note. If the preprocessed original image 610 is wavelet transformed, LL, LH, HL and HH regions are created. Next, the LL region is recursively wavelet transformed as many times as the certain decomposition value, resulting in created LH, HL and HH regions.
[54] The recursive wavelet transforms of the LL region are performed as many times as the decomposition value. In order to determine an appropriate decomposition value in case of the note recognition, reference data that are stored as representative feature vectors by denomination are compared with vectors of a given note image so as to determine whether such feature vectors can be clearly identified. That is, correspondence between two vectors is calculated through one-to-one comparison of the reference data with a given feature vector, and the LL region is then recursively wavelet transformed several times until each note can be identified according to the results of the calculation.
[55] For example, if the wavelet transform is repeated twice as shown in Fig. 7, the pre- processed original image 610 is decomposed into LL region 710, LH region 724, HL region 722, and HH region 726 through the first wavelet transform as shown in Fig. 7 (b). If the LL region 720 is decomposed once more through the second wavelet transform, LL region 730, LHl region 724, HLl region 722, HHl region 726, LH2 region 734, HL2 region 732 and HH2 region 736 are created as shown in Fig. 7 (c). Hereinafter, description will be made in connection with an example in which the decomposition value is set to 'J level'.
[56] An example in which the wavelet transforms are applied to recognition of a note up to J level as the decomposition value will be described in detail. When the pre- processed original image 610 is wavelet transformed as many times as the decomposition value, it is decomposed into 3J + 1 regions. Through the decomposition, 3J (dl to d3j) regions can be obtained, except the LL region 730 that is created in the last transform. For example, if the J level is set to two, as shown in Fig. 7 (c), the pre- processed original image 610 is decomposed into seven (3 x 2 + 1) regions, and six regions of dl region 722, d2 region 724, d3 region 726, d4 region 732, d5 region 734, and d6 region 736 can be obtained.
[57] If the wavelet transforms are completed as many times as the value of J level, each of the regions dl to d6 (when J is two) is configured with wavelet coefficients having features of each band using differences in information of neighboring regions (S340). As for the wavelet coefficient, the LL region 730 for which low pass filters are applied in the horizontal and vertical directions is configured with approximate coefficient values of the image, and d5 region 734 and d2 region 724, i.e., LH regions for which a high pass filter is applied in the horizontal direction, are respectively configured with coefficient values of horizontal components of the image.
[58] In addition, d4 region 732 and dl region 722, i.e., HL regions for which a high pass filter is applied in the vertical direction, are respectively configured with coefficient values of vertical components of the image, and d6 region 736 and d3 region 726, i.e., HH regions for which high pass filters are applied in the horizontal and vertical directions, are respectively configured with coefficient values of diagonal components of the image.
[59] Next, a feature vector is configured according to each of the 3 J regions (e.g., regions dl to d6 if J level is two) created by the wavelet transform. At this time, all of the 3J regions do not need to participate in configuring the feature vectors. Similarily to determining the decomposition value of the wavelet transform, it is possible to determine the positions and number of regions for maximizing the performance of recognition of the denomination of a note. In this case, an optimal combination of the regions can be obtained by examining the recognition performance of all combinations of the regions (full search, binary tree search, and the like).
[60] The feature vector is a vector value configured by comparing the magnitudes of the absolute values of the extracted wavelet coefficients with a predetermined reference value. For example, obtaining feature vectors of d2 region 724 will be described. First, d2 region 724 is divided into m x n cells in the horizontal and vertical directions as shown in Fig. 8 (S350). For the respective m x n cells, the magnitudes of the absolute values of wavelet coefficients are compared with a reference value (S360) to extract the number of wavelet coefficients of which the magnitudes are equal to or larger than the reference value, thereby configuring a first vector (S370).
[61] For example, in a case where the reference value is 7, d2 region 724 is divided into m x n (l l x 4) cells as shown in Fig. 9, and the number of coefficients of which the magnitudes are larger than the reference value 7 is counted among coefficients in first cell mini 910. That is, it is understood that the number of coefficients of which the magnitudes are larger than the reference value 7 is 35 in cell mini 910 comprising coefficients of {5, 7, 9, 10, 11, 9, ..., 9, 9, 8, 7, 7, 6}, and the number 35 becomes a feature vector 912 of cell mini. Subsequently, the number of coefficients of which the magnitudes are larger than the reference value 7 is counted among coefficients in a second cell m2nl 920, and the counted number 27 becomes a feature vector 922 of cell m2nl. If the extraction process is completed up to cell mmnn 930, a first vector 940 having m x n elements is created. Likewise, it is understood that if the process of creating a feature vector is repeated for all of the six (J = 2) sub-band regions (dl to d6) shown in Fig. 10, 6 x m x n feature vectors of region d2 are created.
[62] Fig. 10 shows values obtained when J is 2. If the values are expressed as a formula, a feature vector having a vertical structure of total 3Jmn bits, i.e. 3 x wavelet transform level (J) x number of cells in a row (m) x number of cells in a column (n), is extracted as shown in Fig. 11. [63] Based on the characteristics of the wavelet transform, each of the aforementioned 3 x J + 1 wavelet sub-band regions is another expression of a specific frequency resolution for the entire image. Thus, division of a sub-band region into m x n cells implies that the entire image is divided into m x n portions. Accordingly, by extracting a feature vector on a cell basis, information on the position of a feature point in the entire image can be expressed as a feature vector.
[64] Next, similarity between the configured feature vector and a predetermined determination criterion vector is measured (S390), thereby performing the process of recognizing the denomination of the note (S400). Here, upon measuring of the similarity, such as conventional distance comparison, the size of a feature vector generally becomes an important variable of a processing speed, in view of complexity of calculation. In the present invention, in order to enhance the ability of identifying a feature vector and to decrease the size of the feature vector before evaluation based on the determination criterion, a normal analysis can be applied to the feature vector configured as described above (S380). If the normal analysis is applied, unnecessary components of the feature vector are removed, and the size of the feature vector is effectively reduced. Therefore, upon determination of similarity based on the determination criterion, a considerable amount of processing time and a memory space required to store the feature vector can be reduced.
[65] The normal analysis is one of statistical feature extraction methods, which means a process of transforming a feature vector into a vector space Φ that maximizes the ratio shown in formula 1. That is, in a case where feature vectors are transformed through the normal analysis, the amount of variance is decreased if denominations of notes are identical, and a relative distance is increased if denominations of notes are different. Therefore, correct and reliable feature extraction can be achieved.
[66]
[67] [formula 1]
[68]
2 T
^ 1 . Λ σ5 bitween class variance Φ ∑SΦ
Fisher ratioyk)— — ^- = = — σ within class variance φ v∑ φ
[69] In addition, if the feature vector is transformed into a sub space of the vector space
Φ in formula 1, the dimension of a finally transformed feature vector is decreased, and thus, the size of the feature vector is decreased.
[70] Table 1 below shows classification ratios in methods where the normal analysis is applied and not applied. For example, as for a classification test of a 10,000 Won note, a 5,000 Won note and a 1,000 Won note, the wavelet decomposition value J is set to 1, the number of cells in a row (m) is set to 2, and the number of cells in a column (n) is set to 2. As a result of classifying the notes, the classification ratio is 97.71 % in the method where the normal analysis is not applied. However, in the case where the normal analysis is applied to the feature vector using formula 1 when classifying the notes, the classification ratio is 99.91 %. It is understood from the results that the accuracy of the method where the normal analysis is applied is increased as compared with the method where the normal analysis is not applied.
[71] [72] Table 1
Figure imgf000013_0001
[73] [74] Next, in order to verify the feature vector configured as described above and the predetermined determination criterion vector, final similarity is determined using a shortest distance distribution that is obtained using a conventional method, such as a shortest distance technique or the like. That is, after calculating a distance between feature vectors by applying a normalized Euclidian distance, if a value produced by applying the shortest distance technique to the calculated distance between the feature vectors is smaller than a predetermined reference value, the denomination is recognized as the same.
[75] The present invention has been described in connection with a method of recognizing a typical note. However, the present invention is not limited to the recognition of such a typical note but can be applied to recognition of general types of notes having certain printed patterns or watermark images, such as securities, a variety of paper money guaranties, and checks. Therefore, the scope of the present invention should not be defined by the embodiment described above but apparently defined by the invention of the appended claims and equivalents thereof.
[76]
Industrial Applicability [77] According to the present invention described above, there is provided more reliable recognition of a note, which is more robust against noise such as changes in luminance of a note image or the like due to entire or partial contamination of the note, inconsistent output of a sensor, or the like. Therefore, even when a note to be recognized is replaced by a new type of note, it is easy to identify the denominations of all kinds of notes quite precisely.
[78] In addition, the denomination of a note can be stably recognized regardless of whether the note has been worn out and contaminated due to the use thereof for a long time.
[79]

Claims

Claims
[1] A method of recognizing a denomination of a note using wavelet transform, comprising the steps of: performing a preprocessing process of detecting edges in an original image created by scanning the note; performing the wavelet transform for the preprocessed original image by a predetermined number of times; obtaining wavelet coefficients for each of sub-bands created through the wavelet transform, and dividing each of the sub-bands into m x n cells in horizontal and vertical directions; comparing a magnitude of an absolute value of each of the wavelet coefficients with a reference value for each of the m x n cells, and extracting the number of wavelet coefficients with the magnitudes larger than or equal to the reference value; completing the extracting step for each of the sub-bands, and configuring a feature vector with a vertical structure; applying normal analysis to the feature vector to enhance identification ability and to decrease the size of the feature vector; and recognizing the denomination of the note by measuring similarity between the feature vector and a predetermined determination criterion vector.
[2] The method as claimed in claim 1, wherein the preprocessing step comprises removing noise by detecting edges in the original image, and obtaining an outer boundary of the original image to enable smooth note image processing by facilitating division of the original image into a note image portion to be actually processed and an unnecessary background portion, and wherein an edge image relatively less sensitive to a change in luminance of the note image is received as an input in the preprocessing step.
[3] The method as claimed in claim 2, wherein the edges are detected through Sobel transform using a 3 x 3 mask.
[4] The method as claimed in claim 2, wherein the edge detection is implemented in hardware using a digital signal processing (DSP) chip.
[5] The method as claimed in claim 1, wherein the wavelet transform comprises a high speed Haar wavelet transform method.
[6] The method as claimed in claim 1, wherein the wavelet coefficient has features of each of the sub-bands by using a difference in information of neighboring sub- bands according to each of sub-band images.
[7] The method as claimed in claim 1, wherein the step of configuring the feature vector comprises configuring the feature vector in a vertical structure of total 3Jmn bits that is 3 x wavelet transform level (J) x number of cells in a row (m) x number of cells in a column (n).
[8] The method as claimed in claim 1, wherein the step of measuring the similarity comprises determining final similarity using shortest distance distribution obtained through a conventional shortest distance method.
PCT/KR2007/001194 2006-03-13 2007-03-12 Recognizing the denomination of a note using wavelet transform WO2007105890A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
KR10-2006-0023053 2006-03-13
KR1020060023053A KR101001691B1 (en) 2006-03-13 2006-03-13 Recognizing the Denomination of a Note Using Wavelet transform

Publications (1)

Publication Number Publication Date
WO2007105890A1 true WO2007105890A1 (en) 2007-09-20

Family

ID=38509681

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/KR2007/001194 WO2007105890A1 (en) 2006-03-13 2007-03-12 Recognizing the denomination of a note using wavelet transform

Country Status (2)

Country Link
KR (1) KR101001691B1 (en)
WO (1) WO2007105890A1 (en)

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2010089157A1 (en) 2009-02-05 2010-08-12 Bundesdruckerei Gmbh Method and device for verifying document using a wavelet transformation
WO2013030186A1 (en) 2011-09-01 2013-03-07 Bundesdruckerei Gmbh Apparatus for identifying documents
CN103035061A (en) * 2012-09-29 2013-04-10 广州广电运通金融电子股份有限公司 Anti-counterfeit characteristic generation method of valuable file and identification method and device thereof
CN103198352A (en) * 2013-03-04 2013-07-10 上海古鳌电子科技股份有限公司 Cash-counting machine with automatic correction function
EP2830003A3 (en) * 2013-07-26 2015-05-20 Fujitsu Limited Image processing apparatus and method
CN110838127A (en) * 2019-10-30 2020-02-25 合肥工业大学 Feature image edge detection method for intelligent automobile

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101537809B1 (en) * 2014-03-20 2015-07-23 기산전자 주식회사 Method for determining fitness of banknote
KR102511132B1 (en) * 2018-04-14 2023-03-17 기산전자 주식회사 Paper-money counting apparatus and method
CN113538809B (en) * 2021-06-11 2023-08-04 深圳怡化电脑科技有限公司 Data processing method and device based on self-service equipment

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5530772A (en) * 1994-07-04 1996-06-25 At&T Global Information Solutions Company Apparatus and method for testing bank notes for genuineness using Fourier transform analysis
JPH0935000A (en) * 1995-07-19 1997-02-07 Sony Corp Method and device for recognizing handwritten character
KR19990024302A (en) * 1997-08-13 1999-04-06 구자홍 Banknote Recognition Method and Device
US6234294B1 (en) * 1998-10-29 2001-05-22 De La Rue International Ltd Method and system for recognition of currency by denomination

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6157746A (en) 1997-02-12 2000-12-05 Sarnoff Corporation Apparatus and method for encoding wavelet trees generated by a wavelet-based coding method

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5530772A (en) * 1994-07-04 1996-06-25 At&T Global Information Solutions Company Apparatus and method for testing bank notes for genuineness using Fourier transform analysis
JPH0935000A (en) * 1995-07-19 1997-02-07 Sony Corp Method and device for recognizing handwritten character
KR19990024302A (en) * 1997-08-13 1999-04-06 구자홍 Banknote Recognition Method and Device
US6234294B1 (en) * 1998-10-29 2001-05-22 De La Rue International Ltd Method and system for recognition of currency by denomination

Cited By (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2010089157A1 (en) 2009-02-05 2010-08-12 Bundesdruckerei Gmbh Method and device for verifying document using a wavelet transformation
WO2013030186A1 (en) 2011-09-01 2013-03-07 Bundesdruckerei Gmbh Apparatus for identifying documents
DE102011082028A1 (en) 2011-09-01 2013-03-07 Bundesdruckerei Gmbh Device for document recognition
US9715635B2 (en) 2011-09-01 2017-07-25 Bundesdruckerei Gmbh Apparatus for identifying documents
CN103035061A (en) * 2012-09-29 2013-04-10 广州广电运通金融电子股份有限公司 Anti-counterfeit characteristic generation method of valuable file and identification method and device thereof
CN103035061B (en) * 2012-09-29 2014-12-31 广州广电运通金融电子股份有限公司 Anti-counterfeit characteristic generation method of valuable file and identification method and device thereof
US9499006B2 (en) 2012-09-29 2016-11-22 Grg Banking Equipment Co., Ltd. Anti-counterfeiting feature generation method for valuable document and authentication method and device therefor
CN103198352A (en) * 2013-03-04 2013-07-10 上海古鳌电子科技股份有限公司 Cash-counting machine with automatic correction function
EP2830003A3 (en) * 2013-07-26 2015-05-20 Fujitsu Limited Image processing apparatus and method
US9405956B2 (en) 2013-07-26 2016-08-02 Fujitsu Limited Image processing apparatus and method
US9483710B1 (en) 2013-07-26 2016-11-01 Fujitsu Limited Image processing apparatus and method
CN110838127A (en) * 2019-10-30 2020-02-25 合肥工业大学 Feature image edge detection method for intelligent automobile

Also Published As

Publication number Publication date
KR101001691B1 (en) 2010-12-15
KR20070093210A (en) 2007-09-18

Similar Documents

Publication Publication Date Title
WO2007105890A1 (en) Recognizing the denomination of a note using wavelet transform
JP5314007B2 (en) Authentication of security documents, especially banknotes
US20070154078A1 (en) Processing images of media items before validation
WO2007105892A1 (en) Recognizing the denomination of a note using wavelet transform
KR20110002043A (en) Scale robust feature-based identifiers for image identification
WO2007105891A1 (en) Recognizing the denomination of a note using wavelet transform
Pham et al. Deep learning-based fake-banknote detection for the visually impaired people using visible-light images captured by smartphone cameras
Youn et al. Efficient multi-currency classification of CIS banknotes
Tessfaw et al. Ethiopian banknote recognition and fake detection using support vector machine
KR101811361B1 (en) Authenticatioin of security documents, in particular of banknotes
CN111310628A (en) Paper currency forming mode inspection and identification method based on paper currency printing pattern characteristics
CN112313718A (en) Image-based novelty detection of material samples
CN106599923B (en) Method and device for detecting seal anti-counterfeiting features
CN107134048A (en) A kind of bill anti-counterfeit discrimination method of Intelligent Recognition watermark feature
Huynh-Kha et al. A robust algorithm of forgery detection in copy-move and spliced images
CN107170108B (en) A kind of splicing paper money detection method and system
Vaishnavi et al. Recognizing image splicing forgeries using histogram features
Costa et al. Recognition of banknotes in multiple perspectives using selective feature matching and shape analysis
KR101232684B1 (en) Method for detecting counterfeits of banknotes using Bayesian approach
Nayak et al. Neural network approach for Indian currency recognition
TWI378406B (en) Method for performing color analysis operation on image corresponding to monetary banknote
CN209625317U (en) Paper product identification device and paper discriminating apparatus and cash inspecting machine
Gui et al. Blind median filtering detection based on histogram features
JP3756250B2 (en) Authenticity discrimination device for paper sheets
CN110969757A (en) Multi-country banknote type rapid identification technology

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 07715590

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 07715590

Country of ref document: EP

Kind code of ref document: A1