CN111435559A - Financial terminal, detection method of paper money watermark image and memory - Google Patents

Financial terminal, detection method of paper money watermark image and memory Download PDF

Info

Publication number
CN111435559A
CN111435559A CN201910024604.2A CN201910024604A CN111435559A CN 111435559 A CN111435559 A CN 111435559A CN 201910024604 A CN201910024604 A CN 201910024604A CN 111435559 A CN111435559 A CN 111435559A
Authority
CN
China
Prior art keywords
image
watermark
detected
sample
detection method
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201910024604.2A
Other languages
Chinese (zh)
Inventor
杜杨君
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Yihua Computer Co Ltd
Shenzhen Yihua Time Technology Co Ltd
Shenzhen Yihua Financial Intelligent Research Institute
Original Assignee
Shenzhen Yihua Computer Co Ltd
Shenzhen Yihua Time Technology Co Ltd
Shenzhen Yihua Financial Intelligent Research Institute
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Yihua Computer Co Ltd, Shenzhen Yihua Time Technology Co Ltd, Shenzhen Yihua Financial Intelligent Research Institute filed Critical Shenzhen Yihua Computer Co Ltd
Priority to CN201910024604.2A priority Critical patent/CN111435559A/en
Publication of CN111435559A publication Critical patent/CN111435559A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G07CHECKING-DEVICES
    • G07DHANDLING OF COINS OR VALUABLE PAPERS, e.g. TESTING, SORTING BY DENOMINATIONS, COUNTING, DISPENSING, CHANGING OR DEPOSITING
    • G07D7/00Testing specially adapted to determine the identity or genuineness of valuable papers or for segregating those which are unacceptable, e.g. banknotes that are alien to a currency
    • G07D7/003Testing specially adapted to determine the identity or genuineness of valuable papers or for segregating those which are unacceptable, e.g. banknotes that are alien to a currency using security elements
    • G07D7/0034Testing specially adapted to determine the identity or genuineness of valuable papers or for segregating those which are unacceptable, e.g. banknotes that are alien to a currency using security elements using watermarks
    • GPHYSICS
    • G07CHECKING-DEVICES
    • G07DHANDLING OF COINS OR VALUABLE PAPERS, e.g. TESTING, SORTING BY DENOMINATIONS, COUNTING, DISPENSING, CHANGING OR DEPOSITING
    • G07D7/00Testing specially adapted to determine the identity or genuineness of valuable papers or for segregating those which are unacceptable, e.g. banknotes that are alien to a currency
    • G07D7/20Testing patterns thereon
    • G07D7/2008Testing patterns thereon using pre-processing, e.g. de-blurring, averaging, normalisation or rotation
    • GPHYSICS
    • G07CHECKING-DEVICES
    • G07DHANDLING OF COINS OR VALUABLE PAPERS, e.g. TESTING, SORTING BY DENOMINATIONS, COUNTING, DISPENSING, CHANGING OR DEPOSITING
    • G07D7/00Testing specially adapted to determine the identity or genuineness of valuable papers or for segregating those which are unacceptable, e.g. banknotes that are alien to a currency
    • G07D7/20Testing patterns thereon
    • G07D7/2016Testing patterns thereon using feature extraction, e.g. segmentation, edge detection or Hough-transformation

Abstract

The invention provides a financial terminal, a detection method of a paper money watermark image and a memory, wherein the detection method of the paper money watermark image comprises the following steps: acquiring an image to be detected of the paper money to be detected on the target surface; filtering the image to be detected to obtain a filtered image; carrying out binarization processing on the filtered image to obtain a binarized image; traversing the binary image by using a sliding window, wherein the region with the highest matching degree between the binary image and the sliding window is a watermark image region; and intercepting an image corresponding to the watermark image area in the binary image to obtain a watermark image. The detection method provided by the invention can accurately position the watermark image and can accurately extract the watermark pattern.

Description

Financial terminal, detection method of paper money watermark image and memory
Technical Field
The invention relates to the technical field of financial terminals, in particular to a financial terminal, a detection method of a paper money watermark image and a memory.
Background
The banknote detector module of the ATM equipment has the capability of identifying the watermark of the banknote, collects an infrared transmission image generated by irradiating infrared light on the banknote, can see the watermark pattern from the infrared transmission image, and can identify the watermark characteristic of the banknote by analyzing the watermark pattern.
At present, the conventional identification method only identifies whether the watermark pattern exists or not, and the watermark pattern is right or wrong and is not identified, so that the watermark identification strength is too small. In addition, watermarks of different currencies, versions and currency values are very different. The new and old paper money, the dirt, the crease and the like also have great influence on the watermark, the watermark is blurred or has a plurality of interference stripes, and the contrast between the watermark and the background is not large due to the characteristics of the watermark, so that the watermark pattern is difficult to extract or even cannot be extracted under the influence of the factors. Therefore, it is common to directly discard the watermark discrimination for old banknotes or banknotes of a denomination in which the material is poor and the watermark is unstable.
Disclosure of Invention
In order to overcome the defects of the prior art, the invention provides a method for detecting a banknote watermark image, which can accurately position the watermark image and accurately extract the watermark pattern.
The specific technical scheme provided by the invention is as follows: the detection method of the banknote watermark image comprises the following steps:
acquiring an image to be detected of the paper money to be detected on the target surface;
filtering the image to be detected to obtain a filtered image;
carrying out binarization processing on the filtered image to obtain a binarized image;
traversing the binary image by using a sliding window, wherein the region with the highest matching degree between the binary image and the sliding window is a watermark image region;
and intercepting an image corresponding to the watermark image area in the binary image to obtain a watermark image.
Further, the detection method further comprises:
and respectively matching the watermark image with a plurality of watermark characteristic templates, wherein if the watermark image is matched with at least one watermark characteristic template in the plurality of watermark characteristic templates, the watermark image is correct.
Further, matching the watermark image with a plurality of watermark feature templates specifically includes:
extracting the characteristics of the watermark image;
respectively calculating the similarity between the features of the watermark image and each watermark feature template in the plurality of watermark feature templates;
and judging whether the similarity exceeds a preset threshold value, and if the similarity exceeds the preset threshold value, matching the watermark image with the watermark characteristic template corresponding to the similarity.
Further, the plurality of watermark feature templates are obtained by:
selecting a plurality of sample watermark images;
and performing cluster analysis on the plurality of sample watermark images to obtain the plurality of watermark characteristic templates.
Further, the sliding window comprises a frame and sample watermark image sampling points located in the frame, and the sliding window is obtained through the following steps:
determining a frame of the sliding window according to the size of the sample watermark image;
and sampling the sample watermark image positioned in the frame to obtain a sample watermark image sampling point positioned in the frame.
Further, traversing the binarized image by using the sliding window, wherein the region with the highest matching degree between the binarized image and the sliding window is a watermark image region specifically comprises:
when the sliding window is located at each sliding position, taking the number of sampling points with the same color of the plurality of sample watermark image sampling points and the corresponding binary image as the matching degree of the sliding position;
and taking the area of the binary image corresponding to the sliding position with the maximum matching degree as a watermark image area.
Further, acquiring the to-be-detected image of the paper money to be detected on the target surface upwards specifically comprises:
acquiring an image of an area where watermarks are positioned in paper money to be detected;
judging whether the image orientation of the area where the watermark is located is a target orientation;
if the image orientation of the area where the watermark is located is the target orientation, obtaining an image to be detected of the paper money to be detected on the target orientation;
and if the image orientation of the area where the watermark is located is not the target orientation, performing orientation conversion on the image of the area where the watermark is located, and obtaining the to-be-detected image of the paper money to be detected on the target orientation according to the image subjected to orientation conversion.
Further, performing the face-to-face conversion on the image of the area where the watermark is located includes performing left-right mirror image flipping and/or rotation on the image of the area where the watermark is located.
The invention also provides a memory storing a computer program executable to implement the detection method as defined in any one of the above.
The invention also provides a financial terminal comprising a memory, a processor and a computer program stored in the memory and executable on the processor, the processor implementing the detection method as defined in any one of the above when executing the computer program.
The detection method provided by the invention comprises the steps of after an image to be detected of paper money to be detected on a target surface is obtained, carrying out binarization processing on the image to be detected to obtain a binarized image; and traversing the binary image by using a sliding window, wherein the region with the highest matching degree between the binary image and the sliding window is the watermark image. The detection method provided by the invention can accurately position the watermark image and can accurately extract the watermark pattern.
Drawings
The technical solution and other advantages of the present invention will become apparent from the following detailed description of specific embodiments of the present invention, which is to be read in connection with the accompanying drawings.
FIG. 1 is a flow chart of a method of detecting a banknote watermark image;
FIG. 2 is a schematic diagram of a binarized image obtained after binarization processing of an image to be detected;
FIG. 3 is a schematic diagram of a binarized image obtained after binarization processing of an image to be detected;
FIG. 4 is a schematic diagram of an image after filtering and binarization are sequentially performed on an image to be detected;
FIG. 5 is a schematic diagram of an image to be detected after sequentially performing binarization and denoising on the image;
FIG. 6 is a schematic diagram of an image to be detected after filtering, binarization and denoising are sequentially performed on the image;
FIG. 7 is a schematic view of a sliding window;
FIG. 8 is a schematic diagram of a lattice of sample watermark images equally divided by 8 gamma 8;
fig. 9 is a schematic structural diagram of a financial terminal.
Detailed Description
Hereinafter, embodiments of the present invention will be described in detail with reference to the accompanying drawings. This invention may, however, be embodied in many different forms and should not be construed as limited to the specific embodiments set forth herein. Rather, these embodiments are provided to explain the principles of the invention and its practical application to thereby enable others skilled in the art to understand the invention for various embodiments and with various modifications as are suited to the particular use contemplated. In the drawings, like reference numerals will be used to refer to like elements throughout.
Referring to fig. 1, the method for detecting a watermark image of a banknote according to this embodiment includes:
and step S1, acquiring an image to be detected of the paper money to be detected on the target surface.
Different currencies, different currency values and different versions of watermark images are different in size, and when the banknote watermark image is detected, the size of the image to be detected of the banknote to be detected on the target surface is determined according to the currency, the currency value and the version. Since the detection method of watermark images of different currencies, different currencies and different versions is the same in principle, only the same currency value of the same version in the same currency will be described in this embodiment.
Specifically, step S1 includes:
s11, acquiring an image of an area where watermarks are located in the paper money to be detected;
step S12, judging whether the image orientation of the area where the watermark is located is a target orientation, and if the image orientation of the area where the watermark is located is the target orientation, obtaining an image to be detected of the paper money to be detected on the target orientation; if the image orientation of the area where the watermark is located is not the target orientation, the process proceeds to step S13;
and step S13, performing face-to-face conversion on the image of the area where the watermark is located, and obtaining the to-be-detected image of the paper money to be detected with the target face upward according to the image subjected to face-to-face conversion.
In step S11, the area of the image of the region where the watermark is located in the obtained banknote to be detected is not too large, the interference is larger when the area is larger, the area of the image of the region where the watermark is located in the obtained banknote to be detected is not too small, and the incomplete watermark image may be caused when the area is too small. Therefore, the area of the image of the region where the watermark is located in the obtained paper money to be detected is larger than the area of the actual size of the watermark image, and in this embodiment, the distance between the boundary of the region where the watermark is located in the obtained paper money to be detected and the actual boundary of the watermark image is 2 mm. Of course, the distance between the boundary of the region where the watermark is located in the acquired banknote to be detected and the actual boundary of the watermark image may be set according to actual needs, which is only shown as an example and is not limited.
The image of the bill to be detected in this embodiment refers to an infrared transmission chart generated by irradiating the bill to be detected with infrared light. The infrared light may be an infrared transmission image generated by irradiating from the front side of the bill to be detected, or may be an infrared transmission image generated by irradiating from the back side of the bill to be detected, wherein the infrared transmission image generated by irradiating from the front side includes both the case that the watermark faces towards the left and the case that the watermark faces towards the right, and the infrared transmission image generated by irradiating from the back side includes both the case that the watermark faces towards the left and the case that the watermark faces towards the right. For the convenience of unified processing, the face-to-face conversion is required in this embodiment to obtain the image to be detected of the banknote to be detected with the target face upward.
In step S13, when the image of the area where the watermark is located is not the target face, the image of the area where the watermark is located needs to be face-to-face converted, and the to-be-detected image of the banknote to be detected with the target face upward is obtained according to the converted image. The image of the area where the watermark is located is subjected to face-to-face conversion, and the image of the area where the watermark is located is subjected to left-right mirror image turning and/or rotation.
In this embodiment, for example, the target surface is irradiated in the front direction and faces to the left, when the image of the area where the watermark is located faces to the right in the front direction, the image of the area where the watermark is located needs to be rotated by 180 degrees, and an image to be detected of the paper money to be detected in the target surface up is obtained. When the image of the area where the watermark is located faces to the right and is irradiated from the reverse side, the image of the area where the watermark is located needs to be turned over in a left-right mirror image mode, and the to-be-detected image of the paper money to be detected with the target facing upward is obtained. When the image of the area where the watermark is located faces to the left and is irradiated from the reverse side, the image of the area where the watermark is located needs to be rotated by 180 degrees after being turned over in a left-right mirror image mode, and an image to be detected of the paper money to be detected with the target face upward is obtained.
And step S2, filtering the image to be detected to obtain a filtered image.
In step S20, the filtering of the image to be detected is mainly to smooth the image and remove high frequency noise, so as to obtain a clearer and more coherent image. The filtering method can adopt methods such as mean filtering, median filtering, gaussian filtering and the like. In the present embodiment, a gaussian filtering method is used to filter the image to be detected, which is, of course, only shown as an example and is not used as a limitation.
The gaussian kernel used for gaussian filtering is:
Figure BDA0001942010170000051
because the filtering time is long, the image to be detected can be selectively filtered in the actual detection process.
And step S3, carrying out binarization processing on the filtered image to obtain a binarized image.
In step S3, the binarization processing is performed on the image to be detected in order to distinguish the foreground image and the background image of the image to be detected, so as to highlight the contour of the watermark image. The binarization method in the embodiment comprises full-area binarization, partitioned binarization and partitioned overlapped binarization. Of course, other binarization methods may be selected in this embodiment, and this is not limited here. In this embodiment, the mean value in the partition may be selected as the binarization threshold, the binarization threshold may be selected according to a percentage threshold method (P-Tile method), or the binarization threshold may be selected according to an Otsu threshold method.
Since the partition overlap binarization has a good effect, the binarization processing process in this embodiment is described in detail below by taking the binarization method as the partition overlap binarization and the binarization threshold as the mean value in the partition as an example.
Firstly, an image to be detected is evenly divided into Ngamma-N grids, the size of the image to be detected is larger than the actual size of the watermark image, and the grids on the outermost circle of the image to be detected are defaulted to be areas without the watermark image, so that the grids on the outermost circle are not subjected to binarization processing. The smaller the size of each cell is, the larger the total number of cells is, and the longer the binarization time is. The binarization time is also related to the size of the binarization partition, and the smaller the binarization partition is, the longer the binarization time is. In the actual process, the size of each grid and the size of the binarization partition can be determined by comprehensively considering the binarization effect and the binarization time.
Referring to fig. 2, for example, a lattice with a size of 2 x 2mm (8 x 8 pixels) for each lattice and a size of binarization partitions of 3 x 3 is used, that is, an image to be detected is divided into 7 x 7 lattices, and the lattices in the 1 st row, the 1 st column, the 7 th row and the 7 th column are not subjected to binarization processing. Firstly, performing binarization processing on lattices 11 in a 2 nd row and a 2 nd column, taking the lattices 11 and 8 adjacent lattices (shown as a dotted line frame in fig. 2) around the lattices 11 as a first partition, taking the mean value of gray values of pixel points included in 9 lattices in the first partition as a binarization threshold of the first partition, taking the sum of the gray values of the pixel points included in the lattices 11 as the gray value of the lattices 11, comparing the gray value of the lattices 11 with the binarization threshold of the first partition, and if the gray value of the lattices 11 is greater than the binarization threshold of the first partition, setting the gray values of all the pixel points in the lattices 11 to be 255, namely setting the color of the lattices 11 to be white; if the gray value of the grid 11 is smaller than the binarization threshold of the first partition, setting the gray values of all pixel points in the grid 11 to be 0, namely setting the color of the grid 11 to be black, and so on, and binarizing other grids (12/13/14/15, 21/22/23/24/25, 31/32/33/34/35, 41/42/43/44/45 and 51/52/53/54/55) through the binarization partitions to finally obtain a binarized image, wherein the size of the binarized image is 5 x 5 grids.
After step S3, the detection method in this embodiment further includes:
and step S31, denoising the binary image.
In step S31, the binarized image is denoised to remove some black block noise with small size, and preferably, the binarized image is denoised to remove black block noise with size equal to 2 x 2 pixels or less than 2 x 2 pixels.
Specifically, a horizontal/vertical/horizontal scanning method is adopted for denoising the binary image, the binary image is firstly scanned horizontally, if the number of continuous pixels with a gray value of 0 in each row is less than 3, that is, the number of continuous black pixels in each row is less than 3, the gray values of the pixels are set to 255, for example, the first row comprises 9 pixels, the gray values of the 9 pixels are 255, 0, 255, 0, the gray values of the pixels located in the 2 nd and 3 rd columns in the first row are set to 255, and the gray values of the 9 pixels are 255, 0 respectively. And after the transverse scanning is finished, longitudinal scanning is carried out, and if the number of the continuous pixel points with the gray value of 0 in each column is less than 3, namely the number of the continuous black points in each column is less than 3, the gray values of the pixel points are set to be 255. And after the longitudinal scanning is finished, performing transverse scanning, and if the number of the continuous pixel points with the gray value of 0 in each line is less than 3, namely the number of the continuous black points in each line is less than 3, setting the gray values of the pixel points to be 255. After three scans of horizontal/vertical/horizontal, black blocky noise of 2 x 2 pixels in size or less than 2 x 2 pixels in size in the binarized image can be removed. Of course, the number of scanning times can be set according to actual needs, the scanning mode can adopt other scanning modes, and the denoising method can also adopt other denoising methods.
Referring to fig. 3-6, fig. 3 shows a binarized image obtained after binarization processing is performed on an image to be detected, fig. 4 shows an image obtained after filtering and binarizing the image to be detected in sequence, fig. 5 shows an image obtained after binarizing and denoising the image to be detected in sequence, and fig. 6 shows an image obtained after filtering, binarizing and denoising the image to be detected in sequence.
And step S4, traversing the binary image by using a sliding window, wherein the region with the highest matching degree between the binary image and the sliding window is a watermark image region.
In step S4, a watermark image is obtained by a sliding window traversal method, where the sliding window is a sample watermark image, and in the traversal process of the binarized image, an area of the binarized image that matches the sliding window with the highest degree is a watermark image area, that is, an area where the watermark image is located.
Referring to fig. 7, specifically, the sliding window includes a frame and sample watermark image sampling points located in the frame, and the sliding window in step S4 is obtained through the following steps:
step S300, determining a frame of a sliding window according to the size of the sample watermark image;
step S301, sampling the sample watermark image in the frame to obtain a plurality of sample watermark image sampling points in the frame.
In step S300, after the currency, currency value and version are determined, the sample watermark image can be determined. The sample watermark image is obtained by performing steps S1 to S3 on the sample banknote image, and since the sample banknote image is generally a banknote image with higher quality, the sample watermark image can be obtained by performing steps S1 to S3 on the sample banknote image, and the frame of the sliding window can be obtained according to the size of the sample watermark image.
In step S301, in order to obtain the outline of the sample watermark image, the sample watermark image located in the frame needs to be sampled, and a plurality of sample watermark image sampling points located in the frame are obtained. The sampling refers to point tracing of a sample watermark image, and recording coordinates and colors of each sampling point.
After step S301, coordinates of the plurality of sample watermark image sampling points need to be re-calibrated, and the coordinate value of the coordinate origin is subtracted from the coordinate value of each sample watermark image sampling point in the plurality of sample watermark image sampling points by using the minimum value of the X coordinate and the minimum value of the Y coordinate in the plurality of sample watermark image sampling points as the coordinate origin, so as to obtain the calibrated coordinates of the plurality of sample watermark image sampling points. For example, the number of sample watermark image sampling points is M, and the coordinates of the M sample watermark image sampling points are (x)i,yj) I is more than or equal to 1 and less than or equal to M, j is more than or equal to 1 and less than or equal to M, and the coordinates of the M sample watermark image sampling points obtained after the coordinates are re-calibrated are respectively (x)i-min(x1,x2,......,xi,......,xM),yj-min(y1,y2,......,yi,......,yM) I is more than or equal to 1 and less than or equal to M, and j is more than or equal to 1 and less than or equal to M. In addition, the RECT parameters of the border of the sliding window also need to be recorded, wherein the border of the sliding window is a rectangular border, the RECT parameters of the border of the sliding window include the coordinates of the upper left corner of the rectangular border and the width and height of the rectangular border, and the size and position of the border of the sliding window can be determined according to the RECT parameters. The sliding window obtained through steps S300 to S301 is stored for later recall.
After obtaining the sliding window, step S4 specifically includes:
and step S31, when the sliding window is located at each sliding position, recording the number of sample watermark image sampling points with the same color as the corresponding binary image, and taking the number of sample watermark image sampling points with the same color as the corresponding binary image as the matching degree of the sliding position.
In step S31, the sliding window traverses the binarized image from the top left corner of the binarized image and in the sliding direction from left to right and from top to bottom, the offset between two adjacent slides is 1, that is, the number of times that the sliding window needs to traverse is gamma J, where I ═ W0-W1+1,J=H0-H1+1,W0Representing the width of the binarized image, H0Representing the height, W, of the binarized image1Indicates the width of the sliding window, H1Indicating the height of the sliding window. Of course, the offset between two adjacent slides may also be increased to reduce the number of traversal, for example, if the offset between two adjacent slides is 2, the number of traversal required for the sliding window is I gamma J times, where I ═ (W ═ J times0-W1+1)/2,J=(H0-H1+1)/2。
When the sliding window is located at each sliding position, recording the number of sample watermark image sampling points with the same color as that of the corresponding binary image in the plurality of sample watermark image sampling points, for example, the coordinate is (x)i,yj) The color of the sample watermark image sampling point is black, the color of the image at the position corresponding to the sample watermark image sampling point in the binary image is also black, the color of the sample watermark image sampling point is the same as the color of the corresponding binary image, and so on, thereby obtaining the number of the sample watermark image sampling points with the same color corresponding to each sliding position, and taking the number of the sample watermark image sampling points with the same color of the sample watermark image sampling points and the corresponding binary image as the number of the sample watermark image sampling points with the same color of the sample watermark image sampling pointsIs the degree of matching of the slide position.
In step S32, the area of the binarized image corresponding to the slide position with the highest matching degree is set as the watermark image area.
In step S32, the greater the number of sample watermark image sampling points with the same color is, the higher the matching degree between the region and the sliding window in the binarized image is, and the region with the highest matching degree is the region where the watermark image is located. Because each sliding position can be obtained according to the RECT parameter and the sliding distance of the sliding window, after the number of the sampling points with the same color is obtained, the sliding position corresponding to the maximum number of the sampling points with the same color can be obtained, and therefore the position of the watermark image is obtained.
And step S5, intercepting the image corresponding to the watermark image area in the binary image to obtain the watermark image.
In step S5, after the region where the watermark image is located is determined, the image at the corresponding position in the binarized image is captured according to the position of the watermark image and the size of the frame of the sliding window, so as to obtain the watermark image. The detection method of the embodiment can accurately extract the watermark pattern no matter whether the watermark image of the paper money to be detected is clear, whether the paper money to be detected has creases or whether the paper money to be detected has stains, and the detection algorithm in the embodiment is simple, consumes short time, does not depend on an operating system library, and can be directly used for embedded equipment.
In this embodiment, after obtaining the position of the watermark image and the watermark image, the detection method further includes:
and S6, matching the watermark image with a plurality of watermark characteristic templates respectively, wherein if the watermark image is matched with any one of the watermark characteristic templates, the watermark image is correct.
The watermark feature template in step S6 is obtained by:
and S600, selecting K sample watermark images, wherein K represents the number of the selected sample watermark images, and K is more than or equal to 1000.
Step S601, performing cluster analysis on the K sample watermark images to obtain a plurality of watermark characteristic templates.
In step S601, each of the K sample watermark images is equally divided into a grid of n x n, for example, each of the sample watermark images is equally divided into a grid of 8 x 8 (as shown in fig. 8). Each sample watermark image corresponds to a matrix of the dimensions of gamma n, and data in the matrix represents the number of pixels which are contained in the corresponding grid and have black colors, so that the K matrices of the dimensions of gamma n are obtained.
The method for performing cluster analysis on K sample watermark images is a K-Means clustering algorithm, for example, the number of clusters is 20, the iteration frequency is 100, and the accuracy is 0.01, firstly, performing first iteration, randomly selecting 20 sample watermark images from the K sample watermark images as the centroids of the 20 clusters, respectively calculating the distance from each sample watermark image to the centroids of the 20 clusters, wherein the distance refers to the distance between n-dimensional matrices, the cluster with the smallest distance is the class to which the sample watermark image belongs, obtaining the class to which each sample watermark image belongs after one iteration, then, taking the center of each class as the new centroid of the 20 clusters, repeating the above iteration process until the iteration frequency reaches 100 and the accuracy is 0.01, and finally obtaining 20 watermark feature templates.
In step S6, it is known whether the watermark image is correct by matching the watermark image with a plurality of watermark feature templates, respectively, and the type of the watermark image can also be obtained. The K sample watermark images in this embodiment may be sample watermark images of different currencies, different denominations, and different versions, and the currency, the denomination, and the version of the watermark image may be obtained after step S6. The K sample watermark images may also be sample watermark images of the same currency, the same currency value, and different versions, and which version the watermark image belongs to can be obtained after step S6.
Specifically, the step S6 of matching the watermark image with the watermark feature templates is performed by calculating the similarity between the matrix of the n x n dimension of the watermark image and the matrix of the n x n dimension of each of the watermark feature templates, and the higher the similarity is, the normal watermark image is. The similarity calculation method may adopt a cosine similarity calculation method or an euclidean distance calculation method. In the cosine similarity calculation method, the closer the calculation result is to 1, the higher the similarity is. In the euclidean distance calculation method, the smaller the distance between two matrices of the nxn dimensions, the higher the similarity.
In this embodiment, a cosine similarity calculation method is adopted to calculate the similarity between the watermark image and the watermark feature template, and the cosine similarity calculation method is as follows:
Figure BDA0001942010170000111
wherein A denotes a matrix of the dimensions of gamma n of the watermark image, BmA matrix of dimensions n gamma, cos (theta), representing the mth watermark feature templatem) Representing the similarity between the watermark image and the mth watermark feature template. Calculate cos (θ)m) Then, cos (. theta.) is judgedm) And if the similarity is larger than the similarity threshold, if only one of the m similarities is larger than the similarity threshold, the watermark image is correct.
It can be seen that the detection method of this embodiment can not only detect whether the watermark pattern exists, but also detect whether the watermark pattern is correct, and in addition, can classify the watermark image.
Referring to fig. 9, the present embodiment also provides a financial terminal including a memory 10, a processor 20, and a computer program stored in the memory 10 and executable on the processor 20, the processor 20 implementing the detection method as described above when executing the computer program.
The financial terminal in this embodiment may be a financial device such as an ATM or a banknote validator. The financial terminal may include, but is not limited to, a memory 10, a processor 20. Those skilled in the art will appreciate that fig. 3 is merely an example of a financial terminal and is not intended to be limiting and may include more or fewer components than those shown, or some components may be combined, or different components, e.g., a financial terminal may also include input and output devices, network access devices, buses, etc.
The storage 10 may be an internal storage unit of the financial terminal, such as a hard disk or a memory of the financial terminal. The memory 10 may also be an external storage device of the financial terminal, such as a plug-in hard disk, a Smart Media Card (SMC), a Secure Digital (SD) Card, a Flash memory Card (Flash Card), and the like provided on the financial terminal.
The memory 10 may also include both an internal storage unit and an external storage device of the financial terminal. The memory 10 is used to store the computer program and other programs and data required by the financial terminal. The memory 10 may also be used to temporarily store data that has been output or is to be output.
The Processor 20 may be a Central Processing Unit (CPU), other general purpose Processor, a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), an off-the-shelf programmable gate Array (FPGA) or other programmable logic device, discrete gate or transistor logic, discrete hardware components, etc. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
It is obvious to those skilled in the art that, for convenience and simplicity of description, the foregoing division of the functional units and modules is merely used as an example, and in practical applications, the foregoing function distribution may be performed by different functional units and modules as needed, that is, the internal structure of the service processing system is divided into different functional units or modules to perform all or part of the above described functions. Each functional unit and module in the embodiments may be integrated in one processing unit, or each unit may exist alone physically, or two or more units are integrated in one unit, and the integrated unit may be implemented in a form of hardware, or in a form of software functional unit. In addition, specific names of the functional units and modules are only for convenience of distinguishing from each other, and are not used for limiting the protection scope of the present application. The specific working processes of the units and modules in the system may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again.
The foregoing is directed to embodiments of the present application and it is noted that numerous modifications and adaptations may be made by those skilled in the art without departing from the principles of the present application and are intended to be within the scope of the present application.

Claims (10)

1. A detection method for watermark images of paper money is characterized by comprising the following steps:
acquiring an image to be detected of the paper money to be detected on the target surface;
filtering the image to be detected to obtain a filtered image;
carrying out binarization processing on the filtered image to obtain a binarized image;
traversing the binary image by using a sliding window, wherein the region with the highest matching degree between the binary image and the sliding window is a watermark image region;
and intercepting an image corresponding to the watermark image area in the binary image to obtain a watermark image.
2. The detection method according to claim 1, further comprising:
and respectively matching the watermark image with a plurality of watermark characteristic templates, wherein if the watermark image is matched with any one of the watermark characteristic templates, the watermark image is correct.
3. The detection method according to claim 2, wherein the matching the watermark image with the plurality of watermark feature templates respectively specifically comprises:
extracting the characteristics of the watermark image;
respectively calculating the similarity between the features of the watermark image and each watermark feature template in the plurality of watermark feature templates;
and judging whether the similarity exceeds a preset threshold value, and if the similarity exceeds the preset threshold value, matching the watermark image with the watermark characteristic template corresponding to the similarity.
4. The detection method according to claim 2, wherein the plurality of watermark feature templates are obtained by:
selecting a plurality of sample watermark images;
and performing cluster analysis on the plurality of sample watermark images to obtain the plurality of watermark characteristic templates.
5. The detection method according to claim 1, wherein the sliding window comprises a frame and sample watermark image sampling points located in the frame, and the sliding window is obtained by:
determining a frame of the sliding window according to the size of the sample watermark image;
and sampling the sample watermark image positioned in the frame to obtain a plurality of sample watermark image sampling points positioned in the frame.
6. The detection method according to claim 5, wherein traversing the binarized image by using the sliding window, and the region with the highest matching degree between the binarized image and the sliding window being a watermark image region specifically comprises:
when the sliding window is located at each sliding position, taking the number of sample watermark image sampling points with the same color of the plurality of sample watermark image sampling points and the corresponding binary image as the matching degree of the sliding position;
and taking the area of the binary image corresponding to the sliding position with the maximum matching degree as a watermark image area.
7. The inspection method according to claim 1, wherein acquiring the image to be inspected of the banknote to be inspected with the target surface facing upward specifically comprises:
acquiring an image of an area where watermarks are positioned in paper money to be detected;
judging whether the image orientation of the area where the watermark is located is a target orientation;
if the image orientation of the area where the watermark is located is the target orientation, obtaining an image to be detected of the paper money to be detected on the target orientation;
and if the image orientation of the area where the watermark is located is not the target orientation, performing orientation conversion on the image of the area where the watermark is located, and obtaining the to-be-detected image of the paper money to be detected on the target orientation according to the image subjected to orientation conversion.
8. The detection method according to claim 6, wherein performing the face-to-face conversion on the image of the area where the watermark is located comprises performing left-right mirror image flipping and/or rotation on the image of the area where the watermark is located.
9. A memory, characterized in that the memory stores a computer program executable to implement the detection method according to any one of claims 1-8.
10. A financial terminal comprising a memory, a processor and a computer program stored in the memory and operable on the processor, the processor when executing the computer program implementing the detection method of any one of claims 1 to 8.
CN201910024604.2A 2019-01-10 2019-01-10 Financial terminal, detection method of paper money watermark image and memory Pending CN111435559A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910024604.2A CN111435559A (en) 2019-01-10 2019-01-10 Financial terminal, detection method of paper money watermark image and memory

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910024604.2A CN111435559A (en) 2019-01-10 2019-01-10 Financial terminal, detection method of paper money watermark image and memory

Publications (1)

Publication Number Publication Date
CN111435559A true CN111435559A (en) 2020-07-21

Family

ID=71579895

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910024604.2A Pending CN111435559A (en) 2019-01-10 2019-01-10 Financial terminal, detection method of paper money watermark image and memory

Country Status (1)

Country Link
CN (1) CN111435559A (en)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE20307116U1 (en) * 2003-05-08 2003-08-07 Esser Werner Banknote inspecting system has watermark illuminated and optical enlargement of security thread
CN105046252A (en) * 2014-11-21 2015-11-11 华中科技大学 Method for recognizing Renminbi (Chinese currency yuan) crown codes
CN107134047A (en) * 2017-05-11 2017-09-05 深圳怡化电脑股份有限公司 White watermark detection method and device
CN107527418A (en) * 2017-07-11 2017-12-29 深圳怡化电脑股份有限公司 A kind of Heisui River lettering position method, apparatus, terminal device and readable storage medium storing program for executing
CN107610321A (en) * 2017-10-10 2018-01-19 深圳怡化电脑股份有限公司 A kind of identification note true and false method, apparatus, equipment and storage medium
CN108320373A (en) * 2017-01-17 2018-07-24 深圳怡化电脑股份有限公司 A kind of method and device of the detection of guiding against false of paper currency mark

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE20307116U1 (en) * 2003-05-08 2003-08-07 Esser Werner Banknote inspecting system has watermark illuminated and optical enlargement of security thread
CN105046252A (en) * 2014-11-21 2015-11-11 华中科技大学 Method for recognizing Renminbi (Chinese currency yuan) crown codes
CN108320373A (en) * 2017-01-17 2018-07-24 深圳怡化电脑股份有限公司 A kind of method and device of the detection of guiding against false of paper currency mark
CN107134047A (en) * 2017-05-11 2017-09-05 深圳怡化电脑股份有限公司 White watermark detection method and device
CN107527418A (en) * 2017-07-11 2017-12-29 深圳怡化电脑股份有限公司 A kind of Heisui River lettering position method, apparatus, terminal device and readable storage medium storing program for executing
CN107610321A (en) * 2017-10-10 2018-01-19 深圳怡化电脑股份有限公司 A kind of identification note true and false method, apparatus, equipment and storage medium

Similar Documents

Publication Publication Date Title
Debiasi et al. PRNU-based detection of morphed face images
WO2017197884A1 (en) Banknote management method and system
JP5616958B2 (en) Method for banknote detector device and banknote detector device
JP2020525947A (en) Manipulated image detection
US20140037159A1 (en) Apparatus and method for analyzing lesions in medical image
US20070154078A1 (en) Processing images of media items before validation
CN107103683B (en) Paper money identification method and device, electronic equipment and storage medium
CN106952393B (en) Paper money identification method and device, electronic equipment and storage medium
CN108921831B (en) Stained coin identification method based on image processing technology
JP2014057306A (en) Document image binarization and segmentation using image phase congruency
CN106920318B (en) Method and device for identifying paper money
RU2745098C2 (en) Method of determinig authorship of a painting
CN110378351B (en) Seal identification method and device
CN112313718A (en) Image-based novelty detection of material samples
CN106599923B (en) Method and device for detecting seal anti-counterfeiting features
CN106204616B (en) Method and device for identifying currency value of Iran paper money
CN113920434A (en) Image reproduction detection method, device and medium based on target
Chakraborty et al. Review of various image processing techniques for currency note authentication
CN106447908B (en) Paper money counterfeit distinguishing method and device
CN111435559A (en) Financial terminal, detection method of paper money watermark image and memory
CN116665321A (en) Parking lot vehicle management method based on edge nano-tube technology
CN113033562A (en) Image processing method, device, equipment and storage medium
CN112435226B (en) Fine-grained image stitching region detection method
KR101232684B1 (en) Method for detecting counterfeits of banknotes using Bayesian approach
CN112308141B (en) Scanning bill classification method, system and readable storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20200721