CN105740753A - Fingerprint identification method and fingerprint identification system - Google Patents

Fingerprint identification method and fingerprint identification system Download PDF

Info

Publication number
CN105740753A
CN105740753A CN201410770480.XA CN201410770480A CN105740753A CN 105740753 A CN105740753 A CN 105740753A CN 201410770480 A CN201410770480 A CN 201410770480A CN 105740753 A CN105740753 A CN 105740753A
Authority
CN
China
Prior art keywords
fingerprint
point
image
feature point
point set
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201410770480.XA
Other languages
Chinese (zh)
Inventor
姜波
黄忠伟
隋歆钰
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
BYD Co Ltd
Original Assignee
BYD Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by BYD Co Ltd filed Critical BYD Co Ltd
Priority to CN201410770480.XA priority Critical patent/CN105740753A/en
Publication of CN105740753A publication Critical patent/CN105740753A/en
Pending legal-status Critical Current

Links

Abstract

The invention discloses a fingerprint identification method. The fingerprint identification method comprises steps that a fingerprint image is segmented and filtered, the pre-processing steps comprise a segmentation sub step and a filtering sub step, for the segmentation sub step, the Harris corner energy of each pixel of the fingerprint image is calculated, the Harris corner energy smaller than a preset threshold is filtered to acquire a binary graph, and expansion and corrosion processing on the binary graph is carried out; for the filtering sub step, the direction field, the frequency field and the curvature field of the binary graph are respectively calculated, and a Log-Gabor filter is utilized to filter the processed binary graph. Through the method, computational complexity is reduced through the fingerprint method. The invention further provides a fingerprint identification system.

Description

Fingerprint identification method and fingerprint recognition system
Technical field
The present invention relates to biological identification technology, especially relate to a kind of fingerprint identification method and fingerprint recognition system.
Background technology
Fingerprint identification technology occupies important superiority in field of biological recognition, this is because fingerprint has the features such as uniqueness, stability, reliability, easily collection and cost are low.So, auto Fingerprint Identification System is one of identity identifying technology of being most widely used so far, but there is also a lot of critical problem simultaneously and need further to be solved.Solve the difficult point that fingerprint recognition faces, it is necessary to start with from pretreatment and two aspects of coupling: how (1) strengthens the quality of the fingerprint collecting equipment fingerprint image to collecting;And how (2) select effective recognizer, solve the accuracy rate of fingerprint recognition and the contradiction of recognition efficiency.
In existing Fingerprint Enhancement Algorithm, the method based on Gabor filtering can obtain relatively good effect, it is believed that is current most popular method.Gabor filter has optimal joint null tone resolution, it is possible to process in conjunction with the distinctive directional information of fingerprint and frequency information preferably.But, Gabor filter there is also intrinsic defect.For even symmetry Gabor filter, it is impossible to construct and there is any bandwidth and do not conform to the transmission function of DC component.Gabor filter obtains spectrum information wide as far as possible while can not meeting acquirement optimal spatial location.
Fingerprint matching mainly realizes " the characteristic information set that the fingerprint image that two width are given is extracted is mated, it is judged that whether these two pieces of fingerprints are from same fingerprint ".It is the committed step of fingerprint recognition system, is also final step, and the performance of matching algorithm determines the performance of fingerprint recognition system to a great extent.The classification of fingerprint matching algorithm is very many, and matching process more typically has the coupling based on the coupling of image, ridge pattern match, Point Pattern Matching and graphic based at present.
Summary of the invention
It is contemplated that at least solve one of technical problem of existence in prior art.For this, the present invention needs to provide a kind of fingerprint identification method and a kind of fingerprint recognition system.
The fingerprint identification method of better embodiment of the present invention includes:
Pre-treatment step, splits fingerprint image and filters;
Levy extraction step, pretreated fingerprint image is carried out feature point extraction and forms feature point set, and remove pseudo-random numbers generation further;And
Identification step, calibration carries out feature point set calibration, and is identified by matching algorithm, obtains fingerprint recognition result;
Described pre-treatment step includes:
Segmentation sub-step, calculates the Harris angle point energy of each pixel of fingerprint image, and elimination to obtain binary picture less than the Harris angle point energy of predetermined threshold, expands and corrodes described binary picture;And
Sub-step of filtering, calculates the field of direction of described binary picture, frequency fields and field of curvature respectively, and uses Log-Gabor wave filter to be filtered described binary picture obtaining described pretreated fingerprint image.
In some embodiments, described pre-treatment step includes before described segmentation sub-step:
Normalization sub-step, carries out normalization process to described fingerprint image.
In some embodiments, described normalization sub-step:
Histogram equalization, to described fingerprint image histogram equalization;
Choose region interested;And
Based on the self-adaption specification of local characteristics, described Fingerprint Image Segmentation is become the image block of the not crossover of predefined size, each described image block is standardized respectively.
In some embodiments, described segmentation sub-step:
Calculate the described fingerprint image Harris angle point energy to each pixel;
Add up described Harris angle point energy, carry out ascending sort the described Harris angle point energy of sequence front 30% by described Harris angle point energy size, more described binary picture is expanded and corrodes;And
Binary conversion treatment is carried out to obtaining described binary picture.
In some embodiments, use the orientation field computation method based on gradient to calculate the described field of direction, and be corrected the described field of direction by low pass filter.
In some embodiments, sciagraphy is adopted to calculate described frequency fields.
In some embodiments, field of curvature according to described orientation estimate, weigh crestal line curvature by Curvature Measures and calculate described field of curvature.
In some embodiments, be divided into by described binary picture predefined size and not crossover image is fast, windowed FFT is adopted to extract the spectrum information of each described image subblock, further according to crestal line direction and the frequency of described image subblock, construct described Log-Gabor wave filter and be filtered the frequency spectrum of described image subblock processing.
In some embodiments, described pre-treatment step also includes after described sub-step of filtering:
Refinement sub-step, uses self adaptation dynamic thresholding method to filtered described binary picture binary conversion treatment, and the look-up table again through mathematical morphology carries out refining thus obtaining described pretreated fingerprint image.
In some embodiments, described characteristic point includes fingerprint feature point, and described fingerprint feature point includes end points and crunode, and by adopting 8 neighborhood coding Ridge following algorithms to extract.
In some embodiments, described characteristic point includes fingerprint singularity, and described fingerprint singularity includes central point, triangulation point, and judges to extract by Poincare formula.
In some embodiments, described identification step includes:
Feature point set calibration sub-step, position the coupling initial point pair between described feature point set and template point set, with concentrate at described feature point set and described template point respectively for limit and set up polar coordinate system, calculate the rotation parameter between described two feature point set feature point sets and described template point set and calibrate described feature point set;And
Point matching sub-step, attempts coupling to described feature point set and described template point set, and the feature of statistical match is counted.
In some embodiments, described characteristic point includes singular point and minutiae point, described feature point set calibration sub-step:
Fingerprint image is calibrated, and calibrates crestal line and described singular point;And
Polar coordinate are changed, and described minutiae point are all transformed under described polar coordinate system.
The fingerprint recognition system of better embodiment of the present invention includes
Pretreatment module, for splitting fingerprint image and filtering;
Levy extraction module, form feature point set for pretreated fingerprint image being carried out feature point extraction, and remove pseudo-random numbers generation further;And
Identify module, carry out feature point set calibration for calibration, and be identified by matching algorithm, obtain fingerprint recognition result;
Described pretreatment module includes:
Segmentation module, for calculating the Harris angle point energy of each pixel of fingerprint image, elimination to obtain binary picture less than the Harris angle point energy of predetermined threshold, expands and corrodes described binary picture;And
Filtering submodule, for calculating the field of direction of described binary picture, frequency fields and field of curvature respectively, and uses Log-Gabor wave filter to be filtered described binary picture obtaining described pretreated fingerprint image.
In some embodiments, described pretreatment module also includes
Normalization module, for carrying out normalization process to described fingerprint image.
In some embodiments, described pretreatment module also includes:
Refinement module, is used for using self adaptation dynamic thresholding method that filtered described image binaryzation is processed, and the look-up table again through mathematical morphology carries out refining thus obtaining described pretreated fingerprint image.
In some embodiments, described feature extraction module includes:
Feature point extraction module, is used for take the fingerprint characteristic point and fingerprint singularity;And
Remove pseudo-random numbers generation module, be used for removing pseudo-end points, pseudo-crunode and pseudo-fingerprint (burr) etc..
In some embodiments, described identification module includes:
Feature point set calibration module, for positioning the coupling initial point pair between described feature point set and template point set, with concentrate at described feature point set and described template point respectively for limit and set up polar coordinate system, calculate the rotation parameter between described two feature point set feature point sets and described template point set and calibrate described feature point set;And
Point matching module, for described feature point set and described template point set are attempted coupling, the feature of statistical match is counted.
Fingerprint identification method in present embodiment and in fingerprint recognition system, based on the improved form of the partitioning algorithm of Harris angle point, is not rely on orientation estimate and result of calculation that frequency fields is estimated.Therefore, by the cutting procedure priority treatment of fingerprint image, calculating the mask of fingerprint in advance, the orientation estimate in pre-treatment step, frequency fields are estimated and filter enhancing etc. only the foreground area of fingerprint to be calculated such that it is able to greatly reduce the amount of calculation of fingerprint image preprocessing.
The additional aspect of the present invention and advantage will part provide in the following description, and part will become apparent from the description below, or is recognized by the practice of the present invention.
Accompanying drawing explanation
Above-mentioned and/or the additional aspect of the present invention and advantage are from conjunction with will be apparent from easy to understand the accompanying drawings below description to embodiment, wherein:
Fig. 1 is the functional block diagram of the fingerprint recognition system of better embodiment of the present invention.
Fig. 2 A and 2B is the X-direction that adopts of the fingerprint identification method of better embodiment of the present invention and the Sobel operator of Y-direction respectively.
Fig. 3 A-3C is the schematic diagram of the field of direction of the fingerprint image in the fingerprint identification method of better embodiment of the present invention and correspondence thereof and field of curvature respectively.
Fig. 4 is 8 Neighborhood Graph templates of the fingerprint identification method employing of better embodiment of the present invention.
Fig. 5 be better embodiment of the present invention fingerprint identification method in the common type of detail characteristics of fingerprints.
Fig. 6 is the closed curve grid figure of the fingerprint identification method employing of better embodiment of the present invention.
Fig. 7 be better embodiment of the present invention fingerprint identification method in remove pseudo-fingerprint feature point flow chart.
Fig. 8 A and 8B be better embodiment of the present invention fingerprint identification method in Ridge-Sampling schematic diagram.
Fig. 9 A-9C is the boundary box signal of the fingerprint identification method employing of better embodiment of the present invention, and wherein Fig. 9 B is fixed size boundary box, and Fig. 9 C is variable-size boundary box.
Figure 10 be better embodiment of the present invention fingerprint identification method in fingerprint matching flow chart.
Figure 11 is the schematic diagram of the fingerprint image in each step of fingerprint identification method of better embodiment of the present invention.
Detailed description of the invention
Being described below in detail embodiments of the present invention, the example of described embodiment is shown in the drawings, and wherein same or similar label represents same or similar element or has the element of same or like function from start to finish.The embodiment described below with reference to accompanying drawing is illustrative of, and is only used for explaining the present invention, and is not considered as limiting the invention.
In describing the invention, it is to be understood that term " first ", " second " only for descriptive purposes, and it is not intended that instruction or hint relative importance or the implicit quantity indicating indicated technical characteristic.Thus, define " first ", the feature of " second " can express or implicitly include one or more described features.In describing the invention, " multiple " are meant that two or more, limit unless otherwise in clear and definite present embodiment.
In describing the invention, it is necessary to explanation, unless otherwise clearly defined and limited, term " installation ", " being connected ", " connection " should be interpreted broadly, for instance, it is possible to it is fixing connection, it is also possible to be removably connect, or connect integratedly;Can be mechanically connected, it is also possible to be electrically connect or can intercom mutually;Can be joined directly together, it is also possible to be indirectly connected to by intermediary, it is possible to be connection or the interaction relationship of two elements of two element internals.For the ordinary skill in the art, it is possible to understand above-mentioned term concrete meaning in the present invention as the case may be.
Following disclosure provides many different embodiments or example for realizing the different structure of the present invention.In order to simplify disclosure of the invention, hereinafter parts and setting to specific examples are described.Certainly, they are only merely illustrative, and are not intended to the restriction present invention.Additionally, the present invention can in different examples repeat reference numerals and/or reference letter, this repetition is for purposes of simplicity and clarity, the relation between itself not indicating discussed various embodiment and/or arranging.Additionally, the example of the various specific technique that the invention provides and material, but those of ordinary skill in the art are it can be appreciated that the use of the application of other techniques and/or other materials.
Referring to Fig. 1, the fingerprint recognition system 100 of better embodiment of the present invention includes pretreatment module 10, feature extraction module 20 and identifies module 30.Pretreatment module 10 includes normalization module 11, segmentation module 12, filtration module 13 and refinement module 14.Feature extraction module 20 includes feature point extraction module 21 and removes pseudo-random numbers generation module 22.Identify that module 30 includes feature point set calibration module 31 and Point matching module 32.
The fingerprint identification method of better embodiment of the present invention includes step S1-S3.
S1: pre-treatment step, to Fingerprint Image Segmentation and filtering.
S2: characteristic extraction step, carries out feature point extraction and forms feature point set, and remove pseudo-random numbers generation further pretreated fingerprint image;And
S3: identification step, alignment features point set, and be identified by matching algorithm, obtain fingerprint recognition result.
In the present embodiment, pre-treatment step can be performed by pretreatment module 10, and in other words pretreatment module 10 is for Fingerprint Image Segmentation and filtering.Characteristic extraction step can be realized by feature extraction module 20, and in other words, feature extraction module 20 forms feature point set for pretreated fingerprint image carries out feature point extraction, and removes pseudo-random numbers generation further.Identification step by identifying that module 30 realizes, in other words, can be identified that module 30 is for alignment features point set, and be identified by matching algorithm, obtain fingerprint recognition result.
Pre-treatment step includes sub-step S11-S14.
S11: normalization sub-step, carries out normalization process to fingerprint image.
S12: segmentation sub-step, calculates the Harris angle point energy of each pixel of fingerprint image, and elimination to obtain binary picture less than the Harris angle point energy of predetermined threshold, expands and corrodes described binary picture;
S13: sub-step of filtering, calculates the field of direction of binary picture, frequency fields and field of curvature respectively, and uses Log-Gabor that binary picture wave filter is filtered;And
S14: refinement sub-step, uses self adaptation dynamic thresholding method to binary picture binary conversion treatment, and the look-up table again through mathematical morphology refines.
In the present embodiment, normalization sub-step can be realized by normalization module 11, and in other words, normalization module 11 is for carrying out normalization process to fingerprint image.Segmentation sub-step can be realized by segmentation module 12, in other words, segmentation module 12 is for calculating the Harris angle point energy of each pixel of fingerprint image, and elimination to obtain binary picture less than the Harris angle point energy of predetermined threshold, expands and corrodes described binary picture.Sub-step of filtering can be realized by filtration module 13, and in other words, filtration module 13 is for calculating the field of direction of binary picture, frequency fields and field of curvature respectively, and uses Log-Gabor that binary picture wave filter is filtered.Refinement sub-step can be realized by refinement module 14, and in other words, refinement module 14 is used for using self adaptation dynamic thresholding method that image binaryzation is processed, and the look-up table again through mathematical morphology refines.
In present embodiment, the purpose of normalization sub-step is the gray difference eliminating the noise of sensor itself and finger pressure difference and causing, and the contrast of different fingerprint images and gray scale is adjusted on fixing grey level.The normalization pixel-by-pixel that fingerprint image is carried out, should not change the definition of crestal line and valley line, reduces the change of gray scale on crestal line and valley line direction, it is to avoid produce substantial amounts of pseudo-random numbers generation simultaneously, processes the picture specification providing unified for successive image.
In present embodiment, normalization sub-step includes:
S111: histogram equalization, to fingerprint image histogram equalization, makes the valley and a ridge line of the fingerprint in fingerprint image become readily apparent from, thus improving the detectability of fingerprint characteristic.Histogram equalization is the method utilizing image histogram that contrast is adjusted in image processing field.This method is commonly used to increase the local contrast of many images, especially when the contrast of the useful data of image is fairly close time.By this method, brightness can be distributed better on the histogram.The contrast that thus may be used for strengthening the contrast of local and do not affect entirety, histogram equalization realizes this function by effectively extending conventional brightness.Gray level p is mapped to gray level q by the step of histogram equalization so that gray level q is uniformly distributed.After histogram equalization, the valley and a ridge line of the fingerprint of fingerprint image becomes readily apparent from, thus improving the detectability of fingerprint characteristic.
S112: choose the region interested (regionofinterest, ROI) in fingerprint image, the region comprising finger print information is carried out primary segmentation.
S113: based on the self-adaption specification of local characteristics, Fingerprint Image Segmentation is become the image block of the not crossover of K × L size, each image block is standardized respectively.
Fingerprint image I is defined as the matrix of a M × N, and wherein (i j) represents the pixel value at image the i-th row jth row to I.Average and the variance of fingerprint image are respectively defined as:
M ( I ) = 1 M × N Σ i = 1 M Σ j = 1 N I ( i , j ) - - - ( 1 )
VAR ( I ) = 1 M × N Σ i = 1 M Σ j = 1 N ( I ( i , j ) - M ( I ) ) 2 - - - ( 2 )
The normalization definition of fingerprint image is as follows:
G ( i , j ) = M 0 + VAR 0 ( I ( i , j ) - M ) 2 VAR , I ( i , j ) > M M 0 - VAR 0 ( I ( i , j ) - M ) 2 VAR , otherwise - - - ( 3 )
M0And VAR0Being desired average and variance respectively, M and VAR is average and the variance of calculated fingerprint image in formula (1) and (2) formula.
Normalization must pre-determine M0And VAR0Value.Fingerprint image each local gray scale and differ, therefore present embodiment adopts adaptive piecemeal fingerprint image standardized algorithm.
For i-th piece of image block, gray scale and formula of variance are as follows:
M i d = M 0 - α 1 ( M i - M 0 ) - - - ( 4 )
VAR i d = VAR 0 - α 2 ( VAR i - VAR 0 ) - - - ( 5 )
Wherein, MiAnd VARiIt is average gray and the variance of the i-th image block, M0And VAR0Desired gray scale and variance,WithIt is desired gray scale and the variance of the i-th image block.α1And α2It it is weighter factor.Desired gray scale that feature calculation according to each image block goes out and variance, apply formula (3) to each piece and can obtain the fingerprint image of self-adaption specification.
In present embodiment, segmentation sub-step includes.:
S121: calculate the Harris angle point energy of each pixel of fingerprint image.
Angle point is considered as two dimensional image brightness flop point of curvature maximum acutely or on the curve of image border, it is possible to determine the contour feature of target in two dimensional image.Harris operator is Harris and the Stephens Corner Detection device proposed in 1998, and Harris operator definitions is as follows:
R = I x 2 I y 2 - I xy 2 I x 2 + I y 2 - - - ( 6 )
Wherein, Ix、IyRepresent the gray scale deviation at coordinate X, the direction of Y-axis of fingerprint image, I respectivelyxyRepresent that the gray scale of fingerprint image is simultaneously in the deviation of X and Y both direction.
The Harris angle point energy of each pixel that Harris operator can calculate.Wherein, at the window of overlapping 3 × 3, Sobel operator (Fig. 2 please be join) is used to calculate Ix、Iy、IxyValue.Then use the window of 5 × 5 to Ix、Iy、IxyCarry out gaussian filtering to smooth, abate the noise.
S122: statistics Harris angle point energy, carries out ascending sort the Harris angle point energy of elimination sequence front 30%, then binary picture is expanded and corrodes, thus being extracted prospect and part background area by Harris angle point energy size.;And
In fingerprint image, the Harris angle point energy of foreground area is considered the Harris angle point energy far above background area and noise region.In the present embodiment, add up the Harris angle point energy of each pixel, carry out ascending sort by Harris angle point energy size.Filter the angle point energy of sequence front 30%, then the binary picture after being filtered is expanded and corrodes, thus obtaining a binary picture being extracted prospect and part background area.Because background area often accounts for about the 1/3-1/2 of fingerprint, so having taken 30% here to eliminate part background area, it is possible to reduce the amount of calculation of 20%-30%.Expand (dilate): by B A expanded produced binary picture D be meet the following conditions point (x, set y): if the center origin of B move to point (x, y), then its common factor non-NULL with A.Corrosion (erode): by B to the A produced binary picture E of corrosion be meet the following conditions point (x, set y): if the center origin of B move to point (x, y), then B will be completely contained in A.
S123: carry out binary conversion treatment to obtaining binary picture.
In present embodiment, use formula (7), each pixel of binary picture obtained above is carried out binary conversion treatment for twice.
T ( x , y ) = 1 , Sum ( D ( x , y ) ) &GreaterEqual; Threshold 0 , Sum ( D ( x , y ) ) < Threshold - - - ( 7 )
Wherein, Threshold=c × w2, w is window size, and present embodiment takes 9 × 9, and c is filtration combined weighted, and c first time takes 0.926, and second time takes 0.49, carries out second time smooth.Take big window thus eliminating substantial amounts of isolated Harris angle point.(D (x, y)) is the sum of the effective pixel points calculating w × w region centered by currently putting to Sum, namely calculates that to have how many pixels be the point on crestal line.
In present embodiment, sub-step of filtering includes:
S131: calculated direction field, uses modified model to be calculated based on the field of direction of gradient, corrects again through low pass filter travel direction field;
S132: calculate frequency fields, adopts sciagraphy to calculate ridge frequency;
S133: calculate field of curvature, according to orientation estimate field of curvature, weigh crestal line curvature by Curvature Measures;And
S134: filter filtering, adopts windowed FFT to take the fingerprint the spectrum information of each sub-block of image, and further according to crestal line direction and the frequency of image subblock, its frequency spectrum is filtered processing by structure Log-Gabor wave filter.
Fingerprint enhancement is for improving the quality of fingerprint image, to ensure accuracy and the robustness of subsequent fingerprint feature extraction and coupling.Modified model Log-Gabor wave filter is used for Fingerprint enhancement, utilizes its feature such as zero DC component and broadband covering effectively to overcome the limitation of tradition Gabor filter.The algorithm of present embodiment designs Log-Gabor wave filter according to the crestal line direction of fingerprint image, frequency and curvature feature, and the characteristic for wave filter completes image filtering at frequency domain.
Two dimension Loa-Gabor wave filter being expressed as follows under polar coordinate system:
G ( r , &theta; ) = exp ( - [ log ( r / f 0 ) ] 2 2 &CenterDot; &sigma; r 2 ) &CenterDot; exp ( - ( &theta; - &theta; 0 ) 2 2 &CenterDot; &sigma; 0 2 ) - - - ( 8 )
Wave filter represented by above formula can be decomposed into radially wave filter and angular filter two parts:
G r ( r ) = exp ( - [ log ( r / f 0 ) ] 2 2 &CenterDot; &sigma; r 2 ) - - - ( 9 )
G &theta; ( &theta; ) = exp ( - ( &theta; - &theta; 0 ) 2 2 &CenterDot; &sigma; 0 2 ) - - - ( 10 )
Wherein, r represents that radial coordinate, θ represent angle coordinate, f0For the mid frequency of wave filter, θ0For the deflection of wave filter, parameter σrAnd σθIt is respectively used to determine the bandwidth of radially wave filter and angular filter, that is the radial bandwidth of two dimension Log-Gabor wave filter and angular bandwidth.
After the essential structure determining Log-Gabor wave filter, in addition it is also necessary to obtaining relevant image texture information further, including the direction of fingerprint ridge line, frequency and curvature etc., fingerprint image is filtered by the wave filter to set up a reasonable set.It is thus desirable to fingerprint image is estimated from the field of direction, frequency fields and field of curvature.
The various aspects such as the reliability of direction estimation, precision and efficiency can be taken into account preferably based on gradient direction field computational methods, present embodiment is selected and calculates Fingerprint diretion based on gradient direction field computational methods, and be suitably modified, the field of direction tried to achieve is made secondary correction, further increases accuracy and the capacity of resisting disturbance of orientation estimate.
Orientation field computation step based on gradient is as follows:
S1311: binary picture is divided into the image subblock of the non-overlapping copies of B × B size, wherein B is the length of side of image subblock, is the fingerprint image of 500dpi for resolution, and the value of B is generally 16;
S1312: to arbitrary image sub-block, (i, j) at the gradient d of X-direction and Y-direction to calculate wherein pixelx(i, j) and dy(i, j), gradient operator selects Sobel operator as shown in Figure 2;And
S1313: calculate with pixel (i, the direction of the image subblock centered by j):
V x ( i , j ) = &Sigma; u = i - B 2 i + B 2 &Sigma; v = j - B 2 j + B 2 2 d x ( u , v ) d y ( u , v ) - - - ( 11 )
V y ( i , j ) = &Sigma; u = i - B 2 i + B 2 &Sigma; v = j - B 2 j + B 2 ( d x 2 ( u , v ) - d y 2 ( u , v ) ) - - - ( 12 )
&theta; ( i , j ) = 1 2 arctan ( V x ( i , j ) V y ( i , j ) ) - - - ( 13 )
Wherein, θ (i, j) is the least-squares estimation of local ridge orientation, span be [0, π).
Due to the impact of the factor such as noise, crackle, the crestal line direction that above-mentioned estimation obtains can not be always correct, for ensureing the accuracy of direction estimation, it is necessary to the field of direction is done further correction.
Present embodiment adopts the correction of low pass filter travel direction field, and comprises the following steps:
S1314: to arbitrary image sub-block (i, j), is calculated as follows deviation of directivity Δ (k) between itself and adjacent image sub-block:
&Delta; ( k ) = E ( k ) + &pi; , E ( k ) &le; - &pi; 2 E ( k ) - &pi; , E ( k ) > &pi; 2 E ( k ) , otherwise - - - ( 14 )
E (k)=θ (i-1+m, j-1+n)-θ (i, j) (15)
Wherein, k=3m+n, m, { 0,1,2}, (i j) represents image subblock (i, crestal line direction j) to θ to n ∈;
S1315: the deviation of directivity tried to achieve is divided into two classes according to value size, is expressed as follows with set A and B respectively:
A = { &Delta; ( k ) | &Delta; ( k ) &le; &pi; 4 , k = 0,1 , . . . , 8 } - - - ( 16 )
B = { &Delta; ( k ) | &Delta; ( k ) &le; &pi; 4 , k = 0,1 , . . . , 8 } - - - ( 17 )
Make NAAnd NBNot Biao Shi element number in A and B, and make SAAnd SBRepresent element value sum in set A and B, then can correct travel direction field as follows:
&theta; &prime; ( i , j ) = T ( i , j ) + &pi; , T ( i , j ) &le; - &pi; 2 T ( i , j ) - &pi; , T ( i , j ) > &pi; 2 T ( i , j ) , otherwise - - - ( 18 )
T ( i , j ) = &theta; ( i , j ) + S A N A , N A &GreaterEqual; N B &theta; ( i , j ) + S B N B , otherwise - - - ( 19 )
Wherein, (i, j) (i j) respectively corrects image subblock (i, crestal line direction j) of front and back to θ with θ ';
Owing to the calculating in crestal line direction is relatively easy, can reach higher precision, and also correction for direction can be passed through overcome the interference of noise and crackle further, and utilize crestal line direction accurately, sciagraphy generally can obtain good effect, and therefore present embodiment adopts sciagraphy to calculate ridge frequency.
Present embodiment adopts sciagraphy to calculate ridge frequency, and comprises the following steps:
S1321: each image subblock is respectively provided with a direction window being sized to W × L, the center of direction window and the center superposition of image subblock, the direction of its frame is determined by the crestal line direction of image subblock, wherein W is oriented parallel to crestal line direction, L direction is vertical and crestal line direction, being the fingerprint image of 500dpi for resolution, the value of W × L is generally 16 × 32;
S1322: for arbitrary image sub-block (i, j), projects as follows in corresponding direction window:
X ( k ) = 1 W &Sigma; d = 0 W - 1 G ( u , v ) , k = 0,1 , . . . , L - 1 - - - ( 20 )
u = i + ( d - W 2 ) &CenterDot; cos ( &theta; ( i , j ) ) + ( k - L 2 ) &CenterDot; sin ( &theta; ( i , j ) ) - - - ( 21 )
v = j + ( W 2 - d ) &CenterDot; sin ( &theta; ( i , j ) ) + ( k - L 2 ) &CenterDot; cos ( &theta; ( i , j ) ) - - - ( 22 )
The abscissa of i and j respectively image subblock center pixel and vertical coordinate, (i, j) represents the crestal line direction of image subblock to θ, G (u, v) represent that (X (k) is the one-dimensional signal that projection obtains to pixel for u, gray scale v);
S1323: the one-dimensional signal that projection is obtained makes low-pass filtering, and to eliminate influence of noise, computing formula is as follows:
X &prime; ( k ) = &Sigma; n - N 2 N 2 h ( n ) X ( k - n ) - - - ( 23 )
H (, n) for the one-dimensional low pass filter that each point value is identical, N is filter length, is usually taken to be 3.
S1324: further the projection signal after Filtering Processing is converted into square-wave signal as follows:
Y ( k ) = 0 , X &prime; ( k ) < T 1 , otherwise , K = 0,1 , . . . , L - 1 - - - ( 24 )
Wherein,
T = 1 L &Sigma; k = 0 L - 1 X &prime; ( k ) - - - ( 25 )
S1325: can obtain the average period of signal according to the saltus step edge of square-wave signal, computing formula is as follows:
p = L R N R - 1 - - - ( 26 )
F = 1 P - - - ( 27 )
Wherein NBFor the number of rising edge, L in square-wave signalBFor the signal length between two farthest rising edges, P is the average period of square-wave signal, i.e. the average headway of fingerprint ridge line, and F is corresponding ridge frequency.Ridge frequency is corresponding to the frequency parameter f of Log-Gabor wave filter0
In present embodiment, estimating field of curvature according to the field of direction, present embodiment defines following Curvature Measures and weighs crestal line curvature:
C ( i , j ) = 1 N 2 &Sigma; u = i - N 2 i + N 2 &Sigma; V = j - N 2 j + N 2 ( &phi; x ( i , j , u , v ) + &phi; y ( i , j , u , v ) ) - - - ( 28 )
Wherein,
φx(i, j, u, v)=| cos (2 θ (i, j))-cos (2 θ (u, v)) | (29)
φy(i, j, u, v)=| sin (2 θ (i, j))-sin (2 θ (u, v)) | (30)
(i j) represents image subblock (i, crestal line direction j) to θ.Obviously, crestal line curvature is more big, then the change of crestal line direction is more violent, and the value of Curvature Measures C is also more big.Taking N is 3, and curvature threshold T is taken as 0.75.Fig. 3 gives the field of direction and the field of curvature of a width fingerprint image and correspondence thereof, and wherein field of curvature represents with gray-scale map, and gray scale is more big, it was shown that curvature is more high.It can be seen that the field of curvature that present embodiment is tried to achieve reflects the height change of fingerprint ridge line curvature preferably from figure, it is possible to as the effective foundation adjusting filter angles bandwidth.
After utilizing windowed FFT to obtain the frequency spectrum of each image subblock, the Log-Gabor wave filter of correspondence just can be used to be filtered at frequency domain, thus the spectrum information of the image that takes the fingerprint, namely obtain the frequency spectrum of each image subblock.Image for two dimension needs to use two dimension windowed FFT, defines as follows:
F ( i , j , u , v ) = &Integral; - &infin; + &infin; &Integral; - &infin; + &infin; f ( x , y ) W F ( x - i , y - j ) e - j ( ux + vy ) dxdy - - - ( 31 )
Wherein, (i, j) abscissa of center pixel and vertical coordinate, (x, y) represents window function to WF to i and j respectively image subblock, and f and F represents original image and image spectrum respectively.
Devising the raised cosine window window function as windowed FFT of a kind of circle, the definition of window function is as follows:
W F ( x , y ) = 1 , &Delta;r 1 &le; 0 1 + cos ( &Delta;r 1 &Delta;r 2 &CenterDot; &pi; 2 ) 2 , otherwise , ( x , y ) &Element; [ - W 2 , W 2 ] - - - ( 32 )
Wherein,
&Delta;r 1 = x 2 + y 2 - B 2 - - - ( 33 )
&Delta;r 2 = ( W 2 ) 2 + ( W 2 ) 2 - B 2 - - - ( 34 )
For any one image subblock, should according to its crestal line direction and frequency by Log-Gabor filters modulate to corresponding direction and frequency, and the angular bandwidth of wave filter is determined according to its crestal line curvature, then use the wave filter constructed to be filtered its frequency spectrum processing.Make F (u, v) and F ' (u, v) respectively represent filtering before and after frequency spectrum, sub-step of filtering is as follows:
F ' (u, v)=G (u, v) F (u, v) (35)
(u, v) rectangular coordinate for Log-Gabor wave filter represents G.The size of wave filter keeps consistent with the size of frequency spectrum.
After the frequency spectrum of each image subblock is completed filtering, in addition it is also necessary to respectively it being made inverse fourier transform, transformation for mula is as follows:
f &prime; ( x , y ) = 1 W 2 &Sigma; u = 0 W - 1 &Sigma; v = 0 W - 1 F &prime; ( u , v ) exp ( 2 &pi;j W ( xu + yv ) ) - - - ( 36 )
W is the length of side of square frequency spectrum, i.e. the window length of side in aforementioned windowed FFT.For each data block that inverse transformation obtains, present embodiment retains its real part as the image block switching back to spatial domain, then in spatial domain, each image block is merged, to obtain complete enhancing image.The window location mode adopted for windowed FFT, present embodiment retains B × B the pixel enhancing result as correspondence image sub-block of each image block center, and carries out Image Mosaic by each image subblock home position in the picture.
Refinement module is divided into self adaptation dynamic thresholding method binaryzation and the thinning algorithm of tabling look-up based on mathematical morphology.
Binarization method based on gradation of image information mainly has two kinds, and the first is overall situation fixed threshold method, and the second is self adaptation dynamic thresholding method.Present embodiment adopts the self adaptation dynamic thresholding method of better effects if, and step is as follows:
S141: first divide the image into N × N number of nonoverlapping piece, calculate the average gray of each image block.Taking block size is 7 × 7;
f = 0 , x < T 255 , x &GreaterEqual; T - - - ( 37 )
S142: each pixel in square and gray average are compared.If greater than average, the gray scale of this pixel is set to 255, is otherwise set to 0.Wherein T is the threshold value specified, and x is gray scale.
Refinement is also referred to as skeletonizing, it is exactly the clear uniform fringe center point and line chart picture fingerprint image (streakline is clear but thickness is uneven) after binary conversion treatment being changed into crestal line live width equal (being only the width of a pixel), removes the streakline thickness information not having actual reference significance substantially exactly.
Must ensure that when carrying out fingerprint thinning process the benefit structure of opening up of original image is not changed.As a rule, refinement occupied time is about more than half of whole pretreatment time, thus one fast and effectively thinning algorithm be the key of pretreatment.Present embodiment adopts the thinning algorithm of tabling look-up based on mathematical morphology.
Principle according to refinement has following criterion:
(1) internal point can not be deleted;(2) isolated point can not be deleted;(3) straight line end points can not be deleted;(4) if P is boundary point, after removing P, if connected component does not increase, then P can delete.
According to above-mentioned criterion, make a table in advance, have 256 elements from 0 to 255, each element or be 0, or be 1.Present embodiment is tabled look-up according to the situation of eight consecutive points of certain point (black color dots to be processed), if the element in table is 1, then it represents that this point can be deleted, and otherwise retains.Thinning process uses look-up table.
The step of thinning algorithm is as follows:
(1) index value for tabling look-up is calculated according to the eight neighborhood value of Fig. 4 in order;
S=P1×20+P2×21+P3×22+P4×23+P6×24+P7×25+P8×26+P9×27(38)
(P1~P9 respectively 0 or 1), then have 256 kinds of indexes and numerical value 0 to 255 one_to_one corresponding.
(2) according to above-mentioned criterion make one eliminate table, be actually a capacity be the array of 256, subscript respectively with the index value one_to_one corresponding of pixel, element value is set to the central point of 0 this combination of expression and should retain, and being set to 1 expression should delete.
(3) from top to bottom, from left to right, bianry image being scanned, each stain is done following process: first judge the left and right neighbours of this stain, if being all stain, then this point does not process;Otherwise calculate 8 neighborhood codings as index, look into elimination table sees whether delete, if this stain is deleted, then skip its right neighborhood, process next point.
(4) from left to right, from top to bottom, bianry image carrying out second time scanning, each stain is done similar process: first judge the neighbours up and down of this stain, if being all stain, then this point does not process;Otherwise calculate 8 neighborhood codings as index, look into elimination table sees whether delete: if this stain is deleted, then skip its lower neighbours, process next point.
(5) if this circulation has stain to be deleted, then jumping to (3), otherwise, terminate circulation, refinement terminates.
The improved form of the partitioning algorithm based on Harris angle point in present embodiment, is not rely on orientation estimate and the result of calculation of frequency fields estimation.Therefore, by the cutting procedure priority treatment of image, calculating the mask of fingerprint in advance, the orientation estimate in pre-treatment step, frequency fields are estimated and filter enhancing etc. only the foreground area of fingerprint to be calculated such that it is able to greatly reduce the amount of calculation of fingerprint image preprocessing.
In present embodiment, characteristic extraction step includes:
S21: feature point extraction sub-step, the characteristic point that takes the fingerprint and fingerprint singularity;And
S22: remove pseudo-random numbers generation sub-step, removes pseudo-end points, pseudo-crunode and pseudo-fingerprint (burr) etc..
In the present embodiment, feature point extraction sub-step can be realized by feature point extraction module 21, and in other words, feature point extraction module 21 is used for take the fingerprint characteristic point and fingerprint singularity.Remove pseudo-random numbers generation sub-step to be realized by removing pseudo-random numbers generation module 22, in other words, remove pseudo-random numbers generation module 22 and be used for removing pseudo-end points, pseudo-crunode and pseudo-fingerprint (burr) etc..
In present embodiment, feature point extraction sub-step includes:
S211: calculate and extract end points, crunode respectively;And
S212: in the field of direction, calculates the Poincare index value of central point (core), it is determined that central point type.
Feature extraction is the core algorithm in whole fingerprint recognition identification flow.For the feature extraction of fingerprint image, the task of feature extraction algorithm is the type of quantity and each characteristic point being detected singular point and this two category features point of minutiae point in fingerprint image by algorithm, the streakline direction of position and region.The extraction accuracy of singular point or minutiae point and order of accuarcy determine the performance quality of fingerprint recognition system, the i.e. height of discrimination.
The characteristic point that general fingerprint image extracts is between 10-100, and most of documents are all thought at least should have 12 characteristic points just can mate.Present embodiment adopts 8 neighborhoods to encode the Ridge following algorithms characteristic point to the image that takes the fingerprint.
Minutia refers to the sudden change of fingerprint ridge line, the generally common several classes mainly having as shown in Figure 5, and statistical experiment shows that end points and bifurcation are modal minutias in fingerprint, the probability respectively 68.2% and 23.8% that they occur.Therefore the method that present embodiment uses determines minutia, mainly determines the position of end points and bifurcation, then passes through the mutual relation between comparative feature to determine whether fingerprint mates.
The extraction step of fingerprint characteristic:
(1) end points extracts
Method: scan some point, if about the difference of 8 all adjacent two points of point absolute value and be 2 × 255, then it is end points.
(2) extraction of crunode
Method: scan some point, if about the difference of 8 all two adjacent points of point absolute value and be 6 × 255, then it is crunode.
(3) fingerprint characteristic central point, the extraction of triangulation point
Poincare formula is used to judge singular point, thus extracting central point, triangulation point.Given singular point (i, j) Poincare index definition as follows:
poincare ( i , j ) = 1 2 &pi; { &Integral; 0 2 &pi; &PartialD; &PartialD; O &prime; ( i + &epsiv; cos &theta; , j + &epsiv; sin &theta; ) d&theta; } - - - ( 39 )
&PartialD; &PartialD; O &prime; ( i + &epsiv; cos &theta; , j + &epsiv; sin &theta; ) = d&delta; , | d&delta; | < &pi; 2 &pi; + d&delta; , d&delta; < - &pi; 2 &pi; - d&delta; , otherwise - - - ( 40 )
Wherein,
d&delta; = lim v &RightArrow; &infin; O &prime; ( i + &epsiv; cos ( &theta; + v ) , j + &epsiv; sin ( &theta; + v ) ) - O &prime; ( i + &epsiv; cos &theta; , j + &epsiv; sin &theta; ) v - - - ( 41 )
Certain point is made a circle direction of bowl field integration in week by Poincare, namely seeks the sum of peripheral direction field difference, then it is poor namely to obtain the mean direction of surrounding divided by 2 π.Value required by the difference of direction is more big, and Poincare value is more big.The change of the field of direction is more violent about, then singular point there is a possibility that more big.
In the field of direction, the index value of the Poincare of central point (core) is 1/2, and the Poincare index value of triangulation point (Delta) is-1/2.In digital picture, Poincare value calculate along a closed curve field of direction difference and replace above-mentioned Poincare (i, j) formula.
The extraction step of fingerprint characteristic singular point:
(1) calculating of Poincare value.
As shown in Figure 6, in 5 × 5 grids, so that (i, centered by j), forms closed curve D clockwise1, D2..., D12.The Poincare value of this closed curve:
poincare ( i , j ) = &Sigma; 1 12 | D i - D ( i + 1 ) mod 12 | - - - ( 42 )
In 3 × 3 grids, so that (i, centered by j), forms closed curve d clockwise1, d2..., d8.The Poincare value of this closed curve:
poincare ( i , j ) = &Sigma; 1 8 | d i - d ( i + 1 ) mod 8 | - - - ( 43 )
(2) final singular point is obtained by mean algorithm.
Several candidate's singular points that gained is adjacent, it is necessary to obtain final singular point by mean algorithm.In order to eliminate the pseudo-singular point that noise causes, it is as shown in the table, then so that (i, centered by j), calculates closed curve Poincare value clockwise.Only when the Poincare value (1/2 or-1/2) of 3 × 3 grids and 5 × 5 grids is identical, candidate's singular point is just as real singular point.
In the field of direction of grid 3 × 3, calculate 8 directions (1,2,3 ... 8) sum of difference each other, minimum direction is just for the direction of singular point.In the field of direction of grid 5 × 5, calculate 12 directions (1,2,3 ..., 12) sum of difference each other, minimum direction is exactly the direction of singular point.Calculate singular point direction process in, it is possible to have both direction be the same, it is possible to take the meansigma methods direction as singular point in direction.
Due to the quality problems of fingerprint image acquisition, image can produce many pseudo-characteristics, formed such as the fracture by fingerprint pseudo-end points, the pseudo-crunode formed by grease, the pseudo-fingerprint (burr) etc. that formed by sensor surface dust.The feature of pseudo-fingerprint is as follows:
(1) fingerprint ridge form puppet fingerprint feature point
Analyzing fingerprint form, the streakline in fingerprint is in general not short, so short streakline is pseudo-streakline (burr, stub, viscous bridge etc.).
(2) fingerprint marginal area puppet fingerprint feature point
Due to sensor, the region at fingerprint edge can not be caught in, and streakline end points scarcely can at marginal position, and namely the end points in region, image border is pseudo-end points.
(3) Fingerprint diretion form puppet fingerprint feature point.
From the angle analysis of the field of direction, a streak line consecutive variations, its direction is also repeatedly continuous.If the direction change around a characteristic point (end points or crunode) is acutely, then it is pseudo-random numbers generation.
(4) fingerprint ridge distance amount threshold puppet fingerprint feature point.
The endpoint feature of non-edge.The fingerprint feature point of unit streakline spacing, its quantity has threshold restriction.The fracture of streakline can form the pseudo-end points of a pair close proximity, and namely the end points of close proximity is likely to pseudo-end points.The minima of the distance between two characteristic points, must be the distance between streakline.So, distance is less than, in two characteristic points of ridge distance, certainly existing a pseudo-random numbers generation, and in like manner, crunode is also so.
Remove the method for designing flow chart of pseudo-fingerprint feature point as shown in Figure 7.
In present embodiment, identification step includes:
S31: feature point set calibration sub-step, coupling initial point pair between location feature point set and template point set, set up polar coordinate system to concentrating at feature point set and template point respectively for limit mating initial point, calculate the rotation parameter between feature point set and template point set alignment features point set;And
S32: Point matching sub-step, attempts coupling to feature point set and template point set, and the feature of statistical match is counted.
In present embodiment, feature point set calibration sub-step can be realized by feature point set calibration module 31, in present embodiment, feature point set calibration module 31 is for feature point set calibration sub-step, coupling initial point pair between location feature point set and template point set, set up polar coordinate system to concentrating at feature point set and template point respectively for limit mating initial point, calculate the rotation parameter between feature point set and template point set alignment features point set.Point matching sub-step can be realized by point matching algorithm module 32, and in other words, point matching algorithm module 32 is for attempting coupling to feature point set and template point set, and the feature of statistical match is counted.
In present embodiment, feature point set calibration sub-step includes:
S31: fingerprint image is calibrated, calibrates crestal line, calibrates singular point;And
S33: polar coordinate are changed, and minutiae point are all transformed under polar coordinate system.
Central point matching algorithm based on polar coordinate conversion:
Matching algorithm includes two stages: 1. feature point set calibration: the coupling initial point pair between location feature point set and template point set, with concentrate at feature point set and template point respectively for limit and set up polar coordinate system, calculate the rotation parameter between feature point set and template point set alignment features point set;2. Feature Points Matching: using point matching algorithm (overcoming the impact of image non-linear deformation and calibration error) that feature point set and template point set are attempted coupling in polar coordinate system after calibration, the feature of statistical match is counted.
1. feature point set calibration:
Minutiae point is transformed in polar coordinate system, will concentrate each select a reference point as the initial point in corresponding polar coordinate system at template details point set and input minutiae point, and calculate other minutiae point polar coordinate relative to reference point.
Order P = ( ( x 1 p , y 1 p , q 1 p , t 1 p ) T , . . . , ( x M p , y M p , q M p , t M p ) T ) , Represent M minutiae point in template image,Represent the N number of minutiae point wherein x in input picturei, yi, qi, tiThe x coordinate of the details respectively extracted, y-coordinate, characteristic point direction and type.
The every bit P that template point is concentratediEvery bit Q in (1≤i≤M) and feature point seti(1≤i≤M), definition rotate [i] [j] is by PiAnd QiWhen being used as reference point pair, the anglec of rotation from input picture to template image, if PiAnd QiCan be treated as a pair corresponding point, then the value that rotate [i] [j] will take between 0 °~360 °, otherwise present embodiment definition rotate [i] [j] value is 400 cannot function as a pair corresponding point to represent.
(1) fingerprint image calibration
Ridge alignment method
Adopting minutiae feature to add the preceding paragraph contact crestal line as fingerprint characteristic, as the sampled point on Fig. 8 crestal line represents the crestal line feature that minutiae point is corresponding, sampled distance is about the average distance between crestal line.The crestal line that crestal line bifurcation is corresponding is nearest that with the direction of this minutiae point.The crestal line that crestal line end points is corresponding is exactly the crestal line at this minutiae point place.Represent that the characteristic component of sampled point includes: this point and the distance of corresponding minutiae point, connect the angle between this point with the direction of the straight line of corresponding minutiae point and direction of corresponding minutiae point.In coupling calibration process, corresponding crestal line is used to two feature point sets of coupling are calibrated.
Representing crestal line corresponding to minutiae point P with R, represent crestal line corresponding to minutiae point Q with r, coupling R and r formula 44 and 45 calculates the difference between these two crestal lines.
Diff _ dist = 1 L &Sigma; i = 0 L | R ( di ) - r ( di ) | - - - ( 44 )
Diff _ ang = 1 L &Sigma; i = 0 L | R ( &alpha;i ) - r ( &alpha;i ) | - - - ( 45 )
In formula, L is the some number in the crestal line of record, R (di) and r (di) represents from the point crestal line R and r to the distance of corresponding minutiae point respectively, and r (α i) and R (α i) represents the angle connecting crestal line with upper point with the direction of the same corresponding minutiae point of straight line to deserved minutiae point respectively.
If difference Diff_dist and the Diff_ang of these two crestal lines is respectively smaller than certain threshold value Td and T α, say, that these two crestal lines are similar to a certain extent, then PiAnd QiThe reference point pair of correspondence can be taken as.Otherwise it is assumed that two crestal line dissmilarities, then PiAnd QiThe reference point pair of correspondence can not be taken as.
Singular point calibration steps
Present embodiment method of utilization orientation field in fingerprint extraction process is extracted singular point, therefore set forth herein the method utilizing singular point to calibrate: input fingerprint and template fingerprint are extracted singular point by field of direction method respectively.
If input fingerprint and template fingerprint are with there is singular point, then it is assumed that these 2 can as a pair reference point, then the value that rotate [i] [j] will take between 0 °~360 °, otherwise rotate [i] [j] value is 400, to represent PiAnd QjCannot function as a pair reference point.For with reference to a fingerprint image, further feature point being respectively relative to characteristic point and is transformed under polar coordinate system, carry out follow-up coupling.It is as follows that calibration realizes process:
Owing to fingerprint extraction process having carried out singular point extraction, and their fingerprint and directional information is stored at corresponding memory headroom, therefore the characteristic point calibration steps of present embodiment saves the time finding characteristic point one by one, and interim memory headroom that need not be bigger stores finger print information.After finding calibration point, centered by calibration point, carry out fingerprint correction.
(2) polar coordinate conversion
Owing to finally minutiae point being all transformed under polar coordinate system, present embodiment only calculates the anglec of rotation between input picture and template image, and it is left out the translation between two width images, by input picture and template image calibration in polar coordinate system, only input minutiae point all need to be respectively relative to reference point P with template minutiae pointiAnd QjIt is transformed in polar coordinate system, on the polar angle of all input minutiae point, then adds an angle rotate [i] [j].It is to say, input minutiae point and template minutiae point are all respectively relative to reference point PiAnd QjIt is transformed in polar coordinate system by formula (46).
r i e i &theta; i = ( x i * - x T ) 2 + ( y i * - y T ) 2 tan - 1 ( y i * - y T x i * - x T ) &theta; i * - &theta; T - - - ( 46 )
WhereinThe coordinate of minutiae point to be converted, (xr, yr, qr)TIt is the coordinate with reference to minutiae point, (ri, ei, θi)TIt is minutiae point expression in polar coordinate system, represents that polar radius, polar angle and minutiae point are relative to the direction with reference to minutiae point respectively.
2. Feature Points Matching
Template minutiae point in polar coordinate and input minutiae point are pressed polar angle and is incremented by direction sequencing, and connect bunchiness, be expressed as follows, whereinWithRepresent the polar radius of correspondence, polar angle and the Minutiae Direction relative to reference point.
P i ( S ) = ( ( r 1 P , e 1 P , &theta; 1 P ) T , . . . , ( r m P , e m P , &theta; m P ) T ) - - - ( 47 )
Q j ( S ) = ( ( r 1 Q , e 1 Q , &theta; 1 Q ) T , . . . , ( r n Q , e n Q , &theta; n Q ) T ) - - - ( 48 )
Owing to the many reasons such as difference firmly will necessarily make there is non-linear deformation between two width fingerprint images during by fingerprint.Even across calibration, the minutiae point in input picture is also impossible to be completely superposed with the corresponding point in template image.Add and gatherer process exists the reasons such as noise so that between the corresponding point of two width images, would be likely to occur certain deviation.These are desirable that details description algorithm has certain elasticity, say, that details description algorithm should be able to tolerate the difference of the corresponding point position that inaccurate or image the non-linear deformation in the minutiae point position owing to extracting causes to a certain extent.The concept of gauge box is introduced herein for this.As shown in Figure 9 A, gauge box is placed on a box on each template characteristic point, and the polar angle of its opposite side is constant, and the polar radius of another opposite side is constant.Represent the polar angle difference of that opposite side that polar radius is constant with angle_size, represent the difference of the polar radius of that opposite side that polar angle is constant with radius_size:
Angle_size=angle_high-angle_low (49)
Radius_size=radius_high-radius_low (50)
Size angle_size and the radius_size of gauge box represents, if template characteristic point and calibration after input feature vector point be all located in gauge box, then the two characteristic point is it is possible to be a pair match point.Fig. 9 is shown in by fixed size and variable-size gauge box, there is used herein boundary circle of variable-size.The size variable-size of gauge box, namely footpath, the pole size along with minutiae point is changed by the value of angle_size and radius_size.If the footpath, pole of template minutiae point is relatively larger, its gauge box will have a bigger radius_size and less angle_size, if the polar angle of template minutiae point is relatively larger, its gauge box will have a bigger angle_size and less radius_size.
Radius_size and the angle_size of the template minutiae point that polar radius is r is calculated with following formula.
radius _ size = r _ small if r _ size < r _ small r _ size if r _ small < r < r _ l arg e r _ l arg e if r _ size > r _ l arg e - - - ( 51 )
r _ size = r _ small + r &alpha; - - - ( 52 )
angle _ size = a _ small if a _ size < a _ small a _ size if a _ small < a < a _ l arg e a _ l arg e if a _ size > a _ l arg e - - - ( 53 )
a _ size = r _ size r - - - ( 54 )
Wherein r is footpath, pole r_small, r_large, a_amall, the a_large of template minutiae point is the upper bound and the lower bound of radius_size and angle_size respectively, and their value is set in advance, and α is previously given constant.
Present embodiment uses the gauge box of variable-size in order to make algorithm to non-linear deformation more robust.It is relatively big that non-linear deformation is typically in a specific region, then non-linearly expands outwardly.When the footpath, pole of minutiae point is less, little deformation just can cause the change of big polar angle, and the change of polar radius is less.So in this case, the angle_size of gauge box should relatively larger radius_size then should be smaller.On the other hand, when the polar radius of minutiae point is bigger, the less change of polar angle will result in the large variation of minutiae point position, and the deformation of polar radius can regard this minutiae point and deformation cumulative with reference to all regions between minutiae point as, so in this case, the angle_size of gauge box should smaller radius_size then should be bigger.Fingerprint matching flow chart is as shown in Figure 10.
Determined the size of the gauge box of each template minutiae point by above formula (51) and (53), put ε=0;
Circulation:
Template_point [k] is template characteristic point, and input_point [L] is model characteristic point, if ε is more than ε ' (value of setting), then it represents that the two fingerprint matching success, otherwise represents and is not from same fingerprint.
The pretreatment of use the method for the invention and feature extraction result are as shown in figure 11.
In the description of this specification, specific features, structure, material or feature that the description of reference term " embodiment ", " some embodiments ", " exemplary embodiment ", " example ", " concrete example " or " some examples " etc. means in conjunction with described embodiment or example describe are contained at least one embodiment or the example of the present invention.In this manual, the schematic representation of above-mentioned term is not necessarily referring to identical embodiment or example.And, the specific features of description, structure, material or feature can combine in an appropriate manner in any one or more embodiments or example.
While embodiments of the present invention have been illustrated and described, it will be understood by those skilled in the art that: these embodiments can being carried out multiple change, amendment, replacement and modification when without departing from principles of the invention and objective, the scope of the present invention is limited by claim and equivalent thereof.

Claims (18)

1. a fingerprint identification method, it is characterised in that including:
S1: pre-treatment step, splits fingerprint image and filters;
S2: characteristic extraction step, carries out feature point extraction and forms feature point set, and remove pseudo-random numbers generation further pretreated described fingerprint image;And
S3: identification step, is calibrated described feature point set, and is identified by matching algorithm, to obtain fingerprint recognition result;
Described pre-treatment step includes:
S12: segmentation sub-step, calculates the Harris angle point energy of each pixel of described fingerprint image, and elimination to obtain binary picture less than the Harris angle point energy of predetermined threshold, expands and corrodes described binary picture;And
S13: sub-step of filtering, calculates the field of direction of described binary picture, frequency fields and field of curvature respectively, and uses Log-Gabor wave filter to be filtered described binary picture obtaining described pretreated fingerprint image.
2. fingerprint identification method as claimed in claim 1, it is characterised in that described pre-treatment step includes before described segmentation sub-step:
S11: normalization sub-step, carries out normalization process to the described fingerprint image before segmentation.
3. fingerprint identification method as claimed in claim 2, it is characterised in that described normalization sub-step includes successively:
S111: histogram equalization, to the described fingerprint image histogram equalization before segmentation;
S112: choose the region interested in the described fingerprint image after histogram equalization;And
S113: based on the self-adaption specification of local characteristics, becomes the described Fingerprint Image Segmentation after choosing region interested the image block of the not crossover of predefined size, each described image block is standardized respectively.
4. fingerprint identification method as claimed in claim 1, it is characterised in that described segmentation sub-step includes:
S121: calculate the Harris angle point energy of each pixel of described fingerprint image;
S122: add up described Harris angle point energy, carries out the described Harris angle point energy of ascending sort elimination sequence front 30% to obtain binary picture by described Harris angle point energy size, more described binary picture is expanded and corrodes;And
S123: the described binary picture obtained is carried out binary conversion treatment.
5. fingerprint identification method as claimed in claim 1, it is characterised in that use the orientation field computation method based on gradient to calculate the described field of direction, and correct the described field of direction by low pass filter.
6. fingerprint identification method as claimed in claim 1, it is characterised in that adopt sciagraphy to calculate described frequency fields.
7. fingerprint identification method as claimed in claim 1, it is characterised in that field of curvature according to described orientation field computation, weighs crestal line curvature by Curvature Measures, to calculate described field of curvature.
8. fingerprint identification method as claimed in claim 1, it is characterized in that, that described binary picture is divided into predefined size by described sub-step of filtering and not crossover image is fast, windowed FFT is adopted to extract the spectrum information of each described image subblock, further according to crestal line direction and the frequency of described image subblock, construct described Log-Gabor wave filter and be filtered the frequency spectrum of described image subblock processing.
9. fingerprint identification method as claimed in claim 1, it is characterised in that described pre-treatment step also includes after described sub-step of filtering:
S14: refinement sub-step, uses self adaptation dynamic thresholding method that filtered described binary picture is carried out binary conversion treatment, and the look-up table again through mathematical morphology carries out refining thus obtaining described pretreated fingerprint image.
10. fingerprint identification method as claimed in claim 1, it is characterised in that described characteristic point includes fingerprint feature point, described fingerprint feature point includes end points and crunode, and by adopting 8 neighborhood coding Ridge following algorithms to extract.
11. fingerprint identification method as claimed in claim 1, it is characterised in that described characteristic point includes fingerprint singularity, described fingerprint singularity includes central point and triangulation point, and judges to extract by Poincare formula.
12. fingerprint identification method as claimed in claim 1, it is characterised in that described identification step includes:
S31: feature point set calibration sub-step, position the coupling initial point pair between described feature point set and template point set, set up polar coordinate system with described coupling initial point to concentrating at described feature point set and described template point respectively for limit, calculate the rotation parameter between described feature point set and described template point set and calibrate described feature point set;And
S32: Point matching sub-step, attempts coupling to described feature point set and described template point set, and the feature of statistical match is counted.
13. fingerprint identification method as claimed in claim 12, it is characterised in that described characteristic point includes singular point and minutiae point, described feature point set calibration sub-step comprises the following steps:
S311: fingerprint image is calibrated, calibration crestal line and described singular point;And
S312: polar coordinate are changed, and described minutiae point are all transformed under described polar coordinate system.
14. a fingerprint recognition system, it is characterised in that include
Pretreatment module, for splitting fingerprint image and filtering;
Feature extraction module, forms feature point set for pretreated fingerprint image carries out feature point extraction, and removes pseudo-random numbers generation further;And
Identify module, be used for calibrating described feature point set, and be identified by matching algorithm, obtain fingerprint recognition result;
Described pretreatment module includes:
Segmentation module, for calculating the Harris angle point energy of each pixel of fingerprint image, elimination to obtain binary picture less than the Harris angle point energy of predetermined threshold, expands and corrodes described binary picture;And
Filtering submodule, for calculating the field of direction of described binary picture, frequency fields and field of curvature respectively, and uses Log-Gabor wave filter to be filtered described binary picture obtaining described pretreated fingerprint image.
15. fingerprint recognition system as claimed in claim 14, it is characterised in that described pretreatment module also includes:
Normalization module, for carrying out normalization process to described fingerprint image.
16. fingerprint recognition system as claimed in claim 14, it is characterised in that described pretreatment module also includes:
Refinement module, is used for using self adaptation dynamic thresholding method that filtered described image binaryzation is processed, and the look-up table again through mathematical morphology carries out refining thus obtaining described pretreated fingerprint image.
17. fingerprint recognition system as claimed in claim 14, it is characterised in that described feature extraction module includes:
Feature point extraction module, is used for take the fingerprint characteristic point and fingerprint singularity;And
Remove pseudo-random numbers generation module, be used for removing pseudo-end points, pseudo-crunode and pseudo-fingerprint.
18. fingerprint recognition system as claimed in claim 14, it is characterised in that described identification module includes:
Feature point set calibration module, for positioning the coupling initial point pair between described feature point set and template point set, set up polar coordinate system with described coupling initial point to concentrating at described feature point set and described template point respectively for limit, calculate the rotation parameter between described feature point set and described template point set and calibrate described feature point set;And
Point matching module, for described feature point set and described template point set are attempted coupling, the feature of statistical match is counted.
CN201410770480.XA 2014-12-12 2014-12-12 Fingerprint identification method and fingerprint identification system Pending CN105740753A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201410770480.XA CN105740753A (en) 2014-12-12 2014-12-12 Fingerprint identification method and fingerprint identification system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201410770480.XA CN105740753A (en) 2014-12-12 2014-12-12 Fingerprint identification method and fingerprint identification system

Publications (1)

Publication Number Publication Date
CN105740753A true CN105740753A (en) 2016-07-06

Family

ID=56241610

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201410770480.XA Pending CN105740753A (en) 2014-12-12 2014-12-12 Fingerprint identification method and fingerprint identification system

Country Status (1)

Country Link
CN (1) CN105740753A (en)

Cited By (19)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106709450A (en) * 2016-12-23 2017-05-24 上海斐讯数据通信技术有限公司 Recognition method and system for fingerprint images
CN106778498A (en) * 2016-11-13 2017-05-31 北海和思科技有限公司 A kind of method for strengthening Fingerprint recognition
CN106815564A (en) * 2016-12-28 2017-06-09 深圳天珑无线科技有限公司 A kind of calibration method of fingerprint recognition system, system and a kind of electronic equipment
CN107203647A (en) * 2017-07-20 2017-09-26 长江大学 Student information acquisition method and device based on biological characteristic
CN107341385A (en) * 2017-06-15 2017-11-10 珠海格力电器股份有限公司 A kind of mobile terminal unlocking method and device
CN107437068A (en) * 2017-07-13 2017-12-05 江苏大学 Pig individual discrimination method based on Gabor direction histograms and pig chaeta hair pattern
CN107909031A (en) * 2017-11-15 2018-04-13 张威 A kind of scene of a crime fingerprint ridge leaves region frequency dynamic reconstruction method
CN107918750A (en) * 2016-10-08 2018-04-17 深圳指瑞威科技有限公司 A kind of adaptive fingerprint image method of adjustment
CN108182375A (en) * 2016-12-08 2018-06-19 广东精点数据科技股份有限公司 A kind of fingerprint recognition system based on mobile-phone payment
WO2018176514A1 (en) * 2017-03-31 2018-10-04 清华大学 Fingerprint registration method and device
CN109033851A (en) * 2018-07-02 2018-12-18 北京科东电力控制系统有限责任公司 The mobile application protecting information safety method and apparatus of electric power transaction platform
CN109657579A (en) * 2018-12-07 2019-04-19 上海爱信诺航芯电子科技有限公司 A kind of detection of fingerprint crackle and restorative procedure
CN109711418A (en) * 2019-01-29 2019-05-03 浙江大学 A kind of contour corner detection method for object plane image
CN109784195A (en) * 2018-12-20 2019-05-21 金菁 A kind of fingerprint identification method checked card for enterprise's fingerprint and system
CN109815935A (en) * 2019-02-20 2019-05-28 Oppo广东移动通信有限公司 Electronic device, fingerprint authentication method and Related product
TWI698801B (en) * 2018-11-28 2020-07-11 大陸商北京集創北方科技股份有限公司 Fingerprint image compensation method capable of adapting to different transparent cover layer thickness, transparent cover layer fingerprint recognition device and information processing device
CN111429359A (en) * 2020-06-11 2020-07-17 深圳市诺赛特系统有限公司 Small-area fingerprint image splicing method, device, equipment and storage medium
CN111428701A (en) * 2020-06-10 2020-07-17 深圳市诺赛特系统有限公司 Small-area fingerprint image feature extraction method, system, terminal and storage medium
CN112329681A (en) * 2020-11-13 2021-02-05 北京思比科微电子技术股份有限公司 Filtering method applied to fingerprint identification

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20070230754A1 (en) * 2006-03-30 2007-10-04 Jain Anil K Level 3 features for fingerprint matching
CN103605963A (en) * 2013-03-01 2014-02-26 新乡学院 Fingerprint identification method

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20070230754A1 (en) * 2006-03-30 2007-10-04 Jain Anil K Level 3 features for fingerprint matching
CN103605963A (en) * 2013-03-01 2014-02-26 新乡学院 Fingerprint identification method

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
刘良勇: "基于ARM9的指纹采集和识别系统的研究", 《中国优秀硕士学位论文全文数据库 信息科技辑》 *
张明志: "基于微特征的指纹识别算法研究", 《中国优秀硕士学位论文全文数据库 信息科技辑》 *
王玮: "自动指纹识别系统关键技术研究", 《中国博士学位论文全文数据库 信息科技辑》 *

Cited By (26)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107918750A (en) * 2016-10-08 2018-04-17 深圳指瑞威科技有限公司 A kind of adaptive fingerprint image method of adjustment
CN106778498A (en) * 2016-11-13 2017-05-31 北海和思科技有限公司 A kind of method for strengthening Fingerprint recognition
CN108182375B (en) * 2016-12-08 2020-11-06 广东精点数据科技股份有限公司 Fingerprint identification system based on mobile phone payment
CN108182375A (en) * 2016-12-08 2018-06-19 广东精点数据科技股份有限公司 A kind of fingerprint recognition system based on mobile-phone payment
CN106709450A (en) * 2016-12-23 2017-05-24 上海斐讯数据通信技术有限公司 Recognition method and system for fingerprint images
CN106815564A (en) * 2016-12-28 2017-06-09 深圳天珑无线科技有限公司 A kind of calibration method of fingerprint recognition system, system and a kind of electronic equipment
WO2018176514A1 (en) * 2017-03-31 2018-10-04 清华大学 Fingerprint registration method and device
US10769407B2 (en) 2017-03-31 2020-09-08 Tsinghua University Fingerprint registration method and device
CN107341385A (en) * 2017-06-15 2017-11-10 珠海格力电器股份有限公司 A kind of mobile terminal unlocking method and device
CN107341385B (en) * 2017-06-15 2019-01-04 珠海格力电器股份有限公司 A kind of mobile terminal unlocking method and device
CN107437068A (en) * 2017-07-13 2017-12-05 江苏大学 Pig individual discrimination method based on Gabor direction histograms and pig chaeta hair pattern
CN107203647A (en) * 2017-07-20 2017-09-26 长江大学 Student information acquisition method and device based on biological characteristic
CN107203647B (en) * 2017-07-20 2020-04-07 长江大学 Student information acquisition method and device based on biological characteristics
CN107909031A (en) * 2017-11-15 2018-04-13 张威 A kind of scene of a crime fingerprint ridge leaves region frequency dynamic reconstruction method
CN107909031B (en) * 2017-11-15 2021-06-08 张威 Crime scene fingerprint line leaving area frequency dynamic reconstruction method
CN109033851A (en) * 2018-07-02 2018-12-18 北京科东电力控制系统有限责任公司 The mobile application protecting information safety method and apparatus of electric power transaction platform
TWI698801B (en) * 2018-11-28 2020-07-11 大陸商北京集創北方科技股份有限公司 Fingerprint image compensation method capable of adapting to different transparent cover layer thickness, transparent cover layer fingerprint recognition device and information processing device
CN109657579A (en) * 2018-12-07 2019-04-19 上海爱信诺航芯电子科技有限公司 A kind of detection of fingerprint crackle and restorative procedure
CN109657579B (en) * 2018-12-07 2023-06-09 上海航芯电子科技股份有限公司 Fingerprint crack detection and repair method
CN109784195A (en) * 2018-12-20 2019-05-21 金菁 A kind of fingerprint identification method checked card for enterprise's fingerprint and system
CN109711418B (en) * 2019-01-29 2020-12-01 浙江大学 Contour corner detection method for object plane image
CN109711418A (en) * 2019-01-29 2019-05-03 浙江大学 A kind of contour corner detection method for object plane image
CN109815935A (en) * 2019-02-20 2019-05-28 Oppo广东移动通信有限公司 Electronic device, fingerprint authentication method and Related product
CN111428701A (en) * 2020-06-10 2020-07-17 深圳市诺赛特系统有限公司 Small-area fingerprint image feature extraction method, system, terminal and storage medium
CN111429359A (en) * 2020-06-11 2020-07-17 深圳市诺赛特系统有限公司 Small-area fingerprint image splicing method, device, equipment and storage medium
CN112329681A (en) * 2020-11-13 2021-02-05 北京思比科微电子技术股份有限公司 Filtering method applied to fingerprint identification

Similar Documents

Publication Publication Date Title
CN105740753A (en) Fingerprint identification method and fingerprint identification system
US9633269B2 (en) Image-based liveness detection for ultrasonic fingerprints
US7072523B2 (en) System and method for fingerprint image enhancement using partitioned least-squared filters
CN110378310B (en) Automatic generation method of handwriting sample set based on answer library
Win et al. Fingerprint recognition system for low quality images
CN104463795A (en) Processing method and device for dot matrix type data matrix (DM) two-dimension code images
CN109919960B (en) Image continuous edge detection method based on multi-scale Gabor filter
CN110766689A (en) Method and device for detecting article image defects based on convolutional neural network
CN102254163A (en) Template size self-adaptable Gabor fingerprint image enhancement method
WO2018176514A1 (en) Fingerprint registration method and device
CN112329756A (en) Method and device for extracting seal and recognizing characters
Thajeel et al. A Novel Approach for Detection of Copy Move Forgery using Completed Robust Local Binary Pattern.
CN111754441B (en) Image copying, pasting and forging passive detection method
CN115311746A (en) Off-line signature authenticity detection method based on multi-feature fusion
Govindaraju et al. Feature extraction using a chaincoded contour representation of fingerprint images
JPH0696278A (en) Method and device for recognizing pattern
Giachetti Effective characterization of relief patterns
Pan et al. An efficient method for skew correction of license plate
Magnier et al. Ridges and valleys detection in images using difference of rotating half smoothing filters
Deb et al. Automatic vehicle identification by plate recognition for intelligent transportation system applications
Kim et al. Edge representation with fuzzy sets in blurred images
López et al. Biometric iris recognition using Hough Transform
Wang et al. Low-resolution Chinese character recognition of vehicle license plate based on ALBP and Gabor filters
Wu et al. Image interpolation using texture orientation map and kernel fisher discriminant
Kuban et al. A NOVEL MODIFICATION OF SURF ALGORITHM FOR FINGERPRINT MATCHING.

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20160706