CN101329727A - Fingerprint identification method combining point with line - Google Patents

Fingerprint identification method combining point with line Download PDF

Info

Publication number
CN101329727A
CN101329727A CNA2008100648199A CN200810064819A CN101329727A CN 101329727 A CN101329727 A CN 101329727A CN A2008100648199 A CNA2008100648199 A CN A2008100648199A CN 200810064819 A CN200810064819 A CN 200810064819A CN 101329727 A CN101329727 A CN 101329727A
Authority
CN
China
Prior art keywords
point
image
steps
judged result
difference
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CNA2008100648199A
Other languages
Chinese (zh)
Other versions
CN100590644C (en
Inventor
王明江
闫志锋
王进祥
韦秋初
董颖杰
刘钊
刘鹏
和王峰
彭刚
桑坚
张永胜
张国君
肖永生
马晓卫
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Harbin Institute of Technology
Original Assignee
Harbin Institute of Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Harbin Institute of Technology filed Critical Harbin Institute of Technology
Priority to CN200810064819A priority Critical patent/CN100590644C/en
Publication of CN101329727A publication Critical patent/CN101329727A/en
Application granted granted Critical
Publication of CN100590644C publication Critical patent/CN100590644C/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Landscapes

  • Collating Specific Patterns (AREA)

Abstract

The invention discloses a fingerprint identification method by using the joint of lines and points, which relates to a fingerprint identification method by using the joint of lines and points so as to solve the problems that the existing fingerprint identification method using dot matching easily leads to wrong identification, is greatly affected by fake characteristic points and has relatively poor noise immunity, and the fingerprint identification method using line matching has relatively large calculation work and low identification speed. All matching points in a database image Tp and an image Tq to be identified are selected; sorting is carried out to a plurality of obtained matched points according to matching similarity, and N pairs of matching points with the highest matching similarity are selected; K pairs of matched points are selected from the N pairs of matched points to serve as reference points; the difference between the X-axis and Y-axis and a direction angle is obtained to serve as the offset of two images; translation and rotation transformation are carried out to the image Tq; after correction, the ridge lines of the image Tq and the image Tp mutually match; the matched number is recorded; if the length weighted average value of all matched ridge lines meets a threshold value, the image Tp is identified to match with the image Tq; otherwise, the image Tp is identified not to match with the image Tq.

Description

The fingerprint identification method of dotted line combination
Technical field
The present invention relates to a kind of fingerprint identification method, particularly a kind of fingerprint identification method of dotted line combination belongs to field of biological recognition.
Background technology
Fingerprint recognition belongs to a kind of biological identification technology, utilizes the uniqueness of fingerprint image and the characteristics of stability, and relatively two width of cloth fingerprint images judge that whether they are from same finger.So-called uniqueness is meant different people's fingerprint difference, and the fingerprint image of the different fingers of same individual is also inequality; So-called stability is meant that a people's fingerprint big variation can not take place basically.
Fingerprint identification method commonly used at present is to utilize the unique point of fingerprint image to mate merely, whether details according to Satisfying Matching Conditions between two width of cloth images judges two width of cloth figures from same piece of finger to number, and its general step is: gather, cut apart, enhancing, refinement, minutia are extracted, coupling.When mating, at first minutia in input minutia of fingerprint and the template base to be proofreaied and correct, make two feature sets in same coordinate system, and then mate.Such algorithm only relies on the matching of some isolated unique points to judge whether two width of cloth images mate, and the roughly tendency of crestal line in overall thinking two width of cloth images not, only consider coupling from the angle of point, when picture quality is relatively poor, may cause erroneous judgement because of obtaining enough unique points, and the result is subjected to the influence of pseudo-characteristic point bigger, and noise resisting ability is poor.
Publication number is that the Chinese patent of CN564186 discloses a kind of fingerprint identification method based on overall crestal line, this method is that crestal line points all in two width of cloth images has been carried out some coupling, and whenever definite some coupling will judge earlier all whether a plurality of points (about 2n+1) mate, and operand is bigger; And the recognition methods that relates in this patent also is not used as a crestal line as integral body and treats, and just gets several crestal line points along the crestal line direction in front and back to be matched, is not the coupling to two crestal lines.
Summary of the invention
The present invention causes erroneous judgement easily, is subjected to the influence of pseudo-characteristic point bigger for what solve that existing some coupling fingerprint identification method exists, the operand that the relatively poor and line of noise resisting ability coupling fingerprint identification method exists is big, judge slow-footed problem, and a kind of fingerprint identification method of dotted line combination is provided.The present invention is realized by following steps:
Steps A 1, at database images T pIn, with described database images T pCentral point be the center of circle, be in the border circular areas of radius with R and image T to be identified qIn the corresponding region in seek match point, wherein R represents the real number greater than zero, according to described database images T pBorder circular areas in each unique point, at image T to be identified qBorder circular areas in the traversal all points choose a point that is complementary with described unique point, 2 are designated as a pair of match point jointly;
Steps A 2, with many match point being sorted by matching similarity that steps A 1 obtains, therefrom choose the higher N of matching similarity to match point, wherein N represents natural number;
Steps A 3, from N to select the match point K to match point as reference point, wherein K representative is less than the natural number of N;
Steps A 4, horizontal stroke, ordinate and the deflection of two unique points in every pair of reference point are done difference respectively obtain three difference DELTA x, Δ y and Δ θ, and then obtain mean value Δ x, Δ y and the Δ θ of K to the difference of horizontal stroke, ordinate and the deflection of reference point, with the mean value of the difference of described horizontal stroke, ordinate and deflection respectively as database images T pWith image T to be identified qBetween the side-play amount of all match point horizontal strokes, ordinate and deflection;
Steps A 5, treat recognition image T qCarry out translation and rotational transform, mean value Δ x, Δ y that horizontal stroke, ordinate and the deflection of this image are obtained with steps A 4 respectively and Δ θ do difference and finish image rectification, obtain correcting image T` to be identified q
Steps A 6, to correcting image T` to be identified qWith database images T pCarry out the crestal line coupling, the bar number of the crestal line that acquisition is complementary;
Whether the bar number of the crestal line that is complementary that steps A 7, determining step A6 obtain is more than or equal to threshold value, and judged result then enters steps A 8 for being, judged result is not, then decision data storehouse image T pWith image T to be identified qDo not match, change database images T pReturn steps A 1;
Steps A 8, all crestal lines of coupling are asked weighted mean value according to length, whether judge this mean value less than threshold value, judged result is for being, then thinks database images T pWith image T to be identified qBe complementary, judged result is not for, then thinks database images T pWith image T to be identified qBe not complementary, change database images T pReturn steps A 1.
Beneficial effect: the present invention is when asking for the side-play amount of two width of cloth images, only select for use in the border circular areas that with the central point is the center of circle and carry out Feature Points Matching, dwindled scope, and in entire image, carry out characteristic matching and compare, operand can reduce by 60%, has improved operation efficiency; Select for use many group match points as reference point, averaged has then improved the reliability of image rectification; When carrying out the crestal line coupling, determine according to the ridge number of lines of coupling whether two width of cloth images mate, and carry out matching operation from the angle of line, have effectively reduced the influence of pseudo-characteristic point, and second-rate image is also had matching effect preferably, and noise resisting ability is stronger; Simultaneously, when carrying out the crestal line matching operation, adopt method, can reduce by about 50% operand every several point samplings.
Description of drawings
Fig. 1 is the process flow diagram of this method.
Embodiment
Embodiment one: referring to Fig. 1, present embodiment is made up of following steps:
Steps A 1, at database images T pIn, with described database images T pCentral point be the center of circle, be in the border circular areas of radius with R and image T to be identified qIn the corresponding region in seek match point, wherein R represents the real number greater than zero, choosing of R value should guarantee to have in the border circular areas three to five unique points, according to described database images T pBorder circular areas in each unique point, at image T to be identified qBorder circular areas in the traversal all points choose a point that is complementary with described unique point, 2 are designated as a pair of match point jointly;
Steps A 2, with many match point being sorted by matching similarity that steps A 1 obtains, therefrom choose the higher N of matching similarity to match point, wherein N represents natural number;
Steps A 3, from N to select the match point K to match point as reference point, wherein K representative is less than the natural number of N;
Steps A 4, horizontal stroke, ordinate and the deflection of two unique points in every pair of reference point are done difference respectively obtain three difference DELTA x, Δ y and Δ θ, and then obtain mean value Δ x, Δ y and the Δ θ of K to the difference of horizontal stroke, ordinate and the deflection of reference point, with the mean value of the difference of described horizontal stroke, ordinate and deflection respectively as database images T pWith image T to be identified qBetween the side-play amount of all match point horizontal strokes, ordinate and deflection;
Steps A 5, treat recognition image T qCarry out translation and rotational transform, Δ x, Δ y that horizontal stroke, ordinate and the deflection of this image are obtained with steps A 4 respectively and Δ θ do difference and finish image rectification, obtain correcting image T` to be identified q
Steps A 6, to correcting image T` to be identified qWith database images T pCarry out the crestal line coupling, the bar number of the crestal line that acquisition is complementary;
Whether the bar number of the crestal line that is complementary that steps A 7, determining step A6 obtain is more than or equal to threshold value, and this threshold value can require to do suitable adjustment according to matching precision, generally speaking should be greater than 7, judged result is for being, then enter steps A 8, judged result is not, then decision data storehouse image T pWith image T to be identified qDo not match, change database images T pReturn steps A 1;
Steps A 8, all crestal lines of coupling are asked weighted mean value according to length, whether judge this mean value less than threshold value, described threshold value can add 7 for four times of fingerprint ridge width, and judged result is for being, then thinks database images T pWith image T to be identified qBe complementary, judged result is not for, then thinks database images T pWith image T to be identified qBe not complementary, change database images T pReturn steps A 1.
Embodiment two: present embodiment further illustrates on the basis of embodiment one and described in the steps A 1 to match point by the concrete grammar that matching similarity sorts is:
Step B1, more described database images T pWith image T to be identified qIn border circular areas in four characteristic quantities of two unique points: unique point type, unique point are used P={P to the distance in the center of circle, the frequency at unique point place and the difference of unique point deflection and center of circle deflection 1, P 2... P N, Q={Q 1, Q 2... Q NDifference representation database image T pWith image T to be identified qThe set of the unique point in the middle border circular areas;
Step B2 chooses a unique point P in set P i, all unique points among the traversal set Q are for arbitrary unique point Q among the set Q j, difference judging characteristic point P iWith Q jFour characteristic quantities, when any one characteristic quantity does not meet the demands, then think P iWith Q jNot match point, change Q jRejudge; Otherwise, then think P iWith Q jIt is match point;
The determination methods whether described four characteristic quantities meet the demands is respectively:
B21, whether the judging characteristic vertex type is identical, i.e. judging characteristic point P iWith Q jWhether be all end points or be all bifurcation;
B22, whether judging characteristic point to the distance in the center of circle meets the demands, respectively calculated characteristics point P iAnd Q jTo its place image center apart from d iAnd d j, judge d iWith d jDifference whether less than threshold value, this threshold value is one to be no more than 2 times smaller value of fingerprint ridge width, judged result thinks then to meet the demands that judged result is then thought not meet the demands for not for being;
B23, whether the difference of judging characteristic point deflection and center of circle deflection meets the demands, and calculates respectively to obtain unique point P iAnd Q jDeflection and the difference of center of circle deflection, judge described two difference (θ ip) and (θ jq) difference whether less than threshold value, described threshold value is value among a small circle, can select threshold value for use is 6 degree, judged result then thinks to meet the demands for being, judged result is then thought not meet the demands, wherein θ for denying iAnd θ jDifference representation feature point P i, Q jDeflection, θ p, θ qRepresent database images T respectively pWith image T to be identified qThe deflection of central point;
B24, whether the judging characteristic dot frequency meets the demands, calculated characteristics point P iAnd Q jWhether 2 differences of locating frequency less than threshold value, and this threshold value is value among a small circle, and can select threshold value for use is 0.023, and judged result is for being then to think to meet the requirements; Judged result is then thought undesirable for not.
Embodiment three: present embodiment further illustrates the unique point described in the steps A 2 and by the concrete grammar of the ordering of similarity is on the basis of embodiment one:
Step C1 calculates each to match point P iAnd Q jThe difference DELTA f of frequency, described unique point P iAnd Q jCentral point is apart from difference (d extremely separately i-d j) and described unique point P iAnd Q jThe difference (θ of deflection ip)-(θ jq);
Step C2 finds out the range difference max|d of N to frequency-splitting max| Δ f| maximum in the match point, maximum i-d j| and the difference max| (θ of maximum deflection ip)-(θ jq) |;
Step C3, the parameter that obtains according to step C2 is by formula | Δ f/max Δ f|+| (d i-d j)/max (d i-d j) |+| [(θ ip)-(θ jq)]/max[(θ ip)-(θ jq)] | obtain each similarity reference value to match point;
Step C4 arranges the similarity reference value that step C3 obtains by ascending order, value is more little represents that then this similarity to match point is high more.
Embodiment four: present embodiment further illustrates selecting K to match point from N and to the concrete grammar of reference point be described in the steps A 3 on the basis of embodiment one:
Step D1 is respectively at database images T pWith image T to be identified qMiddle calculating belongs to N to the distance between any two unique points in the match point, and whether the length difference of judging described distance is less than threshold value, and described threshold value is less than 2 times of fingerprint ridge width, judged result is for being, then enter step D3, judged result then enters step D2 for not;
Step D2 removes the minimum a pair of match point of similarity, and makes N=N-1, judges whether N is 1, if judged result is for being, and execution in step D4 then; If judged result is then returned step D1 for not;
Step D3 judges whether N has been carried out calculating, comparison to all unique points in the match point, if judged result is for being, and execution in step D4 then; If judged result is then returned step D1 for not;
Step D4 makes K=N, with K to match point as database images T pWith image T to be identified qThe reference point of proofreading and correct.
Embodiment five: present embodiment further illustrates on the basis of embodiment one in the steps A 6 correcting image T` to be identified qWith database images T pThe method of carrying out the crestal line coupling is specially:
Step e 1 is to according to storehouse image T pWith correcting image T` to be identified qIn every effective crestal line carry out discretize, promptly every crestal line unique point of L feature point extraction, write down ordinate, horizontal ordinate, frequency and the deflection of this point, wherein L represents natural number, so-called effective crestal line is meant that length (pixel number) is greater than a fixed value, and with the unique point is the crestal line of starting point or terminal point, and bifurcation is regarded the intersection point of several crestal lines as;
Step e 2, the crestal line unique point that step e 1 is extracted is respectively at database images T pWith correcting image T` to be identified qIn do Feature Points Matching, judge that according to four characteristic quantities of unique point whether four characteristic quantities of more any two unique points satisfy following condition: 1) the unique point type is identical; 2) unique point arrives distance of center circle deviation 3 times less than the fingerprint ridge width; 3) the unique point difference on the frequency is less than 0.03; 4) angle difference is less than 10 degree; The unique point that satisfies above-mentioned whole four conditions is a pair of match point, and two crestal lines at two unique point places in the described a pair of coupling are two crestal lines of mutual correspondence;
Step e 3 is from the database images T of step e 2 pWith correcting image T` to be identified qThe number of the point that extracts on middle corresponding two crestal lines is respectively M 1And M 2Individual, horizontal, ordinate is respectively { (x 1, y 1), (x 2, y 2) ... (x M1, y M1) and { (x` 1, y` 1), (x` 2, y` 2) ... (x` M2, y` M2), frequency is respectively { f 1, f 2... f M1And { f`` 1, f`` 2... f`` M2, deflection is respectively { θ 1, θ 2... θ M1, { θ ` 1, θ ` 2... θ ` M2, make M=min{M 1, M 2;
Step e 4 is calculated [(x 1-x` 1) 2+ (y 1-y` 1) 2+ (x 2-x` 2) 2+ (y 2-y` 2) 2+ ...+(x M-x` M) 2+ (y M-y` M) 2The value of]/M is designated as R 1, whether judge this value less than threshold value, this thresholding is no more than 4 times of fingerprint ridge width, and judged result then enters step e 5 for being, and judged result thinks then that for not two corresponding crestal lines are not complementary;
Step e 5, calculate [| f 1-f` 1|+| f 2-f` 2|+...+| f M-f` M|]/value of M, be designated as R 2, whether judge this value less than threshold value, this thresholding is for being worth among a small circle, and optional threshold value is 0.023, and judged result then enters step e 6 for being, and judged result thinks then that for not two corresponding crestal lines are not complementary;
Step e 6, calculate [| θ 1-θ ` 1|+| θ 2-θ ` 2|+...+| θ M-θ ` M|]/value of M, be designated as R 3, whether judge this value less than threshold value, this thresholding is for being worth among a small circle, and optional threshold value is 6 degree, and judged result then enters step e 7 for being, and judged result thinks then that for not two corresponding crestal lines are not complementary;
Step e 7, two corresponding crestal lines are asked weighted mean value V according to its length Mean, judge V MeanWhether satisfy less than threshold value, optional threshold value, judged result thinks then that for being two corresponding crestal lines are complementary if being that four times of fingerprint ridge width add 7 again, judged result thinks then that for not two corresponding crestal lines are not complementary, and changes other crestal line, returns step e 1.
Embodiment six: present embodiment further illustrates the computing method that the weighting described in the steps A 8 averages and is made up of following steps on the basis of embodiment five:
Step F 1 for each crestal line to coupling, is calculated 6 * R 1+ 2 * R 2+ 2 * R 3Value, be designated as V;
Step F 2 for each crestal line to coupling, is determined the weights of this crestal line according to the value of M, and weights are determined as follows: if M≤J 1, power is 1; If J 1<M≤J 2, power is 2; If M>J 2, then power is 3, wherein J 1And J 2All represent the length of streakline;
Step F 3 multiply by addition again behind its weights respectively with the V value of every line, with the result of addition divided by all to the weights of matched line and, promptly ask the right weighted mean value V of all matched lines Mean

Claims (6)

1, the fingerprint identification method of dotted line combination is characterized in that it is realized by following steps:
Steps A 1 is at database images T pIn, with described database images T pCentral point be the center of circle, be in the border circular areas of radius with R and image T to be identified qIn the corresponding region in seek match point, wherein R represents the real number greater than zero, according to described database images T pBorder circular areas in each unique point, at image T to be identified qBorder circular areas in the traversal all points choose a point that is complementary with described unique point, 2 are designated as a pair of match point jointly;
Steps A 2,1 the many of acquisition sort by matching similarity to match point with steps A, therefrom choose the higher N of matching similarity to match point, and wherein N represents natural number;
Steps A 3, from N to select the match point K to match point as reference point, wherein K representative is less than the natural number of N;
Steps A 4, horizontal stroke, ordinate and the deflection of two unique points in every pair of reference point are done difference respectively obtain three difference DELTA x, Δ y and Δ θ, and then obtain mean value Δ x, Δ y and the Δ θ of K to the difference of horizontal stroke, ordinate and the deflection of reference point, with the mean value of the difference of described horizontal stroke, ordinate and deflection respectively as database images T pWith image T to be identified qBetween the side-play amount of all match point horizontal strokes, ordinate and deflection;
Steps A 5 is treated recognition image T qCarry out translation and rotational transform, mean value Δ x, Δ y that horizontal stroke, ordinate and the deflection of this image are obtained with steps A 4 respectively and Δ θ do difference and finish image rectification, obtain correcting image T` to be identified q
Steps A 6 is to correcting image T` to be identified qWith database images T pCarry out the crestal line coupling, the bar number of the crestal line that acquisition is complementary;
Steps A 7, whether the bar number of the crestal line that is complementary that determining step A6 obtains is more than or equal to threshold value, and judged result then enters steps A 8 for being, and judged result is not, then decision data storehouse image T pWith image T to be identified qDo not match, change database images T pReturn steps A 1;
Steps A 8 is asked weighted mean value to all crestal lines of coupling according to length, whether judges this mean value less than threshold value, and judged result is for being, then thinks database images T pWith image T to be identified qBe complementary, judged result is not for, then thinks database images T pWith image T to be identified qBe not complementary, change database images T pReturn steps A 1.
2, the fingerprint identification method of dotted line combination according to claim 1 is characterized in that match point is made up of following steps by the method that matching similarity sorts described in the steps A 1:
Step B1, more described database images T pWith image T to be identified qIn border circular areas in four characteristic quantities of two unique points: unique point type, unique point are used P={P to the distance in the center of circle, the frequency at unique point place and the difference of unique point deflection and center of circle deflection 1, P 2... P N, Q={Q 1, Q 2... Q NDifference representation database image T pWith image T to be identified qThe set of the unique point in the middle border circular areas;
Step B2 chooses a unique point P in set P i, all unique points among the traversal set Q are for arbitrary unique point Q among the set Q j, difference judging characteristic point P iWith Q jFour characteristic quantities, when any one characteristic quantity does not meet the demands, then think P iWith Q jNot match point, change Q jRejudge; Otherwise, then think P iWith Q jIt is match point;
The determination methods whether described four characteristic quantities meet the demands is respectively:
B21, whether the judging characteristic vertex type is identical, i.e. judging characteristic point P iWith Q jWhether be all end points or be all bifurcation;
B22, whether judging characteristic point to the distance in the center of circle meets the demands, respectively calculated characteristics point P iAnd Q jTo its place image center apart from d iAnd d j, judge d iWith d jDifference whether less than threshold value, judged result then thinks to meet the demands for being, judged result is then thought not meet the demands for not;
B23, whether the difference of judging characteristic point deflection and center of circle deflection meets the demands, and calculates respectively to obtain unique point P iAnd Q jDeflection and the difference of center of circle deflection, judge described two difference (θ ip) and (θ jq) difference whether less than threshold value, judged result then thinks to meet the demands for being, judged result is then thought not meet the demands, wherein θ for not iAnd θ jDifference representation feature point P i, Q jDeflection, θ p, θ qRepresent database images T respectively pWith image T to be identified qThe deflection of central point;
B24, whether the judging characteristic dot frequency meets the demands, calculated characteristics point P iAnd Q jWhether 2 differences of locating frequency less than threshold value, and judged result is for being then to think to meet the requirements; Judged result is then thought undesirable for not.
3, the fingerprint identification method of dotted line combination according to claim 1 is characterized in that the unique point described in the steps A 2 is made up of following steps by the sort method of similarity:
Step C1 calculates each to match point P iAnd Q jThe difference DELTA f of frequency, described unique point P iAnd Q jCentral point is apart from difference (d extremely separately i-d j) and described unique point P iAnd Q jThe difference (θ of deflection ip)-(θ jq);
Step C2 finds out the range difference max|d of N to frequency-splitting max| Δ f| maximum in the match point, maximum i-d j| and the difference max| (θ of maximum deflection ip)-(θ jq) |;
Step C3, the parameter that obtains according to step C2 is by formula | Δ f/max Δ f|+| (d i-d j)/max (d i-d j) |+| [(θ ip)-(θ jq)]/max[(θ ip)-(θ jq)] | obtain each similarity reference value to match point;
Step C4 arranges the similarity reference value that step C3 obtains by ascending order, value is more little represents that then this similarity to match point is high more.
4, the fingerprint identification method of dotted line combination according to claim 1 is characterized in that to selecting M the match point method of reference point is made up of following steps from N described in the steps A 3:
Step D1 is respectively at database images T pWith image T to be identified qMiddle calculating belongs to N to the distance between any two the unique point points in the match point, and whether the length difference of judging described distance is less than threshold value, and judged result then enters step D3 for being, judged result then enters step D2 for not;
Step D2 removes the minimum a pair of match point of similarity, and makes N=N-1, judges whether N is 1, if judged result is for being, and execution in step D4 then; If judged result is then returned step D1 for not;
Step D3 judges whether N is calculated, relatively all unique points in the match point, and judged result is for being, then execution in step D4; If judged result is then returned step D1 for not;
Step D4 makes K=N, with K to match point as database images T pWith image T to be identified qThe reference point of proofreading and correct.
5, the fingerprint identification method of dotted line combination according to claim 1 is characterized in that in the steps A 6 correcting image T` to be identified qWith database images T pCarrying out the method for crestal line coupling is made up of following steps:
Step e 1 is to according to storehouse image T pWith correcting image T` to be identified qIn every effective crestal line carry out discretize, promptly every crestal line unique point of L feature point extraction, write down ordinate, horizontal ordinate, frequency and the deflection of this point, wherein L represents natural number;
Step e 2, the crestal line unique point that step e 1 is extracted is respectively at database images T pWith correcting image T` to be identified qIn do Feature Points Matching, four characteristic quantities of more any two unique points: unique point type, unique point are to the distance in the center of circle, the frequency at unique point place and the difference of unique point deflection and center of circle deflection, it is right to note all couplings that satisfy condition, and two crestal lines at two unique point places in the described a pair of coupling are two crestal lines of mutual correspondence;
Step e 3 is from the database images T of step e 2 pWith correcting image T` to be identified qThe number of the point that extracts on middle corresponding two crestal lines is respectively M 1And M 2Individual, horizontal, ordinate is respectively { (x 1, y 1), (x 2, y 2) ... (x M1, y M1) and { (x` 1, y` 1), (x` 2, y` 2) ... (x` M2, y` M2), frequency is respectively { f 1, f 2... f M1And { f` 1, f` 2... f` M2, deflection is respectively { θ 1, θ 2... θ M1, { θ ` 1, θ ` 2... θ ` M2, make M=min{M 1, M 2;
Step e 4 is calculated [(x 1-x` 1) 2+ (y 1-y` 1) 2+ (x 2-x` 2) 2+ (y 2-y` 2) 2+ ...+(x M-x` M) 2+ (y M-y` M) 2The value of]/M is designated as R 1, whether judge this value less than threshold value, judged result then enters step e 5 for being, and judged result thinks then that for not two corresponding crestal lines are not complementary;
Step e 5, calculate [| f 1-f` 1|+| f 2-f` 2|+...+| f M-f` M|]/value of M, be designated as R 2, whether judge this value less than threshold value, judged result then enters step e 6 for being, and judged result thinks then that for not two corresponding crestal lines are not complementary;
Step e 6, calculate [| θ 1-θ ` 1|+| θ 2-θ ` 2|+...+| θ M-θ ` M|]/value of M, be designated as R 3, whether judge this value less than threshold value, judged result then enters step e 7 for being, and judged result thinks then that for not two corresponding crestal lines are not complementary;
Step e 7, two corresponding crestal lines are asked weighted mean value V according to its length Mean, judge V MeanWhether satisfy less than threshold value, judged result, judged result thinks then that for not two corresponding crestal lines are not complementary if thinking then that for being two corresponding crestal lines are complementary.
6, the fingerprint identification method of dotted line combination according to claim 5 is characterized in that the computing method that the weighting described in the steps A 8 is averaged are made up of following steps:
Step F 1 for each crestal line to coupling, is calculated 6 * R 1+ 2 * R 2+ 2 * R 3Value, be designated as V;
Step F 2 for each crestal line to coupling, is determined the weights of this crestal line according to the value of M, and weights are determined as follows: if M≤J 1, power is 1; If J 1<M≤J 2, power is 2; If M>J 2, then power is 3, wherein J 1And J 2All represent the length of streakline;
Step F 3 multiply by addition again behind its weights respectively with the V value of every line, with the result of addition divided by all to the weights of matched line and, promptly ask the right weighted mean value V of all matched lines Mean
CN200810064819A 2008-06-27 2008-06-27 Fingerprint identification method combining point with line Expired - Fee Related CN100590644C (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN200810064819A CN100590644C (en) 2008-06-27 2008-06-27 Fingerprint identification method combining point with line

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN200810064819A CN100590644C (en) 2008-06-27 2008-06-27 Fingerprint identification method combining point with line

Publications (2)

Publication Number Publication Date
CN101329727A true CN101329727A (en) 2008-12-24
CN100590644C CN100590644C (en) 2010-02-17

Family

ID=40205529

Family Applications (1)

Application Number Title Priority Date Filing Date
CN200810064819A Expired - Fee Related CN100590644C (en) 2008-06-27 2008-06-27 Fingerprint identification method combining point with line

Country Status (1)

Country Link
CN (1) CN100590644C (en)

Cited By (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102479328A (en) * 2010-11-26 2012-05-30 神盾股份有限公司 Identity verification device and method based on biological characteristics
CN103294987A (en) * 2012-03-05 2013-09-11 天津华威智信科技发展有限公司 Fingerprint matching method and fingerprint matching implementation mode
CN103530872A (en) * 2013-09-18 2014-01-22 北京理工大学 Mismatching deleting method based on angle constraint
CN104239871A (en) * 2014-09-26 2014-12-24 四川大学 Optimal quadrangle based quick fingerprint matching method
CN105205439A (en) * 2015-02-13 2015-12-30 比亚迪股份有限公司 Method for calculating area of fingerprint overlapping region and electronic device
WO2016112737A1 (en) * 2015-01-13 2016-07-21 深圳市汇顶科技股份有限公司 Fingerprint sensor and correction method thereof
CN106157891A (en) * 2016-08-15 2016-11-23 京东方科技集团股份有限公司 A kind of lines identification display device
CN107958443A (en) * 2017-12-28 2018-04-24 西安电子科技大学 A kind of fingerprint image joining method based on crestal line feature and TPS deformation models
CN108416342A (en) * 2018-05-28 2018-08-17 杭州电子科技大学 A kind of fingerprint identification method of combination minutiae point and filament structure
CN109740633A (en) * 2018-12-10 2019-05-10 厦门市美亚柏科信息股份有限公司 A kind of image similarity calculation method, device, storage medium
CN111985337A (en) * 2020-07-21 2020-11-24 江苏艾科半导体有限公司 Fingerprint identification chip testing method for improving detection efficiency and accuracy
CN112508064A (en) * 2020-11-24 2021-03-16 广州广电运通金融电子股份有限公司 Finger vein identity recognition method and device, computer equipment and storage medium
CN113379692A (en) * 2021-06-01 2021-09-10 中科晶源微电子技术(北京)有限公司 Method and device for calibrating OM and SEM coordinate relation, equipment and storage medium
CN114312666A (en) * 2021-11-22 2022-04-12 江铃汽车股份有限公司 Vehicle control method and device based on face recognition, storage medium and equipment
CN116311395A (en) * 2022-08-18 2023-06-23 荣耀终端有限公司 Fingerprint identification method and device

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108764127A (en) * 2018-05-25 2018-11-06 京东方科技集团股份有限公司 Texture Recognition and its device

Cited By (26)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102479328B (en) * 2010-11-26 2014-11-05 神盾股份有限公司 Identity verification device and method based on biological characteristics
CN102479328A (en) * 2010-11-26 2012-05-30 神盾股份有限公司 Identity verification device and method based on biological characteristics
CN103294987A (en) * 2012-03-05 2013-09-11 天津华威智信科技发展有限公司 Fingerprint matching method and fingerprint matching implementation mode
CN103530872B (en) * 2013-09-18 2016-03-30 北京理工大学 A kind of error hiding delet method based on angle restriction
CN103530872A (en) * 2013-09-18 2014-01-22 北京理工大学 Mismatching deleting method based on angle constraint
CN104239871A (en) * 2014-09-26 2014-12-24 四川大学 Optimal quadrangle based quick fingerprint matching method
CN104239871B (en) * 2014-09-26 2017-06-16 四川大学 A kind of quick finger print matching method based on optimal quadrangle
WO2016112737A1 (en) * 2015-01-13 2016-07-21 深圳市汇顶科技股份有限公司 Fingerprint sensor and correction method thereof
US10643048B2 (en) 2015-01-13 2020-05-05 Shenzhen GOODIX Technology Co., Ltd. Fingerprint sensor and correction method thereof
CN105205439A (en) * 2015-02-13 2015-12-30 比亚迪股份有限公司 Method for calculating area of fingerprint overlapping region and electronic device
WO2016127736A1 (en) * 2015-02-13 2016-08-18 比亚迪股份有限公司 Computing method for area of fingerprint overlapping area and electronic apparatus
US11036977B2 (en) 2016-08-15 2021-06-15 Boe Technology Group Co., Ltd. Identity recognition display device, and array substrate and identity recognition circuit thereof
CN106157891A (en) * 2016-08-15 2016-11-23 京东方科技集团股份有限公司 A kind of lines identification display device
CN107958443A (en) * 2017-12-28 2018-04-24 西安电子科技大学 A kind of fingerprint image joining method based on crestal line feature and TPS deformation models
CN107958443B (en) * 2017-12-28 2021-12-28 西安电子科技大学 Fingerprint image splicing method based on ridge line characteristics and TPS deformation model
CN108416342A (en) * 2018-05-28 2018-08-17 杭州电子科技大学 A kind of fingerprint identification method of combination minutiae point and filament structure
CN108416342B (en) * 2018-05-28 2022-02-18 杭州电子科技大学 Fingerprint identification method combining thin node and thin line structure
CN109740633A (en) * 2018-12-10 2019-05-10 厦门市美亚柏科信息股份有限公司 A kind of image similarity calculation method, device, storage medium
CN109740633B (en) * 2018-12-10 2022-02-22 厦门市美亚柏科信息股份有限公司 Image similarity calculation method and device and storage medium
CN111985337A (en) * 2020-07-21 2020-11-24 江苏艾科半导体有限公司 Fingerprint identification chip testing method for improving detection efficiency and accuracy
CN112508064A (en) * 2020-11-24 2021-03-16 广州广电运通金融电子股份有限公司 Finger vein identity recognition method and device, computer equipment and storage medium
CN113379692A (en) * 2021-06-01 2021-09-10 中科晶源微电子技术(北京)有限公司 Method and device for calibrating OM and SEM coordinate relation, equipment and storage medium
WO2022252277A1 (en) * 2021-06-01 2022-12-08 中科晶源微电子技术(北京)有限公司 Method and apparatus for calibrating om and sem coordinate relationship, device, and storage medium
CN114312666A (en) * 2021-11-22 2022-04-12 江铃汽车股份有限公司 Vehicle control method and device based on face recognition, storage medium and equipment
CN116311395A (en) * 2022-08-18 2023-06-23 荣耀终端有限公司 Fingerprint identification method and device
CN116311395B (en) * 2022-08-18 2023-11-14 荣耀终端有限公司 Fingerprint identification method and device

Also Published As

Publication number Publication date
CN100590644C (en) 2010-02-17

Similar Documents

Publication Publication Date Title
CN100590644C (en) Fingerprint identification method combining point with line
CN100356388C (en) Biocharacteristics fusioned identity distinguishing and identification method
CN100470579C (en) Certainty coding method and system in fingerprint identification
CN108038434B (en) Video facial expression pre-detection method based on multi-example learning
Fronthaler et al. Fingerprint image-quality estimation and its application to multialgorithm verification
CN111339990A (en) Face recognition system and method based on dynamic update of face features
CN109919039B (en) Static gesture recognition method based on palm and finger characteristics
CN103971102A (en) Static gesture recognition method based on finger contour and decision-making trees
AU722613B2 (en) Fingerprint characteristic extraction apparatus as well as fingerprint classification apparatus and fingerprint verification apparatus for use with fingerprint characteristic extraction apparatus
CN111507206B (en) Finger vein identification method based on multi-scale local feature fusion
CN105160303A (en) Fingerprint identification method based on mixed matching
CN105224937A (en) Based on the semantic color pedestrian of the fine granularity heavily recognition methods of human part position constraint
CN108334875A (en) Vena characteristic extracting method based on adaptive multi-thresholding
CN103400109A (en) Free-hand sketch offline identification and reshaping method
CN104091145A (en) Human palm vein feature image acquisition method
CN104182748A (en) A method for extracting automatically character strokes based on splitting and matching
CN106203417A (en) A kind of adhesion character alienable RMB crown word number identification method
CN105654025A (en) Fingerprint identification method, apparatus and electronic equipment thereof
CN102799872A (en) Image processing method based on face image characteristics
CN105975906B (en) A kind of PCA static gesture identification methods based on area features
CN104077769A (en) Error matching point pair removing algorithm in image registration
CN103955950A (en) Image tracking method utilizing key point feature matching
CN104036245A (en) Biometric feature recognition method based on on-line feature point matching
CN110163894B (en) Sub-pixel level target tracking method based on feature matching
CN105956581B (en) A kind of quick human face characteristic point initial method

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C14 Grant of patent or utility model
GR01 Patent grant
C17 Cessation of patent right
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20100217

Termination date: 20100627