CN1794266A - Biocharacteristics fusioned identity distinguishing and identification method - Google Patents

Biocharacteristics fusioned identity distinguishing and identification method Download PDF

Info

Publication number
CN1794266A
CN1794266A CN 200510136310 CN200510136310A CN1794266A CN 1794266 A CN1794266 A CN 1794266A CN 200510136310 CN200510136310 CN 200510136310 CN 200510136310 A CN200510136310 A CN 200510136310A CN 1794266 A CN1794266 A CN 1794266A
Authority
CN
China
Prior art keywords
user
confidence
output
class
formula
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN 200510136310
Other languages
Chinese (zh)
Other versions
CN100356388C (en
Inventor
丁晓青
方驰
舒畅
刘长松
蒋焰
王生进
彭良瑞
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tsinghua University
Original Assignee
Tsinghua University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tsinghua University filed Critical Tsinghua University
Priority to CNB2005101363107A priority Critical patent/CN100356388C/en
Publication of CN1794266A publication Critical patent/CN1794266A/en
Application granted granted Critical
Publication of CN100356388C publication Critical patent/CN100356388C/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Landscapes

  • Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)
  • Collating Specific Patterns (AREA)

Abstract

This invention relates to a sorter integration and mode identification field characterizing in obtaining characters of man-face, iris, on-line signature and off-line handwriting of a user by various collecting devices then sending them to corresponding certification sub-modules to pick up the characters and match with the molding boards and outputting the matched marks to be normalized and sent to an identification combined module to get the final identification result by confidence integration or sent to a certification module to be imaged to a multi-dimension space to be sorted by a sorter to get a final result or to be certified and combined to get the final identification result.

Description

Identification and authentication method that biological characteristic merges
Technical field
The invention belongs to the integrated and area of pattern recognition of sorter, but wherein comprised the content in recognition of face, iris recognition and literal identification field again.
Background technology
Authentication is the protection difficult problem that information security faced.Traditional authentication method often adopts password, certificate or some existing specific knowledge to enter the authority of internal system as the user.But because password passes into silence easily, revises, certificate carries inconvenience and easily loses, and therefore traditional authentication mode exists great defective and potential safety hazard.Along with the more and more higher demand in security fields, become more and more widely and efficient based on the automatic identity authentication of biological characteristic and recognition system.The product based on biological characteristic has all been adopted in zones such as many banks, airport.Biological characteristic commonly used had both comprised the identity characteristic of people's face, iris, fingerprint, palmmprint etc., also comprised the behavioural characteristic of signature, person's handwriting, gait etc.Compare with the conventional authentication mode, the characteristics of living things feature recognition maximum are exactly that feature to user self authenticates and discerns, and it is good, convenient and be difficult for forgeing the advantage of losing to have antifalsification.
Each biological characteristic authentication and identification are all different aspect accuracy rate, user's acceptance level and cost, and the relative merits of oneself are all arranged, and are applicable to different application scenarios.For the user, discerning and authenticate by people's face system is the most friendly mode of conflicting of least making us; The most reliable, stable and a kind of accurately detection approach that iris recognition and authentication then have been proved to be; On-line signature and handwriting recognition system are gathered conveniently because of it and are simple to operate, are also accepted extensively by the user.But these systems also face many problems, and are then very responsive to factors such as illumination, attitude and expressions such as face identification system; Iris authentication system has very high quality requirements to the sample that collects, and is not easy to operate during collection, and in actual use probably because the client iris sample quality that collects is too poor or the user was lost efficacy suffering under the situation of ophthalmology disease etc.; The on-line signature system then can be different and the user's online signature sample is impacted to the adaptedness of same collecting device because of the difference of collecting device and user; And for the handwriting recognition system of off line, even for same user, its signature and person's handwriting also can produce under different times and user's different conditions than big-difference, say nothing of the problem of its forgery that faces and personation.
Can effectively address the above problem by the fusion between multiple living things feature recognition Verification System.Because by fusion to a plurality of sorters, the mistake that not only can prevent single creature feature identification Verification System to lose efficacy and produce, total classification error rate is reduced, and the recognition system that has biological behavior characteristic by fusion is such as the on-line signature system, can also provide live body to detect for the identification of further feature, prevent the appearance of some forgery phenomenons.At present, a lot of research work of merging about biological characteristic also all concentrate in the fusion of the fusion of two class biological characteristics and same class biological characteristic, such as iris and people's face, or fingerprint and palmmprint etc.For the research that the above biological characteristic of three classes merges, people's face iris information that particularly will belong to biological identity characteristic is actually rare with the research that the handwriting signature information that belongs to behavioural characteristic merges.
Summary of the invention
The present invention is by people's face, the iris recognition Verification System that will handle biological identity characteristic information and biological behavior characteristic information is handled on-line signature, off line person's handwriting identification Verification System combines, result according to last fusion makes decisions, and has reached the purpose that improves total identification, authentication accuracy rate.
One of feature of the present invention is that the process of identification contains following steps successively:
Step 1:, also will import and the corresponding database of described each module to recognition of face authentication module, iris recognition authentication module, on-line signature identification authentication module and the off line person's handwriting identification authentication module that the computing machine input is set;
Step 2: with people's face of camera collection unknown subscriber Z, gather the off line person's handwriting of unknown subscriber Z with the iris of iris capturing instrument collection unknown subscriber Z, with the on-line signature of handwriting pad or touching display screen collection unknown subscriber Z, with scanner, again respectively in the corresponding image input step 1 described computing machine, carry out feature extraction respectively with corresponding identification authentication module, and and separately in the database existing user biological feature templates mate the mark that obtains after the coupling separately of output then;
Step 3: the identification of carrying out biological characteristic with the computing machine described in the step 1 is merged, and carries out according to the following steps successively:
Step 31: utilize the existing database of each module in the step 2 to set up a described abundant big training set A who respectively treats fusant module output mark who comprises people's face, iris, on-line signature and off line person's handwriting;
Step 32: each mark for the treatment of that the fusant module is exported that comprises distance or similarity in this training set in the step 31 is imported the normalization module that presets in the selected computing machine convert degree of confidence to, carry out according to the following steps successively:
At first, by given generalized confidence estimation formulas the output mark of respectively treating the fusant module is converted to generalized confidence:
Set: N user arranged in the database, and note is made class ω respectively 1, ω 2, L, ω N
The living things feature recognition authentication sub module of pending fusion also claims sorter, has R;
X ^ = ( x r 1 , x r 2 , L , x r R ) Be that user X total proper vector before coupling is gathered in the training set; Wherein
Figure A20051013631000142
Expression should
The user is in i (i=1,2, L, R) proper vector that extraction obtains in the individual sorter;
For the original sorter that is output as distance, The representation feature vector With class ω jOf representative
J (j=1,2, L, N) minimal matching span between the proper vector template of individual user in database;
For the original sorter that is output as similarity, The representation feature vector With class ω jThe j of representative (j=1,2, L, N) the maximum match similarity between the proper vector template of individual user in database;
Then, can convert the original output mark of described each submodule to generalized confidence by given generalized confidence estimation formulas.To i sorter, user X is identified as class ω jRepresentative user's generalized confidence is used g ( ω j | x r i ) Expression.
For the original living things feature recognition submodule that is output as distance, with the mark of (1-1) formula with its output Convert to wide
The justice degree of confidence:
g ( ω j | x r i ) = 1 - d j ( x r i ) min k ≠ j ( d k ( x r i ) ) , ( k = 1,2 , L , N ) - - - ( 1 - 1 )
For the original living things feature recognition submodule that is output as similarity, with the mark of (1-2) formula with its output Convert generalized confidence to:
g ( ω j | x r i ) = 1 - max k ≠ j ( s k ( x r i ) ) s j ( x r i ) , ( k = 1,2 , L , N ) - - - ( 1 - 2 )
Secondly, after mark that certain living things feature recognition submodule is exported all is converted into generalized confidence,, again generalized confidence is converted to degree of confidence by calculating the mapping function f (y) of following generalized confidence to degree of confidence:
If: the codomain of the generalized confidence that obtains from certain living things feature recognition submodule is T;
User X belongs to the abundant big training set A (X ∈ A) that configures in this living things feature recognition submodule;
Order y = g ( ω j | x r i ) , To any y ∈ T, [y-δ, y+ δ] is a near minizone the y;
Then: f ( y ) = Σ j = 1 N count ( { X | X ∈ Aandg ( ω j | x r i ) ∈ [ y - δ , y + δ ] andX ∈ ω j } ) Σ j = 1 N xount ( { X | X ∈ Aandg ( ω j | x r i ) ∈ [ y - δ , y + δ ] } ) - - - ( 1 - 3 )
(1-3) denominator of f (y) is the sum that generalized confidence drops on the sample in the minizone [y-δ, y+ δ] in the formula; Molecule is that generalized confidence drops in the minizone [y-δ, y+ δ] and the number of the sample that is correctly validated.
To any y ∈ T, calculate f (y) after, just can convert generalized confidence to degree of confidence.To i sorter, user X is identified as class ω jRepresentative user's degree of confidence also is a posterior probability, uses p ( ω j | x r i ) Expression.
p ( ω j | x r i ) = f ( g ( ω j | x r i ) ) ·
Step 33: for user Z, to all i (i=1,2, L, R) value and certain j (j=1,2, L, N) value, all the degree of confidence conversion method according to step 32 calculates
Figure A200510136310001510
Substitution identification emerging system obtains the differentiation mark that emerging system is judged as user Z j user to j user's discriminant function expression formula (1-4) formula or (1-5) in the formula then;
g j ( Z ) = Π i = 1 R p ( ω j | z r i ) - - - ( 1 - 4 )
g j ( Z ) = Σ i = 1 R p ( ω j | z r i ) - - - ( 1 - 5 )
Step 34: the j value to all calculates g according to step 33 j(Z) (all adopt (1-4) formula or all adopt (1-5) formula), g when then j being got different value j(Z) value is arranged from big to small, is provided with:
g j1(Z)>g j2(Z)>L>g jN(Z),
Then with user j 1As the first-selected recognition result of user Z, user j 2Select recognition result as two, the rest may be inferred.
Two of feature of the present invention is that the process of authentication contains following steps successively:
Step 1:, also will import and the corresponding database of described each module to recognition of face authentication module, iris recognition authentication module, on-line signature identification authentication module and the off line person's handwriting identification authentication module that the computing machine input is set;
Step 2: with people's face of camera collection unknown subscriber Z, gather the off line person's handwriting of unknown subscriber Z with the iris of iris capturing instrument collection unknown subscriber Z, with the on-line signature of handwriting pad or touching display screen collection unknown subscriber Z, with scanner, again respectively in the corresponding image input step 1 described computing machine, carry out feature extraction respectively with corresponding identification authentication module, and and separately in the database existing user biological feature templates mate the mark that obtains after the coupling separately of output then;
Step 3: the authentication of carrying out biological characteristic with the computing machine described in the step 1 is merged, and carries out according to the following steps successively:
Step 31: utilize the existing database of each module in the step 2 to set up a described abundant big training set A who respectively treats fusant module output mark who comprises people's face, iris, on-line signature and off line person's handwriting;
Step 32: the original output mark to each biological characteristic submodule of all users among the training set A carries out normalized, and new minute number average of each biological characteristic authentication submodule output is mapped to [0,1] interval after the normalization;
Step 33: the new mark of each the biological characteristic authentication submodule output that obtains through step 32 is made vector in the hyperspace, calculate best projection direction, carry out according to the following steps successively from hyperspace to its one-dimensional subspace:
At first, set:
N user arranged in the database;
The living things feature recognition authentication sub module of pending fusion also claims sorter, has R;
X r = ( x 1 , x 2 , L , x R ) T Be biological characteristic R the dimensional vector ((x that total output mark constitutes after through each biological characteristic authentication submodule coupling and normalization of user X in the training set 1, x 2, L, x R) TExpression row vector (x 1, x 2, L, x R) transposition); Component x wherein iRepresent i (i=1,2, L, R) the output mark after the individual sorter normalization;
Whole validated users (true sample) among the note training set A constitute class ω 0, disabled user's (dummy copy) constitutes class ω 1
Suppose class ω 0In total N 0Individual sample, the subclass A of composing training collection A 0, class ω 1In total N 1Individual sample, the subclass A of composing training collection A 1(A=A 0UA 1, N=N 0+ N 1);
Set A 0, A 1All can be considered the set of the column vector in the R dimension space, the best projection direction of the one-dimensional subspace from the R dimension space to the R dimension space is calculated by (2-1) formula:
w r * = S w - 1 ( m r 0 - m r 1 ) - - - ( 2 - 1 )
Wherein, m r k = 1 M k Σ X ∈ A k X r , K=0,1,
Figure A20051013631000173
It is respectively the average of true and false sample;
S k = Σ X ∈ A k ( X r - m r 0 ) ( X r - m r 1 ) T , K=0,1, S 0, S 1It is respectively the within class scatter matrix of true and false sample;
S w=P (ω 0) S 0+ P (ω 1) S 1, S wBe total within class scatter matrix, P (ω 0), P (ω 1) be respectively the prior probability of true and false sample class, get P (ω when specifically implementing 0)=P (ω 1)=0.5;
S w -1Expression S wInverse matrix;
Step 34: utilize the best projection direction of calculating gained in the step 33
Figure A20051013631000176
Structure projection matrix PS projects to all elements among the training set A among the set Y of one-dimensional subspace;
At first, structure projection matrix PS: PS = w r * × ( ( w r * ) T × w r * ) - 1 × ( w r * ) T ;
Secondly, to the arbitrary element (column vector) among the training set A Be transformed among the set Y of one-dimensional subspace by (2-2) formula:
y r = X r × ( PS ) T - - - ( 2 - 2 )
After all being transformed into all elements in the set A among the set Y by (2-2) formula, the subclass A of note set A 0, A 1In sample all be transformed into the subclass Y of set Y respectively 0, Y 1In, can calculate Y according to (2-3) formula 0, Y 1In the average of all kinds of samples;
And then can calculate the standard deviation of all kinds of samples:
Figure A200510136310001711
In the following formula The expression vector
Figure A20051013631000182
With
Figure A20051013631000183
Between Euclidean distance; Because all in the one-dimensional subspace of R dimension space,
So
Figure A20051013631000184
With
Figure A20051013631000185
Be the R dimensional vector, write as the form of component respectively y r = ( y 1 , y 2 , L , y R ) T With
Figure A20051013631000187
The back:
Figure A20051013631000188
Step 35: (each sorter is output as his normalization for user Z Z r = ( z 1 , z 2 , L , z R ) T ) , According to preset threshold With calculated in the step 33 Make categorised decision according to following rule, obtain user Z and belong to ω 0Class or ω 1The judgement of class:
If ( Z r - y r 0 ) × w r * > 0 , Z ∈ ω then 0
If ( Z r - y r 0 ) &times; w r * < 0 , Z ∈ ω then 1
Three of feature of the present invention is that the process of identity identifying and authenticating contains following steps successively:
Step 1:, also will import and the corresponding database of described each module to recognition of face authentication module, iris recognition authentication module, on-line signature identification authentication module and the off line person's handwriting identification authentication module that the computing machine input is set;
Step 2: with people's face of camera collection unknown subscriber Z, gather the off line person's handwriting of unknown subscriber Z with the iris of iris capturing instrument collection unknown subscriber Z, with the on-line signature of handwriting pad or touching display screen collection unknown subscriber Z, with scanner, again respectively in the corresponding image input step 1 described computing machine, carry out feature extraction respectively with corresponding identification authentication module, and and separately in the database existing user biological feature templates mate the mark that obtains after the coupling separately of output then;
Step 3: at first carry out the identification fusion of biological characteristic, carry out according to the following steps successively with the computing machine described in the step 1:
Step 31: utilize the existing database of each module in the step 2 to set up a described abundant big training set A who respectively treats fusant module output mark who comprises people's face, iris, on-line signature and off line person's handwriting;
Step 32: each mark for the treatment of that the fusant module is exported that comprises distance or similarity in this training set in the step 31 is imported the normalization module that presets in the selected computing machine convert degree of confidence to, carry out according to the following steps successively:
At first, by given generalized confidence estimation formulas the output mark of respectively treating the fusant module is converted to generalized confidence:
Set: N user arranged in the database, and note is made class ω respectively 1, ω 2, L, ω N
The living things feature recognition authentication sub module of pending fusion also claims sorter, has R;
X ^ = ( x r 1 , x r 2 , L , x r R ) Be that user X total proper vector before coupling is gathered in the training set; Wherein
Figure A200510136310001815
Represent that this user is in i (i=1,2, L, R) proper vector that extraction obtains in the individual sorter;
For the original sorter that is output as distance,
Figure A20051013631000191
The representation feature vector With class ω jThe j of representative (j=1,2, L, N) minimal matching span between the proper vector template of individual user in database;
For the original sorter that is output as similarity, The representation feature vector
Figure A20051013631000194
With class ω jThe j of representative (j=1,2, L, N) the maximum match similarity between the proper vector template of individual user in database;
Then, can convert the original output mark of described each submodule to generalized confidence by given generalized confidence estimation formulas.To i sorter, user X is identified as class ω jRepresentative user's generalized confidence is used Expression.
For the original living things feature recognition submodule that is output as distance, with the mark of (3-1) formula with its output
Figure A20051013631000196
Convert generalized confidence to:
g ( &omega; j | x r i ) = 1 - d j ( x r i ) min k &NotEqual; j ( d k ( x r i ) ) , ( k = 1,2 , L , N ) - - - ( 3 - 1 )
For the original living things feature recognition submodule that is output as similarity, with the mark of (3-2) formula with its output Convert generalized confidence to:
g ( &omega; j | x r i ) = 1 - max k &NotEqual; j ( s k ( x r i ) ) s j ( x r i ) , ( k = 1,2 , L , N ) - - - ( 3 - 2 )
Secondly, after mark that certain living things feature recognition submodule is exported all is converted into generalized confidence,, again generalized confidence is converted to degree of confidence by calculating the mapping function f (y) of following generalized confidence to degree of confidence:
If: the codomain of the generalized confidence that obtains from certain living things feature recognition submodule is T;
User X belongs to the abundant big training set A (X ∈ A) that configures in this living things feature recognition submodule;
Order y = g ( &omega; j | x r i ) , To any y ∈ T, [y-δ, y+ δ] is a near minizone the y; Then: f ( y ) = &Sigma; j = 1 N count ( { X | X &Element; Aandg ( &omega; j | x r i ) &Element; [ y - &delta; , y + &delta; ] andX &Element; &omega; j } ) &Sigma; j = 1 N count ( { X | X &Element; Aandg ( &omega; j | x r i ) &Element; [ y - &delta; , y + &delta; ] } ) - - - ( 3 - 3 )
(3-3) denominator of f (y) is the sum that generalized confidence drops on the sample in the minizone [y-δ, y+ δ] in the formula; Molecule is that generalized confidence drops in the minizone [y-δ, y+ δ] and the number of the sample that is correctly validated.
To any y ∈ T, calculate f (y) after, just can convert generalized confidence to degree of confidence.To i sorter, user X is identified as class ω jRepresentative user's degree of confidence also is a posterior probability, uses
Figure A20051013631000201
Expression.
p ( &omega; j | x r i ) = f ( g ( &omega; j | x r i ) ) &CenterDot;
Step 33: for user Z, to all i (i=1,2, L, R) value and certain j (j=1,2, L, N) value, all the degree of confidence conversion method according to step 32 calculates
Figure A20051013631000203
Substitution identification emerging system obtains the differentiation mark that emerging system is judged as user Z j user to j user's discriminant function expression formula (3-4) formula or (3-5) in the formula then:
g j ( Z ) = &Pi; i = 1 R p ( &omega; j | z r i )
g j ( Z ) = &Sigma; i = 1 R p ( &omega; j | z r i ) - - - ( 3 - 5 )
Step 34: the j value to all calculates g according to step 33 j(Z) (all adopt (3-4) formula or all adopt (3-5) formula), g when then j being got different value j(Z) value is arranged from big to small, is provided with:
g j1(Z)>g j2(Z)>L>g jN(Z),
Then with user j 1As the first-selected recognition result of user Z, user j 2Select recognition result as two, the rest may be inferred.
Step 4: in the recognition result of step 34, select preceding K (the individual recognition result of K≤N), establish select before K recognition result be respectively that user Z is identified as j in the database 1Individual user, j 2Individual user ..., j KIndividual user; Below again with the computing machine described in the step 1 to user Z being identified as the j in the database 1(l=1,2, L, K) individual user's recognition result carries out the authentication fusion of biological characteristic; Carry out according to the following steps successively:
Step 41: the original output mark of each biological characteristic submodule of all users all has been mapped to [0,1] interval interior degree of confidence through after the step 32 among the training set A; Again these degree of confidence are made the vector in the hyperspace below, calculate best projection direction, carry out according to the following steps successively from hyperspace to its one-dimensional subspace:
At first, set:
N user arranged in the database;
The living things feature recognition authentication sub module of pending fusion also claims sorter, has R; X r = ( x 1 , x 2 , L , x R ) T The biological characteristic that is user X in the training set is through each biological characteristic authentication submodule
Join and convert to R the dimensional vector ((x of the afterwards total output formation of degree of confidence 1, x 2, L, x R) TExpression row vector (x 1, x 2, L, x R) transposition); Component x wherein iRepresent i (i=1,2, L, R) confidence of individual sorter output;
J among the note training set A iWhole composition of sample class ω of individual user 0, whole composition of sample class ω of other user among the training set A 1
Suppose class ω 0In total N 0Individual sample, the subclass A of composing training collection A 0, class ω 1In total N 1Individual sample, the subclass A of composing training collection A 1(A=A 0UA 1, N=N 0+ N 1);
Set A 0, A 1All can be considered the set of the column vector in the R dimension space, the best projection direction of the one-dimensional subspace from the R dimension space to the R dimension space is calculated by (3-6) formula:
w r * = S w - 1 ( m 0 r - m r 1 ) - - - ( 3 - 6 )
Wherein, m r k = 1 M k &Sigma; X &Element; A k X r , K=0,1,
Figure A20051013631000213
It is respectively the average of true and false sample; S k = &Sigma; X &Element; A k ( X r - m r 0 ) ( X r - m r 1 ) T , K=0,1, S 0, S 1It is respectively the within class scatter matrix of true and false sample;
S w=P (ω 0) S 0+ P (ω 1) S 1, S wBe total within class scatter matrix, P (ω 0), P (ω 1) be respectively the prior probability of true and false sample class, get P (ω when specifically implementing 0)=P (ω 1)=0.5;
S w -1Expression S wInverse matrix;
Step 42: utilize the best projection direction of calculating gained in the step 41
Figure A20051013631000216
Structure projection matrix PS projects to all elements among the training set A among the set Y of one-dimensional subspace;
At first, structure projection matrix PS: PS = w r * &times; ( ( w r * ) T &times; w r * ) - 1 &times; ( w r * ) T
Secondly, to the arbitrary element (column vector) among the training set A Be transformed among the set Y of one-dimensional subspace by (3-7) formula:
y r = X r &times; ( PS ) T - - - ( 3 - 7 )
After all being transformed into all elements in the set A among the set Y by (3-7) formula, the subclass A of note set A 0, A 1In sample all be transformed into the subclass Y of set Y respectively 0, Y 1In, can calculate Y according to (3-8) formula 0, Y 1In the average of all kinds of samples;
And then can calculate the standard deviation of all kinds of samples:
In the following formula The expression vector With Between Euclidean distance; Because all in the one-dimensional subspace of R dimension space, so With Be the R dimensional vector, write as the form of component respectively y r = ( y 1 , y 2 , L , y R ) T With
Figure A20051013631000228
The back:
Step 43: (each sorter is output as his degree of confidence for user Z Z r = ( z 1 , z 2 , L , z R ) T ) , According to preset threshold With calculated in the step 41
Figure A200510136310002212
Make categorised decision according to following rule, obtain user Z and belong to ω 0Class or ω 1The judgement of class:
If ( Z r - y r 0 ) &times; w r * > 0 , Z ∈ ω then 0
If ( Z r - y r 0 ) &times; w r * < 0 , Z ∈ ω then 1
Step 44: if in step 43, judge Z ∈ ω 0, put the K dimensional vector so L component be 1; Otherwise be changed to 0;
Step 5: by step 4 couple all j 1, all finish after the authentication fusion, it is vectorial that and if only if
Figure A200510136310002216
In the one-component value is arranged is 1 o'clock, make the judgement that user Z is a validated user; Otherwise, judge that user Z is the disabled user.
Experimental results show that: after the identification authentication fusant module through multimode biological characteristic emerging system, no matter choose which kind of concrete amalgamation mode, total classification error rate after the fusion is all than low without total classification error rate of the single biological characteristic emerging system that merges, especially for the authentication rate is not very high biological characteristic authentication system, all the more so.
Description of drawings
The system framework figure of Fig. 1 multimode biological characteristic emerging system.
Fig. 2 normalization module synoptic diagram.Wherein 2a utilizes the degree of confidence conversion to carry out normalized module diagram; 2b utilizes in " minimum maximum " method to carry out normalized module diagram.
Fig. 3 Fusion Module synoptic diagram.Wherein 3a is an identification fusant module diagram; 3b is an authentication fusant module diagram; 3c is an identification authentication fusant module diagram.
Fig. 4 is from the function curve synoptic diagram (people face module) of generalized confidence to the degree of confidence mapping.
Fig. 5 adopts the classification synoptic diagram of the linear classifier of fisher criterion.Wherein 5a exports the classification signal that mark is done to people's face and iris module; 5b is the classification signal that people's face, on-line signature and off line person's handwriting module output mark are done.
Embodiment
In multimode biological characteristic emerging system of the present invention, Fusion Module is crucial.
At first, the residing position of Fusion Module is at matching layer.In other words, the input of Fusion Module is the mark that each living things feature recognition authentication sub module is exported after finishing template matches.This both be different from utilize the output of different biological features identification Verification System when characteristic layer merges proper vector as input, also be different from utilize the output of different biological features identification Verification System when decision-making level merges logical variable as input.
Secondly, according to the different requirements of identification with authentication, corresponding identification fusant module, authentication fusant module and the identification authentication fusant module of being divided into of Fusion Module.Identification fusant module need finish judge the user " whom is? " work, promptly need to determine user's identity; Authentication fusant module only need be made the judgement of "Yes" or " not being "; Promptly only need to determine whether the user is the identity that he claims; Identification authentication fusant module at first determine the user " whom is? " the basis on, the legitimacy to the user judges again.
The feature of identification fusant module
For finish judge the user actually " whom is? " work, we need obtain the confidence information of user to be detected with respect to all users in the database, and then determine final recognition result according to the height of degree of confidence.Because for same user, it is separate that each feature such as its iris, people's face or signature can be regarded as, so each living things feature recognition sub-classifier also can be thought separate.For separate sorter, we use in " summary of the invention " (1-4) formula or (1-5) formula as the classification discriminant function of emerging system.When emerging system is discerned at last, select to make g j(Z) reach peaked classification as first-selected recognition result, make g j(Z) reach time big classification and select recognition result as two, the rest may be inferred.
If N user arranged in the database, we remember them respectively and make class ω 1, ω 2, L, ω NIf the living things feature recognition authentication sub module (sub-classifier) that merges has R, to the judgement of classifying respectively of R kind biological characteristic pattern.For the user X in the training set, note X = ( x r 1 , x r 2 , L , x r R ) It is its proper vector set total before entering each submodule coupling; Wherein Represent that this user extracts the proper vector that obtains in i sorter.To i sorter, user X is identified as class ω jRepresentative user's degree of confidence also is a posterior probability, uses Expression. Be posterior probability,, also can be regarded as i sorter it is judged as j user's degree of confidence user X.Reflected this judgement confidence level.Because the fusion among the present invention is the fusion on matching hierarchy, and each living things feature recognition submodule is carrying out the mark of exporting after the template matches separately or is being distance, be similarity, therefore when these marks are normalized into degree of confidence, take following step:
1. convert the output of each submodule to generalized confidence by given generalized confidence estimation formulas;
2. convert generalized confidence to degree of confidence by generalized confidence again to the mapping function f (y) of degree of confidence.
For some (might as well remember and do i) living things feature recognition submodule (sub-classifier), establish Be " user X to be identified as class ω jRepresentative user's generalized confidence, corresponding degree of confidence p ( &omega; j | x r i ) = f ( g ( &omega; j | x r i ) ) &CenterDot; For the generalized confidence formula in 1., we use in " summary of the invention " (1-1) formula or (1-2) formula estimate.Rewrite as follows:
g ( &omega; j | x r i ) = 1 - d j ( x r i ) min k &NotEqual; j ( d k ( x r i ) ) , ( k = 1,2 , L , N )
g ( &omega; j | x r i ) = 1 - max k &NotEqual; j ( s k ( x r i ) ) s j ( x r i ) , ( k = 1,2 , L , N )
When each living things feature recognition submodule the mark that carries out separately exporting after the template matches be apart from the time, the distance that adopts (1-1) formula to export converts generalized confidence to, in the formula Expression " proper vector With class ω jMinimal matching span between the proper vector template of j user in database of representative "; And when the mark of coupling back output is similarity, adopt (1-2) formula to convert the similarity of output to generalized confidence, S in the formula j(X) expression " proper vector
Figure A20051013631000247
With class ω jMaximum match similarity between the proper vector template of j user in database of representative ".
If for i living things feature recognition submodule, the codomain of the generalized confidence that obtains is T, order y = g ( &omega; j | x r i ) , To any y ∈ T, get near the minizone [y-δ, y+ δ] the y, then, use (1-3) formula in " summary of the invention " to estimate for the mapping function f (y) in 2., rewrite as follows:
f ( y ) = &Sigma; j = 1 N count ( { X | X &Element; Aandg ( &omega; j | x r i ) &Element; [ y - &delta; , y + &delta; ] andX &Element; &omega; j } ) &Sigma; j = 1 N count ( { X | X &Element; Aandg ( &omega; j x r i ) &Element; [ y - &delta; , y + &delta; ] } )
The number that function count (g) in the following formula is used for adding up element in the set.Denominator in the following formula is actually the sum that generalized confidence drops on the sample in the minizone [y-δ, y+ δ]; Drop in the minizone [y-δ, y+ δ] and the number of the sample that is correctly validated and molecule is actually generalized confidence.
Like this, to all i and j, the degree of confidence of sub-classifier p ( &omega; j | x r i ) = f ( g ( &omega; j | x r i ) ) After all having obtained estimating, expression formula that just can be by the mapping function f (y) that calculated obtains degree of confidence from the coupling mark of unknown subscriber Z, and use g j(Z), press functional value size ordering back and obtain final recognition result as the decision function of emerging system to the j class.
The feature of authentication fusant module
Because the input of Fusion Module is the mark of each biological characteristic authentication submodule output, these marks all are not quite similar on distributing, and therefore at first will carry out normalized before they are merged.This authentication Fusion Module can carry out normalization according to " minimax approach " shown in (1) formula:
v = u - min ( U ) max ( U ) - min ( U ) - - - ( 1 )
U represents the raw score of certain biological characteristic authentication submodule output in the formula, U represents the set that the raw score of this biological characteristic authentication submodule output constitutes, v represents the mark of output after normalization, and max (U), min (U) represent maximal value and the minimum value among the set U respectively.New minute number average of each biological characteristic authentication submodule output is mapped to [0,1] interval after the normalization.
Certainly, can not adopt (1) formula yet, and realize carrying out normalization by the process of degree of confidence conversion.The transfer process of counting to degree of confidence from original output branch is described the feature of identification fusant module.
After the normalization, the output mark set of same user's biological characteristic behind R biological characteristic authentication submodule of process just can be regarded as a bit that has been mapped in the R dimension space.If just the output of certain two (R=2) biological characteristic authentication submodule is merged, so these two pairing users of biological characteristic just be mapped in 2 dimension spaces a bit, and the coordinate of this point just two submodule outputs through the fractional value after the normalization; If the output of certain three (R=3) biological characteristic authentication submodule is merged, these three pairing users of biological characteristic just have been mapped to a bit in 3 dimension spaces so.The rest may be inferred.
Behind the R dimension space, the work of authentication fusant module is exactly that the point in the R dimension space is divided to submodule output mark through normalized mapping, actually or with user's validated user disabled user of judging point representative.This is the problem of a pattern-recognition, and its mathematical model is as follows:
For the authentication fusant H of system, its input Z r = ( z 1 , z 2 , L , z R ) T The biological characteristic of representative of consumer Z R the dimensional vector ((x that total output mark constitutes after through each biological characteristic authentication submodule coupling and normalization 1, x 2, L, x R) TExpression row vector (x 1, x 2, L, x R) transposition).Component z wherein iRepresent i (i=1,2, L, R) the output mark after the individual sorter normalization.The value of R is determined by the number of the biological characteristic authentication subsystem that merges.If ω 0Represent the validated user class, ω 1Represent disabled user's class, L represents the vector space of validated user.Then the whole judging process of the H of system can be represented with (2) formula:
If Z r &Element; L , Z ∈ ω then 0Otherwise Z ∈ ω 1(2)
Authentication fusant module is to R dimension space mid point When the user Z of representative adjudicated, four kinds of situations may appear:
1) validated user is used as validated user; 2) validated user is used as the disabled user; 3) disabled user is used as validated user; 4) disabled user is used as the disabled user.This shows, may occur 2) and 3) mistake of two kinds of situations, be called False Reject Rate (FRR) and False Accept Rate (FAR).Both are defined as follows:
FRR ( R ) = 1 - &Integral; R f ( X | &omega; 0 ) dX - - - ( 3 )
R ( R ) = &Integral; R f ( X | &omega; 1 ) dX - - - ( 4 )
F (X| ω wherein 0) and f (X| ω 1) be respectively validated user and disabled user's conditional probability density function.Total cost Total Error Rate (TER) of classification is defined as:
TER(R)=C FRR×FRR(R)+C FAR×FAR(R) (5)
C wherein FRRAnd C FARRepresented the cost of two kinds of mistakes respectively.And the target of classification has also just become to ask and has made TER ( R min ) = min R { TER ( R ) } Divide R for one of the validated user vector space of setting up Min
Division to the point of R dimension space both can have been adopted linear classifier (such as Fisher linear classifier or linear support vector machine classifier etc.), also can adopt non-linear sorter (such as the support vector machine classifier that adopts the RBF kernel function etc.).Sorter is trained the back and just can be judged that the result that output is at last judged is as the final authentication result of authentication emerging system to any R dimensional vector of input by given training set training parameter.
The feature of identification authentication fusant module
Because being the value to the classification discriminant function, the identification fusant module among the present invention sorts from big to small, get reach the maximal function value classification as first-selected recognition result, this also just means, identification fusant module comes down to select from database one and user's to be identified " as " user as first-selected recognition result, " but as " is other user in the relative data storehouse, can not guarantee that user to be identified is exactly the user as first-selected identification, even can not guarantee user to be identified and very high as user's similarity of first-selected identification.But allow all users in user to be identified and the emerging system database authenticate one by one and too time-consuming, when particularly database is very big.In order to solve above problem, the present invention utilizes identification authentication fusant module that user to be identified is discerned authentication, can better guarantee the validity of recognition result only increasing slightly on the basis of operation time.
Identification authentication fusant module can be regarded as basically by identification fusant module and authentication fusant module polyphone and constitutes, and user to be identified obtains some recognition results such as first-selection, two choosings through identification fusant module earlier; Again with these recognition results respectively substitution authentication fusant module authenticate.For certain recognition result, have only when it has passed through authentication, and be unique when having passed through authentication result, just as effective recognition result.
Certainly, in the verification process of identification authentication fusant module, no longer be whole composition of sample class ω with validated user 0, whole composition of sample class ω of disabled user 1But with the whole composition of sample class ω of the user in the recognition result to be certified 0, whole composition of sample class ω of other user 1
Fig. 1 is the system framework figure of multimode biological characteristic emerging system.At first obtain people's face, iris, on-line signature and each biological characteristic of off line person's handwriting of unknown subscriber by collecting devices such as camera, iris capturing instrument, handwriting pad or touching display screen, scanners, next the output of these equipment is imported respectively corresponding identification authentication sub module carry out feature extraction and and separately in the database existing user biological feature templates carry out template matches, the mark that obtains after the coupling separately of output then.These marks of identification authentication module output are sent to again and carry out normalization in the normalization module, import unified Decision Fusion module then and discern fusion, authenticate to merge or discern authentication and merge, and export identification at last, authenticate or the identification authentication result.
Identification is merged
Because the signature user has only the branch of the true and false during line signature, so when discerning fusion, only at people's face, iris and off line person's handwriting system.Identification is merged and is carried out in the identification fusant module shown in Fig. 3 a.
If there has been one to comprise the training set of respectively treating fusant module output mark.At first need to convert each mark (distance or similarity) for the treatment of that Fusion Module is exported in this training set to generalized confidence by the normalization module shown in Fig. 2 a, and then obtain the function that generalized confidence is mapped as degree of confidence.
To i (i=1,2, L, R) individual submodule to be merged, the mark of supposing its output are user X biological characteristics And the distance in the database between the standard form of all these biological characteristics of user.With Expression " proper vector With class ω jThe j of representative (j=1,2, L, N) minor increment between individual user's the standard form ", (1-1) formula according in " summary of the invention " can calculate and " user X is identified as j user's generalized confidence g ( &omega; j | x r i ) ”。All users in the training set have been carried out behind the aforesaid operations, just can calculate the mapping function f (y) from the generalized confidence to the degree of confidence according to (1-3) formula in " summary of the invention ".Order y = g ( &omega; j | x r i ) , In the concrete practical operation, be the y value of getting series of discrete: y 1, y 2, L y n, L, on these discrete points, generalized confidence directly calculates by (1-3) formula to the conversion of degree of confidence and (should be f (y mutually 1), f (y 2), L, f (y n), L); And, then calculate degree of confidence f (y) by mathematical " cube interpolation " or approximating method for the y value (generalized confidence) on these discrete points not.The concrete grammar of " cube interpolation " or match does not belong to coverage of the present invention, thereby does not remake elaboration.
The mark for the treatment of the output of fusant module when certain is a user X biological characteristic
Figure A20051013631000276
And during the similarity in the database between the standard form of all these biological characteristics of user.With Expression " proper vector With class ω jJ (the j=1 of representative, 2, L, N) maximum similarity between individual user's the standard form "; calculate generalized confidence according to (1-2) formula in " summary of the invention "; other operation and output mark be apart from the time identical, the last mapping function f (y) that can calculate equally from the generalized confidence to the degree of confidence.
Shown in Figure 4 be exactly to people's face module output mark do from the matched curve of generalized confidence to the f (y) of degree of confidence mapping.
For unknown subscriber Z, to all i (i=1,2, L, R) and the j value (j=1,2, L N), finishes Calculating after, just need in the identification fusant module shown in Fig. 3 a, carry out degree of confidence integrated and the decision candidate.
According to what obtained (1-4) formula in the substitution " summary of the invention " or (1-5) formula just can obtain discerning the decision function g of emerging system to j class (j user) j(Z).To all j, the decision function g of identification emerging system j(Z) all try to achieve after, just can be with g j(X) value is arranged from big to small, might as well be provided with g J1(X)>g J2(X)>L>g JN(X), then with user j 1As first-selected recognition result, j 2As the inferior recognition result that selects, the rest may be inferred.
Authentication is merged
Authentication is merged and is carried out in the authentication fusant module shown in Fig. 3 b.At first to finish the output of authentication fusant module and divide the mapping of counting to the R dimension space.Utilize existing validated user data and disabled user's data to set up training set A, arbitrary column vector among the A X r = ( x 1 , x 2 , L , x R ) T Each component be respectively each biological characteristic of user by corresponding biological identity characteristic authentication sub module and through the mark of the output after the normalized.Wherein normalization is to carry out in the module shown in the module shown in Fig. 2 a or Fig. 2 b.
Next can utilize various linearities or nonlinear sorter that the sample point of training set A in the R dimension space is classified.
If in the R dimension space, the whole validated users among the training set A (true sample) constitute class ω 0, disabled user's (dummy copy) constitutes class ω 1Suppose class ω 0In total N 0Individual sample, the subclass A of composing training collection A 0, class ω 1In total N 1Individual sample, the subclass A of composing training collection A 1(A=A 0UA 1, N=N 0+ N 1).
Classify with linear classifier if select, can select linear classifier based on the fisher criterion, by maximization fisher criterion function, seek and to treat that classification samples is mapped to the best projection direction of the one-dimensional space shown in (2-1) formula " summary of the invention " from the R dimension space.Rewrite as follows:
w r * = S w - 1 ( m r 0 - m r 1 )
Wherein, m r k = 1 M k &Sigma; X &Element; A k X r , K=0,1,
Figure A20051013631000283
It is respectively the average of true and false sample;
S k = &Sigma; X r &Element; A k ( X r - m r 0 ) ( X r - m r 1 ) T , K=0,1, S 0, S 1It is respectively the within class scatter matrix of true and false sample.
S w=P (ω 0) S 0+ P (ω 1) S 1, S wBe total within class scatter matrix, P (ω 0), P (ω 1) be respectively the prior probability of true and false sample class,
S W -1Expression S wInverse matrix;
By constructing projection matrix PS: PS = w r * &times; ( ( w r * ) T &times; w r * ) - 1 &times; ( w r * ) T , (2-2) formula in " summary of the invention ":
y r = X r &times; ( PS ) T
Classification problem in the R dimension space just transforms for the classification problem in the one-dimensional subspace of R dimension space.The note set A 0, A 1In sample point in one-dimensional subspace, be mapped to respectively the set Y 0And Y 1, Y 0, Y 1In the average of all kinds of samples and standard deviation define respectively as (2-3) formula in " summary of the invention " and (2-4) formula:
Wherein
Figure A200510136310002810
Calculating by (2-5) formula in " summary of the invention ".
Because classification problem has transformed for the classification problem in the one-dimensional subspace.Only need to determine a threshold value now Will
The normalized vector of user Z is made categorised decision by following rule:
If ( Z r - y r 0 ) &times; w r * > 0 , Z ∈ ω then 0
If ( Z r - y r 0 ) &times; w r * < 0 , Z ∈ ω then 1
Threshold value y 0Define several different methods, such as choosing
Figure A200510136310002814
With Mean value
Or make weight coefficient with all kinds of number of samples
Figure A20051013631000291
With Weighted mean value
Figure A20051013631000293
Perhaps make weight coefficient with all kinds of sample standard deviations in the one-dimensional subspace
Figure A20051013631000294
With Weighted mean value
The value that makes training sample classification error rate minimum that can also search on one-dimensional subspace in order to certain step-length is used as classification thresholds (with reference to (5) formula).
The data that his-and-hers watches 2 are obtained to the table 4, what we chose is that all kinds of sample standard deviations are made weight coefficient in the one-dimensional subspace With
Figure A20051013631000298
The method of weighted mean value.
Because in the practical application, can't determine accurately that validated user and disabled user use the prior probability of Verification System.Therefore, getting P (ω 0)=P (ω 1)=0.5, the C in (18) formula FRR=C FARUnder=1 the situation, use, and test with test set based on the capable classification of sample point of the linear classifier of fisher criterion to training set A in the R dimension space, under various authentication amalgamation modes resulting total classification error rate (TER) as table 2 to shown in the table 4.Table 1 item is total classification error rate that can reach without the single biological characteristic authentication system that merges.Fig. 5 a only authenticates when merging people's face and iris module, and the fisher linear classifier is in the classification signal of 2 dimension spaces; Fig. 5 b is when people's face, off line person's handwriting and on-line signature module are authenticated fusion, and the fisher linear classifier is in the classification signal of 3 dimension spaces.
The TER of the single biological characteristic authentication of table 1 system
Verification System Training TER (%) Test TER (%)
People's face 3.04 3.25
Iris 0.00 0.12
The off line person's handwriting 4.40 9.66
On-line signature 7.85 7.40
Table 2 is merged the TER that obtains by two biological characteristic authentication systems
Verification System Training TER (%) Examination TER (%)
Fisher LinearSVM RbfSVM fisher LinearSVM RbfSVM
F&I 0.00 0.00 0.00 0.04 0.21 0.16
F&Off 2.34 0.53 0.45 4.07 0.95 0.78
F&On 2.47 0.74 0.82 3.33 1.60 1.60
I&Off 0.12 0.00 0.00 0.00 0.25 0.16
I&On 0.00 0.00 0.00 0.21 0.29 0.33
Off&On 2.22 1.89 1.44 2.59 1.97 1.77
Table 3 is merged the TER that obtains by three biological characteristic authentication systems
Verification System Training TER (%) Examination TER (%)
Fisher LinearSVM RbfSVM fisher LinearSVM RbfSVM
F&I&Off 0.04 0.00 0.00 0.00 0.16 0.00
F&I&On 0.00 0.00 0.00 0.08 0.16 0.12
F&Off&On 0.49 0.16 0.08 0.53 0.53 0.25
I&Off&On 0.00 0.00 0.00 0.04 0.25 0.08
Table 4 is merged the TER that obtains by four biological characteristic authentication systems
Verification System Training TER (%) Examination TER (%)
fisher LinearSVM RbfSVM fisher LinearSVM RbfSVM
All systems 0.00 0.00 0.00 0.00 0.04 0.00
[notes]: F, I, Off, On be representative's face, iris, off line person's handwriting, on-line signature respectively; F﹠amp; I represents the emerging system of people's face and iris; The emerging system of " all systems " expression people face, iris, off line person's handwriting and on-line signature.
From table 1~table 4, can see, after the authentication fusant module through multimode biological characteristic emerging system, in any case choose concrete amalgamation mode, total classification error rate after the fusion is all than low without total classification error rate of the single biological characteristic emerging system that merges, especially be not very high biological characteristic authentication system for the authentication rate, such as off line person's handwriting system and on-line signature system.For iris system, although authentication rate own is just very high, but after merging through multimode biological characteristic emerging system, under the situation that the authentication rate does not reduce, for providing, it carries out the space that live body detects again, for its antifalsification provides more guarantee, also be highly significant therefore.
Other linear classifier such as linear svm classifier device or non-linear sorter such as the training and testing result of the svm classifier device that adopts the RBF kernel function also table 2 to the table 4 as a comparison.Because its concrete method does not belong to coverage of the present invention, thereby will not remake elaboration in the present invention.
The identification authentication is merged
The identification authentication is merged in the authentication of the identification shown in Fig. 3 c fusant module and is carried out.Because the process that the identification authentication is merged is the comprehensive of identifying and verification process basically, these two processes are existing in front to be described in detail, so set forth no longer in addition.

Claims (8)

1. the personal identification method that merges of biological characteristic is characterized in that this method contains following steps successively:
Step 1:, also will import and the corresponding database of described each module to recognition of face authentication module, iris recognition authentication module, on-line signature identification authentication module and the off line person's handwriting identification authentication module that the computing machine input is set;
Step 2: with people's face of camera collection unknown subscriber Z, gather the off line person's handwriting of unknown subscriber Z with the iris of iris capturing instrument collection unknown subscriber Z, with the on-line signature of handwriting pad or touching display screen collection unknown subscriber Z, with scanner, again respectively in the corresponding image input step 1 described computing machine, carry out feature extraction respectively with corresponding identification authentication module, and and separately in the database existing user biological feature templates mate the mark that obtains after the coupling separately of output then;
Step 3: the identification of carrying out biological characteristic with the computing machine described in the step 1 is merged, and carries out according to the following steps successively:
Step 31: utilize the existing database of each module in the step 2 to set up a described abundant big training set A who respectively treats fusant module output mark who comprises people's face, iris, on-line signature and off line person's handwriting;
Step 32: each mark for the treatment of that the fusant module is exported that comprises distance or similarity in this training set in the step 31 is imported the normalization module that presets in the selected computing machine convert degree of confidence to, carry out according to the following steps successively:
At first, by given generalized confidence estimation formulas the output mark of respectively treating the fusant module is converted to generalized confidence:
Set: N user arranged in the database, and note is made class ω respectively 1, ω 2, L, ω N
The living things feature recognition authentication sub module of pending fusion also claims sorter, has R;
X ^ = ( x r 1 , x r 2 , L , x r R ) Be that user X total proper vector before coupling is gathered in the training set; Wherein Represent that this user is in i (i=1,2, L, R) proper vector that extraction obtains in the individual sorter;
For the original sorter that is output as distance, The representation feature vector
Figure A2005101363100002C4
With class ω jThe j of representative (j=1,2, L, N) minimal matching span between the proper vector template of individual user in database;
For the original sorter that is output as similarity, The representation feature vector With class ω jThe j of representative (j=1,2, L, N) the maximum match similarity between the proper vector template of individual user in database;
Then, can convert the original output mark of described each submodule to generalized confidence by given generalized confidence estimation formulas.To i sorter, user X is identified as class ω jRepresentative user's generalized confidence is used Expression.
For the original living things feature recognition submodule that is output as distance, with the mark of (1-1) formula with its output Convert generalized confidence to:
g ( &omega; j | x r i ) = 1 - d j ( x r i ) min k &NotEqual; j ( d k ( x r i ) ) , ( k = 1,2 , L , N ) - - - ( 1 - 1 )
For the original living things feature recognition submodule that is output as similarity, with the mark of (1-2) formula with its output Convert generalized confidence to:
g ( &omega; j | x r i ) = 1 - max k &NotEqual; j ( s k ( x r i ) ) s j ( x r i ) , ( k = 1,2 , L , N ) - - - ( 1 - 2 )
Secondly, after mark that certain living things feature recognition submodule is exported all is converted into generalized confidence,, again generalized confidence is converted to degree of confidence by calculating the mapping function f (y) of following generalized confidence to degree of confidence:
If: the codomain of the generalized confidence that obtains from certain living things feature recognition submodule is T;
User X belongs to the abundant big training set A (X ∈ A) that configures in this living things feature recognition submodule;
Order y = g ( &omega; j | x r i ) , To any y ∈ T, [y-δ, y+ δ] is a near minizone the y;
Then: f ( y ) = &Sigma; j = 1 N count ( { X | X &Element; Aandg ( &omega; j | x r i ) &Element; [ y - &delta; , y + &delta; ] andX &Element; &omega; j } ) &Sigma; j = 1 N count ( { X | X &Element; Aandg ( &omega; j | x r i ) &Element; [ y - &delta; , y + &delta; ] } ) - - - ( 1 - 3 )
(1-3) denominator of f (y) is the sum that generalized confidence drops on the sample in the minizone [y-δ, y+ δ] in the formula; Molecule is that generalized confidence drops in the minizone [y-δ, y+ δ] and the number of the sample that is correctly validated.
To any y ∈ T, calculate f (y) after, just can convert generalized confidence to degree of confidence.To i sorter, user X is identified as class ω jRepresentative user's degree of confidence also is a posterior probability, uses
Figure A2005101363100003C6
Expression.
p ( &omega; j | x r i ) = f ( g ( &omega; j | x r i ) ) .
Step 33: for user Z, to all i (i=1,2, L, R) value and certain j (j=1,2, L, N) value, all the degree of confidence conversion method according to step 32 calculates
Figure A2005101363100003C8
Substitution identification emerging system obtains the differentiation mark that emerging system is judged as user Z j user to j user's discriminant function expression formula (1-4) formula or (1-5) in the formula then:
g j ( Z ) = &Pi; i = 1 R p ( &omega; j | z r i ) - - - ( 1 - 4 )
g j ( Z ) = &Sigma; i = 1 R p ( &omega; j | z r i ) - - - ( 1 - 5 )
Step 34: the j value to all calculates g according to step 33 j(Z) (all adopt (1-4) formula or all adopt (1-5) formula), g when then j being got different value j(Z) value is arranged from big to small, is provided with:
g j1(Z)>g j2(Z)>L>g jN(Z),
Then with user j 1As the first-selected recognition result of user Z, user j 2Select recognition result as two, the rest may be inferred.
2. the identity identifying method that merges of biological characteristic is characterized in that this method contains following steps successively:
Step 1:, also will import and the corresponding database of described each module to recognition of face authentication module, iris recognition authentication module, on-line signature identification authentication module and the off line person's handwriting identification authentication module that the computing machine input is set;
Step 2: with people's face of camera collection unknown subscriber Z, gather the off line person's handwriting of unknown subscriber Z with the iris of iris capturing instrument collection unknown subscriber Z, with the on-line signature of handwriting pad or touching display screen collection unknown subscriber Z, with scanner, again respectively in the corresponding image input step 1 described computing machine, carry out feature extraction respectively with corresponding identification authentication module, and and separately in the database existing user biological feature templates mate the mark that obtains after the coupling separately of output then;
Step 3: the authentication of carrying out biological characteristic with the computing machine described in the step 1 is merged, and carries out according to the following steps successively:
Step 31: utilize the existing database of each module in the step 2 to set up a described abundant big training set A who respectively treats fusant module output mark who comprises people's face, iris, on-line signature and off line person's handwriting;
Step 32: the original output mark to each biological characteristic submodule of all users among the training set A carries out normalized, and new minute number average of each biological characteristic authentication submodule output is mapped to [0,1] interval after the normalization;
Step 33: the new mark of each the biological characteristic authentication submodule output that obtains through step 32 is made vector in the hyperspace, calculate best projection direction, carry out according to the following steps successively from hyperspace to its one-dimensional subspace:
At first, set:
N user arranged in the database;
The living things feature recognition authentication sub module of pending fusion also claims sorter, has R;
X r = ( x 1 , x 2 , L , x R ) T Be biological characteristic R the dimensional vector ((x that total output mark constitutes after through each biological characteristic authentication submodule coupling and normalization of user X in the training set 1, x 2, L, x R) TExpression row vector (x 1, x 2, L, x R) transposition); Component x wherein iRepresent i (i=1,2, L, R) the output mark after the individual sorter normalization;
Whole validated users (true sample) among the note training set A constitute class ω 0, disabled user's (dummy copy) constitutes class ω 1
Suppose class ω 0In total N 0Individual sample, the subclass A of composing training collection A 0, class ω 1In total N 1Individual sample, the subclass A of composing training collection A 1(A=A 0UA 1, N=N 0+ N 1);
Set A 0, A 1All can be considered the set of the column vector in the R dimension space, the best projection direction of the one-dimensional subspace from the R dimension space to the R dimension space is calculated by (2-1) formula:
w * r = S w - 1 ( m r 0 - m r 1 ) - - - ( 2 - 1 )
Wherein, m r k = 1 M k &Sigma; X &Element; A k X r , K=0,1,
Figure A2005101363100005C4
It is respectively the average of true and false sample;
S k = &Sigma; X &Element; A k ( X r - m r 0 ) ( X r - m r 1 ) T , K=0,1, S 0, S 1It is respectively the within class scatter matrix of true and false sample;
S w=P (ω 0) S 0+ P (ω 1) S 1, S wBe total within class scatter matrix, P (ω 0), P (ω 1) be respectively the prior probability of true and false sample class, get P (ω when specifically implementing 0)=P (ω 1)=0.5;
S w -1Expression S wInverse matrix;
Step 34: utilize the best projection direction of calculating gained in the step 33
Figure A2005101363100005C6
Structure projection matrix PS projects to all elements among the training set A among the set Y of one-dimensional subspace;
At first, structure projection matrix PS: PS = w r * &times; ( ( w r * ) T &times; w r * ) - 1 &times; ( w r * ) T ;
Secondly, to the arbitrary element (column vector) among the training set A Be transformed among the set Y of one-dimensional subspace by (2-2) formula:
y r = X r &times; ( PS ) T - - - ( 2 - 2 )
After all being transformed into all elements in the set A among the set Y by (2-2) formula, the subclass A of note set A 0, A 1In sample all be transformed into the subclass Y of set Y respectively 0, Y 1In, can calculate Y according to (2-3) formula 0, Y 1In the average of all kinds of samples;
And then can calculate the standard deviation of all kinds of samples:
Figure A2005101363100005C11
In the following formula
Figure A2005101363100005C12
The expression vector
Figure A2005101363100005C13
With Between Euclidean distance; Because all in the one-dimensional subspace of R dimension space, so With Be the R dimensional vector, write as the form of component respectively y r = ( y 1 , y 2 , L , y R ) T With
Figure A2005101363100006C4
The back:
Step 35: (each sorter is output as his normalization for user Z Z r = ( z 1 , z 2 , L , z R ) T ) , According to preset threshold With calculated in the step 33
Figure A2005101363100006C8
Make categorised decision according to following rule, obtain user Z and belong to ω 0Class or ω 1The judgement of class:
If ( Z r - y r 0 ) &times; w r * > 0 , Z ∈ ω then 0
If ( Z r - y r 0 ) &times; w r * < 0 , Z ∈ ω then 1
3. the identification and the authentication method that merge of biological characteristic is characterized in that this method contains following steps successively:
Step 1:, also will import and the corresponding database of described each module to recognition of face authentication module, iris recognition authentication module, on-line signature identification authentication module and the off line person's handwriting identification authentication module that the computing machine input is set;
Step 2: with people's face of camera collection unknown subscriber Z, gather the off line person's handwriting of unknown subscriber Z with the iris of iris capturing instrument collection unknown subscriber Z, with the on-line signature of handwriting pad or touching display screen collection unknown subscriber Z, with scanner, again respectively in the corresponding image input step 1 described computing machine, carry out feature extraction respectively with corresponding identification authentication module, and and separately in the database existing user biological feature templates mate the mark that obtains after the coupling separately of output then;
Step 3: at first carry out the identification fusion of biological characteristic, carry out according to the following steps successively with the computing machine described in the step 1:
Step 31: utilize the existing database of each module in the step 2 to set up a described abundant big training set A who respectively treats fusant module output mark who comprises people's face, iris, on-line signature and off line person's handwriting;
Step 32: each mark for the treatment of that the fusant module is exported that comprises distance or similarity in this training set in the step 31 is imported the normalization module that presets in the selected computing machine convert degree of confidence to, carry out according to the following steps successively:
At first, by given generalized confidence estimation formulas the output mark of respectively treating the fusant module is converted to generalized confidence:
Set: N user arranged in the database, and note is made class ω respectively 1, ω 2, L, ω N
The living things feature recognition authentication sub module of pending fusion also claims sorter, has R;
X ^ = ( x r 1 , x r 2 , L , x r R ) Be that user X total proper vector before coupling is gathered in the training set; Wherein
Figure A2005101363100006C12
Represent that this user is in i (i=1,2, L, R) proper vector that extraction obtains in the individual sorter;
For the original sorter that is output as distance, The representation feature vector With class ω 1The j of representative (j=1,2, L, N) minimal matching span between the proper vector template of individual user in database;
For the original sorter that is output as similarity, The representation feature vector With class ω jThe j of representative (j=1,2, L, N) the maximum match similarity between the proper vector template of individual user in database;
Then, can convert the original output mark of described each submodule to generalized confidence by given generalized confidence estimation formulas.To i sorter, user X is identified as class ω jRepresentative user's generalized confidence is used
Figure A2005101363100007C3
Expression.
For the original living things feature recognition submodule that is output as distance, with the mark of (3-1) formula with its output Convert generalized confidence to:
g ( &omega; j | x r i ) = 1 - d j ( x r i ) min k &NotEqual; j ( d k ( x r i ) ) , ( k = 1,2 , L , N ) - - - ( 3 - 1 )
For the original living things feature recognition submodule that is output as similarity, with the mark of (3-2) formula with its output Convert generalized confidence to:
g ( &omega; j | x r i ) = 1 - max k &NotEqual; j ( s k ( x r i ) ) s j ( x r i ) , ( k = 1,2 , L , N ) - - - ( 3 - 2 )
Secondly, after mark that certain living things feature recognition submodule is exported all is converted into generalized confidence,, again generalized confidence is converted to degree of confidence by calculating the mapping function f (y) of following generalized confidence to degree of confidence:
If: the codomain of the generalized confidence that obtains from certain living things feature recognition submodule is T;
User X belongs to the abundant big training set A (X ∈ A) that configures in this living things feature recognition submodule;
Order y = g ( &omega; j | x r i ) , To any y ∈ T, [y-δ, y+ δ] is a near minizone the y;
Then: f ( y ) = &Sigma; j = 1 N count ( { X | X &Element; Aandg ( &omega; j | x r i ) &Element; [ y - &delta; , y + &delta; ] andX &Element; &omega; j } ) &Sigma; j = 1 N count ( { X | X &Element; Aandg ( &omega; j | x r i ) &Element; [ y - &delta; , y + &delta; ] } ) - - - ( 3 - 3 )
(3-3) denominator of f (y) is the sum that generalized confidence drops on the sample in the minizone [y-δ, y+ δ] in the formula; Molecule is that generalized confidence drops in the minizone [y-δ, y+ δ] and the number of the sample that is correctly validated.
To any y ∈ T, calculate f (y) after, just can convert generalized confidence to degree of confidence.To i sorter, user X is identified as class ω jRepresentative user's degree of confidence also is a posterior probability, uses Expression.
p ( &omega; j | x r i ) = f ( g ( &omega; j | x r i ) ) .
Step 33: for user Z, to all i (i=1,2, L, R) value and certain j (j=1,2, L, N) value, all the degree of confidence conversion method according to step 32 calculates Substitution identification emerging system obtains the differentiation mark that emerging system is judged as user Z j user to j user's discriminant function expression formula (3-4) formula or (3-5) in the formula then:
g j ( Z ) = &Pi; i = 1 R p ( &omega; j | z r i ) - - - ( 3 - 4 )
g j ( Z ) = &Sigma; i = 1 R p ( &omega; j | z r i ) - - - ( 3 - 5 )
Step 34: the j value to all calculates g according to step 33 j(Z) (all adopt (3-4) formula or all adopt (3-5) formula), g when then j being got different value j(Z) value is arranged from big to small, is provided with:
g j1(Z)>g j2(Z)>L>g jN(Z),
Then with user j 1As the first-selected recognition result of user Z, user j 2Select recognition result as two, the rest may be inferred.
Step 4: in the recognition result of step 34, select preceding K (the individual recognition result of K≤N), establish select before K recognition result be respectively that user Z is identified as j in the database 1Individual user, j 2Individual user ..., j KIndividual user; Below again with the computing machine described in the step 1 to user Z being identified as the j in the database l(l=1,2, L, K) individual user's recognition result carries out the authentication fusion of biological characteristic; Carry out according to the following steps successively:
Step 41: the original output mark of each biological characteristic submodule of all users all has been mapped to [0,1] interval interior degree of confidence through after the step 32 among the training set A; Again these degree of confidence are made the vector in the hyperspace below, calculate best projection direction, carry out according to the following steps successively from hyperspace to its one-dimensional subspace:
At first, set:
N user arranged in the database;
The living things feature recognition authentication sub module of pending fusion also claims sorter, has R;
X r = ( x 1 , x 2 , L , x R ) T Be R the dimensional vector ((x of the biological characteristic total output formation after mating through each biological characteristic authentication submodule and converting degree of confidence to of user X in the training set 1, x 2, L, x R) TExpression row vector (x 1, x 2, L, x R) transposition); Component x wherein iRepresent i (i=1,2, L, R) confidence of individual sorter output;
J among the note training set A iWhole composition of sample class ω of individual user 0, whole composition of sample class ω of other user among the training set A 1
Suppose class ω 0In total N 0Individual sample, the subclass A of composing training collection A 0, class ω 1In total N 1Individual sample, the subclass A of composing training collection A 1(A=A 0UA 1, N=N 0+ N 1);
Set A 0, A 1All can be considered the set of the column vector in the R dimension space, the best projection direction of the one-dimensional subspace from the R dimension space to the R dimension space is calculated by (3-6) formula:
w r * = S w - 1 ( m r 0 - m r 1 ) - - - ( 3 - 6 )
Wherein, m r k = 1 M k &Sigma; X &Element; A k X r , K=0,1,
Figure A2005101363100009C3
It is respectively the average of true and false sample;
S k = &Sigma; X &Element; A k ( X r - m r 0 ) ( X r - m r 1 ) T , K=0,1, S 0, S 1It is respectively the within class scatter matrix of true and false sample;
S w=P (ω 0) S 0+ P (ω 1) S 1, S wBe total within class scatter matrix, P (ω 0), P (ω 1) be respectively the prior probability of true and false sample class, get P (ω when specifically implementing 0)=P (ω 1)=0.5;
S w -1Expression S wInverse matrix;
Step 42: utilize the best projection direction of calculating gained in the step 41 Structure projection matrix PS projects to all elements among the training set A among the set Y of one-dimensional subspace;
At first, structure projection matrix PS: PS = w r * &times; ( ( w r * ) T &times; w r * ) - 1 &times; ( w r * ) T ;
Secondly, to the arbitrary element (column vector) among the training set A
Figure A2005101363100009C8
Be transformed among the set Y of one-dimensional subspace by (3-7) formula:
y r = X r &times; ( PS ) T - - - ( 3 - 7 )
After all being transformed into all elements in the set A among the set Y by (3-7) formula, the subclass A of note set A 0, A 1In sample all be transformed into the subclass Y of set Y respectively 0, Y 1In, can calculate Y according to (3-8) formula 0, Y 1In the average of all kinds of samples;
And then can calculate the standard deviation of all kinds of samples:
In the following formula The expression vector With Between Euclidean distance; Because all in the one-dimensional subspace of R dimension space, so
Figure A2005101363100010C4
With Be the R dimensional vector, write as the form of component respectively y r = ( y 1 , y 2 , L , y R ) T With
Figure A2005101363100010C7
The back:
Step 43: (each sorter is output as his degree of confidence for user Z Z r = ( z 1 , z 2 , L , z R ) T ) , According to preset threshold With calculated in the step 41 Make categorised decision according to following rule, obtain user Z and belong to ω 0Class or ω 1The judgement of class:
If ( Z r - y r 0 ) &times; w r * > 0 , Z ∈ ω then 0
If ( Z r - y r 0 ) &times; w r * < 0 , Z ∈ ω then 1
Step 44: if in step 43, judge Z ∈ ω 0, put the K dimensional vector so L component be 1; Otherwise be changed to 0;
Step 5: by step 4 couple all j l, all finish after the authentication fusion, it is vectorial that and if only if In the one-component value is arranged is 1 o'clock, make the judgement that user Z is a validated user; Otherwise, judge that user Z is the disabled user.
4. the identity identifying method that biological characteristic according to claim 2 merges is characterized in that, in described step 32, normalized method is following to be determined:
Set: u represents the raw score of certain biological characteristic authentication submodule output, U represents the set that the raw score of this biological characteristic authentication submodule output constitutes, v represents the mark of output after normalization, carries out normalization according to " minimax approach " shown in (4-1) formula:
v = u - min ( U ) max ( U ) - min ( U )
Wherein max (U), min (U) represent maximal value and the minimum value among the set U respectively;
5. the identity identifying method that biological characteristic according to claim 2 merges is characterized in that, in described step 32, normalized method realizes by the process of degree of confidence conversion.The transfer process of counting to degree of confidence from original output branch is as follows:
At first, by given generalized confidence estimation formulas the output mark of respectively treating the fusant module is converted to generalized confidence:
Set: N user arranged in the database, and note is made class ω respectively 1, ω 2, L, ω N
The living things feature recognition authentication sub module of pending fusion also claims sorter, has R;
X ^ = ( x r 1 , x r 2 , L , x r R ) Be that user X total proper vector before coupling is gathered in the training set; Wherein
Figure A2005101363100011C2
Represent that this user is in i (i=1,2, L, R) proper vector that extraction obtains in the individual sorter;
For the original sorter that is output as distance, The representation feature vector With class ω jThe j of representative (j=1,2, L, N) minimal matching span between the proper vector template of individual user in database;
For the original sorter that is output as similarity, The representation feature vector With class ω jThe j of representative (j=1,2, L, N) the maximum match similarity between the proper vector template of individual user in database;
Then, can convert the original output mark of described each submodule to generalized confidence by given generalized confidence estimation formulas.To i sorter, user X is identified as class ω jRepresentative user's generalized confidence is used Expression.
For the original living things feature recognition submodule that is output as distance, with the mark of (5-1) formula with its output
Figure A2005101363100011C8
Convert generalized confidence to:
g ( &omega; j | x r i ) = 1 - d j ( x r i ) min k &NotEqual; j ( d k ( x r i ) ) , ( k = 1,2 , L , N ) - - - ( 5 - 1 )
For the original living things feature recognition submodule that is output as similarity, with the mark of (2) formula with its output Convert generalized confidence to:
g ( &omega; j | x r i ) = 1 - max k &NotEqual; j ( s k ( x r i ) ) s j ( x r i ) , ( k = 1,2 , L , N ) - - - ( 5 - 2 )
Secondly, after mark that certain living things feature recognition submodule is exported all is converted into generalized confidence,, again generalized confidence is converted to degree of confidence by calculating the mapping function f (y) of following generalized confidence to degree of confidence:
If: the codomain of the generalized confidence that obtains from certain living things feature recognition submodule is T;
User X belongs to the abundant big training set A (X ∈ A) that configures in this living things feature recognition submodule;
Order y = g ( &omega; j | x r i ) , To any y ∈ T, [y-δ, y+ δ] is a near minizone the y;
Then: f ( y ) = &Sigma; j = 1 N count ( { X | X &Element; Aandg ( &omega; j | x r i ) &Element; [ y - &delta; , y + &delta; ] andX &Element; &omega; j } ) &Sigma; j = 1 N count ( { X | X &Element; Aandg ( &omega; j | x r i ) &Element; [ y - &delta; , y + &delta; ] } ) - - - ( 5 - 3 )
(5-3) denominator of f (y) is the sum that generalized confidence drops on the sample in the minizone [y-δ, y+ δ] in the formula; Molecule is that generalized confidence drops in the minizone [y-δ, y+ δ] and the number of the sample that is correctly validated.
To any y ∈ T, calculate f (y) after, just can convert generalized confidence to degree of confidence.To i sorter, user X is identified as class ω jRepresentative user's degree of confidence also is a posterior probability, uses Expression.
p ( &omega; j | x r i ) = f ( g ( &omega; j | x r i ) ) .
6. any in the identity identifying and authenticating method that identity identifying method that biological characteristic according to claim 2 merges or biological characteristic according to claim 3 merge, it is characterized in that, in described step 35 of claim 2 or the described step 43 of claim 3, threshold value Obtain by following formula:
Figure A2005101363100012C4
7. any in the identity identifying and authenticating method that identity identifying method that biological characteristic according to claim 2 merges or biological characteristic according to claim 3 merge, it is characterized in that, in described step 35 of claim 2 or the described step 43 of claim 3, threshold value Obtain by following formula:
8. any in the identity identifying and authenticating method that identity identifying method that biological characteristic according to claim 2 merges or biological characteristic according to claim 3 merge, it is characterized in that, in described step 35 of claim 2 or the described step 43 of claim 3, threshold value Obtain by following formula:
CNB2005101363107A 2005-12-31 2005-12-31 Biocharacteristics fusioned identity distinguishing and identification method Expired - Fee Related CN100356388C (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CNB2005101363107A CN100356388C (en) 2005-12-31 2005-12-31 Biocharacteristics fusioned identity distinguishing and identification method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CNB2005101363107A CN100356388C (en) 2005-12-31 2005-12-31 Biocharacteristics fusioned identity distinguishing and identification method

Publications (2)

Publication Number Publication Date
CN1794266A true CN1794266A (en) 2006-06-28
CN100356388C CN100356388C (en) 2007-12-19

Family

ID=36805691

Family Applications (1)

Application Number Title Priority Date Filing Date
CNB2005101363107A Expired - Fee Related CN100356388C (en) 2005-12-31 2005-12-31 Biocharacteristics fusioned identity distinguishing and identification method

Country Status (1)

Country Link
CN (1) CN100356388C (en)

Cited By (37)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN100351852C (en) * 2006-07-11 2007-11-28 电子科技大学 Iris recognition method based on wavelet transform and maximum detection
WO2010031213A1 (en) * 2008-09-16 2010-03-25 Chen Hong An intelligent personal authentication recognizing method and recognizing terminal thereof
CN101123500B (en) * 2006-08-11 2011-02-02 华为技术有限公司 A biologic verification method and device
CN101976358A (en) * 2010-11-02 2011-02-16 徐国元 Holographic human body biological feature recognition device and method
CN101976148A (en) * 2010-10-28 2011-02-16 广东开心信息技术有限公司 Hand input system and method
CN102034288A (en) * 2010-12-09 2011-04-27 江南大学 Multiple biological characteristic identification-based intelligent door control system
CN101540000B (en) * 2008-03-20 2011-07-27 中国科学院自动化研究所 Iris classification method based on texture primitive statistical characteristic analysis
CN102184416A (en) * 2011-05-19 2011-09-14 汉王科技股份有限公司 Method and device for registering biometric sample
CN102332084A (en) * 2010-07-23 2012-01-25 中国农业大学 Identity identification method based on palm print and human face feature extraction
CN101216884B (en) * 2007-12-29 2012-04-18 北京中星微电子有限公司 A method and system for face authentication
CN101377815B (en) * 2007-08-31 2012-07-25 卡西欧计算机株式会社 Image pick-up apparatus and program
CN103235957A (en) * 2013-04-18 2013-08-07 武汉汉德瑞庭科技有限公司 Palm lateral surface information-based online handwriting authentication method and system
CN103258157A (en) * 2013-04-18 2013-08-21 武汉汉德瑞庭科技有限公司 On-line handwriting authentication method and system based on finger information
CN103440447A (en) * 2013-09-04 2013-12-11 武汉汉德瑞庭科技有限公司 Online handwriting identity authentication method with attacker identity recognition capability
CN103473494A (en) * 2013-09-03 2013-12-25 小米科技有限责任公司 Application running method, device and terminal device
CN104331692A (en) * 2014-11-28 2015-02-04 广东欧珀移动通信有限公司 Face recognition method and face recognition terminal based on double features
CN104992075A (en) * 2015-07-30 2015-10-21 浙江宇视科技有限公司 Multi-source information correlation method based on big data
CN105224849A (en) * 2015-10-20 2016-01-06 广州广电运通金融电子股份有限公司 A kind of multi-biological characteristic merges authentication identifying method and device
CN105262758A (en) * 2015-10-28 2016-01-20 广东欧珀移动通信有限公司 Identity authentication method and device
CN105335726A (en) * 2015-11-06 2016-02-17 广州视源电子科技股份有限公司 Face recognition confidence coefficient acquisition method and system
CN105631272A (en) * 2016-02-02 2016-06-01 云南大学 Multi-safeguard identity authentication method
CN105701462A (en) * 2016-01-11 2016-06-22 成都布林特信息技术有限公司 Identity identification method
CN105701411A (en) * 2016-01-11 2016-06-22 成都布林特信息技术有限公司 Information secure transmission method
CN105740683A (en) * 2016-01-20 2016-07-06 北京信安盟科技有限公司 Multi-factor, multi-engine and human-computer combined identity verification method and system
CN105959121A (en) * 2016-07-08 2016-09-21 钟林超 Mobile terminal with identity authentication function
CN106127103A (en) * 2016-06-12 2016-11-16 广州广电运通金融电子股份有限公司 A kind of off-line identity authentication method and device
CN107147652A (en) * 2017-05-18 2017-09-08 电子科技大学 A kind of safety fusion authentication method of the polymorphic identity of user based on block chain
CN109583332A (en) * 2018-11-15 2019-04-05 北京三快在线科技有限公司 Face identification method, face identification system, medium and electronic equipment
CN110020617A (en) * 2019-03-27 2019-07-16 五邑大学 A kind of personal identification method based on biological characteristic, device and storage medium
CN110378414A (en) * 2019-07-19 2019-10-25 中国计量大学 The personal identification method of multi-modal biological characteristic fusion based on evolution strategy
CN111311809A (en) * 2020-02-21 2020-06-19 南京理工大学 Intelligent access control system based on multi-biological-feature fusion
CN111382626A (en) * 2018-12-28 2020-07-07 广州市百果园信息技术有限公司 Method, device and equipment for detecting illegal image in video and storage medium
CN111460880A (en) * 2019-02-28 2020-07-28 杭州芯影科技有限公司 Multimodal biometric fusion method and system
CN112115446A (en) * 2020-07-29 2020-12-22 航天信息股份有限公司 Identity authentication method and system based on Skyline inquiry biological characteristics
CN112287315A (en) * 2020-07-29 2021-01-29 航天信息股份有限公司 Skyline-based identity authentication method and system by inquiring biological characteristics
CN113822308A (en) * 2020-06-20 2021-12-21 北京眼神智能科技有限公司 Comparison score fusion method, device, medium and equipment for multi-modal biological recognition
CN116863547A (en) * 2023-07-14 2023-10-10 广州市金其利信息科技有限公司 Multi-mode biological identification method and system based on feature scoring

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101299762B (en) * 2008-06-20 2011-08-17 北京中星微电子有限公司 Identification authentication method and apparatus

Cited By (64)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN100351852C (en) * 2006-07-11 2007-11-28 电子科技大学 Iris recognition method based on wavelet transform and maximum detection
CN101123500B (en) * 2006-08-11 2011-02-02 华为技术有限公司 A biologic verification method and device
CN101377815B (en) * 2007-08-31 2012-07-25 卡西欧计算机株式会社 Image pick-up apparatus and program
CN101216884B (en) * 2007-12-29 2012-04-18 北京中星微电子有限公司 A method and system for face authentication
CN101540000B (en) * 2008-03-20 2011-07-27 中国科学院自动化研究所 Iris classification method based on texture primitive statistical characteristic analysis
WO2010031213A1 (en) * 2008-09-16 2010-03-25 Chen Hong An intelligent personal authentication recognizing method and recognizing terminal thereof
CN102332084B (en) * 2010-07-23 2015-01-14 中国农业大学 Identity identification method based on palm print and human face feature extraction
CN102332084A (en) * 2010-07-23 2012-01-25 中国农业大学 Identity identification method based on palm print and human face feature extraction
CN101976148A (en) * 2010-10-28 2011-02-16 广东开心信息技术有限公司 Hand input system and method
CN101976148B (en) * 2010-10-28 2013-10-16 广东因豪信息科技有限公司 Hand input system and method
CN101976358A (en) * 2010-11-02 2011-02-16 徐国元 Holographic human body biological feature recognition device and method
CN102034288B (en) * 2010-12-09 2012-06-20 江南大学 Multiple biological characteristic identification-based intelligent door control system
CN102034288A (en) * 2010-12-09 2011-04-27 江南大学 Multiple biological characteristic identification-based intelligent door control system
CN102184416A (en) * 2011-05-19 2011-09-14 汉王科技股份有限公司 Method and device for registering biometric sample
CN103235957A (en) * 2013-04-18 2013-08-07 武汉汉德瑞庭科技有限公司 Palm lateral surface information-based online handwriting authentication method and system
CN103258157A (en) * 2013-04-18 2013-08-21 武汉汉德瑞庭科技有限公司 On-line handwriting authentication method and system based on finger information
CN103235957B (en) * 2013-04-18 2016-05-11 武汉汉德瑞庭科技有限公司 A kind of online handwriting authentication method and system based on palmar side surface information
CN103258157B (en) * 2013-04-18 2016-09-07 武汉汉德瑞庭科技有限公司 A kind of online handwriting authentication method based on finger information and system
WO2014169835A1 (en) * 2013-04-18 2014-10-23 武汉汉德瑞庭科技有限公司 Online handwriting authentication method and system based on finger information
WO2014169837A1 (en) * 2013-04-18 2014-10-23 武汉汉德瑞庭科技有限公司 Method and system for online handwriting authentication on the basis of palm side surface information
CN103473494B (en) * 2013-09-03 2016-08-17 小米科技有限责任公司 A kind of run the method for application, device and terminal unit
CN103473494A (en) * 2013-09-03 2013-12-25 小米科技有限责任公司 Application running method, device and terminal device
WO2015032304A1 (en) * 2013-09-04 2015-03-12 武汉汉德瑞庭科技有限公司 Online handwriting and identity authentication method having capability for identifying identity of attacker
CN103440447A (en) * 2013-09-04 2013-12-11 武汉汉德瑞庭科技有限公司 Online handwriting identity authentication method with attacker identity recognition capability
CN104331692A (en) * 2014-11-28 2015-02-04 广东欧珀移动通信有限公司 Face recognition method and face recognition terminal based on double features
CN104992075A (en) * 2015-07-30 2015-10-21 浙江宇视科技有限公司 Multi-source information correlation method based on big data
CN104992075B (en) * 2015-07-30 2018-07-13 浙江宇视科技有限公司 A kind of multi-source information correlating method and device based on big data
US10346602B2 (en) 2015-10-20 2019-07-09 Grg Banking Equipment Co., Ltd. Method and device for authenticating identify by means of fusion of multiple biological characteristics
CN105224849B (en) * 2015-10-20 2019-01-01 广州广电运通金融电子股份有限公司 A kind of multi-biological characteristic fusion authentication identifying method and device
CN105224849A (en) * 2015-10-20 2016-01-06 广州广电运通金融电子股份有限公司 A kind of multi-biological characteristic merges authentication identifying method and device
CN105262758B (en) * 2015-10-28 2017-09-12 广东欧珀移动通信有限公司 A kind of auth method and device
CN105262758A (en) * 2015-10-28 2016-01-20 广东欧珀移动通信有限公司 Identity authentication method and device
CN105335726A (en) * 2015-11-06 2016-02-17 广州视源电子科技股份有限公司 Face recognition confidence coefficient acquisition method and system
CN105335726B (en) * 2015-11-06 2018-11-27 广州视源电子科技股份有限公司 Face recognition confidence coefficient acquisition method and system
CN105701462A (en) * 2016-01-11 2016-06-22 成都布林特信息技术有限公司 Identity identification method
CN105701411A (en) * 2016-01-11 2016-06-22 成都布林特信息技术有限公司 Information secure transmission method
CN105740683B (en) * 2016-01-20 2018-10-12 北京信安盟科技有限公司 Based on multifactor, multi engine, the man-machine auth method being combined and system
CN105740683A (en) * 2016-01-20 2016-07-06 北京信安盟科技有限公司 Multi-factor, multi-engine and human-computer combined identity verification method and system
CN105631272B (en) * 2016-02-02 2018-05-11 云南大学 A kind of identity identifying method of multiple security
CN105631272A (en) * 2016-02-02 2016-06-01 云南大学 Multi-safeguard identity authentication method
CN106127103A (en) * 2016-06-12 2016-11-16 广州广电运通金融电子股份有限公司 A kind of off-line identity authentication method and device
US10417532B2 (en) 2016-06-12 2019-09-17 Grg Banking Equipment Co., Ltd. Offline identity authentication method and apparatus
CN106127103B (en) * 2016-06-12 2019-06-25 广州广电运通金融电子股份有限公司 A kind of offline identity authentication method and device
CN105959121B (en) * 2016-07-08 2018-12-14 江苏心灵鸡汤信息技术有限公司 A kind of mobile terminal with identification verification function
CN105959121A (en) * 2016-07-08 2016-09-21 钟林超 Mobile terminal with identity authentication function
CN107147652A (en) * 2017-05-18 2017-09-08 电子科技大学 A kind of safety fusion authentication method of the polymorphic identity of user based on block chain
CN107147652B (en) * 2017-05-18 2019-08-09 电子科技大学 A kind of safety fusion authentication method of the polymorphic identity of user based on block chain
CN109583332A (en) * 2018-11-15 2019-04-05 北京三快在线科技有限公司 Face identification method, face identification system, medium and electronic equipment
CN109583332B (en) * 2018-11-15 2021-07-27 北京三快在线科技有限公司 Face recognition method, face recognition system, medium, and electronic device
CN111382626A (en) * 2018-12-28 2020-07-07 广州市百果园信息技术有限公司 Method, device and equipment for detecting illegal image in video and storage medium
CN111382626B (en) * 2018-12-28 2023-04-18 广州市百果园信息技术有限公司 Method, device and equipment for detecting illegal image in video and storage medium
CN111460880B (en) * 2019-02-28 2024-03-05 杭州芯影科技有限公司 Multimode biological feature fusion method and system
CN111460880A (en) * 2019-02-28 2020-07-28 杭州芯影科技有限公司 Multimodal biometric fusion method and system
CN110020617A (en) * 2019-03-27 2019-07-16 五邑大学 A kind of personal identification method based on biological characteristic, device and storage medium
CN110378414B (en) * 2019-07-19 2021-11-09 中国计量大学 Multi-mode biological characteristic fusion identity recognition method based on evolution strategy
CN110378414A (en) * 2019-07-19 2019-10-25 中国计量大学 The personal identification method of multi-modal biological characteristic fusion based on evolution strategy
CN111311809A (en) * 2020-02-21 2020-06-19 南京理工大学 Intelligent access control system based on multi-biological-feature fusion
CN113822308A (en) * 2020-06-20 2021-12-21 北京眼神智能科技有限公司 Comparison score fusion method, device, medium and equipment for multi-modal biological recognition
CN113822308B (en) * 2020-06-20 2024-04-05 北京眼神智能科技有限公司 Multi-mode biological recognition comparison score fusion method, device, medium and equipment
CN112287315A (en) * 2020-07-29 2021-01-29 航天信息股份有限公司 Skyline-based identity authentication method and system by inquiring biological characteristics
CN112115446B (en) * 2020-07-29 2024-02-09 航天信息股份有限公司 Skyline query biological feature-based identity authentication method and system
CN112115446A (en) * 2020-07-29 2020-12-22 航天信息股份有限公司 Identity authentication method and system based on Skyline inquiry biological characteristics
CN116863547B (en) * 2023-07-14 2024-02-20 广州市金其利信息科技有限公司 Multi-mode biological identification method and system based on feature scoring
CN116863547A (en) * 2023-07-14 2023-10-10 广州市金其利信息科技有限公司 Multi-mode biological identification method and system based on feature scoring

Also Published As

Publication number Publication date
CN100356388C (en) 2007-12-19

Similar Documents

Publication Publication Date Title
CN1794266A (en) Biocharacteristics fusioned identity distinguishing and identification method
CN100336070C (en) Method of robust human face detection in complicated background image
CN1235177C (en) Hand-wirte signature recognition program, method and device
CN1689042A (en) Biometrics information registration apparatus, biometrics information matching apparatus, biometrics information registration/matching system, and biometrics information registration program
CN1818927A (en) Fingerprint identifying method and system
CN1552041A (en) Face meta-data creation and face similarity calculation
CN1177407A (en) Method and system for velocity-based head writing recognition
CN1701324A (en) Systems, methods, and software for classifying text
CN1924897A (en) Image processing apparatus and method and program
CN1506903A (en) Automatic fingerprint distinguishing system and method based on template learning
CN1188808C (en) Judging method of sheets, notes etc. for forgery, and method of insertion direction of them
CN1249046A (en) Systems and methods with identity verification by streamlined comparison and interpretation of fingerprints and the like
CN1599913A (en) Iris identification system and method, and storage media having program thereof
CN1700241A (en) Face description, recognition method and device
CN1664846A (en) On-line hand-written Chinese characters recognition method based on statistic structural features
CN1792060A (en) Methd and system for authenticating physical object
CN101055620A (en) Shape comparison device and method
CN1251130C (en) Method for identifying multi-font multi-character size print form Tibetan character
CN101075868A (en) Long-distance identity-certifying system, terminal, servo and method
CN1310825A (en) Methods and apparatus for classifying text and for building a text classifier
CN1379364A (en) Graph comparing device and graph comparing method
CN1200387C (en) Statistic handwriting identification and verification method based on separate character
CN1151465C (en) Model identification equipment using condidate table making classifying and method thereof
CN1623506A (en) Bioassay system based on iris texture analysis
CN1267849C (en) Finger print identifying method based on broken fingerprint detection

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C14 Grant of patent or utility model
GR01 Patent grant
C17 Cessation of patent right
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20071219

Termination date: 20131231