CN109389094B - Stable iris feature extraction and matching method - Google Patents

Stable iris feature extraction and matching method Download PDF

Info

Publication number
CN109389094B
CN109389094B CN201811240801.XA CN201811240801A CN109389094B CN 109389094 B CN109389094 B CN 109389094B CN 201811240801 A CN201811240801 A CN 201811240801A CN 109389094 B CN109389094 B CN 109389094B
Authority
CN
China
Prior art keywords
iris
feature
characteristic
similarity
image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201811240801.XA
Other languages
Chinese (zh)
Other versions
CN109389094A (en
Inventor
郭慧杰
韩一梁
杨昆
王超楠
杨倩倩
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Institute of Radio Metrology and Measurement
Original Assignee
Beijing Institute of Radio Metrology and Measurement
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Institute of Radio Metrology and Measurement filed Critical Beijing Institute of Radio Metrology and Measurement
Priority to CN201811240801.XA priority Critical patent/CN109389094B/en
Publication of CN109389094A publication Critical patent/CN109389094A/en
Application granted granted Critical
Publication of CN109389094B publication Critical patent/CN109389094B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/18Eye characteristics, e.g. of the iris
    • G06V40/193Preprocessing; Feature extraction
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/18Eye characteristics, e.g. of the iris
    • G06V40/197Matching; Classification

Landscapes

  • Engineering & Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Ophthalmology & Optometry (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Collating Specific Patterns (AREA)
  • Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)

Abstract

The invention discloses a stable iris feature extraction and matching method, which comprises the following steps: self-defining a main feature retention and alternating phase detection operator, constructing an iris feature extraction convolution kernel, registering irises, extracting features by using the iris feature extraction convolution kernel, training a strong classifier by using sequence image feature similarity, producing an enhanced iris feature template, identifying irises, extracting features by using the iris feature extraction convolution kernel, generating stable sample features by using multi-sample weighted feature space projection and voting fusion features, adding a new iris mode feature template, adaptively updating a classification threshold according to risk prediction, and matching and classifying and distinguishing the iris feature template and the sample features according to the updated classification threshold. The stable iris feature extraction and matching method is beneficial to improving the accuracy and the robustness of the iris recognition system.

Description

Stable iris feature extraction and matching method
Technical Field
The invention relates to an iris mode classification method, in particular to a stable iris feature extraction and matching method.
Background
The iris recognition has become a key research direction and development trend in the field of biological recognition with the remarkable advantages of accuracy, stability, safety, non-contact property and the like. However, the iris size is very small and is easily interfered by noise such as light spots, eyelids, eyelashes and the like, and in a remote recognition scene, due to different user forms, the iris image is easy to deform, so that how to acquire a certain number of stable and effective iris features and accurately classify the iris features is a key and difficult point of iris recognition. The current typical iris feature extraction and matching method has the following defects:
1. the extracted iris features are not effectively evaluated and screened, and unstable interference features exist in the iris features, so that the difficulty of iris classification is increased, and the accuracy of iris identification is reduced;
2. correlation between the iris registration template and the identification sample is not fully utilized, stable iris features are not enhanced, unstable iris features are not inhibited, and the accuracy rate of iris identification is difficult to improve;
3. and a fixed threshold is selected for feature matching, so that the influence caused by unstable features is further aggravated, the confidence of correct classification is reduced, and the improvement of the iris identification accuracy and robustness is not facilitated.
Therefore, it is desirable to provide a stable iris feature extraction and matching method, in which an enhanced feature template is formed by feature enhancement at an iris registration end, a stable feature sample is extracted by feature fusion at an iris recognition end, a classification threshold is adaptively determined according to risk prediction, and then feature matching and iris classification are performed, thereby effectively improving the accuracy and robustness of iris recognition.
Disclosure of Invention
The invention aims to provide a stable iris feature extraction and matching method, which solves the problem that the accuracy of iris recognition is seriously influenced by unstable features and a fixed threshold value, thereby effectively enhancing the accuracy and robustness of iris recognition.
In order to achieve the purpose, the invention adopts the following technical scheme:
a stable iris feature extraction and matching method is characterized by comprising the following steps:
step S1: self-defining a main feature holding operator and an alternating phase detection operator, and constructing an iris feature extraction convolution kernel.
Step S11: adaptively calculating and detecting the edge characteristic of the iris and inhibiting the smooth characteristic of the iris;
step S12: detecting the phase change of the edge of the iris in the angle direction according to the characteristic of the edge of the iris to generate stable iris characteristics;
step S2: iris registration, extracting features by using the iris features to extract convolution kernels, training a strong classifier by using the similarity of the features of the sequence images, and producing an enhanced iris feature template.
Step S21: calculating to obtain a shift ratio and anti-iris rotation interference by using an iris image, iterating the feature maps one by one, and constructing a strong feature classifier;
step S22: forming a characteristic mask diagram by using the iterative characteristic diagram to generate an enhanced iris characteristic template;
step S3: and iris recognition, namely extracting features by using an iris feature extraction convolution kernel, and generating stable sample features by using multi-sample weighted feature space projection and voting fusion features.
Step S31: calculating to obtain a weighted feature space by using the reference sample feature map;
step S32: selecting stable feature points according to the weighted feature space voting to generate a stable iris sample feature map;
step S4: and adding a new iris mode characteristic template, and adaptively updating the classification threshold according to risk prediction.
Step S41: in the iris feature template library, respectively calculating probability distribution of the intra-class iris feature similarity and the inter-class iris feature similarity under different classification thresholds to obtain an initial classification threshold;
step S42: updating the classification threshold value in a self-adaptive manner according to the newly registered iris mode class;
step S5: and matching and classifying the iris feature template and the sample features according to the updated classification threshold.
Preferably, by customizing the iris principal characteristic holding operator Fm, the edge characteristic of the iris principal is detected, and the smooth characteristic is inhibited,
Figure BDA0001839233530000031
let the normalized iris image be I (x, y), x ∈ [1,2, …, w ∈],y∈[1,2,…,h]W and h are width and height of the image respectively, and the iris edge characteristic image is Im(x,y),
Figure BDA0001839233530000032
Wherein, x belongs to [1,2, …, w ], y belongs to [1,2, …, h ]; detecting the phase change of the edge of the iris in the angle direction by self-defining an iris alternating phase detection operator Fp, extracting stable iris characteristics,
Fp=[-1 -1 -1 2 2 2 -1 -1 -1]。
further preferably, the iris feature image is IF(x,y),
Figure BDA0001839233530000033
Wherein x ∈ [1,2, …, w ∈ ]],y∈[1,2,…,h](ii) a Let the iris characteristic convolution kernel be CF
CF=f(Fm,Fp)
Is provided with
Figure BDA0001839233530000034
Preferably, when the iris is registered, acquiring n-5 normalized iris images I meeting the quality requirementiI is 0,1, …, n-1, and I is ordered from high to low according to the wavelet high frequency coefficient spectrum weighted energy sumi
Ei=α|Wh|2+β|Wv|2+λ|Wd|2,
Figure BDA0001839233530000041
Wherein WT represents a wavelet transform, preferably a first order wavelet decomposition using a two-dimensional discrete (5,3) wavelet basis; wa、Wh、WvAnd WdRespectively wavelet low frequency, horizontal high frequency, vertical high frequency and diagonal high frequency coefficient spectra, EiFor the weighted energy sum, α, β, λ are weighting coefficients, and preferably, α is 0.4, β is 0.4, and λ is 0.2; to IiCarrying out feature extraction to obtain a sequence feature image IFi,i=0,1,…,n-1,
Figure BDA0001839233530000042
MiFor the feature mask image corresponding thereto, a point having a value of 1 indicates that the corresponding position is a stable feature, and a point having a value of 0 indicates that the corresponding position is an unstable feature, which has been contaminated by noise or other disturbance.
With IF0Calculating the normalized similarity sim between other characteristic images and the reference characteristic image by taking the normalized characteristic image similarity as a criterion,
Figure BDA0001839233530000043
wherein sim ∈ [0.0,1.0]],||·||2And (3) taking the 2-norm of the matrix, wherein shift represents the comparison and shift digit number of the characteristic diagram, preferably, the shift is-8, -7, …,7,8, positive sign represents right shift, negative sign represents left shift, and shift ratio can resist iris rotation interference, so that the maximum similarity of the two comparison and the characteristic diagram is obtained.
And (5) iterating the feature graphs from j to n-1 one by one to construct a strong feature classifier.
It is further preferred that the first and second liquid crystal compositions,
M0=update(M0,j),
Figure BDA0001839233530000044
wherein j is 1,2, …, n-1, when the similarity between the reference characteristic image and the comparison characteristic image is not equal to 1.0, correcting the mask mark of the unmatched characteristic point, and updating M0Setting the corresponding point element to 0 until the last comparison of the feature image iteration is completed, wherein the similarity of all the feature images is equal to 1.0, and obtaining a feature mask image MF
MF=update(M0,n-1)
The generated enhanced iris feature template is IT={IF0;MF}。
Preferably, m-8 normalized iris sample images P meeting the quality requirement are capturediI is 0,1, …, m-1. Ranking P by quality score from high to lowiThen to PiCarrying out feature extraction to obtain a multi-sample feature image PFi,i=0,1,…,m-1,
Figure BDA0001839233530000051
LiFor the feature mask image corresponding thereto, a point having a value of 1 indicates that the corresponding position is a stable feature, and a point having a value of 0 indicates that the corresponding position is an unstable feature, which has been contaminated by noise or other disturbance.
With PF0Aligning the feature points of other sample feature maps with the reference sample feature map according to the image similarity res for the reference sample feature map,
Figure BDA0001839233530000052
wherein j ═ 1,2, …, m-1, | · | | | calc2Is the 2-norm of the matrix, shift represents the sample characteristic diagram alignment shift digit, preferably, shift is-8, -7, …,7,8, positive sign represents right shift, negative sign represents left shift, LjAccording to PFjThe corresponding shift update is performed for the shift case of (2).
Multiple sample feature map PFiProjected into the weighted feature space WF.
It is further preferred that the first and second liquid crystal compositions,
Figure BDA0001839233530000053
wherein, wiM is the number of samples for the characteristic weighting coefficient. Selecting stable feature points according to the weighted feature votes,
Figure BDA0001839233530000061
wherein σFFor stable feature discrimination threshold, preferably, take σF=0.73,LFFor stabilizing the feature mask image, a stable iris sample feature map is generated as PS={PF0;LF}。
Preferably, any two iris characteristic modes I are definedA={PA;MAAnd IB={PB;MBThe similarity ms between the two is,
Figure BDA0001839233530000062
where ms ∈ [0.0,1.0], P denotes the iris feature pattern, M denotes the feature pattern mask, and w, h are the width and height of the feature pattern, respectively. .
In the iris feature template library, calculating probability distribution of the similarity of the iris features in the class and the similarity of the iris features between the classes under different classification thresholds respectively, determining an initial similarity discrimination threshold according to risk prediction,
Figure BDA0001839233530000063
wherein, TcTo an initial classification threshold, D (Z)iSame) represents the feature similarity of the similar irises at a classification threshold value ZiProbability density of lower, D (Z)iI difference) represents the similarity of the characteristics of non-homogeneous irises at a classification threshold value ZiProbability density of [ mu ] is a risk parameter, [ mu ] is>1 indicates that the false acceptance rate is more important at decision time, mu<1 indicates that the error rejection rate is more important in decision making, and μ ═ 1 indicates that the decision making prioritizes the error rates, and preferably, μ ═ 103
As the number of registered iris pattern classes increases, the classification threshold is adaptively updated in real time. Computing newly added mode InewMaximum similarity T with all patterns in the feature template libraryc’。
It is further preferred that the first and second liquid crystal compositions,
Figure BDA0001839233530000064
therein, IOiRepresenting the original iris pattern classes in the library of feature templates. The classification threshold T is updated by comparison with the original classification threshold,
Figure BDA0001839233530000071
preferably, an iris feature template I is calculatedT={IF0;MFAnd sample feature PS={PF0;LF-the degree of similarity between the two components,
Figure BDA0001839233530000072
the updated classification threshold is used for discrimination to complete iris feature matching classification,
Figure BDA0001839233530000073
according to the technical scheme, stable characteristics are extracted and matched from the three aspects of iris registration characteristic enhancement, iris identification characteristic screening and adaptive characteristic similarity matching judgment, an enhanced iris characteristic template and stable sample characteristics are generated, the iris characteristic classification threshold is updated in a self-adaptive mode, interference and influence of unstable characteristics are overcome, and therefore accuracy and robustness of an iris identification system are improved.
Drawings
Fig. 1 is a flow chart of a stable iris feature extraction and matching method.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
A stable iris feature extraction and matching method comprises the following specific steps:
firstly, designing a main characteristic holding and alternating phase detection operator, and carrying out edge detection and characteristic extraction on the normalized iris image.
Operator F is kept through self-defining iris main characteristicmDetecting the main edge characteristics of the iris and suppressing the smooth characteristics thereof,
Figure BDA0001839233530000081
let the normalized iris image be I (x, y), x ∈ [1,2, …, w ∈],y∈[1,2,…,h]W and h respectively represent the width and height of the image, and the iris edge characteristic image is Im(x,y),
Figure BDA0001839233530000082
Wherein x ∈ [1,2, …, w ∈ ]],y∈[1,2,…,h](ii) a Operator F is detected through self-defining iris alternating phasepDetecting the phase change of the edge of the iris in the angular direction, extracting stable iris characteristics,
Fp=[-1 -1 -1 2 2 2 -1 -1 -1]
the characteristic image of iris is IF(x,y),
Figure BDA0001839233530000083
Wherein x ∈ [1,2, …, w ∈ ]],y∈[1,2,…,h](ii) a Let the iris characteristic convolution kernel be CF
CF=f(Fm,Fp)
Is provided with
Figure BDA0001839233530000084
And secondly, constructing a strong feature classifier by using the sequence iris images at an iris registration end to generate an enhanced iris feature template.
When iris registration is carried out, n normalized iris images I meeting quality requirements are collectediI is 0,1, …, n-1, specifically, n is 5. Sorting I according to wavelet high-frequency coefficient spectrum weighted energy sum from high to lowi
Ei=α|Wh|2+β|Wv|2+λ|Wd|2,
Figure BDA0001839233530000085
Wherein WT represents a wavelet transform, Wa、Wh、WvAnd WdThe method comprises the following steps of respectively carrying out wavelet low-frequency, horizontal high-frequency, vertical high-frequency and diagonal high-frequency coefficient spectrums, and specifically, carrying out first-order wavelet decomposition by adopting two-dimensional discrete (5,3) wavelet bases; eiFor the weighted energy sum, α, β, and λ are weighting coefficients, and specifically, α is 0.4, β is 0.4, and λ is 0.2. To IiCarrying out feature extraction to obtain a sequence feature image IFi,i=0,1,…,n-1,
Figure BDA0001839233530000091
MiFor the feature mask image corresponding thereto, a point having a value of 1 indicates that the corresponding position is a stable feature, and a point having a value of 0 indicates that the corresponding position is an unstable feature, which has been contaminated by noise or other disturbance.
With IF0Calculating the normalized similarity sim between other characteristic images and the reference characteristic image by taking the normalized characteristic image similarity as a criterion,
Figure BDA0001839233530000092
wherein sim ∈ [0.0,1.0]],||·||2And (3) taking the 2-norm of the matrix, wherein shift represents the comparison and shift digit number of the characteristic diagram, positive sign represents right shift, negative sign represents left shift, specifically, the shift is-8, -7, …,7,8, and the shift ratio can resist iris rotation interference, so that the maximum similarity of the two comparison and characteristic diagrams is obtained.
And (3) from j to n-1, iterating the feature maps one by one to construct a strong feature classifier,
M0=update(M0,j),
Figure BDA0001839233530000093
wherein j is 1,2, …, n-1, when the similarity between the reference characteristic image and the comparison characteristic image is not equal to 1.0, correcting the mask mark of the unmatched characteristic point, and updating M0Setting the corresponding point element to 0 until the last comparison of the feature image iteration is completed, wherein the similarity of all the feature images is equal to 1.0, and obtaining a feature mask image MF
MF=update(M0,n-1)
The generated enhanced iris feature template is IT={IF0;MF}。
And thirdly, at the iris recognition end, generating a stable iris sample characteristic diagram according to the characteristic mapping and fusion of the multiple samples.
During iris recognition, m normalized iris sample images P meeting the quality requirement are capturediI is 0,1, …, and m-1, specifically, m is 8. Ranking P by quality score from high to lowiThen to PiCarrying out feature extraction to obtain a multi-sample feature image PFi,i=0,1,…,m-1,
Figure BDA0001839233530000094
LiValue for the feature mask image corresponding theretoA point of 1 indicates that the corresponding location is a stable feature, and a point of 0 indicates that the corresponding location is an unstable feature, contaminated by noise or other interference.
With PF0Aligning the feature points of other sample feature maps with the reference sample feature map according to the image similarity res for the reference sample feature map,
Figure BDA0001839233530000101
wherein j ═ 1,2, …, m-1, | · | | | calc2Is the 2-norm of the matrix, shift represents the sample characteristic diagram alignment shift digit number, positive sign represents the right shift, negative sign represents the left shift, specifically, shift is-8, -7, …,7,8, LjAccording to PFjThe corresponding shift update is performed for the shift case of (2).
Multiple sample feature map PFiProjection onto a weighted feature space WF
Figure BDA0001839233530000102
Wherein, wiFor the characteristic weighting coefficient, m ═ 8 is the number of samples. Selecting stable feature points according to the weighted feature votes,
Figure BDA0001839233530000103
wherein σFDetermining a threshold for a stable feature, in particular, taking σF=0.73,LFFor stabilizing the feature mask image, a stable iris sample feature map is generated as PS={PF0;LF}。
And fourthly, determining an iris feature similarity discrimination threshold through self-adaptive risk prediction.
Defining any two iris characteristic modes IA={PA;MAAnd IB={PB;MBThe similarity ms between the two is,
Figure BDA0001839233530000104
where ms ∈ [0.0,1.0], P denotes the iris feature pattern, M denotes the feature pattern mask, and w, h are the width and height of the feature pattern, respectively.
In the iris feature template library, calculating probability distribution of the similarity of the iris features in the class and the similarity of the iris features between the classes under different classification thresholds respectively, determining an initial similarity discrimination threshold according to risk prediction,
Figure BDA0001839233530000111
wherein, TcTo an initial classification threshold, D (Z)iSame) represents the feature similarity of the similar irises at a classification threshold value ZiProbability density of lower, D (Z)iI difference) represents the similarity of the characteristics of non-homogeneous irises at a classification threshold value ZiProbability density of [ mu ] is a risk parameter, [ mu ] is>1 indicates that the false acceptance rate is more important at decision time, mu<1 indicates that the error rejection rate is more important in decision making, and μ ═ 1 indicates that the decision making prioritizes the error rates, specifically, μ ═ 103
As the number of registered iris pattern classes increases, the classification threshold is adaptively updated in real time. Computing newly added mode InewMaximum similarity T with all patterns in the feature template libraryc’,
Figure BDA0001839233530000112
Therein, IOiRepresenting the original iris pattern classes in the library of feature templates. The classification threshold T is updated by comparison with the original classification threshold,
Figure BDA0001839233530000113
and fifthly, carrying out iris feature matching classification according to the updated classification threshold.
Calculating iris feature template IT={IF0;MFAnd sample feature PS={PF0;LF-the degree of similarity between the two components,
Figure BDA0001839233530000114
the updated classification threshold is used for discrimination to complete iris feature matching classification,
Figure BDA0001839233530000115
in summary, according to the technical scheme of the invention, an enhanced feature template is generated by constructing a sequence image feature strong classifier during iris registration, stable sample features are generated by multi-sample weighted feature space projection and voting during iris recognition, and the self-adaptively updated classification threshold is used for discrimination during feature matching, so that the influence of interference features is effectively eliminated, and the accuracy and robustness of an iris recognition system are improved.
Although embodiments of the present invention have been shown and described, it will be appreciated by those skilled in the art that changes, modifications, substitutions and alterations can be made in these embodiments without departing from the principles and spirit of the invention, the scope of which is defined in the appended claims and their equivalents.

Claims (10)

1. A stable iris feature extraction and matching method is characterized by comprising the following steps:
step S1: self-defining a main feature retention and alternating phase detection operator, and constructing an iris feature extraction convolution kernel, wherein the method comprises the following steps:
step S11: adaptively calculating and detecting the edge characteristic of the iris and inhibiting the smooth characteristic of the iris;
step S12: detecting the phase change of the edge of the iris in the angle direction according to the characteristic of the edge of the iris to generate stable iris characteristics;
step S2: iris registering, extracting features by using an iris feature extraction convolution kernel, training a strong classifier by using sequence image feature similarity, and producing an enhanced iris feature template, wherein the iris registering comprises the following steps:
step S21: calculating to obtain a shift ratio and anti-iris rotation interference by using an iris image, iterating the feature maps one by one, and constructing a strong feature classifier;
step S22: forming a characteristic mask diagram by using the iterative characteristic diagram to generate an enhanced iris characteristic template;
step S3: iris recognition, extracting features by using an iris feature extraction convolution kernel, and generating stable sample features by using multi-sample weighted feature space projection and voting fusion features, wherein the characteristics comprise:
step S31: calculating to obtain a weighted feature space by using the reference sample feature map;
step S32: selecting stable feature points according to the weighted feature space voting to generate a stable iris sample feature map;
step S4: adding a new iris mode feature template, and adaptively updating a classification threshold according to risk prediction, wherein the method comprises the following steps:
step S41: in the iris feature template library, respectively calculating probability distribution of the intra-class iris feature similarity and the inter-class iris feature similarity under different classification thresholds to obtain an initial classification threshold;
step S42: updating the classification threshold value in a self-adaptive manner according to the newly registered iris mode class;
step S5: and matching and classifying the iris feature template and the sample features according to the updated classification threshold.
2. The iris feature extraction and matching method according to claim 1, wherein said step S11 specifically includes:
operator F is kept through self-defining iris main characteristicmDetecting the main edge characteristics of the iris and suppressing the smooth characteristics thereof,
Figure FDA0002950648700000021
let the normalized iris image be I (x, y), x ∈ [1,2, …, w ∈],y∈[1,2,…,h]W and h are width and height of the image respectively, and the iris edge characteristic image is Im(x,y),
Figure FDA0002950648700000022
Wherein x belongs to [1,2, …, w ], y belongs to [1,2, …, h ];
operator F is detected through self-defining iris alternating phasepDetecting the phase change of the edge of the iris in the angular direction, extracting stable iris characteristics,
Fp=[-1 -1 -1 2 2 2 -1 -1 -1]。
3. the iris feature extraction and matching method according to claim 2, wherein said step S12 specifically includes:
the characteristic image of iris is IF(x,y),
Figure FDA0002950648700000023
Wherein x belongs to [1,2, …, w ], y belongs to [1,2, …, h ];
let the iris characteristic convolution kernel be CF
CF=f(Fm,Fp)
Then there is
Figure FDA0002950648700000024
4. The iris feature extraction and matching method according to claim 1, wherein said step S21 specifically includes:
when iris registration is carried out, n normalized iris images I meeting quality requirements are collectediI is 0,1, …, n-1, and I is ordered from high to low according to the wavelet high frequency coefficient spectrum weighted energy sumi
Figure FDA0002950648700000031
Wherein WT represents a wavelet transform, Wa、Wh、WvAnd WdRespectively wavelet low frequency, horizontal high frequency, vertical high frequency and diagonal high frequency coefficient spectra, EiAlpha, beta and lambda are weighting coefficients;
user-defined iris principal characteristic holding operator FmComprises the following steps:
Figure FDA0002950648700000032
user-defined iris alternating phase detection operator FpIs composed of
Fp=[-1 -1 -1 2 2 2 -1 -1 -1]
Let the iris characteristic convolution kernel be CF
CF=f(Fm,Fp) To IiCarrying out feature extraction to obtain a sequence feature image IFi,i=0,1,…,n-1,
Figure FDA0002950648700000033
MiFor the feature mask image corresponding thereto, a point with a value of 1 indicates that the corresponding position is a stable feature, and a point with a value of 0 indicates that the corresponding position is an unstable feature, and has been contaminated by noise or other interference;
with IF0Calculating the normalization of other characteristic images and the reference characteristic image by taking the normalized characteristic image similarity as a criterionThe degree of similarity sim is calculated by taking the measured values,
Figure FDA0002950648700000034
wherein sim ∈ [0.0,1.0]],||·||2The characteristic diagram comparison shift digit is represented by shift, the positive sign represents right shift, the negative sign represents left shift, the shift ratio can resist iris rotation interference, and the maximum similarity of the two comparison characteristic diagrams is obtained;
and (5) iterating the feature graphs from j to n-1 one by one to construct a strong feature classifier.
5. The iris feature extraction and matching method according to claim 4, wherein said step S22 specifically includes:
Figure FDA0002950648700000035
wherein j is 1,2, …, n-1;
when the similarity between the reference characteristic image and the comparison characteristic image is not equal to 1.0, correcting the mask mark of the unmatched characteristic point, and updating M0Setting the corresponding point element to 0 until the last comparison of the feature image iteration is completed, wherein the similarity of all the feature images is equal to 1.0, and obtaining a feature mask image MF
MF=update(M0,n-1)
The generated enhanced iris feature template is IT={IF0;MF}。
6. The iris feature extraction and matching method according to claim 1, wherein said step S31 specifically includes:
user-defined iris principal characteristic holding operator FmComprises the following steps:
Figure FDA0002950648700000041
user-defined iris alternating phase detection operator FpIs composed of
Fp=[-1 -1 -1 2 2 2 -1 -1 -1]
Let the iris characteristic convolution kernel be CF
CF=f(Fm,Fp)
During iris recognition, m normalized iris sample images P meeting the quality requirement are capturediI-0, 1, …, m-1, with P ranked from high to low according to quality scoreiThen to PiCarrying out feature extraction to obtain a multi-sample feature image PFi,i=0,1,…,m-1,
Figure FDA0002950648700000042
LiFor the feature mask image corresponding thereto, a point with a value of 1 indicates that the corresponding position is a stable feature, and a point with a value of 0 indicates that the corresponding position is an unstable feature, and has been contaminated by noise or other interference;
with PF0Aligning the feature points of other sample feature maps with the reference sample feature map according to the image similarity res for the reference sample feature map,
Figure FDA0002950648700000051
wherein j ═ 1,2, …, m-1, | · | | | calc2Is the 2-norm of the matrix, shift represents the sample characteristic diagram alignment shift digit number, positive sign represents the right shift, negative sign represents the left shift, LjAccording to PFjCarrying out corresponding shift updating on the shift condition;
multiple sample feature map PFiProjection onto a weighted feature space WF
7. The iris feature extraction and matching method according to claim 6, wherein said step S32 specifically includes:
Figure FDA0002950648700000052
wherein, wiM is a characteristic weighting coefficient, and m is the number of samples;
selecting stable feature points according to the weighted feature votes,
Figure FDA0002950648700000053
wherein σFDiscriminating threshold, L, for stable featuresFFor stabilizing the feature mask image, a stable iris sample feature map is generated as PS={PF0;LF}。
8. The iris feature extraction and matching method according to claim 1, wherein said step S41 specifically includes:
defining any two iris characteristic modes IA={PA;MAAnd IB={PB;MBThe similarity ms between the two is,
Figure FDA0002950648700000054
wherein ms belongs to [0.0,1.0], P represents an iris characteristic mode, M represents a characteristic mode mask, and w and h are the width and the height of the characteristic mode respectively;
in the iris feature template library, calculating probability distribution of the similarity of the iris features in the class and the similarity of the iris features between the classes under different classification thresholds respectively, determining an initial similarity discrimination threshold according to risk prediction,
Figure FDA0002950648700000061
wherein, TcTo an initial classification threshold, D (Z)iSame) represents the feature similarity of the similar irises at a classification threshold value ZiProbability density of lower, D (Z)iI difference) represents the similarity of the characteristics of non-homogeneous irises at a classification threshold value ZiProbability density of [ mu ] is a risk parameter, [ mu ] is>1 indicates that the false acceptance rate is more important at decision time, mu<1 indicates that the error rejection rate is more important in decision making, and mu-1 indicates that the equal error rate is preferentially considered in decision making;
updating the classification threshold in real time in a self-adaptive manner along with the increase of the registered iris pattern classes;
and calculating the maximum similarity Tc' of the newly added mode Inew and all the modes in the characteristic template library.
9. The iris feature extraction and matching method according to claim 8, wherein said step S42 specifically includes:
Figure FDA0002950648700000062
therein, IOiRepresenting the original iris mode class in the characteristic template library;
the classification threshold T is updated by comparison with the original classification threshold,
Figure FDA0002950648700000063
10. the iris feature extraction and matching method according to claim 1, wherein said step S5 specifically includes:
calculating iris feature template IT={IF0;MFAnd sample feature PS={PF0;LF-the degree of similarity between the two components,
Figure FDA0002950648700000064
wherein
PF0As a reference sample feature map, MFFor feature mask pattern, LFTo stabilize the characteristic mask image, IF0Is a reference characteristic image;
the updated classification threshold is used for discrimination to complete iris feature matching classification,
Figure FDA0002950648700000065
CN201811240801.XA 2018-10-23 2018-10-23 Stable iris feature extraction and matching method Active CN109389094B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201811240801.XA CN109389094B (en) 2018-10-23 2018-10-23 Stable iris feature extraction and matching method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201811240801.XA CN109389094B (en) 2018-10-23 2018-10-23 Stable iris feature extraction and matching method

Publications (2)

Publication Number Publication Date
CN109389094A CN109389094A (en) 2019-02-26
CN109389094B true CN109389094B (en) 2021-04-16

Family

ID=65427784

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201811240801.XA Active CN109389094B (en) 2018-10-23 2018-10-23 Stable iris feature extraction and matching method

Country Status (1)

Country Link
CN (1) CN109389094B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111539256B (en) * 2020-03-31 2023-12-01 北京万里红科技有限公司 Iris feature extraction method, iris feature extraction device and storage medium

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1760887A (en) * 2004-10-11 2006-04-19 中国科学院自动化研究所 The robust features of iris image extracts and recognition methods
CN103577840A (en) * 2013-10-30 2014-02-12 汕头大学 Item identification method
CN104766059A (en) * 2015-04-01 2015-07-08 上海交通大学 Rapid and accurate human eye positioning method and sight estimation method based on human eye positioning
CN106778499A (en) * 2016-11-24 2017-05-31 江苏大学 A kind of method of quick positioning people's eye iris during iris capturing
CN107330395A (en) * 2017-06-27 2017-11-07 中国矿业大学 A kind of iris image encryption method based on convolutional neural networks

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7689006B2 (en) * 2004-08-20 2010-03-30 The Research Foundation Of State University Of Ny Biometric convolution using multiple biometrics

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1760887A (en) * 2004-10-11 2006-04-19 中国科学院自动化研究所 The robust features of iris image extracts and recognition methods
CN103577840A (en) * 2013-10-30 2014-02-12 汕头大学 Item identification method
CN104766059A (en) * 2015-04-01 2015-07-08 上海交通大学 Rapid and accurate human eye positioning method and sight estimation method based on human eye positioning
CN106778499A (en) * 2016-11-24 2017-05-31 江苏大学 A kind of method of quick positioning people's eye iris during iris capturing
CN107330395A (en) * 2017-06-27 2017-11-07 中国矿业大学 A kind of iris image encryption method based on convolutional neural networks

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
Image Compression Based on Compressed Sensing Theory and Wavelet Packet Analysis;Huijie Guo et al.;《2011 Cross Strait Quad-Regional Radio Science and Wireless Technology Conference》;20111010;全文 *
虹膜识别技术综述;韩一梁 等.;《2015国防无线电&电学计量与测试学术交流会》;20151231;全文 *

Also Published As

Publication number Publication date
CN109389094A (en) 2019-02-26

Similar Documents

Publication Publication Date Title
CN108038476B (en) A kind of facial expression recognition feature extracting method based on edge detection and SIFT
Zhang et al. Unconstrained salient object detection via proposal subset optimization
US7508961B2 (en) Method and system for face detection in digital images
Cheng et al. Total variation and sparsity regularized decomposition model with union dictionary for hyperspectral anomaly detection
CN110414299B (en) Monkey face affinity analysis method based on computer vision
CN107194393B (en) Method and device for detecting temporary license plate
CN106778742B (en) Car logo detection method based on Gabor filter background texture suppression
Cheng et al. Decomposition model with background dictionary learning for hyperspectral target detection
CN108446613A (en) A kind of pedestrian&#39;s recognition methods again based on distance centerization and projection vector study
CN111259756A (en) Pedestrian re-identification method based on local high-frequency features and mixed metric learning
JP6871658B2 (en) Water area identification methods and equipment based on iterative classification
Huo et al. Learning relationship for very high resolution image change detection
Almaadeed et al. Partial shoeprint retrieval using multiple point-of-interest detectors and SIFT descriptors
CN105913069B (en) A kind of image-recognizing method
CN114782715B (en) Vein recognition method based on statistical information
CN109389094B (en) Stable iris feature extraction and matching method
Rushing et al. Image segmentation using association rule features
CN107564008A (en) Rapid SAR image segmentation method based on crucial pixel fuzzy clustering
CN113378620B (en) Cross-camera pedestrian re-identification method in surveillance video noise environment
CN112183504B (en) Video registration method and device based on non-contact palm vein image
CN115187791B (en) ORB image matching method integrating color and scale features
Shekar et al. Multi-Patches iris based person authentication system using particle swarm optimization and fuzzy C-means clustering
CN112183444B (en) Urban landscape classification optimization method, device, equipment and medium
Yu et al. Robust mean shift tracking based on refined appearance model and online update
CN108764349B (en) Feature fusion recognition method and device based on D-S evidence theory

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant