CN109325455A - A kind of Iris Location and feature extracting method and system - Google Patents

A kind of Iris Location and feature extracting method and system Download PDF

Info

Publication number
CN109325455A
CN109325455A CN201811139801.0A CN201811139801A CN109325455A CN 109325455 A CN109325455 A CN 109325455A CN 201811139801 A CN201811139801 A CN 201811139801A CN 109325455 A CN109325455 A CN 109325455A
Authority
CN
China
Prior art keywords
iris
indicate
function
region
gray
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201811139801.0A
Other languages
Chinese (zh)
Other versions
CN109325455B (en
Inventor
郭慧杰
韩梁
韩一梁
杨昆
王超楠
杨倩倩
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Institute of Radio Metrology and Measurement
Original Assignee
Beijing Institute of Radio Metrology and Measurement
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Institute of Radio Metrology and Measurement filed Critical Beijing Institute of Radio Metrology and Measurement
Priority to CN201811139801.0A priority Critical patent/CN109325455B/en
Publication of CN109325455A publication Critical patent/CN109325455A/en
Application granted granted Critical
Publication of CN109325455B publication Critical patent/CN109325455B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/18Eye characteristics, e.g. of the iris
    • G06V40/19Sensors therefor
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/20Image preprocessing
    • G06V10/26Segmentation of patterns in the image field; Cutting or merging of image elements to establish the pattern region, e.g. clustering-based techniques; Detection of occlusion
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/18Eye characteristics, e.g. of the iris
    • G06V40/193Preprocessing; Feature extraction
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/18Eye characteristics, e.g. of the iris
    • G06V40/197Matching; Classification

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Theoretical Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Ophthalmology & Optometry (AREA)
  • Human Computer Interaction (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Image Analysis (AREA)

Abstract

A kind of Iris Location and feature extracting method and system are provided in the embodiment of the present application, wherein the step of this method includes: S1, based on the local gray level statistical distribution model constructed in advance, chooses iris region segmentation threshold value;S2, classifier is coupled using the dual threshold constructed based on iris region segmentation threshold value, iris region is screened, obtain pupil region image and iris region image;S3, using the edge detector based on pupil region image and iris region picture construction, determine effective iris region;S4, using the local tower transitive model of small echo high-frequency energy constructed based on effective iris region, effective iris pixel of effective iris region is encoded, obtain iris feature coding.This programme overcomes the influence of noise jamming and unstable feature in low-quality iris image, to be conducive to improve the accuracy rate and robustness of iris authentication system.

Description

A kind of Iris Location and feature extracting method and system
Technical field
This application involves iris field of biological recognition, in particular to a kind of based on the effective iris region of determination, in conjunction with The local tower transitive model of small echo high-frequency energy carries out iris and is accurately positioned the method and system extracted with invariant feature.
Background technique
Iris recognition has become bio-identification neck with its accuracy, stability, safety and the significant advantages such as untouchable The focus on research direction and development trend in domain.But the shadow due to iris dimensions very little, vulnerable to noise jamming and user's posture It rings, in collected iris image, iris region often has pollution and deformation, therefore how to solve low-quality images iris essence It determines the problem of position and invariant feature are extracted, is the key that iris recognition and difficult point.At present typical iris feature extract and Method of completing the square has the disadvantage in that
1. lacking adaptable Iris Location model of fit, for low-quality images, Iris Location is influenced by noise jamming Larger, position inaccurate makes the extraction of iris invariant feature become more difficult;
2. the iris feature for lacking strong robustness measures operator, for low-quality images, feature extraction is by iris influence of crust deformation It is larger, it is difficult to extract stable iris feature, substantially reduce the accuracy rate of iris recognition;
3., in order to avoid the accuracy rate of iris recognition declines, taking for low-quality images and giving up present image progress again The strategy of acquisition, to seriously affect the efficiency of iris authentication system.
Summary of the invention
One of in order to solve the above problem, this application provides a kind of Iris Location and feature extracting methods, solve low-quality figure The problem of being accurately positioned as iris with invariant feature extraction, to effectively enhance the accuracy rate and robustness of iris recognition.
According to the first aspect of the embodiment of the present application, a kind of Iris Location and feature extracting method, this method are provided The step of include:
S1, based on the local gray level statistical distribution model constructed in advance, choose iris region segmentation threshold value;
S2, classifier is coupled using the dual threshold constructed based on iris region segmentation threshold value, iris region is carried out Screening obtains pupil region image and iris region image;
S3, using the edge detector based on pupil region image and iris region picture construction, determine effective iris area Domain;
S4, using the local tower transitive model of small echo high-frequency energy constructed based on effective iris region, to effective iris Effective iris pixel in region is encoded, and iris feature coding is obtained.
According to the second aspect of the embodiment of the present application, a kind of Iris Location and Feature Extraction System, the system are provided Include:
Threshold value chooses module, based on the local gray level statistical distribution model constructed in advance, chooses iris region segmentation threshold value;
Optical sieving module couples classifier using the dual threshold constructed based on iris region segmentation threshold value, to iris institute It is screened in region, obtains pupil region image and iris region image;
Effective coverage determining module, using the edge detector based on pupil region image and iris region picture construction, Determine effective iris region;
Characteristic extracting module, using the local tower transitive model of small echo high-frequency energy constructed based on effective iris region, Effective iris pixel of effective iris region is encoded, iris feature coding is obtained.
Herein described technical solution adaptively chooses rainbow according to the local gray level statistical distribution of specific iris image Diaphragm area segmentation threshold, then in conjunction with the iris boundary detector that dual threshold couples classifier and Fine design accurately screen with Effective iris region is positioned, extracts stable iris feature finally by the local tower transitive model of small echo high-frequency energy is constructed And it is encoded.This programme overcomes the influence of noise jamming and unstable feature in low-quality iris image, to be conducive to Improve the accuracy rate and robustness of iris authentication system.
Detailed description of the invention
The drawings described herein are used to provide a further understanding of the present application, constitutes part of this application, this Shen Illustrative embodiments and their description please are not constituted an undue limitation on the present application for explaining the application.In the accompanying drawings:
Fig. 1 shows the schematic diagram of herein described Iris Location and feature extracting method.
Specific embodiment
In order to which technical solution in the embodiment of the present application and advantage is more clearly understood, below in conjunction with attached drawing to the application Exemplary embodiment be described in more detail, it is clear that described embodiment be only the application a part implement Example, rather than the exhaustion of all embodiments.It should be noted that in the absence of conflict, embodiment and reality in the application The feature applied in example can be combined with each other.
The core ideas of this programme is the selection iris region segmentation threshold value from iris image, by the classifier of building and Edge detector determines effective iris region, extracts finally by the local tower transitive model of small echo high-frequency energy of building stable Iris feature is simultaneously encoded.The adaptability and feature extraction of low-quality images Iris Location can effectively be enhanced by this method Robustness, to improve the accuracy rate and recognition efficiency of iris authentication system.
This programme, which discloses a kind of iris and is accurately positioned, can overcome low-quality iris with invariant feature extracting method, this method The influence of noise jamming and unstable feature in image, to be conducive to improve the accuracy rate and robust of iris authentication system Property.This programme is described in detail below by specific one group of example.The step of this method, is as follows:
The first step adaptively chooses iris region segmentation threshold value by constructing local gray level statistical distribution model.
In order to realize adaptive iris region segmentation, it is necessary first to adaptively determine iris region segmentation threshold value, wrap Include the high gray threshold of spot detection and the low gray threshold of pupil detection.
Portion's gray average Statistical Operator of setting a trap is
Wherein, repmat indicates that two-dimensional expansion function, num are that row, column extends frequency.As num=H, high gray scale is obtained Threshold value local gray level Statistical Operator FM (H), statistics step-length are set as SH;As num=L, low gray threshold local gray level system is obtained It calculates sub- FM (L), statistics step-length is set as SL.Specifically, H=7, L=11, S are takenH=5, SL=5.
If iris image is IRM×N, M × N is pixel resolution.High gray threshold local gray level statistical model is
Wherein, hist indicates intensity profile statistical function,Indicate convolution algorithm, SHIndicate sliding step;Low ash degree threshold Value local gray level statistical model is
Wherein, hist indicates intensity profile statistical function,Indicate convolution algorithm, SLIndicate sliding step.
STH、STLRespectively high and low gray threshold local gray level statistical distribution sequence, takes STHH after intensity profile is corresponding The intermediate value of a highest gray value is high gray threshold
TH=median (gs (STH(end-h+1:end))) (4)
Wherein, mean expression takes median function, and gs expression takes gray value function, specifically, take h=7;Take STLGray scale point The intermediate value of the corresponding preceding l lowest gray value of cloth is low gray threshold
TL=median (gs (STL(1:l))) (5)
Wherein, mean expression takes median function, and gs expression takes gray value function, specifically, take l=9.Therefore, it is used In the high gray threshold T of iris region segmentationHWith low gray threshold TL
Second step filters out iris region using dual threshold coupling classifier.
Dual threshold, which is constructed, according to the high and low gray threshold of iris region segmentation couples classifier
IS=repmat (isinner (TL, TH), S) (6)
Wherein, repmat indicates two-dimensional expansion function, and the row, column size of S presentation class device IS specifically takes S=21, Isinner indicates that gray scale interval relational operator, current grayvalue are located at section (TL,TH) when, 0 is returned to, current grayvalue is located at Section [0, TL] when, -1 is returned to, current grayvalue is located at section [TH, 255] when, return to 1.
Utilize region of the iris region dual threshold coupling classifier IS where being partitioned into pupil in iris image
Wherein, numel indicates that nonzero element counting function, find indicate relationship match function,Indicate convolution algorithm, nh、nlIt respectively indicates high and low gray scale detection number thresholding and specifically takes nh=20, take nl=240, Ω are indicated from iris image IR The pupil region of middle detection, IP are pupil region image block.
Iris region is expanded by IP
Wherein, size expression takes image array two-dimensional function, and getcore expression takes picture centre block of pixels function, Φ indicates the iris region that is partitioned into from iris image IR, and IB is iris region image block, m, n respectively indicate IB row, Column pixel number specifically takes m=240, n=320.
Effective iris region is accurately positioned by the inside and outside edge detector of building iris in third step.
Construct the inside and outside boundary inspection of iris respectively in the pupil region image block IP and iris region image block IB being partitioned into It surveys device and iris boundary is accurately positioned, obtain effective iris region.
Construct iris region inner boundary detector
Wherein, rot45 indicates that rotation function counterclockwise, α indicate detection direction weight vectors, specifically, take α=[3/8, 1/8,3/8,1/8], EI is iris inner boundary detector;Construct iris region outer boundary detector
Wherein, rot45 indicates that rotation function counterclockwise, β indicate detection direction weight vectors, specifically, take β=[3/8, 1/8,3/8,1/8], EO is iris inner boundary detector.
Position iris inner boundary
Wherein, sum indicates summing function,Indicate convolution algorithm, epIndicate the shade of gray transition of iris inner boundary neighborhood Thresholding specifically takes ep=32, ω are iris inner boundary pixel, and EP is iris inner boundary;Position exterior iris boundary
Wherein, sum indicates summing function,Indicate convolution algorithm, erIndicate the shade of gray transition of exterior iris boundary neighborhood Thresholding specifically takes er=128, ψ are exterior iris boundary pixel, and ER is exterior iris boundary.
Therefore, obtaining effective iris region is IC=(EP ∪ ER) ∩ IB.
4th step extracts the stable special of effective iris region by constructing the local tower transitive model of small echo high-frequency energy Sign, and encoded.
By effective iris region IC by radial and arc to sampling transformation be effective iris pixel block RB.If radial and arc to Sampling number is respectively r, c, then the row, column pixel resolution of RB is respectively that r, c specifically take r=24, c=128.
RB is subjected to piecemeal in tri- lv × lv, 2lv × 2lv, 4lv × 4lv resolution-scales respectively, specifically, is taken Then lv=6 carries out 1,2,3 grade of wavelet decomposition in the sub-block of different resolution scale respectively, construct local small echo high-frequency energy Measure tower
Wherein, square indicates that squared function, dwt2 indicate two-dimensional discrete wavelet conversion function, dbiIndicate wavelet basis, I indicates wavelet decomposition series, and gethfb expression takes small echo high-frequency sub-band coefficient function, and j indicates the level of small echo high-frequency sub-band.
The tower transitive model of local small echo high-frequency energy is constructed by ET
Wherein, mean indicates that function of averaging, square indicate that squared function, dwt2 indicate two-dimensional discrete wavelet conversion Function, dbiIndicate that wavelet basis, i indicate wavelet decomposition series, gethfb expression takes small echo high-frequency sub-band coefficient function, and j indicates small The level of wave high-frequency sub-band, isgreater indicate coefficient energy relationship operator, and the energy value of current small echo high frequency coefficient is less than When the average energy value of the high-frequency sub-band coefficient, 0 is returned, otherwise, returns to 1.
RB is encoded using EM, iris feature encoding array RC is obtained, has
Wherein, map indicates that the coordinate mapping function by RC to EM, (x, y) indicate the coordinate pair of RC, specifically, the row of RC, Column size is 3 × 2736, and the size of feature code mask is 1kB.
In order to cooperate the implementation of this programme the method, this programme further provides a kind of Iris Location and feature extraction System, the system include: that threshold value chooses module, optical sieving module, effective coverage determining module and characteristic extracting module.Threshold value Module is chosen using the local gray level statistical distribution model constructed in advance, chooses iris region segmentation threshold value;Optical sieving module Classifier is coupled using the dual threshold constructed based on iris region segmentation threshold value, iris region is screened, obtains pupil Porose area area image and iris region image;Effective coverage determining module, which utilizes, is based on pupil region image and iris region image structure The edge detector built determines effective iris region;Characteristic extracting module utilizes the part constructed based on effective iris region small The tower transitive model of wave height frequency energy encodes effective iris pixel of effective iris region, obtains iris feature coding.
In the present solution, it includes: model construction unit and threshold value determination unit that the threshold value, which chooses module,.Model construction unit According to the pixel resolution of setting local gray level mean value Statistical Operator and iris image, high gray threshold local gray level is obtained respectively Statistical model and low gray threshold local gray level statistical model;Threshold value determination unit counts mould using high gray threshold local gray level Type STHWith low gray threshold local gray level statistical model STLIt determines respectively:
The high gray threshold of spot detection: TH=median (gs (STH(end-h+1:end))), wherein median expression takes Median function, gs expression take gray value function, h STHIn it is corresponding after h highest gray value, end for end element rope Draw;
The low gray threshold of pupil detection: TL=median (gs (STL(1:l))), wherein median expression takes median function, Gs expression takes gray value function, l STLIn corresponding preceding l lowest gray value.
In the present solution, described image screening module includes: classifier construction unit, cutting unit and expanding element.Classification Device construction unit couples classifier: IS=according to the high gray threshold of iris region segmentation and low gray threshold building dual threshold repmat(isinner(TL, TH), S), wherein repmat expression two-dimensional expansion function, the row, column size of S presentation class device IS, Isinner indicates gray scale interval relational operator;Cutting unit is divided from iris image using dual threshold coupling classifier Cut out the region where pupil:
Wherein, numel indicates that nonzero element counting function, find indicate relationship match function,Indicate convolution algorithm, Nh, nl respectively indicate high and low gray scale detection number thresholding, and Ω indicates the pupil region detected from iris image IR, and IP is For pupil region image block;
Expanding element expands iris region using pupil region image block IP:
Wherein, size expression takes image array two-dimensional function, and getcore expression takes picture centre block of pixels function, Φ indicates the iris region that is partitioned into from iris image IR, and IB is iris region image block, m, n respectively indicate IB row, Column pixel number.
In the present solution, the effective coverage determining module includes: boundary device construction unit, positioning unit
Boundary device construction unit, in pupil region image block IP and iris region image block IB, respectively construct iris in, Outer boundary detector;
Iris region inner boundary detector: Wherein, rot45 indicates that rotation function counterclockwise, α indicate detection direction weight vectors, and EI is iris inner boundary detector;
Iris region outer boundary detector; Wherein, rot45 indicates that rotation function counterclockwise, β indicate detection direction weight vectors, and EO is outside iris Edge detector;
Positioning unit is based on iris region inner boundary detector and iris region outer boundary detector, positions iris inner edge Boundary:And exterior iris boundary:Wherein, sum indicates summing function,Indicate convolution algorithm, epIndicate iris Inner boundary neighborhood shade of gray transition thresholding, ω are iris inner boundary pixel, and EP is iris inner boundary, erIt indicates outside iris Boundary neighborhood shade of gray transition thresholding, ψ are exterior iris boundary pixel, and ER is exterior iris boundary;
According to iris inner and outer boundary, the effective iris region determined are as follows: IC=(EP ∪ ER) ∩ IB.
In the present solution, the characteristic extracting module includes: converter unit, pyramid of energy construction unit, transitive model building list Member and coding unit.Effective iris region IC is passed through radial and arc to sampling by converter unit, is transformed to effective iris pixel block RB;RB is carried out piecemeal in tri- lv × lv, 2lv × 2lv, 4lv × 4lv resolution-scales respectively by pyramid of energy construction unit, Then 1,2,3 grade of wavelet decomposition is carried out in the sub-block of different resolution scale respectively, constructs local small echo high-frequency energy tower:
Wherein, square indicates that squared function, dwt2 indicate two-dimensional discrete wavelet conversion function, dbiIndicate wavelet basis, I indicates wavelet decomposition series, and gethfb expression takes small echo high-frequency sub-band coefficient function, and j indicates the level of small echo high-frequency sub-band;
Transitive model construction unit constructs the tower transition of local small echo high-frequency energy according to local small echo high-frequency energy tower Model:
Wherein, mean indicates that function of averaging, square indicate that squared function, dwt2 indicate two-dimensional discrete wavelet conversion Function, dbi indicate that wavelet basis, i indicate wavelet decomposition series, and gethfb expression takes small echo high-frequency sub-band coefficient function, and j indicates small The level of wave high-frequency sub-band, isgreater indicate coefficient energy relationship operator;Using the maximum value of fusion error, letter is determined Cease fusion accuracy;
Coding unit encodes effective iris pixel block RB using the local tower transitive model of small echo high-frequency energy, Obtain iris feature encoding array RC.
This programme the method can also be stored in electronic equipment by way of memory.The electronic equipment includes: Memory, one or more processors;Memory is connected with processor by communication bus;Processor is configured as executing storage Instruction in device;It is stored in the storage medium for executing each step in the Iris Location and feature extracting method Instruction.
This programme the method can also be stored in computer readable storage medium in the form of program, and the program is processed The step of device realizes the Iris Location and feature extracting method when executing.
The application is referring to method, the process of equipment (system) and computer program product according to the embodiment of the present application Figure and/or block diagram describe.It should be understood that every one stream in flowchart and/or the block diagram can be realized by computer program instructions The combination of process and/or box in journey and/or box and flowchart and/or the block diagram.It can provide these computer programs Instruct the processor of general purpose computer, special purpose computer, Embedded Processor or other programmable data processing devices to produce A raw machine, so that being generated by the instruction that computer or the processor of other programmable data processing devices execute for real The device for the function of being specified in present one or more flows of the flowchart and/or one or more blocks of the block diagram.
These computer program instructions, which may also be stored in, is able to guide computer or other programmable data processing devices with spy Determine in the computer-readable memory that mode works, so that it includes referring to that instruction stored in the computer readable memory, which generates, Enable the manufacture of device, the command device realize in one box of one or more flows of the flowchart and/or block diagram or The function of being specified in multiple boxes.
These computer program instructions also can be loaded onto a computer or other programmable data processing device, so that counting Series of operation steps are executed on calculation machine or other programmable devices to generate computer implemented processing, thus in computer or The instruction executed on other programmable devices is provided for realizing in one or more flows of the flowchart and/or block diagram one The step of function of being specified in a box or multiple boxes.
The above is only the embodiment of the present invention, are not intended to restrict the invention, all in the spirit and principles in the present invention Within, any modification, equivalent substitution, improvement and etc. done, be all contained in apply pending scope of the presently claimed invention it It is interior.

Claims (10)

1. a kind of Iris Location and feature extracting method, which is characterized in that the step of this method includes:
S1, based on the local gray level statistical distribution model constructed in advance, choose iris region segmentation threshold value;
S2, classifier is coupled using the dual threshold constructed based on iris region segmentation threshold value, iris region is screened, Obtain pupil region image and iris region image;
S3, using the edge detector based on pupil region image and iris region picture construction, determine effective iris region;
S4, using the local tower transitive model of small echo high-frequency energy constructed based on effective iris region, to effective iris region Effective iris pixel encoded, obtain iris feature coding.
2. the method according to claim 1, wherein the step S1 includes:
According to the pixel resolution of setting local gray level mean value Statistical Operator and iris image, high gray threshold part is obtained respectively Gray-scale statistical model and low gray threshold local gray level statistical model;
Utilize high gray threshold local gray level statistical model STHWith low gray threshold local gray level statistical model STLIt determines respectively:
The high gray threshold of spot detection: TH=median (gs (STH(end-h+1:end))), wherein median expression takes intermediate value Function, gs expression take gray value function, h STHIn it is corresponding after h highest gray value, end for end element index;
The low gray threshold of pupil detection: TL=median (gs (STL(1:l))), wherein median expression takes median function, gs table Show and takes gray value function, l STLIn corresponding preceding l lowest gray value.
3. according to the method described in claim 2, it is characterized in that, the step S2 includes:
Classifier: IS=repmat is coupled according to the high gray threshold of iris region segmentation and low gray threshold building dual threshold (isinner(TL, TH), S), wherein repmat expression two-dimensional expansion function, the row, column size of S presentation class device IS, Isinner indicates gray scale interval relational operator;
Utilize region of the dual threshold coupling classifier where being partitioned into pupil in iris image:
Wherein, numel indicates that nonzero element counting function, find indicate relationship match function,Indicate convolution algorithm, nh, nl High and low gray scale detection number thresholding is respectively indicated, Ω indicates the pupil region detected from iris image IR, and IP is pupil Area image block;
Iris region is expanded using pupil region image block IP:
Wherein, size expression takes image array two-dimensional function, and getcore expression takes picture centre block of pixels function, Φ table Show that the iris region being partitioned into from iris image IR, IB are iris region image block, m, n respectively indicate the row, column picture of IB Prime number.
4. analysis method according to claim 3, which is characterized in that the step S3 includes:
In pupil region image block IP and iris region image block IB, the inside and outside edge detector of iris is constructed respectively;
Iris region inner boundary detector:Its In, rot45 indicates that rotation function counterclockwise, α indicate detection direction weight vectors, and EI is iris inner boundary detector;
Iris region outer boundary detector; Wherein, rot45 indicates that rotation function counterclockwise, β indicate detection direction weight vectors, and EO is outside iris Edge detector;
Based on iris region inner boundary detector and iris region outer boundary detector, iris inner boundary is positioned:And exterior iris boundary:Its In, sum indicates summing function,Indicate convolution algorithm, epIndicate iris inner boundary neighborhood shade of gray transition thresholding, ω is rainbow Film inner boundary pixel, EP are iris inner boundary, erIndicate exterior iris boundary neighborhood shade of gray transition thresholding, ψ is iris Outer boundary pixel, ER are exterior iris boundary;
According to iris inner and outer boundary, effective iris region is determined are as follows: IC=(EP ∪ ER) ∩ IB.
5. according to the method described in claim 4, it is characterized in that, the step S4 includes:
Effective iris region IC is transformed to effective iris pixel block RB to sampling by radial and arc;
RB is subjected to piecemeal in tri- lv × lv, 2lv × 2lv, 4lv × 4lv resolution-scales respectively, then respectively in difference 1,2,3 grade of wavelet decomposition is carried out in the sub-block of resolution-scale, constructs local small echo high-frequency energy tower:
Wherein, square indicates that squared function, dwt2 indicate two-dimensional discrete wavelet conversion function, dbiIndicate that wavelet basis, i indicate Wavelet decomposition series, gethfb expression take small echo high-frequency sub-band coefficient function, and j indicates the level of small echo high-frequency sub-band;
According to local small echo high-frequency energy tower, the tower transitive model of local small echo high-frequency energy is constructed:
Wherein, mean indicates that function of averaging, square indicate that squared function, dwt2 indicate two-dimensional discrete wavelet conversion letter Number, dbi indicate that wavelet basis, i indicate wavelet decomposition series, and gethfb expression takes small echo high-frequency sub-band coefficient function, and j indicates small echo The level of high-frequency sub-band, isgreater indicate coefficient energy relationship operator;
Effective iris pixel block RB is encoded using the local tower transitive model of small echo high-frequency energy, obtains iris feature volume Code array RC.
6. a kind of Iris Location and Feature Extraction System, which is characterized in that the system includes:
Threshold value chooses module, based on the local gray level statistical distribution model constructed in advance, chooses iris region segmentation threshold value;
Optical sieving module couples classifier using the dual threshold constructed based on iris region segmentation threshold value, to iris location Domain is screened, and pupil region image and iris region image are obtained;
Effective coverage determining module is determined using the edge detector based on pupil region image and iris region picture construction Effective iris region;
Characteristic extracting module, using the local tower transitive model of small echo high-frequency energy constructed based on effective iris region, to having Effective iris pixel of effect iris region is encoded, and iris feature coding is obtained.
7. system according to claim 6, which is characterized in that the threshold value chooses module and includes:
Model construction unit obtains respectively according to the pixel resolution of setting local gray level mean value Statistical Operator and iris image High gray threshold local gray level statistical model and low gray threshold local gray level statistical model;
Threshold value determination unit utilizes high gray threshold local gray level statistical model STHWith low gray threshold local gray level statistical model STLIt determines respectively:
The high gray threshold of spot detection: TH=median (gs (STH(end-h+1:end))), wherein median expression takes intermediate value Function, gs expression take gray value function, h STHIn it is corresponding after h highest gray value, end for end element index;
The low gray threshold of pupil detection: TL=median (gs (STL(1:l))), wherein median expression takes median function, gs table Show and takes gray value function, l STLIn corresponding preceding l lowest gray value.
8. system according to claim 7, which is characterized in that described image screening module includes:
Classifier construction unit, according to the high gray threshold of iris region segmentation and the building dual threshold coupling classification of low gray threshold Device: IS=repmat (isinner (TL, TH), S), wherein repmat expression two-dimensional expansion function, the row of S presentation class device IS, Column size, isinner indicate gray scale interval relational operator;
Cutting unit utilizes region of the dual threshold coupling classifier where being partitioned into pupil in iris image:
Wherein, numel indicates that nonzero element counting function, find indicate relationship match function,Indicate convolution algorithm, nh, nl High and low gray scale detection number thresholding is respectively indicated, Ω indicates the pupil region detected from iris image IR, and IP is pupil Area image block;
Expanding element expands iris region using pupil region image block IP:
Wherein, size expression takes image array two-dimensional function, and getcore expression takes picture centre block of pixels function, Φ table Show that the iris region being partitioned into from iris image IR, IB are iris region image block, m, n respectively indicate the row, column picture of IB Prime number.
9. system according to claim 8, which is characterized in that the effective coverage determining module includes:
Boundary device construction unit constructs the inside and outside side of iris in pupil region image block IP and iris region image block IB respectively Boundary's detector;
Iris region inner boundary detector:Its In, rot45 indicates that rotation function counterclockwise, α indicate detection direction weight vectors, and EI is iris inner boundary detector;
Iris region outer boundary detector; Wherein, rot45 indicates that rotation function counterclockwise, β indicate detection direction weight vectors, and EO is outside iris Edge detector;
Positioning unit is based on iris region inner boundary detector and iris region outer boundary detector, positions iris inner boundary:And exterior iris boundary:Its In, sum indicates summing function,Indicate convolution algorithm, epIndicate iris inner boundary neighborhood shade of gray transition thresholding, ω is rainbow Film inner boundary pixel, EP are iris inner boundary, erIndicate exterior iris boundary neighborhood shade of gray transition thresholding, ψ is iris Outer boundary pixel, ER are exterior iris boundary;
According to iris inner and outer boundary, the effective iris region determined are as follows: IC=(EP ∪ ER) ∩ IB.
10. system according to claim 9, which is characterized in that the characteristic extracting module includes:
Effective iris region IC is transformed to effective iris pixel block RB to sampling by radial and arc by converter unit;
Pyramid of energy construction unit is divided RB in tri- lv × lv, 2lv × 2lv, 4lv × 4lv resolution-scales respectively Then block carries out 1,2,3 grade of wavelet decomposition in the sub-block of different resolution scale respectively, construct local small echo high-frequency energy Tower:
Wherein, square indicates that squared function, dwt2 indicate two-dimensional discrete wavelet conversion function, dbiIndicate that wavelet basis, i indicate Wavelet decomposition series, gethfb expression take small echo high-frequency sub-band coefficient function, and j indicates the level of small echo high-frequency sub-band;
Transitive model construction unit constructs the tower transitive model of local small echo high-frequency energy according to local small echo high-frequency energy tower:
Wherein, mean indicates that function of averaging, square indicate that squared function, dwt2 indicate two-dimensional discrete wavelet conversion letter Number, dbi indicate that wavelet basis, i indicate wavelet decomposition series, and gethfb expression takes small echo high-frequency sub-band coefficient function, and j indicates small echo The level of high-frequency sub-band, isgreater indicate coefficient energy relationship operator;Using the maximum value of fusion error, information is determined Fusion accuracy;
Coding unit encodes effective iris pixel block RB using the local tower transitive model of small echo high-frequency energy, obtains Iris feature encoding array RC.
CN201811139801.0A 2018-09-28 2018-09-28 Iris positioning and feature extraction method and system Active CN109325455B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201811139801.0A CN109325455B (en) 2018-09-28 2018-09-28 Iris positioning and feature extraction method and system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201811139801.0A CN109325455B (en) 2018-09-28 2018-09-28 Iris positioning and feature extraction method and system

Publications (2)

Publication Number Publication Date
CN109325455A true CN109325455A (en) 2019-02-12
CN109325455B CN109325455B (en) 2021-11-30

Family

ID=65265985

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201811139801.0A Active CN109325455B (en) 2018-09-28 2018-09-28 Iris positioning and feature extraction method and system

Country Status (1)

Country Link
CN (1) CN109325455B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113837993A (en) * 2021-07-29 2021-12-24 天津中科智能识别产业技术研究院有限公司 Lightweight iris image segmentation method and device, electronic equipment and storage medium

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101266645A (en) * 2008-01-24 2008-09-17 电子科技大学中山学院 Iris positioning method based on multi-resolutions analysis
US20140161325A1 (en) * 2012-12-10 2014-06-12 Sri International Iris biometric matching system
CN107844736A (en) * 2016-09-19 2018-03-27 北京眼神科技有限公司 iris locating method and device

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101266645A (en) * 2008-01-24 2008-09-17 电子科技大学中山学院 Iris positioning method based on multi-resolutions analysis
US20140161325A1 (en) * 2012-12-10 2014-06-12 Sri International Iris biometric matching system
CN107844736A (en) * 2016-09-19 2018-03-27 北京眼神科技有限公司 iris locating method and device

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
HUIJIE GUO ET AL.: "Image Compression Based on Compressed Sensing Theory and Wavelet Packet Analysis", 《2011 CROSS STRAIT QUAD-REGIONAL RADIO SCIENCE AND WIRELESS TECHNOLOGY CONFERENCE》 *
张荷萍 等.: "基于变分水平集模型的虹膜图像分割方法", 《计算机工程》 *
韩一梁 等.: "虹膜识别技术综述", 《2015国防无线电&电学计量与测试学术交流会》 *

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113837993A (en) * 2021-07-29 2021-12-24 天津中科智能识别产业技术研究院有限公司 Lightweight iris image segmentation method and device, electronic equipment and storage medium
CN113837993B (en) * 2021-07-29 2024-01-30 天津中科智能识别产业技术研究院有限公司 Lightweight iris image segmentation method and device, electronic equipment and storage medium

Also Published As

Publication number Publication date
CN109325455B (en) 2021-11-30

Similar Documents

Publication Publication Date Title
CN108573276B (en) Change detection method based on high-resolution remote sensing image
Xu et al. Inter/intra-category discriminative features for aerial image classification: A quality-aware selection model
KR101733539B1 (en) Character recognition device and control method thereof
CN106228528B (en) A kind of multi-focus image fusing method based on decision diagram and rarefaction representation
CN103761731A (en) Small infrared aerial target detection method based on non-downsampling contourlet transformation
CN110309781B (en) House damage remote sensing identification method based on multi-scale spectrum texture self-adaptive fusion
CN105760859A (en) Method and device for identifying reticulate pattern face image based on multi-task convolutional neural network
CN107403134B (en) Local gradient trilateral-based image domain multi-scale infrared dim target detection method
CN111681197A (en) Remote sensing image unsupervised change detection method based on Siamese network structure
CN107909560A (en) A kind of multi-focus image fusing method and system based on SiR
CN105069807A (en) Punched workpiece defect detection method based on image processing
CN112288008A (en) Mosaic multispectral image disguised target detection method based on deep learning
CN103971346A (en) SAR (Synthetic Aperture Radar) image spot-inhibiting method based on spare domain noise distribution constraint
CN104036461B (en) A kind of Infrared Complex Background suppressing method based on Federated filter
CN114758288A (en) Power distribution network engineering safety control detection method and device
CN109191416A (en) Image interfusion method based on sparse dictionary study and shearing wave
CN102842047A (en) Infrared small and weak target detection method based on multi-scale sparse dictionary
CN106897999A (en) Apple image fusion method based on Scale invariant features transform
CN114742968B (en) Elevation map generation method based on building elevation point cloud
CN102750675B (en) Non-local means filtering method for speckle noise pollution image
CN114155445A (en) SAR image target detection method based on improved YOLOv3
CN108510531A (en) SAR image registration method based on PCNCC and neighborhood information
CN109325455A (en) A kind of Iris Location and feature extracting method and system
CN106778822B (en) Image straight line detection method based on funnel transformation
CN102184529A (en) Empirical-mode-decomposition-based edge detecting method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant