GB2471192A - Iris and Ocular Recognition using Trace Transforms - Google Patents

Iris and Ocular Recognition using Trace Transforms Download PDF

Info

Publication number
GB2471192A
GB2471192A GB1009983A GB201009983A GB2471192A GB 2471192 A GB2471192 A GB 2471192A GB 1009983 A GB1009983 A GB 1009983A GB 201009983 A GB201009983 A GB 201009983A GB 2471192 A GB2471192 A GB 2471192A
Authority
GB
United Kingdom
Prior art keywords
eye
iris
module
trace
ocular
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
GB1009983A
Other versions
GB201009983D0 (en
GB2471192B (en
Inventor
Rida M Hamza
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Honeywell International Inc
Original Assignee
Honeywell International Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from US12/814,232 external-priority patent/US8472681B2/en
Application filed by Honeywell International Inc filed Critical Honeywell International Inc
Publication of GB201009983D0 publication Critical patent/GB201009983D0/en
Publication of GB2471192A publication Critical patent/GB2471192A/en
Application granted granted Critical
Publication of GB2471192B publication Critical patent/GB2471192B/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/18Eye characteristics, e.g. of the iris
    • G06V40/193Preprocessing; Feature extraction
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/21Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation
    • G06F18/213Feature extraction, e.g. by transforming the feature space; Summarisation; Mappings, e.g. subspace methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/25Fusion techniques
    • G06F18/254Fusion techniques of classification results, e.g. of results related to same input data
    • G06K9/0061
    • G06K9/4633
    • G06K9/6232
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/48Extraction of image or video features by mapping characteristic values of the pattern into a parameter space, e.g. Hough transformation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/50Extraction of image or video features by performing operations within image blocks; by using histograms, e.g. histogram of oriented gradients [HoG]; by summing image-intensity values; Projection analysis
    • G06V10/507Summing image-intensity values; Histogram projection analysis
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/98Detection or correction of errors, e.g. by rescanning the pattern or by human intervention; Evaluation of the quality of the acquired patterns
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/98Detection or correction of errors, e.g. by rescanning the pattern or by human intervention; Evaluation of the quality of the acquired patterns
    • G06V10/993Evaluation of the quality of the acquired pattern
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/18Eye characteristics, e.g. of the iris
    • G06V40/197Matching; Classification

Abstract

The use of trace transforms (e.g. fig. 1 A and 1B) in iris or ocular recognition systems and methods are disclosed. The system acquires (51) an eye image and may provide it to an image quality metrics determination module (53) for a quality evaluation to indicate whether the image goes to an iris recognition module (56) for high quality images or a trace transform module (57) for lower quality images. The trace transform may also be used as a pre-filtering mechanism to determine a small database from bigger datasets. If the quality evaluation is too poor, the image may be rejected subject to rehabilitation or reacquisition. A processed image from the iris recognition module (56) may result in an only best match, A processed image from the trace transform module (57) may instead result with a group of the most probable matches.

Description

AN IRIS AND OCULAR RECOGNITION SYSTEM USING TRACE
TRANS FORMS
This application claims the benefit of U.S. Provisional Patent Application No. 61/268,676, filed June 15, 2009, and entitled "Iris and Ocular Recognition Using Radon and Trace Transforms". U.S. Provisional Patent Application No. 61/268,676, filed June 15, 2009, is hereby incorporated by reference.
Background
The present invention pertains to recognition systems and particularly to biometric recognition systems; in particular the invention pertains to iris recognition systems.
Summary
The invention is an iris and ocular recognition system using trace transforms. The system may capture an eye image with an acquisition module and provide it to an image quality metrics determination module which provides a quality evaluation of the image. The quality evaluation may determine whether the image goes to an iris recognition module or a trace transform module. If the quality evaluation reaches a predefined of iris quality measure, the image may be primarily processed using the iris recognition module. If the quality evaluation does not reach a predefined iris quality level for the iris recognition module, the image may be processed primarily using trace transform module. If the quality evaluation is too poor for the trace transform module, the image may be rejected subject to rehabilitation or reacquisition. In a different approach, one may still need to process the eye image using both modules but the fusion of the matching outcome will be weighted based upon the quality of the iris. In this approach, the trace module may be used to augment the iris recognition module. A processed image from the iris recognition module may be lined up with an only best match. A processed image from the trace transform module may be lined up instead with the most probable matches.
Brief Description of the Drawing
Figures la and lb are diagrams of example irises and their respective trace signatures; Figure 2 is a diagram of an illustrative example of ocular system architecture; Figures 3a, 3b and 3c are diagrams of multilayer ocular recognition images and signature exhibiting extracted features from eye structure, skin texture and iris patterns; Figure 4 is a diagram illustrating an example a normalization approach; and Figure 5 is a diagram of an illustrative example of the present adaptive iris matching approach.
Description
The iris of the human eye has been used as a biometric indicator for identification. The pattern of an iris is complex and can contain many distinctive features such as arching ligaments, furrows, ridges, crypts, rings, corona, freckles, a zigzag collaret, and other distinctive features. The iris of every human eye has a unique texture of high complexity, which is essentially stable over a person's life. No two irises are identical in texture or detail, even in the same person. As an internal organ of the eye, the iris is well protected from the external environment, yet it is easily visible even from yards away as a colored disk, behind the clear protective window of the eye's cornea, surrounded by the white tissue of the eye.
One may also note that the proximal eye skin texture and the eye appearance around the iris are unique to the person's identity. Skin texture modeling has been investigated in many applications including computer assisted diagnosis for dermatology, and topical drug efficacy testing. The ability to recognize or classify a skin texture and the holistic view of the eye may be a biometric enabler.
Although the iris stretches and contracts to adjust the size of the pupil in response to light, its detailed texture remains largely unaltered apart from stretching and shrinking. The eye may also stretch and change in form locally. Such distortions can readily be reversed mathematically in analyzing an eye image to extract and encode an ID signature that remains the same over a wide range of pupillary dilations or eye signal variations.
The richness, uniqueness, and immutability of iris texture, as well as its external visibility skin and its surrounding appearance, make the combined iris/ocular recognition suitable for automated and highly reliable personal identification.
One may introduce the ocular system that augments iris recognition with ocular features extracted from both the eye proximal skin and the holistic eye appearance.
The present approach may be based on using the Trace transform or Radon transform to extract features from the iris map (eye print) as well as from the proximal skin texture around the eye. Other transforms may be used to extract the features.
The Trace transform may be used in image processing for recognizing objects under transformations, i.e., nonlinear or linear transforms. One of the key properties of the Trace transform is that it may be used to construct features invariant to these transformations.
It is known for picking up shape as well as the texture characteristics of the described object. One may here apply the key features of the Trace transform operator to map the features of the iris and the eye to the Trace space. To produce Trace transforms, one may compute a functional along tracing lines of a masked eye image.
One may mask the eye image in order to select only the significant features of importance from the eye Trace transform. For the iris tracing, one may mask the pupil to exclude the pupil reflection side effect. As one processes eyes/irises, one may wish to recognize a framed circular eye shape to eliminate any extraneous information from the computation of the Trace transform.
For this purpose, one may define a local coordinate system with its center at the proximal pupil center (other considerations especially for gazed eye tracing to pick the center at the limbic middle point regardless of the pupil off-angle to reduce the side effect of heavy gazing) One may apply normalization processes to both iris and eye feature extraction processes. The iris normalization process may be based on the rubber sheet model. In addition, one may apply a linear transform to convert elliptic shapes of an iris to a circular model prior to applying the Trace transforms to reduce side effects of the elliptic shape. One may also apply a normalization process around the eye to isolate the actual proximal skin region in a digital eye image. One may extract the edge points of the upper and lower eyelids. The upper and lower edges may then be fit into two separate parabola models to delineate the lids limits. The intersections between these two curves may then be used as the reference points to execute the normalization among different templates. The eye proximal skin may also be fit into a circular mask.
In practice, the introduced approach may best be integrated along with an iris recognition system to produce and base the recognition on multiple signatures rather than just the usual iris barcode. One may cast the ocular recognition challenge as one of classifying among two feature sets extracted from iris patterns and proximal and eye skin texture and eye appearance using Trace space representation. The system should include at least the following components.
1) Iris recognition --this may be a typical iris recognition technique that can be processed using a commercial system or a polar segmentation (viz., POSE) system technique (see e.g., U.S. Patent Application No. 11/043,366, filed January 26, 2005) . This approach may provide a useful signature only if the data is ideal with no heavy obscuration or blurring.
2) Iris recognition using Trace space --the signature from this approach may be mostly useful when the above signature is obscured.
3) Proximal skin texture and face appearance recognition --this approach may extract features from the iris proximal skin texture by analyzing a holistic view of the eye appearance as well as the texture patterns of the skin around the eye and analyze the local features of its surroundings using the Trace space constructed using Trace transform or Radon transform functions.
Again, one may introduce a standoff ocular recognition technique to augment the current state of the art iris recognition technology. One may cast the ocular recognition challenge as one of classifying among two feature sets extracted from iris patterns and proximal and eye skin texture and eye appearance using trace space representation. Figures la and lb appear to show irises 11 and 12 of two different eyes. Also shown are iris signatures 13 and 14 of irises 11 and 12, respectively, using Trace space. Figures 3a, 3b and 3c show multilayer ocular recognition using extracted features from the eye structure, skin texture and iris patterns.
A merit of the approach is that it may allow users to enroll or match poor quality iris images that would be rejected by current state of the art technology. In addition, this approach may generate a list of possible matches instead of only the best match when the iris-print is too obscured to be a reliable matcher.
The technical rationale and approach may be noted.
One may introduce the ocular system that augments iris recognition with ocular features extracted from both the eye proximal skin and the holistic eye appearance. The system architecture of the overall system is presented in Figure 2. This Figure shows an illustrative example of ocular system architecture. Eye image quality may be measured at symbol 25. The quality measured may be evaluated at symbol 26 for IQMS (image quality metrics) to determine whether the quality sufficient for an ordinary iris recognition approach such as a polar based iris recognition approach at symbol 28 which may provide an iris map 29 of the iris of an eye 31. If the quality of the image is not adequate for the recognition approach at symbol 28, it may be still be sufficient for the present adaptive iris matching approach in symbol 32.
From eye image, an iris trace transform may be applied at symbol 33 to result in an iris signature 35. An ocular trace transform at symbol 34 may be performed to result in an ocular signature 36.
Various properties may be obtained from the eye image for recognition. Iris recognition --this is an iris recognition technique that may be processed using a POSE technique noted herein. Proximal skin texture and face appearance recognition --this process may extract features from the iris proximal skin texture by analyzing a holistic view of the eye appearance as well as the texture patterns of the skin around the eye and analyze the local features of its surroundings using Trace space.
The eye trace signatures may be obtained by masking the iris and the white region of the eye from the holistic view of the eye. The masking may reduce side affects from iris movements in the scalar region.
One may measure the eye and iris quality metrics to derive the combination of the signatures from the extracted prints. Having a signature print per each module, rather then just the iris-print, may enable the system to reduce the scatter of features across eye signatures with less variability and boost the discriminating power among the prints of different subjects.
One may augment the iris recognition by extracting features from the holistic view of the eye appearance and the proximal skin textures of the eye. The holistic eye appearance that extends to the eyebrow and surrounding spatial features of the eye shape may provide additional insight on the identity of the person. One may introduce a new framework for interpreting eye image appearance and its surrounding skin texture using Trace representation that permits a direct optimization on this new domain, which is feasible and leads to an algorithm which is rapid, accurate and robust. The Trace representation may employ the structure domain derived from reliable features using the mathematical Trace transforms, formulated to reduce the within eye variance and thus provide features robust to factors of signal variability.
The Trace transform (e.g., A. Kadyrov, and M. Petrou, "The Trace Transform and its Applications," IEEE Trans. PAMI, 23(8), Pp. 811-828, 2001), a generalization of a Radon transform, may be used in image processing for recognizing objects under transformations, i.e., nonlinear or linear transforms. One of the key properties of the Trace transform is that it can be used to construct features invariant to these transformations.
It can pick up shape as well as texture characteristics of the described object. To produce Trace transforms, one may compute a functional along tracing lines of a masked eye image. One may mask the eye image in order to select only the significant features of importance from the eye Trace transform. As one processes eyes, one may wish to recognize a framed circular eye shape to eliminate any extraneous information from the computation of the Trace transform. For this purpose, one may define a local coordinate system with its center at the proximal pupil center (there may be other considerations especially for gazed eyes to pick the center at the limbic middle point regardless of the pupil off-angle to reduce the side effect of heavy gazing) . The tracing lines may then be defined in the minimum enclosing circular region defined by the eye mask. One may show the results of a typical trace transform of different eye images 15, 16 and 17 in Figures 3a, 3b and 3b, respectively. However, eye images 15 and 16 appear to be of the same eye but with different orientations relative to the image acquisition mechanism used to capture the images. These Figures show Trace representations 18, 19 and 21 of the ocular appearance for eye images 15, 16 and 17, respectively. Similar masking may be applied across the three images to reduce the iris variations. (The Radon function may be used for tracing.) The present transformation is scale, translation, and rotation invariant, since prior to tracing the features, one may apply a normalization approach based on the intersection points 41 and 42, as shown in Figure 4 that are highlighted with small circles on the left and right, of the two estimated upper and lower eyelid parabolas 43 and 44, respectively.
For normalization, one may use existing POSE library functionalities to extract the edge points of the upper and lower eyelids. The upper and lower edges may then be fit into two separate parabola models to delineate the lids limits. The intersections between these two curves may then be used as the reference points for normalizing different templates. The two intersection points may be used as a reference to normalize across different templates for scaling and translation calibrations. One may illustrate the process in Figure 4. This Figure shows a basis for the normalization process.
In another approach, other Trace functions, besides Radon, which are well suited to model an eye appearance, may be utilized without violating the design being presented herein. By selecting the appropriate masking and formulating the most suitable Trace function, one may ensure that only genuine eye information is used in extracting the unique eye features. Different masking schemes may be applied to exclude sclera and pupil regions from the ocular appearance analysis to reduce the gazing side effect on the eye-print.
Given a probe of an eye-print, and the extracted trace model parameters (eye prints), the aim is to identify the individual in a way which is invariant to confounding factors as described above. If multiple trace representation eye-prints of the same individual exist in the queries, it may be possible to do this using the Mahalonobis distance measure, which may enhance the effect of inter-class variations whilst suppressing the effect of within class variation-expression, lighting, and pose. Let [P1''PN] be the probability mass function generated by the probe p and be the centroid of the multivariate distribution of class k, and C is the common within class covariance matrix for virtually all the training samples. The distance may be defined as d(p,q) = ( q)C(p q)* For the present recognition approach, one may consider other similarity measures. One may examine the Jeffreys divergence (J-divergence or spectral information divergence) measure, d(p, q) = J [p(x) -q (x)]log p(x) qk(x)) which attempts to quantify "approximate symmetry" and also aids to classify the eye-print as a bilaterally symmetrical / asymmetrical match to the query. This similarity measure may generate the most probable matches instead of only the best match.
A merit of this module is that it may allow users to enroll with or to identify poor quality eye images that would be rejected by other approaches. Another merit of this approach is that it may potentially improve ocular identification efficiency. One may make use of the eye-prints to rapidly index a subset of individuals before actually executing the iris recognition. This means that the matching speed may be much faster.
There may be great promise for iris technology with a false reject rate between 0.01-0.03 at a false acceptance rate of 10-3. The false non-matching rate (FNMR) versus false matching rate (FMR) may be better even on larger datasets. The uniqueness and richness of iris patterns even when deployed to large number of population may be a requirement that should be met to deploy the technology as a true biometric tool. These appear to be very encouraging results; however, virtually all reported iris recognition performance was conducted on mostly frontal iris databases. Limited research appeared to be devoted to address non-cooperative subjects. A few developed techniques may address the true nature of iris irregularity in iris segmentation (see e.g., U.S. Patent Application No. 11/043,366, filed January 26, 2005) . Apart from estimating these irregularities, the segmentation routine should also detect reflections due to ambient light sources and occlusions due to eyelashes and eyelids.
The present approach may address issues with iris segmentation and feature extraction of poorly captured or occluded iris images, obscuration, motion blur, and illumination artifacts in view of a foundation (i.e., U.S. Patent Application No. 11/043,366, filed January 26, 2005), which appears designed for non-ideal iris images.
Work in texture recognition appears to be mostly for the purpose of facial feature tracking or for dermatology to develop an approach for computer assisted diagnosis of skin disorders. An approach in texture modeling may characterize an image texture with a feature distribution. Skin texture modeling may necessitate a model which should account for the appearance variations with the imaging conditions. A model may use a feature distribution which is a function of the imaging conditions. The modeling may be based on histograms of two sets of features: 1) Image texton --obtained by clustering the output of oriented multiscale filtering; and 2) Symbolic texture primitive --defined as a spatial configuration of indices of filter bank that has a large response relative to the other filters. The recognition approach may be based on a subspace technique using Eigenspace projection. The histograms from training images for texture classes may be used to create an Eigenspace, and then the primitive histograms from a certain class may be projected to points in the Eigenspace, which represent a sampling of the manifold of points for the pattern of the probe texture class. For recognition, the feature histograms from a query skin-print may be projected into the Eigenspace and compared with each point in the training set. The class of the nearest neighbor may appear as the classification result.
Various statistical techniques for face recognition may differ in the type of projection and distance measure used. The present Trace approach appears different from subspace techniques, e.g., work by Honeywell International Inc. on Kernel PCA or Eigen-eye (i.e., a modified version of Eigen-face using principal component analysis) or fisher-eye (similar to Fisher-face) The present technical approach may be based on the exploitation of the multilayer representation of the iris and the periocular region to develop a true standoff ocular recognition system. The present approach may include at least two main processes, each producing a unique signature suitable for identification (an eye-print and iris-print) . The present approach may use a holistic ocular appearance model that captures the shape of eyebrow and surrounding spatial features of the eye using Trace transforms. A suitable trace function may be considered to capture the shape variability present in the eye in addition to the ordinary iris signature to include the periocular skin region of the eye.
Figure 5 is a diagram of an illustrative example of the present adaptive iris matching approach. An eye image acquisition module 51 may obtain an image of an eye. The image may proceed to an image quality evaluation section 52. In section 52 may be an eye image quality metrics determination module 53 which outputs an evaluation of the eye image. An eye image router may take the evaluation and determine to which module of an eye recognition section 55 that the eye image should be conveyed. If the quality of the image is rather high, then the image may go to an iris recognition module 56 for processing. If the quality of the image is not high or good enough for processing by module 56, then the image may go to a trace transform module 57 for processing. If the image is too poor in quality, such as being quite blurred, the image may be discarded and a new image may be captured. The router module 54 may determine the amount of quality that the image needs to have for the regular iris recognition module 56 and the trace transform module 57, and respectively forward the image to virtually all modules while prioritizing the outcome using the appropriate module (i.e., give more weight to the appropriate module based upon the quality of the image) . There may be a rehabilitation module (not shown) for which the image may be sent for possible rehabilitation. If the image cannot be rehabilitated, then image may be rejected.
After the image is processed by the iris recognition module 56, the image may proceed on to an iris matcher 58. Matcher 58 may search a database 62 for a best matching iris relative to the iris of the processed image. The best matching iris, along with any available identity information of a person associated with the iris, may be sent to iris results 63 of results module 65. If there is no best matching iris or an adequate best matching iris found in database 62, then the iris of the processed image may be provided along with any identity information to an enroller 66 of an enroller module 68 to be entered in database 62. The matching process may be influenced using weighting the outcome of virtually all of the modules based upon the iris quality measure.
As noted, if the quality of the image does not necessarily reach a predefined quality measure for processing by module 56 and goes to trace transform module 57, the image may go to an iris trace transform sub-module 71 for transformation to an iris signature using trace space. The image may also go to an ocular trace transform sub-module 72 for transformation to an ocular signature using trace space. Module 71 or module 72 singularly, or modules 71 and 72 together, may be used in for trace transformation of the image. In either case, the signatures may be provided to a holistic matcher 59 of matcher module 61. Or the signatures from sub-modules 71 and 72 may be provided to separate matchers (not shown) . Matcher 61 may search database 62 for the most probable matches of signatures instead of only the best match. The search may be based on a similarity measure or some other useful criterion. If the most probable matches show little promise or are sufficient for pre-designated purposes, the iris trace signature and/or the ocular trace signature may be enrolled separately or together with identity information by a holistic enroller 67 of enroller module 68, in database 62. Various configurations of the approach shown in Figure 5 may be implemented.
Relevant applications may include U.S. Patent Application No. 11/675,424, filed February 15, 2007, and entitled!vStandoff Iris Recognition System!v; U.S. Patent Application No. 11/372,854, filed March 10, 2006, and entitled!vlnvariant Radial Iris Segmentation!v; U.S. Patent Application No. 11/043,366, filed January 26, 2005, and entitled!vlris Recognition System and Method!!; U.S. Patent Application No. 11/672,108, filed February 07, 2007, and entitled!!Approaches and Apparatus for Eye Detection in a Digital Image!!; U.S. Patent Application No. 11/681,614, filed March 02, 2007, and entitled!!Iris Recognition System Having Image Quality Metrics!!; U.S. patent Application No. 11/681,751, filed March 02 2007, and entitled "Indexing and Database Search System"; U.S. Patent Application No. 11/681,662, filed march 02, 2007, and entitled!!Expedient Encoding System; U.S. Patent Application No. 10/979,129, filed November 03, 2004, entitled!!System and Method for Gate Access ControP'; U.S. Patent Application No. 10/655,124, filed September 05, 2003, and entitled!!System and Method for Dynamic Stand-Off Biometric Verification!! (issued as U.S. Patent No. 7,183,895); and U.S. Provisional Patent Application 60/778,770, filed March 03, 2006, and entitled!!StandOff Iris Detection Tracking and Recognition System!!; all of which are hereby incorporated by reference.
In the present specification, some of the matter may be of a hypothetical or prophetic nature although stated in another manner or tense.
Although the present system has been described with respect to at least one illustrative example, many variations and modifications will become apparent to those skilled in the art upon reading the specification.
It is therefore the intention that the appended claims be interpreted as broadly as possible in view of the prior art to include all such variations and modifications.

Claims (12)

  1. Claims 1. A standoff ocular recognition system comprising: an eye image acquisition module; an image quality evaluation section connected to the eye image acquisition module; an eye recognition section connected to the image quality evaluation section; and a matcher module connected to the eye recognition section; and wherein the eye recognition section comprises a trace transform module for converting eye images into trace space.
  2. 2. The system of claim 1, wherein: the eye recognition section further comprises an iris recognition module; and the trace transform module comprises: an ocular trace transform sub-module connected to the image quality evaluation section; and an iris trace transform sub-module connected to the image quality evaluation section.
  3. 3. The system of claim 2, wherein: the image quality evaluation section comprises: an eye image quality metrics determination module connected to the eye image acquisition module; and an eye image router module connected to the eye image quality metrics determination module and the eye recognition section; and the system further comprises a results module connected to the matcher module.
  4. 4. The system of claim 3, wherein: the matcher module comprises: a holistic eye matcher connected to the ocular trace transform sub-mo dule; and an iris matcher connected to the iris recognition module; and the results module comprises: a holistic eye results sub-module connected to the holistic eye matcher; and an iris results sub-module connected to the iris matcher.
  5. 5. The system of claim 4, wherein: the holistic eye matcher is connected to a database of trace model parameters associated with individuals; the iris matcher is connected to a database of iris maps associated with individuals; the holistic eye matcher provides to the holistic results sub-module most probable matches rather than an only best match; the most probable matches are determined according to a similarity measure between the trace model parameters of eye images provided to the ocular trace transform sub-module and trace model parameters in the database of trace model parameters associated with individuals; and the iris matcher provides an only best match.
  6. 6. The system of claim 3, wherein: the eye image router module provides eye images, deemed not to meet a threshold of quality metrics according to the eye image quality metrics determination module, primarily to the trace transform module; the eye image router module provides eye images, deemed to meet the threshold of quality metrics according to the eye image quality metrics determination module, primarily to the iris recognition module; and the eye image router module can provide eye images approximately simultaneously to the trace transform module and the iris recognition module, with an outcome weighed based on iris quality.
  7. 7. The system of claim 1, wherein in the trace transform module, prior to a trace transformation, a normalization based on intersection points of two estimated eyelid parabolas, one parabola being fit to an edge of an upper eyelid and another parabola being fit to an edge of a lower lid, has the intersection points as reference points for the normalization of various templates.
  8. 8. A method for ocular recognition, comprising: acquiring an eye image; evaluating the eye image relative to quality; converting eye images having poor quality to trace transforms; and matching ocular trace transforms with ocular trace transforms, associated with identities of individuals, in a database; and wherein: an eye image has poor quality if the eye image is evaluated not to meet a predefined threshold of quality metrics; the matching ocular trace transforms of eye images is based on a similarity measure relative to the ocular trace transforms in the database; and the matching based on the similarity measure generates a group of most probable matches instead of one best match.
  9. 9. The method of claim 8, further comprising: processing eye images having good quality into iris images for iris recognition; and matching the iris images with iris images, associated with identities of individuals, in a database; and wherein: an image has good quality if the eye image is evaluated to meet a predefined threshold of quality metrics; and a matching of iris images generates a one best match.
  10. 10. An approach for standoff ocular recognition, comprising: acquisition of an eye image; determination of quality metrics of the eye image; trace transformation of an eye image, having quality metrics below a certain threshold, into ocular signature images; and a matching of the ocular signature images with ocular signature images associated with identities of individuals in a database, which generates a number of most probable matches; and wherein an eye image having a quality below a certain threshold is deemed to provide an iris print too obscured to be a reliable matcher for selecting one best match among iris prints in the database.
  11. 11. A system as hereinbefore described, with reference to and illustrated by the accompanying drawings.
  12. 12. A method as hereinbefore described, with reference to and illustrated by the accompanying drawings.
GB1009983A 2009-06-15 2010-06-15 An iris and ocular recognition system using trace transforms Expired - Fee Related GB2471192B (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US26867609P 2009-06-15 2009-06-15
US12/814,232 US8472681B2 (en) 2009-06-15 2010-06-11 Iris and ocular recognition system using trace transforms

Publications (3)

Publication Number Publication Date
GB201009983D0 GB201009983D0 (en) 2010-07-21
GB2471192A true GB2471192A (en) 2010-12-22
GB2471192B GB2471192B (en) 2011-08-17

Family

ID=42471665

Family Applications (1)

Application Number Title Priority Date Filing Date
GB1009983A Expired - Fee Related GB2471192B (en) 2009-06-15 2010-06-15 An iris and ocular recognition system using trace transforms

Country Status (1)

Country Link
GB (1) GB2471192B (en)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8369595B1 (en) 2012-08-10 2013-02-05 EyeVerify LLC Texture features for biometric authentication
US8437513B1 (en) 2012-08-10 2013-05-07 EyeVerify LLC Spoof detection for biometric authentication
US8483450B1 (en) 2012-08-10 2013-07-09 EyeVerify LLC Quality metrics for biometric authentication
US9721150B2 (en) 2015-09-11 2017-08-01 EyeVerify Inc. Image enhancement and feature extraction for ocular-vascular and facial recognition
WO2022066817A1 (en) * 2020-09-25 2022-03-31 Sterling Labs Llc Automatic selection of biometric based on quality of acquired image

Non-Patent Citations (4)

* Cited by examiner, † Cited by third party
Title
Fooprateepsiri R., Duangphasuk S., "A Highly Robust Method for Face Authentication", 2009, First Asian Conf. on Intelligent Information and Database Systems, IEEE, pages 380-385 *
Fooprateepsiri R., Kurutach W., "Face Verification base-on Hausdorff-Shape Context", 2009, Int. Asia Conf. on Informatics in Control, Automation and Robotics, pages 240-244 *
Kadyrov A., Petrou M., "The Trace transform and its applications", 2001, IEEE trans. on PAMI, vol. 23(8), pages 811-828 *
Kadyrov A., Petrou M., "The trace transform as a tool to invariant feature construction", 1998, Proc. 14th Int.Conf.on Pattern Recognition, 1998, IEEE, Vol. 2, pages 1037 - 1039 *

Cited By (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9361681B2 (en) 2012-08-10 2016-06-07 EyeVerify LLC Quality metrics for biometric authentication
US9104921B2 (en) 2012-08-10 2015-08-11 EyeVerify, LLC. Spoof detection for biometric authentication
US8483450B1 (en) 2012-08-10 2013-07-09 EyeVerify LLC Quality metrics for biometric authentication
US8675925B2 (en) 2012-08-10 2014-03-18 EyeVerify LLC Spoof detection for biometric authentication
US8369595B1 (en) 2012-08-10 2013-02-05 EyeVerify LLC Texture features for biometric authentication
US8787628B1 (en) 2012-08-10 2014-07-22 EyeVerify LLC Spoof detection for biometric authentication
US8437513B1 (en) 2012-08-10 2013-05-07 EyeVerify LLC Spoof detection for biometric authentication
US9311535B2 (en) 2012-08-10 2016-04-12 Eyeverify, Llc Texture features for biometric authentication
US8724857B2 (en) 2012-08-10 2014-05-13 EyeVerify LLC Quality metrics for biometric authentication
US10108858B2 (en) 2012-08-10 2018-10-23 Eye Verify LLC Texture features for biometric authentication
US10095927B2 (en) 2012-08-10 2018-10-09 Eye Verify LLC Quality metrics for biometric authentication
US9971920B2 (en) 2012-08-10 2018-05-15 EyeVerify LLC Spoof detection for biometric authentication
US9836643B2 (en) 2015-09-11 2017-12-05 EyeVerify Inc. Image and feature quality for ocular-vascular and facial recognition
US9721150B2 (en) 2015-09-11 2017-08-01 EyeVerify Inc. Image enhancement and feature extraction for ocular-vascular and facial recognition
US10311286B2 (en) 2015-09-11 2019-06-04 EyeVerify Inc. Fusing ocular-vascular with facial and/or sub-facial information for biometric systems
WO2022066817A1 (en) * 2020-09-25 2022-03-31 Sterling Labs Llc Automatic selection of biometric based on quality of acquired image

Also Published As

Publication number Publication date
GB201009983D0 (en) 2010-07-21
GB2471192B (en) 2011-08-17

Similar Documents

Publication Publication Date Title
US8472681B2 (en) Iris and ocular recognition system using trace transforms
Ma et al. Local intensity variation analysis for iris recognition
KR20050025927A (en) The pupil detection method and shape descriptor extraction method for a iris recognition, iris feature extraction apparatus and method, and iris recognition system and method using its
Ng et al. A review of iris recognition algorithms
Sequeira et al. Iris liveness detection methods in mobile applications
Barpanda et al. Iris feature extraction through wavelet mel-frequency cepstrum coefficients
Smereka et al. What is a" good" periocular region for recognition?
Choudhary et al. Iris anti-spoofing through score-level fusion of handcrafted and data-driven features
CN109255319A (en) For the recognition of face payment information method for anti-counterfeit of still photo
Nithya et al. Iris recognition techniques: a literature survey
Das et al. A framework for liveness detection for direct attacks in the visible spectrum for multimodal ocular biometrics
GB2471192A (en) Iris and Ocular Recognition using Trace Transforms
Huang et al. Rotation invariant iris feature extraction using Gaussian Markov random fields with non-separable wavelet
Akram et al. Dorsal hand veins based person identification
Das Towards multi-modal sclera and iris biometric recognition with adaptive liveness detection
Taha et al. Iris features extraction and recognition based on the local binary pattern technique
Proença Unconstrained iris recognition in visible wavelengths
Tallapragada et al. Iris recognition based on combined feature of GLCM and wavelet transform
Chandana et al. Face recognition through machine learning of periocular region
Wang et al. Liveness detection of dorsal hand vein based on the analysis of Fourier spectral
Borkar et al. IRIS Recognition System
Bansal et al. Trends in iris recognition algorithms
Maureira et al. Synthetic periocular iris pai from a small set of near-infrared-images
Sallehuddin et al. A survey of iris recognition system
Ambika et al. The eye says it all: Periocular region methodologies

Legal Events

Date Code Title Description
PCNP Patent ceased through non-payment of renewal fee

Effective date: 20230615