WO2013147763A1 - Person identification using ocular biometrics - Google Patents

Person identification using ocular biometrics Download PDF

Info

Publication number
WO2013147763A1
WO2013147763A1 PCT/US2012/030912 US2012030912W WO2013147763A1 WO 2013147763 A1 WO2013147763 A1 WO 2013147763A1 US 2012030912 W US2012030912 W US 2012030912W WO 2013147763 A1 WO2013147763 A1 WO 2013147763A1
Authority
WO
WIPO (PCT)
Prior art keywords
person
eye movement
eye
identity
assessing
Prior art date
Application number
PCT/US2012/030912
Other languages
French (fr)
Inventor
Oleg KOMOGORTSEV
Original Assignee
Texas State University - San Marcos
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Texas State University - San Marcos filed Critical Texas State University - San Marcos
Priority to PCT/US2012/030912 priority Critical patent/WO2013147763A1/en
Priority to EP12872652.8A priority patent/EP2831810A4/en
Priority to US13/908,748 priority patent/US9082011B2/en
Publication of WO2013147763A1 publication Critical patent/WO2013147763A1/en
Priority to US14/797,955 priority patent/US9811730B2/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/18Eye characteristics, e.g. of the iris

Definitions

  • the human eye includes several anatomical components that make up the oculomotor plant (OP). These components include the eye globe and its surrounding tissues, ligaments, six extraocular muscles (EOMs) each containing thin and thick filaments, tendon-like components, various tissues and liquids.
  • OP oculomotor plant
  • EOMs extraocular muscles
  • the brain sends a neuronal control signal to three pairs of extraocular muscles, enabling the visual system to collect information from the visual surround.
  • the eye rotates in its socket, exhibiting eye movement such as the following types: fixation, saccade, smooth pursuit, optokinetic reflex, vestibulo-ocular reflex, and vergence.
  • fixation maintaining the eye directed on the stationary object of interest
  • saccades rapid eye rotations between points of fixation with velocities reaching 700°/s
  • smooth pursuit movements that occur when eyes are tracking a smooth moving object.
  • a multi-modal method of assessing the identity of a person includes measuring eye movement of the person and measuring characteristics of an iris or/and periocular information of a person. Based on measured eye movements, estimates may be made of characteristics of an oculomotor plant of the person, complex eye movement patterns representing brain's control strategies of visual attention , or both. Complex eye movement patterns may include, for example, a scanpath of the person's eyes including a sequence of fixations and saccades. The person's identity may be assessed based on the estimated characteristics of the oculomotor plant, the estimated complex eye movement patterns, and the characteristics of the iris of the person or/and periocular information.
  • a method of assessing a person's identity includes measuring eye movements of the person. Based on measured eye movements, estimates are made of characteristics of an oculomotor plant of the person and complex eye movement patterns of the person's eyes. The person's identity may be assessed based on the estimated characteristics of the oculomotor plant and the estimated complex eye movement patterns that are representative of the brain's control strategies of visual attention.
  • a method of assessing a person's identity includes measuring eye movements of the person while the person is looking at stimulus materials.
  • the person may be reading, looking at various pictures, or looking at a jumping dot of light.
  • Estimates of characteristics of an oculomotor plant are made based on the recorded eye movements.
  • a system for assessing the identity of a person includes a processor, a memory coupled to the processor, and an instrument (e.g. image sensor such as web- camera) that can measure eye movement of a person and external ocular characteristics of the person (such as iris characteristics or periocular information). Based on measured eye movements, the system can estimate characteristics of an oculomotor plant of the person, strategies employed by the brain to guide visual attention represented via complex eye movement patterns, or both. The system can assess the person's identity based on the estimated characteristics of the oculomotor plant, brain strategies to guide visual attention via complex eye movement patterns, and the external ocular characteristics of the person.
  • an instrument e.g. image sensor such as web- camera
  • FIG. 1 illustrates one embodiment of assessing a person's identity using multimodal ocular biometrics based on eye movement tracking and measurement of external characteristics.
  • FIG. 2 illustrates one embodiment of authentication using oculomotor plant characteristics, complex eye movement patterns, iris and periocular information.
  • FIG. 3 is a block diagram illustrating architecture for biometric authentication via oculomotor plant characteristics according to one embodiment.
  • FIG. 4 illustrates raw eye movement signal with classified fixation and saccades and an associated oculomotor plant characteristics biometric template.
  • FIG. 5 is a graph illustrating receiver operating curves for ocular biometric methods in one experiment.
  • FIG. 6 illustrates one embodiment of a system for allowing remote computing with ocular biometric authentication of a user.
  • FIG. 7 illustrates one embodiment of a system for allowing remote computing with ocular biometric authentication of a user wearing an eye-tracking headgear system.
  • oculomotor plant means the eye globe and its surrounding tissues, ligaments, and . extraocular muscles (EOMs), each of which may contain thin and thick filaments, tendon-like components, various tissues and liquids.
  • EOMs extraocular muscles
  • saccade means a spatial path formed by a sequence of fixations and saccades. Fixations occur when the eye is held in a relatively stable position, allowing heightened visual acuity on an object of interest. Saccades may occur when the eye rotates quickly, for example, between points of fixation, with almost no visual acuity maintained during rotation. Velocities during saccades may reach as high as 700° per second.
  • Brain control strategies are defined as an ability of the brain to guide the eye to gather the information from the surrounding world. Strategies may be based on, or include, information on how and where the eye is guided. Brain control strategies can manifest themselves in the spatial and temporal (e.g. location and duration) characteristics of fixation, such characteristics of saccades as main-sequence relationship (relationship between maximum velocity exhibited during a saccade and its amplitude), amplitude duration relationship (relationship between saccade's duration and its amplitude), saccade's waveform (relationship between the time it takes to reach a peak velocity during a saccade to the total saccade duration) and other characteristics.
  • main-sequence relationship reference between maximum velocity exhibited during a saccade and its amplitude
  • amplitude duration relationship reference between saccade's duration and its amplitude
  • saccade's waveform reference between the time it takes to reach a peak velocity during a saccade to the total saccade duration
  • CEM complex eye movement
  • assessing a person's identity includes determining that a person being assessed or measured is a particular person or within a set or classification or persons.
  • “Assessing a person's identity” also includes determining that a person being assessed is not a particular person or within a set or classification or persons (for example, scanning eye movements of Person X to determine whether or not Person X is on a list a persons authorized to access to a computer system).
  • a person's identity is assessed using one or more characteristics that exist only in a live individual.
  • the assessment may be used, for example, to authenticate the person for access to a system or facility.
  • authentication of a person does not require the person being authenticated to remember any information (for example, to remember a password).
  • a person's identity is assessed using measurements of one or more visible characteristics of the person in combination with estimates of one or more non-visible characteristics of the person.
  • the assessment may be used to authenticate the person for access a computer system, for example.
  • a method of assessing a person's identity includes making estimates based on eye movements of a person and measuring iris characteristics or periocular information of the person. Eye movements may be used to estimate oculomotor plant characteristics, brain control strategies in a form of complex eye movement patters and scanpaths, or all these characteristics.
  • FIG. 1 illustrates one embodiment of assessing a person's identity using multimodal ocular biometrics based on eye movement tracking and measurement of external characteristics.
  • Eye movements of a person are tracked. Eye movement data may be collected using, for example, an eye tracking instrument.
  • acquired eye movement data may be used to estimate oculomotor plant characteristics.
  • Dynamic and static characteristics of the oculomotor plant that may be estimated include the eye globe's inertia, dependency of an individual muscle's force on its length and velocity of contraction, resistive properties of the eye globe, muscles and ligaments, characteristics of the neuronal control signal sent by the brain to the EOMs, and the speed of propagation of this signal.
  • Individual properties of the EOMs may vary depending on their roles. For example, the agonist role may be associated with the contracting muscle that pulls the eye globe in the required direction, while the antagonist role may be associated with the lengthening muscle resisting the pull.
  • acquired eye movement data may be used to analyze complex eye movements.
  • the CEM may be representative of the brain's control strategies of guiding visual attention.
  • Complex eye movement patterns may be based on, for example, on individual or aggregated scanpath data.
  • Scanpaths may include one or more fixations and one or more saccades by a person's eye.
  • the processed fixation and saccade groups may describe the scanpath of a recording.
  • Individual scanpath metrics may be calculated for each recording based on the properties of its unique scanpath.
  • Basic eye movement metrics may include: fixation count, average fixation duration, average vectorial average vertical saccade amplitude, average vectorial saccade velocity, average vectorial saccade peak velocity, and the velocity waveform indicator (Q), and a variety of saccades such as: undershot/overshoot, corrected undershoot/overshoot, multi-corrected undershoot/overshoot, dynamic, compound, and express saccades.
  • More complex metrics, resulting from the aggregated scanpath data may include: scanpath length, scanpath area, regions of interest, inflection count, and slope coefficients of the amplitude-duration and main sequence relationships.
  • measurements may be taken of external characteristics of the person.
  • one or more characteristics of the person's iris or/and periocular information are measured.
  • non-ocular external characteristics such as a facial characteristics or fingerprints, may be acquired in addition to, or instead of external ocular characteristics.
  • the measurements acquired at 106 are used to assess external characteristics of a person.
  • a biometric assessment is performed based on some or all of the estimated 'oculomotor plant characteristics, complex eye movement patterns, and external ocular characteristics.
  • biometric assessment is based on a combination of one or more dynamic characteristics is combined with one or more static traits, such as iris patterns or periocular information.
  • Authentication of a person may be carried out based on a combination of two or more of: oculomotor plant characteristics, complex eye movement patterns, and external ocular characteristics.
  • a single instrument is used to acquire all of the eye movement data and external characteristic data (for example, iris patterns or/and periocular information) for a person.
  • external characteristic data for example, iris patterns or/and periocular information
  • two or more different instruments may be used to acquire eye movement data or external characteristic data for a person.
  • Methods and systems as described herein may be shoulder-surfing resistant.
  • data presented during authentication procedures as described herein may not reveal any information about a user to an outside observer.
  • methods and systems as described herein may be counterfeit-resistant in that, for example, they can be based on internal non-visible anatomical structures or complex eye movement patters representative of the brain's strategies to guide visual attention.
  • information on OPC and CEM biometric used in combination with one another to assess identity of a person.
  • a user is authenticated by estimating individual oculomotor plant characteristics (OPC) and complex eye movement patterns generated for a specific type of stimulus.
  • OPC oculomotor plant characteristics
  • the presented visual information may be used to evoke eye movements that facilitate extraction of the OPC and CEM.
  • the information presented can be overseen by a shoulder-surfer with no negative consequences. As a result, the authentication does not require any feedback from a user except looking at a presented sequence of images or text.
  • FIG. 2 illustrates one embodiment of authentication using OPC, CEM, iris, and periocular information.
  • the OPC, CEM, iris, and periocular information may be captured by a single camera sensor.
  • Identity assessment 200 includes use of image, sensor 201 and eye tracking software 203. From image data captured with image sensor 201 , eye tracking software 203 may. generate raw eye positional signal data, which may be sent to the OPC and the CEM modules, and eye images, which may be sent to iris module 205 and periocular module 207. In general, all modules may process the input in the form of raw eye position signal or eye images, perform feature extraction, generate biometric templates, perform individual trait template matching 206, multi-trait template matching phase 208, and decision output 210.
  • Feature extraction 204 includes OPC feature extraction 21 1, CEM feature extraction 213, iris feature extraction 215, and periocular feature extraction 217.
  • Processing of eye images includes iris module image pre-processing 231 , periocular module image pre-processing 232, iris module template generation 233,
  • eye positional signal information is acquired.
  • Raw eye movement data produced during a recording is supplied to an eye movement classification module at 212.
  • an eye-tracker sends the recorded eye gaze trace to an eye movement classification algorithm at 212 after visual information employed for the authentication is presented to a user.
  • An eye movement classification algorithm may extract fixations and saccades from the signal. The extracted saccades' trajectories may be supplied to the mathematical model of the oculomotor plant 214 for the purpose of simulating the exact same trajectories.
  • an optimization algorithm modifies the values for the OPC to produce a minimum error between the recorded and the simulated signal. The values that produce the minimum error are supplied to an authentication algorithm at 218.
  • the authentication algorithm may be driven by a Hotteling's -square test 220. Templates may be accessible from template database 221.
  • the Hotteling's ⁇ -square test (or some other appropriate statistical test) may either accept or reject the user from the system.
  • An authentication probability value (which may be derived, for example, by the Hotteling's ⁇ -square test) may be propagated to decision fusion module 222.
  • a Hotteling's T-square test is employed, an authentication algorithm may be driven by other suitable statistical tests.
  • an authentication algorithm uses a Student's /-test is used (which may be enhanced by voting).
  • Fusion module 222 may accept or reject a person based on one or more similarity scores. In some case, fusion module 222 accept or reject a person based on OPC similarity score 224, CEM similarity score 226, iris similarity score 270, and periocular similarity score 280. Further aspects of implementing authentication based on OPC and the other modalities are set forth below.
  • a Velocity-Threshold (I-VT) classification algorithm (or some other eye movement classification algorithm) may be employed with threshold selection accomplished via standardized behavior scores. After the classification saccades with amplitudes smaller than 0.5° (microsaccades) may be filtered out to reduce the amount of noise in the recorded data.
  • I-VT Velocity-Threshold
  • Oculomotor Plant Mathematical Model At 214, a linear horizontal homeomorphic model of the oculomotor plant capable of simulating the horizontal and vertical component of eye movement during saccades may be employed.
  • the model mathematically may represent dynamic properties of the OP via a set of linear mechanical components such as springs and damping elements.
  • the following properties may be considered for two extraocular muscles that are modeled (medial and lateral recti) and the eye globe: active state tension - tension developed as a result of the innervations of an EOM by a neuronal control signal, length tension relationship - the relationship between the length of an EOM and the force it is capable of exerting, force velocity relationship - the relationship between the velocity of an EOM extension/contraction and the force it is capable of exerting, passive elasticity - the resisting properties of an EOM not innervated by the neuronal control signal, series elasticity - resistive properties of an EOM while the EOM is innervated by the neuronal control signal, passive elastic and viscous properties of the eye globe due to the characteristics of the surrounding tissues.
  • the model may take as an input a neuronal control signal, which may be approximated by a pulse- step function.
  • the OPC described above can be separated into two groups, each separately contributing to the horizontal and the vertical components of movement.
  • OPC Estimation Algorithm At 230, a Nelder-Mead (NM) simplex algorithm (or some other minimization algorithm such as Trust-Region using the interior-reflective Newton method) may be used in a form that allows simultaneous estimation of all OPC vector parameters at the same time. A subset of some OPC may be empirically selected. The remaining OPC may be fixed to default values.
  • NM Nelder-Mead
  • a subset of selected OPC comprises of length tension - the relationship between the length of an extraocular muscle and the force it is capable of exerting, series elasticity - resistive properties of an eye muscle while the muscle is innervated by the neuronal control signal, passive viscosity of the eye globe, force velocity relationship - the relationship between the velocity of an extraocular muscle extension/contraction and the force it is capable of exerting - in the agonist muscle, force velocity relationship in the antagonist muscle, agonist and antagonist muscles' tension intercept that ensures an equilibrium state during an eye fixation at primary eye position (for example an intercept coefficient in a linear relationship between the force that a muscle applies to the eye and the rotational position of the eye during fixation), the agonist muscle's tension slope (for example, a slope coefficient in a linear relationship between the force that an agonist muscle applies to the eye and the rotation position of the eye during fixation), the antagonist muscle's tension slope (for example, a tension slope coefficient for the antagonist muscle), and eye globe's inertia.
  • a template including some or all of the OPC described above is passed to a matching module to produce a matching score between a computed template and a template already stored in the database.
  • the person authentication algorithm takes a vector of the OPC optimized for each qualifying saccade.
  • a statistical test is applied to assess all optimized OPC in the vector at the same time.
  • a Hotelling's T-square test is applied. The test may assess data variability in a single individual as well as across multiple individuals.
  • the Hotelling's T-square test is applied to an empirically selected subset of five estimated parameters: series elasticity, passive viscosity of the eye globe, eye globe's inertia, agonist muscle's tension slope, and the antagonist muscle's tension slope.
  • the following Null Hypothesis (HO) is formulated assuming datasets i and j may be compared: "H0:There is no difference between the vectors of OPC between subject i and j".
  • the statistical significance level (p) resulting from the Hotelling's T-square test may be compared to a predetermined threshold (for example, 0.05). In this example, if the resulting p is smaller than the threshold, the HO is rejected indicating that the datasets in question belonged to different people. Otherwise, the HO is accepted indicating that the datasets belonged to the same person.
  • Two types of errors may be recorded as a result: (1) the rejection test of the HO when the datasets belonged to the same person; and (2) the acceptance test of the HO when the datasets were from different people.
  • biometrics using CEM are described.
  • some aspects of biometrics using CEM in a form of scanpaths are as described in C. Holland, and O. V. Komogortsev, Biometric Identification via Eye Movement Scanpaths in Reading, In Proceedings of the IEEE International Joint Conference on Biometrics (IJCB), 201 1 , pp. 1-8.
  • raw eye movement data produced during a recording is supplied to an eye movement classification module at 212.
  • Classified fixations and saccades forming complex eye movement patterns may be processed by two modules: individual scanpath component module 240 and aggregated scanpath module 241.
  • Individual scanpath component module 240 may process eye movement characteristics belonging to individual fixations and saccades. Characteristics processed by the individual scanpath component module 240 may include the following: [ 0046] Fixation Count - number of detected fixations. Fixation count is indicative of the number of objects processed by the subject, and was measured simply as the total number of fixations contained within the scanpath.
  • Average Fixation Duration sum of duration of all fixations detected divided by fixation count. Average fixation duration is indicative of the amount of time a subject spends interpreting an object, and was measured as the sum of fixation durations over the fixation count.
  • Average vectorial saccade amplitude was measured as the sum of vectorial saccade amplitudes over the total number of saccades, where the vectorial amplitude of a saccade was defined as the Euclidean norm of the horizontal and vertical amplitudes, according to the equation: xf + yf
  • Average Horizontal Saccade Amplitude - average amplitude of the horizontal component of saccadic movement was considered separately as these are more indicative of between- word saccades.
  • Average horizontal saccade amplitude was measured as the sum of horizontal saccade amplitudes greater than 0.5° over the total number of horizontal saccades with amplitude greater than 0.5°.
  • Average Vertical Saccade Amplitude - average amplitude of the vertical component of saccadic movement was considered separately as these are more indicative of between-line saccades. Average vertical saccade amplitude was measured as the sum of vertical saccade amplitudes greater than 0.5° over the total number of vertical saccades with amplitude greater than 0.5°.
  • Average Vectorial Saccade Velocity sum of vectorial saccade velocities over the total number of saccades, where the vectorial velocity of a saccade was defined as the Euclidean norm of the horizontal and vertical velocities.
  • Average Vectorial Saccade Peak Velocity sum of vectorial saccade peak velocities over the total number of saccades. Average vectorial saccade peak velocity was measured as the sum of vectorial saccade peak velocities over the total number of saccades, where the vectorial peak velocity of a saccade was defined as the Euclidean norm of the horizontal and vertical peak velocities.
  • Velocity Waveform Indicator (O) the relationship between the time it takes to reach a peak velocity during a saccade to the total saccade duration.
  • velocity waveform indicator (Q) the ratio of peak velocity to average velocity of a given saccade. In normal human saccades this value is roughly constant at 1.6, though it is assumed that this is subject to some amount of variation similar to the amplitude-duration and main sequence relationships. A rough estimate of this value may be obtained from the ratio of the average vectorial peak velocity over the average vectorial velocity.
  • Duration C x ⁇ Amplitude] + Duration, nin
  • a data set may be constructed from the saccade groups such that x-column data contained the larger absolute component (horizontal or vertical) amplitude and y-column data contained the respective saccade duration.
  • the slope coefficient of the amplitude-duration relationship may be obtained from a linear regression of this data set.
  • Peak Velocity Velocity max (l - e c ) [ 0060] This relationship has shown to be roughly linear for small saccades in the range of 0-10° amplitude. As a result, a linear approximation may be acceptable in the current context, as the saccades produced during reading are often on the order of 0-3° amplitude, with very few over 10° amplitude.
  • a data set may be constructed from the saccade groups such that x-column data contained absolute component (horizontal or vertical) amplitude and .y-column data contained the respective absolute component peak velocity.
  • the slope coefficient of the main sequence relationship may be obtained from a linear regression of this data set.
  • Characteristics processed by the aggregated scanpath module 241 may include the following:
  • Scanpath Length - summated amplitude of all detected saccades is indicative of the efficiency of visual search, and may be considered as a candidate biometric feature under the assumption that visual search is dependent on the subject's familiarity with similar patterns/content.
  • Scanpath length may be measured as the sum of absolute distances between the vectorial centroid of fixation points, where the vectorial centroid was defined as the Euclidean norm of the horizontal and vertical centroid positions, according to the equation:
  • Scanpath Area that is defined by a convex hull that is created by fixation points.
  • Scanpath area may be measured as the area of the convex hull formed by fixation points.
  • Scanpath area is similar to scanpath length in its indication of visual search efficiency, but may be less sensitive to localized searching. That is, a scanpath may have a large length while only covering a small area.
  • Regions of Interest total number of spatially unique regions identified after applying a spatial mean shift clustering algorithm to the sequence of fixations that define a scanpath
  • Regions of interest may be measured as the total number of spatially unique regions identified after applying a spatial mean shift clustering algorithm to the fixation points of the scanpath, using a sigma value of 2° and convergence resolution of 0.1 °.
  • Inflection Count number of eye-gaze direction shifts in a scanpath. Inflections occur when the scanpath changes direction, in reading there are a certain amount of "forced" inflections that may be necessary to progress through the text, but general differences in inflection count are indicative of attentional shifts. Inflection count may be measured as the number of saccades in which the horizontal and/or vertical velocity changes signs, according to the following algorithm:
  • Scanpath fix - aggregated representation of a scanpath that is defined by fixation points and their coordinates.
  • OPC biometric template 242 and scanpath biometric template 244 may be tested for match/non-match. Characteristics may be compared using Gaussian cumulative distribution function (CDF) 246. In some cases, all characteristics except the scanpath fix are compared via Gaussian cumulative distribution function (CDF) 246.
  • CDF Gaussian cumulative distribution function
  • CDF Gaussian cumulative distribution function
  • a Gaussian CDF comparison produces a probability value between 0 and 1 , where a value of 0.5 indicates an exact match and a value of 0 or 1 indicates no match. This probability may be converted into a more intuitive similarity score, where a value of 0 indicates no match and values of 1 indicates an exact match, with the following equation:
  • a simple acceptance threshold may be used to indicate the level of similarity which constitutes a biometric match.
  • scanpath fix characteristics are compared via pairwise distances between the centroids representing positions of fixations at 248.
  • the Euclidean pairwise distance may be calculated between the centroid positions of fixations.
  • a tally may be made of the total number of fixation points in each set that could be matched to within 1 ° of at least one point in the opposing set.
  • the similarity of scanpaths may be assessed by the proportion of tallied fixation points to the total number of fixation points to produce a similarity score similar to those generated for the various eye movement metrics.
  • the total difference is normalized to produce a similarity score with a value of 0 indicates no match and values of 1 indicates an exact match.
  • Iris similarity score 270 may. be generated using iris templates 272. In this example, to produce similarity score 270, a Hamming distance calculation is performed at 274.
  • Periocular similarity score 280 may be generated using periocular templates 282. Periocular similarity score 280 may be based periocular template comparisons at 284.
  • weighted fusion module produces a combined similarity score via a weighted sum of similarity scores produced by one or more of the individual metrics. Weights for each individual metrics may be produced empirically. Other score level fusion techniques can be applied, e.g., density-based score fusion techniques, transformation score fusion, classifier- based score fusion, methods that employ user-specific and evolving classification thresholds, and etc. The resulting similarity score may be employed for the decision of match/non-match for scanpath authentication or serves as an input to decision fusion module 222, which may combine, for example, OPC and CEM biometrics.
  • OPC similarity score 224 and CEM similarity score 226 may be considered for final match/non-match decisions.
  • Match/non-match decisions may be made based on one or more of the following information fusion approaches:
  • Logical fusion method employs individual decisions from the OPC and scanpath modalities in a form of 1 (match) or 0 (non-match) to produce the final match/non-match decision via logical OR (or AND) operations.
  • OR at least one method should indicate a match for the final match decision.
  • AND both methods should indicate a match for the final match decision.
  • the smallest (or largest) similarity score may between the OPM and the scanpath modalities. Thresholding may be applied to arrive to the final decision. For example, if the resulting value is larger than a threshold a match is indicated; otherwise, a non-match is indicated.
  • p is the resulting score
  • A, B, C and B stands for scores derived from the OPC, CEM, Iris, and Periocular respectively
  • wl , w2, w3, w4 are corresponding weights.
  • the resulting score p may be compared with a threshold value. If p is greater than the threshold, a match is indicated; otherwise, a non-match is indicated.
  • score level fusion techniques can be applied, e.g., density-based score fusion techniques, transformation score fusion, classifier-based score fusion, methods that employ user-specific and evolving classification thresholds, and etc.
  • FIG. 3 is a block diagram illustrating architecture for biometric authentication via oculomotor plant characteristics according to one embodiment.
  • assessment using OPC as described in FIG. 3 may be combined with assessments based on CEM, iris characteristics, periocular information, or some or all of those traits.
  • a biometric authentication is a based on a combination of OPC, CEM, iris characteristics, and periocular information.
  • Biometric authentication 300 may engage information during enrollment of a user and, at a later time, authentication of the user.
  • the recorded eye movement signal from an individual is supplied to the Eye movement classification module 302.
  • Eye movement classification module 302 classifies the eye position signal 304 into fixations and saccades.
  • a sequence of classified saccades' trajectories is sent to the oculomotor plant mathematical model (OPMM) 306.
  • OPMM oculomotor plant mathematical model
  • Oculomotor plant mathematical model (OPMM) 306 may generate simulated saccades' trajectories based on the default OPC values that are grouped into a vector with the purpose of matching the simulated trajectories with the recorded ones. Each individual saccade may be matched independently of any other saccade. Both classified and simulated trajectories for each saccade may be sent to error function module 308. Error function module 308 may compute error between the trajectories. The error result may trigger the OPC estimation module 310 to optimize the values inside of the OPC vector minimizing the error between each pair of recorded and simulated saccades.
  • an OPC biometric template 312 representing a user may be generated.
  • the template may include a set of the optimized OPC vectors, with each vector representing a classified saccade.
  • the number of classified saccades may determine the size of the user's OPC biometric template.
  • Eye position data 314 may be provided to eye movement classification module 302.
  • the estimated user biometrics template may be supplied to the person authentication module 316 and information fusion module 318 to authenticate a user.
  • Person authentication module 316 may accept or reject a user based on the recommendation of a given classifier.
  • Information fusion module 318 may aggregate information related to OPC vectors. In some embodiments, information fusion module 318 may work in conjunction with the person authentication module to authenticate a person based on multiple classification methods.
  • the output during user authentication procedure may be a yes/no answer 320 about claimed user's identity.
  • An automated eye movement classification algorithm may be used to help establish an invariant representation for the subsequent estimation of the OPC values.
  • the goal of this algorithm is to automatically and reliably identify each saccade's beginning, end and all trajectory points from a very noisy and jittery eye movement signal (for example, as shown in FIG. 4.
  • the additional goal of the eye movement classification algorithm is to provide additional filtering for saccades to ensure their high quality and a sufficient quantity of data for the estimation of the OPC values.
  • a standardized Velocity-Threshold (I-VT) algorithm is selected due to its speed and robustness.
  • a comparatively high classification threshold of 70° per second may be employed to reduce the impact of trajectory noises at the beginning and the end of each saccade. Additional filtering may include discarding saccades with amplitudes of less than 4 s, duration of less than 20 ms, and various trajectory artifacts that do not belong to normal saccades.
  • Oculomotor Plant Mathematical Model simulates accurate saccade trajectories while containing major anatomical components related to the OP.
  • a linear homeomorphic 2D OP mathematical model is selected.
  • the oculomotor plant mathematical model may be, for example, as described in O. V. Komogortsev and U. K. S. Jayarathna, "2D Oculomotor Plant Mathematical Model for eye movement simulation," in IEEE International Conference on Biolnformatics and Bioengineering (BIBE), 2008, pp. 1-8.
  • the oculomotor plant mathematical model in this example is capable of simulating saccades with properties resembling normal humans on a 2D plane (e.g. computer monitor) by considering physical properties of the eye globe and four extraocular muscles: medial, lateral, superior, and inferior recti.
  • the following advantages are associated with a selection of this oculomotor plant mathematical model: 1 ) major anatomical components are accounted for and can be estimated, 2) linear representation simplifies the estimation process of the OPC while producing accurate simulation data within the spatial boundaries of a regular computer monitor, 3) the architecture of the model allows dividing it into two smaller I D models. One of the smaller models becomes responsible for the simulation of the horizontal component of movement and the other for the vertical.
  • Such assignment while producing identical simulation results when compared to the full model, may allow a significant reduction in the complexity of the required solution and allow simultaneous simulation of both movement components on a multi-core system.
  • FIG. 4 illustrates raw eye movement signal with classified fixation and saccades 400 and an associated OPC biometric template 402.
  • simulated via OPMM saccade trajectories generated with the OPC vectors that provide the closest matches to the recorded trajectories are shown.
  • a subset of nine OPC is selected as a vector to represent an individual saccade for each component of movement (horizontal and vertical).
  • ms.), passive elasticity of the eye globe ( p N A G_C - NANT_C) pulse height of the agonist neuronal control signal (iteratively varied to match recorded saccade's onset and offset coordinates), pulse width of the agonist neuronal control signal (PW A NT ⁇ P A G + 6).
  • the error function module provides high sensitivity to differences between the recorded and simulated saccade trajectories.
  • the error function is implemented as the absolute difference between the saccades that are recorded by an eye tracker and saccades that are simulated by the OPMM.
  • n is the number of points in a trajectory
  • t is a point in a recorded trajectory
  • Sj is a corresponding point in a simulated trajectory.
  • the absolute difference approach may provide an advantage over other estimations such as root mean squared error (RMSE) due to its higher absolute sensitivity to the differences between the saccade trajectories.
  • RMSE root mean squared error
  • Eye Movement Recording Procedure Eye movement records were generated for participants' readings of various excerpts from Lewis Carroll's "The Hunting of the Snark.” This poem was chosen for its difficult and nonsensical content, forcing readers to progress slowly and carefully through the text.
  • Thresholds column contains the thresholds that produce minimum HTER for the corresponding authentication approach.
  • CUE refers to counterfeit-resistant usable eye-based authentication, which may include one of the traits, or two or more traits in combination that are based on the eye movement signal.
  • FIG. 5 is a graph illustrating receiver operating curves (ROC) for ocular biometric methods in the experiment described above.
  • ROC curves 500 corresponds to a different modality and/or fusion approach.
  • Curve 502 represents an authentication based on OPC.
  • Curve 504 represents an authentication based on CEM.
  • Curve 506 represents an authentication based on (OPC) OR (CEM).
  • Curve 508 represents an authentication based on (OPC) AND (CEM).
  • Curve 510 represents an authentication based on MIN (OPC, CEM).
  • Curve 512 represents an authentication based on MAX (OPC, CEM).
  • Curve 514 represents an authentication based on a weighted approach wl *OPC + w2*CEM.
  • Results indicate that OPC biometrics can be performed successfully for a reading task, where the amplitude of saccadic eye movements can be large when compared to a jumping dot stimulus.
  • both the OPC and CEM methods performed with similar accuracy providing the HTER of 27 %. Fusion methods were able to improve the accuracy achieving the best result of 19 % in case of the best performing weighted addition (weight wi was 0.45 while weight W2 was 0.55). Such results may indicate approximately 30 % reduction in the authentication error.
  • multimodal biometric assessment was able to achieve HTER of 19.5 %.
  • a chin rest that was already available from a commercial eye tracking system was employed for the purpose of stabilizing the head to improve the quality of the acquired data.
  • a comfortable chinrest can be constructed from very inexpensive materials as well.
  • Stimulus was displayed on a 19 inch LCD monitor at a refresh rate of 60Hz.
  • a web camera and other equipment such as described above may provide a user authentication station at a relatively low cost.
  • Eve-tracking software ITU eye tracking software was employed for the eye tracking purposes. The software was modified to present required stimulus and store an eye image every three seconds in addition to the existing eye tracking capabilities. Eye tracking was done in no-glint mode.
  • Stimulus was displayed on a 19 inch LCD monitor with refresh rate of 60Hz. The distance between the screen and subjects' eyes was approximately 540mm.
  • the complex pattern stimulus was constructed that employed the Rorschach inkblots used in psychological examination, in order to provide relatively clean patterns which were likely to evoke varied thoughts and emotions in participants.
  • Inkblot images were selected from the original Rorschach psychodiagnostic plates and sized/cropped to fill the screen. Participants were instructed to examine the images carefully, and recordings were performed over two sessions, with 3 rotations of 5 inkblots per session. Resulting sequence of images was 12 sec. long.
  • FIG. 6 illustrates one embodiment of a system for allowing remote computing with ocular biometric authentication of a user.
  • System 600 includes user system 602, computing system 604, and network 606.
  • User system 602 is connected to user display device 608, user input devices 610, and image sensor 61 1.
  • Image sensor may be, for example, a web cam.
  • User display device 608 may be, for example, a computer monitor.
  • Image sensor 61 1 may sense ocular data for the user, including eye movement and external characteristics, such as iris data and periocular information and provide the information to user system 602.
  • Authentication system 616 may serve content to the user by way of user display device 608.
  • Authentication system 616 may receive eye movement information, ocular measurements, or other information from user system 602. Using the information received from user system 602, authentication system 616 may assess the identity of the user. If the user is authenticated, access to computing system 604 by the user may be enabled.
  • user system 602, computing system 604, and authentication system 614 are shown as discrete elements for illustrative purposes. These elements may, nevertheless, in various embodiments be performed on a single computing system with one CPU, or distributed among any number of computing systems.
  • FIG.7 illustrates one embodiment of a system for allowing remote computing with ocular biometric authentication of a user wearing an eye-tracking headgear system.
  • System 620 may be similar to generally similar to system 600 described above relative to FIG. 6.
  • the user may wear eye tracking device 612.
  • Eye tracking device 612 may include eye tracking sensors for one or both eyes of the user.
  • User system 610 may receive sensor data from eye tracking device 612.
  • Authentication system 616 may receive information from user system 610 for authenticating the user.
  • Computer systems may, in various embodiments, include components such as a CPU with an associated memory medium such as Compact Disc Read-Only Memory (CD- ROM).
  • the memory medium may store program instructions for computer programs.
  • the program instructions may be executable by the CPU.
  • Computer systems may further include a display device such as monitor, an alphanumeric input device such as keyboard, and a directional input device such as mouse.
  • Computing systems may be operable to execute the computer programs to implement computer-implemented systems and methods.
  • a computer system may allow access to users by way of any browser or operating system.
  • Embodiments of a subset or all (and portions or all) of the above may be implemented by program instructions stored in a memory medium or carrier medium and executed by a processor.
  • a memory medium may include any of various types of memory devices or storage devices.
  • the term "memory medium" is intended to include an installation medium, e.g., a Compact Disc Read Only Memory (CD-ROM), floppy disks, or tape device; a computer system memory or random access memory such as Dynamic Random Access Memory (DRAM), Double Data Rate Random Access Memory (DDR RAM), Static Random Access Memory (SRAM), Extended Data Out Random Access Memory (EDO RAM), Rambus Random Access Memory (RAM), etc.; or a non-volatile memory such as a magnetic media, e.g., a hard drive, or optical storage.
  • DRAM Dynamic Random Access Memory
  • DDR RAM Double Data Rate Random Access Memory
  • SRAM Static Random Access Memory
  • EEO RAM Extended Data Out Random Access Memory
  • RAM Rambus Random Access Memory
  • the memory medium may comprise other types of memory as well, or combinations thereof.
  • the memory medium may be located in a first computer in which the programs are executed, or may be located in a second different computer that connects to the first computer over a network, such as the Internet. In the latter instance, the second computer may provide program instructions to the first computer for execution.
  • the term "memory medium" may include two or more memory mediums that may reside in different locations, e.g., in different computers that are connected over a network.
  • a computer system at a respective participant location may included memory medium(s) on which one or more computer programs or software components according to one embodiment may be stored.
  • the memory medium may store one or more programs that are executable to perform the methods described herein.
  • the memory medium may also store operating system software, as well as other software for operation of the computer system.
  • the memory medium may store a software program or programs operable to implement embodiments as described herein.
  • the software program(s) may be implemented in various ways, including, but not limited to, procedure-based techniques, component-based techniques, and/or object-oriented techniques, among others.
  • the software programs may be implemented using ActiveX controls, C++ objects, JavaBeans, Microsoft Foundation Classes (MFC), browser-based applications (e.g., Java applets), traditional programs, or other technologies or methodologies, as desired.
  • a CPU executing code and data from the memory medium may include a means for creating and executing the software program or programs according to the embodiments described herein.

Landscapes

  • Engineering & Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Ophthalmology & Optometry (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Theoretical Computer Science (AREA)
  • Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)

Abstract

A method of assessing the identity of a person by one or more of: 1) internal non-visible anatomical structure of an eye represented by the Oculomotor Plant Characteristics (OPC), 2) brain performance represented by the Complex Eye Movement patterns (CEM), 3) iris patterns, and 4) periocular information. CEM and OPC are estimated via measured eye movements. Eye movement, iris patterns, and periocular information can be captured by a single camera sensor. The person's identity may be assessed based on the estimated characteristics of the oculomotor plant, complex eye movement patterns, iris patterns, and periocular information.

Description

PERSON IDENTIFICATION USING OCULAR BIOMETRICS
STATEMENT REGARDING FEDERALLY SPONSORED
RESEARCH OR DEVELOPMENT
[ 0001 ] This invention was made with government support under award no.
60NANB 10D21 3 awarded by the National Institute of Standards. The government has certain rights in the invention.
BACKGROUND
Field
[ 0002 ] This disclosure is generally related to person identification, and more specifically to methods and systems for identifying persons using ocular biometric information. Description of the Related Art
[ 0003 ] Accurate, non-intrusive, and fraud-resistant identity recognition is an area of increasing concern in today's networked world, with the need for security set against the goal of easy access. Many commonly used methods for identity determination have known problems. For example, password verification has demonstrated many weaknesses in areas of accuracy (the individual typing the password may not actually be its owner), usability (people forget passwords), and security (people write passwords down or create easy-to-hack passwords).
[ 0004 ] The communication between a human and a computer frequently begins with an authentication request. During this initial phase of interaction a user supplies a system with verification of his/her identity, frequently given in the form of a typed password, graphical ly encoded security phrase, or a biometric token such as an iris scan or fingerprint. In cases when the user is prompted to select the identification key from a sequence of numerical and graphical symbols, there is a danger of accidental or intentional shoulder surfing performed directly or by use of a hidden camera. Moreover, such challenges may become specifically pronounced in cases of multi-user environments including shared-workstation use and more contemporary interaction media such as tabletops. Authentication methods requiring remembrance of information such as symbols and photos have reduced usability, due to the fact that long, sophisticated passwords can be easily forgotten and short passwords are easy to break. Even biometric methods such as iris and finger print-based authentication may not be completely fraud-proof, since they are based on a human's body characteristics that can be replicated.
[ 0005 ] There are a number of methods employed today for biometric purposes. Some examples include the use of fingerprints, iris, retina scans, face recognition, hand/finger geometry, brain waves, periocular features, ear shape, gait, and voice recognition. Iris-based identification is considered to be one of the most accurate among existing biometric modalities. However, commercial iris-identification systems may be easy to spoof, and they are also inconvenient and intrusive since they usually require a user to stand very still and very close to the image capturing device.
[ 0006 ] The human eye includes several anatomical components that make up the oculomotor plant (OP). These components include the eye globe and its surrounding tissues, ligaments, six extraocular muscles (EOMs) each containing thin and thick filaments, tendon-like components, various tissues and liquids.
[ 0007 ] The brain sends a neuronal control signal to three pairs of extraocular muscles, enabling the visual system to collect information from the visual surround. As a result of this signal, the eye rotates in its socket, exhibiting eye movement such as the following types: fixation, saccade, smooth pursuit, optokinetic reflex, vestibulo-ocular reflex, and vergence. In a simplified scenario, when a stationary person views a two-dimensional display (e.g., computer screen), three eye movement types are exhibited: fixations (maintaining the eye directed on the stationary object of interest), saccades (rapid eye rotations between points of fixation with velocities reaching 700°/s), and smooth pursuit (movements that occur when eyes are tracking a smooth moving object).
[ 0008 ] Accurate estimation of oculomotor plant characteristics is challenging due to the secluded nature of' the corresponding anatomical components, which relies on indirect estimation and includes noise and inaccuracies associated with the eye tracking equipment, and also relies on effective classification and filtering of the eye movement signal.
SUMMARY
[ 0009] In an embodiment, a multi-modal method of assessing the identity of a person includes measuring eye movement of the person and measuring characteristics of an iris or/and periocular information of a person. Based on measured eye movements, estimates may be made of characteristics of an oculomotor plant of the person, complex eye movement patterns representing brain's control strategies of visual attention , or both. Complex eye movement patterns may include, for example, a scanpath of the person's eyes including a sequence of fixations and saccades. The person's identity may be assessed based on the estimated characteristics of the oculomotor plant, the estimated complex eye movement patterns, and the characteristics of the iris of the person or/and periocular information. The identity assessment may be used to authenticate the person (for example, to allow the person access to a computer system or access to a facility). [0010] In an embodiment, a method of assessing a person's identity includes measuring eye movements of the person. Based on measured eye movements, estimates are made of characteristics of an oculomotor plant of the person and complex eye movement patterns of the person's eyes. The person's identity may be assessed based on the estimated characteristics of the oculomotor plant and the estimated complex eye movement patterns that are representative of the brain's control strategies of visual attention.
[0011 ] In an embodiment, a method of assessing a person's identity includes measuring eye movements of the person while the person is looking at stimulus materials. In various embodiments, for example, the person may be reading, looking at various pictures, or looking at a jumping dot of light. Estimates of characteristics of an oculomotor plant are made based on the recorded eye movements.
[ 0012 ] In an embodiment, a system for assessing the identity of a person includes a processor, a memory coupled to the processor, and an instrument (e.g. image sensor such as web- camera) that can measure eye movement of a person and external ocular characteristics of the person (such as iris characteristics or periocular information). Based on measured eye movements, the system can estimate characteristics of an oculomotor plant of the person, strategies employed by the brain to guide visual attention represented via complex eye movement patterns, or both. The system can assess the person's identity based on the estimated characteristics of the oculomotor plant, brain strategies to guide visual attention via complex eye movement patterns, and the external ocular characteristics of the person.
BRIEF DESCRIPTION OF THE DRAWINGS
[0013] FIG. 1 illustrates one embodiment of assessing a person's identity using multimodal ocular biometrics based on eye movement tracking and measurement of external characteristics.
[ 0014] FIG. 2 illustrates one embodiment of authentication using oculomotor plant characteristics, complex eye movement patterns, iris and periocular information.
[ 0015] FIG. 3 is a block diagram illustrating architecture for biometric authentication via oculomotor plant characteristics according to one embodiment.
[ 0016] FIG. 4 illustrates raw eye movement signal with classified fixation and saccades and an associated oculomotor plant characteristics biometric template.
[ 0017 ] FIG. 5 is a graph illustrating receiver operating curves for ocular biometric methods in one experiment.
[ 0018] FIG. 6 illustrates one embodiment of a system for allowing remote computing with ocular biometric authentication of a user. [ 0019] FIG. 7 illustrates one embodiment of a system for allowing remote computing with ocular biometric authentication of a user wearing an eye-tracking headgear system.
[0020] While the invention is described herein by way of example for several embodiments and illustrative drawings, those skilled in the art will recognize that the invention is not limited to the embodiments or drawings described. It should be understood, that the drawings and detailed description thereto are not intended to limit the invention to the particular form disclosed, but on the contrary, the intention is to cover all modifications, equivalents and alternatives falling within the spirit and scope of the present invention as defined by the appended claims. The headings used herein are for organizational purposes only and are not meant to be used to limit the scope of the description or the claims. As used throughout this application, the word "may" is used in a permissive sense (i.e., meaning having the potential to), rather than the mandatory sense (i.e., meaning must). Similarly, the words "include", "including", and "includes" mean including, but not limited to.
DETAILED DESCRIPTION OF EMBODIMENTS
[0021] As used herein, "oculomotor plant" means the eye globe and its surrounding tissues, ligaments, and. extraocular muscles (EOMs), each of which may contain thin and thick filaments, tendon-like components, various tissues and liquids.
[ 0022 ] As used herein, "scanpath" means a spatial path formed by a sequence of fixations and saccades. Fixations occur when the eye is held in a relatively stable position, allowing heightened visual acuity on an object of interest. Saccades may occur when the eye rotates quickly, for example, between points of fixation, with almost no visual acuity maintained during rotation. Velocities during saccades may reach as high as 700° per second.
[0023] As used herein, "brain control strategies" are defined as an ability of the brain to guide the eye to gather the information from the surrounding world. Strategies may be based on, or include, information on how and where the eye is guided. Brain control strategies can manifest themselves in the spatial and temporal (e.g. location and duration) characteristics of fixation, such characteristics of saccades as main-sequence relationship (relationship between maximum velocity exhibited during a saccade and its amplitude), amplitude duration relationship (relationship between saccade's duration and its amplitude), saccade's waveform (relationship between the time it takes to reach a peak velocity during a saccade to the total saccade duration) and other characteristics.
[ 0024 ] As used herein, "complex eye movement (CEM) patterns" are defined as eye movement patterns and characteristics that allow inferring brain's strategies or activity to control visual attention. This information might be inferred from individual and aggregated characteristics of a scanpath. In addition CEM can include, for example, the information about saccades elicited in response to different stimuli. Examples of forms in which CEM information may be manifested include: simple undershoot or overshoot (e.g. saccades that miss the target and no correction is made to put gaze location on the target), corrected undershoot/overshoot (e.g. saccades that miss the target, but the brain corrects eye position to the target's position), multi-corrected undershoot/overshoot - similar in definition to the corrected undershoot/overshoot saccade however additional series of corrective saccades is added that brings the resulting fixation position closer to the target; dynamic overshoot which is the oppositely directed post-saccadic eye movement in the form of backward jerk at the offset of a saccade; compound saccade which represented by an initial saccade that is subsequently followed by two or more oppositely directed saccades of small amplitude that move the eye-gaze back and forth from the target position; and express saccade which is represented by a sequence of saccades directed toward the target where the end of the initial saccade is in the small spatial and temporal proximity from the sequence of new saccades leading to the target.
[0025 ] As used herein, "assessing a person's identity" includes determining that a person being assessed or measured is a particular person or within a set or classification or persons. "Assessing a person's identity" also includes determining that a person being assessed is not a particular person or within a set or classification or persons (for example, scanning eye movements of Person X to determine whether or not Person X is on a list a persons authorized to access to a computer system).
[ 0026 ] In some embodiments, a person's identity is assessed using one or more characteristics that exist only in a live individual. The assessment may be used, for example, to authenticate the person for access to a system or facility. In certain embodiments, authentication of a person does not require the person being authenticated to remember any information (for example, to remember a password).
[ 0027 ] In some embodiments, a person's identity is assessed using measurements of one or more visible characteristics of the person in combination with estimates of one or more non-visible characteristics of the person. The assessment may be used to authenticate the person for access a computer system, for example.
[ 0028 ] In some embodiments, a method of assessing a person's identity includes making estimates based on eye movements of a person and measuring iris characteristics or periocular information of the person. Eye movements may be used to estimate oculomotor plant characteristics, brain control strategies in a form of complex eye movement patters and scanpaths, or all these characteristics. FIG. 1 illustrates one embodiment of assessing a person's identity using multimodal ocular biometrics based on eye movement tracking and measurement of external characteristics. At 100, eye movements of a person are tracked. Eye movement data may be collected using, for example, an eye tracking instrument.
[ 0029] At 102, acquired eye movement data may be used to estimate oculomotor plant characteristics. Dynamic and static characteristics of the oculomotor plant that may be estimated include the eye globe's inertia, dependency of an individual muscle's force on its length and velocity of contraction, resistive properties of the eye globe, muscles and ligaments, characteristics of the neuronal control signal sent by the brain to the EOMs, and the speed of propagation of this signal. Individual properties of the EOMs may vary depending on their roles. For example, the agonist role may be associated with the contracting muscle that pulls the eye globe in the required direction, while the antagonist role may be associated with the lengthening muscle resisting the pull.
[ 0030 ] At 104, acquired eye movement data may be used to analyze complex eye movements. The CEM may be representative of the brain's control strategies of guiding visual attention. Complex eye movement patterns may be based on, for example, on individual or aggregated scanpath data. Scanpaths may include one or more fixations and one or more saccades by a person's eye. The processed fixation and saccade groups may describe the scanpath of a recording. Individual scanpath metrics may be calculated for each recording based on the properties of its unique scanpath. Basic eye movement metrics may include: fixation count, average fixation duration, average vectorial average vertical saccade amplitude, average vectorial saccade velocity, average vectorial saccade peak velocity, and the velocity waveform indicator (Q), and a variety of saccades such as: undershot/overshoot, corrected undershoot/overshoot, multi-corrected undershoot/overshoot, dynamic, compound, and express saccades. More complex metrics, resulting from the aggregated scanpath data, may include: scanpath length, scanpath area, regions of interest, inflection count, and slope coefficients of the amplitude-duration and main sequence relationships.
[ 0031 ] At 106, measurements may be taken of external characteristics of the person. In one embodiment, one or more characteristics of the person's iris or/and periocular information are measured. In certain embodiments, non-ocular external characteristics, such as a facial characteristics or fingerprints, may be acquired in addition to, or instead of external ocular characteristics. At 108, the measurements acquired at 106 are used to assess external characteristics of a person. [0032 ] At 1 10, a biometric assessment is performed based on some or all of the estimated 'oculomotor plant characteristics, complex eye movement patterns, and external ocular characteristics. In some embodiments, biometric assessment is based on a combination of one or more dynamic characteristics is combined with one or more static traits, such as iris patterns or periocular information. Authentication of a person may be carried out based on a combination of two or more of: oculomotor plant characteristics, complex eye movement patterns, and external ocular characteristics.
[0033] In some embodiments, a single instrument is used to acquire all of the eye movement data and external characteristic data (for example, iris patterns or/and periocular information) for a person. In other embodiments, two or more different instruments may be used to acquire eye movement data or external characteristic data for a person.
[0034 ] Methods and systems as described herein may be shoulder-surfing resistant. For example, data presented during authentication procedures as described herein may not reveal any information about a user to an outside observer. In addition, methods and systems as described herein may be counterfeit-resistant in that, for example, they can be based on internal non-visible anatomical structures or complex eye movement patters representative of the brain's strategies to guide visual attention. In some embodiments, information on OPC and CEM biometric used in combination with one another to assess identity of a person.
[0035] In some embodiments, a user is authenticated by estimating individual oculomotor plant characteristics (OPC) and complex eye movement patterns generated for a specific type of stimulus. The presented visual information may be used to evoke eye movements that facilitate extraction of the OPC and CEM. The information presented can be overseen by a shoulder-surfer with no negative consequences. As a result, the authentication does not require any feedback from a user except looking at a presented sequence of images or text.
[0036] FIG. 2 illustrates one embodiment of authentication using OPC, CEM, iris, and periocular information. The OPC, CEM, iris, and periocular information may be captured by a single camera sensor. Identity assessment 200 includes use of image, sensor 201 and eye tracking software 203. From image data captured with image sensor 201 , eye tracking software 203 may. generate raw eye positional signal data, which may be sent to the OPC and the CEM modules, and eye images, which may be sent to iris module 205 and periocular module 207. In general, all modules may process the input in the form of raw eye position signal or eye images, perform feature extraction, generate biometric templates, perform individual trait template matching 206, multi-trait template matching phase 208, and decision output 210. Feature extraction 204 includes OPC feature extraction 21 1, CEM feature extraction 213, iris feature extraction 215, and periocular feature extraction 217. Processing of eye images includes iris module image pre-processing 231 , periocular module image pre-processing 232, iris module template generation 233,
[ 0037 ] At 202, eye positional signal information is acquired. Raw eye movement data produced during a recording is supplied to an eye movement classification module at 212. In some embodiments, an eye-tracker sends the recorded eye gaze trace to an eye movement classification algorithm at 212 after visual information employed for the authentication is presented to a user. An eye movement classification algorithm may extract fixations and saccades from the signal. The extracted saccades' trajectories may be supplied to the mathematical model of the oculomotor plant 214 for the purpose of simulating the exact same trajectories. At 216, an optimization algorithm modifies the values for the OPC to produce a minimum error between the recorded and the simulated signal. The values that produce the minimum error are supplied to an authentication algorithm at 218. The authentication algorithm may be driven by a Hotteling's -square test 220. Templates may be accessible from template database 221. The Hotteling's Γ-square test (or some other appropriate statistical test) may either accept or reject the user from the system. An authentication probability value (which may be derived, for example, by the Hotteling's Γ-square test) may be propagated to decision fusion module 222. Although in the embodiment shown in FIG. 2, a Hotteling's T-square test is employed, an authentication algorithm may be driven by other suitable statistical tests. In one embodiment, an authentication algorithm uses a Student's /-test is used (which may be enhanced by voting).
[0038] Fusion module 222 may accept or reject a person based on one or more similarity scores. In some case, fusion module 222 accept or reject a person based on OPC similarity score 224, CEM similarity score 226, iris similarity score 270, and periocular similarity score 280. Further aspects of implementing authentication based on OPC and the other modalities are set forth below.
[ 0039] Eve Movement Classification: At 212, a Velocity-Threshold (I-VT) classification algorithm (or some other eye movement classification algorithm) may be employed with threshold selection accomplished via standardized behavior scores. After the classification saccades with amplitudes smaller than 0.5° (microsaccades) may be filtered out to reduce the amount of noise in the recorded data.
[ 0040] Oculomotor Plant Mathematical Model: At 214, a linear horizontal homeomorphic model of the oculomotor plant capable of simulating the horizontal and vertical component of eye movement during saccades may be employed. The model mathematically may represent dynamic properties of the OP via a set of linear mechanical components such as springs and damping elements. The following properties may be considered for two extraocular muscles that are modeled (medial and lateral recti) and the eye globe: active state tension - tension developed as a result of the innervations of an EOM by a neuronal control signal, length tension relationship - the relationship between the length of an EOM and the force it is capable of exerting, force velocity relationship - the relationship between the velocity of an EOM extension/contraction and the force it is capable of exerting, passive elasticity - the resisting properties of an EOM not innervated by the neuronal control signal, series elasticity - resistive properties of an EOM while the EOM is innervated by the neuronal control signal, passive elastic and viscous properties of the eye globe due to the characteristics of the surrounding tissues. The model may take as an input a neuronal control signal, which may be approximated by a pulse- step function. The OPC described above can be separated into two groups, each separately contributing to the horizontal and the vertical components of movement.
[0041] OPC Estimation Algorithm: At 230, a Nelder-Mead (NM) simplex algorithm (or some other minimization algorithm such as Trust-Region using the interior-reflective Newton method) may be used in a form that allows simultaneous estimation of all OPC vector parameters at the same time. A subset of some OPC may be empirically selected. The remaining OPC may be fixed to default values. In an example a subset of selected OPC comprises of length tension - the relationship between the length of an extraocular muscle and the force it is capable of exerting, series elasticity - resistive properties of an eye muscle while the muscle is innervated by the neuronal control signal, passive viscosity of the eye globe, force velocity relationship - the relationship between the velocity of an extraocular muscle extension/contraction and the force it is capable of exerting - in the agonist muscle, force velocity relationship in the antagonist muscle, agonist and antagonist muscles' tension intercept that ensures an equilibrium state during an eye fixation at primary eye position (for example an intercept coefficient in a linear relationship between the force that a muscle applies to the eye and the rotational position of the eye during fixation), the agonist muscle's tension slope (for example, a slope coefficient in a linear relationship between the force that an agonist muscle applies to the eye and the rotation position of the eye during fixation), the antagonist muscle's tension slope (for example, a tension slope coefficient for the antagonist muscle), and eye globe's inertia. Lower and upper boundaries may be imposed to prevent reduction or growth of each individual OPC value to less than 10% or larger than 1000% of its default value. Stability degradation of the numerical solution for differential equations describing the OPMM may be used as an additional indicator for acceptance of the suggested OPC values by the estimation algorithm. In some embodiments, a template including some or all of the OPC described above is passed to a matching module to produce a matching score between a computed template and a template already stored in the database.
[ 0042 ] Authentication: As an input, the person authentication algorithm takes a vector of the OPC optimized for each qualifying saccade. In some embodiments, a statistical test is applied to assess all optimized OPC in the vector at the same time. In the example shown in FIG. 2, a Hotelling's T-square test is applied. The test may assess data variability in a single individual as well as across multiple individuals. In one embodiment, the Hotelling's T-square test is applied to an empirically selected subset of five estimated parameters: series elasticity, passive viscosity of the eye globe, eye globe's inertia, agonist muscle's tension slope, and the antagonist muscle's tension slope.
[0043] As a part of the authentication procedure, the following Null Hypothesis (HO) is formulated assuming datasets i and j may be compared: "H0:There is no difference between the vectors of OPC between subject i and j". The statistical significance level (p) resulting from the Hotelling's T-square test may be compared to a predetermined threshold (for example, 0.05). In this example, if the resulting p is smaller than the threshold, the HO is rejected indicating that the datasets in question belonged to different people. Otherwise, the HO is accepted indicating that the datasets belonged to the same person. Two types of errors may be recorded as a result: (1) the rejection test of the HO when the datasets belonged to the same person; and (2) the acceptance test of the HO when the datasets were from different people.
[ 0044 ] In the method described above, variability was accounted for by applying a . Hotelling's T-square test. In certain embodiments, oculomotor plant characteristics are numerically evaluated given a recorded eye-gaze trace.
[ 0045] Referring to the CEM side of FIG. 2, aspects of biometrics using CEM are described. In some embodiments, some aspects of biometrics using CEM in a form of scanpaths are as described in C. Holland, and O. V. Komogortsev, Biometric Identification via Eye Movement Scanpaths in Reading, In Proceedings of the IEEE International Joint Conference on Biometrics (IJCB), 201 1 , pp. 1-8. As noted above, raw eye movement data produced during a recording is supplied to an eye movement classification module at 212. Classified fixations and saccades forming complex eye movement patterns may be processed by two modules: individual scanpath component module 240 and aggregated scanpath module 241. Individual scanpath component module 240 may process eye movement characteristics belonging to individual fixations and saccades. Characteristics processed by the individual scanpath component module 240 may include the following: [ 0046] Fixation Count - number of detected fixations. Fixation count is indicative of the number of objects processed by the subject, and was measured simply as the total number of fixations contained within the scanpath.
[ 0047 ] Average Fixation Duration - sum of duration of all fixations detected divided by fixation count. Average fixation duration is indicative of the amount of time a subject spends interpreting an object, and was measured as the sum of fixation durations over the fixation count.
[ 0048 ] Average Vectorial Saccade Amplitude - sum of vectorial saccade amplitudes over the total number of saccades, where the vectorial amplitude of a saccade was defined as the Euclidean norm of the horizontal and vertical amplitudes. There is a noted tendency for saccades to maintain similar amplitudes during reading, average saccade amplitude was considered as a candidate biometric feature under the assumption that differences in amplitude may be apparent between subjects. Average vectorial saccade amplitude was measured as the sum of vectorial saccade amplitudes over the total number of saccades, where the vectorial amplitude of a saccade was defined as the Euclidean norm of the horizontal and vertical amplitudes, according to the equation: xf + yf
Vectorial Average =
[ 0049 ] Average Horizontal Saccade Amplitude - average amplitude of the horizontal component of saccadic movement. Horizontal saccade amplitude was considered separately as these are more indicative of between- word saccades. Average horizontal saccade amplitude was measured as the sum of horizontal saccade amplitudes greater than 0.5° over the total number of horizontal saccades with amplitude greater than 0.5°.
[0050 ] Average Vertical Saccade Amplitude - average amplitude of the vertical component of saccadic movement. Vertical saccade amplitude was considered separately as these are more indicative of between-line saccades. Average vertical saccade amplitude was measured as the sum of vertical saccade amplitudes greater than 0.5° over the total number of vertical saccades with amplitude greater than 0.5°.
[ 0051 ] Average Vectorial Saccade Velocity - sum of vectorial saccade velocities over the total number of saccades, where the vectorial velocity of a saccade was defined as the Euclidean norm of the horizontal and vertical velocities. Average vectorial saccade velocity as measured as the sum of vectorial saccade velocities over the total number of saccades, where the vectorial velocity of a saccade was defined as the Euclidean norm of the horizontal and vertical velocities.
[ 0052 ] Average Vectorial Saccade Peak Velocity - sum of vectorial saccade peak velocities over the total number of saccades. Average vectorial saccade peak velocity was measured as the sum of vectorial saccade peak velocities over the total number of saccades, where the vectorial peak velocity of a saccade was defined as the Euclidean norm of the horizontal and vertical peak velocities.
[ 0053 ] Velocity Waveform Indicator (O) - the relationship between the time it takes to reach a peak velocity during a saccade to the total saccade duration. We use the term velocity waveform indicator (Q) to refer to the ratio of peak velocity to average velocity of a given saccade. In normal human saccades this value is roughly constant at 1.6, though it is assumed that this is subject to some amount of variation similar to the amplitude-duration and main sequence relationships. A rough estimate of this value may be obtained from the ratio of the average vectorial peak velocity over the average vectorial velocity.
[ 0054 ] Amplitude-Duration Relationship - the relationship between the amplitude of the saccade and its duration.
[ 0055 ] Coefficient of the Amplitude-Duration Relationship. The amplitude-duration relationship varies from person to person, and describes the tendency for saccade duration to increase linearly with amplitude, according to the equation:
Duration = C x {Amplitude] + Duration,nin
[ 0056 ] To calculate the slope coefficient of this relationship, a data set may be constructed from the saccade groups such that x-column data contained the larger absolute component (horizontal or vertical) amplitude and y-column data contained the respective saccade duration.
[ 0057 ] The slope coefficient of the amplitude-duration relationship may be obtained from a linear regression of this data set.
[ 0058 ] Main Sequence" Relationship - the relationship between the amplitude of the saccade and its peak velocity.
[ 0059] Coefficient of the Main Sequence Relationship. The main sequence relationship varies from person to person, and describes the tendency for saccade peak velocity to increase exponentially with amplitude, according to the equation:
\Amplitude\
Peak Velocity = Velocitymax(l - e c ) [ 0060] This relationship has shown to be roughly linear for small saccades in the range of 0-10° amplitude. As a result, a linear approximation may be acceptable in the current context, as the saccades produced during reading are often on the order of 0-3° amplitude, with very few over 10° amplitude.
[ 0061 ] To calculate the slope coefficient of this relationship, a data set may be constructed from the saccade groups such that x-column data contained absolute component (horizontal or vertical) amplitude and .y-column data contained the respective absolute component peak velocity. The slope coefficient of the main sequence relationship may be obtained from a linear regression of this data set.
[0062 ] Characteristics processed by the aggregated scanpath module 241 may include the following:
[0063 ] Scanpath Length - summated amplitude of all detected saccades. Scanpath length is indicative of the efficiency of visual search, and may be considered as a candidate biometric feature under the assumption that visual search is dependent on the subject's familiarity with similar patterns/content. Scanpath length may be measured as the sum of absolute distances between the vectorial centroid of fixation points, where the vectorial centroid was defined as the Euclidean norm of the horizontal and vertical centroid positions, according to the equation:
Scanpath Length
Figure imgf000014_0001
[ 0064 ] Scanpath Area - area that is defined by a convex hull that is created by fixation points. Scanpath area may be measured as the area of the convex hull formed by fixation points. Scanpath area is similar to scanpath length in its indication of visual search efficiency, but may be less sensitive to localized searching. That is, a scanpath may have a large length while only covering a small area.
[0065 ] Regions of Interest - total number of spatially unique regions identified after applying a spatial mean shift clustering algorithm to the sequence of fixations that define a scanpath
[0066] Regions of interest may be measured as the total number of spatially unique regions identified after applying a spatial mean shift clustering algorithm to the fixation points of the scanpath, using a sigma value of 2° and convergence resolution of 0.1 °.
[0067 ] Inflection Count - number of eye-gaze direction shifts in a scanpath. Inflections occur when the scanpath changes direction, in reading there are a certain amount of "forced" inflections that may be necessary to progress through the text, but general differences in inflection count are indicative of attentional shifts. Inflection count may be measured as the number of saccades in which the horizontal and/or vertical velocity changes signs, according to the following algorithm:
1. Inflections = 0
2. i = 2
3. While i < Saccade Count:
4. If sign(Velocityj) != sign(Velocityj-i):
5. Inflections = Inflections + 1
6. End if
7. i = i + 1
8. End while
[0068 ] Scanpath fix - aggregated representation of a scanpath that is defined by fixation points and their coordinates.
[ 0069] OPC biometric template 242 and scanpath biometric template 244 may be tested for match/non-match. Characteristics may be compared using Gaussian cumulative distribution function (CDF) 246. In some cases, all characteristics except the scanpath fix are compared via Gaussian cumulative distribution function (CDF) 246.
[ 0070 ] To determine a relative measure of similarity between metrics, a Gaussian cumulative distribution function (CDF) was applied as follows, were x and μ are the metric values being compared and σ is the metric-s ecific standard deviation:
Figure imgf000015_0001
[ 0071 ] A Gaussian CDF comparison produces a probability value between 0 and 1 , where a value of 0.5 indicates an exact match and a value of 0 or 1 indicates no match. This probability may be converted into a more intuitive similarity score, where a value of 0 indicates no match and values of 1 indicates an exact match, with the following equation:
Similarity = 1 - |2p - 1 |
[0072 ] From the similarity score, a simple acceptance threshold may be used to indicate the level of similarity which constitutes a biometric match.
[ 0073] In some embodiments, scanpath fix characteristics are compared via pairwise distances between the centroids representing positions of fixations at 248. In comparing two scanpaths, the Euclidean pairwise distance may be calculated between the centroid positions of fixations. Following this, a tally may be made of the total number of fixation points in each set that could be matched to within 1 ° of at least one point in the opposing set. The similarity of scanpaths may be assessed by the proportion of tallied fixation points to the total number of fixation points to produce a similarity score similar to those generated for the various eye movement metrics. In some embodiments, the total difference is normalized to produce a similarity score with a value of 0 indicates no match and values of 1 indicates an exact match.
[ 0074 ] Iris similarity score 270 may. be generated using iris templates 272. In this example, to produce similarity score 270, a Hamming distance calculation is performed at 274.
[ 0075 ] Periocular similarity score 280 may be generated using periocular templates 282. Periocular similarity score 280 may be based periocular template comparisons at 284.
[ 0076 ] At 250, weighted fusion module produces a combined similarity score via a weighted sum of similarity scores produced by one or more of the individual metrics. Weights for each individual metrics may be produced empirically. Other score level fusion techniques can be applied, e.g., density-based score fusion techniques, transformation score fusion, classifier- based score fusion, methods that employ user-specific and evolving classification thresholds, and etc. The resulting similarity score may be employed for the decision of match/non-match for scanpath authentication or serves as an input to decision fusion module 222, which may combine, for example, OPC and CEM biometrics.
[ 0077 ] For example at 222, OPC similarity score 224 and CEM similarity score 226 may be considered for final match/non-match decisions. Match/non-match decisions may be made based on one or more of the following information fusion approaches:
[0078] Logical OR. AND. Logical fusion method employs individual decisions from the OPC and scanpath modalities in a form of 1 (match) or 0 (non-match) to produce the final match/non-match decision via logical OR (or AND) operations. In case of OR at least one method should indicate a match for the final match decision. In case of AND both methods should indicate a match for the final match decision.
[ 0079] ΜΓΝ. MAX. For a MIN (or MAX) method, the smallest (or largest) similarity score may between the OPM and the scanpath modalities. Thresholding may be applied to arrive to the final decision. For example, if the resulting value is larger than a threshold a match is indicated; otherwise, a non-match is indicated.
[ 0080 ] Weighted addition. Weighted summation of the two or two similarity scores from the OPC, CEM, iris, and periocular may be performed via the formula p =wi«A+W2*B+W3»C+w4 »D. Here p is the resulting score, A, B, C and B stands for scores derived from the OPC, CEM, Iris, and Periocular respectively, wl , w2, w3, w4 are corresponding weights. The resulting score p may be compared with a threshold value. If p is greater than the threshold, a match is indicated; otherwise, a non-match is indicated.
[ 0081 ] Other score level fusion techniques can be applied, e.g., density-based score fusion techniques, transformation score fusion, classifier-based score fusion, methods that employ user-specific and evolving classification thresholds, and etc.
[ 0082 ] FIG. 3 is a block diagram illustrating architecture for biometric authentication via oculomotor plant characteristics according to one embodiment. In certain embodiments, assessment using OPC as described in FIG. 3 may be combined with assessments based on CEM, iris characteristics, periocular information, or some or all of those traits. In one embodiment, a biometric authentication is a based on a combination of OPC, CEM, iris characteristics, and periocular information.
[0083] Biometric authentication 300 may engage information during enrollment of a user and, at a later time, authentication of the user. During the enrollment, the recorded eye movement signal from an individual is supplied to the Eye movement classification module 302. Eye movement classification module 302 classifies the eye position signal 304 into fixations and saccades. A sequence of classified saccades' trajectories is sent to the oculomotor plant mathematical model (OPMM) 306.
[ 0084] Oculomotor plant mathematical model (OPMM) 306 may generate simulated saccades' trajectories based on the default OPC values that are grouped into a vector with the purpose of matching the simulated trajectories with the recorded ones. Each individual saccade may be matched independently of any other saccade. Both classified and simulated trajectories for each saccade may be sent to error function module 308. Error function module 308 may compute error between the trajectories. The error result may trigger the OPC estimation module 310 to optimize the values inside of the OPC vector minimizing the error between each pair of recorded and simulated saccades.
[ 0085] When the minimum error is achieved for all classified and simulated saccade pairs, an OPC biometric template 312 representing a user may be generated. The template may include a set of the optimized OPC vectors, with each vector representing a classified saccade. The number of classified saccades may determine the size of the user's OPC biometric template.
[ 0086] During a person's verification, the information flow may be similar to the enrollment procedure. Eye position data 314 may be provided to eye movement classification module 302. In addition, the estimated user biometrics template may be supplied to the person authentication module 316 and information fusion module 318 to authenticate a user. Person authentication module 316 may accept or reject a user based on the recommendation of a given classifier. Information fusion module 318 may aggregate information related to OPC vectors. In some embodiments, information fusion module 318 may work in conjunction with the person authentication module to authenticate a person based on multiple classification methods. The output during user authentication procedure may be a yes/no answer 320 about claimed user's identity.
[ 0087 ] Further description for various modules in this example is provided below.
[ 0088] Eve Movement Classification. An automated eye movement classification algorithm may be used to help establish an invariant representation for the subsequent estimation of the OPC values. The goal of this algorithm is to automatically and reliably identify each saccade's beginning, end and all trajectory points from a very noisy and jittery eye movement signal (for example, as shown in FIG. 4. The additional goal of the eye movement classification algorithm is to provide additional filtering for saccades to ensure their high quality and a sufficient quantity of data for the estimation of the OPC values.
[ 0089] In one embodiment, a standardized Velocity-Threshold (I-VT) algorithm is selected due to its speed and robustness. A comparatively high classification threshold of 70° per second may be employed to reduce the impact of trajectory noises at the beginning and the end of each saccade. Additional filtering may include discarding saccades with amplitudes of less than 4 s, duration of less than 20 ms, and various trajectory artifacts that do not belong to normal saccades.
[0090 ] Oculomotor Plant Mathematical Model. The oculomotor plant mathematical model simulates accurate saccade trajectories while containing major anatomical components related to the OP. In one embodiment, a linear homeomorphic 2D OP mathematical model is selected. The oculomotor plant mathematical model may be, for example, as described in O. V. Komogortsev and U. K. S. Jayarathna, "2D Oculomotor Plant Mathematical Model for eye movement simulation," in IEEE International Conference on Biolnformatics and Bioengineering (BIBE), 2008, pp. 1-8. The oculomotor plant mathematical model in this example is capable of simulating saccades with properties resembling normal humans on a 2D plane (e.g. computer monitor) by considering physical properties of the eye globe and four extraocular muscles: medial, lateral, superior, and inferior recti. The following advantages are associated with a selection of this oculomotor plant mathematical model: 1 ) major anatomical components are accounted for and can be estimated, 2) linear representation simplifies the estimation process of the OPC while producing accurate simulation data within the spatial boundaries of a regular computer monitor, 3) the architecture of the model allows dividing it into two smaller I D models. One of the smaller models becomes responsible for the simulation of the horizontal component of movement and the other for the vertical. Such assignment, while producing identical simulation results when compared to the full model, may allow a significant reduction in the complexity of the required solution and allow simultaneous simulation of both movement components on a multi-core system.
[ 0091 ] Specific OPC that may be accounted by the OPMM and selected to be a part of the user's biometric template are discussed below. FIG. 4 illustrates raw eye movement signal with classified fixation and saccades 400 and an associated OPC biometric template 402. In the middle of FIG. 4, simulated via OPMM saccade trajectories generated with the OPC vectors that provide the closest matches to the recorded trajectories are shown.
[ 0092 ] In this example, a subset of nine OPC is selected as a vector to represent an individual saccade for each component of movement (horizontal and vertical). Length tension ( lt=1.2 g °) - the relationship between the length of an extraocular muscle and the force it is capable of exerting, series elasticity (Kse=2.5 g °) - resistive properties of an eye muscle while the muscle is innervated by the neuronal control signal, passive viscosity (Bp=0.06 g.s/°) of the eye globe, force velocity relationship - the relationship between the velocity of an extraocular muscle extension/contraction and the force it is capable of exerting - in the agonist muscle (BAG=0.046 g-s/°), force velocity relationship in the antagonist muscle (BANT=0.022 g-s/°), agonist and antagonist muscles' tension intercept (NFIX_C=14.0 g.) that ensures an equilibrium state during an eye fixation at primary eye position, the agonist muscle's tension slope (NAG_C=0.8 g.), and the antagonist muscle's tension slope (NANT_C=0.5 g.), eye globe's inertia (J=0.000043 g-s2/°). All tension characteristics may be directly impacted by the neuronal control signal sent by the brain, and therefore partially contain the neuronal control signal information.
[ 0093 ] The remaining OPC to produce the simulated saccades may be fixed to the following default values: agonist muscle neuronal control signal activation (1 1.7) and deactivation constants (2.0), antagonist muscle neuronal control signal activation (2.4) and deactivation constants (1.9), pulse height of the antagonist neuronal control signal (0.5 g.), pulse width of the antagonist neuronal control signal (PWAG=7+|A| ms.), passive elasticity of the eye globe ( p= NAG_C - NANT_C) pulse height of the agonist neuronal control signal (iteratively varied to match recorded saccade's onset and offset coordinates), pulse width of the agonist neuronal control signal (PWANT^ P AG+6).
[ 0094 ] The error function module provides high sensitivity to differences between the recorded and simulated saccade trajectories. In some cases, the error function is implemented as the absolute difference between the saccades that are recorded by an eye tracker and saccades that are simulated by the OPMM.
Figure imgf000020_0001
[ 0095] where n is the number of points in a trajectory, t, is a point in a recorded trajectory and Sj is a corresponding point in a simulated trajectory. The absolute difference approach may provide an advantage over other estimations such as root mean squared error (RMSE) due to its higher absolute sensitivity to the differences between the saccade trajectories. First Example of an Experiment with Multimodal Ocular Authentication in which only CEM & OPC Modalities are employed
[0096] The following describes an experiment including biometric authentication based on oculomotor plant characteristics and complex eye movement patterns.
[0097 ] Equipment. The data was recorded using the EyeLink II eye tracker at sampling frequency of 1000Hz. Stimuli were presented on a 30 inch flat screen monitor positioned at a distance of 685 millimeters from the subject, with screen dimensions of 640 χ 400 millimeters, and resolution of 2560 x 1600 pixels. Chin rest was employed to ensure high reliability of the collected data.
[0098] Eye Movement Recording Procedure. Eye movement records were generated for participants' readings of various excerpts from Lewis Carroll's "The Hunting of the Snark." This poem was chosen for its difficult and nonsensical content, forcing readers to progress slowly and carefully through the text.
[0099] For each recording, the participant was given 1 minute to read, and text excerpts were chosen to require roughly 1 minute to complete. Participants were given a different excerpt for each of four recording session, and excerpts were selected from the "The Hunting of the Snark" to ensure the difficulty of the material was consistent, line lengths were consistent, and that learning effects did not impact subsequent readings.
[00100] Participants and Data Quality. Eye movement data was collected for a total of 32 subjects (26 males / 6 females), ages 18 - 40 with an average age of 23 (SD = 5.4). Mean positional accuracy of the recordings averaged between all calibration points was 0.74° (SD = 0.54°). 29 of the subjects performed 4 recordings each, and 3 of the subjects performed 2 recordings each, generating a total of 122 unique eye movement records.
[ 00101 ] The first two recordings for each subject were conducted during the same session with a 20 minute break between recordings; the second two recordings were performed a week later, again with a 20 minute break between recordings. [00102] Performance Evaluation. The performance of the authentication methods was evaluated via False Acceptance Rate (FAR) and False Rejection Rate (FRR) metrics. The FAR represents the percentage of imposters' records accepted as authentic users and the FRR indicates the amount of authentic users' records rejected from the system. To simplify the presentation of the results the Half Total Error Rate (HTER) was employed which was defined as the averaged combination of FAR and FRR.
[00103] Performance of authentication using biometric assessment using oculomotor plant characteristics, scanpaths, or combinations thereof, was computed as a result of a run across all possible combinations of eye movement records. For example, considering 3 eye movement records (A, B, and C) produced by unique subjects, similarity scores were produced for the combinations: A + B, A + C, B + C. For the 122 eye movement records, this resulted in 7381 combinations that were employed for acceptance and rejection tests for both methods.
[00104] For this experiment, in case of the OPC biometrics, only horizontal components of the recorded saccades with amplitudes > 1° and duration over 4 ms were considered for the authentication. As a result average amplitude of the horizontal component prior to filtering was 3.42° (SD=3.25) and after filtering was 3.79° (SD=3.26). Magnitude of the vertical components prior to filtering was quite small (M=1.2° SD=3.16), therefore vertical component of movement was not considered for derivation of OPC due to high signal/noise ratio of the vertical component of movement.
[00105] Results. Table I presents results of the experiment described above. In Table
I, authentication results are presented for each biometric modality.. Thresholds column contains the thresholds that produce minimum HTER for the corresponding authentication approach. CUE refers to counterfeit-resistant usable eye-based authentication, which may include one of the traits, or two or more traits in combination that are based on the eye movement signal.
Table I.
Figure imgf000021_0001
[00106] FIG. 5 is a graph illustrating receiver operating curves (ROC) for ocular biometric methods in the experiment described above. Each of ROC curves 500 corresponds to a different modality and/or fusion approach. Curve 502 represents an authentication based on OPC. Curve 504 represents an authentication based on CEM. Curve 506 represents an authentication based on (OPC) OR (CEM). Curve 508 represents an authentication based on (OPC) AND (CEM). Curve 510 represents an authentication based on MIN (OPC, CEM). Curve 512 represents an authentication based on MAX (OPC, CEM). Curve 514 represents an authentication based on a weighted approach wl *OPC + w2*CEM.
[00107] Results indicate that OPC biometrics can be performed successfully for a reading task, where the amplitude of saccadic eye movements can be large when compared to a jumping dot stimulus. In this example, both the OPC and CEM methods performed with similar accuracy providing the HTER of 27 %. Fusion methods were able to improve the accuracy achieving the best result of 19 % in case of the best performing weighted addition (weight wi was 0.45 while weight W2 was 0.55). Such results may indicate approximately 30 % reduction in the authentication error. In a custom case where weights for OPC and scanpath traits are equal, multimodal biometric assessment was able to achieve HTER of 19.5 %.
Second Example of an Experiment with Multimodal Ocular Authentication in which only CEM & OPC & Iris Modalities are employed.
[00108] The following describes an experiment including biometric authentication based on oculomotor plant characteristics, complex eye movement patterns, and iris.
[00109] Equipment. Eye movement recording and iris capture were simultaneously conducted using PlayStation Eye web-camera. The camera worked at the resolution of 640x480 pixels and the frame rate of 75Hz. The existing IR pass filter was removed from the camera and a piece of unexposed developed film was inserted as a filter for the visible spectrum of light. An array of IR lights in a form of Clover Electronics IR010 Infrared Illuminator together with two separate IR diodes placed on the body of the camera were employed for better eye tracking. The web-camera and main IR array were installed on a flexible arm of the Mainstays Halogen Desk Lamp each to provide an installation that can be adjusted to a specific user. A chin rest that was already available from a commercial eye tracking system was employed for the purpose of stabilizing the head to improve the quality of the acquired data. In a low cost scenario a comfortable chinrest can be constructed from very inexpensive materials as well. Stimulus was displayed on a 19 inch LCD monitor at a refresh rate of 60Hz. A web camera and other equipment such as described above may provide a user authentication station at a relatively low cost. [ 00110 ] Eve-tracking software. ITU eye tracking software was employed for the eye tracking purposes. The software was modified to present required stimulus and store an eye image every three seconds in addition to the existing eye tracking capabilities. Eye tracking was done in no-glint mode.
[ 00111 ] Stimulus. Stimulus was displayed on a 19 inch LCD monitor with refresh rate of 60Hz. The distance between the screen and subjects' eyes was approximately 540mm. The complex pattern stimulus was constructed that employed the Rorschach inkblots used in psychological examination, in order to provide relatively clean patterns which were likely to evoke varied thoughts and emotions in participants. Inkblot images were selected from the original Rorschach psychodiagnostic plates and sized/cropped to fill the screen. Participants were instructed to examine the images carefully, and recordings were performed over two sessions, with 3 rotations of 5 inkblots per session. Resulting sequence of images was 12 sec. long.
[ 00112 ] Eye movement data and iris data was collected for a total of 28 subjects ( 18 males, 10 females), ages 18 - 36 with an average age of 22.4 (SD = 4.6). Each subject participated in two recording sessions with an interval of approximately 15 min. between the sessions.
[ 00113 ] Results. Weighted fusion was employed to combine scores from all three biometric modalities. The weights were selected by dividing the recorded data randomly into training and testing sets. Each set contained 50% of the original recording. After 20 random divisions the average results are presented by Table II:
Table II
Figure imgf000023_0001
[ 00114 ] FIG. 6 illustrates one embodiment of a system for allowing remote computing with ocular biometric authentication of a user. System 600 includes user system 602, computing system 604, and network 606. User system 602 is connected to user display device 608, user input devices 610, and image sensor 61 1. Image sensor may be, for example, a web cam. User display device 608 may be, for example, a computer monitor.
[ 00115 ] Image sensor 61 1 may sense ocular data for the user, including eye movement and external characteristics, such as iris data and periocular information and provide the information to user system 602. Authentication system 616 may serve content to the user by way of user display device 608. Authentication system 616 may receive eye movement information, ocular measurements, or other information from user system 602. Using the information received from user system 602, authentication system 616 may assess the identity of the user. If the user is authenticated, access to computing system 604 by the user may be enabled.
[00116] In the embodiment shown in FIG. 6, user system 602, computing system 604, and authentication system 614 are shown as discrete elements for illustrative purposes. These elements may, nevertheless, in various embodiments be performed on a single computing system with one CPU, or distributed among any number of computing systems.
[00117] FIG.7 illustrates one embodiment of a system for allowing remote computing with ocular biometric authentication of a user wearing an eye-tracking headgear system. System 620 may be similar to generally similar to system 600 described above relative to FIG. 6. To carry out authentication, the user may wear eye tracking device 612. Eye tracking device 612 may include eye tracking sensors for one or both eyes of the user. User system 610 may receive sensor data from eye tracking device 612. Authentication system 616 may receive information from user system 610 for authenticating the user.
[00118] Computer systems may, in various embodiments, include components such as a CPU with an associated memory medium such as Compact Disc Read-Only Memory (CD- ROM). The memory medium may store program instructions for computer programs. The program instructions may be executable by the CPU. Computer systems may further include a display device such as monitor, an alphanumeric input device such as keyboard, and a directional input device such as mouse. Computing systems may be operable to execute the computer programs to implement computer-implemented systems and methods. A computer system may allow access to users by way of any browser or operating system.
[00119] Embodiments of a subset or all (and portions or all) of the above may be implemented by program instructions stored in a memory medium or carrier medium and executed by a processor. A memory medium may include any of various types of memory devices or storage devices. The term "memory medium" is intended to include an installation medium, e.g., a Compact Disc Read Only Memory (CD-ROM), floppy disks, or tape device; a computer system memory or random access memory such as Dynamic Random Access Memory (DRAM), Double Data Rate Random Access Memory (DDR RAM), Static Random Access Memory (SRAM), Extended Data Out Random Access Memory (EDO RAM), Rambus Random Access Memory (RAM), etc.; or a non-volatile memory such as a magnetic media, e.g., a hard drive, or optical storage. The memory medium may comprise other types of memory as well, or combinations thereof. In addition, the memory medium may be located in a first computer in which the programs are executed, or may be located in a second different computer that connects to the first computer over a network, such as the Internet. In the latter instance, the second computer may provide program instructions to the first computer for execution. The term "memory medium" may include two or more memory mediums that may reside in different locations, e.g., in different computers that are connected over a network. In some embodiments, a computer system at a respective participant location may included memory medium(s) on which one or more computer programs or software components according to one embodiment may be stored. For example, the memory medium may store one or more programs that are executable to perform the methods described herein. The memory medium may also store operating system software, as well as other software for operation of the computer system.
[ 00120] The memory medium may store a software program or programs operable to implement embodiments as described herein. The software program(s) may be implemented in various ways, including, but not limited to, procedure-based techniques, component-based techniques, and/or object-oriented techniques, among others. For example, the software programs may be implemented using ActiveX controls, C++ objects, JavaBeans, Microsoft Foundation Classes (MFC), browser-based applications (e.g., Java applets), traditional programs, or other technologies or methodologies, as desired. A CPU executing code and data from the memory medium may include a means for creating and executing the software program or programs according to the embodiments described herein.
[00121 ] Further modifications and alternative embodiments of various aspects of the invention may be apparent to those skilled in the art in view of this description. Accordingly, this description is to be construed as illustrative only and is for the purpose of teaching those skilled in the art the general manner of carrying out the invention. It is to be understood that the forms of the invention shown and described herein are to be taken as embodiments. Elements and materials may be substituted for those illustrated and described herein, parts and processes may be reversed, and certain features of the invention may be utilized independently, all as would be apparent to one skilled in the art after having the benefit of this description of the invention. Methods may be implemented manually, in software, in hardware, or a combination thereof. The order of any method may be changed, and various elements may be added, reordered, combined, omitted, modified, etc. Changes may be made in the elements described herein without departing from the spirit and scope of the invention as described in the following claims.

Claims

WHAT IS CLAIMED IS:
1. A method of assessing a person's identity, comprising: measuring eye movement of the person; estimating one or more anatomical characteristics of an oculomotor plant of the person based at least in part on at least a portion of the measured eye movement; estimating one or more brain's control strategies in guiding visual attention via exhibition of complex eye movement patterns represented in part by spatial paths of the eye computed from the measured eye movement; and assessing the person's identity based in part on at least one of the estimated one or more characteristics of the oculomotor plant of the person, and based in part on one or more properties related to the complex eye movement patterns.
2. The method of claim 1, wherein measuring eye movement of the person comprises eye tracking.
3. The method of claim 1, wherein estimating at least one of the anatomical
characteristics of an oculomotor plant of the person comprises creating a two-dimensional mathematical model including at least one of the anatomical characteristics of the oculomotor plant.
4. The method of claim 1, further comprising generating one or more brain's control strategies in guiding visual attention from eye movements of the person in a form of complex eye movement patterns, wherein the assessment of the person's identity is based in part on one or more components of complex eye movement patterns.
5. The method of claim 1, wherein assessing the person's identity comprises matching the biometric templates related to complex eye movement patterns and oculomotor plant characteristics with previously acquired templates for the person.
6. The method of claim 1, wherein assessing the person's identity comprises determining the identity of the person.
7. The method of claim 1, wherein further comprising authenticating the person based on the assessment of the person's identity.
8. The method of claim 1, wherein estimating at least one of the one or more characteristics of an oculomotor plant of the person comprises estimating at least one static characteristic of the oculomotor plant.
9. The method of claim 1, wherein estimating at least one of the one or more characteristics of an oculomotor plant of the person comprises estimating at least one dynamic characteristic of the oculomotor plant.
10. The method of claim 1, further comprising measuring one or more external characteristics of at least one eye of the person, wherein the assessment of the person's identity is based in part on at least one of the measured characteristics.
1 1. The method of claim 10, wherein measuring one or more external characteristics comprises measuring one or more characteristics of an iris or a periocular region of the eye of the person.
12. The method of claim 10, wherein measuring one or more external characteristics comprises measuring one or more characteristics of an iris and one or more characteristics of a periocular region of the eye of the person.
13. The method of claim 1, wherein at least some measurements of eye movement of the person are performed while the person is reading, looking at images, webpages, or interacting with a computer, wherein estimating at least one of the one or more characteristics of the oculomotor plant of the person is based on measured eye movement while the person is reading, looking at images, webpages, or interacting with a computer.
14. The method of claim 1, further comprising accessing one or more oculomotor plant characteristic templates for the person, wherein the assessment of the person's identity is based in part on a comparison of eye movement with at least one of the one or more oculomotor plant characteristic templates.
15. The method of claim 1, wherein at least one of the oculomotor plant characteristic templates is based at in part on previous measurements of eye movement of the person.
16. The method of claim 1, wherein at least one estimate of the one or more
characteristics of the oculomotor plant of the person is based at least in part on an oculomotor plant mathematical model.
17. The method of claim 1, further comprising producing a minimum error between one or more recorded signals and one or more simulated signals.
18. The method of claim 1, wherein assessing the person's identity comprises applying one or more statistical tests to one or more oculomotor plant characteristics.
19. The method of claim 1, wherein assessing the person's identity comprises determining an OPC similarity score based on comparison of two OPC biometric templates derived in part from some of the eye movements.
20. The method of claim 1, further comprising accessing one or more complex eye movement patterns templates for the person, wherein the assessment of the person's identity is based in part on a comparison of eye movement with at least one of the one or more complex eye movement patterns templates.
21. The method of claim 1, wherein assessing the person's identity comprises aggregating two or more complex eye movement patterns.
22. The method of claim 1, wherein assessing the person's identity comprises determining a complex eye movement similarity score based on at least some of the eye movements.
23. The method of claim 1, wherein assessing the person's identity comprises determining a match based in part on the estimated one or more oculomotor plant characteristics and based in part on the estimated one or more complex eye movement patterns.
24. The method of claim 1, wherein assessing the person's identity comprises, determining an OPC similarity score based on at least some of the eye movements;
determining a complex eye movement similarity score based on at least some of the eye movements; and
determining a match based in part on a combination of the OPC similarity score and the CEM similarity score.
25. The method of claim 1, wherein assessing the person's identity comprises, determining an OPC similarity score based on at least some of the eye movements;
determining a complex eye movement similarity score based on at least some of the eye movements;
determining an iris similarity score based on one or more iris characteristics of the
person;
determining an periocular similarity score based on one or more periocular characteristics of the person;
and
determining a match based in part on a combination of the OPC similarity score, the CEM similarity score, the iris similarity score, and the periocular similarity score.
26. A system, comprising: an instrument configured to measure eye movement of a person and one or more characteristics of an eye of the person; and a processor;
a memory coupled to the processor, wherein the memory comprises program instructions executable by the processor to implement: measuring eye movement of the person with the instrument; estimating one or more characteristics of an oculomotor plant of the person based at least in part on at least a portion of the measured eye movement; estimating one or more properties of the complex eye movement patterns based at least in part on at least a portion of the measured eye movement; and assessing the person's identity based in part on at least one of the estimated one or more characteristics of the oculomotor plant of the person, and based in part on at least one of the estimated one or more complex eye movement patterns .
27. The system of claim 26, wherein the instrument is configured to measure one or more characteristics of an iris or one or more characteristics of a periocular region of the eye of the person, wherein the program instructions executable by the processor to implement assessing the person's identity based in part on at least one of the iris characteristics or at least one of the periocular characteristics.
28. The system of claim 26, wherein the instrument comprises a camera sensor.
29. The system of claim 26, wherein the instrument comprises a web cam.
30. The system of claim 26, wherein the instrument comprises an eye tracker.
31. A non-transitory, computer-readable storage medium comprising program instructions stored thereon, wherein the program instructions are configured to implement: measuring eye movement of the person; estimating one or more characteristics of an oculomotor plant of the person based at least in part on at least a portion of the measured eye movement; estimating one or more complex eye movement patterns of the eye based at least in part on at least a portion of the measured eye movement; and assessing the person's identity based in part on at least one of the estimated one or more characteristics of the oculomotor plant of the person, and based in part on at least one of the estimated one or more complex eye movement patterns of the eye.
32. A method of identifying a person, comprising: measuring eye movement of the person while the person is reading, looking at images, webpages, or/and interacting with a computer; and estimating one or more characteristics of an oculomotor plant of the person based at least in part on the measured eye movement.
33. A method of identifying a person, comprising: measuring eye movement of the person; estimating: one or more characteristics of an oculomotor plant of the person based at least in part on the measured eye movement; or one or more complex eye movement pattern based at least in part on at least a portion of the measured eye movement; measuring one or more characteristic of an iris of the person or/and periocular
information; and assessing the person's identity based: in part on at least one of the estimated one or more characteristics of the oculomotor plant of the person or at least one of the estimated one or more complex eye movement patterns; and in part on at least one of the characteristics of the iris or/and periocular information of the person.
34. The method of claim 33, wherein the assessment of the person's identity is based: in part on at least one of the estimated one or more characteristics of the oculomotor plant of the person;
in part on at least one of the estimated one or more complex eye movement pattern; and
on at least one of the characteristics of the iris or/and periocular information of the person.
35. A system, comprising: an instrument configured to measure eye movement of a person and one or more characteristics of an eye of the person; and a processor;
a memory coupled to the processor, wherein the memory comprises program instructions executable by the processor to implement: measuring eye movement of the person with the instrument; estimating: one or more characteristics of an oculomotor plant of the person based at least in part on the measured eye movement; or one or more complex eye movement pattern of the eye based at least in part on at least a portion of the measured eye movement; measuring one or more characteristic of an iris or periocular information of the person with the instrument; and assessing the person's identity based: in part on at least one of the estimated one or more characteristics of the oculomotor plant of the person or at least one of the estimated one or more property of the complex eye movement patterns; and in part on at least one of the characteristics of the iris and/or periocular information of the person.
36. The system of claim 35, wherein the assessment of the person's identity is based: in part on at least one of the estimated one or more characteristics of the oculomotor plant of the person;
in part on at least one of the estimated one or property of the complex eye movement patterns; and
in part on at least one of the characteristics of the iris or/and periocular information of the person.
37. A method of assessing a person's identity, comprising: measuring eye movement of the person; creating a mathematical model of one or more anatomical characteristics of an
oculomotor plant of the person, wherein the mathematical model is a multidimensional model that includes at least two dimensions; and assessing the person's identity based in part on the mathematical model.
38. The method of claim 37, wherein assessing the person's identity comprises combination of one or more measures based on the mathematical model of the anatomical characteristics with one or more measures based on one or more additional ocular modes of assessment of the person's identity.
39. A method of assessing a person's identity, comprising: measuring eye movement of the person; and assessing the person's identity based at least in part on the measured eye movement of the person.
PCT/US2012/030912 2012-03-28 2012-03-28 Person identification using ocular biometrics WO2013147763A1 (en)

Priority Applications (4)

Application Number Priority Date Filing Date Title
PCT/US2012/030912 WO2013147763A1 (en) 2012-03-28 2012-03-28 Person identification using ocular biometrics
EP12872652.8A EP2831810A4 (en) 2012-03-28 2012-03-28 Person identification using ocular biometrics
US13/908,748 US9082011B2 (en) 2012-03-28 2013-06-03 Person identification using ocular biometrics with liveness detection
US14/797,955 US9811730B2 (en) 2012-03-28 2015-07-13 Person identification using ocular biometrics with liveness detection

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/US2012/030912 WO2013147763A1 (en) 2012-03-28 2012-03-28 Person identification using ocular biometrics

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US13/908,748 Continuation-In-Part US9082011B2 (en) 2012-03-28 2013-06-03 Person identification using ocular biometrics with liveness detection

Publications (1)

Publication Number Publication Date
WO2013147763A1 true WO2013147763A1 (en) 2013-10-03

Family

ID=49260829

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2012/030912 WO2013147763A1 (en) 2012-03-28 2012-03-28 Person identification using ocular biometrics

Country Status (2)

Country Link
EP (1) EP2831810A4 (en)
WO (1) WO2013147763A1 (en)

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140337948A1 (en) * 2013-05-13 2014-11-13 Hoyos Labs Corp. System and method for determining liveness
CN104808774A (en) * 2014-01-24 2015-07-29 北京奇虎科技有限公司 Device and method for determining validity of operation of head-wearing intelligent device
WO2015109937A1 (en) * 2014-01-24 2015-07-30 北京奇虎科技有限公司 Head-mounted intelligent device and identity authentication method
CN108350417A (en) * 2015-11-10 2018-07-31 国立大学法人京都大学 Use the cell culture processes of the culture medium containing laminin fragment
CN110650685A (en) * 2017-03-24 2020-01-03 爱尔西斯有限责任公司 Method for assessing a psychophysiological state of a person
CN111008592A (en) * 2014-06-11 2020-04-14 索库里公司 Analyzing facial recognition data and social network data for user authentication
US11170369B2 (en) 2013-05-13 2021-11-09 Veridium Ip Limited Systems and methods for biometric authentication of transactions
US11210380B2 (en) 2013-05-13 2021-12-28 Veridium Ip Limited System and method for authorizing access to access-controlled environments

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20010026632A1 (en) * 2000-03-24 2001-10-04 Seiichiro Tamai Apparatus for identity verification, a system for identity verification, a card for identity verification and a method for identity verification, based on identification by biometrics
US20030091215A1 (en) * 2000-05-16 2003-05-15 Eric Lauper Biometric identification and authentication method

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
IL165586A0 (en) * 2004-12-06 2006-01-15 Daphna Palti Wasserman Multivariate dynamic biometrics system
US7986816B1 (en) * 2006-09-27 2011-07-26 University Of Alaska Methods and systems for multiple factor authentication using gaze tracking and iris scanning

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20010026632A1 (en) * 2000-03-24 2001-10-04 Seiichiro Tamai Apparatus for identity verification, a system for identity verification, a card for identity verification and a method for identity verification, based on identification by biometrics
US20030091215A1 (en) * 2000-05-16 2003-05-15 Eric Lauper Biometric identification and authentication method

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
COREY HOLLAND ET AL.: "Biometric Identification via Eye Movement Scanpaths in Reading", IEEE INTERNATIONAL JOINT CONFERENCE ON BIOMETRICS(IJCB), 13 October 2011 (2011-10-13), pages 1 - 8, XP032081601 *
DLEG V. KOMOGORTSEV ET AL.: "Biometric Authentication via Anatomical Characte ristics of the Oculomotor Plant", TECHNICAL REPORT TR2011-07-25, 31 July 2011 (2011-07-31), TEXAS STATE UNIVERSITY, XP032215554 *
See also references of EP2831810A4 *

Cited By (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11210380B2 (en) 2013-05-13 2021-12-28 Veridium Ip Limited System and method for authorizing access to access-controlled environments
US20140337948A1 (en) * 2013-05-13 2014-11-13 Hoyos Labs Corp. System and method for determining liveness
US9313200B2 (en) * 2013-05-13 2016-04-12 Hoyos Labs Ip, Ltd. System and method for determining liveness
US20140337949A1 (en) * 2013-05-13 2014-11-13 Hoyos Labs Corp. System and method for generating a biometric identifier
US20160182506A1 (en) * 2013-05-13 2016-06-23 Hoyos Labs Ip Ltd. System and method for generating a biometric identifier
US9294475B2 (en) * 2013-05-13 2016-03-22 Hoyos Labs Ip, Ltd. System and method for generating a biometric identifier
US11170369B2 (en) 2013-05-13 2021-11-09 Veridium Ip Limited Systems and methods for biometric authentication of transactions
CN104808774B (en) * 2014-01-24 2017-12-05 北京奇虎科技有限公司 Judge the apparatus and method of head-wearing type intelligent equipment operation validity
CN104808774A (en) * 2014-01-24 2015-07-29 北京奇虎科技有限公司 Device and method for determining validity of operation of head-wearing intelligent device
WO2015109937A1 (en) * 2014-01-24 2015-07-30 北京奇虎科技有限公司 Head-mounted intelligent device and identity authentication method
CN111008592A (en) * 2014-06-11 2020-04-14 索库里公司 Analyzing facial recognition data and social network data for user authentication
CN111008592B (en) * 2014-06-11 2023-07-28 索库里公司 Analyzing facial recognition data and social network data for user authentication
CN108350417A (en) * 2015-11-10 2018-07-31 国立大学法人京都大学 Use the cell culture processes of the culture medium containing laminin fragment
CN108350417B (en) * 2015-11-10 2023-04-11 国立大学法人京都大学 Cell culture method using culture medium containing laminin fragments
CN110650685A (en) * 2017-03-24 2020-01-03 爱尔西斯有限责任公司 Method for assessing a psychophysiological state of a person
CN110650685B (en) * 2017-03-24 2024-02-20 爱尔西斯有限责任公司 Method for assessing psychophysiological state of human

Also Published As

Publication number Publication date
EP2831810A1 (en) 2015-02-04
EP2831810A4 (en) 2016-04-27

Similar Documents

Publication Publication Date Title
US9811730B2 (en) Person identification using ocular biometrics with liveness detection
US10740465B2 (en) Detection of print-based spoofing attacks
US10966605B2 (en) Health assessment via eye movement biometrics
US20170364732A1 (en) Eye tracking via patterned contact lenses
Bednarik et al. Eye-movements as a biometric
Komogortsev et al. Biometric identification via an oculomotor plant mathematical model
WO2013147763A1 (en) Person identification using ocular biometrics
Komogortsev et al. Attack of mechanical replicas: Liveness detection with eye movements
Rigas et al. Biometric recognition via eye movements: Saccadic vigor and acceleration cues
Kinnunen et al. Towards task-independent person authentication using eye movement signals
Komogortsev et al. Biometric authentication via oculomotor plant characteristics
Holland et al. Complex eye movement pattern biometrics: The effects of environment and stimulus
Galdi et al. Eye movement analysis for human authentication: a critical survey
Makowski et al. DeepEyedentificationLive: Oculomotoric biometric identification and presentation-attack detection using deep neural networks
US20150294149A1 (en) Multivariate Dynamic Biometrics System
Kasprowski et al. First eye movement verification and identification competition at BTAS 2012
US10470690B2 (en) Authentication device using brainwaves, authentication method, authentication system, and program
Deravi et al. Gaze trajectory as a biometric modality
Komogortsev et al. Liveness detection via oculomotor plant characteristics: Attack of mechanical replicas
Rigas et al. Current research in eye movement biometrics: An analysis based on BioEye 2015 competition
Zhang et al. On biometrics with eye movements
Komogortsev et al. CUE: counterfeit-resistant usable eye movement-based authentication via oculomotor plant characteristics and complex eye movement patterns
Komogortsev et al. Biometric authentication via complex oculomotor behavior
Kasprowski Human identification using eye movements
Kasprowski et al. Enhancing eye-movement-based biometric identification method by using voting classifiers

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 12872652

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

REEP Request for entry into the european phase

Ref document number: 2012872652

Country of ref document: EP

WWE Wipo information: entry into national phase

Ref document number: 2012872652

Country of ref document: EP