US20160283865A1 - System and method for distinguishing human swipe input sequence behavior - Google Patents

System and method for distinguishing human swipe input sequence behavior Download PDF

Info

Publication number
US20160283865A1
US20160283865A1 US15/178,676 US201615178676A US2016283865A1 US 20160283865 A1 US20160283865 A1 US 20160283865A1 US 201615178676 A US201615178676 A US 201615178676A US 2016283865 A1 US2016283865 A1 US 2016283865A1
Authority
US
United States
Prior art keywords
input sequence
behavior
measuring points
sequence
user
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US15/178,676
Inventor
Neil Costigan
Ingo Deutschmann
Tony Libell
Johan Lindholm
Peder Nordstrom
Peter Parnes
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Behaviometrics AB
Original Assignee
Behaviometrics AB
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority to US201261637572P priority Critical
Priority to US13/866,171 priority patent/US9305260B2/en
Priority to US15/057,241 priority patent/US9542541B1/en
Application filed by Behaviometrics AB filed Critical Behaviometrics AB
Priority to US15/178,676 priority patent/US20160283865A1/en
Publication of US20160283865A1 publication Critical patent/US20160283865A1/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • G06N99/005
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/30Authentication, i.e. establishing the identity or authorisation of security principals
    • G06F21/31User authentication
    • G06F21/316User authentication by observing the pattern of computer usage, e.g. typical user behaviour
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/30Authentication, i.e. establishing the identity or authorisation of security principals
    • G06F21/31User authentication
    • G06F21/32User authentication using biometric data, e.g. fingerprints, iris scans or voiceprints
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0354Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of 2D relative movements between the device, or an operating part thereof, and a plane or surface, e.g. 2D mice, trackballs, pens or pucks
    • G06F3/03543Mice or pucks
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object or an image, setting a parameter value or selecting a range
    • G06F3/04842Selection of a displayed object
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for entering handwritten data, e.g. gestures, text
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06KRECOGNITION OF DATA; PRESENTATION OF DATA; RECORD CARRIERS; HANDLING RECORD CARRIERS
    • G06K9/00Methods or arrangements for reading or recognising printed or written characters or for recognising patterns, e.g. fingerprints
    • G06K9/00335Recognising movements or behaviour, e.g. recognition of gestures, dynamic facial expressions; Lip-reading
    • G06K9/00355Recognition of hand or arm movements, e.g. recognition of deaf sign language
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06NCOMPUTER SYSTEMS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N20/00Machine learning
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06NCOMPUTER SYSTEMS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N5/00Computer systems using knowledge-based models
    • G06N5/02Knowledge representation
    • G06N5/022Knowledge engineering; Knowledge acquisition
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/041Indexing scheme relating to G06F3/041 - G06F3/045
    • G06F2203/04105Pressure sensors for measuring the pressure or force exerted on the touch surface without providing the touch position

Abstract

Recording, analyzing and categorizing of user interface input via touchpad, touch screens or any device that can synthesize gestures from touch and pressure into input events. Such as, but not limited to, smart phones, touch pads and tablets. Humans may generate the input. The analysis of data may include statistical profiling of individual users as well as groups of users, the profiles can be stored in, but not limited to data containers such as files, secure storage, smart cards, databases, off device, in the cloud etc. A profile may be built from user/users behavior categorized into quantified types of behavior and/or gestures. The profile might be stored anonymized. The analysis may take place in real time or as post processing. Profiles can be compared against each other by all the types of quantified behaviors or by a select few.

Description

    FIELD OF THE DISCLOSED TECHNOLOGY
  • The present invention describes a method and system that uses behavioral biometric algorithms that gather, filter, analyze and distinguish human swipe input sequence behavior from other human behavior and or machine behavior.
  • BACKGROUND OF THE DISCLOSED TECHNOLOGY
  • Computing devices, such as mobile handsets, are traditionally designed to deliver performance on restricted hardware. Since there is no ‘built-in security’ commonly found on purpose built computers with increased mobility (e.g., smart phones), information stored in the computing devices is much more likely to end up in the wrong hands. Adding smart card readers or expecting users to carry One-Time-Password tokens (OTPs) to use with the computing devices is not realistic. Out-of-band solutions such as SMS or other communications protocols are cumbersome and not user friendly for widespread use. People pick weak passwords that are easy to remember or just easy to type on a handset and voice biometric solutions are expensive and disrupts the user experience. Multilayered security can be achieved by combining three pillars: (i) something you have (e.g., the phone as a token), (ii) something you know (e.g., your PIN), and (iii) something you are (e.g., your physical or behavioral metrics).
  • SUMMARY OF THE DISCLOSED TECHNOLOGY
  • The principles of the present invention provide for recording, analyzing, and categorizing user interface input via touchpad, touch screens or any electronic device that can receive and sense or synthesize gestures from human touch and pressure into input events. Such electronic devices may include, but are not limited to, smart phones, touch pads and tablets. Humans may generate the input sequence behavior that is converted by the electronic devices into sequence behavior input data.
  • The analysis of the sequence behavior input data may include statistical profiling of individual users as well as groups of users. The statistical profiles can be stored in, but are not limited to, data containers, such as files, secure storage, smart cards, databases, off device (e.g., memory devices), “cloud” storage devices, etc. A statistical profile may be built from user/users behavior categorized into quantified types of behavior and/or gestures. The analysis may take place in real time or non-real time, such as post-processing.
  • Statistical profiles can be compared against each other by one or more types of quantified behaviors. Quantified types of behavior may be, but not limited to, angle, acceleration, sequence, flight, pressure, quotient, and velocity.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • The accompanying drawings, which are incorporated into and constitute a part of the specification, illustrate one or more example embodiments of the invention and, together with the detailed description serve to explain their principles and implementations.
  • FIG. 1 is shown a collection of input points.
  • FIG. 2 is shown a single measuring point with a collision detection circle and input points.
  • FIG. 3 are shown multiple measuring points with collision detection circles and input points.
  • FIG. 4 is shown the acceleration of multiple points.
  • FIG. 5 is shown the quotient of two points or shapes.
  • FIG. 6 are shown two sequences with multiple shapes.
  • FIG. 7 is shown the flight behavior with multiple input points and three shapes.
  • FIG. 8 is shown the dominant side on the lower side of two points.
  • FIG. 9 is shown the area between two points of a sequence.
  • FIG. 10 are shown sample collision shapes.
  • FIG. 11 are shown collision shapes, resembling a PIN pad.
  • FIG. 12 are shown collision shapes, resembling a letter pad.
  • FIG. 13 are shown collision shapes, resembling an alphanumeric PIN pad.
  • FIG. 14 is shown an example system with a sensory device on which the behavioral biometric process operates.
  • FIG. 15 is shown an example system with a sensory device on which the behavioral biometric process operates.
  • FIG. 16 is shown a flow chart of a sample behavioral biometric system.
  • DETAILED DESCRIPTION OF EMBODIMENTS OF THE DISCLOSED TECHNOLOGY
  • Input from a touch, mouse, or other gesture sensory interface may be gathered in a raw format to be processed to determine different behavioral traits. The raw format may include points in space, such as a plane, with the highest resolution available for a particular system. Each point has one or more coordinate values (e.g., Cartesian coordinates), timing and any other information the system may provide, such as, but not limited to, pressure. Multiple points or a set of points are called an input sequence. An input sequence can be processed to different behaviors (sequence behavior), which might include the total aggregated distance, time or pressure, acceleration, angle and others. If an input sequence is touching or crossing shapes, which are visible on the touch screen it is called a shape input sequence (see FIGS. 2, 3, 4).
  • FIG. 1 shows an example collection of input points (input sequence) where each point has information of the touch or event, such as Cartesian coordinates (x, y), time (t) and pressure (pr). Additionally, a floating touch input, where the finger of the user is not leaving the touch screen could be used similar, to record the input points, and use the touch as an additional behavior. It should be understood that a wide variety of input devices that respond to touch and/or motion may be utilized in accordance with the principles of the present invention.
  • The raw input may be filtered by either excluding points, taking the average over two or more points to create new points, or performing another filtering technique. To eliminate errors, filtering may be applied to specific data of some points. To anonymize users, filters may also be applied to specific data of some points, or some points maybe omitted.
  • Two or more points may be used to generate and/or calculate behaviors, such as, but not limited to, the velocity between the points.
  • Geometric figures, points, lines, and polygons may be implemented as collision detection to be used to create additional behaviors. These figures, points, lines and polygons may also be movable to create additional behaviors.
  • FIG. 2 shows an example where the point p2 is inside a collision detection circle. The center of the circle s and the point p2 may be used to calculate a behavior angle v.
  • Two or more collision detection shapes may be used to generate and/or calculate behaviors such as, but not limited to, the velocity between the collision detection shapes. It should be understood that a geometric or non-geometric shape other than a circle may be utilized.
  • FIG. 3 displays how 3 circles s1, s2, s3 may be used for behavior analysis. s1 receives p1 properties while s2 receives the properties from both p3 and p4. If several points are inside a geometric shape, one of them or all of them may be used by taking the average of the points, for example. Any behavior applicable to raw points may also be applicable do a series of geometric shapes.
  • Examples of Quantified Tests/Behaviors/Behavioral Traits
  • Every test/behavior/behavioral trait may be calculated from any point or shape in a series of touch points or shapes to any other point or shape.
  • Angle Behavior
  • The angle value may be calculated from any point or shape in a series of touch points to any other point or shape. It could be, but not limited to, the first point inside a geometric shape, such as P2 in FIG. 2, or the last point inside a geometric shape, such as p5 in FIG. 2. These two examples would be the entering and exiting points to calculate the entering and exiting angle.
  • Velocity Behavior
  • Velocity v is the change in position respect to time. Velocity can be calculated by dividing the distance d between two points or shapes by the time:
  • d = ( p x 1 - p x 0 ) 2 + ( p y 1 - p y 0 ) 2 v = d ( t 1 - t 0 )
  • The distance and time may be aggregated for several points before the velocity is calculated to give an average velocity. The velocity may be divided into its components according to the coordinate system in use. For example, in the Cartesian coordinate system, the velocity of the x and y component would be.
  • d x = p x 1 - p x 0 v x = d ( t 1 - t 0 ) d y = p y 1 - p y 0 v y = d ( t 1 - t 0 )
  • Acceleration Behavior
  • FIG. 4 shows a series of touch points with velocity vectors shown between successive touch points. Acceleration is the rate of change of velocity of the touch input. The acceleration can be calculated by dividing the change in velocity between two points and the time (t=t2−t0.
  • a = ( v 1 - v 0 ) ( t 2 - t 0 )
  • Quotient Behavior
  • d s = ( p xk - p x 0 ) 2 + ( p yk - p y 0 ) 2 d a = n = 0 k - 1 ( p x [ n + 1 ] - p xn ) 2 + ( p y [ n + 1 ] - p yn ) 2 q = d a d s
  • FIG. 5 shows a series of touch points that may be used to determine quotient behavior. The quotient behavior is the quotient q between the shortest distance ds and the aggregated distance da of two points or shapes.
  • Sequence Behavior
  • FIG. 6 shows a pair of touch input curves or sequences that are used to determine sequence behavior. The sequence behavior is the total aggregated distance, time or pressure for a touch input. A shape input sequence is the total aggregated distance, time or pressure for a touch input, that moves over two or more collision detection shapes (see FIGS. 1, 2, 3). The behavior is distinguished by which shapes are touched and is also directional, i.e. the s1 to s2 sequence is not the same as s2 to s1. Continuing with examples FIG. 6: Seq1 is the distance/time/pressure sequence when a touch input moves over the shapes s1→s2→s4→s5→s3, while seq2 is the sequence when the input moves over s1→s2→s3→s5. The total time, distance or pressure defines the sequence behavior. It is not limited to total time, distance or pressure, it could also be the mean or median etc.
  • A start sequence is beginning at the first touch until reaching the first shape. The end sequence is from leaving the last shape to the end of the touch.
  • Shapes/Collision Shapes
  • The shapes might resemble an empty PINPAD or, might be denoted by numbers or letters, so the user is able to remember also complex sequences.
  • Flight Behavior
  • FIG. 7 shows a series of touch points that may be created by a user interfacing with a touch pad or other sensory device to define a flight. Flight is a categorizing behavior that uses two or more shapes to categorize any other basic behavior, like pressure or velocity.
  • It categorizes behavior by how the input sequence enters and exits shapes. For example, the average velocity between the entering and exiting points of two shapes is shown in Table 1
  • TABLE 1 Shape 1 Shape 2 Velocity Enter Enter v0 Enter Exit v1 Exit Enter v2 Exit Exit v3
  • TABLE 2 Shape 1 Shape 2 Shape 3 Time Enter Enter Enter t0 Enter Exit Enter t1 Exit Enter Exit t2 Exit Exit Exit t3 . . . . . . . . . . . .
  • As shown in FIG. 7, p1, p5, and p8 are entering points while p3, p6 and p10 are exiting points. The behavior between points p1 and p5 would be the enter/enter behavior for the shapes s1 and s2. While the behavior between the points p3 and p8 would be the exit/enter behavior for shapes s1 and s3. The behavior between the points p1, p5 and p10 would denote a enter/enter/exit categorization. The behavior categories for the sample FIG. 7 are shown in Table 2. The flight behavior could be, but not limited to, the average pressure, velocity, time or distance between the shapes in the sequence.
  • Dominant Side Behavior
  • FIG. 8 shows a touch sequence to illustrate a dominant side. The dominant side behavior checks weather the touch sequence is on one or the others side of the shortest distance between two shapes or points. As shown, the dominant side is to the lower side of the shortest distance between the two points.
  • Area Behavior
  • FIG. 9 is an illustration showing a touch sequence that includes two regions that define a certain area. The area behavior is the total area between the actual touch sequence and the shortest distance between two shapes or points. It is described by the enclosing curve. The area might be calculated using an integral over the enclosing curve.
  • Curve Fitting Behavior
  • Fit a Bezier curve to the actual input sequence and use the Bezier constants as behavior.
  • Height/Heat Map Behavior
  • Add a two-dimensional integer array or map using defined dimensions.
  • The size of the integer array can respond to the resolution of the touch screen or might be adjusted by a factor to make different resolutions comparable.
  • Each time a measuring point is activated the hit count for that particular point is increased. An algorithm for image recognition can be used to compare the maps. The heat map might also be split into sections, by putting a grid over the area.
  • Time/Local Time
  • The local time of the user can be recorded. An algorithm for calculating the probability of the sleeping time can be used as understood in the art. Example System Description
  • A typical usage of the system might happen in the following way: At first the system is just recording the behavior of the user with the behavior monitor 101, using the input sequences and shape sequences, which are generated in the normal process of using the touchpad of the device 102 and might also use data from the GPS and Clock. Additional behavior might be gathered by displaying collision shapes on the touchscreen of the mobile device. The gathered data might be filtered using the filter 103. A user profile 106 will be generated and stored.
  • Authentication of a User
  • The behavior monitor 101 gathers data from touchpad as well as system information such as GPS location and Clock data 102. This data is filtered using the filter 103 to be used for the quantified tests 104 in the comparator 105; the profile 106 for the user is loaded into the comparator 105. The result analyzer 107 decides if the profile should be updated with new input data depending on the result of the comparator 105, which is comparing the profile 106 with the gathered user data. The decision maker 108 decides if it was the correct user or not, or is delivering a raw score and a status mode (gathering or not gathering of user information) of the system. All these components may be installed on the system or distributed. The comparator 105, the profile 106, the result analyzer 107 and the decision maker 108 can be installed on the mobile device or on a central server or in a cloud environment. The decision maker 108 can be installed and run by a third party.
  • To protect the privacy of the user a mechanism to start and stop the behavior monitor 101 from gathering the behavior of the user might be provided. This mechanism might be operated by an administrator of the device or by the user himself. The information about the status of the behavior monitor 101 gathering/not gathering is communicated to the decision maker 108. Depending on the status of the behavior monitor 101 the decision maker 108 will deliver a gathering status.
  • The filter can also be used to anonymize the behavior of users, by omitting data of specific points or omitting specific points. The behavior monitor is able to run continuously, so that the system might get a constant data feed.
  • FIG. 10 is an illustration of an illustrative mobile device with a touch screen device enabling a user to swipe, creating an input sequence. FIG. 11 is a block diagram of the mobile device of FIG. 10 showing internal electronic components, including a processing unit, memory, input/output unit, and sense device (e.g., touch screen). The memory unit shown, includes software that may be executed by the processing unit to distinguish swipe input sequence behavior as described here. The processing unit, in response to a user performing a swipe, may execute the software to perform calculations, as previously described, to determine whether the user is the user to which the mobile device belongs. In one embodiment, the memory may store behavioral data that is associated with one or more swipes that the user has previously made during a registration process of the mobile device or during previous use of the mobile device. It should be understood that the drawings of the mobile device are illustrative and that additional and/or different components may be utilized in supporting and performing the processes described herein.
  • The previous description is of a preferred embodiment for implementing the invention, and the scope of the invention should not necessarily be limited by this description. The scope of the present invention is instead defined by the following claims.
  • While the disclosed technology has been taught with specific reference to the above embodiments, a person having ordinary skill in the art will recognize that changes can be made in form and detail without departing from the spirit and the scope of the disclosed technology. The described embodiments are to be considered in all respects only as illustrative and not restrictive. All changes that come within the meaning and range of equivalency of the claims are to be embraced within their scope. Combinations of any of the methods, systems, and devices described hereinabove are also contemplated and within the scope of the invention.

Claims (8)

1. A method of using behavioral biometric algorithms, the method comprising:
gathering data from a first user;
filtering said gathered data; and
conducting analysis and distinguishing a human swipe input sequence applied to a touchpad or touchscreen or shape input sequence and behavioral traits from other human behavior and or machine behavior where:
a. said input sequence data is determined based on at least one of:
i. raw data in the highest resolution available for the particular system,
ii. raw data in the resolution determined by the current application,
iii. filtered data fitting different behavior traits;
b. said behavioral traits are determined based on at least three of the following:
i. an angle of the swipe when entering or leaving one or more measuring points,
ii. a velocity between one more measuring points,
iii. an acceleration between one or more measuring points,
iv. a quotient between one or more measuring points,
v. a sequence between multiple measuring points,
vi. a start sequence to a first measuring point,
vii. an end sequence from the last measuring point,
viii. a time of flight between one or more measuring points,
ix the dominant side between one or more measuring points,
x. an area enclosed between a curve and a line connecting one or more measuring points on said curve,
xi. a curve fitting between one or more measuring points,
xii. a heat map between one or more measuring points;
xii. the average time of the sample,
xiii. keypress timings; and
c. where said conducting of analysis comprises at least one of the following:
i. determining if said input sequence is from said first user,
ii. determining if said input sequence is from another user, different from said first user,
iii. determining if said input sequence is not human.
2. The method of claim 1, wherein said shape input sequence is inputted using a mouse.
3. The method of claim 1, wherein a raw score of a determination of human characteristics of said swipe input sequence is exhibited, lacking a determination as to whether said swipe input sequence was carried out by a human.
4. The method of claim 1, wherein collision detection on shapes is used to conduct analysis and distinguish human swipe shape input sequence and behavioral traits from other human behavior and or machine behavior.
5. (canceled)
6. The method of claim 1, wherein said shape input sequence comprises a determination of a shape of inputted data based on a shape of at least two letters or numbers.
7. The method of claim 1, wherein said method is carried out as part of post-processing, after a transaction by a user conducting said swipe input sequence is carried out.
8. The method of claim 1, wherein a filter is used to omit a part of said input sequence data to anonymize user information.
US15/178,676 2012-04-24 2016-06-10 System and method for distinguishing human swipe input sequence behavior Abandoned US20160283865A1 (en)

Priority Applications (4)

Application Number Priority Date Filing Date Title
US201261637572P true 2012-04-24 2012-04-24
US13/866,171 US9305260B2 (en) 2012-04-24 2013-04-19 System and method for distinguishing human swipe input sequence behavior
US15/057,241 US9542541B1 (en) 2012-04-24 2016-03-01 System and method for distinguishing human swipe input sequence behavior
US15/178,676 US20160283865A1 (en) 2012-04-24 2016-06-10 System and method for distinguishing human swipe input sequence behavior

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
US15/178,676 US20160283865A1 (en) 2012-04-24 2016-06-10 System and method for distinguishing human swipe input sequence behavior

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
US15/057,241 Continuation US9542541B1 (en) 2012-04-24 2016-03-01 System and method for distinguishing human swipe input sequence behavior

Publications (1)

Publication Number Publication Date
US20160283865A1 true US20160283865A1 (en) 2016-09-29

Family

ID=49381063

Family Applications (3)

Application Number Title Priority Date Filing Date
US13/866,171 Active 2034-08-08 US9305260B2 (en) 2012-04-24 2013-04-19 System and method for distinguishing human swipe input sequence behavior
US15/057,241 Active US9542541B1 (en) 2012-04-24 2016-03-01 System and method for distinguishing human swipe input sequence behavior
US15/178,676 Abandoned US20160283865A1 (en) 2012-04-24 2016-06-10 System and method for distinguishing human swipe input sequence behavior

Family Applications Before (2)

Application Number Title Priority Date Filing Date
US13/866,171 Active 2034-08-08 US9305260B2 (en) 2012-04-24 2013-04-19 System and method for distinguishing human swipe input sequence behavior
US15/057,241 Active US9542541B1 (en) 2012-04-24 2016-03-01 System and method for distinguishing human swipe input sequence behavior

Country Status (1)

Country Link
US (3) US9305260B2 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10437971B2 (en) 2012-11-06 2019-10-08 Behaviosec Inc. Secure authentication of a user of a device during a session with a connected server

Families Citing this family (29)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10728761B2 (en) 2010-11-29 2020-07-28 Biocatch Ltd. Method, device, and system of detecting a lie of a user who inputs data
US10834590B2 (en) 2010-11-29 2020-11-10 Biocatch Ltd. Method, device, and system of differentiating between a cyber-attacker and a legitimate user
US10474815B2 (en) 2010-11-29 2019-11-12 Biocatch Ltd. System, device, and method of detecting malicious automatic script and code injection
US10404729B2 (en) 2010-11-29 2019-09-03 Biocatch Ltd. Device, method, and system of generating fraud-alerts for cyber-attacks
US10586036B2 (en) 2010-11-29 2020-03-10 Biocatch Ltd. System, device, and method of recovery and resetting of user authentication factor
US10298614B2 (en) * 2010-11-29 2019-05-21 Biocatch Ltd. System, device, and method of generating and managing behavioral biometric cookies
US10621585B2 (en) 2010-11-29 2020-04-14 Biocatch Ltd. Contextual mapping of web-pages, and generation of fraud-relatedness score-values
US10685355B2 (en) * 2016-12-04 2020-06-16 Biocatch Ltd. Method, device, and system of detecting mule accounts and accounts used for money laundering
US10897482B2 (en) 2010-11-29 2021-01-19 Biocatch Ltd. Method, device, and system of back-coloring, forward-coloring, and fraud detection
US10262324B2 (en) 2010-11-29 2019-04-16 Biocatch Ltd. System, device, and method of differentiating among users based on user-specific page navigation sequence
US10776476B2 (en) 2010-11-29 2020-09-15 Biocatch Ltd. System, device, and method of visual login
US10747305B2 (en) 2010-11-29 2020-08-18 Biocatch Ltd. Method, system, and device of authenticating identity of a user of an electronic device
US10055560B2 (en) * 2010-11-29 2018-08-21 Biocatch Ltd. Device, method, and system of detecting multiple users accessing the same account
US9305260B2 (en) * 2012-04-24 2016-04-05 Behaviometrics Ab System and method for distinguishing human swipe input sequence behavior
US9236052B2 (en) 2013-06-20 2016-01-12 Bank Of America Corporation Utilizing voice biometrics
US9215321B2 (en) 2013-06-20 2015-12-15 Bank Of America Corporation Utilizing voice biometrics
KR102195314B1 (en) * 2013-08-28 2020-12-24 삼성전자주식회사 An electronic device and operating metod thereof
US9380041B2 (en) 2013-09-30 2016-06-28 Bank Of America Corporation Identification, verification, and authentication scoring
US20150178374A1 (en) * 2013-12-23 2015-06-25 Trusteer Ltd. Method and system of providing user profile detection from an input device
US9531710B2 (en) * 2014-05-09 2016-12-27 Behaviometrics Ab Behavioral authentication system using a biometric fingerprint sensor and user behavior for authentication
US10440019B2 (en) * 2014-05-09 2019-10-08 Behaviometrics Ag Method, computer program, and system for identifying multiple users based on their behavior
US9529987B2 (en) * 2014-05-09 2016-12-27 Behaviometrics Ab Behavioral authentication system using a behavior server for authentication of multiple users based on their behavior
GB2539705B (en) 2015-06-25 2017-10-25 Aimbrain Solutions Ltd Conditional behavioural biometrics
US10069837B2 (en) * 2015-07-09 2018-09-04 Biocatch Ltd. Detection of proxy server
US10345914B2 (en) * 2016-01-26 2019-07-09 Infinity Augmented Reality Israel Ltd. Method and system for generating a synthetic database of postures and gestures
US10489577B2 (en) 2016-08-11 2019-11-26 Onenigma LLC Identifying one or more users based on typing pattern and/or behavior
JP6349366B2 (en) * 2016-10-14 2018-06-27 株式会社 みずほ銀行 Service management system, service management method, and service management program
US10579784B2 (en) 2016-11-02 2020-03-03 Biocatch Ltd. System, device, and method of secure utilization of fingerprints for user authentication
US10397262B2 (en) 2017-07-20 2019-08-27 Biocatch Ltd. Device, system, and method of detecting overlay malware

Family Cites Families (18)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7973773B2 (en) * 1995-06-29 2011-07-05 Pryor Timothy R Multipoint, virtual control, and force based touch screen applications
US7908216B1 (en) * 1999-07-22 2011-03-15 Visa International Service Association Internet payment, authentication and loading system using virtual smart card
US9135598B2 (en) * 2001-11-06 2015-09-15 Excel Communications Anonymous reporting system
JP2003271967A (en) * 2002-03-19 2003-09-26 Fujitsu Prime Software Technologies Ltd Program, method and device for authentication of hand- written signature
US7116805B2 (en) * 2003-01-07 2006-10-03 Avagotechnologies Ecbu Ip (Singapore) Pte. Ltd. Fingerprint verification device
WO2008008473A2 (en) * 2006-07-11 2008-01-17 Agent Science Technologies, Inc. Behaviormetrics application system for electronic transaction authorization
US8497836B2 (en) * 2008-05-06 2013-07-30 Cisco Technology, Inc. Identifying user by measuring pressure of button presses on user input device
US9400879B2 (en) * 2008-11-05 2016-07-26 Xerox Corporation Method and system for providing authentication through aggregate analysis of behavioral and time patterns
US8088332B2 (en) * 2009-04-28 2012-01-03 Chem Spectra, Inc. Explosive or drug detection system for shipping containers
US20110304531A1 (en) * 2010-06-10 2011-12-15 Peter Brooks Method and system for interfacing and interaction with location-aware devices
US20120266220A1 (en) * 2010-11-17 2012-10-18 Sequent Software Inc. System and Method for Controlling Access to a Third-Party Application with Passwords Stored in a Secure Element
US10476873B2 (en) * 2010-11-29 2019-11-12 Biocatch Ltd. Device, system, and method of password-less user authentication and password-less detection of user identity
US8938787B2 (en) * 2010-11-29 2015-01-20 Biocatch Ltd. System, device, and method of detecting identity of a user of a mobile electronic device
WO2012128916A2 (en) * 2011-03-24 2012-09-27 AYaH, LLC Method for generating a human likeness score
US8803825B2 (en) * 2011-09-27 2014-08-12 Carefusion 303, Inc. System and method for filtering touch screen inputs
US9305260B2 (en) * 2012-04-24 2016-04-05 Behaviometrics Ab System and method for distinguishing human swipe input sequence behavior
US9063612B2 (en) * 2012-12-10 2015-06-23 Intel Corporation Techniques and apparatus for managing touch interface
IN2013MU01148A (en) * 2013-03-26 2015-04-24 Tata Consultancy Services Ltd

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10437971B2 (en) 2012-11-06 2019-10-08 Behaviosec Inc. Secure authentication of a user of a device during a session with a connected server

Also Published As

Publication number Publication date
US20130282637A1 (en) 2013-10-24
US9305260B2 (en) 2016-04-05
US9542541B1 (en) 2017-01-10

Similar Documents

Publication Publication Date Title
US10776463B2 (en) Active authentication of users
US10395018B2 (en) System, method, and device of detecting identity of a user and authenticating a user
Bo et al. Silentsense: silent user identification via touch and movement behavioral biometrics
US9195878B2 (en) Method of controlling an electronic device
Alzubaidi et al. Authentication of smartphone users using behavioral biometrics
CN104239761B (en) The identity for sliding behavioural characteristic based on touch screen continues authentication method
JP6409507B2 (en) Method, computer system, and program for associating user activity label with gaze
AU2015314949B2 (en) Classification of touch input as being unintended or intended
Gascon et al. Continuous authentication on mobile devices by analysis of typing motion behavior
US8856543B2 (en) User identification with biokinematic input
CN104778397B (en) Information processor and its method
US9985787B2 (en) Continuous monitoring of fingerprint signature on a mobile touchscreen for identity management
CN107077551B (en) Scalable authentication process selection based on sensor input
US9754149B2 (en) Fingerprint based smart phone user verification
US9665703B2 (en) Device, system, and method of detecting user identity based on inter-page and intra-page navigation patterns
US9788203B2 (en) System and method for implicit authentication
Mahfouz et al. A survey on behavioral biometric authentication on smartphones
US9541995B2 (en) Device, method, and system of detecting user identity based on motor-control loop model
Shahzad et al. Secure unlocking of mobile touch screen devices by simple gestures: You can see it but you can not do it
Tian et al. KinWrite: Handwriting-Based Authentication Using Kinect.
KR101280050B1 (en) Location-based security system for portable electronic device
Zheng et al. You are how you touch: User verification on smartphones via tapping behaviors
US8752146B1 (en) Providing authentication codes which include token codes and biometric factors
Serwadda et al. When kids' toys breach mobile phone security
US8695086B2 (en) System and method for user authentication

Legal Events

Date Code Title Description
STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION