US20200128006A1 - Gesture-Based Signature Authentication - Google Patents
Gesture-Based Signature Authentication Download PDFInfo
- Publication number
- US20200128006A1 US20200128006A1 US16/716,983 US201916716983A US2020128006A1 US 20200128006 A1 US20200128006 A1 US 20200128006A1 US 201916716983 A US201916716983 A US 201916716983A US 2020128006 A1 US2020128006 A1 US 2020128006A1
- Authority
- US
- United States
- Prior art keywords
- representation
- mobile device
- soc
- gesture
- biometric
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L63/00—Network architectures or network communication protocols for network security
- H04L63/08—Network architectures or network communication protocols for network security for authentication of entities
- H04L63/0861—Network architectures or network communication protocols for network security for authentication of entities using biometrical features, e.g. fingerprint, retina-scan
-
- G06K9/00167—
-
- G06K9/00335—
-
- G06K9/00892—
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/20—Movements or behaviour, e.g. gesture recognition
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/30—Writer recognition; Reading and verifying signatures
- G06V40/37—Writer recognition; Reading and verifying signatures based only on signature signals such as velocity or pressure, e.g. dynamic signature recognition
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/70—Multimodal biometrics, e.g. combining information from different biometric modalities
-
- G—PHYSICS
- G07—CHECKING-DEVICES
- G07C—TIME OR ATTENDANCE REGISTERS; REGISTERING OR INDICATING THE WORKING OF MACHINES; GENERATING RANDOM NUMBERS; VOTING OR LOTTERY APPARATUS; ARRANGEMENTS, SYSTEMS OR APPARATUS FOR CHECKING NOT PROVIDED FOR ELSEWHERE
- G07C9/00—Individual registration on entry or exit
- G07C9/30—Individual registration on entry or exit not involving the use of a pass
- G07C9/32—Individual registration on entry or exit not involving the use of a pass in combination with an identity check
- G07C9/35—Individual registration on entry or exit not involving the use of a pass in combination with an identity check by means of a handwritten signature
-
- G—PHYSICS
- G07—CHECKING-DEVICES
- G07C—TIME OR ATTENDANCE REGISTERS; REGISTERING OR INDICATING THE WORKING OF MACHINES; GENERATING RANDOM NUMBERS; VOTING OR LOTTERY APPARATUS; ARRANGEMENTS, SYSTEMS OR APPARATUS FOR CHECKING NOT PROVIDED FOR ELSEWHERE
- G07C9/00—Individual registration on entry or exit
- G07C9/30—Individual registration on entry or exit not involving the use of a pass
- G07C9/32—Individual registration on entry or exit not involving the use of a pass in combination with an identity check
- G07C9/37—Individual registration on entry or exit not involving the use of a pass in combination with an identity check using biometric data, e.g. fingerprints, iris scans or voice recognition
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L63/00—Network architectures or network communication protocols for network security
- H04L63/08—Network architectures or network communication protocols for network security for authentication of entities
- H04L63/083—Network architectures or network communication protocols for network security for authentication of entities using passwords
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L63/00—Network architectures or network communication protocols for network security
- H04L63/10—Network architectures or network communication protocols for network security for controlling access to devices or network resources
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04W—WIRELESS COMMUNICATION NETWORKS
- H04W12/00—Security arrangements; Authentication; Protecting privacy or anonymity
- H04W12/06—Authentication
-
- G06K9/00181—
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/30—Writer recognition; Reading and verifying signatures
- G06V40/37—Writer recognition; Reading and verifying signatures based only on signature signals such as velocity or pressure, e.g. dynamic signature recognition
- G06V40/394—Matching; Classification
-
- G—PHYSICS
- G07—CHECKING-DEVICES
- G07C—TIME OR ATTENDANCE REGISTERS; REGISTERING OR INDICATING THE WORKING OF MACHINES; GENERATING RANDOM NUMBERS; VOTING OR LOTTERY APPARATUS; ARRANGEMENTS, SYSTEMS OR APPARATUS FOR CHECKING NOT PROVIDED FOR ELSEWHERE
- G07C2209/00—Indexing scheme relating to groups G07C9/00 - G07C9/38
- G07C2209/14—With a sequence of inputs of different identification information
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04W—WIRELESS COMMUNICATION NETWORKS
- H04W12/00—Security arrangements; Authentication; Protecting privacy or anonymity
- H04W12/60—Context-dependent security
- H04W12/68—Gesture-dependent or behaviour-dependent
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04W—WIRELESS COMMUNICATION NETWORKS
- H04W88/00—Devices specially adapted for wireless communication networks, e.g. terminals, base stations or access point devices
- H04W88/02—Terminal devices
Definitions
- Embodiments of the invention generally relate to the field of integrated circuits and, more particularly, to systems, methods, devices, and machine-readable mediums for gesture-based signature authentication.
- FIG. 1 illustrates several examples of motion gestures that may be utilized as gesture-based signatures.
- FIG. 2 illustrates an embodiment of the glyph capture buffer, audio capture buffer, and 3D movement buffer in memory.
- FIG. 3 is a flow diagram of an embodiment of a gesture-based signature registration phase.
- FIG. 4 is a flow diagram of an embodiment of a process to authenticate a gesture-based signature.
- FIG. 5 illustrates an embodiment of a computing device that implements a gesture-based authentication process.
- Embodiments are generally directed to systems, methods, and apparatuses for implementing gesture-based signature authentication.
- a process to use a combination of gestures including glyphs entered in a touch screen (which may include single touch or multi-touch capable touch screens), sounds captured by a microphone, and movements registered by motion sensors may be used in place of a standard text-based password for use with a mobile computing device.
- the term “gesture” will be applied to any user entered combination of glyphs, movements and sounds that the user enters into the mobile computing device through input devices.
- a gesture-based signature can be verified as performed by the authentic user (i.e., authenticated) through a series of calculations. Once the gesture-based signature has been authenticated, the gesture-based signature can be substituted for a correct text-based password that is then input into a password field to gain access to a software application or website.
- Logic to perform gesture identification, comparison, and verification processes may be present within the mobile computing device. This process does not require significant changes to application or website infrastructure.
- FIG. 1 illustrates several examples of motion gestures that may be utilized as gesture-based signatures.
- Each of the samples involves a user holding the mobile computing device A and making a series of movements in three-dimensional space.
- mobile device A needs an integrated motion sensor device (e.g., an accelerometer).
- a coordinate space reference at the top of the page illustrates the positive X, Y, and Z directions in three-dimensional space.
- gesture sample signature A 100 which shows the mobile device A first moving in the positive Z direction, then the negative Z direction, then a clockwise circle, and finally, the negative X direction.
- This description is rudimentary for ease of explanation. In a real example, a user's hand is generally not going to only travel back and forth along an axis in three-dimensional space.
- a buffer will be created to store X, Y, Z coordinate triplets in 3D space.
- the coordinate triplets will be stored at the end of a time interval since the previous coordinate triplet was acquired.
- mobile device A may store a coordinate triplet of the current location of the mobile device A in 3D space every 0.001 seconds. This would create a histogram in 3D space of the location of the mobile device at a 0.001 second granularity. In other embodiments, the interval granularity may be shorter or longer.
- a user presses a button on the mobile device to begin capture of the signature. Once the button is pressed, the 3D coordinates at the current location of the device may be zeroed out as a relative reference point. Then the coordinate capture begins upon pressing the button and the coordinate triplets are continuously stored until a button is pressed again by the user to indicate that the gesture-based signature is complete.
- Gesture sample signature B 102 utilizes mobile device B, which includes a touch screen display.
- the touch screen display may be utilized to capture a glyph.
- a user may use their hand as the input instrument on the touch screen.
- the user may use a pen or other pointing device as the input instrument on the touch screen.
- the glyph may be any combination of movements captured by the touch screen. The combination of movements may resemble a collection of letters or symbols. Although, any combination of movements is acceptable so even seemingly random movements that do not resemble letters or symbols may be captured as the glyph, such as a handwritten signature.
- Glyph capture may resemble the 3D space motion capture where a buffer may be created that stores X, Y coordinate doubles. These X, Y coordinate doubles may then be saved into the buffer once every time interval, which creates a time-based memory of how the glyph was created on the touch screen.
- the mobile devices each include a microphone (e.g., the audio receiver in a cell phone).
- a word or phrase may be spoken into the microphone and saved as an audio frequency pattern over time in a buffer in the mobile device.
- gesture sample signature B 102 the user may enter a glyph on the touch screen and then speak a word (e.g., “football”) into the microphone on the mobile device as a combination of a glyph and audio gesture-based signature.
- Gesture sample signature C 104 illustrates a gesture-based signature utilizing all three types of input methods. For example, the user may enter a glyph on the touch screen, then speak a phrase into the microphone on the mobile device (e.g., “Umbrella”), and then move the mobile device in 3D space.
- two or more types of gestures may be simultaneously entered as a more advanced requirement for the gesture-based signature.
- the user might first enter the glyph, but then speak the phrase and move the device in 3D space simultaneously.
- a user may press a start gesture capture sequence button on the mobile device indicating the start of the sequence.
- the glyph capture buffer, audio capture buffer, and 3D movement buffer are each initialized.
- the three buffers are utilized simultaneously to capture all three types of gestures (i.e., glyph, voice, and 3D movement) throughout the entire length of the gesture capture sequence.
- the gesture capture sequence ends as soon as an end capture sequence button is pressed on the mobile device.
- FIG. 2 illustrates an embodiment of the glyph capture buffer, audio capture buffer, and 3D movement buffer in memory.
- memory 200 includes each of a glyph capture buffer 202 , an audio capture buffer 204 , and 3D movement buffer 206 .
- These buffers are initialized at the start of a gesture capture sequence. In many embodiments, these buffers line up by acquisition times, as shown by the time column to the left of the buffers. For example, if Time 0 represents the start of a gesture capture period and there are n capture intervals, then Time 0+n intervals is the end of the gesture capture period. Each time interval has a saved amount of glyph data (X,Y coordinates), audio data (instantaneous frequency map of the audible spectrum as captured by the microphone), and 3D location data (X,Y,Z coordinates) in the respective buffers.
- the buffers may be saved to a gesture-based signature file and stored in a storage location.
- the storage location of the gesture-based signature file may be secured through the use of hardware or software-based security measures (e.g., signed public and private security keys, hash algorithms, hardware measurement techniques, etc.).
- storage location may be a remote server using a secure channel, for connected comparison and use of the signature.
- the length of time it takes to fully record two separate attempts at the same gesture-based signature may need to be within a predetermined maximum discrepancy of time when comparing the two signatures. So if a first signature very nearly duplicates the coordinates of a second signature, but the first signature takes twice as long to complete, this may cause comparison logic to result with a “no match.”
- the first phase is the registration phase, which deals with generating a new gesture and safely storing it into the mobile device.
- the new gesture may require a training period where the user becomes familiar with the gesture and can substantially repeat the gesture combination desired as the password. Once the user has become confident that he or she can successfully repeatedly enter the full gesture without a substantial chance of having gesture comparison logic reject the gesture due to discrepancies, the gesture signature file may then be stored for future use.
- gesture identification and comparison logic within the mobile device will have the user enter the gesture-based signature several times and average out the coordinate, audio, and time data stored in the buffers during each capture period. Additionally, the variability of the user's ability to recreate the gesture-based signature may cause a maximum discrepancy threshold to increase or decrease.
- the maximum discrepancy threshold may be a discrepancy per time interval between the averaged coordinate and audio data and the most recently captured coordinate and audio data. For example, if the observed glyph coordinate data between several capture period training attempts varies widely, the gesture comparison and verification logic may allow a greater variability in the X,Y glyph data when determining the authenticity of the signature most recently recorded and a stored version of the signature.
- Glyph and 3D movement gestures are represented by a finite list of one or more (for multi-touch devices) coordinates and their corresponding acquisition times. This makes a handwritten glyph-based and/or 3D movement-based signature harder to forge, since time-based coordinate information is stored, which reveals the way the signature is drawn on the touch screen or in the air, as well as the speed at which each stroke is usually performed.
- the gesture-based signature file is stored as a list of tuples of the form (input device, input value, acquisition time), for example ((touch screen); (8,33); 0).
- Each registered gesture-based signature may be associated with a given application or website. Therefore, additional information is requested from the user, including the site/application, username, and text-based password if the user already possesses such password. If the user does not already possess such a password, then the text-based password can be automatically generated in this phase. This text-based password information is stored in a safe storage location.
- FIG. 3 is a flow diagram of an embodiment of a gesture-based signature registration phase.
- processing logic which may comprise hardware (e.g., circuitry), software (e.g., an operating system or application), firmware (e.g., microcode), or any combination of two or more of the listed types of processing logic.
- the process begins with a user starting the signature registration processing logic to register a gesture-based signature (processing block 300 ).
- a user enters a gesture-based signature into the processing logic (processing block 302 ).
- block 302 is performed several times until the entry has been sufficiently learned by the processing logic, including the variability levels of each type of gesture the user makes for the signature being registered.
- Processing logic determines if a username exists for user at the application or website that the user wants to implement a gesture-based signature for (processing block 304 ). If the username exists, then processing logic associates the gesture-based signature to the website/application, the username, and the password.
- the ASCII-based (i.e., text-based) password will correspond to the gesture-based signature. For example, if a user wants to log in to a website that requires a username and password, processing logic first associates a valid text-based username and password for the user to gain access to the website. Once the valid text-based username and password have been discovered, processing logic may associate this information with a particular gesture-based signature file. Processing logic then stores the gesture-based signature in a secure storage location (processing block 308 ).
- processing logic determines whether the username does exist for the application or website. If the username does not exist for the application or website, then processing logic generates an ASCII-based password from the signature (processing block 310 ) and then associates the gesture-based signature to the website/application and username (processing block 312 ). Finally, processing logic then stores the gesture-based signature in a secure storage location (processing block 308 ).
- the authentication phase deals with the moment when the signature is being compared with the registered one for authentication purposes. Specifically, the process to authenticate a user proceeds as follows. An application or website requires the user to enter his/her username and password. The application/website developer uses a provided API (application programming interface) that will launch a graphical interface asking the user to enter a gesture-based signature.
- API application programming interface
- the user then enters a gesture-based signature.
- the gesture-based signature entered by the user is then compared to the registered signature for that site/application. Since no two signatures from the same user are likely to be identical, algorithms that take into account variability, as mentioned above, may be used to compare the registered gesture-based signature with the newly entered gesture-based signature. Signatures that are compared would be required to be substantially similar in regard to time, coordinates, and voice frequencies. Just how similar would be signature specific, considering the variability per signature may differ depending on how well a user can repeat the signature during initial training capture periods. In some embodiments, a process described as Dynamic Type Warping may be utilized. Dynamic Type Warping performs point-to-point correspondence for data at each acquisition time. A smart selection of acquisition points may then be compared.
- the gesture-based signature comparison processing logic may be offloaded to a backend server coupled to a network that the mobile computing device is also coupled to.
- the registered signature may be stored at the backend server.
- the associated text-based password is returned to the external application/website, which then permits the user to log in.
- FIG. 4 is a flow diagram of an embodiment of a process to authenticate a gesture-based signature.
- processing logic which may comprise hardware (e.g., circuitry), software (e.g., an operating system or application), firmware (e.g., microcode), or any combination of two or more of the listed types of processing logic.
- the process begins by processing logic in the application or web browser calling gesture-based signature comparison logic through the provided API (processing block 400 ). Then the user enters the gesture-based signature (processing block 402 ). Processing logic then compares the gesture-based signature that has been newly entered by the user with an existing registered signature in storage (processing block 404 ).
- Processing logic determines if the signatures match (processing block 406 ). If the signatures do not match, then the signature entry has failed and the process returns to block 402 to re-enter the gesture-based signature. Otherwise, if the signatures do result in a match, then processing logic returns the stored ASCII-based password to the application/browser for use (processing block 408 ) and the process is finished.
- FIG. 5 illustrates an embodiment of a computing device that implements a gesture-based authentication process.
- Computer system 500 is shown.
- the computer system in FIG. 5 generally comprises a system on a chip (SoC) layout.
- SoC system on a chip
- the SoC layout may be utilized in any type of computer system but is useful for small form factor mobile computing devices, such as cellular phones, smart phones, and personal digital assistants (PDAs).
- PDAs personal digital assistants
- the computer system 500 includes a central processing unit (CPU) 502 .
- CPU central processing unit
- SoC layout it is common to have a single CPU, though in other embodiments that are not shown, one or more additional CPUs are also located in computer system 500 .
- CPU 502 may be Intel® Corporation CPU or a CPU of another brand.
- CPU 502 includes one or more cores.
- CPU 502 includes Core A ( 504 ), Core B ( 506 ), Core C ( 508 ), and Core D ( 510 ). Only one core is needed for operation of the computer system, but additional cores can distribute workloads and potentially increase overall system performance.
- each core (such as core A ( 504 )) includes internal functional blocks such as one or more execution units, retirement units, a set of general purpose and specific registers, etc. If the cores shown in FIG. 5 are multi-threaded or hyper-threaded, then each hardware thread may be considered as a core as well.
- CPU 502 may also include one or more caches, such as last level cache (LLC) 512 .
- LLC last level cache
- cache 512 may be apportioned in different ways.
- Cache 512 may be one of many different sizes in different embodiments.
- cache 512 may be an 8 megabyte (MB) cache, a 16 MB cache, etc.
- the cache may be a direct mapped cache, a fully associative cache, a multi-way set-associative cache, or a cache with another type of mapping.
- the cache may include one large portion shared among all cores or may be divided into several separately functional slices (e.g., one slice for each core). Each cache may also include one portion shared among all cores and several other portions that are separate functional slices per core.
- CPU 502 includes a system memory controller 514 to provide an interface to communicate with system memory 516 .
- System memory 516 may comprise dynamic random access memory (DRAM), such as a type of double data rate (DDR) DRAM, non-volatile memory such as flash memory, phase change memory (PCM), or another type of memory technology.
- DRAM dynamic random access memory
- System memory 516 may be a general purpose memory to store data and instructions to be operated upon by CPU 502 .
- DMA direct memory access
- I/O input/output
- the link i.e., bus, interconnect, etc.
- the link that couples CPU 502 with system memory 516 may include one or more optical, metal, or other wires (i.e. lines) that are capable of transporting data, address, control, and clock information.
- CPU 502 also may include an integrated graphics subsystem 518 , that is capable of computing pixel, vertex, and geometry data to be displayed on display device 520 .
- CPU 502 additionally may include a communication subsystem 522 that provides an I/O interface to communicate with external devices.
- the communication subsystem 522 may include both wired 524 and wireless 526 interfaces.
- the wired interface 524 may be an Ethernet compatible interface, in some embodiments.
- the wireless interface 526 (through one or more antenna components for transmitting and receiving) may be compatible for wireless communications through several protocols.
- the communication subsystem 522 wireless interface 526 may communicate through an IEEE 802.11-based protocol, a Bluetooth protocol, a cellular protocol, a WiMAX protocol, and/or one or more other wireless protocols.
- CPU 502 also includes a storage controller 528 to provide an interface to a mass storage device 530 .
- Mass storage device 530 may be a hard disk drive, a solid state drive, or another form of mass storage.
- CPU 502 also is capable of communicating to I/O devices, such as I/O device 532 and I/O device 534 through I/O adapters 536 and 538 , respectively.
- the I/O adapters each may allow the CPU 502 to communicate with one or more I/O devices through a certain protocol.
- one I/O adapter may be a Universal Serial Bus (USB) adapter to allow for plug in communication through USB ports between the CPU 502 and other external USB interfaces.
- USB Universal Serial Bus
- An input interface 540 allows the computer system 500 to be coupled to input devices such as a touchscreen 542 or microphone 544 . Additionally, a motion sensor unit 546 is located on the system, which tracks the movement of computer system 500 in 3-dimensional space.
- the computing system may be implemented in a different way, such as in a standard CPU/chipset configuration instead of as a SoC design.
- gesture-based signature identification, comparison, and verification logic may be present in any one of the following locations.
- the logic may be present in system memory 516 (logic 600 ), mass storage 530 (logic 602 ), cache 512 (logic 604 ), or potentially in any core (not shown).
- the logic may be present in the general circuitry (uncore) of the CPU 502 outside of the cores (logic 606 ).
- Elements of embodiments of the present invention may also be provided as a machine-readable medium for storing the machine-executable instructions.
- the machine-readable medium may include, but is not limited to, flash memory, optical disks, compact disks-read only memory (CD-ROM), digital versatile/video disks (DVD) ROM, random access memory (RAM), erasable programmable read-only memory (EPROM), electrically erasable programmable read-only memory (EEPROM), magnetic or optical cards, propagation media or other type of machine-readable media suitable for storing electronic instructions.
- embodiments of the invention may be downloaded as a computer program which may be transferred from a remote computer (e.g., a server) to a requesting computer (e.g., a client) by way of data signals embodied in a carrier wave or other propagation medium via a communication link (e.g., a modem or network connection).
- a remote computer e.g., a server
- a requesting computer e.g., a client
- a communication link e.g., a modem or network connection
- logic is representative of hardware, firmware, software (or any combination thereof) to perform one or more functions.
- examples of “hardware” include, but are not limited to, an integrated circuit, a finite state machine, or even combinatorial logic.
- the integrated circuit may take the form of a processor such as a microprocessor, an application specific integrated circuit, a digital signal processor, a micro-controller, or the like.
Landscapes
- Engineering & Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Physics & Mathematics (AREA)
- Human Computer Interaction (AREA)
- Computer Security & Cryptography (AREA)
- Computer Networks & Wireless Communication (AREA)
- Signal Processing (AREA)
- Multimedia (AREA)
- Theoretical Computer Science (AREA)
- Computer Hardware Design (AREA)
- Computing Systems (AREA)
- General Engineering & Computer Science (AREA)
- Health & Medical Sciences (AREA)
- General Health & Medical Sciences (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Psychiatry (AREA)
- Social Psychology (AREA)
- Biomedical Technology (AREA)
- User Interface Of Digital Computer (AREA)
- Telephone Function (AREA)
- Collating Specific Patterns (AREA)
Abstract
Embodiments of the invention are generally directed to systems, methods, devices, and machine-readable mediums for implementing gesture-based signature authentication. In one embodiment, a method may involve recording a first gesture-based signature and storing the recorded first gesture-based signature. Then the method compares the first gesture-based signature with a second gesture-based signature. Then the method verifies the first gesture-based signature as authentic when the first gesture-based signature is substantially similar to the second gesture-based signature.
Description
- This application is a continuation of U.S. patent application Ser. No. 16/293,951, filed Mar. 6, 2019, which is a continuation of U.S. patent application Ser. No. 15/992,404, filed May 30, 2018, now U.S. Pat. No. 10,305,897, issued May 28, 2019, which is a continuation of U.S. patent application Ser. No. 15/397,906, filed Jan. 4, 2017, now U.S. Pat. No. 10,015,166, issued Jul. 3, 2018, which is a continuation of U.S. patent application Ser. No. 15/048,078, filed Feb. 19, 2016, now U.S. Pat. No. 9,560,044, issued Jan. 31, 2017, which is a continuation of U.S. patent application Ser. No. 12/655,380, filed Dec. 30, 2009, now U.S. Pat. No. 9,292,731, issued Mar. 22, 2016, the content of which is hereby incorporated by reference.
- Embodiments of the invention generally relate to the field of integrated circuits and, more particularly, to systems, methods, devices, and machine-readable mediums for gesture-based signature authentication.
- There are a number of software applications that require authentication. For example, many e-commerce, home banking, and network access applications need authentication to provide the user a level of security. In most cases the need for authentication is addressed by requiring the user to enter a text-based password or passphrase. Text-based passwords, and especially passphrases, are more difficult to input when using a small mobile devices. The use of passwords and passphrases are restricted to the availability of suitable keyboards. Additionally, text-based passwords and passphrases can be easily compromised when a malicious user determines the correct password or passphrase.
- Embodiments of the invention are illustrated by way of example, and not by way of limitation, in the figures of the accompanying drawings in which like reference numerals refer to similar elements.
-
FIG. 1 illustrates several examples of motion gestures that may be utilized as gesture-based signatures. -
FIG. 2 illustrates an embodiment of the glyph capture buffer, audio capture buffer, and 3D movement buffer in memory. -
FIG. 3 is a flow diagram of an embodiment of a gesture-based signature registration phase. -
FIG. 4 is a flow diagram of an embodiment of a process to authenticate a gesture-based signature. -
FIG. 5 illustrates an embodiment of a computing device that implements a gesture-based authentication process. - Embodiments are generally directed to systems, methods, and apparatuses for implementing gesture-based signature authentication.
- In many embodiments, a process to use a combination of gestures including glyphs entered in a touch screen (which may include single touch or multi-touch capable touch screens), sounds captured by a microphone, and movements registered by motion sensors may be used in place of a standard text-based password for use with a mobile computing device. The term “gesture” will be applied to any user entered combination of glyphs, movements and sounds that the user enters into the mobile computing device through input devices. A gesture-based signature can be verified as performed by the authentic user (i.e., authenticated) through a series of calculations. Once the gesture-based signature has been authenticated, the gesture-based signature can be substituted for a correct text-based password that is then input into a password field to gain access to a software application or website. Logic to perform gesture identification, comparison, and verification processes may be present within the mobile computing device. This process does not require significant changes to application or website infrastructure.
-
FIG. 1 illustrates several examples of motion gestures that may be utilized as gesture-based signatures. - Each of the samples involves a user holding the mobile computing device A and making a series of movements in three-dimensional space. In order for these movements to be recognized, mobile device A needs an integrated motion sensor device (e.g., an accelerometer).
- A coordinate space reference at the top of the page illustrates the positive X, Y, and Z directions in three-dimensional space. Based on that reference, we now turn to gesture
sample signature A 100, which shows the mobile device A first moving in the positive Z direction, then the negative Z direction, then a clockwise circle, and finally, the negative X direction. This description is rudimentary for ease of explanation. In a real example, a user's hand is generally not going to only travel back and forth along an axis in three-dimensional space. - Rather, in a true example of motion capture in three-dimensional space, a buffer will be created to store X, Y, Z coordinate triplets in 3D space. The coordinate triplets will be stored at the end of a time interval since the previous coordinate triplet was acquired. For example, mobile device A may store a coordinate triplet of the current location of the mobile device A in 3D space every 0.001 seconds. This would create a histogram in 3D space of the location of the mobile device at a 0.001 second granularity. In other embodiments, the interval granularity may be shorter or longer.
- In many embodiments, a user presses a button on the mobile device to begin capture of the signature. Once the button is pressed, the 3D coordinates at the current location of the device may be zeroed out as a relative reference point. Then the coordinate capture begins upon pressing the button and the coordinate triplets are continuously stored until a button is pressed again by the user to indicate that the gesture-based signature is complete.
- Gesture
sample signature B 102 utilizes mobile device B, which includes a touch screen display. The touch screen display may be utilized to capture a glyph. In some embodiments, a user may use their hand as the input instrument on the touch screen. In other embodiments, the user may use a pen or other pointing device as the input instrument on the touch screen. In different embodiments the glyph may be any combination of movements captured by the touch screen. The combination of movements may resemble a collection of letters or symbols. Although, any combination of movements is acceptable so even seemingly random movements that do not resemble letters or symbols may be captured as the glyph, such as a handwritten signature. - Glyph capture may resemble the 3D space motion capture where a buffer may be created that stores X, Y coordinate doubles. These X, Y coordinate doubles may then be saved into the buffer once every time interval, which creates a time-based memory of how the glyph was created on the touch screen.
- Furthermore, the mobile devices each include a microphone (e.g., the audio receiver in a cell phone). A word or phrase may be spoken into the microphone and saved as an audio frequency pattern over time in a buffer in the mobile device.
- Returning to gesture
sample signature B 102, the user may enter a glyph on the touch screen and then speak a word (e.g., “football”) into the microphone on the mobile device as a combination of a glyph and audio gesture-based signature. Gesturesample signature C 104 illustrates a gesture-based signature utilizing all three types of input methods. For example, the user may enter a glyph on the touch screen, then speak a phrase into the microphone on the mobile device (e.g., “Umbrella”), and then move the mobile device in 3D space. - In many embodiments, two or more types of gestures may be simultaneously entered as a more advanced requirement for the gesture-based signature. For example, in gesture
sample signature C 104, the user might first enter the glyph, but then speak the phrase and move the device in 3D space simultaneously. In many embodiments, a user may press a start gesture capture sequence button on the mobile device indicating the start of the sequence. At the start of the gesture capture sequence the glyph capture buffer, audio capture buffer, and 3D movement buffer are each initialized. Then the three buffers are utilized simultaneously to capture all three types of gestures (i.e., glyph, voice, and 3D movement) throughout the entire length of the gesture capture sequence. The gesture capture sequence ends as soon as an end capture sequence button is pressed on the mobile device. -
FIG. 2 illustrates an embodiment of the glyph capture buffer, audio capture buffer, and 3D movement buffer in memory. - In many embodiments,
memory 200 includes each of aglyph capture buffer 202, anaudio capture buffer 204, and 3D movement buffer 206. These buffers are initialized at the start of a gesture capture sequence. In many embodiments, these buffers line up by acquisition times, as shown by the time column to the left of the buffers. For example, ifTime 0 represents the start of a gesture capture period and there are n capture intervals, thenTime 0+n intervals is the end of the gesture capture period. Each time interval has a saved amount of glyph data (X,Y coordinates), audio data (instantaneous frequency map of the audible spectrum as captured by the microphone), and 3D location data (X,Y,Z coordinates) in the respective buffers. - Once the capture period has completed, the buffers may be saved to a gesture-based signature file and stored in a storage location. In many embodiments, the storage location of the gesture-based signature file may be secured through the use of hardware or software-based security measures (e.g., signed public and private security keys, hash algorithms, hardware measurement techniques, etc.). In other embodiments, storage location may be a remote server using a secure channel, for connected comparison and use of the signature.
- In order to harden the password, several variables are taken into account, including the timing and the way in which a glyph is drawn on the screen, the time and speed at which a phrase is spoken in relation to the drawing of the glyph, the frequency of the user voice, etc. By taking all these points of data into account, even if the spoken phrase and the handwritten signature are compromised, it remains difficult to forge the gesture-based signature. Using a basic example of utilizing time as a factor in the comparison of two gesture-based signatures, a relative temporal (i.e., time) element may be introduced for the entire capture period length. In other words, the length of time it takes to fully record two separate attempts at the same gesture-based signature may need to be within a predetermined maximum discrepancy of time when comparing the two signatures. So if a first signature very nearly duplicates the coordinates of a second signature, but the first signature takes twice as long to complete, this may cause comparison logic to result with a “no match.”
- There are two general phases associated with gesture-based signature authentication. The first phase is the registration phase, which deals with generating a new gesture and safely storing it into the mobile device. The new gesture may require a training period where the user becomes familiar with the gesture and can substantially repeat the gesture combination desired as the password. Once the user has become confident that he or she can successfully repeatedly enter the full gesture without a substantial chance of having gesture comparison logic reject the gesture due to discrepancies, the gesture signature file may then be stored for future use.
- In many embodiments, gesture identification and comparison logic within the mobile device will have the user enter the gesture-based signature several times and average out the coordinate, audio, and time data stored in the buffers during each capture period. Additionally, the variability of the user's ability to recreate the gesture-based signature may cause a maximum discrepancy threshold to increase or decrease. The maximum discrepancy threshold may be a discrepancy per time interval between the averaged coordinate and audio data and the most recently captured coordinate and audio data. For example, if the observed glyph coordinate data between several capture period training attempts varies widely, the gesture comparison and verification logic may allow a greater variability in the X,Y glyph data when determining the authenticity of the signature most recently recorded and a stored version of the signature.
- Glyph and 3D movement gestures are represented by a finite list of one or more (for multi-touch devices) coordinates and their corresponding acquisition times. This makes a handwritten glyph-based and/or 3D movement-based signature harder to forge, since time-based coordinate information is stored, which reveals the way the signature is drawn on the touch screen or in the air, as well as the speed at which each stroke is usually performed.
- In many embodiments, the gesture-based signature file is stored as a list of tuples of the form (input device, input value, acquisition time), for example ((touch screen); (8,33); 0).
- Each registered gesture-based signature may be associated with a given application or website. Therefore, additional information is requested from the user, including the site/application, username, and text-based password if the user already possesses such password. If the user does not already possess such a password, then the text-based password can be automatically generated in this phase. This text-based password information is stored in a safe storage location.
-
FIG. 3 is a flow diagram of an embodiment of a gesture-based signature registration phase. - The process is performed by processing logic, which may comprise hardware (e.g., circuitry), software (e.g., an operating system or application), firmware (e.g., microcode), or any combination of two or more of the listed types of processing logic.
- The process begins with a user starting the signature registration processing logic to register a gesture-based signature (processing block 300). Next, a user enters a gesture-based signature into the processing logic (processing block 302). In many embodiments, block 302 is performed several times until the entry has been sufficiently learned by the processing logic, including the variability levels of each type of gesture the user makes for the signature being registered.
- Processing logic then determines if a username exists for user at the application or website that the user wants to implement a gesture-based signature for (processing block 304). If the username exists, then processing logic associates the gesture-based signature to the website/application, the username, and the password. The ASCII-based (i.e., text-based) password will correspond to the gesture-based signature. For example, if a user wants to log in to a website that requires a username and password, processing logic first associates a valid text-based username and password for the user to gain access to the website. Once the valid text-based username and password have been discovered, processing logic may associate this information with a particular gesture-based signature file. Processing logic then stores the gesture-based signature in a secure storage location (processing block 308).
- Returning to block 304, if the username does not exist for the application or website, then processing logic generates an ASCII-based password from the signature (processing block 310) and then associates the gesture-based signature to the website/application and username (processing block 312). Finally, processing logic then stores the gesture-based signature in a secure storage location (processing block 308).
- The authentication phase deals with the moment when the signature is being compared with the registered one for authentication purposes. Specifically, the process to authenticate a user proceeds as follows. An application or website requires the user to enter his/her username and password. The application/website developer uses a provided API (application programming interface) that will launch a graphical interface asking the user to enter a gesture-based signature.
- The user then enters a gesture-based signature. The gesture-based signature entered by the user is then compared to the registered signature for that site/application. Since no two signatures from the same user are likely to be identical, algorithms that take into account variability, as mentioned above, may be used to compare the registered gesture-based signature with the newly entered gesture-based signature. Signatures that are compared would be required to be substantially similar in regard to time, coordinates, and voice frequencies. Just how similar would be signature specific, considering the variability per signature may differ depending on how well a user can repeat the signature during initial training capture periods. In some embodiments, a process described as Dynamic Type Warping may be utilized. Dynamic Type Warping performs point-to-point correspondence for data at each acquisition time. A smart selection of acquisition points may then be compared.
- In some embodiments, the gesture-based signature comparison processing logic may be offloaded to a backend server coupled to a network that the mobile computing device is also coupled to. In many embodiments that utilize a backend server for comparison of the two signatures, the registered signature may be stored at the backend server.
- Once the newly-entered gesture-based signature is matched to a registered signature, the associated text-based password is returned to the external application/website, which then permits the user to log in.
-
FIG. 4 is a flow diagram of an embodiment of a process to authenticate a gesture-based signature. - The process is performed by processing logic, which may comprise hardware (e.g., circuitry), software (e.g., an operating system or application), firmware (e.g., microcode), or any combination of two or more of the listed types of processing logic.
- The process begins by processing logic in the application or web browser calling gesture-based signature comparison logic through the provided API (processing block 400). Then the user enters the gesture-based signature (processing block 402). Processing logic then compares the gesture-based signature that has been newly entered by the user with an existing registered signature in storage (processing block 404).
- Processing logic then determines if the signatures match (processing block 406). If the signatures do not match, then the signature entry has failed and the process returns to block 402 to re-enter the gesture-based signature. Otherwise, if the signatures do result in a match, then processing logic returns the stored ASCII-based password to the application/browser for use (processing block 408) and the process is finished.
-
FIG. 5 illustrates an embodiment of a computing device that implements a gesture-based authentication process. -
Computer system 500 is shown. The computer system inFIG. 5 generally comprises a system on a chip (SoC) layout. The SoC layout may be utilized in any type of computer system but is useful for small form factor mobile computing devices, such as cellular phones, smart phones, and personal digital assistants (PDAs). - The
computer system 500 includes a central processing unit (CPU) 502. In a SoC layout, it is common to have a single CPU, though in other embodiments that are not shown, one or more additional CPUs are also located incomputer system 500. -
CPU 502 may be Intel® Corporation CPU or a CPU of another brand.CPU 502 includes one or more cores. In the embodiment shown,CPU 502 includes Core A (504), Core B (506), Core C (508), and Core D (510). Only one core is needed for operation of the computer system, but additional cores can distribute workloads and potentially increase overall system performance. In many embodiments, each core (such as core A (504)) includes internal functional blocks such as one or more execution units, retirement units, a set of general purpose and specific registers, etc. If the cores shown inFIG. 5 are multi-threaded or hyper-threaded, then each hardware thread may be considered as a core as well. -
CPU 502 may also include one or more caches, such as last level cache (LLC) 512. In many embodiments that are not shown, additional caches other thancache 512 are implemented where multiple levels of cache exist between the execution units in each core and memory. Indifferent embodiments cache 512 may be apportioned in different ways.Cache 512 may be one of many different sizes in different embodiments. For example,cache 512 may be an 8 megabyte (MB) cache, a 16 MB cache, etc. Additionally, in different embodiments the cache may be a direct mapped cache, a fully associative cache, a multi-way set-associative cache, or a cache with another type of mapping. The cache may include one large portion shared among all cores or may be divided into several separately functional slices (e.g., one slice for each core). Each cache may also include one portion shared among all cores and several other portions that are separate functional slices per core. - In many embodiments,
CPU 502 includes asystem memory controller 514 to provide an interface to communicate withsystem memory 516.System memory 516 may comprise dynamic random access memory (DRAM), such as a type of double data rate (DDR) DRAM, non-volatile memory such as flash memory, phase change memory (PCM), or another type of memory technology.System memory 516 may be a general purpose memory to store data and instructions to be operated upon byCPU 502. Additionally, there may be other potential devices withincomputer system 500 that have the capability to read and write to the system memories, such as a direct memory access (DMA)-capable I/O (input/output) device. The link (i.e., bus, interconnect, etc.) that couplesCPU 502 withsystem memory 516 may include one or more optical, metal, or other wires (i.e. lines) that are capable of transporting data, address, control, and clock information. -
CPU 502 also may include anintegrated graphics subsystem 518, that is capable of computing pixel, vertex, and geometry data to be displayed ondisplay device 520.CPU 502 additionally may include acommunication subsystem 522 that provides an I/O interface to communicate with external devices. Thecommunication subsystem 522 may include both wired 524 andwireless 526 interfaces. Thewired interface 524 may be an Ethernet compatible interface, in some embodiments. The wireless interface 526 (through one or more antenna components for transmitting and receiving) may be compatible for wireless communications through several protocols. For example, thecommunication subsystem 522wireless interface 526 may communicate through an IEEE 802.11-based protocol, a Bluetooth protocol, a cellular protocol, a WiMAX protocol, and/or one or more other wireless protocols. -
CPU 502 also includes astorage controller 528 to provide an interface to amass storage device 530.Mass storage device 530 may be a hard disk drive, a solid state drive, or another form of mass storage. Additionally,CPU 502 also is capable of communicating to I/O devices, such as I/O device 532 and I/O device 534 through I/O adapters CPU 502 to communicate with one or more I/O devices through a certain protocol. For example, one I/O adapter may be a Universal Serial Bus (USB) adapter to allow for plug in communication through USB ports between theCPU 502 and other external USB interfaces. - An
input interface 540 allows thecomputer system 500 to be coupled to input devices such as atouchscreen 542 ormicrophone 544. Additionally, amotion sensor unit 546 is located on the system, which tracks the movement ofcomputer system 500 in 3-dimensional space. - In many other embodiments that are not shown, the computing system may be implemented in a different way, such as in a standard CPU/chipset configuration instead of as a SoC design.
- In many embodiments, gesture-based signature identification, comparison, and verification logic may be present in any one of the following locations. When at least a portion of the logic is implemented in software, the logic may be present in system memory 516 (logic 600), mass storage 530 (logic 602), cache 512 (logic 604), or potentially in any core (not shown). When at least a portion of the logic is implemented in hardware, the logic may be present in the general circuitry (uncore) of the
CPU 502 outside of the cores (logic 606). - Elements of embodiments of the present invention may also be provided as a machine-readable medium for storing the machine-executable instructions. The machine-readable medium may include, but is not limited to, flash memory, optical disks, compact disks-read only memory (CD-ROM), digital versatile/video disks (DVD) ROM, random access memory (RAM), erasable programmable read-only memory (EPROM), electrically erasable programmable read-only memory (EEPROM), magnetic or optical cards, propagation media or other type of machine-readable media suitable for storing electronic instructions. For example, embodiments of the invention may be downloaded as a computer program which may be transferred from a remote computer (e.g., a server) to a requesting computer (e.g., a client) by way of data signals embodied in a carrier wave or other propagation medium via a communication link (e.g., a modem or network connection).
- In the description above, certain terminology is used to describe embodiments of the invention. For example, the term “logic” is representative of hardware, firmware, software (or any combination thereof) to perform one or more functions. For instance, examples of “hardware” include, but are not limited to, an integrated circuit, a finite state machine, or even combinatorial logic. The integrated circuit may take the form of a processor such as a microprocessor, an application specific integrated circuit, a digital signal processor, a micro-controller, or the like.
- It should be appreciated that reference throughout this specification to “one embodiment” or “an embodiment” means that a particular feature, structure or characteristic described in connection with the embodiment is included in at least one embodiment of the present invention. Therefore, it is emphasized and should be appreciated that two or more references to “an embodiment” or “one embodiment” or “an alternative embodiment” in various portions of this specification are not necessarily all referring to the same embodiment. Furthermore, the particular features, structures or characteristics may be combined as suitable in one or more embodiments of the invention.
- Similarly, it should be appreciated that in the foregoing description of embodiments of the invention, various features are sometimes grouped together in a single embodiment, figure, or description thereof for the purpose of streamlining the disclosure aiding in the understanding of one or more of the various inventive aspects. This method of disclosure, however, is not to be interpreted as reflecting an intention that the claimed subject matter requires more features than are expressly recited in each claim. Rather, as the following claims reflect, inventive aspects lie in less than all features of a single foregoing disclosed embodiment. Thus, the claims following the detailed description are hereby expressly incorporated into this detailed description.
Claims (20)
1. A mobile device comprising:
a system on chip (SoC) including a first core and a second core;
a plurality of input devices; and
a storage coupled to the SoC, the storage to store a set of instructions which, if executed by the SoC, causes the SoC to perform a method comprising:
capturing a first biometric-based input of a user obtained from one or more of the plurality of input devices;
generating a first representation using the first biometric-based input;
storing the first representation in a secure storage;
capturing a second biometric-based input of the user obtained from the one or more of the plurality of input devices;
generating a second representation using the second biometric-based input;
comparing the first representation with the second representation;
authenticating the user in response to the first representation matching the second representation; and
after authenticating the user in response to the first representation matching the second representation, returning a password to a website to permit the user to access the website, the password associated with the website and stored in another storage location and obtained during a registration.
2. The mobile device of claim 1 , wherein the storage is further to store instructions which, if executed by the SoC, cause the SoC to register the first biometric-based input prior to the capture of the second biometric-based input.
3. The mobile device of claim 2 , wherein the storage is further to store instructions which, if executed by the SoC, cause the SoC to associate the first biometric-based input with the website and further associate the first biometric-based input with a username.
4. The mobile device of claim 1 , wherein the storage is further to store instructions which, if executed by the SoC, cause the mobile device to present a graphical user interface on a display of the mobile device to request input of the first biometric-based input.
5. The mobile device of claim 1 , wherein the storage is further to store instructions which, if executed by the SoC, cause the SoC to obtain a plurality of first biometric-based inputs each associated with a given website.
6. The mobile device of claim 1 , wherein the storage is further to store instructions which, if executed by the SoC, cause the SoC to request the user to re-enter the second biometric-based input in response to the first representation not matching the second representation.
7. The mobile device of claim 1 , wherein the plurality of input devices comprises a touch screen and a pen.
8. The mobile device of claim 1 , wherein the mobile device comprises a smart phone.
9. The mobile device of claim 1 , wherein the SoC further comprises at least one shared cache memory.
10. The mobile device of claim 1 , wherein the website comprises an e-commerce website.
11. The mobile device of claim 1 , wherein the first biometric-based input comprises a gesture-based signature.
12. The mobile device of claim 1 , wherein the SoC further comprises a graphics processor.
13. The mobile device of claim 1 , wherein the SoC further comprises a motion sensor.
14. A mobile device comprising:
a system on chip (SoC) including a first core and a second core;
a plurality of input devices comprising a touch screen; and
a storage coupled to the SoC, the storage to store a set of instructions which, if executed by the SoC, causes the SoC to perform a method comprising:
in a registration phase:
capturing a first biometric-based input of a user obtained from one or more of the plurality of input devices;
generating a first representation using the first biometric-based input; and
storing the first representation in a first storage;
in an authentication phase for an application:
capturing a second biometric-based input of the user obtained from the one or more of the plurality of input devices;
generating a second representation using the second biometric-based input;
comparing the first representation with the second representation; and
authenticating the user in response to the first representation matching the second representation; and
after authenticating the user in response to the first representation matching the second representation, returning a password to the application, the password stored in another storage location and associated with the application.
15. The mobile device of claim 14 , wherein the storage is further to store instructions which, if executed by the SoC, cause the SoC to obtain the second biometric-based input comprising a gesture-based signature of the user.
16. The mobile device of claim 14 , wherein the storage is further to store instructions which, if executed by the SoC, cause the mobile device to present a graphical user interface on the touch screen to request input of the first biometric-based input.
17. The mobile device of claim 14 , wherein the password comprises an ASCII-based password.
18. At least one non-transitory computer readable storage medium having stored thereon instructions, which if performed by a machine cause the machine to perform a method comprising:
in a registration phase:
capturing a first gesture-based input of a user obtained from one or more of a plurality of input devices of a mobile device;
generating a first representation using the first gesture-based input; and
storing the first representation in a first storage location of the mobile device;
in an authentication phase:
capturing a second gesture-based input of the user obtained from the one or more of the plurality of input devices;
generating a second representation using the second gesture-based input;
comparing the first representation with the second representation; and
authenticating the user in response to the first representation matching the second representation; and
after authenticating the user in response to the first representation matching the second representation, sending a password to the application, the password stored in a second storage location of the mobile device and associated with the application.
19. The at least one non-transitory computer readable storage medium of claim 18 , further comprising instructions that when executed enable the mobile device to present a graphical user interface on a display of the mobile device to request input of the first gesture-based input.
20. The at least one non-transitory computer readable storage medium of claim 18 , further comprising instructions that when executed enable the SoC to request the user to re-enter the second gesture-based input in response to the first representation not matching the second representation.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US16/716,983 US20200128006A1 (en) | 2009-12-30 | 2019-12-17 | Gesture-Based Signature Authentication |
Applications Claiming Priority (6)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US12/655,380 US9292731B2 (en) | 2009-12-30 | 2009-12-30 | Gesture-based signature authentication |
US15/048,078 US9560044B2 (en) | 2009-12-30 | 2016-02-19 | Gesture-based signature authentication |
US15/397,906 US10015166B2 (en) | 2009-12-30 | 2017-01-04 | Gesture-based signature authentication |
US15/992,404 US10305897B2 (en) | 2009-12-30 | 2018-05-30 | Gesture-based signature authentication |
US16/293,951 US10681042B2 (en) | 2009-12-30 | 2019-03-06 | Gesture-based signature authentication |
US16/716,983 US20200128006A1 (en) | 2009-12-30 | 2019-12-17 | Gesture-Based Signature Authentication |
Related Parent Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US16/293,951 Continuation US10681042B2 (en) | 2009-12-30 | 2019-03-06 | Gesture-based signature authentication |
Publications (1)
Publication Number | Publication Date |
---|---|
US20200128006A1 true US20200128006A1 (en) | 2020-04-23 |
Family
ID=43828340
Family Applications (6)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US12/655,380 Active 2030-10-02 US9292731B2 (en) | 2009-12-30 | 2009-12-30 | Gesture-based signature authentication |
US15/048,078 Active US9560044B2 (en) | 2009-12-30 | 2016-02-19 | Gesture-based signature authentication |
US15/397,906 Active US10015166B2 (en) | 2009-12-30 | 2017-01-04 | Gesture-based signature authentication |
US15/992,404 Active US10305897B2 (en) | 2009-12-30 | 2018-05-30 | Gesture-based signature authentication |
US16/293,951 Active US10681042B2 (en) | 2009-12-30 | 2019-03-06 | Gesture-based signature authentication |
US16/716,983 Abandoned US20200128006A1 (en) | 2009-12-30 | 2019-12-17 | Gesture-Based Signature Authentication |
Family Applications Before (5)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US12/655,380 Active 2030-10-02 US9292731B2 (en) | 2009-12-30 | 2009-12-30 | Gesture-based signature authentication |
US15/048,078 Active US9560044B2 (en) | 2009-12-30 | 2016-02-19 | Gesture-based signature authentication |
US15/397,906 Active US10015166B2 (en) | 2009-12-30 | 2017-01-04 | Gesture-based signature authentication |
US15/992,404 Active US10305897B2 (en) | 2009-12-30 | 2018-05-30 | Gesture-based signature authentication |
US16/293,951 Active US10681042B2 (en) | 2009-12-30 | 2019-03-06 | Gesture-based signature authentication |
Country Status (3)
Country | Link |
---|---|
US (6) | US9292731B2 (en) |
EP (3) | EP3432211A1 (en) |
CN (2) | CN102117392A (en) |
Families Citing this family (69)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US7953983B2 (en) | 2005-03-08 | 2011-05-31 | Microsoft Corporation | Image or pictographic based computer login systems and methods |
US8941466B2 (en) * | 2009-01-05 | 2015-01-27 | Polytechnic Institute Of New York University | User authentication for devices with touch sensitive elements, such as touch sensitive display screens |
US8458485B2 (en) | 2009-06-17 | 2013-06-04 | Microsoft Corporation | Image-based unlock functionality on a computing device |
US9292731B2 (en) | 2009-12-30 | 2016-03-22 | Intel Corporation | Gesture-based signature authentication |
US20110271332A1 (en) * | 2010-04-30 | 2011-11-03 | American Teleconferencing Services Ltd. | Participant Authentication via a Conference User Interface |
US8689349B2 (en) * | 2010-05-05 | 2014-04-01 | Intel Corporation | Information flow tracking and protection |
US20120225703A1 (en) * | 2010-10-21 | 2012-09-06 | Aibelive Co., Ltd | Method for playing a video game on a mobile device |
US9836780B2 (en) | 2010-11-19 | 2017-12-05 | Mastercard International Incorporated | Method and system for consumer transactions using voice or human based gesture actions |
US10043209B2 (en) | 2010-11-19 | 2018-08-07 | Mastercard International Incorporated | Method and system for consumer transactions using voice or human based gesture actions |
CN102073810B (en) * | 2010-12-06 | 2013-01-23 | 上海合合信息科技发展有限公司 | Method for integrating account management function in input method software |
US8843752B1 (en) * | 2011-01-24 | 2014-09-23 | Prima Cimema, Inc. | Multi-factor device authentication |
US20120194440A1 (en) * | 2011-01-31 | 2012-08-02 | Research In Motion Limited | Electronic device and method of controlling same |
US20120200391A1 (en) * | 2011-02-03 | 2012-08-09 | Sony Corporation, A Japanese Corporation | Method to identify user with security |
US8494967B2 (en) * | 2011-03-11 | 2013-07-23 | Bytemark, Inc. | Method and system for distributing electronic tickets with visual display |
AU2011202415B1 (en) * | 2011-05-24 | 2012-04-12 | Microsoft Technology Licensing, Llc | Picture gesture authentication |
US8752200B2 (en) | 2011-07-12 | 2014-06-10 | At&T Intellectual Property I, L.P. | Devices, systems and methods for security using magnetic field based identification |
US9164648B2 (en) | 2011-09-21 | 2015-10-20 | Sony Corporation | Method and apparatus for establishing user-specific windows on a multi-user interactive table |
US10725563B2 (en) | 2011-10-28 | 2020-07-28 | Wacom Co., Ltd. | Data transfer from active stylus to configure a device or application |
US9389701B2 (en) * | 2011-10-28 | 2016-07-12 | Atmel Corporation | Data transfer from active stylus |
US9773245B1 (en) * | 2011-12-05 | 2017-09-26 | Amazon Technologies, Inc. | Acquiring items using gestures on a touchscreen |
US9147059B2 (en) * | 2012-02-22 | 2015-09-29 | Polytechnic Institute Of New York University | Biometric-rich gestures for authentication on multi-touch devices |
US9519909B2 (en) * | 2012-03-01 | 2016-12-13 | The Nielsen Company (Us), Llc | Methods and apparatus to identify users of handheld computing devices |
CN103294334B (en) * | 2012-03-05 | 2017-03-01 | 北京三星通信技术研究有限公司 | Unlocking screen data access control method and safety control |
JP5993164B2 (en) * | 2012-03-08 | 2016-09-14 | オリンパス株式会社 | COMMUNICATION DEVICE, COMMUNICATION METHOD, AND PROGRAM |
US8473975B1 (en) | 2012-04-16 | 2013-06-25 | The Nielsen Company (Us), Llc | Methods and apparatus to detect user attentiveness to handheld computing devices |
CN102880888A (en) * | 2012-08-28 | 2013-01-16 | 王令徽 | Method for associating inner and outer layers of packages of commodity |
US9374618B2 (en) * | 2012-09-11 | 2016-06-21 | Intel Corporation | Interactive visual advertisement service |
GB2519710A (en) * | 2012-10-09 | 2015-04-29 | Lockheed Corp | Secure gesture |
US9372970B2 (en) | 2012-10-12 | 2016-06-21 | Apple Inc. | Gesture entry techniques |
US9147058B2 (en) * | 2012-10-12 | 2015-09-29 | Apple Inc. | Gesture entry techniques |
US9184921B2 (en) * | 2012-12-14 | 2015-11-10 | Microsoft Technology Licensing, Llc | Input challenge based authentication |
US9223297B2 (en) | 2013-02-28 | 2015-12-29 | The Nielsen Company (Us), Llc | Systems and methods for identifying a user of an electronic device |
TWI501101B (en) | 2013-04-19 | 2015-09-21 | Ind Tech Res Inst | Multi touch methods and devices |
WO2014186010A1 (en) | 2013-05-13 | 2014-11-20 | Ohio University | Motion-based identity authentication of an individual with a communications device |
US20150006385A1 (en) * | 2013-06-28 | 2015-01-01 | Tejas Arvindbhai Shah | Express transactions on a mobile device |
US9824348B1 (en) | 2013-08-07 | 2017-11-21 | Square, Inc. | Generating a signature with a mobile device |
US10083436B1 (en) | 2013-09-30 | 2018-09-25 | Asignio Inc. | Electronic payment systems and methods |
US10460096B2 (en) | 2013-10-30 | 2019-10-29 | Ohio University | Motion-based identity authentication of an individual |
WO2015084392A1 (en) * | 2013-12-06 | 2015-06-11 | Hewlett-Packard Development Company, L.P. | Object-based user authentication |
US9223955B2 (en) * | 2014-01-30 | 2015-12-29 | Microsoft Corporation | User-authentication gestures |
WO2015142031A1 (en) * | 2014-03-21 | 2015-09-24 | Samsung Electronics Co., Ltd. | User terminal apparatus, electronic apparatus, system, and control method thereof |
KR102296180B1 (en) | 2014-03-21 | 2021-09-01 | 삼성전자주식회사 | User terminal apparatus, electronic apparatus, system and control method thereof |
CN104050402A (en) * | 2014-06-12 | 2014-09-17 | 深圳市汇顶科技股份有限公司 | Mobile terminal security certification method and system and mobile terminal |
EP3167445B1 (en) | 2014-07-10 | 2021-05-26 | Intelligent Platforms, LLC | Apparatus and method for electronic labeling of electronic equipment |
US20170013464A1 (en) * | 2014-07-10 | 2017-01-12 | Gila FISH | Method and a device to detect and manage non legitimate use or theft of a mobile computerized device |
CN105447374B (en) * | 2014-09-11 | 2018-08-21 | 塔塔咨询服务有限公司 | Computer implemented system for generating and giving for change authorization code and method |
JP6524762B2 (en) * | 2015-04-03 | 2019-06-05 | 富士通株式会社 | CONTENT DISPLAY CONTROL METHOD, CONTENT DISPLAY CONTROL DEVICE, AND CONTENT DISPLAY CONTROL PROGRAM |
US10419428B2 (en) | 2015-07-05 | 2019-09-17 | NXT-ID, Inc. | System and method to authenticate electronics using electronic-metrics |
US20170053249A1 (en) | 2015-07-30 | 2017-02-23 | NXT-ID, Inc. | Electronic Crypto-Currency Management Method and System |
US9967244B2 (en) * | 2015-10-14 | 2018-05-08 | Microsoft Technology Licensing, Llc | Multi-factor user authentication framework using asymmetric key |
CA3002977C (en) | 2015-11-04 | 2019-01-08 | Screening Room Media, Inc. | Digital content delivery system |
WO2017087981A2 (en) * | 2015-11-20 | 2017-05-26 | Payeazy, Inc. | Systems and methods for authenticating users of a computer system |
US10268814B1 (en) | 2015-12-16 | 2019-04-23 | Western Digital Technologies, Inc. | Providing secure access to digital storage devices |
US11079915B2 (en) | 2016-05-03 | 2021-08-03 | Intelligent Platforms, Llc | System and method of using multiple touch inputs for controller interaction in industrial control systems |
US10845987B2 (en) | 2016-05-03 | 2020-11-24 | Intelligent Platforms, Llc | System and method of using touch interaction based on location of touch on a touch screen |
US10540491B1 (en) | 2016-10-25 | 2020-01-21 | Wells Fargo Bank, N.A. | Virtual and augmented reality signatures |
US10417327B2 (en) * | 2016-12-30 | 2019-09-17 | Microsoft Technology Licensing, Llc | Interactive and dynamically animated 3D fonts |
US10686774B2 (en) | 2017-01-13 | 2020-06-16 | Asignio Inc. | Authentication systems and methods for online services |
US10367805B2 (en) * | 2017-01-25 | 2019-07-30 | Airsig Inc. | Methods for dynamic user identity authentication |
US10452819B2 (en) | 2017-03-20 | 2019-10-22 | Screening Room Media, Inc. | Digital credential system |
WO2019079815A1 (en) | 2017-10-20 | 2019-04-25 | Asignio Inc. | Electronic verification systems and methods |
US11126705B2 (en) | 2017-11-09 | 2021-09-21 | Mastercard International Incorporated | Systems and methods for user authentication using word-gesture pairs |
WO2019240766A1 (en) * | 2018-06-12 | 2019-12-19 | Hewlett-Packard Development Company, L.P. | Gesture based accesses |
US11288347B2 (en) * | 2019-03-07 | 2022-03-29 | Paypal, Inc. | Login from an alternate electronic device |
CN111849103B (en) * | 2019-04-24 | 2021-11-02 | 歌尔股份有限公司 | Vibrating diaphragm for miniature sound generating device and miniature sound generating device |
CN113202353A (en) * | 2020-01-31 | 2021-08-03 | 青岛海尔智能家电科技有限公司 | Control method and control device for intelligent door lock and intelligent door lock |
CN111708607A (en) * | 2020-06-18 | 2020-09-25 | 哈工大机器人(合肥)国际创新研究院 | Web-based online simulation intelligent agv simulation scheduling method and device |
US11804077B2 (en) * | 2021-04-01 | 2023-10-31 | KaiKuTek Inc. | Generic gesture detecting method and generic gesture detecting device |
US11663302B1 (en) * | 2021-12-22 | 2023-05-30 | Devdan Gershon | System and method for quickly accessing a locked electronic device |
Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20020163522A1 (en) * | 2001-05-07 | 2002-11-07 | Porter Allen J.C. | Method and apparatus for maintaining secure and nonsecure data in a shared memory system |
US20020188854A1 (en) * | 2001-06-08 | 2002-12-12 | John Heaven | Biometric rights management system |
US20030236947A1 (en) * | 2002-06-24 | 2003-12-25 | Shinya Yamazaki | Prevention of conflicting cache hits without an attendant increase in hardware |
US7173604B2 (en) * | 2004-03-23 | 2007-02-06 | Fujitsu Limited | Gesture identification of controlled devices |
US20070143833A1 (en) * | 2005-12-21 | 2007-06-21 | Conley Kevin M | Voice controlled portable memory storage device |
US20090002316A1 (en) * | 2007-01-31 | 2009-01-01 | Broadcom Corporation | Mobile communication device with game application for use in conjunction with a remote mobile communication device and methods for use therewith |
US20100250789A1 (en) * | 2009-03-27 | 2010-09-30 | Qualcomm Incorporated | System and method of managing memory at a portable computing device and a portable computing device docking station |
Family Cites Families (25)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US7331523B2 (en) * | 2001-07-13 | 2008-02-19 | Hand Held Products, Inc. | Adaptive optical image reader |
US20090106558A1 (en) * | 2004-02-05 | 2009-04-23 | David Delgrosso | System and Method for Adding Biometric Functionality to an Application and Controlling and Managing Passwords |
US7301528B2 (en) * | 2004-03-23 | 2007-11-27 | Fujitsu Limited | Distinguishing tilt and translation motion components in handheld devices |
US7536304B2 (en) * | 2005-05-27 | 2009-05-19 | Porticus, Inc. | Method and system for bio-metric voice print authentication |
US7809214B2 (en) * | 2005-08-22 | 2010-10-05 | Samsung Electronics Co., Ltd. | Device and a method for identifying movement patterns |
EP1962280A1 (en) | 2006-03-08 | 2008-08-27 | BIOMETRY.com AG | Method and network-based biometric system for biometric authentication of an end user |
JP5852781B2 (en) * | 2007-07-31 | 2016-02-03 | マイクロニクス, インコーポレイテッド | Hygienic swab collection system, microfluidic assay device and method for diagnostic assays |
US8565535B2 (en) | 2007-08-20 | 2013-10-22 | Qualcomm Incorporated | Rejecting out-of-vocabulary words |
CN100580685C (en) | 2008-03-14 | 2010-01-13 | 福建伊时代信息科技股份有限公司 | Path password input method based on contacts |
CN101339489A (en) | 2008-08-14 | 2009-01-07 | 炬才微电子(深圳)有限公司 | Human-computer interaction method, device and system |
CN101408822B (en) | 2008-11-13 | 2012-01-11 | 宇龙计算机通信科技(深圳)有限公司 | Unlocking method, system and mobile terminal of built-in unlocking system |
US8543415B2 (en) * | 2008-11-26 | 2013-09-24 | General Electric Company | Mobile medical device image and series navigation |
CN101408832A (en) | 2008-11-26 | 2009-04-15 | 深圳华为通信技术有限公司 | Keyboard dynamic unlocking method and electronic apparatus |
US8151344B1 (en) * | 2009-01-29 | 2012-04-03 | Intuit Inc. | Method and apparatus to authenticate a user |
US8630088B2 (en) * | 2009-03-27 | 2014-01-14 | Qualcomm Incorporated | Portable docking station for a portable computing device |
EP2239651B1 (en) * | 2009-03-27 | 2017-08-30 | CSEM Centre Suisse d'Electronique et de Microtechnique SA - Recherche et Développement | Smart Label |
US20100243507A1 (en) * | 2009-03-31 | 2010-09-30 | John Gelardi | blister pack secondary package and sleeve |
US8638939B1 (en) * | 2009-08-20 | 2014-01-28 | Apple Inc. | User authentication on an electronic device |
US8436821B1 (en) * | 2009-11-20 | 2013-05-07 | Adobe Systems Incorporated | System and method for developing and classifying touch gestures |
US9292731B2 (en) * | 2009-12-30 | 2016-03-22 | Intel Corporation | Gesture-based signature authentication |
US8923260B2 (en) * | 2011-12-06 | 2014-12-30 | Cisco Technology, Inc. | Mobility in multi-device multi-homed deployments |
US9717655B2 (en) * | 2013-08-13 | 2017-08-01 | Next Paradigm Inc. | Electronic pill box with detachable day module which uses a blister pack |
US20160013523A1 (en) * | 2014-07-08 | 2016-01-14 | Robert Bosch Battery Systems Llc | Apparatus For Electrochemical Cell Temperature Measurement In A Battery Pack |
US10375847B2 (en) * | 2014-10-10 | 2019-08-06 | QuantaEd, LLC | Connected packaging |
US10142822B1 (en) * | 2015-07-25 | 2018-11-27 | Gary M. Zalewski | Wireless coded communication (WCC) devices with power harvesting power sources triggered with incidental mechanical forces |
-
2009
- 2009-12-30 US US12/655,380 patent/US9292731B2/en active Active
-
2010
- 2010-12-22 EP EP18191422.7A patent/EP3432211A1/en active Pending
- 2010-12-22 EP EP10252199.4A patent/EP2341465B1/en active Active
- 2010-12-22 EP EP16020063.0A patent/EP3082064A1/en not_active Withdrawn
- 2010-12-24 CN CN201010625006XA patent/CN102117392A/en active Pending
- 2010-12-24 CN CN201610197325.2A patent/CN105893964A/en active Pending
-
2016
- 2016-02-19 US US15/048,078 patent/US9560044B2/en active Active
-
2017
- 2017-01-04 US US15/397,906 patent/US10015166B2/en active Active
-
2018
- 2018-05-30 US US15/992,404 patent/US10305897B2/en active Active
-
2019
- 2019-03-06 US US16/293,951 patent/US10681042B2/en active Active
- 2019-12-17 US US16/716,983 patent/US20200128006A1/en not_active Abandoned
Patent Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20020163522A1 (en) * | 2001-05-07 | 2002-11-07 | Porter Allen J.C. | Method and apparatus for maintaining secure and nonsecure data in a shared memory system |
US20020188854A1 (en) * | 2001-06-08 | 2002-12-12 | John Heaven | Biometric rights management system |
US20030236947A1 (en) * | 2002-06-24 | 2003-12-25 | Shinya Yamazaki | Prevention of conflicting cache hits without an attendant increase in hardware |
US7173604B2 (en) * | 2004-03-23 | 2007-02-06 | Fujitsu Limited | Gesture identification of controlled devices |
US20070143833A1 (en) * | 2005-12-21 | 2007-06-21 | Conley Kevin M | Voice controlled portable memory storage device |
US20090002316A1 (en) * | 2007-01-31 | 2009-01-01 | Broadcom Corporation | Mobile communication device with game application for use in conjunction with a remote mobile communication device and methods for use therewith |
US20100250789A1 (en) * | 2009-03-27 | 2010-09-30 | Qualcomm Incorporated | System and method of managing memory at a portable computing device and a portable computing device docking station |
Also Published As
Publication number | Publication date |
---|---|
US9292731B2 (en) | 2016-03-22 |
US9560044B2 (en) | 2017-01-31 |
US10681042B2 (en) | 2020-06-09 |
EP3432211A1 (en) | 2019-01-23 |
US20160173494A1 (en) | 2016-06-16 |
US20190297076A1 (en) | 2019-09-26 |
EP2341465B1 (en) | 2018-01-24 |
US10305897B2 (en) | 2019-05-28 |
EP3082064A1 (en) | 2016-10-19 |
EP2341465A3 (en) | 2012-12-12 |
US20190028469A1 (en) | 2019-01-24 |
EP2341465A2 (en) | 2011-07-06 |
CN102117392A (en) | 2011-07-06 |
US10015166B2 (en) | 2018-07-03 |
US20170187712A1 (en) | 2017-06-29 |
CN105893964A (en) | 2016-08-24 |
US20110156867A1 (en) | 2011-06-30 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US10681042B2 (en) | Gesture-based signature authentication | |
US20240037045A1 (en) | Apparatuses and methods for securing an access protection scheme | |
US10404695B2 (en) | Portable biometric authentication device and terminal device using near field communication | |
US10169558B2 (en) | Enhancing biometric security of a system | |
US9235748B2 (en) | Dynamic handwriting verification and handwriting-based user authentication | |
Blanco‐Gonzalo et al. | Performance evaluation of handwritten signature recognition in mobile environments | |
US10063541B2 (en) | User authentication method and electronic device performing user authentication | |
US10665319B1 (en) | Memory device testing | |
TW202040385A (en) | System for using device identification to identify via telecommunication server and method thereof | |
WO2012046099A1 (en) | Method, apparatus, and computer program product for implementing sketch-based authentication | |
US20250007726A1 (en) | Key possession based verification in endpoint devices | |
US20230344620A1 (en) | Personal private key encryption device | |
JP5922071B2 (en) | Improving system biometric security | |
TWI704796B (en) | System for using network identification to sign in service server via telecommunication server and method thereof | |
CN117015776A (en) | Resolution processing of biometric data |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER |
|
STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION |