WO2015062382A1 - Method and system for authenticating user of mobile terminal - Google Patents
Method and system for authenticating user of mobile terminal Download PDFInfo
- Publication number
- WO2015062382A1 WO2015062382A1 PCT/CN2014/087538 CN2014087538W WO2015062382A1 WO 2015062382 A1 WO2015062382 A1 WO 2015062382A1 CN 2014087538 W CN2014087538 W CN 2014087538W WO 2015062382 A1 WO2015062382 A1 WO 2015062382A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- authentication
- input
- user
- authentication process
- input mode
- Prior art date
Links
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L9/00—Cryptographic mechanisms or cryptographic arrangements for secret or secure communications; Network security protocols
- H04L9/32—Cryptographic mechanisms or cryptographic arrangements for secret or secure communications; Network security protocols including means for verifying the identity or authority of a user of the system or for message authentication, e.g. authorization, entity authentication, data integrity or data verification, non-repudiation, key authentication or verification of credentials
- H04L9/3226—Cryptographic mechanisms or cryptographic arrangements for secret or secure communications; Network security protocols including means for verifying the identity or authority of a user of the system or for message authentication, e.g. authorization, entity authentication, data integrity or data verification, non-repudiation, key authentication or verification of credentials using a predetermined code, e.g. password, passphrase or PIN
- H04L9/3231—Biological data, e.g. fingerprint, voice or retina
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L9/00—Cryptographic mechanisms or cryptographic arrangements for secret or secure communications; Network security protocols
- H04L9/32—Cryptographic mechanisms or cryptographic arrangements for secret or secure communications; Network security protocols including means for verifying the identity or authority of a user of the system or for message authentication, e.g. authorization, entity authentication, data integrity or data verification, non-repudiation, key authentication or verification of credentials
- H04L9/3226—Cryptographic mechanisms or cryptographic arrangements for secret or secure communications; Network security protocols including means for verifying the identity or authority of a user of the system or for message authentication, e.g. authorization, entity authentication, data integrity or data verification, non-repudiation, key authentication or verification of credentials using a predetermined code, e.g. password, passphrase or PIN
Definitions
- the present disclosed technology relates to the field of electronic technologies, and in particular, to a method and system for authenticating a user’s identity.
- authentication of a user of the mobile device is generally directed to a certain authentication manner, and if a program supports multiple authentication manners, such as password authentication and graph authentication, the user is required to manually select one of the multiple authentication manners or manually switch between the multiple authentication manners, which is rather complicated.
- an input interface with a defined authentication manner provides a prompt corresponding to the defined authentication manner, which enables a fraudulent or unauthorized user a focus for cracking the authentication manner. This creates a potential security issue for the mobile device.
- the embodiments of the present disclosure provide methods and systems for authenticating a user of a client device that may address the problems stated in the background section.
- a method of authenticating a user is performed at a server system (e.g., server system 108, Figures 1-2) with one or more processors and memory.
- the method includes detecting a trigger condition and, in response to detecting the trigger condition, performing an authentication process so as to authenticate the user.
- the authentication process includes: dynamically selecting two or more distinct input modes for the authentication process based on one or more predetermined criteria; and prompting the user to provide two or more authentication inputs via the two or more dynamically selected input modes.
- the authentication process includes: after the prompting, obtaining a first authentication input via a first input mode and a second authentication input via a second input mode distinct from the first input mode; and, in response to receiving the first authentication input and the second authentication input, authenticating the user based on the first authentication input and the second authentication input.
- a computer system e.g., server system 108 ( Figures 1-2), client device 104 ( Figures 1 and 3), or a combination thereof
- server system 108 Figures 1-2
- client device 104 Figures 1 and 3
- a computer system includes one or more processors and memory storing one or more programs for execution by the one or more processors, the one or more programs include instructions for performing, or controlling performance of, the operations of any of the methods described herein.
- a non-transitory computer readable storage medium storing one or more programs, the one or more programs comprising instructions, which, when executed by a computer system (e.g., server system 108 ( Figures 1-2), client device 104 ( Figures 1 and 3), or a combination thereof) with one or more processors, cause the computer system to perform, or control performance of, the operations of any of the methods described herein.
- a computer system e.g., server system 108 ( Figures 1-2), client device 104 ( Figures 1 and 3), or a combination thereof
- FIG. 1 is a block diagram of a server-client environment in accordance with some embodiments.
- FIG. 2 is a block diagram of a server system in accordance with some embodiments.
- Figure 3 is a block diagram of a client device in accordance with some embodiments.
- Figure 4A is a diagram of an implementation of a data structure for an authentication information database in accordance with some embodiments.
- Figure 4B is a diagram of an implementation of a data structure for a user profile in accordance with some embodiments.
- Figures 5A-5C illustrate exemplary user interfaces for authenticating a user in accordance with some embodiments.
- Figure 6 illustrates a flowchart diagram of a method of identity authentication in accordance with some embodiments.
- Figure 7 illustrates a flowchart diagram of a method of identity authentication in accordance with some embodiments.
- Figures 8A-8C illustrate a flowchart diagram of a method of authenticating a user in accordance with some embodiments.
- Figure 9 is a block diagram of a client-side module in accordance with some embodiments.
- server-client environment 100 includes client-side processing 102-1, 102-2 (hereinafter “client-side modules 102” ) executed on a client device 104-1, 104-2, and server-side processing 106 (hereinafter “server-side module 106” ) executed on a server system 108.
- client-side module 102 communicates with server-side module 106 through one or more networks 110.
- Client-side module 102 provides client-side functionalities for the social networking platform (e.g., communications, payment processing, user authentication, etc. ) and communications with server-side module 106.
- Server-side module 106 provides server-side functionalities for the social networking platform (e.g., communications, payment processing, user authentication, etc. ) for any number of client modules 102 each residing on a respective client device 104.
- server-side module 106 includes one or more processors 112, authentication information database 114, profiles database 116, an I/O interface to one or more clients 118, and an I/O interface to one or more external services 120.
- I/O interface to one or more clients 118 facilitates the client-facing input and output processing for server-side module 106.
- One or more processors 112 perform an authentication process so as to authenticate a user of the social networking platform in response to detecting a trigger condition.
- Authentication information database 114 stores authentication information for users of the social networking platform, and profiles database 116 stores a user profile for each user of the social networking platform.
- I/O interface to one or more external services 120 facilitates communications with one or more external services 122 (e.g., merchant websites, credit card companies, and/or other payment processing services).
- client device 104 examples include, but are not limited to, a handheld computer, a wearable computing device, a personal digital assistant (PDA), a tablet computer, a laptop computer, a desktop computer, a cellular telephone, a smart phone, an enhanced general packet radio service (EGPRS) mobile phone, a media player, a navigation device, a game console, a television, a remote control, or a combination of any two or more of these data processing devices or other data processing devices.
- PDA personal digital assistant
- EGPS enhanced general packet radio service
- Examples of one or more networks 110 include local area networks (LAN) and wide area networks (WAN) such as the Internet.
- One or more networks 110 are, optionally, implemented using any known network protocol, including various wired or wireless protocols, such as Ethernet, Universal Serial Bus (USB), FIREWIRE, Global System for Mobile Communications (GSM), Enhanced Data GSM Environment (EDGE), code division multiple access (CDMA), time division multiple access (TDMA), Bluetooth, Wi-Fi, voice over Internet Protocol (VoIP), Wi-MAX, or any other suitable communication protocol.
- Server system 108 is implemented on one or more standalone data processing apparatuses or a distributed network of computers.
- server system 108 also employs various virtual devices and/or services of third party service providers (e.g., third-party cloud service providers) to provide the underlying computing resources and/or infrastructure resources of server system 108.
- third party service providers e.g., third-party cloud service providers
- Server-client environment 100 shown in Figure 1 includes both a client-side portion (e.g., client-side module 102) and a server-side portion (e.g., server-side module 106).
- data processing is implemented as a standalone application installed on client device 104.
- client-side module 102 is a thin-client that provides only user-facing input and output processing functions, and delegates all other data processing functionalities to a backend server (e.g., server system 108).
- client-side module 102 performs the verification process and a backend server (e.g., server system 108) performs other functions of the social networking platform (e.g., communications and payment processing).
- FIG. 2 is a block diagram illustrating server system 108 in accordance with some embodiments.
- Server system 108 typically, includes one or more processing units (CPUs) 112, one or more network interfaces 204 (e.g., including I/O interface to one or more clients 118 and I/O interface to one or more external services 120), memory 206, and one or more communication buses 208 for interconnecting these components (sometimes called a chipset).
- Memory 206 includes high-speed random access memory, such as DRAM, SRAM, DDR RAM, or other random access solid state memory devices; and, optionally, includes non-volatile memory, such as one or more magnetic disk storage devices, one or more optical disk storage devices, one or more flash memory devices, or one or more other non-volatile solid state storage devices.
- Memory 206 optionally, includes one or more storage devices remotely located from one or more processing units 112.
- Memory 206 or alternatively the non-volatile memory within memory 206, includes a non-transitory computer readable storage medium.
- memory 206, or the non-transitory computer readable storage medium of memory 206 stores the following programs, modules, and data structures, or a subset or superset thereof:
- ⁇ operating system 210 including procedures for handling various basic system services and for performing hardware dependent tasks
- ⁇ network communication module 212 for connecting server system 108 to other computing devices (e.g., client devices 104 and external service (s) 122) connected to one or more networks 110 via one or more network interfaces 204 (wired or wireless);
- ⁇ server-side module 106 which provides server-side data processing for a social networking platform (e.g., communications, payment processing, user authentication, etc. ), includes, but is not limited to:
- biometric information obtaining module 220 for obtaining biometric information associated with users during usage of the social networking platform
- ⁇ communications module 222 for managing and routing messages sent between users of the social networking platform
- ⁇ payment processing module 224 for processing transactions for a respective user of the social networking platform based on payment data in a user profile in profiles database 116 corresponding to the respective user;
- ⁇ trigger detection module 226 for detecting a trigger condition as to a respective user (e.g., in response to receiving, from a client device 104, a notification of a user input corresponding to the trigger condition);
- ⁇ authentication module 228 for performing an authentication process so as to authenticate the respective user of the social networking platform in response to detecting the trigger condition as to the respective user, including but not limited to:
- ⁇ environmental information obtaining module 230 for obtaining environmental information corresponding to environmental conditions associated with a client device 104 of the respective user
- ⁇ selecting module 232 for dynamically selecting two or more distinct input modes for the authentication process based on one or more predetermined criteria
- ⁇ prompting module 234 for prompting the user to provide two or more authentication inputs via the two or more dynamically selected input modes (e.g., by sending, to a client device 104 associated with the user, a user interface, such as a notification, or voice message indicating the two or more dynamically selected input modes for presentation via client device 104);
- ⁇ obtaining module 236 for obtaining a first authentication input via a first input mode and a second authentication input via a second input mode
- ⁇ processing module 238 for processing the first authentication input and the second authentication input by, for example, determining a vocal signature for a voice sample of the user, performing facial recognition on an image with a user’s face, performing fingerprint recognition on an image with a user’s fingerprint, or performing retinal recognition on an image with a user’s retina, and the like;
- ⁇ scoring module 240 for determining a first authentication score by comparing the first authentication input against previously stored authentication information corresponding to the first input mode and a second authentication score by comparing the second authentication input against previously stored authentication information corresponding to the second input mode;
- ⁇ determining module 242 for determining whether the first authentication score satisfies a first authentication threshold corresponding to the first input mode and whether the second authentication score satisfies a second authentication threshold corresponding to the second input mode;
- ⁇ adjusting module 244 for adjusting the first authentication threshold and the authentication threshold based on the environmental
- ⁇ authenticating module 246 for authenticating or denying authentication of the respective user based on the results from determining module 242;
- ⁇ server data 250 storing data for the social networking platform, including but not limited to:
- authentication information database 114 storing authentication information for users of the social networking platform
- ⁇ profiles database 116 storing user profiles for users of the social networking platform, where a respective user profile for a user includes a user identifier (e.g., an account name or handle), login credentials to the social networking platform, payment data (e.g., linked credit card information, app credit or gift card balance, billing address, shipping address, etc. ), authentication preferences, run-time biometric information, custom parameters for the user (e.g., age, location, hobbies, etc. ), and identified trends and/or likes/dislikes of the user.
- a user identifier e.g., an account name or handle
- login credentials to the social networking platform e.g., payment data (e.g., linked credit card information, app credit or gift card balance, billing address, shipping address, etc. ), authentication preferences, run-time biometric information, custom parameters for the user (e.g., age, location, hobbies, etc. ), and identified trends and/or likes/dislikes of the user.
- payment data e.g
- Each of the above identified elements may be stored in one or more of the previously mentioned memory devices, and corresponds to a set of instructions for performing a function described above.
- the above identified modules or programs i.e., sets of instructions
- memory 206 optionally, stores a subset of the modules and data structures identified above.
- memory 206 optionally, stores additional modules and data structures not described above.
- FIG. 3 is a block diagram illustrating a representative client device 104 associated with a user in accordance with some embodiments.
- Client device 104 typically, includes one or more processing units (CPUs) 302, one or more network interfaces 304, memory 306, and one or more communication buses 308 for interconnecting these components (sometimes called a chipset).
- Client device 104 also includes a user interface 310.
- User interface 310 includes one or more output devices 312 that enable presentation of media content, including one or more speakers and/or one or more visual displays.
- User interface 310 also includes one or more input devices 314, including user interface components that facilitate user input such as a keyboard, a mouse, a voice-command input unit or microphone, a touch screen display, a touch-sensitive input pad, a camera, a gesture capturing camera, or other input buttons or controls.
- user interface components that facilitate user input such as a keyboard, a mouse, a voice-command input unit or microphone, a touch screen display, a touch-sensitive input pad, a camera, a gesture capturing camera, or other input buttons or controls.
- client devices 104 use a microphone and voice recognition or a camera and gesture recognition to supplement or replace the keyboard.
- Client device 104 further includes sensors 315, which provide information as to the environmental conditions associated with client device 104.
- Sensors 315 include but are not limited to one or more microphones, one or more cameras, an ambient light sensor, one or more accelerometers, one or more gyroscopes, a GPS positioning system, a Bluetooth or BLE system, a temperature sensor, one or more motion sensors, one or more biological sensors (e.g., a galvanic skin resistance sensor, a pulse oximeter, and the like), and other sensors.
- Memory 306 includes high-speed random access memory, such as DRAM, SRAM, DDR RAM, or other random access solid state memory devices; and, optionally, includes non-volatile memory, such as one or more magnetic disk storage devices, one or more optical disk storage devices, one or more flash memory devices, or one or more other non-volatile solid state storage devices.
- Memory 306, optionally, includes one or more storage devices remotely located from one or more processing units 302.
- Memory 306, or alternatively the non-volatile memory within memory 306, includes a non-transitory computer readable storage medium.
- memory 306, or the non-transitory computer readable storage medium of memory 306, stores the following programs, modules, and data structures, or a subset or superset thereof:
- ⁇ operating system 316 including procedures for handling various basic system services and for performing hardware dependent tasks
- ⁇ network communication module 318 for connecting client device 104 to other computing devices (e.g., server system 108 and external service (s) 122) connected to one or more networks 110 via one or more network interfaces 304 (wired or wireless);
- ⁇ presentation module 320 for enabling presentation of information (e.g., a user interface for a social networking platform, widget, websites or web pages thereof, game, and/or application, audio and/or video content, text, etc. ) at client device 104 via one or more output devices 312 (e.g., displays, speakers, etc. ) associated with user interface 310;
- information e.g., a user interface for a social networking platform, widget, websites or web pages thereof, game, and/or application, audio and/or video content, text, etc.
- output devices 312 e.g., displays, speakers, etc.
- ⁇ input processing module 322 for detecting one or more user inputs or interactions from one of the one or more input devices 314 and interpreting the detected input or interaction;
- ⁇ web browser module 324 for navigating, requesting (e.g., via HTTP), and displaying websites and web pages thereof;
- client device 104 e.g., games, application marketplaces, payment platforms, and/or other applications.
- applications 326-1–326-N for execution by client device 104 (e.g., games, application marketplaces, payment platforms, and/or other applications).
- client-side module 102 which provides client-side data processing and functionalities for the social networking platform, including but not limited to:
- ⁇ biometric obtaining module 332 for obtaining biometric information during usage of the social networking platform by a user of client device 104;
- ⁇ environmental information obtaining module 334 for obtaining environmental information from sensors 315 corresponding to environmental conditions associated with client device 104;
- ⁇ communication system 336 for sending messages to and receiving messages from other users of the social networking platform (e.g., instant messaging, group chat, message board, message/news feed, and the like);
- ⁇ payment processing 338 for processing payments associated with transactions initiated within the social networking platform or at a merchant’s website within web browser module 324;
- ⁇ trigger detection module 340 for detecting a trigger condition as to a user of client device 104 and sending a notification of the trigger condition to server system 108;
- ⁇ (optionally) authenticating module 342 for performing an authentication process so as to authenticate the user in response to detecting the trigger condition as to the user;
- ⁇ client data 350 storing data associated with the social networking platform, including, but is not limited to:
- ⁇ user profile 352 storing a user profile associated with the user of client device 104 including a user identifier (e.g., an account name or handle), login credentials to the social networking platform, payment data (e.g., linked credit card information, app credit or gift card balance, billing address, shipping address, etc. ), authentication preferences, run-time biometric information, custom parameters for the user (e.g., age, location, hobbies, etc. ), and identified trends and/or likes/dislikes of the user;
- a user identifier e.g., an account name or handle
- payment data e.g., linked credit card information, app credit or gift card balance, billing address, shipping address, etc.
- authentication preferences e.g., run-time biometric information, custom parameters for the user (e.g., age, location, hobbies, etc. ), and identified trends and/or likes/dislikes of the user;
- authentication information 354 storing authentication information for the user of client device 104.
- ouser data 356 storing data authored, saved, liked, or chosen as favorites by the user of client device 104 in the social networking platform.
- Each of the above identified elements may be stored in one or more of the previously mentioned memory devices, and corresponds to a set of instructions for performing a function described above.
- the above identified modules or programs i.e., sets of instructions
- memory 306 optionally, stores a subset of the modules and data structures identified above.
- memory 306, optionally, stores additional modules and data structures not described above.
- At least some of the functions of server system 108 are performed by client device 104, and the corresponding sub-modules of these functions may be located within client device 104 rather than server system 108.
- the environmental information obtaining module 230, the selecting module 232, the prompting module 234, the authentication input obtaining module 236, the processing module 238 may be implemented at least in part on the client device 104.
- at least some of the functions of client device 104 are performed by server system 108, and the corresponding sub-modules of these functions may be located within server system 108 rather than client device 104.
- Client device 104 and server system 108 shown in Figures 2-3, respectively, are merely illustrative, and different configurations of the modules for implementing the functions described herein are possible in various embodiments.
- FIG 4A is a diagram of an implementation of a data structure representing authentication information database 114 in accordance with some embodiments.
- authentication information database 114 is stored and managed by server system 108 or a component thereof (e.g., server-side module 106, Figures 1-2).
- authentication information database 114 is stored remote from server system 108 but is managed by server system 108.
- authentication information database 114 includes a plurality of entries for facilitating the authentication of users of the social networking platform.
- server system 108 operates and manages the social networking platform.
- a respective entry corresponds to a user of the social networking platform, and the respective entry includes a user identifier (ID) 402 corresponding to the user and authentication information 404 corresponding to the user.
- ID 402-A is a unique number generated by server system 108 for a respective user of the social networking platform or a user name, handle, account name, number or sequence of characters created by the respective user of the social networking platform.
- authentication information 404-A includes a set of authentication information entered by the respective user during initialization or setup of an account for the social networking platform.
- authentication information 404 includes authentication information for a plurality of distinct input modes (e.g., at least two input modes) (sometimes also here called “input manners,” “authentication manners,” or “authentication input manner” ).
- authentication information 404-N is, for example, associated with a password 412, a signature 414, a sketch password 416 (e.g., a sketch pattern or drawing), vocal information 418 (e.g., a voice signature extracted from a voice sample), fingerprint information 420 (e.g., a fingerprint signature extracted from a fingerprint image or scan), retinal information 422 (e.g., a retinal signature extracted from a retinal image or scan), facial information 424 (e.g., a facial signature extracted from a headshot or other image), and other authentication information for the user of the social networking platform that correspond with user ID 402-N.
- Figure 4B is a diagram of an implementation of a data structure representing user profile 352 in accordance with some embodiments.
- user profile 352 corresponds to a user of the social networking platform.
- user profile 352 is stored in profiles database 116 by server system and/or in client data 350 by a client device 104 associated with the user.
- user profile 352 includes the following fields: user identifier (ID) field 452; login credentials field 454; payment information field 456; authentication preferences field 458; biometric information field 460; and other user information field 462.
- ID user identifier
- user identifier (ID) field 452 includes a unique number generated by server system 108 for the user of the social networking platform or a user name, handle, account name, number or sequence of characters generated by the user of the social networking platform.
- a respective user identifier 402 in authentication information database 114 matches user identifier field 452 in a user profile for the respective user.
- login credentials field 454 includes a user name and password entered by the user during initialization or setup of an account for the social networking platform. For example, when logging into the social networking platform, a user corresponding to user profile 352 enters login credentials, and, subsequently, server system 108 matches the entered login credentials against login credentials field 454 so as to verify and login the user.
- payment information field 456 includes payment data entered by the user during initialization or setup of an account for the social networking platform by the user.
- payment information field 456 includes credit card information, app credit or gift card balance, billing address, shipping address, and the like for the user.
- authentication preferences field 458 includes authentication preferences entered by the user during initialization or setup of an account for the social networking platform. For example, the user specifies that he/she wishes to user any two of vocal information, fingerprint information, and retinal information for authenticating the user during an authentication process.
- biometric information field 460 includes biometric information obtained by client-side module 102 during usage of the social networking platform by the user.
- client-side module 102 initiates collection of biometric information when the user starts a process that may ultimately require the authentication process such as starting an online shopping process, starting a device reconfiguration process, or starting a payment process.
- other user information field 462 includes other information corresponding to the user such as custom parameters for the user (e.g., age, location, hobbies, etc.) and identified trends and/or likes/dislikes of the user.
- FIG. 5A-5C illustrate exemplary user interfaces for authenticating a user in accordance with some embodiments.
- the device detects inputs on a touch-sensitive surface that is separate from the display.
- the touch sensitive surface has a primary axis that corresponds to a primary axis on the display.
- the device detects contacts with the touch-sensitive surface at locations that correspond to respective locations on the display. In this way, user inputs detected by the device on the touch-sensitive surface are used by the device to manipulate the user interface on the display of the device when the touch-sensitive surface is separate from the display. It should be understood that similar methods are, optionally, used for other user interfaces described herein.
- finger inputs e.g., finger contacts, finger tap gestures, finger swipe gestures, etc.
- one or more of the finger inputs are replaced with input from another input device (e.g., a mouse-based, stylus-based, or physical button-based input).
- a swipe gesture is, optionally, replaced with a mouse click (e.g., instead of a contact) followed by movement of the cursor along the path of the swipe (e.g., instead of movement of the contact).
- a tap gesture is, optionally, replaced with a mouse click while the cursor is located over the location of the tap gesture (e.g., instead of detection of the contact followed by ceasing to detect the contact) or depression of a physical button.
- a tap gesture is, optionally, replaced with a mouse click while the cursor is located over the location of the tap gesture (e.g., instead of detection of the contact followed by ceasing to detect the contact) or depression of a physical button.
- multiple user inputs are simultaneously detected, it should be understood that multiple computer mice are, optionally, used simultaneously, or a mouse and finger contacts are, optionally, used simultaneously.
- Figures 5A-5C show interface 508 for a social networking platform or other application displayed on client device 104 (e.g., a mobile phone); however, one skilled in the art will appreciate that the user interfaces shown in Figures 5A-5C may be implemented on other similar computing devices.
- client device 104 e.g., a mobile phone
- the user interfaces in Figures 5A-5C are used to illustrate the processes described herein, including the process described with respect to Figures 8A-8C.
- Figure 5A illustrates client device 104 executing a web browser (e.g., web browser module 324, Figure 3).
- the web browser is displaying a “Checkout” web page for merchant website 510 on touch screen 506.
- the web browser includes a current address bar showing merchant website 510 as the current web address displayed by the web browser, refresh affordance 512 for reloading the current web page, back navigation affordance 514-A for displaying the last web page, and forward navigation affordance 514-B for displaying the next web page.
- the “Checkout” web page for merchant website 510 includes item 516 (e.g., Board Game A), which the user of client device 104 intends to purchase from merchant website 510.
- item 516 e.g., Board Game A
- the “Checkout” web page also includes a description of item 516, the cost of item 516, and the current quantity of item 516 selected for purchase.
- the “Checkout” web page further includes the subtotal for item 516, tax associated with item 516, shipping cost for item 516, and the total amount for the purchase of item 516 from merchant website 510.
- the “Checkout” web page further includes a plurality of payment options for completing the purchase of item 516.
- the payment options include affordance 518-A for payment option A (e.g., merchant credit or a gift card associated with merchant website 410), affordance 518-B for payment option B (e.g., an alternative payment application or a credit card previously recognized by merchant website 410), and affordance 518-C for payment processing associated with a social networking platform.
- Figure 5A also illustrates client device 104 detecting contact 520 (e.g., a tap gesture) at a location corresponding to affordance 518-C. In response to selection of affordance 518-C, payment for the purchase of item 516 from merchant website 510 will be conducted by payment processing associated with a social networking platform.
- contact 520 e.g., a tap gesture
- client-side module 102 sends a notification, to server system 108, indicating selection of affordance 518-C for payment processing associated with the social networking platform.
- server system 108 detects a trigger condition in response to receiving the notification.
- server system 108 performs an authentication process so as to authenticate the user of client device 104.
- server system 108 sends a user interface (e.g., a notification) or voice message, to client device 104 for presentation to the user.
- the user interface or voice message prompts the user to provide two or more authentication inputs via the two or more distinct input modes.
- client-side module 102 in response to the selection of affordance 518-C via contact 520 in Figure 5A, detects a trigger condition. Continuing with this example, in response to detecting the trigger condition, client-side module 102 performs an authentication process so as to authenticate the user of client device 104. As part of the authentication process server system client-side module 102 causes client device 104 to present a user interface (e.g., a notification) or voice message to a user. For example, the user interface or voice message prompts the user to provide two or more authentication inputs via the two or more distinct input modes.
- a user interface e.g., a notification
- voice message prompts the user to provide two or more authentication inputs via the two or more distinct input modes.
- Figure 5B illustrates client device 104 displaying notification 522 on touch screen 506 in response to selection of affordance 518-C in Figure 5A.
- notification 522 prompts the user of client device 104 to provide authentication inputs for the authentication process without specifying the input modes.
- Figure 5C illustrates client device 104 displaying notification 524 on touch screen 506 in response to selection of affordance 518-C in Figure 5A.
- notification 524 prompts the user of client device 104 to provide authentication information for the authentication process via a first input mode corresponding to vocal recognition (e.g., by providing a voice sample) and a second input mode corresponding to facial recognition (e.g., by capturing a headshot with a camera associated with client device 104).
- Figure 6 illustrates a flowchart diagram of a method 600 of identity authentication in accordance with some embodiments.
- method 600 is performed by a mobile terminal (e.g., client device 104, Figures 1 and 3) with one or more processors and memory.
- client device 104 Figures 1 and 3
- method 600 is performed by client device 104 ( Figures 1 and 3) or a component thereof (e.g., client-side module 102, Figures 1 and 3).
- method 600 is governed by instructions that are stored in a non-transitory computer readable storage medium and the instructions are executed by one or more processors of the server system.
- the mobile terminal is a mobile device such as a smart phone (e.g., an Android TM mobile phone and an iOS TM mobile phone), a tablet computer, a laptop computer, a mobile Internet device (MID), or a wearable computing device.
- a smart phone e.g., an Android TM mobile phone and an iOS TM mobile phone
- a tablet computer e.g., a tablet computer
- a laptop computer e.g., a tablet computer
- MID mobile Internet device
- wearable computing device e.g., a wearable computing device.
- the mobile terminal displays (602) an identity authentication interface.
- an identity authentication program of the mobile terminal detects that it is required to perform identity authentication for a user when the user performs or intends to perform an online transaction (e.g., for goods and/or services), when the mobile terminal is powered on, when the mobile terminal is unlocked from a standby mode, or when the mobile terminal executes an secured application program (e.g., a payment processing service or application).
- the identity authentication interface corresponds to notification 522 in Figure 5B, where notification 522 only provides to the user of client device 104 (e.g., the mobile terminal) a prompt to input authentication information, but does not provide a prompt of the authentication manner (or input mode).
- the identity authentication interface increases the difficulty of cracking the authentication process because a fraudulent or unauthorized user cannot focus on a single authentication manner.
- the identity authentication interface e.g., notification 524 in Figure 5C
- the mobile terminal determines (604) whether authentication information input by the user is obtained by using any one of at least two selected authentication information input manners (or modes). In some embodiments, the mobile terminal selects at least two distinct user input manners (or input modes). For example, the least two distinct user input manners are selected at random or based on authentication preferences associated with a user profile of the user of the mobile terminal. After the identity authentication interface is displayed in operation 620, at least two user input sensors corresponding to the at least two distinct user input manners are triggered (i.e., started) so as to obtain authentication information input by the user.
- the at least two user input sensors include any two of: a fingerprint sensor, a voiceprint sensor, a touch screen input sensor, a face recognition sensor, an iris recognition sensor, a keyboard input sensor, and the like.
- the at least two user input sensors are triggered for a preset time window after the identity authentication interface is displayed.
- the user must enter the authentication information within the preset time window or a time expiration prompt is presented to the user and the at least two user input sensors are turned off.
- the preset time window is 30 seconds, 1 minute, or the like.
- method 600 continues to operation 606. In accordance with a determination that the authentication information input does use any one of the at least two selected authentication information input manners, method 600 returns to operation 604.
- the mobile terminal compares (606) the authentication information with selected authentication information corresponding to the authentication information input manner.
- authentication information for each authentication information input manner is collected for a user and stored by the mobile device in advance.
- authentication information corresponding to a respective manner e.g., by using a fingerprint collection sensor
- the mobile terminal detects that the authentication information input by the user is obtained through a certain authentication information input manner
- the obtained authentication information is compared with the previously stored authentication information corresponding to the authentication information input manner. For example, the comparison is performed to see whether the similarity between obtained fingerprint or voiceprint information and previously collected and stored fingerprint or voiceprint information satisfies a predetermined comparison threshold.
- the mobile terminal confirms (608) the identity of the user of the mobile terminal according to a comparison result.
- the mobile terminal confirms the identity of the user of the mobile terminal. For example, after the identity authentication interface is displayed, the mobile terminal starts a fingerprint collection sensor, a voiceprint sensor (e.g., one or more microphones), and a touch screen input sensor to detect the authentication information input by the user. Continuing with this example, as long as effective authentication information is obtained through any of the authentication information input manners and the comparison of authentication information is successful, the mobile terminal confirms the identity of the current user of the mobile terminal.
- the mobile terminal confirms the identity of the current user of the mobile terminal after successful comparisons of authentication information obtained using two or more authentication information input manner.
- the predetermined comparison threshold and the authentication information is effective when the fingerprint information collected by the fingerprint collection sensor reaches a preset threshold or the voice information collected by the voiceprint sensor reaches a preset threshold.
- the mobile terminal determines that the comparison fails and, in some circumstances, the mobile terminal presents a notification to the user of the comparison failure.
- the mobile terminal stores one or more pieces of authentication information for each user identity.
- the mobile terminal confirms that the current user has the identity corresponding to the previously stored authentication information.
- the mobile terminal determines that the current user is an authorized user, then, the authentication result is displayed in the authentication interface. In some circumstances, if the comparison of authentication information is successful, the mobile terminal is unlocked or the current user is provided with the ability to use the secured application. In some circumstances, if the comparison of authentication information is successful, the mobile terminal executes the online transaction. For example, the mobile terminal executes the online transaction by sending transaction information to a merchants’ server according to the determined user identity of the mobile terminal. In this example, the user transaction information includes an item, amount of money, and other account information corresponding to the determined user identity corresponding to the online transaction.
- Figure 7 illustrates a flowchart diagram of a method 700 of identity authentication in accordance with some embodiments.
- method 700 is performed by a mobile terminal (e.g., client device 104, Figures 1 and 3) with one or more processors and memory.
- client device 104 Figures 1 and 3
- method 700 is performed by client device 104 ( Figures 1 and 3) or a component thereof (e.g., client-side module 102, Figures 1 and 3).
- method 700 is governed by instructions that are stored in a non-transitory computer readable storage medium and the instructions are executed by one or more processors of the server system.
- the mobile terminal is a mobile device such as a smart phone (e.g., an Android TM mobile phone and an iOS TM mobile phone), a tablet computer, a laptop computer, a mobile Internet device (MID), or a wearable computing device.
- a smart phone e.g., an Android TM mobile phone and an iOS TM mobile phone
- a tablet computer e.g., a tablet computer
- a laptop computer e.g., a tablet computer
- MID mobile Internet device
- wearable computing device e.g., a wearable computing device.
- the mobile terminal displays (702) an identity authentication interface and starts at least two user input sensors of a mobile terminal.
- an identity authentication program of the mobile terminal detects that it is required to perform identity authentication for a user when the user performs or intends to perform an online transaction (e.g., for goods and/or services), when the mobile terminal is powered on, when the mobile terminal is unlocked from a standby mode, or when the mobile terminal executes an secured application program (e.g., a payment processing service or application).
- an identity authentication program of the mobile terminal detects that it is required to perform identity authentication for a user when the user performs or intends to perform an online transaction (e.g., for goods and/or services), when the mobile terminal is powered on, when the mobile terminal is unlocked from a standby mode, or when the mobile terminal executes an secured application program (e.g., a payment processing service or application).
- an online transaction e.g., for goods and/or services
- an secured application program e.g., a payment processing service or
- the identity authentication interface corresponds to notification 522 in Figure 5B, where notification 522 only provides to the user of client device 104 (e.g., the mobile terminal) a prompt to input authentication information via two or more authentication manner (or input mode), but does not provide a prompt indicating the authentication manners (or input modes).
- the identity authentication interface e.g., corresponding with notification 522 in Figure 5B
- the identity authentication interface e.g., notification 524 in Figure 5C
- input sensors corresponding to the authentication information input manners (or input modes) include a fingerprint collection sensor, a voiceprint sensor, a touch screen input sensor, a face recognition sensor, an iris recognition sensor, and a keyboard input sensor.
- the mobile terminal determines (704) whether a first user input sensor obtains authentication information input by the user. In accordance with a determination that the mobile terminal obtains authentication information via the first user input sensor, method 700 continues to operation 706. In accordance with a determination that the mobile terminal does not obtain authentication information via the first user input sensor, method 700 returns to operation 704.
- the mobile terminal compares (706) the authentication information obtained by the first user input sensor with preset authentication information corresponding to the first user input sensor.
- the first user input sensor in this embodiment is a fingerprint collection sensor.
- the fingerprint information is compared against preset (or previously stored) fingerprint information for the user of the mobile terminal. In this example, if the similarity between the obtained fingerprint information and preset fingerprint information reaches a predetermined comparison threshold for the fingerprint collection sensor, the mobile terminal determines that the comparison is successful.
- the mobile terminal determines (708) whether a second user input sensor obtains authentication information input by the user. In accordance with a determination that the mobile terminal obtains authentication information via the second user input sensor, method 700 continues to operation 710. In accordance with a determination that the mobile terminal does not obtain authentication information via the second user input sensor, method 700 returns to operation 708.
- the mobile terminal compares (710) the authentication information obtained by the second user input sensor with preset authentication information corresponding to the second user input sensor.
- the second user input sensor in this embodiment is a voiceprint collection sensor.
- the voiceprint information is compared against preset (or previously stored) voiceprint information for the user of the mobile terminal. In this example, if the similarity between the obtained voiceprint information and preset voiceprint information reaches a predetermined comparison threshold for the voiceprint collection sensor, the mobile terminal determines that the comparison is successful.
- operations 04-706 and operations 708-710 should not be understood to have strict sequence.
- operations 704-706 are performed first and then operations 708-710 are performed.
- the user inputs authentication information by using the first user input sensor, and, after the comparison is successful, the user inputs authentication information by using the second user input sensor.
- operations 708-710 are performed first and then operations 704-706 are performed.
- the user inputs authentication information by using the second user input sensor, and, after the comparison is successful, the user inputs authentication information by using the first user input sensor.
- operations 704-706 and operations 708-710 are performed in parallel.
- the mobile terminal determines (712) whether comparison of the authentication information obtained by the first user input sensor and the authentication information obtained by the second input sensor is successful. For example, the comparison is performed to see (A) whether the first similarity between fingerprint information obtained by the fingerprint sensor (i.e., the first user input sensor) and previously collected and stored fingerprint information for the user satisfies a predetermined comparison threshold for the fingerprint sensor and (B) whether the second similarity between voiceprint information obtained by the voice sensor (i.e., the second user input sensor) and previously collected and stored voiceprint information for the user satisfies a predetermined comparison threshold for the voice sensor.
- the comparison is performed to see (A) whether the first similarity between fingerprint information obtained by the fingerprint sensor (i.e., the first user input sensor) and previously collected and stored fingerprint information for the user satisfies a predetermined comparison threshold for the fingerprint sensor and (B) whether the second similarity between voiceprint information obtained by the voice sensor (i.e., the second user input sensor) and previously collected and stored voiceprint information for the
- authentication information for each authentication information input manner is collected for a user and stored by the mobile device in advance. For example, authentication information corresponding to a respective manner (e.g., by using a fingerprint collection sensor) is input by the user through a corresponding input sensor of the mobile terminal (or an external device in communication with the mobile terminal) in advance.
- the mobile terminal detects that the authentication information input by the user is obtained through a certain authentication information input manner, the obtained authentication information is compared with the previously stored authentication information corresponding to the authentication information input manner.
- the mobile terminal compares the authentication information from the first input sensor to previously stored authentication information for the first input sensor and authentication information from the second input sensor to previously stored authentication information for the second input sensor. In accordance with a determination that the comparison is successful, method 700 continues to operation 714. In accordance with a determination that the comparison is not successful, method 700 continues to operation 716, where the mobile terminal prompts the user to provide the two or more inputs again and repeats operation 702.
- the mobile terminal confirms (714) the identity of the user of the mobile terminal according to a comparison result. In some embodiments, in accordance with a determination that the comparison result in operation 712 satisfies the predetermined comparison threshold, the mobile terminal confirms the identity of the user of the mobile terminal.
- the mobile terminal determines that the current user is an authorized user, the authentication result is displayed in the authentication interface. In some circumstances, if the comparison of authentication information is successful, the mobile terminal is unlocked or the current user is provided with the ability to use the secured application. In some circumstances, if the comparison of authentication information is successful, the mobile terminal executes the online transaction. For example, the mobile terminal executes the online transaction by sending transaction information to a merchants’ server according to the determined user identity of the mobile terminal. In this example, the user transaction information includes an item, amount of money, and other account information corresponding to the determined user identity corresponding to the online transaction.
- this embodiment is an example of the mobile terminal determining user identity by obtaining authentication information through two user input sensors, and, in other optional embodiments, the mobile terminal may determines user identity by obtaining authentication information through three or more user input sensors.
- Figures 8A-8C illustrate a flowchart diagram of a method 800 of authenticating a user in accordance with some embodiments.
- method 800 is performed by a server with one or more processors and memory.
- method 800 is performed by server system 108 ( Figures 1-2) or a component thereof (e.g., server-side module 106, Figures 1-2).
- method 800 is governed by instructions that are stored in a non-transitory computer readable storage medium and the instructions are executed by one or more processors of the server system. Optional operations are indicated by dashed lines (e.g., boxes with dashed-line borders).
- the server prior to detecting the trigger condition, obtains (802) biometric information corresponding to the user.
- server system 108 or a component thereof e.g., biometric information obtaining module 220, Figure 2 obtains, from a client device 104, biometric information associated with a user of client device 104 during usage of the social networking platform.
- client device 104 or a component thereof e.g., biometric obtaining module 332, Figure 3 collects biometric information during usage of the social networking platform by a user of client device 104.
- the collected biometric information includes retinal, facial, vocal, and fingerprint information.
- the background collection of biometric information by biometric obtaining module 332 may be disabled by the user of client device 104.
- the collecting of biometric information is initiated when the user starts a process that may ultimately require the authentication process (e.g., an online shopping process), starts a device reconfiguration process, starts a payment process, and the like.
- the collection of biometric information can be performed periodically during the shopping/configuration/payment process.
- biometric obtaining module 332 turns on the retina scanner, front facing camera, fingerprint sensor, or microphone (s) for a brief moment without any indication to the user.
- the server detects (804) a trigger condition.
- server system 108 or a component thereof detects a trigger condition as to a respective user of the social networking platform in response to receiving, from a client device 104, a notification of a user input corresponding to the trigger condition.
- the notification indicates that a user input detected by client device 104 or a component thereof (e.g., trigger detection module 340, Figure 3) corresponding to initiation of an online payment, initiation of an online transaction, reactivation of client device 104 when locked, or powering on of client device 104 when off.
- detection of the trigger condition by trigger detection module 226 is preceded by the user logging into client device 104 or an application (e.g., client-side module 102, which is associated with the social networking platform or another application) with login credentials.
- client-side module 102 in response to the selection of affordance 518-C via contact 520, sends a notification, to server system 108, indicating selection of affordance 518-C for payment processing associated with the social networking platform.
- server system 108 detects a trigger condition in response to receiving the notification.
- server system 108 performs an authentication process so as to authenticate the user of client device 104 prior to processing the online transaction for item 516 from online merchant 510.
- the server In response to detecting the trigger condition, the server performs (806) an authentication process so as to authenticate a user.
- server system 108 or a component thereof e.g., authentication module 228, Figure 2 performs an authentication process so as to authenticate the respective user in order to perform a secured operation such as payment processing for an online transaction, unlocking of client device 104, or access to a secured application.
- the authentication process is associated with an application into which the user previously logged in prior to detection of the trigger condition.
- the server dynamically selects (808) two or more distinct input modes for the authentication process based on one or more predetermined criteria.
- the two or more dynamically selected input modes are a subset of all possible input modes for user authentication available to the user device.
- the dynamic selection results in different input modes being selected depending on the actual conditions of the present authentication process, and/or some artificially introduced randomness in the selection process.
- server system 108 or a component thereof e.g., selecting module 232, Figure 2 dynamically selects two or more distinct input modes for the authentication process of the respective user based on one or more predetermined criteria.
- the one or more predetermined criteria include environmental conditions associated with client device 104 of the respective user and/or authentication preferences specified by the respective user.
- the authentication preferences are stored in a user profile for the respective user (e.g., authentication preferences field 458 in user profile 352 in Figure 4B).
- the two or more dynamically selected input modes for the authentication process differ (810) from two or more dynamically selected input modes for a previous authentication process as a result of a pseudo-random selection procedure used for the dynamic selections.
- selecting module 232 dynamically selects the two or more input modes based on a random or pseudo-random selection procedure. As such, the two or more dynamically selected input modes for the current authentication process are different from the two or more dynamically selected input modes a previous authentication process, which were also dynamically selected based on the random or pseudo-random selection procedure.
- the two or more dynamically selected input modes for the authentication process differ (812) from two or more dynamically selected input modes for a previous authentication process as a result of differing environmental conditions existing at the time of the dynamic selections.
- server system 108 or a component thereof e.g., environmental information obtaining module 230, Figure 2 obtains environmental information corresponding to environmental conditions associated with a client device 104 of the respective user.
- the environmental information indicates: an absolute position of client device 104 (e.g., an address or latitudinal and longitudinal coordinates) based on a GPS positioning system of client device 104; an ambient temperate sensed by a temperature sensor of client device 104; a level of ambient sound detected by one or more microphones of client device 104; an ambient light level sensed by an ambient light sensor of client device 104; a velocity, acceleration, or change thereof based on one or more accelerometers of client device 104; and other information indicating environmental conditions associated with client device 104.
- selecting module 232 dynamically selects the two or more input modes based on environmental information obtained by environmental information obtaining module 230 from client device 104. As such, the two or more dynamically selected input modes for the current authentication process are different from the two or more dynamically selected input modes a previous authentication process because environmental conditions associated with client device 104 have changed between the previous authentication process and the current authentication process.
- the one or more predetermined criteria include (814) environmental conditions associated with a client device corresponding to the user.
- server system 108 or a component thereof e.g., environmental information obtaining module 230, Figure 2 obtains environmental information corresponding to environmental conditions associated with a client device 104 of the respective user.
- selecting module 232 dynamically selects the two or more input modes based on environmental information obtained by environmental information obtaining module 230 from client device 104.
- selecting module 232 dynamically selects input modes corresponding to vocal recognition and a spoken password because the user may only have the ability to talk to, but not touch, client device 104.
- the user is in a vehicle that is in motion such as a car, plane, bus, or train.
- the user is performing the locomotion such as walking, cycling, etc.
- the environmental information indicates that client device 104 is located in peak conditions (e.g., a user’s office or home with ample lighting, minimal ambient sound, and minimal movement), selecting module 232 dynamically selects input modes corresponding to facial recognition and fingerprint recognition.
- dynamically selecting the two or more distinct input modes for the authentication process based on one or more predetermined criteria further includes: determining that the user is located in a crowded environment; and in accordance with the determination that the user is located in a crowded environment, dynamically selecting a first authentication input mode corresponding to fingerprint recognition and a second authentication input mode corresponding to retinal recognition for the authentication process.
- the environmental information indicates that client device 104 is located in a crowded area based on the location of client device 104 from a GPS positioning system, ambient temperature information, and ambient sound information in environmental information obtained by environmental information obtaining module 230 from client device 104.
- selecting module 232 dynamically selects input modes corresponding to fingerprint recognition and retinal recognition because they cannot be eavesdropped on by nearby persons.
- selecting module 232 dynamically selects input modes corresponding to fingerprint recognition and retinal recognition because they cannot be eavesdropped on by nearby persons.
- dynamically selecting the two or more distinct input modes for the authentication process based on one or more predetermined criteria further includes: determining that the user is in a low-light environment; and in accordance with the determination that the user is in a low-light environment, dynamically selecting a first authentication input mode corresponding to vocal recognition and a second authentication input mode corresponding to fingerprint recognition for the authentication process.
- the environmental information indicates that client device 104 is located in a low light environment based on time/date information, the location of client device 104 from a GPS positioning system, and ambient light information in environmental information obtained by environmental information obtaining module 230 from client device 104.
- selecting module 232 dynamically selects input modes corresponding to vocal recognition and fingerprint recognition because the user may not be able to clearly see the screen client device 104 or the user may not want to waste battery power to turn on a backlight for the display of client device 104.
- the one or more predetermined criteria include (816) an input mode preference in a user profile corresponding to user.
- server system 108 or a component thereof e.g., selecting module 232, Figure 2 dynamically selects the two or more input modes based on an authentication preferences field in a user profile for the user of client device 104.
- user profile 352 for a respective user is stored in is stored in profiles database 116 by server system and/or in client data 350 by a client device 104 associated with the respective user.
- user profile 352 includes authentication preferences field 458 indicating that the authentication input modes preferred by the respective user are retinal recognition, fingerprint recognition, and vocal recognition.
- selecting module 232 selects two of the authentication input modes preferred by the respective user from authentication preferences field 458 of user profile 352 for the respective user.
- the user preference is used to pre-filter all available authentication modes before the random selection and/or selection based on environmental conditions.
- the user preference may also post-filter the dynamically selected authentication modes to arrive at the final set of authentication modes presented to the user.
- the server prompts (818) the user to provide two or more authentication inputs via the two or more dynamically selected input modes.
- server system 108 or a component thereof e.g., prompting module 234, Figure 2
- sends a user interface e.g., a notification
- voice message e.g., a notification
- client device 104 for presentation to the user.
- the user interface or voice message prompts the user to provide two or more authentication inputs via the two or more distinct input modes.
- prompting the user further includes (820) presenting an indication of the first input mode for the first authentication input and the second input mode for the second authentication input.
- prompting module 234 sends a user interface (e.g., a notification) for presentation on client device 104.
- a user interface e.g., a notification
- notification 524 prompts the user of client device 104 to provide authentication information for the authentication process via a first input mode corresponding to vocal recognition (e.g., by providing a voice sample) and a second input mode corresponding to facial recognition (e.g., by capturing a headshot with a camera associated with client device 104).
- notification 522 prompts the user of client device 104 to provide authentication inputs for the authentication process without specifying the input modes.
- the server obtains (822) a first authentication input via a first input mode and a second authentication input via a second input mode distinct from the first input mode.
- server system 108 or a component thereof e.g., obtaining module 236, Figure 2 obtains first authentication input via a first input mode and a second authentication input via a second input mode distinct from the first input mode from client device 104 in response to the prompt in operation 818 and/or operation 820.
- a respective authentication input is one of a retinal image, a fingerprint image, an image including a user’s face, a voice sample, a sketch code, a password, and the like.
- the first authentication input and the second authentication input are received (824) within a predefined time period.
- the prompt in operation 818 also indicates a predefined time period within which the first and second authentication inputs must be entered. For example, if the first and second authentication inputs are not entered within a predefined time period, the authentication process fails or times out.
- server system or a component thereof processes the first authentication input and the second authentication input.
- processing module 238 determines a vocal signature for a voice sample of the user, performs facial recognition on an image with a user’s face, performs fingerprint recognition on an image with a user’s fingerprint, or performs retinal recognition on an image with a user’s retina, and the like.
- server system 108 or a component thereof authenticates or denies authentication of the respective user.
- the server authenticates the user based on the first authentication input and the second authentication input by (828): determining a first authentication score by comparing the first authentication input against previously stored first authentication information corresponding to the first input mode; determining a second authentication score by comparing the second authentication input against previously stored second authentication information corresponding to the second input mode; and, in accordance with a determination that the first authentication score satisfies a first authentication threshold and the second authentication score satisfies a second authentication threshold, authenticating the user.
- server system 108 or a component thereof determines a first authentication score by comparing the first authentication input against previously stored first authentication information corresponding to the first input mode and a second authentication score by comparing the second authentication input against previously stored second authentication information corresponding to the second input mode.
- authentication information database 114 which is stored by server system 108, stores authentication information for users of the social networking platform including previously stored authentication information corresponding to the respective user for the first input mode and the second input mode.
- authentication information 354 stored by the client device 104 includes previously entered authentication information for each of the input modes. For example, the user enters authentication information for each of the input modes during initialization or setup of an account for the social networking platform or other application.
- server system 108 determines whether the first authentication score satisfies a first authentication threshold for the first input mode and whether the second authentication score satisfies a second authentication threshold for the second input mode.
- the authentication score is a confidence score or a number of matching features between the input and the stored information.
- an overall authentication score is computed based on a predefined algorithm that takes into account the confidence scores for each of the input modes.
- server system 108 or a component thereof authenticates the respective user in accordance with a determination that the first authentication score satisfies a first authentication threshold for the first input mode and the second authentication score satisfies a second authentication threshold for the second input mode.
- server system 108 or a component thereof e.g., authenticating module 246, Figure 2
- authenticates the respective user in accordance with a determination that the first authentication score satisfies a first authentication threshold for the first input mode and the second authentication score satisfies a second authentication threshold for the second input mode.
- an online transaction is processed or client device 104 is unlocked.
- server system 108 in accordance with a determination that the first authentication score does not satisfy the first authentication threshold or the second authentication score does not satisfy the second authentication threshold, server system 108 or a component thereof (e.g., prompting module 234, Figure 2) sends a notification to client device 104 for presentation to the user indicating that the user is not authentic and/or prompting the user to re-input authentication information.
- the user does not get a second attempt at the authentication process, and, instead, the user is locked out of the application.
- the first authentication threshold and the second authentication threshold are based (830) at least in part on environmental conditions associated with a client device corresponding to the user.
- server system 108 or a component thereof e.g., adjusting module 244, Figure 2 adjusts at least one of the first authentication threshold for the first input mode and the second authentication threshold for the second input mode from a default value to a custom value based on environmental information corresponding to environmental conditions associated with a client device 104 of the respective user obtained by environmental information obtaining module 230. For example, when the environmental information indicates that client device 104 is located in a low light or in-motion environment, adjusting module 244 decreases the authentication threshold associated with facial and retinal recognition.
- adjusting module 244 decreases the authentication threshold associated with voice authentication. In another example, adjusting module 244 increases an authentication threshold associated with facial recognition when the environmental information indicates that client device 104 is located in an environment that makes capturing an image more accurate.
- the server in response to receiving the first authentication input and the second authentication input, the server (832): determines whether at least one of the first authentication input or the second authentication input matches the obtained biometric information; in accordance with a determination that at least one of the first authentication input or the second authentication input matches the obtained biometric information, increases at least one of the first authentication score and the second authentication score; and, in accordance with a determination that the obtained biometric information does not the first authentication input or the second authentication input, notifies the user to re-input authentication information.
- server system 108 or a component thereof increases the confidence of one of an authentication score for an authentication input when the previously collected biometric information matches the authentication input.
- server system 108 or a component thereof notifies the user to re-input authentication information or decreases an authentication score for an authentication input when the previously collected biometric information does not match the authentication input.
- server system 108 or a component thereof notifies the user to re-input authentication information or decreases an authentication score for an authentication input when the previously collected biometric information does not match the authentication input.
- a thief steals the user’s phone and uses his own fingerprints to initiate an online transaction but uses a forged fingerprint mimicking the user’s to attempt to pass the authentication process.
- the collected biometric information includes the thief’s fingerprint used to initiation the online transaction.
- FIG. 9 is a block diagram of a client-side module 102 in accordance with some embodiments.
- client-side module 102 is executed on client device 104 (e.g., the mobile terminal), and client-side module 102 corresponds to the social networking platform or another application.
- client device 104 is a mobile device such as a smart phone (e.g., an Android TM mobile phone and an iOS TM mobile phone), a tablet computer, a laptop computer, a mobile Internet device (MID), or a wearable computing device.
- client-side module 102 includes the following modules: identity authentication triggering module 902, authentication information comparing module 904, identity determining module 906, online transaction execution module 908, and unlocking module 910.
- Optional modules are indicated by dashed lines (e.g., boxes with dashed-line borders).
- identity authentication triggering module 902 is configured to display an identity authentication interface and detect whether authentication information input by a user is obtained through any one of at least two preset authentication information input manners.
- identity authentication triggering module 902 further includes an input sensor start sub-unit 922.
- identity authentication triggering module 92 further includes an authentication input detection sub-unit 924.
- the input sensor start sub-unit 922 is configured to start at least two user input sensors of the mobile terminal.
- the mobile terminal includes at least two user input sensors, and each user input sensor respectively corresponds to one authentication information input manner.
- the input sensor start sub-unit 922 is triggered, after the identity authentication interface is displayed, to start the at least two user input sensors so as to obtain authentication information input by the user.
- the user input sensors include a fingerprint collection sensor, a voiceprint sensor, a touch screen input sensor, a face recognition sensor, an iris recognition sensor, and a keyboard input sensor.
- the authentication input detection sub-unit 924 is configured to detect whether any user input sensor of the at least two started user input sensors obtains authentication information input by the user. Further, in some embodiments, the at least two user input sensors are triggered for a preset time window after the identity authentication interface is displayed. Thus, the user must enter the authentication information within the preset time window or a time expiration prompt is presented to the user and the at least two user input sensors are turned off. For example, the preset time window is 30 seconds, 1 minute, or the like.
- authentication information comparing module 904 is configured to: if the authentication information input by the user is obtained through any authentication information input manner, compare the authentication information with preset authentication information corresponding to the authentication information input manner.
- identity confirming module 906 is configured to confirm the identity of the user of the mobile terminal when the comparison result from authentication information comparing module 904 satisfies a predetermined comparison threshold.
- online transaction execution module 908 is configured execute an online transaction by sending transaction information to a merchants’ server according to the determined user identity of the mobile terminal.
- the user transaction information includes an item, amount of money, and other account information corresponding to the determined user identity corresponding to the online transaction.
- unlocking module 910 is configured to unlock the mobile terminal according to the determined user identity of the mobile terminal. Specifically, in some circumstances, the mobile terminal is unlocked or the current user is provided with the ability to use the secured application.
Landscapes
- Engineering & Computer Science (AREA)
- Computer Security & Cryptography (AREA)
- Computer Networks & Wireless Communication (AREA)
- Signal Processing (AREA)
- Health & Medical Sciences (AREA)
- Life Sciences & Earth Sciences (AREA)
- Biodiversity & Conservation Biology (AREA)
- Biomedical Technology (AREA)
- General Health & Medical Sciences (AREA)
- Telephone Function (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
A server system with one or more processors and memory detects a trigger condition and, in response to detecting the trigger condition, performs an authentication process so as to authenticate a user of a client device. During the authentication process, the server system dynamically selects two or more distinct input modes for the authentication process based on one or more predetermined criteria and prompts the user to provide two or more authentication inputs via the two or more dynamically selected input modes. During the authentication process, the server system obtains, after the prompting, a first authentication input via a first input mode and a second authentication input via a second input mode distinct from the first input mode and, in response to receiving the first authentication input and the second authentication input, authenticates the user based on the first authentication input and the second authentication input.
Description
PRIORITY CLAIM AND RELATED APPLICATIONS
This application claims priority to Chinese Patent Application No. 201310518398.3, entitled “Method and Mobile Terminal for Mobile Terminal Identity Verification,” filed on October 28, 2013, which is incorporated by reference in its entirety.
FIELD OF THE TECHNOLOGY
The present disclosed technology relates to the field of electronic technologies, and in particular, to a method and system for authenticating a user’s identity.
BACKGROUND OF THE TECHNOLOGY
With the rapid development of Internet technologies, the usage of mobile devices has become an indispensable part of everyday life. However, sometimes financial or private information is involved when performing operations with a mobile device. As such, authentication of a user of the mobile device’s identity is often required in order to verify that the current user is authorized to perform certain operations. This guarantees the security and integrity of the mobile device.
In the prior art, authentication of a user of the mobile device’s identity is generally directed to a certain authentication manner, and if a program supports multiple authentication manners, such as password authentication and graph authentication, the user is required to manually select one of the multiple authentication manners or manually switch between the multiple authentication manners, which is rather complicated. Moreover, an input interface with a defined authentication manner provides a prompt corresponding to the defined authentication manner, which enables a fraudulent or unauthorized user a focus for cracking the authentication manner. This creates a potential security issue for the mobile device.
SUMMARY
The embodiments of the present disclosure provide methods and systems for authenticating a user of a client device that may address the problems stated in the background section.
In some embodiments, a method of authenticating a user is performed at a server system (e.g., server system 108, Figures 1-2) with one or more processors and memory. The method includes detecting a trigger condition and, in response to detecting the trigger condition, performing an authentication process so as to authenticate the user. The authentication process includes: dynamically selecting two or more distinct input modes for the authentication process based on one or more predetermined criteria; and prompting the user to provide two or more authentication inputs via the two or more dynamically selected input modes. The authentication process includes: after the prompting, obtaining a first authentication input via a first input mode and a second authentication input via a second input mode distinct from the first input mode; and, in response to receiving the first authentication input and the second authentication input, authenticating the user based on the first authentication input and the second authentication input.
In some embodiments, a computer system (e.g., server system 108 (Figures 1-2), client device 104 (Figures 1 and 3), or a combination thereof) includes one or more processors and memory storing one or more programs for execution by the one or more processors, the one or more programs include instructions for performing, or controlling performance of, the operations of any of the methods described herein. In some embodiments, a non-transitory computer readable storage medium storing one or more programs, the one or more programs comprising instructions, which, when executed by a computer system (e.g., server system 108 (Figures 1-2), client device 104 (Figures 1 and 3), or a combination thereof) with one or more processors, cause the computer system to perform, or control performance of, the operations of any of the methods described herein. In some embodiments, a computer system (e.g., server system 108 (Figures 1-2), client device 104 (Figures 1 and 3), or a combination thereof) includes means for performing, or controlling performance of, the operations of any of the methods described herein.
Various advantages of the present application are apparent in light of the descriptions below.
The aforementioned features and advantages of the disclosed technology as well as additional features and advantages thereof will be more clearly understood hereinafter as a result of a detailed description of preferred embodiments when taken in conjunction with the drawings.
To describe the technical solutions in the embodiments of the present disclosed technology or in the prior art more clearly, the following briefly introduces the accompanying drawings required for describing the embodiments or the prior art. Apparently, the accompanying drawings in the following description show merely some embodiments of the present disclosed technology, and persons of ordinary skill in the art may still derive other drawings from these accompanying drawings without creative efforts.
Figure 1 is a block diagram of a server-client environment in accordance with some embodiments.
Figure 2 is a block diagram of a server system in accordance with some embodiments.
Figure 3 is a block diagram of a client device in accordance with some embodiments.
Figure 4A is a diagram of an implementation of a data structure for an authentication information database in accordance with some embodiments.
Figure 4B is a diagram of an implementation of a data structure for a user profile in accordance with some embodiments.
Figures 5A-5C illustrate exemplary user interfaces for authenticating a user in accordance with some embodiments.
Figure 6 illustrates a flowchart diagram of a method of identity authentication in accordance with some embodiments.
Figure 7 illustrates a flowchart diagram of a method of identity authentication in accordance with some embodiments.
Figures 8A-8C illustrate a flowchart diagram of a method of authenticating a user in accordance with some embodiments.
Figure 9 is a block diagram of a client-side module in accordance with some embodiments.
Like reference numerals refer to corresponding parts throughout the several views of the drawings.
DESCRIPTION OF EMBODIMENTS
Reference will now be made in detail to embodiments, examples of which are illustrated in the accompanying drawings. In the following detailed description, numerous specific details are set forth in order to provide a thorough understanding of the subject matter presented herein. But it will be apparent to one skilled in the art that the subject matter may be practiced without these specific details. In other instances, well-known methods, procedures, components, and circuits have not been described in detail so as not to unnecessarily obscure aspects of the embodiments.
The following clearly and completely describes the technical solutions in the embodiments of the present application with reference to the accompanying drawings in the embodiments of the present application. Apparently, the described embodiments are merely a part rather than all of the embodiments of the present application. All other embodiments obtained by persons of ordinary skill in the art based on the embodiments of the present application without creative efforts shall fall within the protection scope of the present application.
As shown in Figure 1, data processing for a social networking platform or other application is implemented in a server-client environment 100 in accordance with some embodiments. In accordance with some embodiments, server-client environment 100 includes client-side processing 102-1, 102-2 (hereinafter “client-side modules 102” ) executed on a client device 104-1, 104-2, and server-side processing 106 (hereinafter “server-side
module 106” ) executed on a server system 108. Client-side module 102 communicates with server-side module 106 through one or more networks 110. Client-side module 102 provides client-side functionalities for the social networking platform (e.g., communications, payment processing, user authentication, etc. ) and communications with server-side module 106. Server-side module 106 provides server-side functionalities for the social networking platform (e.g., communications, payment processing, user authentication, etc. ) for any number of client modules 102 each residing on a respective client device 104.
In some embodiments, server-side module 106 includes one or more processors 112, authentication information database 114, profiles database 116, an I/O interface to one or more clients 118, and an I/O interface to one or more external services 120. I/O interface to one or more clients 118 facilitates the client-facing input and output processing for server-side module 106. One or more processors 112 perform an authentication process so as to authenticate a user of the social networking platform in response to detecting a trigger condition. Authentication information database 114 stores authentication information for users of the social networking platform, and profiles database 116 stores a user profile for each user of the social networking platform. I/O interface to one or more external services 120 facilitates communications with one or more external services 122 (e.g., merchant websites, credit card companies, and/or other payment processing services).
Examples of client device 104 include, but are not limited to, a handheld computer, a wearable computing device, a personal digital assistant (PDA), a tablet computer, a laptop computer, a desktop computer, a cellular telephone, a smart phone, an enhanced general packet radio service (EGPRS) mobile phone, a media player, a navigation device, a game console, a television, a remote control, or a combination of any two or more of these data processing devices or other data processing devices.
Examples of one or more networks 110 include local area networks (LAN) and wide area networks (WAN) such as the Internet. One or more networks 110 are, optionally, implemented using any known network protocol, including various wired or wireless protocols, such as Ethernet, Universal Serial Bus (USB), FIREWIRE, Global System for Mobile Communications (GSM), Enhanced Data GSM Environment (EDGE), code
division multiple access (CDMA), time division multiple access (TDMA), Bluetooth, Wi-Fi, voice over Internet Protocol (VoIP), Wi-MAX, or any other suitable communication protocol.
Server-client environment 100 shown in Figure 1 includes both a client-side portion (e.g., client-side module 102) and a server-side portion (e.g., server-side module 106). In some embodiments, data processing is implemented as a standalone application installed on client device 104. In addition, the division of functionalities between the client and server portions of client environment data processing can vary in different embodiments. For example, in some embodiments, client-side module 102 is a thin-client that provides only user-facing input and output processing functions, and delegates all other data processing functionalities to a backend server (e.g., server system 108). In another example, client-side module 102 performs the verification process and a backend server (e.g., server system 108) performs other functions of the social networking platform (e.g., communications and payment processing).
Figure 2 is a block diagram illustrating server system 108 in accordance with some embodiments. Server system 108, typically, includes one or more processing units (CPUs) 112, one or more network interfaces 204 (e.g., including I/O interface to one or more clients 118 and I/O interface to one or more external services 120), memory 206, and one or more communication buses 208 for interconnecting these components (sometimes called a chipset). Memory 206 includes high-speed random access memory, such as DRAM, SRAM, DDR RAM, or other random access solid state memory devices; and, optionally, includes non-volatile memory, such as one or more magnetic disk storage devices, one or more optical disk storage devices, one or more flash memory devices, or one or more other non-volatile solid state storage devices. Memory 206, optionally, includes one or more storage devices remotely located from one or more processing units 112. Memory 206, or alternatively the non-volatile memory within memory 206, includes a non-transitory computer readable
storage medium. In some implementations, memory 206, or the non-transitory computer readable storage medium of memory 206, stores the following programs, modules, and data structures, or a subset or superset thereof:
·operating system 210 including procedures for handling various basic system services and for performing hardware dependent tasks;
·network communication module 212 for connecting server system 108 to other computing devices (e.g., client devices 104 and external service (s) 122) connected to one or more networks 110 via one or more network interfaces 204 (wired or wireless);
·server-side module 106, which provides server-side data processing for a social networking platform (e.g., communications, payment processing, user authentication, etc. ), includes, but is not limited to:
·(optionally) biometric information obtaining module 220 for obtaining biometric information associated with users during usage of the social networking platform;
·communications module 222 for managing and routing messages sent between users of the social networking platform;
·payment processing module 224 for processing transactions for a respective user of the social networking platform based on payment data in a user profile in profiles database 116 corresponding to the respective user;
·trigger detection module 226 for detecting a trigger condition as to a respective user (e.g., in response to receiving, from a client device 104, a notification of a user input corresponding to the trigger condition); and
·authentication module 228 for performing an authentication process so as to authenticate the respective user of the social networking platform in response to detecting the trigger condition as to the respective user, including but not limited to:
οenvironmental information obtaining module 230 for obtaining environmental information corresponding to environmental conditions associated with a client device 104 of the respective user;
οadjusting module 244 for adjusting the first authentication threshold and the authentication threshold based on the environmental
information obtained by environmental information obtaining module 230; and
·server data 250 storing data for the social networking platform, including but not limited to:
Each of the above identified elements may be stored in one or more of the previously mentioned memory devices, and corresponds to a set of instructions for performing a function described above. The above identified modules or programs (i.e., sets of instructions) need not be implemented as separate software programs, procedures, or modules, and thus various subsets of these modules may be combined or otherwise re-arranged in various implementations. In some implementations, memory 206, optionally, stores a subset of the modules and data structures identified above. Furthermore, memory 206, optionally, stores additional modules and data structures not described above.
Figure 3 is a block diagram illustrating a representative client device 104 associated with a user in accordance with some embodiments. Client device 104, typically,
includes one or more processing units (CPUs) 302, one or more network interfaces 304, memory 306, and one or more communication buses 308 for interconnecting these components (sometimes called a chipset). Client device 104 also includes a user interface 310. User interface 310 includes one or more output devices 312 that enable presentation of media content, including one or more speakers and/or one or more visual displays. User interface 310 also includes one or more input devices 314, including user interface components that facilitate user input such as a keyboard, a mouse, a voice-command input unit or microphone, a touch screen display, a touch-sensitive input pad, a camera, a gesture capturing camera, or other input buttons or controls. Furthermore, some client devices 104 use a microphone and voice recognition or a camera and gesture recognition to supplement or replace the keyboard. Client device 104 further includes sensors 315, which provide information as to the environmental conditions associated with client device 104. Sensors 315 include but are not limited to one or more microphones, one or more cameras, an ambient light sensor, one or more accelerometers, one or more gyroscopes, a GPS positioning system, a Bluetooth or BLE system, a temperature sensor, one or more motion sensors, one or more biological sensors (e.g., a galvanic skin resistance sensor, a pulse oximeter, and the like), and other sensors. Memory 306 includes high-speed random access memory, such as DRAM, SRAM, DDR RAM, or other random access solid state memory devices; and, optionally, includes non-volatile memory, such as one or more magnetic disk storage devices, one or more optical disk storage devices, one or more flash memory devices, or one or more other non-volatile solid state storage devices. Memory 306, optionally, includes one or more storage devices remotely located from one or more processing units 302. Memory 306, or alternatively the non-volatile memory within memory 306, includes a non-transitory computer readable storage medium. In some implementations, memory 306, or the non-transitory computer readable storage medium of memory 306, stores the following programs, modules, and data structures, or a subset or superset thereof:
·operating system 316 including procedures for handling various basic system services and for performing hardware dependent tasks;
·network communication module 318 for connecting client device 104 to other computing devices (e.g., server system 108 and external service (s) 122)
connected to one or more networks 110 via one or more network interfaces 304 (wired or wireless);
·presentation module 320 for enabling presentation of information (e.g., a user interface for a social networking platform, widget, websites or web pages thereof, game, and/or application, audio and/or video content, text, etc. ) at client device 104 via one or more output devices 312 (e.g., displays, speakers, etc. ) associated with user interface 310;
·input processing module 322 for detecting one or more user inputs or interactions from one of the one or more input devices 314 and interpreting the detected input or interaction;
·web browser module 324 for navigating, requesting (e.g., via HTTP), and displaying websites and web pages thereof;
·one or more applications 326-1–326-N for execution by client device 104 (e.g., games, application marketplaces, payment platforms, and/or other applications); and
·client-side module 102, which provides client-side data processing and functionalities for the social networking platform, including but not limited to:
οbiometric obtaining module 332 for obtaining biometric information during usage of the social networking platform by a user of client device 104;
οenvironmental information obtaining module 334 for obtaining environmental information from sensors 315 corresponding to environmental conditions associated with client device 104;
ο(optionally) authenticating module 342 for performing an authentication process so as to authenticate the user in response to detecting the trigger condition as to the user;
·client data 350 storing data associated with the social networking platform, including, but is not limited to:
οuser profile 352 storing a user profile associated with the user of client device 104 including a user identifier (e.g., an account name or handle), login credentials to the social networking platform, payment data (e.g., linked credit card information, app credit or gift card balance, billing address, shipping address, etc. ), authentication preferences, run-time biometric information, custom parameters for the user (e.g., age, location, hobbies, etc. ), and identified trends and/or likes/dislikes of the user;
o(optionally) authentication information 354 storing authentication information for the user of client device 104; and
ouser data 356 storing data authored, saved, liked, or chosen as favorites by the user of client device 104 in the social networking platform.
Each of the above identified elements may be stored in one or more of the previously mentioned memory devices, and corresponds to a set of instructions for performing a function described above. The above identified modules or programs (i.e., sets of instructions) need not be implemented as separate software programs, procedures, modules or data structures, and thus various subsets of these modules may be combined or otherwise re-arranged in various implementations. In some implementations, memory 306, optionally, stores a subset of the modules and data structures identified above. Furthermore, memory 306, optionally, stores additional modules and data structures not described above.
In some embodiments, at least some of the functions of server system 108 are performed by client device 104, and the corresponding sub-modules of these functions may
be located within client device 104 rather than server system 108. For example, in some embodiments, the environmental information obtaining module 230, the selecting module 232, the prompting module 234, the authentication input obtaining module 236, the processing module 238 may be implemented at least in part on the client device 104. In some embodiments, at least some of the functions of client device 104 are performed by server system 108, and the corresponding sub-modules of these functions may be located within server system 108 rather than client device 104. Client device 104 and server system 108 shown in Figures 2-3, respectively, are merely illustrative, and different configurations of the modules for implementing the functions described herein are possible in various embodiments.
Figure 4A is a diagram of an implementation of a data structure representing authentication information database 114 in accordance with some embodiments. In some embodiments, authentication information database 114 is stored and managed by server system 108 or a component thereof (e.g., server-side module 106, Figures 1-2). In some embodiments, authentication information database 114 is stored remote from server system 108 but is managed by server system 108.
In some embodiments, authentication information database 114 includes a plurality of entries for facilitating the authentication of users of the social networking platform. In some embodiments, server system 108 operates and manages the social networking platform. A respective entry corresponds to a user of the social networking platform, and the respective entry includes a user identifier (ID) 402 corresponding to the user and authentication information 404 corresponding to the user. For example, in Figure 4A, user ID 402-A is a unique number generated by server system 108 for a respective user of the social networking platform or a user name, handle, account name, number or sequence of characters created by the respective user of the social networking platform. For example, in Figure 4A, authentication information 404-A includes a set of authentication information entered by the respective user during initialization or setup of an account for the social networking platform.
In some embodiments, authentication information 404 includes authentication information for a plurality of distinct input modes (e.g., at least two input modes) (sometimes also here called “input manners,” “authentication manners,” or “authentication input
manner” ). In Figure 4A, authentication information 404-N is, for example, associated with a password 412, a signature 414, a sketch password 416 (e.g., a sketch pattern or drawing), vocal information 418 (e.g., a voice signature extracted from a voice sample), fingerprint information 420 (e.g., a fingerprint signature extracted from a fingerprint image or scan), retinal information 422 (e.g., a retinal signature extracted from a retinal image or scan), facial information 424 (e.g., a facial signature extracted from a headshot or other image), and other authentication information for the user of the social networking platform that correspond with user ID 402-N.
Figure 4B is a diagram of an implementation of a data structure representing user profile 352 in accordance with some embodiments. In some embodiments, user profile 352 corresponds to a user of the social networking platform. In some embodiments, user profile 352 is stored in profiles database 116 by server system and/or in client data 350 by a client device 104 associated with the user. In some embodiments, user profile 352 includes the following fields: user identifier (ID) field 452; login credentials field 454; payment information field 456; authentication preferences field 458; biometric information field 460; and other user information field 462.
In some embodiments, user identifier (ID) field 452 includes a unique number generated by server system 108 for the user of the social networking platform or a user name, handle, account name, number or sequence of characters generated by the user of the social networking platform. In some embodiments, a respective user identifier 402 in authentication information database 114 matches user identifier field 452 in a user profile for the respective user.
In some embodiments, login credentials field 454 includes a user name and password entered by the user during initialization or setup of an account for the social networking platform. For example, when logging into the social networking platform, a user corresponding to user profile 352 enters login credentials, and, subsequently, server system 108 matches the entered login credentials against login credentials field 454 so as to verify and login the user.
In some embodiments, payment information field 456 includes payment data entered by the user during initialization or setup of an account for the social networking platform by the user. For example, payment information field 456 includes credit card
information, app credit or gift card balance, billing address, shipping address, and the like for the user.
In some embodiments, authentication preferences field 458 includes authentication preferences entered by the user during initialization or setup of an account for the social networking platform. For example, the user specifies that he/she wishes to user any two of vocal information, fingerprint information, and retinal information for authenticating the user during an authentication process.
In some embodiments, biometric information field 460 includes biometric information obtained by client-side module 102 during usage of the social networking platform by the user. For example, client-side module 102 initiates collection of biometric information when the user starts a process that may ultimately require the authentication process such as starting an online shopping process, starting a device reconfiguration process, or starting a payment process.
In some embodiments, other user information field 462 includes other information corresponding to the user such as custom parameters for the user (e.g., age, location, hobbies, etc.) and identified trends and/or likes/dislikes of the user.
Attention is now directed towards embodiments of user interfaces and associated processes that may be implemented on a client device 104 with a touch screen 506 (sometimes also herein called a touch screen display) enabled to receive one or more contacts and display information (e.g., media content, websites and web pages thereof, and/or user interfaces for an application such as a web browser or the social networking platform). Figures 5A-5C illustrate exemplary user interfaces for authenticating a user in accordance with some embodiments.
Although some of the examples that follow will be given with reference to inputs on touch screen 506 (where the touch sensitive surface and the display are combined), in some embodiments, the device detects inputs on a touch-sensitive surface that is separate from the display. In some embodiments, the touch sensitive surface has a primary axis that corresponds to a primary axis on the display. In accordance with these embodiments, the device detects contacts with the touch-sensitive surface at locations that correspond to respective locations on the display. In this way, user inputs detected by the device on the touch-sensitive surface are used by the device to manipulate the user interface on the display
of the device when the touch-sensitive surface is separate from the display. It should be understood that similar methods are, optionally, used for other user interfaces described herein.
Additionally, while the following examples are given primarily with reference to finger inputs (e.g., finger contacts, finger tap gestures, finger swipe gestures, etc. ), it should be understood that, in some embodiments, one or more of the finger inputs are replaced with input from another input device (e.g., a mouse-based, stylus-based, or physical button-based input). For example, a swipe gesture is, optionally, replaced with a mouse click (e.g., instead of a contact) followed by movement of the cursor along the path of the swipe (e.g., instead of movement of the contact). As another example, a tap gesture is, optionally, replaced with a mouse click while the cursor is located over the location of the tap gesture (e.g., instead of detection of the contact followed by ceasing to detect the contact) or depression of a physical button. Similarly, when multiple user inputs are simultaneously detected, it should be understood that multiple computer mice are, optionally, used simultaneously, or a mouse and finger contacts are, optionally, used simultaneously.
Figures 5A-5C show interface 508 for a social networking platform or other application displayed on client device 104 (e.g., a mobile phone); however, one skilled in the art will appreciate that the user interfaces shown in Figures 5A-5C may be implemented on other similar computing devices. The user interfaces in Figures 5A-5C are used to illustrate the processes described herein, including the process described with respect to Figures 8A-8C.
Figure 5A illustrates client device 104 executing a web browser (e.g., web browser module 324, Figure 3). In Figure 5A, the web browser is displaying a “Checkout” web page for merchant website 510 on touch screen 506. In Figure 5A, the web browser includes a current address bar showing merchant website 510 as the current web address displayed by the web browser, refresh affordance 512 for reloading the current web page, back navigation affordance 514-A for displaying the last web page, and forward navigation affordance 514-B for displaying the next web page. In Figure 5A, the “Checkout” web page for merchant website 510 includes item 516 (e.g., Board Game A), which the user of client device 104 intends to purchase from merchant website 510. In Figure 5A, the “Checkout” web page also includes a description of item 516, the cost of item 516, and the current quantity of item 516 selected for purchase. In Figure 5A, the “Checkout” web page further
includes the subtotal for item 516, tax associated with item 516, shipping cost for item 516, and the total amount for the purchase of item 516 from merchant website 510.
In Figure 5A, the “Checkout” web page further includes a plurality of payment options for completing the purchase of item 516. In Figure 5A, the payment options include affordance 518-A for payment option A (e.g., merchant credit or a gift card associated with merchant website 410), affordance 518-B for payment option B (e.g., an alternative payment application or a credit card previously recognized by merchant website 410), and affordance 518-C for payment processing associated with a social networking platform. Figure 5A also illustrates client device 104 detecting contact 520 (e.g., a tap gesture) at a location corresponding to affordance 518-C. In response to selection of affordance 518-C, payment for the purchase of item 516 from merchant website 510 will be conducted by payment processing associated with a social networking platform.
For example, in response to the selection of affordance 518-C via contact 520 in Figure 5A, client-side module 102 (e.g., associated with the social networking platform) sends a notification, to server system 108, indicating selection of affordance 518-C for payment processing associated with the social networking platform. Continuing with this example, server system 108 detects a trigger condition in response to receiving the notification. Continuing with this example, in response to detecting the trigger condition, server system 108 performs an authentication process so as to authenticate the user of client device 104. As part of the authentication process server system 108 sends a user interface (e.g., a notification) or voice message, to client device 104 for presentation to the user. For example, the user interface or voice message prompts the user to provide two or more authentication inputs via the two or more distinct input modes.
Alternatively, in another example, in response to the selection of affordance 518-C via contact 520 in Figure 5A, client-side module 102 (e.g., associated with the social networking platform) detects a trigger condition. Continuing with this example, in response to detecting the trigger condition, client-side module 102 performs an authentication process so as to authenticate the user of client device 104. As part of the authentication process server system client-side module 102 causes client device 104 to present a user interface (e.g., a notification) or voice message to a user. For example, the user interface or voice message
prompts the user to provide two or more authentication inputs via the two or more distinct input modes.
Figure 5B illustrates client device 104 displaying notification 522 on touch screen 506 in response to selection of affordance 518-C in Figure 5A. In Figure 5B, notification 522 prompts the user of client device 104 to provide authentication inputs for the authentication process without specifying the input modes.
Figure 5C illustrates client device 104 displaying notification 524 on touch screen 506 in response to selection of affordance 518-C in Figure 5A. In Figure 5C, notification 524 prompts the user of client device 104 to provide authentication information for the authentication process via a first input mode corresponding to vocal recognition (e.g., by providing a voice sample) and a second input mode corresponding to facial recognition (e.g., by capturing a headshot with a camera associated with client device 104).
Figure 6 illustrates a flowchart diagram of a method 600 of identity authentication in accordance with some embodiments. In some embodiments, method 600 is performed by a mobile terminal (e.g., client device 104, Figures 1 and 3) with one or more processors and memory. For example, in some embodiments, method 600 is performed by client device 104 (Figures 1 and 3) or a component thereof (e.g., client-side module 102, Figures 1 and 3). In some embodiments, method 600 is governed by instructions that are stored in a non-transitory computer readable storage medium and the instructions are executed by one or more processors of the server system.
In some embodiments, the mobile terminal is a mobile device such as a smart phone (e.g., an AndroidTM mobile phone and an iOSTM mobile phone), a tablet computer, a laptop computer, a mobile Internet device (MID), or a wearable computing device.
The mobile terminal displays (602) an identity authentication interface. For example, an identity authentication program of the mobile terminal detects that it is required to perform identity authentication for a user when the user performs or intends to perform an online transaction (e.g., for goods and/or services), when the mobile terminal is powered on, when the mobile terminal is unlocked from a standby mode, or when the mobile terminal executes an secured application program (e.g., a payment processing service or application). For example, the identity authentication interface corresponds to notification 522 in Figure
5B, where notification 522 only provides to the user of client device 104 (e.g., the mobile terminal) a prompt to input authentication information, but does not provide a prompt of the authentication manner (or input mode). In this example, the identity authentication interface increases the difficulty of cracking the authentication process because a fraudulent or unauthorized user cannot focus on a single authentication manner. In another example, the identity authentication interface (e.g., notification 524 in Figure 5C) prompts the user to input authentication information by using any one or more of at least two authentication information input manners (or input modes).
The mobile terminal determines (604) whether authentication information input by the user is obtained by using any one of at least two selected authentication information input manners (or modes). In some embodiments, the mobile terminal selects at least two distinct user input manners (or input modes). For example, the least two distinct user input manners are selected at random or based on authentication preferences associated with a user profile of the user of the mobile terminal. After the identity authentication interface is displayed in operation 620, at least two user input sensors corresponding to the at least two distinct user input manners are triggered (i.e., started) so as to obtain authentication information input by the user. For example, the at least two user input sensors include any two of: a fingerprint sensor, a voiceprint sensor, a touch screen input sensor, a face recognition sensor, an iris recognition sensor, a keyboard input sensor, and the like. Further, in some embodiments, the at least two user input sensors are triggered for a preset time window after the identity authentication interface is displayed. Thus, the user must enter the authentication information within the preset time window or a time expiration prompt is presented to the user and the at least two user input sensors are turned off. For example, the preset time window is 30 seconds, 1 minute, or the like.
In accordance with a determination that the authentication information input uses any one of the at least two selected authentication information input manners, method 600 continues to operation 606. In accordance with a determination that the authentication information input does use any one of the at least two selected authentication information input manners, method 600 returns to operation 604.
The mobile terminal compares (606) the authentication information with selected authentication information corresponding to the authentication information input manner. In some embodiments, authentication information for each authentication information input manner is collected for a user and stored by the mobile device in advance. For example, authentication information corresponding to a respective manner (e.g., by using a fingerprint collection sensor) is input by the user through a corresponding input sensor of the mobile terminal (or an external device in communication with the mobile terminal) in advance. When the mobile terminal detects that the authentication information input by the user is obtained through a certain authentication information input manner, the obtained authentication information is compared with the previously stored authentication information corresponding to the authentication information input manner. For example, the comparison is performed to see whether the similarity between obtained fingerprint or voiceprint information and previously collected and stored fingerprint or voiceprint information satisfies a predetermined comparison threshold.
The mobile terminal confirms (608) the identity of the user of the mobile terminal according to a comparison result. In some embodiments, in accordance with a determination that the comparison result in operation 606 satisfies the predetermined comparison threshold, the mobile terminal confirms the identity of the user of the mobile terminal. For example, after the identity authentication interface is displayed, the mobile terminal starts a fingerprint collection sensor, a voiceprint sensor (e.g., one or more microphones), and a touch screen input sensor to detect the authentication information input by the user. Continuing with this example, as long as effective authentication information is obtained through any of the authentication information input manners and the comparison of authentication information is successful, the mobile terminal confirms the identity of the current user of the mobile terminal. Optionally, in some embodiments, the mobile terminal confirms the identity of the current user of the mobile terminal after successful comparisons of authentication information obtained using two or more authentication information input manner. For example, the predetermined comparison threshold and the authentication information is effective when the fingerprint information collected by the fingerprint collection sensor reaches a preset threshold or the voice information collected by the voiceprint sensor reaches a preset threshold. Optionally, in some embodiments, if the mobile terminal does not store authentication information for a respective authentication manner and
an input sensor corresponding to the respective authentication manner receives authentication information input by the user, the mobile terminal determines that the comparison fails and, in some circumstances, the mobile terminal presents a notification to the user of the comparison failure.
In some embodiments, the mobile terminal stores one or more pieces of authentication information for each user identity. When a comparison between authentication information obtained by at least one sensor and previously stored authentication information for the at least one the sensor is successful, the mobile terminal confirms that the current user has the identity corresponding to the previously stored authentication information.
If the comparison of authentication information is successful, the mobile terminal determines that the current user is an authorized user, then, the authentication result is displayed in the authentication interface. In some circumstances, if the comparison of authentication information is successful, the mobile terminal is unlocked or the current user is provided with the ability to use the secured application. In some circumstances, if the comparison of authentication information is successful, the mobile terminal executes the online transaction. For example, the mobile terminal executes the online transaction by sending transaction information to a merchants’ server according to the determined user identity of the mobile terminal. In this example, the user transaction information includes an item, amount of money, and other account information corresponding to the determined user identity corresponding to the online transaction.
It should be understood that the particular order in which the operations in Figure 6 have been described is merely exemplary and is not intended to indicate that the described order is the only order in which the operations could be performed. One of ordinary skill in the art would recognize various ways to reorder the operations described herein. Additionally, it should be noted that details of other processes described herein with respect to other methods described herein (e.g., methods 700 and 800) are also applicable in an analogous manner to method 600 described above with respect to Figure 6.
Figure 7 illustrates a flowchart diagram of a method 700 of identity authentication in accordance with some embodiments. In some embodiments, method 700 is performed by a mobile terminal (e.g., client device 104, Figures 1 and 3) with one or more
processors and memory. For example, in some embodiments, method 700 is performed by client device 104 (Figures 1 and 3) or a component thereof (e.g., client-side module 102, Figures 1 and 3). In some embodiments, method 700 is governed by instructions that are stored in a non-transitory computer readable storage medium and the instructions are executed by one or more processors of the server system.
In some embodiments, the mobile terminal is a mobile device such as a smart phone (e.g., an AndroidTM mobile phone and an iOSTM mobile phone), a tablet computer, a laptop computer, a mobile Internet device (MID), or a wearable computing device.
The mobile terminal displays (702) an identity authentication interface and starts at least two user input sensors of a mobile terminal. For example, an identity authentication program of the mobile terminal detects that it is required to perform identity authentication for a user when the user performs or intends to perform an online transaction (e.g., for goods and/or services), when the mobile terminal is powered on, when the mobile terminal is unlocked from a standby mode, or when the mobile terminal executes an secured application program (e.g., a payment processing service or application). For example, the identity authentication interface corresponds to notification 522 in Figure 5B, where notification 522 only provides to the user of client device 104 (e.g., the mobile terminal) a prompt to input authentication information via two or more authentication manner (or input mode), but does not provide a prompt indicating the authentication manners (or input modes). In this example, the identity authentication interface (e.g., corresponding with notification 522 in Figure 5B) increases the difficulty of cracking the authentication process by a fraudulent or unauthorized user. In another example, the identity authentication interface (e.g., notification 524 in Figure 5C) prompts the user to input authentication information by two or more authentication information input manners (or input modes). In some embodiments, input sensors corresponding to the authentication information input manners (or input modes) include a fingerprint collection sensor, a voiceprint sensor, a touch screen input sensor, a face recognition sensor, an iris recognition sensor, and a keyboard input sensor.
The mobile terminal determines (704) whether a first user input sensor obtains authentication information input by the user. In accordance with a determination that the mobile terminal obtains authentication information via the first user input sensor, method 700
continues to operation 706. In accordance with a determination that the mobile terminal does not obtain authentication information via the first user input sensor, method 700 returns to operation 704.
In accordance with a determination that the first user input sensor obtains authentication information input by the user, the mobile terminal compares (706) the authentication information obtained by the first user input sensor with preset authentication information corresponding to the first user input sensor. For example, the first user input sensor in this embodiment is a fingerprint collection sensor. Continuing with this example, after fingerprint information is collected by the fingerprint collection sensor for the user of the mobile terminal, the fingerprint information is compared against preset (or previously stored) fingerprint information for the user of the mobile terminal. In this example, if the similarity between the obtained fingerprint information and preset fingerprint information reaches a predetermined comparison threshold for the fingerprint collection sensor, the mobile terminal determines that the comparison is successful.
The mobile terminal determines (708) whether a second user input sensor obtains authentication information input by the user. In accordance with a determination that the mobile terminal obtains authentication information via the second user input sensor, method 700 continues to operation 710. In accordance with a determination that the mobile terminal does not obtain authentication information via the second user input sensor, method 700 returns to operation 708.
In accordance with a determination that the second user input sensor obtains authentication information input by the user, the mobile terminal compares (710) the authentication information obtained by the second user input sensor with preset authentication information corresponding to the second user input sensor. For example, the second user input sensor in this embodiment is a voiceprint collection sensor. Continuing with this example, after voiceprint information is collected by the voiceprint collection sensor for the user of the mobile terminal, the voiceprint information is compared against preset (or previously stored) voiceprint information for the user of the mobile terminal. In this example, if the similarity between the obtained voiceprint information and preset voiceprint
information reaches a predetermined comparison threshold for the voiceprint collection sensor, the mobile terminal determines that the comparison is successful.
It should be noted that, the implementation of operations 04-706 and operations 708-710 should not be understood to have strict sequence. In one example, operations 704-706 are performed first and then operations 708-710 are performed. In this example, the user inputs authentication information by using the first user input sensor, and, after the comparison is successful, the user inputs authentication information by using the second user input sensor. In another example, operations 708-710 are performed first and then operations 704-706 are performed. In this example, the user inputs authentication information by using the second user input sensor, and, after the comparison is successful, the user inputs authentication information by using the first user input sensor. In a further example, operations 704-706 and operations 708-710 are performed in parallel.
The mobile terminal determines (712) whether comparison of the authentication information obtained by the first user input sensor and the authentication information obtained by the second input sensor is successful. For example, the comparison is performed to see (A) whether the first similarity between fingerprint information obtained by the fingerprint sensor (i.e., the first user input sensor) and previously collected and stored fingerprint information for the user satisfies a predetermined comparison threshold for the fingerprint sensor and (B) whether the second similarity between voiceprint information obtained by the voice sensor (i.e., the second user input sensor) and previously collected and stored voiceprint information for the user satisfies a predetermined comparison threshold for the voice sensor.
In some embodiments, authentication information for each authentication information input manner is collected for a user and stored by the mobile device in advance. For example, authentication information corresponding to a respective manner (e.g., by using a fingerprint collection sensor) is input by the user through a corresponding input sensor of the mobile terminal (or an external device in communication with the mobile terminal) in advance. When the mobile terminal detects that the authentication information input by the user is obtained through a certain authentication information input manner, the obtained
authentication information is compared with the previously stored authentication information corresponding to the authentication information input manner.
In some embodiments, after the first user input sensor and the second user input sensor obtain authentication information, the mobile terminal compares the authentication information from the first input sensor to previously stored authentication information for the first input sensor and authentication information from the second input sensor to previously stored authentication information for the second input sensor. In accordance with a determination that the comparison is successful, method 700 continues to operation 714. In accordance with a determination that the comparison is not successful, method 700 continues to operation 716, where the mobile terminal prompts the user to provide the two or more inputs again and repeats operation 702.
The mobile terminal confirms (714) the identity of the user of the mobile terminal according to a comparison result. In some embodiments, in accordance with a determination that the comparison result in operation 712 satisfies the predetermined comparison threshold, the mobile terminal confirms the identity of the user of the mobile terminal.
If the comparison of authentication information is successful, the mobile terminal determines that the current user is an authorized user, the authentication result is displayed in the authentication interface. In some circumstances, if the comparison of authentication information is successful, the mobile terminal is unlocked or the current user is provided with the ability to use the secured application. In some circumstances, if the comparison of authentication information is successful, the mobile terminal executes the online transaction. For example, the mobile terminal executes the online transaction by sending transaction information to a merchants’ server according to the determined user identity of the mobile terminal. In this example, the user transaction information includes an item, amount of money, and other account information corresponding to the determined user identity corresponding to the online transaction.
It should be noted that this embodiment is an example of the mobile terminal determining user identity by obtaining authentication information through two user input
sensors, and, in other optional embodiments, the mobile terminal may determines user identity by obtaining authentication information through three or more user input sensors.
It should be understood that the particular order in which the operations in Figure 7 have been described is merely exemplary and is not intended to indicate that the described order is the only order in which the operations could be performed. One of ordinary skill in the art would recognize various ways to reorder the operations described herein. Additionally, it should be noted that details of other processes described herein with respect to other methods described herein (e.g., methods 600 and 800) are also applicable in an analogous manner to method 700 described above with respect to Figure 7.
Figures 8A-8C illustrate a flowchart diagram of a method 800 of authenticating a user in accordance with some embodiments. In some embodiments, method 800 is performed by a server with one or more processors and memory. For example, in some embodiments, method 800 is performed by server system 108 (Figures 1-2) or a component thereof (e.g., server-side module 106, Figures 1-2). In some embodiments, method 800 is governed by instructions that are stored in a non-transitory computer readable storage medium and the instructions are executed by one or more processors of the server system. Optional operations are indicated by dashed lines (e.g., boxes with dashed-line borders).
In some embodiments, prior to detecting the trigger condition, the server obtains (802) biometric information corresponding to the user. In some embodiments, server system 108 or a component thereof (e.g., biometric information obtaining module 220, Figure 2) obtains, from a client device 104, biometric information associated with a user of client device 104 during usage of the social networking platform. For example, client device 104 or a component thereof (e.g., biometric obtaining module 332, Figure 3) collects biometric information during usage of the social networking platform by a user of client device 104. In some embodiments, the collected biometric information includes retinal, facial, vocal, and fingerprint information.
In some embodiments, the background collection of biometric information by biometric obtaining module 332 may be disabled by the user of client device 104. In some embodiments, the collecting of biometric information is initiated when the user starts a process that may ultimately require the authentication process (e.g., an online shopping
process), starts a device reconfiguration process, starts a payment process, and the like. The collection of biometric information can be performed periodically during the shopping/configuration/payment process. For example, biometric obtaining module 332 turns on the retina scanner, front facing camera, fingerprint sensor, or microphone (s) for a brief moment without any indication to the user.
The server detects (804) a trigger condition. In some embodiments, server system 108 or a component thereof (e.g., trigger detection module 226, Figure 2) detects a trigger condition as to a respective user of the social networking platform in response to receiving, from a client device 104, a notification of a user input corresponding to the trigger condition. For example, the notification indicates that a user input detected by client device 104 or a component thereof (e.g., trigger detection module 340, Figure 3) corresponding to initiation of an online payment, initiation of an online transaction, reactivation of client device 104 when locked, or powering on of client device 104 when off. In some embodiments, detection of the trigger condition by trigger detection module 226 is preceded by the user logging into client device 104 or an application (e.g., client-side module 102, which is associated with the social networking platform or another application) with login credentials.
For example, with reference to Figure 5A, in response to the selection of affordance 518-C via contact 520, client-side module 102 (e.g., associated with the social networking platform) sends a notification, to server system 108, indicating selection of affordance 518-C for payment processing associated with the social networking platform. Continuing with this example, server system 108 detects a trigger condition in response to receiving the notification. Continuing with this example, in response to detecting the trigger condition, server system 108 performs an authentication process so as to authenticate the user of client device 104 prior to processing the online transaction for item 516 from online merchant 510.
In response to detecting the trigger condition, the server performs (806) an authentication process so as to authenticate a user. In some embodiments, in response to detection of the trigger condition as to a respective user by trigger detection module 226, server system 108 or a component thereof (e.g., authentication module 228, Figure 2)
performs an authentication process so as to authenticate the respective user in order to perform a secured operation such as payment processing for an online transaction, unlocking of client device 104, or access to a secured application. For example, the authentication process is associated with an application into which the user previously logged in prior to detection of the trigger condition.
During the authentication process, the server dynamically selects (808) two or more distinct input modes for the authentication process based on one or more predetermined criteria. The two or more dynamically selected input modes are a subset of all possible input modes for user authentication available to the user device. The dynamic selection results in different input modes being selected depending on the actual conditions of the present authentication process, and/or some artificially introduced randomness in the selection process. In some embodiments, server system 108 or a component thereof (e.g., selecting module 232, Figure 2) dynamically selects two or more distinct input modes for the authentication process of the respective user based on one or more predetermined criteria. For example, the one or more predetermined criteria include environmental conditions associated with client device 104 of the respective user and/or authentication preferences specified by the respective user. For example, the authentication preferences are stored in a user profile for the respective user (e.g., authentication preferences field 458 in user profile 352 in Figure 4B).
In some embodiments, the two or more dynamically selected input modes for the authentication process differ (810) from two or more dynamically selected input modes for a previous authentication process as a result of a pseudo-random selection procedure used for the dynamic selections. In some embodiments, selecting module 232 dynamically selects the two or more input modes based on a random or pseudo-random selection procedure. As such, the two or more dynamically selected input modes for the current authentication process are different from the two or more dynamically selected input modes a previous authentication process, which were also dynamically selected based on the random or pseudo-random selection procedure.
In some embodiments, the two or more dynamically selected input modes for the authentication process differ (812) from two or more dynamically selected input modes for a previous authentication process as a result of differing environmental conditions
existing at the time of the dynamic selections. In some embodiments, server system 108 or a component thereof (e.g., environmental information obtaining module 230, Figure 2) obtains environmental information corresponding to environmental conditions associated with a client device 104 of the respective user. For example, the environmental information indicates: an absolute position of client device 104 (e.g., an address or latitudinal and longitudinal coordinates) based on a GPS positioning system of client device 104; an ambient temperate sensed by a temperature sensor of client device 104; a level of ambient sound detected by one or more microphones of client device 104; an ambient light level sensed by an ambient light sensor of client device 104; a velocity, acceleration, or change thereof based on one or more accelerometers of client device 104; and other information indicating environmental conditions associated with client device 104. In some embodiments, selecting module 232 dynamically selects the two or more input modes based on environmental information obtained by environmental information obtaining module 230 from client device 104. As such, the two or more dynamically selected input modes for the current authentication process are different from the two or more dynamically selected input modes a previous authentication process because environmental conditions associated with client device 104 have changed between the previous authentication process and the current authentication process.
In some embodiments, the one or more predetermined criteria include (814) environmental conditions associated with a client device corresponding to the user. In some embodiments, server system 108 or a component thereof (e.g., environmental information obtaining module 230, Figure 2) obtains environmental information corresponding to environmental conditions associated with a client device 104 of the respective user. In some embodiments, selecting module 232 dynamically selects the two or more input modes based on environmental information obtained by environmental information obtaining module 230 from client device 104. For example, when the environmental information indicates that client device 104 is located in a moving environment (e.g., in a car), selecting module 232 dynamically selects input modes corresponding to vocal recognition and a spoken password because the user may only have the ability to talk to, but not touch, client device 104. In this example, the user is in a vehicle that is in motion such as a car, plane, bus, or train. Alternatively, in this example, the user is performing the locomotion such as walking, cycling, etc. In another example, the environmental information indicates that client device 104 is
located in peak conditions (e.g., a user’s office or home with ample lighting, minimal ambient sound, and minimal movement), selecting module 232 dynamically selects input modes corresponding to facial recognition and fingerprint recognition.
In some embodiments, dynamically selecting the two or more distinct input modes for the authentication process based on one or more predetermined criteria further includes: determining that the user is located in a crowded environment; and in accordance with the determination that the user is located in a crowded environment, dynamically selecting a first authentication input mode corresponding to fingerprint recognition and a second authentication input mode corresponding to retinal recognition for the authentication process. For example, the environmental information indicates that client device 104 is located in a crowded area based on the location of client device 104 from a GPS positioning system, ambient temperature information, and ambient sound information in environmental information obtained by environmental information obtaining module 230 from client device 104. Continuing with this example, after determining that that client device 104 is located in a crowded area, selecting module 232 dynamically selects input modes corresponding to fingerprint recognition and retinal recognition because they cannot be eavesdropped on by nearby persons. In this example,
In some embodiments, dynamically selecting the two or more distinct input modes for the authentication process based on one or more predetermined criteria further includes: determining that the user is in a low-light environment; and in accordance with the determination that the user is in a low-light environment, dynamically selecting a first authentication input mode corresponding to vocal recognition and a second authentication input mode corresponding to fingerprint recognition for the authentication process. For example, the environmental information indicates that client device 104 is located in a low light environment based on time/date information, the location of client device 104 from a GPS positioning system, and ambient light information in environmental information obtained by environmental information obtaining module 230 from client device 104. Continuing with this example, after determining that client device 104 is located in a low light environment, selecting module 232 dynamically selects input modes corresponding to vocal recognition and fingerprint recognition because the user may not be able to clearly see
the screen client device 104 or the user may not want to waste battery power to turn on a backlight for the display of client device 104.
In some embodiments, the one or more predetermined criteria include (816) an input mode preference in a user profile corresponding to user. In some embodiments, server system 108 or a component thereof (e.g., selecting module 232, Figure 2) dynamically selects the two or more input modes based on an authentication preferences field in a user profile for the user of client device 104. For example, with reference to Figure 4B, user profile 352 for a respective user is stored in is stored in profiles database 116 by server system and/or in client data 350 by a client device 104 associated with the respective user. In this example, user profile 352 includes authentication preferences field 458 indicating that the authentication input modes preferred by the respective user are retinal recognition, fingerprint recognition, and vocal recognition. Continuing with this example, selecting module 232 selects two of the authentication input modes preferred by the respective user from authentication preferences field 458 of user profile 352 for the respective user. In some embodiments, the user preference is used to pre-filter all available authentication modes before the random selection and/or selection based on environmental conditions. In some embodiments, the user preference may also post-filter the dynamically selected authentication modes to arrive at the final set of authentication modes presented to the user.
During the authentication process, the server prompts (818) the user to provide two or more authentication inputs via the two or more dynamically selected input modes. For example, as part of the authentication process, server system 108 or a component thereof (e.g., prompting module 234, Figure 2) sends a user interface (e.g., a notification) or voice message, to client device 104 for presentation to the user. For example, the user interface or voice message prompts the user to provide two or more authentication inputs via the two or more distinct input modes.
In some embodiments, prompting the user (e.g., via an authentication interface) further includes (820) presenting an indication of the first input mode for the first authentication input and the second input mode for the second authentication input. In some embodiments, prompting module 234 sends a user interface (e.g., a notification) for presentation on client device 104. In Figure 5C, for example, notification 524 prompts the user of client device 104 to provide authentication information for the authentication process
via a first input mode corresponding to vocal recognition (e.g., by providing a voice sample) and a second input mode corresponding to facial recognition (e.g., by capturing a headshot with a camera associated with client device 104). Alternatively, in Figure 5B, for example, notification 522 prompts the user of client device 104 to provide authentication inputs for the authentication process without specifying the input modes.
During the authentication process and after the prompting, the server obtains (822) a first authentication input via a first input mode and a second authentication input via a second input mode distinct from the first input mode. In some embodiments, server system 108 or a component thereof (e.g., obtaining module 236, Figure 2) obtains first authentication input via a first input mode and a second authentication input via a second input mode distinct from the first input mode from client device 104 in response to the prompt in operation 818 and/or operation 820. For example, a respective authentication input is one of a retinal image, a fingerprint image, an image including a user’s face, a voice sample, a sketch code, a password, and the like.
In some embodiments, the first authentication input and the second authentication input are received (824) within a predefined time period. In some embodiments, the prompt in operation 818 also indicates a predefined time period within which the first and second authentication inputs must be entered. For example, if the first and second authentication inputs are not entered within a predefined time period, the authentication process fails or times out.
During the authentication process and in response to receiving the first authentication input and the second authentication input, the server authenticates (826) the user based on the first authentication input and the second authentication input. In some embodiments, as a part of the authentication process, server system or a component thereof (e.g., processing module 238, Figure 2) processes the first authentication input and the second authentication input. For example, processing module 238 determines a vocal signature for a voice sample of the user, performs facial recognition on an image with a user’s face, performs fingerprint recognition on an image with a user’s fingerprint, or performs retinal recognition on an image with a user’s retina, and the like. Furthermore, after performing the processing, server system 108 or a component thereof (e.g., authenticating module 246, Figure 2) authenticates or denies authentication of the respective user.
In some embodiments, the server authenticates the user based on the first authentication input and the second authentication input by (828): determining a first authentication score by comparing the first authentication input against previously stored first authentication information corresponding to the first input mode; determining a second authentication score by comparing the second authentication input against previously stored second authentication information corresponding to the second input mode; and, in accordance with a determination that the first authentication score satisfies a first authentication threshold and the second authentication score satisfies a second authentication threshold, authenticating the user. In some embodiments, server system 108 or a component thereof (e.g., scoring module 240, Figure 2) determines a first authentication score by comparing the first authentication input against previously stored first authentication information corresponding to the first input mode and a second authentication score by comparing the second authentication input against previously stored second authentication information corresponding to the second input mode. For example, authentication information database 114, which is stored by server system 108, stores authentication information for users of the social networking platform including previously stored authentication information corresponding to the respective user for the first input mode and the second input mode. In another example, authentication information 354 stored by the client device 104 includes previously entered authentication information for each of the input modes. For example, the user enters authentication information for each of the input modes during initialization or setup of an account for the social networking platform or other application.
Thereafter, in some embodiments, server system 108 or a component thereof (e.g., scoring module 240, Figure 2) determines whether the first authentication score satisfies a first authentication threshold for the first input mode and whether the second authentication score satisfies a second authentication threshold for the second input mode. In some embodiments, the authentication score is a confidence score or a number of matching features between the input and the stored information. Alternatively, in some embodiments, an overall authentication score is computed based on a predefined algorithm that takes into account the confidence scores for each of the input modes. Furthermore, in some embodiments, server system 108 or a component thereof (e.g., authenticating module 246, Figure 2) authenticates the respective user in accordance with a determination that the first authentication score
satisfies a first authentication threshold for the first input mode and the second authentication score satisfies a second authentication threshold for the second input mode. In some embodiments, after authenticating the user, an online transaction is processed or client device 104 is unlocked.
In some embodiments, in accordance with a determination that the first authentication score does not satisfy the first authentication threshold or the second authentication score does not satisfy the second authentication threshold, server system 108 or a component thereof (e.g., prompting module 234, Figure 2) sends a notification to client device 104 for presentation to the user indicating that the user is not authentic and/or prompting the user to re-input authentication information. Alternatively, in some embodiments, the user does not get a second attempt at the authentication process, and, instead, the user is locked out of the application.
In some embodiments, the first authentication threshold and the second authentication threshold are based (830) at least in part on environmental conditions associated with a client device corresponding to the user. In some embodiments, server system 108 or a component thereof (e.g., adjusting module 244, Figure 2) adjusts at least one of the first authentication threshold for the first input mode and the second authentication threshold for the second input mode from a default value to a custom value based on environmental information corresponding to environmental conditions associated with a client device 104 of the respective user obtained by environmental information obtaining module 230. For example, when the environmental information indicates that client device 104 is located in a low light or in-motion environment, adjusting module 244 decreases the authentication threshold associated with facial and retinal recognition. In another example, when the environmental information indicates that client device 104 is located in a noisy environment, adjusting module 244 decreases the authentication threshold associated with voice authentication. In another example, adjusting module 244 increases an authentication threshold associated with facial recognition when the environmental information indicates that client device 104 is located in an environment that makes capturing an image more accurate.
In some embodiments, in response to receiving the first authentication input and the second authentication input, the server (832): determines whether at least one of the first authentication input or the second authentication input matches the obtained biometric information; in accordance with a determination that at least one of the first authentication input or the second authentication input matches the obtained biometric information, increases at least one of the first authentication score and the second authentication score; and, in accordance with a determination that the obtained biometric information does not the first authentication input or the second authentication input, notifies the user to re-input authentication information. In some embodiments, server system 108 or a component thereof (e.g., scoring module 240, Figure 2) increases the confidence of one of an authentication score for an authentication input when the previously collected biometric information matches the authentication input. Alternatively, in some embodiments, server system 108 or a component thereof notifies the user to re-input authentication information or decreases an authentication score for an authentication input when the previously collected biometric information does not match the authentication input. For example, a thief steals the user’s phone and uses his own fingerprints to initiate an online transaction but uses a forged fingerprint mimicking the user’s to attempt to pass the authentication process. In this example, the collected biometric information includes the thief’s fingerprint used to initiation the online transaction.
It should be understood that the particular order in which the operations in Figures 8A-8C have been described is merely exemplary and is not intended to indicate that the described order is the only order in which the operations could be performed. One of ordinary skill in the art would recognize various ways to reorder the operations described herein. Additionally, it should be noted that details of other processes described herein with respect to other methods described herein (e.g., methods 600 and 700) are also applicable in an analogous manner to method 800 described above with respect Figures 8A-8C.
Figure 9 is a block diagram of a client-side module 102 in accordance with some embodiments. In some embodiments, client-side module 102 is executed on client device 104 (e.g., the mobile terminal), and client-side module 102 corresponds to the social networking platform or another application. In some embodiments, client device 104 is a mobile device such as a smart phone (e.g., an AndroidTM mobile phone and an iOSTM mobile
phone), a tablet computer, a laptop computer, a mobile Internet device (MID), or a wearable computing device. In some embodiments, client-side module 102 includes the following modules: identity authentication triggering module 902, authentication information comparing module 904, identity determining module 906, online transaction execution module 908, and unlocking module 910. Optional modules are indicated by dashed lines (e.g., boxes with dashed-line borders).
In some embodiments, identity authentication triggering module 902 is configured to display an identity authentication interface and detect whether authentication information input by a user is obtained through any one of at least two preset authentication information input manners. Optionally, in some embodiments, identity authentication triggering module 902 further includes an input sensor start sub-unit 922. Optionally, in some embodiments, identity authentication triggering module 92 further includes an authentication input detection sub-unit 924.
In some embodiments, the input sensor start sub-unit 922 is configured to start at least two user input sensors of the mobile terminal. The mobile terminal includes at least two user input sensors, and each user input sensor respectively corresponds to one authentication information input manner. The input sensor start sub-unit 922 is triggered, after the identity authentication interface is displayed, to start the at least two user input sensors so as to obtain authentication information input by the user. In some embodiments, the user input sensors include a fingerprint collection sensor, a voiceprint sensor, a touch screen input sensor, a face recognition sensor, an iris recognition sensor, and a keyboard input sensor.
In some embodiments, the authentication input detection sub-unit 924 is configured to detect whether any user input sensor of the at least two started user input sensors obtains authentication information input by the user. Further, in some embodiments, the at least two user input sensors are triggered for a preset time window after the identity authentication interface is displayed. Thus, the user must enter the authentication information within the preset time window or a time expiration prompt is presented to the user and the at least two user input sensors are turned off. For example, the preset time window is 30 seconds, 1 minute, or the like.
In some embodiments, authentication information comparing module 904 is configured to: if the authentication information input by the user is obtained through any authentication information input manner, compare the authentication information with preset authentication information corresponding to the authentication information input manner.
In some embodiments, identity confirming module 906 is configured to confirm the identity of the user of the mobile terminal when the comparison result from authentication information comparing module 904 satisfies a predetermined comparison threshold.
Optionally, in some embodiments, after confirming the identity of the user with identity confirming module 906, online transaction execution module 908 is configured execute an online transaction by sending transaction information to a merchants’ server according to the determined user identity of the mobile terminal. In this example, the user transaction information includes an item, amount of money, and other account information corresponding to the determined user identity corresponding to the online transaction.
Optionally, in some embodiments, after confirming the identity of the user with identity confirming module 906, unlocking module 910 is configured to unlock the mobile terminal according to the determined user identity of the mobile terminal. Specifically, in some circumstances, the mobile terminal is unlocked or the current user is provided with the ability to use the secured application.
While particular embodiments are described above, it will be understood it is not intended to limit the application to these particular embodiments. On the contrary, the application includes alternatives, modifications and equivalents that are within the spirit and scope of the appended claims. Numerous specific details are set forth in order to provide a thorough understanding of the subject matter presented herein. But it will be apparent to one of ordinary skill in the art that the subject matter may be practiced without these specific details. In other instances, well-known methods, procedures, components, and circuits have not been described in detail so as not to unnecessarily obscure aspects of the embodiments.
Claims (20)
- A method of authenticating a user, comprising:at a server system with one or more processors and memory:detecting a trigger condition;in response to detecting the trigger condition, performing an authentication process so as to authenticate the user, the authentication process including:dynamically selecting two or more distinct input modes for the authentication process based on one or more predetermined criteria;prompting the user to provide two or more authentication inputs via the two or more dynamically selected input modes;after the prompting, obtaining a first authentication input via a first input mode and a second authentication input via a second input mode distinct from the first input mode; andin response to receiving the first authentication input and the second authentication input, authenticating the user based on the first authentication input and the second authentication input.
- The method of claim 1, wherein the two or more dynamically selected input modes for the authentication process differ from two or more dynamically selected input modes for a previous authentication process as a result of a pseudo-random selection procedure used for the dynamic selections.
- The method of claim 1, wherein the two or more dynamically selected input modes for the authentication process differ from two or more dynamically selected input modes for a previous authentication process as a result of differing environmental conditions existing at the time of the dynamic selections.
- The method of any of claims 1-3, wherein prompting the user further includes presenting an indication of the first input mode for the first authentication input and the second input mode for the second authentication input.
- The method of any of claims 1-4, wherein the one or more predetermined criteria include environmental conditions associated with a client device corresponding to the user.
- The method of claim 5, wherein dynamically selecting the two or more distinct input modes for the authentication process based on one or more predetermined criteria further includes:determining that the user is located in a crowded environment; andin accordance with the determination that the user is located in a crowded environment, dynamically selecting a first authentication input mode corresponding to fingerprint recognition and a second authentication input mode corresponding to retinal recognition for the authentication process.
- The method of claim 5, wherein dynamically selecting the two or more distinct input modes for the authentication process based on one or more predetermined criteria further includes:determining that the user is in a low-light environment; andin accordance with the determination that the user is in a low-light environment, dynamically selecting a first authentication input mode corresponding to vocal recognition and a second authentication input mode corresponding to fingerprint recognition for the authentication process.
- The method of any of claims 1-4, wherein the one or more predetermined criteria include an input mode preference in a user profile corresponding to user.
- The method of any of claims 1-8, wherein authenticating the user based on the first authentication input and the second authentication input further includes:determining a first authentication score by comparing the first authentication input against previously stored first authentication information corresponding to the first input mode;determining a second authentication score by comparing the second authentication input against previously stored second authentication information corresponding to the second input mode; andin accordance with a determination that the first authentication score satisfies a first authentication threshold and the second authentication score satisfies a second authentication threshold, authenticating the user.
- The method of claim 9, wherein the first authentication threshold and the second authentication threshold are based at least in part on environmental conditions associated with a client device corresponding to the user.
- The method of any of claims 9-10, further comprising:prior to detecting the trigger condition, obtaining biometric information corresponding to the user;in response to receiving the first authentication input and the second authentication input, determining whether at least one of the first authentication input or the second authentication input matches the obtained biometric information;in accordance with a determination that at least one of the first authentication input or the second authentication input matches the obtained biometric information, increasing at least one of the first authentication score and the second authentication score; andin accordance with a determination that the obtained biometric information does not the first authentication input or the second authentication input, notifying the user to re-input authentication information.
- A server system, comprising:one or more processors; andmemory storing one or more programs to be executed by the one or more processors, the one or more programs comprising instructions for:detecting a trigger condition;in response to detecting the trigger condition, performing an authentication process so as to authenticate the user, the authentication process including:dynamically selecting two or more distinct input modes for the authentication process based on one or more predetermined criteria;prompting the user to provide two or more authentication inputs via the two or more dynamically selected input modes;after the prompting, obtaining a first authentication input via a first input mode and a second authentication input via a second input mode distinct from the first input mode; andin response to receiving the first authentication input and the second authentication input, authenticating the user based on the first authentication input and the second authentication input.
- The server system of claim 12, wherein the two or more dynamically selected input modes for the authentication process differ from two or more dynamically selected input modes for a previous authentication process as a result of differing environmental conditions existing at the time of the dynamic selections.
- The server system of any of claims 12-13, wherein the one or more predetermined criteria include environmental conditions associated with a client device corresponding to the user.
- The server system of claim 14, wherein dynamically selecting the two or more distinct input modes for the authentication process based on one or more predetermined criteria further includes:determining that the user is located in a crowded environment; andin accordance with the determination that the user is located in a crowded environment, dynamically selecting a first authentication input mode corresponding to fingerprint recognition and a second authentication input mode corresponding to retinal recognition for the authentication process.
- The server system of claim 14, wherein dynamically selecting the two or more distinct input modes for the authentication process based on one or more predetermined criteria further includes:determining that the user is in a low-light environment; andin accordance with the determination that the user is in a low-light environment, dynamically selecting a first authentication input mode corresponding to vocal recognition and a second authentication input mode corresponding to fingerprint recognition for the authentication process.
- A non-transitory computer readable storage medium storing one or more programs, the one or more programs comprising instructions, which, when executed by a server system with one or more processors, cause the server system to perform operations comprising:detecting a trigger condition;in response to detecting the trigger condition, performing an authentication process so as to authenticate the user, the authentication process including:dynamically selecting two or more distinct input modes for the authentication process based on one or more predetermined criteria;prompting the user to provide two or more authentication inputs via the two or more dynamically selected input modes;after the prompting, obtaining a first authentication input via a first input mode and a second authentication input via a second input mode distinct from the first input mode; andin response to receiving the first authentication input and the second authentication input, authenticating the user based on the first authentication input and the second authentication input.
- The non-transitory computer readable storage medium of claim 17, wherein the one or more predetermined criteria include environmental conditions associated with a client device corresponding to the user.
- The non-transitory computer readable storage medium of claim 18, wherein dynamically selecting the two or more distinct input modes for the authentication process based on one or more predetermined criteria further includes:determining that the user is located in a crowded environment; andin accordance with the determination that the user is located in a crowded environment, dynamically selecting a first authentication input mode corresponding to fingerprint recognition and a second authentication input mode corresponding to retinal recognition for the authentication process.
- The non-transitory computer readable storage medium of claim 18, wherein dynamically selecting the two or more distinct input modes for the authentication process based on one or more predetermined criteria further includes:determining that the user is in a low-light environment; andin accordance with the determination that the user is in a low-light environment, dynamically selecting a first authentication input mode corresponding to vocal recognition and a second authentication input mode corresponding to fingerprint recognition for the authentication process.
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201310518398.3A CN104579670A (en) | 2013-10-28 | 2013-10-28 | Mobile terminal authentication method and mobile terminal |
CN201310518398.3 | 2013-10-28 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2015062382A1 true WO2015062382A1 (en) | 2015-05-07 |
Family
ID=53003299
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/CN2014/087538 WO2015062382A1 (en) | 2013-10-28 | 2014-09-26 | Method and system for authenticating user of mobile terminal |
Country Status (3)
Country | Link |
---|---|
CN (1) | CN104579670A (en) |
TW (1) | TWI543012B (en) |
WO (1) | WO2015062382A1 (en) |
Cited By (27)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN108804263A (en) * | 2018-05-03 | 2018-11-13 | 北京金山安全软件有限公司 | Terminal verification method, device and computer readable medium |
CN110442033A (en) * | 2019-07-30 | 2019-11-12 | 恒大智慧科技有限公司 | Authority control method, device, computer equipment and the storage medium of home equipment |
EP3779780A1 (en) * | 2017-09-09 | 2021-02-17 | Apple Inc. | Implementation of biometric authentication with first and second form of authentication |
US11037150B2 (en) | 2016-06-12 | 2021-06-15 | Apple Inc. | User interfaces for transactions |
US11074572B2 (en) | 2016-09-06 | 2021-07-27 | Apple Inc. | User interfaces for stored-value accounts |
US11100349B2 (en) | 2018-09-28 | 2021-08-24 | Apple Inc. | Audio assisted enrollment |
US11170085B2 (en) | 2018-06-03 | 2021-11-09 | Apple Inc. | Implementation of biometric authentication |
US11200309B2 (en) | 2011-09-29 | 2021-12-14 | Apple Inc. | Authentication with secondary approver |
US11206309B2 (en) | 2016-05-19 | 2021-12-21 | Apple Inc. | User interface for remote authorization |
US11287942B2 (en) | 2013-09-09 | 2022-03-29 | Apple Inc. | Device, method, and graphical user interface for manipulating user interfaces |
US11321731B2 (en) | 2015-06-05 | 2022-05-03 | Apple Inc. | User interface for loyalty accounts and private label accounts |
US11328352B2 (en) | 2019-03-24 | 2022-05-10 | Apple Inc. | User interfaces for managing an account |
WO2022109508A1 (en) * | 2020-11-20 | 2022-05-27 | Qualcomm Incorporated | Selection of authentication function according to environment of user device |
US11386189B2 (en) | 2017-09-09 | 2022-07-12 | Apple Inc. | Implementation of biometric authentication |
US11393258B2 (en) | 2017-09-09 | 2022-07-19 | Apple Inc. | Implementation of biometric authentication |
US11468155B2 (en) | 2007-09-24 | 2022-10-11 | Apple Inc. | Embedded authentication systems in an electronic device |
US11481769B2 (en) | 2016-06-11 | 2022-10-25 | Apple Inc. | User interface for transactions |
US11574041B2 (en) | 2016-10-25 | 2023-02-07 | Apple Inc. | User interface for managing access to credentials for use in an operation |
US11619991B2 (en) | 2018-09-28 | 2023-04-04 | Apple Inc. | Device control using gaze information |
US11676373B2 (en) | 2008-01-03 | 2023-06-13 | Apple Inc. | Personal computing device control using face detection and recognition |
US11783305B2 (en) | 2015-06-05 | 2023-10-10 | Apple Inc. | User interface for loyalty accounts and private label accounts for a wearable device |
US11816194B2 (en) | 2020-06-21 | 2023-11-14 | Apple Inc. | User interfaces for managing secure operations |
US11836725B2 (en) | 2014-05-29 | 2023-12-05 | Apple Inc. | User interface for payments |
US12002042B2 (en) | 2016-06-11 | 2024-06-04 | Apple, Inc | User interface for transactions |
US12079458B2 (en) | 2016-09-23 | 2024-09-03 | Apple Inc. | Image data for enhanced user interactions |
US12099586B2 (en) | 2021-01-25 | 2024-09-24 | Apple Inc. | Implementation of biometric authentication |
US12124770B2 (en) | 2023-08-24 | 2024-10-22 | Apple Inc. | Audio assisted enrollment |
Families Citing this family (19)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10410216B2 (en) | 2014-04-29 | 2019-09-10 | Mastercard International Incorporated | Methods and systems for verifying individuals prior to benefits distribution |
US20160364703A1 (en) * | 2015-06-09 | 2016-12-15 | Mastercard International Incorporated | Systems and Methods for Verifying Users, in Connection With Transactions Using Payment Devices |
US10817878B2 (en) | 2015-06-09 | 2020-10-27 | Mastercard International Incorporated | Systems and methods for verifying users, in connection with transactions using payment devices |
WO2017028250A1 (en) * | 2015-08-18 | 2017-02-23 | 张焰焰 | Method and mobile terminal for authenticating account login via voice and fingerprint |
WO2017028248A1 (en) * | 2015-08-18 | 2017-02-23 | 张焰焰 | Method and mobile terminal for indicating patent information upon voice login to account |
WO2017028251A1 (en) * | 2015-08-18 | 2017-02-23 | 张焰焰 | Method and mobile terminal for indicating information after authenticating account login with voice and fingerprint |
WO2017028247A1 (en) * | 2015-08-18 | 2017-02-23 | 张焰焰 | Method and mobile terminal for logging in to account with combination of voice, numeric password and fingerprint |
CN105141609B (en) * | 2015-08-28 | 2018-09-04 | 广东欧珀移动通信有限公司 | Fingerprint authentication method and relevant apparatus and fingerprint verification system |
CN105516983A (en) * | 2016-01-28 | 2016-04-20 | 宇龙计算机通信科技(深圳)有限公司 | Authentication method and authentication device |
CN105844468A (en) * | 2016-03-17 | 2016-08-10 | 上海新储集成电路有限公司 | Mobile-terminal ultra-low power consumption and high safety communication method |
CN106485123A (en) * | 2016-10-17 | 2017-03-08 | 信利光电股份有限公司 | A kind of cold screen awakening method and intelligent terminal |
CN108172221A (en) * | 2016-12-07 | 2018-06-15 | 广州亿航智能技术有限公司 | The method and apparatus of manipulation aircraft based on intelligent terminal |
CN107483717A (en) * | 2017-07-19 | 2017-12-15 | 广东欧珀移动通信有限公司 | The method to set up and Related product of infrared light compensating lamp |
CN108418829B (en) * | 2018-03-22 | 2020-10-27 | 平安科技(深圳)有限公司 | Account login verification method and device, computer equipment and storage medium |
CN108833359A (en) * | 2018-05-22 | 2018-11-16 | 深圳市商汤科技有限公司 | Auth method, device, equipment, storage medium and program |
CN108809983A (en) * | 2018-06-12 | 2018-11-13 | 北京智明星通科技股份有限公司 | A kind of method, apparatus and system for ensureing account safety and logging in |
CN109165490A (en) * | 2018-07-24 | 2019-01-08 | 北京全知科技有限公司 | A kind of data inputting method and device |
CN109815669A (en) * | 2019-01-14 | 2019-05-28 | 平安科技(深圳)有限公司 | Authentication method and server based on recognition of face |
CN114495338A (en) * | 2022-03-10 | 2022-05-13 | 珠海格力电器股份有限公司 | Door lock control method and device, electronic equipment and storage medium |
Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20020087894A1 (en) * | 2001-01-03 | 2002-07-04 | Foley James M. | Method and apparatus for enabling a user to select an authentication method |
US20050268107A1 (en) * | 2003-05-09 | 2005-12-01 | Harris William H | System and method for authenticating users using two or more factors |
US20070005988A1 (en) * | 2005-06-29 | 2007-01-04 | Microsoft Corporation | Multimodal authentication |
CN101197678A (en) * | 2007-12-27 | 2008-06-11 | 腾讯科技(深圳)有限公司 | Picture identifying code generation method and generation device |
US20110047608A1 (en) * | 2009-08-24 | 2011-02-24 | Richard Levenberg | Dynamic user authentication for access to online services |
US8171298B2 (en) * | 2002-10-30 | 2012-05-01 | International Business Machines Corporation | Methods and apparatus for dynamic user authentication using customizable context-dependent interaction across multiple verification objects |
US8316436B2 (en) * | 2009-03-27 | 2012-11-20 | Sony Corporation | User-defined multiple input mode authentication |
Family Cites Families (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN1581208A (en) * | 2004-05-19 | 2005-02-16 | 杭州中正生物认证技术有限公司 | Identificate-indentifying method using finger print sensor as well as being as inputting apparatus |
CN201838011U (en) * | 2010-05-26 | 2011-05-18 | 中国科学院自动化研究所 | Identity authentication equipment based on second-generation ID cards and multimode biological features |
-
2013
- 2013-10-28 CN CN201310518398.3A patent/CN104579670A/en active Pending
-
2014
- 2014-09-26 WO PCT/CN2014/087538 patent/WO2015062382A1/en active Application Filing
- 2014-10-13 TW TW103135406A patent/TWI543012B/en active
Patent Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20020087894A1 (en) * | 2001-01-03 | 2002-07-04 | Foley James M. | Method and apparatus for enabling a user to select an authentication method |
US8171298B2 (en) * | 2002-10-30 | 2012-05-01 | International Business Machines Corporation | Methods and apparatus for dynamic user authentication using customizable context-dependent interaction across multiple verification objects |
US20050268107A1 (en) * | 2003-05-09 | 2005-12-01 | Harris William H | System and method for authenticating users using two or more factors |
US20070005988A1 (en) * | 2005-06-29 | 2007-01-04 | Microsoft Corporation | Multimodal authentication |
CN101197678A (en) * | 2007-12-27 | 2008-06-11 | 腾讯科技(深圳)有限公司 | Picture identifying code generation method and generation device |
US8316436B2 (en) * | 2009-03-27 | 2012-11-20 | Sony Corporation | User-defined multiple input mode authentication |
US20110047608A1 (en) * | 2009-08-24 | 2011-02-24 | Richard Levenberg | Dynamic user authentication for access to online services |
Cited By (43)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US11468155B2 (en) | 2007-09-24 | 2022-10-11 | Apple Inc. | Embedded authentication systems in an electronic device |
US11676373B2 (en) | 2008-01-03 | 2023-06-13 | Apple Inc. | Personal computing device control using face detection and recognition |
US11200309B2 (en) | 2011-09-29 | 2021-12-14 | Apple Inc. | Authentication with secondary approver |
US11755712B2 (en) | 2011-09-29 | 2023-09-12 | Apple Inc. | Authentication with secondary approver |
US11768575B2 (en) | 2013-09-09 | 2023-09-26 | Apple Inc. | Device, method, and graphical user interface for manipulating user interfaces based on unlock inputs |
US11494046B2 (en) | 2013-09-09 | 2022-11-08 | Apple Inc. | Device, method, and graphical user interface for manipulating user interfaces based on unlock inputs |
US11287942B2 (en) | 2013-09-09 | 2022-03-29 | Apple Inc. | Device, method, and graphical user interface for manipulating user interfaces |
US11836725B2 (en) | 2014-05-29 | 2023-12-05 | Apple Inc. | User interface for payments |
US11783305B2 (en) | 2015-06-05 | 2023-10-10 | Apple Inc. | User interface for loyalty accounts and private label accounts for a wearable device |
US11321731B2 (en) | 2015-06-05 | 2022-05-03 | Apple Inc. | User interface for loyalty accounts and private label accounts |
US11734708B2 (en) | 2015-06-05 | 2023-08-22 | Apple Inc. | User interface for loyalty accounts and private label accounts |
US11206309B2 (en) | 2016-05-19 | 2021-12-21 | Apple Inc. | User interface for remote authorization |
US12002042B2 (en) | 2016-06-11 | 2024-06-04 | Apple, Inc | User interface for transactions |
US11481769B2 (en) | 2016-06-11 | 2022-10-25 | Apple Inc. | User interface for transactions |
US11037150B2 (en) | 2016-06-12 | 2021-06-15 | Apple Inc. | User interfaces for transactions |
US11900372B2 (en) | 2016-06-12 | 2024-02-13 | Apple Inc. | User interfaces for transactions |
US11074572B2 (en) | 2016-09-06 | 2021-07-27 | Apple Inc. | User interfaces for stored-value accounts |
US12079458B2 (en) | 2016-09-23 | 2024-09-03 | Apple Inc. | Image data for enhanced user interactions |
US11995171B2 (en) | 2016-10-25 | 2024-05-28 | Apple Inc. | User interface for managing access to credentials for use in an operation |
US11574041B2 (en) | 2016-10-25 | 2023-02-07 | Apple Inc. | User interface for managing access to credentials for use in an operation |
US11393258B2 (en) | 2017-09-09 | 2022-07-19 | Apple Inc. | Implementation of biometric authentication |
EP4155988A1 (en) * | 2017-09-09 | 2023-03-29 | Apple Inc. | Implementation of biometric authentication for performing a respective function |
US11386189B2 (en) | 2017-09-09 | 2022-07-12 | Apple Inc. | Implementation of biometric authentication |
US11765163B2 (en) | 2017-09-09 | 2023-09-19 | Apple Inc. | Implementation of biometric authentication |
EP3779780A1 (en) * | 2017-09-09 | 2021-02-17 | Apple Inc. | Implementation of biometric authentication with first and second form of authentication |
CN108804263B (en) * | 2018-05-03 | 2021-08-24 | 北京金山安全软件有限公司 | Terminal verification method, device and computer readable medium |
CN108804263A (en) * | 2018-05-03 | 2018-11-13 | 北京金山安全软件有限公司 | Terminal verification method, device and computer readable medium |
US11170085B2 (en) | 2018-06-03 | 2021-11-09 | Apple Inc. | Implementation of biometric authentication |
US11928200B2 (en) | 2018-06-03 | 2024-03-12 | Apple Inc. | Implementation of biometric authentication |
US11619991B2 (en) | 2018-09-28 | 2023-04-04 | Apple Inc. | Device control using gaze information |
US11100349B2 (en) | 2018-09-28 | 2021-08-24 | Apple Inc. | Audio assisted enrollment |
US12105874B2 (en) | 2018-09-28 | 2024-10-01 | Apple Inc. | Device control using gaze information |
US11809784B2 (en) | 2018-09-28 | 2023-11-07 | Apple Inc. | Audio assisted enrollment |
US11688001B2 (en) | 2019-03-24 | 2023-06-27 | Apple Inc. | User interfaces for managing an account |
US11669896B2 (en) | 2019-03-24 | 2023-06-06 | Apple Inc. | User interfaces for managing an account |
US11610259B2 (en) | 2019-03-24 | 2023-03-21 | Apple Inc. | User interfaces for managing an account |
US11328352B2 (en) | 2019-03-24 | 2022-05-10 | Apple Inc. | User interfaces for managing an account |
CN110442033A (en) * | 2019-07-30 | 2019-11-12 | 恒大智慧科技有限公司 | Authority control method, device, computer equipment and the storage medium of home equipment |
US11816194B2 (en) | 2020-06-21 | 2023-11-14 | Apple Inc. | User interfaces for managing secure operations |
US11907342B2 (en) | 2020-11-20 | 2024-02-20 | Qualcomm Incorporated | Selection of authentication function according to environment of user device |
WO2022109508A1 (en) * | 2020-11-20 | 2022-05-27 | Qualcomm Incorporated | Selection of authentication function according to environment of user device |
US12099586B2 (en) | 2021-01-25 | 2024-09-24 | Apple Inc. | Implementation of biometric authentication |
US12124770B2 (en) | 2023-08-24 | 2024-10-22 | Apple Inc. | Audio assisted enrollment |
Also Published As
Publication number | Publication date |
---|---|
TWI543012B (en) | 2016-07-21 |
TW201516731A (en) | 2015-05-01 |
CN104579670A (en) | 2015-04-29 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2015062382A1 (en) | Method and system for authenticating user of mobile terminal | |
US10242364B2 (en) | Image analysis for user authentication | |
US11005834B2 (en) | Method and system for facilitating wireless network access | |
US12032668B2 (en) | Identifying and authenticating users based on passive factors determined from sensor data | |
US10776479B2 (en) | Authentication system | |
CN108140083B (en) | Authorizing transactions on a shared device using a personal device | |
US11194594B2 (en) | Methods and systems for detecting a user and intelligently altering user device settings | |
JP6820062B2 (en) | Identity authentication methods and devices, terminals and servers | |
CN110178179B (en) | Voice signature for authenticating to electronic device users | |
US20150186892A1 (en) | Methods and systems for verifying a transaction | |
US20170006028A1 (en) | System and Method to Authenticate Electronics Using Electronic-Metrics | |
WO2020006252A1 (en) | Biometric authentication | |
US20150161613A1 (en) | Methods and systems for authentications and online transactions | |
WO2015101036A1 (en) | Methods and systems for verifying a transaction | |
US9576135B1 (en) | Profiling user behavior through biometric identifiers | |
KR20200002785A (en) | Method and apparatus for constructing biometric feature database | |
EP2853073A1 (en) | User-based identification system for social networks | |
JP2015515694A (en) | Location-based access control for portable electronic devices | |
US9049211B1 (en) | User challenge using geography of previous login | |
AU2018217220A1 (en) | Methods and systems for capturing biometric data | |
US20230105850A1 (en) | Systems and methods for conducting remote user authentication | |
US20230359719A1 (en) | A computer implemented method |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 14859063 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
32PN | Ep: public notification in the ep bulletin as address of the adressee cannot be established |
Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 21/09/2016) |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 14859063 Country of ref document: EP Kind code of ref document: A1 |