WO2015103970A1 - Method, apparatus and system for authenticating user - Google Patents

Method, apparatus and system for authenticating user Download PDF

Info

Publication number
WO2015103970A1
WO2015103970A1 PCT/CN2015/070226 CN2015070226W WO2015103970A1 WO 2015103970 A1 WO2015103970 A1 WO 2015103970A1 CN 2015070226 W CN2015070226 W CN 2015070226W WO 2015103970 A1 WO2015103970 A1 WO 2015103970A1
Authority
WO
WIPO (PCT)
Prior art keywords
user
face
data
terminal device
user identification
Prior art date
Application number
PCT/CN2015/070226
Other languages
French (fr)
Inventor
Danqing SUN
Original Assignee
Tencent Technology (Shenzhen) Company Limited
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tencent Technology (Shenzhen) Company Limited filed Critical Tencent Technology (Shenzhen) Company Limited
Publication of WO2015103970A1 publication Critical patent/WO2015103970A1/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/30Authentication, i.e. establishing the identity or authorisation of security principals
    • G06F21/31User authentication
    • G06F21/32User authentication using biometric data, e.g. fingerprints, iris scans or voiceprints

Definitions

  • the present application generally relates to the field of information security, and more particularly to a method and related apparatus and system for authenticating a user associated with transferring data.
  • passwords are commonly used in user authentication systems associated with secure data transfers.
  • some known systems authenticate a user by requesting the user to enter a preset password, and then allow the user to perform data transfer only if the password entered by the user is verified by the systems.
  • Such known systems are typically vulnerable to security risks such as hacked or stolen password caused by, for example, intentional peeping, password stealing trojans, password phishing websites, and/or the like.
  • memorization of a password string can be challenging for some users.
  • a method for authenticating a user at a server device associated with transferring data is disclosed.
  • the method is performed at the server device, which includes one or more processors and memory for storing programs to be executed by the one or more processors.
  • the method includes receiving, from a terminal device associated with the user and in response to a request for transferring data received at the terminal device, data associated with the user’s face and a user identification uniquely associated with the user.
  • the request for transferring data includes a request for transferring a financial asset from an account of the user.
  • the method also includes generating a matching result at the server device by comparing the data associated with the user’s face with face data identified by the user identification that is stored at the server device.
  • the matching result indicates either a match between the data associated with the user’s face and the stored face data identified by the user identification or no match between the data associated with the user’s face and the stored face data identified by the user identification.
  • the face data identified by the user identification was provided by the user to the server device prior to the server device receiving the data associated with the user’s face and the user identification.
  • the server device receives updated face data identified by the user identification, and then replaces the stored face data identified by the user identification with the updated face data identified by the user identification.
  • the method in generating the matching result, includes calculating a matching value based on the data associated with the user’s face and the stored face data identified by the user identification, and then comparing the matching value with a predefined matching threshold. If the matching value is greater than the predefined matching threshold, a matching result is generated to indicate a match between the data associated with the user’s face and the stored face data identified by the user identification. Otherwise, if the matching value is not greater than the predefined matching threshold, a matching result is generated to indicate no match between the data associated with the user’s face and the stored face data identified by the user identification.
  • the method further includes sending the matching result from the server device to the terminal device such that the terminal device responds to the request for transferring data in accordance with the matching result.
  • the sending the matching result includes sending the matching result from the server device to the terminal device such that if the matching result indicates a match between the data associated with the user’s face and the stored face data identified by the user identification, the terminal device performs data transfer in compliance with the request; and if the matching result indicates no match between the data associated with the user’s face and the stored face data identified by the user identification, the terminal device sends a message to the user declining the request.
  • an apparatus e.g., a server device
  • the one or more programs include instructions that cause the apparatus to perform the method for authenticating a user associated with transferring data as described above.
  • a non-transitory computer readable storage medium stores one or more programs including instructions for execution by one or more processors. The instructions, when executed by the one or more processors, cause the processors to perform the operations of authenticating a user associated with transferring data as described above.
  • FIG. 1 is a flowchart illustrating a method performed at a terminal device for authenticating a user associated with transferring data in accordance with some embodiments.
  • FIG. 2 is a flowchart illustrating a method performed at a server device for authenticating a user associated with transferring data in accordance with some embodiments.
  • FIG. 3 is a block diagram of a terminal device configured to authenticate a user associated with transferring data in accordance with some embodiments.
  • FIG. 4 is a block diagram illustrating structure of a terminal device in accordance with some embodiments.
  • FIG. 5 is a block diagram of a server device configured to authenticate a user associated with transferring data in accordance with some embodiments.
  • FIG. 6 is a block diagram illustrating structure of a server device in accordance with some embodiments.
  • FIG. 7 is a schematic diagram illustrating a system configured to authenticate a user associated with transferring data in accordance with some embodiments.
  • FIG. 1 is a flowchart illustrating a method 100 performed at a terminal device for authenticating a user associated with transferring data in accordance with some embodiments.
  • the terminal device performing the method 100 can be any type of device that is configured to authenticate a user operating the terminal device associated with a data transfer operation, and perform the data transfer for the user in response to the user being authenticated.
  • the terminal device can be operatively coupled to and communicate with a server device via one or more network (s) (e.g., the Internet) . Details of a system including a terminal device and a server device configured to perform user authentication associated with transferring date are shown and described below with respect to FIG. 7.
  • the terminal device performing the method 100 can be any type of electronic device configured to function as a client-side device to provide the user authentication service to a user operating that terminal device.
  • a terminal device can be, for example, a cellular phone, a smart phone, a mobile Internet device (MID) , a personal digital assistant (PDA) , a palmtop computer, a tablet computer, an e-reader, a laptop computer, a handheld computer, a wearable device, a desktop computer, a vehicle terminal, and/or the like.
  • a terminal device can be referred to as, for example, a client device, a user device, a mobile device, a portable device, a terminal, and/or the like.
  • the terminal device performing the method 100 includes a device (e.g., a camera) capable of taking pictures.
  • the terminal device is configured to take a picture of the user in association with performing the user authentication process. Details of a terminal device are shown and described below with respect to FIGS. 3-4.
  • the server device operatively coupled to and communicating with the terminal device can be any type of device configured to function as a server-side device to provide the user authentication service to the user operating the terminal device.
  • a server device can typically be configured to communicate with multiple terminal devices via one or more networks.
  • the server device can be, for example, a background server, a back end server, a database server, a workstation, a desktop computer, a cloud computing server, a data processing server, and/or the like.
  • the server device can be a server cluster or server center consisting of two or more servers (e.g., a data processing server and a database server) . Details of a server device are shown and described below with respect to FIGS. 5-6.
  • a network connecting the terminal device performing the method 100 and a server device can be any type of network configured to operatively couple one or more server devices to one or more terminal devices, and enable communications between the server device (s) and the terminal device (s) .
  • a network can include one or more networks such as, for example, a cellular network, a satellite network, a local area network (LAN) , a wide area network (WAN) , a wireless local area network (WLAN) , the Internet, etc.
  • such a network can be optionally implemented using any known network protocol including various wired and/or wireless protocols such as, for example, Ethernet, universal serial bus (USB) , global system for mobile communications (GSM) , enhanced data GSM environment (EDGE) , general packet radio service (GPRS) , long term evolution (LTE) , code division multiple access (CDMA) , wideband code division multiple Access (WCDMA) , time division multiple access (TDMA) , Bluetooth, Wi-Fi, voice over internet protocol (VoIP) , Wi-MAX, etc.
  • GSM global system for mobile communications
  • EDGE enhanced data GSM environment
  • GPRS general packet radio service
  • LTE long term evolution
  • CDMA code division multiple access
  • WCDMA wideband code division multiple Access
  • TDMA time division multiple access
  • Bluetooth Wi-Fi
  • Wi-Fi voice over internet protocol
  • Wi-MAX Wi-MAX
  • the data transfer operation associated with user authentication can be any type of operation that involves communication (e.g., transmission of data) between the terminal device and another device (e.g., a server device, another terminal device) , and requires user authentication as a security mechanism to protect the user from potential risks.
  • the data transfer operation can include transferring a financial asset (e.g., money, security bonds, stocks, mileage points, etc. ) from an account of the user (e.g., as a payer) to an account of another party (e.g., as a payee) .
  • a financial asset e.g., money, security bonds, stocks, mileage points, etc.
  • the data transfer operation can be a transfer of money from a bank account of the user to a bank account of another party.
  • the data transfer can be a transmission of personal information and/or confidential data (e.g., a bank account number, credit card information, social security number, address, phone number. etc. ) of the user from the terminal device to another device (e.g., a server device, another terminal device) operatively coupled to the terminal device.
  • another device e.g., a server device, another terminal device
  • the user authentication mechanism described herein can be implemented for other applications involving data transfers such as, for example, online games, online education, cloud database, and/or the like.
  • the data transfer operation includes making an online payment and/or an offline payment associated with a transaction.
  • a terminal device configured to perform online payment transactions can be any type of personal electronic device such as, for example, personal computer (PC) , PDA, laptop, touchpad, mobile phone, and/or the like.
  • the user enters the request into the terminal device, which includes, for example, a user identification, a payer account number, an amount of payment, etc.
  • a terminal device configured to perform offline payment transactions can be any type of device capable of obtaining information and/or data associated with the requested offline payment transactions.
  • a terminal device can be configured to swipe a card (e.g., a bank card, a credit card, a debit card, etc. ) of the user, which contains information and/or data (e.g., card number, expiration date, name of card holder, security code, card verification value (CVV) , etc. ) of the card that is used to complete an offline payment transaction.
  • a terminal device can be, for example, a point-of-sale (POS) terminal, a credit card machine, or any other type of payment terminal.
  • POS point-of-sale
  • a terminal device can be configured to scan and read a barcode (e.g., one-dimensional barcode, two-dimensional barcode, Quick Response (QR) code, etc. ) provided by the user.
  • the barcode contains information and/or data used to complete an offline payment transaction such as, for example, a card number, expiration date, name of card holder, security code, CVV, etc.
  • the user operating the terminal device performing the method 100 can be any person that can access and operate the terminal device, and initiate or perform the data transfer operation.
  • a user can be, for example, a payer of a transaction (e.g., online payment) .
  • a user operating a terminal device can use the user authentication service and the data transfer service to, for example, make online payments, conduct online shopping, transfer financial assets, transmit confidential data, and/or the like.
  • the terminal device performing the method 100 can include one or more processors and memory.
  • the method 100 is implemented using instructions or code of an application that are stored in a non-transitory computer readable storage medium of the terminal device and executed by the one or more processors of the terminal device.
  • the application is associated with authenticating users for performing data transfer operations.
  • Such an application typically has a client-side portion that is stored in and/or executed at the terminal device, and a server-side portion that is stored in and/or executed at the server devices operatively coupled to the terminal device.
  • the method 100 is performed at the terminal device.
  • the method 100 includes the following steps.
  • the terminal device receives, from a user of the terminal device, a request for transferring data.
  • a request for transferring data can be, for example, a request to make a payment (e.g., an online payment or an offline payment) associated with purchasing a merchandise item.
  • the received request includes a user identification uniquely associated with the user.
  • a user identification can be in any suitable form such as, for example, a number, a user ID, a username, etc.
  • the user selects face data (from a set of potential password methods such as, for example, text string, fingerprint, voice, slide operation, etc. ) as the password method used in user authentication associated with data transfer operations.
  • face data from a set of potential password methods such as, for example, text string, fingerprint, voice, slide operation, etc.
  • the user is prompted to enter or select a user identification, as well as to provide face data associated with the user.
  • the user enters or selects a user identification (e.g., a user ID, a username) that uniquely identifies the user, thus differentiating the user from any other user.
  • the user also provides a picture of her face that includes the requested face data. For example, the user can operate the terminal device to take a picture of her face.
  • the user can upload a picture of her face to the terminal device.
  • the picture of the user’s face is required to show a clear image of the user’s face without any portion of the face being covered.
  • the user is required not to wear one or more of glasses, hat, mask, makeup, etc. , when she takes the picture.
  • the terminal device After the user identification is determined at the terminal device and the face data of the user (e.g., a picture of the user’s face) is provided to the terminal device, the terminal device sends the user identification and the face data to the server device.
  • the user identification of the user and the face data of the user are stored (e.g., in a database) at the server device, where the user identification of each user is associated with the face data of that user.
  • the user can generate, at the terminal device, a request for transferring data including the user identification.
  • the terminal device that the user uses to determine the user identification, the terminal device that the user uses to provide face data, and the terminal device the user uses to perform user authentication and data transfer can be the same or different terminal devices.
  • the user can use a first terminal device to determine a user identification and provide original face data, and use a second terminal device to initiate a data transfer operation (e.g., generate a request for transferring data that includes the user identification) .
  • the first terminal device sends the determined user identification and the original face data to a server device
  • the second terminal device sends the user identification included in the request and collected face data (as described below with respect to the step 102) to the server device (as described below with respect to the step 103) .
  • the server device then performs the server-side portion of user authentication based on the user identification and face data initially received from the first terminal device and the user identification and face data subsequently received from the second terminal device.
  • the terminal device collects, in response to the request, data associated with the user’s face.
  • the user may not be required to enter any type of password (e.g., a textual string, a slide operation, a fingerprint, etc. ) to complete the data transfer operation.
  • the terminal device can collect the face data of the user in any suitable method.
  • the terminal device can use an embedded camera to take a picture of the user’s face.
  • the terminal device can use an external webcam to take a picture of the user’s face.
  • the terminal device can detect the user’s face and gather data associated with the user’s face without taking a picture of the user’s face.
  • the terminal device can implement, for example, face recognition techniques.
  • the terminal device is configured to reject any face data that is not collected from the user’s face in real time. That is, the terminal device is configured to reject any face data obtained from, for example, a picture that was previously taken, scan of an existing image, data uploaded to the terminal device, and/or the like. In other words, the user has to show her “real” face to provide the face data to the terminal device.
  • the terminal device sends the data associated with the user’s face and the user identification to the server device.
  • the server device stores the user identification and face data of the user that are previously provided to the server device.
  • the face data of the user is associated with and identified by the user identification of the user and is stored in (e.g., a memory of) the server device.
  • the server device In response to receiving the user identification and the face data from the terminal device, the server device generates a matching result by comparing the face data of the user received from the terminal device with the face data identified by the user identification that is stored at the server device.
  • the matching result generated at the server device indicates either a match or no match between the face data of the user received from the terminal device and the stored face data identified by the user identification.
  • the server device calculates a matching value based on the face data of the user received from the terminal device and the stored face data identified by the user identification that is stored at the server device. The server device then compares the matching value with a predefined matching threshold. If the matching value is greater than the predefined matching threshold, the server device generates a matching result indicating a match between the face data of the user received from the terminal device and the stored face data identified by the user identification. Otherwise, if the matching value is not greater than the predefined matching threshold, the server device generates a matching result indicating no match between the face data of the user received from the terminal device and the stored face data identified by the user identification.
  • the terminal device receives, from the server device, the matching result in response to the user’s face and the user identification.
  • the terminal device receives a binary value as the matching result, where a positive matching result (e.g., “1” ) indicates a match between the face data of the user collected at the terminal device (at the step 102) and the face data of the user stored at the server device, and a negative matching result (e.g., “0” ) indicates no match between the face data of the user collected at the terminal device (at the step 102) and the face data of the user stored at the server device.
  • a positive matching result e.g., “1”
  • a negative matching result e.g., “0”
  • the terminal device receives other information associated with the binary matching result such as, for example, the matching value calculated at the server device.
  • the other information e.g., the matching value
  • the other information can indicate a level of confidence in the matching result. For example, a relatively higher matching value indicates a higher level of confidence in a positive matching result or a lower level of confidence in a negative matching result, while a relatively lower matching value indicates a higher level of confidence in a negative matching result or a lower level of confidence in a positive matching result.
  • the terminal device receives information and/or data associated with the comparison of the collected face data and the stored face data, without receiving a binary matching result. For example, the terminal device receives the matching value without receiving a binary matching result as described above. In such embodiments, the terminal device can determine a matching result based on the received information and/or data. For example, the terminal device can compare the received matching value with a predefined matching threshold to determine a positive or negative matching result as described above.
  • the terminal device responds to the request for transferring data in accordance with the matching result.
  • the matching result can be received from the server device or determined at the terminal device based on information and/or data received from the server device.
  • the matching result e.g., a positive binary matching result
  • the terminal device performs data transfer in compliance with the request.
  • the terminal device sends a message to the user declining the request.
  • the terminal device sends a message prompting the user to reenter her request for transferring data (including her user identification) and/or to provide her face data again.
  • the terminal device in an online payment transaction, if the matching result indicates a match between the collected face data and the stored face data, the terminal device communicates with other devices to deduct a payment amount from an account of the user, and add that payment amount to an account of a payee. Information of the payment amount, user’s account and the payee’s account are included in the data transfer request previously generated or received at the terminal device. Furthermore, after the online payment transaction is completed, the terminal device generates and displays a message to the user, indicating the success of the online payment transaction. Otherwise, if the matching result indicates no match between the collected face data and the stored face data, the terminal device declines the data transfer request. Specifically, for example, the terminal device can generate and display an error message to the user indicating the failure of the user authentication and the decline of the data transfer operation.
  • FIG. 2 is a flowchart illustrating a method 200 performed at a server device for authenticating a user associated with transferring data in accordance with some embodiments.
  • the server device performing the method 200 is similar to the server device described above in the method 100 with respect to FIG. 1. Particularly, the server device is operatively coupled to and communicates with one or more terminal devices similar to the terminal device performing the method 100 described above with respect to FIG. 1. Furthermore, the server device performing the method 200 is configured to store data (e.g., face data, user identifications) of users that is used in user authentication for data transfer operations.
  • data e.g., face data, user identifications
  • the server device performing the method 200 includes one or more processors and memory.
  • the method 200 is implemented using instructions or code of an application that are stored in a non-transitory computer readable storage medium of the server device and executed by the one or more processors of the server device.
  • the application is associated with authenticating users for performing data transfer operations.
  • Such an application typically has a server-side portion that is stored in and/or executed at the server devices, and a client-side portion that is stored in and/or executed at each terminal device operatively coupled to the server device.
  • the method 200 is performed at the server device.
  • the method 200 includes the following steps.
  • the server device receives, from a terminal device associated with a user and in response to a request for transferring data received at the terminal device, data associated with the user’s face and a user identification uniquely associated with the user.
  • a password method e.g., from a set of potential password methods such as, for example, face data, text string, fingerprint, voice, slide operation, etc.
  • the user can select, for example, face data as the password method for data transfer operations.
  • the user is prompted to enter or select a user identification, as well as to provide face data of the user to a terminal device.
  • a user identification e.g., a user ID, a username
  • the terminal device also obtains the face data of the user by, for example, scanning a picture of the user, taking a picture of the user, using face recognition technology to capture face data of the user, receiving an uploaded image of the user, and/or the like.
  • the terminal device After obtaining the user identification and face data of the user, the terminal device sends the user identification and the face data to the server device.
  • the server device stores the user identification of the data together with the face data of the user in, for example, a memory of the server device.
  • the user identification of the user is associated with the face data of the user in the storage of the server device, such that the stored face data of the user can be located and retrieved based on the user identification of the data.
  • the server device is configured to receive updated face data identified by a user identification of a user.
  • the server device can periodically receive updated face data of a user, which is identified by a user identification of the user.
  • the server device can receive updated face data of a user in response to such updated face data of the user being received at a terminal device operatively coupled to the server device.
  • the updated face data of a user can reflect a change in the face of the user such as, a removal of glasses, a newly-generated scar, newly-grown beard, a change in hairstyle, a change due to a cosmetic surgery (e.g., eyelid surgery) , a change due to natural aging, and/or the like.
  • the server device can replace the stored face data identified by the user identification of the user with the updated face data identified by the same user identification. In such a method, the face data of the user stored at the server device can be updated.
  • the user can generate a request for transferring data at a terminal device.
  • the terminal device then sends the request to the server device.
  • the request for transferring data includes the user identification of the user and other information and/or data associated with the requested data transfer operation (e.g., bank account number, credit card number, payment amount, etc. ) .
  • the terminal device receiving the data transfer request can be the same or different from the terminal device collecting the initial user identification and face data of the user.
  • the server device generates a matching result by comparing the data associated with the user’s face with stored face data that is identified by the user identification.
  • the server device stores the user identification and face data of the user that are previously sent to the server device in response to the user selecting face data as the password method for data transfer operations.
  • the face data of the user is associated with the user identification of the user in the storage of the server device.
  • the stored face data of the user is identified by the user identification.
  • the server device In response to receiving the user identification and the face data from the terminal device, the server device generates a matching result by comparing the received face data of the user with the stored face data of the user that is identified by the user identification. That is, the user identification included in the request is used to locate the stored face data of the user that is identified by that user identification in the storage of the server device. The stored face data of the user is then retrieved, and compared with the received face data of the user to generate the matching result.
  • the matching result generated at the server device indicates either a match or no match between the face data of the user received from the terminal device and the stored face data identified by the user identification.
  • the server device calculates a matching value based on the received face data of the user and the stored face data of the user identified by the user identification.
  • the server device compares the matching value with a predefined matching threshold. If the matching value is greater than the predefined matching threshold, the server device generates a matching result (e.g., a positive matching result, or a “1” result) indicating a match between the received face data of the user and the stored face data identified by the user identification.
  • a matching result e.g., a positive matching result, or a “1” result
  • the server device Otherwise, if the matching value is not greater than the predefined matching threshold, the server device generates a matching result (e.g., a negative matching result, or a “0” result) indicating no match between the received face data of the user and the stored face data identified by the user identification.
  • a matching result e.g., a negative matching result, or a “0” result
  • the server device sends the matching result to the terminal device such that the terminal device responds to the request for transferring data in accordance with the matching result.
  • the server device sends to the terminal device a binary value as the matching result, where a positive matching result (e.g., “1” ) indicates a match between the face data of the user collected at the terminal device and the face data of the user stored at the server device, and a negative matching result (e.g., “0” ) indicates no match between the face data of the user collected at the terminal device and the face data of the user stored at the server device.
  • the matching result can be in any other suitable form.
  • the server device can send, to the terminal device, other information and/or data associated with the comparison of the collected face data and the stored face data, with or without sending a binary matching result.
  • the matching result e.g., a positive binary matching result
  • the terminal device performs data transfer in compliance with the request.
  • the matching result e.g., a negative binary matching result
  • the terminal device sends a message to the user declining the request.
  • the terminal device sends a message prompting the user to reenter her request for transferring data (including her user identification) and/or to provide her face data again.
  • the terminal device sends an error message to the user indicating the failure of the user authentication and the decline of the data transfer request.
  • FIG. 3 is a block diagram of a terminal device 300 configured to authenticate a user associated with transferring data in accordance with some embodiments.
  • the terminal device 300 can be structurally and functionally similar to the terminal devices described with respect to FIGS. 1 and 2. Particularly, the terminal device 300 can be operatively coupled to and communicate with one or more server devices, at least one of which is configured to provide the user authentication service as described herein.
  • the terminal device 300 includes a receive module 301, a data collection module 302, a transmit module 303 and a process module 304.
  • a terminal device can include more or less modules than those shown in FIG. 3, and/or connect to one or more external devices.
  • a terminal device can be connected to an external device (e.g., a webcam, a camera) configured to capture face data of users.
  • a terminal device can include an input module configured to receive input of users (e.g., entered user identifications, entered requests for data transfer, etc. ) .
  • each module included in the terminal device 300 can be a hardware-based module (e.g., a digital signal processor (DSP) , a field programmable gate array (FPGA) , an application-specific integrated circuit (ASIC) , etc. ) , a software-based module (e.g., a module of computer code executed at a processor, a set of processor-readable instructions executed at a processor, etc. ) , or a combination of hardware and software modules. Instructions or code of each module can be stored in a memory of the terminal device 300 (not shown in FIG. 3) and executed at a processor (e.g., a CPU) of the terminal device 300 (not shown in FIG. 3) .
  • the receive module 301, the data collection module 302, the transmit module 303 and the process module 304 can be configured to collectively perform the method 100 (e.g., the client-side portion of a user authentication application) shown and described above with respect to FIG. 1.
  • the receive module 301 is configured to, among other functions, receive data transfer requests from users operating the terminal device 300. Each data transfer request received at the receive module 301 includes a user identification that uniquely identifies a user. The receive module 301 is also configured to receive matching results returned from a server device operatively coupled to and communicating with the terminal device 300. In some embodiments, the receive module 301 can receive data and/or information associated with a requested data transfer operation from, for example, input data entered by a user using a finger, a mouse or a keyboard; swiping and reading of a magnetic card of a user (e.g., a bank card, a debit card, a credit card, etc. ) ; scanning of a barcode (e.g., a one-dimensional barcode, a two-dimensional barcode, a QR code, etc. ) provided by a user; and/or the like.
  • a barcode e.g., a one-dimensional barcode, a two-dimensional barcode, a QR
  • the data collection module 302 is configured to, among other functions, collect face data of users.
  • the data collection module 302 can be configured to communicate with and/or control an embedded device or external device (e.g., a camera, a webcam, etc. ) that is capable of capturing face data of users.
  • the data collection module 302 is configured to implement image processing techniques such as, for example, face recognition techniques.
  • the data collection module 302 can be configured to process a captured image to detect any human face in the image.
  • the data collection module 302 is configured to analyze an image of a human face to obtain face data from that image.
  • the transmit module 303 is configured to, among other functions, send received data transfer requests including user identifications and collected face data to the server device. Specifically, the transmit module 303 sends face data of a user together with the received user identification of that user, such that the received user identification can be used to locate face data identified by the received user identification that is stored at the server device.
  • the process module 304 is configured to, among other functions, determine a responding operation for a data transfer request based on a matching result for the data transfer request that is received at the receive module 301. Specifically, as discussed above, the process module 304 can proceed to complete the requested data transfer if the matching result for that data transfer request indicates a match between the face data of the user collected at the terminal device and the face data of the user stored at the server device and identified by the user identification collected at the terminal device. Additionally, the process module 304 can optionally generate and display a message to the user indicating the success of the user authentication and/or the completion of the data transfer operation.
  • the process module 304 can decline the data transfer request if the matching result for that data transfer request indicates no match between the face data of the user collected at the terminal device and the face data of the user stored at the server device and identified by the user identification collected at the terminal device. Additionally, the process module 304 can optionally generate and display an error message to the user indicating the failure of the user authentication and/or the decline of the data transfer operation. Furthermore, in some embodiments, in addition to a decline of a data transfer request, the process module 304 can prompt the user to reenter a user identification (or other information and/or data of the requested data transfer operation) , and/or prompt the user to provide face data again.
  • FIG. 4 is a block diagram illustrating structure of a terminal device 400 in accordance with some embodiments.
  • the terminal device 400 can be structurally and functionally similar to the terminal devices shown and/or described with respect to FIGS. 1-3. Particularly, the terminal device 400 can be operatively coupled to and communicate with one or more server devices, at least one of which is configured to provide the user authentication service as described herein.
  • the terminal device 400 includes a processor 401, a communication bus 402, a network interface 403, and a memory 404.
  • a terminal device can include more or less devices, components and/or modules than those shown in FIG. 4.
  • the processor 401 can be any processing device capable of performing the method 100 (e.g., a client-side portion or a user authentication application) described with respect to FIG. 1. Such a processor can be, for example, a CPU, a DSP, a FPGA, an ASIC, and/or the like.
  • the processor 401 can be configured to control the operations of other components and/or modules of the terminal device 400.
  • the processor 401 can be configured to control operations of the network interface 403.
  • the processor 401 can be configured to execute instructions or code stored in a software program or module (e.g., user authentication application) within the memory 404.
  • the communication bus 402 is configured to implement connections and communication among the other components of the terminal device 400.
  • the network interface 403 is configured to provide and control network interfaces of the terminal device 400 that are used to interact with other network devices (e.g., server devices, other terminal devices) .
  • the network interface 403 can include, for example, a standard wired interface and/or a standard wireless interface (e.g., a Wi-Fi interface) .
  • the network interface 403 is used for connecting the terminal device 400 with one or more server devices and performing data communication with the one or more server devices.
  • the network interface 403 is configured to transmit to the server device (s) , for example, data transfer requests including user identifications, face data, etc.
  • the network interface 403 is also configured to receive from the server device (s) , for example, matching results and/or other data associated with user authentication.
  • operations of the network interface 403 are controlled by instructions or code stored in the memory 404.
  • the memory 404 can include, for example, a random-access memory (RAM) (e.g., a DRAM, a SRAM, a DDR RAM, etc. ) , a non-volatile memory such as one or more magnetic disk storage devices, optical disk storage devices, flash memory devices, or other non-volatile solid state storage devices.
  • the memory 404 can include one or more storage devices (e.g., a removable memory) remotely located from other components of the terminal device 400.
  • the memory 404 includes program code associated with, for example, the user authentication application described herein.
  • each component, program, application or module included in the memory 404 can be a hardware-based module (e.g., a DSP, a FPGA, an ASIC) , a software-based module (e.g., a module of computer code executed at a processor, a set of processor-readable instructions executed at a processor) , or a combination of hardware and software modules.
  • Instructions or code of each component, program, application or module can be stored in the memory 404 and executed at the processor 401.
  • the instructions or code of the method 100 (e.g., a client-side portion of a user authentication application) shown and described above with respect to FIG. 1 are stored in the program code within the memory 404.
  • the processor 401 is configured to perform the instructions or code stored in the program code within the memory 404, as shown and described above with respect to the method 100 in FIG. 1.
  • FIG. 5 is a block diagram of a server device 500 configured to authenticate a user associated with transferring data in accordance with some embodiments.
  • the server device 500 can be structurally and functionally similar to the server devices described with respect to FIGS. 1 and 2. Particularly, the server device 500 can be operatively coupled to and communicate with one or more terminal devices, at least one of which and the server device are configured to collectively provide the user authentication service as described herein.
  • the server device 500 includes a receive module 501, a process module 502, and a transmit module 503.
  • the process module 502 includes a calculation unit 5021 and a determination unit 5022.
  • a server device can include more or less modules than those shown in FIG. 5.
  • a server device can include a storage module configured to store and update face data and user identifications of the users.
  • each module (or submodule, unit) included in the server device 500 can be a hardware-based module (e.g., a DSP, a FPGA, an ASIC, etc. ) , a software-based module (e.g., a module of computer code executed at a processor, a set of processor-readable instructions executed at a processor, etc. ) , or a combination of hardware and software modules. Instructions or code of each module can be stored in a memory of the server device 500 (not shown in FIG. 5) and executed at a processor (e.g., a CPU) of the server device 500 (not shown in FIG. 5) .
  • a processor e.g., a CPU
  • the receive module 501, the process module 502 (including the calculation unit 5021 and the determination unit 5022) , and a transmit module 503 can be configured to collectively perform the method 200 (e.g., the server-side portion of a user authentication application) shown and described above with respect to FIG. 2.
  • the receive module 501 is configured to, among other functions, receive from the terminal device (s) data transfer requests including user identifications and face data collected at the terminal device (s) in response to the data transfer requests. Specifically, the receive module 501 receives face data of a user together with the user identification of that user, such that the received user identification of the user can be associated with and stored together with the face data of the user. Thus, the user identification of the user can be used to locate face data identified by the user identification that is stored at the server device 500.
  • the process module 502 is configured to, among other functions, compare face data that is collected at a terminal device and received at the receive module 501 with stored face data identified by the corresponding user identification received at the receive module 501.
  • the process module 502 can further generate a matching result and/or other data (e.g., a matching value) based on the comparison.
  • the calculation unit 5021 is configured to calculate a matching value based on the comparison between the received face data and the stored face data. Typically, a high matching value indicates a high degree of match, and a low matching value indicates a low degree of match. Subsequently, the determination unit 5022 is configured to determine a matching result based on the matching value calculated at the calculation unit 5021.
  • the determination unit 5022 compares the calculated matching value with a predefined matching threshold, and determines a positive matching result (i.e., indicating a match) if the calculated matching value is greater than the predefined matching threshold, while determines a negative matching result (i.e., indicating no match) if the calculated matching value is not greater than the predefined matching threshold.
  • the transmit module 503 is configured to, among other functions, transmit matching results generated at the process module 502 to the corresponding terminal devices.
  • the transmit module 503 can transmit to the terminal devices other data (e.g., matching values) generated at the process module 502 that is associated with the user authentication.
  • the corresponding terminal devices can respond to the users in accordance with the matching results and/or other data associated with the user authentication.
  • FIG. 6 is a block diagram illustrating structure of a server device 600 in accordance with some embodiments.
  • the server device 600 can be structurally and functionally similar to the server devices shown and/or described with respect to FIGS. 1-2 and 5. Particularly, the server device 600 can be operatively coupled to and communicate with one or more terminal devices. The server device 600 and at least one of the terminal devices are configured to collectively provide the user authentication service as described herein.
  • the server device 600 includes a processor 601, a communication bus 602, a network interface 603, and a memory 604.
  • a server device can include more or less devices, components and/or modules than those shown in FIG. 6.
  • the processor 601 can be any processing device capable of performing the method 200 (e.g., a server-side portion or a user authentication application) described with respect to FIG. 2. Such a processor can be, for example, a CPU, a DSP, a FPGA, an ASIC, and/or the like.
  • the processor 601 can be configured to control the operations of other components and/or modules of the server device 600.
  • the processor 601 can be configured to control operations of the network interface 603.
  • the processor 601 can be configured to execute instructions or code stored in a software program or module (e.g., user authentication application) within the memory 604.
  • the communication bus 602 is configured to implement connections and communication among the other components of the server device 600.
  • the network interface 603 is configured to provide and control network interfaces of the server device 600 that are used to interact with other network devices (e.g., terminal devices, other server devices) .
  • the network interface 603 can include, for example, a standard wired interface and/or a standard wireless interface (e.g., a Wi-Fi interface) .
  • the network interface 603 is used for connecting the server device 600 with one or more terminal devices and performing data communication with the one or more terminal devices.
  • the network interface 603 is configured to receive from the terminal device (s) , for example, data transfer requests including user identifications, face data, etc.
  • the network interface 603 is also configured to transmit to the terminal device (s) , for example, matching results and/or other data associated with user authentication.
  • operations of the network interface 603 are controlled by instructions or code stored in the memory 604.
  • the memory 604 can include, for example, a RAM (e.g., a DRAM, a SRAM, a DDR RAM, etc. ) , a non-volatile memory such as one or more magnetic disk storage devices, optical disk storage devices, flash memory devices, or other non-volatile solid state storage devices.
  • the memory 604 can include one or more storage devices (e.g., a removable memory) remotely located from other components of the server device 600.
  • the memory 604 includes program code associated with, for example, the user authentication application described herein.
  • each component, program, application or module included in the memory 604 can be a hardware-based module (e.g., a DSP, a FPGA, an ASIC) , a software-based module (e.g., a module of computer code executed at a processor, a set of processor-readable instructions executed at a processor) , or a combination of hardware and software modules.
  • Instructions or code of each component, program, application or module can be stored in the memory 604 and executed at the processor 601.
  • the instructions or code of the method 200 (e.g., a server-side portion of a user authentication application) shown and described above with respect to FIG. 2 are stored in the program code within the memory 604.
  • the processor 601 is configured to perform the instructions or code stored in the program code within the memory 604, as shown and described above with respect to the method 200 in FIG. 2.
  • FIG. 7 is a schematic diagram illustrating a system 700 configured to authenticate a user associated with transferring data in accordance with some embodiments.
  • the system 700 includes a server device 702 and a terminal device 701.
  • the server device 702 is operatively coupled to the terminal device 701 via a network 703.
  • the terminal device 701 is operated by a user 704.
  • the server device 702 can be structurally and functionally similar to the server device performing the method 100 described above with respect to FIG. 1 and the server devices shown and described above with respect to FIGS. 5-6.
  • the terminal device 701 can be structurally and functionally similar to the terminal device described above with respect to FIG. 1 and the terminal devices shown and described above with respect to FIGS. 3-4.
  • the user 704 can be similar to the user described above with respect to FIG. 1.
  • the network 703 can be similar to the network described above with respect to FIG. 1.
  • the user 704 initially operates a terminal device (e.g., the terminal device 701 or another terminal device) to select face data as a password method for user authentication associated with data transfer operations.
  • the terminal device prompts the user 704 to enter a user identification (e.g., a username) that uniquely identifies the user 704.
  • the terminal device also prompts the user 704 to provide face data by, for example, taking a picture of the user 704, uploading a picture of the user 704, scanning an image of the user 704, and/or the like.
  • the terminal device sends the entered user identification of the user 704 and the received face data of the user 704 to the server device 702.
  • the server device 702 stores the user identification of the user 704 and the face data of the user 704, such that the face data of the user 704 can be identified and located using the user identification of the user 704.
  • the user 704 is allowed to initiate data transfer operations. Specifically, the user 704 operates the terminal device 701 to generate a data transfer request, which includes the user identification of the user 704 and other data associated with the requested data transfer (e.g., bank account number, payment amount, etc. ) . In response to receiving the data transfer request, the terminal device 701 collects face data of the user 704 by, for example, taking a picture of the user 704 using a camera of a webcam.
  • a data transfer request which includes the user identification of the user 704 and other data associated with the requested data transfer (e.g., bank account number, payment amount, etc. ) .
  • the terminal device 701 collects face data of the user 704 by, for example, taking a picture of the user 704 using a camera of a webcam.
  • the terminal device 701 then sends the received user identification and the collected face data to the server device 702.
  • the server device 702 retrieves the stored face data identified by the received user identification.
  • the server device 702 compares the stored face data with the face data recently received from the terminal device 701.
  • the server device 702 generates a matching result based on the comparison, which can be, for example, a positive matching result indicating a match between the stored face data and the received face data, or a negative matching result indicating no match between the stored face data and the received face data.
  • the server device 702 sends the generated matching result to the terminal device 701.
  • the terminal device 701 responds to the data transfer request accordingly based on the matching result. If the matching result indicates a match between the stored face data and the received face data, the user 704 is authenticated and the terminal device 701 proceeds to perform the requested data transfer operation. The terminal device 701 can further generates and displays a message to the user 704 indicating the success of user authentication and/or the completion of data transfer. Otherwise, if the matching result indicates no match between the stored face data and the received face data, the user 704 is not authenticated and the terminal device 701 declines to perform the requested data transfer operation. The terminal device 701 can further generates and displays an error message to the user 704 indicating the failure of user authentication and/or the decline of the data transfer request. Additionally, the terminal device 701 can prompt the user 704 to reenter a user identification and/or to provide face data again.
  • the term “if” may be construed to mean “when” or “upon” or “in response to determining” or “in accordance with a determination” or “in response to detecting, ” that a stated condition precedent is true, depending on the context.
  • the phrase “if it is determined [that a stated condition precedent is true] ” or “if [astated condition precedent is true] ” or “when [astated condition precedent is true] ” may be construed to mean “upon determining” or “in response to determining” or “in accordance with a determination” or “upon detecting” or “in response to detecting” that the stated condition precedent is true, depending on the context.
  • stages that are not order dependent may be reordered and other stages may be combined or broken out. While some reordering or other groupings are specifically mentioned, others will be obvious to those of ordinary skill in the art and so do not present an exhaustive list of alternatives. Moreover, it should be recognized that the stages could be implemented in hardware, firmware, software or any combination thereof.

Abstract

A method of authenticating a user at a server device associated with transferring data is disclosed. The method is performed at the server device having one or more processors and memory for storing programs to be executed by the one or more processors. The method includes receiving, from a terminal device associated with the user and in response to a request for transferring data received at the terminal device, data associated with the user's face and a user identification uniquely associated with the user. The method also includes generating a matching result by comparing the data associated with the user's face with face data identified by the user identification that is stored at the server device. The method further includes sending the matching result from the server device to the terminal device such that the terminal device responds to the request for transferring data in accordance with the matching result.

Description

METHOD, APPARATUS AND SYSTEM FOR AUTHENTICATING USER
PRIORITY CLAIM AND RELATED APPLICATION
This application claims priority to Chinese Patent Application Serial No.
201410010506.0, entitled “Method, Terminal, Server and System for Transferring Data” , filed on January 9, 2014, which is incorporated herein by reference in its entirety.
FIELD OF THE APPLICATION
The present application generally relates to the field of information security, and more particularly to a method and related apparatus and system for authenticating a user associated with transferring data.
BACKGROUND
Nowadays, passwords are commonly used in user authentication systems associated with secure data transfers. To ensure a secure data transfer, some known systems authenticate a user by requesting the user to enter a preset password, and then allow the user to perform data transfer only if the password entered by the user is verified by the systems. Such known systems, however, are typically vulnerable to security risks such as hacked or stolen password caused by, for example, intentional peeping, password stealing trojans, password phishing websites, and/or the like. Additionally, memorization of a password string can be challenging for some users.
Thus, a need exists for a method, apparatus and system that can enable a user authentication mechanism that is simplified, efficient, and more secure.
SUMMARY
The above deficiencies associated with the known user authentication systems may be reduced or eliminated by the techniques described herein.
In some embodiments, a method for authenticating a user at a server device associated with transferring data is disclosed. The method is performed at the server device, which includes one or more processors and memory for storing programs to be executed by the one or more processors. The method includes receiving, from a terminal device associated with the user and in response to a request for transferring data received at the terminal device, data associated with the user’s face and a user identification uniquely associated with the user. In some instances, the request for transferring data includes a request for transferring a financial asset from an account of the user.
The method also includes generating a matching result at the server device by comparing the data associated with the user’s face with face data identified by the user identification that is stored at the server device. In some instances, the matching result indicates either a match between the data associated with the user’s face and the stored face data identified by the user identification or no match between the data associated with the user’s face and the stored face data identified by the user identification. In some instances, the face data identified by the user identification was provided by the user to the server device prior to the server device receiving the data associated with the user’s face and the user identification. In some instances, the server device receives updated face data identified by the user identification, and then replaces the stored face data identified by the user identification with the updated face data identified by the user identification.
In some instances, in generating the matching result, the method includes calculating a matching value based on the data associated with the user’s face and the stored face data identified by the user identification, and then comparing the matching value with a predefined matching threshold. If the matching value is greater than the predefined matching threshold, a matching result is generated to indicate a match between the data associated with the user’s face and the stored face data identified by the user identification. Otherwise, if the matching value is not greater than the predefined matching threshold, a matching result is generated to indicate no match between the data associated with the user’s face and the stored face data identified by the user identification.
After generating the matching result, the method further includes sending the matching result from the server device to the terminal device such that the terminal device responds to the request for transferring data in accordance with the matching result. In some instances, the sending the matching result includes sending the matching result from the server device to the terminal device such that if the matching result indicates a match between the data associated with the user’s face and the stored face data identified by the user identification, the terminal device performs data transfer in compliance with the request; and if the matching result indicates no match between the data associated with the user’s face and the stored face data identified by the user identification, the terminal device sends a message to the user declining the request.
In some embodiments, an apparatus (e.g., a server device) includes one or more processors and memory storing one or more programs for execution by the one or more processors. The one or more programs include instructions that cause the apparatus to perform the method for authenticating a user associated with transferring data as described above. In some embodiments, a non-transitory computer readable storage medium stores one or more programs including instructions for execution by one or more processors. The instructions, when executed by the one or more processors, cause the processors to perform the operations of authenticating a user associated with transferring data as described above.
BRIEF DESCRIPTION OF DRAWINGS
The aforementioned features and advantages of the present application as well as additional features and advantages thereof will be more clearly understood hereinafter as a result of a detailed description of preferred embodiments when taken in conjunction with the drawings.
FIG. 1 is a flowchart illustrating a method performed at a terminal device for authenticating a user associated with transferring data in accordance with some embodiments.
FIG. 2 is a flowchart illustrating a method performed at a server device for authenticating a user associated with transferring data in accordance with some embodiments.
FIG. 3 is a block diagram of a terminal device configured to authenticate a user associated with transferring data in accordance with some embodiments.
FIG. 4 is a block diagram illustrating structure of a terminal device in accordance with some embodiments.
FIG. 5 is a block diagram of a server device configured to authenticate a user associated with transferring data in accordance with some embodiments.
FIG. 6 is a block diagram illustrating structure of a server device in accordance with some embodiments.
FIG. 7 is a schematic diagram illustrating a system configured to authenticate a user associated with transferring data in accordance with some embodiments.
Like reference numerals refer to corresponding parts throughout the several views of the drawings.
DETAILED DESCRIPTION
Reference will now be made in detail to embodiments, examples of which are illustrated in the accompanying drawings. In the following detailed description, numerous specific details are set forth in order to provide a thorough understanding of the subject matter presented herein. But it will be apparent to one skilled in the art that the subject matter may be practiced without these specific details. In other instances, well-known methods, procedures, components, and circuits have not been described in detail so as not to unnecessarily obscure aspects of the embodiments.
In order to make the objectives, technical solutions, and advantages of the present application comprehensible, embodiments of the present application are further described in detail below with reference to the accompanying drawings.
FIG. 1 is a flowchart illustrating a method 100 performed at a terminal device for authenticating a user associated with transferring data in accordance with some embodiments. The terminal device performing the method 100 can be any type of device that is configured to authenticate a user operating the terminal device associated with a data transfer operation, and perform the data transfer for the user in response to the user being authenticated. Furthermore, to accomplish the user authentication process, the terminal device can be operatively coupled to and communicate with a server device via one or more network (s) (e.g., the Internet) . Details of a system including a terminal device and a server device configured to perform user authentication associated with transferring date are shown and described below with respect to FIG. 7.
The terminal device performing the method 100 can be any type of electronic device configured to function as a client-side device to provide the user authentication service to a user operating that terminal device. In some embodiments, such a terminal device can be, for example, a cellular phone, a smart phone, a mobile Internet device (MID) , a personal digital assistant (PDA) , a palmtop computer, a tablet computer, an e-reader, a laptop computer, a handheld computer, a wearable device, a desktop computer, a vehicle terminal, and/or the like. In some embodiments, such a terminal device can be referred to as, for example, a client device, a user device, a mobile device, a portable device, a terminal, and/or the like. In some embodiments, the terminal device performing the method 100 includes a device (e.g., a camera) capable of taking pictures. In such embodiments, as described below, the terminal device is configured to take a picture of the user in association with performing the user authentication process. Details of a terminal device are shown and described below with respect to FIGS. 3-4.
The server device operatively coupled to and communicating with the terminal device can be any type of device configured to function as a server-side device to provide the user authentication service to the user operating the terminal device. Such a server device can typically be configured to communicate with multiple terminal devices via one or more networks. In some embodiments, the server device can be, for example, a background server, a back end server, a database server, a workstation, a desktop computer, a cloud computing server, a data processing server, and/or the like. In some embodiments, the server device can be a server cluster or server center consisting of two or more servers (e.g., a data processing server and a database server) . Details of a server device are shown and described below with respect to FIGS. 5-6.
A network connecting the terminal device performing the method 100 and a server device can be any type of network configured to operatively couple one or more server devices to one or more terminal devices, and enable communications between the server device (s) and the terminal device (s) . In some embodiments, such a network can include one or more networks such as, for example, a cellular network, a satellite network, a local area network (LAN) , a wide area network (WAN) , a wireless local area network (WLAN) , the Internet, etc. In some embodiments, such a network can be optionally implemented using any known network protocol including various wired and/or wireless protocols such as, for example, Ethernet, universal serial bus (USB) , global system for mobile communications (GSM) , enhanced data GSM environment (EDGE) , general packet radio service (GPRS) , long term evolution (LTE) , code division multiple access (CDMA) , wideband code division multiple Access (WCDMA) , time division multiple access (TDMA) , Bluetooth, Wi-Fi, voice over internet protocol (VoIP) , Wi-MAX, etc.
The data transfer operation associated with user authentication can be any type of operation that involves communication (e.g., transmission of data) between the terminal device and another device (e.g., a server device, another terminal device) , and requires user authentication as a security mechanism to protect the user from potential risks. Particularly, the data transfer operation can include transferring a financial asset (e.g., money, security bonds, stocks, mileage points, etc. ) from an account of the user (e.g., as a payer) to an account of another party (e.g., as a payee) . For example, the data transfer operation can be a transfer of money from a bank account of the user to a bank account of another party. For another example, the data transfer can be a transmission of personal information and/or confidential data (e.g., a bank account number, credit card information, social security number, address, phone number. etc. ) of the user from the terminal device to another device (e.g., a server device, another terminal device) operatively coupled to the terminal device. Additionally, in some embodiments, the user authentication mechanism described herein can be  implemented for other applications involving data transfers such as, for example, online games, online education, cloud database, and/or the like.
In some embodiments, the data transfer operation includes making an online payment and/or an offline payment associated with a transaction. A terminal device configured to perform online payment transactions can be any type of personal electronic device such as, for example, personal computer (PC) , PDA, laptop, touchpad, mobile phone, and/or the like. The user enters the request into the terminal device, which includes, for example, a user identification, a payer account number, an amount of payment, etc.
A terminal device configured to perform offline payment transactions can be any type of device capable of obtaining information and/or data associated with the requested offline payment transactions. In some embodiments, for example, a terminal device can be configured to swipe a card (e.g., a bank card, a credit card, a debit card, etc. ) of the user, which contains information and/or data (e.g., card number, expiration date, name of card holder, security code, card verification value (CVV) , etc. ) of the card that is used to complete an offline payment transaction. Such a terminal device can be, for example, a point-of-sale (POS) terminal, a credit card machine, or any other type of payment terminal. In some other embodiments, for example, a terminal device can be configured to scan and read a barcode (e.g., one-dimensional barcode, two-dimensional barcode, Quick Response (QR) code, etc. ) provided by the user. The barcode contains information and/or data used to complete an offline payment transaction such as, for example, a card number, expiration date, name of card holder, security code, CVV, etc.
The user operating the terminal device performing the method 100 can be any person that can access and operate the terminal device, and initiate or perform the data transfer operation. Such a user can be, for example, a payer of a transaction (e.g., online payment) . In some embodiments, a user operating a terminal device can use the user authentication service and the data transfer service to, for example, make online payments, conduct online shopping, transfer financial assets, transmit confidential data, and/or the like.
In some embodiments, the terminal device performing the method 100 can include one or more processors and memory. In such embodiments, the method 100 is implemented using instructions or code of an application that are stored in a non-transitory computer readable storage medium of the terminal device and executed by the one or more processors of the terminal device. The application is associated with authenticating users for performing data transfer operations. Such an application typically has a client-side portion that is stored in and/or executed at the terminal device, and a server-side portion that is stored in and/or executed at the server devices operatively  coupled to the terminal device. As a result of the client-side portion of the application being executed, the method 100 is performed at the terminal device. As shown in FIG. 1, the method 100 includes the following steps.
At 101, the terminal device receives, from a user of the terminal device, a request for transferring data. Such a request for transferring data can be, for example, a request to make a payment (e.g., an online payment or an offline payment) associated with purchasing a merchandise item. The received request includes a user identification uniquely associated with the user. Such a user identification can be in any suitable form such as, for example, a number, a user ID, a username, etc.
In some embodiments, before the user can use the terminal device to perform any data transfer operation, the user selects face data (from a set of potential password methods such as, for example, text string, fingerprint, voice, slide operation, etc. ) as the password method used in user authentication associated with data transfer operations. As a result, the user is prompted to enter or select a user identification, as well as to provide face data associated with the user. In response, the user enters or selects a user identification (e.g., a user ID, a username) that uniquely identifies the user, thus differentiating the user from any other user. The user also provides a picture of her face that includes the requested face data. For example, the user can operate the terminal device to take a picture of her face. For another example, the user can upload a picture of her face to the terminal device. In some embodiments, the picture of the user’s face is required to show a clear image of the user’s face without any portion of the face being covered. For example, the user is required not to wear one or more of glasses, hat, mask, makeup, etc. , when she takes the picture.
After the user identification is determined at the terminal device and the face data of the user (e.g., a picture of the user’s face) is provided to the terminal device, the terminal device sends the user identification and the face data to the server device. Thus, the user identification of the user and the face data of the user are stored (e.g., in a database) at the server device, where the user identification of each user is associated with the face data of that user. Subsequently, when the user initiates a data transfer operation, the user can generate, at the terminal device, a request for transferring data including the user identification.
In some embodiments, the terminal device that the user uses to determine the user identification, the terminal device that the user uses to provide face data, and the terminal device the user uses to perform user authentication and data transfer can be the same or different terminal devices. For example, the user can use a first terminal device to determine a user identification and provide original face data, and use a second terminal device to initiate a data transfer operation (e.g.,  generate a request for transferring data that includes the user identification) . Thus, the first terminal device sends the determined user identification and the original face data to a server device, and the second terminal device sends the user identification included in the request and collected face data (as described below with respect to the step 102) to the server device (as described below with respect to the step 103) . The server device then performs the server-side portion of user authentication based on the user identification and face data initially received from the first terminal device and the user identification and face data subsequently received from the second terminal device.
At 102, the terminal device collects, in response to the request, data associated with the user’s face. Furthermore, the user may not be required to enter any type of password (e.g., a textual string, a slide operation, a fingerprint, etc. ) to complete the data transfer operation. The terminal device can collect the face data of the user in any suitable method. For example, the terminal device can use an embedded camera to take a picture of the user’s face. For another example, the terminal device can use an external webcam to take a picture of the user’s face. In some embodiments, the terminal device can detect the user’s face and gather data associated with the user’s face without taking a picture of the user’s face. In such embodiments, the terminal device can implement, for example, face recognition techniques.
Additionally, in some embodiments, the terminal device is configured to reject any face data that is not collected from the user’s face in real time. That is, the terminal device is configured to reject any face data obtained from, for example, a picture that was previously taken, scan of an existing image, data uploaded to the terminal device, and/or the like. In other words, the user has to show her “real” face to provide the face data to the terminal device.
At 103, the terminal device sends the data associated with the user’s face and the user identification to the server device. As discussed above, the server device stores the user identification and face data of the user that are previously provided to the server device. Furthermore, the face data of the user is associated with and identified by the user identification of the user and is stored in (e.g., a memory of) the server device.
In response to receiving the user identification and the face data from the terminal device, the server device generates a matching result by comparing the face data of the user received from the terminal device with the face data identified by the user identification that is stored at the server device. In some embodiments, the matching result generated at the server device indicates either a match or no match between the face data of the user received from the terminal device and the stored face data identified by the user identification.
In some embodiments, for example, the server device calculates a matching value based on the face data of the user received from the terminal device and the stored face data identified by the user identification that is stored at the server device. The server device then compares the matching value with a predefined matching threshold. If the matching value is greater than the predefined matching threshold, the server device generates a matching result indicating a match between the face data of the user received from the terminal device and the stored face data identified by the user identification. Otherwise, if the matching value is not greater than the predefined matching threshold, the server device generates a matching result indicating no match between the face data of the user received from the terminal device and the stored face data identified by the user identification.
At 104, the terminal device receives, from the server device, the matching result in response to the user’s face and the user identification. In some embodiments, the terminal device receives a binary value as the matching result, where a positive matching result (e.g., “1” ) indicates a match between the face data of the user collected at the terminal device (at the step 102) and the face data of the user stored at the server device, and a negative matching result (e.g., “0” ) indicates no match between the face data of the user collected at the terminal device (at the step 102) and the face data of the user stored at the server device.
In some other embodiments, the terminal device receives other information associated with the binary matching result such as, for example, the matching value calculated at the server device. In such embodiments, the other information (e.g., the matching value) can indicate a level of confidence in the matching result. For example, a relatively higher matching value indicates a higher level of confidence in a positive matching result or a lower level of confidence in a negative matching result, while a relatively lower matching value indicates a higher level of confidence in a negative matching result or a lower level of confidence in a positive matching result.
In yet some other embodiments, the terminal device receives information and/or data associated with the comparison of the collected face data and the stored face data, without receiving a binary matching result. For example, the terminal device receives the matching value without receiving a binary matching result as described above. In such embodiments, the terminal device can determine a matching result based on the received information and/or data. For example, the terminal device can compare the received matching value with a predefined matching threshold to determine a positive or negative matching result as described above.
At 105, the terminal device responds to the request for transferring data in accordance with the matching result. As discussed above, the matching result can be received from the server  device or determined at the terminal device based on information and/or data received from the server device. To respond to the request based on the matching result, if the matching result (e.g., a positive binary matching result) indicates a match between the face data of the user collected at the terminal device and the face data of the user stored at the server device, the user is authenticated by the server device. As a result, the terminal device performs data transfer in compliance with the request. Otherwise, if the matching result (e.g., a negative binary matching result) indicates no match between the face data of the user collected at the terminal device and the face data of the user stored at the server device, the user is not authenticated by the server device. As a result, the terminal device sends a message to the user declining the request. In some embodiments, for example, the terminal device sends a message prompting the user to reenter her request for transferring data (including her user identification) and/or to provide her face data again.
For example, in an online payment transaction, if the matching result indicates a match between the collected face data and the stored face data, the terminal device communicates with other devices to deduct a payment amount from an account of the user, and add that payment amount to an account of a payee. Information of the payment amount, user’s account and the payee’s account are included in the data transfer request previously generated or received at the terminal device. Furthermore, after the online payment transaction is completed, the terminal device generates and displays a message to the user, indicating the success of the online payment transaction. Otherwise, if the matching result indicates no match between the collected face data and the stored face data, the terminal device declines the data transfer request. Specifically, for example, the terminal device can generate and display an error message to the user indicating the failure of the user authentication and the decline of the data transfer operation.
FIG. 2 is a flowchart illustrating a method 200 performed at a server device for authenticating a user associated with transferring data in accordance with some embodiments. The server device performing the method 200 is similar to the server device described above in the method 100 with respect to FIG. 1. Particularly, the server device is operatively coupled to and communicates with one or more terminal devices similar to the terminal device performing the method 100 described above with respect to FIG. 1. Furthermore, the server device performing the method 200 is configured to store data (e.g., face data, user identifications) of users that is used in user authentication for data transfer operations.
In some embodiments, the server device performing the method 200 includes one or more processors and memory. In such embodiments, the method 200 is implemented using instructions or code of an application that are stored in a non-transitory computer readable storage medium of the server device and executed by the one or more processors of the server device. The  application is associated with authenticating users for performing data transfer operations. Such an application typically has a server-side portion that is stored in and/or executed at the server devices, and a client-side portion that is stored in and/or executed at each terminal device operatively coupled to the server device. As a result of the server-side portion of the application being executed, the method 200 is performed at the server device. As shown in FIG. 2, the method 200 includes the following steps.
At 201, the server device receives, from a terminal device associated with a user and in response to a request for transferring data received at the terminal device, data associated with the user’s face and a user identification uniquely associated with the user. As described above with respect to the steps 101-103 of the method 100 in FIG. 1, prior to the user initiating the request for transferring data, the user is prompted to select a password method (e.g., from a set of potential password methods such as, for example, face data, text string, fingerprint, voice, slide operation, etc. ) to be used in user authentication associated with data transfer operations. The user can select, for example, face data as the password method for data transfer operations. As a result, the user is prompted to enter or select a user identification, as well as to provide face data of the user to a terminal device. In response, the user enters or selects a user identification (e.g., a user ID, a username) that uniquely identifies the user and differentiates the user from any other user. The terminal device also obtains the face data of the user by, for example, scanning a picture of the user, taking a picture of the user, using face recognition technology to capture face data of the user, receiving an uploaded image of the user, and/or the like.
After obtaining the user identification and face data of the user, the terminal device sends the user identification and the face data to the server device. The server device then stores the user identification of the data together with the face data of the user in, for example, a memory of the server device. Particularly, the user identification of the user is associated with the face data of the user in the storage of the server device, such that the stored face data of the user can be located and retrieved based on the user identification of the data.
In some embodiments, the server device is configured to receive updated face data identified by a user identification of a user. For example, the server device can periodically receive updated face data of a user, which is identified by a user identification of the user. For another example, the server device can receive updated face data of a user in response to such updated face data of the user being received at a terminal device operatively coupled to the server device. The updated face data of a user can reflect a change in the face of the user such as, a removal of glasses, a newly-generated scar, newly-grown beard, a change in hairstyle, a change due to a cosmetic surgery (e.g., eyelid surgery) , a change due to natural aging, and/or the like. In response to receiving the  updated face data of the user, the server device can replace the stored face data identified by the user identification of the user with the updated face data identified by the same user identification. In such a method, the face data of the user stored at the server device can be updated.
Subsequently, when the user initiates a data transfer operation, the user can generate a request for transferring data at a terminal device. The terminal device then sends the request to the server device. The request for transferring data includes the user identification of the user and other information and/or data associated with the requested data transfer operation (e.g., bank account number, credit card number, payment amount, etc. ) . Also, as discussed above with respect to the method 100 in FIG. 1, the terminal device receiving the data transfer request can be the same or different from the terminal device collecting the initial user identification and face data of the user.
At 202, the server device generates a matching result by comparing the data associated with the user’s face with stored face data that is identified by the user identification. As discussed above, the server device stores the user identification and face data of the user that are previously sent to the server device in response to the user selecting face data as the password method for data transfer operations. Furthermore, the face data of the user is associated with the user identification of the user in the storage of the server device. Thus, the stored face data of the user is identified by the user identification.
In response to receiving the user identification and the face data from the terminal device, the server device generates a matching result by comparing the received face data of the user with the stored face data of the user that is identified by the user identification. That is, the user identification included in the request is used to locate the stored face data of the user that is identified by that user identification in the storage of the server device. The stored face data of the user is then retrieved, and compared with the received face data of the user to generate the matching result. In some embodiments, the matching result generated at the server device indicates either a match or no match between the face data of the user received from the terminal device and the stored face data identified by the user identification.
In some embodiments, for example, the server device calculates a matching value based on the received face data of the user and the stored face data of the user identified by the user identification. The server device then compares the matching value with a predefined matching threshold. If the matching value is greater than the predefined matching threshold, the server device generates a matching result (e.g., a positive matching result, or a “1” result) indicating a match between the received face data of the user and the stored face data identified by the user identification. Otherwise, if the matching value is not greater than the predefined matching threshold,  the server device generates a matching result (e.g., a negative matching result, or a “0” result) indicating no match between the received face data of the user and the stored face data identified by the user identification.
At 203, the server device sends the matching result to the terminal device such that the terminal device responds to the request for transferring data in accordance with the matching result. In some embodiments, for example, the server device sends to the terminal device a binary value as the matching result, where a positive matching result (e.g., “1” ) indicates a match between the face data of the user collected at the terminal device and the face data of the user stored at the server device, and a negative matching result (e.g., “0” ) indicates no match between the face data of the user collected at the terminal device and the face data of the user stored at the server device. In other embodiments, the matching result can be in any other suitable form. Additionally, in some embodiments, the server device can send, to the terminal device, other information and/or data associated with the comparison of the collected face data and the stored face data, with or without sending a binary matching result.
As discussed above with respect to the step 105 of the method 100 in FIG. 1, to respond to the request based on the matching result, if the matching result (e.g., a positive binary matching result) indicates a match between the face data of the user collected at the terminal device and the face data of the user stored at the server device, the user is authenticated by the server device. As a result, the terminal device performs data transfer in compliance with the request. Otherwise, if the matching result (e.g., a negative binary matching result) indicates no match between the face data of the user collected at the terminal device and the face data of the user stored at the server device, the user is not authenticated by the server device. As a result, the terminal device sends a message to the user declining the request. For example, the terminal device sends a message prompting the user to reenter her request for transferring data (including her user identification) and/or to provide her face data again. For another example, the terminal device sends an error message to the user indicating the failure of the user authentication and the decline of the data transfer request.
FIG. 3 is a block diagram of a terminal device 300 configured to authenticate a user associated with transferring data in accordance with some embodiments. The terminal device 300 can be structurally and functionally similar to the terminal devices described with respect to FIGS. 1 and 2. Particularly, the terminal device 300 can be operatively coupled to and communicate with one or more server devices, at least one of which is configured to provide the user authentication service as described herein.
As shown in FIG. 3, the terminal device 300 includes a receive module 301, a data collection module 302, a transmit module 303 and a process module 304. In some embodiments, a terminal device can include more or less modules than those shown in FIG. 3, and/or connect to one or more external devices. For example, a terminal device can be connected to an external device (e.g., a webcam, a camera) configured to capture face data of users. For another example, a terminal device can include an input module configured to receive input of users (e.g., entered user identifications, entered requests for data transfer, etc. ) .
In some embodiments, each module included in the terminal device 300 can be a hardware-based module (e.g., a digital signal processor (DSP) , a field programmable gate array (FPGA) , an application-specific integrated circuit (ASIC) , etc. ) , a software-based module (e.g., a module of computer code executed at a processor, a set of processor-readable instructions executed at a processor, etc. ) , or a combination of hardware and software modules. Instructions or code of each module can be stored in a memory of the terminal device 300 (not shown in FIG. 3) and executed at a processor (e.g., a CPU) of the terminal device 300 (not shown in FIG. 3) . Overall, the receive module 301, the data collection module 302, the transmit module 303 and the process module 304 can be configured to collectively perform the method 100 (e.g., the client-side portion of a user authentication application) shown and described above with respect to FIG. 1.
Specifically, the receive module 301 is configured to, among other functions, receive data transfer requests from users operating the terminal device 300. Each data transfer request received at the receive module 301 includes a user identification that uniquely identifies a user. The receive module 301 is also configured to receive matching results returned from a server device operatively coupled to and communicating with the terminal device 300. In some embodiments, the receive module 301 can receive data and/or information associated with a requested data transfer operation from, for example, input data entered by a user using a finger, a mouse or a keyboard; swiping and reading of a magnetic card of a user (e.g., a bank card, a debit card, a credit card, etc. ) ; scanning of a barcode (e.g., a one-dimensional barcode, a two-dimensional barcode, a QR code, etc. ) provided by a user; and/or the like.
The data collection module 302 is configured to, among other functions, collect face data of users. The data collection module 302 can be configured to communicate with and/or control an embedded device or external device (e.g., a camera, a webcam, etc. ) that is capable of capturing face data of users. In some embodiments, the data collection module 302 is configured to implement image processing techniques such as, for example, face recognition techniques. For example, the data collection module 302 can be configured to process a captured image to detect any human face  in the image. Additionally, in some embodiments, the data collection module 302 is configured to analyze an image of a human face to obtain face data from that image.
The transmit module 303 is configured to, among other functions, send received data transfer requests including user identifications and collected face data to the server device. Specifically, the transmit module 303 sends face data of a user together with the received user identification of that user, such that the received user identification can be used to locate face data identified by the received user identification that is stored at the server device.
The process module 304 is configured to, among other functions, determine a responding operation for a data transfer request based on a matching result for the data transfer request that is received at the receive module 301. Specifically, as discussed above, the process module 304 can proceed to complete the requested data transfer if the matching result for that data transfer request indicates a match between the face data of the user collected at the terminal device and the face data of the user stored at the server device and identified by the user identification collected at the terminal device. Additionally, the process module 304 can optionally generate and display a message to the user indicating the success of the user authentication and/or the completion of the data transfer operation.
Otherwise, the process module 304 can decline the data transfer request if the matching result for that data transfer request indicates no match between the face data of the user collected at the terminal device and the face data of the user stored at the server device and identified by the user identification collected at the terminal device. Additionally, the process module 304 can optionally generate and display an error message to the user indicating the failure of the user authentication and/or the decline of the data transfer operation. Furthermore, in some embodiments, in addition to a decline of a data transfer request, the process module 304 can prompt the user to reenter a user identification (or other information and/or data of the requested data transfer operation) , and/or prompt the user to provide face data again.
FIG. 4 is a block diagram illustrating structure of a terminal device 400 in accordance with some embodiments. The terminal device 400 can be structurally and functionally similar to the terminal devices shown and/or described with respect to FIGS. 1-3. Particularly, the terminal device 400 can be operatively coupled to and communicate with one or more server devices, at least one of which is configured to provide the user authentication service as described herein. As shown in FIG. 4, the terminal device 400 includes a processor 401, a communication bus 402, a network interface 403, and a memory 404. In some embodiments, a terminal device can include more or less devices, components and/or modules than those shown in FIG. 4.
The processor 401 can be any processing device capable of performing the method 100 (e.g., a client-side portion or a user authentication application) described with respect to FIG. 1. Such a processor can be, for example, a CPU, a DSP, a FPGA, an ASIC, and/or the like. The processor 401 can be configured to control the operations of other components and/or modules of the terminal device 400. For example, the processor 401 can be configured to control operations of the network interface 403. For another example, the processor 401 can be configured to execute instructions or code stored in a software program or module (e.g., user authentication application) within the memory 404. The communication bus 402 is configured to implement connections and communication among the other components of the terminal device 400.
The network interface 403 is configured to provide and control network interfaces of the terminal device 400 that are used to interact with other network devices (e.g., server devices, other terminal devices) . The network interface 403 can include, for example, a standard wired interface and/or a standard wireless interface (e.g., a Wi-Fi interface) . In some embodiments, the network interface 403 is used for connecting the terminal device 400 with one or more server devices and performing data communication with the one or more server devices. In such embodiments, as described above with respect to FIGS. 1-2, the network interface 403 is configured to transmit to the server device (s) , for example, data transfer requests including user identifications, face data, etc. The network interface 403 is also configured to receive from the server device (s) , for example, matching results and/or other data associated with user authentication. In some embodiments, operations of the network interface 403 are controlled by instructions or code stored in the memory 404.
In some embodiments, the memory 404 can include, for example, a random-access memory (RAM) (e.g., a DRAM, a SRAM, a DDR RAM, etc. ) , a non-volatile memory such as one or more magnetic disk storage devices, optical disk storage devices, flash memory devices, or other non-volatile solid state storage devices. In some embodiments, the memory 404 can include one or more storage devices (e.g., a removable memory) remotely located from other components of the terminal device 400.
As shown in FIG. 4, the memory 404 includes program code associated with, for example, the user authentication application described herein. In some embodiments, each component, program, application or module included in the memory 404 can be a hardware-based module (e.g., a DSP, a FPGA, an ASIC) , a software-based module (e.g., a module of computer code executed at a processor, a set of processor-readable instructions executed at a processor) , or a combination of hardware and software modules. Instructions or code of each component, program, application or module can be stored in the memory 404 and executed at the processor 401.  Particularly, the instructions or code of the method 100 (e.g., a client-side portion of a user authentication application) shown and described above with respect to FIG. 1 are stored in the program code within the memory 404. In some embodiments, the processor 401 is configured to perform the instructions or code stored in the program code within the memory 404, as shown and described above with respect to the method 100 in FIG. 1.
FIG. 5 is a block diagram of a server device 500 configured to authenticate a user associated with transferring data in accordance with some embodiments. The server device 500 can be structurally and functionally similar to the server devices described with respect to FIGS. 1 and 2. Particularly, the server device 500 can be operatively coupled to and communicate with one or more terminal devices, at least one of which and the server device are configured to collectively provide the user authentication service as described herein.
As shown in FIG. 5, the server device 500 includes a receive module 501, a process module 502, and a transmit module 503. The process module 502 includes a calculation unit 5021 and a determination unit 5022. In some embodiments, a server device can include more or less modules than those shown in FIG. 5. For example, a server device can include a storage module configured to store and update face data and user identifications of the users.
In some embodiments, each module (or submodule, unit) included in the server device 500 can be a hardware-based module (e.g., a DSP, a FPGA, an ASIC, etc. ) , a software-based module (e.g., a module of computer code executed at a processor, a set of processor-readable instructions executed at a processor, etc. ) , or a combination of hardware and software modules. Instructions or code of each module can be stored in a memory of the server device 500 (not shown in FIG. 5) and executed at a processor (e.g., a CPU) of the server device 500 (not shown in FIG. 5) . Overall, the receive module 501, the process module 502 (including the calculation unit 5021 and the determination unit 5022) , and a transmit module 503 can be configured to collectively perform the method 200 (e.g., the server-side portion of a user authentication application) shown and described above with respect to FIG. 2.
Specifically, the receive module 501 is configured to, among other functions, receive from the terminal device (s) data transfer requests including user identifications and face data collected at the terminal device (s) in response to the data transfer requests. Specifically, the receive module 501 receives face data of a user together with the user identification of that user, such that the received user identification of the user can be associated with and stored together with the face data of the user. Thus, the user identification of the user can be used to locate face data identified by the user identification that is stored at the server device 500.
The process module 502 is configured to, among other functions, compare face data that is collected at a terminal device and received at the receive module 501 with stored face data identified by the corresponding user identification received at the receive module 501. The process module 502 can further generate a matching result and/or other data (e.g., a matching value) based on the comparison.
In some embodiments, specifically, the calculation unit 5021 is configured to calculate a matching value based on the comparison between the received face data and the stored face data. Typically, a high matching value indicates a high degree of match, and a low matching value indicates a low degree of match. Subsequently, the determination unit 5022 is configured to determine a matching result based on the matching value calculated at the calculation unit 5021. In some embodiments, for example, the determination unit 5022 compares the calculated matching value with a predefined matching threshold, and determines a positive matching result (i.e., indicating a match) if the calculated matching value is greater than the predefined matching threshold, while determines a negative matching result (i.e., indicating no match) if the calculated matching value is not greater than the predefined matching threshold.
The transmit module 503 is configured to, among other functions, transmit matching results generated at the process module 502 to the corresponding terminal devices. In some embodiments, the transmit module 503 can transmit to the terminal devices other data (e.g., matching values) generated at the process module 502 that is associated with the user authentication. As a result, the corresponding terminal devices can respond to the users in accordance with the matching results and/or other data associated with the user authentication.
FIG. 6 is a block diagram illustrating structure of a server device 600 in accordance with some embodiments. The server device 600 can be structurally and functionally similar to the server devices shown and/or described with respect to FIGS. 1-2 and 5. Particularly, the server device 600 can be operatively coupled to and communicate with one or more terminal devices. The server device 600 and at least one of the terminal devices are configured to collectively provide the user authentication service as described herein. As shown in FIG. 6, the server device 600 includes a processor 601, a communication bus 602, a network interface 603, and a memory 604. In some embodiments, a server device can include more or less devices, components and/or modules than those shown in FIG. 6.
The processor 601 can be any processing device capable of performing the method 200 (e.g., a server-side portion or a user authentication application) described with respect to FIG. 2. Such a processor can be, for example, a CPU, a DSP, a FPGA, an ASIC, and/or the like. The  processor 601 can be configured to control the operations of other components and/or modules of the server device 600. For example, the processor 601 can be configured to control operations of the network interface 603. For another example, the processor 601 can be configured to execute instructions or code stored in a software program or module (e.g., user authentication application) within the memory 604. The communication bus 602 is configured to implement connections and communication among the other components of the server device 600.
The network interface 603 is configured to provide and control network interfaces of the server device 600 that are used to interact with other network devices (e.g., terminal devices, other server devices) . The network interface 603 can include, for example, a standard wired interface and/or a standard wireless interface (e.g., a Wi-Fi interface) . In some embodiments, the network interface 603 is used for connecting the server device 600 with one or more terminal devices and performing data communication with the one or more terminal devices. In such embodiments, as described above with respect to FIGS. 1-2, the network interface 603 is configured to receive from the terminal device (s) , for example, data transfer requests including user identifications, face data, etc. The network interface 603 is also configured to transmit to the terminal device (s) , for example, matching results and/or other data associated with user authentication. In some embodiments, operations of the network interface 603 are controlled by instructions or code stored in the memory 604.
In some embodiments, the memory 604 can include, for example, a RAM (e.g., a DRAM, a SRAM, a DDR RAM, etc. ) , a non-volatile memory such as one or more magnetic disk storage devices, optical disk storage devices, flash memory devices, or other non-volatile solid state storage devices. In some embodiments, the memory 604 can include one or more storage devices (e.g., a removable memory) remotely located from other components of the server device 600.
As shown in FIG. 6, the memory 604 includes program code associated with, for example, the user authentication application described herein. In some embodiments, each component, program, application or module included in the memory 604 can be a hardware-based module (e.g., a DSP, a FPGA, an ASIC) , a software-based module (e.g., a module of computer code executed at a processor, a set of processor-readable instructions executed at a processor) , or a combination of hardware and software modules. Instructions or code of each component, program, application or module can be stored in the memory 604 and executed at the processor 601. Particularly, the instructions or code of the method 200 (e.g., a server-side portion of a user authentication application) shown and described above with respect to FIG. 2 are stored in the program code within the memory 604. In some embodiments, the processor 601 is configured to  perform the instructions or code stored in the program code within the memory 604, as shown and described above with respect to the method 200 in FIG. 2.
FIG. 7 is a schematic diagram illustrating a system 700 configured to authenticate a user associated with transferring data in accordance with some embodiments. As shown in FIG. 7, the system 700 includes a server device 702 and a terminal device 701. The server device 702 is operatively coupled to the terminal device 701 via a network 703. The terminal device 701 is operated by a user 704. The server device 702 can be structurally and functionally similar to the server device performing the method 100 described above with respect to FIG. 1 and the server devices shown and described above with respect to FIGS. 5-6. The terminal device 701 can be structurally and functionally similar to the terminal device described above with respect to FIG. 1 and the terminal devices shown and described above with respect to FIGS. 3-4. The user 704 can be similar to the user described above with respect to FIG. 1. The network 703 can be similar to the network described above with respect to FIG. 1.
In operation, the user 704 initially operates a terminal device (e.g., the terminal device 701 or another terminal device) to select face data as a password method for user authentication associated with data transfer operations. The terminal device prompts the user 704 to enter a user identification (e.g., a username) that uniquely identifies the user 704. The terminal device also prompts the user 704 to provide face data by, for example, taking a picture of the user 704, uploading a picture of the user 704, scanning an image of the user 704, and/or the like. Subsequently, the terminal device sends the entered user identification of the user 704 and the received face data of the user 704 to the server device 702. The server device 702 then stores the user identification of the user 704 and the face data of the user 704, such that the face data of the user 704 can be identified and located using the user identification of the user 704.
After the user identification and face data of the user 704 are stored at the server device 702, the user 704 is allowed to initiate data transfer operations. Specifically, the user 704 operates the terminal device 701 to generate a data transfer request, which includes the user identification of the user 704 and other data associated with the requested data transfer (e.g., bank account number, payment amount, etc. ) . In response to receiving the data transfer request, the terminal device 701 collects face data of the user 704 by, for example, taking a picture of the user 704 using a camera of a webcam.
The terminal device 701 then sends the received user identification and the collected face data to the server device 702. The server device 702 retrieves the stored face data identified by the received user identification. The server device 702 then compares the stored face data with the  face data recently received from the terminal device 701. The server device 702 generates a matching result based on the comparison, which can be, for example, a positive matching result indicating a match between the stored face data and the received face data, or a negative matching result indicating no match between the stored face data and the received face data. The server device 702 sends the generated matching result to the terminal device 701.
In response to receiving the matching result from the server device 702, the terminal device 701 responds to the data transfer request accordingly based on the matching result. If the matching result indicates a match between the stored face data and the received face data, the user 704 is authenticated and the terminal device 701 proceeds to perform the requested data transfer operation. The terminal device 701 can further generates and displays a message to the user 704 indicating the success of user authentication and/or the completion of data transfer. Otherwise, if the matching result indicates no match between the stored face data and the received face data, the user 704 is not authenticated and the terminal device 701 declines to perform the requested data transfer operation. The terminal device 701 can further generates and displays an error message to the user 704 indicating the failure of user authentication and/or the decline of the data transfer request. Additionally, the terminal device 701 can prompt the user 704 to reenter a user identification and/or to provide face data again.
The foregoing description, for purpose of explanation, has been described with reference to specific embodiments. However, the illustrative discussions above are not intended to be exhaustive or to limit the present application to the precise forms disclosed. Many modifications and variations are possible in view of the above teachings. The embodiments were chosen and described in order to best explain the principles of the present application and its practical applications, to thereby enable others skilled in the art to best utilize the present application and various embodiments with various modifications as are suited to the particular use contemplated.
While particular embodiments are described above, it will be understood it is not intended to limit the present application to these particular embodiments. On the contrary, the present application includes alternatives, modifications and equivalents that are within the spirit and scope of the appended claims. Numerous specific details are set forth in order to provide a thorough understanding of the subject matter presented herein. But it will be apparent to one of ordinary skill in the art that the subject matter may be practiced without these specific details. In other instances, well-known methods, procedures, components, and circuits have not been described in detail so as not to unnecessarily obscure aspects of the embodiments.
The terminology used in the description of the present application herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the present application. As used in the description of the present application and the appended claims, the singular forms “a, ” “an, ” and “the” are intended to include the plural forms as well, unless the context clearly indicates otherwise. It will also be understood that the term “and/or” as used herein refers to and encompasses any and all possible combinations of one or more of the associated listed items. It will be further understood that the terms “includes, ” “including, ” “comprises, ” and/or “comprising, ” when used in this specification, specify the presence of stated features, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, operations, elements, components, and/or groups thereof.
As used herein, the term “if” may be construed to mean “when” or “upon” or “in response to determining” or “in accordance with a determination” or “in response to detecting, ” that a stated condition precedent is true, depending on the context. Similarly, the phrase “if it is determined [that a stated condition precedent is true] ” or “if [astated condition precedent is true] ” or “when [astated condition precedent is true] ” may be construed to mean “upon determining” or “in response to determining” or “in accordance with a determination” or “upon detecting” or “in response to detecting” that the stated condition precedent is true, depending on the context.
Although some of the various drawings illustrate a number of logical stages in a particular order, stages that are not order dependent may be reordered and other stages may be combined or broken out. While some reordering or other groupings are specifically mentioned, others will be obvious to those of ordinary skill in the art and so do not present an exhaustive list of alternatives. Moreover, it should be recognized that the stages could be implemented in hardware, firmware, software or any combination thereof.

Claims (20)

  1. A method of authenticating a user at a server device associated with transferring data, comprising:
    at a computer system having one or more processors and memory for storing programs to be executed by the one or more processors:
    receiving, from a terminal device associated with the user and in response to a request for transferring data received at the terminal device, data associated with the user’s face and a user identification uniquely associated with the user;
    generating a matching result at the server device by comparing the data associated with the user’s face with face data identified by the user identification that is stored at the server device; and 
    sending the matching result from the server device to the terminal device such that the terminal device responds to the request for transferring data in accordance with the matching result.
  2. The method of claim 1, wherein the sending the matching result includes sending the matching result from the server device to the terminal device such that if the matching result indicates a match between the data associated with the user’s face and the stored face data identified by the user identification, the terminal device performs data transfer in compliance with the request; and if the matching result indicates no match between the data associated with the user’s face and the stored face data identified by the user identification, the terminal device sends a message to the user declining the request.
  3. The method of claim 1, wherein the face data identified by the user identification was provided by the user to the server device prior to the server device receiving the data associated with the user’s face and the user identification.
  4. The method of claim 1, wherein the matching result indicates a match between the data associated with the user’s face and the stored face data identified by the user identification or no match between the data associated with the user’s face and the stored face data identified by the user identification.
  5. The method of claim 1, wherein the generating the matching result includes:
    calculating a matching value based on the data associated with the user’s face and the stored face data identified by the user identification;
    comparing the matching value with a predefined matching threshold;
    if the matching value is greater than the predefined matching threshold, generating a matching result indicating a match between the data associated with the user’s face and the stored face data identified by the user identification; and
    if the matching value is not greater than the predefined matching threshold, generating a matching result indicating no match between the data associated with the user’s face and the stored face data identified by the user identification.
  6. The method of claim 1, wherein the request for transferring data includes a request for transferring a financial asset from an account of the user.
  7. The method of claim 1, further comprising:
    receiving updated face data identified by the user identification; and
    replacing the stored face data identified by the user identification with the updated face data identified by the user identification.
  8. An apparatus configured to authenticate a user associated with transferring data, comprising: one or more processors; and
    memory storing one or more programs to be executed by the one or more processors, the one or more programs comprising instructions for:
    receiving, from a terminal device associated with the user and in response to a request for transferring data received at the terminal device, data associated with the user’s face and a user identification uniquely associated with the user;
    generating a matching result by comparing the data associated with the user’s face with face data identified by the user identification that is stored at the apparatus; and
    sending the matching result to the terminal device such that the terminal device responds to the request for transferring data in accordance with the matching result.
  9. The apparatus of claim 8, wherein the sending the matching result includes sending the matching result from the apparatus to the terminal device such that if the matching result indicates a match between the data associated with the user’s face and the stored face data identified by the user identification, the terminal device performs data transfer in compliance with the request; and if the matching result indicates no match between the data associated with the user’s face and the stored face data identified by the user identification, the terminal device sends a message to the user declining the request.
  10. The apparatus of claim 8, wherein the face data identified by the user identification was provided by the user to the apparatus prior to the apparatus receiving the data associated with the user’s face and the user identification.
  11. The apparatus of claim 8, wherein the matching result indicates a match between the data associated with the user’s face and the stored face data identified by the user identification or no match between the data associated with the user’s face and the stored face data identified by the user identification.
  12. The apparatus of claim 8, wherein the generating the matching result includes:
    calculating a matching value based on the data associated with the user’s face and the stored face data identified by the user identification;
    comparing the matching value with a predefined matching threshold;
    if the matching value is greater than the predefined matching threshold, generating a matching result indicating a match between the data associated with the user’s face and the stored face data identified by the user identification; and
    if the matching value is not greater than the predefined matching threshold, generating a matching result indicating no match between the data associated with the user’s face and the stored face data identified by the user identification.
  13. The apparatus of claim 8, wherein the request for transferring data includes a request for transferring a financial asset from an account of the user.
  14. The apparatus of claim 8, the one or more programs further comprising instructions for:
    receiving updated face data identified by the user identification; and
    replacing the stored face data identified by the user identification with the updated face data identified by the user identification.
  15. A non-transitory computer readable storage medium storing one or more programs, the one or more programs comprising instructions, which, when executed by one or more processors, cause the processors to perform operations comprising:
    receiving, from a terminal device associated with a user and in response to a request for transferring data received at the terminal device, data associated with the user’s face and a user identification uniquely associated with the user;
    generating a matching result at a server device by comparing the data associated with the user’s face with face data identified by the user identification that is stored at the server device; and
    sending the matching result from the server device to the terminal device such that the terminal device responds to the request for transferring data in accordance with the matching result.
  16. The non-transitory computer readable storage medium of claim 15, wherein the sending the matching result includes sending the matching result from the server device to the terminal device such that if the matching result indicates a match between the data associated with the user’s face and the stored face data identified by the user identification, the terminal device performs data transfer in compliance with the request; and if the matching result indicates no match between the data associated with the user’s face and the stored face data identified by the user identification, the terminal device sends a message to the user declining the request.
  17. The non-transitory computer readable storage medium of claim 15, wherein the face data identified by the user identification was provided by the user to the server device prior to the server device receiving the data associated with the user’s face and the user identification.
  18. The non-transitory computer readable storage medium of claim 15, wherein the generating the matching result includes:
    calculating a matching value based on the data associated with the user’s face and the stored face data identified by the user identification;
    comparing the matching value with a predefined matching threshold;
    if the matching value is greater than the predefined matching threshold, generating a matching result indicating a match between the data associated with the user’s face and the stored face data identified by the user identification; and
    if the matching value is not greater than the predefined matching threshold, generating a matching result indicating no match between the data associated with the user’s face and the stored face data identified by the user identification.
  19. The non-transitory computer readable storage medium of claim 15, wherein the request for transferring data includes a request for transferring a financial asset from an account of the user.
  20. The non-transitory computer readable storage medium of claim 15, the one or more programs further comprising instructions for:
    receiving updated face data identified by the user identification; and
    replacing the stored face data identified by the user identification with the updated face data identified by the user identification.
PCT/CN2015/070226 2014-01-09 2015-01-06 Method, apparatus and system for authenticating user WO2015103970A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201410010506.0A CN104778389A (en) 2014-01-09 2014-01-09 Numerical value transferring method, terminal, server and system
CN201410010506.0 2014-01-09

Publications (1)

Publication Number Publication Date
WO2015103970A1 true WO2015103970A1 (en) 2015-07-16

Family

ID=53523546

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2015/070226 WO2015103970A1 (en) 2014-01-09 2015-01-06 Method, apparatus and system for authenticating user

Country Status (2)

Country Link
CN (1) CN104778389A (en)
WO (1) WO2015103970A1 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2017155466A1 (en) * 2016-03-09 2017-09-14 Trakomatic Pte. Ltd. Method and system for visitor tracking at a pos area
IT201800006758A1 (en) * 2018-06-28 2019-12-28 System and method of online verification of the identity of a subject

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105488376B (en) * 2015-12-01 2018-08-07 北京中微锐芯科技有限公司 Determine the method and device of identity
CN110322612B (en) * 2018-03-29 2021-09-24 腾讯科技(深圳)有限公司 Business data processing method and device, computing equipment and storage medium
CN111954011A (en) * 2020-08-06 2020-11-17 广州华多网络科技有限公司 Virtual gift giving method and device, computer equipment and readable storage medium

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101075868A (en) * 2006-05-19 2007-11-21 华为技术有限公司 Long-distance identity-certifying system, terminal, servo and method
CN101089874A (en) * 2006-06-12 2007-12-19 华为技术有限公司 Identify recognising method for remote human face image
WO2013000142A1 (en) * 2011-06-30 2013-01-03 深圳市君盛惠创科技有限公司 Mobile phone user identity authentication method, cloud server and network system
CN103034880A (en) * 2012-12-14 2013-04-10 上海第二工业大学 Examination identity authentication system and identity authentication method based on face recognition

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103093349A (en) * 2011-10-31 2013-05-08 深圳光启高等理工研究院 Mobile paying method and mobile terminal apparatus
CN202503577U (en) * 2012-03-30 2012-10-24 上海华勤通讯技术有限公司 Face recognition anti-theft mobile phone
CN103268549A (en) * 2013-04-24 2013-08-28 徐明亮 Mobile payment verification system based on facial features

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101075868A (en) * 2006-05-19 2007-11-21 华为技术有限公司 Long-distance identity-certifying system, terminal, servo and method
CN101089874A (en) * 2006-06-12 2007-12-19 华为技术有限公司 Identify recognising method for remote human face image
WO2013000142A1 (en) * 2011-06-30 2013-01-03 深圳市君盛惠创科技有限公司 Mobile phone user identity authentication method, cloud server and network system
CN103034880A (en) * 2012-12-14 2013-04-10 上海第二工业大学 Examination identity authentication system and identity authentication method based on face recognition

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2017155466A1 (en) * 2016-03-09 2017-09-14 Trakomatic Pte. Ltd. Method and system for visitor tracking at a pos area
IT201800006758A1 (en) * 2018-06-28 2019-12-28 System and method of online verification of the identity of a subject
WO2020003186A1 (en) * 2018-06-28 2020-01-02 Inventia S.R.L. System and method for online verification of the identity of a subject

Also Published As

Publication number Publication date
CN104778389A (en) 2015-07-15

Similar Documents

Publication Publication Date Title
CN108293054B (en) Electronic device and method for biometric authentication using social network
CN107851254B (en) Seamless transactions with minimized user input
US20200027090A1 (en) Systems and methods for authenticating financial transactions
US10229408B2 (en) System and method for selectively initiating biometric authentication for enhanced security of access control transactions
US11495051B2 (en) Automatic hands free service requests
US10061912B2 (en) Multi-factor authentication system and method
US20150262052A1 (en) Omni smart card
CN105100108B (en) A kind of login authentication method based on recognition of face, apparatus and system
WO2015062412A1 (en) Method, device and system for online payment
US20200034807A1 (en) Method and system for securing transactions in a point of sale
WO2015103970A1 (en) Method, apparatus and system for authenticating user
JP6359173B2 (en) User attribute value transfer method and terminal
WO2018235055A1 (en) Facial biometrics card emulation for in-store payment authorization
US20230410119A1 (en) System and methods for obtaining real-time cardholder authentication of a payment transaction
US11037146B2 (en) Managing product returns associated with a user device
US20210049568A1 (en) Method and System for Large Transfer Authentication
WO2015101057A1 (en) Data processing method and related device and system
US11593810B2 (en) Systems and methods for transaction pre-registration
EP4163854A1 (en) Systems and methods for conducting remote user authentication
US20220358503A1 (en) Systems and methods for providing in-person status to a user device
US20200184451A1 (en) Systems and methods for account event notification
CN110544098A (en) Novel payment method and device applying vein recognition

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 15735548

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

32PN Ep: public notification in the ep bulletin as address of the adressee cannot be established

Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205 DATED 17/11/2016)

122 Ep: pct application non-entry in european phase

Ref document number: 15735548

Country of ref document: EP

Kind code of ref document: A1