WO2015096503A1 - Method, device and system for associating and managing payment accounts - Google Patents

Method, device and system for associating and managing payment accounts Download PDF

Info

Publication number
WO2015096503A1
WO2015096503A1 PCT/CN2014/085566 CN2014085566W WO2015096503A1 WO 2015096503 A1 WO2015096503 A1 WO 2015096503A1 CN 2014085566 W CN2014085566 W CN 2014085566W WO 2015096503 A1 WO2015096503 A1 WO 2015096503A1
Authority
WO
WIPO (PCT)
Prior art keywords
user
information
voice input
payment account
terminal device
Prior art date
Application number
PCT/CN2014/085566
Other languages
French (fr)
Inventor
Wenpeng ZHANG
Chen Gong
Wenjing Zhang
Yiyong YANG
Jiawei Jiang
Guoguo LIU
Yaqin Guo
Original Assignee
Tencent Technology (Shenzhen) Company Limited
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tencent Technology (Shenzhen) Company Limited filed Critical Tencent Technology (Shenzhen) Company Limited
Publication of WO2015096503A1 publication Critical patent/WO2015096503A1/en

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W4/00Services specially adapted for wireless communication networks; Facilities therefor
    • H04W4/24Accounting or billing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q20/00Payment architectures, schemes or protocols
    • G06Q20/02Payment architectures, schemes or protocols involving a neutral party, e.g. certification authority, notary or trusted third party [TTP]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q20/00Payment architectures, schemes or protocols
    • G06Q20/08Payment architectures
    • G06Q20/12Payment architectures specially adapted for electronic shopping systems
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q20/00Payment architectures, schemes or protocols
    • G06Q20/30Payment architectures, schemes or protocols characterised by the use of specific devices or networks
    • G06Q20/32Payment architectures, schemes or protocols characterised by the use of specific devices or networks using wireless devices
    • G06Q20/322Aspects of commerce using mobile devices [M-devices]
    • G06Q20/3221Access to banking information through M-devices
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q20/00Payment architectures, schemes or protocols
    • G06Q20/38Payment protocols; Details thereof
    • G06Q20/40Authorisation, e.g. identification of payer or payee, verification of customer or shop credentials; Review and approval of payers, e.g. check credit lines or negative lists
    • G06Q20/401Transaction verification
    • G06Q20/4014Identity check for transactions
    • G06Q20/40145Biometric identity checks
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L63/00Network architectures or network communication protocols for network security
    • H04L63/08Network architectures or network communication protocols for network security for authentication of entities
    • H04L63/0861Network architectures or network communication protocols for network security for authentication of entities using biometrical features, e.g. fingerprint, retina-scan
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L67/00Network arrangements or protocols for supporting network services or applications
    • H04L67/14Session management
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M15/00Arrangements for metering, time-control or time indication ; Metering, charging or billing arrangements for voice wireline or wireless communications, e.g. VoIP
    • H04M15/70Administration or customization aspects; Counter-checking correct charges
    • H04M15/705Account settings, e.g. limits or numbers or payment
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M15/00Arrangements for metering, time-control or time indication ; Metering, charging or billing arrangements for voice wireline or wireless communications, e.g. VoIP
    • H04M15/70Administration or customization aspects; Counter-checking correct charges
    • H04M15/715Activating new subscriber or card
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M15/00Arrangements for metering, time-control or time indication ; Metering, charging or billing arrangements for voice wireline or wireless communications, e.g. VoIP
    • H04M15/70Administration or customization aspects; Counter-checking correct charges
    • H04M15/72Administration or customization aspects; Counter-checking correct charges by the user
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M15/00Arrangements for metering, time-control or time indication ; Metering, charging or billing arrangements for voice wireline or wireless communications, e.g. VoIP
    • H04M15/70Administration or customization aspects; Counter-checking correct charges
    • H04M15/72Administration or customization aspects; Counter-checking correct charges by the user
    • H04M15/723Administration or customization aspects; Counter-checking correct charges by the user using the user's device
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M17/00Prepayment of wireline communication systems, wireless communication systems or telephone systems
    • H04M17/02Coin-freed or check-freed systems, e.g. mobile- or card-operated phones, public telephones or booths
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M17/00Prepayment of wireline communication systems, wireless communication systems or telephone systems
    • H04M17/10Account details or usage
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M17/00Prepayment of wireline communication systems, wireless communication systems or telephone systems
    • H04M17/30Prepayment of wireline communication systems, wireless communication systems or telephone systems using a code
    • H04M17/301Code input or reading
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M17/00Prepayment of wireline communication systems, wireless communication systems or telephone systems
    • H04M17/30Prepayment of wireline communication systems, wireless communication systems or telephone systems using a code
    • H04M17/308Code management
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M17/00Prepayment of wireline communication systems, wireless communication systems or telephone systems
    • H04M17/20Prepayment of wireline communication systems, wireless communication systems or telephone systems with provision for recharging the prepaid account or card, or for credit establishment
    • H04M17/204Prepayment of wireline communication systems, wireless communication systems or telephone systems with provision for recharging the prepaid account or card, or for credit establishment on-line recharging, e.g. cashless
    • H04M17/205Prepayment of wireline communication systems, wireless communication systems or telephone systems with provision for recharging the prepaid account or card, or for credit establishment on-line recharging, e.g. cashless by calling a service number, e.g. interactive voice response [IVR] or menu
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W8/00Network data management
    • H04W8/22Processing or transfer of terminal data, e.g. status or physical capabilities
    • H04W8/24Transfer of terminal data

Definitions

  • the present application generally relates to the field of mobile Internet technologies, and more particularly to a method, devices (e.g., terminal device, server device) and system for associating and managing payment accounts.
  • devices e.g., terminal device, server device
  • system for associating and managing payment accounts.
  • Some known online payment systems require a user to provide information of a payment account (e.g., a bank card) online such that the provided payment account can be bound to a user account of the user. For example, the user can be requested to provide a bank card number, a cardholder name, a cardholder identification number, a mobile number, etc.
  • a payment account e.g., a bank card
  • Such known online payment systems typically use a physical or virtual keyboard of a terminal device (e.g., mobile phone) as an input means, and requires the user to operate on (e.g., click on) the keyboard to enter
  • Such an input means can be inconvenient and prone to errors for the user. For example, typing on the keyboard and switching between input methods (e.g., switching between a letter keyboard and a numeric keyboard) can cause errors in the user input.
  • a method for associating a payment account with a user account is disclosed.
  • the method is performed at a terminal device having one or more processors and memory for storing programs to be executed by the one or more processors.
  • the method includes receiving, through an account binding interface associated with the user account and from a user, a first voice input.
  • the first voice input includes first information of the payment account.
  • the method also includes performing user authentication and information extraction based on the first voice input.
  • the user authentication is based on a voiceprint of the first voice input
  • the information extraction is based on speech recognition of the first voice input.
  • the performing user authentication and information extraction includes sending the first voice input to a server device such that the server device (1) performs the user authentication based on the voiceprint of the first voice input and stored voice authentication information for the user account, and (2) performs the speech recognition of the first voice input to extract the first information of the payment account if the user authentication is successful.
  • the voice authentication information is provided to the server device prior to the first voice input is received at the terminal device.
  • the method further includes providing an indication of the payment account being bound to the user account based on at least a success of the user authentication and the information extraction based on the first voice input.
  • the method includes presenting the first information of the payment account as extracted from the first voice input to the user through the account binding interface.
  • Such a presentation can include, for example, at least one of (1) displaying a textual message including the first information of the payment account, and (2) providing a text-to-speech output presenting the first information of the payment account as extracted from the first voice input.
  • the method includes receiving a voice command of the user associated with the first information of the payment account presented to the user.
  • the method further includes performing, according to the voice command, an operation associated with the first information of the payment account presented to the user.
  • an operation can include, for example, one of (1) submitting, to a server device, a confirmation on the first
  • the method includes detecting a possible error in the information extraction based on speech recognition of the first voice input, and presenting the possible error to the user.
  • a possible error can be, for example, a low-confidence recognition result in the first information as extracted from the first voice input.
  • the method includes automatically selecting, based on a character type associated with the first information, a matching keyboard for the character type from a group of soft keyboards associated with the terminal device. The method further includes automatically, without further user input, displaying the automatically selected matching keyboard to prompt the user to enter correction data for the first information of the payment account presented to the user.
  • the method includes receiving a second voice input from the user that includes second information of the payment account.
  • the second information is different from the first information with respect to a predetermined characteristic.
  • the predetermined characteristic can indicate, for example, whether respective information contained in a respective voice input meets a predetermined criticality level, or whether respective information contained in a respective voice input is an initial voice input received through the account binding interface.
  • the method also includes performing information extraction based on the second voice input without performing user authentication based on the second voice input.
  • a terminal device includes one or more processors and memory storing one or more programs for execution by the one or more processors.
  • the one or more programs include instructions that cause the terminal device to perform the method for associating a payment account with a user account as described above.
  • a non-transitory computer readable storage medium stores one or more programs including instructions for execution by one or more processors. The instructions, when executed by the one or more processors, cause the processors to perform the method for associating a payment account with a user account at a terminal device as described above.
  • FIG. 1 is a flow chart illustrating a method performed at a terminal device for associating a payment account with a user account in accordance with some embodiments.
  • FIG. 2 is a flow chart illustrating a method performed at a server device for associating a payment account with a user account in accordance with some embodiments.
  • FIG. 3 is a flow chart illustrating a method performed at a terminal device and a server device for associating a payment account with a user account in accordance with some embodiments.
  • FIG. 4 is a flow chart illustrating another method performed at a terminal device and a server device for associating a payment account with a user account in accordance with some embodiments.
  • FIG. 5 is a flow chart illustrating yet another method performed at a terminal device and a server device for associating a payment account with a user account in accordance with some embodiments.
  • FIG. 6 is a block diagram illustrating modules of a terminal device for associating and managing payment accounts in accordance with some embodiments.
  • FIG. 7 is a block diagram illustrating components of a terminal device for associating and managing payment accounts in accordance with some embodiments.
  • FIG. 8 is a block diagram illustrating modules of a server device for associating and managing payment accounts in accordance with some embodiments.
  • FIG. 9 is a block diagram illustrating components of a server device for associating and managing payment accounts in accordance with some embodiments.
  • FIG. 10 is a schematic diagram illustrating a terminal device and a server device configured to associate and manage payment accounts in accordance with some embodiments.
  • FIG. 11 is a schematic diagram illustrating a user interface of a terminal device for entering information of a payment account in accordance with some embodiments.
  • FIG. 12 is a schematic diagram illustrating another user interface of a terminal device for entering information of a payment account in accordance with some embodiments.
  • FIG. 13 is a schematic diagram illustrating a user interface of a terminal device for correcting information of a payment account in accordance with some embodiments.
  • FIG. 1 is a flow chart illustrating a method 100 performed at a terminal device for associating a payment account with a user account in accordance with some embodiments.
  • the terminal device performing the method 100 can be any type of electronic device that can be used by a user to access an online payment service. Such a terminal device can be configured to
  • the terminal device can be configured to interact with the user operating the terminal device to provide the online payment service and related services and/or functions to the user. Details of a terminal device and a server device configured to provide an online payment service are shown and described below with respect to FIG. 10.
  • the terminal device can be, for example, a cellular phone, a smart phone, a mobile Internet device (MID), a personal digital assistant (PDA), a tablet computer, an e-reader, a laptop computer, a handheld computer, a wearable device, a desktop computer, a vehicle terminal, and/or any other electronic device.
  • a terminal device can be referred to as, for example, a client device, a user device, a mobile device, a portable device, a terminal, and/or the like. Details of a terminal device are shown and described below with respect to FIGS. 6 and 7.
  • a server device communicating with the terminal device performing the method 100 can be any type of device configured to function as a server-side device to provide the online payment service and related services and/or functions (e.g., account management) described herein.
  • a server device can typically be configured to communicate with multiple terminal devices via one or more networks.
  • the server device can be, for example, a background server, a back end server, a database server, a workstation, a desktop computer, a cloud computing server, a data processing server, an instant messaging server, a Social Networking Service (SNS) server, a payment server, and/or the like.
  • a server device can be a server cluster or server center consisting of two or more servers (e.g., a data processing server and a database server). Details of a server device are shown and described below with respect to FIGS. 8 and 9.
  • a network connecting the terminal device and the server device can be any type of network configured to operatively couple one or more server devices to one or more terminal devices, and enable communications between the server device(s) and the terminal device(s).
  • a network can include one or more networks such as, for example, a cellular network, a satellite network, a local area network (LAN), a wide area network (WAN), a wireless local area network (WLAN), Internet, etc.
  • such a network can be optionally implemented using any known network protocol including various wired and/or wireless protocols such as, for example, Ethernet, universal serial bus (USB), global system for mobile communications (GSM), enhanced data GSM environment (EDGE), general packet radio service (GPRS), long term evolution (LTE), code division multiple access (CDMA), wideband code division multiple Access (WCDMA), time division multiple access (TDMA), Bluetooth, Wi-Fi, voice over internet protocol (VoIP), Wi-MAX, etc.
  • GSM global system for mobile communications
  • EDGE enhanced data GSM environment
  • GPRS general packet radio service
  • LTE long term evolution
  • CDMA code division multiple access
  • WCDMA wideband code division multiple Access
  • TDMA time division multiple access
  • Bluetooth Wi-Fi
  • Wi-Fi voice over internet protocol
  • VoIP voice over internet protocol
  • Wi-MAX etc.
  • a user operating the terminal device performing the method 100 can be any person
  • Such a person can use the online payment service to, for example, make online payments, conduct online shopping, etc.
  • FIG. 10 is a schematic diagram illustrating a terminal device 1020 and a server device 1040 configured to associate and manage payment accounts in accordance with some embodiments.
  • the terminal device 1020 can be structurally and functionally similar to the terminal device performing the method 100 as described above.
  • the server device 1040 can be structurally and functionally similar to the server device described above.
  • the terminal device 1020 is operated by a user 1010 and operatively coupled to the server device 1040 via a network 1030.
  • the user 1010 can be similar to the user described above, and the network 1030 can be similar to the network described above.
  • the user 1010 can operate the terminal device 1020 setup a payment account and/or link the payment account to a user account of the user 1010.
  • the user 1010 can operate the terminal device 1020 to provide a voice input including information of a payment account.
  • the terminal device 1020 can send the voice input to the server device 1040 such that the server device 1040 can extract the information of the payment account from the voice input.
  • the server device 1040 can then send the extracted information of the payment account to the terminal device such that the terminal device 1020 can present the extracted information of the payment account to the user 1010 for verification.
  • the user 1010 can operate the terminal device 1020 to update or confirm the extracted information.
  • the server device 1040 can bind the payment account to the user account of the user. Details of the operations associated with binding a payment account to a user account are shown and described with respect to FIGS. 1-5.
  • the terminal device performing the method 100 can include one or more processors and memory.
  • the method 100 is governed by instructions or code of an application that are stored in a non-transitory computer readable storage medium of the terminal device and executed by the one or more processors of the terminal device.
  • the application is associated with managing payment accounts and associating payment accounts with user accounts.
  • Such an application typically has a server-side portion that is stored in and/or executed at the server device, and a client-side portion that is stored in and/or executed at the terminal device.
  • the method 100 is performed at the terminal device.
  • the method 100 includes the following steps.
  • the terminal device receives, from the user, a voice input including information of a payment account.
  • the user logs onto a program or service (e.g., online payment service) of the terminal device that enables the user to setup a payment account and link that payment account to a user account of the user.
  • a program or service e.g., online payment service
  • the user can log onto the program or service using a login credential (e.g., a combination of a user ID and a password) that is uniquely associated with the user account.
  • a login credential e.g., a combination of a user ID and a password
  • the user When the user initiates a process to setup a payment account and/or link that payment account to the user account, the user is typically prompted to enter information of the payment account on a user interface (e.g., an account binding interface) of the terminal device.
  • a user interface e.g., an account binding interface
  • the user can generate a voice input including the required information of the payment account.
  • the information of the payment account included in the voice input can include, for example, a card number, a card type, an expiration date, a phone number, a user name, and/or the like.
  • the user can enter the voice input into the terminal device using various methods.
  • the user can trigger the terminal device to enter a sound capture mode using any suitable means, and then speak to a sound capture device (e.g., an embedded microphone) of the terminal device while the terminal device is in the sound capture mode.
  • a sound capture device e.g., an embedded microphone
  • the user can trigger the sound capture mode by operating on (e.g., long press, single-click, double-click) a voice input button on a user interface of the terminal device.
  • the voice message spoken by the user can be captured by the sound capture device of the terminal device as the voice input.
  • the user can operate the terminal device to switch between the sound capture mode and a text input mode for entering data into the terminal device.
  • FIG. 11 is a schematic diagram illustrating a user interface 1100 (e.g., an account binding interface) of a terminal device for entering information of a payment account in accordance with some embodiments.
  • the terminal device presenting the user interface 1100 is similar to the terminal device performing the method 100 described above.
  • a user operating the terminal device is prompted to enter a bank card number into the terminal device using the user interface 1100.
  • the user can choose to type the bank card number into the blank field 1102, or enter a voice input by long pressing the voice input button 1104 (thus triggering a sound capture mode). After completing this step, the user can enter next step by clicking the button 1106.
  • FIG. 12 is a schematic diagram illustrating another user interface
  • the terminal device presenting the user interface 1200 is similar to the terminal device performing the method 100 described above.
  • a user operating the terminal device is prompted to enter multiple information items into the terminal device using the user interface 1100.
  • the user can choose to enter the information by typing in the corresponding blank field, and/or entering a voice input.
  • the user can type a bank card type (e.g., "XX Bank Credit Card") into the field 1202, type an expiration date into the field 1206, and type a mobile phone number into the field 1210.
  • the user can provide information of the expiration date by long pressing the voice input button 1204 to enter a voice input, and provide information of the mobile phone number by long pressing the voice input button 1208 to enter another voice input.
  • the terminal device sends the voice input to the server device such that the server device performs user authentication and information extraction based on the voice input.
  • the server device performs user authentication and information extraction based on the voice input.
  • the information of the payment account is extracted from the voice input and recognized at the server device. Details of a server device performing user authentication and/or information extraction on a voice input are shown and further described with respect to FIG. 4.
  • the terminal device receives, from the server device, the information of the payment account as extracted from the voice input.
  • the terminal device presents the information of the payment account to the user on the user interface (e.g., account binding interface) or via any other suitable means.
  • the terminal device receives and presents the extracted information of the payment account such that the user can check and confirm the accuracy of the information.
  • the terminal device can present the extracted information to the user in various suitable methods. For example, the terminal device can display a textual message including the information extracted from the voice input on the user interface such that the user can view the extracted information.
  • the terminal device can output a voice message (e.g., by a speaker of the terminal device) including the extracted information using a text- to-speech technology such that the user can hear the extracted information.
  • a voice message e.g., by a speaker of the terminal device
  • the user can operate the terminal device to modify the information of the payment account such that the correct information can be sent to the server device.
  • the terminal device sends a confirmation of the information of the payment account to the server device such that the payment account is bound to a user account associated with the user at the server device based on the information of the payment account.
  • the user can operate the terminal device to indicate a confirmation on the information. For example, the user can click a "confirm” or "submit” button on a submission interface of the terminal device.
  • the terminal device sends a confirmation of the information to the server device.
  • the server device can associate the payment account with the user account of the user that has been identified since the user logs onto the program or service.
  • the server device can store information of the payment account into a field of a data entry for the user account in a database, where each payment account is uniquely associated with (e.g., bound to) one and only one user account.
  • the payment account provided by the user is bound to the user account of the user at the server device.
  • the server device sends a signal to the terminal device such that the terminal device provides an indication of the successful binding to the user.
  • FIG. 2 is a flow chart illustrating a method 200 performed at a server device for associating a payment account with a user account in accordance with some embodiments.
  • the server device performing the method 200 can be similar to the server device described above with respect to FIG. 1.
  • the server device performing the method 200 can be operatively coupled to and communicate with (e.g., via a network) one or more terminal devices that are similar to the terminal device described above with respect to FIG. 1.
  • the server device performing the method 200 can include one or more processors and memory.
  • the method 200 is governed by instructions or code of an application that are stored in a non-transitory computer readable storage medium of the server device and executed by the one or more processors of the server device.
  • the application is associated with managing payment accounts and associating payment accounts with user accounts.
  • Such an application typically has a server-side portion that is stored in and/or executed at the server device, and a client-side portion that is stored in and/or executed at the terminal device.
  • the method 200 is performed at the server device.
  • the method 200 includes the following steps.
  • the server device receives, from a terminal device, a voice input including information of a payment account.
  • the terminal device is one of the terminal devices operatively coupled to and communicating with the server device.
  • a user of the terminal device logs onto a program or service using the terminal device to initiate a process for setting up a payment account and/or linking the payment account to a user account of the user.
  • the user account can be identified at the server device based on, for example, a login credential (e.g., a user name and a password) provided by the user.
  • a user interface e.g., account binding interface
  • the user enters a voice input including the information of the payment account into the terminal device, which then sends the terminal device to the server device.
  • the server device performs speech recognition of the voice input to extract the information of the payment account.
  • the server device performs user authentication based on the voice input prior to performing the speech recognition of the voice input. Details of a server device performing user authentication and speech recognition of a voice input are shown and further described with respect to FIG. 4. As a result of a successful speech recognition, the server device extracts the information of the payment account from the voice input.
  • the server device binds the payment account to the user account based on the information of the payment account.
  • the user account can be identified based on, for example, the login credential provided by the user.
  • the user account is uniquely associated with the user.
  • the user can use the payment account in various online payment services (e.g., make a payment) and/or other services without re-entering information of the payment account.
  • FIG. 3 is a flow chart illustrating a method 300 performed at a terminal device 310 and a server device 320 for associating a payment account with a user account in accordance with some embodiments.
  • the terminal device 310 is similar to the terminal devices described above with respect to FIGS. 1 and 2, and the server device 320 is similar to the server devices described above with respect to FIGS. 1 and 2.
  • the terminal device 310 is operatively coupled to and communicates with the server device 320 (e.g., via a network not shown in FIG. 3).
  • the terminal device 310 and the server device 320 each includes one or more processors and memory.
  • the method 300 is governed by instructions or code of an application, which includes a server-side portion that is stored in and/or executed at the server device 320, and a client-side portion that is stored in and/or executed at the terminal device 310.
  • the server-side portion of the application and the client-side portion of the application being executed at the server device 320 and the terminal device 310 respectively, the server device 320 and the terminal device 310 collectively perform the method 300.
  • the method 300 includes the following steps.
  • the terminal device 310 receives a voice input including information of a payment account.
  • a user operating the terminal device 310 can log onto a program or service using the terminal device 310 to initiate a process for setting up a payment account and/or linking the payment account to a user account of the user.
  • the user account can be identified at the server device 320 based on, for example, a login credential (e.g., a pair of user id and password) entered by the user into the terminal device 310.
  • a login credential e.g., a pair of user id and password
  • the user can enter the voice input including the information of the payment account into the terminal device 310 on a user interface (e.g., an account binding interface) of the terminal device 310.
  • the terminal device 310 receives the voice input from the user.
  • the terminal device 310 sends the voice input to the server device 320, as described with respect to S102 of the method 100 and S201 of the method 200.
  • the server device 320 performs speech recognition of the voice input to extract the information of the payment account.
  • the server device 320 can perform user authentication based on the voice input prior to performing the speech recognition of the voice input. In such embodiments, if the user
  • the server device 320 performs the speech recognition of the voice input to extract the information of the payment account from the voice input. Otherwise, if the user authentication fails (that is, the user's identity is not confirmed based on the voice input), the server device 320 aborts the speech
  • the server device 320 extracts the information of the payment account from the voice input.
  • the server device 320 can send the extracted information back to the terminal device 310, which then presents the extracted information (e.g., display in a textual form, output in an audio form) to the user.
  • the terminal device 310 can send a confirmation to the server device 320 indicating that the extracted information is confirmed by the user.
  • the server device 320 binds the payment account to the user account based on the information of the payment account.
  • the server device 320 can associate the payment account with the user account by, for example, linking the two accounts in the same data entry uniquely assigned to the user account (or the user) in a database.
  • each payment account recorded in the database is uniquely associated with (e.g., bound to) one and only one user account.
  • the server device 320 can send a signal to cause the terminal device 310 to present an indication of the successful binding to the user.
  • FIG. 4 is a flow chart illustrating another method 400 performed at a terminal device
  • the terminal device 410 is similar to the terminal devices described above with respect to FIGS. 1-3, and the server device 420 is similar to the server devices described above with respect to FIGS. 1-3. As shown in FIG. 4, the terminal device 410 is operatively coupled to and communicates with the server device 420 (e.g., via a network not shown in FIG. 4).
  • the terminal device 410 and the server device 420 each includes one or more processors and memory.
  • the method 400 is governed by instructions or code of an application, which includes a server-side portion that is stored in and/or executed at the server device 420, and a client-side portion that is stored in and/or executed at the terminal device 410.
  • the server-side portion of the application and the client-side portion of the application being executed at the server device 420 and the terminal device 410 respectively, the server device 420 and the terminal device 410 collectively perform the method 400.
  • the method 400 includes the following steps.
  • the terminal device 410 receives a voice input including information of a payment account.
  • a user operating the terminal device 410 can log onto a program or service using the terminal device 410 to initiate a process for setting up a payment account and/or linking the payment account to a user account of the user.
  • the user account can be identified at the server device 420 based on, for example, a login credential entered by the user into the terminal device 410.
  • the user can enter the voice input including the information of the payment account into the terminal device 410 on a user interface (e.g., an account binding interface) of the terminal device 410.
  • the terminal device 410 receives the voice input from the user.
  • the terminal device 410 sends the voice input to the server device 420, as described with respect to SI 02 of the method 100, S201 of the method 200 and S302 of the method 300.
  • the server device 420 performs user authentication based on a voiceprint of the voice input and stored voice authentication information for the user account.
  • the server device 420 can be configured to determine the voiceprint of the voice input by analyzing the voice input using any suitable method.
  • Such a voiceprint of the voice input can be a representation in any suitable form that uniquely identifies the voice of the user that generates the voice input.
  • the voiceprint can be a spectrogram.
  • the voiceprint can be referred to as, for example, spectral waterfalls, voicegrams, etc.
  • the server device 420 can also be configured to identify the stored voice
  • the stored voice authentication information can be, for example, a stored template used to identify the user via her voice in speaker recognition.
  • the stored voice authentication information can be compared against the voiceprint obtained from the voice input to determine whether the voiceprint and the stored voice authentication information are associated with the same user.
  • the user's identity is authenticated based on the voiceprint of the voice input and the stored voice authentication information for the user account.
  • the stored voice authentication information is provided to the server device 420 prior to the terminal device 410 receiving the voice input.
  • the user is required to provide a voice sample (e.g., a template) when the user creates the user account associated with the program or service.
  • a voice sample is sent to and stored at the server device 420, and is uniquely linked to the user account (and the user).
  • the server device 420 can (optionally) perform user authentication of the voice input based on the stored voice sample and/or other voice samples associated with other user accounts (i.e., provided by other users).
  • the server device 420 If the user authentication fails (that is, the user's identity is not confirmed based on the voice input), the server device 420 aborts the speech recognition of the voice input, and the information of the payment account is not extracted from the voice input (not shown in FIG. 4). Otherwise, if the user authentication is successful (that is, the user's identity is confirmed based on the voice input), at S404, the server device 420 performs speech recognition of the voice input to extract the information of the payment account. In some embodiments, the server device 420 can be configured to perform speech recognition of the voice input using any suitable speech recognition technology (e.g., acoustic modeling, language modeling, etc.).
  • any suitable speech recognition technology e.g., acoustic modeling, language modeling, etc.
  • a voice input is considered as a void or invalid voice input if no meaningful information can be extracted from that voice input using speech recognition. For example, if the sound volume (i.e., loudness) of the voice input is below a threshold, then no information can be recognized from the voice input. For another example, if the noise level in the voice input is above a threshold (e.g., signal-to-noise ratio below a threshold), then no meaningful information can be successfully extracted from the voice input.
  • a threshold e.g., signal-to-noise ratio below a threshold
  • the information of the payment account is extracted from the voice input at the server device 420.
  • the server device 420 sends the extracted information of the payment account to the terminal device 410.
  • the terminal device 410 in response to receiving the extracted information of the payment account, can be configured to present the extracted information received from the server device 420 to the user such that the user can check and confirm the accuracy of the extracted information (not shown in FIG. 4).
  • the user can operate the terminal device 410 to send a confirmation to the server device 420. Otherwise, if the user determines that the extracted information received from the server device 420 is incorrect or inaccurate (i.e., different from the original information of the payment account included in the voice input), the user can operate the terminal device 410 to modify the information and then send the updated information of the payment account to the server device 420. Details of a user performing a terminal device to modify information of a payment account are shown and further described with respect to FIG. 5.
  • the terminal device 410 sends a collection of information of the payment account to the server device 420.
  • the server device 420 stores the collection of information of the payment account received from the terminal device 410.
  • the collection of information of the payment account can include multiple pieces of information of the payment account.
  • the voice input received at the terminal device 410 at S401 and sent from the terminal device 410 to the server device 420 at S402 is a first voice input, which includes first information of the payment account (e.g., a card number).
  • the terminal device 410 receives a second voice input from the user, which includes second information of the payment account (e.g., a card expiration date).
  • the terminal device 410 then sends the second voice input to the server device 420.
  • the second information of the payment account is included in the collection of information of the payment account with respect to S406-S407.
  • each information included in the collection of information of the payment account with respect to S406-S407, and/or the corresponding voice input that includes that information is processed according to S401-S405 as described above.
  • the second information of the payment account can be different from the first information of the payment account with respect to a predetermined characteristic.
  • a predetermined characteristic can be associated with, for example, a type, size, criticality level, indication, etc., of respective information.
  • the server device 420 can be configured to perform different operation(s) on different received voice inputs based on the predetermined characteristic of the received voice inputs.
  • the server device 420 can be configured to perform user authentication and then perform information extraction (if the user authentication is successful) on the first voice input including the first information; the server device 420 can be configured to perform information extraction on the second voice input including the second information without performing user authentication on the second voice input, where the second information is different from the first information with respect to the predetermined characteristic.
  • the predetermined characteristic can indicate whether respective information contained in a respective voice input (e.g., the first information contained in the first voice input, the second information contained in the second voice input) meets a predetermined criticality level.
  • a predetermined criticality level i.e., being more critical
  • information of a phone number has a higher criticality level (i.e., being more critical) than information of a type of the phone number (e.g., home phone number, mobile phone number, office phone number).
  • a type of the phone number e.g., home phone number, mobile phone number, office phone number.
  • the server device 420 performs user authentication on a voice input if information contained in the voice input meets the predetermined criticality level (e.g., card number, phone number). If such user authentication is successful, the server device 420 performs information extraction on the voice input to extract the information. On the contrary, the server device 420 performs information extraction on a voice input to extract information contained in that voice input if the information contained in the voice input fails to meet the predetermined criticality level (e.g., card expiration date, type of phone number). Thus, user authentication is performed only on voice inputs including information that meets the predetermined criticality level.
  • the predetermined criticality level e.g., card number, phone number
  • the predetermined characteristic can indicate whether respective voice input (e.g., the first voice input, the second voice input) is an initial voice input received at the terminal device 410 (e.g., through an account binding interface) during a session.
  • a session can be defined as, for example, communications between a particular user and the terminal device 410 and/or operations performed by the particular user on the terminal device 410 during a certain period of time.
  • a new session is initiated if a user logs off and the same user or a different user logs in again, or a user returns to operate the terminal device 410 after being silent for a certain period of time, and/or the like.
  • the server device 420 performs user authentication on a voice input if that voice input is the initial voice input (of a session) received at the terminal device 410. If such user authentication is successful, the server device 420 performs information extraction on the voice input to extract the information. On the contrary, the server device 420 performs information extraction on a voice input to extract information contained in that voice input if that voice input is a voice input subsequently received at the terminal device 410 in a session (i.e., not the initial voice input of a session). In other words, user authentication is performed on the initial voice input at the beginning of a session, and ignored for subsequent voice inputs as long as they are received during the same session.
  • the server device 420 After the collection of necessary information of the payment account is received, stored and verified at the server device 420, at S408, the server device 420 binds the payment account to the user account based on the collection of information of the payment account.
  • the server device 420 can be configured to bind the payment account to the user account by, for example, linking the two accounts in the same data entry uniquely assigned to the user account (or the user) in a database.
  • each payment account recorded in the database is uniquely associated with (e.g., bound to) one and only one user account.
  • the server device 420 after binding the payment account to the user account, the server device 420 can send a signal to cause the terminal device 410 to present an indication of the successful binding to the user.
  • FIG. 5 is a flow chart illustrating another method 500 performed at a terminal device
  • the terminal device 580 is similar to the terminal devices described above with respect to FIGS. 1-4, and the server device 590 is similar to the server devices described above with respect to FIGS. 1-4. As shown in FIG. 5, the terminal device 580 is operatively coupled to and communicates with the server device 590 (e.g., via a network not shown in FIG. 5).
  • the terminal device 580 and the server device 590 each includes one or more processors and memory.
  • the method 500 is governed by instructions or code of an application, which includes a server-side portion that is stored in and/or executed at the server device 590, and a client-side portion that is stored in and/or executed at the terminal device 580.
  • the server device 590 and the terminal device 580 collectively perform the method 500.
  • the method 500 includes the following steps.
  • the terminal device 580 receives a first voice input including first information of a payment account.
  • the terminal device 580 sends the first voice input to the server device 590.
  • the server device 590 performs user authentication based on a voiceprint of the first voice input and stored voice authentication information for the user account.
  • the server device 590 performs speech recognition of the first voice input to extract the first information of the payment account.
  • the server device 590 sends the extracted first information of the payment account to the terminal device 580.
  • the operations of S501-S505 are similar to the operations of S401-S405 of the method 400 shown and described with respect to FIG. 4, and/or the corresponding operations shown and described with respect to FIGS. 1-3, thus not elaborated herein.
  • the terminal device 580 in response to receiving the extracted first information of the payment account, can be configured to present the extracted first information to a user operating the terminal device 580 such that the user can check and confirm the accuracy of the extracted first information. Based on the result of the checking and confirming, the user then provides a voice command indicting next operation on the extracted first information. Such a voice command is then sent to the server device 590 for interpretation. That is, the server device 590 performs speech recognition on the voice command to extract information including the command from the voice command. The server device 590 then sends the extracted command to the terminal device 580 such that the terminal device 580 performs an operation based on the command extracted from the voice command.
  • the user determines that the extracted first information is accurate (that is, identical to the original first information included in the first voice input)
  • the user provides a voice command including a submission command such that the extracted first information is submitted to the server device 590 (as shown and described with respect to S511-S514).
  • the user determines that the extracted first information is incorrect or inaccurate (that is, different from the original first information included in the first voice input)
  • the user provides a voice command including a control command (e.g., a deleting or editing command) such that the extracted first information is deleted or further modified before being submitted to the server device 590 (as shown and described with respect to S506-S510).
  • a control command e.g., a deleting or editing command
  • the terminal device 580 receives a first voice command from the user.
  • the first voice command includes a control command.
  • the terminal device 580 sends the first voice command to the server device 590.
  • the server device 590 performs speech recognition of the first voice command to extract the control command.
  • the server device 590 sends the control command to the terminal device 580.
  • the terminal device 580 performs an operation associated with the first information of the payment account. Particularly, such an operation is based on the control command received from the server device 590.
  • such a control command can be a deleting command.
  • the voice command corresponding to the deleting command can include a keyword "delete,” “remove,” “clear,” and/or the like.
  • the operation performed at the terminal device 580 at S510 includes deleting the first information of the payment account presented to the user.
  • the first information is cleared from a field on a user interface (e.g., an account binding interface), thus enabling the user to provide new information of the payment account for the respective field.
  • the user can be prompted to enter new data into the field, generate a new voice input for the field, and/or user any other suitable method to provide new information of the payment account.
  • such a control command can be an editing command.
  • the voice command corresponding to the editing command can include a keyword "edit,” “change,” “modify,” “update,” and/or the like.
  • the operation performed at the terminal device 580 at S510 includes entering a modification mode for modifying the first information of the payment account presented to the user.
  • the modification mode the user can choose to modify the first information by, for example, using a keyboard associated with the terminal device 580, using a voice input button, and/or any other suitable methods.
  • the terminal device 580 in the modification mode, can be configured to automatically determine a character type of the first information as extracted from the first voice input, and then automatically select, based on the character type of the extracted first information, a matching keyboard from a group of soft keyboards associated with the terminal device 580.
  • the terminal device 580 can also be configured to automatically, without further user input, display such a matching keyboard on a user interface (e.g., an account binding interface) of the terminal device 580 such that the user can enter correction data using the matching keyboard.
  • a user interface e.g., an account binding interface
  • the group of soft keyboards can include, for example, a numerical keyboard (without any alphabetical key), an alphanumeric keyboard (with both alphabetical and numerical keys), an alphabetical keyboard (without any numerical key), a special character keyboard, a keyboard for a foreign language, etc.
  • the terminal device 580 automatically determines the numerical characters of the first information, and selects a numerical keyboard for the first information. The terminal device 580 then automatically, without further user input, displays the selected numerical keyboard to prompt the user to enter correction data for the first information using the selected numerical keyboard.
  • a character type of numerical characters e.g., a card number, phone number
  • the terminal device 580 automatically determines the alphabetical characters of the first information, and selects an alphabetical keyboard for the first information. The terminal device 580 then automatically, without further user input, displays the selected alphabetic keyboard to prompt the user to enter correction data for the first information using the selected alphabetic keyboard.
  • a character type of alphabetical characters e.g., a user's name
  • the terminal device 580 if the first information has a character type of a combination of numerical and alphabetic characters (e.g., a mailing address), the terminal device 580
  • the terminal device 580 automatically determines the combination of numerical and alphabetic characters of the first information, and selects an alphanumeric keyboard for the first information.
  • the terminal device 580 then automatically, without further user input, displays the selected alphanumeric keyboard to prompt the user to enter correction data for the first information using the selected alphanumeric keyboard.
  • FIG. 13 is a schematic diagram illustrating a user interface 1300 of a terminal device for correcting information of a payment account in accordance with some
  • the terminal device presenting the user interface 1300 is similar to the terminal device 580 described herein. As shown in FIG. 13, after a possible error is detected (at the terminal device or a corresponding server device) for provided information associated with a card number, the terminal device automatically determines that the information for the card number has a numerical characteristic. The terminal device then automatically selects, based on the determined numerical characteristic, a numerical keyboard 1302. Moreover, the terminal device automatically, without further user input, displays such a numerical keyboard 1302 on the user interface 1300 such that a user operating the terminal device can enter correction data into the filed 1301 using the numerical keyboard 1302.
  • the server device 590 or the terminal device 580 can be configured to detect a possible error in the extracted first information based on speech recognition of the first voice input performed at S508.
  • the extracted first information presented to the user at the terminal device 580 is different from the first information included in the first voice input that is provided by the user to the terminal device 580.
  • a possible error can be, for example, a low-confidence recognition result in the extracted first information, or a speech recognition error.
  • the terminal device 580 can be configured to present the possible error to the user, and prompt the user to enter correction data for the first information of the payment account.
  • the terminal device 580 can automatically enter the
  • the terminal device 580 can be configured to display the extracted first information to the user with the possible error being highlighted in the display, or presenting a separate message indicating the possible error to the user.
  • the terminal device 580 can be configured to automatically determine a character type associated with a possible correction of the first information as extracted from the first voice input (e.g., a possible correction of the low-confidence recognition result).
  • the terminal device 580 can be configured to automatically select, based on the character type associated with the possible correction of the first information, a matching keyboard from a group of soft keyboards associated with the terminal device 580.
  • the terminal device 580 can further be configured to automatically, without further user input, display such a matching keyboard on a user interface (e.g., an account binding interface) of the terminal device 580 such that the user can enter correction data using the matching keyboard.
  • a user interface e.g., an account binding interface
  • the user can be request to provide a confirmation and/or submission command for each extracted information.
  • the user After reviewing the extracted first information, the user provides a voice command including a submission command with respect to the extracted first information.
  • the terminal device 580 sends the voice command to the server device 590, which extracts the submission command from the voice command using speech recognition, and then sends the submission command to the terminal device 580.
  • the terminal device 580 then performs an operation in response to the submission command, which includes submitting, to the server device 590, a confirmation on the extracted first information of the payment account.
  • the user determines that a collection of information extracted from one or multiple voice inputs is accurate (that is, identical to the original information included in the one or multiple voice inputs), or any error in the collection of extracted information has been corrected, the user provides a voice command including a submission command such that the collection of extracted information is submitted to the server device 590.
  • the collection of information can be associated with, for example, multiple fields in a form associated with setting up a payment account and/or linking the payment account to a user account. After the user provides information for each field of the form (e.g., type in using the user interface or provide a voice input), the collection of the information can be presented to the user for verification.
  • the user can provide a voice command including a submission command indicating that each information included in the collection is confirmed and ready to be submitted. In that case, the user can submit the collection of information using a single submission command. Alternatively, each information included in the collection of information can be separately presented to the user for verification. Thus, the user can provide a voice command including a submission command indicating that a single piece of information is confirmed and ready to be submitted. In that case, the user is typically not asked to submit the collection of information using a single submission command.
  • the terminal device 580 receives a second voice command.
  • the terminal device 580 sends the second voice command to the server device 590.
  • the server device 590 performs speech recognition of the second voice command to extract a submission command.
  • the server device 590 sends the submission command to the terminal device 580.
  • the terminal device 580 submits a collection of information of the payment account to the server device 590.
  • the server device 590 stores the collection of information of the payment account received from the terminal device 580.
  • the server device 590 binds the payment account to the user account based on the collection of information of the payment account.
  • the operations of S515-S517 are similar to the operations of S406-S408 of the method 400 shown and described with respect to FIG. 4, and/or the corresponding operations shown and described with respect to FIGS. 1-3, thus not elaborated herein.
  • speech recognition of a voice command can be performed based on a set of predefined voice commands including predefined keywords such as, for example, “submit,” “confirm,” “send,” and/or the like corresponding to a submission command; “delete,” “remove,” “clear,” and/or the like corresponding to a deleting command; “edit,” “change,” “update,” “modify,” and/or the like corresponding to an editing command.
  • predefined voice commands including predefined keywords such as, for example, “submit,” “confirm,” “send,” and/or the like corresponding to a submission command; “delete,” “remove,” “clear,” and/or the like corresponding to a deleting command; “edit,” “change,” “update,” “modify,” and/or the like corresponding to an editing command.
  • the voice command if more than one conflicting predefined keywords are recognized from the voice command, then the voice command is considered to be void or invalid. Additionally, if no predefined keyword is recognized from the voice command (e.g., due to low sound volume, due to high noise level, etc.), then the voice command is considered to be void or invalid.
  • the user can be provided with an option to enter a command by manually operating on the user interface of the terminal device 580 without generating a voice command.
  • the user can indicate a submission command by generating a voice command including a submission command (e.g., long press a voice input button to activate a sound capture mode) or clicking a submission button on the user interface.
  • the user can enter a modification mode by generating a voice command including an editing command (e.g., long press a voice input button to activate a sound capture mode) or moving the cursor into the corresponding field on the user interface.
  • FIG. 6 is a block diagram illustrating modules of a terminal device 600 for associating and managing payment accounts in accordance with some embodiments.
  • the terminal device 600 can be structurally and functionally similar to the terminal devices shown and/or described with respect to FIGS. 1-5.
  • the terminal device 600 includes an acquisition module 610, a transmitting module 620, a receiving module 630 and an account management module 640.
  • the terminal device 600 can include more or less modules than those shown in FIG. 6.
  • each module included in the terminal device 600 can be a hardware-based module (e.g., a digital signal processor (DSP), a field programmable gate array (FPGA), etc.), a software-based module (e.g., a module of computer code executed at a processor, a set of processor-readable instructions executed at a processor, etc.), or a combination of hardware and software modules.
  • a hardware-based module e.g., a digital signal processor (DSP), a field programmable gate array (FPGA), etc.
  • a software-based module e.g., a module of computer code executed at a processor, a set of processor-readable instructions executed at a processor, etc.
  • Instructions or code of each module can be stored in a memory of the terminal device 600 (not shown in FIG. 6) and executed at a processor (e.g., a CPU) of the terminal device 600 (not shown in FIG. 6).
  • the acquisition module 610, the transmitting module 620, the receiving module 630 and the account management module 640 can be configured to collectively perform at least a portion of the methods 100 and 300-500 shown and described with respect to FIGS. 1 and 3-5.
  • the acquisition module 610 is configured to acquire voice inputs from a user operating the terminal device 600.
  • the voice inputs acquired by the acquisition module 610 includes voice inputs including information of payment accounts and voice commands associated with operations on the information of the payment account.
  • the acquisition module 610 can include two sub-modules, which are responsible for acquiring voice inputs including information of payment accounts and acquiring voice commands including commands, respectively. For example, as shown and described with respect to FIGS.
  • the acquisition module 610 when the terminal device 600 is in a sound capture mode (e.g., as a result of the user long pressing a voice input button), the acquisition module 610 is configured to capture sound from the environment surrounding the terminal device 600 (e.g., using a microphone or other voice acquisition device), thus receiving the voice inputs from the user.
  • a sound capture mode e.g., as a result of the user long pressing a voice input button
  • the acquisition module 610 is configured to capture sound from the environment surrounding the terminal device 600 (e.g., using a microphone or other voice acquisition device), thus receiving the voice inputs from the user.
  • the transmitting module 620 is configured to transmit the voice inputs (including the voice inputs including information of payment accounts and voice commands including commands) acquired by the acquisition module 610 to a server device operatively coupled to and communicating with the terminal device 600, such that the voice inputs can be processed at the server device.
  • the receiving module 630 is configured to receive extracted information and/or commands from the server device, such that the terminal device 600 can present the extracted information to the user and/or perform operations according to the commands.
  • the account management module 640 is configured to perform operations according to commands received from the server device. For example, as described with respect to the method 500 in FIG. 5, the account management module 640 is configured to delete, edit or submit extracted information of a payment account based on a deleting command, an editing command or a submission command received from the server device. As described above, such a command can be received by the receiving module 630, which is extracted at the server device from a voice command acquired by the acquisition module 610 and transmitted from the terminal device 600 to the server device by the transmitting module 620.
  • FIG. 7 is a block diagram illustrating components of a terminal device 700 for associating and managing payment accounts in accordance with some embodiments.
  • the terminal device 700 can be structurally and functionally similar to the terminal devices shown and described above with respect to FIGS. 1-6.
  • the terminal device 700 includes a processor 701, a communication bus 702, a network interface 703, a voice acquisition device 705, and a memory 704 including programming code 706.
  • the terminal device 700 can include more or less devices, components and/or modules than those shown in FIG. 7.
  • the processor 701 can be any processing device capable of performing at least a portion of the methods 100-500 described with respect to FIGS. 1-5. Such a processor can be, for example, a CPU, a DSP, a FPGA, and/or the like.
  • the processor 701 can be configured to control the operations of other components and/or modules of the terminal device 700.
  • the processor 701 can be configured to control operations of the network interface 703.
  • the processor 701 can be configured to execute instructions or code stored in a software program or module (e.g., the program code 706) within the memory 704.
  • the communication bus 702 is configured to implement connections and
  • the network interface 703 is configured to provide and control network interfaces of the terminal device 700 that are used to interact with other network devices (e.g., server devices).
  • the network interface 703 can include, for example, a standard wired interface and/or a standard wireless interface (e.g., a Wi-Fi interface).
  • the network interface 703 is used for connecting one or more server devices and performing data communication with the one or more server devices.
  • operations of the network interface 703 are controlled by instructions or code stored in the memory 704 (e.g., the program code 706).
  • the voice acquisition device 705 is configured to acquire voice inputs and/or voice commands from a user operating the terminal device 700.
  • a voice acquisition device 705 can be, for example, a microphone or any other suitable sound capture device.
  • the voice acquisition device 705 can be controlled by, for example, an acquisition module similar to the acquisition module 610 of the terminal device 600 in FIG. 6.
  • the memory 704 can include, for example, a random-access memory (RAM) (e.g., a DRAM, a SRAM, a DDR RAM, etc.), a non-volatile memory such as one or more magnetic disk storage devices, optical disk storage devices, flash memory devices, or other non-volatile solid state storage devices.
  • the memory 704 can include one or more storage devices (e.g., a removable memory) remotely located from other components of the terminal device 700.
  • each component, program, application or module (e.g., the program code 706) included in the memory 704 can be a hardware-based module (e.g., a DSP, a FPGA), a software-based module (e.g., a module of computer code executed at a processor, a set of processor-readable instructions executed at a processor), or a combination of hardware and software modules.
  • Instructions or code of each component, program, application or module can be stored in the memory 704 and executed at the processor 701.
  • components, module and devices of the terminal device 700 are configured to collectively perform at least a portion of the methods 100 and 300-500 shown and described above with respect to FIGS. 1 and 3-5.
  • FIG. 8 is a block diagram illustrating modules of a server device 800 for associating and managing payment accounts in accordance with some embodiments.
  • the server device 800 can be structurally and functionally similar to the server devices shown and/or described with respect to FIGS. 1-5.
  • the server device 800 includes a receiving module 810, a speech recognition module 820, a voice authentication module 840, an account management module 830 and a transmitting module 850.
  • the server device 800 can include more or less modules than those shown in FIG. 8.
  • each module included in the server device 800 can be a hardware-based module (e.g., a DSP, a FPGA, etc.), a software-based module (e.g., a module of computer code executed at a processor, a set of processor-readable instructions executed at a processor, etc.), or a combination of hardware and software modules. Instructions or code of each module can be stored in a memory of the server device 800 (not shown in FIG. 8) and executed at a processor (e.g., a CPU) of the server device 800 (not shown in FIG. 8).
  • a hardware-based module e.g., a DSP, a FPGA, etc.
  • a software-based module e.g., a module of computer code executed at a processor, a set of processor-readable instructions executed at a processor, etc.
  • Instructions or code of each module can be stored in a memory of the server device 800 (not shown in FIG. 8) and executed at a processor (e.g., a CPU
  • the receiving module 810, the speech recognition module 820, the voice authentication module 840, the account management module 830 and the transmitting module 850 can be configured to collectively perform at least a portion of the methods 200-500 shown and described with respect to FIGS. 2-5.
  • the receiving module 810 is configured to receive voice inputs and/or voice commands from terminal devices operatively coupled to and communicating with the server device 800.
  • the voice inputs include information associated with payment accounts, and the voice commands include commands provided by users of the terminal devices.
  • the transmitting module 850 is configured to transmit the extracted information of payment accounts and extracted commands to the terminal devices, such that the extracted information of payment accounts can be presented to users of the terminal devices for verification, and the extracted commands can be performed at the terminal devices.
  • the voice authentication module 840 is configured to perform user authentication on the voice inputs and/or voice commands received by the receiving module 810. In some
  • the user authentication can be performed based on a voiceprint of a voice input and stored voice authentication information for a user account.
  • the speech recognition module 820 is configured to perform speech recognition on voice inputs and/or voice commands to extract information and/or commands included in the voice inputs and/or the voice commands.
  • speech recognition of a voice input or voice command is performed when the user authentication on that voice input or voice command is successful.
  • speech recognition is performed on a voice input or voice command without user authentication being performed on that voice input or voice command.
  • the account management module 830 is configured to manage payment accounts and user accounts.
  • the account management module 830 is also configured to bind a payment account to a user account if the two accounts are determined to be associated with the same user.
  • the account management module 830 can be configured to automatically retrieve information of the payment account based on the user account without requesting the user to provide any information of the payment account.
  • the account management module 830 is configured to maintain and manage a database storing information of the payment accounts and the user accounts.
  • FIG. 9 is a block diagram illustrating components of a server device 900 for associating and managing payment accounts in accordance with some embodiments.
  • the server device 900 can be structurally and functionally similar to the server devices shown and described above with respect to FIGS. 1-5 and FIG. 8.
  • the server device 900 includes a processor 901, a communication bus 902, a network interface 903 and a memory 904 including programming code 905.
  • the server device 900 can include more or less devices, components and/or modules than those shown in FIG. 9.
  • the processor 901 can be any processing device capable of performing at least a portion of the methods 100-500 described with respect to FIGS. 1-5. Such a processor can be, for example, a CPU, a DSP, a FPGA, and/or the like.
  • the processor 901 can be configured to control the operations of other components and/or modules of the server device 900.
  • the processor 901 can be configured to control operations of the network interface 903.
  • the processor 901 can be configured to execute instructions or code stored in a software program or module (e.g., the program code 905) within the memory 904.
  • the communication bus 902 is configured to implement connections and
  • the network interface 903 is configured to provide and control network interfaces of the server device 900 that are used to interact with other network devices (e.g., terminal devices).
  • the network interface 903 can include, for example, a standard wired interface and/or a standard wireless interface (e.g., a Wi-Fi interface).
  • the network interface 903 is used for connecting one or more terminal devices and performing data communication with the one or more terminal devices.
  • operations of the network interface 903 are controlled by instructions or code stored in the memory 904 (e.g., the program code 905).
  • the memory 904 can include, for example, a random-access memory (RAM) (e.g., a DRAM, a SRAM, a DDR RAM, etc.), a non-volatile memory such as one or more magnetic disk storage devices, optical disk storage devices, flash memory devices, or other non-volatile solid state storage devices.
  • the memory 904 can include one or more storage devices (e.g., a removable memory) remotely located from other components of the server device 900.
  • each component, program, application or module included in the memory 904 can be a hardware-based module (e.g., a DSP, a FPGA), a software-based module (e.g., a module of computer code executed at a processor, a set of processor-readable instructions executed at a processor), or a combination of hardware and software modules.
  • Instructions or code of each component, program, application or module can be stored in the memory 904 and executed at the processor 901.
  • components, module and devices of the server device 900 are configured to collectively perform at least a portion of the methods 200-500 shown and described above with respect to FIGS. 2-5.
  • the phrase “if it is determined [that a stated condition precedent is true]” or “if [a stated condition precedent is true]” or “when [a stated condition precedent is true]” may be construed to mean “upon determining” or “in response to determining” or “in accordance with a determination” or “upon detecting” or “in response to detecting” that the stated condition precedent is true, depending on the context.
  • stages that are not order dependent may be reordered and other stages may be combined or broken out. While some reordering or other groupings are specifically mentioned, others will be obvious to those of ordinary skill in the art and so do not present an exhaustive list of alternatives. Moreover, it should be recognized that the stages could be implemented in hardware, firmware, software or any combination thereof.

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Business, Economics & Management (AREA)
  • Signal Processing (AREA)
  • Accounting & Taxation (AREA)
  • Computer Security & Cryptography (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • General Business, Economics & Management (AREA)
  • Strategic Management (AREA)
  • Finance (AREA)
  • Biomedical Technology (AREA)
  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Computer Hardware Design (AREA)
  • Computing Systems (AREA)
  • General Engineering & Computer Science (AREA)
  • Financial Or Insurance-Related Operations Such As Payment And Settlement (AREA)
  • Mobile Radio Communication Systems (AREA)
  • Telephonic Communication Services (AREA)

Abstract

A method for associating a payment account with a user account is disclosed. The method is performed at a terminal device having one or more processors and memory for storing programs to be executed by the one or more processors. The method includes receiving, through an account binding interface associated with the user account and from a user, a first voice input including first information of the payment account. The method also includes performing user authentication and information extraction based on the first voice input, wherein the user authentication is based on a voiceprint of the first voice input, and the information extraction is based on speech recognition of the first voice input. The method further includes providing an indication of the payment account being bound to the user account based on at least a success of the user authentication and the information extraction based on the first voice input.

Description

METHOD, DEVICE AND SYSTEM FOR ASSOCIATING AND MANAGING
PAYMENT ACCOUNTS
Description
PRIORITY CLAIM AND RELATED APPLICATION
[0001] This application claims priority to Chinese Patent Application Serial No.
201310727165.4, entitled "METHOD, DEVICE AND SYSTEM FOR ASSOCIATING AND
MANAGING PAYMENT ACCOUNTS", filed December 24, 2013, which is incorporated herein by reference in its entirety.
FIELD OF THE APPLICATION
[0002] The present application generally relates to the field of mobile Internet technologies, and more particularly to a method, devices (e.g., terminal device, server device) and system for associating and managing payment accounts.
BACKGROUND
[0003] Some known online payment systems require a user to provide information of a payment account (e.g., a bank card) online such that the provided payment account can be bound to a user account of the user. For example, the user can be requested to provide a bank card number, a cardholder name, a cardholder identification number, a mobile number, etc. Such known online payment systems typically use a physical or virtual keyboard of a terminal device (e.g., mobile phone) as an input means, and requires the user to operate on (e.g., click on) the keyboard to enter
information of the payment account. Such an input means, however, can be inconvenient and prone to errors for the user. For example, typing on the keyboard and switching between input methods (e.g., switching between a letter keyboard and a numeric keyboard) can cause errors in the user input.
[0004] Therefore, a need exists for a method, device and system that can provide an improved and smooth user experience in associating and managing payment accounts.
SUMMARY
[0005] The above deficiencies associated with the known online payment systems may be addressed by the techniques described herein.
[0006] In some embodiments, a method for associating a payment account with a user account is disclosed. The method is performed at a terminal device having one or more processors and memory for storing programs to be executed by the one or more processors. The method includes receiving, through an account binding interface associated with the user account and from a user, a first voice input. The first voice input includes first information of the payment account.
[0007] The method also includes performing user authentication and information extraction based on the first voice input. The user authentication is based on a voiceprint of the first voice input, and the information extraction is based on speech recognition of the first voice input. In some instances, the performing user authentication and information extraction includes sending the first voice input to a server device such that the server device (1) performs the user authentication based on the voiceprint of the first voice input and stored voice authentication information for the user account, and (2) performs the speech recognition of the first voice input to extract the first information of the payment account if the user authentication is successful. In some instances, the voice authentication information is provided to the server device prior to the first voice input is received at the terminal device.
[0008] The method further includes providing an indication of the payment account being bound to the user account based on at least a success of the user authentication and the information extraction based on the first voice input. In some instances, the method includes presenting the first information of the payment account as extracted from the first voice input to the user through the account binding interface. Such a presentation can include, for example, at least one of (1) displaying a textual message including the first information of the payment account, and (2) providing a text-to-speech output presenting the first information of the payment account as extracted from the first voice input.
[0009] In some instances, the method includes receiving a voice command of the user associated with the first information of the payment account presented to the user. In such instances, the method further includes performing, according to the voice command, an operation associated with the first information of the payment account presented to the user. Such an operation can include, for example, one of (1) submitting, to a server device, a confirmation on the first
information of the payment account presented to the user, (2) deleting the first information of the payment account presented to the user, and (3) entering a modification mode for modifying the first information of the payment account presented to the user.
[0010] In some instances, the method includes detecting a possible error in the information extraction based on speech recognition of the first voice input, and presenting the possible error to the user. Such a possible error can be, for example, a low-confidence recognition result in the first information as extracted from the first voice input. In such instances, the method includes automatically selecting, based on a character type associated with the first information, a matching keyboard for the character type from a group of soft keyboards associated with the terminal device. The method further includes automatically, without further user input, displaying the automatically selected matching keyboard to prompt the user to enter correction data for the first information of the payment account presented to the user.
[0011] In some instances, the method includes receiving a second voice input from the user that includes second information of the payment account. The second information is different from the first information with respect to a predetermined characteristic. The predetermined characteristic can indicate, for example, whether respective information contained in a respective voice input meets a predetermined criticality level, or whether respective information contained in a respective voice input is an initial voice input received through the account binding interface. In such instances, the method also includes performing information extraction based on the second voice input without performing user authentication based on the second voice input.
[0012] In some embodiments, a terminal device includes one or more processors and memory storing one or more programs for execution by the one or more processors. The one or more programs include instructions that cause the terminal device to perform the method for associating a payment account with a user account as described above. In some embodiments, a non-transitory computer readable storage medium stores one or more programs including instructions for execution by one or more processors. The instructions, when executed by the one or more processors, cause the processors to perform the method for associating a payment account with a user account at a terminal device as described above.
[0013] Various advantages of the present application are apparent in light of the descriptions below.
BRIEF DESCRIPTION OF DRAWINGS
[0014] The aforementioned implementation of the application as well as additional implementations will be more clearly understood as a result of the following detailed description of the various aspects of the application when taken in conjunction with the drawings.
[0015] FIG. 1 is a flow chart illustrating a method performed at a terminal device for associating a payment account with a user account in accordance with some embodiments.
[0016] FIG. 2 is a flow chart illustrating a method performed at a server device for associating a payment account with a user account in accordance with some embodiments.
[0017] FIG. 3 is a flow chart illustrating a method performed at a terminal device and a server device for associating a payment account with a user account in accordance with some embodiments.
[0018] FIG. 4 is a flow chart illustrating another method performed at a terminal device and a server device for associating a payment account with a user account in accordance with some embodiments.
[0019] FIG. 5 is a flow chart illustrating yet another method performed at a terminal device and a server device for associating a payment account with a user account in accordance with some embodiments.
[0020] FIG. 6 is a block diagram illustrating modules of a terminal device for associating and managing payment accounts in accordance with some embodiments. [0021] FIG. 7 is a block diagram illustrating components of a terminal device for associating and managing payment accounts in accordance with some embodiments.
[0022] FIG. 8 is a block diagram illustrating modules of a server device for associating and managing payment accounts in accordance with some embodiments.
[0023] FIG. 9 is a block diagram illustrating components of a server device for associating and managing payment accounts in accordance with some embodiments.
[0024] FIG. 10 is a schematic diagram illustrating a terminal device and a server device configured to associate and manage payment accounts in accordance with some embodiments.
[0025] FIG. 11 is a schematic diagram illustrating a user interface of a terminal device for entering information of a payment account in accordance with some embodiments.
[0026] FIG. 12 is a schematic diagram illustrating another user interface of a terminal device for entering information of a payment account in accordance with some embodiments.
[0027] FIG. 13 is a schematic diagram illustrating a user interface of a terminal device for correcting information of a payment account in accordance with some embodiments.
[0028] Like reference numerals refer to corresponding parts throughout the several views of the drawings.
DETAILED DESCRIPTION
[0029] Reference will now be made in detail to embodiments, examples of which are illustrated in the accompanying drawings. In the following detailed description, numerous specific details are set forth in order to provide a thorough understanding of the subject matter presented herein. But it will be apparent to one skilled in the art that the subject matter may be practiced without these specific details. In other instances, well-known methods, procedures, components, and circuits have not been described in detail so as not to unnecessarily obscure aspects of the embodiments.
[0030] In order to make the objectives, technical solutions, and advantages of the present application comprehensible, embodiments of the present application are further described in detail below with reference to the accompanying drawings.
[0031] FIG. 1 is a flow chart illustrating a method 100 performed at a terminal device for associating a payment account with a user account in accordance with some embodiments. The terminal device performing the method 100 can be any type of electronic device that can be used by a user to access an online payment service. Such a terminal device can be configured to
communicate with one or more server device(s) via one or more network(s) (e.g., the Internet). The terminal device can be configured to interact with the user operating the terminal device to provide the online payment service and related services and/or functions to the user. Details of a terminal device and a server device configured to provide an online payment service are shown and described below with respect to FIG. 10.
[0032] In some embodiments, the terminal device can be, for example, a cellular phone, a smart phone, a mobile Internet device (MID), a personal digital assistant (PDA), a tablet computer, an e-reader, a laptop computer, a handheld computer, a wearable device, a desktop computer, a vehicle terminal, and/or any other electronic device. In some embodiments, such a terminal device can be referred to as, for example, a client device, a user device, a mobile device, a portable device, a terminal, and/or the like. Details of a terminal device are shown and described below with respect to FIGS. 6 and 7.
[0033] A server device communicating with the terminal device performing the method 100 can be any type of device configured to function as a server-side device to provide the online payment service and related services and/or functions (e.g., account management) described herein. Such a server device can typically be configured to communicate with multiple terminal devices via one or more networks. In some embodiments, the server device can be, for example, a background server, a back end server, a database server, a workstation, a desktop computer, a cloud computing server, a data processing server, an instant messaging server, a Social Networking Service (SNS) server, a payment server, and/or the like. In some embodiments, a server device can be a server cluster or server center consisting of two or more servers (e.g., a data processing server and a database server). Details of a server device are shown and described below with respect to FIGS. 8 and 9.
[0034] A network connecting the terminal device and the server device can be any type of network configured to operatively couple one or more server devices to one or more terminal devices, and enable communications between the server device(s) and the terminal device(s). In some embodiments, such a network can include one or more networks such as, for example, a cellular network, a satellite network, a local area network (LAN), a wide area network (WAN), a wireless local area network (WLAN), Internet, etc. In some embodiments, such a network can be optionally implemented using any known network protocol including various wired and/or wireless protocols such as, for example, Ethernet, universal serial bus (USB), global system for mobile communications (GSM), enhanced data GSM environment (EDGE), general packet radio service (GPRS), long term evolution (LTE), code division multiple access (CDMA), wideband code division multiple Access (WCDMA), time division multiple access (TDMA), Bluetooth, Wi-Fi, voice over internet protocol (VoIP), Wi-MAX, etc.
[0035] A user operating the terminal device performing the method 100 can be any person
(potentially) interested in using the online payment service provided by the terminal device and the server device. Such a person can use the online payment service to, for example, make online payments, conduct online shopping, etc.
[0036] As an example, FIG. 10 is a schematic diagram illustrating a terminal device 1020 and a server device 1040 configured to associate and manage payment accounts in accordance with some embodiments. The terminal device 1020 can be structurally and functionally similar to the terminal device performing the method 100 as described above. The server device 1040 can be structurally and functionally similar to the server device described above. As shown in FIG. 10, the terminal device 1020 is operated by a user 1010 and operatively coupled to the server device 1040 via a network 1030. The user 1010 can be similar to the user described above, and the network 1030 can be similar to the network described above.
[0037] In operation, the user 1010 can operate the terminal device 1020 setup a payment account and/or link the payment account to a user account of the user 1010. Specifically, for example, the user 1010 can operate the terminal device 1020 to provide a voice input including information of a payment account. The terminal device 1020 can send the voice input to the server device 1040 such that the server device 1040 can extract the information of the payment account from the voice input. The server device 1040 can then send the extracted information of the payment account to the terminal device such that the terminal device 1020 can present the extracted information of the payment account to the user 1010 for verification. Based on the extracted information presented to the user 1010, the user 1010 can operate the terminal device 1020 to update or confirm the extracted information. After the user 1010 confirms that accurate information of the payment account has been submitted to the server device 1040, the server device 1040 can bind the payment account to the user account of the user. Details of the operations associated with binding a payment account to a user account are shown and described with respect to FIGS. 1-5.
[0038] Returning to FIG. 1, in some embodiments, the terminal device performing the method 100 can include one or more processors and memory. In such embodiments, the method 100 is governed by instructions or code of an application that are stored in a non-transitory computer readable storage medium of the terminal device and executed by the one or more processors of the terminal device. The application is associated with managing payment accounts and associating payment accounts with user accounts. Such an application typically has a server-side portion that is stored in and/or executed at the server device, and a client-side portion that is stored in and/or executed at the terminal device. As a result of the client-side portion of the application being executed, the method 100 is performed at the terminal device. As shown in FIG. 1, the method 100 includes the following steps.
[0039] At S101, the terminal device receives, from the user, a voice input including information of a payment account. In some embodiments, initially, the user logs onto a program or service (e.g., online payment service) of the terminal device that enables the user to setup a payment account and link that payment account to a user account of the user. Typically, the user can log onto the program or service using a login credential (e.g., a combination of a user ID and a password) that is uniquely associated with the user account. Thus, when the user logs onto the program or service, the user account is identified at the server device based on the login credential provided by the user at the terminal device.
[0040] When the user initiates a process to setup a payment account and/or link that payment account to the user account, the user is typically prompted to enter information of the payment account on a user interface (e.g., an account binding interface) of the terminal device. Operating on the user interface, the user can generate a voice input including the required information of the payment account. In some embodiments, the information of the payment account included in the voice input can include, for example, a card number, a card type, an expiration date, a phone number, a user name, and/or the like.
[0041] The user can enter the voice input into the terminal device using various methods. In some embodiments, for example, the user can trigger the terminal device to enter a sound capture mode using any suitable means, and then speak to a sound capture device (e.g., an embedded microphone) of the terminal device while the terminal device is in the sound capture mode. For example, the user can trigger the sound capture mode by operating on (e.g., long press, single-click, double-click) a voice input button on a user interface of the terminal device. When the terminal device is in the sound capture mode, the voice message spoken by the user can be captured by the sound capture device of the terminal device as the voice input. Additionally, in some embodiments, the user can operate the terminal device to switch between the sound capture mode and a text input mode for entering data into the terminal device.
[0042] As an example, FIG. 11 is a schematic diagram illustrating a user interface 1100 (e.g., an account binding interface) of a terminal device for entering information of a payment account in accordance with some embodiments. The terminal device presenting the user interface 1100 is similar to the terminal device performing the method 100 described above. As shown in FIG. 11, a user operating the terminal device is prompted to enter a bank card number into the terminal device using the user interface 1100. The user can choose to type the bank card number into the blank field 1102, or enter a voice input by long pressing the voice input button 1104 (thus triggering a sound capture mode). After completing this step, the user can enter next step by clicking the button 1106.
[0043] As another example, FIG. 12 is a schematic diagram illustrating another user interface
1200 of a terminal device for entering information of a payment account in accordance with some embodiments. The terminal device presenting the user interface 1200 is similar to the terminal device performing the method 100 described above. As shown in FIG. 12, a user operating the terminal device is prompted to enter multiple information items into the terminal device using the user interface 1100. The user can choose to enter the information by typing in the corresponding blank field, and/or entering a voice input. For example, the user can type a bank card type (e.g., "XX Bank Credit Card") into the field 1202, type an expiration date into the field 1206, and type a mobile phone number into the field 1210. For another example, the user can provide information of the expiration date by long pressing the voice input button 1204 to enter a voice input, and provide information of the mobile phone number by long pressing the voice input button 1208 to enter another voice input.
[0044] Returning to FIG. 1, at SI 02, the terminal device sends the voice input to the server device such that the server device performs user authentication and information extraction based on the voice input. As a result, the information of the payment account is extracted from the voice input and recognized at the server device. Details of a server device performing user authentication and/or information extraction on a voice input are shown and further described with respect to FIG. 4.
[0045] At SI 03, the terminal device receives, from the server device, the information of the payment account as extracted from the voice input. The terminal device then presents the information of the payment account to the user on the user interface (e.g., account binding interface) or via any other suitable means. In some embodiments, the terminal device receives and presents the extracted information of the payment account such that the user can check and confirm the accuracy of the information. In such embodiments, the terminal device can present the extracted information to the user in various suitable methods. For example, the terminal device can display a textual message including the information extracted from the voice input on the user interface such that the user can view the extracted information. For another example, the terminal device can output a voice message (e.g., by a speaker of the terminal device) including the extracted information using a text- to-speech technology such that the user can hear the extracted information. Furthermore, if the user determines that the extracted information is incorrect or inaccurate, the user can operate the terminal device to modify the information of the payment account such that the correct information can be sent to the server device.
[0046] At SI 04, the terminal device sends a confirmation of the information of the payment account to the server device such that the payment account is bound to a user account associated with the user at the server device based on the information of the payment account. Specifically, after the user determines that the extracted information is correct and accurate, the user can operate the terminal device to indicate a confirmation on the information. For example, the user can click a "confirm" or "submit" button on a submission interface of the terminal device.
[0047] As a result, the terminal device sends a confirmation of the information to the server device. In response to receiving the confirmation, the server device can associate the payment account with the user account of the user that has been identified since the user logs onto the program or service. For example, the server device can store information of the payment account into a field of a data entry for the user account in a database, where each payment account is uniquely associated with (e.g., bound to) one and only one user account. Thus, the payment account provided by the user is bound to the user account of the user at the server device. Furthermore, in some embodiments, after the payment account is bound to the user account at the server device, the server device sends a signal to the terminal device such that the terminal device provides an indication of the successful binding to the user.
[0048] FIG. 2 is a flow chart illustrating a method 200 performed at a server device for associating a payment account with a user account in accordance with some embodiments. The server device performing the method 200 can be similar to the server device described above with respect to FIG. 1. Particularly, the server device performing the method 200 can be operatively coupled to and communicate with (e.g., via a network) one or more terminal devices that are similar to the terminal device described above with respect to FIG. 1.
[0049] In some embodiments, the server device performing the method 200 can include one or more processors and memory. In such embodiments, the method 200 is governed by instructions or code of an application that are stored in a non-transitory computer readable storage medium of the server device and executed by the one or more processors of the server device. The application is associated with managing payment accounts and associating payment accounts with user accounts. Such an application typically has a server-side portion that is stored in and/or executed at the server device, and a client-side portion that is stored in and/or executed at the terminal device. As a result of the server-side portion of the application being executed, the method 200 is performed at the server device. As shown in FIG. 2, the method 200 includes the following steps.
[0050] At S201, the server device receives, from a terminal device, a voice input including information of a payment account. The terminal device is one of the terminal devices operatively coupled to and communicating with the server device. Corresponding to S101-S102 of the method 100 described above, a user of the terminal device logs onto a program or service using the terminal device to initiate a process for setting up a payment account and/or linking the payment account to a user account of the user. The user account can be identified at the server device based on, for example, a login credential (e.g., a user name and a password) provided by the user. In response to a prompt displayed on a user interface (e.g., account binding interface) of the terminal device, the user enters a voice input including the information of the payment account into the terminal device, which then sends the terminal device to the server device.
[0051] At S202, the server device performs speech recognition of the voice input to extract the information of the payment account. In some embodiments, the server device performs user authentication based on the voice input prior to performing the speech recognition of the voice input. Details of a server device performing user authentication and speech recognition of a voice input are shown and further described with respect to FIG. 4. As a result of a successful speech recognition, the server device extracts the information of the payment account from the voice input.
[0052] At S203, the server device binds the payment account to the user account based on the information of the payment account. As described herein, the user account can be identified based on, for example, the login credential provided by the user. Thus, such a user account is uniquely associated with the user. As a result of the payment account being bound to the user account, for example, the user can use the payment account in various online payment services (e.g., make a payment) and/or other services without re-entering information of the payment account.
[0053] FIG. 3 is a flow chart illustrating a method 300 performed at a terminal device 310 and a server device 320 for associating a payment account with a user account in accordance with some embodiments. The terminal device 310 is similar to the terminal devices described above with respect to FIGS. 1 and 2, and the server device 320 is similar to the server devices described above with respect to FIGS. 1 and 2. As shown in FIG. 3, the terminal device 310 is operatively coupled to and communicates with the server device 320 (e.g., via a network not shown in FIG. 3).
[0054] In some embodiments, the terminal device 310 and the server device 320 each includes one or more processors and memory. In such embodiments, the method 300 is governed by instructions or code of an application, which includes a server-side portion that is stored in and/or executed at the server device 320, and a client-side portion that is stored in and/or executed at the terminal device 310. As a result of the server-side portion of the application and the client-side portion of the application being executed at the server device 320 and the terminal device 310 respectively, the server device 320 and the terminal device 310 collectively perform the method 300. As shown in FIG. 3, the method 300 includes the following steps.
[0055] At S301, the terminal device 310 receives a voice input including information of a payment account. As described with respect to S 101 of the method 100 and S201 of the method 200, a user operating the terminal device 310 can log onto a program or service using the terminal device 310 to initiate a process for setting up a payment account and/or linking the payment account to a user account of the user. The user account can be identified at the server device 320 based on, for example, a login credential (e.g., a pair of user id and password) entered by the user into the terminal device 310. In the process, the user can enter the voice input including the information of the payment account into the terminal device 310 on a user interface (e.g., an account binding interface) of the terminal device 310. Thus, the terminal device 310 receives the voice input from the user.
[0056] At S302, the terminal device 310 sends the voice input to the server device 320, as described with respect to S102 of the method 100 and S201 of the method 200. At S303, the server device 320 performs speech recognition of the voice input to extract the information of the payment account. As described with respect to SI 02 of the method 100 and S202 of the method 200, in some embodiments, the server device 320 can perform user authentication based on the voice input prior to performing the speech recognition of the voice input. In such embodiments, if the user
authentication is successful (that is, the user's identity is confirmed based on the voice input), the server device 320 performs the speech recognition of the voice input to extract the information of the payment account from the voice input. Otherwise, if the user authentication fails (that is, the user's identity is not confirmed based on the voice input), the server device 320 aborts the speech
recognition of the voice input, and the information of the payment account is not extracted from the voice input.
[0057] As a result of successfully performing speech recognition of the voice input, the server device 320 extracts the information of the payment account from the voice input. In some embodiments, as described with respect to SI 03 -SI 04 of the method 100 (although not shown in FIG. 3), the server device 320 can send the extracted information back to the terminal device 310, which then presents the extracted information (e.g., display in a textual form, output in an audio form) to the user. After obtaining a confirmation from the user on the extracted information, the terminal device 310 can send a confirmation to the server device 320 indicating that the extracted information is confirmed by the user.
[0058] Subsequently, at S304, the server device 320 binds the payment account to the user account based on the information of the payment account. As described with respect to S104 of the method 100 and S203 of the method 200, the server device 320 can associate the payment account with the user account by, for example, linking the two accounts in the same data entry uniquely assigned to the user account (or the user) in a database. As a result, each payment account recorded in the database is uniquely associated with (e.g., bound to) one and only one user account.
Additionally, in some embodiments (although not shown in FIG. 3), after binding the payment account to the user account, the server device 320 can send a signal to cause the terminal device 310 to present an indication of the successful binding to the user.
[0059] FIG. 4 is a flow chart illustrating another method 400 performed at a terminal device
410 and a server device 420 for associating a payment account with a user account in accordance with some embodiments. The terminal device 410 is similar to the terminal devices described above with respect to FIGS. 1-3, and the server device 420 is similar to the server devices described above with respect to FIGS. 1-3. As shown in FIG. 4, the terminal device 410 is operatively coupled to and communicates with the server device 420 (e.g., via a network not shown in FIG. 4).
[0060] In some embodiments, the terminal device 410 and the server device 420 each includes one or more processors and memory. In such embodiments, the method 400 is governed by instructions or code of an application, which includes a server-side portion that is stored in and/or executed at the server device 420, and a client-side portion that is stored in and/or executed at the terminal device 410. As a result of the server-side portion of the application and the client-side portion of the application being executed at the server device 420 and the terminal device 410 respectively, the server device 420 and the terminal device 410 collectively perform the method 400. As shown in FIG. 4, the method 400 includes the following steps.
[0061] At S401, the terminal device 410 receives a voice input including information of a payment account. As described with respect to S 101 of the method 100, S201 of the method 200 and S301 of the method 300, a user operating the terminal device 410 can log onto a program or service using the terminal device 410 to initiate a process for setting up a payment account and/or linking the payment account to a user account of the user. The user account can be identified at the server device 420 based on, for example, a login credential entered by the user into the terminal device 410. In the process, the user can enter the voice input including the information of the payment account into the terminal device 410 on a user interface (e.g., an account binding interface) of the terminal device 410. Thus, the terminal device 410 receives the voice input from the user.
[0062] At S402, the terminal device 410 sends the voice input to the server device 420, as described with respect to SI 02 of the method 100, S201 of the method 200 and S302 of the method 300. At S403, the server device 420 performs user authentication based on a voiceprint of the voice input and stored voice authentication information for the user account. In some embodiments, the server device 420 can be configured to determine the voiceprint of the voice input by analyzing the voice input using any suitable method. Such a voiceprint of the voice input can be a representation in any suitable form that uniquely identifies the voice of the user that generates the voice input. For example, the voiceprint can be a spectrogram. In some embodiments, the voiceprint can be referred to as, for example, spectral waterfalls, voicegrams, etc.
[0063] The server device 420 can also be configured to identify the stored voice
authentication information for the user account based on, for example, the login credential entered by the user that uniquely links to the user account. The stored voice authentication information can be, for example, a stored template used to identify the user via her voice in speaker recognition.
Furthermore, the stored voice authentication information can be compared against the voiceprint obtained from the voice input to determine whether the voiceprint and the stored voice authentication information are associated with the same user. Thus, the user's identity is authenticated based on the voiceprint of the voice input and the stored voice authentication information for the user account.
[0064] In some embodiments, the stored voice authentication information is provided to the server device 420 prior to the terminal device 410 receiving the voice input. For example, the user is required to provide a voice sample (e.g., a template) when the user creates the user account associated with the program or service. Such a voice sample is sent to and stored at the server device 420, and is uniquely linked to the user account (and the user). Subsequently, when a voice input is received at the server device 420, the server device 420 can (optionally) perform user authentication of the voice input based on the stored voice sample and/or other voice samples associated with other user accounts (i.e., provided by other users).
[0065] If the user authentication fails (that is, the user's identity is not confirmed based on the voice input), the server device 420 aborts the speech recognition of the voice input, and the information of the payment account is not extracted from the voice input (not shown in FIG. 4). Otherwise, if the user authentication is successful (that is, the user's identity is confirmed based on the voice input), at S404, the server device 420 performs speech recognition of the voice input to extract the information of the payment account. In some embodiments, the server device 420 can be configured to perform speech recognition of the voice input using any suitable speech recognition technology (e.g., acoustic modeling, language modeling, etc.).
[0066] In some embodiments, a voice input is considered as a void or invalid voice input if no meaningful information can be extracted from that voice input using speech recognition. For example, if the sound volume (i.e., loudness) of the voice input is below a threshold, then no information can be recognized from the voice input. For another example, if the noise level in the voice input is above a threshold (e.g., signal-to-noise ratio below a threshold), then no meaningful information can be successfully extracted from the voice input.
[0067] As a result of performing S404, the information of the payment account is extracted from the voice input at the server device 420. At S405, the server device 420 sends the extracted information of the payment account to the terminal device 410. In some embodiments, as described with respect to S103-S104 of the method 100, in response to receiving the extracted information of the payment account, the terminal device 410 can be configured to present the extracted information received from the server device 420 to the user such that the user can check and confirm the accuracy of the extracted information (not shown in FIG. 4). If the user determines that the extracted information received from the server device 420 is accurate (i.e., identical to the original information of the payment account included in the voice input), the user can operate the terminal device 410 to send a confirmation to the server device 420. Otherwise, if the user determines that the extracted information received from the server device 420 is incorrect or inaccurate (i.e., different from the original information of the payment account included in the voice input), the user can operate the terminal device 410 to modify the information and then send the updated information of the payment account to the server device 420. Details of a user performing a terminal device to modify information of a payment account are shown and further described with respect to FIG. 5. [0068] At S406, the terminal device 410 sends a collection of information of the payment account to the server device 420. At S407, the server device 420 stores the collection of information of the payment account received from the terminal device 410. In some embodiments, the collection of information of the payment account can include multiple pieces of information of the payment account. For example, the voice input received at the terminal device 410 at S401 and sent from the terminal device 410 to the server device 420 at S402 is a first voice input, which includes first information of the payment account (e.g., a card number). After receiving the first voice input, the terminal device 410 receives a second voice input from the user, which includes second information of the payment account (e.g., a card expiration date). The terminal device 410 then sends the second voice input to the server device 420. The second information of the payment account is included in the collection of information of the payment account with respect to S406-S407. In some embodiments, each information included in the collection of information of the payment account with respect to S406-S407, and/or the corresponding voice input that includes that information, is processed according to S401-S405 as described above.
[0069] In some embodiments, the second information of the payment account can be different from the first information of the payment account with respect to a predetermined characteristic. Such a predetermined characteristic can be associated with, for example, a type, size, criticality level, indication, etc., of respective information. Furthermore, the server device 420 can be configured to perform different operation(s) on different received voice inputs based on the predetermined characteristic of the received voice inputs. For example, the server device 420 can be configured to perform user authentication and then perform information extraction (if the user authentication is successful) on the first voice input including the first information; the server device 420 can be configured to perform information extraction on the second voice input including the second information without performing user authentication on the second voice input, where the second information is different from the first information with respect to the predetermined characteristic.
[0070] For example, the predetermined characteristic can indicate whether respective information contained in a respective voice input (e.g., the first information contained in the first voice input, the second information contained in the second voice input) meets a predetermined criticality level. For instance, information of a card number has a higher criticality level (i.e., being more critical) than information of a card expiration date. For another instance, information of a phone number has a higher criticality level (i.e., being more critical) than information of a type of the phone number (e.g., home phone number, mobile phone number, office phone number). Thus, the information of a card number and the information of a phone number meet the predetermined criticality level, while information of a card expiration data and information of a type of the phone number do not meet the predetermined criticality level.
[0071] As a result, the server device 420 performs user authentication on a voice input if information contained in the voice input meets the predetermined criticality level (e.g., card number, phone number). If such user authentication is successful, the server device 420 performs information extraction on the voice input to extract the information. On the contrary, the server device 420 performs information extraction on a voice input to extract information contained in that voice input if the information contained in the voice input fails to meet the predetermined criticality level (e.g., card expiration date, type of phone number). Thus, user authentication is performed only on voice inputs including information that meets the predetermined criticality level.
[0072] For another example, the predetermined characteristic can indicate whether respective voice input (e.g., the first voice input, the second voice input) is an initial voice input received at the terminal device 410 (e.g., through an account binding interface) during a session. Such a session can be defined as, for example, communications between a particular user and the terminal device 410 and/or operations performed by the particular user on the terminal device 410 during a certain period of time. As a result, for example, a new session is initiated if a user logs off and the same user or a different user logs in again, or a user returns to operate the terminal device 410 after being silent for a certain period of time, and/or the like.
[0073] Subsequently, the server device 420 performs user authentication on a voice input if that voice input is the initial voice input (of a session) received at the terminal device 410. If such user authentication is successful, the server device 420 performs information extraction on the voice input to extract the information. On the contrary, the server device 420 performs information extraction on a voice input to extract information contained in that voice input if that voice input is a voice input subsequently received at the terminal device 410 in a session (i.e., not the initial voice input of a session). In other words, user authentication is performed on the initial voice input at the beginning of a session, and ignored for subsequent voice inputs as long as they are received during the same session.
[0074] After the collection of necessary information of the payment account is received, stored and verified at the server device 420, at S408, the server device 420 binds the payment account to the user account based on the collection of information of the payment account. As described with respect to SI 04 of the method 100, S203 of the method 200 and S304 of the method 300, the server device 420 can be configured to bind the payment account to the user account by, for example, linking the two accounts in the same data entry uniquely assigned to the user account (or the user) in a database. As a result, each payment account recorded in the database is uniquely associated with (e.g., bound to) one and only one user account. Additionally, in some embodiments (although not shown in FIG. 4), after binding the payment account to the user account, the server device 420 can send a signal to cause the terminal device 410 to present an indication of the successful binding to the user.
[0075] FIG. 5 is a flow chart illustrating another method 500 performed at a terminal device
580 and a server device 590 for associating a payment account with a user account in accordance with some embodiments. The terminal device 580 is similar to the terminal devices described above with respect to FIGS. 1-4, and the server device 590 is similar to the server devices described above with respect to FIGS. 1-4. As shown in FIG. 5, the terminal device 580 is operatively coupled to and communicates with the server device 590 (e.g., via a network not shown in FIG. 5).
[0076] In some embodiments, the terminal device 580 and the server device 590 each includes one or more processors and memory. In such embodiments, the method 500 is governed by instructions or code of an application, which includes a server-side portion that is stored in and/or executed at the server device 590, and a client-side portion that is stored in and/or executed at the terminal device 580. As a result of the server-side portion of the application and the client-side portion of the application being executed at the server device 590 and the terminal device 580 respectively, the server device 590 and the terminal device 580 collectively perform the method 500. As shown in FIG. 5, the method 500 includes the following steps.
[0077] At S501, the terminal device 580 receives a first voice input including first information of a payment account. At S502, the terminal device 580 sends the first voice input to the server device 590. At S503, the server device 590 performs user authentication based on a voiceprint of the first voice input and stored voice authentication information for the user account. At S504, if the user authentication is successful, the server device 590 performs speech recognition of the first voice input to extract the first information of the payment account. At S505, the server device 590 sends the extracted first information of the payment account to the terminal device 580. The operations of S501-S505 are similar to the operations of S401-S405 of the method 400 shown and described with respect to FIG. 4, and/or the corresponding operations shown and described with respect to FIGS. 1-3, thus not elaborated herein.
[0078] In some embodiments, in response to receiving the extracted first information of the payment account, the terminal device 580 can be configured to present the extracted first information to a user operating the terminal device 580 such that the user can check and confirm the accuracy of the extracted first information. Based on the result of the checking and confirming, the user then provides a voice command indicting next operation on the extracted first information. Such a voice command is then sent to the server device 590 for interpretation. That is, the server device 590 performs speech recognition on the voice command to extract information including the command from the voice command. The server device 590 then sends the extracted command to the terminal device 580 such that the terminal device 580 performs an operation based on the command extracted from the voice command.
[0079] Typically, if the user determines that the extracted first information is accurate (that is, identical to the original first information included in the first voice input), the user provides a voice command including a submission command such that the extracted first information is submitted to the server device 590 (as shown and described with respect to S511-S514). Otherwise, if the user determines that the extracted first information is incorrect or inaccurate (that is, different from the original first information included in the first voice input), the user provides a voice command including a control command (e.g., a deleting or editing command) such that the extracted first information is deleted or further modified before being submitted to the server device 590 (as shown and described with respect to S506-S510).
[0080] Assuming that the user determines that the extracted first information is incorrect or inaccurate, at S506, the terminal device 580 receives a first voice command from the user. The first voice command includes a control command. At S507, the terminal device 580 sends the first voice command to the server device 590. At S508, the server device 590 performs speech recognition of the first voice command to extract the control command. At S509, the server device 590 sends the control command to the terminal device 580. At S510, the terminal device 580 performs an operation associated with the first information of the payment account. Particularly, such an operation is based on the control command received from the server device 590.
[0081] In some embodiments, such a control command can be a deleting command. The voice command corresponding to the deleting command can include a keyword "delete," "remove," "clear," and/or the like. In such embodiments, the operation performed at the terminal device 580 at S510 includes deleting the first information of the payment account presented to the user. For example, the first information is cleared from a field on a user interface (e.g., an account binding interface), thus enabling the user to provide new information of the payment account for the respective field. In some embodiments, the user can be prompted to enter new data into the field, generate a new voice input for the field, and/or user any other suitable method to provide new information of the payment account.
[0082] In some embodiments, such a control command can be an editing command. The voice command corresponding to the editing command can include a keyword "edit," "change," "modify," "update," and/or the like. In such embodiments, the operation performed at the terminal device 580 at S510 includes entering a modification mode for modifying the first information of the payment account presented to the user. In the modification mode, the user can choose to modify the first information by, for example, using a keyboard associated with the terminal device 580, using a voice input button, and/or any other suitable methods. [0083] In some embodiments, in the modification mode, the terminal device 580 can be configured to automatically determine a character type of the first information as extracted from the first voice input, and then automatically select, based on the character type of the extracted first information, a matching keyboard from a group of soft keyboards associated with the terminal device 580. The terminal device 580 can also be configured to automatically, without further user input, display such a matching keyboard on a user interface (e.g., an account binding interface) of the terminal device 580 such that the user can enter correction data using the matching keyboard. The group of soft keyboards can include, for example, a numerical keyboard (without any alphabetical key), an alphanumeric keyboard (with both alphabetical and numerical keys), an alphabetical keyboard (without any numerical key), a special character keyboard, a keyboard for a foreign language, etc.
[0084] For example, if the first information has a character type of numerical characters (e.g., a card number, phone number), the terminal device 580 automatically determines the numerical characters of the first information, and selects a numerical keyboard for the first information. The terminal device 580 then automatically, without further user input, displays the selected numerical keyboard to prompt the user to enter correction data for the first information using the selected numerical keyboard.
[0085] For another example, if the first information has a character type of alphabetical characters (e.g., a user's name), the terminal device 580 automatically determines the alphabetical characters of the first information, and selects an alphabetical keyboard for the first information. The terminal device 580 then automatically, without further user input, displays the selected alphabetic keyboard to prompt the user to enter correction data for the first information using the selected alphabetic keyboard.
[0086] For yet another example, if the first information has a character type of a combination of numerical and alphabetic characters (e.g., a mailing address), the terminal device 580
automatically determines the combination of numerical and alphabetic characters of the first information, and selects an alphanumeric keyboard for the first information. The terminal device 580 then automatically, without further user input, displays the selected alphanumeric keyboard to prompt the user to enter correction data for the first information using the selected alphanumeric keyboard.
[0087] As an example, FIG. 13 is a schematic diagram illustrating a user interface 1300 of a terminal device for correcting information of a payment account in accordance with some
embodiments. The terminal device presenting the user interface 1300 is similar to the terminal device 580 described herein. As shown in FIG. 13, after a possible error is detected (at the terminal device or a corresponding server device) for provided information associated with a card number, the terminal device automatically determines that the information for the card number has a numerical characteristic. The terminal device then automatically selects, based on the determined numerical characteristic, a numerical keyboard 1302. Moreover, the terminal device automatically, without further user input, displays such a numerical keyboard 1302 on the user interface 1300 such that a user operating the terminal device can enter correction data into the filed 1301 using the numerical keyboard 1302.
[0088] Returning to FIG. 5, in some embodiments, the server device 590 or the terminal device 580 can be configured to detect a possible error in the extracted first information based on speech recognition of the first voice input performed at S508. In such embodiments, the extracted first information presented to the user at the terminal device 580 is different from the first information included in the first voice input that is provided by the user to the terminal device 580. Such a possible error can be, for example, a low-confidence recognition result in the extracted first information, or a speech recognition error. As a result of detecting the possible error, the terminal device 580 can be configured to present the possible error to the user, and prompt the user to enter correction data for the first information of the payment account.
[0089] In such embodiments, the terminal device 580 can automatically enter the
modification mode without receiving a voice command including a control command (e.g., editing command) from the user. In other words, operations of S506-S509 can be skipped and the user is not prompted to check and confirm the accuracy of the extracted first information. Furthermore, in such embodiments, the terminal device 580 can be configured to display the extracted first information to the user with the possible error being highlighted in the display, or presenting a separate message indicating the possible error to the user.
[0090] Additionally, in such embodiments, the terminal device 580 can be configured to automatically determine a character type associated with a possible correction of the first information as extracted from the first voice input (e.g., a possible correction of the low-confidence recognition result). The terminal device 580 can be configured to automatically select, based on the character type associated with the possible correction of the first information, a matching keyboard from a group of soft keyboards associated with the terminal device 580. The terminal device 580 can further be configured to automatically, without further user input, display such a matching keyboard on a user interface (e.g., an account binding interface) of the terminal device 580 such that the user can enter correction data using the matching keyboard.
[0091] In some embodiments, the user can be request to provide a confirmation and/or submission command for each extracted information. For example, after reviewing the extracted first information, the user provides a voice command including a submission command with respect to the extracted first information. The terminal device 580 sends the voice command to the server device 590, which extracts the submission command from the voice command using speech recognition, and then sends the submission command to the terminal device 580. The terminal device 580 then performs an operation in response to the submission command, which includes submitting, to the server device 590, a confirmation on the extracted first information of the payment account.
[0092] In some other embodiments, if the user determines that a collection of information extracted from one or multiple voice inputs is accurate (that is, identical to the original information included in the one or multiple voice inputs), or any error in the collection of extracted information has been corrected, the user provides a voice command including a submission command such that the collection of extracted information is submitted to the server device 590. In such embodiments, the collection of information can be associated with, for example, multiple fields in a form associated with setting up a payment account and/or linking the payment account to a user account. After the user provides information for each field of the form (e.g., type in using the user interface or provide a voice input), the collection of the information can be presented to the user for verification. The user can provide a voice command including a submission command indicating that each information included in the collection is confirmed and ready to be submitted. In that case, the user can submit the collection of information using a single submission command. Alternatively, each information included in the collection of information can be separately presented to the user for verification. Thus, the user can provide a voice command including a submission command indicating that a single piece of information is confirmed and ready to be submitted. In that case, the user is typically not asked to submit the collection of information using a single submission command.
[0093] Specifically, at S511, the terminal device 580 receives a second voice command. At
S512, the terminal device 580 sends the second voice command to the server device 590. At S513, the server device 590 performs speech recognition of the second voice command to extract a submission command. At S514, the server device 590 sends the submission command to the terminal device 580. At S515, the terminal device 580 submits a collection of information of the payment account to the server device 590. At S516, the server device 590 stores the collection of information of the payment account received from the terminal device 580. At S517, the server device 590 binds the payment account to the user account based on the collection of information of the payment account. The operations of S515-S517 are similar to the operations of S406-S408 of the method 400 shown and described with respect to FIG. 4, and/or the corresponding operations shown and described with respect to FIGS. 1-3, thus not elaborated herein.
[0094] In some embodiments, speech recognition of a voice command (e.g., the first voice command, the second voice command) can be performed based on a set of predefined voice commands including predefined keywords such as, for example, "submit," "confirm," "send," and/or the like corresponding to a submission command; "delete," "remove," "clear," and/or the like corresponding to a deleting command; "edit," "change," "update," "modify," and/or the like corresponding to an editing command. When one of the predefined keywords is extracted from the voice command using speech recognition, then the voice command can be considered to include the corresponding command. In some embodiments, if more than one conflicting predefined keywords are recognized from the voice command, then the voice command is considered to be void or invalid. Additionally, if no predefined keyword is recognized from the voice command (e.g., due to low sound volume, due to high noise level, etc.), then the voice command is considered to be void or invalid.
[0095] In some embodiments, the user can be provided with an option to enter a command by manually operating on the user interface of the terminal device 580 without generating a voice command. For example, the user can indicate a submission command by generating a voice command including a submission command (e.g., long press a voice input button to activate a sound capture mode) or clicking a submission button on the user interface. For another example, the user can enter a modification mode by generating a voice command including an editing command (e.g., long press a voice input button to activate a sound capture mode) or moving the cursor into the corresponding field on the user interface.
[0096] FIG. 6 is a block diagram illustrating modules of a terminal device 600 for associating and managing payment accounts in accordance with some embodiments. The terminal device 600 can be structurally and functionally similar to the terminal devices shown and/or described with respect to FIGS. 1-5. As shown in FIG. 6, the terminal device 600 includes an acquisition module 610, a transmitting module 620, a receiving module 630 and an account management module 640. In some embodiments, the terminal device 600 can include more or less modules than those shown in FIG. 6.
[0097] In some embodiments, each module included in the terminal device 600 can be a hardware-based module (e.g., a digital signal processor (DSP), a field programmable gate array (FPGA), etc.), a software-based module (e.g., a module of computer code executed at a processor, a set of processor-readable instructions executed at a processor, etc.), or a combination of hardware and software modules. Instructions or code of each module can be stored in a memory of the terminal device 600 (not shown in FIG. 6) and executed at a processor (e.g., a CPU) of the terminal device 600 (not shown in FIG. 6). Overall, the acquisition module 610, the transmitting module 620, the receiving module 630 and the account management module 640 can be configured to collectively perform at least a portion of the methods 100 and 300-500 shown and described with respect to FIGS. 1 and 3-5. [0098] Specifically, the acquisition module 610 is configured to acquire voice inputs from a user operating the terminal device 600. The voice inputs acquired by the acquisition module 610 includes voice inputs including information of payment accounts and voice commands associated with operations on the information of the payment account. Alternatively, in some embodiments, the acquisition module 610 can include two sub-modules, which are responsible for acquiring voice inputs including information of payment accounts and acquiring voice commands including commands, respectively. For example, as shown and described with respect to FIGS. 11-12, when the terminal device 600 is in a sound capture mode (e.g., as a result of the user long pressing a voice input button), the acquisition module 610 is configured to capture sound from the environment surrounding the terminal device 600 (e.g., using a microphone or other voice acquisition device), thus receiving the voice inputs from the user.
[0099] The transmitting module 620 is configured to transmit the voice inputs (including the voice inputs including information of payment accounts and voice commands including commands) acquired by the acquisition module 610 to a server device operatively coupled to and communicating with the terminal device 600, such that the voice inputs can be processed at the server device.
Similarly, the receiving module 630 is configured to receive extracted information and/or commands from the server device, such that the terminal device 600 can present the extracted information to the user and/or perform operations according to the commands.
[00100] The account management module 640 is configured to perform operations according to commands received from the server device. For example, as described with respect to the method 500 in FIG. 5, the account management module 640 is configured to delete, edit or submit extracted information of a payment account based on a deleting command, an editing command or a submission command received from the server device. As described above, such a command can be received by the receiving module 630, which is extracted at the server device from a voice command acquired by the acquisition module 610 and transmitted from the terminal device 600 to the server device by the transmitting module 620.
[00101] FIG. 7 is a block diagram illustrating components of a terminal device 700 for associating and managing payment accounts in accordance with some embodiments. The terminal device 700 can be structurally and functionally similar to the terminal devices shown and described above with respect to FIGS. 1-6. As shown in FIG. 7, the terminal device 700 includes a processor 701, a communication bus 702, a network interface 703, a voice acquisition device 705, and a memory 704 including programming code 706. In some embodiments, the terminal device 700 can include more or less devices, components and/or modules than those shown in FIG. 7.
[00102] The processor 701 can be any processing device capable of performing at least a portion of the methods 100-500 described with respect to FIGS. 1-5. Such a processor can be, for example, a CPU, a DSP, a FPGA, and/or the like. The processor 701 can be configured to control the operations of other components and/or modules of the terminal device 700. For example, the processor 701 can be configured to control operations of the network interface 703. For another example, the processor 701 can be configured to execute instructions or code stored in a software program or module (e.g., the program code 706) within the memory 704.
[00103] The communication bus 702 is configured to implement connections and
communication among the other components of the terminal device 700. The network interface 703 is configured to provide and control network interfaces of the terminal device 700 that are used to interact with other network devices (e.g., server devices). The network interface 703 can include, for example, a standard wired interface and/or a standard wireless interface (e.g., a Wi-Fi interface). In some embodiments, the network interface 703 is used for connecting one or more server devices and performing data communication with the one or more server devices. In some embodiments, operations of the network interface 703 are controlled by instructions or code stored in the memory 704 (e.g., the program code 706).
[00104] The voice acquisition device 705 is configured to acquire voice inputs and/or voice commands from a user operating the terminal device 700. Such a voice acquisition device 705 can be, for example, a microphone or any other suitable sound capture device. The voice acquisition device 705 can be controlled by, for example, an acquisition module similar to the acquisition module 610 of the terminal device 600 in FIG. 6.
[00105] In some embodiments, the memory 704 can include, for example, a random-access memory (RAM) (e.g., a DRAM, a SRAM, a DDR RAM, etc.), a non-volatile memory such as one or more magnetic disk storage devices, optical disk storage devices, flash memory devices, or other non-volatile solid state storage devices. In some embodiments, the memory 704 can include one or more storage devices (e.g., a removable memory) remotely located from other components of the terminal device 700.
[00106] In some embodiments, each component, program, application or module (e.g., the program code 706) included in the memory 704 can be a hardware-based module (e.g., a DSP, a FPGA), a software-based module (e.g., a module of computer code executed at a processor, a set of processor-readable instructions executed at a processor), or a combination of hardware and software modules. Instructions or code of each component, program, application or module can be stored in the memory 704 and executed at the processor 701. In some embodiments, components, module and devices of the terminal device 700 are configured to collectively perform at least a portion of the methods 100 and 300-500 shown and described above with respect to FIGS. 1 and 3-5.
[00107] FIG. 8 is a block diagram illustrating modules of a server device 800 for associating and managing payment accounts in accordance with some embodiments. The server device 800 can be structurally and functionally similar to the server devices shown and/or described with respect to FIGS. 1-5. As shown in FIG. 8, the server device 800 includes a receiving module 810, a speech recognition module 820, a voice authentication module 840, an account management module 830 and a transmitting module 850. In some embodiments, the server device 800 can include more or less modules than those shown in FIG. 8.
[00108] In some embodiments, each module included in the server device 800 can be a hardware-based module (e.g., a DSP, a FPGA, etc.), a software-based module (e.g., a module of computer code executed at a processor, a set of processor-readable instructions executed at a processor, etc.), or a combination of hardware and software modules. Instructions or code of each module can be stored in a memory of the server device 800 (not shown in FIG. 8) and executed at a processor (e.g., a CPU) of the server device 800 (not shown in FIG. 8). Overall, the receiving module 810, the speech recognition module 820, the voice authentication module 840, the account management module 830 and the transmitting module 850 can be configured to collectively perform at least a portion of the methods 200-500 shown and described with respect to FIGS. 2-5.
[00109] Specifically, the receiving module 810 is configured to receive voice inputs and/or voice commands from terminal devices operatively coupled to and communicating with the server device 800. The voice inputs include information associated with payment accounts, and the voice commands include commands provided by users of the terminal devices. Similarly, the transmitting module 850 is configured to transmit the extracted information of payment accounts and extracted commands to the terminal devices, such that the extracted information of payment accounts can be presented to users of the terminal devices for verification, and the extracted commands can be performed at the terminal devices.
[00110] The voice authentication module 840 is configured to perform user authentication on the voice inputs and/or voice commands received by the receiving module 810. In some
embodiments, the user authentication can be performed based on a voiceprint of a voice input and stored voice authentication information for a user account. The speech recognition module 820 is configured to perform speech recognition on voice inputs and/or voice commands to extract information and/or commands included in the voice inputs and/or the voice commands. In some embodiments, speech recognition of a voice input or voice command is performed when the user authentication on that voice input or voice command is successful. In some other embodiments, speech recognition is performed on a voice input or voice command without user authentication being performed on that voice input or voice command.
[00111] The account management module 830 is configured to manage payment accounts and user accounts. The account management module 830 is also configured to bind a payment account to a user account if the two accounts are determined to be associated with the same user. Thus, the account management module 830 can be configured to automatically retrieve information of the payment account based on the user account without requesting the user to provide any information of the payment account. In some embodiments, the account management module 830 is configured to maintain and manage a database storing information of the payment accounts and the user accounts.
[00112] FIG. 9 is a block diagram illustrating components of a server device 900 for associating and managing payment accounts in accordance with some embodiments. The server device 900 can be structurally and functionally similar to the server devices shown and described above with respect to FIGS. 1-5 and FIG. 8. As shown in FIG. 9, the server device 900 includes a processor 901, a communication bus 902, a network interface 903 and a memory 904 including programming code 905. In some embodiments, the server device 900 can include more or less devices, components and/or modules than those shown in FIG. 9.
[00113] The processor 901 can be any processing device capable of performing at least a portion of the methods 100-500 described with respect to FIGS. 1-5. Such a processor can be, for example, a CPU, a DSP, a FPGA, and/or the like. The processor 901 can be configured to control the operations of other components and/or modules of the server device 900. For example, the processor 901 can be configured to control operations of the network interface 903. For another example, the processor 901 can be configured to execute instructions or code stored in a software program or module (e.g., the program code 905) within the memory 904.
[00114] The communication bus 902 is configured to implement connections and
communication among the other components of the server device 900. The network interface 903 is configured to provide and control network interfaces of the server device 900 that are used to interact with other network devices (e.g., terminal devices). The network interface 903 can include, for example, a standard wired interface and/or a standard wireless interface (e.g., a Wi-Fi interface). In some embodiments, the network interface 903 is used for connecting one or more terminal devices and performing data communication with the one or more terminal devices. In some embodiments, operations of the network interface 903 are controlled by instructions or code stored in the memory 904 (e.g., the program code 905).
[00115] In some embodiments, the memory 904 can include, for example, a random-access memory (RAM) (e.g., a DRAM, a SRAM, a DDR RAM, etc.), a non-volatile memory such as one or more magnetic disk storage devices, optical disk storage devices, flash memory devices, or other non-volatile solid state storage devices. In some embodiments, the memory 904 can include one or more storage devices (e.g., a removable memory) remotely located from other components of the server device 900.
[00116] In some embodiments, each component, program, application or module (e.g., the program code 905) included in the memory 904 can be a hardware-based module (e.g., a DSP, a FPGA), a software-based module (e.g., a module of computer code executed at a processor, a set of processor-readable instructions executed at a processor), or a combination of hardware and software modules. Instructions or code of each component, program, application or module can be stored in the memory 904 and executed at the processor 901. In some embodiments, components, module and devices of the server device 900 are configured to collectively perform at least a portion of the methods 200-500 shown and described above with respect to FIGS. 2-5.
[00117] The foregoing description, for purpose of explanation, has been described with reference to specific embodiments. However, the illustrative discussions above are not intended to be exhaustive or to limit the present application to the precise forms disclosed. Many modifications and variations are possible in view of the above teachings. The embodiments were chosen and described in order to best explain the principles of the present application and its practical applications, to thereby enable others skilled in the art to best utilize the present application and various embodiments with various modifications as are suited to the particular use contemplated.
[00118] While particular embodiments are described above, it will be understood it is not intended to limit the present application to these particular embodiments. On the contrary, the present application includes alternatives, modifications and equivalents that are within the spirit and scope of the appended claims. Numerous specific details are set forth in order to provide a thorough understanding of the subject matter presented herein. But it will be apparent to one of ordinary skill in the art that the subject matter may be practiced without these specific details. In other instances, well-known methods, procedures, components, and circuits have not been described in detail so as not to unnecessarily obscure aspects of the embodiments.
[00119] The terminology used in the description of the present application herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the present application. As used in the description of the present application and the appended claims, the singular forms "a," "an," and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise. It will also be understood that the term "and/or" as used herein refers to and encompasses any and all possible combinations of one or more of the associated listed items. It will be further understood that the terms "includes," "including," "comprises," and/or "comprising," when used in this specification, specify the presence of stated features, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, operations, elements, components, and/or groups thereof.
[00120] As used herein, the term "if may be construed to mean "when" or "upon" or "in response to determining" or "in accordance with a determination" or "in response to detecting," that a stated condition precedent is true, depending on the context. Similarly, the phrase "if it is determined [that a stated condition precedent is true]" or "if [a stated condition precedent is true]" or "when [a stated condition precedent is true]" may be construed to mean "upon determining" or "in response to determining" or "in accordance with a determination" or "upon detecting" or "in response to detecting" that the stated condition precedent is true, depending on the context.
[00121] Although some of the various drawings illustrate a number of logical stages in a particular order, stages that are not order dependent may be reordered and other stages may be combined or broken out. While some reordering or other groupings are specifically mentioned, others will be obvious to those of ordinary skill in the art and so do not present an exhaustive list of alternatives. Moreover, it should be recognized that the stages could be implemented in hardware, firmware, software or any combination thereof.

Claims

Claims
1. A method of associating a payment account with a user account, comprising:
at a terminal device having one or more processors and memory for storing programs to be executed by the one or more processors:
through an account binding interface associated with the user account, receiving a first voice input from a user, the first voice input including first information of the payment account;
performing user authentication and information extraction based on the first voice input, wherein the user authentication is based on a voiceprint of the first voice input, and the information extraction is based on speech recognition of the first voice input; and
providing an indication of the payment account being bound to the user account based on at least a success of the user authentication and the information extraction based on the first voice input.
2. The method of claim 1, wherein performing the user authentication and the information extraction further comprises:
sending the first voice input to a server device such that the server device (1) performs the user authentication based on the voiceprint of the first voice input and stored voice authentication information for the user account, and (2) performs the speech recognition of the first voice input to extract the first information of the payment account if the user authentication is successful.
3. The method of claim 2, wherein the voice authentication information is provided to the server device prior to the receiving the first voice input.
4. The method of claim 1, further comprising:
receiving a second voice input from the user, the second voice input including second information of the payment account, the second information being different from the first
information with respect to a predetermined characteristic; and
in accordance with the second information being different from the first information with respect to the predetermined characteristic, performing information extraction based on the second voice input without performing user authentication based on the second voice input.
5. The method of claim 4, wherein the predetermined characteristic indicates whether respective information contained in a respective voice input meets a predetermined criticality level.
6. The method of claim 4, wherein the predetermined characteristic indicates whether respective voice input is an initial voice input received through the account binding interface.
7. The method of claim 1, further comprising:
presenting the first information of the payment account as extracted from the first voice input to the user through the account binding interface.
8. The method of claim 7, wherein the presenting the first information of the payment account as extracted from the first voice input includes at least one of (1) displaying a textual message including the first information of the payment account and (2) providing a text-to-speech output presenting the first information of the payment account as extracted from the first voice input.
9. The method of claim 7, further comprising:
receiving a voice command of the user associated with the first information of the payment account presented to the user; and
performing, according to the voice command, an operation associated with the first information of the payment account presented to the user.
10. The method of claim 9, wherein performing the operation includes one of (1) submitting, to a server device, a confirmation on the first information of the payment account presented to the user, (2) deleting the first information of the payment account presented to the user, and (3) entering a modification mode for modifying the first information of the payment account presented to the user.
11. The method of claim 1, further comprising:
detecting a possible error in the information extraction based on speech recognition of the first voice input; and presenting the possible error to the user.
12. The method of claim 11, further comprising:
based on a character type associated with the first information as extracted from the first voice input, automatically selecting a matching keyboard for the character type from a plurality of soft keyboards associated with the terminal device; and
automatically, without further user input, displaying the automatically selected matching keyboard to prompt the user to enter correction data for the first information of the payment account presented to the user.
13. The method of claim 11, wherein the possible error is a low-confidence recognition result in the first information as extracted from the first voice input, and wherein the method further comprises:
based on a character type associated with a possible correction of the low-confidence recognition result, automatically selecting a matching keyboard for the character type from a plurality of soft keyboards associated with the terminal device; and
automatically, without further user input, displaying the automatically selected matching keyboard to prompt the user to enter correction data for the first information of the payment account presented to the user.
14. A terminal device, comprising:
one or more processors; and
memory storing one or more programs to be executed by the one or more processors, the one or more programs comprising instructions for: through an account binding interface associated with a user account, receiving a first voice input from a user, the first voice input including first information of a payment account;
performing user authentication and information extraction based on the first voice input, wherein the user authentication is based on a voiceprint of the first voice input, and the information extraction is based on speech recognition of the first voice input; and
providing an indication of the payment account being bound to the user account based on at least a success of the user authentication and the information extraction based on the first voice input.
15. The terminal device of claim 14, performing the user authentication and the information extraction includes:
sending the first voice input to a server device such that the server device (1) performs the user authentication based on the voiceprint of the first voice input and stored voice authentication information for the user account, and (2) performs the speech recognition of the first voice input to extract the first information of the payment account if the user authentication is successful.
16. The terminal device of claim 14, wherein the one or more programs further comprises instructions for:
presenting the first information of the payment account as extracted from the first voice input to the user through the account binding interface, wherein the presenting includes at least one of (1) displaying a textual message including the first information of the payment account and (2) providing a text-to-speech output presenting the first information of the payment account as extracted from the first voice input.
17. The terminal device of claim 14, wherein the one or more programs further comprises instructions for:
presenting the first information of the payment account as extracted from the first voice input to the user through the account binding interface;
receiving a voice command of the user associated with the first information of the payment account presented to the user; and
performing, according to the voice command, an operation associated with the first information of the payment account presented to the user.
18. A non-transitory computer readable storage medium storing one or more programs, the one or more programs comprising instructions, which, when executed by one or more processors, cause the processors to perform operations comprising:
at a terminal device:
through an account binding interface associated with a user account, receiving a first voice input from a user, the first voice input including first information of a payment account; performing user authentication and information extraction based on the first voice input, wherein the user authentication is based on a voiceprint of the first voice input, and the information extraction is based on speech recognition of the first voice input; and
providing an indication of the payment account being bound to the user account based on at least a success of the user authentication and the information extraction based on the first voice input.
19. The non-transitory computer readable storage medium of claim 18, wherein the operations further comprises:
detecting a possible error in the information extraction based on speech recognition of the first voice input; and presenting the possible error to the user.
20. The non-transitory computer readable storage medium of claim 18, wherein the operations further comprises:
based on a character type associated with the first information as extracted from the first voice input, automatically selecting a matching keyboard for the character type from a plurality of soft keyboards associated with the terminal device; and
automatically, without further user input, displaying the automatically selected matching keyboard to prompt the user to enter correction data for the first information of the payment account presented to the user.
PCT/CN2014/085566 2013-12-24 2014-08-29 Method, device and system for associating and managing payment accounts WO2015096503A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201310727165.4 2013-12-24
CN201310727165.4A CN104735634B (en) 2013-12-24 2013-12-24 A kind of association payment accounts management method, mobile terminal, server and system

Publications (1)

Publication Number Publication Date
WO2015096503A1 true WO2015096503A1 (en) 2015-07-02

Family

ID=53458980

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2014/085566 WO2015096503A1 (en) 2013-12-24 2014-08-29 Method, device and system for associating and managing payment accounts

Country Status (4)

Country Link
CN (1) CN104735634B (en)
HK (1) HK1207237A1 (en)
TW (1) TW201525894A (en)
WO (1) WO2015096503A1 (en)

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106257518A (en) * 2016-07-25 2016-12-28 四川易想电子商务有限公司 A kind of password method of payment
CN106485499A (en) * 2016-10-24 2017-03-08 安徽百慕文化科技有限公司 One kind is based on voice-operated on-line payment system
KR20170139650A (en) * 2015-11-17 2017-12-19 텐센트 테크놀로지(센젠) 컴퍼니 리미티드 Method for adding accounts, terminals, servers, and computer storage media
CN111612470A (en) * 2020-06-05 2020-09-01 中国银行股份有限公司 Mobile banking transaction method and system
US10990944B2 (en) * 2019-09-25 2021-04-27 Cameron May Methods and systems for relaying a payment card detail during a telephone call between a customer's telephone and a vendor's telephone
US11551219B2 (en) * 2017-06-16 2023-01-10 Alibaba Group Holding Limited Payment method, client, electronic device, storage medium, and server

Families Citing this family (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106713111B (en) * 2015-11-17 2020-04-07 腾讯科技(深圳)有限公司 Processing method for adding friends, terminal and server
CN105893554A (en) * 2016-03-31 2016-08-24 广东小天才科技有限公司 Wearable device friend making method and system
CN106209604A (en) * 2016-08-26 2016-12-07 北京小米移动软件有限公司 Add the method and device of good friend
CN106648308A (en) * 2016-11-17 2017-05-10 北京小度信息科技有限公司 Interface display method and information input method and device
CN109428804B (en) * 2017-08-28 2021-07-27 腾讯科技(深圳)有限公司 Account management method and device
CN108196812A (en) * 2017-12-26 2018-06-22 广东美的厨房电器制造有限公司 A kind of cooking equipment menu control method, device and cooking equipment
CN109146492B (en) * 2018-07-24 2020-09-11 吉利汽车研究院(宁波)有限公司 Vehicle-end mobile payment device and method
CN111429143A (en) * 2019-01-10 2020-07-17 上海小蚁科技有限公司 Transfer method, device, storage medium and terminal based on voiceprint recognition
CN114500426B (en) * 2020-10-26 2023-07-14 腾讯科技(深圳)有限公司 Message reminding method, device, computer equipment and storage medium
CN112835900A (en) * 2021-02-01 2021-05-25 深圳市科荣软件股份有限公司 Rural sewage intelligent operation system and method, electronic equipment and storage medium

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2002077790A2 (en) * 2001-03-22 2002-10-03 Canon Kabushiki Kaisha Information processing apparatus and method, and program
WO2012129231A1 (en) * 2011-03-21 2012-09-27 Apple Inc. Device access using voice authentication
CN103377652A (en) * 2012-04-25 2013-10-30 上海智臻网络科技有限公司 Method, device and equipment for carrying out voice recognition
CN103390232A (en) * 2013-07-29 2013-11-13 西安工程大学 Method for paying accounts based on mobile phone

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7200555B1 (en) * 2000-07-05 2007-04-03 International Business Machines Corporation Speech recognition correction for devices having limited or no display
US20090070109A1 (en) * 2007-09-12 2009-03-12 Microsoft Corporation Speech-to-Text Transcription for Personal Communication Devices
CN103188238B (en) * 2011-12-30 2017-11-07 上海博泰悦臻电子设备制造有限公司 The Activiation method and system of payment accounts
CN103366743A (en) * 2012-03-30 2013-10-23 北京千橡网景科技发展有限公司 Voice-command operation method and device
CN102708867A (en) * 2012-05-30 2012-10-03 北京正鹰科技有限责任公司 Method and system for identifying faked identity by preventing faked recordings based on voiceprint and voice
US8543397B1 (en) * 2012-10-11 2013-09-24 Google Inc. Mobile device voice activation
CN102930868A (en) * 2012-10-24 2013-02-13 北京车音网科技有限公司 Identity recognition method and device

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2002077790A2 (en) * 2001-03-22 2002-10-03 Canon Kabushiki Kaisha Information processing apparatus and method, and program
WO2012129231A1 (en) * 2011-03-21 2012-09-27 Apple Inc. Device access using voice authentication
CN103377652A (en) * 2012-04-25 2013-10-30 上海智臻网络科技有限公司 Method, device and equipment for carrying out voice recognition
CN103390232A (en) * 2013-07-29 2013-11-13 西安工程大学 Method for paying accounts based on mobile phone

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20170139650A (en) * 2015-11-17 2017-12-19 텐센트 테크놀로지(센젠) 컴퍼니 리미티드 Method for adding accounts, terminals, servers, and computer storage media
KR102081495B1 (en) * 2015-11-17 2020-02-25 텐센트 테크놀로지(센젠) 컴퍼니 리미티드 How to add accounts, terminals, servers, and computer storage media
CN106257518A (en) * 2016-07-25 2016-12-28 四川易想电子商务有限公司 A kind of password method of payment
CN106485499A (en) * 2016-10-24 2017-03-08 安徽百慕文化科技有限公司 One kind is based on voice-operated on-line payment system
US11551219B2 (en) * 2017-06-16 2023-01-10 Alibaba Group Holding Limited Payment method, client, electronic device, storage medium, and server
US10990944B2 (en) * 2019-09-25 2021-04-27 Cameron May Methods and systems for relaying a payment card detail during a telephone call between a customer's telephone and a vendor's telephone
CN111612470A (en) * 2020-06-05 2020-09-01 中国银行股份有限公司 Mobile banking transaction method and system

Also Published As

Publication number Publication date
TW201525894A (en) 2015-07-01
CN104735634A (en) 2015-06-24
CN104735634B (en) 2019-06-25
HK1207237A1 (en) 2016-01-22

Similar Documents

Publication Publication Date Title
WO2015096503A1 (en) Method, device and system for associating and managing payment accounts
US9361891B1 (en) Method for converting speech to text, performing natural language processing on the text output, extracting data values and matching to an electronic ticket form
US11275728B2 (en) Processing method and device of the user input information
US11777918B2 (en) Dynamic and cryptographically secure augmentation of participants in programmatically established chatbot sessions
US10270736B2 (en) Account adding method, terminal, server, and computer storage medium
US10673786B2 (en) Artificial intelligence system for automatically generating custom travel documents
US11711326B2 (en) Bot group messaging method
WO2016165590A1 (en) Speech translation method and device
US11404052B2 (en) Service data processing method and apparatus and related device
US10930288B2 (en) Mobile device for speech input and text delivery
US11757870B1 (en) Bi-directional voice authentication
US10965623B2 (en) Shared and per-user bot group messaging method
CN106713111B (en) Processing method for adding friends, terminal and server
US20160080558A1 (en) Electronic device and method for displaying phone call content
US20220157663A1 (en) Bot group messaging using bot-specific voice libraries
WO2016188456A1 (en) Screen capture method and apparatus, and mobile terminal
WO2018166367A1 (en) Real-time prompt method and device in real-time conversation, storage medium, and electronic device
CN112767936A (en) Voice conversation method, device, storage medium and electronic equipment
WO2016169364A1 (en) Method and apparatus for automatically processing self-service voice service, and mobile terminal
US20180278556A1 (en) Bot group messaging using general voice libraries
KR101968287B1 (en) Apparatus and method for providing transaction of an intellectual property service
US20190179902A1 (en) Systems and methods for task automation using natural language processing
CN113852694B (en) Message pushing system and pushing method for multi-terminal access client system
CN111968630B (en) Information processing method and device and electronic equipment
US20230328143A1 (en) Communication platform shifting for voice-enabled device

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 14875454

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

32PN Ep: public notification in the ep bulletin as address of the adressee cannot be established

Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 17.01.2017)

122 Ep: pct application non-entry in european phase

Ref document number: 14875454

Country of ref document: EP

Kind code of ref document: A1