DK201770505A1 - User interfaces for peer-to-peer transfers - Google Patents

User interfaces for peer-to-peer transfers Download PDF

Info

Publication number
DK201770505A1
DK201770505A1 DKPA201770505A DKPA201770505A DK201770505A1 DK 201770505 A1 DK201770505 A1 DK 201770505A1 DK PA201770505 A DKPA201770505 A DK PA201770505A DK PA201770505 A DKPA201770505 A DK PA201770505A DK 201770505 A1 DK201770505 A1 DK 201770505A1
Authority
DK
Denmark
Prior art keywords
item
user
messages
message
participant
Prior art date
Application number
DKPA201770505A
Other languages
Danish (da)
Inventor
Van Os Marcel
D. Anton Peter
W. Dryer Allison
Original Assignee
Apple Inc.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Apple Inc. filed Critical Apple Inc.
Priority to CN202011206499.3A priority Critical patent/CN112150133B/en
Priority to CN202310634790.8A priority patent/CN116521302A/en
Priority to EP18730556.0A priority patent/EP3586481B1/en
Priority to KR1020227019902A priority patent/KR102495947B1/en
Priority to KR1020217035417A priority patent/KR102372228B1/en
Priority to KR1020247004706A priority patent/KR20240023212A/en
Priority to KR1020237036172A priority patent/KR102636696B1/en
Priority to KR1020197033768A priority patent/KR102154850B1/en
Priority to KR1020227007288A priority patent/KR102409769B1/en
Priority to CN202010174749.3A priority patent/CN111490926B/en
Priority to CN202210023470.4A priority patent/CN114363278B/en
Priority to KR1020237003678A priority patent/KR102594156B1/en
Priority to JP2019572834A priority patent/JP6983261B2/en
Priority to CN202210639919.XA priority patent/CN114936856A/en
Priority to PCT/US2018/033054 priority patent/WO2018213508A1/en
Priority to KR1020217011434A priority patent/KR102321894B1/en
Priority to CN201880048209.1A priority patent/CN110999228A/en
Priority to EP20204436.8A priority patent/EP3800837B1/en
Priority to EP23190272.7A priority patent/EP4250679A3/en
Priority to AU2018269512A priority patent/AU2018269512B2/en
Priority to KR1020207025711A priority patent/KR102243500B1/en
Publication of DK201770505A1 publication Critical patent/DK201770505A1/en
Priority to AU2020202953A priority patent/AU2020202953B2/en
Priority to JP2021157213A priority patent/JP2022000802A/en
Priority to AU2021290334A priority patent/AU2021290334B2/en
Priority to AU2023203197A priority patent/AU2023203197B2/en
Priority to JP2023138172A priority patent/JP2023169179A/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/16Sound input; Sound output
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q20/00Payment architectures, schemes or protocols
    • G06Q20/30Payment architectures, schemes or protocols characterised by the use of specific devices or networks
    • G06Q20/32Payment architectures, schemes or protocols characterised by the use of specific devices or networks using wireless devices
    • G06Q20/322Aspects of commerce using mobile devices [M-devices]
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L51/00User-to-user messaging in packet-switching networks, transmitted according to store-and-forward or real-time protocols, e.g. e-mail
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/26Power supply means, e.g. regulation thereof
    • G06F1/32Means for saving power
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/30Authentication, i.e. establishing the identity or authorisation of security principals
    • G06F21/31User authentication
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/30Authentication, i.e. establishing the identity or authorisation of security principals
    • G06F21/31User authentication
    • G06F21/32User authentication using biometric data, e.g. fingerprints, iris scans or voiceprints
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/016Input arrangements with force or tactile feedback as computer generated output to the user
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/16Sound input; Sound output
    • G06F3/167Audio in a user interface, e.g. using voice commands for navigating, audio feedback
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q20/00Payment architectures, schemes or protocols
    • G06Q20/08Payment architectures
    • G06Q20/10Payment architectures specially adapted for electronic funds transfer [EFT] systems; specially adapted for home banking systems
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q20/00Payment architectures, schemes or protocols
    • G06Q20/08Payment architectures
    • G06Q20/10Payment architectures specially adapted for electronic funds transfer [EFT] systems; specially adapted for home banking systems
    • G06Q20/102Bill distribution or payments
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q20/00Payment architectures, schemes or protocols
    • G06Q20/38Payment protocols; Details thereof
    • G06Q20/386Payment protocols; Details thereof using messaging services or messaging apps
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q20/00Payment architectures, schemes or protocols
    • G06Q20/38Payment protocols; Details thereof
    • G06Q20/40Authorisation, e.g. identification of payer or payee, verification of customer or shop credentials; Review and approval of payers, e.g. check credit lines or negative lists
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L51/00User-to-user messaging in packet-switching networks, transmitted according to store-and-forward or real-time protocols, e.g. e-mail
    • H04L51/42Mailbox-related aspects, e.g. synchronisation of mailboxes
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L67/00Network arrangements or protocols for supporting network services or applications
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/725Cordless telephones
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W4/00Services specially adapted for wireless communication networks; Facilities therefor
    • H04W4/12Messaging; Mailboxes; Announcements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2200/00Indexing scheme relating to G06F1/04 - G06F1/32
    • G06F2200/16Indexing scheme relating to G06F1/16 - G06F1/18
    • G06F2200/163Indexing scheme relating to constructional details of the computer
    • G06F2200/1637Sensing arrangement for detection of housing movement or orientation, e.g. for controlling scrolling or cursor movement on the display of an handheld computer
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • H04M1/72403User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality
    • H04M1/72409User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality by interfacing with external accessories
    • H04M1/72412User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality by interfacing with external accessories using two-way short-range wireless interfaces
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • H04M1/72403User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality
    • H04M1/7243User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality with interactive means for internal management of messages
    • H04M1/72436User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality with interactive means for internal management of messages for text messaging, e.g. SMS or e-mail
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M2250/00Details of telephonic subscriber devices
    • H04M2250/74Details of telephonic subscriber devices with voice recognition means

Abstract

The present disclosure generally relates to user interfaces for managing peer-to-peer transfers. In some examples, a device provides user interfaces for initiating and managing transfers. In some examples, a device provides user interfaces corresponding to completed transfers. In some examples, a device provides user interfaces for providing visually distinguishable message object appearances based on message designation. In some examples, a device provides user interfaces for activating accounts for accepting and sending transfers. In some examples, a device provides user interfaces for exchanging accounts for use in a transfer. In some examples, a device provides user interfaces for splitting transfers between two or more accounts. In some examples, a device provides user interfaces for generating and displaying a transfers history list. In some examples, a device provides user interfaces for voice-activation of transfers. In some examples, a device provides visual or haptic feedback corresponding to a transfer operation.

Description

USER INTERFACES FOR PEER-TO-PEER TRANSFERS
CROSS-REFERENCE TO RELATED APPLICATIONS [0001] This application claims priority to U.S. Provisional Patent Application Serial No. 62/507,161, titled “USER INTERFACES FOR PEER-TO-PEER TRANSFERS,” filed on May 16, 2017, and to U.S. Provisional Patent Application Serial No. 62/514,945, titled “USER INTERFACES FOR PEER-TO-PEER TRANSFERS,” filed on June 4, 2017. The contents of these applications are hereby incorporated by reference in their entireties for all purposes.
FIELD [0002] The present disclosure relates generally to computer user interfaces, and more specifically to interfaces and techniques for managing peer-to-peer transfers.
BACKGROUND [0003] Peer-to-peer transfers, such as transfers of resources and files, using electronic devices are a convenient and efficient method of exchanging the resources and files. Peer-topeer transfers enable a user to, using an electronic device, quickly and easily send an outgoing transfer and quickly and easily accept an incoming transfer.
BRIEF SUMMARY [0004] Some techniques for managing peer-to-peer transfers using electronic devices, however, are generally cumbersome and inefficient. For example, some existing techniques require the use of certain applications that may not be commonly used by a user of a device, which may unnecessarily cause the user to open a seldom-used application. For another example, some existing techniques have limited options for making and receiving transfers. For another example, some existing techniques use a complex and time-consuming user interface, which may include multiple key presses or keystrokes. As such, existing techniques require more time than necessary, wasting user time and device energy. This latter consideration is particularly important in battery-operated devices.
DK 2017 70505 A1 [0005] Accordingly, the present technique provides electronic devices with faster, more efficient methods and interfaces for managing peer-to-peer transfers. Such methods and interfaces optionally complement or replace other methods for managing peer-to-peer transfers. Such methods and interfaces reduce the cognitive burden on a user and produce a more efficient human-machine interface. For battery-operated computing devices, such methods and interfaces conserve power and increase the time between battery charges. Such methods and interfaces also reduce the number of unnecessary, extraneous, or repetitive input required at computing devices, such as smartphones and smartwatches.
[0006] In accordance with some embodiments, a method performed at an electronic device with a display, one or more input devices, and a wireless communication radio is described. The method comprises: receiving, via the wireless communication radio, one or more messages; displaying, on the display, a user interface for a messaging application that includes at least one of the one or more messages in a message conversation between a plurality of conversation participants; while concurrently displaying, on the display, at least one of the one or more messages in the message conversation, receiving, from one of the participants, a respective message; in response to receiving the respective message, in accordance with a determination, based on an analysis of text in the respective message, that the respective message relates to a transfer of a first type of item that the messaging application is configured to transfer, concurrently displaying, on the display, a representation of the message and a selectable indication that corresponds to the first type of item; while the representation of the message and the selectable indication that corresponds to the first type of item are concurrently displayed on the display, detecting, via the one or more input devices, user activation of the selectable indication; and in response to detecting the user activation of the selectable indication, displaying, on the display, a transfer user interface for initiating transfer of the first type of item between participants in the message conversation.
[0007] In accordance with some embodiments, a non-transitory computer-readable storage medium is described. The non-transitory computer-readable storage medium storing one or more programs configured to be executed by one or more processors of an electronic device with a display, one or more input devices, and a wireless communication radio, the one or more
DK 2017 70505 A1 programs including instructions for: receiving, via the wireless communication radio, one or more messages; displaying, on the display, a user interface for a messaging application that includes at least one of the one or more messages in a message conversation between a plurality of conversation participants; while concurrently displaying, on the display, at least one of the one or more messages in the message conversation, receiving, from one of the participants, a respective message; in response to receiving the respective message, in accordance with a determination, based on an analysis of text in the respective message, that the respective message relates to a transfer of a first type of item that the messaging application is configured to transfer, concurrently displaying, on the display, a representation of the message and a selectable indication that corresponds to the first type of item; while the representation of the message and the selectable indication that corresponds to the first type of item are concurrently displayed on the display, detecting, via the one or more input devices, user activation of the selectable indication; and in response to detecting the user activation of the selectable indication, displaying, on the display, a transfer user interface for initiating transfer of the first type of item between participants in the message conversation.
[0008] In accordance with some embodiments, a transitory computer-readable storage medium is described. The transitory computer-readable storage medium storing one or more programs configured to be executed by one or more processors of an electronic device with a display, one or more input devices, and a wireless communication radio, the one or more programs including instructions for: receiving, via the wireless communication radio, one or more messages; displaying, on the display, a user interface for a messaging application that includes at least one of the one or more messages in a message conversation between a plurality of conversation participants; while concurrently displaying, on the display, at least one of the one or more messages in the message conversation, receiving, from one of the participants, a respective message; in response to receiving the respective message, in accordance with a determination, based on an analysis of text in the respective message, that the respective message relates to a transfer of a first type of item that the messaging application is configured to transfer, concurrently displaying, on the display, a representation of the message and a selectable indication that corresponds to the first type of item; while the representation of the message and the selectable indication that corresponds to the first type of item are concurrently displayed on
DK 2017 70505 A1 the display, detecting, via the one or more input devices, user activation of the selectable indication; and in response to detecting the user activation of the selectable indication, displaying, on the display, a transfer user interface for initiating transfer of the first type of item between participants in the message conversation.
[0009] In accordance with some embodiments, an electronic device is described. The electronic device comprises: a display; one or more input devices; a wireless communication radio; one or more processors; and memory storing one or more programs configured to be executed by the one or more processors, the one or more programs including instructions for: receiving, via the wireless communication radio, one or more messages; displaying, on the display, a user interface for a messaging application that includes at least one of the one or more messages in a message conversation between a plurality of conversation participants; while concurrently displaying, on the display, at least one of the one or more messages in the message conversation, receiving, from one of the participants, a respective message; in response to receiving the respective message, in accordance with a determination, based on an analysis of text in the respective message, that the respective message relates to a transfer of a first type of item that the messaging application is configured to transfer, concurrently displaying, on the display, a representation of the message and a selectable indication that corresponds to the first type of item; while the representation of the message and the selectable indication that corresponds to the first type of item are concurrently displayed on the display, detecting, via the one or more input devices, user activation of the selectable indication; and in response to detecting the user activation of the selectable indication, displaying, on the display, a transfer user interface for initiating transfer of the first type of item between participants in the message conversation.
[0010] In accordance with some embodiments, an electronic device is described. The electronic device comprises: a display; one or more input devices; a wireless communication radio; means for receiving, via the wireless communication radio, one or more messages; means for displaying, on the display, a user interface for a messaging application that includes at least one of the one or more messages in a message conversation between a plurality of conversation participants; means, while concurrently displaying, on the display, at least one of the one or more
DK 2017 70505 A1 messages in the message conversation, for receiving, from one of the participants, a respective message; means, in response to receiving the respective message, in accordance with a determination, based on an analysis of text in the respective message, that the respective message relates to a transfer of a first type of item that the messaging application is configured to transfer, for concurrently displaying, on the display, a representation of the message and a selectable indication that corresponds to the first type of item; means, while the representation of the message and the selectable indication that corresponds to the first type of item are concurrently displayed on the display, for detecting, via the one or more input devices, user activation of the selectable indication; and means, in response to detecting the user activation of the selectable indication, for displaying, on the display, a transfer user interface for initiating transfer of the first type of item between participants in the message conversation.
[0011] In accordance with some embodiments, a method performed at an electronic device with a display and one or more sensor devices is described. The method comprises: displaying, on the display, a graphical representation of a communication; while displaying the graphical representation of the communication on the display, detecting, via the one or more sensor devices, a change in orientation of the electronic device relative to a reference point; and in response to detecting the change in the orientation of the electronic device relative to the reference point while displaying the graphical representation of the communication on the display: in accordance with a determination that the communication has a first state, displaying the graphical representation of the communication and outputting a respective type of feedback corresponding to the graphical representation of the communication, wherein the feedback indicates a magnitude of the change in the orientation of the electronic device relative to the reference point; and in accordance with a determination that the communication has a second state that is different from the first state, displaying the graphical representation of the communication without outputting feedback that indicates a magnitude of the change in the orientation of the electronic device relative to the reference point.
[0012] In accordance with some embodiments, a non-transitory computer-readable storage medium is described. The non-transitory computer-readable storage medium storing one or more programs configured to be executed by one or more processors of an electronic device with
DK 2017 70505 A1 a display and one or more sensor devices, the one or more programs including instructions for: displaying, on the display, a graphical representation of a communication; while displaying the graphical representation of the communication on the display, detecting, via the one or more sensor devices, a change in orientation of the electronic device relative to a reference point; and in response to detecting the change in the orientation of the electronic device relative to the reference point while displaying the graphical representation of the communication on the display: in accordance with a determination that the communication has a first state, displaying the graphical representation of the communication and outputting a respective type of feedback corresponding to the graphical representation of the communication, wherein the feedback indicates a magnitude of the change in the orientation of the electronic device relative to the reference point; and in accordance with a determination that the communication has a second state that is different from the first state, displaying the graphical representation of the communication without outputting feedback that indicates a magnitude of the change in the orientation of the electronic device relative to the reference point.
[0013] In accordance with some embodiments, a transitory computer-readable storage medium is described. The transitory computer-readable storage medium storing one or more programs configured to be executed by one or more processors of an electronic device with a display and one or more sensor devices, the one or more programs including instructions for: displaying, on the display, a graphical representation of a communication; while displaying the graphical representation of the communication on the display, detecting, via the one or more sensor devices, a change in orientation of the electronic device relative to a reference point; and in response to detecting the change in the orientation of the electronic device relative to the reference point while displaying the graphical representation of the communication on the display: in accordance with a determination that the communication has a first state, displaying the graphical representation of the communication and outputting a respective type of feedback corresponding to the graphical representation of the communication, wherein the feedback indicates a magnitude of the change in the orientation of the electronic device relative to the reference point; and in accordance with a determination that the communication has a second state that is different from the first state, displaying the graphical representation of the
DK 2017 70505 A1 communication without outputting feedback that indicates a magnitude of the change in the orientation of the electronic device relative to the reference point.
[0014] In accordance with some embodiments, an electronic device is described. The electronic device comprises: a display; one or more sensor devices; one or more processors; and memory storing one or more programs configured to be executed by the one or more processors, the one or more programs including instructions for: displaying, on the display, a graphical representation of a communication; while displaying the graphical representation of the communication on the display, detecting, via the one or more sensor devices, a change in orientation of the electronic device relative to a reference point; and in response to detecting the change in the orientation of the electronic device relative to the reference point while displaying the graphical representation of the communication on the display: in accordance with a determination that the communication has a first state, displaying the graphical representation of the communication and outputting a respective type of feedback corresponding to the graphical representation of the communication, wherein the feedback indicates a magnitude of the change in the orientation of the electronic device relative to the reference point; and in accordance with a determination that the communication has a second state that is different from the first state, displaying the graphical representation of the communication without outputting feedback that indicates a magnitude of the change in the orientation of the electronic device relative to the reference point.
[0015] In accordance with some embodiments, an electronic device is described. The electronic device comprises: a display; one or more sensor devices; means for displaying, on the display, a graphical representation of a communication; means, while displaying the graphical representation of the communication on the display, for detecting, via the one or more sensor devices, a change in orientation of the electronic device relative to a reference point; and means, in response to detecting the change in the orientation of the electronic device relative to the reference point while displaying the graphical representation of the communication on the display, for: in accordance with a determination that the communication has a first state, displaying the graphical representation of the communication and outputting a respective type of feedback corresponding to the graphical representation of the communication, wherein the
DK 2017 70505 A1 feedback indicates a magnitude of the change in the orientation of the electronic device relative to the reference point; and in accordance with a determination that the communication has a second state that is different from the first state, displaying the graphical representation of the communication without outputting feedback that indicates a magnitude of the change in the orientation of the electronic device relative to the reference point.
[0016] In accordance with some embodiments, a method performed at an electronic device with a display and one or more input devices is described. The method comprises: displaying, on the display, a numerical value selection user interface; while displaying the numerical value selection user interface, receiving, via the one or more input devices, an input that corresponds to selection of a respective numerical value from a plurality of numerical values in the numerical value selection interface; in response to receiving the input that corresponds to the selection of the respective numerical value, displaying, on the display, a representation of the respective numerical value in the numerical value selection user interface; while displaying the representation of the respective numerical value in the numerical value selection user interface, receiving, via the one or more input devices, an input that corresponds to a request to send a message, via a messaging application, that corresponds to the respective numerical value; and in response to receiving the input that corresponds to the request to send the message, via the messaging application, that corresponds to the respective numerical value, sending the message that corresponds to the respective numerical value to one or more participants, and: in accordance with a determination that the message is designated as a transmission message for the respective numerical value, displaying, on the display, a first message object in a message transcript of the messaging application, wherein the first message object includes a graphical representation of the respective numerical value in a respective font that is associated with requests generated using the numerical value selection user interface; and in accordance with a determination that the message is designated as a request message for the respective numerical value, displaying, on the display, a second message object in the message transcript of the messaging application different from the first message object, wherein, in the second message object: the respective numerical value is displayed in the message object in a font that is smaller than the respective font; and a predetermined request indicator associated with requests generated using the numerical value selection user interface is displayed in the respective font.
DK 2017 70505 A1 [0017] In accordance with some embodiments, a non-transitory computer-readable storage medium is described. The non-transitory computer-readable storage medium storing one or more programs configured to be executed by one or more processors of an electronic device with a display and one or more input devices, the one or more programs including instructions for: displaying, on the display, a numerical value selection user interface; while displaying the numerical value selection user interface, receiving, via the one or more input devices, an input that corresponds to selection of a respective numerical value from a plurality of numerical values in the numerical value selection interface; in response to receiving the input that corresponds to the selection of the respective numerical value, displaying, on the display, a representation of the respective numerical value in the numerical value selection user interface; while displaying the representation of the respective numerical value in the numerical value selection user interface, receiving, via the one or more input devices, an input that corresponds to a request to send a message, via a messaging application, that corresponds to the respective numerical value; and in response to receiving the input that corresponds to the request to send the message, via the messaging application, that corresponds to the respective numerical value, sending the message that corresponds to the respective numerical value to one or more participants, and: in accordance with a determination that the message is designated as a transmission message for the respective numerical value, displaying, on the display, a first message object in a message transcript of the messaging application, wherein the first message object includes a graphical representation of the respective numerical value in a respective font that is associated with requests generated using the numerical value selection user interface; and in accordance with a determination that the message is designated as a request message for the respective numerical value, displaying, on the display, a second message object in the message transcript of the messaging application different from the first message object, wherein, in the second message object: the respective numerical value is displayed in the message object in a font that is smaller than the respective font; and a predetermined request indicator associated with requests generated using the numerical value selection user interface is displayed in the respective font.
[0018] In accordance with some embodiments, a transitory computer-readable storage medium is described. The transitory computer-readable storage medium storing one or more programs configured to be executed by one or more processors of an electronic device with a
DK 2017 70505 A1 display and one or more input devices, the one or more programs including instructions for: displaying, on the display, a numerical value selection user interface; while displaying the numerical value selection user interface, receiving, via the one or more input devices, an input that corresponds to selection of a respective numerical value from a plurality of numerical values in the numerical value selection interface; in response to receiving the input that corresponds to the selection of the respective numerical value, displaying, on the display, a representation of the respective numerical value in the numerical value selection user interface; while displaying the representation of the respective numerical value in the numerical value selection user interface, receiving, via the one or more input devices, an input that corresponds to a request to send a message, via a messaging application, that corresponds to the respective numerical value; and in response to receiving the input that corresponds to the request to send the message, via the messaging application, that corresponds to the respective numerical value, sending the message that corresponds to the respective numerical value to one or more participants, and: in accordance with a determination that the message is designated as a transmission message for the respective numerical value, displaying, on the display, a first message object in a message transcript of the messaging application, wherein the first message object includes a graphical representation of the respective numerical value in a respective font that is associated with requests generated using the numerical value selection user interface; and in accordance with a determination that the message is designated as a request message for the respective numerical value, displaying, on the display, a second message object in the message transcript of the messaging application different from the first message object, wherein, in the second message object: the respective numerical value is displayed in the message object in a font that is smaller than the respective font; and a predetermined request indicator associated with requests generated using the numerical value selection user interface is displayed in the respective font.
[0019] In accordance with some embodiments, an electronic device is described. The electronic device comprises: a display; one or more input devices; one or more processors; and memory storing one or more programs configured to be executed by the one or more processors, the one or more programs including instructions for: displaying, on the display, a numerical value selection user interface; while displaying the numerical value selection user interface, receiving, via the one or more input devices, an input that corresponds to selection of a
DK 2017 70505 A1 respective numerical value from a plurality of numerical values in the numerical value selection interface; in response to receiving the input that corresponds to the selection of the respective numerical value, displaying, on the display, a representation of the respective numerical value in the numerical value selection user interface; while displaying the representation of the respective numerical value in the numerical value selection user interface, receiving, via the one or more input devices, an input that corresponds to a request to send a message, via a messaging application, that corresponds to the respective numerical value; and in response to receiving the input that corresponds to the request to send the message, via the messaging application, that corresponds to the respective numerical value, sending the message that corresponds to the respective numerical value to one or more participants, and: in accordance with a determination that the message is designated as a transmission message for the respective numerical value, displaying, on the display, a first message object in a message transcript of the messaging application, wherein the first message object includes a graphical representation of the respective numerical value in a respective font that is associated with requests generated using the numerical value selection user interface; and in accordance with a determination that the message is designated as a request message for the respective numerical value, displaying, on the display, a second message object in the message transcript of the messaging application different from the first message object, wherein, in the second message object: the respective numerical value is displayed in the message object in a font that is smaller than the respective font; and a predetermined request indicator associated with requests generated using the numerical value selection user interface is displayed in the respective font.
[0020] In accordance with some embodiments, an electronic device is described. The electronic device comprises: a display; one or more input devices; means for displaying, on the display, a numerical value selection user interface; means, while displaying the numerical value selection user interface, for receiving, via the one or more input devices, an input that corresponds to selection of a respective numerical value from a plurality of numerical values in the numerical value selection interface; means, in response to receiving the input that corresponds to the selection of the respective numerical value, for displaying, on the display, a representation of the respective numerical value in the numerical value selection user interface; means, while displaying the representation of the respective numerical value in the numerical
DK 2017 70505 A1 value selection user interface, for receiving, via the one or more input devices, an input that corresponds to a request to send a message, via a messaging application, that corresponds to the respective numerical value; and means, in response to receiving the input that corresponds to the request to send the message, via the messaging application, that corresponds to the respective numerical value, for sending the message that corresponds to the respective numerical value to one or more participants, and: means, in accordance with a determination that the message is designated as a transmission message for the respective numerical value, for displaying, on the display, a first message object in a message transcript of the messaging application, wherein the first message object includes a graphical representation of the respective numerical value in a respective font that is associated with requests generated using the numerical value selection user interface; and means, in accordance with a determination that the message is designated as a request message for the respective numerical value, for displaying, on the display, a second message object in the message transcript of the messaging application different from the first message object, wherein, in the second message object: the respective numerical value is displayed in the message object in a font that is smaller than the respective font; and a predetermined request indicator associated with requests generated using the numerical value selection user interface is displayed in the respective font.
[0021] In accordance with some embodiments, a method performed at an electronic device with a display and one or more input devices is described. The method comprises: displaying, on the display, a message object in a message conversation, wherein the message object includes an indication of a first one or more items sent from a participant in the conversation to a user of the electronic device; while displaying at least a portion of the message conversation, detecting, via the one or more input devices, an input that corresponds to a request to obtain the first one or more items; and in response to detecting the input that corresponds to the request to obtain the first one or more items: in accordance with a determination that the electronic device is associated with an activated account that is authorized to obtain the first one or more items, proceeding to obtain the first one or more items; and in accordance with a determination that the electronic device is not associated with an activated account that is authorized to obtain the first content, displaying, on the display, a second affordance for activating an account that is authorized to obtain the first one or more items.
DK 2017 70505 A1 [0022] In accordance with some embodiments, a non-transitory computer-readable storage medium is described. The non-transitory computer-readable storage medium storing one or more programs configured to be executed by one or more processors of an electronic device with a display and one or more input devices, the one or more programs including instructions for: displaying, on the display, a message object in a message conversation, wherein the message object includes an indication of a first one or more items sent from a participant in the conversation to a user of the electronic device; while displaying at least a portion of the message conversation, detecting, via the one or more input devices, an input that corresponds to a request to obtain the first one or more items; and in response to detecting the input that corresponds to the request to obtain the first one or more items: in accordance with a determination that the electronic device is associated with an activated account that is authorized to obtain the first one or more items, proceeding to obtain the first one or more items; and in accordance with a determination that the electronic device is not associated with an activated account that is authorized to obtain the first content, displaying, on the display, a second affordance for activating an account that is authorized to obtain the first one or more items.
[0023] In accordance with some embodiments, a transitory computer-readable storage medium is described. The transitory computer-readable storage medium storing one or more programs configured to be executed by one or more processors of an electronic device with a display and one or more input devices, the one or more programs including instructions for: displaying, on the display, a message object in a message conversation, wherein the message object includes an indication of a first one or more items sent from a participant in the conversation to a user of the electronic device; while displaying at least a portion of the message conversation, detecting, via the one or more input devices, an input that corresponds to a request to obtain the first one or more items; and in response to detecting the input that corresponds to the request to obtain the first one or more items: in accordance with a determination that the electronic device is associated with an activated account that is authorized to obtain the first one or more items, proceeding to obtain the first one or more items; and in accordance with a determination that the electronic device is not associated with an activated account that is authorized to obtain the first content, displaying, on the display, a second affordance for activating an account that is authorized to obtain the first one or more items.
DK 2017 70505 A1 [0024] In accordance with some embodiments, an electronic device is described. The electronic device comprises: a display; one or more input devices; one or more processors; and memory storing one or more programs configured to be executed by the one or more processors, the one or more programs including instructions for: displaying, on the display, a message object in a message conversation, wherein the message object includes an indication of a first one or more items sent from a participant in the conversation to a user of the electronic device; while displaying at least a portion of the message conversation, detecting, via the one or more input devices, an input that corresponds to a request to obtain the first one or more items; and in response to detecting the input that corresponds to the request to obtain the first one or more items: in accordance with a determination that the electronic device is associated with an activated account that is authorized to obtain the first one or more items, proceeding to obtain the first one or more items; and in accordance with a determination that the electronic device is not associated with an activated account that is authorized to obtain the first content, displaying, on the display, a second affordance for activating an account that is authorized to obtain the first one or more items.
[0025] In accordance with some embodiments, an electronic device is described. The electronic device comprises: a display; one or more input devices; means for displaying, on the display, a message object in a message conversation, wherein the message object includes an indication of a first one or more items sent from a participant in the conversation to a user of the electronic device; means, while displaying at least a portion of the message conversation, for detecting, via the one or more input devices, an input that corresponds to a request to obtain the first one or more items; and means, in response to detecting the input that corresponds to the request to obtain the first one or more items, for: in accordance with a determination that the electronic device is associated with an activated account that is authorized to obtain the first one or more items, proceeding to obtain the first one or more items; and in accordance with a determination that the electronic device is not associated with an activated account that is authorized to obtain the first content, displaying, on the display, a second affordance for activating an account that is authorized to obtain the first one or more items.
DK 2017 70505 A1 [0026] In accordance with some embodiments, a method performed at an electronic device with a display, a wireless transmission device, and one or more input devices is described. The method comprises: receiving a request to provide restricted credentials associated with a user of the device via the wireless transmission device to an external device; in response to receiving the request to provide the restricted credentials, concurrently displaying, on the display: a representation of a first account associated with first restricted credentials at a first location of the display, wherein the first account is selected for use in providing the restricted credentials, and at least a portion of a representation of a second account associated with second restricted credentials at a second location of the display, wherein display of at least the portion of the representation of the second account includes display of a usage metric for the second account; detecting, via the one or more input devices, user selection of the representation of the second account; and in response to detecting the user selection of the representation of the second account: replacing display of the representation of the first account with the representation of the second account at the first location of the display, and selecting the second account for use in providing the restricted credentials.
[0027] In accordance with some embodiments, a non-transitory computer-readable storage medium is described. The non-transitory computer-readable storage medium storing one or more programs configured to be executed by one or more processors of an electronic device with a display, a wireless transmission device, and one or more input devices, the one or more programs including instructions for: receiving a request to provide restricted credentials associated with a user of the device via the wireless transmission device to an external device; in response to receiving the request to provide the restricted credentials, concurrently displaying, on the display: a representation of a first account associated with first restricted credentials at a first location of the display, wherein the first account is selected for use in providing the restricted credentials, and at least a portion of a representation of a second account associated with second restricted credentials at a second location of the display, wherein display of at least the portion of the representation of the second account includes display of a usage metric for the second account; detecting, via the one or more input devices, user selection of the representation of the second account; and in response to detecting the user selection of the representation of the second account: replacing display of the representation of the first account with the
DK 2017 70505 A1 representation of the second account at the first location of the display, and selecting the second account for use in providing the restricted credentials.
[0028] In accordance with some embodiments, a transitory computer-readable storage medium is described. The transitory computer-readable storage medium storing one or more programs configured to be executed by one or more processors of an electronic device with a display, a wireless transmission device, and one or more input devices, the one or more programs including instructions for: receiving a request to provide restricted credentials associated with a user of the device via the wireless transmission device to an external device; in response to receiving the request to provide the restricted credentials, concurrently displaying, on the display: a representation of a first account associated with first restricted credentials at a first location of the display, wherein the first account is selected for use in providing the restricted credentials, and at least a portion of a representation of a second account associated with second restricted credentials at a second location of the display, wherein display of at least the portion of the representation of the second account includes display of a usage metric for the second account; detecting, via the one or more input devices, user selection of the representation of the second account; and in response to detecting the user selection of the representation of the second account: replacing display of the representation of the first account with the representation of the second account at the first location of the display, and selecting the second account for use in providing the restricted credentials.
[0029] In accordance with some embodiments, an electronic device is described. The electronic device comprises: a display; a wireless transmission device; one or more input devices; one or more processors; and memory storing one or more programs configured to be executed by the one or more processors, the one or more programs including instructions for: receiving a request to provide restricted credentials associated with a user of the device via the wireless transmission device to an external device; in response to receiving the request to provide the restricted credentials, concurrently displaying, on the display: a representation of a first account associated with first restricted credentials at a first location of the display, wherein the first account is selected for use in providing the restricted credentials, and at least a portion of a representation of a second account associated with second restricted credentials at a second
DK 2017 70505 A1 location of the display, wherein display of at least the portion of the representation of the second account includes display of a usage metric for the second account; detecting, via the one or more input devices, user selection of the representation of the second account; and in response to detecting the user selection of the representation of the second account: replacing display of the representation of the first account with the representation of the second account at the first location of the display, and selecting the second account for use in providing the restricted credentials.
[0030] In accordance with some embodiments, an electronic device is described. The electronic device comprises: a display; a wireless transmission device; one or more input devices; means for receiving a request to provide restricted credentials associated with a user of the device via the wireless transmission device to an external device; means, in response to receiving the request to provide the restricted credentials, for concurrently displaying, on the display: a representation of a first account associated with first restricted credentials at a first location of the display, wherein the first account is selected for use in providing the restricted credentials, and at least a portion of a representation of a second account associated with second restricted credentials at a second location of the display, wherein display of at least the portion of the representation of the second account includes display of a usage metric for the second account; means for detecting, via the one or more input devices, user selection of the representation of the second account; and means, in response to detecting the user selection of the representation of the second account, for: replacing display of the representation of the first account with the representation of the second account at the first location of the display, and selecting the second account for use in providing the restricted credentials.
[0031] In accordance with some embodiments, a method performed at an electronic device with a display and one or more input devices is described. The method comprises: receiving a request to participate in a transfer of resources for a requested resource amount using a first resource account; and in response to receiving the request to participate in the transfer of resources for the requested resource amount using the first resource account: in accordance with a determination that the requested resource amount is equal to or less than an amount of resources available via the first resource account, automatically proceeding with the transfer of
DK 2017 70505 A1 resources using only the first resource account, and in accordance with a determination that the requested resource amount is greater than the amount of resources available via the first resource account, automatically proceeding with the transfer of resources using the first resource account and a second resource account different from the first resource account.
[0032] In accordance with some embodiments, a non-transitory computer-readable storage medium is described. The non-transitory computer-readable storage medium storing one or more programs configured to be executed by one or more processors of an electronic device with a display and one or more input devices, the one or more programs including instructions for: receiving a request to participate in a transfer of resources for a requested resource amount using a first resource account; and in response to receiving the request to participate in the transfer of resources for the requested resource amount using the first resource account: in accordance with a determination that the requested resource amount is equal to or less than an amount of resources available via the first resource account, automatically proceeding with the transfer of resources using only the first resource account, and in accordance with a determination that the requested resource amount is greater than the amount of resources available via the first resource account, automatically proceeding with the transfer of resources using the first resource account and a second resource account different from the first resource account.
[0033] In accordance with some embodiments, a transitory computer-readable storage medium is described. The transitory computer-readable storage medium storing one or more programs configured to be executed by one or more processors of an electronic device with a display and one or more input devices, the one or more programs including instructions for: receiving a request to participate in a transfer of resources for a requested resource amount using a first resource account; and in response to receiving the request to participate in the transfer of resources for the requested resource amount using the first resource account: in accordance with a determination that the requested resource amount is equal to or less than an amount of resources available via the first resource account, automatically proceeding with the transfer of resources using only the first resource account, and in accordance with a determination that the requested resource amount is greater than the amount of resources available via the first resource
DK 2017 70505 A1 account, automatically proceeding with the transfer of resources using the first resource account and a second resource account different from the first resource account.
[0034] In accordance with some embodiments, an electronic device is described. The electronic device comprises: a display; one or more input devices; one or more processors; and memory storing one or more programs configured to be executed by the one or more processors, the one or more programs including instructions for: receiving a request to participate in a transfer of resources for a requested resource amount using a first resource account; and in response to receiving the request to participate in the transfer of resources for the requested resource amount using the first resource account: in accordance with a determination that the requested resource amount is equal to or less than an amount of resources available via the first resource account, automatically proceeding with the transfer of resources using only the first resource account, and in accordance with a determination that the requested resource amount is greater than the amount of resources available via the first resource account, automatically proceeding with the transfer of resources using the first resource account and a second resource account different from the first resource account.
[0035] In accordance with some embodiments, an electronic device is described. The electronic device comprises: a display; one or more input devices; means for receiving a request to participate in a transfer of resources for a requested resource amount using a first resource account; and means, in response to receiving the request to participate in the transfer of resources for the requested resource amount using the first resource account, for: in accordance with a determination that the requested resource amount is equal to or less than an amount of resources available via the first resource account, automatically proceeding with the transfer of resources using only the first resource account, and in accordance with a determination that the requested resource amount is greater than the amount of resources available via the first resource account, automatically proceeding with the transfer of resources using the first resource account and a second resource account different from the first resource account.
[0036] In accordance with some embodiments, a method performed at an electronic device with a display is described. The method comprises: receiving one or more messages in a first conversation of electronic messages that includes messages from a user of the electronic device 19
DK 2017 70505 A1 to a first participant and messages from the first participant to the user of the electronic device, the one or more messages in the first conversation including a first message that is associated with the transfer of a first additional item; receiving one or more messages in a second conversation of electronic messages that includes messages from the user of the electronic device to a second participant and messages from the second participant to the user of the electronic device, the one or more messages in the second conversation including a second message that is associated with the transfer of a second additional item; and concurrently displaying, on the display: a first item associated with the first participant, wherein the first item includes first information from the first message in the first conversation of electronic messages and a representation of the first additional item; and a second item associated with the second participant, wherein the second item includes second information from the second message in the second conversation of electronic messages and a representation of the second additional item.
[0037] In accordance with some embodiments, a non-transitory computer-readable storage medium is described. The non-transitory computer-readable storage medium storing one or more programs configured to be executed by one or more processors of an electronic device with a display, the one or more programs including instructions for: receiving one or more messages in a first conversation of electronic messages that includes messages from a user of the electronic device to a first participant and messages from the first participant to the user of the electronic device, the one or more messages in the first conversation including a first message that is associated with the transfer of a first additional item; receiving one or more messages in a second conversation of electronic messages that includes messages from the user of the electronic device to a second participant and messages from the second participant to the user of the electronic device, the one or more messages in the second conversation including a second message that is associated with the transfer of a second additional item; and concurrently displaying, on the display: a first item associated with the first participant, wherein the first item includes first information from the first message in the first conversation of electronic messages and a representation of the first additional item; and a second item associated with the second participant, wherein the second item includes second information from the second message in the second conversation of electronic messages and a representation of the second additional item.
DK 2017 70505 A1 [0038] In accordance with some embodiments, a transitory computer-readable storage medium is described. The transitory computer-readable storage medium storing one or more programs configured to be executed by one or more processors of an electronic device with a display, the one or more programs including instructions for: receiving one or more messages in a first conversation of electronic messages that includes messages from a user of the electronic device to a first participant and messages from the first participant to the user of the electronic device, the one or more messages in the first conversation including a first message that is associated with the transfer of a first additional item; receiving one or more messages in a second conversation of electronic messages that includes messages from the user of the electronic device to a second participant and messages from the second participant to the user of the electronic device, the one or more messages in the second conversation including a second message that is associated with the transfer of a second additional item; and concurrently displaying, on the display: a first item associated with the first participant, wherein the first item includes first information from the first message in the first conversation of electronic messages and a representation of the first additional item; and a second item associated with the second participant, wherein the second item includes second information from the second message in the second conversation of electronic messages and a representation of the second additional item.
[0039] In accordance with some embodiments, an electronic device is described. The electronic device comprises: a display; one or more processors; and memory storing one or more programs configured to be executed by the one or more processors, the one or more programs including instructions for: receiving one or more messages in a first conversation of electronic messages that includes messages from a user of the electronic device to a first participant and messages from the first participant to the user of the electronic device, the one or more messages in the first conversation including a first message that is associated with the transfer of a first additional item; receiving one or more messages in a second conversation of electronic messages that includes messages from the user of the electronic device to a second participant and messages from the second participant to the user of the electronic device, the one or more messages in the second conversation including a second message that is associated with the transfer of a second additional item; and concurrently displaying, on the display: a first item associated with the first participant, wherein the first item includes first information from the
DK 2017 70505 A1 first message in the first conversation of electronic messages and a representation of the first additional item; and a second item associated with the second participant, wherein the second item includes second information from the second message in the second conversation of electronic messages and a representation of the second additional item.
[0040] In accordance with some embodiments, an electronic device is described. The electronic device comprises: a display; means for receiving one or more messages in a first conversation of electronic messages that includes messages from a user of the electronic device to a first participant and messages from the first participant to the user of the electronic device, the one or more messages in the first conversation including a first message that is associated with the transfer of a first additional item; means for receiving one or more messages in a second conversation of electronic messages that includes messages from the user of the electronic device to a second participant and messages from the second participant to the user of the electronic device, the one or more messages in the second conversation including a second message that is associated with the transfer of a second additional item; and means for concurrently displaying, on the display: a first item associated with the first participant, wherein the first item includes first information from the first message in the first conversation of electronic messages and a representation of the first additional item; and a second item associated with the second participant, wherein the second item includes second information from the second message in the second conversation of electronic messages and a representation of the second additional item.
[0041] In accordance with some embodiments, a method performed at an electronic device with one or more output devices including a display and one or more input devices is described. The method comprises: receiving, via the one or more input devices, an utterance from a user that corresponds to a request to perform an operation; in response to receiving the utterance, preparing to perform the operation: in accordance with a determination that the operation requires authorization, preparing to perform the operation includes presenting, via the one or more output devices of the device: a representation of the operation; and instructions for providing authorization to the device, via the one or more input devices of the device, to perform the operation; after preparing to perform the operation, receiving a confirmation input associated with performing the operation; and in response to receiving the confirmation input: in
DK 2017 70505 A1 accordance with a determination that the operation requires authorization and the operation has not been authorized, forgoing performing the operation in response to the confirmation input; in accordance with a determination that the operation requires authorization and the operation has been authorized, performing the operation in response to the confirmation input; and in accordance with a determination that the operation does not require authorization, performing the operation in response to the confirmation input.
[0042] In accordance with some embodiments, a non-transitory computer-readable storage medium is described. The non-transitory computer-readable storage medium storing one or more programs configured to be executed by one or more processors of an electronic device with one or more output devices including a display and one or more input devices, the one or more programs including instructions for: receiving, via the one or more input devices, an utterance from a user that corresponds to a request to perform an operation; in response to receiving the utterance, preparing to perform the operation: in accordance with a determination that the operation requires authorization, preparing to perform the operation includes presenting, via the one or more output devices of the device: a representation of the operation; and instructions for providing authorization to the device, via the one or more input devices of the device, to perform the operation; after preparing to perform the operation, receiving a confirmation input associated with performing the operation; and in response to receiving the confirmation input: in accordance with a determination that the operation requires authorization and the operation has not been authorized, forgoing performing the operation in response to the confirmation input; in accordance with a determination that the operation requires authorization and the operation has been authorized, performing the operation in response to the confirmation input; and in accordance with a determination that the operation does not require authorization, performing the operation in response to the confirmation input.
[0043] In accordance with some embodiments, a transitory computer-readable storage medium is described. The transitory computer-readable storage medium storing one or more programs configured to be executed by one or more processors of an electronic device with one or more output devices including a display and one or more input devices, the one or more programs including instructions for: receiving, via the one or more input devices, an utterance
DK 2017 70505 A1 from a user that corresponds to a request to perform an operation; in response to receiving the utterance, preparing to perform the operation: in accordance with a determination that the operation requires authorization, preparing to perform the operation includes presenting, via the one or more output devices of the device: a representation of the operation; and instructions for providing authorization to the device, via the one or more input devices of the device, to perform the operation; after preparing to perform the operation, receiving a confirmation input associated with performing the operation; and in response to receiving the confirmation input: in accordance with a determination that the operation requires authorization and the operation has not been authorized, forgoing performing the operation in response to the confirmation input; in accordance with a determination that the operation requires authorization and the operation has been authorized, performing the operation in response to the confirmation input; and in accordance with a determination that the operation does not require authorization, performing the operation in response to the confirmation input.
[0044] In accordance with some embodiments, an electronic device is described. The electronic device comprises: one or more output devices including a display; one or more input devices; one or more processors; and memory storing one or more programs configured to be executed by the one or more processors, the one or more programs including instructions for: receiving, via the one or more input devices, an utterance from a user that corresponds to a request to perform an operation; in response to receiving the utterance, preparing to perform the operation: in accordance with a determination that the operation requires authorization, preparing to perform the operation includes presenting, via the one or more output devices of the device: a representation of the operation; and instructions for providing authorization to the device, via the one or more input devices of the device, to perform the operation; after preparing to perform the operation, receiving a confirmation input associated with performing the operation; and in response to receiving the confirmation input: in accordance with a determination that the operation requires authorization and the operation has not been authorized, forgoing performing the operation in response to the confirmation input; in accordance with a determination that the operation requires authorization and the operation has been authorized, performing the operation in response to the confirmation input; and in accordance with a determination that the operation does not require authorization, performing the operation in response to the confirmation input.
DK 2017 70505 A1 [0045] In accordance with some embodiments, an electronic device is described. The electronic device comprises: one or more output devices, including a display; one or more input devices; means for receiving, via the one or more input devices, an utterance from a user that corresponds to a request to perform an operation; means, responsive to receiving the utterance, preparing to perform the operation, for: in accordance with a determination that the operation requires authorization, preparing to perform the operation includes presenting, via the one or more output devices of the device: a representation of the operation; and instructions for providing authorization to the device, via the one or more input devices of the device, to perform the operation; means, after preparing to perform the operation, for receiving a confirmation input associated with performing the operation; and means, responsive to receiving the confirmation input, for: in accordance with a determination that the operation requires authorization and the operation has not been authorized, forgoing performing the operation in response to the confirmation input; in accordance with a determination that the operation requires authorization and the operation has been authorized, performing the operation in response to the confirmation input; and in accordance with a determination that the operation does not require authorization, performing the operation in response to the confirmation input.
[0046] In accordance with some embodiments, a method performed at an electronic device with a display and one or more sensor devices is described. The method comprises: while the device is at a first orientation relative to a baseline orientation with respect to a reference point, displaying, on the display, a user interface object; while displaying the user interface object, detecting, via the one or more sensor devices, a change in orientation of the device from the first orientation relative to the reference point to a respective orientation relative to the reference point; in response to detecting the change in orientation of the device: changing an appearance of the user interface object by applying a visual effect to the user interface object that varies a set of one or more parameters of the user interface object as the orientation of the device changes relative to the reference point; in accordance with a determination that the change in orientation of the device includes movement, towards the baseline orientation, that meets predetermined criteria, reducing an amplitude of the visual effect; and in accordance with a determination that the change in orientation of the device includes movement, away from the baseline orientation,
DK 2017 70505 A1 that meets the predetermined criteria, continuing to apply the visual effect to the user interface object without reducing the amplitude of the visual effect.
[0047] In accordance with some embodiments, a non-transitory computer-readable storage medium is described. The non-transitory computer-readable storage medium storing one or more programs configured to be executed by one or more processors of an electronic device with a display and one or more sensor devices, the one or more programs including instructions for: while the device is at a first orientation relative to a baseline orientation with respect to a reference point, displaying, on the display, a user interface object; while displaying the user interface object, detecting, via the one or more sensor devices, a change in orientation of the device from the first orientation relative to the reference point to a respective orientation relative to the reference point; in response to detecting the change in orientation of the device: changing an appearance of the user interface object by applying a visual effect to the user interface object that varies a set of one or more parameters of the user interface object as the orientation of the device changes relative to the reference point; in accordance with a determination that the change in orientation of the device includes movement, towards the baseline orientation, that meets predetermined criteria, reducing an amplitude of the visual effect; and in accordance with a determination that the change in orientation of the device includes movement, away from the baseline orientation, that meets the predetermined criteria, continuing to apply the visual effect to the user interface object without reducing the amplitude of the visual effect.
[0048] In accordance with some embodiments, a transitory computer-readable storage medium is described. The transitory computer-readable storage medium storing one or more programs configured to be executed by one or more processors of an electronic device with a display and one or more sensor devices, the one or more programs including instructions for: while the device is at a first orientation relative to a baseline orientation with respect to a reference point, displaying, on the display, a user interface object; while displaying the user interface object, detecting, via the one or more sensor devices, a change in orientation of the device from the first orientation relative to the reference point to a respective orientation relative to the reference point; in response to detecting the change in orientation of the device: changing an appearance of the user interface object by applying a visual effect to the user interface object
DK 2017 70505 A1 that varies a set of one or more parameters of the user interface object as the orientation of the device changes relative to the reference point; in accordance with a determination that the change in orientation of the device includes movement, towards the baseline orientation, that meets predetermined criteria, reducing an amplitude of the visual effect; and in accordance with a determination that the change in orientation of the device includes movement, away from the baseline orientation, that meets the predetermined criteria, continuing to apply the visual effect to the user interface object without reducing the amplitude of the visual effect.
[0049] In accordance with some embodiments, an electronic device is described. The electronic device comprises: a display; one or more sensor devices; one or more processors; and memory storing one or more programs configured to be executed by the one or more processors, the one or more programs including instructions for: while the device is at a first orientation relative to a baseline orientation with respect to a reference point, displaying, on the display, a user interface object; while displaying the user interface object, detecting, via the one or more sensor devices, a change in orientation of the device from the first orientation relative to the reference point to a respective orientation relative to the reference point; in response to detecting the change in orientation of the device: changing an appearance of the user interface object by applying a visual effect to the user interface object that varies a set of one or more parameters of the user interface object as the orientation of the device changes relative to the reference point; in accordance with a determination that the change in orientation of the device includes movement, towards the baseline orientation, that meets predetermined criteria, reducing an amplitude of the visual effect; and in accordance with a determination that the change in orientation of the device includes movement, away from the baseline orientation, that meets the predetermined criteria, continuing to apply the visual effect to the user interface object without reducing the amplitude of the visual effect.
[0050] In accordance with some embodiments, an electronic device is described. The electronic device comprises: a display; one or more sensor devices; means, while the device is at a first orientation relative to a baseline orientation with respect to a reference point, for displaying, on the display, a user interface object; means, while displaying the user interface object, for detecting, via the one or more sensor devices, a change in orientation of the device
DK 2017 70505 A1 from the first orientation relative to the reference point to a respective orientation relative to the reference point; means, in response to detecting the change in orientation of the device, for: changing an appearance of the user interface object by applying a visual effect to the user interface object that varies a set of one or more parameters of the user interface object as the orientation of the device changes relative to the reference point; in accordance with a determination that the change in orientation of the device includes movement, towards the baseline orientation, that meets predetermined criteria, reducing an amplitude of the visual effect; and in accordance with a determination that the change in orientation of the device includes movement, away from the baseline orientation, that meets the predetermined criteria, continuing to apply the visual effect to the user interface object without reducing the amplitude of the visual effect.
[0051] Executable instructions for performing these functions are, optionally, included in a non-transitory computer-readable storage medium or other computer program product configured for execution by one or more processors. Executable instructions for performing these functions are, optionally, included in a transitory computer-readable storage medium or other computer program product configured for execution by one or more processors.
[0052] Thus, devices are provided with faster, more efficient methods and interfaces for managing peer-to-peer transfers, thereby increasing the effectiveness, efficiency, and user satisfaction with such devices. Such methods and interfaces may complement or replace other methods for managing peer-to-peer transfers.
DESCRIPTION OF THE FIGURES [0053] For a better understanding of the various described embodiments, reference should be made to the Description of Embodiments below, in conjunction with the following drawings in which like reference numerals refer to corresponding parts throughout the figures.
[0054] FIG. 1A is a block diagram illustrating a portable multifunction device with a touchsensitive display, in accordance with some embodiments.
DK 2017 70505 A1 [0055] FIG. 1B is a block diagram illustrating exemplary components for event handling, in accordance with some embodiments.
[0056] FIG. 1C is a block diagram illustrating exemplary components for generating a tactile output, in accordance with some embodiments.
[0057] FIG. 2 illustrates a portable multifunction device having a touch screen, in accordance with some embodiments.
[0058] FIG. 3 is a block diagram of an exemplary multifunction device with a display and a touch-sensitive surface, in accordance with some embodiments.
[0059] FIG. 4A illustrates an exemplary user interface for a menu of applications on a portable multifunction device, in accordance with some embodiments.
[0060] FIG. 4B illustrates an exemplary user interface for a multifunction device with a touch-sensitive surface that is separate from the display, in accordance with some embodiments.
[0061] FIGS. 4C-4H illustrate exemplary tactile output patterns that have a particular waveform, in accordance with some embodiments.
[0062] FIG. 5A illustrates a personal electronic device, in accordance with some embodiments.
[0063] FIG. 5B is a block diagram illustrating a personal electronic device, in accordance with some embodiments.
[0064] FIGS. 5C-5D illustrate exemplary components of a personal electronic device having a touch-sensitive display and intensity sensors, in accordance with some embodiments.
[0065] FIGS. 5E-5H illustrate exemplary components and user interfaces of a personal electronic device, in accordance with some embodiments.
[0066] FIG. 6 illustrates exemplary devices connected via one or more communication channels, in accordance with some embodiments.
DK 2017 70505 A1 [0067] FIGS. 7A-7E illustrate exemplary user interfaces for initiating and managing transfers file transfers, in accordance with some embodiments.
[0068] FIGS. 8A-8AH illustrate exemplary user interfaces for initiating and managing transfers, in accordance with some embodiments.
[0069] FIGS. 9A-9I are a flow diagram illustrating methods of initiating and managing transfers, in accordance with some embodiments.
[0070] FIGS. 10A-10D illustrate exemplary user interfaces for providing feedback to message objects corresponding to completed file transfers, in accordance with some embodiments.
[0071] FIGS. 11A-11V illustrate exemplary user interfaces for providing feedback to message objects corresponding to completed transfers, in accordance with some embodiments.
[0072] FIGS. 12A-12C are a flow diagram illustrating methods of providing feedback to message objects corresponding to completed transfers, in accordance with some embodiments.
[0073] FIGS. 13A-13D illustrate exemplary user interfaces for providing visually distinguishable message object appearances based on message designation, in accordance with some embodiments.
[0074] FIGS. 14A-14M illustrate exemplary user interfaces for providing visually distinguishable message object appearances based on message designation, in accordance with some embodiments.
[0075] FIGS. 15A-15K are a flow diagram illustrating methods of providing visually distinguishable message object appearances based on message designation, in accordance with some embodiments.
[0076] FIGS. 16A-16F illustrate exemplary user interfaces for activating accounts for accepting and sending encrypted message transfers, in accordance with some embodiments.
DK 2017 70505 A1 [0077] FIGS. 17A-17L illustrate exemplary user interfaces for activating accounts for accepting and sending transfers, in accordance with some embodiments.
[0078] FIGS. 18A-18F are a flow diagram illustrating methods of activating accounts for accepting and sending transfers, in accordance with some embodiments.
[0079] FIGS. 19A-19D illustrate exemplary user interfaces for exchanging a user identification with a different user identification, in accordance with some embodiments.
[0080] FIGS. 20A-20J illustrate exemplary user interfaces for exchanging an account for use in a transfer, in accordance with some embodiments.
[0081] FIGS. 21A-21D are a flow diagram illustrating methods of exchanging an account for use in a transfer, in accordance with some embodiments.
[0082] FIGS. 22A-22F illustrate exemplary user interfaces for splitting resource transfers between two or more resource accounts, in accordance with some embodiments.
[0083] FIGS. 23A-23O illustrate exemplary user interfaces for splitting transfers between two or more accounts, in accordance with some embodiments.
[0084] FIGS. 24A-24C are a flow diagram illustrating methods of splitting transfers between two or more accounts, in accordance with some embodiments.
[0085] FIGS. 25A-25C illustrate exemplary user interfaces for generating and displaying an attachment transfers history list, in accordance with some embodiments.
[0086] FIGS. 26A-26T illustrate exemplary user interfaces for generating and displaying a transfers history list, in accordance with some embodiments.
[0087] FIGS. 27A-27E are a flow diagram illustrating methods of generating and displaying a transfers history list, in accordance with some embodiments.
[0088] FIGS. 28A-28F illustrate exemplary user interfaces for voice-activation of file transfers, in accordance with some embodiments.
DK 2017 70505 A1 [0089] FIGS. 29A-29S illustrate exemplary user interfaces for voice-activation of transfers, in accordance with some embodiments.
[0090] FIGS. 30A-30D are a flow diagram illustrating methods of voice-activation of transfers, in accordance with some embodiments.
[0091] FIGS. 31A-31M illustrate exemplary user interfaces for user verification, in accordance with some embodiments.
[0092] FIGS. 32A-32D illustrate exemplary user interfaces for automatic account onboarding, in accordance with some embodiments.
[0093] FIGS. 33A-33O illustrate exemplary user interfaces for providing feedback corresponding to an operation associated with a transfer, in accordance with some embodiments.
[0094] FIGS. 34A-34D are a flow diagram illustrating a method for providing feedback corresponding to an operation associated with a transfer, in accordance with some embodiments.
DESCRIPTION OF EMBODIMENTS [0095] The following description sets forth exemplary methods, parameters, and the like. It should be recognized, however, that such description is not intended as a limitation on the scope of the present disclosure but is instead provided as a description of exemplary embodiments.
[0096] There is a need for electronic devices that provide efficient methods and interfaces for managing peer-to-peer transfers. For example, there is a need for electronic devices that provide a convenient and efficient method for sending and receiving transfers using commonly used messaging applications. For another example, there is a need for electronic devices that provide easier management for peer-to-peer transfers in a secure manner. For another example, there is a need for electronic devices that provide a quick and intuitive technique for viewing and managing transfer history. For another example, there is a need for electronic devices that can accept transfers without user input or wither minimal user input. Such techniques can reduce the cognitive burden on a user who accesses and utilizes peer-to-peer transfers, thereby enhancing
DK 2017 70505 A1 productivity. Further, such techniques can reduce processor and battery power otherwise wasted on redundant user inputs.
[0097] Below, FIGS. 1A-1C, 2, 3, 4A-4H, 5A-5H provide a description of exemplary devices for performing the techniques for managing peer-to-peer transfers. FIG. 6 illustrates exemplary devices connected via one or more communication channels, in accordance with some embodiments. FIGS. 7A-7E illustrate exemplary user interfaces for initiating and managing transfers file transfers, in accordance with some embodiments. FIGS. 8A-8AH illustrate exemplary user interfaces for initiating and managing transfers, in accordance with some embodiments. FIGS. 9A-9I are a flow diagram illustrating methods of initiating and managing transfers, in accordance with some embodiments. The user interfaces in FIGS. 7A-7E and FIGS. 8A-8AH are used to illustrate the processes described below, including the processes in FIGS. 9A-9I. FIGS. 10A-10D illustrate exemplary user interfaces for providing feedback to message objects corresponding to completed file transfers, in accordance with some embodiments. FIGS. 11A-11V illustrate exemplary user interfaces for providing feedback to message objects corresponding to completed transfers, in accordance with some embodiments. FIGS. 12A-12C are a flow diagram illustrating methods of providing feedback to message objects corresponding to completed transfers, in accordance with some embodiments. The user interfaces in FIGS. 10A-10D and FIGS. 11A-11V are used to illustrate the processes described below, including the processes in FIGS. 12A-12C. FIGS. 13A-13D illustrate exemplary user interfaces for providing visually distinguishable message object appearances based on message designation, in accordance with some embodiments. FIGS. 14A-14M illustrate exemplary user interfaces for providing visually distinguishable message object appearances based on message designation, in accordance with some embodiments. FIGS. 15A-15K are a flow diagram illustrating methods of providing visually distinguishable message object appearances based on message designation, in accordance with some embodiments. The user interfaces in FIGS. 13A13D and FIGS. 14A-14M are used to illustrate the processes described below, including the processes in FIGS. 15A-15K. FIGS. 16A-16F illustrate exemplary user interfaces for activating accounts for accepting and sending encrypted message transfers, in accordance with some embodiments. FIGS. 17A-17L illustrate exemplary user interfaces for activating accounts for accepting and sending transfers, in accordance with some embodiments. FIGS. 18A-18F are a
DK 2017 70505 A1 flow diagram illustrating methods of activating accounts for accepting and sending transfers, in accordance with some embodiments. The user interfaces in FIGS. 16A-16F and FIGS. 17A-17L are used to illustrate the processes described below, including the processes in FIGS. 18A-18F. FIGS. 19A-19D illustrate exemplary user interfaces for exchanging a user identification with a different user identification, in accordance with some embodiments. FIGS. 20A-20J illustrate exemplary user interfaces for exchanging an account for use in a transfer, in accordance with some embodiments. FIGS. 21A-21D are a flow diagram illustrating methods of exchanging an account for use in a transfer, in accordance with some embodiments. The user interfaces in FIGS. 19A-19D and FIGS. 20A-20J are used to illustrate the processes described below, including the processes in FIGS. 21A-21D. FIGS. 22A-22F illustrate exemplary user interfaces for splitting resource transfers between two or more resource accounts, in accordance with some embodiments. FIGS. 23A-23O illustrate exemplary user interfaces for splitting transfers between two or more accounts, in accordance with some embodiments. FIGS. 24A-24C are a flow diagram illustrating methods of splitting transfers between two or more accounts, in accordance with some embodiments. The user interfaces in FIGS. 22A-22F and FIGS. 23A-23O are used to illustrate the processes described below, including the processes in FIGS. 24A-24C. FIGS. 25A-25C illustrate exemplary user interfaces for generating and displaying an attachment transfers history list, in accordance with some embodiments. FIGS. 26A-26T illustrate exemplary user interfaces for generating and displaying a transfers history list, in accordance with some embodiments. FIGS. 27A-27E are a flow diagram illustrating methods of generating and displaying a transfers history list, in accordance with some embodiments. The user interfaces in FIGS. 25A-25C and FIGS. 26A-26T are used to illustrate the processes described below, including the processes in FIGS. 27A-27E. FIGS. 28A-28F illustrate exemplary user interfaces for voice-activation of file transfers, in accordance with some embodiments. FIGS. 29A-29S illustrate exemplary user interfaces for voice-activation of transfers, in accordance with some embodiments. FIGS. 30A-30D are a flow diagram illustrating methods of voice-activation of transfers, in accordance with some embodiments. The user interfaces in FIGS. 28A-28F and FIGS. 29A-29S are used to illustrate the processes described below, including the processes in FIGS. 30A-30D. FIGS. 31A-31M illustrate exemplary user interfaces for user verification, in accordance with some embodiments. FIGS. 32A-32D illustrate exemplary user interfaces for
DK 2017 70505 A1 automatic account on-boarding, in accordance with some embodiments. FIGS. 33A-33O illustrate exemplary user interfaces for providing feedback corresponding to an operation associated with a transfer, in accordance with some embodiments. FIGS. 34A-34D are a flow diagram illustrating a method for providing feedback corresponding to an operation associated with a transfer, in accordance with some embodiments. The user interfaces in FIGS. 33A-33O are used to illustrate the processes described below, including the processes in FIGS. 34A-34D.
[0098] Although the following description uses terms “first,” “second,” etc. to describe various elements, these elements should not be limited by the terms. These terms are only used to distinguish one element from another. For example, a first touch could be termed a second touch, and, similarly, a second touch could be termed a first touch, without departing from the scope of the various described embodiments. The first touch and the second touch are both touches, but they are not the same touch.
[0099] The terminology used in the description of the various described embodiments herein is for the purpose of describing particular embodiments only and is not intended to be limiting. As used in the description of the various described embodiments and the appended claims, the singular forms “a,” “an,” and “the” are intended to include the plural forms as well, unless the context clearly indicates otherwise. It will also be understood that the term “and/or” as used herein refers to and encompasses any and all possible combinations of one or more of the associated listed items. It will be further understood that the terms “includes,” “including,” “comprises,” and/or “comprising,” when used in this specification, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof.
[0100] The term “if” is, optionally, construed to mean “when” or “upon” or “in response to determining” or “in response to detecting,” depending on the context. Similarly, the phrase “if it is determined” or “if [a stated condition or event] is detected” is, optionally, construed to mean “upon determining” or “in response to determining” or “upon detecting [the stated condition or event]” or “in response to detecting [the stated condition or event],” depending on the context.
DK 2017 70505 A1 [0101] Embodiments of electronic devices, user interfaces for such devices, and associated processes for using such devices are described. In some embodiments, the device is a portable communications device, such as a mobile telephone, that also contains other functions, such as PDA and/or music player functions. Exemplary embodiments of portable multifunction devices include, without limitation, the iPhone®, iPod Touch®, and iPad® devices from Apple Inc. of Cupertino, California. Other portable electronic devices, such as laptops or tablet computers with touch-sensitive surfaces (e.g., touch screen displays and/or touchpads), are, optionally, used. It should also be understood that, in some embodiments, the device is not a portable communications device, but is a desktop computer with a touch-sensitive surface (e.g., a touch screen display and/or a touchpad).
[0102] In the discussion that follows, an electronic device that includes a display and a touch-sensitive surface is described. It should be understood, however, that the electronic device optionally includes one or more other physical user-interface devices, such as a physical keyboard, a mouse, and/or a joystick.
[0103] The device typically supports a variety of applications, such as one or more of the following: a drawing application, a presentation application, a word processing application, a website creation application, a disk authoring application, a spreadsheet application, a gaming application, a telephone application, a video conferencing application, an e-mail application, an instant messaging application, a workout support application, a photo management application, a digital camera application, a digital video camera application, a web browsing application, a digital music player application, and/or a digital video player application.
[0104] The various applications that are executed on the device optionally use at least one common physical user-interface device, such as the touch-sensitive surface. One or more functions of the touch-sensitive surface as well as corresponding information displayed on the device are, optionally, adjusted and/or varied from one application to the next and/or within a respective application. In this way, a common physical architecture (such as the touch-sensitive surface) of the device optionally supports the variety of applications with user interfaces that are intuitive and transparent to the user.
DK 2017 70505 A1 [0105] Attention is now directed toward embodiments of portable devices with touchsensitive displays. FIG. 1A is a block diagram illustrating portable multifunction device 100 with touch-sensitive display system 112 in accordance with some embodiments. Touch-sensitive display 112 is sometimes called a “touch screen” for convenience and is sometimes known as or called a “touch-sensitive display system.” Device 100 includes memory 102 (which optionally includes one or more computer-readable storage mediums), memory controller 122, one or more processing units (CPUs) 120, peripherals interface 118, RF circuitry 108, audio circuitry 110, speaker 111, microphone 113, secure element 115, input/output (I/O) subsystem 106, other input control devices 116, and external port 124. Device 100 optionally includes one or more optical sensors 164. Device 100 optionally includes one or more contact intensity sensors 165 for detecting intensity of contacts on device 100 (e.g., a touch-sensitive surface such as touchsensitive display system 112 of device 100). Device 100 optionally includes one or more tactile output generators 167 for generating tactile outputs on device 100 (e.g., generating tactile outputs on a touch-sensitive surface such as touch-sensitive display system 112 of device 100 or touchpad 355 of device 300). These components optionally communicate over one or more communication buses or signal lines 103.
[0106] As used in the specification and claims, the term “intensity” of a contact on a touchsensitive surface refers to the force or pressure (force per unit area) of a contact (e.g., a finger contact) on the touch-sensitive surface, or to a substitute (proxy) for the force or pressure of a contact on the touch-sensitive surface. The intensity of a contact has a range of values that includes at least four distinct values and more typically includes hundreds of distinct values (e.g., at least 256). Intensity of a contact is, optionally, determined (or measured) using various approaches and various sensors or combinations of sensors. For example, one or more force sensors underneath or adjacent to the touch-sensitive surface are, optionally, used to measure force at various points on the touch-sensitive surface. In some implementations, force measurements from multiple force sensors are combined (e.g., a weighted average) to determine an estimated force of a contact. Similarly, a pressure-sensitive tip of a stylus is, optionally, used to determine a pressure of the stylus on the touch-sensitive surface. Alternatively, the size of the contact area detected on the touch-sensitive surface and/or changes thereto, the capacitance of the touch-sensitive surface proximate to the contact and/or changes thereto, and/or the resistance 37
DK 2017 70505 A1 of the touch-sensitive surface proximate to the contact and/or changes thereto are, optionally, used as a substitute for the force or pressure of the contact on the touch-sensitive surface. In some implementations, the substitute measurements for contact force or pressure are used directly to determine whether an intensity threshold has been exceeded (e.g., the intensity threshold is described in units corresponding to the substitute measurements). In some implementations, the substitute measurements for contact force or pressure are converted to an estimated force or pressure, and the estimated force or pressure is used to determine whether an intensity threshold has been exceeded (e.g., the intensity threshold is a pressure threshold measured in units of pressure). Using the intensity of a contact as an attribute of a user input allows for user access to additional device functionality that may otherwise not be accessible by the user on a reduced-size device with limited real estate for displaying affordances (e.g., on a touch-sensitive display) and/or receiving user input (e.g., via a touch-sensitive display, a touchsensitive surface, or a physical/mechanical control such as a knob or a button).
[0107] As used in the specification and claims, the term “tactile output” refers to physical displacement of a device relative to a previous position of the device, physical displacement of a component (e.g., a touch-sensitive surface) of a device relative to another component (e.g., housing) of the device, or displacement of the component relative to a center of mass of the device that will be detected by a user with the user’s sense of touch. For example, in situations where the device or the component of the device is in contact with a surface of a user that is sensitive to touch (e.g., a finger, palm, or other part of a user’s hand), the tactile output generated by the physical displacement will be interpreted by the user as a tactile sensation corresponding to a perceived change in physical characteristics of the device or the component of the device. For example, movement of a touch-sensitive surface (e.g., a touch-sensitive display or trackpad) is, optionally, interpreted by the user as a “down click” or “up click” of a physical actuator button. In some cases, a user will feel a tactile sensation such as an “down click” or “up click” even when there is no movement of a physical actuator button associated with the touch-sensitive surface that is physically pressed (e.g., displaced) by the user’s movements. As another example, movement of the touch-sensitive surface is, optionally, interpreted or sensed by the user as “roughness” of the touch-sensitive surface, even when there is no change in smoothness of the touch-sensitive surface. While such interpretations of touch by a user will be subject to
DK 2017 70505 A1 the individualized sensory perceptions of the user, there are many sensory perceptions of touch that are common to a large majority of users. Thus, when a tactile output is described as corresponding to a particular sensory perception of a user (e.g., an “up click,” a “down click,” “roughness”), unless otherwise stated, the generated tactile output corresponds to physical displacement of the device or a component thereof that will generate the described sensory perception for a typical (or average) user. Using tactile outputs to provide haptic feedback to a user enhances the operability of the device and makes the user-device interface more efficient (e.g., by helping the user to provide proper inputs and reducing user mistakes when operating/interacting with the device) which, additionally, reduces power usage and improves battery life of the device by enabling the user to use the device more quickly and efficiently.
[0108] In some embodiments, a tactile output pattern specifies characteristics of a tactile output, such as the amplitude of the tactile output, the shape of a movement waveform of the tactile output, the frequency of the tactile output, and/or the duration of the tactile output.
[0109] When tactile outputs with different tactile output patterns are generated by a device (e.g., via one or more tactile output generators that move a moveable mass to generate tactile outputs), the tactile outputs may invoke different haptic sensations in a user holding or touching the device. While the sensation of the user is based on the user’s perception of the tactile output, most users will be able to identify changes in waveform, frequency, and amplitude of tactile outputs generated by the device. Thus, the waveform, frequency and amplitude can be adjusted to indicate to the user that different operations have been performed. As such, tactile outputs with tactile output patterns that are designed, selected, and/or engineered to simulate characteristics (e.g., size, material, weight, stiffness, smoothness, etc.); behaviors (e.g., oscillation, displacement, acceleration, rotation, expansion, etc.); and/or interactions (e.g., collision, adhesion, repulsion, attraction, friction, etc.) of objects in a given environment (e.g., a user interface that includes graphical features and objects, a simulated physical environment with virtual boundaries and virtual objects, a real physical environment with physical boundaries and physical objects, and/or a combination of any of the above) will, in some circumstances, provide helpful feedback to users that reduces input errors and increases the efficiency of the user’s operation of the device. Additionally, tactile outputs are, optionally, generated to correspond to
DK 2017 70505 A1 feedback that is unrelated to a simulated physical characteristic, such as an input threshold or a selection of an object. Such tactile outputs will, in some circumstances, provide helpful feedback to users that reduces input errors and increases the efficiency of the user’s operation of the device.
[0110] In some embodiments, a tactile output with a suitable tactile output pattern serves as a cue for the occurrence of an event of interest in a user interface or behind the scenes in a device. Examples of the events of interest include activation of an affordance (e.g., a real or virtual button, or toggle switch) provided on the device or in a user interface, success or failure of a requested operation, reaching or crossing a boundary in a user interface, entry into a new state, switching of input focus between objects, activation of a new mode, reaching or crossing an input threshold, detection or recognition of a type of input or gesture, etc. In some embodiments, tactile outputs are provided to serve as a warning or an alert for an impending event or outcome that would occur unless a redirection or interruption input is timely detected. Tactile outputs are also used in other contexts to enrich the user experience, improve the accessibility of the device to users with visual or motor difficulties or other accessibility needs, and/or improve efficiency and functionality of the user interface and/or the device. Tactile outputs are optionally accompanied with audio outputs and/or visible user interface changes, which further enhance a user’s experience when the user interacts with a user interface and/or the device, and facilitate better conveyance of information regarding the state of the user interface and/or the device, and which reduce input errors and increase the efficiency of the user’s operation of the device.
[0111] FIGS. 4C-4E provide a set of sample tactile output patterns that may be used, either individually or in combination, either as is or through one or more transformations (e.g., modulation, amplification, truncation, etc.), to create suitable haptic feedback in various scenarios and for various purposes, such as those mentioned above and those described with respect to the user interfaces and methods discussed herein. This example of a palette of tactile outputs shows how a set of three waveforms and eight frequencies can be used to produce an array of tactile output patterns. In addition to the tactile output patterns shown in this figure, each of these tactile output patterns is optionally adjusted in amplitude by changing a gain value for the tactile output pattern, as shown, for example for FullTap 80Hz, FullTap 200Hz, MiniTap
DK 2017 70505 A1
80Hz, MiniTap 200Hz, MicroTap 80Hz, and MicroTap 200Hz in FIGS. 4F-4H, which are each shown with variants having a gain of 1.0, 0.75, 0.5, and 0.25. As shown in FIGS. 4F-4H, changing the gain of a tactile output pattern changes the amplitude of the pattern without changing the frequency of the pattern or changing the shape of the waveform. In some embodiments, changing the frequency of a tactile output pattern also results in a lower amplitude as some tactile output generators are limited by how much force can be applied to the moveable mass and thus higher frequency movements of the mass are constrained to lower amplitudes to ensure that the acceleration needed to create the waveform does not require force outside of an operational force range of the tactile output generator (e.g., the peak amplitudes of the FullTap at 230Hz, 270Hz, and 300Hz are lower than the amplitudes of the FullTap at 80Hz, 100Hz, 125Hz, and 200Hz).
[0112] FIGS. 4C-4H show tactile output patterns that have a particular waveform. The waveform of a tactile output pattern represents the pattern of physical displacements relative to a neutral position (e.g., xzero) versus time that an moveable mass goes through to generate a tactile output with that tactile output pattern. For example, a first set of tactile output patterns shown in FIG. 4C (e.g., tactile output patterns of a “FullTap”) each have a waveform that includes an oscillation with two complete cycles (e.g., an oscillation that starts and ends in a neutral position and crosses the neutral position three times). A second set of tactile output patterns shown in FIG. 4D (e.g., tactile output patterns of a “MiniTap”) each have a waveform that includes an oscillation that includes one complete cycle (e.g., an oscillation that starts and ends in a neutral position and crosses the neutral position one time). A third set of tactile output patterns shown in FIG. 4E (e.g., tactile output patterns of a “MicroTap”) each have a waveform that includes an oscillation that include one half of a complete cycle (e.g., an oscillation that starts and ends in a neutral position and does not cross the neutral position). The waveform of a tactile output pattern also includes a start buffer and an end buffer that represent the gradual speeding up and slowing down of the moveable mass at the start and at the end of the tactile output. The example waveforms shown in FIGS. 4C-4H include xmin and xmax values which represent the maximum and minimum extent of movement of the moveable mass. For larger electronic devices with larger moveable masses, there may be larger or smaller minimum and maximum extents of movement of the mass. The examples shown in FIGS. 4C-4H describe movement of a mass in 1
DK 2017 70505 A1 dimension, however similar principles would also apply to movement of a moveable mass in two or three dimensions.
[0113] As shown in FIGS. 4C-4E, each tactile output pattern also has a corresponding characteristic frequency that affects the “pitch” of a haptic sensation that is felt by a user from a tactile output with that characteristic frequency. For a continuous tactile output, the characteristic frequency represents the number of cycles that are completed within a given period of time (e.g., cycles per second) by the moveable mass of the tactile output generator. For a discrete tactile output, a discrete output signal (e.g., with 0.5, 1, or 2 cycles) is generated, and the characteristic frequency value specifies how fast the moveable mass needs to move to generate a tactile output with that characteristic frequency. As shown in FIGS. 4C-4H, for each type of tactile output (e.g., as defined by a respective waveform, such as FullTap, MiniTap, or MicroTap), a higher frequency value corresponds to faster movement(s) by the moveable mass, and hence, in general, a shorter time to complete the tactile output (e.g., including the time to complete the required number of cycle(s) for the discrete tactile output, plus a start and an end buffer time). For example, a FullTap with a characteristic frequency of 80Hz takes longer to complete than FullTap with a characteristic frequency of 100Hz (e.g., 35.4ms vs. 28.3ms in FIG. 4C). In addition, for a given frequency, a tactile output with more cycles in its waveform at a respective frequency takes longer to complete than a tactile output with fewer cycles its waveform at the same respective frequency. For example, a FullTap at 150Hz takes longer to complete than a MiniTap at 150Hz (e.g., 19.4ms vs. 12.8ms), and a MiniTap at 150Hz takes longer to complete than a MicroTap at 150Hz (e.g., 12.8ms vs. 9.4ms). However, for tactile output patterns with different frequencies this rule may not apply (e.g., tactile outputs with more cycles but a higher frequency may take a shorter amount of time to complete than tactile outputs with fewer cycles but a lower frequency, and vice versa). For example, at 300Hz, a FullTap takes as long as a MiniTap (e.g., 9.9 ms).
[0114] As shown in FIGS. 4C-4E, a tactile output pattern also has a characteristic amplitude that affects the amount of energy that is contained in a tactile signal, or a “strength” of a haptic sensation that may be felt by a user through a tactile output with that characteristic amplitude. In some embodiments, the characteristic amplitude of a tactile output pattern refers to an absolute
DK 2017 70505 A1 or normalized value that represents the maximum displacement of the moveable mass from a neutral position when generating the tactile output. In some embodiments, the characteristic amplitude of a tactile output pattern is adjustable, e.g., by a fixed or dynamically determined gain factor (e.g., a value between 0 and 1), in accordance with various conditions (e.g., customized based on user interface contexts and behaviors) and/or preconfigured metrics (e.g., input-based metrics, and/or user-interface-based metrics). In some embodiments, an input-based metric (e.g., an intensity-change metric or an input-speed metric) measures a characteristic of an input (e.g., a rate of change of a characteristic intensity of a contact in a press input or a rate of movement of the contact across a touch-sensitive surface) during the input that triggers generation of a tactile output. In some embodiments, a user-interface-based metric (e.g., a speed-across-boundary metric) measures a characteristic of a user interface element (e.g., a speed of movement of the element across a hidden or visible boundary in a user interface) during the user interface change that triggers generation of the tactile output. In some embodiments, the characteristic amplitude of a tactile output pattern may be modulated by an “envelope” and the peaks of adjacent cycles may have different amplitudes, where one of the waveforms shown above is further modified by multiplication by an envelope parameter that changes over time (e.g., from 0 to 1) to gradually adjust amplitude of portions of the tactile output over time as the tactile output is being generated.
[0115] Although specific frequencies, amplitudes, and waveforms are represented in the sample tactile output patterns in FIGS. 4C-4E for illustrative purposes, tactile output patterns with other frequencies, amplitudes, and waveforms may be used for similar purposes. For example, waveforms that have between 0.5 to 4 cycles can be used. Other frequencies in the range of 60Hz-400Hz may be used as well. Table 1 provides examples of particular haptic feedback behaviors, configurations, and examples of their use.
[0116] It should be appreciated that device 100 is only one example of a portable multifunction device, and that device 100 optionally has more or fewer components than shown, optionally combines two or more components, or optionally has a different configuration or arrangement of the components. The various components shown in FIG. 1A are implemented in
DK 2017 70505 A1 hardware, software, or a combination of both hardware and software, including one or more signal processing and/or application-specific integrated circuits.
[0117] Memory 102 optionally includes high-speed random access memory and optionally also includes non-volatile memory, such as one or more magnetic disk storage devices, flash memory devices, or other non-volatile solid-state memory devices. Memory controller 122 optionally controls access to memory 102 by other components of device 100.
[0118] Peripherals interface 118 can be used to couple input and output peripherals of the device to CPU 120 and memory 102. The one or more processors 120 run or execute various software programs and/or sets of instructions stored in memory 102 to perform various functions for device 100 and to process data. In some embodiments, peripherals interface 118, CPU 120, and memory controller 122 are, optionally, implemented on a single chip, such as chip 104. In some other embodiments, they are, optionally, implemented on separate chips.
[0119] RF (radio frequency) circuitry 108 receives and sends RF signals, also called electromagnetic signals. RF circuitry 108 converts electrical signals to/from electromagnetic signals and communicates with communications networks and other communications devices via the electromagnetic signals. RF circuitry 108 optionally includes well-known circuitry for performing these functions, including but not limited to an antenna system, an RF transceiver, one or more amplifiers, a tuner, one or more oscillators, a digital signal processor, a CODEC chipset, a subscriber identity module (SIM) card, memory, and so forth. RF circuitry 108 optionally communicates with networks, such as the Internet, also referred to as the World Wide Web (WWW), an intranet and/or a wireless network, such as a cellular telephone network, a wireless local area network (LAN) and/or a metropolitan area network (MAN), and other devices by wireless communication. The RF circuitry 108 optionally includes well-known circuitry for detecting near field communication (NFC) fields, such as by a short-range communication radio. The wireless communication optionally uses any of a plurality of communications standards, protocols, and technologies, including but not limited to Global System for Mobile Communications (GSM), Enhanced Data GSM Environment (EDGE), high-speed downlink packet access (HSDPA), high-speed uplink packet access (HSUPA), Evolution, Data-Only (EVDO), HSPA, HSPA+, Dual-Cell HSPA (DC-HSPDA), long term evolution (LTE), near field
DK 2017 70505 A1 communication (NFC), wideband code division multiple access (W-CDMA), code division multiple access (CDMA), time division multiple access (TDMA), Bluetooth, Bluetooth Low Energy (BTLE), Wireless Fidelity (Wi-Fi) (e.g., IEEE 802.11a, IEEE 802.11b, IEEE 802.11g, IEEE 802.11n, and/or IEEE 802.11ac), voice over Internet Protocol (VoIP), Wi-MAX, a protocol for e-mail (e.g., Internet message access protocol (IMAP) and/or post office protocol (POP)), instant messaging (e.g., extensible messaging and presence protocol (XMPP), Session Initiation Protocol for Instant Messaging and Presence Leveraging Extensions (SIMPLE), Instant Messaging and Presence Service (IMPS)), and/or Short Message Service (SMS), or any other suitable communication protocol, including communication protocols not yet developed as of the filing date of this document.
[0120] Audio circuitry 110, speaker 111, and microphone 113 provide an audio interface between a user and device 100. Audio circuitry 110 receives audio data from peripherals interface 118, converts the audio data to an electrical signal, and transmits the electrical signal to speaker 111. Speaker 111 converts the electrical signal to human-audible sound waves. Audio circuitry 110 also receives electrical signals converted by microphone 113 from sound waves. Audio circuitry 110 converts the electrical signal to audio data and transmits the audio data to peripherals interface 118 for processing. Audio data is, optionally, retrieved from and/or transmitted to memory 102 and/or RF circuitry 108 by peripherals interface 118. In some embodiments, audio circuitry 110 also includes a headset jack (e.g., 212, FIG. 2). The headset jack provides an interface between audio circuitry 110 and removable audio input/output peripherals, such as output-only headphones or a headset with both output (e.g., a headphone for one or both ears) and input (e.g., a microphone).
[0121] Secure element (e.g., 115) is a hardware component (e.g., a secure microcontroller chip) configured to securely store data or an algorithm such that the securely stored data is not accessible by the device without proper authentication information from a user of the device. Keeping the securely stored data in a secure element that is separate from other storage on the device prevents access to the securely stored data even if other storage locations on the device are compromised (e.g., by malicious code or other attempts to compromise information stored on the device). In some examples, the secure element provides (or releases) payment information
DK 2017 70505 A1 (e.g., an account number and/or a transaction-specific dynamic security code). In some examples, the secure element provides (or releases) the payment information in response to the device receiving authorization, such as a user authentication (e.g., fingerprint authentication; passcode authentication; detecting double-press of a hardware button when the device is in an unlocked state, and optionally, while the device has been continuously on a user’s wrist since the device was unlocked by providing authentication credentials to the device, where the continuous presence of the device on the user’s wrist is determined by periodically checking that the device is in contact with the user’s skin). For example, the device detects a fingerprint at a fingerprint sensor (e.g., a fingerprint sensor integrated into a button) of the device. The device determines whether the fingerprint is consistent with a registered fingerprint. In accordance with a determination that the fingerprint is consistent with the registered fingerprint, the secure element provides (or releases) payment information. In accordance with a determination that the fingerprint is not consistent with the registered fingerprint, the secure element forgoes providing (or releasing) payment information.
[0122] Additional details regarding the secure element and related techniques are described in the following applications: U.S. Patent Application Serial No. 61/912,727, entitled “PROVISIONING AND AUTHENTICATING CREDENTIALS ON AN ELECTRONIC DEVICE”, filed December 6, 2013; U.S. Patent Application Serial No. 62/004,182, entitled “ONLINE PAYMENTS USING A SECURE ELEMENT OF AN ELECTRONIC DEVICE”, filed May 28, 2014; U.S. Patent Application Serial No. 61/899,737, entitled “USING BIOAUTHENTICATION IN NEAR-FIELD-COMMUNICATION TRANSACTIONS”, filed November 4, 2013; U.S. Patent Application Serial No. 61/905,035, entitled “GENERATING TRANSACTION IDENTIFIERS”, filed November 15, 2013; U.S. Patent Application Serial No. 62/004,837, entitled “METHODS FOR MANAGING PAYMENT APPLETS ON A SECURE ELEMENT TO CONDUCT MOBILE PAYMENT TRANSACTIONS”, filed May 29, 2014; U.S. Patent Application Serial No. 62/004,832, entitled “METHODS FOR USING A RANDOM AUTHORIZATION NUMBER TO PROVIDE ENHANCED SECURITY FOR A SECURE ELEMENT”, filed May 29, 2014; and U.S. Patent Application Serial No. 62/004,338, entitled “USER DEVICE SECURE PARTICIPATION IN TRANSACTIONS VIA LOCAL SECURE
DK 2017 70505 A1
ELEMENT DETECTION OF MECHANICAL INPUT”, filed May 29, 2014; which are hereby incorporated by reference in their entirety.
[0123] I/O subsystem 106 couples input/output peripherals on device 100, such as touch screen 112 and other input control devices 116, to peripherals interface 118. I/O subsystem 106 optionally includes display controller 156, optical sensor controller 158, intensity sensor controller 159, haptic feedback controller 161, and one or more input controllers 160 for other input or control devices. The one or more input controllers 160 receive/send electrical signals from/to other input control devices 116. The other input control devices 116 optionally include physical buttons (e.g., push buttons, rocker buttons, etc.), dials, slider switches, joysticks, click wheels, and so forth. In some alternate embodiments, input controller(s) 160 are, optionally, coupled to any (or none) of the following: a keyboard, an infrared port, a USB port, and a pointer device such as a mouse. The one or more buttons (e.g., 208, FIG. 2) optionally include an up/down button for volume control of speaker 111 and/or microphone 113. The one or more buttons optionally include a push button (e.g., 206, FIG. 2).
[0124] A quick press of the push button optionally disengages a lock of touch screen 112 or optionally begins a process that uses gestures on the touch screen to unlock the device, as described in U.S. Patent Application 11/322,549, “Unlocking a Device by Performing Gestures on an Unlock Image,” filed December 23, 2005, U.S. Pat. No. 7,657,849, which is hereby incorporated by reference in its entirety. A longer press of the push button (e.g., 206) optionally turns power to device 100 on or off. The functionality of one or more of the buttons are, optionally, user-customizable. Touch screen 112 is used to implement virtual or soft buttons and one or more soft keyboards.
[0125] Touch-sensitive display 112 provides an input interface and an output interface between the device and a user. Display controller 156 receives and/or sends electrical signals from/to touch screen 112. Touch screen 112 displays visual output to the user. The visual output optionally includes graphics, text, icons, video, and any combination thereof (collectively termed “graphics”). In some embodiments, some or all of the visual output optionally corresponds to user-interface objects.
DK 2017 70505 A1 [0126] Touch screen 112 has a touch-sensitive surface, sensor, or set of sensors that accepts input from the user based on haptic and/or tactile contact. Touch screen 112 and display controller 156 (along with any associated modules and/or sets of instructions in memory 102) detect contact (and any movement or breaking of the contact) on touch screen 112 and convert the detected contact into interaction with user-interface objects (e.g., one or more soft keys, icons, web pages, or images) that are displayed on touch screen 112. In an exemplary embodiment, a point of contact between touch screen 112 and the user corresponds to a finger of the user.
[0127] Touch screen 112 optionally uses LCD (liquid crystal display) technology, LPD (light emitting polymer display) technology, or LED (light emitting diode) technology, although other display technologies are used in other embodiments. Touch screen 112 and display controller 156 optionally detect contact and any movement or breaking thereof using any of a plurality of touch sensing technologies now known or later developed, including but not limited to capacitive, resistive, infrared, and surface acoustic wave technologies, as well as other proximity sensor arrays or other elements for determining one or more points of contact with touch screen 112. In an exemplary embodiment, projected mutual capacitance sensing technology is used, such as that found in the iPhone® and iPod Touch® from Apple Inc. of Cupertino, California.
[0128] A touch-sensitive display in some embodiments of touch screen 112 is, optionally, analogous to the multi-touch sensitive touchpads described in the following U.S. Patents: 6,323,846 (Westerman et al.), 6,570,557 (Westerman et al.), and/or 6,677,932 (Westerman), and/or U.S. Patent Publication 2002/0015024A1, each of which is hereby incorporated by reference in its entirety. However, touch screen 112 displays visual output from device 100, whereas touch-sensitive touchpads do not provide visual output.
[0129] A touch-sensitive display in some embodiments of touch screen 112 is described in the following applications: (1) U.S. Patent Application No. 11/381,313, “Multipoint Touch Surface Controller,” filed May 2, 2006; (2) U.S. Patent Application No. 10/840,862, “Multipoint Touchscreen,” filed May 6, 2004; (3) U.S. Patent Application No. 10/903,964, “Gestures For Touch Sensitive Input Devices,” filed July 30, 2004; (4) U.S. Patent Application No. 11/048,264, “Gestures For Touch Sensitive Input Devices,” filed January 31, 2005; (5) U.S. Patent
DK 2017 70505 A1
Application No. 11/038,590, “Mode-Based Graphical User Interfaces For Touch Sensitive Input Devices,” filed January 18, 2005; (6) U.S. Patent Application No. 11/228,758, “Virtual Input Device Placement On A Touch Screen User Interface,” filed September 16, 2005; (7) U.S. Patent Application No. 11/228,700, “Operation Of A Computer With A Touch Screen Interface,” filed September 16, 2005; (8) U.S. Patent Application No. 11/228,737, “Activating Virtual Keys Of A Touch-Screen Virtual Keyboard,” filed September 16, 2005; and (9) U.S. Patent Application No. 11/367,749, “Multi-Functional Hand-Held Device,” filed March 3, 2006. All of these applications are incorporated by reference herein in their entirety.
[0130] Touch screen 112 optionally has a video resolution in excess of 100 dpi. In some embodiments, the touch screen has a video resolution of approximately 160 dpi. The user optionally makes contact with touch screen 112 using any suitable object or appendage, such as a stylus, a finger, and so forth. In some embodiments, the user interface is designed to work primarily with finger-based contacts and gestures, which can be less precise than stylus-based input due to the larger area of contact of a finger on the touch screen. In some embodiments, the device translates the rough finger-based input into a precise pointer/cursor position or command for performing the actions desired by the user.
[0131] In some embodiments, in addition to the touch screen, device 100 optionally includes a touchpad (not shown) for activating or deactivating particular functions. In some embodiments, the touchpad is a touch-sensitive area of the device that, unlike the touch screen, does not display visual output. The touchpad is, optionally, a touch-sensitive surface that is separate from touch screen 112 or an extension of the touch-sensitive surface formed by the touch screen.
[0132] Device 100 also includes power system 162 for powering the various components. Power system 162 optionally includes a power management system, one or more power sources (e.g., battery, alternating current (AC)), a recharging system, a power failure detection circuit, a power converter or inverter, a power status indicator (e.g., a light-emitting diode (LED)) and any other components associated with the generation, management and distribution of power in portable devices.
DK 2017 70505 A1 [0133] Device 100 optionally also includes one or more optical sensors 164. FIG. 1A shows an optical sensor coupled to optical sensor controller 158 in I/O subsystem 106. Optical sensor 164 optionally includes charge-coupled device (CCD) or complementary metal-oxide semiconductor (CMOS) phototransistors. Optical sensor 164 receives light from the environment, projected through one or more lenses, and converts the light to data representing an image. In conjunction with imaging module 143 (also called a camera module), optical sensor 164 optionally captures still images or video. In some embodiments, an optical sensor is located on the back of device 100, opposite touch screen display 112 on the front of the device so that the touch screen display is enabled for use as a viewfinder for still and/or video image acquisition. In some embodiments, an optical sensor is located on the front of the device so that the user’s image is, optionally, obtained for video conferencing while the user views the other video conference participants on the touch screen display. In some embodiments, the position of optical sensor 164 can be changed by the user (e.g., by rotating the lens and the sensor in the device housing) so that a single optical sensor 164 is used along with the touch screen display for both video conferencing and still and/or video image acquisition.
[0134] Device 100 optionally also includes one or more contact intensity sensors 165. FIG. 1A shows a contact intensity sensor coupled to intensity sensor controller 159 in I/O subsystem 106. Contact intensity sensor 165 optionally includes one or more piezoresistive strain gauges, capacitive force sensors, electric force sensors, piezoelectric force sensors, optical force sensors, capacitive touch-sensitive surfaces, or other intensity sensors (e.g., sensors used to measure the force (or pressure) of a contact on a touch-sensitive surface). Contact intensity sensor 165 receives contact intensity information (e.g., pressure information or a proxy for pressure information) from the environment. In some embodiments, at least one contact intensity sensor is collocated with, or proximate to, a touch-sensitive surface (e.g., touchsensitive display system 112). In some embodiments, at least one contact intensity sensor is located on the back of device 100, opposite touch screen display 112, which is located on the front of device 100.
[0135] Device 100 optionally also includes one or more proximity sensors 166. FIG. 1A shows proximity sensor 166 coupled to peripherals interface 118. Alternately, proximity sensor
DK 2017 70505 A1
166 is, optionally, coupled to input controller 160 in I/O subsystem 106. Proximity sensor 166 optionally performs as described in U.S. Patent Application Nos. 11/241,839, “Proximity Detector In Handheld Device”; 11/240,788, “Proximity Detector In Handheld Device”;
11/620,702, “Using Ambient Light Sensor To Augment Proximity Sensor Output”; 11/586,862, “Automated Response To And Sensing Of User Activity In Portable Devices”; and 11/638,251, “Methods And Systems For Automatic Configuration Of Peripherals,” which are hereby incorporated by reference in their entirety. In some embodiments, the proximity sensor turns off and disables touch screen 112 when the multifunction device is placed near the user’s ear (e.g., when the user is making a phone call).
[0136] Device 100 optionally also includes one or more tactile output generators 167. FIG. 1A shows a tactile output generator coupled to haptic feedback controller 161 in I/O subsystem 106. Tactile output generator 167 optionally includes one or more electroacoustic devices such as speakers or other audio components and/or electromechanical devices that convert energy into linear motion such as a motor, solenoid, electroactive polymer, piezoelectric actuator, electrostatic actuator, or other tactile output generating component (e.g., a component that converts electrical signals into tactile outputs on the device). Contact intensity sensor 165 receives tactile feedback generation instructions from haptic feedback module 133 and generates tactile outputs on device 100 that are capable of being sensed by a user of device 100. In some embodiments, at least one tactile output generator is collocated with, or proximate to, a touchsensitive surface (e.g., touch-sensitive display system 112) and, optionally, generates a tactile output by moving the touch-sensitive surface vertically (e.g., in/out of a surface of device 100) or laterally (e.g., back and forth in the same plane as a surface of device 100). In some embodiments, at least one tactile output generator sensor is located on the back of device 100, opposite touch screen display 112, which is located on the front of device 100.
[0137] Device 100 optionally also includes one or more accelerometers 168. FIG. 1A shows accelerometer 168 coupled to peripherals interface 118. Alternately, accelerometer 168 is, optionally, coupled to an input controller 160 in I/O subsystem 106. Accelerometer 168 optionally performs as described in U.S. Patent Publication No. 20050190059, “Accelerationbased Theft Detection System for Portable Electronic Devices,” and U.S. Patent Publication No.
DK 2017 70505 A1
20060017692, “Methods And Apparatuses For Operating A Portable Device Based On An Accelerometer,” both of which are incorporated by reference herein in their entirety. In some embodiments, information is displayed on the touch screen display in a portrait view or a landscape view based on an analysis of data received from the one or more accelerometers. Device 100 optionally includes, in addition to accelerometer(s) 168, a magnetometer (not shown) and a GPS (or GLONASS or other global navigation system) receiver (not shown) for obtaining information concerning the location and orientation (e.g., portrait or landscape) of device 100.
[0138] In some embodiments, the software components stored in memory 102 include operating system 126, communication module (or set of instructions) 128, contact/motion module (or set of instructions) 130, graphics module (or set of instructions) 132, text input module (or set of instructions) 134, Global Positioning System (GPS) module (or set of instructions) 135, and applications (or sets of instructions) 136. Furthermore, in some embodiments, memory 102 (FIG. 1A) or 370 (FIG. 3) stores device/global internal state 157, as shown in FIGS. 1A and 3. Device/global internal state 157 includes one or more of: active application state, indicating which applications, if any, are currently active; display state, indicating what applications, views or other information occupy various regions of touch screen display 112; sensor state, including information obtained from the device’s various sensors and input control devices 116; and location information concerning the device’s location and/or attitude.
[0139] Operating system 126 (e.g., Darwin, RTXC, LINUX, UNIX, OS X, iOS, WINDOWS, or an embedded operating system such as VxWorks) includes various software components and/or drivers for controlling and managing general system tasks (e.g., memory management, storage device control, power management, etc.) and facilitates communication between various hardware and software components.
[0140] Communication module 128 facilitates communication with other devices over one or more external ports 124 and also includes various software components for handling data received by RF circuitry 108 and/or external port 124. External port 124 (e.g., Universal Serial Bus (USB), FIREWIRE, etc.) is adapted for coupling directly to other devices or indirectly over a network (e.g., the Internet, wireless LAN, etc.). In some embodiments, the external port is a
DK 2017 70505 A1 multi-pin (e.g., 30-pin) connector that is the same as, or similar to and/or compatible with, the 30-pin connector used on iPod® (trademark of Apple Inc.) devices.
[0141] Contact/motion module 130 optionally detects contact with touch screen 112 (in conjunction with display controller 156) and other touch-sensitive devices (e.g., a touchpad or physical click wheel). Contact/motion module 130 includes various software components for performing various operations related to detection of contact, such as determining if contact has occurred (e.g., detecting a finger-down event), determining an intensity of the contact (e.g., the force or pressure of the contact or a substitute for the force or pressure of the contact), determining if there is movement of the contact and tracking the movement across the touchsensitive surface (e.g., detecting one or more finger-dragging events), and determining if the contact has ceased (e.g., detecting a finger-up event or a break in contact). Contact/motion module 130 receives contact data from the touch-sensitive surface. Determining movement of the point of contact, which is represented by a series of contact data, optionally includes determining speed (magnitude), velocity (magnitude and direction), and/or an acceleration (a change in magnitude and/or direction) of the point of contact. These operations are, optionally, applied to single contacts (e.g., one finger contacts) or to multiple simultaneous contacts (e.g., “multitouch”/multiple finger contacts). In some embodiments, contact/motion module 130 and display controller 156 detect contact on a touchpad.
[0142] In some embodiments, contact/motion module 130 uses a set of one or more intensity thresholds to determine whether an operation has been performed by a user (e.g., to determine whether a user has “clicked” on an icon). In some embodiments, at least a subset of the intensity thresholds are determined in accordance with software parameters (e.g., the intensity thresholds are not determined by the activation thresholds of particular physical actuators and can be adjusted without changing the physical hardware of device 100). For example, a mouse “click” threshold of a trackpad or touch screen display can be set to any of a large range of predefined threshold values without changing the trackpad or touch screen display hardware. Additionally, in some implementations, a user of the device is provided with software settings for adjusting one or more of the set of intensity thresholds (e.g., by adjusting individual intensity thresholds
DK 2017 70505 A1 and/or by adjusting a plurality of intensity thresholds at once with a system-level click “intensity” parameter).
[0143] Contact/motion module 130 optionally detects a gesture input by a user. Different gestures on the touch-sensitive surface have different contact patterns (e.g., different motions, timings, and/or intensities of detected contacts). Thus, a gesture is, optionally, detected by detecting a particular contact pattern. For example, detecting a finger tap gesture includes detecting a finger-down event followed by detecting a finger-up (liftoff) event at the same position (or substantially the same position) as the finger-down event (e.g., at the position of an icon). As another example, detecting a finger swipe gesture on the touch-sensitive surface includes detecting a finger-down event followed by detecting one or more finger-dragging events, and subsequently followed by detecting a finger-up (liftoff) event.
[0144] Graphics module 132 includes various known software components for rendering and displaying graphics on touch screen 112 or other display, including components for changing the visual impact (e.g., brightness, transparency, saturation, contrast, or other visual property) of graphics that are displayed. As used herein, the term “graphics” includes any object that can be displayed to a user, including, without limitation, text, web pages, icons (such as user-interface objects including soft keys), digital images, videos, animations, and the like.
[0145] In some embodiments, graphics module 132 stores data representing graphics to be used. Each graphic is, optionally, assigned a corresponding code. Graphics module 132 receives, from applications etc., one or more codes specifying graphics to be displayed along with, if necessary, coordinate data and other graphic property data, and then generates screen image data to output to display controller 156.
[0146] Haptic feedback module 133 includes various software components for generating instructions used by tactile output generator(s) 167 to produce tactile outputs at one or more locations on device 100 in response to user interactions with device 100.
DK 2017 70505 A1 [0147] Text input module 134, which is, optionally, a component of graphics module 132, provides soft keyboards for entering text in various applications (e.g., contacts 137, e-mail 140, IM 141, browser 147, and any other application that needs text input).
[0148] GPS module 135 determines the location of the device and provides this information for use in various applications (e.g., to telephone 138 for use in location-based dialing; to camera 143 as picture/video metadata; and to applications that provide location-based services such as weather widgets, local yellow page widgets, and map/navigation widgets).
[0149] Applications 136 optionally include the following modules (or sets of instructions), or a subset or superset thereof:
• Contacts module 137 (sometimes called an address book or contact list);
• Telephone module 138;
• Video conference module 139;
• E-mail client module 140;
• Instant messaging (IM) module 141;
• Workout support module 142;
• Camera module 143 for still and/or video images;
• Image management module 144;
• Video player module;
• Music player module;
• Browser module 147;
• Calendar module 148;
DK 2017 70505 A1 • Widget modules 149, which optionally include one or more of: weather widget 149-1, stocks widget 149-2, calculator widget 149-3, alarm clock widget 149-4, dictionary widget 149-5, and other widgets obtained by the user, as well as user-created widgets 149-6;
• Widget creator module 150 for making user-created widgets 149-6;
• Search module 151;
• Video and music player module 152, which merges video player module and music player module;
• Notes module 153;
• Map module 154; and/or • Online video module 155.
[0150] Examples of other applications 136 that are, optionally, stored in memory 102 include other word processing applications, other image editing applications, drawing applications, presentation applications, JAVA-enabled applications, encryption, digital rights management, voice recognition, and voice replication.
[0151] In conjunction with touch screen 112, display controller 156, contact/motion module 130, graphics module 132, and text input module 134, contacts module 137 are, optionally, used to manage an address book or contact list (e.g., stored in application internal state 192 of contacts module 137 in memory 102 or memory 370), including: adding name(s) to the address book; deleting name(s) from the address book; associating telephone number(s), e-mail address(es), physical address(es) or other information with a name; associating an image with a name; categorizing and sorting names; providing telephone numbers or e-mail addresses to initiate and/or facilitate communications by telephone 138, video conference module 139, e-mail 140, or IM 141; and so forth.
DK 2017 70505 A1 [0152] In conjunction with RF circuitry 108, audio circuitry 110, speaker 111, microphone 113, touch screen 112, display controller 156, contact/motion module 130, graphics module 132, and text input module 134, telephone module 138 are optionally, used to enter a sequence of characters corresponding to a telephone number, access one or more telephone numbers in contacts module 137, modify a telephone number that has been entered, dial a respective telephone number, conduct a conversation, and disconnect or hang up when the conversation is completed. As noted above, the wireless communication optionally uses any of a plurality of communications standards, protocols, and technologies.
[0153] In conjunction with RF circuitry 108, audio circuitry 110, speaker 111, microphone 113, touch screen 112, display controller 156, optical sensor 164, optical sensor controller 158, contact/motion module 130, graphics module 132, text input module 134, contacts module 137, and telephone module 138, video conference module 139 includes executable instructions to initiate, conduct, and terminate a video conference between a user and one or more other participants in accordance with user instructions.
[0154] In conjunction with RF circuitry 108, touch screen 112, display controller 156, contact/motion module 130, graphics module 132, and text input module 134, e-mail client module 140 includes executable instructions to create, send, receive, and manage e-mail in response to user instructions. In conjunction with image management module 144, e-mail client module 140 makes it very easy to create and send e-mails with still or video images taken with camera module 143.
[0155] In conjunction with RF circuitry 108, touch screen 112, display controller 156, contact/motion module 130, graphics module 132, and text input module 134, the instant messaging module 141 includes executable instructions to enter a sequence of characters corresponding to an instant message, to modify previously entered characters, to transmit a respective instant message (for example, using a Short Message Service (SMS) or Multimedia Message Service (MMS) protocol for telephony-based instant messages or using XMPP, SIMPLE, or IMPS for Internet-based instant messages), to receive instant messages, and to view received instant messages. In some embodiments, transmitted and/or received instant messages optionally include graphics, photos, audio files, video files and/or other attachments as are
DK 2017 70505 A1 supported in an MMS and/or an Enhanced Messaging Service (EMS). As used herein, “instant messaging” refers to both telephony-based messages (e.g., messages sent using SMS or MMS) and Internet-based messages (e.g., messages sent using XMPP, SIMPLE, or IMPS).
[0156] In conjunction with RF circuitry 108, touch screen 112, display controller 156, contact/motion module 130, graphics module 132, text input module 134, GPS module 135, map module 154, and music player module, workout support module 142 includes executable instructions to create workouts (e.g., with time, distance, and/or calorie burning goals); communicate with workout sensors (sports devices); receive workout sensor data; calibrate sensors used to monitor a workout; select and play music for a workout; and display, store, and transmit workout data.
[0157] In conjunction with touch screen 112, display controller 156, optical sensor(s) 164, optical sensor controller 158, contact/motion module 130, graphics module 132, and image management module 144, camera module 143 includes executable instructions to capture still images or video (including a video stream) and store them into memory 102, modify characteristics of a still image or video, or delete a still image or video from memory 102.
[0158] In conjunction with touch screen 112, display controller 156, contact/motion module 130, graphics module 132, text input module 134, and camera module 143, image management module 144 includes executable instructions to arrange, modify (e.g., edit), or otherwise manipulate, label, delete, present (e.g., in a digital slide show or album), and store still and/or video images.
[0159] In conjunction with RF circuitry 108, touch screen 112, display controller 156, contact/motion module 130, graphics module 132, and text input module 134, browser module 147 includes executable instructions to browse the Internet in accordance with user instructions, including searching, linking to, receiving, and displaying web pages or portions thereof, as well as attachments and other files linked to web pages.
[0160] In conjunction with RF circuitry 108, touch screen 112, display controller 156, contact/motion module 130, graphics module 132, text input module 134, e-mail client module
DK 2017 70505 A1
140, and browser module 147, calendar module 148 includes executable instructions to create, display, modify, and store calendars and data associated with calendars (e.g., calendar entries, todo lists, etc.) in accordance with user instructions.
[0161] In conjunction with RF circuitry 108, touch screen 112, display controller 156, contact/motion module 130, graphics module 132, text input module 134, and browser module 147, widget modules 149 are mini-applications that are, optionally, downloaded and used by a user (e.g., weather widget 149-1, stocks widget 149-2, calculator widget 149-3, alarm clock widget 149-4, and dictionary widget 149-5) or created by the user (e.g., user-created widget 1496). In some embodiments, a widget includes an HTML (Hypertext Markup Language) file, a CSS (Cascading Style Sheets) file, and a JavaScript file. In some embodiments, a widget includes an XML (Extensible Markup Language) file and a JavaScript file (e.g., Yahoo! Widgets).
[0162] In conjunction with RF circuitry 108, touch screen 112, display controller 156, contact/motion module 130, graphics module 132, text input module 134, and browser module 147, the widget creator module 150 are, optionally, used by a user to create widgets (e.g., turning a user-specified portion of a web page into a widget).
[0163] In conjunction with touch screen 112, display controller 156, contact/motion module 130, graphics module 132, and text input module 134, search module 151 includes executable instructions to search for text, music, sound, image, video, and/or other files in memory 102 that match one or more search criteria (e.g., one or more user-specified search terms) in accordance with user instructions.
[0164] In conjunction with touch screen 112, display controller 156, contact/motion module 130, graphics module 132, audio circuitry 110, speaker 111, RF circuitry 108, and browser module 147, video and music player module 152 includes executable instructions that allow the user to download and play back recorded music and other sound files stored in one or more file formats, such as MP3 or AAC files, and executable instructions to display, present, or otherwise play back videos (e.g., on touch screen 112 or on an external, connected display via external port
DK 2017 70505 A1
124). In some embodiments, device 100 optionally includes the functionality of an MP3 player, such as an iPod (trademark of Apple Inc.).
[0165] In conjunction with touch screen 112, display controller 156, contact/motion module 130, graphics module 132, and text input module 134, notes module 153 includes executable instructions to create and manage notes, to-do lists, and the like in accordance with user instructions.
[0166] In conjunction with RF circuitry 108, touch screen 112, display controller 156, contact/motion module 130, graphics module 132, text input module 134, GPS module 135, and browser module 147, map module 154 are, optionally, used to receive, display, modify, and store maps and data associated with maps (e.g., driving directions, data on stores and other points of interest at or near a particular location, and other location-based data) in accordance with user instructions.
[0167] In conjunction with touch screen 112, display controller 156, contact/motion module 130, graphics module 132, audio circuitry 110, speaker 111, RF circuitry 108, text input module 134, e-mail client module 140, and browser module 147, online video module 155 includes instructions that allow the user to access, browse, receive (e.g., by streaming and/or download), play back (e.g., on the touch screen or on an external, connected display via external port 124), send an e-mail with a link to a particular online video, and otherwise manage online videos in one or more file formats, such as H.264. In some embodiments, instant messaging module 141, rather than e-mail client module 140, is used to send a link to a particular online video. Additional description of the online video application can be found in U.S. Provisional Patent Application No. 60/936,562, “Portable Multifunction Device, Method, and Graphical User Interface for Playing Online Videos,” filed June 20, 2007, and U.S. Patent Application No.
11/968,067, “Portable Multifunction Device, Method, and Graphical User Interface for Playing Online Videos,” filed December 31, 2007, the contents of which are hereby incorporated by reference in their entirety.
[0168] Each of the above-identified modules and applications corresponds to a set of executable instructions for performing one or more functions described above and the methods
DK 2017 70505 A1 described in this application (e.g., the computer-implemented methods and other information processing methods described herein). These modules (e.g., sets of instructions) need not be implemented as separate software programs, procedures, or modules, and thus various subsets of these modules are, optionally, combined or otherwise rearranged in various embodiments. For example, video player module is, optionally, combined with music player module into a single module (e.g., video and music player module 152, FIG. 1A). In some embodiments, memory 102 optionally stores a subset of the modules and data structures identified above. Furthermore, memory 102 optionally stores additional modules and data structures not described above.
[0169] In some embodiments, device 100 is a device where operation of a predefined set of functions on the device is performed exclusively through a touch screen and/or a touchpad. By using a touch screen and/or a touchpad as the primary input control device for operation of device 100, the number of physical input control devices (such as push buttons, dials, and the like) on device 100 is, optionally, reduced.
[0170] The predefined set of functions that are performed exclusively through a touch screen and/or a touchpad optionally include navigation between user interfaces. In some embodiments, the touchpad, when touched by the user, navigates device 100 to a main, home, or root menu from any user interface that is displayed on device 100. In such embodiments, a “menu button” is implemented using a touchpad. In some other embodiments, the menu button is a physical push button or other physical input control device instead of a touchpad.
[0171] FIG. 1B is a block diagram illustrating exemplary components for event handling in accordance with some embodiments. In some embodiments, memory 102 (FIG. 1A) or 370 (FIG. 3) includes event sorter 170 (e.g., in operating system 126) and a respective application 136-1 (e.g., any of the aforementioned applications 137-151, 155, 380-390).
[0172] Event sorter 170 receives event information and determines the application 136-1 and application view 191 of application 136-1 to which to deliver the event information. Event sorter 170 includes event monitor 171 and event dispatcher module 174. In some embodiments, application 136-1 includes application internal state 192, which indicates the current application view(s) displayed on touch-sensitive display 112 when the application is active or executing. In
DK 2017 70505 A1 some embodiments, device/global internal state 157 is used by event sorter 170 to determine which application(s) is (are) currently active, and application internal state 192 is used by event sorter 170 to determine application views 191 to which to deliver event information.
[0173] In some embodiments, application internal state 192 includes additional information, such as one or more of: resume information to be used when application 136-1 resumes execution, user interface state information that indicates information being displayed or that is ready for display by application 136-1, a state queue for enabling the user to go back to a prior state or view of application 136-1, and a redo/undo queue of previous actions taken by the user.
[0174] Event monitor 171 receives event information from peripherals interface 118. Event information includes information about a sub-event (e.g., a user touch on touch-sensitive display 112, as part of a multi-touch gesture). Peripherals interface 118 transmits information it receives from I/O subsystem 106 or a sensor, such as proximity sensor 166, accelerometer(s) 168, and/or microphone 113 (through audio circuitry 110). Information that peripherals interface 118 receives from I/O subsystem 106 includes information from touch-sensitive display 112 or a touch-sensitive surface.
[0175] In some embodiments, event monitor 171 sends requests to the peripherals interface 118 at predetermined intervals. In response, peripherals interface 118 transmits event information. In other embodiments, peripherals interface 118 transmits event information only when there is a significant event (e.g., receiving an input above a predetermined noise threshold and/or for more than a predetermined duration).
[0176] In some embodiments, event sorter 170 also includes a hit view determination module 172 and/or an active event recognizer determination module 173.
[0177] Hit view determination module 172 provides software procedures for determining where a sub-event has taken place within one or more views when touch-sensitive display 112 displays more than one view. Views are made up of controls and other elements that a user can see on the display.
DK 2017 70505 A1 [0178] Another aspect of the user interface associated with an application is a set of views, sometimes herein called application views or user interface windows, in which information is displayed and touch-based gestures occur. The application views (of a respective application) in which a touch is detected optionally correspond to programmatic levels within a programmatic or view hierarchy of the application. For example, the lowest level view in which a touch is detected is, optionally, called the hit view, and the set of events that are recognized as proper inputs are, optionally, determined based, at least in part, on the hit view of the initial touch that begins a touch-based gesture.
[0179] Hit view determination module 172 receives information related to sub-events of a touch-based gesture. When an application has multiple views organized in a hierarchy, hit view determination module 172 identifies a hit view as the lowest view in the hierarchy which should handle the sub-event. In most circumstances, the hit view is the lowest level view in which an initiating sub-event occurs (e.g., the first sub-event in the sequence of sub-events that form an event or potential event). Once the hit view is identified by the hit view determination module 172, the hit view typically receives all sub-events related to the same touch or input source for which it was identified as the hit view.
[0180] Active event recognizer determination module 173 determines which view or views within a view hierarchy should receive a particular sequence of sub-events. In some embodiments, active event recognizer determination module 173 determines that only the hit view should receive a particular sequence of sub-events. In other embodiments, active event recognizer determination module 173 determines that all views that include the physical location of a sub-event are actively involved views, and therefore determines that all actively involved views should receive a particular sequence of sub-events. In other embodiments, even if touch sub-events were entirely confined to the area associated with one particular view, views higher in the hierarchy would still remain as actively involved views.
[0181] Event dispatcher module 174 dispatches the event information to an event recognizer (e.g., event recognizer 180). In embodiments including active event recognizer determination module 173, event dispatcher module 174 delivers the event information to an event recognizer determined by active event recognizer determination module 173. In some embodiments, event
DK 2017 70505 A1 dispatcher module 174 stores in an event queue the event information, which is retrieved by a respective event receiver 182.
[0182] In some embodiments, operating system 126 includes event sorter 170. Alternatively, application 136-1 includes event sorter 170. In yet other embodiments, event sorter 170 is a stand-alone module, or a part of another module stored in memory 102, such as contact/motion module 130.
[0183] In some embodiments, application 136-1 includes a plurality of event handlers 190 and one or more application views 191, each of which includes instructions for handling touch events that occur within a respective view of the application’s user interface. Each application view 191 of the application 136-1 includes one or more event recognizers 180. Typically, a respective application view 191 includes a plurality of event recognizers 180. In other embodiments, one or more of event recognizers 180 are part of a separate module, such as a user interface kit (not shown) or a higher level object from which application 136-1 inherits methods and other properties. In some embodiments, a respective event handler 190 includes one or more of: data updater 176, object updater 177, GUI updater 178, and/or event data 179 received from event sorter 170. Event handler 190 optionally utilizes or calls data updater 176, object updater 177, or GUI updater 178 to update the application internal state 192. Alternatively, one or more of the application views 191 include one or more respective event handlers 190. Also, in some embodiments, one or more of data updater 176, object updater 177, and GUI updater 178 are included in a respective application view 191.
[0184] A respective event recognizer 180 receives event information (e.g., event data 179) from event sorter 170 and identifies an event from the event information. Event recognizer 180 includes event receiver 182 and event comparator 184. In some embodiments, event recognizer 180 also includes at least a subset of: metadata 183, and event delivery instructions 188 (which optionally include sub-event delivery instructions).
[0185] Event receiver 182 receives event information from event sorter 170. The event information includes information about a sub-event, for example, a touch or a touch movement. Depending on the sub-event, the event information also includes additional information, such as
DK 2017 70505 A1 location of the sub-event. When the sub-event concerns motion of a touch, the event information optionally also includes speed and direction of the sub-event. In some embodiments, events include rotation of the device from one orientation to another (e.g., from a portrait orientation to a landscape orientation, or vice versa), and the event information includes corresponding information about the current orientation (also called device attitude) of the device.
[0186] Event comparator 184 compares the event information to predefined event or subevent definitions and, based on the comparison, determines an event or sub-event, or determines or updates the state of an event or sub-event. In some embodiments, event comparator 184 includes event definitions 186. Event definitions 186 contain definitions of events (e.g., predefined sequences of sub-events), for example, event 1 (187-1), event 2 (187-2), and others. In some embodiments, sub-events in an event (187) include, for example, touch begin, touch end, touch movement, touch cancellation, and multiple touching. In one example, the definition for event 1 (187-1) is a double tap on a displayed object. The double tap, for example, comprises a first touch (touch begin) on the displayed object for a predetermined phase, a first liftoff (touch end) for a predetermined phase, a second touch (touch begin) on the displayed object for a predetermined phase, and a second liftoff (touch end) for a predetermined phase. In another example, the definition for event 2 (187-2) is a dragging on a displayed object. The dragging, for example, comprises a touch (or contact) on the displayed object for a predetermined phase, a movement of the touch across touch-sensitive display 112, and liftoff of the touch (touch end). In some embodiments, the event also includes information for one or more associated event handlers 190.
[0187] In some embodiments, event definition 187 includes a definition of an event for a respective user-interface object. In some embodiments, event comparator 184 performs a hit test to determine which user-interface object is associated with a sub-event. For example, in an application view in which three user-interface objects are displayed on touch-sensitive display 112, when a touch is detected on touch-sensitive display 112, event comparator 184 performs a hit test to determine which of the three user-interface objects is associated with the touch (subevent). If each displayed object is associated with a respective event handler 190, the event comparator uses the result of the hit test to determine which event handler 190 should be
DK 2017 70505 A1 activated. For example, event comparator 184 selects an event handler associated with the subevent and the object triggering the hit test.
[0188] In some embodiments, the definition for a respective event (187) also includes delayed actions that delay delivery of the event information until after it has been determined whether the sequence of sub-events does or does not correspond to the event recognizer’s event type.
[0189] When a respective event recognizer 180 determines that the series of sub-events do not match any of the events in event definitions 186, the respective event recognizer 180 enters an event impossible, event failed, or event ended state, after which it disregards subsequent subevents of the touch-based gesture. In this situation, other event recognizers, if any, that remain active for the hit view continue to track and process sub-events of an ongoing touch-based gesture.
[0190] In some embodiments, a respective event recognizer 180 includes metadata 183 with configurable properties, flags, and/or lists that indicate how the event delivery system should perform sub-event delivery to actively involved event recognizers. In some embodiments, metadata 183 includes configurable properties, flags, and/or lists that indicate how event recognizers interact, or are enabled to interact, with one another. In some embodiments, metadata 183 includes configurable properties, flags, and/or lists that indicate whether subevents are delivered to varying levels in the view or programmatic hierarchy.
[0191] In some embodiments, a respective event recognizer 180 activates event handler 190 associated with an event when one or more particular sub-events of an event are recognized. In some embodiments, a respective event recognizer 180 delivers event information associated with the event to event handler 190. Activating an event handler 190 is distinct from sending (and deferred sending) sub-events to a respective hit view. In some embodiments, event recognizer 180 throws a flag associated with the recognized event, and event handler 190 associated with the flag catches the flag and performs a predefined process.
DK 2017 70505 A1 [0192] In some embodiments, event delivery instructions 188 include sub-event delivery instructions that deliver event information about a sub-event without activating an event handler. Instead, the sub-event delivery instructions deliver event information to event handlers associated with the series of sub-events or to actively involved views. Event handlers associated with the series of sub-events or with actively involved views receive the event information and perform a predetermined process.
[0193] In some embodiments, data updater 176 creates and updates data used in application 136-1. For example, data updater 176 updates the telephone number used in contacts module 137, or stores a video file used in video player module. In some embodiments, object updater 177 creates and updates objects used in application 136-1. For example, object updater 177 creates a new user-interface object or updates the position of a user-interface object. GUI updater 178 updates the GUI. For example, GUI updater 178 prepares display information and sends it to graphics module 132 for display on a touch-sensitive display.
[0194] In some embodiments, event handler(s) 190 includes or has access to data updater 176, object updater 177, and GUI updater 178. In some embodiments, data updater 176, object updater 177, and GUI updater 178 are included in a single module of a respective application 136-1 or application view 191. In other embodiments, they are included in two or more software modules.
[0195] FIG. 1C is a block diagram illustrating a tactile output module in accordance with some embodiments. In some embodiments, I/O subsystem 106 (e.g., haptic feedback controller 161 (FIG. 1A) and/or other input controller(s) 160 (FIG. 1A)) includes at least some of the example components shown in FIG. 1C. In some embodiments, peripherals interface 118 includes at least some of the example components shown in FIG. 1C.
[0196] In some embodiments, the tactile output module includes haptic feedback module 133. In some embodiments, haptic feedback module 133 aggregates and combines tactile outputs for user interface feedback from software applications on the electronic device (e.g., feedback that is responsive to user inputs that correspond to displayed user interfaces and alerts and other notifications that indicate the performance of operations or occurrence of events in user
DK 2017 70505 A1 interfaces of the electronic device). Haptic feedback module 133 includes one or more of: waveform module 123 (for providing waveforms used for generating tactile outputs), mixer 125 (for mixing waveforms, such as waveforms in different channels), compressor 127 (for reducing or compressing a dynamic range of the waveforms), low-pass filter 129 (for filtering out high frequency signal components in the waveforms), and thermal controller 131 (for adjusting the waveforms in accordance with thermal conditions). In some embodiments, haptic feedback module 133 is included in haptic feedback controller 161 (FIG. 1A). In some embodiments, a separate unit of haptic feedback module 133 (or a separate implementation of haptic feedback module 133) is also included in an audio controller (e.g., audio circuitry 110, FIG. 1A) and used for generating audio signals. In some embodiments, a single haptic feedback module 133 is used for generating audio signals and generating waveforms for tactile outputs.
[0197] In some embodiments, haptic feedback module 133 also includes trigger module 121 (e.g., a software application, operating system, or other software module that determines a tactile output is to be generated and initiates the process for generating the corresponding tactile output). In some embodiments, trigger module 121 generates trigger signals for initiating generation of waveforms (e.g., by waveform module 123). For example, trigger module 121 generates trigger signals based on preset timing criteria. In some embodiments, trigger module 121 receives trigger signals from outside haptic feedback module 133 (e.g., in some embodiments, haptic feedback module 133 receives trigger signals from hardware input processing module 146 located outside haptic feedback module 133) and relays the trigger signals to other components within haptic feedback module 133 (e.g., waveform module 123) or software applications that trigger operations (e.g., with trigger module 121) based on activation of a user interface element (e.g., an application icon or an affordance within an application) or a hardware input device (e.g., a home button or an intensity-sensitive input surface, such as an intensity-sensitive touch screen). In some embodiments, trigger module 121 also receives tactile feedback generation instructions (e.g., from haptic feedback module 133, FIGS. 1A and 3). In some embodiments, trigger module 121 generates trigger signals in response to haptic feedback module 133 (or trigger module 121 in haptic feedback module 133) receiving tactile feedback instructions (e.g., from haptic feedback module 133, FIGS. 1A and 3).
DK 2017 70505 A1 [0198] Waveform module 123 receives trigger signals (e.g., from trigger module 121) as an input, and in response to receiving trigger signals, provides waveforms for generation of one or more tactile outputs (e.g., waveforms selected from a predefined set of waveforms designated for use by waveform module 123, such as the waveforms described in greater detail below with reference to FIGS. 4C-4D).
[0199] Mixer 125 receives waveforms (e.g., from waveform module 123) as an input, and mixes together the waveforms. For example, when mixer 125 receives two or more waveforms (e.g., a first waveform in a first channel and a second waveform that at least partially overlaps with the first waveform in a second channel) mixer 125 outputs a combined waveform that corresponds to a sum of the two or more waveforms. In some embodiments, mixer 125 also modifies one or more waveforms of the two or more waveforms to emphasize particular waveform(s) over the rest of the two or more waveforms (e.g., by increasing a scale of the particular waveform(s) and/or decreasing a scale of the rest of the waveforms). In some circumstances, mixer 125 selects one or more waveforms to remove from the combined waveform (e.g., the waveform from the oldest source is dropped when there are waveforms from more than three sources that have been requested to be output concurrently by tactile output generator 167).
[0200] Compressor 127 receives waveforms (e.g., a combined waveform from mixer 125) as an input, and modifies the waveforms. In some embodiments, compressor 127 reduces the waveforms (e.g., in accordance with physical specifications of tactile output generators 167 (FIG. 1A) or 357 (FIG. 3)) so that tactile outputs corresponding to the waveforms are reduced. In some embodiments, compressor 127 limits the waveforms, such as by enforcing a predefined maximum amplitude for the waveforms. For example, compressor 127 reduces amplitudes of portions of waveforms that exceed a predefined amplitude threshold while maintaining amplitudes of portions of waveforms that do not exceed the predefined amplitude threshold. In some embodiments, compressor 127 reduces a dynamic range of the waveforms. In some embodiments, compressor 127 dynamically reduces the dynamic range of the waveforms so that the combined waveforms remain within performance specifications of the tactile output generator 167 (e.g., force and/or moveable mass displacement limits).
DK 2017 70505 A1 [0201] Low-pass filter 129 receives waveforms (e.g., compressed waveforms from compressor 127) as an input, and filters (e.g., smooths) the waveforms (e.g., removes or reduces high frequency signal components in the waveforms). For example, in some instances, compressor 127 includes, in compressed waveforms, extraneous signals (e.g., high frequency signal components) that interfere with the generation of tactile outputs and/or exceed performance specifications of tactile output generator 167 when the tactile outputs are generated in accordance with the compressed waveforms. Low-pass filter 129 reduces or removes such extraneous signals in the waveforms.
[0202] Thermal controller 131 receives waveforms (e.g., filtered waveforms from low-pass filter 129) as an input, and adjusts the waveforms in accordance with thermal conditions of device 100 (e.g., based on internal temperatures detected within device 100, such as the temperature of haptic feedback controller 161, and/or external temperatures detected by device 100). For example, in some cases, the output of haptic feedback controller 161 varies depending on the temperature (e.g. haptic feedback controller 161, in response to receiving same waveforms, generates a first tactile output when haptic feedback controller 161 is at a first temperature and generates a second tactile output when haptic feedback controller 161 is at a second temperature that is distinct from the first temperature). For example, the magnitude (or the amplitude) of the tactile outputs may vary depending on the temperature. To reduce the effect of the temperature variations, the waveforms are modified (e.g., an amplitude of the waveforms is increased or decreased based on the temperature).
[0203] In some embodiments, haptic feedback module 133 (e.g., trigger module 121) is coupled to hardware input processing module 146. In some embodiments, other input controller(s) 160 in FIG. 1A includes hardware input processing module 146. In some embodiments, hardware input processing module 146 receives inputs from hardware input device 145 (e.g., other input or control devices 116 in FIG. 1A, such as a home button or an intensitysensitive input surface, such as an intensity-sensitive touch screen). In some embodiments, hardware input device 145 is any input device described herein, such as touch-sensitive display system 112 (FIG. 1A), keyboard/mouse 350 (FIG. 3), touchpad 355 (FIG. 3), one of other input or control devices 116 (FIG. 1A), or an intensity-sensitive home button. In some embodiments,
DK 2017 70505 A1 hardware input device 145 consists of an intensity-sensitive home button, and not touch-sensitive display system 112 (FIG. 1A), keyboard/mouse 350 (FIG. 3), or touchpad 355 (FIG. 3). In some embodiments, in response to inputs from hardware input device 145 (e.g., an intensity-sensitive home button or a touch screen), hardware input processing module 146 provides one or more trigger signals to haptic feedback module 133 to indicate that a user input satisfying predefined input criteria, such as an input corresponding to a “click” of a home button (e.g., a “down click” or an “up click”), has been detected. In some embodiments, haptic feedback module 133 provides waveforms that correspond to the “click” of a home button in response to the input corresponding to the “click” of a home button, simulating a haptic feedback of pressing a physical home button.
[0204] In some embodiments, the tactile output module includes haptic feedback controller 161 (e.g., haptic feedback controller 161 in FIG. 1A), which controls the generation of tactile outputs. In some embodiments, haptic feedback controller 161 is coupled to a plurality of tactile output generators, and selects one or more tactile output generators of the plurality of tactile output generators and sends waveforms to the selected one or more tactile output generators for generating tactile outputs. In some embodiments, haptic feedback controller 161 coordinates tactile output requests that correspond to activation of hardware input device 145 and tactile output requests that correspond to software events (e.g., tactile output requests from haptic feedback module 133) and modifies one or more waveforms of the two or more waveforms to emphasize particular waveform(s) over the rest of the two or more waveforms (e.g., by increasing a scale of the particular waveform(s) and/or decreasing a scale of the rest of the waveforms, such as to prioritize tactile outputs that correspond to activations of hardware input device 145 over tactile outputs that correspond to software events).
[0205] In some embodiments, as shown in FIG. 1C, an output of haptic feedback controller 161 is coupled to audio circuitry of device 100 (e.g., audio circuitry 110, FIG. 1A), and provides audio signals to audio circuitry of device 100. In some embodiments, haptic feedback controller 161 provides both waveforms used for generating tactile outputs and audio signals used for providing audio outputs in conjunction with generation of the tactile outputs. In some embodiments, haptic feedback controller 161 modifies audio signals and/or waveforms (used for
DK 2017 70505 A1 generating tactile outputs) so that the audio outputs and the tactile outputs are synchronized (e.g., by delaying the audio signals and/or waveforms). In some embodiments, haptic feedback controller 161 includes a digital-to-analog converter used for converting digital waveforms into analog signals, which are received by amplifier 163 and/or tactile output generator 167.
[0206] In some embodiments, the tactile output module includes amplifier 163. In some embodiments, amplifier 163 receives waveforms (e.g., from haptic feedback controller 161) and amplifies the waveforms prior to sending the amplified waveforms to tactile output generator 167 (e.g., any of tactile output generators 167 (FIG. 1A) or 357 (FIG. 3)). For example, amplifier 163 amplifies the received waveforms to signal levels that are in accordance with physical specifications of tactile output generator 167 (e.g., to a voltage and/or a current required by tactile output generator 167 for generating tactile outputs so that the signals sent to tactile output generator 167 produce tactile outputs that correspond to the waveforms received from haptic feedback controller 161) and sends the amplified waveforms to tactile output generator 167. In response, tactile output generator 167 generates tactile outputs (e.g., by shifting a moveable mass back and forth in one or more dimensions relative to a neutral position of the moveable mass).
[0207] In some embodiments, the tactile output module includes sensor 169, which is coupled to tactile output generator 167. Sensor 169 detects states or state changes (e.g., mechanical position, physical displacement, and/or movement) of tactile output generator 167 or one or more components of tactile output generator 167 (e.g., one or more moving parts, such as a membrane, used to generate tactile outputs). In some embodiments, sensor 169 is a magnetic field sensor (e.g., a Hall effect sensor) or other displacement and/or movement sensor. In some embodiments, sensor 169 provides information (e.g., a position, a displacement, and/or a movement of one or more parts in tactile output generator 167) to haptic feedback controller 161 and, in accordance with the information provided by sensor 169 about the state of tactile output generator 167, haptic feedback controller 161 adjusts the waveforms output from haptic feedback controller 161 (e.g., waveforms sent to tactile output generator 167, optionally via amplifier 163).
[0208] It shall be understood that the foregoing discussion regarding event handling of user touches on touch-sensitive displays also applies to other forms of user inputs to operate
DK 2017 70505 A1 multifunction devices 100 with input devices, not all of which are initiated on touch screens. For example, mouse movement and mouse button presses, optionally coordinated with single or multiple keyboard presses or holds; contact movements such as taps, drags, scrolls, etc. on touchpads; pen stylus inputs; movement of the device; oral instructions; detected eye movements; biometric inputs; and/or any combination thereof are optionally utilized as inputs corresponding to sub-events which define an event to be recognized.
[0209] FIG. 2 illustrates a portable multifunction device 100 having a touch screen 112 in accordance with some embodiments. The touch screen optionally displays one or more graphics within user interface (UI) 200. In this embodiment, as well as others described below, a user is enabled to select one or more of the graphics by making a gesture on the graphics, for example, with one or more fingers 202 (not drawn to scale in the figure) or one or more styluses 203 (not drawn to scale in the figure). In some embodiments, selection of one or more graphics occurs when the user breaks contact with the one or more graphics. In some embodiments, the gesture optionally includes one or more taps, one or more swipes (from left to right, right to left, upward and/or downward), and/or a rolling of a finger (from right to left, left to right, upward and/or downward) that has made contact with device 100. In some implementations or circumstances, inadvertent contact with a graphic does not select the graphic. For example, a swipe gesture that sweeps over an application icon optionally does not select the corresponding application when the gesture corresponding to selection is a tap.
[0210] Device 100 optionally also include one or more physical buttons, such as “home” or menu button 204. As described previously, menu button 204 is, optionally, used to navigate to any application 136 in a set of applications that are, optionally, executed on device 100. Alternatively, in some embodiments, the menu button is implemented as a soft key in a GUI displayed on touch screen 112.
[0211] In some embodiments, device 100 includes touch screen 112, menu button 204, push button 206 for powering the device on/off and locking the device, volume adjustment button(s) 208, subscriber identity module (SIM) card slot 210, headset jack 212, and docking/charging external port 124. Push button 206 is, optionally, used to turn the power on/off on the device by depressing the button and holding the button in the depressed state for a predefined time interval;
DK 2017 70505 A1 to lock the device by depressing the button and releasing the button before the predefined time interval has elapsed; and/or to unlock the device or initiate an unlock process. In an alternative embodiment, device 100 also accepts verbal input for activation or deactivation of some functions through microphone 113. Device 100 also, optionally, includes one or more contact intensity sensors 165 for detecting intensity of contacts on touch screen 112 and/or one or more tactile output generators 167 for generating tactile outputs for a user of device 100.
[0212] FIG. 3 is a block diagram of an exemplary multifunction device with a display and a touch-sensitive surface in accordance with some embodiments. Device 300 need not be portable. In some embodiments, device 300 is a laptop computer, a desktop computer, a tablet computer, a multimedia player device, a navigation device, an educational device (such as a child’s learning toy), a gaming system, or a control device (e.g., a home or industrial controller). Device 300 typically includes one or more processing units (CPUs) 310, one or more network or other communications interfaces 360, memory 370, and one or more communication buses 320 for interconnecting these components. Communication buses 320 optionally include circuitry (sometimes called a chipset) that interconnects and controls communications between system components. Device 300 includes input/output (I/O) interface 330 comprising display 340, which is typically a touch screen display. I/O interface 330 also optionally includes a keyboard and/or mouse (or other pointing device) 350 and touchpad 355, tactile output generator 357 for generating tactile outputs on device 300 (e.g., similar to tactile output generator(s) 167 described above with reference to FIG. 1A), sensors 359 (e.g., optical, acceleration, proximity, touchsensitive, and/or contact intensity sensors similar to contact intensity sensor(s) 165 described above with reference to FIG. 1A). Memory 370 includes high-speed random access memory, such as DRAM, SRAM, DDR RAM, or other random access solid state memory devices; and optionally includes non-volatile memory, such as one or more magnetic disk storage devices, optical disk storage devices, flash memory devices, or other non-volatile solid state storage devices. Memory 370 optionally includes one or more storage devices remotely located from CPU(s) 310. In some embodiments, memory 370 stores programs, modules, and data structures analogous to the programs, modules, and data structures stored in memory 102 of portable multifunction device 100 (FIG. 1A), or a subset thereof. Furthermore, memory 370 optionally stores additional programs, modules, and data structures not present in memory 102 of portable 74
DK 2017 70505 A1 multifunction device 100. For example, memory 370 of device 300 optionally stores drawing module 380, presentation module 382, word processing module 384, website creation module 386, disk authoring module 388, and/or spreadsheet module 390, while memory 102 of portable multifunction device 100 (FIG. 1A) optionally does not store these modules.
[0213] Each of the above-identified elements in FIG. 3 is, optionally, stored in one or more of the previously mentioned memory devices. Each of the above-identified modules corresponds to a set of instructions for performing a function described above. The above-identified modules or programs (e.g., sets of instructions) need not be implemented as separate software programs, procedures, or modules, and thus various subsets of these modules are, optionally, combined or otherwise rearranged in various embodiments. In some embodiments, memory 370 optionally stores a subset of the modules and data structures identified above. Furthermore, memory 370 optionally stores additional modules and data structures not described above.
[0214] Attention is now directed towards embodiments of user interfaces that are, optionally, implemented on, for example, portable multifunction device 100.
[0215] FIG. 4A illustrates an exemplary user interface for a menu of applications on portable multifunction device 100 in accordance with some embodiments. Similar user interfaces are, optionally, implemented on device 300. In some embodiments, user interface 400 includes the following elements, or a subset or superset thereof:
• Signal strength indicator(s) 402 for wireless communication(s), such as cellular and WiFi signals;
• Time 404;
• Bluetooth indicator 405;
• Battery status indicator 406;
• Tray 408 with icons for frequently used applications, such as:
DK 2017 70505 A1 o Icon 416 for telephone module 138, labeled “Phone,” which optionally includes an indicator 414 of the number of missed calls or voicemail messages;
o Icon 418 for e-mail client module 140, labeled “Mail,” which optionally includes an indicator 410 of the number of unread e-mails;
o Icon 420 for browser module 147, labeled “Browser;” and o Icon 422 for video and music player module 152, also referred to as iPod (trademark of Apple Inc.) module 152, labeled “iPod;” and • Icons for other applications, such as:
o Icon 424 for IM module 141, labeled “Messages;” o Icon 426 for calendar module 148, labeled “Calendar;” o Icon 428 for image management module 144, labeled “Photos;” o Icon 430 for camera module 143, labeled “Camera;” o Icon 432 for online video module 155, labeled “Online Video;” o Icon 434 for stocks widget 149-2, labeled “Stocks;” o Icon 436 for map module 154, labeled “Maps;” o Icon 438 for weather widget 149-1, labeled “Weather;” o Icon 440 for alarm clock widget 149-4, labeled “Clock;” o Icon 442 for workout support module 142, labeled “Workout Support;” o Icon 444 for notes module 153, labeled “Notes;” and o Icon 446 for a settings application or module, labeled “Settings,” which provides access to settings for device 100 and its various applications 136.
[0216] It should be noted that the icon labels illustrated in FIG. 4A are merely exemplary. For example, icon 422 for video and music player module 152 is labeled “Music” or “Music Player.” Other labels are, optionally, used for various application icons. In some embodiments, a label for a respective application icon includes a name of an application corresponding to the
DK 2017 70505 A1 respective application icon. In some embodiments, a label for a particular application icon is distinct from a name of an application corresponding to the particular application icon.
[0217] FIG. 4B illustrates an exemplary user interface on a device (e.g., device 300, FIG. 3) with a touch-sensitive surface 451 (e.g., a tablet or touchpad 355, FIG. 3) that is separate from the display 450 (e.g., touch screen display 112). Device 300 also, optionally, includes one or more contact intensity sensors (e.g., one or more of sensors 359) for detecting intensity of contacts on touch-sensitive surface 451 and/or one or more tactile output generators 357 for generating tactile outputs for a user of device 300.
[0218] Although some of the examples that follow will be given with reference to inputs on touch screen display 112 (where the touch-sensitive surface and the display are combined), in some embodiments, the device detects inputs on a touch-sensitive surface that is separate from the display, as shown in FIG. 4B. In some embodiments, the touch-sensitive surface (e.g., 451 in FIG. 4B) has a primary axis (e.g., 452 in FIG. 4B) that corresponds to a primary axis (e.g., 453 in FIG. 4B) on the display (e.g., 450). In accordance with these embodiments, the device detects contacts (e.g., 460 and 462 in FIG. 4B) with the touch-sensitive surface 451 at locations that correspond to respective locations on the display (e.g., in FIG. 4B, 460 corresponds to 468 and 462 corresponds to 470). In this way, user inputs (e.g., contacts 460 and 462, and movements thereof) detected by the device on the touch-sensitive surface (e.g., 451 in FIG. 4B) are used by the device to manipulate the user interface on the display (e.g., 450 in FIG. 4B) of the multifunction device when the touch-sensitive surface is separate from the display. It should be understood that similar methods are, optionally, used for other user interfaces described herein.
[0219] Additionally, while the following examples are given primarily with reference to finger inputs (e.g., finger contacts, finger tap gestures, finger swipe gestures), it should be understood that, in some embodiments, one or more of the finger inputs are replaced with input from another input device (e.g., a mouse-based input or stylus input). For example, a swipe gesture is, optionally, replaced with a mouse click (e.g., instead of a contact) followed by movement of the cursor along the path of the swipe (e.g., instead of movement of the contact). As another example, a tap gesture is, optionally, replaced with a mouse click while the cursor is located over the location of the tap gesture (e.g., instead of detection of the contact followed by 77
DK 2017 70505 A1 ceasing to detect the contact). Similarly, when multiple user inputs are simultaneously detected, it should be understood that multiple computer mice are, optionally, used simultaneously, or a mouse and finger contacts are, optionally, used simultaneously.
[0220] FIG. 5A illustrates exemplary personal electronic device 500. Device 500 includes body 502. In some embodiments, device 500 can include some or all of the features described with respect to devices 100 and 300 (e.g., FIGS. 1A-4B). In some embodiments, device 500 has touch-sensitive display screen 504, hereafter touch screen 504. Alternatively, or in addition to touch screen 504, device 500 has a display and a touch-sensitive surface. As with devices 100 and 300, in some embodiments, touch screen 504 (or the touch-sensitive surface) optionally includes one or more intensity sensors for detecting intensity of contacts (e.g., touches) being applied. The one or more intensity sensors of touch screen 504 (or the touch-sensitive surface) can provide output data that represents the intensity of touches. The user interface of device 500 can respond to touches based on their intensity, meaning that touches of different intensities can invoke different user interface operations on device 500.
[0221] Exemplary techniques for detecting and processing touch intensity are found, for example, in related applications: International Patent Application Serial No.
PCT/US2013/040061, titled “Device, Method, and Graphical User Interface for Displaying User Interface Objects Corresponding to an Application,” filed May 8, 2013, published as WIPO Publication No. WO/2013/169849, and International Patent Application Serial No.
PCT/US2013/069483, titled “Device, Method, and Graphical User Interface for Transitioning Between Touch Input to Display Output Relationships,” filed November 11, 2013, published as WIPO Publication No. WO/2014/105276, each of which is hereby incorporated by reference in their entirety.
[0222] In some embodiments, device 500 has one or more input mechanisms 506 and 508. Input mechanisms 506 and 508, if included, can be physical. Examples of physical input mechanisms include push buttons and rotatable mechanisms. In some embodiments, device 500 has one or more attachment mechanisms. Such attachment mechanisms, if included, can permit attachment of device 500 with, for example, hats, eyewear, earrings, necklaces, shirts, jackets,
DK 2017 70505 A1 bracelets, watch straps, chains, trousers, belts, shoes, purses, backpacks, and so forth. These attachment mechanisms permit device 500 to be worn by a user.
[0223] FIG. 5B depicts exemplary personal electronic device 500. In some embodiments, device 500 can include some or all of the components described with respect to FIGS. 1A, 1B, and 3. Device 500 has bus 512 that operatively couples I/O section 514 with one or more computer processors 516 and memory 518. I/O section 514 can be connected to display 504, which can have touch-sensitive component 522 and, optionally, intensity sensor 524 (e.g., contact intensity sensor). In addition, I/O section 514 can be connected with communication unit 530 for receiving application and operating system data, using Wi-Fi, Bluetooth, near field communication (NFC), cellular, and/or other wireless communication techniques. Device 500 can include input mechanisms 506 and/or 508. Input mechanism 506 is, optionally, a rotatable input device or a depressible and rotatable input device, for example. Input mechanism 508 is, optionally, a button, in some examples.
[0224] Input mechanism 508 is, optionally, a microphone, in some examples. Personal electronic device 500 optionally includes various sensors, such as GPS sensor 532, accelerometer 534, directional sensor 540 (e.g., compass), gyroscope 536, motion sensor 538, and/or a combination thereof, all of which can be operatively connected to I/O section 514.
[0225] Memory 518 of personal electronic device 500 can include one or more nontransitory computer-readable storage mediums, for storing computer-executable instructions, which, when executed by one or more computer processors 516, for example, can cause the computer processors to perform the techniques described below, including processes 900, 1200, 1500, 1800, 2100, 2400, 2700, and 3000 (FIGS. 9A-9I, 12A-12C, 15A-15K, 18A-18F, 21A-21D, 24A-24C, 27A-27E, and 30A-30D). A computer-readable storage medium can be any medium that can tangibly contain or store computer-executable instructions for use by or in connection with the instruction execution system, apparatus, or device. In some examples, the storage medium is a transitory computer-readable storage medium. In some examples, the storage medium is a non-transitory computer-readable storage medium. The non-transitory computerreadable storage medium can include, but is not limited to, magnetic, optical, and/or semiconductor storages. Examples of such storage include magnetic disks, optical discs based
DK 2017 70505 A1 on CD, DVD, or Blu-ray technologies, as well as persistent solid-state memory such as flash, solid-state drives, and the like. Personal electronic device 500 is not limited to the components and configuration of FIG. 5B, but can include other or additional components in multiple configurations.
[0226] As used here, the term “affordance” refers to a user-interactive graphical user interface object that is, optionally, displayed on the display screen of devices 100, 300, and/or 500 (FIGS. 1A, 3, and 5A-5B). For example, an image (e.g., icon), a button, and text (e.g., hyperlink) each optionally constitute an affordance.
[0227] As used herein, the term “focus selector” refers to an input element that indicates a current part of a user interface with which a user is interacting. In some implementations that include a cursor or other location marker, the cursor acts as a “focus selector” so that when an input (e.g., a press input) is detected on a touch-sensitive surface (e.g., touchpad 355 in FIG. 3 or touch-sensitive surface 451 in FIG. 4B) while the cursor is over a particular user interface element (e.g., a button, window, slider, or other user interface element), the particular user interface element is adjusted in accordance with the detected input. In some implementations that include a touch screen display (e.g., touch-sensitive display system 112 in FIG. 1A or touch screen 112 in FIG. 4A) that enables direct interaction with user interface elements on the touch screen display, a detected contact on the touch screen acts as a “focus selector” so that when an input (e.g., a press input by the contact) is detected on the touch screen display at a location of a particular user interface element (e.g., a button, window, slider, or other user interface element), the particular user interface element is adjusted in accordance with the detected input. In some implementations, focus is moved from one region of a user interface to another region of the user interface without corresponding movement of a cursor or movement of a contact on a touch screen display (e.g., by using a tab key or arrow keys to move focus from one button to another button); in these implementations, the focus selector moves in accordance with movement of focus between different regions of the user interface. Without regard to the specific form taken by the focus selector, the focus selector is generally the user interface element (or contact on a touch screen display) that is controlled by the user so as to communicate the user’s intended interaction with the user interface (e.g., by indicating, to the device, the element of the user
DK 2017 70505 A1 interface with which the user is intending to interact). For example, the location of a focus selector (e.g., a cursor, a contact, or a selection box) over a respective button while a press input is detected on the touch-sensitive surface (e.g., a touchpad or touch screen) will indicate that the user is intending to activate the respective button (as opposed to other user interface elements shown on a display of the device).
[0228] As used in the specification and claims, the term “characteristic intensity” of a contact refers to a characteristic of the contact based on one or more intensities of the contact. In some embodiments, the characteristic intensity is based on multiple intensity samples. The characteristic intensity is, optionally, based on a predefined number of intensity samples, or a set of intensity samples collected during a predetermined time period (e.g., 0.05, 0.1, 0.2, 0.5, 1, 2, 5, 10 seconds) relative to a predefined event (e.g., after detecting the contact, prior to detecting liftoff of the contact, before or after detecting a start of movement of the contact, prior to detecting an end of the contact, before or after detecting an increase in intensity of the contact, and/or before or after detecting a decrease in intensity of the contact). A characteristic intensity of a contact is, optionally, based on one or more of: a maximum value of the intensities of the contact, a mean value of the intensities of the contact, an average value of the intensities of the contact, a top 10 percentile value of the intensities of the contact, a value at the half maximum of the intensities of the contact, a value at the 90 percent maximum of the intensities of the contact, or the like. In some embodiments, the duration of the contact is used in determining the characteristic intensity (e.g., when the characteristic intensity is an average of the intensity of the contact over time). In some embodiments, the characteristic intensity is compared to a set of one or more intensity thresholds to determine whether an operation has been performed by a user. For example, the set of one or more intensity thresholds optionally includes a first intensity threshold and a second intensity threshold. In this example, a contact with a characteristic intensity that does not exceed the first threshold results in a first operation, a contact with a characteristic intensity that exceeds the first intensity threshold and does not exceed the second intensity threshold results in a second operation, and a contact with a characteristic intensity that exceeds the second threshold results in a third operation. In some embodiments, a comparison between the characteristic intensity and one or more thresholds is used to determine whether or not to perform one or more operations (e.g., whether to perform a respective operation or forgo
DK 2017 70505 A1 performing the respective operation), rather than being used to determine whether to perform a first operation or a second operation.
[0229] FIG. 5C illustrates detecting a plurality of contacts 552A-552E on touch-sensitive display screen 504 with a plurality of intensity sensors 524A-524D. FIG. 5C additionally includes intensity diagrams that show the current intensity measurements of the intensity sensors 524A-524D relative to units of intensity. In this example, the intensity measurements of intensity sensors 524A and 524D are each 9 units of intensity, and the intensity measurements of intensity sensors 524B and 524C are each 7 units of intensity. In some implementations, an aggregate intensity is the sum of the intensity measurements of the plurality of intensity sensors 524A-524D, which in this example is 32 intensity units. In some embodiments, each contact is assigned a respective intensity that is a portion of the aggregate intensity. FIG. 5D illustrates assigning the aggregate intensity to contacts 552A-552E based on their distance from the center of force 554. In this example, each of contacts 552A, 552B, and 552E are assigned an intensity of contact of 8 intensity units of the aggregate intensity, and each of contacts 552C and 552D are assigned an intensity of contact of 4 intensity units of the aggregate intensity. More generally, in some implementations, each contact j is assigned a respective intensity Ij that is a portion of the aggregate intensity, A, in accordance with a predefined mathematical function, Ij = A-(DjÆDi), where Dj is the distance of the respective contact j to the center of force, and ΣDi is the sum of the distances of all the respective contacts (e.g., i=1 to last) to the center of force. The operations described with reference to FIGS. 5C-5D can be performed using an electronic device similar or identical to device 100, 300, or 500. In some embodiments, a characteristic intensity of a contact is based on one or more intensities of the contact. In some embodiments, the intensity sensors are used to determine a single characteristic intensity (e.g., a single characteristic intensity of a single contact). It should be noted that the intensity diagrams are not part of a displayed user interface, but are included in FIGS. 5C-5D to aid the reader.
[0230] In some embodiments, a portion of a gesture is identified for purposes of determining a characteristic intensity. For example, a touch-sensitive surface optionally receives a continuous swipe contact transitioning from a start location and reaching an end location, at which point the intensity of the contact increases. In this example, the characteristic intensity of
DK 2017 70505 A1 the contact at the end location is, optionally, based on only a portion of the continuous swipe contact, and not the entire swipe contact (e.g., only the portion of the swipe contact at the end location). In some embodiments, a smoothing algorithm is, optionally, applied to the intensities of the swipe contact prior to determining the characteristic intensity of the contact. For example, the smoothing algorithm optionally includes one or more of: an unweighted sliding-average smoothing algorithm, a triangular smoothing algorithm, a median filter smoothing algorithm, and/or an exponential smoothing algorithm. In some circumstances, these smoothing algorithms eliminate narrow spikes or dips in the intensities of the swipe contact for purposes of determining a characteristic intensity.
[0231] The intensity of a contact on the touch-sensitive surface is, optionally, characterized relative to one or more intensity thresholds, such as a contact-detection intensity threshold, a light press intensity threshold, a deep press intensity threshold, and/or one or more other intensity thresholds. In some embodiments, the light press intensity threshold corresponds to an intensity at which the device will perform operations typically associated with clicking a button of a physical mouse or a trackpad. In some embodiments, the deep press intensity threshold corresponds to an intensity at which the device will perform operations that are different from operations typically associated with clicking a button of a physical mouse or a trackpad. In some embodiments, when a contact is detected with a characteristic intensity below the light press intensity threshold (e.g., and above a nominal contact-detection intensity threshold below which the contact is no longer detected), the device will move a focus selector in accordance with movement of the contact on the touch-sensitive surface without performing an operation associated with the light press intensity threshold or the deep press intensity threshold. Generally, unless otherwise stated, these intensity thresholds are consistent between different sets of user interface figures.
[0232] An increase of characteristic intensity of the contact from an intensity below the light press intensity threshold to an intensity between the light press intensity threshold and the deep press intensity threshold is sometimes referred to as a “light press” input. An increase of characteristic intensity of the contact from an intensity below the deep press intensity threshold to an intensity above the deep press intensity threshold is sometimes referred to as a “deep press”
DK 2017 70505 A1 input. An increase of characteristic intensity of the contact from an intensity below the contactdetection intensity threshold to an intensity between the contact-detection intensity threshold and the light press intensity threshold is sometimes referred to as detecting the contact on the touchsurface. A decrease of characteristic intensity of the contact from an intensity above the contactdetection intensity threshold to an intensity below the contact-detection intensity threshold is sometimes referred to as detecting liftoff of the contact from the touch-surface. In some embodiments, the contact-detection intensity threshold is zero. In some embodiments, the contact-detection intensity threshold is greater than zero.
[0233] In some embodiments described herein, one or more operations are performed in response to detecting a gesture that includes a respective press input or in response to detecting the respective press input performed with a respective contact (or a plurality of contacts), where the respective press input is detected based at least in part on detecting an increase in intensity of the contact (or plurality of contacts) above a press-input intensity threshold. In some embodiments, the respective operation is performed in response to detecting the increase in intensity of the respective contact above the press-input intensity threshold (e.g., a “down stroke” of the respective press input). In some embodiments, the press input includes an increase in intensity of the respective contact above the press-input intensity threshold and a subsequent decrease in intensity of the contact below the press-input intensity threshold, and the respective operation is performed in response to detecting the subsequent decrease in intensity of the respective contact below the press-input threshold (e.g., an “up stroke” of the respective press input).
[0234] FIGS. 5E-5H illustrate detection of a gesture that includes a press input that corresponds to an increase in intensity of a contact 562 from an intensity below a light press intensity threshold (e.g., “ITL”) in FIG. 5E, to an intensity above a deep press intensity threshold (e.g., “ITD”) in FIG. 5H. The gesture performed with contact 562 is detected on touch-sensitive surface 560 while cursor 576 is displayed over application icon 572B corresponding to App 2, on a displayed user interface 570 that includes application icons 572A-572D displayed in predefined region 574. In some embodiments, the gesture is detected on touch-sensitive display 504. The intensity sensors detect the intensity of contacts on touch-sensitive surface 560. The device
DK 2017 70505 A1 determines that the intensity of contact 562 peaked above the deep press intensity threshold (e.g., “ITd”). Contact 562 is maintained on touch-sensitive surface 560. In response to the detection of the gesture, and in accordance with contact 562 having an intensity that goes above the deep press intensity threshold (e.g., “ITD”) during the gesture, reduced-scale representations 578A578C (e.g., thumbnails) of recently opened documents for App 2 are displayed, as shown in FIGS. 5F-5H. In some embodiments, the intensity, which is compared to the one or more intensity thresholds, is the characteristic intensity of a contact. It should be noted that the intensity diagram for contact 562 is not part of a displayed user interface, but is included in FIGS. 5E-5H to aid the reader.
[0235] In some embodiments, the display of representations 578A-578C includes an animation. For example, representation 578A is initially displayed in proximity of application icon 572B, as shown in FIG. 5F. As the animation proceeds, representation 578A moves upward and representation 578B is displayed in proximity of application icon 572B, as shown in FIG. 5G. Then, representations 578A moves upward, 578B moves upward toward representation 578A, and representation 578C is displayed in proximity of application icon 572B, as shown in FIG. 5H. Representations 578A-578C form an array above icon 572B. In some embodiments, the animation progresses in accordance with an intensity of contact 562, as shown in FIGS. 5F5G, where the representations 578A-578C appear and move upwards as the intensity of contact 562 increases toward the deep press intensity threshold (e.g., “ITD”). In some embodiments, the intensity, on which the progress of the animation is based, is the characteristic intensity of the contact. The operations described with reference to FIGS. 5E-5H can be performed using an electronic device similar or identical to device 100, 300, or 500.
[0236] In some embodiments, the device employs intensity hysteresis to avoid accidental inputs sometimes termed “jitter,” where the device defines or selects a hysteresis intensity threshold with a predefined relationship to the press-input intensity threshold (e.g., the hysteresis intensity threshold is X intensity units lower than the press-input intensity threshold or the hysteresis intensity threshold is 75%, 90%, or some reasonable proportion of the press-input intensity threshold). Thus, in some embodiments, the press input includes an increase in intensity of the respective contact above the press-input intensity threshold and a subsequent
DK 2017 70505 A1 decrease in intensity of the contact below the hysteresis intensity threshold that corresponds to the press-input intensity threshold, and the respective operation is performed in response to detecting the subsequent decrease in intensity of the respective contact below the hysteresis intensity threshold (e.g., an “up stroke” of the respective press input). Similarly, in some embodiments, the press input is detected only when the device detects an increase in intensity of the contact from an intensity at or below the hysteresis intensity threshold to an intensity at or above the press-input intensity threshold and, optionally, a subsequent decrease in intensity of the contact to an intensity at or below the hysteresis intensity, and the respective operation is performed in response to detecting the press input (e.g., the increase in intensity of the contact or the decrease in intensity of the contact, depending on the circumstances).
[0237] For ease of explanation, the descriptions of operations performed in response to a press input associated with a press-input intensity threshold or in response to a gesture including the press input are, optionally, triggered in response to detecting either: an increase in intensity of a contact above the press-input intensity threshold, an increase in intensity of a contact from an intensity below the hysteresis intensity threshold to an intensity above the press-input intensity threshold, a decrease in intensity of the contact below the press-input intensity threshold, and/or a decrease in intensity of the contact below the hysteresis intensity threshold corresponding to the press-input intensity threshold. Additionally, in examples where an operation is described as being performed in response to detecting a decrease in intensity of a contact below the press-input intensity threshold, the operation is, optionally, performed in response to detecting a decrease in intensity of the contact below a hysteresis intensity threshold corresponding to, and lower than, the press-input intensity threshold.
[0238] FIG. 6 illustrates exemplary devices connected via one or more communication channels to participate in a transaction in accordance with some embodiments. One or more exemplary electronic devices (e.g., devices 100, 300, and 500) are configured to optionally detect input (e.g., a particular user input, an NFC field) and optionally transmit payment information (e.g., using NFC). The one or more electronic devices optionally include NFC hardware and are configured to be NFC-enabled.
DK 2017 70505 A1 [0239] The electronic devices (e.g., devices 100, 300, and 500) are optionally configured to store payment account information associated with each of one or more payment accounts. Payment account information includes, for example, one or more of: a person’s or company’s name, a billing address, a login, a password, an account number, an expiration date, a security code, a telephone number, a bank associated with the payment account (e.g., an issuing bank), and a card network identifier. In some examples, payment account information includes include an image, such as a picture of a payment card (e.g., taken by the device and/or received at the device). In some examples, the electronic devices receive user input including at least some payment account information (e.g., receiving user-entered credit, debit, account, or gift card number and expiration date). In some examples, the electronic devices detect at least some payment account information from an image (e.g., of a payment card captured by a camera sensor of the device). In some examples, the electronic devices receive at least some payment account information from another device (e.g., another user device or a server). In some examples, the electronic device receives payment account information from a server associated with another service for which an account for a user or user device previously made a purchase or identified payment account data (e.g., an app for renting or selling audio and/or video files).
[0240] In some embodiments, a payment account is added to an electronic device (e.g., device 100, 300, and 500), such that payment account information is securely stored on the electronic device. In some examples, after a user initiates such process, the electronic device transmits information for the payment account to a transaction-coordination server, which then communicates with a server operated by a payment network for the account (e.g., a payment server) to ensure a validity of the information. The electronic device is optionally configured to receive a script from the server that allows the electronic device to program payment information for the account onto the secure element.
[0241] In some embodiments, communication among electronic devices 100, 300, and 500 facilitates transactions (e.g., generally or specific transactions). For example, a first electronic device (e.g., 100) can serve as a provisioning or managing device, and can send notifications of new or updated payment account data (e.g., information for a new account, updated information for an existing account, and/or an alert pertaining to an existing account) to a second electronic
DK 2017 70505 A1 device (e.g., 500). In another example, a first electronic device (e.g., 100) can send data to a second election device, wherein the data reflects information about payment transactions facilitated at the first electronic device. The information optionally includes one or more of: a payment amount, an account used, a time of purchase, and whether a default account was changed. The second device (e.g., 500) optionally uses such information to update a default payment account (e.g., based on a learning algorithm or explicit user input).
[0242] Electronic devices (e.g., 100, 300, 500) are configured to communicate with each other over any of a variety of networks. For example, the devices communicate using a Bluetooth connection 608 (e.g., which includes a traditional Bluetooth connection or a Bluetooth Low Energy connection) or using a WiFi network 606. Communications among user devices are, optionally, conditioned to reduce the possibility of inappropriately sharing information across devices. For example, communications relating to payment information requires that the communicating devices be paired (e.g., be associated with each other via an explicit user interaction) or be associated with a same user account.
[0243] In some embodiments, an electronic device (e.g., 100, 300, 500) is used to communicate with a point-of-sale (POS) payment terminal 600, which is optionally NFCenabled. The communication optionally occurs using a variety of communication channels and/or technologies. In some examples, electronic device (e.g., 100, 300, 500) communicates with payment terminal 600 using an NFC channel 610. In some examples, payment terminal 600 communicates with an electronic device (e.g., 100, 300, 500) using a peer-to-peer NFC mode. Electronic device (e.g., 100, 300, 500) is optionally configured transmit a signal to payment terminal 600 that includes payment information for a payment account (e.g., a default account or an account selected for the particular transaction).
[0244] In some embodiments, proceeding with a transaction includes transmitting a signal that includes payment information for an account, such as a payment account. In some embodiments, proceeding with the transaction includes reconfiguring the electronic device (e.g., 100, 300, 500) to respond as a contactless payment card, such as an NFC-enabled contactless payment card, and then transmitting credentials of the account via NFC, such as to payment terminal 600. In some embodiments, subsequent to transmitting credentials of the account via
DK 2017 70505 A1
NFC, the electronic device reconfigures to not respond as a contactless payment card (e.g., requiring authorization before again reconfigured to respond as a contactless payment card via NFC).
[0245] In some embodiments, generation of and/or transmission of the signal is controlled by a secure element in the electronic device (e.g., 100, 300, 500). The secure element optionally requires a particular user input prior to releasing payment information. For example, the secure element optionally requires detection that the electronic device is being worn, detection of a button press, detection of entry of a passcode, detection of a touch, detection of one or more option selections (e.g., received while interacting with an application), detection of a fingerprint signature, detection of a voice or voice command, and or detection of a gesture or movement (e.g., rotation or acceleration). In some examples, if a communication channel (e.g., an NFC communication channel) with another device (e.g., payment terminal 600) is established within a defined time period from detection of the input, the secure element releases payment information to be transmitted to the other device (e.g., payment terminal 600). In some examples, the secure element is a hardware component that controls release of secure information. In some examples, the secure element is a software component that controls release of secure information.
[0246] In some embodiments, protocols related to transaction participation depend on, for example, device types. For example, a condition for generating and/or transmitting payment information can be different for a wearable device (e.g., device 500) and a phone (e.g., device 100). For example, a generation and/or transmission condition for a wearable device includes detecting that a button has been pressed (e.g., after a security verification), while a corresponding condition for a phone does not require button-depression and instead requires detection of particular interaction with an application. In some examples, a condition for transmitting and/or releasing payment information includes receiving particular input on each of multiple devices. For example, release of payment information optionally requires detection of a fingerprint and/or passcode at the device (e.g., device 100) and detection of a mechanical input (e.g., button press) on another device (e.g., device 500).
[0247] Payment terminal 600 optionally uses the payment information to generate a signal to transmit to a payment server 604 to determine whether the payment is authorized. Payment
DK 2017 70505 A1 server 604 optionally includes any device or system configured to receive payment information associated with a payment account and to determine whether a proposed purchase is authorized. In some examples, payment server 604 includes a server of an issuing bank. Payment terminal 600 communicates with payment server 604 directly or indirectly via one or more other devices or systems (e.g., a server of an acquiring bank and/or a server of a card network).
[0248] Payment server 604 optionally uses at least some of the payment information to identify a user account from among a database of user accounts (e.g., 602). For example, each user account includes payment information. An account is, optionally, located by locating an account with particular payment information matching that from the POS communication. In some examples, a payment is denied when provided payment information is not consistent (e.g., an expiration date does not correspond to a credit, debit or gift card number) or when no account includes payment information matching that from the POS communication.
[0249] In some embodiments, data for the user account further identifies one or more restrictions (e.g., credit limits); current or previous balances; previous transaction dates, locations and/or amounts; account status (e.g., active or frozen), and/or authorization instructions. In some examples, the payment server (e.g., 604) uses such data to determine whether to authorize a payment. For example, a payment server denies a payment when a purchase amount added to a current balance would result in exceeding an account limit, when an account is frozen, when a previous transaction amount exceeds a threshold, or when a previous transaction count or frequency exceeds a threshold.
[0250] In some embodiments, payment server 604 responds to POS payment terminal 600 with an indication as to whether a proposed purchase is authorized or denied. In some examples, POS payment terminal 600 transmits a signal to the electronic device (e.g., 100, 300, 500) to identify the result. For example, POS payment terminal 600 sends a receipt to the electronic device (e.g., 100, 300, 500) when a purchase is authorized (e.g., via a transaction-coordination server that manages a transaction app on the user device). In some instances, POS payment terminal 600 presents an output (e.g., a visual or audio output) indicative of the result. Payment can be sent to a merchant as part of the authorization process or can be subsequently sent.
DK 2017 70505 A1 [0251] In some embodiments, the electronic device (e.g., 100, 300, 500) participates in a transaction that is completed without involvement of POS payment terminal 600. For example, upon detecting that a mechanical input has been received, a secure element in the electronic device (e.g., 100, 300, 500) releases payment information to allow an application on the electronic device to access the information (e.g., and to transmit the information to a server associated with the application).
[0252] In some embodiments, the electronic device (e.g., 100, 300, 500) is in a locked state or an unlocked state. In the locked state, the electronic device is powered on and operational but is prevented from performing a predefined set of operations in response to the user input. The predefined set of operations may include navigation between user interfaces, activation or deactivation of a predefined set of functions, and activation or deactivation of certain applications. The locked state may be used to prevent unintentional or unauthorized use of some functionality of the electronic device or activation or deactivation of some functions on the electronic device. In the unlocked state, the electronic device 100 is power on and operational and is not prevented from performing at least a portion of the predefined set of operations that cannot be performed while in the locked state.
[0253] When the device is in the locked state, the device is said to be locked. In some embodiments, the device in the locked state may respond to a limited set of user inputs, including input that corresponds to an attempt to transition the device to the unlocked state or input that corresponds to powering the device off.
[0254] In some examples, a secure element (e.g., 115) is a hardware component (e.g., a secure microcontroller chip) configured to securely store data or an algorithm such that the securely stored data is not accessible by the device without proper authentication information from a user of the device. Keeping the securely stored data in a secure element that is separate from other storage on the device prevents access to the securely stored data even if other storage locations on the device are compromised (e.g., by malicious code or other attempts to compromise information stored on the device). In some examples, the secure element provides (or releases) payment information (e.g., an account number and/or a transaction-specific dynamic security code). In some examples, the secure element provides (or releases) the payment
DK 2017 70505 A1 information in response to the device receiving authorization, such as a user authentication (e.g., fingerprint authentication; passcode authentication; detecting double-press of a hardware button when the device is in an unlocked state, and optionally, while the device has been continuously on a user’s wrist since the device was unlocked by providing authentication credentials to the device, where the continuous presence of the device on the user’s wrist is determined by periodically checking that the device is in contact with the user’s skin). For example, the device detects a fingerprint at a fingerprint sensor (e.g., a fingerprint sensor integrated into a button) of the device. The device determines whether the fingerprint is consistent with a registered fingerprint. In accordance with a determination that the fingerprint is consistent with the registered fingerprint, the secure element provides (or releases) payment information. In accordance with a determination that the fingerprint is not consistent with the registered fingerprint, the secure element forgoes providing (or releasing) payment information.
[0255] Attention is now directed towards embodiments of user interfaces (“UI”) and associated processes that are implemented on an electronic device, such as portable multifunction device 100, device 300, or device 500.
[0256] FIGS. 7A-7E illustrate exemplary user interfaces for managing peer-to-peer transfers, in accordance with some embodiments. As described in greater detail below, the non-limiting exemplary embodiment of the user interfaces illustrated in FIGS. 7A-7E relate to the non-limited exemplary embodiment of the user interfaces illustrated in FIGS. 8A-8AH, which in turn are used to illustrate the processes described below, including the processes in FIGS. 9A-9I.
[0257] FIG. 7A illustrates an electronic device 700 (e.g., portable multifunction device 100, device 300, or device 500). In the non-limiting exemplary embodiment illustrated in FIGS. 7A7E, electronic device 700 is a smartphone. In other embodiments, electronic device 700 can be a different type of electronic device, such as a wearable device (e.g., a smartwatch). Electronic device 700 has a display 702, one or more input devices (e.g., touchscreen of display 702, a mechanical button 704, a mic), and a wireless communication radio.
[0258] In FIG. 7A, electronic device 700 displays, on display 702, a message conversation 708 of a messaging application 706 between a user of the device (e.g., “Kate Appleseed”) and a
DK 2017 70505 A1 message participant 710 (e.g., “John Appleseed”). In some embodiments, message participant 710 is a contact stored on the device. In some embodiments, message participant 710 is a contact of a contact list associated with the user account logged onto the device. In some embodiments, message participant 710 is a contact included in a trusted contacts list associated with the user account logged onto the device.
[0259] In some embodiments, electronic device 700 also displays, on display 702, a virtual keyboard 712 (e.g., an alphanumeric keyboard for typing a message) and a compose bar 714 displaying the text of a message as a message is typed using virtual keyboard 712. In some embodiments, a mechanical keyboard can be used in addition to or alternatively to virtual keyboard 712 to type a message. In some embodiments, compose bar 714 can expand (e.g., expand upwards) to accommodate a longer message or message object (e.g., an image, an emoticon, a special type of message object, such as a payment object). In some embodiments, compose bar 714 includes a mic button 716 which, when activated, enables the user to record a message using voice input.
[0260] As shown in FIG. 7A, message conversation 708 includes two visible message objects 718 and 720. Message object 718 corresponds to a message sent by the user of the device to message participant 710. In message object 718, the user states to message participant 710: “That restaurant was so good!” Message object 720 corresponds to a message sent by message participant 710 to the user (as a response to the message corresponding to message object 718). In message object 720, message participant 710 responds to the user: “Yeah! Can you send me the photos from last night?” [0261] In some embodiments, electronic device 700 performs an analysis of the contents (e.g., the text) of the message corresponding to message object 720 (stating, as a response to the message corresponding to message object 718 stating “That restaurant was so good!”, “Yeah! Can you send me the photos from last night?”). In some embodiments, the analysis of the contents (e.g., the text) of the message is performed by electronic device 700 using a language processing component or a language analysis component of the device. In some embodiments, the analysis is performed at an external device (e.g., a server), and electronic device 700 receives a result of the analysis from the external device.
DK 2017 70505 A1 [0262] Based on the analysis of the contents (e.g., the text) of message object 720 (and, optionally, one or more other previous or subsequent message objects of message conversation 708, such as message object 718 and 720), in accordance with a determination (e.g., made at electronic device 700 or received from an external device, such as a server) that the contents (e.g., the text) of the message corresponding to message object 720 relates to a transfer of an electronic file (e.g., a photo, a video, a document, an audio file) that messaging application706 is configured to transfer, electronic device 700 displays a selectable indication that corresponds to a transfer of one or more files (e.g., photos, video files, audio files, documents) or to an intent to proceed with a transfer of one or more files (e.g., photos, video files, audio files, documents), as discussed below. For example, in FIG. 7A, a determination is made, based on the text of message object 720 (stating “Yeah! Can you send me the photos from last night?”) that message participant 710 is requesting a transfer of photos taken from a specific time period (e.g., last night).
[0263] In some embodiments, as shown in FIG. 7A, electronic device 700 provides a marking 722 (e.g., underlining, bolding, highlighting) of a phrase (e.g., “photos from last night”) within message object 720 that is determined (based on the analysis discussed above) by the device or by an external device (e.g., a server) communicating with the device to correspond to the request for the transfer of one or more files (e.g., photos from last night). Additionally, in response to the determination that a phrase within the message corresponds to a request for a transfer of one or more files, electronic device 700 displays (e.g., over a portion of virtual keyboard 712, between virtual keyboard 712 and compose bar 714) a suggestions bar 724 that includes a transfer button 726 for proceeding with a transfer of the requested one or more files corresponding to the message of message object 720. For example, in FIG. 7A, transfer button 726 shows “PHOTOS” to indicate that the button relates to the transfer of the requested photos, and that the transfer can be made using an operating-system (first-party) controlled transfer application (and not by a third-party application).. In some embodiments, suggestions bar 724 also includes suggested responses (e.g., “Sure,” “OK”) for responding to the message of message object 720 (without proceeding with a transfer of a file).
DK 2017 70505 A1 [0264] In FIG. 7B, while displaying message conversation 708, electronic device 700 detects (e.g., via the touchscreen) user activation of marking 722 of the phrase corresponding to the request for transfer of one or more files (e.g., photos from last night) included in message object 720. For example, as shown in FIG. 7B, the user activation is a tap gesture 701 of marking 722 of the phrase (e.g., the underlined “photos from last night”) included in message object 720. Alternatively, in some embodiments, the user activation can be user selection (e.g., a tap gesture) of transfer button 726 (e.g., showing “PHOTOS”) within suggestions bar 724.
[0265] In FIG. 7C, in response to detecting tap gesture 701 on marking 722 corresponding to message participant 710’s request for “photos from last night,” electronic device 700 displays, on display 702, a photo gallery user interface 728. In some embodiments, as shown in FIG. 7C, photo gallery user interface 728 replaces display of messaging application 706 and virtual keyboard 712. In some embodiments, photo gallery user interface 728 slides into the display from an edge of the display (e.g., slides up from the bottom edge of the display) to replace display of virtual keyboard 712 (and, optionally, messaging application 706).
[0266] In some embodiments, as shown in FIG. 7C, photo gallery user interface 728 includes a plurality of selectable preview images corresponding to photos stored on electronic device 700 (or accessible by the device via a remote server). In some embodiments, as shown in FIG. 7C, the plurality of selectable preview images are organized based on time (e.g., a date during which a photo was taken) and/or based on location (e.g., of where a photo was taken). For example, the plurality of selectable preview images 730A-730F shown under header 730 correspond to photos taken on April 30 at Cupertino, CA, the plurality of selectable preview images 732A-732C shown under header 732 corresponds to photos taken yesterday at San Francisco, CA, and the plurality of selectable preview images 734A-734B shown under header 734 correspond to photos taken today.
[0267] Further, because photo gallery user interface 728 was launched via user activation of marking 722 corresponding to the detected request for a transfer of “photos from last night” from message participant 710, selectable preview images that are consistent with the detected request are pre-selected (to be transferred) when photo gallery user interface 728 is displayed. In photo gallery user interface 728, plurality of selectable preview images 732A-732C correspond to
DK 2017 70505 A1 photos taken last night (e.g., as indicated by header 732). Thus, as shown in FIG. 7C, each of selectable preview images 732A-732C are pre-selected (e.g., as indicated by graphical checkmarks on the preview images) to be transferred to message participant 710 via messaging application 706.
[0268] In some embodiments, as also shown in FIG. 7C, photo gallery user interface 728 includes a compose bar 736 (e.g., corresponding to compose bar 714) for including a comment to accompany the transfer (e.g., of selected photos), and a send button 738 for initiating the transfer (e.g., of the selected photos).
[0269] In FIG. 7D, while displaying photo gallery user interface 728 with selectable preview images 732A-732C (corresponding to photos from last night) pre-selected to be transferred (to message participant 710), electronic device 700 detects user activation of send button 738 for initiating the transfer of the photos corresponding to selectable preview images 732A-732C. For example, as shown in FIG. 7D, the user activation is a tap gesture 703 on send button 738.
[0270] In FIG. 7E, in response to detecting tap gesture 703, electronic device transmits, using messaging application 706 and via a wireless communication radio, the photos corresponding to the selected selectable preview images 732A-732C to message participant 710 and again displays (e.g., replaces display of photo gallery user interface 728 with), on display 702, message conversation 708 of messaging application 706. As shown in FIG. 7E, message conversation 708 shows photo message objects 733A-733C (corresponding to the photos corresponding to selectable preview images 732A-732C) having been sent to message participant 710 via messaging application 706. In some embodiments, message conversation 708 further displays an indication 740 (e.g., stating “Delivered”) informing the user that the photos have been sent to the intended recipient (e.g., message participant 710).
[0271] As mentioned above, the non-limiting exemplary embodiment of the user interfaces illustrated in FIGS. 7A-7E described above relate to the non-limited exemplary embodiment of the user interfaces illustrated in FIGS. 8A-8AH described below. Therefore, it is to be understood that the processes described above with respect to the exemplary user interfaces illustrated in FIGS. 7A-7E and the processes described below with respect to the exemplary user
DK 2017 70505 A1 interfaces illustrated in FIGS. 8A-8AH are largely analogous processes that similarly involve initiating and managing transfers using an electronic device (e.g., 100, 300, 500, 700, or 800).
[0272] FIGS. 8A-8AH illustrate exemplary user interfaces for managing peer-to-peer transfers, in accordance with some embodiments. The user interfaces in these figures are used to illustrate the processes described below, including the processes in FIGS. 9A-9I.
[0273] FIG. 8A illustrates an electronic device 800 (e.g., portable multifunction device 100, device 300, or device 500). In the non-limiting exemplary embodiment illustrated in FIGS. 8A8AH, electronic device 800 is a smartphone. In other embodiments, electronic device 800 can be a different type of electronic device, such as a wearable device (e.g., a smartwatch). Electronic device 800 has a display 802, one or more input devices (e.g., touchscreen of display 802, a mechanical button 804, a mic), and a wireless communication radio.
[0274] In FIG. 8A, electronic device 800 displays, on display 802, a message conversation 808 of a messaging application 806 between a user of the device (e.g., “Kate Appleseed”) and a message participant 810 (e.g., “John Appleseed”). In some embodiments, message participant 810 is a contact stored on the device. In some embodiments, message participant 810 is a contact of a contact list associated with the user account logged onto the device. In some embodiments, message participant 810 is a contact included in a trusted contacts list associated with the user account logged onto the device.
[0275] In some embodiments, electronic device 800 also displays, on display 802, a virtual keyboard 812 (e.g., an alphanumeric keyboard for typing a message) and a compose bar 814 displaying the text of a message as a message is typed using virtual keyboard 812. In some embodiments, a mechanical keyboard can be used in addition to or alternatively to virtual keyboard 812 to type a message. In some embodiments, compose bar 814 can expand (e.g., expand upwards) to accommodate a longer message or message object (e.g., an image, an emoticon, a special type of message object, such as a payment object). In some embodiments, compose bar 814 includes a mic button 814A which, when activated, enables the user to record a message using voice input.
DK 2017 70505 A1 [0276] As shown in FIG. 8A, message conversation 808 includes two visible message objects 816 and 818. Message object 816 corresponds to a message sent by message participant 810 to the user and message object 818 corresponds to a message sent by the user to message participant 810. In message object 816, message participant 810 states to the user: “Last night was fun. Pay me back when you can.” In message object 818, the user asks message participant 810: “How much do I owe you?” [0277] FIG. 8B shows, in message conversation 808 of messaging application 806, message object 818 corresponding to a new received message from message participant 810 to the user responding to the user’s question of “How much do I owe you?” Specifically, in message object 818, message participant 810 responds: “Dinner and the cab ride together was $28.” In response to receiving the new message corresponding to message object 820, an analysis of the contents (e.g., the text) of message object 820 is performed. In some embodiments, the analysis is performed by electronic device 800 using a language processing component or a language analysis component of the device. In some embodiments, the analysis is performed at an external device (e.g., a server), and electronic device 800 receives a result of the analysis from the external device.
[0278] Based on the analysis of the contents (e.g., the text) of message object 820 (and, optionally, one or more other previous or subsequent message objects of message conversation 808, such as message object 816 and 818), in accordance with a determination (e.g., made at electronic device 800 or received from an external device, such as a server) that the contents (e.g., the text) of the message corresponding to message object 820 relates to a transfer of a payment (e.g., a request for a payment, agreement to send a payment) that messaging application 806 is configured to transfer, electronic device 800 displays a selectable indication that corresponds to a payment amount or to an intent to proceed with a payment transfer, as discussed below.
[0279] In some embodiments, the determination that the contents of a message relates to a payment (or, alternatively, relates to a request for a payment) is made based at least in part on an indication of an amount (e.g., “$28”) of the payment included in the message (or one or more previous or subsequent messages). In some embodiments, the determination that the contents of 98
DK 2017 70505 A1 a message relates to a payment (or, alternatively, relates to a request for a payment) is made based at least in part on an indication of an amount (e.g., “$28”) of the payment included in the message and/or one or more text triggers in the message (and, optionally, one or more previous or subsequent messages) (e.g., “I owe you,” “Pay me,” “Here is the payment”). In some embodiments, the determination that the contents of a message relates to a payment (or, alternatively, relates to a request for a payment) is made based at least in part on an indication of an amount (e.g., “$28”) of the payment included in the message and a more detailed analysis of the text of the message (and, optionally, one or more previous or subsequent messages) using language processing and interpretation techniques to decipher an intent of the message (and, optionally, one or more previous or subsequent messages).
[0280] For example, in FIG. 8B, in response to the user’s question of “How much do I owe you?” shown in message object 818, message participant 810 responds, as shown in message object 820, “Dinner and the cab ride together was $28.” In response, an analysis (e.g., by an external device, such as a server, or by electronic device 800) of the contents (e.g., “Dinner and the cab ride together was $28”) of the message is performed and a determination is made (e.g., by an external device, such as a server, or by electronic device 800) that the message corresponds to a request for a payment in the amount of $28 by message participant 810 from the user.
[0281] As shown in FIG. 8B, electronic device 800 provides a marking 822 (e.g., underlining, bolding, highlighting) of the payment amount shown in message object 820. Additionally, in response to the determination that the message relates to a payment, electronic device 800 displays (e.g., over a portion of virtual keyboard 812, between virtual keyboard 812 and compose bar 814) a suggestions bar 824 that includes a pay amount button 826 that includes a selectable indication (e.g., “$28”) of the payment amount. In FIG. 8B, in addition to pay amount button 826, suggestions bar 824 includes a pay button 828 (e.g., showing “PAY”) that does not include an indication of the payment amount but includes an indication that the button is for proceeding with a payment (or, alternatively, proceeding with a payment request). In some embodiments, one or more of the in-message selectable indications is omitted. In some embodiments, pay button 828 indicates (e.g., by showing “PAY”) to the user that a payment can be made, with respect to the potential payment detected from the message (or one or more
DK 2017 70505 A1 previous or subsequent messages), using an operating-system controlled payment transfer application (and not by a third-party application).
[0282] FIG. 8C illustrates a different message conversation 830 of messaging application 806. In FIG. 8C, message conversation 830 is between the user of electronic device 800 and an unknown participant 832. In some embodiments, unknown participant 832 is a participant that does not correspond to a contact stored on the device. In some embodiments, unknown participant 832 is a participant that is not included in a contact of a contact list associated with the user account logged onto the device. In some embodiments, unknown participant 832 is a participant not included in a trusted contacts list associated with the user account logged onto the device. In some embodiments, unknown participant 832 is a participant included in a nontrusted contacts list (e.g., a spam list) associated with the user account logged onto the device. In some embodiments, unknown participant 832 is a participant included in a non-trusted user list (e.g., a spam list) maintained by an external device, such as a server.
[0283] As shown in FIG. 8C, electronic device 800 displays in message conversation 830 a message object 834 corresponding to a message received from unknown participant 832. For example, the message corresponding to message object 834 received from unknown participant 832 has the same content (e.g., states the same thing as) the message corresponding to message object 820 received from message participant 810 (e.g., “Dinner and the cab ride together was $28.). However, because the message corresponding to message object 834 is from an unknown participant, the device, even if a determination (e.g., by an external device, such as a server, or by electronic device 800) is made that the message corresponds to a request for a payment in the amount of $28, forgoes displaying a selectable indication (e.g., marking 822 of the payment $28, pay amount button 826, pay button 828) that corresponds to a payment amount (of $28) or to an intent to proceed with a payment transfer (of $28).
[0284] In some embodiments, in accordance with the determination that the message corresponding to the message object (e.g., message object 834) is from an unknown participant (e.g., unknown participant 832), electronic device 800 forgoes displaying a selectable indication (e.g., marking 822 of the payment $28, pay amount button 826, pay button 828 (showing “PAY”)). In some embodiments, instead of displaying the selectable indication (e.g., marking
100
DK 2017 70505 A1
822 of the payment $28, pay amount button 826, pay button 828), the device displays (e.g., within message conversation 830), a spam notification 836 (e.g., a textual notification, a graphical notification, a prompt) that the message is from an unknown participant. For example, as shown in FIG. 8C, the device displays within message conversation 830 spam notification 836 (a notification message) stating “this sender is not in your contacts list.” In some embodiments, the device further displays (e.g., below spam notification 836), a selectable indication 838 (e.g., a selectable text, a button) for reporting (e.g., transmitting information about) the unknown participant to an external device (e.g., a server). For example, as shown in FIG. 8C, the device displays below spam notification 836 selectable notification 838 (selectable text) stating “Report Spam.” [0285] In FIG. 8D, electronic device 800 again displays message conversation 808 of messaging application 806 with message participant 810 (e.g., as first shown in FIG. 8B). In some embodiments, while displaying message conversation 808, electronic device 800 detects (e.g., via the touchscreen) user activation of marking 822 of the payment amount (e.g., the underlined “$28”) included in message object 820. For example, as shown in FIG. 8D, the user activation is a tap gesture 801 of marking 822 of the payment amount (e.g., the underlined “$28”) included in message object 820. Alternatively, in some embodiments, the user activation can be user selection (e.g., a tap gesture) of payment amount button 826 (with the payment amount (e.g., “$28”) shown) within suggestions bar 824. Alternatively, in some embodiments, the user activation can be user selection (e.g., a tap gesture) of payment button 828 (with an indication (e.g., “PAY”) that the button is for proceeding with a payment) within suggestions bar 824.
[0286] In FIG. 8E, in response to detecting tap gesture 801, electronic device 800 displays, on display 802, a payment transfer user interface 840. In some embodiments, payment transfer user interface 840 replaces display of virtual keyboard 812. In some embodiments, payment transfer user interface 840 slides into the display from an edge of the display (e.g., slides up from the bottom edge of the display) to replace display of virtual keyboard 812.
[0287] In some embodiments, payment transfer user interface 840 includes an interface switching menu bar 842 that includes a plurality of shortcut icons for switching between
101
DK 2017 70505 A1 different user interfaces (e.g., switching between payment transfer user interface 840 and a user interface for playing music) associated with different application features (e.g., manage peer-topeer transfers, play music, set alarm clock) accessible from within messaging application 806 while maintain display of message conversation 808. In some embodiments, the plurality of shortcut icons of interface switching menu bar 842 correspond to different applications, thus enabling the user to quickly switch between user interfaces of different applications. In some embodiments, interface switching menu bar 842 includes a payment transfer shortcut icon 844 corresponding to payment transfer user interface 840. Thus, because payment transfer user interface 840 is the currently-displayed user interface, the device in FIG. 8E shows payment transfer shortcut icon 844 currently being selected within interface switching menu bar 842. In some embodiments, payment transfer user interface 840 also includes an indication 841 (e.g., stating “PAY”) informing the user that the payment message object corresponds to a payment made via an operating-system controlled payment transfer application (and not by a third-party application).
[0288] As also shown in FIG. 8E, payment transfer user interface 840 includes a request button 845 for initiating a request for a payment from a different user (e.g., message participant 810) via messaging application 806 and send button 847 for initiating a sending of a payment to a different user (e.g., message participant 810) via messaging application 806.
[0289] As also shown in FIG. 8E, payment transfer user interface 840 includes a value change region 846 that includes an indication 848 of the transfer amount (e.g., “$28”). As shown in FIG. 8E, when payment transfer user interface 840 is displayed in response to user activation (e.g., tap gesture 801) of marking 822 of the payment amount (or of payment amount button 826 with the payment amount shown), the device displays payment transfer user interface 840 with the payment amount (e.g., “$28”) pre-populated in indication 848, as shown in FIG. 8E. In some embodiments, the pre-populated payment amount in indication 848 includes a currency symbol (e.g., “$” of USD). In some embodiments, the pre-populated payment amount in indication 848 does not include a currency symbol.
[0290] Payment transfer user interface 840 also includes, within value change region 846, a value increase button 850 (e.g., indicated as a “+”) for increasing the displayed payment amount 102
DK 2017 70505 A1 (e.g., “$28”) within indication 848 and a value decrease button 852 (e.g., indicated as a “-”) for decreasing the displayed payment amount (e.g., “$28”) within indication 848. In some embodiments, in response to detecting user activation (e.g., a user input) of value increase button 850, the displayed payment amount within indication 848 is increased.
[0291] For example, as shown in FIG. 8F, in response to detecting tap gesture 803 on value increase button 850, the displayed payment amount within indication 848 is increased from “$28” to “$29.” In some embodiments, if the user activation is a tap gesture on value increase button 850 (e.g., tap gesture 803), one tap gesture causes a one unit increase (e.g., an increase by one dollar, an increase by one cent) in the payment amount displayed in indication 848. In some embodiments, if the user activation is as continued press (e.g., a press for at least a predetermined time period) on value increase button 850, the payment amount displayed in indication 848 continually increases by a unit increment at a constant rate. In some embodiments, if the user activation is a continued press (e.g., a press for at least a predetermined time period) on value increase button 850, the payment amount displayed in indication 848 continually increases by a unit increment at an accelerating rate corresponding to the length of the continued press on value increase button 852. In some embodiments, if the user activation is a continued input having a first contact intensity, the payment amount displayed in indication 848 continually increases by a unit increment at a first constant rate, and if the user activation is a continued input having a second contact intensity that is greater than the first contact intensity, the payment amount displayed in indication 848 continually increases by a unit increment at a second constant rate that is faster than the first constant rate. The same features described above can apply, in the opposite direction (e.g., decreasing the payment amount displayed in indication 848 instead of increasing), with respect to value decrease button 852.
[0292] FIG. 8G shows, as a result of tap gesture 803 on value increase button 850 of value change region 846, indication 848 displaying a payment amount (e.g., “$29”) that is greater than the previously displayed payment amount (e.g., “$28”). In FIGS. 8H-8I, while displaying indication 848 displaying the payment amount (e.g., “$29”) that is greater than the previously displayed payment amount (e.g., “$28”), electronic device 800 detects another user input on
103
DK 2017 70505 A1 value change region 846 that decreases the payment amount displayed in indication 848 (e.g., from “$29” to “$28”).
[0293] For example, as shown in the transition from FIG. 8H to FIG. 8I, the user input is a sliding gesture 805 from the right-to-left direction within value change region 846. As shown in FIG. 8I, in response to detecting sliding gesture 805, the payment amount displayed in indication 848 is decreased (e.g., from “$29” to “$28). In some embodiments, in accordance with a determination that the length of the sliding gesture is within a predetermined length limit, the payment amount is decreased by one unit (e.g., by one dollar, by one cent). In some embodiments, in accordance with a determination that the length of the sliding gesture is at least a predetermined length limit, the payment amount is decreased by multiple units (e.g., by five dollars, by fifty cents). In some embodiments, in accordance with a determination that the sliding gesture is increasing in speed as it is being detected within value change region 846, the rate of change (e.g., the rate of decrease) of the payment accelerates proportionally with the increasing speed of the sliding gesture. In some embodiments, the sliding gesture that decreases the payment amount displayed in indication 848 can be a top-to-bottom sliding gesture. The same features described above can apply, in the opposite direction (e.g., increasing the payment amount displayed in indication 848 instead of decreasing), with respect to a sliding gesture in a left-to-right direction (or a bottom-to-top direction).
[0294] FIG. 8J shows, as a result of sliding gesture 805 within value change region 846 of payment transfer user interface 840, indication 848 displaying a payment amount (e.g., “$28”) that is smaller than the previously displayed payment amount (e.g., “$29”). In some embodiments, payment transfer user interface 840 includes a expand region 854 for expanding the interface to a second (e.g., larger, full-screen) mode. In some embodiments, in response to a user input on expand region 854 (e.g., a tap gesture on the expand region, a sliding-up gesture on the expand region), electronic device 800 displays (e.g., replaces display of payment transfer user interface 840 and at least a portion of messaging application 806 with) expanded payment transfer user interface 856. For example, as shown in FIG. 8K, the user input expanding payment transfer user interface 840 to expanded payment transfer user interface 856 is a slidingup gesture 807, on expand region 854, towards the top edge of display 806. In some
104
DK 2017 70505 A1 embodiments, expanded payment transfer user interface 856 covers at least a portion of (or all of) the displayed messaging application 806. In some embodiments, payment transfer user interface 840 includes a expand button (e.g., instead of or in addition to expand region 854) which, when selected, causes display of expanded payment transfer user interface 856.
[0295] FIG. 8L shows an embodiment of expanded payment transfer user interface 856 after it has been expanded from payment transfer user interface 840 by sliding-up gesture 807. As shown in FIG. 8L, in some embodiments, expanded payment transfer user interface 856 maintains display of value change region 846 from payment transfer user interface 840. In some embodiments, expanded payment transfer user interface 856 also maintains display of request button 845 and send button 847 from payment transfer user interface 840. In some embodiments, expanded payment transfer user interface 856 also maintains display of interface switching menu bar 842. In some embodiments, expanded payment transfer user interface 856 includes an indication 858 of a balance associated with a payment account (e.g., a default payment account, a stored-value account, a debit account, a checking account) provisioned on the device.
[0296] As also shown in FIG. 8L, expanded payment transfer user interface 856 also includes a plurality of selection buttons 860A-860L. Selection buttons 860A-860L correspond to buttons of a numerical keypad (e.g., including digit buttons 0-9, a symbol button, and a back/clear button). Selection buttons 860A-L allow the user to change the payment amount displayed in indication 848 as if the user is typing on a numerical keypad. In some embodiments, expanded payment transfer user interface 856 includes a return button (e.g., instead of or in addition to expand region 854) that, when selected, causes a return to payment transfer user interface 840.
[0297] FIGS. 8M-8O show a transition from expanded payment transfer user interface 856 to a suggestions mode expanded payment transfer user interface 862. In some embodiments, the transition involves a user input on expanded payment transfer user interface 856 to switch to suggestions mode expanded payment transfer user interface 862. For example, as shown in FIGS. 8M-8N, the user input is a sliding gesture 809 from a right-to-left direction on expanded payment transfer user interface 856. In response to detecting sliding gesture 809, electronic
105
DK 2017 70505 A1 device 800 gradually replaces display of expanded payment transfer user interface 856 with suggestions mode expanded payment transfer user interface 862 (e.g., expanded payment transfer user interface 856 slides off of the display at one edge of the display and suggestions mode expanded payment transfer user interface 862 slides onto the display at an opposite edge of the display). In some embodiments, the sliding gesture is in a left-to-right direction on expanded payment transfer user interface 856. In some embodiments, expanded payment transfer user interface 856 includes a switch button that, when selected, causes display of suggestions mode expanded payment transfer user interface 856. In some embodiments, suggestions mode expanded payment transfer user interface 856 also includes a corresponding switch button that, when selected, causes display of suggestions mode expanded payment transfer user interface 862.
[0298] As shown in FIG. 8O, in some embodiments, suggestions mode expanded payment transfer user interface 862 maintains display of value change region 846. As depicted, suggestions mode expanded payment transfer user interface 862 also maintains display of interface switching menu bar 842. As shown, suggestions mode expanded payment transfer user interface 862 also maintains display of request button 845 and send button 847. In FIG. 8O, suggestions mode expanded payment transfer user interface 862 includes indication 858 of the balance associated with a payment account (e.g., a default payment account, a stored-value account, a debit account, a checking account) provisioned on the device. In some embodiments, payment transfer user interface 840 includes a expand button (e.g., instead of or in addition to expand region 854) that, when selected, causes display of suggestions mode expanded payment transfer user interface 862 (instead of expanded payment transfer user interface 856). In some embodiments, suggestions mode expanded payment transfer user interface 862 includes a return button (e.g., instead of or in addition to expand region 854) that, when selected, causes a return to payment transfer user interface 840.
[0299] Suggestions mode expanded payment transfer user interface 862 includes a plurality of selection buttons 864A-L. In some embodiments, selection buttons 864A-864L include a suggested (or recommended) payment amounts (or suggested/recommended additional payment amount) that can relate to, for example, a suggested tip amount button (e.g., 10%, 15%, 18%, 20%, $2, $5, etc.), a suggested tax amount button (which can vary depending on a detected
106
DK 2017 70505 A1 location (e.g., a US state, such as California) of the device), a dividing factor button (e.g., divide by two, divide by four) for splitting the payment account across two or more individuals, an “undo” button, and a “reset” button. In some examples, the buttons include representations of amounts corresponding to the various options (e.g., if the payment is $100, the buttons include $110 for a 10% tip, $120 for a 20% tip, $102 for a $2 tip, $50 for a two way split, or $25 for a four way split). For example, in FIG. 8O, user selection of selection button 864C causes the payment amount displayed in indication 848 to be adjusted by +20% (e.g., to add a 20% tip). For another example, in FIG. 8O, user selection of selection button 864D causes the payment amount displayed in indication 848 to be adjusted by, for example, 8% (e.g., to account for a sales tax). As mentioned above, the suggested tax amount can be automatically adjusted by the device based on location information. For another example, in FIG. 8O, user selection of selection button 864G causes the payment amount displayed in indication 848 to be adjusted by a divisional factor of 2 (e.g., from “$28” to “$14”).
[0300] FIG. 8P shows electronic device 800 again displaying message conversation 808 of messaging application 806 and payment transfer user interface 840 (e.g., as first shown in FIGS. 8E and 8J). While displaying payment transfer user interface 840, the device detects user activation of send button 847 (e.g., to send a payment). For example, as shown in FIG. 8P, the user activation is a tap gesture 811 on send button 847.
[0301] In FIG. 8Q, in response to detecting tap gesture 811 on send button 847, the device displays (e.g., replaces display of payment transfer user interface 840 with) virtual keyboard 812. Further, in response to detecting tap gesture 811 on send button 847, the device displays, in message conversation 808, a payment message object 866 that includes an amount indication 868 of the payment amount. In some embodiments, payment message object 866 also includes a mode indication 870 (e.g., stating “PAY”) that the payment message object corresponds to a payment made via an operating-system controlled payment transfer application (and not by a third-party application). In some embodiments, payment transfer user interface 840 is replaced by virtual keyboard 812.
[0302] Alternatively, while not shown in FIGS. 8P-8Q, while displaying message conversation 808 of messaging application 806 and payment transfer user interface 840 (e.g., as 107
DK 2017 70505 A1 first shown in FIGS. 8E and 8J), the device can also detect user activation (e.g., a tap gesture) of request button 845. Then, payment message object 866, includes in indication 868, the requested payment amount and an additional textual indication (e.g., “$28 Requested”) informing the user that the payment message object corresponds to a request for payment (instead of a payment).
[0303] In some embodiments, as shown in FIG. 8Q, payment message object 866 is displayed inside an expanded compose bar 872 (e.g., an expanded region of the compose region that is adjacent to / above compose bar 814). The payment message object being located within expanded compose bar 872 indicates to the user that the payment corresponding to the payment message object has not yet been sent (to message participant 810) but is being created.
[0304] In some embodiments, indication 868 of the payment amount (or, alternatively, of the payment request amount) within payment message object 866 is prominently displayed (e.g., in a thick font, in a large font) at the center of the message object. In some examples, indication 870 indicating that the message object corresponds to an outgoing payment (or, alternatively, that the message object corresponds to a request for a payment) is less-prominently displayed (e.g., in a thinner font, a smaller font) at a corner of the message object.
[0305] In some embodiments, payment message object 866 is displayed with a visual characteristic (e.g., a different background color, a different background shade, a different background pattern) that distinguishes it from a non-payment message object (e.g., message object 818 and message object 820). For example, payment message object 866 is displayed with a dark color/shade (e.g., black) background color, whereas a non-payment message object (e.g., message object 818 and message object 820) is displayed with a lighter background color/shade (e.g., gray, white, blue).
[0306] In some embodiments, as also shown in FIG. 8Q, when a payment message object (e.g., payment message object 866) is being displayed in an expanded compose bar (and has not yet been sent), electronic device 800 displays, within compose bar 814, an indication 873 (e.g., “Add Comment or Send”) that a comment (e.g., a note, a message) can be added to (or sent together with) payment message object 866. The device also displays, within compose bar 814, a final send button 874 for sending the payment corresponding to the payment message object
108
DK 2017 70505 A1 (or, alternatively, for sending the payment request corresponds to the payment message object) to the intended participant within the message conversation (e.g., message participant 810 of message conversation 808).
[0307] In some embodiments, in response to detecting (e.g., via a tap gesture) user input on compose bar 814 (e.g., a region of compose bar 814 that does not include final send button 874, a region of compose bar 814 with indication 873 stating “Add Comment or Send”), electronic device 800 displays (e.g., replaces display of indication 873 with) a cursor indicating that a comment is ready to be inputted (e.g., typed) into compose bar 814 (e.g., using virtual keyboard 812). For example, FIG. 8R shows a comment 876 (e.g., “Dinner + Cab”) added by the user to send together with payment message object 866 to message participant 810.
[0308] In FIG. 8S, while displaying payment message object 866 within expanded compose bar 872 and comment 876 added to the payment, electronic device 800 detects user activation of final send button 874. For example, the user activation is a tap gesture 813 on final send button 874. In FIGS. 8T-8U, in response to detecting tap gesture 813, electronic device 800 displays a payment confirmation user interface 878. In some embodiments, as shown in the transition from FIG. 8T to FIG. 8U, payment confirmation user interface 878 appears from the bottom edge of display 802 and slides up onto the display to eventually replace display of virtual keyboard 812 (or display of payment transfer user interface 840, whichever is currently displayed). In some embodiments, while payment confirmation user interface 878 is sliding up onto the display, the remaining portion of the display that is not covered by payment confirmation user interface is shaded (e.g., displayed with a darker shade, grayed-out), thus drawing the user’s attention to payment confirmation user interface 878 (instead of other portions of the display, such as message conversation 808).
[0309] In some embodiments, if payment message object 866 instead relates to a payment request by the user to message participant 810 (as opposed to an outgoing payment from the user to message participant 810), user activation of final send button 874 does not cause display of payment confirmation user interface 878. Instead, if payment message object 866 relates to a payment request, in response to the user activation of final send button 874, electronic device
109
DK 2017 70505 A1
800 displays, within message conversation 808, payment message object 866 (thereby indicating that the payment request associated with the payment message object has been sent).
[0310] In some embodiments, (while a payment message object is displayed in expanded compose field 872) electronic device 800 displays a pay button within compose bar 814 for sending the payment (or payment request) corresponding to the payment message object. Thus, in some embodiments, the user can, subsequent to entering a note (e.g., a comment, a message) to accompany the payment (or payment request), select the pay button to send the payment and the entered note. In some embodiments, the pay button is shown within compose bar 814 while virtual keyboard 812 is displayed. In some embodiments, the pay button is shown within compose bar 814 while payment transfer user interface 840 is displayed.
[0311] As shown in FIG. 8U, payment confirmation user interface 878 includes a mode indication 880 (e.g., stating “PAY”) that the payment message object being created by payment confirmation user interface 878 corresponds to a payment made via an operating-system controlled payment transfer application (and not by a third-party application). In some embodiments, payment confirmation user interface 878 also includes a cancel button 827 for cancelling the payment (e.g., to message participant 810). In some embodiments, as also shown in FIG. 8U, payment confirmation user interface 878 includes an indication 884 (e.g., a graphical indication, a textual indication) of a payment account and a balance of the payment account to be used for the payment (or, alternatively, to receive a payment for a payment request). For example, indication 884 can include a mini-thumbnail image of a physical card associated with the payment account. For another example, if the payment account is a stored-value account, indication 884 can include the balance (e.g., “$50”) stored on the account. In some embodiments, payment confirmation user interface 878 includes an additional accounts button 886 for viewing other payment accounts provisioned on the device that can be used to make the payment corresponding to payment message object 866. In some embodiments, as also shown in FIG. 8U, payment confirmation user interface 878 includes an indication 882 of the intended recipient of the payment (e.g., “Pay John”) and an indication 888 of the payment amount (e.g., to serve as another reminder to the user of the amount to be paid).
110
DK 2017 70505 A1 [0312] In some embodiments, as also shown in FIG. 8U, payment confirmation user interface 878 includes an authentication request 890 (e.g., a graphical request, a textual request) requesting that the user provide authentication information to proceed with making the payment to message participant 810. In some embodiments, the requested authentication is biometric authentication, such as facial recognition authentication, fingerprint authentication, voice recognition authentication, iris scan authentication, or retina scan authentication. For example, in FIG. 8U, the requested authentication information (e.g., as shown in authentication request 890), is fingerprint information (e.g., “Pay with Fingerprint”).
[0313] In FIG. 8 V, while displaying payment confirmation user interface 878, electronic device 800 receives, from the user, the requested fingerprint information 815 (e.g., via mechanical button 804). While (or subsequent to) receiving, from the user, fingerprint information 815, a determination is made (e.g., by the device or by an external device, such as a server) whether fingerprint information 815 is consistent with an enrolled authentication information (e.g., an enrolled fingerprint information) of the user. As shown in FIG. 8W, in accordance with a determination that fingerprint information 815 is consistent with enrolled fingerprint information of the user, the device updates authentication request 890 (previously showing a request for a certain type of authentication information) to indicate that the authentication was successful (e.g., by displaying a checkmark, by displaying “Authorization Successful” or “Payment Complete”).
[0314] In some embodiments, in accordance with a determination that fingerprint information 815 is not consistent with enrolled fingerprint information of the user (i.e., authentication was not successful), the device displays an indication that the authentication was unsuccessful and a request to re-provide the requested authentication information. In some embodiments, in accordance with a determination that fingerprint information 815 is (e.g., for a second time) not consistent with enrolled fingerprint information of the user, the device displays a verification user interface (e.g., as described below with reference to FIGS. 31A-31M) for providing a different type of authentication information or for verifying that the user is the user that is associated with the user account logged onto the device.
111
DK 2017 70505 A1 [0315] As shown in FIG. 8X, in response to the successful user authentication from FIG. 8W, electronic device 800 removes display of payment confirmation user interface 878 (and again displays virtual keyboard 812 in place of the removed payment confirmation user interface 878). Further, as also shown in FIG. 8X, the device displays payment message object 866 within message conversation 808 of messaging application 806, thereby indicating that the payment has been sent to message participant 810. In addition, the device also displays, adjacent to (or beneath or within) payment message object 866, a sent note message object 892 corresponding to added comment 876 previously entered by the user.
[0316] In some embodiments, payment message object 866, once sent, includes a first status indicator 894 informing the user of a status of the payment corresponding to the sent payment message object (e.g., “pending,” “paid,” “accepted,” “expired,”). For example, in FIG. 8X, first status indicator 894 shows “pending,” thus indicating to the user that the payment associated with sent payment message object 866 has not yet been accepted by message participant 810. In some embodiments, once a payment message object is sent, the device displays (in addition to or instead of first status indicator 894), a second status indicator 896 informing the user of a status of the payment corresponding to the sent payment message object (e.g., “pending,” “paid,” “accepted,” “expired,”). For example, as shown in FIG. 8X, second status indicator 896 (e.g., “pending”) shows the same status as shown by first status indicator 894 (e.g., “pending”).
[0317] FIG. 8Y shows the payment (or, alternatively, the payment request) corresponding to payment message object 866 having been accepted by message participant 810. In response to the determination that the payment (or, alternatively, the payment request) corresponding to payment message object 866 has been accepted by message participant 810, electronic device 800 updates first status indicator 894 (e.g., from “pending” to “paid”) to inform the user that the payment has been accepted by message participant 810 (or, alternatively, to inform the user that the payment request has been accepted, and thus a payment by message participant 810 in the requested payment amount has been made by message participant 810 to the user). In some embodiments, the device updates second status indicator 896 (e.g., from “pending” to “paid”) to inform the user that the payment has been accepted by message participant 810 (or, alternatively, to inform the user that the payment request has been accepted, and thus a payment by message
112
DK 2017 70505 A1 participant 810 in the requested payment amount has been made by message participant 810 to the user).
[0318] As also shown in FIG. 8Y, in response to the payment (or, alternatively, the payment request) corresponding to payment message object 866 having been accepted by message participant 810, the device changes (e.g., applies a special graphical effect to, applies a special animation to, applies a special pattern to) display of indication 868 of the payment amount within payment message object 866. In some embodiments, indication 868 of the payment amount is changed to a more prominent font (e.g., a larger font, a thicker font). In some embodiments, indication 868 of the payment amount is changed to show a special holographic effect (e.g., as described in more detail with reference to FIGS. 11A-11V). In addition to first status indicator 894 and second status indicator 896, the change to indication 868 of the payment amount within accepted payment message object 866 confirms to the user that the payment has been accepted by message participant 810 (or, alternatively, that the payment request has been accepted/acknowledged by message participant 810).
[0319] FIG. 8Z shows, in contrast to FIG. 8Y, the payment (or, alternatively, the payment request) corresponding to payment message object 866 having not been accepted by message participant 810 within a predetermined time period (e.g., 24 hours, 3 days, 1 week, etc.). In response to the determination that the payment (or, alternatively, the payment request) corresponding to payment message object 866 has not been accepted by message participant 810 within the predetermined time period, electronic device 800 updates first status indicator 894 (e.g., from “pending” to “expired”) to inform the user that the payment has not been accepted by message participant 810 within the predetermined time period (or, alternatively, to inform the user that the payment request has not been accepted within the predetermined time period, and thus a payment by message participant 810 in the requested payment amount has not been made by message participant 810 to the user). In some embodiments, the device updates second status indicator 896 (e.g., from “pending” to “expired”) to inform the user that the payment has not been accepted by message participant 810 within the predetermined time period (or, alternatively, to inform the user that the payment request has not been accepted within the
113
DK 2017 70505 A1 predetermined time period, and thus a payment by message participant 810 in the requested payment amount has not been made by message participant 810 to the user).
[0320] In some embodiments, as also shown in FIG. 8Z, in response to the determination that the payment (or, alternatively, the payment request) corresponding to payment message object 866 has not been accepted by message participant 810 within the predetermined time period, electronic device 800 changes display (e.g., blurs out, lightens the displayed text) of payment message object 866 (and sent note message object 892 associated with the payment message object) corresponding to the expired payment to indicate that the payment (or, alternatively, the payment request) has expired.
[0321] FIGS. 8AA-8AH illustrate exemplary user interfaces for managing peer-to-peer transfers similar to the exemplary user interfaces for managing peer-to-peer transfers described above with reference to FIGS. 8A-8Z. In particular, while the non-limiting exemplary user interfaces of FIGS. 8A-8Z was illustrated and described with respect to electronic device 800, a smartphone, the non-limiting exemplary user interfaces of FIGS. 8AA-8AH are illustrated and described with respect to an electronic device 850, a wearable device (e.g., a smartwatch). Similar to electronic device 800, electronic device 850 has a display 851, one or more input devices (e.g., touchscreen of display 851, a rotatable input button 853, a mechanical button 855, and a mic), and a wireless communication radio.
[0322] In FIG. 8AA, electronic device 850 displays, on display 851, a message conversation 859 of a messaging application 857 between the user (e.g., “Kate Appleseed”) and message participant 810 (e.g., “John Appleseed”) (e.g., similar to message conversation 808 of messaging application 806 described above with reference to FIGS. 8A-8Z). In some embodiments, messaging application 857 includes display of one or more message input buttons 861A-861C for inputting a message (e.g., using different input methods, using different input objects) to be sent via the messaging application. For example, in FIGS. 8AA, the message input buttons include a mic input button 861A for inputting a message via voice input (e.g., a spoken user input), an emoticon input button 861B for selecting an emoticon to be transmitted as (or with) a message, and a dynamic input button 861C for creating a dynamic (e.g., moving, non-static)
114
DK 2017 70505 A1 message. In some embodiments, messaging application 857 also includes display of a scribble input button 863 for allowing a user to enter text of a message using hand-scribbled input.
[0323] As shown in FIG. 8AA, message conversation 808 includes a message object 865 sent by message participant 810 to the user. In the message corresponding to message object 865, message participant 810 informs the user: “Dinner was $28.” In response to receiving the new message corresponding to message object 865, an analysis of the contents (e.g., the text) of message object 865 is performed (e.g., similar to the analysis performed with respect to message object 820 in FIG. 8B above). Based on the analysis of the contents (e.g., the text) of message object 865 (and, optionally, one or more other previous or subsequent message objects of message conversation 859), in accordance with a determination (e.g., made at electronic device 850 or received from an external device, such as a server) that the contents (e.g., the text) of the message corresponding to message object 865 relate to a transfer of a payment (e.g., a request for a payment, agreement to send a payment) that messaging application 857 is configured to transfer, electronic device 850 displays one or more selectable indications that corresponds to a payment amount or to an intent to proceed with a payment transfer, as discussed below (e.g., similar to the selectable indications that are displayed with respect to message object 820, as described above in FIG. 8B).
[0324] In some embodiments, as shown in FIG. 8AA, the selectable indication is a marking 867 (e.g., similar to marking 822 of message object 820) of the payment amount shown in message object 865. In some embodiments, as shown in FIG. 8AA, in addition to (or alternatively to) marking 867 on message object 820, electronic device 850 displays a pay button 869 (e.g., similar to pay button 828 associated with message object 820). In some embodiments, pay button 869 is displayed below scribble button 863 within messaging application 857.
[0325] In FIG. 8AB, while displaying message conversation 859 with marking 867 and pay button 869 displayed, electronic device 850 detects user activation of pay button 869 (or, alternatively, of marking 867 of message object 865) to proceed with transferring the requested payment amount (e.g., “$28”) to message participant 810. For example, as shown in FIG. 8AB, the user activation is a tap gesture 871 on the pay button (or, alternatively, on the marking of the message object). In FIG. 8AC, in response to detecting tap gesture 871 on pay button 869 (or,
115
DK 2017 70505 A1 alternatively, on marking 867 of message object 865), electronic device 850 displays, on display 851, a payment transfer user interface 875 (e.g., similar to payment transfer user interface 840 illustrated, for example, in FIG. 8E).
[0326] As with payment transfer user interface 840, payment transfer user interface 875 includes a value change region 879 (e.g., corresponding to value change region 846 of payment transfer user interface 840). As with value change region 846 of payment transfer user interface 840, value change region 879 of payment transfer user interface 875 includes an indication 881 of the transfer amount (e.g., “$28”). As shown in FIG. 8AC, in some embodiments, payment transfer user interface 875 is displayed with the payment amount (e.g., “$28”) pre-populated in indication 881 (e.g., as described above in FIG. 8E with respect to indication 848 of payment transfer user interface 840).
[0327] In some embodiments, payment transfer user interface 875 also includes an indication 877 (e.g., stating “PAY,” similar to indication 841 of payment transfer user interface 840) informing the user that the payment message object corresponds to a payment made via an operating-system controlled payment transfer application (and not by a third-party application). In some embodiments, payment transfer user interface 875 includes a request button 877 (e.g., corresponding to request button 845) and a send button 889 (e.g., corresponding to send button 8847).
[0328] As also shown in FIG. 8AC, payment transfer user interface 875 also includes, within value change region 881 (e.g., similar to value change region 846 of payment transfer user interface 840), a value increase button 885 (e.g., indicated as a “+,” corresponding to increase button 850 of value change region 846) for increasing and a value decrease button 883 (e.g., indicated as a “-,” corresponding to decrease button 852 of value change region 846) for decreasing the displayed payment amount within indication 881. In some embodiments, in response to detecting user activation of value increase button 885, the displayed payment amount within indication 881 is increased, and in response to detecting user activation of value decrease button 883, the displayed payment amount within indication 881 is decreased.
116
DK 2017 70505 A1 [0329] In some embodiments, in addition to (or alternatively to) changing the payment amount displayed in indication 881 using value increase button 885 and value decrease button 883, the payment amount can be increased or decreased based on rotation of rotatable input button 853. In some embodiments, the value of the displayed payment amount in indication 881 of value change region 879 is increased in response to a clockwise rotation of rotatable input button 853 and the value of the displayed payment amount in indication 881 in value change region 879 is decreased in response to a counter-clockwise rotation of rotatable input button 853. For example, in FIG. 8AD, electronic device 850 receives a user rotation input 891 on rotatable input button 853, where user rotation input 891 is a rotation of the input button in the clockwise direction. As shown in FIG. 8AD, in response to receiving user rotation input 891, the displayed payment amount in indication 881 is increased (e.g., from “$28” to “$29”). In some embodiments, the same result can be achieved by user activation of value increase button 885.
[0330] In FIG. 8AE, electronic device 850 detects a user input on value decrease button 883 of value change region 879. For example, as shown in FIG. 8AE, the user input is a tap gesture 893 on value decrease button 883. As shown in FIG. 8AE, in response to detecting tap gesture 893, the displayed payment amount in indication 881 is decreased (e.g., from “$29” to “$28”). In some embodiments, the same result can be achieved via user rotation of the rotatable input button in a counter-clockwise direction.
[0331] In FIG. 8AF, while displaying payment transfer user interface 875 with the payment amount (e.g., “$28”) corresponding to the amount requested by message participant 810 (in message object 865) displayed in indication 881 of value change region 879, electronic device 850 detects user activation of send button 889. For example, as shown in FIG. 8AF, the user activation is a tap gesture 895 on the send button.
[0332] As shown in FIG. 8AG, in response to detecting tap gesture 895 on send button 889, electronic device 850 displays (e.g., replaces display of payment transfer user interface 875 with) a payment confirmation user interface 831 (e.g., similar to payment confirmation user interface 878 described above with reference to, for example, FIG. 8U). As with payment confirmation user interface 878, payment confirmation user interface 831 includes a mode indication 877 (e.g., stating “PAY,” corresponding to mode indication 880 of payment confirmation user interface
117
DK 2017 70505 A1
878), a cancel button 897 (e.g., corresponding to cancel button 827 of payment confirmation user interface 878), an indication 839 (e.g., corresponding to indication 884 of payment confirmation user interface 787) of a payment account, an indication 835 (e.g., corresponding to indication 882 of payment confirmation user interface 787) of the intended recipient of the payment (e.g., “To John Appleseed”), and an indication 833 (e.g., corresponding to indication 888 of payment confirmation user interface 878) of the payment amount (e.g., to serve as another reminder to the user of the amount to be paid). In some embodiments, payment confirmation user interface 31 includes a confirmation request 837 (e.g., similar to authentication request 890 of payment confirmation user interface 787, a graphical request, a textual request) requesting that the user provide confirmation to proceed with making the payment (e.g., of $28) to message participant 810. For example, in FIG. 8AG, confirmation request 837 states “Double Click to Pay.” [0333] As shown in FIG. 8AG, while displaying confirmation user interface 831, electronic device 850 receives a user input corresponding to the confirmation requested via confirmation request 837 to proceed with completing the payment transfer. For example, as shown in FIG. 8AG, the user input is a double-click (or a double-push) input 899 on mechanical button 855.
[0334] In FIG. 8AH, in response to receiving double-click input 899 on mechanical button 855, electronic device 850 again displays (e.g., replaces display of payment confirmation user interface 831 with), on display 851, message conversation 859 of messaging application 857 with message participant 810. As shown in FIG. AH, (below message object 865) message conversation 859 now includes a payment message object 853 (e.g., similar to payment message object 866 described above with reference to, for example, FIG. 8Q) corresponding to the payment (of $28) transmitted to message participant 810 (in response to the message participant’s request contained in the message corresponding to message object 865).
[0335] In some embodiments, as with payment message object 866, payment message object 853 includes an amount indication 859 (e.g., corresponding to amount indication 868 of payment message object 866) of the payment amount. In some embodiments, also as with payment message object 866, payment message object 853 includes a mode indication 859 (e.g., stating “PAY,” corresponding to mode indication 870 of payment message object 866). In some embodiments, also as with payment message object 866, payment message object 853 includes a 118
DK 2017 70505 A1 status indicator 859 (e.g., stating “PENDING,” corresponding to status indicator 894 of text message object 866) indicating a status of the payment associated with the payment message object sent to message participant 810.
[0336] FIGS. 9A-9I are a flow diagram illustrating a method for managing peer-to-peer transfers using an electronic device in accordance with some embodiments. Method 900 is performed at a device (e.g., 100, 300, 500, 700, 800, 850) with display, one or more input devices (e.g., a touchscreen, a mic, a camera, a biometric sensor), and a wireless communication radio (e.g., a Bluetooth connection, WiFi connection, a mobile broadband connection such as a 4G LTE connection). Some operations in method 900 are, optionally, combined, the orders of some operations are, optionally, changed, and some operations are, optionally, omitted.
[0337] As described below, method 900 provides an intuitive way for managing peer-to-peer transfers. The method reduces the cognitive burden on a user for managing peer-to-peer transfers, thereby creating a more efficient human-machine interface. For battery-operated computing devices, enabling a user to manage peer-to-peer transfers faster and more efficiently conserves power and increases the time between battery charges.
[0338] The electronic device (e.g., 700, 800, 850) receives (902), via the wireless communication radio, one or more messages (e.g., a text message, an email message, an instant message) (e.g., from a remote user).
[0339] The electronic device (e.g., 700, 800, 850) displays (904) (e.g., in response to / subsequent to receiving the one or more messages), on the display (e.g., 702, 802, 851), a user interface for a messaging application (e.g., 706, 806, 857) that includes at least one of the one or more messages (e.g., 718, 720, 816, 818, 820, 865) in a message conversation (e.g., 708, 808, , 859, an instant message conversation, a text message thread, an email thread) between a plurality of conversation participants (e.g., 710, 810, a user of the device and one or more other participants). Displaying the user interface for the messaging application (e.g., 706, 806, 857) and, in particular, the messages (e.g., 718, 720, 816, 818, 820, 865) in the conversation provides the user with contextual feedback regarding the sender/receiver of messages in the conversation and reduces the need for the user to investigate the sender/receiver for further messages
119
DK 2017 70505 A1 displayed in the conversation. Providing improved visual feedback to the user enhances the operability of the device and makes the user-device interface more efficient (e.g., by helping the user to achieve an intended result by providing feedback indicative of an input that will cause the device to generate the intended result and reducing user mistakes when operating/interacting with the device) which, additionally, reduces power usage and improves battery life of the device by enabling the user to use the device more quickly and efficiently. In some examples, the message conversation (e.g., 708, 808, 859) (906) involves two or more participants, other than a user of the device.
[0340] While concurrently displaying, on the display (e.g., 702, 802, 851), at least one of the one or more messages (e.g., 718, 720, 816, 818, 820, 865) in the message conversation (e.g., 708, 808, 859), the electronic device (e.g., 700, 800, 850) receives (908), from one of the participants (e.g., 710, 810), a respective message (e.g., 720, 820, 865) (from the user or one of the other participants).
[0341] In response (910) to receiving the respective message, in accordance with a determination, based on an analysis of text in the respective message (and, optionally one or more prior messages in the message conversation), that the respective message relates to a transfer of a first type of item (e.g., a sticker, a photo, or a payment object) that the messaging application is configured to transfer, the electronic device (e.g., 700, 800, 850) concurrently displays (912), on the display (e.g., 702, 802, 851), a representation of the message and a selectable indication (e.g., 722, 822, 867, underlining a portion of the text that relates to the first type of item and updating the portion of the text to be a selectable affordance, or displaying in a virtual keyboard a representation of the first type of item) that corresponds to the first type of item. Concurrently displaying the representation of the message (e.g., 720, 820, 865) and the selectable indication (e.g., 722, 822, 867) in response to receiving a message that is determined to relate to a transfer of a type of item (e.g., a photo, a payment) provides the user with feedback to indicate that the selectable indication corresponds to the received message and that activating the selectable indication will cause an operation to be performed that relates to the message. Providing improved visual feedback to the user enhances the operability of the device and makes the user-device interface more efficient (e.g., by helping the user to achieve an intended result by
120
DK 2017 70505 A1 providing feedback indicative of an input that will cause the device to generate the intended result and reducing user mistakes when operating/interacting with the device) which, additionally, reduces power usage and improves battery life of the device by enabling the user to use the device more quickly and efficiently.
[0342] In some examples, the text in the respective message (e.g., 720, 820, 865) includes (914) a first quantity (e.g., a number of stickers, a number of photos, a payment amount, a resource amount) of content of the first type of item. In some examples, the representation of the respective message includes the first quantity.
[0343] In some examples, in accordance with the determination, based on the analysis of the text in the respective message, that the respective message relates to the transfer of the first type of item that the messaging application is configured to transfer, the electronic device (e.g., 700, 800, 850) displays (916) (e.g., at a suggestions region of a virtual keyboard (e.g., 712, 812) that includes one or more suggested quantities (e.g., 724, 824) of content of the first type of item), on the display (e.g., 702, 802), a transfer affordance (e.g., 726, 826, 828, 869) (e.g., an affordance for opening a sticker/photo gallery user interface, a payment affordance for opening a payment user interface, a resource-transfer affordance for opening a resource-transfer affordance). In some examples, the electronic device (e.g., 700, 800, 850) detects user activation of the transfer affordance (e.g., 726, 826, 828, 869), and in response to detecting the user activation of the transfer affordance (e.g., 726, 826, 828, 869), the electronic device (e.g., 700, 800, 850) displays, on the display (e.g., 702, 802, 851), the transfer user interface (e.g., 728, 840, 875) (e.g., a sticker/photo gallery for selecting stickers/photos to transfer, a numerical value selection user interface for selecting an amount of funds or an amount of resources) for initiating transfer of the first type of item to a participant in the message conversation (and ceasing to the display the virtual keyboard). Displaying the transfer affordance (e.g., 726, 826, 828, 869) when the respective message relates to a transfer of an item and displaying the transfer user interface (e.g., 728, 840, 875) when the transfer affordance (e.g., 726, 826, 828, 869) is activated avoids the need for the device to receive multiple user inputs to initiate the transfer user interface. Reducing the number of inputs needed to perform an operation enhances the operability of the device and makes the user-device interface more efficient (e.g., by helping the user to achieve an
121
DK 2017 70505 A1 intended result by providing feedback indicative of an input that will cause the device to generate the intended result and reducing user mistakes when operating/interacting with the device) which, additionally, reduces power usage and improves battery life of the device by enabling the user to use the device more quickly and efficiently.
[0344] In some examples, in accordance with a determination that the respective message includes (or corresponds to) one or more features that indicate that the transfer request is a fraudulent transfer request (e.g., a transfer request from an unknown user/account, a transfer request from an unverified source, a transfer request from a flagged user/account), the electronic device (e.g., 700, 800, 850) forgoes displaying the transfer affordance (e.g., 726, 826, 828, 869). In some examples, the electronic device further provides a prompt/notification (e.g., 726, 836, 838) indicating that the respective message is suspected to be a spam/junk message. In some examples, messages from participants not in a list of contacts (e.g., address book app) of the electronic device are flagged as relating to a fraudulent transfer request. In some examples, messages from participants in a list of contacts (e.g., a list of known spammers) of the electronic device are flagged as relating to a fraudulent transfer request. Not displaying the transfer affordance when the respective message relates to a fraudulent transfer request reduces the likelihood that the user will participant in the transfer without further investigating the transfer because the user must take additional steps to participate in the transfer, thereby enhancing the security of the technique and reducing the number of fraudulent transfers. Reducing the number of fraudulent transfers enhances the operability of the device and makes the user-device interface more secure (e.g., by reducing fraud when operating/interacting with the device).
[0345] In some examples, further in response to receiving the respective message, in accordance with a determination, based on the analysis of text in the respective message (and, optionally one or more prior messages in the message conversation), that the respective message does not relate to a transfer of the first type of item (e.g., a sticker, a photo, or a payment object), the electronic device (e.g., 700, 800, 850) displays (920), on the display (e.g., 702, 802, 851), a representation of the respective message (e.g., 720, 820, 865) (e.g., a regular text message, a regular chat bubble, a regular email message) without displaying the selectable indication (e.g., 722, 822, 867) that corresponds to the first type of item.
122
DK 2017 70505 A1 [0346] While the representation of the message and the selectable indication that corresponds to the first type of item are concurrently displayed on the display, the electronic device (e.g., 700, 800, 850) detects (922), via the one or more input devices, user activation (e.g., 801, 871, a touch gesture, such as a tap) of the selectable indication.
[0347] In response (924) to detecting the user activation (e.g., 801, 871) of the selectable indication, the electronic device (e.g., 700, 800, 850) displays (926), on the display (702, 802, 851), a transfer user interface (e.g., 728, 840, 875) for initiating transfer of the first type of item between participants (e.g., 810, the user) in the message conversation (e.g., a sticker sharing interface, a photo sharing interface, a payment interface, or a resource-numerical value selection user interface for receiving user adjustment of the amount of resources, such as points, credits, or funds, to be sent or requested). Displaying an indication (e.g., 722, 822, 867) that is selectable when the respective message relates to a transfer of an item and displaying the transfer user interface (e.g., 728, 840, 875) when the indication (e.g., 722, 822, 867) is selected (e.g., activated) avoids the need for the device to receive multiple user inputs to initiate the transfer user interface. Reducing the number of inputs needed to perform an operation enhances the operability of the device and makes the user-device interface more efficient (e.g., by helping the user to achieve an intended result by providing feedback indicative of an input that will cause the device to generate the intended result and reducing user mistakes when operating/interacting with the device) which, additionally, reduces power usage and improves battery life of the device by enabling the user to use the device more quickly and efficiently.
[0348] In some examples, the text in the respective message includes (928) a first quantity (e.g., a number of stickers, a number of photos, a payment amount, a resource amount) of content of the first type of item. In some examples, the transfer user interface (e.g., 728, 840, 875) includes an indication of the first quantity (e.g., 848, 881) of the content of the first type of item. In some examples, a quantity (e.g., a numerical value, a numerical value adjacent to a currency symbol/character) being contained in the text is used during analysis of the text in the respective message to determine that the respective message relates to a transfer of the first type of item (e.g., a sticker, a photo, or a payment object) that the messaging application is configured to transfer. Automatically displaying the quantity of the item from the message in the transfer user
123
DK 2017 70505 A1 interface as a starting point allows the user to make adjustments (e.g., incrementing, decrementing) to the quantity derived from the message, rather than adjusting an unrelated value (e.g., value of 0), and helps to reduce the number of inputs needed to reach a desired adjusted value. Reducing the number of inputs needed to perform an operation enhances the operability of the device and makes the user-device interface more efficient (e.g., by helping the user to achieve an intended result by providing feedback indicative of an input that will cause the device to generate the intended result and reducing user mistakes when operating/interacting with the device) which, additionally, reduces power usage and improves battery life of the device by enabling the user to use the device more quickly and efficiently.
[0349] In some examples, the transfer user interface (e.g., 728, 840, 875) includes (930) an indication of a second quantity of content of the first type of item, wherein the second quantity is a numerical value divided (e.g., proportionally) among the two or more participants based on the first quantity. For example, if the text in the respective message includes a first quantity of the content (e.g., a payment amount) of $20, and the number of other participants in the message conversation is 5 participants, the second quantity of the content is $20/5 = $4. Automatically displaying a value based on the quantity of the item from the message in the transfer user interface as a starting point allows the user to make adjustments (e.g., incrementing, decrementing) to the value, rather than adjusting an unrelated value (e.g., 0), and reduces the number of inputs needed to reach a desired adjusted value. Reducing the number of inputs needed to perform an operation enhances the operability of the device and makes the user-device interface more efficient (e.g., by helping the user to achieve an intended result by providing feedback indicative of an input that will cause the device to generate the intended result and reducing user mistakes when operating/interacting with the device) which, additionally, reduces power usage and improves battery life of the device by enabling the user to use the device more quickly and efficiently.
[0350] In some examples, the transfer user interface (e.g., 728, 840, 875) is concurrently displayed (932) with at least a portion of (e.g., some or all of) the representation of the respective message (e.g., 720, 820, 865) (and, optionally, with the selectable indication (e.g., 722, 822, 867) of the first resource amount). In some examples, the transfer user interface (e.g., 728, 840, 875)
124
DK 2017 70505 A1 is displayed in a bottom-half portion of the display, and the messaging application (or the conversation of the messaging application) containing the representation of the respective message (e.g., 720, 820, 865) is displayed in a top-half portion of the display.
[0351] In some examples, the transfer user interface (e.g., 728, 840, 875) includes (934) a transfer mode affordance (e.g., a toggle for switching between a “transfer out” mode and a “requesting transfer” mode).
[0352] In some examples, the transfer user interface (e.g., 728, 840, 875) includes (936) a send affordance (e.g., 738, 847, 889) (e.g., an affordance for sending a message associated with selected stickers, an affordance for sending a message associated with selected photos, an affordance for sending a message associated with a selected amount of funds, an affordance for sending a message with a selected amount of resources).
[0353] In some examples, while displaying the transfer user interface (e.g., 728, 840, 875), the electronic device (e.g., 700, 800, 850) receives (938) user input (e.g., a swipe in an upwards direction from an area of the transfer user interface towards and out of the top edge of the transfer user interface). In response (940) to receiving the user input, the electronic device (e.g., 700, 800, 850) displays (942), on the display (e.g., 702, 802, 851), a keypad user interface (e.g., 856, 862) (e.g., containing a numbers pad), wherein the keypad user interface (e.g., 856, 862) includes one or more suggested numerical values for a quantity of the first type of item to transfer. In some examples, the keypad user interface (e.g., 856, 862) replaces display of the transfer user interface (e.g., 728, 840, 875). In some examples, the suggested numerical values are based on location (e.g., local sales tax). In some examples, the suggested numerical values are based on a number of participants in the message conversation. In some examples, the suggested numerical values are based on context (e.g., an indication that a payment will be split or multiple items need to be paid for). In some examples, the suggested numerical values include a suggestion with a tip included. In some examples, the one or more suggested numerical values are displayed as part of the keypad user interface (e.g., 856, 862) in response to receiving the respective message and in accordance with the determination, based on analysis of text in the respective message (and, optionally one or more prior messages in the message
125
DK 2017 70505 A1 conversation), that the respective message relates to transfer of the first type of item (e.g., a sticker, a photo, or a payment object) that the messaging application is configured to transfer.
[0354] In some examples, the electronic device (e.g., 700, 800, 850) detects (944) a first activation (e.g., touchscreen tap on) of the transfer mode affordance. In response (946) to detecting the first activation of the transfer mode affordance, the electronic device designates (948) the message associated with the transfer of the first type of item as corresponding to a transmission (e.g., sending out) of the first type of item. The electronic device detects (950) a second activation of the transfer mode affordance. In response (952) to detecting the second activation of the transfer mode affordance, the electronic device designates (954) the message associated with the transfer of the first type of item as corresponding to a request for the first type of item.
[0355] In some examples, the electronic device detects (956) user activation of the send affordance (e.g., 738, 847, 889). In response (958) to detecting the user activation of the send affordance (e.g., 738, 847, 889), the electronic device (e.g., 700, 800, 850) displays (960), on the display (e.g., 702, 802, 851), a graphical representation of a message (e.g., 866, 853) (e.g., a message associated with selected stickers, a message associated with selected photos, a message associated with a selected amount of funds, a message with a selected amount of resources) associated with the transfer of the first type of item (e.g., stickers, photos, funds, resources) in the message conversation, wherein the graphical representation of the message (e.g., 866, 853) associated with the transfer of the first type of item includes an indication of a quantity of content (e.g., 868, 859) (e.g., a number of stickers, a number of photos, an amount of funds, an amount of resources) of the first type of item being transferred. Displaying a message that includes an indication of the quantity of the item transferred provides the user with visual feedback of the operation being performed and enables the user to subsequently review the message conversation to understand the amount of the item transferred and to whom it was transferred. Providing improved visual feedback to the user enhances the operability of the device and makes the user-device interface more efficient (e.g., by helping the user to achieve an intended result by providing feedback indicative of an input that will cause the device to generate the intended result and reducing user mistakes when operating/interacting with the device)
126
DK 2017 70505 A1 which, additionally, reduces power usage and improves battery life of the device by enabling the user to use the device more quickly and efficiently.
[0356] In some examples, in response (958) to detecting the user activation of the send affordance (e.g., 738, 847, 889) and prior to displaying, on the display (e.g., 702, 802), the graphical representation of the message (e.g., 866, 853) associated with the transfer of the first type of item in the message conversation, in accordance with a determination that the message associated with the transfer of the first type of item corresponds to a transmission of the first type of item, the electronic device (e.g., 700, 800, 850) displays (964), on the display (e.g., 702, 802, 851), an authentication user interface (e.g., 878, 831) requesting authentication information (e.g., biometric information, such as a fingerprint, facial features, iris/retina features, or input information such as a passcode or pattern). The electronic device (e.g., 700, 800, 850) receives (966), via the one or more input devices (e.g., 704, 804, 855), the authentication information. In accordance with a determination that the received authentication information corresponds to enrolled authentication information (stored on the device) for authorizing transfers, the electronic device displays (968), on the display, the graphical representation of the message (e.g., 866, 853) associated with the transfer of the first type of item in the message conversation (e.g., 708, 808, 859). In accordance with a determination that the received authentication information does not correspond to the enrolled authentication information for authorizing transfers, the electronic device forgoes displaying (970), on the display (e.g., 702, 802, 851), the graphical representation of the message (e.g., 866, 853) associated with the transfer of the first type of item in the message conversation (e.g., 708, 808, 859).
[0357] In some examples, while displaying (972), on the display (e.g., 702, 802, 851), the transfer user interface (e.g., 728, 840, 875), blocks 974-980 are performed. The electronic device (e.g., 700, 800, 850) displays (974) a numerical value (e.g., 848, 881) representing a quantity of the first type of item (e.g., “0” or a non-zero value determined based on the text analysis of the text in the respective message). The electronic device (e.g., 700, 800, 850) detects (976), via the one or more input devices, a user input (e.g., 803, 805). In accordance with a determination that the user input corresponds to a first type of user input, the electronic device increases (978) the displayed numerical value (e.g., 848, 881) by an amount corresponding to the
127
DK 2017 70505 A1 first type of user input. In some examples, the first type of user input corresponds to a selection of a first affordance, such as a “+” affordance (e.g., 850, 885). In some examples, the first type of user input corresponds to a horizontal/vertical scrub in a first direction. In accordance with a determination that the user input corresponds to a second type of user input, the electronic device decreases (980) the displayed numerical value (e.g., 848, 881) by an amount corresponding to the second type of user input. In some examples, the second type of user input corresponds to a selection of a second affordance, such as a “-” affordance (e.g., 852, 883). In some examples, the second type of user input corresponds to a vertical/horizontal scrub in a second direction.
[0358] In some examples, the user input is a continuous input (e.g., a “touch and hold” input, a prolonged input) on an affordance for at least a predetermined time. In accordance with the determination that the user input corresponds to the first type of user input (e.g., an input on a first affordance, such as a “+” affordance), the electronic device (e.g., 700, 800, 850) increases the displayed numerical value (e.g., 848, 881) by an increasingly faster rate based on the duration (and/or characteristics intensity) of the user input. In accordance with the determination that the user input corresponds to the second type of user input (e.g., an input on a second affordance, such as a “-” affordance), the electronic device decreases the displayed numerical value (e.g., 848, 881) by an increasingly faster rate based on the duration (and/or characteristics intensity) of the user input. Thus, in some examples, the displayed numerical value (e.g., 848, 881) changes (increases or decreases) at a progressively faster rate as the user input is held for an increasingly longer duration of time. Increasing or decreasing the numerical value by an increasingly faster rate based on the duration (or intensity) of the user input provides the user with feedback about the duration (or level of intensity) that is being detected by the device based on the user’s input and provides visual feedback to the user indicating that holding longer (or pressing harder will) cause the device to increase the rate of the change. Providing improved visual feedback to the user enhances the operability of the device and makes the user-device interface more efficient (e.g., by helping the user to achieve an intended result by providing feedback indicative of an input that will cause the device to generate the intended result and reducing user mistakes when operating/interacting with the device) which, additionally, reduces power usage and improves battery life of the device by enabling the user to use the device more quickly and efficiently.
128
DK 2017 70505 A1 [0359] In some examples, the user input (e.g., 803, 805) is a continuous input on an affordance (e.g., a “touch and hold” input, a prolonged input) having a first characteristic intensity at a first time and a second characteristic intensity (e.g., stronger than the first contact intensity) at a second time (e.g., a time later than the first time). Thus, in some examples, the user input is a contact that becomes stronger/firmer as time passes. In accordance with the determination that the user input corresponds to the first type of user input (e.g., an input on a first affordance, such as a “+” affordance (e.g., 850, 885)), increasing the displayed numerical value (e.g., 848, 881) by a first rate at the first time and by a second rate (e.g., a rate faster than the first rate) at the second time. In accordance with the determination that the user input corresponds to the second type of user input (e.g., an input on a second affordance, such as a “-” affordance (e.g., 852, 883)), decreasing the displayed numerical value (e.g., 848, 881) by the first rate at the first time and by the second rate at the second time. Thus, in some examples, the displayed numerical value changes (increases or decreases) at a progressively faster rate as a user’s touch/contact input becomes increasing firmer/stronger. Increasing or decreasing the numerical value by an increasingly faster rate based on the intensity of the user input provides the user with feedback about the level of intensity that is being detected by the device based on the user’s input and provides visual feedback to the user indicating that pressing harder will cause the device to increase the rate of the change. Providing improved visual feedback to the user enhances the operability of the device and makes the user-device interface more efficient (e.g., by helping the user to achieve an intended result by providing feedback indicative of an input that will cause the device to generate the intended result and reducing user mistakes when operating/interacting with the device) which, additionally, reduces power usage and improves battery life of the device by enabling the user to use the device more quickly and efficiently.
[0360] In some examples, the electronic device (e.g., 700, 800, 850) provides feedback (e.g., haptic feedback in the form of one or more tactile outputs, audio feedback) while changing (increasing or decreasing) the displayed numerical value (e.g., by the amount corresponding to the first type of user input or to the second type of user input).
[0361] In some examples, in accordance with a determination that the user input corresponds to a third type of user input (e.g., an upwards swipe on the transfer user interface), the electronic
129
DK 2017 70505 A1 device (e.g., 700, 800, 850) replaces display of the transfer user interface (e.g., 728, 840, 875) with a numerical keypad user interface (e.g., 856, 862) (e.g., a user interface that includes an icon for each digit), wherein the numerical keypad user interface (e.g., 856, 862) includes a plurality of suggested values (for the quantity of the first type of item to transfer).
[0362] In some examples, an amount of at least one of the plurality of suggested values is determined based on stored historical use data (e.g., the most frequently used values, the most recently used value) associated with a user of the electronic device (e.g., 700, 800, 850).
[0363] In some examples, further in response to receiving the respective message, in accordance with a determination, based on the analysis of text in the respective message (and, optionally one or more prior messages in the message conversation), that the respective message does not relate to a transfer of the first type of item (e.g., a sticker, a photo, or a payment object), the electronic device displays, on the display, a representation of the respective message (e.g., a regular text message, a regular chat bubble, a regular email message) without displaying the selectable indication (e.g., 722, 822, 867) that corresponds to the first type of item.
[0364] In some examples, the selectable indication (e.g., 722, 822, 867) is a portion of the text (e.g., a name or quantity of a sticker(s), a name or quantity of a photo(s), an amount of funds, an amount of resources) in (the representation of) the respective message that relates to the first type of item that is visually distinguished (e.g., by underlining the portion of the text or displaying the portion of the text in a different color) from other text in the respective message.
[0365] In some examples, displaying, on the display (e.g., 702, 802, 851), the transfer user interface (e.g., 728, 840, 875) comprises replacing display of a virtual keyboard (e.g., 712, 812, a regular virtual keyboard of the operating system of the device) having a plurality of alphanumeric keys with the transfer user interface (e.g., 728, 840, 875).
[0366] In some examples, in accordance with a determination that a message prepared to be sent corresponds to the first type of item, the send affordance (e.g. 874) is displayed with a first visual characteristic (e.g., a color, a shade, a graphical pattern, a shape). In some examples, in accordance with a determination that the message prepared to be sent corresponds to a second
130
DK 2017 70505 A1 type of item (e.g., a textual message) different from the first type of item, the send affordance (e.g., 874) is displayed with a second visual characteristic (e.g., a different color, a different shade, a different graphical pattern, a different shape) different from the first visual characteristic. In some examples, when a message has been prepared to be sent that includes a payment or a request for payment, the send affordance (e.g., 874) is a first color, when the message to be sent does not include a payment or a request for payment, the send affordance (e.g., 874) is a second color that is different from the first color. Visually differentiating between drafts of messages that do and drafts of messages that do not correspond to transfer of items helps the user avoid unintentionally sending messages that include transfer of items. This is particularly helpful because non-transfer messages involve limited consequences and users may send such messages with little review, while messages that correspond to transfers involve relatively higher consequences. The differentiated visual feedback prompts the user to review such messages more carefully. Providing improved visual feedback to the user enhances the operability of the device and makes the user-device interface more efficient (e.g., by helping the user to achieve an intended result by providing feedback indicative of an input that will cause the device to generate the intended result and reducing user mistakes when operating/interacting with the device) which, additionally, reduces power usage and improves battery life of the device by enabling the user to use the device more quickly and efficiently.
[0367] In some examples, the graphical representation of the message (e.g., 866, 853) associated with transfer of the first type of item is displayed with a third visual characteristic (e.g., a color, a shade, a graphical pattern, a shape) in the message conversation (e.g., 708, 808, 859), and a representation of a message (e.g., 718, 720, 816, 818, 820, 865) in the message conversation not associated with transfer of the first type of item is displayed with a fourth visual characteristic (e.g., a different color, a different shade, a different graphical pattern, a different shape) that is different from the third visual characteristic. Visually differentiating between messages that do and do not correspond to transfer of items helps the user quickly identify messages that include transfers of items. This is particularly helpful because non-transfer messages involve limited consequences and users may glance over such messages with little review, while messages that correspond to transfers involve relatively higher consequences. The differentiated visual feedback prompts the user to review such messages more carefully (and
131
DK 2017 70505 A1 potentially take action). Providing improved visual feedback to the user enhances the operability of the device and makes the user-device interface more efficient (e.g., by helping the user to achieve an intended result by providing feedback indicative of an input that will cause the device to generate the intended result and reducing user mistakes when operating/interacting with the device) which, additionally, reduces power usage and improves battery life of the device by enabling the user to use the device more quickly and efficiently.
[0368] In some examples, in accordance with a determination that the respective message corresponds to a transmission, from a first participant in the message conversation, of a first quantity of content of the first type of item, the electronic device (e.g., 700, 800, 850) automatically (e.g., without checking for authentication, without requesting authentication information, without requiring user input) transfers the first quantity of content of the first type of item to the first participant (e.g., 710, 810). In some examples, in accordance with the determination that the respective message corresponds to a transmission, from the first participant in the message conversation, the electronic device displays (e.g., without checking for authentication, without requesting authentication information), on the display, a graphical representation of a message associated with transferring the first quantity of content of the first type of item to the first participant (e.g., 710, 810). Automatically accepting transfers of content when the message is a transfer of items to the user of the device allows quicker processing of the transfer and avoids the need for additional user inputs to accept the transfer. Performing an operation when a set of conditions has been met without requiring further user input enhances the operability of the device and makes the user-device interface more efficient (e.g., by helping the user to achieve an intended result by providing feedback indicative of an input that will cause the device to generate the intended result and reducing user mistakes when operating/interacting with the device) which, additionally, reduces power usage and improves battery life of the device by enabling the user to use the device more quickly and efficiently.
[0369] In some examples, while displaying, on the display (e.g., 702, 802, 851), the transfer user interface (e.g., 728, 840, 875), the electronic device (e.g., 700, 800, 850) displays an affordance for changing an account (e.g., a payment account, such as a debit card account or a credit card account, a points account, a resources account) for use in the transfer of the first type
132
DK 2017 70505 A1 of item. The electronic device detects, via the one or more input devices, user activation of the affordance for changing the account. In response to detecting the user activation of the affordance for changing the account, the electronic device displays, on the display, an account user interface including a representation of a current account and a representation of a second account, wherein the current account is currently selected for use in the transfer. The electronic device detects, via the one or more input devices, user selection of the representation of the second account. In response to detecting the user selection of the representation of the second account, the electronic device selects the second account for use in the transfer (e.g., without using the first account).
[0370] In some examples, in response to (or subsequent to) transferring the first type of item to participants in the message conversation, the electronic device (e.g., 700, 800, 850) provides (e.g., in addition to the outputted feedback described in method 1200 with reference to FIGS. 12A-12C) a dynamic graphical animation (e.g., moving cash, falling cash) within the representation of the message (or, alternatively, within the entire displayed message conversation or within the entire display). In some examples, in response to (or subsequent to) receiving the first type of item from participants in the message conversation, the electronic device (e.g., 700, 800, 850) provides (e.g., in addition to the outputted feedback described in method 1200 with reference to FIGS. 12A-12C) a dynamic graphical animation (e.g., moving cash/currency symbols, falling cash/currency symbols) within the representation of the message (or, alternatively, within the entire displayed message conversation or within the entire display).
[0371] In some examples, subsequent to initiating the transfer of the first type of item between the participants (e.g., 710, 810, the user) in the message conversation (e.g., 708, 808, 859), and in accordance with a determination that the transfer of the first type of item has been accepted by a participant, a dynamic visual, audio, and/or sensory feedback is applied to the representation of the message and/or the selectable indication that corresponds to the first type of item by the device (e.g., as described below in method 1200 with reference to FIGS. 12A-12C). In some examples, the dynamic feedback is a visual feedback where the font changes with the orientation of the device (e.g., as described below in method 1200 with reference to FIGS. 12A12C). In some examples, the dynamic feedback is a visual feedback where the font changes with
133
DK 2017 70505 A1 the movement of the user’s face relative to the device (e.g., as described below in method 1200 with reference to FIGS. 12A-12C). In some examples, the dynamic feedback is a sensory feedback, such as a haptic feedback (e.g., as described below in method 1200 with reference to FIGS. 12A-12C).
[0372] Note that details of the processes described above with respect to method 900 (e.g., FIGS. 9A-9I) are also applicable in an analogous manner to the methods described below. For example, method 900 optionally includes one or more of the characteristics of the various methods described below with reference to methods 1200, 1500, 1800, 2100, 2400, 2700, 3000, and 3400. For example, the outputting of dynamic feedback described in method 1200 can be applied with respect to the graphical representation of a message (e.g., 866, 853). For another example, the different visual appearances of a message object based on whether the message object corresponds to a transmission message or a request message, as described in method 1500, can be applied with respect to the graphical representation of a message (e.g., 866, 853). For another example, a request for activating an account that is authorized to obtain one or items (e.g., a sticker, a photo, resources, a payment), as described in method 1800, can be applied with respect to the graphical representation of a message (e.g., 866, 853) when retrieving one or more items (e.g., a sticker, a photo, a payment) associated with the message. For another example, displaying representations of a first account and a second account, as described in method 2100, can also be displayed on the authentication user interface (e.g., 878, 831). For another example, automatically proceeding with a transfer, as described in method 2400, without any user input can be used to accept a transfer corresponding to the graphical representation of a message (e.g., 866, 853). For another example, the plurality of items including information from messages in a message conversation, as described in method 2700, can be displayed in response to user selection of the graphical representation of a message (e.g., 866, 853). For another example, an utterance can be used, as described in method 3000, to create the graphical representation of a message (e.g., 866, 853). For another example, a visual effect (e.g., a coloring effect, a geometric alteration effect) can be applied, as described in method 3400, to an element (e.g., 868, 853) of a graphical representation of a message (e.g., 866, 853) when a transfer (e.g., of a resource, of a file, of a payment) associated with the message is completed. For brevity, these details are not repeated below.
134
DK 2017 70505 A1 [0373] The operations in the information processing methods described above are, optionally, implemented by running one or more functional modules in an information processing apparatus such as general purpose processors (e.g., as described with respect to FIGS. 1A, 3, and 5A) or application specific chips. Further, the operations described above with reference to FIGS. 9A-9I are, optionally, implemented by components depicted in FIGS. 1A-1B. For example, receiving operation 902, displaying operation 904, receiving operation 908, displaying operation 912, detecting operation 922, and displaying operation 926 are, optionally, implemented by event sorter 170, event recognizer 180, and event handler 190. Event monitor 171 in event sorter 170 detects a contact on touch-sensitive surface 604, and event dispatcher module 174 delivers the event information to application 136-1. A respective event recognizer 180 of application 136-1 compares the event information to respective event definitions 186, and determines whether a first contact at a first location on the touch-sensitive surface corresponds to a predefined event or sub-event, such as selection of an object on a user interface. When a respective predefined event or sub-event is detected, event recognizer 180 activates an event handler 190 associated with the detection of the event or sub-event. Event handler 190 optionally utilizes or calls data updater 176 or object updater 177 to update the application internal state 192. In some embodiments, event handler 190 accesses a respective GUI updater 178 to update what is displayed by the application. Similarly, it would be clear to a person having ordinary skill in the art how other processes can be implemented based on the components depicted in FIGS. 1A-1B.
[0374] FIGS. 10A-10D illustrate exemplary user interfaces for managing peer-to-peer transfers, in accordance with some embodiments. As described in greater detail below, the nonlimiting exemplary embodiment of the user interfaces illustrated in FIGS. 10A-10D relate to the non-limited exemplary embodiment of the user interfaces illustrated in FIGS. 11A-11V, which in turn are used to illustrate the processes described below, including the processes in FIGS. 12A12C.
[0375] FIG. 10A illustrates an electronic device 1000 (e.g., portable multifunction device 100, device 300, or device 500). In the non-limiting exemplary embodiment illustrated in FIGS. 10A-10D, electronic device 1000 is a smartphone. In other embodiments, electronic
135
DK 2017 70505 A1 device 1000 can be a different type of electronic device, such as a wearable device (e.g., a smartwatch). Electronic device 1000 has a display 1002 and one or more sensor devices (e.g., an accelerometer, one or more cameras). In some embodiments, optionally, electronic device 1000 also has one or more input devices (e.g., a touchscreen of display 1002, a mechanical button 1004, a mic).
[0376] In FIG. 10A, electronic device 1000 displays, on display 1002, a message conversation 1008 of a messaging application 1006 between a user of the device (e.g., “Kate Appleseed”) and a message participant 1010 (e.g., “John Appleseed”). In some embodiments, message participant 1010 is a contact stored on the device. In some embodiments, message participant 1010 is a contact of a contact list associated with the user account logged onto the device. In some embodiments, message participant 1010 is a contact included in a trusted contacts list associated with the user account logged onto the device.
[0377] In some embodiments, electronic device 1000 also displays, on display 1002, a virtual keyboard 1012 (e.g., an alphanumeric keyboard for typing a message) and a compose bar 1014 displaying the text of a message as a message is typed using virtual keyboard 1012. In some embodiments, a mechanical keyboard can be used in addition to or alternatively to virtual keyboard 1012 to type a message. In some embodiments, compose bar 1014 can expand (e.g., expand upwards) to accommodate a longer message or message object (e.g., an image, an emoticon, a special type of message object, such as a payment object). In some embodiments, compose bar 1014 includes a mic button 1016 which, when activated, enables the user to record a message using voice input.
[0378] As shown in FIG. 10A, message conversation 1008 includes a message object 1018. Message object 1018 corresponds to a message sent by message participant 1010 to the user of electronic device 1000. In message object 1018, the message participant states to the user: “Can you send me the photo from last night?” In other words, message participant 1010 is requesting to the user that the user send to the message participant, via the messaging application, a photo stored on electronic device 1000 (or accessible by the device) taken last night.
136
DK 2017 70505 A1 [0379] As shown in FIG. 10A, message conversation 1008 also includes a pending transfer message object 1020. Pending transfer message object 1020 corresponds to a pending transfer of a photo (from last night) sent by the user to message participant 1010 in response to the message participant’s request in message object 1018. In some embodiments, instead of a photo, the transfer can be a different type of file, such as a video file, an audio file, or a document. In some embodiments, the transfer can be of a plurality of files. In some embodiments, pending transfer message object 1020 includes a mini-file object 1022 (e.g., a (selectable) thumbnail, a (selectable) preview image, a link) corresponding to the photo (from last night) sent by the user to message participant 1010 via pending transfer message object 1020. In some embodiments, pending transfer message object 1020 also includes a status indicator 1024 (e.g., stating “PENDING”) that informs the user of a status of the transfer associated with the message object. In some embodiments, a transfer is “pending” when an intended recipient of the message corresponding to the transfer, which in this example is message participant 1010, has not yet accepted (e.g., viewed, downloaded) the file (e.g., photo, video file, audio file, document) corresponding to the transfer message object. In some embodiments, the file (e.g., the photo corresponding to preview image 1022) corresponding to the transfer is selected using the process described above with respect to FIGS. 7A-7E.
[0380] FIG. 10B shows electronic device 1000, while displaying the display (including pending transfer message object 1020 within message conversation 1008) shown in FIG. 10A, being viewed at two different angles (angle 1000A and angle 1000B) relative to a reference point 1026 that is a face of a viewer (e.g., the user) of the device in a field of view of a sensor (e.g., a camera) of the device. Alternatively, in some embodiments, the reference point is a static point external to the device, such as a location on the ground or floor. As shown in FIG. 10B, from the perspective of reference point 1026 of a viewer (e.g., the user) viewing display 1002 of the device at either angle 1000A or at angle 1000B, pending transfer message object 1020 appears the same at either angle. In other words, whether a viewer (e.g., the user) views display 1002 of the device at angle 1000A, or whether a viewer (e.g., the user) views display 1002 of the device at angle 1000B, or whether a viewer (e.g., the user) views display 1002 of the device from straight on (e.g., such that the display is not tilted at an angle relative to the face of the viewer, as shown in FIG. 10A), there is no change in how the pending transfer message object is perceived
137
DK 2017 70505 A1 by the user, for there is no change in how the pending transfer message object is displayed on display 1002 by the device. Thus, in FIG. 10B (in contrast to FIG. 10D, described below), the device does not provide any feedback (e.g., visual feedback, audio feedback) associated with pending transfer message object 1020 to a viewer (e.g., the user) of the device in response to a change in orientation (e.g., change in movement, change in viewing angle) of the device relative to a reference point (e.g., the viewer’s face, a static point external to the device).
[0381] In FIG. 10C, the photo corresponding to mini-file object 1022 has been viewed (or downloaded) by message participant 1010. Thus, FIG. 10C shows, in place of pending transfer message object 1020, a completed transfer message object 1028. A transfer message object is a completed transfer message object (as opposed to a pending transfer message object) when a file (e.g., the photo corresponding to mini-file object 1022) associated with the transfer corresponding to the transfer message object has been viewed (or downloaded) by the intended recipient (e.g., message participant 1010) of the transfer. In some embodiments, status indicator 1024 is updated (e.g., to state “VIEWED” instead of “PENDING”) to inform the user that the file (e.g., the photo corresponding to mini-file object 1022) corresponding to the transfer associated with the transfer message object has been viewed (or downloaded) by the intended recipient.
[0382] In some embodiments, electronic device 1000 generates a feedback (e.g., a visual effect, a sensory feedback, such as a haptic effect, an audio feedback) associated with a completed transfer message object, which indicates to the user that the transfer of the file corresponding to message object has been accepted (e.g., viewed, downloaded). For example, once the transfer of the file corresponding to the transfer message object has been completed, a visual effect is applied to mini-file object 1022 of completed transfer message object 1028. In some embodiments, the visual effect applied to the mini-file object is a bolding (or thickening) of a border of the mini-file object. In some embodiments, the visual effect applied to the mini-file object is a black outline (e.g., a shadow) applied to a border of the mini-file object. In some embodiments, the visual effect applied to the mini-file object is a change color of at least a portion of the mini-file object.
138
DK 2017 70505 A1 [0383] In some embodiments, electronic device 1000 generates feedback (e.g., a visual feedback, a haptic feedback, an audio feedback) that is associated with the completed transfer message object or associated with an element (e.g., mini-file object 1022) of the completed transfer message object. In some embodiments, the feedback is a dynamic visual feedback causing display of the completed transfer message object (e.g., completed transfer message object 1028) or an element (e.g., mini-file object 1022) of the transfer message object to change as changes in the orientation of the device relative to reference point 1026 are detected. In some embodiments, changes in orientation of the device are detected via the one or more sensors of the device (e.g., an accelerometer, a camera). For example, the device detects movement, and thus changes in its orientation, via an accelerometer. For another example, the device detects changes in its position relative to the face of a viewer (e.g., the user) via a camera. In some embodiments, the dynamic feedback (e.g., visual, haptic, and/or audio feedback) gradually changes as the orientation of the device and/or the position of the device relative to the face of the user changes (e.g., the amount and/or direction of the change in the dynamic feedback is determined by an amount and/or direction of the change in the orientation of the device and/or the position of the device relative to the face of the user).
[0384] For example, in FIG. 10D, the dynamic visual feedback is a 3D effect that provides the user with the visual effect that mini-file object 1022 of the completed transfer message object 1028 is three-dimensional (e.g., similar to the one or more types of visual feedback applied to amount object 3324 described below with reference to, for example, FIGS. 33D-33J). Thus, in FIG. 10D, based on reference point 1026, mini-file object 1022 of completed transfer message object 1028 looks visually different (e.g., the mini-file object that corresponds to a photo appears to be a dynamic cube, with the photo displayed on one side of the cube, instead of a two dimensional photo) from angle 1000A of the device and from angle 1000B of the device and, optionally, both the view of mini-file object 1022 of completed transfer message object 1028 from angle 1000A and angle 1000B look different from the appearance of the mini-file object of the completed transfer message object from straight on (e.g., such that the display is not tilted at an angle relative to the reference point, as shown in FIG. 10C). In some embodiments, the dynamic visual feedback is a changing color applied to at least a portion of the mini-file object or to at least a portion of the completed transfer message object. In some embodiments, the
139
DK 2017 70505 A1 dynamic visual feedback is a changing background applied to the completed transfer message object. In some embodiments, the dynamic visual feedback is a moving of one or more elements, such as mini-file object 1022, of the completed transfer message object.
[0385] In some embodiments, as also shown in FIG. 10D, in addition to, or instead of, generating a dynamic visual feedback, the device generates a dynamic haptic feedback 1030 (e.g., similar to the generated tactile output 3336 described below with reference to, for example, FIGS. 33F-33H). In some embodiments, the dynamic haptic feedback is a dynamically strengthening and weakening tactile output caused by the device. In some embodiments, the dynamic haptic feedback is a tactile output with changing tactile output patterns caused by the device. In some embodiments, the strength or frequency of the tactile output changes as the device detects changes in the orientation of the device relative to the reference point (e.g., reference point 1026).
[0386] In some embodiments, the generated feedback (e.g., visual feedback, haptic feedback, audio feedback) is caused (e.g., only) by an operating system program of the device and nonoperating system programs of the device are not enabled to cause the feedback.
[0387] As mentioned above, the non-limiting exemplary embodiment of the user interfaces illustrated in FIGS. 10A-10D described above relate to the non-limited exemplary embodiment of the user interfaces illustrated in FIGS. 11A-11V described below. Therefore, it is to be understood that the processes described above with respect to the exemplary user interfaces illustrated in FIGS. 10A-10D and the processes described below with respect to the exemplary user interfaces illustrated in FIGS. 11A-11V are largely analogous processes that similarly involve initiating and managing transfers using an electronic device (e.g., 100, 300, 500, 1000, or 1100).
[0388] FIGS. 11A-11V illustrate exemplary user interfaces for peer-to-peer transfers, in accordance with some embodiments. The user interfaces in these figures are used to illustrate the processes described below, including the processes in FIGS. 12A-12C.
140
DK 2017 70505 A1 [0389] FIG. 11A illustrates an electronic device 1100 (e.g., portable multifunction device 100, device 300, or device 500). In the non-limiting exemplary embodiment illustrated in FIGS. 11A-11V, electronic device 1100 is a smartphone. In other embodiments, electronic device 1100 can be a different type of electronic device, such as a wearable device (e.g., a smartwatch). Electronic device 1100 has a display 1102 and one or more sensor devices (e.g., an accelerometer, one or more cameras). In some embodiments, optionally, electronic device 1100 also has one or more input devices (e.g., a mechanical button 1104).
[0390] In FIG. 11A, electronic device 1100 displays, on display 1102, a message conversation 1108 of a messaging application 1106 between a user of the device (e.g., “Kate Appleseed”) and a message participant 1110 (e.g., “John Appleseed”). In some embodiments, message participant 1110 is a contact stored on the device. In some embodiments, message participant 1110 is a contact of a contact list associated with the user account logged onto the device. In some embodiments, message participant 1110 is a contact included in a trusted contacts list associated with the user account logged onto the device.
[0391] In some embodiments, electronic device 1100 also displays, on display 1102, a virtual keyboard 1112 (e.g., an alphanumeric keyboard for typing a message) and a compose bar 1114 displaying the text of a message as a message is typed using virtual keyboard 1112. In some embodiments, a mechanical keyboard can be used in addition to or alternatively to virtual keyboard 1112 to type a message. In some embodiments, compose bar 1114 can expand (e.g., expand upwards) to accommodate a longer message or message object (e.g., an image, an emoticon, a special type of message object, such as a payment object). In some embodiments, compose bar 1114 includes a mic button 1114A which, when activated, enables the user to record a message using voice input.
[0392] As shown in FIG. 11A, message conversation 1108 includes a message object 1116 and a payment message object 1118. Message object 1116 corresponds to a message sent by the user of electronic device 1100 to message participant 1110. In message object 1116, the user states to message participant: “Dinner was $28.” In other words, the user is informing message participant 1110 that the user is owed $28 by message participant 1110 (and thus requesting that $28 be paid by message participant 1110 to the user). Payment message object 1118 corresponds 141
DK 2017 70505 A1 to a payment sent by message participant 1110 to the user for $28 (responding to the user’s request for payment of $28). In addition, an accompanying note message object 1126 corresponding to an accompanying note (e.g., “for dinner”) sent together with the message corresponding to payment message object 1118 is also displayed.
[0393] As shown, payment message object 1118 also includes a mode indication 1120 (e.g., stating “PAY”) that the payment message object corresponds to a payment made by message participant 1110 to the user via an operating-system controlled payment transfer application (and not by a third-party application). Payment message object 1118 also includes an amount indication 1122 (e.g., “$28”) of the amount of the payment sent by message participant 1110 to the user. Alternatively, if payment message object 1118 corresponded to a request for payment (instead of a sent payment) by message participant 1110 to the user, amount indication 1122 would indicate an amount of the requested payment and, optionally, further indication that the amount is being requested (e.g., “$28 requested”).
[0394] Payment message object 1118 also includes an accept button 1124 for accepting the payment (or, alternatively, agreeing to send the payment made by a payment request) corresponding to received message object 1118 in the amount shown in amount indication 1122. In some embodiments, payment message object 1118 also includes an accompanying note message object 1126. In FIG. 11A, message participant 1110 informs the user, via note message object 1126, that the payment corresponding to payment object 1118 is “For dinner” (that was requested by the user via message object 1116).
[0395] FIG. 11B shows electronic device 1100, while displaying the display (including payment message object 1118 within message conversation 1108) shown in FIG. 11A, being viewed at two different angles (angle 1100A and angle 1100B) relative to a reference point 1128 that is a face of a viewer (e.g., the user) of the device in a field of view of a sensor (e.g., a camera) of the device. Alternatively, in some embodiments, the reference point is a static point external to the device, such as a location on the ground or floor. As shown in FIG. 11B, from the perspective of reference point 1128 of a viewer (e.g., the user) viewing display 1102 of the device at either angle 1100A or at angle 1100B, payment message object 1118 appears the same at either angle. In other words, whether a viewer (e.g., the user) views display 1102 of the
142
DK 2017 70505 A1 device at angle 1100A, or whether a viewer (e.g., the user) views display 1102 of the device at angle 1100B, or whether a viewer (e.g., the user) views display 1102 of the device from straight on (e.g., such that the display is not tilted at an angle relative to the face of the viewer, as shown in FIG. 11A) there is no change in how the payment message object is perceived by the user, for there is no change in how the payment message object is displayed on display 1102 by the device. Thus, in FIG. 11B (in contrast to FIG. 11E, described below), the device does not provide any feedback associated with payment message object 1118 to a viewer (e.g., the user) of the device in response to a change in orientation (e.g., change in movement, change in viewing angle) of the device relative to the viewer’s (e.g., the user’s) face.
[0396] In FIG. 11C, while displaying payment message object 1118 within message conversation 1108, electronic device 1100 detects user input on accept button 1124 of the payment message object. For example, as shown in FIG. 11C, the user input is a tap gesture 1101 on accept button 1124 of payment message object 1118.
[0397] FIG. 11D shows, in place of (non-completed) payment message object 1118, a completed payment message object 1132. Specifically, as shown in FIG. 11D, in response to detecting tap gesture 1101 on payment message object 1118 (thereby accepting the payment from message participant 1110), accept button 1124 ceases to be displayed on the payment message object. As also shown in FIG. 11D, in response to detecting tap gesture 1101 on payment message object 1118 (thereby accepting the payment from message participant 1110), electronic device 1100 generates a feedback (e.g., a visual effect, a sensory feedback, such as a haptic effect, an audio feedback) indicating to the user that the payment corresponding to payment message object 1118 has been accepted and that payment message object 1118, now completed payment message object 1132, corresponds to a payment that has already been accepted.
[0398] For example, as shown in FIG. 11D, in response to detecting tap gesture 1101 on payment message object 1118, amount indication 1122 (e.g., “$28”) of completed payment message object 1132 is visually changed. In some embodiments, the visual change to amount indication 1122 is a bolding (or thickening) of the font of the displayed amount (e.g., “$28”). In some embodiments, the visual change to amount indication 1122 includes a black outline (e.g., a 143
DK 2017 70505 A1 shadow) applied to the font of the displayed amount (e.g., “$28”). In some embodiments, the visual change to amount indication 1122 is a change in color (e.g., from black to white) of the displayed amount (e.g., “$28”).
[0399] In some embodiments, in response to detecting tap gesture 1101 on payment message object, electronic device 1100 generates feedback (e.g., a visual feedback, a haptic feedback, an audio feedback) associated with the payment message object. In some embodiments, the feedback is a dynamic visual feedback causing display of the payment message object (e.g., completed payment message object 1132) to change as changes in the orientation of the device relative to reference point 1128 are detected. In some embodiments, changes in orientation of the device are detected via the one or more sensors of the device (e.g., an accelerometer, a camera). For example, the device detects movement, and thus changes in its orientation, via an accelerometer. For another example, the device detects changes in its position relative to the face of a viewer (e.g., the user) via a camera. In some embodiments, the dynamic feedback (e.g., visual, haptic, and/or audio feedback) gradually changes as the orientation of the device and/or the position of the device relative to the face of the user changes (e.g., the amount and/or direction of the change in the dynamic feedback is determined by an amount and/or direction of the change in the orientation of the device and/or the position of the device relative to the face of the user).
[0400] For example, in FIG. 11E, the dynamic visual feedback is a 3D effect (e.g., the simulated depth effect 3325 described below with reference to FIGS. 33D-33J) that provides the user with the visual effect that amount indication 1122 of the payment message object is threedimensional (e.g., similar to the one or more types of visual feedback applied to amount object 3324 described below with reference to, for example, FIGS. 33D-33J). Thus, in FIG. 11E, based on reference point 1128 of the user, amount indication 1122 of payment message object 1118 looks visually different (e.g., shadows behind the displayed numbers of amount indication 1122 appear different) from angle 1100A of the device and from angle 1100B of the device and, optionally, both the view of payment message object 1118 from angle 1100A and angle 1100B look different from the appearance of the payment message object 1118 from straight on (e.g., such that the display is not tilted at an angle relative to the face of the viewer, as shown in FIG.
144
DK 2017 70505 A1
11D). In some embodiments, the dynamic visual feedback is a changing color applied to the amount indication (or to the entire payment message object). In some embodiments, the dynamic visual feedback is a changing background applied to the payment message object. In some embodiments, the dynamic visual feedback is a moving of one or more elements, such as amount indication 1122 or mode indication 1120, of the payment message object.
[0401] In some embodiments, as also shown in FIG. 11E, in addition to, or instead of, generating a dynamic visual feedback, the device generates a dynamic haptic feedback 1130 (e.g., similar to the generated tactile output 3336 described below with reference to, for example, FIGS. 33F and 33H). In some embodiments, the dynamic haptic feedback is a dynamically strengthening and weakening tactile output caused by the device. In some embodiments, the dynamic haptic feedback is a tactile output with changing tactile output patterns caused by the device. In some embodiments, the strength or frequency of the tactile output changes as the device detects changes in the orientation of the device relative to the reference point (e.g., reference point 1128).
[0402] In some embodiments, the generated feedback (e.g., visual feedback, haptic feedback, audio feedback) is caused (e.g., only) by an operating system program of the device and nonoperating system programs of the device are not enabled to cause the feedback.
[0403] In FIG. 11F, while displaying completed payment message object 1132 within message conversation 1108, electronic device 1100 detects a user input on the completed payment message object. For example, as shown in FIG. 11F, the user input is a tap gesture 1103 on completed payment message object 1132.
[0404] In FIG. 11G, in response to detecting tap gesture 1103, electronic device 1100 displays (e.g., replaces display of messaging application 1106 and virtual keyboard 1112 with) a transaction detail user interface 1134 that includes a graphical representation 1135 (e.g., a copy) of the completed payment message object (e.g., completed payment message object 1132), a textual indication 1137 of a note accompanying the payment (e.g., accompanying note message object 1126 stating “For dinner”), plurality of details relating to the selected (e.g., the payment corresponding to completed payment message object 1132) transaction. In some embodiments,
145
DK 2017 70505 A1 transaction detail user interface 1134 includes an indication 1134A of the payment (or, alternatively, payment request) sender/recipient. In some embodiments, transaction detail user interface 1134 includes an indication 1134B of the payment account or the payment receipt account. In some embodiments, transaction detail user interface 1134 includes an indication 1134C of the date and time of the payment (or, alternately, of the payment request). In some embodiments, transaction detail user interface 1134 includes an indication 1134D of the date and time of competed transaction. In some embodiments, transaction detail user interface 1134 includes an indication 1134E of a transaction number. In some embodiments, transaction detail user interface 1134 includes an indication 1134F of the payment (or, alternatively, payment request) amount.
[0405] As shown, transaction detail user interface 1134 includes a wallet button 1136 (e.g., a “View in Wallet” selectable indication) for viewing the transaction details in an electronic wallet application of electronic device 1100. In some embodiments, transaction detail user interface
1134 includes a send again button 1131 (e.g., if the payment associated payment message object
1135 was a payment made by the user to a message participant) for creating a new payment message object corresponding to a payment in the same amount as the currently-viewed transaction intended for the same recipient as the currently-viewed transaction. Thus, send again button 1131 provides the user with a quick and easy option to perform another payment in the same amount (e.g., “$28”) to the same recipient (e.g., message participant 1110) via the transaction detail user interface of the last transaction with that recipient. In some embodiments, transaction detail user interface 1134 includes a refund button 1133 (e.g., if the payment associated payment message object 1135 was a payment made by the user to a message participant) for requesting a refund of a sent payment. In some embodiments, refund button 1133 is only available (e.g., is only visible, is only selectable) if the payment associated with the payment message object has been accepted (e.g., is no longer pending because the intended recipient (e.g., message participant 1110) has accepted the payment).
[0406] FIG. 11H illustrates a different message conversation 1140 of messaging application 806. In FIG. 11H, message conversation 1140 is between the user of electronic device 1100 and an unknown participant 1142. In some embodiments, unknown participant 1142 is a participant
146
DK 2017 70505 A1 that does not correspond to a contact stored on the device. In some embodiments, unknown participant 1142 is a participant that is not included in a contact of a contact list associated with the user account logged onto the device. In some embodiments, unknown participant 1142 is a participant not included in a trusted contacts list associated with the user account logged onto the device. In some embodiments, unknown participant 1142 is a participant included in a nontrusted contacts list (e.g., a spam list) associated with the user account logged onto the device. In some embodiments, unknown participant 1142 is a participant included in a non-trusted user list (e.g., a spam list) maintained by an external device, such as a server. Note that FIG. 8C and the corresponding description provide additional examples relating to unknown participants.
[0407] As shown in FIG. 11H, electronic device 1100 displays in message conversation 1140 a payment message object 1144 (and an accompanying note message object 1146) corresponding to a payment request of $28 made by unknown participant 1142 to the user. In some embodiments, a payment message object (e.g., payment message object 1144) corresponding to a payment request includes in payment amount indication 1122 that also includes addition text (e.g., “$28 Request”) that indicates to the user that the payment message object relates to a payment request, and not a payment. As with a payment message object relating to a payment (e.g., payment message object 1118), a payment message object relating to a payment request (e.g., payment message object 1144) also includes an accept button 1124 for accepting the payment request (e.g., agreeing to pay the requested amount in the payment request).
[0408] In some embodiments, as also shown in FIG. 11H, a payment message object corresponding to a payment request (e.g., payment message object 1144) (instead of a payment) includes a request indicator 1145 that indicates to the user that the payment message object corresponds to a payment request (e.g., a payment request made by the user of the device to a message participant or a payment request sent by a message participant to the user) and not to a payment. In some embodiments, as shown in FIG. 11H, request indicator 1145 is a currency symbol (e.g., the dollar symbol “$”) displayed at a center region of the message object. In some embodiments, request indicator 1145 is a graphical symbol. In some embodiments, the visual characteristics (e.g., font type, boldness / thickness, color, shading, dynamic feedback, such as a 3D effect) of request indicator 1145 correspond with the visual characteristics (e.g., font type,
147
DK 2017 70505 A1 boldness / thickness, color, shading) of an amount indication of a payment message object that corresponds to a (pending or completed) payment (e.g., amount indication 1122 of payment message objects 1118, 1132). In some embodiments, the visual characteristics (e.g., font type, boldness / thickness, color, shading, dynamic feedback, such as a 3D effect) of request indicator 1145 are different from (and thus does not correspond with) the visual characteristics (e.g., font type, boldness / thickness, color, shading, dynamic feedback, such as a 3D effect) of an amount indication of a payment message object that corresponds to a (pending or completed) payment request (e.g., amount indication 1122 of payment message object 1144).
[0409] In FIG. 11H, as described above, the payment request corresponding to payment message object 1144 is from an unknown participant (e.g., unknown participant 1142). In some embodiments, in accordance with a determination that the payment request corresponding to the payment message object is form an unknown participant, electronic device 1100 displays a spam notification 1158 (e.g., a textual notification, a graphical notification, a prompt) that the message is from an unknown participant. For example, as shown in FIG. 11H, the device displays within message conversation 1140 spam notification 1158 (a notification message) stating “this sender is not in your contacts list.” In some embodiments, the device further displays (e.g., below spam notification 1158), a selectable reporting notification 1160 (e.g., a selectable text, a button) for reporting (e.g., transmitting information about) the unknown participant to an external device (e.g., a server). For example, as shown in FIG. 11H, the device displays, below spam notification 1158, selectable reporting notification 1160 (e.g., selectable text) stating “Report Spam.” [0410] In FIG. 11I, while displaying spam notification 1158, electronic device 1100 detects (despite the displayed spam notification 1158) a user input on accept button 1124 of payment message object 1144 from unknown participant 1142. For example, as shown in FIG. 11I, the detected user input is a tap gesture 1105 on accept button 1124.
[0411] In some embodiments, as shown in FIG. 11J, in response to detecting (despite the displayed spam notification 1158) tap gesture 1105, electronic device 1100 displays, on display 1102, a pop-up warning 1162 further informing the user that the payment request corresponding to payment message object 1144 is from an unknown sender (e.g., unknown participant 1142).
148
DK 2017 70505 A1
In some embodiments, pop-up warning 1162 includes a cancel button 1162A and a proceed button 1162B.
[0412] In FIG. 11K, while displaying pop-up warning 1162, electronic device 1100 detects (despite the pop-up warning again informing the user that the payment request is from an unknown sender) a user input on proceed button 1162B of pop-up warning 1162. For example, as shown in FIG. 11K, the detected user input is a tap gesture 1107 on proceed button 1162B of pop-up warning 1162.
[0413] In FIG. 11L, in response to detecting tap gesture 1107 on proceed button 1162B of pop-up warning 1162, electronic device 1100 ceases displaying pop-up warning 1162. Further, in some embodiments, in response to detecting tap gesture 1107 on proceed button 1162B of pop-up warning 1162, the device displays (e.g., replaces display of virtual keyboard 1112 with) a payment transfer user interface 1164 corresponding to payment transfer user interface 840 described above with reference to FIGS. 8A-8AH. In some embodiments, as shown in FIG. 11L, based on the payment request corresponding to payment message object 1144 for the amount of $28, payment transfer user interface 1162 is displayed with the requested payment amount (of $28) pre-populated in value change region 1166.
[0414] In FIG. 11M, while displaying payment transfer user interface 1164, electronic device 1100 detects user activation of send button 1168 (e.g., to send the requested payment). For example, as shown in FIG. 11M, the user activation is a tap gesture 1109 on send button 1168.
[0415] In FIG. 11N, in response to detecting tap gesture 1109 on send button 1168, electronic device 1100 displays (e.g., replaces display of payment transfer user interface 1164 with) payment confirmation user interface 1178 corresponding to payment confirmation user interface 878 described above with reference to FIGS. 8T-8W.
[0416] In some embodiments, as shown in FIG. 11N, payment confirmation user interface 1178 includes an authentication request 1180 (e.g., a graphical request, a textual request) requesting that the user provide authentication information to proceed with making the payment to requested by the payment request. In some embodiments, the requested authentication is
149
DK 2017 70505 A1 biometric authentication, such as facial recognition authentication, fingerprint authentication, voice recognition authentication, iris scan authentication, or retina scan authentication. For example, in FIG. 11N, the requested authentication information (e.g., as shown in authentication request 1180), is fingerprint information (e.g., “Pay with Fingerprint”). In some embodiments, payment confirmation user interface 1178 also includes an indication 1176 that the current payment is intended for an unknown recipient (e.g., unknown participant 1142).
[0417] In FIG. 11O, while displaying payment confirmation user interface 1178, electronic device 1100 receives, from the user, the requested fingerprint information 1111 (e.g., via mechanical button 1104). While (or subsequent to) receiving, from the user, fingerprint information 1111, a determination is made (e.g., by the device or by an external device, such as a server) whether fingerprint information 1111 is consistent with an enrolled authentication information (e.g., an enrolled fingerprint information) of the user. As shown in FIG. 11P, in accordance with a determination that fingerprint information 1111 is consistent with enrolled fingerprint information of the user, the device updates authentication request 1180 (previously showing a request for a certain type of authentication information) to indicate that the authentication was successful (e.g., by displaying a checkmark, by displaying “Authorization Successful” or “Payment Complete”).
[0418] In some embodiments, in accordance with a determination that fingerprint information 1111 is not consistent with enrolled fingerprint information of the user (e.g., authentication was not successful), the device displays an indication that the authentication was unsuccessful and a request to re-provide the requested authentication information. In some embodiments, in accordance with a determination that fingerprint information 1111 is (e.g., for a second time) not consistent with enrolled fingerprint information of the user, the device displays a verification user interface (e.g., as described below with reference to FIGS. 31A-31M) for providing a different type of authentication information or for verifying that the user is the user that is associated with the user account logged onto the device.
[0419] As shown in FIG. 11Q, in response to a successful user authentication, electronic device 1100 removes display of payment confirmation user interface 1178 (and again displays virtual keyboard 1112 in place of the removed payment confirmation user interface 1178). In 150
DK 2017 70505 A1 some embodiments, as also shown in FIG. 11Q, the device displays a new payment message object 1170 corresponding to the payment made by the user to unknown participant 1142 in response to the payment request from the unknown participant. As also shown in FIG. 11Q, payment message object 1170 includes a first status indicator 1194 informing the user of a status of the payment corresponding to the sent payment message object (e.g., “pending,” “paid,” “accepted,” “expired”). For example, in FIG. 11Q, first status indicator 1194 shows “pending,” thus indicating to the user that the payment associated with sent payment message object 1170 has not yet been accepted by unknown participant 1142. In some embodiments, the device displays (in addition to or instead of first status indicator 1194), a second status indicator 1196 informing the user of a status of the payment corresponding to the sent payment message object (e.g., “pending,” “paid,” “accepted,” “expired”). For example, in FIG. 11X, second status indicator 1196 (e.g., “pending”) shows the same status as shown by first status indicator 1194 (e.g., “pending”).
[0420] In some embodiments, payment message object 1144 corresponding to the payment request sent by unknown participant 1142 to the user is maintained (within message conversation 1108 (e.g., above payment message object 1170), and not removed from the message conversation) when payment message object 1170 corresponding to the payment made by the user to unknown participant 1142 in response to the payment request from the unknown participant is created. In some embodiments, if payment message object 1144 is maintained (within message conversation 1108 (e.g., above payment message object 1170)), payment message object 1144 is (e.g., in response to a successful authentication from FIG. 11P) updated to indicate that the payment request has been accepted by the user (e.g., that the user has agreed to make the payment requested via payment message object 1144). For example, in some embodiments, accept button 1124 is removed from the message object and a status indicator (e.g., stating “PENDING”) is updated (e.g., to state “ACCEPTED”). In some embodiments, once the payment corresponding to payment message object 1170 (which also corresponds to the payment requested by the unknown participant via payment message object 1144) has been accepted by unknown participant 1142 (and thus the payment corresponding to payment message object 1170 has been completed), a dynamic three-dimensional visual effect (e.g., as described with reference to completed payment message object 1172 in FIG. 11T) is applied to a request 151
DK 2017 70505 A1 indicator 1149 (e.g., displayed as a currency symbol (e.g., “$”) in the same font and/or style as an amount indicator (e.g., “$28”) of payment message objects 1170 and 1172) of payment message object 1144 or to the entire payment message object. In some embodiments, payment message object 1144 is removed (from message conversation 1108) when payment message object 1170 corresponding to the payment made by the user to unknown participant 1142 in response to the payment request from the unknown participant is created.
[0421] FIG. 11R shows (as also described with reference to payment message object 1118 in FIG. 11B) electronic device 1100, while displaying the display (including payment message object 1170 within message conversation 1140) shown in FIG. 11Q, being viewed at two different angles (angle 1100A and angle 1100B) relative to a reference point 1128 that is a face of a viewer (e.g., the user) of the device in a field of view of a sensor (e.g., a camera) of the device. Alternatively, in some embodiments, the reference point is a static point external to the device, such as a location on the ground or floor. As shown in FIG. 11R, from the perspective of reference point 1128 of a viewer (e.g., the user) viewing display 1102 of the device at either angle 1100A or at angle 1100B, payment message object 1170 (because it has not yet been accepted by the intended recipient (e.g., unknown participant 1142)) appears the same at either angle. In other words, whether a viewer (e.g., the user) views display 1102 of the device at angle 1100A, or whether a viewer (e.g., the user) views display 1102 of the device at angle 1100B, or whether a viewer (e.g., the user) views display 1102 of the device from straight on (e.g., such that the display is not tilted at an angle relative to the face of the viewer, as shown in FIG. 11Q), there is no change in how the payment message object is perceived by the user, for there is no change in how the payment message object is displayed on display 1102 by the device. Thus, in FIG. 11R, the device does not provide any feedback associated with payment message object 1170 to a viewer (e.g., the user) of the device.
[0422] FIG. 11S shows the payment (or, alternatively, the payment request) corresponding to payment message object 1170 having been accepted by unknown participant 1142. Thus, the device updates display of payment message object 1170 to a completed message object 1172. Further, in response to the determination that the payment (or, alternatively, the payment request) corresponding to payment message object 1170 has been accepted by unknown participant 1142,
152
DK 2017 70505 A1 electronic device 1100 updates first status indicator 1194 (e.g., from “pending” to “paid”) to inform the user that the payment has been accepted by unknown participant 1142 (or, alternatively, to inform the user that the payment request has been accepted, and thus a payment by unknown participant 1142 in the requested payment amount has been made by the unknown participant to the user). In some embodiments, the device updates second status indicator 1196 (e.g., from “pending” to “paid”) to inform the user that the payment has been accepted by unknown participant 1142 (or, alternatively, to inform the user that the payment request has been accepted, and thus a payment by unknown participant 1142 in the requested payment amount has been made by the unknown participant to the user).
[0423] FIG. 11T shows (as also described with reference to completed payment message object 1132 in FIG. 11B), in place of (non-completed) payment message object 1170, a completed payment message object 1172. Specifically, as shown in FIG. 11T, electronic device 1100 generates a feedback (e.g., a visual effect, a haptic feedback, an audio feedback) indicating to the user that the payment corresponding to completed payment message object 1170 has been accepted and that payment message object 1170, now completed payment message object 1172, corresponds to a payment that has already been accepted.
[0424] For example, as shown in FIG. 11T, amount indication 1174 (e.g., “$28”) of completed payment message object 1172 is visually changed. In some embodiments, the visual change to amount indication 1174 is a bolding (or thickening) of the font of the displayed amount (e.g., “$28”). In some embodiments, the visual change to amount indication 1174 includes a black outline (e.g., a shadow) applied to the font of the displayed amount (e.g., “$28”). In some embodiments, the visual change to amount indication 1174 is as change in color (e.g., from black to white) of the displayed amount (e.g., “$28”).
[0425] In some embodiments, electronic device 1100 generates feedback (e.g., a visual feedback, a haptic feedback, an audio feedback) associated with completed payment message object 1172. In some embodiments, the feedback is a dynamic visual feedback causing display of the payment message object (e.g., completed payment message object 1172) to change as changes in the orientation of the device relative to reference point 1128 are detected. For example, in FIG. 11T, the dynamic visual feedback is a 3D effect (e.g., the simulated depth
153
DK 2017 70505 A1 effect 3325 described below with reference to FIGS. 33D-33J) that provides the user with the visual effect that amount indication 1174 of the payment message object is three-dimensional. Thus, in FIG. 11T, based on reference point 1128 of the user, amount indication 1174 of completed payment message object 1172 looks visually different (e.g., shadows behind the displayed numbers of amount indication 1174 appear different) from angle 1100A of the device and from angle 1100B of the device and, optionally, both the view of completed payment message object 1172 from angle 1100A and angle 1100B look different from the appearance of the object from straight on (e.g., such that the display is not tilted at an angle relative to the face of the viewer, as shown in FIG. 11S). In some embodiments, the dynamic visual feedback is a changing color applied to the amount indication (or to the entire payment message object). In some embodiments, the dynamic visual feedback is a changing background applied to the payment message object. In some embodiments, the dynamic visual feedback is a moving of one or more elements, such as amount indication 1174, of the payment message object.
[0426] In some embodiments, as also shown in FIG. 11T, in addition to or instead of a dynamic visual feedback, the device generates a dynamic haptic feedback 1176. In some embodiments, the dynamic haptic feedback is a dynamically strengthening and weakening tactile output caused by the device. In some embodiments, the dynamic haptic feedback is a tactile output with changing tactile output patterns caused by the device. In some embodiments, the strength or frequency of the tactile output changes as the device detects changes in the orientation of the device relative to the reference point (e.g., reference point 1128).
[0427] In some embodiments, the generated feedback (e.g., visual feedback, sensory feedback, audio feedback) is caused (e.g., only) by an operating system program of the device and non-operating system programs of the device are not enabled to cause the feedback.
[0428] In FIG. 11U, while displaying completed payment message object 1172 within message conversation 1140, electronic device 1100 detects a user input on the completed payment message object. For example, as shown in FIG. 11U, the user input is a tap gesture 1113 on completed payment message object 1172.
154
DK 2017 70505 A1 [0429] In FIG. 11V, in response to detecting tap gesture 1113, electronic device 1100 displays (e.g., replaces display of messaging application 1106 and virtual keyboard 1112 with) transaction detail user interface 1134 (as first described above with reference to FIG. 11G) that includes a graphical representation 1135 (e.g., a copy) of the completed payment message object (e.g., completed payment message object 1172), a textual indication 1137 of a note accompanying the payment (e.g., accompanying message object 1146 stating “For dinner”), plurality of details relating to the selected (e.g., the payment corresponding to completed payment message object 1172) transaction. In some embodiments, transaction detail user interface 1134 includes an indication 1134A of the payment sender/recipient (e.g., the user (Kate Appleseed, “From Kate’s Payment account”). In some embodiments, transaction detail user interface 1134 includes an indication 1134B of the payment account or the payment recipient account. In some embodiments, transaction detail user interface 1134 includes an indication 1134C of the intended recipient of the payment (or, alternatively, a payment request) (e.g., unknown participant 1142, message participant 1110). In some embodiments, transaction detail user interface 1134 includes an indication 1134D of the date and time of competed transaction. In some embodiments, transaction detail user interface 1134 includes an indication 1134E of the date and time of competed transaction. In some embodiments, transaction detail user interface 1134 includes an indication 1134F of a transaction number.
[0430] In some embodiments, transaction detail user interface 1134 includes a wallet button 1136 (e.g., a “View in Wallet” selectable indication) for viewing the transaction details in an electronic wallet application of electronic device 1100. In some embodiments, transaction detail user interface 1134 includes a send again button 1131 (e.g., if the payment associated payment message object 1135 was a payment made by the user to a message participant) for creating a new payment message object corresponding to a payment in the same amount as the currentlyviewed transaction intended for the same recipient as the currently-viewed transaction. Thus, send again button 1131 provides the user with a quick and easy option to perform another payment in the same amount (e.g., “$28”) to the same recipient (e.g., message participant 1110) via the transaction detail user interface of the last transaction with that recipient. In some embodiments, transaction detail user interface 1134 includes a refund button 1133 (e.g., if the payment associated payment message object 1135 was a payment made by the user to a message 155
DK 2017 70505 A1 participant) for requesting a refund of a sent payment. In some embodiments, refund button 1133 is only available (e.g., is only visible, is only selectable) if the payment associated with the payment message object has been accepted (e.g., is no longer pending because the intended recipient (e.g., message participant 1110) has accepted the payment).
[0431] FIGS. 12A-12C are a flow diagram illustrating a method for managing peer-to-peer transfers using an electronic device in accordance with some embodiments. Method 1200 is performed at a device (e.g., 100, 300, 500, 1000, 1100) with a display and one or more sensor devices (e.g., an accelerometer, a camera). Some operations in method 1200 are, optionally, combined, the order of some operations are, optionally, changed, and some operations are, optionally, omitted.
[0432] As described below, method 1200 provides an intuitive way for managing peer-topeer transfers. The method reduces the cognitive burden on a user for managing peer-to-peer transfers, thereby creating a more efficient human-machine interface. For battery-operated computing devices, enabling a user to manage peer-to-peer transfers faster and more efficiently conserves power and increases the time between battery charges.
[0433] In some examples, prior to displaying (1204), on the display (e.g., 1002, 1102), a graphical representation of the communication (e.g., 1020, 1118), the electronic device (e.g., 1000, 1100) receives (1202) a communication with a predetermined type of message from an external device. In some examples, the message corresponds to a resource transfer (e.g., computing resources, points, credits, funds, virtual resources) from a first account associated with the electronic device to a second account associated with an external device (e.g., a transfer of resources from an account of the electronic device’s user to an account of the user of a different electronic device, such as a smartphone, a smartwatch, a laptop computer, a desktop computer)).
[0434] In some examples, the communication is associated with a completed transfer of a first type of item between a user of the device (e.g., 1000, 1100) and a participant (e.g., 1010, 1110) in a message conversation (e.g., 1008, 1108). In some examples, the transfer of the first type of item is a transfer of a sticker using a sticker sharing interface, a transfer of a photo using
156
DK 2017 70505 A1 a photo sharing interface, a transfer of a payment using a payment interface, or a transfer of a resource using a resource-numerical value selection user interface for receiving user adjustment of the amount of resources, such as points, credits, or funds, to be sent or requested. In some examples, the communication corresponds to a request for transfer of the first type of item (e.g., funds) between the user of the device and a participant in the message conversation.
[0435] The electronic device (e.g., 1000, 1100) displays (1204), on the display (e.g., 1002, 1102), the graphical representation of the communication (e.g., 1020, 1118). In some examples, a state (1206) of the communication (e.g., the communication being in the first state or the second state) is indicative of an action taken by a participant (e.g., 1010, 1110), other than a user of the device, in a message conversation (e.g., a confirmation of payment by the participant on an external device). In some examples, the state of a communication indicates whether the receiver of the communication has read the message corresponding to the communication or accepted a payment corresponding to the communication.
[0436] In some examples, a state (1208) of the communication is indicative of an action taken by a user of the device (e.g., user has read a message corresponding to a communication, user has confirmed payment on the device). For example, the action taken by the user is accepting a payment associated with the communication or initiating accepting of a payment (e.g., by setting up a required payment account to accept the payment) associated with the communication (e.g., as described below in method 1800 with reference to FIGS. 18A-18F). In some examples, the state of a communication indicates whether a message corresponding to the communication has been sent or whether a payment has been made (e.g., accepted) for a payment request corresponding to the communication.
[0437] In some examples, the graphical representation of the communication (e.g., 1118) includes (1210) an indication (e.g., 1122) of a quantity of an item associated with the communication. In some examples, the quantity is the number of times a message has been viewed by a remote recipient. In some examples, the quantity is the amount of concurrency transferred, the amount of currency to be transferred, or the amount of currency requested to be transferred. In some examples, the quantity of the item associated with the communication is displayed with a special visual characteristic (e.g., a special font) that distinguishes the quantity 157
DK 2017 70505 A1 from other item and/or elements on the display (e.g., as described below with reference to FIGS. 15A-15K).
[0438] While displaying the graphical representation of the communication (e.g., 1020, 1118) on the display (e.g., 1002, 1102), the electronic device (e.g., 1000, 1100) detects (1212), via the one or more sensor devices, a change in orientation of the electronic device relative to a reference point (e.g., 1026, 1128, a point in space, a point on the floor/ground, a point a point in a field of view of a camera). In some examples, the orientation of the electronic device relative to the reference point (e.g., 1026, 1128) changes when a user holding the electronic device is moving the electronic device relative to a point in space, on the floor, or on the ground while the user remains stationary. In some examples, the orientation of the electronic device relative to the reference point changes when the user (e.g., the user’s head or eyes) is moving relative to the electronic device, which remains stationary relative to a point in space, on the floor, or on the ground.
[0439] In some examples, the reference point (e.g., 1026, 1128) is (1214) a face (e.g., face of the user of the device, a point on a face, such as an eye or nose) in a field of view of a sensor (e.g., a camera) of the one or more sensor.
[0440] In some examples, the reference point (e.g., 1026, 1128) is (1216) a static location (e.g., a fixed point on the ground/floor, a fixed point that is external to the device) external to the electronic device (e.g., 1000, 1100). In some examples, the device uses one or more sensors, such as accelerometers and/or a compass, to determine a change in orientation of the electronic device relative to the reference point, such as the earth.
[0441] In response (1218) to detecting the change in the orientation of the electronic device (e.g., 1000, 1100) relative to the reference point (e.g., 1026, 1128) while displaying the graphical representation of the communication on the display, blocks 1220-1226 are optionally performed.
[0442] In accordance with a determination that the communication has a first state (e.g., message read for an incoming message, payment accepted for an incoming payment, message sent for an outgoing message, payment approved for an outgoing payment, payment accepted for
158
DK 2017 70505 A1 an outgoing payment), the electronic device (e.g., 1000, 1100) displays (1220) the graphical representation of the communication (e.g., 1020, 1118) and outputs a respective type of feedback (e.g., feedback on amount indication 1174 described in FIG. 11T, 1130, visual feedback, sensory feedback, audio feedback) corresponding to the graphical representation of the communication. The feedback indicates a magnitude of the change in the orientation of the electronic device relative to the reference point. In some examples, if the communication is a payment, the respective type of feedback is output as the device detects the change in the orientation if the payment has been accepted for an incoming payment or the payment has been approved for an outgoing payment. Outputting a particular feedback corresponding to the graphical representation of the communication as the device detects a change in orientation relative to a reference point provides the user with feedback about the state of the communication. Thus, for example, a user can determine the state of the communication by changing the device’s orientation and checking whether the feedback is provided. Providing improved feedback to the user enhances the operability of the device and indicates the state of an element of the device, thus making the user-device interface more efficient (e.g., by helping the user to achieve an intended result by providing feedback indicative of an input that will cause the device to generate the intended result and reducing user mistakes when operating/interacting with the device) which, additionally, reduces power usage and improves battery life of the device by enabling the user to use the device more quickly and efficiently.
[0443] In some examples, the respective type of feedback is a dynamic visual feedback (e.g., feedback on amount indication 1174 described in FIG. 11T) (e.g., a changing visual effect, such as a changing color, a changing pattern, a changing background, a moving of one or more elements of the graphical representation of the communication (e.g., 1028, 1172)). In some examples, the visual feedback changes as the change in orientation of the electronic device relative to a reference point is detected. In some examples, the visual feedback is a 3D effect (e.g., the simulated depth effect 3325 described below with reference to FIGS. 33D-33J) that provides the user with the effect that an element of the graphical representation (such as the quantity) of the communication is three-dimensional. Outputting a dynamic visual feedback (such as a three-dimensional effect) that corresponds to the change in orientation allows the user to know that the feedback is legitimate (e.g., is tied to the change in orientation) and is not
159
DK 2017 70505 A1 illegitimate (e.g., pre-made video that is not tied to the change in orientation) and enables the user to identify whether a visual feedback being provided is legitimate and therefore the associated communication is authentic. Providing improved visual feedback to the user enhances the operability of the device and makes the user-device interface more efficient which, additionally, reduces power usage and improves battery life of the device by enabling the user to use the device more quickly and efficiently.
[0444] In some examples, the respective type of feedback is a dynamic haptic feedback (e.g., 1030, 1130) (e.g., dynamically strengthening and weakening tactile outputs, a tactile output with changing tactile output patterns). For example, the strength or frequency of the tactile output changes as the device detects changes in orientation of the electronic device relative to the reference point is detected. Outputting a dynamic haptic feedback (e.g., 1030, 1130) (such as a haptic feedback that changes in strength or frequency) that corresponds to the change in orientation allows the user to know that the feedback is legitimate (e.g., is tied to the change in orientation) and is not illegitimate (e.g., pre-made haptic that is not tied to the change in orientation) and enables the user to identify whether a haptic feedback being provided is legitimate and therefore the associated communication is authentic. Providing improved haptic feedback to the user enhances the operability of the device and makes the user-device interface more efficient which, additionally, reduces power usage and improves battery life of the device by enabling the user to use the device more quickly and efficiently.
[0445] In some examples, the respective type of feedback (e.g., visual feedback, sensory feedback, audio feedback) is caused (e.g., only) by an operating system program of the electronic device and non-operating system programs of the electronic device are not enabled to cause the respective type of feedback. In some examples, only the operating system is enabled to initiate/cause the respective type of feedback. In some examples, the operating system only enables certain applications to initiate/cause the respective type of feedback (e.g., a particular frequency of tactile output or accelerometer controlled visual animation).
[0446] In some examples, the respective type of feedback is a graphical animation (e.g., a lighting effect) displayed over the graphical representation of the communication (e.g. 1028,
160
DK 2017 70505 A1
1172). In some examples, the graphical representation of the communication (e.g., 1170, 1172) includes a currency indicator (e.g., a “$” symbol, a “€” symbol).
[0447] In some examples, the respective type of feedback is a graphical animation (e.g., a shadow) displayed under the graphical representation (e.g., 1028, 1172). In some examples, the graphical representation of the communication (e.g., 1170, 1172) includes a currency indicator (e.g., a “$” symbol, a “€” symbol).
[0448] In some examples, the respective type of feedback is a graphical animation (e.g., shifting colors, shifting shapes) that creates an illusion that the graphical representation (e.g., 1028, 1172) (or portion thereof) is a three dimensional object that is being viewed from different angles as the angle (or orientation) of the device changes. In some examples, the graphical representation of the communication (e.g., 1170, 1172) includes a currency indicator (e.g., a “$” symbol, a “€” symbol).
[0449] In some examples, outputting the respective type of feedback comprises outputting a non-visual feedback (e.g., a haptic feedback that includes one or more tactile outputs and/or an audio feedback). In some examples, the haptic feedback uses frequencies of tactile outputs that are only available to first party apps (and thus cannot be simulated by any other app developer).
[0450] In accordance with a determination that the communication has a second state (e.g., message unread for an incoming message, payment unaccepted for an incoming payment, message unsent for an outgoing message, payment unapproved for an outgoing payment) that is different from the first state, the electronic device (e.g., 1000, 1100) displays (1226) the graphical representation of the communication (e.g., 1028, 1172) without outputting feedback (e.g., feedback on mini-file object 1022 or on amount indication 1174 described in FIG. 11T, 1130) that indicates a magnitude of the change in the orientation of the electronic device relative to the reference point. In some examples, at an external device of the sender of the communication, a corresponding feedback is outputted as the external device detects changes in orientation even when the communication has the second state. In some examples, the second state and the first state are mutually exclusive (e.g., if the communication has the first state it cannot have the second state, and if the communication has the second state it cannot have the
161
DK 2017 70505 A1 first state). Forgoing outputting a particular feedback corresponding to the graphical representation as the device detects a change in orientation relative to a reference point provides the user with feedback about the state of the communication. Thus, for example, a user can determine the state of the communication by changing the devices orientation and checking whether the feedback is provided. When the feedback is not provided, the user knows that the communication is not in the first state. Providing improved feedback to the user enhances the operability of the device and indicates the state of an element of the device, thus making the user-device interface more efficient (e.g., by helping the user to achieve an intended result by providing feedback indicative of an input that will cause the device to generate the intended result and reducing user mistakes when operating/interacting with the device) which, additionally, reduces power usage and improves battery life of the device by enabling the user to use the device more quickly and efficiently.
[0451] In some examples, the communication is a message in a message conversation (e.g., 1108) between a plurality of conversation participants (e.g., 1010, 1110) and the communication is associated with a confirmation. Prior to displaying, on the display (e.g., 1002, 1102), the graphical representation of the communication (e.g., 1170, 1172), the electronic device (e.g., 1000, 1100) detects user activation of a confirmation affordance (e.g., to confirm a payment). In response to detecting user activation of the confirmation affordance, the electronic device displays, on the display, the graphical representation of the communication (e.g., 1170) in the message conversation, and outputs a second type of feedback (e.g., visual feedback, haptic feedback that includes one or more tactile outputs, audio feedback) corresponding to the graphical representation of the communication, wherein the feedback indicates that the communication has been confirmed. In some examples, the haptic feedback uses frequencies of tactile outputs that are only available to first party apps (and thus cannot be simulated by any other app developer). In some examples, the second type of feedback is a portion of the first type of feedback. In some examples, the second type of feedback is different from the first type of feedback. In some examples, the second type of feedback does not vary based on an angle of the device.
162
DK 2017 70505 A1 [0452] In some examples, the electronic device (e.g., 1000, 1100) receives (1228) user selection (e.g., a tap) of the graphical representation of the communication (e.g., 1028, 1172). In response to receiving the user selection of the graphical representation of the communication (e.g., 1028, 1172), the electronic device displays (1230), on the display (e.g., 1002, 1102), a details user interface (e.g., 1134) including information (e.g., 1134A-E, an amount of a payment transaction, a quantity of a resource transfer, a date and time of a transaction/transfer, a note/comment relating to a transaction/transfer) associated with the communication.
[0453] In some examples, in response to receiving the communication (e.g., 1144 with the predetermined type of message: in accordance with a determination, based on an analysis (of the contents) of the communication, that the communication meets a first predefined condition (e.g., raises a predetermined flag (e.g., because the communication is suspected to be from an untrusted sender, because the communication is suspected to be spam/junk)), the electronic device (e.g., 1000, 1100) displays, on the display (e.g., 1002, 1102), a first indication (e.g., 1158, 1160, a notification, message, or prompt warning the user that the communication (e.g., 1144) is from an untrusted sender, a notification, message, or prompt indicating that the communication is suspected to be spam/junk, a notification, message, or prompt warning the user that the communication may not be safe to view/select/open) that the communication meets the first predefined condition (e.g., raised/set the predetermined flag) and, optionally, the electronic device optionally forgoes outputting the respective type of feedback corresponding to the graphical representation of the communication (e.g., 1144). Automatically displaying an indication (e.g., 1158, 1160, an indication that the message is spam) when the predefined condition is met reduces the likelihood that the user will participant in a transfer corresponding to the message without further investigating the transfer, thereby enhancing the security of the technique and reducing the number of fraudulent transfers. Reducing the number of fraudulent transfers enhances the operability of the device and makes the user-device interface more secure (e.g., by reducing fraud when operating/interacting with the device). In accordance with a determination, based on the analysis (of the contents) of the communication (e.g.,1118), that the communication does not meet the first predefined condition (e.g., the communication does not raise/set the predetermined flag) (e.g., because the communication is from a trusted sender, because the communication is not suspected to be spam/junk), the electronic device forgoes
163
DK 2017 70505 A1 displaying, on the display, the first indication (e.g., 1158, 1160, the indication that the message may be spam). In some examples, the first indication is that the message is potentially a spam message.
[0454] In some examples, the communication (e.g., 1144) meets the first predefined condition (e.g., raises the predetermined flag) when the external device does not correspond to one of a plurality of contacts (e.g., a contacts list, a trusted contacts list, a user-configured contacts list) associated with the electronic device (e.g., 1000, 1100) (e.g., the communication is from an unknown number).
[0455] In some examples, the communication (e.g., 1144) meets the first predefined condition (e.g., raises/sets the predetermined flag) when the external device corresponds to one of a plurality of contacts (e.g., a spam numbers list, a suspected fraudulent accounts list). In some examples, the plurality of contacts is a list of contacts identified as being untrustworthy.
[0456] In some examples, in accordance with the determination, based on the analysis (of the contents) of the communication (e.g., 1144), that the communication meets the first predetermined condition (e.g., raises/sets the predetermined flag) (e.g., because the communication is suspected to be from an untrusted sender, because the communication is suspected to be spam/junk), the electronic device (e.g., 1000, 1100) displays, on the display (e.g., 1002, 1102), a reporting affordance (e.g., 1160) (e.g., for reporting the spam/junk communication to a remote server). While displaying, on the display, the reporting affordance (e.g., 1160), the electronic device detects user activation of the reporting affordance (e.g., 1160). In response to detecting the user activation of the reporting affordance, the electronic device transmits, to an external device (e.g., an external server), information associated with the communication that raised (or set) the predetermined flag.
[0457] In some examples, subsequent to displaying, on the display, the first indication (e.g., 1158, 1160, a notification, message, or prompt warning the user that the communication is from an untrusted sender, a notification, message, or prompt indicating that the communication is suspected to be spam/junk, a notification, message, or prompt warning the user that the communication may not be safe to view/select/open) that the communication raised (or set) the
164
DK 2017 70505 A1 predetermined flag, the electronic device (e.g., 1000, 1100) receives user activation of a send affordance (e.g., 1124) displayed on the graphical representation of the communication. In response to receiving the user activation of the send affordance (e.g., 1124) (and in accordance with the determination, based on an analysis (of the contents) of the communication, that the communication meets a first predefined condition): the electronic device displays a second indication (e.g., 1162, “Sender Unknown, Do You Still Wish To Proceed?”) that the communication met the first predetermined condition (e.g., raised/set the predetermined flag), wherein the second indication is visually distinguishable from the first indication, (e.g., the second indication is more visibly prominent than the first indication), and displays, on the display, a cancel affordance (e.g., 1162A) for forgoing proceeding with (e.g., canceling completion of, forgoing display of a transfer user interface for initiating transfer of the first type of item) a transfer of the first type of item between a user of the device and a participant in a message conversation. In some examples, the transfer corresponds to the received communication. In some examples, a second send affordance (e.g., 1162B) is displayed concurrently with the cancel affordance. In response to detecting activation of the second send affordance (e.g., 1162B), the electronic device (e.g., 1000, 1100) displays a transfer user interface (e.g., 1164) for proceeding with the transfer of the first type of item between the user of the device and the participant in the message conversation. Displaying a second indication (e.g., 1162, an indication that the message is spam) when the user provides input to continue with the transfer reduces the likelihood that the user will participant in a transfer corresponding to the message without further investigating the transfer and/or message, thereby enhancing the security of the technique and reducing the number of fraudulent transfers. Reducing the number of fraudulent transfers enhances the operability of the device and makes the user-device interface more secure (e.g., by reducing fraud when operating/interacting with the device).
[0458] In some examples, the details user interface (e.g., 1134) includes a cancellation affordance (e.g., 1141, an affordance for requesting a refund if the communication is related to a payment, an affordance for requesting return of a sent item/resource if the communication is related to an item/resource). The cancellation affordance is user-selectable when the communication is in the first state (e.g., 170). The electronic device (e.g., 1000, 1100) detects user activation of the cancellation affordance (e.g., 1141). In response to detecting the user
165
DK 2017 70505 A1 activation of the cancellation affordance (e.g., 1141): in accordance with the determination that the communication has the first state, the electronic device transmits a second communication with the predetermined type of message to an external device associated with the communication requesting a return transfer of a first type of item that was transferred via the communication. In some examples, the cancellation affordance (e.g., 1141) is not user-selectable when the communication is in the second state. In some examples, the cancellation affordance (e.g., 1141) is not displayed when the communication is in the second state. In some examples, in accordance with the determination that the communication is in the second state (e.g., transitions to the second state from the first state) the electronic device causes the graphical representation of the communication to no longer be selectable, and provides (e.g., displays, on the display) an indication that the graphical representation of the communication is no longer selectable. In some examples, the cancellation affordance (e.g., 1141) is conditionally displayed depending on the state of the communication (e.g., in accordance with a determination that the communication has the first state, displaying the cancellation affordance in the details user interface, and in accordance with a determination that the communication has the second state, forgoing display of the cancellation affordance in the details user interface).
[0459] In some examples, the graphical representation of the communication (e.g., 1028, 1172) having the first state (e.g., message read for an incoming message, payment accepted for an incoming payment, message sent for an outgoing message, payment approved for an outgoing payment) includes a graphical indication (e.g., 1174) of a completed transfer of a first type of item (e.g., a sticker, a photo, or a payment object) between the electronic device (e.g., 1000, 1100) and an external device.
[0460] Note that details of the processes described above with respect to method 1200 (e.g., FIGS. 12A-12C) are also applicable in an analogous manner to the methods described herein. For example, method 1200 optionally includes one or more of the characteristics of the various methods described herein with reference to methods 900, 1500, 1800, 2100, 2400, 2700, 3000, and 3400. For example, concurrently displaying the representation of a message and a selectable indication that corresponds to a type of item (being transferred, such as a photo, sticker, resources, or a payment) as described in method 900 can be applied with respect to the graphical
166
DK 2017 70505 A1 representation of a communication (e.g., 1118). For another example, the different visual appearances of a message object based on whether the message object corresponds to a transmission message or a request message, as described in method 1500, can be applied with respect to the graphical representation of a communication (e.g., 1118). For another example, a request for activating an account that is authorized to obtain one or items (e.g., a sticker, a photo, resources, a payment), as described in method 1800, can be applied with respect to the graphical representation of a communication (e.g., 1118) when retrieving one or more items (e.g., a sticker, a photo, resources, a payment) associated with the message. For another example, displaying representations of a first account and a second account, as described in method 2100, can also be displayed when authenticating / confirming an incoming transfer corresponding to the graphical representation of a communication (e.g., 1118). For another example, automatically proceeding with a transfer, as described in method 2400, instead of requiring user input, can also be used to accept the contents of a communication having the second state. For another example, the plurality of items including information from messages in a message conversation, as described in method 2700, can be displayed in response to user selection of the graphical representation of a communication (e.g., 1172). For another example, an utterance can be used, as described in method 3000, to create the graphical representation of a communication (e.g., 1118). For another example, a visual effect (e.g., a coloring effect, a geometric alteration effect) can be applied, as described in method 3400, to an element (e.g., 1122) of a graphical representation of a communication (e.g., 1118) when a transfer (e.g., of a resource, of a file, of a payment) associated with the communication is completed. For brevity, these details are not repeated below.
[0461] The operations in the information processing methods described above are, optionally, implemented by running one or more functional modules in an information processing apparatus such as general purpose processors (e.g., as described with respect to FIGS. 1A, 3, and 5A) or application specific chips. Further, the operations described above with reference to FIGS. 12A-12C are, optionally, implemented by components depicted in FIGS. 1A1B. For example, displaying operation 1204, detecting operation 1212, displaying operation 1220, and displaying operation 1226 are, optionally, implemented by event sorter 170, event recognizer 180, and event handler 190. Event monitor 171 in event sorter 170 detects a contact 167
DK 2017 70505 A1 on touch-sensitive surface 604, and event dispatcher module 174 delivers the event information to application 136-1. A respective event recognizer 180 of application 136-1 compares the event information to respective event definitions 186, and determines whether a first contact at a first location on the touch-sensitive surface corresponds to a predefined event or sub-event, such as selection of an object on a user interface. When a respective predefined event or sub-event is detected, event recognizer 180 activates an event handler 190 associated with the detection of the event or sub-event. Event handler 190 optionally utilizes or calls data updater 176 or object updater 177 to update the application internal state 192. In some embodiments, event handler 190 accesses a respective GUI updater 178 to update what is displayed by the application. Similarly, it would be clear to a person having ordinary skill in the art how other processes can be implemented based on the components depicted in FIGS. 1A-1B.
[0462] FIGS. 13A-13D illustrate exemplary user interfaces for managing peer-to-peer transfers, in accordance with some embodiments. As described in greater detail below, the nonlimiting exemplary embodiment of the user interfaces illustrated in FIGS. 13A-13D relate to the non-limited exemplary embodiment of the user interfaces illustrated in FIGS. 14A-14M, which are in turn used to illustrate the processes described below, including the processes in FIGS. 15A-15K.
[0463] FIG. 13A illustrates an electronic device 1300 (e.g., portable multifunction device 100, device 300, or device 500). In the non-limiting exemplary embodiment illustrated in FIGS. 13A-13D, electronic device 1300 is a smartphone. In other embodiments, electronic device 1300 can be a different type of electronic device, such as a wearable device (e.g., a smartwatch). Electronic device 1300 has a display 1302 and one or more input devices (e.g., touchscreen of display 1302, a mechanical button 1304, a mic).
[0464] In FIG. 13A, electronic device 1300 displays, on display 1302, a message conversation 1308 of a messaging application 1306 between a user of the device (e.g., “Kate Appleseed”) and a message participant 1310 (e.g., “John Appleseed”). As shown in FIG. 13A, message conversation 1308 includes a message object 1318 corresponding to a message sent by message participant 1310 to the user. In message object 1318, message participant 1310 states: “Last night was fun.”
168
DK 2017 70505 A1 [0465] FIG. 13A also shows electronic device 1300 displaying, on display 1302, a transfer user interface 1320 (e.g., for transferring a file, such as a photo, video file, audio file, or document, via messaging application 1306). In some embodiments, transfer user interface 1320 includes an indication 1322 (e.g., stating “TRANSFER”) informing the user that the user interface corresponds to a (first-party, an operating-system controlled) application for transferring a file using messaging application 1306. In some embodiments, transfer user interface 1320 includes an interface switching menu bar 1324 that includes a plurality of shortcut icons for switching between different user interfaces (e.g., switching between transfer user interface 1320 and a user interface for playing music) associated with different application features (e.g., manage peer-to-peer transfers, play music, set alarm clock) accessible from within messaging application 1306 while maintain display of message conversation 1308. In some embodiments, the plurality of shortcut icons of interface switching menu bar 1324 correspond to different applications, thus enabling the user to quickly switch between user interfaces of different applications. In some embodiments, interface switching menu bar 1324 includes a transfer shortcut icon 1326 corresponding to transfer user interface 1320. Thus, because transfer user interface 1320 is the currently-displayed user interface, the device shows transfer shortcut icon 1326 currently being selected within interface switching menu bar 1324.
[0466] As also shown in FIG. 13A, transfer user interface 1320 includes a request button 1338 for initiating a request for a transfer of a file from a different user (e.g., message participant 1310) via messaging application 1306 and send button 1340 for initiating a transfer of a file to a different user (e.g., message participant 1310) via messaging application 1306.
[0467] As also shown in FIG. 13A, transfer user interface 1320 includes a value change region 1330 that includes an indication 1332 (e.g., stating “#5”) of a number of files (e.g., a number of photos, a number of video files, a number of audio files, a number of documents) to be transferred or a specific file (e.g., a fifth photo, a fifth video file, a fifth audio file, a fifth document) to be transferred. In some embodiments, transfer user interface 1320 is pre-populated with a number of files (or a specific file) to be transferred based on an analysis of the content of the message corresponding to message object 1318 (e.g., similar to the pre-selection process described above with respect to FIGS. 7A-7E). In some embodiments, the pre-populated number
169
DK 2017 70505 A1 in indication 1332 includes a symbol (e.g., “#”) indicating that the numerical value in indication 1332 relates to a number of files. In some embodiments, transfer user interface 1320 includes an indication 1328 (e.g., stating “Photos from last night”) informing the user of the type of files (e.g., photos) and the specific files from the type of files (e.g., photos from last night) that are selected to be transferred. In some embodiments, value change region 1330 also includes a value increase button 1336 (e.g., indicated as a “+”) for increasing the displayed numerical value amount (e.g., “#5”) within indication 1332 and a value decrease button 1334 (e.g., indicated as a “-”) for decreasing the displayed numerical value (e.g., “#5”) within indication 1332.
[0468] FIG. 13B shows electronic device 1300 displaying, on display 1302, in response to detecting user activation of request button 1338, a request message object 1344 corresponding to a request (e.g., made by the user to message participant 1310) for transfer of 5 photos (selected via transfer user interface 1320 in FIG. 13A) from last night. In some embodiments, in response to the user activation of request button 1338, the device displays (e.g., replaces display of transfer user interface 1320 with), on display 1302, a virtual keyboard 1312.
[0469] In some embodiments, as shown in FIG. 13B, request message object 1344 is displayed within an expanded compose region 1342, which is an extension of a compose region 1314, to indicate to the user that the message object has not yet been transmitted to the intended recipient (e.g., message participant 1310). In some embodiments, compose region 1314 includes a send button 1316 for transmitting the message object.
[0470] In some embodiments, request message object 1344 includes a request indicator 1346 (e.g., showing “#”) that indicates to the user that the transfer associated with the message object is a request for files (e.g., a request for the 5 photos from last night), as opposed to an outgoing transfer of files. In some embodiments, request message object 1344 also includes a textual indication 1348 (e.g., stating “5 photos from last night request”) of the specific files (e.g., type of files, number of files) that are being requested. In some embodiments, as also shown in FIG. 13B, request indicator 1346 is displayed in a different font (e.g., a thicker font, a bolder font, a special type of reserved for transfer requests) than textual indication 1348.
170
DK 2017 70505 A1 [0471] In FIG. 13C, electronic device 1300 displays, on display 1302, a message conversation 1350, different from message conversation 1308, of messaging application 1306 between a user of the device (e.g., “Kate Appleseed”) and a message participant 1352 (e.g., “Sarah James”). As shown in FIG. 13C, message conversation 1350 includes a message object 1354 corresponding to a message sent by message participant 1352 to the user. In message object 1354, message participant 1352 states: “Can you send me the 3 photos from last night?” [0472] FIG. 13C also shows electronic device 1300 displaying, on display 1302, transfer user interface 1320. In some embodiments, transfer user interface 1320 is displayed in response to detecting user selection of a marking 1356 of a phrase (e.g., stating “3 photos from last night”) of the message corresponding to message object 1354. In some embodiments, marking 1356 is applied to message object 1354 based on an analysis (e.g., by the device, by a remote server communicating with the device) of the contents (e.g., the text) of the message corresponding to the message object and a determination that a phrase of the text relates to a request for transfer of one or more files (e.g., photos, video files, audio files, documents).
[0473] As shown in FIG. 13C, value change region 1330 of transfer user interface 1320 includes indication 1332 (e.g., stating “#3”) showing a numerical value that corresponds to a number of files (e.g., a number of photos, a number of video files, a number of audio files, a number of documents) that are being requested to be transferred or a specific file (e.g., a third photo, a third video file, a third audio file, a third document) that are being requested to be transferred by message participant 1352. In some embodiments, transfer user interface 1320 is pre-populated with the number of files (or the specific file) that is being requested to be transferred based on the analysis of the content of the message corresponding to message object 131354 (e.g., similar to the pre-selection process described above with respect to FIGS. 7A-7E).
[0474] FIG. 13D shows electronic device 1300 displaying, on display 1302, in response to detecting user activation of send button 1340, a send message object 1360 corresponding to an outgoing transfer (e.g., to message participant 1352) 3 photos (selected via transfer user interface 1320 in FIG. 13C) from last night. In some embodiments, in response to the user activation of send button 1340, the device displays (e.g., replaces display of transfer user interface 1320 with), on display 1302, a virtual keyboard 1312.
171
DK 2017 70505 A1 [0475] In some embodiments, as shown in FIG. 13D, send message object 1358 is displayed within expanded compose region 1342 to indicate to the user that the message object has not yet been transmitted to the intended recipient (e.g., message participant 1352). In some embodiments, compose region 1314 includes a send button 1316 for transmitting the message object.
[0476] In some embodiments, send message object 1358 includes a textual indication 1360 (e.g., stating “#3 photos from last night”) of the specific files (e.g., type of files, number of files) that are being transferred. In some embodiments, send message object 1358 includes a plurality of (selectable) mini-file objects 1362A-1362C corresponding to the photos that are being transferred via the transfer associated with send message object 1358.
[0477] As mentioned above, the non-limiting exemplary embodiment of the user interfaces illustrated in FIGS. 13A-13D described above relate to the non-limited exemplary embodiment of the user interfaces illustrated in FIGS. 14A-14M described below. Therefore, it is to be understood that the processes described above with respect to the exemplary user interfaces illustrated in FIGS. 13A-13D and the processes described below with respect to the exemplary user interfaces illustrated in FIGS. 14A-14M are largely analogous processes that similarly involve initiating and managing transfers using an electronic device (e.g., 100, 300, 500, 1300, or 1400).
[0478] FIGS. 14A-14M illustrate exemplary user interfaces for peer-to-peer transfers, in accordance with some embodiments. The user interfaces in these figures are used to illustrate the processes described below, including the processes in FIGS. 15A-15K.
[0479] FIG. 14A illustrates an electronic device 1400 (e.g., portable multifunction device 100, device 300, or device 500). In the non-limiting exemplary embodiment illustrated in FIGS. 14A-14M, electronic device 1400 is a smartphone. In other embodiments, electronic device 1400 can be a different type of electronic device, such as a wearable device (e.g., a smartwatch). Electronic device 1400 has a display 1402 and one or more input devices (e.g., touchscreen of display 1402, a mechanical button 1404, a mic).
172
DK 2017 70505 A1 [0480] In FIG. 14A, electronic device 1400 displays, on display 1402, a message conversation 1408 of a messaging application 1406 corresponding to message conversation 808 of messaging application 806 described above, for example, with reference to FIGS. 8A-8B. As in FIGS. 8A-8B, message conversation 1408 of messaging application 1406 is between a user of the device (e.g., “Kate Appleseed”) and a message participant 1410 (e.g., “John Appleseed”).
[0481] As shown in FIG. 14A, message conversation 1408 includes two visible message objects 1416 and 1418. Message object 1416 corresponds to message object 818 of FIGS. 8A8B (e.g., a message sent by the user to message participant 1410). Message object 1418 corresponds to message object 820 of FIGS. 8A-8B (e.g., a message sent by message participant 1410 to the user). In message object 1416, the user asks message participant 1410: “How much do I owe you?” In message object 1418, message participant 1410 responds: “Dinner and the cab ride together was $28.” [0482] FIG. 14A also shows a payment transfer user interface 1440 that corresponds to payment transfer user interface 840 described above, for example, with reference to FIGS. 8E8P. As with payment transfer user interface 840, payment transfer user interface 1440 displays, in an amount indication 1448 of a value change region 1446, the amount (e.g., “$28”) of the payment requested by message participant 1410 in the message corresponding to message object 1418 pre-populated. For example, as described above with respect to payment transfer user interface 840, electronic device 1400 pre-populates the payment amount in payment transfer user interface based an analysis of the contents (e.g., the text) of message object 1418. In some embodiments, the analysis is performed by electronic device 1400 using a language processing component or a language analysis component of the device. In some embodiments, the analysis is performed at an external device (e.g., a server), and the device receives a result of the analysis from the external device.
[0483] In FIG. 14B, while displaying payment user interface 1440, electronic device 1400 detects a user input on send button 1447 of payment transfer user interface 1440, which corresponds to send button 847 of payment transfer user interface 840 described above, for example, with reference to FIGS. 8E-8P. For example, as shown in FIG. 14B, the user input is a tap gestures 1401 on send button 1447.
173
DK 2017 70505 A1 [0484] In FIG. 14C, in response to detecting tap gesture 1401 on send button 1447, electronic device 1400 displays an expanded compose bar 1472 (e.g., an expanded region of the compose region that is adjacent to / above a compose bar 1414), corresponding to expanded compose bar 872 described, for example, with reference to FIGS. 8Q-8W, with a payment message object 1420, corresponding to payment message object 866 described, for example, with reference to FIGS. 8Q-8W, located inside expanded compose bar 1472. Payment message object 1420 corresponds to a payment (e.g., an outgoing payment from the user to message participant 1410).
[0485] As mentioned above, the payment message object being located within expanded compose bar 1472 indicates to the user that the payment corresponding to payment message object 1420 has not yet been sent (to message participant 1410) but is being created. As also described above with reference to FIGS. 8Q-8W, payment message object 1420 includes an amount indication 1468 (e.g., “$28”), corresponding to amount indication 868 of payment message object 866, of the amount of the payment to be sent by message participant 1410 to the user and a mode indication 1470 (e.g., stating “PAY”), corresponding to mode indication 870, indicating that the payment message object corresponds to a payment made via an operatingsystem controlled payment transfer application (and not by a third-party application).
[0486] As also shown in FIG. 14C, electronic device displays compose bar 1414, corresponding to compose bar 814, for entering (e.g., via typing on virtual keyboard 1412) a note (e.g., a comment or message) accompanying a sent payment message object. In some embodiments, in response to detecting (e.g., via a tap gesture) user input on compose bar 1414 (which, in some examples, includes an indication 1473 stating “Add Comment or Send” informing the user that a note can be added), electronic device 1400 displays (e.g., replaces display of indication 1473 with) a cursor indicating that a note (e.g., comment or message) is ready to be inputted (e.g., typed) into compose bar 1414 (e.g., using virtual keyboard 1412). For example, FIG. 14D shows a note 1476 (e.g., “Dinner + Cab”) added by the user to accompany payment message object 1420.
[0487] In FIG. 14E, while displaying payment message object 1420 within expanded compose bar 1472 and note 1476 within compose bar 1414 added to the payment, electronic
174
DK 2017 70505 A1 device 1400 detects user activation of a final send button 1474 (corresponding to final send button 874). For example, the user activation is a tap gesture 1403 on final send button 1474.
[0488] In FIG. 14F, upon successful user authentication (e.g., via the authentication process described in FIGS. 8T-8U using a payment confirmation user interface, such as payment confirmation user interface 878), electronic device 1400 displays virtual keyboard 1412 (e.g., in place of a removed payment confirmation user interface). Further, the device displays payment message object 1420 within message conversation 1408 of messaging application 1406, thereby indicating to the user that the payment corresponding to payment message object 1420 has been sent to message participant 1410. In addition, the device also displays, adjacent to (or below or within) payment message object 1420, a note message object 1473 corresponding to note 1472 added by the user to accompany the payment.
[0489] Further, in some embodiments, sent payment message object 1420 includes a first status indicator 1494 (corresponding to status indicator 894) informing the user of a status of the payment corresponding to the sent payment message object (e.g., “pending,” “paid,” “accepted,” “expired”). For example, in FIG. 14F, first status indicator 1494 shows “pending,” thus indicating to the user that the payment associated with sent payment message object 866 has not yet been accepted by message participant 1410. In some embodiments, the device also displays (in addition to or instead of first status indicator 1494) a second status indicator 1496 (corresponding to status indicator 896) informing the user of a status of the payment corresponding to the sent payment message object (e.g., “pending,” “paid,” “accepted,” “expired”). For example, as shown in FIG. 14F, second status indicator 1496 (e.g., “pending”) shows the same status as shown by first status indicator 1494 (e.g., “pending”).
[0490] Further, in some embodiments, because payment message object 1420 corresponds to a payment by the user (instead of a payment request), amount indication 1468 of payment message object 1420 displays the numerical value of the payment amount (e.g., “$28”) in a payment font that is predetermined (or controlled, set, configured) by the operating system (of the device) to be associated with payments generated using the payment transfer user interface. In some embodiments, the payment font is type of font that is larger (in size) than a font used for regular text message object (e.g., message object 1418 in the messaging application).
175
DK 2017 70505 A1 [0491] FIG. 14G shows electronic device 1400 displaying, on display 1402, message conversation 1408 of messaging application 1406 showing message object 1417 (e.g., corresponding to a message sent by the user to message participant 1410 asking “How much do you owe me?”) and message object 1419 (e.g., corresponding to a message sent by message participant 1410 to the user responding “Dinner and the cab ride together was $28.”) and payment transfer user interface 1440 with the payment amount (e.g., “$28”) pre-populated, as described above with reference to FIG. 14A.
[0492] In FIG. 14G, while displaying payment transfer user interface 1440 with the payment amount (e.g., “$28”) pre-populated, electronic device 1400 detects a user activation of request button 1445 (corresponding to request button 845) of payment transfer user interface 1440. For example, as shown in FIG. 14G, the user activation is a tap gesture 1405 on request button 1445.
[0493] In FIG. 14H, in response to detecting tap gesture 1405 on request button 1445, electronic device 1400 displays a pending payment message object 1460 within expanded compose bar 1472. Unlike payment message object 1420 (which corresponds to a payment by the user), payment message object 1460 corresponds to a request for payment (a payment request), by the user (e.g., to message participant 1410). In some embodiments, a payment message object that corresponds to a payment request (e.g., payment message object 1460) (instead of a payment) includes additional information in payment amount indication 1468 (e.g., additional text) (e.g., “$28 Request”) informing the user that the payment message object corresponds to a payment request (as opposed to a payment). In some embodiments, a payment message object that corresponds to a payment request (instead of a payment) (e.g., payment message object 1460) also includes a mode indication 1470 (e.g., stating “PAY”) indicating that the payment message object corresponds to a payment made via an operating-system controlled payment transfer application (and not by a third-party application). In some embodiments, payment message object 1460 that corresponds to a payment request also includes an accept button 1471.
[0494] Further, in some embodiments, as also shown in FIG. 14H, a payment message object corresponding to a payment request (e.g., payment message object 1460) (instead of a payment) includes a request indicator 1449 that indicates to the user that the payment message object
176
DK 2017 70505 A1 corresponds to a payment request (e.g., a payment request made by the user of the device to a message participant or a payment request sent by a message participant to the user) and not to a payment. In some embodiments, as shown in FIG. 14H, request indicator 1449 is a currency symbol (e.g., the dollar symbol “$”) displayed at a center region of the message object. In some embodiments, request indicator 1449 is a graphical symbol. In some embodiments, the visual characteristics (e.g., font type, boldness / thickness, color, shading, dynamic feedback, such as a 3D effect) of request indicator 1449 correspond with the visual characteristics (e.g., font type, boldness / thickness, color, shading, dynamic feedback, such as a 3D effect) of an amount indication of a payment message object that corresponds to a (pending or completed) payment (e.g., amount indication 1468 of payment message objects 1420, 1491). In some embodiments, the visual characteristics (e.g., font type, boldness / thickness, color, shading, dynamic feedback, such as a 3D effect) of request indicator 1449 are different from (and thus does not correspond with) the visual characteristics (e.g., font type, boldness / thickness, color, shading) of an amount indication of a payment message object that corresponds to a (pending or completed) payment request (e.g., amount indication 1468 of payment message objects 1460, 1490).
[0495] In FIG. 14I, while displaying payment message object 1460 (corresponding to a payment request) within expanded compose bar 1472, the device receives user input of a note 1461 (e.g., using virtual keyboard 1412) to accompany the payment request. For example, in FIG. 14I, note 1461 to accompany the payment request of payment message object 1460 states “Dinner + Cab.” [0496] In FIG. 14J, while displaying payment request displaying payment message object 1460 (corresponding to a payment request) within expanded compose bar 1472 and note 1461 (e.g., “Dinner + Cab”) in compose bar 1414, electronic device 1400 detects a user activation of final send button 1474. For example, as shown in FIG. 14J, the user activation is tap gesture 1407 on final send button 1474.
[0497] In FIG. 14K, in response to detecting tap gesture 1407 on final send button 1474, electronic device 1400 displays payment message object 1460 (corresponding to a payment request) within message conversation 1408 of messaging application 1406, thereby indicating to the user that the payment corresponding to payment message object 1420 has been sent to
177
DK 2017 70505 A1 message participant 1410. The device also displays, adjacent to (or below or within) payment message object 1420, note message object 1463 corresponding to note 1461 added by the user to accompany the payment.
[0498] Further, in some embodiments, sent payment message object 1460 (corresponding to a payment request) includes first status indicator 1494 informing the user of a status of the payment request corresponding to the sent payment message object (e.g., “pending,” “paid,” “accepted,” “expired” ). For example, in FIG. 14F, first status indicator 1494 shows “pending,” thus indicating to the user that the payment request associated with sent payment message object 1460 has not yet been accepted (i.e., the payment has not yet been made) by message participant 1410. In some embodiments, second status indicator 1496 (corresponding to status indicator 896) is displayed (in addition to or instead of first status indicator 1494) informing the user of a status of the payment request corresponding to the sent payment message object (e.g., “pending,” “paid,” “accepted,” “expired” ). For example, as shown in FIG. 14K, second status indicator 1496 (e.g., “pending”) shows the same status as shown by first status indicator 1494 (e.g., “pending”).
[0499] FIG. 14L shows a payment message object 1490 within conversation 1408 with message participant 1410 (e.g., “John Appleseed”) sent by message participant 1410 to the user. In some embodiments, payment message object 1490 includes elements that are analogous to elements of payment message object 1460 corresponding to a payment request that is sent by the user to the message participant. For example, as with payment message object 1460, payment message object 1490 includes amount indication 1468 includes additional information (e.g., additional text) (e.g., “$28 Request”) informing the user that the payment message object corresponds to a payment request (as opposed to a payment). For another example, as with payment message object 1460, payment message object 1490 includes mode indication 1470 (e.g., stating “PAY”) indicating that the payment message object corresponds to a payment made via an operating-system controlled payment transfer application (and not by a third-party application). For another example, as with payment message object 1460, payment message object 1490 includes an accept button 1471 (e.g., stating “PAY”) for accepting the payment request (i.e., to make the payment requested by the payment request).
178
DK 2017 70505 A1 [0500] In some embodiments, as with payment message object 1460 shown in FIG. 14H, payment message object 1490, because it corresponds to a payment request (instead of a payment) includes a request indicator 1449 that indicates to the user that the payment message object corresponds to a payment request and not to a payment. In some embodiments, as shown in FIG. 14L, request indicator 1449 is a currency symbol (e.g., the dollar symbol “$”) displayed at a center region of the message object. In some embodiments, request indicator 1449 is a graphical symbol. In some embodiments, the visual characteristics (e.g., font type, boldness / thickness, color, shading, dynamic feedback, such as a 3D effect) of request indicator 1449 correspond with the visual characteristics (e.g., font type, boldness / thickness, color, shading, dynamic feedback, such as a 3D effect) of an amount indication of a payment message object that corresponds to a (pending or completed) payment (e.g., amount indication 1468 of payment message objects 1420, 1491). In some embodiments, the visual characteristics (e.g., font type, boldness / thickness, color, shading, dynamic feedback, such as a 3D effect) of request indicator 1449 are different from (and thus does not correspond with) the visual characteristics (e.g., font type, boldness / thickness, color, shading) of an amount indication of a payment message object that corresponds to a (pending or completed) payment request (e.g., amount indication 1468 of payment message objects 1460, 1490).
[0501] In some embodiments, in response to detecting user activation (e.g., a tap gesture) of accept button 1471 in payment message object 1490, a payment confirmation user interface corresponding to payment confirmation user interface 878 described above with reference to FIGS. 8T-8W is displayed. Then, upon successful user authentication via the payment confirmation user interface, the payment requested by message participant 1410 via payment message object 1490 can be paid. In some embodiments, upon successful user authentication via the payment confirmation user interface, the device creates a new payment message object corresponding to the authenticated payment sent by the user to message participant 1410 (e.g., payment message object 1420 described above with reference to FIGS. 14C-14F).
[0502] FIG. 14M shows a payment message object 1491 within conversation 1408 with message participant 1410 (e.g., “John Appleseed”) sent by message participant 1410 to the user. In some embodiments, payment message object 1491 includes elements that are analogous to
179
DK 2017 70505 A1 elements of payment message object 1420 corresponding to a payment sent by the user to the message participant. For example, as with payment message object 1420, payment message object 1491 includes amount indication 1468 that shows the payment amount (e.g., “$28”) displayed in the payment font. For another example, as with payment message object 1420, payment message object 1491 includes mode indication 1470 (e.g., stating “PAY”) indicating that the payment message object corresponds to a payment made via an operating-system controlled payment transfer application (and not by a third-party application). For another example, as with payment message object 1420, payment message object 1491 includes an accept button 1471 (e.g., stating “PAY”) for accepting the payment (i.e., to receive the payment sent by message participant 1410).
[0503] In some embodiments, in response to detecting user activation (e.g., a tap gesture) of accept button 1471 in payment message object 1491 (thereby accepting the payment from message participant 1410), accept button 1471 ceases to be displayed on the payment message object. Further, in some embodiments, as described above with reference to payment message object 1118 in FIGS. 11D-11E, electronic device 1400 generates a feedback (e.g., a visual effect, a sensory feedback, such as a haptic effect, an audio feedback) indicating to the user that the payment corresponding to payment message object 1491 has been accepted. As noted, the different types of exemplary feedback that can be generated by the device as described above with reference to payment message object 1118 in FIGS. 11D-11E.
[0504] In some embodiments, a payment message object associated with a payment (e.g., payment message object 1420) sent by the user (to a message participant, such as message participant 1410), a payment request (e.g., payment message object 1460) made by the user (to a message participant, such as message participant 1410), a payment request (e.g., payment message object 1490) made by a message participant (e.g., message participant 1410) to the user, and a payment (e.g., payment message object 1491) sent by a message participant (e.g., message participant 1410) to the user are displayed, on a display (e.g., display 1402) with the same visual characteristic, such as the same (background) color, the same shade, the same graphical pattern, and/or the same shape. In some embodiments, this consistency of visual characteristics is true
180
DK 2017 70505 A1 across the two communicating devices (e.g., on the user’s device and on the message participant’s device).
[0505] In some embodiments, while payment related objects are displayed in a similar manner on two communicating devices, non-payment related objects are displayed in a different manner, as between the devices. For example, while a payment message object associated with a payment (e.g., payment message object 1420) sent by the user (to a message participant, such as message participant 1410), a payment request (e.g., payment message object 1460) made by the user (to a message participant, such as message participant 1410), a payment request (e.g., payment message object 1490) made by a message participant (e.g., message participant 1410) to the user, and a payment (e.g., payment message object 1491) sent by a message participant (e.g., message participant 1410) to the user are displayed, on a display (e.g., display 1402) with the same visual characteristic (e.g., a first (background) color, shade, graphical pattern, and/or shape) on both communicating devices (e.g., on both the user’s device and the message participant’s device), a non-payment message object (e.g., message object 1416 and message object 1418) are displayed with a different visual characteristic on the two communicating devices. For example, on the user’s device (e.g., electronic device 1400), while all payment message objects are displayed with the first visual characteristic, message object 1416 (corresponding to a message sent by the user to message participant 1410) is displayed with a second visual characteristic (e.g., a second (background) color, shade, graphical pattern, and/or shape that is different from the first (background) color, shade, graphical pattern, and/or shape), message object 1418 (corresponding to a message sent by message participant 1410 to the user) is displayed with a third visual characteristic (e.g., a third (background) color, shade, graphical pattern, and/or shape that is different from both the first (background) color, shade, graphical pattern, and/or shape and the second (background) color, shade, graphical pattern, and/or shape). By contrast, on the message participant’s (e.g., message participant 1410’s) device, while all payment message objects are displayed with the first visual characteristic, a message object corresponding to message object 1416 on the user’s device is displayed with the third (instead of the second) visual characteristic and the message object corresponding to message object 1418 on the user’s device is displayed with the second (instead of the third) visual characteristic.
181
DK 2017 70505 A1 [0506] FIGS. 15A-15K are a flow diagram illustrating a method for managing peer-to-peer transfers using an electronic device in accordance with some embodiments. Method 1500 is performed at a device (e.g., 100, 300, 500, 1300, 1400) with display and one or more input devices (e.g., a touch-sensitive surface, a mic). Some operations in method 1500 are, optionally, combined, the orders of some operations are, optionally, changed, and some operations are, optionally, omitted.
[0507] As described below, method 1500 provides an intuitive way for managing peer-topeer transfers. The method reduces the cognitive burden on a user for managing peer-to-peer transfers, thereby creating a more efficient human-machine interface. For battery-operated computing devices, enabling a user to manage peer-to-peer transfers faster and more efficiently conserves power and increases the time between battery charges.
[0508] In some examples, prior to displaying, on the display (e.g., 1302, 1402), a numerical value selection user interface (e.g., 1320, 1440), the electronic device (e.g., 1300, 1400) displays (1502), on the display (e.g., 1302, 1402), a third message object (e.g., 1418) that corresponds to a message (e.g., a message requesting the respective numerical value) received from a participant (e.g., 1310, 1410), other than a user of the electronic device, of the one or more participants. In some examples, in accordance with a determination that the third message was authenticated (e.g., via fingerprint authentication, via facial recognition authentication, via iris/retina scan authentication, or via passcode), by the participant, on an external device of the participant, the electronic device concurrently displays (1504), with the third message object (e.g., 1418), an indication (e.g., an message, notification, or note/comment stating that the received message is a verified message) that the third message was biometrically authenticated (e.g., verified) by the participant (e.g., a request for payment that was made with authentication by the sender is displayed differently at the recipient than a request for payment that was made without authentication to indicate to the recipient when the request for payment is an authorized request).
[0509] In some examples, prior to displaying, on the display (e.g., 1302, 1402), a numerical value selection user interface (e.g., 1320, 1440), the electronic device (e.g., 1300, 1400) displays, based on an analysis of the contents of the third message object (e.g., 1418) (or one or more additional message objects in the message transcript), a selectable indication (e.g., corresponding 182
DK 2017 70505 A1 to a payment amount included in the contents of the message object). In some examples, in response to detecting user selection of the selectable indication, the electronic device displays (launches) the numerical value selection user interface (e.g., 1320, 1440) with the numerical value corresponding to the requested amount (e.g., of funds) indicated in the contents of the third message object (e.g., 1418) pre-populated within the numerical value selection user interface (e.g., as described in method 900 with reference to FIGS. 9A-9I). In some examples, in response to detecting the user selection of the selectable indication, the electronic device displays (launches) the numerical value selection user interface (e.g., 1320, 1440) without the numerical value corresponding to the requested amount (e.g., of funds) indicated in the contents of the third message object (e.g., 1418) pre-populated within the numerical value selection user interface.
[0510] The electronic device (e.g., 1300, 1400) displays (1506), on the display (e.g., 1302, 1402), a numerical value selection user interface (e.g., 1320, 1440). While displaying the numerical value selection user interface (e.g., 1320, 1440), the electronic device (e.g., 1300, 1400) receives (1508), via the one or more input devices, an input (e.g., a user input on a touch sensitive surface of the device) that corresponds to selection of a respective numerical value from a plurality of numerical values in the numerical value selection interface.
[0511] In response (1510) to receiving the input that corresponds to the selection of the respective numerical value, the electronic device (e.g., 1300, 1400) displays (1512), on the display (e.g., 1302, 1402), a representation of the respective numerical value (e.g., 1448) in the numerical value selection user interface (e.g., 1320, 1440).
[0512] While displaying the representation of the respective numerical value (e.g., 1448) in the numerical value selection user interface (e.g., 1320, 1440), the electronic device (e.g., 1300, 1400) receives (1514), via the one or more input devices, an input that corresponds to a request to send a message, via a messaging application (e.g., 1306, 1406), that corresponds to the respective numerical value.
[0513] In response (1516) to receiving the input (e.g., 1401) that corresponds to the request to send the message, via the messaging application (e.g., 1306, 1406), that corresponds to the
183
DK 2017 70505 A1 respective numerical value, the electronic device (e.g., 1300, 1400) sends (1518) the message that corresponds to the respective numerical value to one or more participants (e.g., 1310, 1410).
[0514] In some examples, the one or more participants (e.g., 1310, 1410) includes (1520) a first participant and a second participant, and the first participant and the second participant are different from a user of the electronic device (e.g., 1300, 1400).
[0515] In accordance with (1522) a determination that the message is designated as a transmission message for the respective numerical value (e.g., a sending out of computing resources, a sending out of points, a sending out of credits, a sending out of funds, a sending out of virtual resources), the electronic device (e.g., 1300, 1400) displays (1524), on the display, a first message object (e.g., 1344, 1420, a text message, a chat bubble, an open email) in a message transcript (e.g., 1308, 1408) of the messaging application (between a user of the electronic device and a remote user). The first message object (e.g., 1344, 1420) includes a graphical representation of the respective numerical value (e.g., 1346, 1468) in a respective font that is associated with requests generated using the numerical value selection user interface (e.g., a special type of font controlled by the operating system). In some examples, the respective font is a font that is larger than a font used for text in other message objects in the message transcript. Displaying, in a message transcript (e.g., 1308, 1408), a message that includes a graphical representation of a value in a particular font provides the user with feedback about how the message was generated, that the message relates to a transmission (e.g., a transmission of funds, rather than a request for funds), and the value that corresponds to the transmission. Providing improved visual feedback to the user enhances the operability of the device and makes the userdevice interface more efficient (e.g., by helping the user to achieve an intended result by providing feedback indicative of an input that will cause the device to generate the intended result and reducing user mistakes when operating/interacting with the device) which, additionally, reduces power usage and improves battery life of the device by enabling the user to use the device more quickly and efficiently.
[0516] In accordance with (1528) a determination that the message is designated as a request message for the respective numerical value (e.g., a request for computing resources, a request for points, a request for credits, a request for funds, a request for virtual resources), the electronic 184
DK 2017 70505 A1 device (e.g., 1300, 1400) displays (1530), on the display (e.g., 1302, 1402), a second message object (e.g., 1358, 1460) in the message transcript (e.g., 1308, 1408) of the messaging application (e.g., 1306, 1406) different from the first message object (e.g., 1344, 1420, a text message, a chat bubble, an open email).
[0517] In the second message object (e.g., 1358, 1460), the respective numerical value is displayed (1532) in a font that is smaller (e.g., smaller in height) than the respective font. In the second message object (e.g., 1360, 1460), a predetermined request indicator (e.g., a symbol, such as currency symbol, a “$,” or a textual indicator, such as “Request for Resources”) associated with requests generated using the numerical value selection user interface is displayed (1534) in the respective font. Displaying, in the message transcript (e.g., 1308, 1408), a message that includes a predetermined request indicator in a particular font without displaying the numerical value in the same font provides the user with feedback about how the message was generated, that the message relates to a request (e.g., a request for funds, rather than a transmission of funds), and the value that corresponds to the request. Providing improved visual feedback to the user enhances the operability of the device and makes the user-device interface more efficient (e.g., by helping the user to achieve an intended result by providing feedback indicative of an input that will cause the device to generate the intended result and reducing user mistakes when operating/interacting with the device) which, additionally, reduces power usage and improves battery life of the device by enabling the user to use the device more quickly and efficiently.
[0518] In some examples, the message transcript (e.g., 1308, 1408) of the messaging application (e.g., 1306, 1406) includes (1536) a third message object (e.g., 1491, received from a participant (e.g., 1310, 1410) other than the user). In some examples, the third message object (e.g., 1491) corresponds (1538) to a transmission message (e.g., a payment message sending funds in the sent numerical value amount to the user) for sending one or more items corresponding to a numerical value generated at an external device of a participant (e.g., 1310, 1410) of the one or more participants. In some examples, the third message object includes (1540) an accept affordance (e.g., 1471) for accepting one or more items associated with the third message object at the electronic device (e.g., one or more files or a payment from the participant from whom the third message object was received).
185
DK 2017 70505 A1 [0519] In some examples, the message transcript (e.g., 1308, 1408) of the messaging application (e.g., 1306, 1406) includes (1542) a fourth message object (e.g., 1490, received from a participant (e.g., 1310, 1410) other than the user). In some examples, the fourth message object (e.g., 1490) corresponds (1544) to a request message (e.g., a payment request message requesting funds in the requested numerical value amount) for requesting one or more items corresponding to a numerical value generated at an external device of a participant of the one or more participants. In some examples, the fourth message object (e.g., 1490) includes (1546) a send affordance (e.g., 1471) for sending one or more items associated with the fourth message object to a participant from whom the fourth message object (e.g., 1490) was received (e.g., one or more files or a payment from a user of the device to the participant from whom the fourth message object was received).
[0520] In response (1516) to receiving the input that corresponds to the request to send the message, via the messaging application (e.g., 1306, 1406), that corresponds to the respective numerical value, in accordance with a determination that a first participant (e.g., 1310, 1410) of the one or more participants is ineligible to receive the message (e.g., the first participant’s device does not support transfers of resources), the electronic device (e.g., 1300, 1400) displays (1548), on the display (e.g.,1402), an indication (e.g., pop-up notification, an error message in the message application, a note/comment accompanying the message in the messaging application), that the first participant is ineligible to receive the message. In some examples, where the intended recipient (or an intended recipient of the plurality) is ineligible (e.g., not enabled) to participate in transfers/requests for resources, a send affordance (e.g., used to send a drafted message) is greyed out or otherwise prevented from being activated. Displaying an indication that the remote user is ineligible to receive the message provides the user with feedback about the capabilities of the remote user’s device and provides the user with visual feedback that the message will not be sent. Providing improved visual feedback to the user enhances the operability of the device and makes the user-device interface more efficient (e.g., by helping the user to achieve an intended result by providing feedback indicative of an input that will cause the device to generate the intended result and reducing user mistakes when operating/interacting with the device) which, additionally, reduces power usage and improves battery life of the device by enabling the user to use the device more quickly and efficiently.
186
DK 2017 70505 A1 [0521] In some examples, the electronic device receives (1550), from a participant (e.g., 1310, 1410) of the one or more participants, a message that corresponds to a second respective numerical value. In some examples, in accordance with a determination that the received message is designated as a transmission message for the second respective numerical value (e.g., a sending out of computing resources, a sending out of points, a sending out of credits, a sending out of funds, a sending out of virtual resources), the electronic device displays (1552), on the display, a first received message object (e.g., 1491, a text message bubble, a chat bubble, an open email that is received from a different participant) in the message transcript (e.g., 1308, 1408) of the messaging application (between a user of the electronic device and a remote user). In some examples, the first received message object (e.g., 1491) includes (1554) a graphical representation of the second respective numerical value (e.g., 1468) in the respective font that is associated with requests generated using the numerical value selection user interface (e.g., a special type of font controlled by the operating system). In some examples, the respective font is a font that is larger than a font used for text in other message objects in the message transcript. Displaying, in a message transcript, a message that includes a graphical representation of a value in a particular font provides the user with feedback about how the message was generated, that the message relates to a transmission (e.g., a transmission of funds, rather than a request for funds), and the value of the transmission. Providing improved visual feedback to the user enhances the operability of the device and makes the user-device interface more efficient (e.g., by helping the user to achieve an intended result by providing feedback indicative of an input that will cause the device to generate the intended result and reducing user mistakes when operating/interacting with the device) which, additionally, reduces power usage and improves battery life of the device by enabling the user to use the device more quickly and efficiently.
[0522] In some examples, in accordance with a determination that the received message is designated as a request message for the second respective numerical value (e.g., a request for computing resources, a request for points, a request for credits, a request for funds, a request for virtual resources), the electronic device (e.g., 1300, 1400) displays (1556), on the display (e.g., 1302, 1402), a second received message object (e.g., 1490) in the message transcript (e.g., 1308, 1408) of the messaging application different from the first received message object (e.g., 1491, a text message, a chat bubble, an open email). In some examples, in the second received message 187
DK 2017 70505 A1 object (e.g., 1490), the respective numerical value (e.g., 1468) is displayed (1558) in the font that is smaller (e.g., smaller in height) than the respective font. In some examples, a predetermined request indicator (e.g., a symbol, such as a currency symbol or a or a textual indicator, such as “Request for Resources”) associated with requests generated using the numerical value selection user interface is displayed (1560) in the respective font. Displaying, in the message transcript, a message that includes a predetermined request indicator in a particular font without displaying the numerical value in the same font provides the user with feedback about how the message was generated, that the message relates to a request (e.g., a request for funds, rather than a transmission of funds), and the value of the request. Providing improved visual feedback to the user enhances the operability of the device and makes the user-device interface more efficient (e.g., by helping the user to achieve an intended result by providing feedback indicative of an input that will cause the device to generate the intended result and reducing user mistakes when operating/interacting with the device) which, additionally, reduces power usage and improves battery life of the device by enabling the user to use the device more quickly and efficiently.
[0523] In some examples, the electronic device (e.g., 1300, 1400) concurrently displays (1562) (e.g., with the first message object, with the second message object), at a first location (e.g., within the first message object, adjacent to the first message object) associated with a message object (e.g., the first message object or the second message object) in the message transcript (e.g., 1308, 1408) of the messaging application (e.g., 1306, 1406), a visual indicator (e.g., 1494, 1496, text, such as “accepted,” “pending,” “viewed,” or a graphical indicator) indicating a status associated with an action of a participant (e.g., 1310, 1410) of the one or more participants. In some examples, in accordance with a determination that the participant (e.g., 1310, 1410) has taken an action (e.g., accepted a transfer, accepted a payment, viewed a transfer, viewed a payment, decline to accept a transfer, declined to accept a payment) changing the status, the electronic device updates (1564) the visual indicator (e.g., 1494, 1496) to reflect the change in status associated with the action of the participant.
[0524] In some examples, the first location at least partially overlaps with the displayed message object (e.g., 1344, 1358, 1420, 1460, 1490, 1491). In some examples, the first location does not overlap with the displayed message object. In some examples, content of the visual
188
DK 2017 70505 A1 indicator is controlled by an operating system of the electronic device (e.g., the visual indicator is displayed in a background of a transcript on which representations of message are displayed, and is visually distinguished from the representations of messages in the transcript).
[0525] In some examples, subsequent to displaying (1530), on the display (e.g., 1302, 1402), the second message object (e.g., 1358, 1460) in the messaging application, in accordance with a determination that a transfer of a first type of item in a quantity corresponding to the respective numerical value has been initiated (or accepted) by an intended recipient of the message associated with the second message object, the electronic device (e.g., 1300, 1400) changes (1566) display of a visual characteristic of the second message object (e.g., 1358, 1460) from a first visual characteristic to a second visual characteristic.
[0526] In some examples, the electronic device (e.g., 1300, 1400) receives (1568), from an external device associated with a participant of the one or more participants, a second message (e.g., a resource request message, a payment request message, a regular text message containing only text) associated with a request for a second respective numerical value (e.g., a request message that includes an embedded request for computing resources, points, credits, funds, or virtual resources, or a regular text message that includes a mention of a request for computing resources, points, credits, funds, or virtual resources). Subsequent to (1570) receiving the second message associated with the request for the second respective numerical value and in accordance with a determination that a predetermined amount of time (e.g., a pre-set time limit, such as 1 hour) has passed since receiving the second message, and in accordance with a determination that the second message is designated (e.g., is sent using a corresponding numerical value selection user interface by the participant on the external device) as a request message for one or more items corresponding to the second respective numerical value, the electronic device generates (1572) a reminder (e.g., displayed on a lock screen of the electronic device, displays as a numerical indicator on an icon for starting the messaging application) of the received second message. Subsequent to (1570) receiving the second message associated with the request for the second respective numerical value and in accordance with a determination that the second message is not designated as a request message (e.g., is not sent using the corresponding numerical value selection user interface on the external device, but is a regular text message
189
DK 2017 70505 A1 containing text relating to a request for the second respective numerical value) for one or more items corresponding to the second respective numerical value, the electronic device forgoes (1574) generating the reminder of the received second message.
[0527] In some examples, the first message object (e.g., 1344, 1420), the second message object (e.g., 1460), the first received message object (e.g., 1491), and the second received message object (e.g., 1490) are displayed with a first visual characteristic (e.g., a color, a shade, a graphical pattern, a shape). Thus, in some examples, the first message object, the second message object, the first received message object, and the second received message object are all displayed with the same visual characteristic, such as the same color, the same background color, the same shade, the same graphical pattern, and/or the same shape.
[0528] In some examples, a third message object (e.g., 1416) that corresponds to a message of the messaging application (e.g., 1306, 1406) that was sent by the electronic device (e.g., 1300, 1400) and does not correspond to the respective numerical value (and/or does not correspond to a message generated using a numerical value selection user interface) is displayed with a second visual characteristic (e.g., a color, a shade, a graphical pattern, a shape) and a third received message object (e.g., 1418) that corresponds to a messaging application that was received from the one or more participants and does not correspond to the second respective numerical value (and/or does not correspond to a message generated using a numerical value selection user interface) is displayed with a third visual characteristic (e.g., a color, a shade, a graphical pattern, a shape) that is different from the second visual characteristic. Differentiating messages based on whether they were sent by the device or received by the device provides the user with visual feedback about the sender and recipient of the message. Providing improved visual feedback to the user enhances the operability of the device and makes the user-device interface more efficient (e.g., by helping the user to achieve an intended result by providing feedback indicative of an input that will cause the device to generate the intended result and reducing user mistakes when operating/interacting with the device) which, additionally, reduces power usage and improves battery life of the device by enabling the user to use the device more quickly and efficiently.
[0529] In some examples, the first message object (e.g., 1344, 1420) and the second message object (e.g., 1460) are displayed with a first visual characteristic (e.g., a color, a shade, a
190
DK 2017 70505 A1 graphical pattern, a shape). In some examples, messages that do not correspond to transfers or requests for resources/funds have a different background color from messages that do correspond to transfers or requests for resources/funds.
[0530] In some examples, a third message object (e.g., 1416) that corresponds to a message of the messaging application (e.g., 1306, 1406) that does not correspond to the respective numerical value (and/or does not correspond to a message generated using a numerical value selection user interface) is displayed with a second visual characteristic (e.g., a color, a shade, a graphical pattern, a shape) that is different from the first visual characteristic. Visually differentiating between messages that do and do not correspond to transfer of items helps the user quickly identify messages that include transfers of items. This is particularly helpful because non-transfer messages involve limited consequences and users may glance over such messages with little review, while messages that correspond to transfers involve relatively higher consequences. The differentiated visual feedback prompts the user to review such messages more carefully (and potentially take action). Providing improved visual feedback to the user enhances the operability of the device and makes the user-device interface more efficient (e.g., by helping the user to achieve an intended result by providing feedback indicative of an input that will cause the device to generate the intended result and reducing user mistakes when operating/interacting with the device) which, additionally, reduces power usage and improves battery life of the device by enabling the user to use the device more quickly and efficiently.
[0531] In some examples, the one or more participants include a first participant (e.g., 1310, 1410) and a second participant, and the first participant and the second participant are different from a user of the electronic device (e.g., 1300, 1400). In some examples, the electronic device receives an indication (e.g., based on user input, based on text of one or more messages between the participants) that an intended recipient of the message is the first participant and not the second participant. Subsequent to receiving the indication that the intended recipient of the message is the first participant and not the second participant, and in accordance with the determination that the message is designated as a transmission message for the respective numerical value (e.g., a sending out of computing resources, a sending out of points, a sending out of credits, a sending out of funds, a sending out of virtual resources), the electronic device
191
DK 2017 70505 A1 (e.g., 1300, 1400) displays, on the display (e.g., 1302, 1402), the first message object (e.g., 1416) in a second message transcript (e.g., a chat screen between only the user of the electronic device and the first participant, and not involving the second participant) different from the message transcript of the messaging application. The second message transcript is not associated with the second participant (e.g., is only associated with the first participant and the user of the electronic device). Thus, in some examples, if the original messaging conversation was a group conversation, and the user intends to send the message to only one participant of the group conversation, a new conversation is created between only the user and the intended recipient of the message. Subsequent to receiving the indication that the intended recipient of the message is the first participant and not the second participant, and in accordance with the determination that the message is designated as a request message for the respective numerical value (e.g., a sending out of computing resources, a sending out of points, a sending out of credits, a sending out of funds, a sending out of virtual resources), the electronic device displays, on the display, the second message object in the second message transcript of the messaging application.
[0532] In some examples, prior to sending the message to the one or more participants, the electronic device (e.g., 1300, 1400) receives, via the input mechanism (e.g., 1412), a user comment (e.g., text relating to the message to be sent to the one or more participants). In some examples, prior to receiving the user comment, the electronic device receives an input (e.g., a tap) on a comment indicator (e.g., a comment region or comment bar for entering text, a comment affordance for bringing up a keyboard). In some examples, the numerical value selection user interface (e.g., 1320, 1440) includes a comment region or comment bar for entering comments. In some examples, the numerical value selection user interface (e.g., 1320, 1440) includes a comment affordance for bringing up a keyboard that enables the user to enter comments. In some examples, in response to receiving the input on the comment indicator, the device displays an input mechanism (e.g., a virtual keyboard for typing text, a digital assistant to entering text via spoken user input) for entering a comment. In some examples, subsequent to receiving user comment, and in accordance with the determination that the message is designated as a transmission message for one or more items corresponding to the respective numerical value (e.g., a sending out of computing resources, a sending out of points, a sending out of credits, a sending out of funds, a sending out of virtual resources), the electronic device concurrently
192
DK 2017 70505 A1 displays, adjacent to (e.g., below) the first message object (e.g., 1344, 1420), a message object (e.g., 1463) including the user comment (e.g., 1461). In some examples, subsequent to receiving user comment, and in accordance with the determination that the message is designated as a request message for one or more items corresponding to the respective numerical value (e.g., a requesting of computing resources, a requesting of points, a requesting of credits, a requesting of funds, a requesting of virtual resources), the electronic device concurrently displays, adjacent to (e.g., below) the second message object (e.g., 1460), the message object including the user comment.
[0533] Note that details of the processes described above with respect to method 1500 (e.g., FIGS. 15A-15K) are also applicable in an analogous manner to the methods described herein. For example, method 1500 optionally includes one or more of the characteristics of the various methods described herein with reference to methods 900, 1200, 1800, 2100, 2400, 2700, 3000, and 3400. For example, concurrently displaying the representation of a message and a selectable indication that corresponds to a type of item (being transferred, such as a photo, sticker, resources, or a payment), as described in method 900, can be applied with respect to the first message object (e.g., 1420), the second message object (e.g., 1460), the first received message object (e.g., 1491), or the second received message object (e.g., 1490). For another example, the outputting of dynamic feedback described in method 1200 can be applied with respect to the first message object (e.g., 1420), the second message object (e.g., 1460), the first received message object (e.g., 1491), or the second received message object (e.g., 1490). For another example, a request for activating an account that is authorized to obtain one or items (e.g., a sticker, a photo, resources, a payment), as described in method 1800, can be applied with respect to the first message object (e.g., 1420), the second message object (e.g., 1460), the first received message object (e.g., 1491), or the second received message object (e.g., 1490) when retrieving one or more items (e.g., a sticker, a photo, resources, a payment) associated with the message. For another example, displaying representations of a first account and a second account, as described in method 2100, can also be displayed when authenticating / confirming an incoming transfer corresponding to the second received message object (e.g., 1491). For another example, automatically proceeding with a transfer, as described in method 2400, instead of requiring user input, can also be used to accept the contents of an incoming transfer corresponding to second
193
DK 2017 70505 A1 received message object (e.g., 1491). For another example, the plurality of items including information from messages in a message conversation, as described in method 2700, can be displayed in response to user selection of the first message object (e.g., 1420), the second message object (e.g., 1460), the first received message object (e.g., 1491), or the second received message object (e.g., 1490). For another example, an utterance can be used, as described in method 3000, to create the first message object (e.g., 1420) or the second message object (e.g., 1460). For another example, a visual effect (e.g., a coloring effect, a geometric alteration effect) can be applied, as described in method 3400, to an element (e.g., 1468) a message object (e.g., 1420) or an element (e.g., 1468) of a received message object (e.g., 1490) when a transfer (e.g., of a resource, of a file, of a payment) associated with the message objects are completed. For brevity, these details are not repeated below.
[0534] The operations in the information processing methods described above are, optionally, implemented by running one or more functional modules in an information processing apparatus such as general purpose processors (e.g., as described with respect to FIGS. 1A, 3, and 5A) or application specific chips. Further, the operations described above with reference to FIGS. 15A-15K are, optionally, implemented by components depicted in FIGS. 1A1B. For example, displaying operation 1506, receiving operation 1508, displaying operation 1512, receiving operation 1514, sending operation 1518, displaying operation 1524, displaying operation 1530, are, optionally, implemented by event sorter 170, event recognizer 180, and event handler 190. Event monitor 171 in event sorter 170 detects a contact on touch-sensitive surface 604, and event dispatcher module 174 delivers the event information to application 1361. A respective event recognizer 180 of application 136-1 compares the event information to respective event definitions 186, and determines whether a first contact at a first location on the touch-sensitive surface corresponds to a predefined event or sub-event, such as selection of an object on a user interface. When a respective predefined event or sub-event is detected, event recognizer 180 activates an event handler 190 associated with the detection of the event or subevent. Event handler 190 optionally utilizes or calls data updater 176 or object updater 177 to update the application internal state 192. In some embodiments, event handler 190 accesses a respective GUI updater 178 to update what is displayed by the application. Similarly, it would
194
DK 2017 70505 A1 be clear to a person having ordinary skill in the art how other processes can be implemented based on the components depicted in FIGS. 1A-1B.
[0535] FIGS. 16A-16F illustrate exemplary user interfaces for managing peer-to-peer transfers, in accordance with some embodiments. As described in greater detail below, the nonlimiting exemplary embodiment of the user interfaces illustrated in FIGS. 16A-16F relate to the non-limited exemplary embodiment of the user interfaces illustrated in FIGS. 17A-17L, which in turn are used to illustrate the processes described below, including the processes in FIGS. 18A18F.
[0536] FIG. 16A illustrates an electronic device 1600 (e.g., portable multifunction device 100, device 300, or device 500). In the non-limiting exemplary embodiment illustrated in FIGS. 16A-16F, electronic device 1600 is a smartphone. In other embodiments, electronic device 1600 can be a different type of electronic device, such as a wearable device (e.g., a smartwatch). Electronic device 1600 has a display 1602 and one or more input devices (e.g., touchscreen of display 1602, a mechanical button 1604, a mic).
[0537] FIG. 16A shows electronic device 1600 displaying, on display 1602, a message conversation 1608 of a messaging application 1606 between a user of electronic device 1600 (e.g., “Kate Appleseed”) and a message participant 1610 (e.g., “John Appleseed”). In some embodiments, the device also displays (e.g., beneath or covering a portion of messaging application 1608) a virtual keyboard 1612 (e.g., an alphanumeric keyboard for typing a message) and a compose bar 1614 displaying the text of a message as a message is typed using virtual keyboard 1612. In some embodiments, a mechanical keyboard can be used in addition to or alternatively to virtual keyboard 1612 to type a message. In some embodiments, compose bar 1614 can expand (e.g., expand upwards) to accommodate a longer message or message object (e.g., an image, an emoticon, a special type of message object, such as a payment object). In some embodiments, compose bar 1614 includes a mic button 1616 which, when activated, enables the user to enter a message using voice input.
[0538] As shown in FIG. 16A, message conversation 1608 includes a message object 1618 that corresponds to a message sent by the user to message participant 1610. In the message
195
DK 2017 70505 A1 corresponding to message object 1618, the user requests to message participant 1610: “Can you send me the account info?” As also shown in FIG. 16A, message conversation 1608 also includes an encrypted message object 1620 sent by message participant 1610 to the user in response to the user’s request to send “the account info.” In some embodiments, encrypted message object 1620 corresponds to a transfer of an encrypted message (e.g., as indicated by indication 1622) and includes an accept button 1624 for accepting (and thereby decrypting) the contents of the encrypted message associated with the encrypted message object. In some embodiments, an activated decrypting account is required in order to decrypt and encrypted message, such as the encrypted message corresponding to encrypted message object 1620.
[0539] FIG. 16B shows electronic device 1600 displaying, on display 1602, in response to detecting user activation of accept button 1624, and in accordance with a determination (e.g., made by electronic device 1600 based on accounts stored or logged into the device or made by an external device, such as a server, storing information about accounts associated with the user of the device) that a decrypting account associated with the user, which is required to view and send encrypted (e.g., via messaging application 1606), is not yet activated (e.g., not yet set up, not yet configured), the device displays (e.g., replaces display of message conversation 1608 of messaging application 1606 with) an initial setup notification user interface 1626.
[0540] In some embodiments, initial setup notification user interface 1626 includes an (graphical and/or textual) indication 1628 informing the user that a decrypting account associated with the user account (logged into the device and belonging to the user of the device) must be activated (e.g., set up, configured). For example, in FIG. 16B, indication 1628 includes text stating: “To send and receive encrypted messages, please set up your decrypting account.” Initial setup notification user interface 1626 also includes a proceed button 1630 for proceeding with activating the decrypting account.
[0541] In some embodiments, in accordance with a determination (e.g., made by electronic device 1600 based on accounts stored or logged into the device or made by an external device, such as a server, storing information about accounts associated with the user of the device) that a required decrypting account associated with the user is already activated (e.g., is already set up, is already configured), the device proceeds with decrypting the encrypted message corresponding 196
DK 2017 70505 A1 to encrypted message object 1620 and displays the contents of the message (e.g., as shown in FIG. 16F).
[0542] In FIG. 16C, in response to detecting user selection of proceed button 1630, electronic device 1600 displays, on display 1602, an account activation user interface 1626. As shown in Fig. 16C, account activation user interface 1626 includes a graphical representation 1632 (e.g., an image representing a card associated with the payment account) of the decrypting account and a progress indication 1638 (e.g., stating “Activating,” “Setting up your decrypting account”) informing the user that activation of the decrypting account is in progress. In some embodiments, as shown in FIG. 16C, graphical representation 1632 of the decrypting account includes a textual indication 1634 that the representations corresponds to a decrypting account and a plurality of patterns 1636 (e.g., user interface objects, shapes) that can be dynamic (e.g., moving, changes colors, changes location, changes a depth effect). In some embodiments, when the decrypting account is not yet activated (as is the case in FIG. 16C), a dynamic feedback (e.g., visual, sensory, audio) is not generated by electronic device 1600. The dynamic feedback is instead generated once the decrypting has been activated, as shown in FIG. 16D.
[0543] FIG. 16D shows the account activation process being completed. Thus, in FIG. 16D, progress indication 1638 of account activation user interface 1626 informs the user that activation of the decrypting account has been successfully completed (e.g., by stating “Activated, “Your decrypting account is ready to use”).
[0544] In some embodiments, as shown in FIG. 16E, electronic device 1600 generates a dynamic feedback on graphical representation 1632 of the decrypting account (e.g., a plurality of moving patterns 1636, a 3D animation of moving patterns/elements) akin to the dynamic visual feedback applied to a completed transfer message object (e.g., similar to the visual feedback applied to amount object 3324 described below with reference to, for example, FIGS. 33D-33J). In some embodiments, the feedback is a dynamic visual feedback causing display of graphical representation 1632 (or of patterns 1636) to change as changes in the orientation of the device relative to a reference point 1640 are detected, where reference point 1640 is a face of a viewer (e.g., the user) of the device in a field of view of a sensor (e.g., a camera) of the device (alternatively, in some embodiments, the reference point is a static point external to the device,
197
DK 2017 70505 A1 such as a location on the ground or floor). For example, in FIG. 16E, the dynamic visual feedback is a 3D effect that provides the user with the visual effect that graphical representation 1632 (or patterns 1636) is three-dimensional. Thus, in FIG. 16E, based on reference point 1640 of the user, graphical representation 1632 (or patterns 1636) looks visually different (e.g., shadows behind plurality of patterns 1632 change) from angle 1600A of the device and from angle 1600B of the device and, optionally, both the view of graphical representation 1632 from angle 1600A and angle 1600B look different from the appearance of the representation from straight on (e.g., such that the display is not tilted at an angle relative to the face of the viewer, as shown in FIG. 16D). In some embodiments, the dynamic visual feedback is a changing color applied to the graphical representation (or to the plurality of patterns of the graphical representation).
[0545] In some embodiments, in addition to or instead of a dynamic visual feedback, the device generates a dynamic haptic feedback (e.g., similar to the generated tactile output 3336 described below with reference to, for example, FIGS. 33F and 33H). In some embodiments, the dynamic haptic feedback is a dynamically strengthening and weakening tactile output caused by the device. In some embodiments, the dynamic haptic feedback is a tactile output with changing tactile output patterns caused by the device. In some embodiments, the strength or frequency of the tactile output changes as the device detects changes in the orientation of the device relative to the reference point (e.g., reference point 1640).
[0546] In some embodiments, the generated feedback (e.g., visual feedback, sensory feedback, audio feedback) is caused (e.g., only) by an operating system program of the device and non-operating system programs of the device are not enabled to cause the feedback.
[0547] FIG. 16F shows electronic device 1600 displaying, on display 1602, message conversation 1608 (with message participant 1610) of messaging application 1606 after the user has successfully activated the decrypting account. Because the decrypting account is now active, encrypted message object 1620 now shows the contents of the message (e.g., stating “The password on your account is 12345678”) associated with the encrypted message object.
198
DK 2017 70505 A1 [0548] As mentioned above, the non-limiting exemplary embodiment of the user interfaces illustrated in FIGS. 16A-16F described above relate to the non-limited exemplary embodiment of the user interfaces illustrated in FIGS. 17A-17L described below. Therefore, it is to be understood that the processes described above with respect to the exemplary user interfaces illustrated in FIGS. 16A-16F and the processes described below with respect to the exemplary user interfaces illustrated in FIGS. 17A-17L are largely analogous processes that similarly involve initiating and managing transfers using an electronic device (e.g., 100, 300, 500, 1600, or 1700).
[0549] FIGS. 17A-17L illustrate exemplary user interfaces for peer-to-peer transfers, in accordance with some embodiments. The user interfaces in these figures are used to illustrate the processes described below, including the processes in FIGS. 18A-18F.
[0550] FIG. 17A illustrates an electronic device 1700 (e.g., portable multifunction device 100, device 300, or device 500). In the non-limiting exemplary embodiment illustrated in FIGS. 17A-17L, electronic device 1700 is a smartphone. In other embodiments, electronic device 1700 can be a different type of electronic device, such as a wearable device (e.g., a smartwatch). Electronic device 1700 has a display 1702 and one or more input devices (e.g., touchscreen of display 1702, a mechanical button 1704, a mic).
[0551] In FIG. 17A, electronic device 1700 displays, on display 1702, a user interface locked screen 1730 showing a notification 1732 corresponding to a payment received from a message participant 1710 (e.g., “John Appleseed”). For example, as show in FIGS. 17A, notification 1732 reads “John Appleseed sent you a payment.” In some embodiments (e.g., if the received payment is a gift), an amount of the received payment is not shown on notification 1732. In some embodiments (e.g., if the received payment is not a gift), an amount of the received payment is shown on notification 1732.
[0552] In some embodiments, notification 1732 is shown on a different user interface other than user interface locked screen 1730. For example, notification 1732 can be shown on a homescreen of the device (e.g., as a pop-up banner). For another example, notification 1732 can be shown on a notification user interface (e.g., a notification panel) of the device. For another
199
DK 2017 70505 A1 example, notification 1732 can be shown as a pop-up notification over an application user interface of a currently-running application on the device.
[0553] FIG. 17B shows a message conversation 1708 of a messaging application 1706 between a user of electronic device 1700 (e.g., “Kate Appleseed”) and message participant 1710 (e.g., “John Appleseed”), the sender of the payment corresponding to notification 1732. In some embodiments, as shown in FIG. 17B, the device 1700 also displays (e.g., beneath or covering a portion of messaging application 1708) a virtual keyboard 1712 (e.g., an alphanumeric keyboard for typing a message) and a compose bar 1714 displaying the text of a message as a message is typed using virtual keyboard 1712. In some embodiments, a mechanical keyboard can be used in addition to or alternatively to virtual keyboard 1712 to type a message. In some embodiments, compose bar 1714 can expand (e.g., expand upwards) to accommodate a longer message or message object (e.g., an image, an emoticon, a special type of message object, such as a payment object). In some embodiments, compose bar 1714 includes a mic button 1714A which, when activated, enables the user to enter a message using voice input.
[0554] As shown in FIG. 17B, message conversation 1708 includes a message object 1716 that corresponds to a message sent by the user to message participant 1710. In the message corresponding to message object 1716, the user states to message participant 1710: “See you at the party tomorrow!” [0555] As also shown in FIG. 17B, message conversation 1708 includes a gift payment message object 1718 that corresponds to the received payment (e.g., a gift payment) notified by notification 1732. In some embodiments, gift payment message object 1718 includes a mode indication 1720 (e.g., corresponding to mode indication 1120 described, for example, in FIG. 11A) (e.g., stating “PAY”) that the payment message object corresponds to a payment made via an operating-system controlled payment transfer application (and not by a third-party application). In some embodiments, gift payment message object 1718 includes a status indicator 1722 (e.g., corresponding to first status indicator 894) informing the user of a status of the payment corresponding to the payment message object (e.g., “pending,” “paid,” “accepted,” “expired,” etc.). For example, in FIG. 17A, status indicator 1722 shows “pending,” thus
200
DK 2017 70505 A1 indicating to the user that the payment associated with gift payment message object 1718 has not yet been accepted by the user.
[0556] In some embodiments, gift payment message object 1718 includes a graphical indication 1724 (e.g., instead of an indication of the payment, thus hiding the payment amount). In some embodiments, graphical indication 1724 is a graphical animation (e.g., a gift box, an envelope, a birthday cake) that informs the user that the payment corresponding to gift payment message object 1718 is as gift. In some embodiments, graphical indication 1724 is a dynamic graphical animation (e.g., an opening gift box, an opening envelope, a birthday cake with lighted candles) that informs the user that the payment corresponding to gift payment message object 1718 is a gift.
[0557] In FIG. 17B, gift payment message object 1718 includes an accompanying note message object 1719 corresponding to a note (e.g., a comment or message) sent by the sender of the gift payment (e.g., message participant 1710). For example, in FIG. 17B, the message corresponding to note message object 1719 accompanying gift payment message object 1718 states “Happy Birthday!,” thus providing further indication (e.g., in addition to graphical indication 1724) that the gift payment is intended as a gift for the user’s birthday.
[0558] In some embodiments, prior to sending of a gift payment (e.g., the payment corresponding to gift payment message object 1718), a payment (e.g., the payment corresponding to gift payment message object 1718) is marked as a gift payment (instead of a regular payment) at the sender’s device (e.g., message participant 1710’s device) in response to detecting user selection of a send gift payment option (e.g., on payment transfer user interface 840 described, for example, in FIG. 8E). In some embodiments, in addition to the send gift payment option, the device provides (e.g., on a payment transfer user interface, such as payment transfer user interface 840, or on a gift options user interface accessible from the payment transfer user interface) a plurality of (dynamic) graphical animations (e.g., a gift box, an envelope, a birthday cake) that can be used for graphical indication 1724 to be applied to the gift payment message object (e.g., gift payment message object 1718) corresponding to the gift payment.
201
DK 2017 70505 A1 [0559] In FIG. 17C, while displaying gift payment message object 1718 within message conversation 1708, electronic device 1700 detects a user input on (graphical indication 1724 of) gift payment message object 1718. For example, as shown in FIG. 17C, the user input is a tap gesture 1701 on graphical indication 1724. In some embodiments, the user input (e.g., tap gesture 1718) is detected at any region within gift payment message object 1718.
[0560] In FIG. 17D, in response to detecting tap gesture 1701, electronic device 1700 replaces display of graphical indication 1724 hiding the gift (payment) amount with an amount indication 1726 (e.g., corresponding to amount indication 1122 described, for example, in FIG. 11A), thereby revealing the amount of the gift payment, and an accept button 1728 (e.g., corresponding to accept button 1124 described, for example, in FIG. 11A) for accepting the payment sent by message participant 1710 to the user as a gift. The revealing of amount indication 1726 (showing the gift amount) from graphical indication 1724 (hiding the gift amount) in this fashion provides a “surprise” effect to the recipient (e.g., the user) receiving the payment as a gift.
[0561] In FIG. 17E, while displaying amount indication 1726 and accept button 1728 within gift payment message object 1718, electronic device 1700 detects a user activation of accept button 1728. For example, as shown in FIG. 17E, the user activation is a tap gesture 1703 on accept button 1728 of gift payment message object 1718.
[0562] In some embodiments, as shown in FIG. 17F, in response to detecting tap gesture 1703, in accordance with a determination (e.g., made by electronic device 1700 based on accounts stored or logged into the device or made by an external device, such as a server, storing information about accounts associated with the user of the device) that a payment account associated with the user, which is required to receive and send payments (e.g., via an operatingsystem controlled transfer application), is not yet activated (e.g., not yet set up, not yet configured), the device displays (e.g., replaces display of message conversation 1708 of messaging application 1706 with) an initial setup notification user interface 1740.
[0563] In some embodiments, initial setup notification user interface 1740 includes an (graphical and/or textual) indication 1742 informing the user that a payment account associated
202
DK 2017 70505 A1 with the user account (logged into the device and belonging to the user of the device) must be activated (e.g., set up, configured). For example, in FIG. 17F, indication 1742 includes a graphical indication of a payment account and states: “To send and receive payments, please set up your Payment account.” Initial setup notification user interface 1740 also includes a proceed button 1744 for proceeding with activating the payment account. In some embodiments, initial setup notification user interface 1740 also includes a cancel button 1745 for forgoing proceeding with activating the payment account.
[0564] In some embodiments, in accordance with a determination (e.g., made by electronic device 1700 based on accounts stored or logged into the device or made by an external device, such as a server, storing information about accounts associated with the user of the device) that a required payment account associated with the user is already activated (e.g., is already set up, is already configured), the device proceeds with accepting the gift payment sent by message participant 1710 via, for example, the steps described above with reference to FIGS. 11A-11G, causing the received gift payment (e.g., in the amount of $50) to be added to the payment account associated with the user.
[0565] In some embodiments, if the payment account is already activated, the payment corresponding to a payment message object (e.g., gift payment message object 1718) is automatically accepted without any user input (e.g., without tap gesture 1703). In some embodiments, electronic device 1700 proceeds to automatically accept (without any user input, such as tap gesture 1703, on the payment message object corresponding to the received payment) the received payment is an automatic accept option is enabled on the device. In some embodiments, even if the automatic accept option is enabled on the device, the device forgoes automatically accepting a payment if the sender of the payment (e.g., message participant 1710) is not on a contacts list or a trusted contacts list associated with the user account logged onto the device. In some embodiments, even if the payment account is already activated and an automatic accept option is enabled, if electronic device 1700 determines that there is no record of any prior transactions involving the payment account (e.g., if the device determines that the user has already not yet received a first payment), the device forgoes automatically accepting the
203
DK 2017 70505 A1 payment in that first instance and instead requires user input (e.g., tap gesture 1703) to accept the payment.
[0566] FIG. 17G shows a “Terms and Conditions” page 1746 that is displayed in response to detecting user selection (e.g., a tap gesture) of proceed button 1744 on initial setup notification user interface 1740. As shown in Fig. 17G, “Terms and Conditions” page 1746 includes a textual description 1748 of the (legal) terms and conditions associated with activating a payment account, and requests user confirmation of the user’s understanding of the terms and conditions and agreement with the terms and conditions. “Terms and Conditions’ page 1746 includes a agree button 1750 for proceeding with the account activation (and thereby indicating agreement with the terms and conditions) and a disagree button 1752 for forgoing proceeding with the account activation (and thereby indication non-agreement with the terms and conditions).
[0567] In FIG. 17H, in response to detecting user selection of accept button 1750 on “Terms and Conditions” page 1746, electronic device 1700 displays an account activation user interface 1754. As shown in Fig. 17H, account activation user interface 1754 includes a graphical representation 1756 (e.g., an image representing a card associated with the payment account) of the payment account and a progress indication 1758 (e.g., stating “Activating,” “Setting up your Payment account”) informing the user that activation of the payment account is in progress.
[0568] In some embodiments, as shown in FIG. 17I, electronic device 1700 generates a dynamic feedback animation on graphical representation 1756 of the payment account (e.g., a plurality of moving patterns 1757, a 3D animation of moving patterns/elements) akin to the dynamic visual feedback applied to a completed payment message object (e.g., the dynamic visual feedback applied to completed payment message object 1132 described, for example, in FIG. 11E). In some embodiments, the feedback is a dynamic visual feedback causing display of the graphical representation 1756 to change as changes in the orientation of the device relative to a reference point 1729 are detected, where reference point 1729 is a face of a viewer (e.g., the user) of the device in a field of view of a sensor (e.g., a camera) of the device (alternatively, in some embodiments, the reference point is a static point external to the device, such as a location on the ground or floor). For example, in FIG. 17I, the dynamic visual feedback is a 3D effect that provides the user with the visual effect that graphical representation 1756 is three
204
DK 2017 70505 A1 dimensional. Thus, in FIG. 17I, based on reference point 1729 of the user, graphical representation 1756 looks visually different (e.g., shadows behind plurality of moving patterns 1757 change) from angle 1700A of the device and from angle 1700B of the device and, optionally, both the view of graphical representation 1756 from angle 1700A and angle 1700B look different from the appearance of the representation from straight on (e.g., such that the display is not tilted at an angle relative to the face of the viewer, as shown in FIG. 17H). In some embodiments, the dynamic visual feedback is a changing color applied to the graphical representation.
[0569] In some embodiments, in addition to or instead of a dynamic visual feedback, the device generates a dynamic haptic feedback. In some embodiments, the dynamic haptic feedback is a dynamically strengthening and weakening tactile output caused by the device. In some embodiments, the dynamic haptic feedback is a tactile output with changing tactile output patterns caused by the device. In some embodiments, the strength or frequency of the tactile output changes as the device detects changes in the orientation of the device relative to the reference point (e.g., reference point 1729).
[0570] In some embodiments, the generated feedback (e.g., visual feedback, sensory feedback, audio feedback) is caused (e.g., only) by an operating system program of the device and non-operating system programs of the device are not enabled to cause the feedback.
[0571] FIG. 17J shows the account activation process being completed. Thus, in FIG. 17J, progress indication 1758 of account activation user interface 1754 informs the user that activation of the payment account has been successfully completed (e.g., by stating “Activated, “Your Payment account is ready to use”). FIG. 17K shows account activation user interface 1754 from FIG. 17J, with activation being completed. In some embodiments, as shown in FIG. 17K, the dynamic visual feedback applied to graphical representation 1756 of the payment account, shown and described with reference to FIG. 17I, is maintained after activation of the payment account.
[0572] Following activation of a payment account, FIG. 17L again shows the payment corresponding to gift payment message object 1718 within message conversation 1708 of
205
DK 2017 70505 A1 messaging application 1706 having been accepted. In particular, because the payment has been accepted (and the gift payment of $50 has been credited to the activated payment account), a dynamic visual effect is applied to amount indication 1726 of gift payment message object 1718 (or to the entire payment message object), where the dynamic visual effect is akin to the visual effect applied to completed payment message object 1132 described above with reference to FIGS. 11D-11E.
[0573] In some embodiments, in response to detecting user selection of completed gift payment message object 1718 shown in FIG. 17L, electronic device 1700 displays (e.g., replaces display of messaging application 1706 with) a transaction detail user interface, corresponding to transaction detail user interface 1134 described above in FIGS. 11G and 11V, that includes a list of details (e.g., an copy of the payment message object, a copy of an accompanying note, payment sender/recipient information, transaction date and time information, information of one or more accounts used in the transaction etc.). In some embodiments, the transaction detail user interface further includes a wallet button (e.g., a “View in Wallet” selectable indication) for viewing the transaction details in an electronic wallet application of the device. In some embodiments, the transaction detail user interface further includes a return button to return the received payment to the sender (e.g., message participant 1710) of the payment.
[0574] FIGS. 18A-18F are a flow diagram illustrating a method for managing peer-to-peer transfers using an electronic device in accordance with some embodiments. Method 1800 is performed at a device (e.g., 100, 300, 500, 1600, 1700) with a display and one or more input devices (e.g., a touchscreen, a mic, a camera, a biometric sensor). Some operations in method 1800 are, optionally, combined, the orders of some operations are, optionally, changed, and some operations are, optionally, omitted.
[0575] As described below, method 1800 provides an intuitive way for managing peer-topeer transfers. The method reduces the cognitive burden on a user for managing peer-to-peer transfers, thereby creating a more efficient human-machine interface. For battery-operated computing devices, enabling a user to manage peer-to-peer transfers faster and more efficiently conserves power and increases the time between battery charges.
206
DK 2017 70505 A1 [0576] The electronic device (e.g., 1600, 1700) displays (1802), on the display (e.g., 1602, 1702), a message object (e.g., 1718, a text message, a chat bubble, an open email) in a message conversation (e.g., 1608, 1708) (between a user of the electronic device and a remote user (e.g., 1610, 1710), in a messaging application). The message object (e.g., 1620, 1718) includes (1804) an indication (e.g., 1622, 1724, 1726) of a first one or more items sent from a participant in the conversation to a user of the electronic device (e.g., a specially encrypted message or a payment object that corresponds to a payment from the participant to the user of the device). In some examples, the indication (e.g., 1622, 1726) indicates the first amount of a resource, which can be deposited into an account of the user. Displaying messages in the conversation provides the user with contextual feedback regarding the sender/receiver of messages in the conversation and reduces the need for the user to investigate the sender/receiver for further messages displayed in the conversation. Displaying a message that includes an indication of the items (or quantity of items) provides the user with visual feedback regarding what has been received. Providing improved visual feedback to the user enhances the operability of the device and makes the userdevice interface more efficient (e.g., by helping the user to achieve an intended result by providing feedback indicative of an input that will cause the device to generate the intended result and reducing user mistakes when operating/interacting with the device) which, additionally, reduces power usage and improves battery life of the device by enabling the user to use the device more quickly and efficiently.
[0577] In some examples, while (1806) displaying at least the portion of the message conversation (e.g., 1608, 1708) and prior to (1808) detecting, via the one or more input devices, the input that corresponds to the request to obtain the first one or more items (e.g., an input on an accept affordance for playing the specially encrypted message or accepting the payment), and in accordance with (1810) a determination that the electronic device is associated with an activated account that is authorized to obtain the first content without further user confirmation (e.g., in accordance with a determination that the user has already set up a message decryption account configured the account to automatically decrypt messages or a peer-to-peer payment account configured to automatically accept payments), the electronic device (e.g., 1600, 1700) proceeds (1812) to obtain the first one or more items without detecting the input that corresponds to the request to obtain the first one or more items (and without requiring any additional user inputs).
207
DK 2017 70505 A1
Thus, in some examples, the electronic device proceeds to automatically obtain the first content without any user input. In some examples, the electronic device proceeds to automatically obtain the first content without any user input from any participant. In some examples, the electronic device proceeds to automatically obtain the first content without any user input from a participant that is on a contacts list of the user’s device. Automatically obtaining the items without detecting further user input when the device is associated with an activated, authorized account enables the user to more quickly obtain the items. Performing an operation when a set of conditions has been met without requiring further user input enhances the operability of the device and makes the user-device interface more efficient (e.g., by helping the user to achieve an intended result by providing feedback indicative of an input that will cause the device to generate the intended result and reducing user mistakes when operating/interacting with the device) which, additionally, reduces power usage and improves battery life of the device by enabling the user to use the device more quickly and efficiently.
[0578] In some examples, the first one or more items are items of a first type (e.g., encrypted messages or payments). While (1806) displaying at least the portion of the message conversation (e.g., 1608, 1708) and in accordance with (1810) the determination that that the electronic device (e.g., 1600, 1700) is associated with the activated account that is authorized to obtain the items of the first type without further user confirmation, and in accordance with (1814) a determination that there is no record of a prior obtained items of the first type using the activated account (e.g., in accordance with a determination that the user has already set up a message decryption account but has not yet received an encrypted message, in accordance with a determination that the user has already set up a peer-to-peer payment account but has not yet received a payment), the electronic device: forgoes proceeding (1816) to obtain the first one or more items without detecting the input (e.g., input is required the first time, even if the user has configured the device to automatically decrypt messages/accept payments) and proceeds (1818) to obtain the first content in response to detecting the input that corresponds to the request to obtain the first one or more items.
[0579] In some examples, the first one or more items are items of a first type (e.g., encrypted messages or payments). While (1806) displaying at least the portion of the message
208
DK 2017 70505 A1 conversation (e.g., 1608, 1708) and in accordance with (1810) the determination that that the electronic device (e.g., 1600, 1700) is associated with the activated account that is authorized to obtain the items of the first type without further user confirmation, in accordance with (1820) a determination that there is a record of a prior obtained items of the first type using the activated account (e.g., in accordance with a determination that the user has already set up a message decryption account and has already received at least one encrypted message, in accordance with a determination that the user has already set up a peer-to-peer payment account and has already received at least one payment), the electronic device proceeds (1822) to obtain the items of the first type without requiring detection of a user input that corresponds to a request to obtain items of the first type.
[0580] In some examples, while (1806) displaying at least the portion of the message conversation (e.g., 1608, 1708) and prior to (1808) detecting, via the one or more input devices, the input that corresponds to the request to obtain the first one or more items (e.g., an input on an accept affordance for playing the specially encrypted message or accepting the payment), and in accordance with a determination that the electronic device is not associated with an activated account that is authorized to obtain the first one or more items without further user confirmation, the electronic device (e.g., 1600, 1700) displays (1824), on the display (e.g., 1602, 1702), the accept affordance (e.g., an activation affordance requesting or prompting the user to set up a resource account) for activating an account that is authorized to obtain the first one or more items. Displaying the acceptance affordance when the device is not associated with an activated, authorized account provides the user with feedback regarding the state of the device and enables the user to easily activate an authorized account. Providing improved visual feedback to the user enhances the operability of the device and makes the user-device interface more efficient (e.g., by helping the user to achieve an intended result by providing feedback indicative of an input that will cause the device to generate the intended result and reducing user mistakes when operating/interacting with the device) which, additionally, reduces power usage and improves battery life of the device by enabling the user to use the device more quickly and efficiently.
[0581] While displaying (1806) at least a portion of the message conversation (e.g., 1608, 1708), the electronic device (e.g., 1600, 1700) detects (1826), via the one or more input devices,
209
DK 2017 70505 A1 an input that corresponds to a request to obtain the first one or more items. In some examples, the electronic device detects activation of an accept affordance for playing/viewing the specially encrypted message or accepting the payment. In some examples, the first one or more items are items of a first type (e.g., encrypted messages or payments).
[0582] In response (1830) to detecting the input that corresponds to the request to obtain the first one or more items, in accordance with (1832) a determination that the electronic device (e.g., 1600, 1700) is associated with an activated account (of the user) that is authorized to obtain the first one or more items (e.g., in accordance with a determination that the user has already set up a message decryption account or a peer-to-peer payment account), the electronic device proceeds (1834) to obtain the first one or more items.
[0583] In response (1830) to detecting the input that corresponds to the request to obtain the first one or more items, in accordance with (1836) a determination that the electronic device (e.g., 1600, 1700) is not associated with an activated account that is authorized to obtain the first content, the electronic device displays (1838), on the display (e.g., 1602, 1702), a second affordance (e.g., 1630, 1744, 1750, an activation affordance requesting or prompting the user to set up a resource account) for activating an account that is authorized to obtain the first one or more items. In some examples, the second affordance (e.g., 1630, 1744, 1750) is displayed as part of a user interface (e.g., 1626, 1740, 1746) that covers at least a portion of the message user interface (e.g., 1706, 1708, the message transcript). In some examples, the electronic device already has one or more activated accounts; however, the accounts are not authorized to obtain the first content. For example, the accounts are not the right type of accounts or are not enabled to obtain the first content. Automatically displaying an affordance for activating an account when the device determines that an appropriate account is not already activated provides the user with contextual feedback regarding the status of the device and reduces the need for the user to navigate the user interface of the device to activate the account. Providing improved visual feedback to the user enhances the operability of the device and makes the user-device interface more efficient (e.g., by helping the user to achieve an intended result by providing feedback indicative of an input that will cause the device to generate the intended result and reducing user mistakes when operating/interacting with the device) which, additionally, reduces power usage
210
DK 2017 70505 A1 and improves battery life of the device by enabling the user to use the device more quickly and efficiently.
[0584] In some examples, in accordance with (1840) a determination that the first content sent from the participant (e.g., 1610, 1710) corresponds to a gift (e.g., an encrypted message, a payment intended to be a surprise to the recipient), the electronic device (e.g., 1600, 1700) displays (1842) a graphical indication (e.g., 1622, 1724, an indication that the message is encrypted, a graphical indication of a wrapped gift box, a graphical indication of a closed envelope) that the first one or more items sent from the participant corresponds to a gift. In some examples, the message object (e.g., 1620, 1718) is displayed at least partially as a graphical representation of a wrapped gift box (e.g., 1724). In some examples, the message object (e.g., 1724) is displayed at least partially as a graphical representation of a closed envelope. In some examples, the graphical indication (e.g., 1620, 1724, an indication that the message is encrypted, a graphical indication of a wrapped gift box, a graphical indication of a closed envelope) applied to the graphical representation of the communication is selected based on a special input on a corresponding pending graphical representation of the communication on an external device (e.g., the device where the communication originated from). For example, the special input is a deep press input having a contact intensity greater than a predetermined threshold intensity (e.g., a deep press intensity threshold). In some examples, in response to detecting a deep press the corresponding pending graphical representation of the communication on the device (before it is transmitted from the external device to the electronic device), the external device displays a list of one or more graphical indications (a wrapped gift box, a closed envelope) that can be selected and applied to the communication.
[0585] In some examples, in accordance with (1840) a determination that the first content sent from the participant corresponds to a gift (e.g., an encrypted message, a payment intended to be a surprise to the recipient), the electronic device (e.g., 1600, 1700) conceals (1844) (e.g., forgoes) display of an indication of the amount (e.g., 1726) of the first one or more items. Displaying a graphical indication (e.g., 1724) that the item corresponds to a gift without displaying an indication of the amount (e.g., 1726) of the gift provides the user with feedback regarding the state of the content (e.g., that it is a gift) and, optionally, enables the user to reject
211
DK 2017 70505 A1 the gift without seeing the amount. Providing improved visual feedback to the user enhances the operability of the device and makes the user-device interface more efficient (e.g., by helping the user to achieve an intended result by providing feedback indicative of an input that will cause the device to generate the intended result and reducing user mistakes when operating/interacting with the device) which, additionally, reduces power usage and improves battery life of the device by enabling the user to use the device more quickly and efficiently.
[0586] In some examples, the electronic device (e.g., 1600, 1700) detects (1846) user selection of the message object (e.g., 1620, 1718). In response to detecting the user selection of the message object (e.g., 1718), the electronic device (e.g., 1600, 1700) (optionally decrypting the message and) displays (1848), on the display, the indication of the amount (e.g., 1726) of the first one or more items. Displaying the indication of the amount (e.g., 1726) of the gift provides the user with feedback regarding the state of the content (e.g., the quantity of the gift). Providing improved visual feedback to the user enhances the operability of the device and makes the userdevice interface more efficient (e.g., by helping the user to achieve an intended result by providing feedback indicative of an input that will cause the device to generate the intended result and reducing user mistakes when operating/interacting with the device) which, additionally, reduces power usage and improves battery life of the device by enabling the user to use the device more quickly and efficiently.
[0587] In some examples, the electronic device (e.g., 1600, 1700) receives (1850) a second input on the message object (e.g., 1620, 1718, a text message, a chat bubble, an open email) in the message conversation (between a user of the electronic device and a remote user). In response to receiving the second input on the message object (e.g., 1620, 1718), the electronic device displays (1852), on the display (e.g., 1602, 1702), a details user interface including information (e.g., encryption/decryption information, the amount of content, the amount of payment, information related to the participant, a time and date, a note/comment relating to the obtaining of the first content) associated with the message object.
[0588] In some examples, the first one or more items are items of a first type (e.g., encrypted messages or payments). In some examples, in accordance with a determination that obtaining the first one or more items moves a total number of prior transfers of items of the first type (e.g., 212
DK 2017 70505 A1 obtaining/receiving of encrypted messages or payments, transmission/sending out of encrypted messages or payments) associated with the activated account over a predetermined limit, the electronic device (e.g., 1600, 1700) displays (1854), on the display (e.g., 1602, 1702), a verification user interface (e.g., as described below with reference to FIGS. 31A-31M) corresponding to a request to verify identity of the user associated with the activated account. In some examples, when the user attempts to accept (or send) funds that would cause the total amount of funds accepted (or sent) over a certain period (or total) to exceed a threshold amount, a verification user interface is displayed to enable the user to verify the user’s identity, such as by taking a picture of an identification (e.g., government issued identification).
[0589] In some examples, the electronic device (e.g., 1600, 1700) includes an application that is configured to manage (e.g., handle) the first one or more items. In some examples, the application is configured to decrypt encrypted messages. In some examples, the application is configured to encrypt messages. In some examples, the application is configured to manage a payment account to receive/send payments. In some examples, although the application is available on the device that can handle the first content, an activated account is also required to obtain the first content.
[0590] In some examples, prior to displaying, on the display (e.g., 1602, 1702), the message object (e.g., 1620, 1718) in the message conversation (e.g., 1608, 1708) (e.g., prior to the user viewing the decrypted message, prior to the user viewing the payment), and in accordance with a determination that the first one or more items sent from the participant (e.g., 1610, 1710) corresponds to a gift (e.g., an encrypted message, a payment intended to be a surprise to the recipient), the electronic device displays, on the display (e.g., 1602, 1702), a notification (e.g., 1732) (e.g., a pop-up notification, a notification banner) of the first content received from the participant that does not include display of an indication of the amount of the first one or more items. In some examples, prior to displaying, on the display, the message object (e.g., 1718) in the message conversation (e.g., 1608, 1708) (e.g., prior to the user viewing the decrypted message, prior to the user viewing the payment), in accordance with a determination that the first one or more items sent from the participant does not correspond to a gift, the electronic device displays, on the display (e.g., 1602, 1702), a notification of the first one or more items received
213
DK 2017 70505 A1 from the participant that includes display of the indication of the amount of the first one or more items. In some examples, the notification (e.g., 1732) is displayed on a home screen of the device. In some examples, the notification (e.g., 1732) is displayed on a lock screen (e.g., 1730) of the device. In some examples, the notification (e.g., 1732) is displayed over a user interface of an application running on the device. Displaying a notification (e.g., 1732) without displaying an indication of the amount when the notification relates to a gift provides the user with feedback regarding the state of the content (e.g., that it is a gift) and, optionally, enables the user to reject the gift without seeing the amount. Providing improved visual feedback to the user enhances the operability of the device and makes the user-device interface more efficient (e.g., by helping the user to achieve an intended result by providing feedback indicative of an input that will cause the device to generate the intended result and reducing user mistakes when operating/interacting with the device) which, additionally, reduces power usage and improves battery life of the device by enabling the user to use the device more quickly and efficiently.
[0591] In some examples, the message object (e.g., 1620, 1718) includes an accept affordance (e.g., 1624, 1728, an accept affordance for playing the specially encrypted message or accepting the payment), and the input that corresponds to the request to obtain the first one or more items comprises an input on the accept affordance.
[0592] In some examples, the first one or more items are items of a first type (e.g., encrypted messages or payments), and the total amount (e.g., number or aggregate quantity) of prior transfers of items of the first type associated with the activated account includes only prior transfers of items of the first type associated with an obtaining (or receiving) of items of the first type (e.g., receiving a payment, a funding of the activated account by the user of the account) (e.g., by an account associated with a user of the electronic device), and does not include prior transfers of items of the first type associated with a transmission (or sending out) of items of the first type (e.g., from an account associated with a user of the electronic device to another user such as a user of an external device).
[0593] Note that details of the processes described above with respect to method 1800 (e.g., FIGS. 18A-18F) are also applicable in an analogous manner to the methods described herein. For example, method 1800 optionally includes one or more of the characteristics of the various 214
DK 2017 70505 A1 methods described herein with reference to methods 900, 1200, 1500, 2100, 2400, 2700, 3000, and 3400. For example, concurrently displaying the representation of a message and a selectable indication that corresponds to a type of item (being transferred, such as a photo, sticker, resources, or a payment), as described in method 900, can be applied with respect to the message object (e.g., 1718). For another example, the outputting of dynamic feedback described in method 1200 can be applied with respect to the message object (e.g., 1718). For another example, the different visual appearances of a message object based on whether the message object corresponds to a transmission message or a request message, as described in method 1500, can be applied with respect to the message object (e.g., 1718). For another example, displaying representations of a first account and a second account, as described in method 2100, can also be displayed when authenticating / confirming an outgoing transfer of a gift analogous to the gift message object (e.g., 1718). For another example, automatically proceeding with a transfer, as described in method 2400, instead of requiring user input, can also be used to accept the contents of an incoming transfer corresponding to a (gift) payment object (e.g., 1718). For another example, the plurality of items including information from messages in a message conversation, as described in method 2700, can be displayed in response to user selection of the (gift) message object (e.g., 1718). For another example, an utterance can be used, as described in method 3000, to accept a gift corresponding to the (gift) message object (e.g., 1718) or to create an outgoing (gift) message object analogous to the message object (e.g., 1718). For another example, a visual effect (e.g., a coloring effect, a geometric alteration effect) can be applied, as described in method 3400, to an element of a message object (e.g., 1726) when a transfer (e.g., of a resource, of a file, of a payment) associated with the message object is completed. For brevity, these details are not repeated below.
[0594] The operations in the information processing methods described above are, optionally, implemented by running one or more functional modules in an information processing apparatus such as general purpose processors (e.g., as described with respect to FIGS. 1A, 3, and 5A) or application specific chips. Further, the operations described above with reference to FIGS. 18A-18F are, optionally, implemented by components depicted in FIGS. 1A1B. For example, displaying operation 1802, detecting operation 1826, proceeding operation 1834, and displaying operation 1838 are, optionally, implemented by event sorter 170, event
215
DK 2017 70505 A1 recognizer 180, and event handler 190. Event monitor 171 in event sorter 170 detects a contact on touch-sensitive surface 604, and event dispatcher module 174 delivers the event information to application 136-1. A respective event recognizer 180 of application 136-1 compares the event information to respective event definitions 186, and determines whether a first contact at a first location on the touch-sensitive surface corresponds to a predefined event or sub-event, such as selection of an object on a user interface. When a respective predefined event or sub-event is detected, event recognizer 180 activates an event handler 190 associated with the detection of the event or sub-event. Event handler 190 optionally utilizes or calls data updater 176 or object updater 177 to update the application internal state 192. In some embodiments, event handler 190 accesses a respective GUI updater 178 to update what is displayed by the application. Similarly, it would be clear to a person having ordinary skill in the art how other processes can be implemented based on the components depicted in FIGS. 1A-1B.
[0595] FIGS. 19A-19D illustrate exemplary user interfaces for managing peer-to-peer transfers, in accordance with some embodiments. As described in greater detail below, the nonlimiting exemplary embodiment of the user interfaces illustrated in FIGS. 19A-19D relate to the non-limited exemplary embodiment of the user interfaces illustrated in FIGS. 20A-20J, which in turn are used to illustrate the processes described below, including the processes in FIGS. 21A21D.
[0596] In FIG. 19A, electronic device 1900 displays, on display 1902, a user interface locked screen 1906. In some embodiments, a user interface locked screen is displayed when the device is in a user interface locked state (e.g., a state where one or more functions of the operating system is prohibited from use by a user (e.g., “Kate Appleseed”) of the device). In some embodiments, user interface locked screen 1906 includes an indication 1908 (e.g., stating “Locked”) that the device is in the user interface locked state.
[0597] In some embodiments, while electronic device 1900 remains in the user interface locked state, the device receives, via the wireless transmission device, a signal from an external device. In some embodiments, the external device is a near field communication (NFC) terminal. In some embodiments, the external device is a user device (e.g., a smartphone, a smartwatch) different from electronic device 1900. In some embodiments, the signal from the external device 216
DK 2017 70505 A1 corresponds to a request for identification credentials (associated with the user of the device) for providing identification information from electronic device 1900 to the external device.
[0598] In FIG. 19B, in response to receiving the request for identification credentials, electronic device 1900 displays, on display 1902, an identifications user interface 1910. In some embodiments, as shown in FIG. 19B, identifications user interface 1910 includes, at a first location (e.g., a top-half portion of the interface), a graphical representation 1912 of a default identification (e.g., a general identification card, such as a driver’s license) stored on the device. In some embodiments, the identification (or two or more identification) that is located at the first location of identifications user interface 1910 is the identification that is currently selected for use to provide identification information.
[0599] In some embodiments, as also shown in FIG. 19B, identifications user interface 1910 includes, at a second location (e.g., a bottom portion of the interface), graphical representations 1916 of one or more identifications stored on the device other than the identification corresponding to graphical representation 1912. In some embodiments, the one or more identifications stored on the device include a limited use identification card (e.g., an identification card that has a limited number of available uses for providing identification information).
[0600] In some embodiments, as shown in FIG. 19B, while maintaining display of graphical representation 1912 of the default identification (e.g., a driver’s license), electronic device 1900 displays a larger portion of a graphical representation 1918 corresponding to the limited use identification card within graphical representations 1916 of the identifications. In some embodiments, graphical representation 1918 slides up (e.g., after a predefined amount of time from when identifications user interface 1910 is first displayed) from graphical representations 1916 to display the larger portion. As also shown in FIG. 19B, graphical representation 1918 of the limited use identification card includes a limit indication 1922 (e.g., stating “5 uses remaining” corresponding to the available number of uses remaining on the limited use identification card. Limit indication 1922 provides the user with a reminder of the remaining number of uses that the limited use identification card (corresponding to graphical representation 1918) can be used for to provide identification information. Graphical representation 1918 of the 217
DK 2017 70505 A1 limited use identification card also includes an indication (e.g., stating “Limited use ID card”) that the representation corresponds to an identification that is a limited use identification.
[0601] In FIG. 19C, while displaying the larger portion of graphical representation 1918 of the limited use identification account, electronic device 1900 detects a user input on graphical representation 1918. For example, as shown in FIG. 19C, the user input is a tap gesture 1901 on graphical representation 1918.
[0602] As shown in FIGS.19D, in response to detecting tap gesture 1901, electronic device 1900 replaces display of graphical representation 1912 of the general use identification card with graphical representation 1918 of the limited use identification card at the first location of the identifications user interface (and graphical representation 1912 of the default identification becomes part of graphical representations 1916 of the one or more other identifications). In some embodiments, as shown in FIG. 19D, graphical representation 1918 slides up from its location within graphical representations 1916 as it is replacing graphical representation 1912 at the first location. In some embodiments, graphical representation 1912 slides down from the first location towards graphical representations 1916.
[0603] In some embodiments, as shown in FIG. 19D, similar to graphical representation 1632 of a decrypting account described above with reference to FIGS. 16A-16F, graphical representation 1918 of the limited use identification card includes a plurality of moving patterns 1924 which can provide dynamic feedback (e.g., a 3D animation of the moving patterns). Further, as also shown in FIG. 19D, graphical representation 1918 maintains display of indication 1920 that the identification corresponds to a limited use identification and limit indication 1922 while displayed at the first location of the identification user interface. Once having fully replaced display of graphical representation 1912 of the default identification at the first location, the limited use identification corresponding to graphical representation 1918 is ready for use in providing requested identification information.
[0604] As mentioned above, the non-limiting exemplary embodiment of the user interfaces illustrated in FIGS. 19A-19D described above relate to the non-limited exemplary embodiment of the user interfaces illustrated in FIGS. 20A-20J described below. Therefore, it is to be
218
DK 2017 70505 A1 understood that the processes described above with respect to the exemplary user interfaces illustrated in FIGS.19A-19D and the processes described below with respect to the exemplary user interfaces for exchanging an account illustrated in FIGS. 20A-20J are largely analogous processes that similarly involve managing transfers using an electronic device (e.g., 100, 300, 500, 1900, or 2000).
[0605] FIGS. 20A-20J illustrate exemplary user interfaces for exchanging an account for use in a transfer, in accordance with some embodiments. The user interfaces in these figures are used to illustrate the processes described below, including the processes in FIGS. 21A-21D.
[0606] FIG. 20A illustrates an electronic device 2000 (e.g., portable multifunction device 100, device 300, or device 500). In the non-limiting exemplary embodiment illustrated in FIGS. 20A-20J, electronic device 2000 is a smartphone. In other embodiments, electronic device 2000 can be a different type of electronic device, such as a wearable device (e.g., a smartwatch). Electronic device 2000 has a display 2002, a wireless transmission device, and one or more input devices (e.g., touchscreen of display 2002, a mechanical button 2004, a mic).
[0607] In FIG. 20A, electronic device 2000 displays, on display 2002, a user interface locked screen 2016. In some embodiments, a user interface locked screen is displayed when the device is in a user interface locked state (e.g., a state where one or more functions of the operating system is prohibited from use by a user (e.g., “Kate Appleseed”) of the device). In some embodiments, user interface locked screen 2016 includes an indication 2018 that the device is in the user interface locked state.
[0608] In FIG. 20B, while electronic device 2000 remains in the user interface locked state, the device receives, via the wireless transmission device, a signal from an external device 2020. In some embodiments, external device 2020 is a near field communication (NFC) terminal (e.g., for making payment transactions). In some embodiments, external device 2020 is a point-of-sale (POS) terminal (e.g., for making payment transactions). In some embodiments, external device 2020 is a user device (e.g., a smartphone, a smartwatch) different from the electronic device 2000.
219
DK 2017 70505 A1 [0609] In FIG. 20B, the signal from external device 2020 (e.g., a NFC terminal, a POS terminal) corresponds to a request for payment credentials (associated with the user of the device) for making a payment to be transmitted from electronic device 2000 to external device 2020. In some embodiments, as shown in FIG. 20B, the device displays, on display 2002, user interface locked screen 2016 when the signal from external device 2020 is received. In some embodiments, the display 2002 of the device is off when the signal from external device 2020 is received.
[0610] In FIG. 20C, in response to receiving the request for payment credentials from external device 2020, electronic device 2000 displays, on display 2002, a wallet user interface 2022. In some embodiments, as shown in FIG. 20C, wallet user interface 2022 includes, at a first location (e.g., a top-half portion of the interface), a graphical representation 2024 of a default account (e.g., a payment account corresponding to a stored-value account, a payment account corresponding to a debit account, a payment account corresponding to a checking account) provisioned on the device 2000. In some embodiments, the account (or two or more accounts) that is located at the first location of wallet user interface 2002 is the account that is currently selected for use in a payment transaction.
[0611] In some embodiments, as also shown in FIG. 20C, wallet user interface 2022 includes, at a second location (e.g., a bottom portion of the interface), graphical representations 2026 of one or more accounts provisioned on the device other than the account corresponding to graphical representation 2024. For example, the one or more accounts provisioned on the device can include an operating system-controlled payment account, a debit card account, a checking account, a credit account, and a (loyalty) points card account. In some embodiments, as shown in FIG. 20C, each graphical representation of an account within graphical representations 2026 are only partially visible on wallet user interface 2022. In some embodiments, graphical representations 2026 include a (partial) graphical representation 2030 corresponding to an operating system-controlled payment account (e.g., an account corresponding to the payment account associated with graphical representation 1756 described above with reference to, for example, FIGS 17H-17K). In some embodiments, the payment account is a unique operating system-controlled and managed account.
220
DK 2017 70505 A1 [0612] In some embodiments, subsequent to receiving the request for payment credentials from external device 2020 (and displaying wallet user interface 2022), electronic device 2000 is moved (e.g., by the user) away from the external device such that the signal from the external device is no longer detected. In some embodiments, subsequent to receiving the request for payment credentials from external device 2020 (and displaying wallet user interface 2022), the device is maintained (e.g., by the user) near external device 2020 such that the signal from the external device continues to be detected.
[0613] In some embodiments, as shown in FIG. 20C, wallet user interface also displays an indication 2028 (e.g., graphical and/or textual) informing the user of an authentication method for authorizing a transaction using an account provisioned on electronic device 2000. For example, in FIG. 20C, indication 2028 (e.g., depicting a graphical representation of a fingerprint and stating “Pay with Fingerprint”) informs the user that fingerprint authentication can be used to authorize a transaction.
[0614] FIG. 20D shows wallet user interface 2022, while maintaining display of graphical representation 2024 of the default account, displaying a larger portion of graphical representation 2030 corresponding to the payment account. In some embodiments, the larger portion of graphical representation 2030 corresponding to the payment account is displayed after a predetermined time (e.g., 0.3 second, 0.5 seconds, 1 second) has passed since first receiving the signal from external device 2020. In some embodiments, the larger portion of graphical representation 2030 corresponding to the payment account is displayed when wallet user interface 2022 is first displayed in response to receiving the signal from external device 2020.
[0615] In some embodiments, graphical representation 2030 slides up from graphical representations 2026 to display the larger portion (as shown in FIG. 20D). As also shown in FIG. 20D, graphical representation 2030 includes a balance indication 2032 corresponding to the available balance of the payment account (corresponding to graphical representation 2030). Balance indication 2032 provides the user with a reminder of the available balance of the payment account (corresponding to graphical representation 2030) when the larger portion of graphical representation 2030 is displayed.
221
DK 2017 70505 A1 [0616] In FIG. 20E, while displaying the larger portion of graphical representation 2030 of the payment account, electronic device 2000 detects a user input on graphical representation 2030. For example, as shown in FIG. 20E, the user input is a tap gesture 2001 on graphical representation 2030.
[0617] As shown in FIGS. 20F-20G, in response to detecting tap gesture 2001, electronic device 2000 replaces display of graphical representation 2024 corresponding to the default account with graphical representation 2030 of the payment account at the first location of wallet user interface 2022 (and graphical representation 2024 of the default account becomes part of graphical representations 2026 of the one or more accounts. In some embodiments, as shown in FIG. 20F, graphical representation 2030 slides up from its location within graphical representations 2026 (e.g., as shown in FIG. 20D) as it is replacing graphical representation 2024 at the first location. In some embodiments, graphical representation 2024 slides down from the first location towards graphical representations 2026. As shown in FIG. 20F, the device maintains display of balance indication 2032 on graphical representation 2030 as it slides up on the display.
[0618] As mentioned above, in some embodiments, the payment account associated with graphical representation 2030 corresponds to the payment account associated with graphical representation 1756 described above with reference to FIGS. 17H-17K. As with graphical representation 1756, graphical representation 2030 includes a plurality of moving patterns 2034 corresponding to plurality of patterns 1757 of graphical representation 1756. Thus, electronic device 2000 generates a dynamic feedback animation on graphical representation 2030 of the payment account (e.g., a 3D animation of the moving patterns) akin to the dynamic visual feedback applied to a completed payment message object as described, for example, in FIG. 11E.
[0619] FIG. 20G shows, in response to detecting tap gesture 2001, graphical representation 2030 of the payment account having fully replaced display of graphical representation 2024 of the default account at the first location of wallet user interface 2022 and graphical representation 2024 having replaced graphical representation 2030 within graphical representations 2026 of the one or more other accounts.
222
DK 2017 70505 A1 [0620] FIG. 20H shows electronic device 2000 (again) in communication, via the wireless transmission radio, with external device 2020. In some embodiments, if the device had been moved (e.g., in FIG. 20C) away from external device 2020, the device in FIG. 20H again receives the signal (e.g., corresponding to a request for payment credentials) from the external device (e.g., by being placed closed to the external device 2020). In some embodiments, if the device had been maintained near external device 2020, the device in FIG. 20H continues to receive the signal (e.g., corresponding to a request for payment credentials) from the external device.
[0621] In FIG. 20H, while displaying wallet user interface 2022 with graphical representation 2030 of the payment account located at the first location of the interface (and thus the payment account is currently selected for use in a payment transaction), electronic device 2000 receives a user input 2003 corresponding to the authentication request indicated in indication 2028. For example, as shown in FIG. 20H, indication 2028 (e.g., stating “Pay with Fingerprint”) requests fingerprint authentication, and thus the user input is a fingerprint scan input 2003 on a fingerprint sensor (e.g., a mechanical button 2004) of the device.
[0622] FIG. 20I shows, via indication 2028 (e.g., stating “Payment Complete”), that the fingerprint authentication was successful, and thus the payment transaction has been completed using the payment account associated with graphical representation 2030. In some embodiments, authentication (e.g., fingerprint authentication) is successful if the received authentication information (e.g., fingerprint scan input 2003) is consistent with enrolled authentication information (e.g., enrolled fingerprint authentication information) stored on the device (or accessible, via an external server, by the device). In some embodiments, if the authentication is not successful (e.g., because the fingerprint information obtained from fingerprint scan input 2003 is not consistent with an enrolled fingerprint authentication information), electronic device 2000 requests that the user try inputting the requested authentication information again or cancels the payment transaction with external device 2020.
[0623] In FIG. 20I, because the payment account (corresponding to graphical representation 2030) was the account currently-selected for use in a payment transaction, the successful payment transaction (e.g., indicated by indication 2028) was performed with funds from the
223
DK 2017 70505 A1 payment account (which, as indicated by balance indication 2032, had funds in the amount of $30) instead of funds from, for example, the default account associated with graphical representation 2024 that, prior to tap gesture 2001, had been the currently-selected account for use in a payment transaction. In some examples, electronic device 2000 updates balance indication 2032 within graphical representation 2030 of the payment account to reflect the amount of funds (e.g., “$10.00”) that was withdrawn (or taken out of) the payment account to fund the successful transaction and displays the updated balance indication 2032 concurrently with successful payment indication 2028.
[0624] FIG. 20J shows wallet user interface 2022 displaying transaction summary information 2036 following the successful payment transaction (using the payment account). In some embodiments, transaction summary information includes an indication 2038 of the other party (e.g., a business, a restaurant, a different non-business individual) and/or location (e.g., an address, a city) of the transaction. For example, in FIG. 20J, the current transaction was with Sandwich Shop in San Francisco, California. In some embodiments, transaction summary information includes an indication 2040 of the transaction amount (e.g., “$10.00”). In some embodiments, transaction summary information includes an indication 2044 of the account (e.g., the payment account) that was used in the transaction, and an indication 2046 of the amount of funds (e.g., “$10.00”) that was taken out of the account corresponding to indication 2044 to fund the transaction.
[0625] As also shown in FIG. 20J, subsequent to the successful transaction (in the amount of $10.00), electronic device 2000 updates balance indication 2032 within graphical representation 2030 of the payment account to reflect the amount of funds (e.g., “$10.00”) that was withdrawn (or taken out of) the payment account to fund the successful transaction. For example, in FIG. 20J, because balance indication 2032 showed $30 prior to the successful transaction, and the amount of the transaction was $10.00 (as indicated by indication 2046), the device updates balance indication 2032 to show a post-transaction amount of $20.
[0626] FIGS. 21A-21D are a flow diagram illustrating a method for exchanging an account for use in a transfer using an electronic device in accordance with some embodiments. Method 2100 is performed at a device (e.g., 100, 300, 500, 1900, 2000) with a display, a wireless
224
DK 2017 70505 A1 transmission device, and one or more input devices (e.g., a touchscreen, a mic, a camera, a biometric sensor). Some operations in method 2100 are, optionally, combined, the orders of some operations are, optionally, changed, and some operations are, optionally, omitted.
[0627] As described below, method 2100 provides an intuitive way for managing peer-topeer transactions. The method reduces the cognitive burden on a user for managing peer-to-peer transactions, thereby creating a more efficient human-machine interface. For battery-operated computing devices, enabling a user to manage peer-to-peer transactions faster and more efficiently conserves power and increases the time between battery charges.
[0628] The electronic device (e.g., 1900, 2000) receives (2102) (e.g., via the wireless transmission device) a request (e.g., a user input on the electronic device, a signal from an external device) to provide restricted credentials (e.g., identification credentials, or payment credentials) associated with a user of the device via the wireless transmission device to an external device (e.g., 2020, a point-of-sale (POS) terminal, a smartphone or smartwatch different from the electronic device). In some examples, the request is to transmit the credentials via the wireless transmission device. In some examples, providing the restricted credentials to the external device includes transmitting, via the wireless transmission device, the credentials to the external device (e.g., 2020).
[0629] In some examples, the electronic device (e.g., 1900, 2000) includes a secure element (e.g., 115) and the restricted credentials (e.g., for the first account and the second account) are stored (2104) in the secure element of the electronic device. In some examples, the restricted credentials are (or include) payment information. In some examples, the secure element provides (or releases) payment information (e.g., an account number and/or a transaction-specific dynamic security code). In some examples, the secure element provides (or releases) the payment information in response to the device receiving authorization, such as a user authentication (e.g., fingerprint authentication; passcode authentication; detecting double-press of a hardware button when the device is in an unlocked state, and optionally, while the device has been continuously on a user’s wrist since the device was unlocked by providing authentication credentials to the device, where the continuous presence of the device on the user’s wrist is determined by periodically checking that the device is in contact with the user’s
225
DK 2017 70505 A1 skin). For example, the device detects a fingerprint at a fingerprint sensor (e.g., a fingerprint sensor integrated into a button) of the device. The device determines whether the fingerprint is consistent with a registered fingerprint. In accordance with a determination that the fingerprint is consistent with the registered fingerprint, the secure element provides (or releases) payment information. In accordance with a determination that the fingerprint is not consistent with the registered fingerprint, the secure element forgoes providing (or releasing) payment information.
[0630] In some examples, the received request to provide restricted credentials associated with the user of the device via the wireless transmission device to the external device (e.g., 2020) is an input (e.g., a tap input 2303, a voice input, an input on a button (e.g., 1904, 2304, 2356) of the device) from the user of the device. In some examples, the input from the user is a double press of a button (e.g., 1904, 2304, a home button) of the device. In some examples, the input from the user is a double press of a power button of the device.
[0631] In some examples, the external device (e.g., 2020) is a contactless terminal (e.g., a transaction terminal, a POS terminal 2020, a NFC payment terminal). In some examples, the received request to provide restricted credentials associated with the user of the device via the wireless transmission device to the external device (e.g., 2020) is a signal from the contactless terminal (e.g., 2020). In some examples, the electronic device (e.g., 1900, 2000) is placed within range of the contactless terminal (e.g., a contactless payment terminal, 2020) and receives (e.g., via NFC) a request for payment.
[0632] In response (2110) to receiving the request to provide the restricted credentials, the electronic device (e.g., 1900, 2000) concurrently displays, on the display a representation of a first account (2112) (e.g., 2024) associated with first restricted credentials (e.g., a default user identification account, a default resource account, a default points account, a debit card account, a credit card account) at a first location (e.g., a prominently-visible portion of the display, such as a region at or near the center of the display) of the display, wherein the first account is selected for use in providing the restricted credentials, and at least a portion (e.g., a top portion, a top portion without a bottom portion, a first portion with a second portion) of a representation of a second account (2114) (e.g., 1918, 2030) associated with second restricted credentials (e.g., an alternative identifier) at a second location (e.g., a corner or edge of the display, such as the
226
DK 2017 70505 A1 bottom edge of the display) of the display. The display of at least the portion of the representation of the second account includes display of a usage metric (e.g., 1922, 2032, usage limit, available resources) for the second account (e.g., an amount of time that the alternate identifier is available for use, a number of uses that the alternative identifier is available for use, a quantity of currency available for use in the payment account) stored in the account or associated with the account). In some examples, the representation of the second resource account (e.g., 1918, 2030) is only partially displayed on the bottom of the display such that the indication of the available resources (e.g., 1922, 2032) is visible in a top-right corner or a top-left corner of the displayed account representation. Concurrently displaying representations of multiple accounts at different locations with a usage metric (e.g., 1922, 2032) on the display provides the user with (location-based) visual feedback about the state of the accounts, such as whether they are selected for use or available for use. Providing improved visual feedback to the user enhances the operability of the device and makes the user-device interface more efficient (e.g., by helping the user to achieve an intended result by providing feedback indicative of an input that will cause the device to generate the intended result and reducing user mistakes when operating/interacting with the device) which, additionally, reduces power usage and improves battery life of the device by enabling the user to use the device more quickly and efficiently.
[0633] In some examples, in accordance with a determination that the signal from the contactless terminal (e.g., 2020) is detected for at least a second predetermined amount of time (and alternatively, or in addition, in accordance with a determination that the device has not been removed from within range of the contactless terminal for the second predetermined period of time), the electronic device (e.g., 1900, 2000) proceeds (2116) (e.g., automatically) with providing the restricted credentials using the first account.
[0634] In some examples, in accordance with a determination that the signal from the contactless terminal (e.g., 2020) is detected for less than the second predetermined amount of time (and alternatively, or in addition, in accordance with a determination that the device has been removed from within range of the contactless terminal before at least the second predetermined period of time), the electronic device (e.g., 1900, 2000) forgoes proceeding (2118) with providing (e.g., to the contactless terminal, via wireless communication) the
227
DK 2017 70505 A1 restricted credentials using the first account. Thus, the electronic device provides the user an opportunity to switch from the using the first account to using the second account for providing the restricted credentials, such as in a payment transaction with the contactless terminal (e.g., 2020). Forgoing automatically proceeding with the first account when the device is placed into range (e.g., within RF range) of the contactless terminal for less than the second predetermined period of time enables the user view the status of the account as the device is placed into range of the contactless terminal and provides the user with the control and time to withdraw the device from range of the contactless terminal to change the selected account, thereby helping to avoid use of undesired or unintended accounts. Providing additional control enhances the operability of the device and makes the user-device interface more efficient (e.g., by helping the user to achieve an intended result by providing feedback indicative of an input that will cause the device to generate the intended result and reducing user mistakes when operating/interacting with the device) which, additionally, reduces power usage and improves battery life of the device by enabling the user to use the device more quickly and efficiently.
[0635] The electronic device (e.g., 1900, 2000) detects, via the one or more input devices, user selection (e.g., a touch gesture, such as a tap 2001, on a portion of the displayed second resource account by a user, a voice selection through a digital assistant) of the representation of the second account (e.g., 1918, 2030). In response (2122) to detecting the user selection of the representation of the second account (e.g., 1918, 2030), the electronic device optionally proceeds to one or more of blocks 2124-2130.
[0636] In some examples, the electronic device (e.g., 1900, 2000) replaces (2124) display of the at least a portion of the representation of the second account (e.g., 1918, 2030) with display of at least a portion of the representation of the first account (e.g., 1912, 2024), and the electronic device selects (2126) the second account for use in providing the restricted credentials while maintaining selection of the first account for concurrent use in providing the restricted credentials (e.g., both the first account and the second account are partially used in providing the restricted credentials, both the first account and the second account are partially used in a payment transaction). In some examples, when the second account is selected for use in providing the restricted credentials, the second account will, in some circumstances, not have
228
DK 2017 70505 A1 sufficient funds for the payment and, accordingly, provides payment using both the first account and the second account. In some examples, as described below with reference to the first resource account and the second resource account in method 2400 of FIGS. 24A-24C, the electronic device provides all available funds of the second account and provides funds from the first account for the outstanding portion of the payment.
[0637] In response (2122) to detecting the user selection of the representation of the second account (e.g., 1918, 2030), the electronic device (e.g., 1900, 2000) replaces (2128) display of the representation of the first account (e.g., 1912, 2024) with the representation of the second account (e.g., 1918, 2030) at the first location of the display. Changing the locations on the display of the various accounts provides the user with (location-based) visual feedback about the updated states of the accounts, such as whether they are selected for use or available for use, and provides the user with visual feedback that the input they provided has changed the account selected for use. Providing improved visual feedback to the user enhances the operability of the device and makes the user-device interface more efficient (e.g., by helping the user to achieve an intended result by providing feedback indicative of an input that will cause the device to generate the intended result and reducing user mistakes when operating/interacting with the device) which, additionally, reduces power usage and improves battery life of the device by enabling the user to use the device more quickly and efficiently.
[0638] In response (2122) to detecting the user selection of the representation of the second account (e.g., 1918, 2030), the electronic device (e.g., 1900, 2000) selects the second account for use in providing the restricted credentials (e.g., preparing to use the alternative identification by making the alternative identification credentials available via the wireless transmission device, or by preparing to use the payment account by making the payment account available via the wireless transmission device). In some examples, the electronic device also deselects the first account for use in providing the restricted credentials when the electronic device selects the second account for the use. In some examples, the electronic device also does not deselect the first account for use in providing the restricted credentials when the electronic device selects the second account for the use.
229
DK 2017 70505 A1 [0639] In some examples, subsequent to selecting the second account for use in providing the restricted credentials, the electronic device (e.g., 1900, 2000) proceeds (2132) with providing (e.g., by transmitting, using wireless transmission device) the restricted credentials using the second account. In some examples, the electronic device updates (2134) display of the usage metric (e.g., 1922, 2032, usage limit, available resources) for the second account to reflect the change in the usage metric caused by providing the restricted credentials using the second account (e.g., the amount of time that the alternate identifier is available for use is decreased, the number of uses that the alternative identifier is available for use is decreased, the quantity of currency available for use in the payment account stored in the account or associated with the account is decreased). Updating the displayed usage metric (e.g., 192, 2032) to reflect the usage of the restricted credentials provides the user with real-time (or near-real time) visual feedback about the state of the second account resulting from use of the second account, such as the amount of resources remaining in the accounts). Providing improved visual feedback to the user enhances the operability of the device and makes the user-device interface more efficient (e.g., by helping the user to achieve an intended result by providing feedback indicative of an input that will cause the device to generate the intended result and reducing user mistakes when operating/interacting with the device) which, additionally, reduces power usage and improves battery life of the device by enabling the user to use the device more quickly and efficiently.
[0640] In some examples, the restricted credentials (e.g., for the first account and the second account) are uniquely associated (e.g., via a user-specific identifier) with a user of the electronic device.
[0641] In some examples, the electronic device (e.g., 1900, 2000) forgoes transmitting (e.g., rejects requests, such as user requests, to transmit) the restricted credentials to an external device (e.g., 2020) unless user authentication (e.g., biometric authentication, such as fingerprint, facial recognition, iris, or retina authentication) has been successfully provided by a user of the electronic device. In some examples, user authentication is successfully received when the electronic device receives biometric information and determines that the biometric information corresponds to biometric information enabled to authorize transmitting the restricted credentials.
230
DK 2017 70505 A1 [0642] In some examples, the at least a portion of the representation of the second account (e.g., 2030) is displayed after a predetermined amount of time (e.g., 2 seconds) has passed from displaying the representation of the first account (e.g., 1912, 2024). Thus, in some examples, initially the representation of the first account (e.g., 1912, 2024) is displayed without the representation of the second account (e.g., 1918, 2030) being displayed. The representation of the second account (e.g., 1918, 2030) is displayed after the predetermined amount of time has passed since displaying the representation of the first account (e.g., 1912, 2024). In some examples, after the predetermined period of time has passed, representations (or portions thereof) of both the first account (e.g., 1912, 2024) and the second account (e.g., 1918, 2030) are displayed on the display at the same time. Displaying the selected account first, followed by displaying the unselected account after a short time delay provides the user with (time-based) visual feedback about the states of the accounts, such as whether they are selected for use or available for use. Providing improved visual feedback to the user enhances the operability of the device and makes the user-device interface more efficient (e.g., by helping the user to achieve an intended result by providing feedback indicative of an input that will cause the device to generate the intended result and reducing user mistakes when operating/interacting with the device) which, additionally, reduces power usage and improves battery life of the device by enabling the user to use the device more quickly and efficiently.
[0643] In some examples, replacing display of the representation of the first account (e.g., 2024) with the representation of the second account (e.g., 1918, 2030) at the first location of the display includes: displaying the entirety of the representation of the second account (e.g., 1912, 2024) at the first location (e.g., a prominently-visible portion of the display, such as a region at or near the center of the display) of the display (e.g., because the second account, instead of the first account, is set as the selected account), and displaying at least a portion (e.g., less than all of the representation of the first account (e.g., 1912, 2024), a first portion but not a second portion) of the representation of the first account (e.g., 1912, 2024) at the second location (e.g., a corner or edge of the display, such as the bottom edge of the display) of the display. In some examples, a user of the device can change the default account to be the second account instead of the first account. Displaying the entire representation of the second account (the selected account) and a portion of the first account (e.g., less than the entire representation of the unselected account)
231
DK 2017 70505 A1 provides the user with (size-based) visual feedback about the states of the accounts, such as whether they are selected for use or available for use. Providing improved visual feedback to the user enhances the operability of the device and makes the user-device interface more efficient (e.g., by helping the user to achieve an intended result by providing feedback indicative of an input that will cause the device to generate the intended result and reducing user mistakes when operating/interacting with the device) which, additionally, reduces power usage and improves battery life of the device by enabling the user to use the device more quickly and efficiently. In some examples, the entirety of a representation of an account is larger in size (when displayed, on the display) than a portion of the representation of an account that is not the entirety of the representation of the account.
[0644] In some examples, the electronic device (e.g., 1900, 2000) concurrently displays, on the display, at least a portion (e.g., a top portion) of a representation of a third account (e.g., one of 1916, one of 2026, the third account is enabled to provide corresponding restricted credentials from the secure element) at a location adjacent to the second location of the display (e.g., adjacent to the representation of the second account (e.g., 2030 of FIG. 20D), above the representation of the second account, behind the representation of the second account, such as items arranged in a stack) while maintaining display of the at least a portion of the representation of the second account at the second location. In some examples, the electronic device detects, via the one or more input devices, user selection (e.g., a touch gesture, such as a tap, on a portion of the displayed second resource account by a user, a voice selection through a digital assistant) of the representation of the third account (e.g., one of 1916, one of 2026). In response to detecting the user selection of the representation of the third account: the electronic device replaces display of the representation of the first account (e.g., 1912, 2024) with the representation of the third account (e.g., one of 1916, one of 2026) at the first location of the display, and the electronic device maintains display of the at least a portion of the representation of the second account (e.g., 1918, 2030, at the second location of the display). Thus, in some examples, display of the representation of the second account at the second location of the display is always maintained no matter (independent of) which account (other than the second account) is selected and displayed at the first location of the display. Changing the location on the display of the various accounts provides the user with (location-based) visual feedback about 232
DK 2017 70505 A1 the updated states of the accounts, such as whether they are selected for use or available for use, and provides the user with visual feedback that the input they provided has changed the account selected for use. Providing improved visual feedback to the user enhances the operability of the device and makes the user-device interface more efficient (e.g., by helping the user to achieve an intended result by providing feedback indicative of an input that will cause the device to generate the intended result and reducing user mistakes when operating/interacting with the device) which, additionally, reduces power usage and improves battery life of the device by enabling the user to use the device more quickly and efficiently.
[0645] In some examples, the usage metric (e.g., 1922, 2032, usage limit, available resources) for the second account is displayed after a third predetermined amount of time (e.g., 1 second) has passed from displaying the at least a portion of the representation of the second account. Thus, in some examples, the displayed usage metric (e.g., 1922, 2032) is displayed after a delay from when the representation of the second account (e.g., 1918, 2030) is first displayed. In some examples, the at least a portion of the representation of the second account (e.g., 2030) is displayed without displaying the usage metric (e.g., 1922, 2032). After the third predetermined period of time has passed, the usage metric (e.g., 1922, 2032) is displayed such that the representation of the second account (e.g., 1918, 2030) is displayed concurrently with the usage metric (e.g., 2032).
[0646] In some examples, the usage metric (e.g., 1922, 2032, usage limit, available resources) for the second account ceases to be displayed after a fourth predetermined amount of time (e.g., 3 seconds) has passed from first displaying the usage metric (e.g., 1922, 2032). Thus, in some examples, the displayed usage metric auto-hides from the display if a user does not select the second account after a certain amount of time.
[0647] In some examples, selected accounts are displayed at the first location (e.g., a location towards a center of the display, indicating that the account is selected) and non-selected accounts are displayed at the second location (e.g., at a location towards an edge of the display, indicating that the account is not selected) (e.g., a region of representations of non-selected accounts or non-selected payment cards arranged as a stack). Thus, in some examples, if a representation of an account is displayed at the first location of the display, the user of the device is made aware
233
DK 2017 70505 A1 that the account is currently selected for use in providing the restricted credentials or for use in a payment transaction, whereas if a representation of an account is displayed at the second location of the display, the user is made aware that the account is currently not selected for use in providing the restricted credentials or for use in a payment transaction. Displaying representations of accounts at different locations on the display provides the user with (locationbased) visual feedback about the state of the corresponding accounts, such as whether they are selected for use or available for use. Providing improved visual feedback to the user enhances the operability of the device and makes the user-device interface more efficient (e.g., by helping the user to achieve an intended result by providing feedback indicative of an input that will cause the device to generate the intended result and reducing user mistakes when operating/interacting with the device) which, additionally, reduces power usage and improves battery life of the device by enabling the user to use the device more quickly and efficiently.
[0648] In some examples, a plurality of representations of non-selected accounts are displayed in a stack configuration (e.g., a three-dimensional stack, graphical representations 1916, 2026 of other accounts shown in, for example, FIG. 20C) at the second location (e.g., the representation of accounts piled on top of one another, the representation of payment cards piled on top of one another with at least a portion of each visible).
[0649] In some examples, in response to detecting the user selection of the representation of the second account, the electronic device (e.g., 1900, 2000) replaces display of the at least a portion of the representation of the second account (e.g., 2030) with display of at least a portion of the representation of the first account (e.g., 1912, 2024). Changing the location on the display of the various accounts provides the user with (location-based) visual feedback about the updated states of the accounts, such as whether they are selected for use or available for use, and provides the user with visual feedback that the provided input has changed the account selected for use. Providing improved visual feedback to the user enhances the operability of the device and makes the user-device interface more efficient (e.g., by helping the user to achieve an intended result by providing feedback indicative of an input that will cause the device to generate the intended result and reducing user mistakes when operating/interacting with the device) which,
234
DK 2017 70505 A1 additionally, reduces power usage and improves battery life of the device by enabling the user to use the device more quickly and efficiently.
[0650] In some examples, the representation of the second account (e.g., 1918, 2030) includes a distinguishing visual characteristic (e.g., a graphical animation, a graphical pattern, a dynamic animation, a dynamic pattern) and representations of other accounts (e.g., 1912, 1916, 2024, 2026) that are not the second account, including the representation of the first account (e.g., 2024), do not include the distinguishing visual characteristic. In some examples, the representation of the second account (e.g., 1918, 2030) includes a visual effect that changes commensurate with changes in the orientation of the device (e.g., as described above with reference to in FIGS. 17I and 17K), such as a three-dimensional visual effect (e.g., with drop shadows) that provides an appearance of the card having engravings. For example, the threedimensional visual effect involves causing display of the representation of the second account (e.g., 2030) to change as changes in the orientation of the device relative to a reference point are detected. In some examples, the reference point is a face of a viewer (e.g., the user) of the device in a field of view of a sensor (e.g., a camera) of the device. Alternatively, in some examples, the reference point is a static point external to the device, such as a location on the ground or floor). Based on the reference point (e.g., of the face of the user), the representation of the second account (e.g., 1918, 2030) looks visually different (e.g., shadows behind plurality of moving patterns 1924, 2034 change) from one slanted angle view of the device as compared to a different slanted angle view of the device and, optionally, the representation from either angle looks different from a straight on view of the device (e.g., such that the display is not tilted at an angle relative to the face of the user, as shown in FIG. 20H).
[0651] Note that details of the processes described above with respect to method 2100 (e.g., FIGS. 21A-21D) are also applicable in an analogous manner to the methods described herein. For example, method 2100 optionally includes one or more of the characteristics of the various methods described herein with reference to methods 900, 1200, 1500, 1800, 2400, 2700, 3000, and 3400. For example, displaying a transfer user interface for initiating transfer of a first type of item (e.g., a photo, stickers, resources, payments) between participants as described in method 900 can apply in response to detecting user selection on representation of the second account
235
DK 2017 70505 A1 (e.g., 2030). For another example, the outputting of feedback, as described in method 1200, can be applied to the representation of the second account (e.g., 2030). For another example, the different visual appearances of a message object based on whether the message object corresponds to a transmission message or a request message, as described in method 1500, can be applied with respect to the representation of the second account (e.g., 2030). For another example, a request for activating an account that is authorized to obtain one or items (e.g., a sticker, a photo, resources, a payment), as described in method 1800, can be applied when setting up the second account. For another example, automatically proceeding with a transfer, as described in method 2400, instead of requiring user input, can also be used when proceeding with a transfer using the second account. For another example, the plurality of items including information from messages in a message conversation, as described in method 2700, can include information from transfers using the first account and the second account. For another example, an utterance can be used, as described in method 3000, to initiate a transfer (e.g., initiate a payment) using the first account or the second account. For another example, a visual effect (e.g., a coloring effect, a geometric alteration effect) can be applied, as described in method 3400, to one or more elements (e.g., 2034) of a representation of an account (e.g., 2030) when the account is ready to be used in a transfer (e.g., of a resource, of a file, of a payment) and/or when a transfer (e.g., of a resource, of a file, of a payment) using the account is completed. For brevity, these details are not repeated below.
[0652] The operations in the information processing methods described above are, optionally, implemented by running one or more functional modules in an information processing apparatus such as general purpose processors (e.g., as described with respect to FIGS. 1A, 3, and 5A) or application specific chips. Further, the operations described above with reference to FIGS. 21A-21D are, optionally, implemented by components depicted in FIGS. 1A1B. For example, receiving operation 2102, displaying operation 2110, detecting operation 2120, replacing operation 2128, and selecting operation 2130 are, optionally, implemented by event sorter 170, event recognizer 180, and event handler 190. Event monitor 171 in event sorter 170 detects a contact on touch-sensitive surface 604, and event dispatcher module 174 delivers the event information to application 136-1. A respective event recognizer 180 of application 136-1 compares the event information to respective event definitions 186, and determines
236
DK 2017 70505 A1 whether a first contact at a first location on the touch-sensitive surface corresponds to a predefined event or sub-event, such as selection of an object on a user interface. When a respective predefined event or sub-event is detected, event recognizer 180 activates an event handler 190 associated with the detection of the event or sub-event. Event handler 190 optionally utilizes or calls data updater 176 or object updater 177 to update the application internal state 192. In some embodiments, event handler 190 accesses a respective GUI updater 178 to update what is displayed by the application. Similarly, it would be clear to a person having ordinary skill in the art how other processes can be implemented based on the components depicted in FIGS. 1A-1B.
[0653] FIGS. 22A-22F illustrate exemplary user interfaces for managing peer-to-peer transfers, in accordance with some embodiments. As described in greater detail below, the nonlimiting exemplary embodiment of the user interfaces illustrated in FIGS. 22A-22F relate to the non-limited exemplary embodiment of the user interfaces illustrated in FIGS. 23A-23O, which in turn are used to illustrate the processes described below, including the processes in FIGS. 24A24C.
[0654] FIG. 22A illustrates an electronic device 2200 (e.g., portable multifunction device 100, device 300, or device 500). In the non-limiting exemplary embodiment illustrated in FIGS. 22A-22F, electronic device 2200 is a smartphone. In other embodiments, electronic device 2200 can be a different type of electronic device, such as a wearable device (e.g., a smartwatch). Electronic device 2200 has a display 2202 and one or more input devices (e.g., touchscreen of display 2202, a mechanical button 2204, a mic).
[0655] In FIG. 22A, electronic device 2200 displays, on display 2202, a message conversation 2208 of a messaging application 2206 between the user (e.g., “Kate Appleseed”) of the device and a message participant 2210 (e.g., “John Appleseed”). The user and the message participant are engaged in a conversation concerning the transfer of central processing unit (“CPU”) cycles. In some embodiments, message participant 2210 is a contact stored on the device. In some embodiments, message participant 2210 is a contact of a contact list associated with the user account logged onto the device. In some embodiments, message participant 2210
237
DK 2017 70505 A1 is a contact included in a trusted contacts list associated with the user account logged onto the device.
[0656] In some embodiments, electronic device 2200 also displays, on display 2202, a virtual keyboard 2212 (e.g., an alphanumeric keyboard for typing a message) and a compose bar 2214 for displaying the text of a message as the message is typed using virtual keyboard 2212. In some embodiments, a mechanical keyboard can be used in addition to or alternatively to virtual keyboard 2212 to type a message. In some embodiments, compose bar 2214 can expand (e.g., expand upwards) to accommodate a longer message or message object (e.g., an image, an emoticon, a special type of message object, such as a payment object). In some embodiments, compose bar 2214 includes a mic button 2216 which, when activated, enables the user to enter a message using voice input.
[0657] FIG. 22A also shows a message object 2218 corresponding to a message sent by the user to message participant 2210. In the message corresponding to message object 2218, the user asks message participant 2210 about an amount of CPU cycles that are needed by the message participant: “How much more do you need?” FIG. 22A also shows a cycle transfer message object 2220 corresponding to a request for a specific number (e.g., 1 million) CPU cycles sent by message participant 2210 to the user.
[0658] In some embodiments, as shown in FIG. 22A, cycle transfer message object 2220 includes a request indicator 2221 (e.g., a symbol “#”) indicating that the transfer message object corresponds to a request for CPU cycles (as opposed to a transmission of CPU cycles. In some embodiments, as also shown in FIG. 22A, cycle transfer message object 2220 includes a textual indication 2222 (e.g., stating “1 M cycles request” of the number of cycles that are being requested. In some embodiments request indicator 2221 is displayed in a different font (e.g., a thicker font, a bolder font, a special type of font reserved for transfer request messages) than textual indication 2222. In some embodiments, as also shown in FIG. 22A, cycle transfer message object 2220 includes an accept button 2224 for accepting the request for cycles (and thus agreeing to transmit the requested number of cycles to the message participant).
238
DK 2017 70505 A1 [0659] In FIG. 22A, while displaying cycle transfer message object 2220 (corresponding to a request for 1 million CPU cycles) within message conversation 2208 with message participant 2210, electronic device 2200 detects a user input on accept button 2224 of the cycle transfer message object. For example, as shown in FIG. 22A, the user input is a tap gesture 2201 on accept button 2224.
[0660] In FIG. 22B, in response to detecting tap gesture 2201, electronic device 2200 displays, on display 2202, a transfer confirmation user interface 2226. In some embodiments, transfer confirmation user interface 2226 includes an indication 2228 (e.g., a graphical indication, a textual indication) of a primary (e.g., priority) resource account (e.g., storing priority CPU cycles). In FIG. 22B, the primary resource account does not have sufficient balance of CPU cycles to cover the requested amount of 1 million CPU cycles. Thus, in some embodiments, indication 2228 includes a textual indication (e.g., stating “Insufficient Balance”) informing the user that the resource account has an insufficient balance of CPU cycles, and an accounts selection button 2230 for selecting one or more other additional accounts to use in the transfer of CPU cycles. In some embodiments, transfer confirmation user interface 2226 also includes a status indication 2232 (e.g., a graphical and/or textual indication) informing the user that the currently-selected resource account (e.g., the primary resource account associate with indication 2228) has an insufficient number of CPU cycles remaining to cover the request number of resources for by the transfer request.
[0661] In FIG. 22C, while displaying transfer confirmation user interface 2226, electronic device 2200 detects a user input on accounts selection button 2230 of indication 2228. For example, as shown in FIG. 22C, the user input is a tap gesture 2203 on accounts selection button 2230.
[0662] In FIG. 22D, in response to detecting tap gesture 2203, electronic device 2200 displays indications of one or more accounts 2228 and 2236 stored on or provisioned on the device. Indication 2228 corresponds to the primary resource account. Because the primary resource account is currently selected for use in responding to the request for CPU cycles, indication 2228 of the primary resource account includes a selection mark 2234 informing the user that the account is currently selected for use in the CPU cycles transfer. Indication 2236
239
DK 2017 70505 A1 corresponds to a backup (e.g., non-priority) resource account, which has sufficient CPU cycles to (either alone or together with the primary resource account) cover the received CPU cycles request. Because the backup resource account is not currently selected for use in the CPU cycles transfer, it does not include selection mark.
[0663] As shown in FIG. 22D, while both indication 2228 of the primary resource account and indication 2236 of the backup resource account, electronic device 2200 detects user selection of indication 2236 corresponding to the backup resource account. For example, as shown in FIG. 22D, the user selection is a tap gesture 2205 on indication 2236. In FIG. 22E, in response to detecting tap gesture 2205, the device updates display of indication 2236 to include a selection mark 2238. Thus, in FIG. 22E, following the detection of tap gesture 2205, both the primary resource account and the backup account are selected for use in the CPU cycles transfer.
[0664] In FIG. 22F, electronic device 2200 requests, as indicated by status indication 2232 (e.g., stating “Send with fingerprint”) authentication information (e.g., biometric authentication, such as fingerprint authentication, facial recognition, voice recognition, iris/retina recognition, or passcode authentication) to proceed with transferring the requested CPU cycles to message participant using both the primary resource account and the backup resource account. Once authentication information that is consistent with enrolled authentication information (for proceeding with CPU cycle transfers) are provided by the user, the device transmits, via messaging application 2206, the requested (1 million) CPU cycles to message participant 2210 by withdrawing available CPU cycles from (first) the primary resource account and (second) the backup resource account.
[0665] As mentioned above, the non-limiting exemplary embodiment of the user interfaces illustrated in FIGS. 22A-22F described above relate to the non-limited exemplary embodiment of the user interfaces illustrated in FIGS. 23A-23O described below. Therefore, it is to be understood that the processes described above with respect to the exemplary user interfaces illustrated in FIGS. 22A-22F and the processes described below with respect to the exemplary user interfaces illustrated in FIGS. 23A-23O are largely analogous processes that similarly involve initiating and managing transfers using an electronic device (e.g., 100, 300, 500, 2200, or 2300).
240
DK 2017 70505 A1 [0666] FIGS. 23A-23O illustrate exemplary user interfaces for splitting transfers between two or more accounts, in accordance with some embodiments. The user interfaces in these figures are used to illustrate the processes described below, including the processes in FIGS. 24A-24C.
[0667] FIG. 23A illustrates an electronic device 2300 (e.g., portable multifunction device 100, device 300, or device 500). In the non-limiting exemplary embodiment illustrated in FIGS. 23A-23O, electronic device 2300 is a smartphone. In other embodiments, electronic device 2300 can be a different type of electronic device, such as a wearable device (e.g., a smartwatch). Electronic device 2300 has a display 2302 and one or more input devices (e.g., touchscreen of display 2302, a mechanical button 2304, a mic).
[0668] In FIG. 23A, electronic device 2300 displays, on display 2302, a wallet user interface 2322 corresponding to wallet user interface 2022 described above with reference to FIGS. 20A20J. In some embodiments, as shown in FIG. 23A, the device 2300 displays wallet user interface 2322 in response to receiving a signal from an external device 2320 (e.g., a different device, such as a smartphone or a smartwatch, a near field communication (NFC) terminal, a point-of-sale (POS) terminal) requesting user credentials (e.g., payment credentials). In some embodiments, the device displays wallet user interface 2322 in response to detecting user activation (e.g., a double press) of the interface via mechanical button 2304. In some embodiments, the device displays wallet user interface 2322 in response to receiving, via a mic, a voice command.
[0669] As with wallet user interface 2022, wallet user interface 2322 includes a graphical representation 2330 (e.g., corresponding to graphical representation 2030) corresponding to a payment account (e.g., a unique operating system-controlled and managed account) at a first location (e.g., a top-half portion) of the display, and a balance indication 2332 (e.g., “$30”) within graphical representation 2330 informing a user (e.g., “Kate Appleseed”) of the device of the amount of funds available from the payment account. As shown in FIG. 23A, the payment account associated with graphical representation 2330 has a current balance of (as indicated by balance indication 2332) $30.
241
DK 2017 70505 A1 [0670] As with wallet user interface 2022, wallet user interface 2322 includes graphical representations 2326 (e.g., corresponding to graphical representations 2026) corresponding to one or more accounts stored on or provisioned on the device at a second location (e.g., at the bottom edge) of the display. As with wallet user interface 2022, wallet user interface 2322 includes an indication (e.g., graphical and/or textual, corresponding to indication 2028) informing the user of an authentication method for authorizing a transaction using an account provisioned on the device. For example, in FIG. 23A, indication 2328 (e.g., depicting a graphical representation of a fingerprint and stating “Pay with Fingerprint”) informs the user that fingerprint authentication can be used to authorize a transaction on the device.
[0671] Further, in FIG. 23A, electronic device 2300 displays a graphical representation 2324 (e.g., corresponding to graphical representation 2024) corresponding to a default account at the first location of the interface, together with graphical representation 2330. In FIG. 23A, graphical representation 2324 (corresponding to the default account) covers a portion of graphical representation 2330 (corresponding to the payment account) at the first location of the interface. In some embodiments, graphical representation 2330 (corresponding to the payment account) covers a portion of graphical representation 2324 (corresponding to the default account) at the first location of the interface.
[0672] In FIG. 23A, electronic device 2300 received, from external device 2320, a request for payment credentials to authorize payment in an amount (e.g., $50) that is greater than the currently available balance of the payment account ($30). Thus, in some embodiments, in accordance with the determination (based on the request signal from external device 2320) that the payment account alone has insufficient funds to fully pay for the current transaction, the device automatically displays a graphical representation 2324 (e.g., corresponding to graphical representation 2024) corresponding to a default account together with graphical representation 2330 at the first location of the interface (thereby indicating that both the payment account and the default will be used for the current transaction. In some embodiments, the device automatically displays the representation of the default account (as opposed to a different account stored or provisioned on the device) because the default account is designated as the “default” account. In some embodiments, the device displays graphical representation 2324 of
242
DK 2017 70505 A1 the default account at the first location in response to receiving user selection of the default account as the second account to be used for the current transaction (e.g., after the device first prompts the user that the payment account has insufficient funds to fully pay for the current transaction or after the user realizes, based on balance indication 2332, that the payment account has insufficient funds).
[0673] In some embodiments, as shown in FIG. 23A, graphical representation 2324 (corresponding to the default account) covers a portion of graphical representation 2330 (corresponding to the payment account) at the first location of the interface. In some embodiments, graphical representation 2330 (corresponding to the payment account) covers a portion of graphical representation 2324 (corresponding to the default account) at the first location of the interface.
[0674] As shown in FIG. 23A, while displaying both graphical representation 2330 (of the payment account) and graphical representation 2324 (of the default account) at the first location of wallet user interface 2322, electronic device receives a user input (e.g., fingerprint scan input 2301 on a fingerprint sensor of mechanical button 2304) to authorize a payment (e.g., of $50) for the current transaction. As shown in FIG. 23B, as indicated by indication 2328 (now stating “Payment complete”), the authentication (based on fingerprint scan input 2301) is successful (e.g., because the received fingerprint information is consistent with an enrolled fingerprint information for authorizing transactions). Because the graphical representations of both the payment account and the default account are displayed at the first location of wallet user interface 2322 when the current transaction (of $50) is successfully authorized by the user, both accounts are authorized to be used to pay for the transaction.
[0675] FIG. 23C shows wallet user interface 2322 displaying transaction summary information 2336 following the successful payment transaction (using both the payment account and the default account). In some embodiments, transaction summary information 2336 includes an indication 2338 of the other party (e.g., a business, a restaurant, a non-business individual) and/or location (e.g., an address, a city) of the transaction. For example, in FIG. 23C, the current transaction was with Tim’s Toy Store in San Francisco, California. In some embodiments, transaction summary information includes an indication 2340 of the transaction amount (e.g.,
243
DK 2017 70505 A1 “$50.00”). In some embodiments, transaction summary information includes an indication 2344A of the first account (e.g., the payment account), of the two different accounts, that was used in the transaction and an indication 2344B, of the second account (e.g., the default account) of the two different accounts, that was used in the transaction. In some embodiments, transaction summary information includes a first indication 2346A of the amount of funds (e.g., “$30.00”) that was taken out of the first account (e.g., the payment account) for use in the transaction and a second indication of the amount of funds (e.g., “$20.00”) that was taken out of the second account (e.g., the default account) for use in the transaction.
[0676] As shown in FIG. 23C, subsequent to the successful transaction (in the amount of $10.00), electronic device 2000 updates balance indication 2332 within graphical representation 2330 of the payment account to reflect the amount of funds (e.g., “$30.00”) that was withdrawn (or taken out of) the payment account to fund the successful transaction. For example, in FIG. 23C, because balance indication 2332 showed $30 prior to the successful transaction, and the amount of the transaction was $50.00 (which is more than the available funds of the payment account), the device updates balance indication 2332 to show a post-transaction amount of $0 (because all available funds, in the amount of $30, was used to cover the transaction. Thus, in some embodiments, if the transaction amount of the current transaction (e.g., “$50”) is greater than the available funds in the payment account (e.g., “$30”) all available funds from the payment account is automatically used in the transaction (thereby leaving the balance of the payment account at $0), and the insufficient balance left by the payment account (e.g., “$20”) is automatically covered by other account selected for the transaction (e.g., the default account).
[0677] FIG. 23D shows electronic device 2300 displaying, on display 2302, a message conversation 2308 of a messaging application 2306 between the user (e.g., “Kate Appleseed”) of the device and a message participant 2310 (e.g., “John Appleseed”). In some embodiments, message participant 2310 is a contact stored on the device. In some embodiments, message participant 2310 is a contact of a contact list associated with the user account logged onto the device. In some embodiments, message participant 2310 is a contact included in a trusted contacts list associated with the user account logged onto the device.
244
DK 2017 70505 A1 [0678] In some embodiments, electronic device 2300 also displays, on display 2302, a virtual keyboard 2312 (e.g., an alphanumeric keyboard for typing a message) and a compose bar 2314 for displaying the text of a message as the message is typed using virtual keyboard 2312. In some embodiments, a mechanical keyboard can be used in addition to or alternatively to virtual keyboard 2312 to type a message. In some embodiments, compose bar 2314 can expand (e.g., expand upwards) to accommodate a longer message or message object (e.g., an image, an emoticon, a special type of message object, such as a payment object). In some embodiments, compose bar 2314 includes a mic button 2314A which, when activated, enables the user to enter a message using voice input.
[0679] FIG. 23D also shows a message object 2348 corresponding to a message sent by message participant 2310 to the user. For example, in FIG. 23D, message object 2348 states “Dinner and the cab ride together was $28.” FIG. 23D also shows a payment message object 2350 (e.g., corresponding to payment message object 1490 described above with reference to FIG. 14L) corresponding to a payment request sent by message participant 2310 to the user, requesting payment (e.g., for dinner and the cab ride stated in message object 2310) in the amount of $28.
[0680] In some embodiments, as shown in FIG. 23D, payment message object 2350 has a mode indication 2352 (e.g., stating “PAY”) indicating to the user that the payment message object corresponds to a payment request made via an operating system-controlled payment transfer application (and not by a third-party application). As shown in FIG. 23D, payment message object 2350 also includes an amount indication 2354 informing the recipient (e.g., the user) of the amount of the requested payment (e.g., “$28”) and a further indication (e.g., “$28 Request”) that the payment message object corresponds to a request for payment. As shown in FIG. 23D, payment message object 2350 also includes an accept button 2356 for agreeing to make (or initiating the process for making) the payment (e.g., in the amount of $28) corresponding to the requested payment. In some embodiments, as shown in FIG. 23D, payment message object 2350 includes a status indicator 2358 informing the user of a status of the payment request corresponding to the payment message object (e.g., “pending,” “paid,” “accepted,” “expired,” etc.). For example, in FIG. 23D, status indicator 2358 shows “pending,”
245
DK 2017 70505 A1 thus indicating to the user that the payment request associated with payment message object 2350 has not yet been accepted by the user. In some embodiments, as shown in FIG. 23D, a note message object 2360 corresponding to a note (e.g., a comment, a message) related to the payment request accompanies the payment message object.
[0681] In FIG. 23D, while displaying payment message object 2350 (corresponding to a payment request) within message conversation 2308 with message participant 2310, electronic device 2300 detects a user input on accept button 2356 of the payment message object. For example, as shown in FIG. 23D, the user input is a tap gesture 2303 on accept button 2356.
[0682] In FIG. 23E, in response to detecting tap gesture 2303, electronic device 2300 displays, on display 2302, a payment confirmation user interface 2362 corresponding to payment confirmation user interface 878 described with reference to FIGS. 8T-8W. As with payment confirmation user interface 878, payment confirmation user interface 2362 includes a mode indication (e.g., corresponding to mode indication 880, stating “PAY”) indicating to the user that the current payment relates to a payment request (or payment) made via an operating systemcontrolled payment transfer application (and not by a third-party application). As with payment confirmation user interface 878, payment confirmation user interface 2362 also includes an indication 2366 (e.g., corresponding to indication 884) (e.g., a graphical indication, a textual indication) of a payment account and a balance of the payment account that is currently selected for the payment. Indication 2364 informs the user that the device is requesting authorization for a payment. For example, in FIG. 23E, indication 2366 includes a thumbnail image of a graphical representation of the payment account and a current balance (e.g., “$20”) of the payment account. As with payment confirmation user interface 878, payment confirmation user interface 2362 also includes an indication 2370 (e.g., corresponding to indication 882) of the intended recipient of the payment (e.g., “Pay John”) and an indication 2372 (e.g., corresponding to indication 888) of the payment amount (e.g., to serve as a reminder to the user of the amount to be paid). In some embodiments, payment confirmation user interface 2362 also includes a cancel button 2376 for canceling the payment (and closing the payment confirmation interface).
[0683] As shown by indication 2372 (e.g., “$28”) showing the payment amount (e.g., the payment requested via payment message object 2350), the current balance (e.g., “$20”) of the 246
DK 2017 70505 A1 payment account, as shown by indication 2366, is insufficient to cover the entirety of the requested payment. Thus, in some embodiments, in accordance with a determination that the current balance (e.g., “20”) is insufficient to cover the full amount of the current transaction, electronic device 2300 displays, within payment confirmation user interface 2362, an (graphical and/or textual) indication 2374 (e.g., stating “Insufficient Balance”) that the account (e.g., the payment account) that is currently-selected for the transaction has insufficient funds to cover the amount of the current transaction. Further, in some embodiments, the device displays within indication 2366 a warning indication 2365 (e.g., stating “Insufficient Balance”) and an accounts selection button 2368 for selecting one or more additional accounts to be used together with the currently-selected account (e.g., the payment account) in the transaction.
[0684] In FIG. 23F, while displaying payment confirmation user interface 2362, electronic device 2300 detects a user input on accounts selection button 2368 of indication 2366. For example, as shown in FIG. 23F, the user input is a tap gesture 2305 on accounts selection button 2368.
[0685] In FIG. 23G, in response to detecting tap gesture 2305, electronic device 2300 displays (e.g., replaces display of payment confirmation user interface 2632 with) an accounts selection user interface 2378. In some embodiments, accounts selection user interface 2378 includes a back button 2382 for returning to payment confirmation user interface 2362. In some embodiments, accounts selection user interface 2378 maintains display of cancel button 2376 (for canceling the accounts selection or for canceling the payment). In some embodiments, accounts selection user interface 2378 maintains display of indication 2374 informing the user of the insufficient balance of the payment account (and thus that one or more additional (or alternative) accounts need to be selected for use in the transaction).
[0686] In some embodiments, accounts selection user interface 2378 includes indications of one or more accounts stored on or provisioned on the device. For example, as shown in FIG. 23G, accounts selection user interface 2378 shows indication 2366 corresponding to the payment account. Because the payment account is currently selected for use in the transaction (e.g., as indicated by indication 2366 in FIG. 23F), indication 2366 of the payment account includes a selection mark 2367 informing the user that the payment account is currently selected 247
DK 2017 70505 A1 for use in the transaction. Accounts selection user interface 2378 also shows an indication 2380 corresponding to a debit card account and an indication 2384 corresponding to a credit card account. Because neither the debit card account nor the credit card account is currently selected for use in the transaction, neither indication includes a selection mark.
[0687] In FIG. 23H, while displaying accounts selection user interface 2378, electronic device 2300 detects user selection of indication 2380 corresponding to the debit card account. For example, as shown in FIG. 23H, the user selection is a tap gesture 2307 on indication 2380 corresponding to the debit card account. In FIG. 23I, in response to detecting tap gesture 2307, the device updates display of indication 2380 to include a selection mark 2381. Thus, in FIG. 23I, following the detection of tap gesture 2307, accounts selection user interface 2378 shows (via selection marks 2367 and 2381) that both the payment account and the debit card account are selected for use in the transaction, but that the credit card is not selected for use in the transaction. In some embodiments, user selection of an indication that is already selected will cause the indication (and thus the corresponding account) to be unselected (and thus not be selected for use in the transaction).
[0688] In FIG. 23J, while displaying accounts selection user interface 2378 with both the payment account and the debit card account selected, electronic device 2300 detects user selection of back button 2382 for returning to payment confirmation user interface 2362. For example, as shown in FIG. 23J, the user selection is a tap gesture 2309 on back button 2382.
[0689] In FIG. 23K, in response to detecting tap gesture 2309, electronic device 2300 again displays (e.g., replaces display of accounts selection user interface 2378 with) payment confirmation user interface 2362. As shown in FIG. 23K, payment confirmation user interface 2362 now displays indication 2366 corresponding to the payment account and indication 2380 corresponding to the debit card account selected by tap gesture 2307. Thus, the user is informed that both the payment account and the debit card account will be (or are authorized to be) used for the transaction. Further, because the debit card account will also be used for the transaction, the transaction can proceed, even though the payment account still has insufficient funds (e.g., $20) to cover the amount of the transaction (e.g., $28) alone. As such, the device ceases to display warning indication 2365 and accounts selection button 2368. Further, the device changes 248
DK 2017 70505 A1 display of indication 2374 to (instead of warning of an insufficient balance) request user authentication information to proceed with the transaction.
[0690] In some embodiments, instead of providing the manual accounts selection option using accounts selection user interface 2378 described above with reference to FIGS. 23F-23J, electronic device 2300 automatically selects, in accordance with a determination that the currently-selected account (e.g., the payment account) has insufficient funds, a default back account (e.g., debit card account) as a second account for use in a transaction when the currentlyselected account (e.g., the payment account) has insufficient funds. Thus, in some embodiments, in accordance with a determination that the payment account has insufficient funds, instead of or in addition to displaying accounts selection button 2368 providing the user with the option to manually selection a second account for use in the transaction, the device automatically sets (and displays an indication of) the default backup account (e.g., the debit card account) to be used with the payment account in the transaction. In some embodiments, the default backup account is pre-configured by the user (e.g., the user pre-selects an account from a plurality of accounts stored on or provisioned on the device as the default backup account).
[0691] In FIG. 23L, electronic device 2300 receives a user input corresponding to the requested authentication information indicated by indication 2374 (e.g., requesting fingerprint information). For example, as shown in FIG. 23L, the user input is a fingerprint scan input 2311 on a fingerprint sensor (e.g., of mechanical button 2304) of the device.
[0692] FIG. 23M shows, subsequent to a successful authentication (e.g., because the fingerprint information obtained from fingerprint scan input 2311 on mechanical button 2304 is consistent with enrolled fingerprint information stored on electronic device 2300 for authorizing transactions), the payment (e.g., in the amount of $28) being completed. Thus, in some embodiments, indication 2374 is updated to indicate that the payment is complete (e.g., by stating “Payment Complete” and/or replacing a fingerprint request graphical indication with a checkmark graphical indication).
[0693] In response to the successful transaction using both the payment account (corresponding to indication 2366) and the debit card account (corresponding to indication 2380),
249
DK 2017 70505 A1 electronic device 2300 updates display of the balance of the payment account to (from $20) to $0. Because the payment account did not have sufficient funds to alone cover the amount of the payment (of $28), all available funds (of $20) from the payment account was used for the transaction, and the remaining balance (of $8) was paid from the debit card account.
[0694] FIG. 23N shows electronic device 2300 displaying, on display 2302, a shopping cart screen 2386 of a third-party online store shown via a third-party application. For example, in FIG. 23N, shopping cart screen 2386 includes a first item 2388 (e.g., a t-shirt) with a price tag of $16.00 and a second item 2390 with a price tag of $12.00. Thus, shopping cart screen 2386 indicates a total cost 2392 of the checkout to be $28.00.
[0695] As shown in FIG. 23N, shopping cart screen 2386 also includes a pay button 2394 for proceeding with payment of the items in the shopping cart (e.g., for an amount of $28.00). In FIG. 23N, electronic device 2300 detects user selection (e.g., tap gesture 2313) of pay button 2394 to proceed with the payment.
[0696] In FIG. 23O, in response to user tap gesture 2313, electronic device 2300 displays payment confirmation user interface 2362 with both indication 2366 corresponding to the payment account and indication 2380 corresponding to the debit card account automatically (e.g., without user input) shown on the interface. Thus, while the payment account has insufficient funds (e.g., “$20”), the user can still easily proceed with the payment for first item 2388 and second item 2390 (for the amount of $28, as indicated by indication 2372) using both the payment account and the debit card account.
[0697] FIGS. 24A-24C are a flow diagram illustrating a method for splitting transfers between two or more accounts using an electronic device in accordance with some embodiments. Method 2400 is performed at a device (e.g., 100, 300, 500, 2200, 2300) with display and one or more input devices (e.g., a touchscreen, a mic, a camera, a biometric sensor). Some operations in method 2400 are, optionally, combined, the orders of some operations are, optionally, changed, and some operations are, optionally, omitted.
250
DK 2017 70505 A1 [0698] As described below, method 2400 provides an intuitive way for managing peer-topeer transactions. The method reduces the cognitive burden on a user for managing peer-to-peer transactions, thereby creating a more efficient human-machine interface. For battery-operated computing devices, enabling a user to manage peer-to-peer transactions faster and more efficiently conserves power and increases the time between battery charges.
[0699] In some examples, prior to receiving the request (e.g., a user input on the electronic device, a signal from an external device) to participate in the transfer of resources (e.g., a transfer of computing resources, a transfer of points, a transfer of credits, a transfer of funds, a transfer of virtual resources) for the requested resource amount using the first resource account, the electronic device (e.g., 2200, 2300) receives (2402) an initiation input (e.g., a user input on the electronic device, a signal from an external device, such as a POS terminal).
[0700] In some examples, in response to receiving the initiation input (and, optionally, in accordance with the determination that the requested resource amount is greater than the amount of resources available via the first resource account (e.g., in accordance with a determination that the first resource account does not have sufficient resources to cover the requested resource amount of the resource transfer), and/or in accordance with the determination that the requested resource amount is equal to or less than the amount of resources available via the first resource account), the electronic device (e.g., 2200, 2300) concurrently displays (2404), on the display (e.g., 2202, 2302), a representation (e.g., a graphical representation, a textual representation) of the first resource account (e.g., an account stored in a secure element of the device) and a representation (e.g., a graphical representation, a textual representation) of the second resource account (e.g., a backup resource account, a debit account, a checking account).
[0701] In some examples, the electronic device (e.g., 2200, 2300) receives (2406) user input (e.g., 2303, a touch input, a voice input) for proceeding with the transfer of resources.
[0702] In some examples, in response to receiving the user input for proceeding with the transfer of resources, the electronic device (e.g., 2200, 2300) displays (2408), on the display (e.g., 2202, 2302), an authentication user interface (e.g., 2226, 2362) requesting authentication information for proceeding with the transfer of resources. Displaying a request for
251
DK 2017 70505 A1 authentication provides the user with visual feedback about the state of the device (state in which authentication is required) and prompts the user to provide the authentication (e.g., through biometric authentication, such as via a fingerprint authentication or facial recognition).
Providing improved visual feedback to the user enhances the operability of the device and makes the user-device interface more efficient (e.g., by helping the user to achieve an intended result by providing feedback indicative of an input that will cause the device to generate the intended result and reducing user mistakes when operating/interacting with the device) which, additionally, reduces power usage and improves battery life of the device by enabling the user to use the device more quickly and efficiently.
[0703] In some examples, the representation of the first resource account (e.g., 2330) includes an indication of the amount of funds (e.g., 2332) available via the first resource account. In some examples, the representation of the first resource account (e.g., 2330) and representation of the second resource account (e.g., 2324) are displayed in a list. In some examples, the representation of the first resource account (e.g., 2330) is displayed prior to displaying the representation of the second resource account (e.g., 2324). In some examples, the representation of the first resource account (e.g., 2330) is displayed higher up in the list than the representation of the second resource account (e.g., 2324). In some examples, the representation of the second resource account (e.g., 2324) is displayed before the representation of the first resource account (e.g., 2330) in the list. In some examples, the list is a three-dimensional stack. In some examples, the representations of resource accounts partially overlap each other.
[0704] The electronic device (e.g., 2200, 2300) receives (2410) a request (e.g., 2303, a user input on the electronic device, a signal from an external device) to participate in a transfer of resources (e.g., a transfer of computing resources, a transfer of points, a transfer of credits, a transfer of funds, a transfer of virtual resources) for a requested resource amount using a first resource account.
[0705] In some examples, the resource is (2412) an amount of funds (e.g., dollars, euros) and the second resource account is a stored-value account (e.g., a debit card account, a checking account) containing stored funds (e.g., stored-value account that is available for use in
252
DK 2017 70505 A1 sending/receiving payments via a messaging app as described in greater detail above with reference to methods 900, 1200, 1500, and 1800.
[0706] In some examples, receiving (2410) the request to participate in the transfer of resources includes receiving (2414) authentication information (e.g., biometric information, such as fingerprint information, facial recognition information, voice recognition information, iris/retina scan information, or authentication information that corresponds to a passcode or pattern). In some examples, the device (e.g., 2200, 2300) determines whether the authentication information is consistent with registered authentication information. In some examples, transferring resources includes transmitting credentials. In some examples, in accordance with a determination that the authentication information is consistent with the registered authentication information, a secure element of the electronic device provides (or releases) credentials (e.g., payment information). In some examples, in accordance with a determination that the authentication information is not consistent with the registered authentication information, the secure element forgoes providing (or releasing) credentials (e.g., payment information).
[0707] In response (2416) to (or subsequent to) receiving the request to participate in the transfer of resources for the requested resource amount using the first resource account, the electronic device (e.g., 2200, 2300) optionally performs blocks 2418 and 2428.
[0708] In accordance with (2418) a determination that the requested resource amount is equal to or less than an amount of resources available via the first resource account (e.g., in accordance with a determination that the first resource account has sufficient resources to cover the requested resource amount of the resource transfer), the electronic device (e.g., 2200, 2300) optionally performs one or more of blocks 2420-2426.
[0709] In some examples, the electronic device (e.g., 2200, 2300) displays (2420), on the display (e.g., 2202, 2302), an indication of the amount of resources available via the first resource account (e.g., 2228, 2332), and the electronic device forgoes (2422) displaying a selectable representation (e.g., a graphical representation, a textual representation) of the second resource account. Displaying an indication of available resources from the first resource account (e.g., 2332) without displaying the representation of the second resource account when sufficient
253
DK 2017 70505 A1 resources are available on the first resource account provides the user with visual feedback confirming that the first resource account has sufficient resources (e.g., funds) to fulfill the request and that the second resource account will not be used. Providing improved visual feedback to the user enhances the operability of the device and makes the user-device interface more efficient (e.g., by helping the user to achieve an intended result by providing feedback indicative of an input that will cause the device to generate the intended result and reducing user mistakes when operating/interacting with the device) which, additionally, reduces power usage and improves battery life of the device by enabling the user to use the device more quickly and efficiently.
[0710] The electronic device (e.g., 2200, 2300) automatically (e.g., without additional user input) proceeds (2424) with the transfer of resources using only the first resource account (e.g., using the first resource account and without using the second resource account). In some examples, the first resource account is associated with an amount of transferrable resources. Automatically proceeding with the transfer of resources using the appropriate account(s) based the requested resource amount being (or not being) greater than the amount of resources available on a particular account enables the correct account of resources to be transferred without requiring further user input. Performing an operation without requiring further user inputs enhances the operability of the device and makes the user-device interface more efficient, which, additionally, reduces power usage and improves battery life of the device by enabling the user to use the device more quickly and efficiently.
[0711] In some examples, subsequent to (or, optionally, in response to) automatically proceeding with the transfer of resource using only the first resource account (and not the second resource account), the electronic device (e.g., 2200, 2300) displays (2426), on the display (e.g., 2202, 2302), a first representation (e.g., 2330, a graphical representation, a textual representation) associated with the first resource account and forgoes displaying a second representation (e.g., 2324) associated with the second resource account.
[0712] In accordance with (2428) a determination that the requested resource amount is greater than the amount of resources available via the first resource account (e.g., in accordance with a determination that the first resource account does not have sufficient resources to cover 254
DK 2017 70505 A1 the requested resource amount of the resource transfer), the electronic device (e.g., 2200, 2300) optionally performs one or more of blocks 2430-2434.
[0713] In some examples, the electronic device (e.g., 2200, 2300) displays (2430) (e.g., concurrently), on the display (e.g., 2202, 2302), the indication of the amount of resources available via the first resource account (e.g., 2228, 2332) and the selectable representation (e.g., 2324, a graphical representation, a textual representation) of the second resource account (e.g., a backup resource account, a different type of resource account from the first resource account). Displaying an indication of available resources from the first resource account and displaying the representation of the second resource account when sufficient resources are not available on the first resource account provides the user with visual feedback that the first resource account has insufficient resources (e.g., funds) to fulfill the request and that the second resource account will be used. Providing improved visual feedback to the user enhances the operability of the device and makes the user-device interface more efficient (e.g., by helping the user to achieve an intended result by providing feedback indicative of an input that will cause the device to generate the intended result and reducing user mistakes when operating/interacting with the device) which, additionally, reduces power usage and improves battery life of the device by enabling the user to use the device more quickly and efficiently.
[0714] The electronic device (e.g., 2200, 2300) automatically (e.g., without user input, without user input after receiving the request to participate in a transfer of resources) proceeds (2432) with the transfer of resources using the first resource account and a second resource account (e.g., a backup resource account) different from the first resource account. In some examples, the second resource account is associated with an amount of transferrable resources. Automatically proceeding with the transfer of resources using the appropriate account(s) based the requested resource amount being (or not being) greater than the amount of resources available on a particular account enables the correct account of resources to be transferred without requiring further user input. Performing an operation without requiring further user inputs enhances the operability of the device and makes the user-device interface more efficient, which, additionally, reduces power usage and improves battery life of the device by enabling the user to use the device more quickly and efficiently.
255
DK 2017 70505 A1 [0715] In some examples, subsequent to (or, optionally, in response to) proceeding (2432) with the transfer of resources using the first resource account and the second resource, (e.g., in accordance with a determination that the first resource account does not have sufficient resources to cover the requested resource amount of the resource transfer), the electronic device (e.g., 2200, 2300) displays (2434) (e.g., concurrently), on the display (e.g., 2202, 2302), a first representation (e.g., a graphical representation, a textual representation) associated with the first resource account (e.g., 2228, 2330) and a second representation associated with the second resource account (e.g., 2236, 2324). In some examples, the device further concurrently displays an amount of the resource transferred using the first resource account and an amount of the resource transferred using the second resource account.
[0716] In some examples, prior to proceeding with the transfer of resources (e.g., using only the first resource account or using both the first resource account and the second resource account) (and, optionally, prior to receiving the request to participate in a transfer of resources), the electronic device (e.g., 2200, 2300) displays, on the display (e.g., 2202, 2302), an authentication user interface (e.g., 2226, 2362) requesting authentication information (e.g., biometric information, such as a fingerprint, facial features, iris/retina features, or input information such as a passcode or pattern). Displaying a request for authentication provides the user with visual feedback about the state of the device (state in which authentication is required) and prompts the user to provide the authentication (e.g., through biometric authentication, such as via a fingerprint authentication or facial recognition). Providing improved visual feedback to the user enhances the operability of the device and makes the user-device interface more efficient (e.g., by helping the user to achieve an intended result by providing feedback indicative of an input that will cause the device to generate the intended result and reducing user mistakes when operating/interacting with the device) which, additionally, reduces power usage and improves battery life of the device by enabling the user to use the device more quickly and efficiently. In some examples, the electronic device receives, via the one or more input devices, the authentication information, wherein automatically proceeding with the transfer of resources (e.g., using the first resource account or both the first and second resource accounts) is in accordance with a determination that the received authentication information corresponds to enrolled authentication information (stored on the device) for authorizing transfers. In some examples, in
256
DK 2017 70505 A1 accordance with a determination that the received authentication information does not correspond to the enrolled authentication information for authorizing transfers, the electronic device forgoes proceeding with the transfer of resources (and, optionally, indicating that authorization is required).
[0717] In some examples, the resource is an amount of funds (e.g., dollars, euros) and (e.g., a credit card account). In some examples, in response to receiving the request to participate in the transfer of resources for the requested resource amount using the first resource account, and in accordance with a determination that the second resource account is associated with a transaction fee, the electronic device (e.g., 2200, 2300) displays, on the display (e.g., 2202, 2302), an indication that a transaction fee (e.g., a transaction fee for using a credit card account, a percentage (e.g., 2%) of the amount of funds to be transmitted in the transfer) will be added to the transfer. In some examples, in response to receiving the request to participate in the transfer of resources for the requested resource amount using the first resource account, and in accordance with a determination that the second resource account is not associated with a transaction fee, the electronic device forgoes displaying, on the display, the indication that a transaction fee (e.g., a transaction fee for using a credit card account, a percentage (e.g., 2%) of the amount of funds to be transmitted in the transfer) will be added to the transfer.
[0718] In some examples, in accordance with proceeding with the transfer of resources using the second account (e.g., not using the first account, using both the first account and the second account), the electronic device (e.g., 2200, 2300) applies a first charge (e.g., a charge made to the second resource account, which is a credit account) in a first amount to the second resource account, wherein the first amount includes the transaction fee. In some examples, the transfer of funds only uses the second resource account and the total amount charged to the second resource account is the sum of the amount of funds transmitted and the transaction fee. In some examples, the transfer of funds uses the second resource account and one or more accounts (e.g., the first account), and the total amount charged to the second resource account is the sum of the amount of funds transmitted using the second resource account and the transaction fee. In some examples, the transaction fee is based on (e.g., a percent of) the amount transmitted using the
257
DK 2017 70505 A1 second resource amount. In some examples, the transaction fee is a flat fee. In some examples, the transaction fee is a combined percentage and flat fee.
[0719] In some examples, receiving the request to participate in the transfer of resources includes receiving a sequence of one or more inputs from the user to transmit the resources to another participant (e.g., 2210, 2310) in a message conversation (e.g., 2208, 2308).
[0720] In some examples, receiving the request to participate in the transfer of resources includes receiving information from an external source with information about a transaction and receiving a sequence of one or more inputs from the user to transmit resources selected based on the information from the external source.
[0721] In some examples, receiving the request to participate in the transfer of resources includes receiving a sequence of one or more inputs from the user that authorizes transmission of restricted credentials to a nearby device via a short range wireless communication.
[0722] In some examples, proceeding with the transfer of resources using the first resource account and the second resource account is in accordance with a determination that a split account option (e.g., a user setting for enabling/disabling automatic transfer of resources using two or more different resource accounts) is enabled on the device. In some examples, the default state is that the split account option is enabled on the device. In some examples, in accordance with a determination that a split account option is not enabled on the device, the electronic device (e.g., 2200, 2300) displays, on the display (e.g., 2202, 2302), a notification (e.g., a pop-up notification, a prompt) that the requested resource amount is greater than the amount of resources available via the first resource account (e.g., that the first resource account does not have sufficient resources to cover the requested resource amount of the resource transfer). In some examples, when the split account option is not enabled on the device, the electronic device forgoes proceeding with the transfer of resources using the first resource account and the second resource account, and, optionally, proceeds with the transfer of resource using the second resource account (and not using the first resource account).
258
DK 2017 70505 A1 [0723] In some examples, prior to receiving the request to participate in the transfer of resources, the electronic device (e.g., 2200, 2300) receives one or more inputs selecting a different (e.g., third) resource account for use in the transfer. When the device receives the request to participate in the transfer, the device uses the selected (e.g., the different, third) resource account for use in the transfer rather than the first resource account. In some examples, the electronic device displays, on the display (e.g., 2202, 2302), a selectable representation (e.g., a graphical representation, a textual representation) of the second (or third) resource account (e.g., a backup resource account, a different type of resource account from the first resource account). In some examples, the electronic device receives user selection of the selectable representation of the second (or third) resource account. In response to receiving the user selection of the selectable representation of the second resource account, the electronic device selects the second (or third) resource account for use in the transfer (e.g., without using the first resource account in the transfer).
[0724] In some examples, prior to receiving the request to participate in the transfer of resources for the requested resource amount using the first resource account, the electronic device (e.g., 2200, 2300) displays, on the display (e.g., 2202, 2302), message conversation (e.g., 2208, 2308) of a messaging application (e.g., 2206, 2306) between a plurality of participants (e.g., the user of the device and a contact of the user, 2210, 2310). In some examples, the initiation input corresponds to user selection of a resource message object (e.g., a message bubble having an indication of the requested resource amount, an email message having an indication of the requested resource amount) received from a first participant of the message conversation. In some examples, the request to participate in the transfer of resources is received while displaying, on the display (e.g., 2202, 2302), the message conversation (e.g., 2208, 2308) of the messaging application (e.g., 2206, 2306) between a plurality of participants, and wherein the initiation input corresponds to user selection (e.g., tap input on a touch-screen display) of a resource message object (e.g., a message bubble having an indication of the requested resource amount, an email message having an indication of the requested resource amount). In response to detecting the initiation input (e.g., in response to detecting the user selection (e.g., activation) of the resource message object or in response to selection of a payment send affordance in a payment creation interface), the electronic device concurrently displays, on the display (e.g.,
259
DK 2017 70505 A1
2202, 2302), a representation (e.g., 2366, a graphical representation, a textual representation) of the first resource account (e.g., an account stored in a secure element of the device) and a representation (e.g., 2380, a graphical representation, a textual representation) of the second resource account (e.g., a backup resource account, a debit account, a checking account). In some examples, the electronic device displays, on the display (e.g., 2202, 2302), a transfer user interface (e.g., a resource-numerical value selection user interface for receiving user adjustment of the amount of resources, such as points, credits, or funds), wherein the transfer user interface includes an indication (e.g., a list that includes a representation of the first resource account and a representation of the second resource account) that resources from the first resource account and the second resource account will be used for the transfer. In some examples, the representation of the first resource account is a graphical representation of the account, such as a thumbnail image of a card associated with the account. In some examples, the representation of the first resource account is a textual representation of the account, such as an identification number (e.g., identification number, card number) associated with the account. In some examples, the representation of the second resource account is a graphical representation of the account, such as a thumbnail image of a card associated with the account. In some examples, the representation of the second resource account is a textual representation of the account, such as an identification number (e.g., identification number, card number) associated with the account. In some examples, the message conversation is concurrently displayed with the representations of the resource accounts.
[0725] In some examples, concurrently displaying, on the display (e.g., 2202, 2302), the representation (e.g., 2330, 2366, a graphical representation, a textual representation) of the first resource account (e.g., an account stored in a secure element of the device) and the representation (e.g., 2324, 2380, a graphical representation, a textual representation) of the second resource account includes displaying a transaction detail region that also includes additional information about the transaction (e.g., a total price, shipping information, tax, etc.) and instructions for providing authorization information (e.g., a passcode or a biometric authorization such as a fingerprint or face) to authorize participation in the transaction.
260
DK 2017 70505 A1 [0726] Note that details of the processes described above with respect to method 2400 (e.g., FIGS. 24A-24C) are also applicable in an analogous manner to the methods described herein. For example, method 2400 optionally includes one or more of the characteristics of the various methods described herein with reference to methods 900, 1200, 1500, 1800, 2100, 2700, 3000, and 3400. For example, displaying a transfer user interface for initiating transfer of a first type of item (e.g., a photo, stickers, resources, payments) between participants as described in method 900 can apply when adjusting the transfer amount to send using both the first resource account and the second resource account. For another example, the outputting of feedback, as described in method 1200, can be applied to a transfer message object made using resources from both the first resource account and the second resource account via a messaging application (e.g., 2206, 2306). For another example, the different visual appearances of a message object based on whether the message object corresponds to a transmission message or a request message, as described in method 1500, can be applied to a transfer message object made using resources from both the first resource account and the second resource account via a messaging application (e.g., 2206, 2306). For another example, a request for activating an account that is authorized to obtain one or items (e.g., a sticker, a photo, resources, a payment), as described in method 1800, can be applied when setting up the first resource account. For another example, switching the account to be used in a resource transfer based on an indication that resources are insufficient in the currently-selected account, as described in method 2100, can be used when proceeding with a transfer using a single account that is not the first resource account when the first resource account has insufficient resources. For another example, the plurality of items including information from messages in a message conversation, as described in method 2700, can include information associated with the first resource account and from the second resource account. For another example, an utterance can be used, as described in method 3000, to initiate a transfer (e.g., initiate a payment) using both the first resource account and the second resource account. For another example, a visual effect (e.g., a coloring effect, a geometric alteration effect) can be applied, as described in method 3400, to one or more elements (e.g., one or more user interface objects on a surface of a graphical representation of an account, one or more patterns) of a graphical representation (e.g., 2330) of a payment account when the payment account is ready to be used in a transfer (e.g., of a resource, of a file, of a payment) and/or when a transfer (e.g., of a
261
DK 2017 70505 A1 resource, of a file, of a payment) using with the payment account is completed. For brevity, these details are not repeated below.
[0727] The operations in the information processing methods described above are, optionally, implemented by running one or more functional modules in an information processing apparatus such as general purpose processors (e.g., as described with respect to FIGS. 1A, 3, and 5A) or application specific chips. Further, the operations described above with reference to FIGS. 24A-24C are, optionally, implemented by components depicted in FIGS. 1A1B. For example, receiving operation 2410, proceeding operation 2424, and proceeding operation 2432 are, optionally, implemented by event sorter 170, event recognizer 180, and event handler 190. Event monitor 171 in event sorter 170 detects a contact on touch-sensitive surface 604, and event dispatcher module 174 delivers the event information to application 136-1. A respective event recognizer 180 of application 136-1 compares the event information to respective event definitions 186, and determines whether a first contact at a first location on the touch-sensitive surface corresponds to a predefined event or sub-event, such as selection of an object on a user interface. When a respective predefined event or sub-event is detected, event recognizer 180 activates an event handler 190 associated with the detection of the event or subevent. Event handler 190 optionally utilizes or calls data updater 176 or object updater 177 to update the application internal state 192. In some embodiments, event handler 190 accesses a respective GUI updater 178 to update what is displayed by the application. Similarly, it would be clear to a person having ordinary skill in the art how other processes can be implemented based on the components depicted in FIGS. 1A-1B.
[0728] FIGS. 25A-25C illustrate exemplary user interfaces for managing peer-to-peer transfers, in accordance with some embodiments. As described in greater detail below, the nonlimiting exemplary embodiment of the user interfaces illustrated in FIGS. 25A-25C relate to the non-limited exemplary embodiment of the user interfaces illustrated in FIGS. 26A-26T.
[0729] FIG. 25A illustrates an electronic device 2500 (e.g., portable multifunction device 100, device 300, or device 500). In the non-limiting exemplary embodiment illustrated in FIGS. 25A-25C, electronic device 2500 is a smartphone. In other embodiments, electronic device 2500 can be a different type of electronic device, such as a wearable device (e.g., a
262
DK 2017 70505 A1 smartwatch). Electronic device 2500 has a display 2502 and, optionally, one or more input devices (e.g., a touchscreen of display 2502, a mechanical button 2505, a mic).
[0730] In FIG. 25A, electronic device 2500 displays, on display 2502, a message conversation 2508 of a messaging application 2506 between a user of the device (e.g., “Kate Appleseed”) and a message participant 2510 (e.g., “John Appleseed”). In some embodiments, message participant 2510 is a contact stored on the device. In some embodiments, message participant 2510 is a contact of a contact list associated with the user account logged onto the device. In some embodiments, message participant 2510 is a contact included in a trusted contacts list associated with the user account logged onto the device.
[0731] In some embodiments, electronic device 2500 also displays, on display 2502, a virtual keyboard 2512 (e.g., an alphanumeric keyboard for typing a message) and a compose bar 2514 displaying the text of a message as a message is typed using virtual keyboard 2512. In some embodiments, a mechanical keyboard can be used in addition to or alternatively to virtual keyboard 2512 to type a message. In some embodiments, compose bar 2514 can expand (e.g., expand upwards) to accommodate a longer message or message object (e.g., an image, an emoticon, a special type of message object, such as a payment object). In some embodiments, compose bar 2514 includes a mic button 2516 which, when activated, enables the user to enter a message using voice input.
[0732] As shown in FIG. 25A, message conversation 2508 includes a message object 2518 corresponding to a message sent by the user to message participant 2510. In the message corresponding to message object 2518, the user asks message participant 2510: “Can you send me the video from last night?” As also shown in FIG. 25A, message conversation 2508 includes a transfer message object 2520 sent by message participant 2510 to the user. Transfer message object 2520 corresponds to a transmission of a file (e.g., a video file) that is requested by the user in the message corresponding to message object 2518.
[0733] In some embodiments, as shown in FIG. 25A, transfer message object 2520 includes an attachment object 2522 corresponding to a file (e.g., a video file) that is being transmitted via transfer message object 2520. For example, in FIG. 25A, the file is a video file, and thus
263
DK 2017 70505 A1 attachment object 2522 corresponds to a video file. In some embodiments, as also shown in FIG. 25A, transfer message object 2520 also includes a status indicator 2524 (e.g., stating “PENDING”) informing the user that the file (e.g., the video file corresponding to attachment object 2522) has not yet been accepted (e.g., viewed or downloaded) by the user.
[0734] In FIG. 25B, electronic device 2500 displays, on display 2502, a message conversation 2509 (different form message conversation 2508) of messaging application 2506 between the user of the device (e.g., “Kate Appleseed”) and a message participant 2530 (e.g., “Sarah James”). In some embodiments, message participant 2530 is a contact stored on the device. In some embodiments, message participant 2530 is a contact of a contact list associated with the user account logged onto the device. In some embodiments, message participant 2530 is a contact included in a trusted contacts list associated with the user account logged onto the device.
[0735] As shown in FIG. 25B, message conversation 2509 includes a message object 2532 corresponding to a message sent by message participant 2530 to the user. In the message corresponding to message object 2532, message participant 2530 states to the user: “Last night was fun!” As also shown in FIG. 25B, message conversation 2509 includes a transfer message object 2534 sent by the user to message participant 2530. Transfer message object 2534 corresponds to a request for transfer of photos (e.g., 5 photos) from the time period (e.g., last night) mentioned by message participant 2530 in the message corresponding to message object 2532).
[0736] In some embodiments, transfer message object 2534 includes a request indicator 2535 (e.g., a symbol “#”) indicating to the user that the message object corresponds to a request for a transfer of files (e.g., photos). In some embodiments, transfer message object 2534 also includes a textual indication 2536 (e.g., stating “5 photos from last night request”) indicating a number (e.g., “5”) of the files (e.g., photos) being requested to be transferred and a description (e.g., “from last night”) of the specific type of files that are being requested to be transferred. In some embodiments, transfer message object 2534 also includes a status indicator 2538 (e.g., stating “PENDING”) informing the user that the request for transfer has not yet been accepted by message participant 2530.
264
DK 2017 70505 A1 [0737] In FIG. 25C, electronic device 2500 displays, on display 2502, an attachments details user interface 2540 that includes details associated sent, received, and/or requested attachments using messaging application 2506 with various contacts. In some embodiments, attachments details user interface 2540 includes a graphical representation 2542 of a user account logged onto the device and associated with the sent, received, and/or requested attachments using messaging application 2506.
[0738] In some embodiments, attachments details user interface 2540 includes a plurality of attachment detail items, each corresponding to an attachment (e.g., a photo, a video file, an audio file, a document) sent to, received from, or requested to / requested by a contact associated with the user account logged onto electronic device 2500. In some embodiments, attachments details user interface 2540 includes one or more incoming items 2550 corresponding to incoming (i.e., received) attachments and/or incoming requests for transmission of an attachment. For example, in FIG. 25C, incoming items 2550 include a first incoming item 2552 of a video file corresponding to the video file associated with transfer message object 2520 received from message participant 2510 (e.g., “John Appleseed”), as shown in FIG. 25A. In some embodiments, first incoming item 2552 includes an indication 2552A (e.g., stating “John Appleseed”) of the contact associated with the item and a selectable indication 2552B of the attachment (e.g., the video file corresponding to attachment object 2522) which, when selected, causes the device to display a details screen that includes details about the attachment corresponding to the video file corresponding to attachment object 2522.
[0739] In some embodiments, attachments details user interface 2540 includes one or more outgoing items 2554 corresponding to outgoing (i.e., transmitted) attachments and/or outgoing requests for transmission of an attachment. For example, in FIG. 25C, outgoing items 2554 include a first outgoing item 2556 of a pending request for photos corresponding to the request for “5 photos from last night” associated with transfer message object 2534 sent to message participant 2530 (e.g., “Sarah James”), as shown in FIG. 25B. In some embodiments, first outgoing item 2556 includes an indication 2556A (e.g., stating “Sarah James”) of the contact associated with the item and a selectable indication 2556B of the request for attachments (e.g., the “5 photos from last night”) which, when selected, causes the device to display a details
265
DK 2017 70505 A1 screen that includes details about the pending request for the photos corresponding to the transfer message object 2534.
[0740] In some embodiments, attachments details user interface 2540 includes one or more today items 2558 corresponding to incoming and/or outgoing attachments and/or requests for transmission of an attachment from the current day. For example, in FIG. 25C, today items 2558 include a first today item 2560 corresponding to an attachment of 4 photos (e.g., as indicated by note indication 2560B stating “Team Photos” and selectable indication 2560C) sent to a different message participant (e.g., “Matthew Smith,” as indicated by indication 2560A) and a second today item 2562 corresponding to an attachment of a birthday video (e.g., as indicated by note indication 2562B stating “Happy Birthday” and selectable indication 2562C) sent to message participant 2510 (e.g., as indicated by indication 2562B), where the video file corresponding to second today item 2562 is different from the video file corresponding to first incoming item 2552 (which in turn corresponds to transfer message object 2520).
[0741] As mentioned above, the non-limiting exemplary embodiment of the user interfaces illustrated in FIGS. 25A-25C described above relate to the non-limited exemplary embodiment of the user interfaces illustrated in FIGS. 26A-26T described below. Therefore, it is to be understood that the processes described above with respect to the exemplary user interfaces illustrated in FIGS. 25A-25C and the processes described below with respect to the exemplary user interfaces illustrated in FIGS. 26A-26T are largely analogous processes that similarly involve initiating and managing transfers using an electronic device (e.g., 100, 300, 500, 2500, or 2600).
[0742] FIGS. 26A-26T illustrate exemplary user interfaces for generating and displaying a transfers history list, in accordance with some embodiments. The user interfaces in these figures are used to illustrate the processes described below, including the processes in FIGS. 27A-27E.
[0743] FIG. 26A illustrates an electronic device 2600 (e.g., portable multifunction device 100, device 300, or device 500). In the non-limiting exemplary embodiment illustrated in FIGS. 26A-26T, electronic device 2600 is a smartphone. In other embodiments, electronic
266
DK 2017 70505 A1 device 2600 can be a different type of electronic device, such as a wearable device (e.g., a smartwatch). Electronic device 2600 has a display 2602.
[0744] In FIG. 26A, electronic device 2600 displays, on display 2602, a message conversation 2608 of a messaging application 2606 between a user of the device (e.g., “Kate Appleseed”) and a message participant 2610 (e.g., “John Appleseed”). In some embodiments, message participant 2610 is a contact stored on the device. In some embodiments, message participant 2610 is a contact of a contact list associated with the user account logged onto the device. In some embodiments, message participant 2610 is a contact included in a trusted contacts list associated with the user account logged onto the device.
[0745] In some embodiments, electronic device 2600 also displays, on display 2602, a virtual keyboard 2612 (e.g., an alphanumeric keyboard for typing a message) and a compose bar 2614 displaying the text of a message as a message is typed using virtual keyboard 2612. In some embodiments, a mechanical keyboard can be used in addition to or alternatively to virtual keyboard 2612 to type a message. In some embodiments, compose bar 2614 can expand (e.g., expand upwards) to accommodate a longer message or message object (e.g., an image, an emoticon, a special type of message object, such as a payment object). In some embodiments, compose bar 2614 includes a mic button 2614A which, when activated, enables the user to enter a message using voice input.
[0746] As shown in FIG. 26A, message conversation 2608 includes a message object 2616 corresponding to a message sent by message participant 2610 to the user. In message object 2616, message participant 2610 states to the user: “Dinner and the cab ride together was $28.” As also shown in FIG. 26A, message conversation 2608 includes a follow-up payment message object 2618 sent by message participant 2610 to the user. Payment message object 2618 (e.g., corresponding to a payment message object associated with a payment request received by the user, as described above with reference to, for example, payment message object 1490 in FIG. 14L) corresponds to a payment request (e.g., of $28 for the dinner and the cab ride indicated in the message corresponding to message object 2616). In some embodiments, as shown in FIG. 26A, payment message object 2618 (associated with a payment request received by the user) includes a mode indication 2620 (e.g., stating “PAY”) indicating to the user that the payment
267
DK 2017 70505 A1 message object corresponds to a payment request (or payment) made via an operating systemcontrolled payment transfer application (and not by a third-party application). As shown in FIG. 26A, payment message object 2618 also includes an amount indication 2622 informing the recipient (e.g., the user) of the amount of the requested payment (e.g., “$28”) and a further indication (e.g., “$28 Request”) that the payment message object corresponds to a request for payment. In some embodiments, as shown in FIG. 26A, payment message object 2618 also includes an accept button 2624 for accepting the payment request (e.g., for agreeing to make the requested payment and proceed with making the requested payment). In some embodiments, as shown in FIG. 26A, payment message object 2618 also includes a status indicator 2626 informing the user of a status of the payment request corresponding to the payment message object (e.g., “pending,” “paid,” “accepted,” “expired,” etc.). For example, in FIG. 26A, status indicator 2626 shows “pending,” thus indicating to the user that the payment request associated with payment message object 2618 has not yet been accepted by the user. In some embodiments, as shown in FIG. 26A, message conversation 2608 includes a note message object 2628 that accompanies the payment message object (e.g., stating “Dinner + Cab”) corresponding to a note (e.g., a comment, a message) related to the payment request.
[0747] In FIG. 26B, electronic device 2600 displays, on display 2602, a message conversation 2630 (different form message conversation 2608) of messaging application 2606 between the user of the device (e.g., “Kate Appleseed”) and a message participant 2621 (e.g., “Sarah James”). In some embodiments, message participant 2621 is a contact stored on the device. In some embodiments, message participant 2621 is a contact of a contact list associated with the user account logged onto the device. In some embodiments, message participant 2621 is a contact included in a trusted contacts list associated with the user account logged onto the device.
[0748] As shown in FIG. 26B, message conversation 2630 includes a message object 2632 corresponding to a message sent by message participant 2621 to the user. In message object 2632, message participant 2621 states to the user: “Let me know how much I owe you for brunch.” As also shown in FIG. 26B, message conversation 2630 includes a payment message object 2634 sent by the user to message participant 2621. Payment message object 2634 (e.g.,
268
DK 2017 70505 A1 corresponding to a payment message object associated with a payment request made by the user, as described above with reference to, for example, payment message object 1460 in FIGS.14G14K) corresponds to a payment request (e.g., of $35 for the brunch indicated in the message corresponding to message object 2632). In some embodiments, as shown in FIG. 26B, payment message object 2634 (associated with a payment request made by the user) includes mode indication 2620 (e.g., stating “PAY”) indicating to the user that the payment message object corresponds to a payment request (or payment) made via an operating system-controlled payment transfer application (and not by a third-party application). As shown in FIG. 26B, payment message object 2634 also includes amount indication 2622 informing the recipient (e.g., message participant 2621) of the amount of the requested payment (e.g., “$35”) and a further indication (e.g., “$35 Request”) that the payment message object corresponds to a request for payment. In some embodiments, as shown in FIG. 26B, payment message object 2634 also includes a first status indicator 2626 informing the user of a status of the payment request corresponding to the payment message object (e.g., “pending,” “paid,” “accepted,” “expired,” etc.). For example, in FIG. 26B, status indicator 2626 shows “pending,” thus indicating to the user that the payment request associated with payment message object 2634 has not yet been accepted by message participant 2621. In some embodiments, as shown in FIG. 26B, payment message object 2634 also includes (in addition to or instead of first status indicator 2626), a second status indicator 2636 informing the user of the status of the payment corresponding to the sent payment message object (e.g., “pending,” “paid,” “accepted,” “expired,” etc.). For example, in FIG. 26B, second status indicator 2636 (e.g., “pending”) shows the same status as shown by first status indicator 2626 (e.g., “pending”). In some embodiments, as shown in FIG. 26B, message conversation 2630 includes a note message object 2638 that accompanies the payment message object (e.g., stating “Brunch”) corresponding to a note (e.g., a comment, a message) related to the payment request.
[0749] In FIG. 26C, electronic device 2600 displays, on display 2602, a message conversation 2640 (different from message conversations 2608 and 2630) of messaging application 2606 between the user of the device (e.g., “Kate Appleseed”) and a message participant 2631 (e.g., “Matthew Smith”). In some embodiments, message participant 2631 is a contact stored on the device. In some embodiments, message participant 2631 is a contact of a 269
DK 2017 70505 A1 contact list associated with the user account logged onto the device. In some embodiments, message participant 2631 is a contact included in a trusted contacts list associated with the user account logged onto the device.
[0750] As shown in FIG. 26C, message conversation 2640 includes a message object 2642 corresponding to a message sent by message participant 2642 to the user. In message object 2642, message participant 2642 states to the user: “Team fees this season are $40 per player. See you at the game!” As also shown in FIG. 26C, message conversation 2640 includes a payment message object 2644 sent by the user to message participant 2631. Payment message object 2644 (e.g., corresponding to a payment message object associated with a payment made by the user, as described above with reference to, for example, payment message object 1420 in FIGS.14B-14F) corresponds to a payment (e.g., of $40 for the team fees indicated in the message corresponding to message object 2642). In some embodiments, as shown in FIG. 26C, payment message object 2644 (associated with a payment made by the user) includes mode indication 2620 (e.g., stating “PAY”) indicating to the user that the payment message object corresponds to a payment request (or payment) made via an operating system-controlled payment transfer application (and not by a third-party application). As shown in FIG. 26C, payment message object 2644 also includes amount indication 2622 informing the recipient (e.g., message participant 2631) of the amount of the made payment (e.g., “$40”). In some embodiments, as shown in FIG. 26C, payment message object 2644 also includes a first status indicator 2626 informing the user of a status of the payment corresponding to the payment message object (e.g., “pending,” “paid,” “accepted,” “expired,” etc.). For example, in FIG. 26C, first status indicator 2626 shows “paid,” thus indicating to the user that the payment request associated with payment message object 2644 has been accepted by message participant 2631. In some embodiments, as shown in FIG. 26C, payment message object 2644 also includes (in addition to or instead of first status indicator 2626), a second status indicator 2636 informing the user of the status of the payment corresponding to the payment message object (e.g., “pending,” “paid,” “accepted,” “expired,” etc.). For example, in FIG. 26C, second status indicator 2636 (e.g., “paid”) shows the same status as shown by first status indicator 2626 (e.g., “paid”). In some embodiments, as shown in FIG. 26C, message conversation 2640 includes a note message object 2646 that
270
DK 2017 70505 A1 accompanies the payment message object (e.g., stating “Team Fees”) corresponding to a note (e.g., a comment, a message) related to the payment.
[0751] In FIG. 26C, while displaying message conversation 2640 with message participant 2631, electronic device 2600 detects user selection of payment message object 2644. For example, as shown in FIG. 26C, the user selection is a tap gesture 2601 on payment message object 2644.
[0752] In FIG. 26D, in response to detecting tap gesture 2601 on payment message object 2644, electronic device 2600 displays, on display 2602, a transaction details user interface 2648 that includes transactions details associated with the payment (or payment request) corresponding to the selected payment message object (e.g., payment message object 2644). In some embodiments, transaction details user interface 2648 includes a payment message object image 2650 corresponding to the selected payment message object (e.g., payment message object 2644). In some embodiments, transaction details user interface 2648 includes an indication 2652 of the note (e.g., stating “Team Fees”) corresponding to note message object 2646. In some embodiments, transaction details user interface 2648 includes a plurality of transaction details 2648A-G related to the payment made via payment message object 2644. For example, transaction details user interface 2648 includes an indication 2648A of the payment account (e.g., “From Kate’s Payment account”) used in the transaction and the amount (e.g., “$40”) that was withdrawn from the used account and an indication 2648B of the total amount (e.g., “$40”) of the transaction. For another example, transaction details user interface 2648 includes in indication 2648C of the account details (e.g., account number) of the used account (e.g., Kate’s payment account). For another example, transaction details user interface 2648 includes an indication 2648D of the recipient (e.g., message participant 2631, “Matthew Smith”) of the payment. For another example, transaction details user interface 2648 includes an indication 2648E of the date and time when the payment was sent (by the user) and an indication 2648F of the date and time when the payment was accepted (by the recipient, message participant 2631). For another example, transaction details user interface 2648 includes an indication 2648G of the transaction number.
271
DK 2017 70505 A1 [0753] In some embodiments, as shown in FIG. 26D, transaction details user interface 2648 includes a wallet button 2654 for viewing the transaction details in a wallet application (e.g., corresponding to wallet user interface 2022 described above with reference to FIGS. 20A-20J). In some embodiments, as shown in FIG. 26D, transaction details user interface 2648 includes send again button 2656 for sending a new payment with corresponding payment details (e.g., the same amount, the same intended recipient, from the same payment account) of the currentlyshown payment. In some embodiments, as shown in FIG. 26D, transaction details user interface 2648 includes a refund button 2658 for requesting a return of the payment corresponding to the currently-shown transaction details page. In some embodiments, a refund can be requested for a completed transaction (where the recipient has accepted the payment) but not for a pending transaction. Thus, in some embodiments, if the payment corresponds to a pending transaction (and not to a completed transaction), a refund button (e.g., refund button 2658) is not shown on the transaction details user interface. In some embodiments, transaction details user interface 2648 includes a return button 2613 (e.g., stating the name of message participant 2631 associated with the current transaction) which, when selected, causes the device to return to message conversation 2640 (and cease to display the details transaction user interface) .
[0754] In FIG. 26E, while displaying transaction details user interface 2648, electronic device 2600 detects user selection of wallet button 2654 to view account details of the account that was used in the current transaction (e.g., the payment account, as shown in, for example, 2648A and 2648C). For example, as shown in FIG. 26E, the user selection is a tap gesture 2603 on wallet button 2654.
[0755] FIG. 26F shows, in response to detecting tap gesture 2603 on wallet button 2654 from transaction details user interface 2648, electronic device 2600 displays, on display 2602, an transactions history user interface 2661 including a list of pending and past transactions associated with the currently-viewed account (e.g., the payment account). For example, in FIG. 26E, the account that was used for the payment corresponding to payment message object 2644 was the payment account. Therefore, in response to detecting user selection 2603 on wallet button 2654 from the transaction details user interface associated with payment message object
272
DK 2017 70505 A1
2644, the electronic device displays, on the display 2602, transactions history user interface 2661 corresponding to the payment account is.
[0756] In some embodiments, as shown in FIG. 26F, transactions history user interface 2661 includes a graphical representation 2662 (e.g., a thumbnail image, a mini-image) corresponding to the currently-viewed account (e.g., the payment account). In some embodiments, transactions history user interface 2661 includes a balance indication 2664 (e.g., “$215”) of the currentlyavailable amount of funds in the payment account. In some embodiments, transactions history user interface 2661 includes a switch bar 2666 for switching between an account information view (e.g., corresponding to an account information user interface associated with the currentlyviewed account (e.g., the payment account)) and an account history view (e.g., corresponding to the transactions history user interface). The account information view corresponds to an info tab 2666A and the account history view corresponds to a transactions tab 2666B. As shown in FIG. 26F, because the transactions history user interface is currently displayed (instead of an account information user interface), transactions tab 2666B is highlighted (e.g., marked with thicker borders) to indicate to the user that the currently-displayed view corresponds to the transactions history user interface.
[0757] As also shown in FIG. 26F, transactions history user interface 2661 includes one or more transaction items (e.g., a pending transaction item corresponding to a currently-pending transaction or a past transaction item corresponding to a competed transaction). In some embodiments, as shown in FIG. 26F, the one or more items are organized based on pending transactions (e.g., incoming pending requests 2668 and outgoing pending requests 2672), and completed transactions (e.g., today transactions 2676 corresponding to transactions completed today and earlier transactions 2682 corresponding to transactions completed earlier than today). In some embodiments, completed transactions are organized (in chronological order) by days up to a predetermined point (e.g., today, yesterday, Tuesday, Monday, then “earlier”).
[0758] For example, in FIG. 26F, incoming requests 2668 includes an incoming payment request item 2670 from message participant 2610 (e.g., “John Appleseed”) corresponding to the payment request associated with payment message object 2618 described above with reference to FIG. 26A. Incoming payment request item 2670 includes an indication 2670A (e.g., showing
273
DK 2017 70505 A1 “John Appleseed”) of the sender (e.g., message participant 2610) of the incoming payment request. In some embodiments, because incoming payment request item 2670 corresponds to a payment request that is pending, the item also includes a pay button 2670B for paying the requested payment. In some embodiments, incoming payment request item 2670 is selectable. In response to receiving user selection of incoming payment request item 2670, the device displays a participant history user interface associated with the payments and/or payment requests corresponding to the message participant associated with the selected item.
[0759] Further, in FIG. 26F, outgoing requests 2672 includes an outgoing payment request item 2674 from the user to message participant 2621 (e.g., “Sarah James”) corresponding to the payment request associated with payment message object 2634 described above with reference to FIG. 26B. Outgoing payment request item 2674 includes an indication 2674A (e.g., showing “Sarah James”) of the recipient (e.g., message participant 2621) of the outgoing payment request. In some embodiments, because outgoing payment request item 2674 corresponds to a payment request that the recipient (e.g., message participant 2621) has not yet accepted, and thus is still pending, the item also includes a reminder button 2674B for sending a reminder to the recipient (e.g., message participant 2621) to make the requested payment. In some embodiments, outgoing payment request item 2674 is selectable. In response to receiving user selection of incoming payment request item 2674, the device displays a participant history user interface associated with the payments and/or payment requests corresponding to the message participant associated with the selected item.
[0760] Further in FIG. 26F, today transactions 2676 (of transactions completed during the current day) includes a completed outgoing payment item 2678 associated with a payment from the user to message participant 2631 (e.g., “Matthew Smith”) corresponding to the payment associated with payment message object 2644 described above with reference to FIG. 26C and a completed incoming payment item 2680 (e.g., from message participant 2610, “John Appleseed”). Completed outgoing payment item 2678 includes an indication 2678A (e.g., showing “Matthew Smith”) of the recipient (e.g., message participant 2631) of the completed outgoing payment and an indication 2678B of an amount (e.g., “$40”) of the made payment. In some embodiments, indication 2678B shows the amount (e.g., “$40”) without a positive (e.g.,
274
DK 2017 70505 A1 “+”) or negative (e.g., “-”) indicator to inform the user that the item corresponds to an outgoing payment (e.g., a payment made by the user to a recipient). Completed outgoing payment item 2678 also includes an indication 2678C of other details associated with the completed transactions, such as a note (e.g., stating “Team Fees,” corresponding to the note of note message object 2646) associated with the transaction and a time and/or date of when the payment was sent. Completed incoming payment item 2680 includes an indication 2680A of the sender (e.g., message participant 2610, “John Appleseed”) of the incoming payment and an indication 2680B of an amount (e.g., “$50”) of the received payment. In some embodiments, indication 2680B shows the amount (e.g., “$50”) with a positive (e.g., “+”) indicator to inform the user that the item corresponds to a received payment (e.g., a payment received by the user from a sender). Completed incoming payment item 2680 also includes an indication 2680C of other details associated with the completed transactions, such as a note (e.g., stating “Happy Birthday”) associated with the transaction and a time and/or date of when the payment was received. Further, in some embodiments, items within today transactions 2676 (e.g., completed outgoing payment item 2678 and completed incoming payment item 2680) are selectable. In response to receiving user selection of a transaction item from among the today transactions 2676, the device displays a participant history user interface associated with the payments and/or payment requests corresponding to the message participant associated with the selected item.
[0761] Further in FIG. 26F, earlier transactions 2682 (or transactions completed during a day earlier than the current day) includes a completed outgoing payment item 2684 associated with a payment made by the user to a commercial entity (e.g., a coffee shop), as indicated by indication 2684A. In some embodiments, transactions with commercial entities can be made as with noncommercial entities, such as message participants 2610, 2621, and 2631. As with other items, completed outgoing payment item 2684 includes indication 2684A of the name of the recipient (or commercial entity) (e.g., “Coffee Shop”), an indication 2684B of the payment made to the commercial entity (e.g., “$4.75”), and an indication 2684C of the time and/or date of the commercial transaction.
[0762] In FIG. 26G, while displaying transactions history user interface 2661, electronic device 2600 detects user selection of incoming payment request item 2670 (e.g., at a region other
275
DK 2017 70505 A1 than the region corresponding to pay button 2670B) of incoming request items 2668. For example, as shown in FIG. 26G, the user selection is a tap gesture 2605 on incoming payment request item 2670 (e.g., at a region other than the region corresponding to pay button 2670B).
[0763] In Fig. 26H, in response to detecting tap gesture 2605, electronic device 2600 displays, on display 2602, a participant history user interface 2686 that includes one or more items specific to the participant associated with the item selected by tap gesture 2605. For example, in FIG. 26H, because tap gesture 2605 corresponded to a selection of incoming payment request item 2670, which corresponds with message participant 2610, participant history user interface 2686 corresponds to message participant 2610.
[0764] As shown in FIG. 26H, participant history user interface 2686 includes an indication 2610 (e.g., “John Appleseed”) of the message participant (e.g., message participant 2610) associated with the currently-viewed participant history user interface. In some embodiments, as also shown in FIG. 26H, participant history user interface 2686 includes menu icons 2692A-E associated with different types of operations that can be performed concerning the currentlyviewed message participant (e.g., message participant 2610). For example, menu icon 2692A is a messaging icon which allows the user to initiate a messaging conversation (e.g., via messaging application 2606) with message participant 2610, menu icon 2692B is a calling icon which allows the user to initiate a phone call (e.g., via a phone application) with message participant 2610, menu icon 2692C is a video call icon which allows the user to initiate a video call with message participant 2610, menu icon 2692D is a mail icon which allows the user to initiate an electronic mail (e.g., email) communication with message participant 2610, and menu icon 2692E is an information icon which allows the user to view information/details (e.g., name, contact information, address information, associated group) of message participant 2610.
[0765] As also shown in FIG. 26H, similar to transactions history user interface 2661, participant history user interface 2686 (e.g., specific to message participant 2610) includes one or more items (corresponding to transactions performed with message participant 2610) corresponding to pending transactions (e.g., pending payments or payment requests) and completed (e.g., paid payments or payment requests) transactions. Thus, in some embodiments, the one or more items are organized based on pending transactions (e.g., incoming pending
276
DK 2017 70505 A1 requests 2694), and completed transactions (e.g., today transactions 2698 corresponding to transactions completed today and earlier transactions 2697 corresponding to transactions completed earlier than today).
[0766] For example, in FIG. 26H, incoming pending requests 2694 includes a payment request item 2696 corresponding to payment request item 2670 described above with reference to FIG. 26F. Payment request item 2696 includes an indication 2696A (e.g., stating “Dinner + Cab”) of a note associated with the payment request item (e.g., to serve as a reminder to the user of the reason for the payment request), a pay button 2696B (e.g., corresponding to pay button 2670B) indicating the amount (e.g., “$28”) of the payment request and which, when selected (as with pay button 2670B), proceed with payment of the payment request (e.g., via messaging application 2606), and an indication 2696C of a date and/or time of when the payment request was received.
[0767] Further, in FIG. 26H, today transactions 2698 includes a completed incoming payment item 2699 corresponding to a payment received by the user from message participant 2610 during the current day. Completed incoming payment item 2699 includes an indication 2699A (e.g., stating “Happy Birthday”) of a note associated with the incoming payment (e.g., to serve as a reminder to the user of the reason for the payment), an amount indication 2699B (e.g., stating “+$50”) showing the received payment amount, and an indication 2699C of the date and/or of when the payment was received. In some embodiments, because completed incoming payment item 2699 corresponds to a received payment (as opposed to a made payment), amount indication 2699B includes a positive symbol (e.g., “+”) to indicate that the payment corresponds to an incoming payment.
[0768] Further, in FIG. 26H, earlier transactions 2697 includes a completed outgoing payment item 2695 corresponding to a payment made by the user to message participant 2610 and a completed incoming payment item 2693 corresponding to a payment received by the user from message participant 2610. Completed outgoing payment item 2695 includes an indication 2695A (e.g., stating “Mom’ Gift”) of a note associated with the payment (e.g., to serve as a reminder to the user of the reason for making the payment), an amount indication 2695B (e.g., “$60.00”) of the payment amount, and an indication 2695C of the time and/or date of when the
277
DK 2017 70505 A1 payment was made. Completed incoming payment item 2693 includes an indication 2693A (e.g., stating “Lunch”) of a note associated with the payment (e.g., to serve as a reminder to the user of the reason for the received payment), an amount indication 2693B (e.g., “+13.50”) of the received payment amount, and an indication 2693C of the time and/or date of when the payment was received.
[0769] In some embodiments, as also shown in FIG. 26H, participant history user interface 2686 includes a back button 2688 for returning to transactions history user interface 2661. In some embodiments, as also shown in FIG. 26H, participant history user interface also includes a pay button 2690 for initiating (e.g., via messaging application 2606) a new payment or a new payment request with messaging participant 2610.
[0770] In FIG. 26I, while displaying participant history user interface 2686 corresponding to message participant 2610, electronic device 2600 detects user selection of completed incoming payment item 2699. For example, as shown in FIG. 26I, the user selection is a tap gesture 2607 on completed incoming payment item 2699.
[0771] In FIG. 26J, in response to detecting tap gesture 2607 on completed incoming payment item 2699, electronic device 2600 displays, on display 2602, a transaction details user interface 2691 (e.g., similar to transaction details user interface 2648 described above with reference to FIG. 26D) that includes transactions details associated with completed incoming payment item 2699. In some embodiments, transaction details user interface 2691 includes a payment message object image 2689 corresponding to the payment message object associated with completed incoming payment item 2699. In some embodiments, transaction details user interface 2691 includes an indication 2687 of the note (e.g., stating “Happy Birthday”) associated with the payment message object associated with the completed incoming payment item 2699. In some embodiments, transaction details user interface 2691 includes a plurality of transaction details 2691A-E related to the incoming payment. For example, transaction details user interface 2691 includes an indication 2691A of the sender (e.g., stating “John,” message participant 2610) of the payment and the payment amount (e.g., “$50”). For another example, transaction details user interface 2691 includes in indication 2691B of the account details (e.g., account number) of the account where the payment was deposited (e.g., the user, Kate’s, payment account). For
278
DK 2017 70505 A1 another example, transaction details user interface 2691 includes an indication 2691C of the date and time when the payment was sent (by message participant 2610) and an indication 2648D of the date and time when the payment was accepted (by the user). For another example, transaction details user interface 2691 includes an indication 2691E of the transaction number. In some embodiments, as shown in FIG. 26J, transaction details user interface 2691 includes a view in message button 2654 for viewing the payment message object corresponding to the current payment in a message conversation (e.g., message conversation 2608 with message participant 2610) of messaging application 2606.
[0772] In FIG. 26K, while displaying participant history user interface 2686, electronic device 2600 detects user activation of pay button 2696B (corresponding to pay button 2670B) for proceeding with making the payment corresponding to the payment request associated with payment request item 2696 (corresponding to payment request item 2670). For example, as shown in FIG. 26K, the user activation is a tap gesture 2609 on pay button 2696B (or, can also be a tap gesture on corresponding pay button 2670B on transactions history user interface 2661).
[0773] In FIG. 26L, in response to detecting tap gesture 2609 on pay button 2696B, electronic device 2600 again displays, on display 2602, message conversation 2608 (with message participant 2610) of messaging application 2606, as first described above with reference to FIG. 26A. As described above, message conversation 2608 includes a message object 2616 sent by message participant 2610 to the user and a payment message object 2618 corresponding to the payment request that corresponds to the payment request associated with payment request item 2696 (as displayed in participant history user interface 2686) and payment request item 2670 (as displayed in transactions history user interface 2661).
[0774] As also shown in FIG. 26L, in response to detecting tap gesture 2609 on pay button 2696B (or pay button 2670B), electronic device 2600 displays, on display 2602, a payment transfer user interface 2683 (e.g., corresponding to payment transfer user interface 840 described above with reference to FIGS. 8E-8P). As with payment transfer user interface 840, payment transfer user interface 2683 includes a value change region 2681 (e.g., corresponding to value change region 846) and an indication 2679 of the transfer amount (e.g., “$28”) within value change region 2681.
279
DK 2017 70505 A1 [0775] In FIG. 26M, while displaying payment transfer user interface 2683, electronic device 2600 detects user activation of a send button 2677 (e.g., corresponding to send button 847 of payment transfer user interface 840) for sending a payment in the indicated amount (e.g., of $28). For example, as shown in FIG. 26M, the user activation is a tap gesture 2611 on send button 2677.
[0776] In FIG. 26N, in response to detecting tap gesture 2611 on send button 2677 (and thus sending the payment requested by the payment requested associated with payment message object 2618), electronic device 2600 updates the appearance of payment message object 2618 to indicate that the requested payment has been made. Specifically, as shown in FIG. 26N, amount indication 2622 of payment message object 2618 is visually changed. In some embodiments, the visual change to amount indication 2622 is a bolding (or thickening) of the font of the displayed amount (e.g., “$28”). In some embodiments, the visual change to amount indication 2622 includes a black outline (e.g., a shadow) applied to the font of the displayed amount (e.g., “$28”). In some embodiments, the visual change to amount indication 2622 is a change in color (e.g., from black to white) of the displayed amount (e.g., “$28”). In some embodiments, in response to detecting a change in orientation of the device, electronic device 2600 generates feedback (e.g., a visual feedback, a haptic feedback, audio feedback) associated with the payment message object. In some embodiments, the feedback is a dynamic visual feedback causing display of the payment message object (e.g., payment message object 2618) to change as changes in the orientation of the device relative to a reference point are detected, as described above, for example, with reference to payment message object 1118 in FIG. 11E. In some embodiments, the device also displays (e.g., replaces display of payment transfer user interface 2683 with) virtual keyboard 2612.
[0777] FIG. 26O shows a wallet user interface 2673 (e.g., similar to wallet user interface 2022 described above with reference to FIGS. 20C-20J). As shown in FIG. 26O, wallet user interface 2673 shows a graphical representation 2669 of the payment account and a graphical representation 2671 of a debit card account (e.g., which is a default backup account). Graphical representation 2669 of the payment account (e.g., Kate’s payment account, a unique operating system-controlled and managed account) includes a balance indication 2669A (e.g., “$187”)
280
DK 2017 70505 A1 indicating the available funds of the payment account, where graphical representations 2669 and 2671 are displayed at the first location (e.g., the top-half portion) of the interface, thereby indicating that the two accounts (e.g., the payment account and the debit card account) are currently selected for use in a transaction. In some embodiments, as also shown in FIG. 26O, wallet user interface 2673 includes (partial) graphical representations 2667 of a plurality of other accounts (e.g., of an airline ticket 2667A, of a concert pass 2667B, of a loyalty card 2667C) displayed at the second location (e.g., a bottom-edge region) of the interface, thereby indicating that these accounts are currently not selected for use in a transaction.
[0778] In FIG. 26P, while displaying wallet user interface 2673, electronic device 2600 detects a user input on graphical representation 2669 of the payment account. For example, as shown in FIG. 26P, the user input is a tap gesture 2613 on graphical representation 2669 corresponding to the payment account.
[0779] In FIG. 26Q, in response to detecting tap gesture 2613, electronic device 2600 displays within wallet user interface 2673 a selected account-specific page (e.g., the payment account-specific page) that includes summary information 2665 of the most recent transaction (e.g., the outgoing payment associated with payment message object 2618) that was made using the selected account (e.g., the payment account). For example, as shown in FIG. 26Q, summary information 2665 includes the recipient of the payment (e.g., message participant 2610, “John Appleseed”), a note (e.g., stating “Dinner + Cab”) associated with the payment (e.g., to serve as a reminder to the user of the reason for the payment), a date and/or time of the payment, and an amount (e.g., “$28.00”) of the payment. Further, as also shown in FIG. 26Q, in response to detecting tap gesture 2613 on graphical representation 2669 (of the payment account), the device (maintains) display of the graphical representation of the selected account (e.g., graphical representation 2669 of the payment account) at the first location of the wallet user interface.
[0780] In some embodiments, as also shown in FIG. 26Q, in response to detecting user selection (e.g., tap gesture 2613) of graphical representation of the payment account (and not the debit card account), electronic device moves display of graphical representation 2671 of the (non-selected) debit card account from the first location to the second location of the interface. For example, as shown in FIG. 26Q, graphical representation 2671 of the debit card account is 281
DK 2017 70505 A1 moved such that it is one of graphical representations 2667 of the other accounts and is only partially visible on the display.
[0781] In some embodiments, as also shown in FIG. 26Q, the payment account-specific page of wallet user interface 2673 includes an account information button 2663 for viewing more details/information associated with the currently-selected account (e.g., the payment account). In FIG. 26R, while displaying the payment account-specific page of wallet user interface 2673 (e.g., as indicated by graphical representation 2669 of the payment account being displayed at the first location of the interface), electronic device 2600 detects user activation of account information button 2663 for viewing more details/information associated with the currently-selected payment account. For example, the user activation is a tap gesture 2615 on account information button 2663.
[0782] In FIG. 26S, in response to detecting tap gesture 2615 on account information button 2663, electronic device 2600 displays (e.g., replaces display of the payment account-specific page of wallet user interface 2673 with an account information user interface 2659 (of the payment account). In some embodiments, account information user interface 2659 corresponds to the account information view navigable from transactions history user interface 2661 by selecting info tab 2666A from switch bar 2666, as described above with reference to FIG. 26F. Similarly, account information user interface 2659 includes corresponding switch bar 2666 (having corresponding info tab 2666A and corresponding transactions tab 2666B). Because the account information user interface is currently displayed, info tab 2666A is highlighted (e.g., marked with thicker borders). Further, as also shown in FIG. 26S, account information user interface 2659 (of the payment account) includes a graphical representation 2662 (e.g., a thumbnail image, a mini-image) corresponding to the currently-viewed account (e.g., the payment account). In some embodiments, account information user interface 2659 includes a balance indication 2664 (e.g., “$187”) of the currently-available amount of funds in the payment account. For example, the balance of the payment account as shown by balance indication 2664 in FIG. 26S is $28 less (e.g., $187 v. $215) than balance of the payment account as shown by balance indication 2664 in FIG. 26F as a result of the payment of $28 made to message participant 2610 via payment message object 2618.
282
DK 2017 70505 A1 [0783] In some embodiments, account information user interface 2659 includes an add funds selectable indication 2659A (e.g., showing “Add Money”) for adding funds to the currentlyviewed account (e.g., the payment account). In some embodiments, account information user interface 2659 also includes a transfer selectable indication 2659B (e.g., showing “Transfer to Bank”) for transferring funds from the payment account to a different account (e.g., a bank account). In some embodiments, account information user interface 2659 also includes, within automatic payment acceptance list 2659C, an “everyone” option 2659D which, when selected, causes electronic device 2600 to automatically accept (e.g., without any user input from the user) an incoming payment (or, in some embodiments, to also automatically accept and agree to an incoming payment request) from any message participant. In some embodiments, account information user interface 2659 also includes, within automatic payment acceptance list 2659C, a “contacts only” option 2659K which, when selected, causes the device to automatically accept (e.g., without any user input from the user) an incoming payment (or, in some embodiments, to also automatically accept and agree to an incoming payment request) from a message participant that corresponds to a contact within a contacts list (e.g., the main contacts list, a favorites contacts list, a trusted contacts list) associated with the user account logged into the device. In some embodiments, account information user interface 2659 also includes, within automatic payment acceptance list 2659C, a manual option 2659E which, when selected, causes electronic device 2600 to automatically accept (e.g., without any user input from the user) an incoming payment (or, in some embodiments, to also automatically accept and agree to an incoming payment request) from a message participant (e.g., message participant 2610) that is a member of a manually created (by the user) list, such as a trusted message participant list. In some embodiments, account information user interface 2659 also includes, within automatic payment acceptance list 2659C, an off option which, when selected, causes electronic device 2600 to not automatically accept (e.g., unless the user provides user input accepting) an incoming payment (or, in some embodiments, to also automatically accept and agree to an incoming payment request) from any message participant.
[0784] In some embodiments, account information user interface 2659 also includes an allow requests option 2659F (which may, as shown in FIG. 26S, have a toggle button to switch between an “off” mode and an “on” mode) which, when in the “on” mode, allows the device to 283
DK 2017 70505 A1 receive and provide payment requests from other message participants (e.g., message participant 2610, 2621, 2631) via messaging application 2606 and, when in the “off” mode, disallows the device from receiving and providing payment requests from other message participants (e.g., message participants 2610, 2621, 2631). In some embodiments, account information user interface 2659 also includes a card (or account) details region 2659G which includes account information specific to the currently-viewed account (e.g., the payment account). For example, as shown in FIG. 26S, card details region 2659H includes an indication 2659G of a card number (or an account number) associated with the currently-viewed account (e.g., the payment account), an indication 2659I of a (initial) pin number set for the account (and/or a change pin button for changing the pin number set for the account), and a selectable indication 2659J for deactivating (e.g., de-provisioning from the device) the currently-viewed account (e.g., the payment account). In some examples, the indication 2659G is at least a portion of the card number (or account number).
[0785] FIG. 26T shows transactions history user interface 2661, as described above with reference to FIG. 26F. As described above, in some embodiments, account information user interface 2659 corresponds to the account information view navigable from transactions history user interface 2661 by selecting info tab 2666A from switch bar 2666. Similarly, account information user interface 2659 includes corresponding switch bar 2666 (having corresponding info tab 2666A and corresponding transactions tab 2666B). In FIG. 26T, because the account information user interface is currently displayed (e.g., in response to detecting user selection of transactions tap 2666B while viewing account information user interface 2659), transactions tab 2666B is highlighted (e.g., marked with thicker borders).
[0786] FIGS. 27A-27E are a flow diagram illustrating a method 2700 for generating and displaying a transfers history list using an electronic device in accordance with some embodiments. Method 2700 is performed at a device (e.g., 100, 300, 500, 2500, 2600) with a display. Some operations in method 2700 are, optionally, combined, the orders of some operations are, optionally, changed, and some operations are, optionally, omitted.
[0787] As described below, method 2700 provides an intuitive way for managing peer-topeer transactions. The method reduces the cognitive burden on a user for managing peer-to-peer 284
DK 2017 70505 A1 transactions, thereby creating a more efficient human-machine interface. For battery-operated computing devices, enabling a user to manage peer-to-peer transactions faster and more efficiently conserves power and increases the time between battery charges.
[0788] The electronic device (e.g., 2500, 2600) receives (2702) one or more messages (e.g., one or more text messages, one or more emails) in a first conversation (e.g., 2608) of electronic messages (e.g., a text conversation thread, an email thread) that includes messages from a user of the electronic device to a first participant (e.g., 2510, 2610) and messages from the first participant (e.g., a first friend) to the user of the electronic device. The one or more messages in the first conversation (e.g., 2508, 2608) include (2704) a first message (e.g., from the first participant or the user) that is associated with the transfer of a first additional item (e.g., a photo, video, file, or payment).
[0789] In some examples, the first additional item is (2706) a first transfer between the user of the electronic device and the first participant (e.g., 2510, 2610). In some examples, the first transfer is a first media transfer. In some examples, the first transfer is a first payment transfer.
[0790] In some examples, the first transfer is (2708) a transfer from the user of the electronic device to the first participant (e.g., 2510, 2610). In some examples, the first transfer from the user of the device to the first participant is a media transfer from the user of the device to the first participant. In some examples, the first transfer from the user of the device to the first participant is a payment from the user of the device to the first participant.
[0791] In some examples, the first transfer is (2710) a transfer request by the user of the electronic device to the first participant (e.g., 2510, 2610).
[0792] The electronic device (e.g., 2500, 2600) receives (2712) one or more messages (e.g., one or more text messages, one or more emails) in a second conversation (e.g., 2509, 2630) of electronic messages (e.g., a text conversation thread, an email thread) that includes messages from the user of the electronic device to a second participant (e.g., 2530, 2621) and messages from the second participant (e.g., a second friend different from the first friend) to the user of the electronic device. The one or more messages in the second conversation include (2714) a second
285
DK 2017 70505 A1 message (e.g., from the second participant or the user) that is associated with the transfer of a second additional item (e.g., a photo, video, file, or payment).
[0793] In some examples, the second additional item is (2716) a second transfer between the user of the electronic device and the second participant (e.g., 2530, 2621). In some examples, the second transfer is a second media transfer. In some examples, the second transfer is a second payment transfer.
[0794] In some examples, the second transfer is (2718) a transfer from the user of the electronic device to the second participant (e.g., one of 2510, 2530, 2610, 2621, 2631 that does not correspond to the first participant). In some examples, the second transfer from the user of the device to the second participant is a media transfer from the user of the device to the second participant. In some examples, the second transfer from the user of the device to the second participant is a payment from the user of the device to the second participant.
[0795] In some examples, the second transfer is (2720) a transfer request by the user of the electronic device to the second participant.
[0796] The electronic device (e.g., 2500, 2600) concurrently displays (2722), on the display (e.g., 2502, 2602), a first item (e.g., 2552, 2670) (2724) associate with the first participant and a second item (e.g., 2556, 2674) (2736) associated with the second participant. Concurrently displaying multiple items (e.g., the first item and the second item) that include information from messages of different conversations provides the user with visual feedback that the items are related to transfers while allowing the user to concurrently view the information from the different conversations. Providing improved visual feedback to the user enhances the operability of the device and makes the user-device interface more efficient (e.g., by helping the user to achieve an intended result by providing feedback indicative of an input that will cause the device to generate the intended result and reducing user mistakes when operating/interacting with the device) which, additionally, reduces power usage and improves battery life of the device by enabling the user to use the device more quickly and efficiently.
286
DK 2017 70505 A1 [0797] In some examples, the first item (e.g., 2552, 2670) includes an indication (e.g., 2670A, a photo, a name) of the associated contact (participant) and other information (e.g., transaction information, such as an amount of the transaction, a time of the transaction, a location of the transaction) related to a conversation with the associated contact. The first item includes (2726) first information (e.g., 2622) from the first message in the first conversation of electronic messages and a representation of the first additional item (e.g., 2552B, 2670B).
[0798] In some examples, the representation of the first additional item (e.g., 2552B, 2670B) includes (2728) a numerical representation (e.g., a size (bytes), an edit date, an amount of funds) of the first additional item. In some examples, the first additional item is a video file, and thus the representation of the first additional item includes a size (bytes) and/or edit date of the video file. In some examples, the second additional item is a photo, and thus the representation of the second additional item includes a size (bytes) and/or edit date of the photo.
[0799] In some examples, the representation of the first additional (e.g., 2552B, 2670B) item includes (2730) an indication (e.g., 3.1MB, $20.17) of an amount of the first transfer. In some examples, the amount is an amount of resource. In some examples, the amount is an amount of storage used or size (e.g., in bytes). In some examples, the amount is an amount of funds/currency.
[0800] In some examples, the first item (alternatively, or in addition, the second item) (e.g., the first item and/or the second item) includes (2734) an indication (e.g., 2552A, 2670A) of the first participant (alternatively, the second participant) (e.g., the first participant or the second participant) associated with the first item and an indication of a time (e.g., the time the message associated with the item was sent/received) associated with the first item. Displaying indications of participants and time provides the user with visual feedback about what other people were involved in the transfer and when the transfer took place. Such information is particularly helpful when transfers using the same account are grouped together, providing the user with a summary of transfers for a particular account and the corresponding details of the transfers. Providing improved visual feedback to the user enhances the operability of the device and makes the user-device interface more efficient (e.g., by helping the user to achieve an intended result by providing feedback indicative of an input that will cause the device to generate the intended
287
DK 2017 70505 A1 result and reducing user mistakes when operating/interacting with the device) which, additionally, reduces power usage and improves battery life of the device by enabling the user to use the device more quickly and efficiently.
[0801] The second item includes (2738) second information from the second message in the second conversation (e.g., 2509, 2630) of electronic messages and a representation of the second additional item (e.g., 2556B, 2674B). Concurrently displaying multiple items (e.g., the first item and the second item) that include information from messages of different conversations provides the user with visual feedback that the items are related to transfers while allowing the user to concurrently view the information from the different conversations. Providing improved visual feedback to the user enhances the operability of the device and makes the user-device interface more efficient (e.g., by helping the user to achieve an intended result by providing feedback indicative of an input that will cause the device to generate the intended result and reducing user mistakes when operating/interacting with the device) which, additionally, reduces power usage and improves battery life of the device by enabling the user to use the device more quickly and efficiently.
[0802] In some examples, the representation of the second additional item (e.g., 2556B, 2674B) includes (2740) an indication (e.g., 3.1MB, $5.16) of an amount of the second transfer. In some examples, the amount is an amount of resource. In some examples, the amount is an amount of storage used or size (e.g., in bytes). In some examples, the amount is an amount of funds/currency.
[0803] In some examples, the first additional item is a video file and the second additional item is a photo.
[0804] In some examples, the electronic device (e.g., 2500, 2600) detects an input (e.g., on a touch-sensitive surface of the electronic device) at a location corresponding to the first item (e.g., 2552, 2670). In response to detecting the input at the location corresponding to the first item (e.g., 2552, 2670), and in accordance with a determination that the location corresponds to the representation of the first additional item, the electronic device displays an item-specific user interface (e.g., an item detail page including details associated with the first message). In
288
DK 2017 70505 A1 response to detecting the input at the location corresponding to the first item, and in accordance with a determination that the location does not correspond to the representation of the first additional item, the electronic device displays a first participant-specific user interface (e.g., 2686, a contact detail page including several different items associated with the participant).
[0805] In some examples, the first item (e.g., 2552, 2670) (alternatively, or in addition, the second item) corresponds to a pending (e.g., not yet completed) payment transaction and the representation of the first additional item (e.g., 2552B, 2670B) (alternatively, the representation of the second additional item) includes an indication of an amount (of funds) of the pending payment transaction. The electronic device (e.g., 2500, 2600) receives user input on the representation of the first additional item (e.g., 2552B, 2670B) of the first item (alternatively, the second item). In response to receiving the user input, the electronic device displays, on the display (e.g., 2502, 2602), an authentication user interface requesting authentication information (e.g., biometric authentication information, such as a fingerprint for fingerprint authentication, facial features for facial recognition, voice input for voice recognition, iris/retina scan for iris/retina identification) for authorizing the transaction. Displaying a request for authentication provides the user with visual feedback about the state of the device (state in which authentication is required) and prompts the user to provide the authentication (e.g., through biometric authentication, such as via a fingerprint authentication or facial recognition). Providing improved visual feedback to the user enhances the operability of the device and makes the userdevice interface more efficient (e.g., by helping the user to achieve an intended result by providing feedback indicative of an input that will cause the device to generate the intended result and reducing user mistakes when operating/interacting with the device) which, additionally, reduces power usage and improves battery life of the device by enabling the user to use the device more quickly and efficiently.
[0806] In some examples, the first item (e.g., 2552, 2670) (alternatively, or in addition, the second item) corresponds to a payment sent to the user by the first participant (alternatively, the second participant) (e.g., the first participant or the second participant) associated with the first item. In some examples, the first item includes an affordance (e.g., 2624) for transferring an amount of the payment to an external account (e.g., a linked bank account of the user) associated
289
DK 2017 70505 A1 with the user. In some examples, upon receiving user selection of the affordance, funds equivalent to the amount of the payment received from the participant is transferred to a default account of the user, such as a default stored-value account (e.g., a debit account).
[0807] In some examples, the first participant-specific user interface (e.g., 2686) includes contact information (e.g., a phone number, an email address, a webpage URL) associated with the first participant and a list of one or more first participant-specific items (e.g., 2696, 2698, 2695, 2693, previous items associated with the participant, wherein the previous items each include information form an associated message in a conversation of electronic messages), including the first item, associated with the first participant. Displaying information about the participant provides the user with additional context and visual feedback about the transfer and enables the user to easily contact the participant (e.g., by activating a phone number affordance in the participant-specific user interface) to discuss the transfer. Providing improved visual feedback to the user enhances the operability of the device and makes the user-device interface more efficient (e.g., by helping the user to achieve an intended result by providing feedback indicative of an input that will cause the device to generate the intended result and reducing user mistakes when operating/interacting with the device) which, additionally, reduces power usage and improves battery life of the device by enabling the user to use the device more quickly and efficiently.
[0808] In some examples, the first item-specific user interface includes (2746) a representation of content (2748) (e.g., a preview or mini-representation of the first additional item, such as a preview of a photo, video, or file or an indication of a payment amount) associated with the first item, an indication of the first participant (2750), and an indication of a time (2752) (e.g., the time the first message was sent/received) associated with the first message. Displaying information about an item provides the user with additional context relating to the item and provides the user with visual feedback about the item. Providing improved visual feedback to the user enhances the operability of the device and makes the user-device interface more efficient (e.g., by helping the user to achieve an intended result by providing feedback indicative of an input that will cause the device to generate the intended result and reducing user mistakes when operating/interacting with the device) which, additionally, reduces power usage
290
DK 2017 70505 A1 and improves battery life of the device by enabling the user to use the device more quickly and efficiently.
[0809] In some examples, the representation of the first additional item includes a status indicator (e.g., “pending,” “completed”) associated with the first transfer and an affordance for viewing additional details (e.g., date and time of the transaction, participants in the transaction, location of the transaction) associated with the first transfer. In some examples, the electronic device (e.g., 2500, 2600) detects user activation of the affordance for viewing additional details associated with the first transfer. In response to detecting the user activation of the affordance, the electronic device displays, on the display (e.g., 2502, 2602), a details user interface. The details user interface includes (e.g., concurrently displayed): the first information from the first message in the first conversation of the electronic messages, an authorization affordance for authorizing the first transfer, and a cancel affordance for cancelling (e.g., refusing) the first transfer.
[0810] In some examples, the electronic device (e.g., 2500, 2600) detects user activation of the authorization affordance. In response to detecting the user activation of the authorization affordance, the electronic device displays an authentication user interface for requesting authentication information (e.g., biometric authentication, such as a fingerprint, facial recognition, iris scan, retina scan authentication). The electronic device receives the authentication information. In accordance with a determination that the received authentication information is consistent with (e.g., matches, corresponds to) enrolled authentication information (stored on the device) for authorizing transactions, the electronic device authorizes the first transfer and updates display of the first message (e.g., changing a color, changing a shade, changing a pattern, changing a status indicator) in the first conversation of electronic messages to indicate that the first transfer has been authorized. In accordance with a determination that the received authentication information is not consistent with the enrolled authentication information for authorizing transactions, the electronic device forgoes authorizing the first transfer and, optionally, does not update display of the first message (e.g., changing a color, changing a shade, changing a pattern, changing a status indicator) in the first conversation of electronic messages to indicate that the first transfer has been authorized.
291
DK 2017 70505 A1 [0811] In some examples, the electronic device (e.g., 2500, 2600) detects user activation of the cancel affordance. In response to detecting the user activation of the cancel affordance, the electronic device displays, on the display (e.g., 2502, 2602), the first conversation of electronic messages. The first conversation (e.g., 2508, 2608) includes an indication (e.g., a new message indicating) that the first transfer has been canceled. In some examples, the electronic device further updates display of the first message (e.g., 2520, 2618, changing a color, changing a shade, changing a pattern, changing a status indicator) to indicate that the first payment transfer has been canceled. In some examples, the electronic device updates display of the first message (e.g., 2520, 2618) to indicate that the first payment transfer has been canceled. Updating the display of a message to reflect a change in status (e.g., from pending to canceled) provides the user with visual feedback about the state of the message and that a request made by the user (e.g., to cancel a payment transfer) has been received (and implemented) by the device. Providing improved visual feedback to the user enhances the operability of the device and makes the user-device interface more efficient which, additionally, reduces power usage and improves battery life of the device by enabling the user to use the device more quickly and efficiently.
[0812] In some examples, the first item corresponds to a pending (e.g., not yet completed) payment transaction and the second item corresponds to a completed payment transaction.
[0813] In some examples, the first item-specific user interface includes an annotation of text in the first message in the first conversation of electronic messages.
[0814] In some examples, the first item-specific user interface includes an annotation of text from one or more messages that are adjacent to the first message (including or not including the first message) (e.g., a previous message received immediately before the first message, a subsequent message received immediately after the first message) in the first conversation of electronic messages. Displaying text from adjacent messages in the conversations provides the user with visual feedback regarding the context of the item, such as why the item was sent to the user or what event the item corresponds to. Providing improved visual feedback to the user enhances the operability of the device and makes the user-device interface more efficient (e.g., by helping the user to achieve an intended result by providing feedback indicative of an input that will cause the device to generate the intended result and reducing user mistakes when
292
DK 2017 70505 A1 operating/interacting with the device), which, additionally, reduces power usage and improves battery life of the device by enabling the user to use the device more quickly and efficiently.
[0815] In some examples, the first item (e.g., 2552, 2670) and the second item (e.g., 2556, 2674) correspond to transactions made using a first payment account (e.g., a default account, a debit account, a stored-value account). Prior to concurrently displaying, on the display, the first item (e.g., 2552, 2670) and the second item (e.g., 2556, 2674), the electronic device (e.g., 2500, 2600) displays, on the display (e.g., 2502, 2602), a representation (e.g., 2669, a graphical representation, such as a thumbnail image of the payment account or a preview image of the payment account) of the first payment account. The electronic device receives user selection of the representation of the first payment account (e.g., 2669). In response to receiving the user selection of the representation of the first payment account, the electronic device concurrently displays, on the display, a list of items (e.g., 2670, 2674, 2678, 2680, 2684) associated with the first payment account. The list of items (e.g., 2670, 2674, 2678, 2680, 2684) includes the first item (e.g., 2552, 2670) and the second item (e.g., 2556, 2674). Concurrently displaying a list of items (e.g., corresponding to messages of different conversations) provides the user with visual feedback about transfers that used the first payment account. Providing improved visual feedback to the user enhances the operability of the device and makes the user-device interface more efficient (e.g., by helping the user to achieve an intended result by providing feedback indicative of an input that will cause the device to generate the intended result and reducing user mistakes when operating/interacting with the device) which, additionally, reduces power usage and improves battery life of the device by enabling the user to use the device more quickly and efficiently.
[0816] In some examples, in accordance with a determination that the first item (alternatively, the second item) is associated with a transfer of an amount of funds from the user to the first participant (alternatively, the second participant) associated with the first item, the electronic device forgoes adding a directional indicator (e.g., a “+” symbol or a “-” symbol) to a numerical representation of the amount of funds included in the first item. In some examples, in accordance with a determination that the first item (e.g., 2670) (alternatively, the second item) is associated with a transfer of the amount of funds to the user from the first participant
293
DK 2017 70505 A1 (alternatively, the second participant) associated with the first item, the electronic device adds the directional indicator (e.g., a “+” symbol) to the numerical representation of the amount of funds included in the first item. Visually differentiating between transfers from and to the user by including or not including a particular indicator provides the user with visual feedback about the direction of flow of resources (e.g., funds) between the user and others. Providing improved visual feedback to the user enhances the operability of the device and makes the user-device interface more efficient (e.g., by helping the user to achieve an intended result by providing feedback indicative of an input that will cause the device to generate the intended result and reducing user mistakes when operating/interacting with the device) which, additionally, reduces power usage and improves battery life of the device by enabling the user to use the device more quickly and efficiently.
[0817] In some examples, the first item (e.g., 2552, 2670) includes a graphical indication (e.g., a photo of the participant, a picture of the participant) of the first participant associated with the first item. In some examples, if the first participant is not an individual but a commercial entity (e.g., a company), the graphical indication is a logo associated with the commercial entity. In some examples, the second item includes a graphical indication (e.g., a photo of the participant, a picture of the participant) of the second participant associated with the second item.
[0818] In some examples, the representation of the first additional item (alternatively, the representation of the second additional item) includes a thumbnail image of the first additional item. In some examples, the first additional item is a video file, and thus the representation of the first additional item includes a thumbnail image (e.g., a preview image) of the video file. In some examples, the second additional item is a photo, and thus the representation of the second additional item includes a thumbnail image (e.g., a smaller image) of the photo.
[0819] In some examples, transactions between participants may be commercial transactions between the user of the electronic device and a merchant. In some examples, the user of the electronic device makes a payment to a merchant or requests a payment from the merchant. In some examples, the merchant makes a payment (e.g., refund of previous purchase) to the user of
294
DK 2017 70505 A1 the electronic device or requests payment (e.g., for a good or service) from the user of the electronic device.
[0820] Note that details of the processes described above with respect to method 2700 (e.g., FIGS. 27A-27E) are also applicable in an analogous manner to the methods described herein. For example, method 2700 optionally includes one or more of the characteristics of the various methods described herein with reference to methods 900, 1200, 1500, 1800, 2100, 2400, 3000, and 3400. For example, a payment message object created to transfer the first type of item (e.g., a sticker, a photo, a payment object), as described in method 900, can be selected to view the item-specific user interface. For another example, the outputting of feedback, as described in method 1200, can be applied to a representation of a first item (e.g., 2689) shown in the first item-specific user interface (e.g., 2691). For another example, the message objects with different visual appearances based on whether the message object corresponds to a transmission message or a request message, as described in method 1500, can be selected to view the first item-specific user interface. For another example, a request for activating an account that is authorized to obtain one or items (e.g., a sticker, a photo, resources, a payment), as described in method 1800, can be applied when setting up the account associated with first item and the second item. For another example, switching the account to be used in a resource transfer based on an indication that resources are insufficient in the currently-selected account, as described in method 2100, can be used when proceeding with a transfer from first participant-specific user interface (e.g., 2686) suing the first item (e.g., 2696). For another example, automatically proceeding with a transfer, as described in method 2400, instead of requiring user input, can be used when proceeding with a transfer for the first item-specific user interface or the first participant-specific user interface. For another example, an utterance can be used, as described in method 3000, to initiate a transfer (e.g., initiate a payment) while viewing first participant-specific user interface (e.g., 2686) via the first item (e.g., 2696). For another example, a visual effect (e.g., a coloring effect, a geometric alteration effect) can be applied, as described in method 3400, to an element (e.g., 2622) of a message object (e.g., 2644) when a transfer (e.g., of a resource, of a file, of a payment) associated with a message corresponding to the message object is completed. For brevity, these details are not repeated below.
295
DK 2017 70505 A1 [0821] The operations in the information processing methods described above are, optionally, implemented by running one or more functional modules in an information processing apparatus such as general purpose processors (e.g., as described with respect to FIGS. 1A, 3, and 5A) or application specific chips. Further, the operations described above with reference to FIGS. 27A-27E are, optionally, implemented by components depicted in FIGS. 1A1B. For example, receiving operation 2702 and displaying operation 2722, are, optionally, implemented by event sorter 170, event recognizer 180, and event handler 190. Event monitor 171 in event sorter 170 detects a contact on touch-sensitive surface 604, and event dispatcher module 174 delivers the event information to application 136-1. A respective event recognizer 180 of application 136-1 compares the event information to respective event definitions 186, and determines whether a first contact at a first location on the touch-sensitive surface corresponds to a predefined event or sub-event, such as selection of an object on a user interface. When a respective predefined event or sub-event is detected, event recognizer 180 activates an event handler 190 associated with the detection of the event or sub-event. Event handler 190 optionally utilizes or calls data updater 176 or object updater 177 to update the application internal state 192. In some embodiments, event handler 190 accesses a respective GUI updater 178 to update what is displayed by the application. Similarly, it would be clear to a person having ordinary skill in the art how other processes can be implemented based on the components depicted in FIGS. 1A-1B.
[0822] FIGS. 28A-28F illustrate exemplary user interfaces for managing peer-to-peer transfers, in accordance with some embodiments. As described in greater detail below, the nonlimiting exemplary embodiment of the user interfaces illustrated in FIGS. 28A-28F relate to the non-limited exemplary embodiment of the user interfaces illustrated in FIGS. 29A-29S.
[0823] FIG. 28A illustrates an electronic device 2800 (e.g., portable multifunction device 100, device 300, or device 500). In the non-limiting exemplary embodiment illustrated in FIGS. 28A-28F, electronic device 2800 is a smartphone. In other embodiments, electronic device 2800 can be a different type of electronic device, such as a wearable device (e.g., a smartwatch).
296
DK 2017 70505 A1 [0824] In FIG. 28A, the user (e.g., “Kate Appleseed”) of electronic device 2800 is providing (e.g., while a digital assistant user interface of a digital assistant is shown on a display of the device) a spoken user input containing a request. For example, as shown in FIG. 28A, the user provides spoken user input 2801 stating “Send the 5 photos from last night to John,” thus requesting that the device send 5 photos from last night to intended recipient John Appleseed.
[0825] FIG. 28B shows electronic device 2800 displaying, on a display 2802 of the device, a digital assistant user interface 2804 of the digital assistant following the request received via spoken user input 2801. Specifically, in response to receiving spoken user input 2801 stating “Send the 5 photos from last night to John,” the device performs speech recognition and natural language processing on the spoken user input and displays, on display 2802, a transcription 2801A corresponding to spoken user input 2801 (e.g., to provide confirmation that the user’s intended input was received by the device). Further, from the performed speech recognition and natural language processing on the spoken user input, a determination is made (e.g., by the device or by an external device, such as a server, communicating with the device) of a user intent (e.g., an intent to send the 5 photos from last night to John Appleseed).
[0826] In accordance with a determination (e.g., made by the device or made by an external device, such as a server, communicating with the device) that the user’s intent, based on spoken user input 2801, is to send one or more files (e.g., photos, video files, audio files, documents) to an intended recipient (e.g., to send John Appleseed the 5 photos from last night), electronic device 2800 displays within digital assistant user interface 2804 a message object box 2808 that includes a draft transfer message object 2812 corresponding to the requested transfer of photos determined from spoken user input 2801. As shown in FIG. 28B, draft transfer message object 2812 includes a plurality of mini-file objects 2814A-E corresponding to the 5 photos to be sent to John Appleseed. In some embodiments, message object box 2808 also includes an indication 2806 of the intended recipient (e.g., stating “John Appleseed”) of the transfer. In some embodiments, message object box 2808 includes a send button 2818 (for proceeding with the transfer of the transfer message object associated with 5 selected photos as shown by message object box 2808) and a forgo sending button 2816 (for cancelling proceeding with the transfer of
297
DK 2017 70505 A1 the transfer message object associated with 5 selected photos as shown by message object box 2808).
[0827] In FIG. 28C, while displaying message object box 2808 on digital assistant user interface 2804, electronic device 2800 detects user activation of send button 2818 for proceeding with the transfer of the selected 5 photos as indicated by draft transfer message object 2812 of the message object box. For example, as shown in FIG. 28C, the user activation is a tap gesture 2803 on send button 2818. In some embodiments, the user activation is made via a spoken user input (e.g., “Send the selected photos”) to the digital assistant.
[0828] In some embodiments, as shown in FIG. 28D, in response to detecting user input 2803 on send button 2818 for proceeding with the transfer as shown in draft transfer message object 2812 of message object box 2808, electronic device 2800 displays on digital assistant user interface a confirmation request 2820 (e.g., stating “Are you sure you want to send these 5 selected photos to John Appleseed?”). As shown in FIG. 28D, the user provides the device with (e.g., via a voice input) the requested confirmation 2805 (e.g., stating “Yes, send these photos to John.”).
[0829] In some embodiments, as shown in FIG. 28E, in response to receiving requested confirmation 2805, electronic device 2800 displays (e.g., over at least a portion of digital assistant user interface 2804) a transfer confirmation user interface 2822. In some embodiments, transfer confirmation user interface 2822 includes an authentication request 2830 (e.g., a graphical request, a textual request) requesting that the user provide authentication information (e.g., “Send with Fingerprint”) to proceed with transmitting the selected files (e.g., the 5 photos corresponding to mini-file objects 2814A-2814E) to the intended recipient (e.g., “John Appleseed”). In some embodiments, as also shown in FIG. 28E, transfer confirmation user interface 2822 includes an indication 2824 (e.g., “5 photos to John”) of the items (e.g., files, photos, video files, audio files, documents) that will be transferred and the intended recipient of the transfer, a change button 2826 for changing the items to be sent and/or changing one or more intended recipients of the transfer, and a cancel button 2828 for cancelling the transfer.
298
DK 2017 70505 A1 [0830] In FIG. 28E, while displaying transfer confirmation user interface 2822, electronic device 2800 detects a user input that corresponds to the requested authentication information for proceeding with the transfer. For example, as shown in FIG. 28E, the user input is a fingerprint input 2807 on a fingerprint sensor of a mechanical button 2817of the device.
[0831] In FIG. 28F, in response to a determination that authentication was successful, the digital assistant provides, on digital assistant user interface 2804 (e.g., below message object box 2808), an affirmation 2830 (e.g., stating “Okay, I’ll send your message”) informing the user that a transfer message object corresponding to draft transfer message object 2812 will be sent (e.g., via a messaging application) to the intended recipient (e.g., “John Appleseed”) with the associated files (e.g., the 5 photos corresponding to mini-file objects 2814A-E of draft transfer message object 2812).
[0832] As mentioned above, the non-limiting exemplary embodiment of the user interfaces illustrated in FIGS. 28A-28F described above relate to the non-limited exemplary embodiment of the user interfaces illustrated in FIGS. 29A-29S described below. Therefore, it is to be understood that the processes described above with respect to the exemplary user interfaces illustrated in FIGS. 28A-28F and the processes described below with respect to the exemplary user interfaces illustrated in FIGS. 29A-29S are largely analogous processes that similarly involve initiating and managing transfers using an electronic device (e.g., 100, 300, 500, 2800, or 2900).
[0833] FIGS. 29A-29S illustrate exemplary user interfaces for voice-activation of transfers, in accordance with some embodiments. The user interfaces in these figures are used to illustrate the processes described below, including the processes in FIGS. 30A-30D.
[0834] FIG. 29A illustrates an electronic device 2900 (e.g., portable multifunction device 100, device 300, or device 500). In the non-limiting exemplary embodiment illustrated in FIGS. 29A-29S, electronic device 2900 is a smartphone. In other embodiments, electronic device 2900 can be a different type of electronic device, such as a wearable device (e.g., a smartwatch). Electronic device 2900 has a display 2902 and one or more input devices (e.g., touchscreen of display 2902, a mechanical button 2904, a mic).
299
DK 2017 70505 A1 [0835] In FIG. 29A, electronic device 2900 displays, on display 2902, a user interface locked screen 2920. In some embodiments, a user interface locked screen is displayed when the device is in a user interface locked state (e.g., a state where one or more functions of the operating system is prohibited from use by a user (e.g., “Kate Appleseed”) of the device). In some embodiments, user interface locked screen 2920 includes an indication 2922 that the device is in the user interface locked state.
[0836] In some embodiments, in FIG. 29A, while electronic device 2900 remains in the user interface locked state, the device receives a user input initiating a digital assistant of the device. For example, as shown in FIG. 29A, the user input is a press-and-hold gesture 2901 on a home button (e.g., mechanical button 2904) of the device. In some embodiments, the device receives the user input (e.g., press-and-hold gesture 2901, detecting a press of button 2904 for longer than a threshold duration) while the device is in a user interface unlocked state. In some embodiments, the user input is (instead of press-and-hold gesture 2901) a voice input (e.g., call for, via speech, the digital assistant), detected via a mic of the device, for initiating the digital assistant of the device.
[0837] The digital assistant of electronic device 2900 is a (voice-controlled) digital assistant that can respond to the user’s spoken requests. In order to do so, the digital assistant requires speech recognition capability. In some examples, speech recognition is performed using speechto-text (STT) processing, such as through an Automatic Speech Recognition (ASR) system. One or more ASR systems can process the speech input to produce a recognition result. Each ASR system includes a front-end speech pre-processor. The front-end speech pre-processor extracts representative features from the speech input. For example, the front-end speech pre-processor performs a Fourier transform on the speech input to extract spectral features that characterize the speech input as a sequence of representative multi-dimensional vectors. Further, each ASR system includes one or more speech recognition models (e.g., acoustic models and/or language models) and implements one or more speech recognition engines. Examples of speech recognition models include Hidden Markov Models, Gaussian-Mixture Models, Deep Neural Network Models, n-gram language models, and other statistical models. Examples of speech recognition engines include the dynamic time warping based engines and weighted finite-state
300
DK 2017 70505 A1 transducers (WFST) based engines. The one or more speech recognition models and the one or more speech recognition engines are used to process the extracted representative features of the front-end speech pre-processor to produce intermediate recognitions results (e.g., phonemes, phonemic strings, and sub-words), and ultimately, text recognition results (e.g., words, word strings, or sequence of tokens). In some examples, the speech input is processed at least partially by a third-party service or on the user’s device (e.g., the electronic device) to produce the recognition result. Once the STT processing produces recognition results containing a text string (e.g., words, or sequence of words, or sequence of tokens), the recognition result is passed to a natural language processing module for intent deduction. In some examples, STT processing produces multiple candidate text representations of the speech input. Each candidate text representation is a sequence of words or tokens corresponding to the speech input. In some examples, each candidate text representation is associated with a speech recognition confidence score. Based on the speech recognition confidence scores, STT processing ranks the candidate text representations and provides the n-best (e.g., n highest ranked) candidate text representation(s) to the natural language processing module for intent deduction, where n is a predetermined integer greater than zero. For example, in one example, only the highest ranked (n=1) candidate text representation is passed to the natural language processing module for intent deduction. In another example, the five highest ranked (n=5) candidate text representations are passed to the natural language processing module for intent deduction. More details on the speech-to-text processing are described in U.S. Utility Application Serial No. 13/236,942 for “Consolidating Speech Recognition Results,” filed on September 20, 2011, the entire disclosure of which is incorporated herein by reference.
[0838] The natural language processing module (“natural language processor”) of a digital assistant takes the n-best candidate text representation(s) (“word sequence(s)” or “token sequence(s)”) generated by STT processing, and attempts to associate each of the candidate text representations with one or more “actionable intents” recognized by the digital assistant. An “actionable intent” (or “user intent”) represents a task that can be performed by the digital assistant. The associated task flow is a series of programmed actions and steps that the digital assistant takes in order to perform the task. The scope of a digital assistant’s capabilities is dependent on the number and variety of task flows that have been implemented and stored in
301
DK 2017 70505 A1 various task flow models, or in other words, on the number and variety of “actionable intents” that the digital assistant recognizes. The effectiveness of the digital assistant, however, also dependents on the assistant’s ability to infer the correct “actionable intent(s)” from the user request expressed in natural language. Other details of inferring a user intent based on candidate actionable intents determined from multiple candidate text representations of a speech input are described in U.S. Utility Application Serial No. 14/298,725 for “System and Method for Inferring User Intent From Speech Inputs,” filed June 6, 2014, the entire disclosure of which is incorporated herein by reference.
[0839] In FIG. 29B, in response to receiving press-and-hold gesture 2901 on mechanical button 2904 to activate the digital assistant of electronic device 2900, the device displays, on display 2902, a digital assistant user interface 2924 that includes an indication 2924A (e.g., stating “What Can I Help You With?”) indicating to the user that the digital assistant is ready to assist the user, and a graphical indication 2924B showing whether the user’s speech input is being (or is not being) detected by the device. For example, graphical indication 2924B dynamically changes shape as the device detects and while the device is detecting the user’s spoken input.
[0840] FIG. 29C shows the user (e.g., “Kate Appleseed”) providing electronic device 2900 (e.g., while digital assistant user interface 2924 is shown on the display) with a spoken user input containing a request. For example, as shown in FIG. 29C, the user provides spoken user input 2903 stating “Send John $28,” thus requesting that the device send to recipient John Appleseed a payment in the amount of $28.
[0841] FIG. 29D shows digital assistant user interface 2924 following the request received via spoken user input 2903. Specifically, in response to receiving spoken user input 2903 stating “Send John $28,” the electronic device 2900 performs speech recognition and natural language processing on the spoken user input and displays, on display 2902, a transcription 2926 of spoken user input 2903 (e.g., to provide confirmation that the user’s intended input was received by the device). Further, from the performed speech recognition and natural language processing on the spoken user input, a determination is made (e.g., by the device or by an external device,
302
DK 2017 70505 A1 such as a server, communicating with the device) of a user intent (e.g., an intent to send a payment of $28 to John).
[0842] In accordance with a determination (e.g., made by the device or made by an external device, such as a server, communicating with the device) that the user’s intent, based on spoken user input 2903, is to send a payment to an intended recipient (e.g., to send John a payment in the amount of $28), electronic device 2900 displays within digital assistant user interface 2924 a message object box 2928 that includes a draft payment message object 2932 corresponding to the requested payment determined from spoken user input 2903. As also shown in FIG. 29D, draft message object 2932 includes a mode indication 2934 (e.g., stating “PAY”) indicating to the user that the draft payment message object corresponds to a payment to be made via an operating system-controlled payment transfer application (and not by a third-party application). As also shown in FIG. 29D, draft message object 2932 includes an amount indication 2936 (e.g., “$28”) indicating the amount of the intended payment.
[0843] In some embodiments, message object box 2928 includes an indication 2930 of the intended recipient (e.g., recipient 2910, “John Appleseed”) of the payment. In some embodiments, message object box 2928 includes a pay button 2940 (for proceeding with the payment as shown by message object box 2928) and a forgo pay button 2938 (for cancelling proceeding with the payment as shown by message object box 2928). In some embodiments, digital assistant user interface also displays a request 2942 from the digital assistant asking whether the user intends to add a comment (e.g., a note, a message) to accompany the payment.
[0844] FIG. 29E shows, while electronic device 2900 displays digital assistant user interface 2924 with message object box 2928, the user (e.g., “Kate Appleseed”) providing to the device a confirmation (e.g., “Yes”) that the user does intend to add a comment, and further providing the comment (e.g., “For Dinner + Cab”) to be added. For example, as shown in FIG. 29E, the user provides the confirmation and the comment via one continuous spoken user input 2905 (e.g., stating “Yes, For Dinner + Cab”).
[0845] In FIG. 29F, in response to receiving spoken user input 2905 responding to request 2942 (to add a comment to accompany the payment), electronic device 2900 provides display of
303
DK 2017 70505 A1 an updated message object box 2928 on digital assistant user interface 2924 to include (e.g., below the draft payment message object) a draft note message object 2946 (e.g., stating “Dinner + Cab”) corresponding to the comment from spoken user input 2905. In some embodiments, digital assistant user interface 2924 provides, prior to displaying the updated message object box, a transcription 2944 (e.g., stating “Yes, For Dinner + Cab”) of spoken user input 2905 received from the user.
[0846] In FIG. 29G, while displaying message object box 2928, electronic device 2900 detects user activation of pay button 2940 for proceeding with the payment as indicated by the message object box. For example, as shown in FIG. 29G, the user activation is a tap gesture 2907 on pay button 2940. In some embodiments, the user activation is made via a spoken user input (e.g., “Proceed with the payment,” “Make the payment as shown”) to the digital assistant.
[0847] In FIG. 29H, in response to detecting user input 2907 on pay button 2940 for proceeding with the payment as shown in message object box 2928, electronic device 2900 displays, on display 2902, a payment confirmation user interface 2948 (e.g., corresponding to payment confirmation user interface 878 described above with reference to FIGS. 8T-8W). As with payment confirmation use interface 878, payment confirmation user interface 2948 includes an authentication request 2950 (e.g., a graphical request, a textual request) requesting that the user provide authentication information (e.g., “Pay with Fingerprint”) to proceed with making the payment to recipient 2910 (e.g., “John Appleseed”).
[0848] In FIG. 29I, while displaying payment confirmation user interface 2948 including authentication request 2950, electronic device 2900 detects a user input corresponding to the authentication request 2950. For example, as shown in FIG. 29I, the requested authentication request is a fingerprint authentication request, and the user input is a fingerprint scan input 2909 on a fingerprint sensor (e.g., of mechanical button 2904) of the device.
[0849] In FIG. 29J, while (or subsequent to) detecting fingerprint scan input 2909 on mechanical button 2904, a determination is made (e.g., by the device or by an external device communicating with the device) whether the fingerprint information received from fingerprint scan input 2909 is consistent with an enrolled fingerprint information for authorizing transactions.
304
DK 2017 70505 A1
In accordance with a determination that the received fingerprint information is consistent with the enrolled fingerprint information, electronic device 2900 updates authentication request 2950 to indicate (e.g., stating “Authentication Successful”) that the transaction was successfully completed. In some embodiments, in accordance with a determination that the received fingerprint information is not consistent with the enrolled fingerprint information, the device displays a prompt requesting a second attempt at authentication. In some embodiments, in accordance with a determination that the received fingerprint information is not consistent with the enrolled fingerprint information, the device terminates the pending payment and displays an indication (e.g., on digital assistant user interface 2924) that authentication was unsuccessful.
[0850] In FIG. 29K, in response to a determination that authentication was successful, the digital assistant provides, on digital assistant user interface 2924 (e.g., below message object box 2928), an affirmation 2952 (e.g., stating “I’ll Send Your Message”) informing the user that a payment message object corresponding to draft payment message object 2932 (along with a note message object corresponding to draft note message object 2946) will be sent (e.g., via a messaging application). In some embodiments, the payment message object (and thus the corresponding payment) is sent (via a messaging application to recipient 2910) even when the device is currently in the user interface locked state. In some embodiments, the payment message object (and thus the corresponding payment) is sent (via a messaging application to recipient 2910) when the device is changed from a user interface locked state to a user interface unlocked state.
[0851] FIG. 29L shows electronic device 2900 again in the user interface locked state and displaying user interface locked screen 2920, as described in FIG. 29A. In FIG. 29L, while displaying user interface locked screen 2920, the device detects a user input that corresponds with successfully unlocking the device from the user interface locked state to the user interface unlocked state. For example, as shown in FIG. 29L, the user input is a fingerprint scan input 2911 on a fingerprint sensor (e.g., of mechanical button 2904) that matches with enrolled fingerprint information for unlocking the device.
[0852] FIG. 29M shows (e.g., after receiving fingerprint scan input 2911 unlocking the device) a home user interface 2954 of electronic device 2900. As shown in FIG. 29M, home
305
DK 2017 70505 A1 user interface 2954 includes a plurality of application icons 2954A-2954I corresponding to different applications (e.g., an application icon 2954A corresponding to a watch application, an application icon 2954B corresponding to a camera application, an application icon 2954C corresponding to a weather application, an application icon 2954D corresponding to aa alarm clock application, an application icon 2954E corresponding to a music application, an application icon 2954F corresponding to a messaging application, an application icon 2954G corresponding to a phone application, an application icon 2954H corresponding to a mail application, and an application icon 2954I corresponding to a browser application).
[0853] In FIG. 29M, while displaying home user interface 2954, electronic device 2900 detects user selection of icon 2954F corresponding to a messaging application. For example, as shown in FIG. 29M, the user selection is a tap gesture 2913 on icon 2954F.
[0854] In FIG. 29N, in response to detecting tap gesture 2913 on icon 2954F corresponding to a messaging application, electronic device 2900 displays, on display 2902, a message conversation 2908 of the messaging application 2906 between the user of the device (e.g., “Kate Appleseed”) and recipient 2910 (e.g., “John Appleseed”). In some embodiments, recipient 2910 is a contact stored on the device. In some embodiments, recipient 2910 is a contact of a contact list associated with the user account logged onto the device. In some embodiments, recipient 2910 is a contact included in a trusted contacts list associated with the user account logged onto the device.
[0855] In some embodiments, electronic device 2900 also displays, on display 2902, a virtual keyboard 2912 (e.g., an alphanumeric keyboard for typing a message) and compose bar 2914 for displaying the text of a message as a message is typed using a virtual keyboard 2912. In some embodiments, a mechanical keyboard can be used in addition to or alternatively to virtual keyboard 2912 to type a message. In some embodiments, compose bar 2914 can expand (e.g., expand upwards) to accommodate a longer message or message object (e.g., an image, an emoticon, a special type of message object, such as a payment object). In some embodiments, compose bar 2914 includes a mic button 2914A which, when activated, enables the user to record a message using voice input.
306
DK 2017 70505 A1 [0856] As shown in FIG. 29N, message conversation 2908 includes a payment message object 2956 created via the digital assistant and sent via messaging application 2906 to recipient 2910 (e.g., “John Appleseed”). Payment message object 2956 (e.g., similar to payment message object 1420 described above with reference to FIGS. 14C-14F) corresponds to draft payment message object 2932, which in turn corresponds to the requested payment determined from spoken user input 2903. In some embodiments, payment message object 2956 includes a mode indication 2958 (e.g., stating “PAY”) indicating to the user that the payment message object corresponds to a payment (or payment request) made via an operating system-controlled payment transfer application (and not by a third-party application). Payment message object 2956 also includes an amount indication 2960 informing the user of the amount of the payment (e.g., “$28”). In some embodiments, payment message object 2956 also includes a first status indicator 2962 informing the user of a status of the payment corresponding to the payment message object (e.g., “pending,” “paid,” “accepted,” “expired,” etc.). For example, in FIG. 29N, first status indicator 2962 shows “paid,” thus indicating to the user that the payment associated with payment message object 2956 has been accepted by the recipient (e.g., recipient 2910). In some embodiments, a second status indicator 2964 informing the user of the status of the payment corresponding to the sent payment message object (e.g., “pending,” “paid,” “accepted,” “expired,” etc.) is also displayed (e.g., outside of the payment message object). For example, in FIG. 29N, second status indicator 2964 (e.g., “paid”) shows the same status as shown by first status indicator 2962 (e.g., “paid”).
[0857] As indicated by status indicators 2962 and 2964, the payment corresponding to payment message object 2956 has been accepted by recipient 2910. Thus, in some embodiments, amount indication 2960 of the payment amount (e.g., “$28”) is displayed with a bolder (or thicker) font than if the payment was pending. In some embodiments, amount indication 2960 includes a black outline (e.g., a shadow) applied to the font of the displayed amount (e.g., “$28”). In some embodiments, amount indication 2960 of the payment amount (e.g., “$28”) is shown in a different color (e.g., white as opposed to black) than if the payment was pending. In some embodiments, in response to detecting a change in orientation of the device, electronic device 2900 generates feedback (e.g., a visual feedback, a haptic feedback, audio feedback) associated with the payment message object. In some embodiments, the feedback is a dynamic
307
DK 2017 70505 A1 visual feedback causing display of the payment message object (e.g., payment message object 2956) to change as changes in the orientation of the device relative to a reference point are detected, as described above, for example, with reference to payment message object 1172 in FIG. 11T.
[0858] FIG. 29O shows (e.g., while the digital assistant is active, while electronic device 2900 is displaying digital assistant user interface 2924), the user (e.g., “Kate Appleseed”) providing a spoken user input 2915 to the device requesting that a payment request be made to an intended recipient. For example, as shown in FIG. 29O, spoken user input 2915 states “Request $28 from John for diner + cab” (thus requesting that the digital assistant a payment request to be sent to recipient 2910 (John) in the amount of $28 for “dinner + cab”).
[0859] In FIG. 29P, in response to receiving spoken user input 2915 requesting that a payment request (of $28) be sent to recipient 2910 (John) for “dinner + cab,” electronic device 2900 displays digital assistant user interface 2924 with a transcription 2968 (e.g., stating “Request $28 from John for dinner + cab”) of the spoken user input 2915 and a message object box 2970 corresponding to the request for creating a payment request received from spoken user input 2915. As shown in FIG. 29P, corresponding to the request received from spoken user input 2915, message object box 2970 includes an indication 2972 of the intended recipient of the payment request (e.g., recipient 2910, “John Appleseed”), a draft payment message object 2974 for a payment request, and a draft note message object 2980 corresponding to the comment detected from spoken user input 2915. As with draft payment message object 2932, draft payment message object 2974 includes a mode indication 2976 (e.g., stating “PAY”) indicating to the user that the payment message object corresponds to a payment request made via an operating system-controlled payment transfer application (and not by a third-party application). Draft payment message object 2974 also includes an amount indication 2978 informing the recipient of the payment request (e.g., recipient 2910) the amount of the requested payment (e.g., “$28”) and a further indication (e.g., “$28 Request”) that the payment message object corresponds to a request for payment.
[0860] As also shown in FIG. 29P, message object box 2970 includes a request button 2984 (for proceeding with sending the payment request) and a forgo request button 2982 (for forgoing 308
DK 2017 70505 A1 proceeding with sending the payment request). In FIG. 29Q, while displaying message object box 2970, electronic device 2900 detects user selection of request button 2984 for proceeding with sending the payment request. For example, as shown in FIG. 29Q, the user selection is a tap gesture 2917 on request button 2984. In some embodiments, the user selection is a spoken input indicating an intent to select request button 2984 (or an intent to proceed with sending the payment request).
[0861] In FIG. 29R, in response to detecting tap gesture 2917 on request button 2984 for proceeding with sending the payment request, electronic device 2900 displays (e.g., below message object box 2970), an affirmation 2986 (e.g., stating “I’ll Send Your Message”) informing the user that the device will proceed with sending the payment request (e.g., via messaging application 2906).
[0862] FIG. 29S shows electronic device 2900 displaying message conversation 2908 (with recipient 2910, “John Appleseed”) of messaging application 2906. As shown in FIG. 29S, message conversation 2908 includes a payment message object 2990 corresponding to the payment request created via the digital assistant and sent via messaging application 2906 to recipient 2910 (e.g., “John Appleseed”). Payment message object 2990 (e.g., similar to payment message object 1460 described above with reference to FIGS. 14H-14K) corresponds to draft payment message object 2974, which in turn corresponds to the payment request created from spoken user input 2915. In some embodiments, payment message object 2990 includes a mode indication 2992 (e.g., stating “PAY”) indicating to the user that the payment message object corresponds to a payment request made via an operating system-controlled payment transfer application (and not by a third-party application). Payment message object 2990 also includes an amount indication 2994 informing the recipient of the payment request (e.g., recipient 2910) the amount of the requested payment (e.g., “$28”) and a further indication (e.g., “$28 Request”) that the payment message object corresponds to a request for payment. In some embodiments, payment message object 2990 also includes a first status indicator 2962 informing the user of a status of the payment corresponding to the payment message object (e.g., “pending,” “paid,” “accepted,” “expired,” etc.). For example, in FIG. 29S, first status indicator 2996 shows “pending,” thus indicating to the user that the payment associated with payment message object
309
DK 2017 70505 A1
2990 has not yet been accepted by the recipient (e.g., recipient 2910). In some embodiments, a second status indicator 2998 informing the user of the status of the payment corresponding to the sent payment message object (e.g., “pending,” “paid,” “accepted,” “expired,” etc.) is also displayed (e.g., outside of the payment message object). For example, in FIG. 29S, second status indicator 2998 (e.g., “pending”) shows the same status as shown by first status indicator 2996 (e.g., “pending”).
[0863] FIGS. 30A-30D are a flow diagram illustrating a method for voice-activation of transfers using an electronic device in accordance with some embodiments. Method 3000 is performed at a device (e.g., 100, 300, 500, 2800, 2900) with one or more output devices (e.g., a display, a speaker) including a display and one or more input devices (e.g., a mic for receiving voice input, touch-sensitive surface). Some operations in method 3000 are, optionally, combined, the orders of some operations are, optionally, changed, and some operations are, optionally, omitted.
[0864] As described below, method 3000 provides an intuitive way for managing peer-topeer transactions. The method reduces the cognitive burden on a user for managing peer-to-peer transactions, thereby creating a more efficient human-machine interface. For battery-operated computing devices, enabling a user to manage peer-to-peer transactions faster and more efficiently conserves power and increases the time between battery charges.
[0865] The electronic device (e.g., 2800, 2900) receives (3002), via the one or more input devices, an utterance (e.g., 2801, 2903, a word, a phrase, a natural language input) from a user that corresponds to a request to perform an operation (e.g., accessing secure content, sending secure content, sending a payment, accepting a payment, sending a request for payment).
[0866] In some examples, the utterance (e.g., 2801, 2903) from the user that corresponds to the request to perform the operation is received (3004) while the electronic device (e.g., 2800, 2900) is in a locked mode of operation (e.g., a mode of operation in which the user is not authenticated to the device and the device is prevented from performing one or more operations that the device can perform when in an unlocked mode of operation).
310
DK 2017 70505 A1 [0867] In some examples, the utterance (e.g., 2801, 2903) from the user that corresponds to the request to perform the operation is received while the device (e.g., 2800, 2900) is in an unlocked mode (e.g., the device performs the operation while the device is unlocked) of operation (e.g., a mode of operation in which the user is authenticated to the device and the device is enabled to perform one or more operations that the device is prevented from performing when in a locked mode of operation).
[0868] In some examples, the operation includes (3006) sending a message (e.g., a text message, a chat message, an email) to a message participant (other than a user of the device) in a message conversation of a messaging application (e.g., a text message application, a chat application, an email application). In some examples, the message includes (3008) an attached item (e.g. a file, a photo, a video, a payment). In some examples, the attached item (i.e., attachment) is not marked as requiring authorization.
[0869] In response to receiving the utterance, the electronic device (e.g., 2800, 2900) prepares (3010) to perform the operation, wherein in accordance with (3012) a determination that the operation requires authorization, preparing to perform the operation includes (3014) presenting, via the one or more output devices of the device: a representation (e.g., 2932) (3016) of the operation and instructions (3018) for providing authorization to the device, via the one or more input devices of the device, to perform the operation. Presenting, to the user, a representation (e.g., 2932) of the operation and instructions for providing authorization to perform the operation provides the user with feedback about the operation that will be performed (once authorized) and about the state of the device (state in which authentication is required), and prompts the user to provide the authorization (e.g., through biometric authentication, such as via a fingerprint authentication or facial recognition). Providing improved feedback to the user enhances the operability of the device and makes the user-device interface more efficient (e.g., by helping the user to achieve an intended result by providing feedback indicative of an input that will cause the device to generate the intended result and reducing user mistakes when operating/interacting with the device) which, additionally, reduces power usage and improves battery life of the device by enabling the user to use the device more quickly and efficiently.
311
DK 2017 70505 A1 [0870] In some examples, presenting instructions for providing authorization to the device, via the one or more input devices of the device, to perform the operation comprises (3020) displaying, on the display (e.g., 2802, 2902), an authorization user interface (e.g., 2822, 2948, a user interface for receiving authentication information from the user of the device to authorize the operation, such as a payment). In some examples, the authorization user interface (e.g., 2948) includes a request for authentication information (e.g., 2830, 2950, biometric authentication information) from the user of the device to authorize the operation. In some examples, the authorization user interface (e.g., 2822, 2948 includes an indication of a payment method to be used, along with an option to change the payment method (e.g., to select from among a plurality of payment methods, such as credit card accounts, debit card accounts, payment accounts, provisioned onto the electronic device). In some examples, the authorization user interface (e.g., 2822, 2948) includes instructions for providing the authorization (e.g., 2830, 2950). In some examples, the authorization user interface (e.g., 2948) is a system-generated authorization user interface that is used for payments in other contexts (e.g., in-app and web payments). In some examples, the system-generated authorization user interface is a user interface for the second application (e.g., operating system or electronic wallet application), as described in U.S. Patent Application Serial No. 14/503,296, filed September 30, 2014, titled “USER INTERFACE FOR PAYMENTS,” the contents of which are incorporated herein by reference.
[0871] After preparing to perform the operation, the electronic device receives (3022) a confirmation input (e.g., a tap input on the device, a tap input on a touch-sensitive surface of the device, a verbal confirmation input) associated with (or corresponding to) performing the operation. In response to (3024) receiving the confirmation input, the electronic device performs one or more of blocks 3026, 3032, and 3038.
[0872] In accordance with (3026) a determination that the operation requires authorization and the operation has not been authorized, the electronic device (e.g., 2800, 2900) forgoes (3028) performing the operation in response to the confirmation input. In some examples, in accordance with (3026) a determination that the operation requires authorization and the operation has not been authorized, the electronic device forgoes (3030) unlocking the device from the locked mode
312
DK 2017 70505 A1 of operation to an unlocked mode of operation (e.g., a mode of operation in which the user is authenticated to the device and the device is enabled to perform one or more operations that the device is prevented from performing when in a locked mode of operation). Forgoing unlocking the device in accordance with a determination that authorization has not been successfully provided enhances device security and allows the user to store files and information (e.g., documents, photos, accounts) on the device knowing that access to the device is protected by security measures. Increasing the security of the device enhances the operability of the device by preventing unauthorized access to content and operations and, additionally, reduces power usage and improves battery life of the device by enabling the user to use the device more efficiently.
[0873] In accordance with (3032) a determination that the operation requires authorization and the operation has been authorized, the electronic device performs (3034) the operation in response to the confirmation input (e.g., sending the message with the secure attachment, or sending the payment). In some examples, in accordance with (3032) a determination that the operation requires authorization and the operation has been authorized, the electronic device (e.g., 2800, 2900) unlocks (3036) the device from the locked mode of operation to the unlocked mode of operation.
[0874] In accordance with (3038) a determination that the operation does not require authorization, the electronic device (e.g., 2800, 2900) performs (3040) the operation in response to the confirmation input (e.g., sending the message that does not include the secure attachment or payment). In some examples, in accordance with (3038) a determination that the operation does not require authorization, the electronic device forgoes unlocking the device from the locked mode of operation to the unlocked mode of operation.
[0875] Thus, in some examples, when the electronic device is locked and the electronic device (e.g., 2800, 2900) receives valid authentication to authorize an operation, the device also transitions to an unlocked mode of operation. Accordingly, a single authentication (e.g., a single biometric authentication, a single fingerprint authentication, a single facial recognition authentication) is used to both authorize the operation and to unlock the device. In some examples, when the electronic device is locked and the electronic device receives invalid
313
DK 2017 70505 A1 authentication to authorize an operation, the device does not transition to the unlocked mode of operation.
[0876] In some examples, the attached item is marked as requiring authorization (e.g., to authorize opening of a protected file, to authorize a payment associated with the attachment).
[0877] In some examples, the attached item is a payment object that represents a payment to the message participant (e.g., a payment object that authorizes payment to the message participant by a bank or other financial institution or a digital representation of a payment made to the message participant).
[0878] In some examples, the attached item is a request for payment (e.g., a request for a certain amount of funds) by the user of the device from the message participant.
[0879] In some examples, performing the operation in response to the confirmation input includes displaying, on the display (e.g., 2802, 2902), an indication (e.g., 2830, 2952, a confirmation notification, a textual confirmation (e.g., “Your message will be sent,” “I’ll send your message”), an audio confirmation, a feedback indicating confirmation) that the message (with the attachment) will be sent to the message participant in the message conversation of the messaging application. Displaying an indication that the message will be send to the participant provides the user with visual feedback about the state of the device, such as whether the operation has been (or will be) performed. Providing improved feedback to the user enhances the operability of the device and makes the user-device interface more efficient (e.g., by helping the user to achieve an intended result by providing feedback indicative of an input that will cause the device to generate the intended result and reducing user mistakes when operating/interacting with the device) which, additionally, reduces power usage and improves battery life of the device by enabling the user to use the device more quickly and efficiently.
[0880] In some examples, prior to performing the operation in response to the confirmation input, the electronic device (e.g., 2800, 2900) outputs a prompt (e.g., a visual indication such as a text bar or a prompt, and/or an audio indication) to include a user-specified message along with the attached item (e.g., in the message or in a subsequent or prior message in the message
314
DK 2017 70505 A1 conversation). In some examples, subsequent to outputting the prompt to include the userspecified message along with the attached item, the electronic device receives, via one or more input devices (e.g., a microphone, a displayed keyboard), additional input (e.g., an utterance, a typed input). The electronic device sends text corresponding to the additional input to the participant in the message conversation along with the attachment.
[0881] In some examples, in accordance with a determination, based on the utterance from the user, that a graphical animation (e.g., a dynamic visual effect, such as a moving pattern, moving elements, and/or changing colors) is to be associated with the message, the electronic device (e.g., 2800, 2900) requests, via the one or more output devices (e.g., a visual request via the display, an audio request via speakers), user selection of a graphical animation. In some examples, the electronic device provides a plurality of different graphical animations that can be applied for the user to choose from. In some examples, the electronic device receives, via the one or more input devices, the user selection of a first graphical animation (e.g., animation comprising falling cash, an animation comprising fireworks, an animation comprising an unwrapping gift box, an animation comprising an opening envelope). In some examples, the electronic device associates the first graphical animation with the message prior to sending the message to the message participant. In some examples, if the message is an instant message (e.g., a text message), when the message participant receives the message in a text messaging application on the message participant’s external device, the message is displayed in the message conversation of the text messaging application with the first graphical animation being applied.
[0882] In some examples, prior to presenting the instructions for providing authorization to the device, via the one or more output devices of the device, to perform the operation, the electronic device (e.g., 2800, 2900) displays, on the display (e.g., 2802, 2902), an indication of a send option (e.g., 2940, for sending an attachment, for sending a payment) and an indication of a request option (e.g., 2984, for requesting an attachment, for requesting a payment).
[0883] In some examples, prior to presenting the instructions for providing authorization to the device, via the one or more output devices of the device, to perform the operation, the electronic device (e.g., 2800, 2900) displays, on the display (e.g., 2802, 2902) a send option (e.g., 2818, 2940, for sending an attachment, for sending a payment) and a request option (e.g., 315
DK 2017 70505 A1
2984, for requesting an attachment, for requesting a payment). For example, the send option (e.g., 2818, 2940) and the request option (e.g., 2984) are displayed in accordance with the electronic device determining that the utterance (e.g., a word, a phrase, a natural language input) corresponds to the operation (e.g., with high confidence, confidence above a threshold) but with a confidence below a confidence threshold as to whether the utterance corresponds to a send operation or a request operation. The electronic device receives user input selecting the send option (e.g., 2818, 2940) or the request option (e.g., 2984). In accordance with the received user input corresponding to activation of the send option, the electronic device presents the instructions for providing authorization to the device. In some examples, in accordance with the received user input corresponding to activation of the request option (e.g., 2984), the electronic device forgoes presenting the instructions for providing authorization to the device and, optionally, transmits the request (e.g., request for payment). Thus, in some examples, the device requires authorization for sending payments and does not require authorization for requesting payments. In some examples, the send option and the request option are not displayed in accordance with the electronic device determining that the utterance (e.g., a word, a phrase, a natural language input) corresponds to the operation (e.g., with high confidence, confidence above a threshold) and with a confidence above the confidence threshold as to whether the utterance corresponds to a send operation or a request operation.
[0884] In some examples, the electronic device (e.g., 2800, 2900) receives a user input selection. In accordance with a determination that the user input selection corresponds to activation of the send option, the electronic device designates the attachment as a payment corresponding to a transfer of an amount of funds from the user to the message participant. In some examples, the technique subsequently proceeds to preparing to perform the operation, as described above. In accordance with a determination that the user input selection corresponds to activation of the request option, the electronic device designates the attachment as a payment request corresponding to a request for transfer of an amount of funds to the user from the message participant. In some examples, the technique subsequently proceeds to preparing to perform the operation, as described above.
316
DK 2017 70505 A1 [0885] In some examples, the electronic device (e.g., 2800, 2900) receives user selection (e.g., via a touch input of the option, via verbal instructions to select the option) of the send option (e.g., 2940). In response to receiving the user selection of the send option (e.g., 2818, 2940), the electronic device designates the attachment as a payment corresponding to a transfer of an amount of funds from the user to the message participant. In some examples, the technique subsequently proceeds to preparing to perform the operation, as described above.
[0886] In some examples, the electronic device (e.g., 2800, 2900) receives user selection (e.g., via a touch input of the option, via verbal instructions to select the option) of the request option (e.g., 2984). In response to receiving the user selection of the request option (e.g., 2984), the electronic device designates the attachment as a payment request corresponding to a request for transfer of an amount of funds to the user from the message participant. In some examples, the technique subsequently proceeds to preparing to perform the operation, as described above.
[0887] In some examples, the authentication information includes biometric authentication information (e.g., a fingerprint for fingerprint authentication, a facial feature for facial recognition, a voice input for voice recognition, an iris scan for iris recognition, retina scan for retina recognition).
[0888] In some examples, the authorization user interface (e.g., 2822, 2948) includes an indication of a resource account (e.g., a payment account, such as a debit card or a checking account, a points account, a credit account) for use in performing the operation.
[0889] In some examples, presenting, via the one or more output devices of the device, the representation of the operation and the instructions for providing the authorization to the device includes concurrently displaying, on the display: the representation of the operation (e.g., 2932), and the instructions for providing the authorization to the device, via the one or more input devices of the device, to perform the operation. Concurrently displaying a representation of the operation and instructions for providing authorization to perform the operation provides the user with visual feedback about the operation that will be performed (once authorized) and about the state of the device (state in which authentication is required), and prompts the user to provide the authorization (e.g., through biometric authentication, such as via a fingerprint authentication or
317
DK 2017 70505 A1 facial recognition). Providing improved visual feedback to the user enhances the operability of the device and makes the user-device interface more efficient (e.g., by helping the user to achieve an intended result by providing feedback indicative of an input that will cause the device to generate the intended result and reducing user mistakes when operating/interacting with the device) which, additionally, reduces power usage and improves battery life of the device by enabling the user to use the device more quickly and efficiently.
[0890] In some examples, presenting, via the one or more output devices of the device, the representation of the operation and the instructions for providing the authorization to the device includes: outputting, via the one or more output devices (e.g., an audio output via a speaker), an audio (e.g., verbal) (or, alternatively or in addition, a visual description) description of the operation; and outputting, via the one or more output devices (e.g., an audio output via a speaker), audio (e.g., verbal) (or, alternatively or in addition, a visual instruction) instructions for providing authorization to the device to enable performing of the operation. Outputting audio description of the operation and audio instructions for providing authorization to perform the operation provides the user with audio feedback about the operation that will be performed (once authorized) and about the state of the device (state in which authentication is required), and prompts the user to provide the authorization (e.g., through biometric authentication, such as via a fingerprint authentication or facial recognition). Providing improved audio feedback to the user enhances the operability of the device and makes the user-device interface more efficient (e.g., by helping the user to achieve an intended result by providing feedback indicative of an input that will cause the device to generate the intended result and reducing user mistakes when operating/interacting with the device) which, additionally, reduces power usage and improves battery life of the device by enabling the user to use the device more quickly and efficiently.
[0891] In some examples, in response to receiving the utterance, and prior to preparing to perform the operation, the electronic device performs speech recognition on the utterance to determine a text representation of the utterance, wherein the operation is performed based on an analysis of the text representation of the utterance.
[0892] In some examples, speech recognition is performed using speech-to-text (STT) processing, such as through an Automatic Speech Recognition (ASR) system. One or more ASR 318
DK 2017 70505 A1 systems can process the speech input to produce a recognition result. Each ASR system includes a front-end speech pre-processor. The front-end speech pre-processor extracts representative features from the speech input. For example, the front-end speech pre-processor performs a Fourier transform on the speech input to extract spectral features that characterize the speech input as a sequence of representative multi-dimensional vectors. Further, each ASR system includes one or more speech recognition models (e.g., acoustic models and/or language models) and implements one or more speech recognition engines. Examples of speech recognition models include Hidden Markov Models, Gaussian-Mixture Models, Deep Neural Network Models, n-gram language models, and other statistical models. Examples of speech recognition engines include the dynamic time warping based engines and weighted finite-state transducers (WFST) based engines. The one or more speech recognition models and the one or more speech recognition engines are used to process the extracted representative features of the front-end speech pre-processor to produce intermediate recognitions results (e.g., phonemes, phonemic strings, and sub-words), and ultimately, text recognition results (e.g., words, word strings, or sequence of tokens). In some examples, the speech input is processed at least partially by a third-party service or on the user’s device (e.g., the electronic device) to produce the recognition result. Once the STT processing produces recognition results containing a text string (e.g., words, or sequence of words, or sequence of tokens), the recognition result is passed to a natural language processing module for intent deduction. In some examples, STT processing produces multiple candidate text representations of the speech input. Each candidate text representation is a sequence of words or tokens corresponding to the speech input. In some examples, each candidate text representation is associated with a speech recognition confidence score. Based on the speech recognition confidence scores, STT processing ranks the candidate text representations and provides the n-best (e.g., n highest ranked) candidate text representation(s) to the natural language processing module for intent deduction, where n is a predetermined integer greater than zero. For example, in one example, only the highest ranked (n=1) candidate text representation is passed to the natural language processing module for intent deduction. In another example, the five highest ranked (n=5) candidate text representations are passed to the natural language processing module for intent deduction.
319
DK 2017 70505 A1 [0893] More details on the speech-to-text processing are described in U.S. Utility Application Serial No. 13/236,942 for “Consolidating Speech Recognition Results,” filed on September 20, 2011, the entire disclosure of which is incorporated herein by reference.
[0894] In some examples, the analysis of the text representation of the utterance comprises performing natural language processing on the text representation of the utterance to determine an actionable intent (of a user of the device).
[0895] In some examples, the natural language processing module (“natural language processor”) of a digital assistant takes the n-best candidate text representation(s) (“word sequence(s)” or “token sequence(s)”) generated by STT processing, and attempts to associate each of the candidate text representations with one or more “actionable intents” recognized by the digital assistant. An “actionable intent” (or “user intent”) represents a task that can be performed by the digital assistant. The associated task flow is a series of programmed actions and steps that the digital assistant takes in order to perform the task. The scope of a digital assistant’s capabilities is dependent on the number and variety of task flows that have been implemented and stored in various task flow models, or in other words, on the number and variety of “actionable intents” that the digital assistant recognizes. The effectiveness of the digital assistant, however, also dependents on the assistant’s ability to infer the correct “actionable intent(s)” from the user request expressed in natural language.
[0896] Other details of inferring a user intent based on candidate actionable intents determined from multiple candidate text representations of a speech input are described in U.S. Utility Application Serial No. 14/298,725 for “System and Method for Inferring User Intent From Speech Inputs,” filed June 6, 2014, the entire disclosure of which is incorporated herein by reference.
[0897] Note that details of the processes described above with respect to method 3000 (e.g., FIGS. 30A-30D) are also applicable in an analogous manner to the methods described herein. For example, method 3000 optionally includes one or more of the characteristics of the various methods described herein with reference to methods 900, 1200, 1500, 1800, 2100, 2400, 2700, and 3400. For example, a payment message object created to transfer the first type of item (e.g.,
320
DK 2017 70505 A1 a sticker, a photo, a payment object), as described in method 900, can be create via an utterance (e.g., 2903). For another example, the outputting of feedback, as described in method 1200, can be applied to a representation of the operation after it has been sent via a messaging application. For another example, the message objects with different visual appearances based on whether the message object corresponds to a transmission message or a request message, as described in method 1500, are applicable to the different types of operations that be initiated by an utterance (e.g., 2903). For another example, a request for activating an account that is authorized to obtain one or items (e.g., a sticker, a photo, resources, a payment), as described in method 1800, can be applied when setting up an account associated an operation initiated by an utterance (e.g., 2903). For another example, switching the account to be used in a resource transfer based on an indication that resources are insufficient in the currently-selected account, as described in method 2100, can be used when switching the account to be used in the operation initiated by an utterance (e.g., 2903). For another example, automatically proceeding with a transfer, as described in method 2400, instead of requiring user input, can be used when transmitting an operation initiated by an utterance (e.g., 2903). For another example, the plurality of items including information from messages in a message conversation, as described in method 2700, can include information from operations initiated by an utterance (e.g., 2903). For another example, a visual effect (e.g., a coloring effect, a geometric alteration effect) can be applied, as described in method 3400, to an element of a message object (e.g., 2932) when a transfer (e.g., of a resource, of a file, of a payment) associated with a message corresponding to the message object is completed. For brevity, these details are not repeated below.
[0898] The operations in the information processing methods described above are, optionally, implemented by running one or more functional modules in an information processing apparatus such as general purpose processors (e.g., as described with respect to FIGS. 1A, 3, and 5A) or application specific chips. Further, the operations described above with reference to FIGS. 30A-30D are, optionally, implemented by components depicted in FIGS. 1A1B. For example, receiving operation 3002, preparing operation 3010, presenting operation3014, receiving operation 3022, performing operation 3028, performing operation 3034, and performing operation 3040 are, optionally, implemented by event sorter 170, event recognizer 180, and event handler 190. Event monitor 171 in event sorter 170 detects a contact on touch321
DK 2017 70505 A1 sensitive surface 604, and event dispatcher module 174 delivers the event information to application 136-1. A respective event recognizer 180 of application 136-1 compares the event information to respective event definitions 186, and determines whether a first contact at a first location on the touch-sensitive surface corresponds to a predefined event or sub-event, such as selection of an object on a user interface. When a respective predefined event or sub-event is detected, event recognizer 180 activates an event handler 190 associated with the detection of the event or sub-event. Event handler 190 optionally utilizes or calls data updater 176 or object updater 177 to update the application internal state 192. In some embodiments, event handler 190 accesses a respective GUI updater 178 to update what is displayed by the application. Similarly, it would be clear to a person having ordinary skill in the art how other processes can be implemented based on the components depicted in FIGS. 1A-1B.
[0899] FIGS. 31A-31M illustrate exemplary user interfaces for user verification, in accordance with some embodiments. FIG. 31A illustrates an electronic device 3100 (e.g., portable multifunction device 100, device 300, or device 500). In the non-limiting exemplary embodiment illustrated in FIGS. 31A-31M, electronic device 3100 is a smartphone. In other embodiments, electronic device 3100 can be a different type of electronic device, such as a wearable device (e.g., a smartwatch). Electronic device 3100 has a display 3102 and one or more input devices (e.g., touchscreen of display 3102, a mechanical button 3104, a mic, a camera).
[0900] In some embodiments, the exemplary user interfaces for user verification described below with reference to FIGS. 31A-31M can be used by an electronic device described herein (e.g., electronic device 700, 800, 850, 1000, 1100, 1300, 1400, 1600, 1700, 1900, 2000, 2100, 2300, 2500, 2600, 2800, 2900, 3200, and/or 3300) to verify a user of the device (e.g., to verify that the current user of the device is the user corresponding to a user account logged into the device). For example, in FIG. 8W, in accordance with a determination that fingerprint information 815 is (e.g., for a second time) not consistent with the enrolled fingerprint information (for authorizing a transaction or for verifying the user) associated with the user account, the user verification techniques described in FIGS. 31A-31M can be used by the device to verify the user. For another example, in FIG. 11P, in accordance with a determination that
322
DK 2017 70505 A1 fingerprint information 1111 is (e.g., for a second time) not consistent with the enrolled fingerprint information (for authorizing a transaction or for verifying the user) associated with the user account, the user verification techniques described in FIGS. 31A-31M can be used by the device to verify the user. For another example, in FIG. 20I, in accordance with a determination that fingerprint information 2003 is (e.g., for a second time) not consistent with the enrolled fingerprint information (for authorizing a transaction or for verifying the user) associated with the user account, the user verification techniques described in FIGS. 31A-31M can be used by the device to verify the user. For another example, in FIG. 23M, in accordance with a determination that fingerprint information 2311 is (e.g., for a second time) not consistent with the enrolled fingerprint information (for authorizing a transaction or for verifying the user) associated with the user account, the user verification techniques described in FIGS. 31A-31M can be used by the device to verify the user.
[0901] In FIG. 31A, electronic device 3100 displays, on display 3102, an automatic verification user interface 3106 for verifying a user of the device (e.g., to verify that the current user of the device is the user corresponding to a user account logged into the device). As shown in FIG. 31A, automatic verification user interface 3106 includes a capture region 3108. In some embodiments, automatic verification user interface 3106 also includes a verification request 3110 (e.g., stating “Verify Photo ID”) indicating to the user of a request for user verification. In some embodiments, automatic verification user interface 3106 also includes an indication 3112 (e.g., “Use DL or State ID in the frame”) informing the user of an allowable input object (e.g., a driver’s license, a government-issued identification card, a passport) for the verification.
[0902] In FIG. 31B, while displaying automatic verification user interface 3106, electronic device 3100 detects (e.g., via a camera of the device), an input object provided by the user of the device (e.g., “Kate Appleseed”). For example, as shown in FIG. 31B, the input object is a government-issued identification card 3114 (e.g., a California state ID) provided by the user. In some embodiments, the device displays, while and after capturing an image of governmentissued identification card 3114, the captured image of government-issued identification card 3114 within capture region 3108.
323
DK 2017 70505 A1 [0903] As shown in FIG. 31C, in response to capturing the image of government-issued identification card 3114, automatic verification user interface 3106 displays a progress page 3106 including an indication 3116 that the ID-verification is currently in progress.
[0904] As shown in FIG. 31D, in accordance with a determination (e.g., made by the device or by an external device, such as a server, communicating with the device) that verification was successful (e.g., because user identification information obtained from the captured governmentissued identification card 3114 is consistent with enrolled user identification information stored on the device or stored on an external device, such as a server, communicating with the device), automatic verification user interface 3106 displays a confirmation indication 3116 (e.g., stating “Verified,” “Your Account Has Been Verified”) informing the user of the successful verification. In some embodiments, automatic verification user interface 3106 also displays a done button 3118 for leaving the verification interface.
[0905] In some embodiments, in accordance with a determination (e.g., made by the device or by an external device, such as server, communicating with the device) that verification using identification information captured from government-issued identification card 3114 was not successful, electronic device 3100 displays, on display 3102, a manual verification user interface 3120. In some embodiments, manual verification user interface 3120 is displayed in the first instance of the verification (e.g., instead of automatic verification user interface 3106). In some embodiments, as shown in FIG. 31E, manual verification user interface 3120 includes an (e.g., graphical and/or textual) indication that manual verification is required for user verification and a continue button 3124 for proceeding with the manual verification process.
[0906] In FIG. 31F, in response to detecting selection (e.g., a tap gesture) of continue button 3124, electronic device 3100 displays, on display 3102, a first questions page of manual verification user interface 3120 that requests from the user a first set of identification information. For example, as shown in FIG. 31F, the first set of identification information includes a request 3120A for a first name, a request 3120B for a last name, a request 3120C for a street address, a request 3120D for a resident state, and a request 3120E for a zip code. In some embodiments, first questions page of manual verification user interface 3120 also includes a cancel button 3126
324
DK 2017 70505 A1 for canceling the verification process and a next button 3125 for proceeding with the verification process (after having provided the request information from the first set of questions).
[0907] In some embodiments, in FIG. 31G, in response to detecting user selection (e.g., a tap gesture) of next button 3125 from the first questions page of manual verification user interface 3120, electronic device 3100 displays a second questions page of manual verification user interface 3120 that requests from the user a second set of identification information. For example, as shown in FIG. 31G, the second set of identification information includes a request 3120F for (a portion of) a government identification number (e.g., last four digits of a social security number, last four digits of an individual taxpayer identification number) and a request 3120G for date of birth information of the user. In some embodiments, second questions page of manual verification user interface 3120 maintains display of cancel button 3126 and next button 3125.
[0908] In some embodiments, in FIG. 31H, in response to detecting user selection (e.g., a tap gesture) of next button 3125 from the second questions page of manual verification user interface 3120, electronic device 3100 displays a third questions page of manual verification user interface 3120 that requests from the user a third set of identification information. For example, as shown in FIG. 31H, the third set of identification information includes a request 3120H for a full government identification number (e.g., full social security number, full individual taxpayer identification number). In some embodiments, request 3120H corresponds to a selectable indication, and, in response to detecting a user input (e.g., a tap gesture) on a selectable region of request 3120H, the device displays a virtual keypad 3128 for entering the requested information. In some embodiments, in response to a determination that the full digits of the requested verification information (e.g., all digits of a requested social security number has been entered), the device displays an indication 3129 that the entered identification number (e.g., the full social security number) is being verified (e.g., by an external device, such as a server, in communication with the device).
[0909] In some embodiments, in FIG. 31I, in response to a determination (e.g., by the external device in communication with the device) that the provided government identification number (e.g., a social security number) was consistent with enrolled identification information, 325
DK 2017 70505 A1 electronic device 3100 displays a first question page of manual verification user interface 3120 that requests from the user an answer to a first question 3130 provided in the first question page of manual verification user interface 3120. For example, as shown in FIG. 31I, first question 3130 relates to a question requesting correct selection of a current or former street address of the user, and includes four possible answer choices 3130A-3130D (e.g., with only one being the correct answer choice). In FIG. 31I, answer choice 3130C is selected (as indicated by the checkmark) as the answer choice.
[0910] In some embodiments, in FIG. 31J, in accordance with a determination that answer choice 3130C was the correct answer choice to first question 3130, first question page of manual verification user interface 3120 provides an indication (e.g., stating “Verified”) that the verification (that the current user of the device is the user associated with the user account logged into the device) was successful. In some embodiments, additional questions are asked by the device, via manual verification user interface 3120, for further verification.
[0911] In some embodiments, as shown in FIG. 31K, in accordance with a determination that verification was unsuccessfully (e.g., subsequent to a determination that the answer choice provided for first question 3130 was not successful), electronic device 3100 displays a verification failed page of manual verification user interface 3120. For example, as shown in FIG. 31K, verification failed page of manual verification user interface 3120 includes an indication 3134 (e.g., stating “Verification Failed”) informing the user that the verification was unsuccessful and an indication 3136 (e.g., stating “Account Under Review”) that the user account currently logged into the device will undergo review (e.g., via an external server). In some embodiments, as shown in FIG. 31L, (e.g., if a user account is undergoing review) a plurality of features connected with and/or associated with the user account is disabled (e.g., the user account is restricted). Thus, in some embodiments, as shown in FIG. 31L, verification failed page of manual verification user interface 3120 includes an indication 3138 (e.g., stating “Account is Restricted”) that the user account is currently restricted from use (e.g., while the account is undergoing review).
[0912] In some embodiments, as shown in FIG. 31M, (e.g., if a user account is undergoing review) all features connected with and/or associated with the user account are disabled (e.g., the 326
DK 2017 70505 A1 user account is locked). Thus, in some embodiments, as shown in FIG. 31M, verification failed page of manual verification user interface 3120 includes an indication 3140 (e.g., stating “Account Locked”) that the user account is currently locked from use (e.g., while the account is undergoing review). In some embodiments, as also shown in FIG. 31M, verification failed page of manual verification user interface 3120 includes a contact affordance 3143 for reporting the locked account and/or contacting an account management team (e.g., in order to discuss unlocking the locked account).
[0913] FIGS. 32A-32D illustrate exemplary user interfaces for automatic account onboarding, in accordance with some embodiments. FIG. 32A illustrates an electronic device 3200 (e.g., portable multifunction device 100, device 300, or device 500). In the non-limiting exemplary embodiment illustrated in FIGS. 32A-32D, electronic device 3200 is a smartphone. In other embodiments, electronic device 3200 can be a different type of electronic device, such as a wearable device (e.g., a smartwatch). Electronic device 3200 has a display 3202 and one or more input devices (e.g., touchscreen of display 3202, a mechanical button 3204, a mic, a camera).
[0914] The exemplary user interfaces for provisioning a user’s account on the device described below with reference to FIGS. 32A-32D can be used, for example, to provision one or more of the accounts described above on the device. For example, the provisioning techniques described in FIGS. 32A-32D can be used to provision the payment account (corresponding to graphical representation 1756) described above with reference to FIGS. 17H-17K on the device. For another example, the provisioning techniques described in FIGS. 32A-32D can be used to provision the payment account (corresponding to graphical representation 2030) and/or the default account (corresponding to graphical representation 2024) described above with reference to FIGS. 20C-20J on the device. For another example, the provisioning techniques described in FIGS. 32A-32D can be used to provision the payment account (corresponding to graphical representation 2330) and/or the default account (corresponding to graphical representation 2324) described above with reference to FIGS. 23A-23C on the device. For another example, the provisioning techniques described in FIGS. 32A-32D can be used to provision the payment account (corresponding to graphical representation 2669) and the debit card account
327
DK 2017 70505 A1 (corresponding to graphical representation 2671) described above with reference to FIGS. 26O26R on the device.
[0915] In FIG. 32A, electronic device 3200 displays, on display 3202, an automatic account setup user interface 3206 for provisioning a user’s account on electronic device 3200 for use (in making payment transactions) via the device. In some embodiments, automatic account setup user interface 3206 corresponds to automatic verification user interface 3106 described above with reference to FIG. 31A.
[0916] As shown in FIG. 32A, automatic account setup user interface 3206 includes a capture region 3208 for capturing (e.g., via a camera of the device) an image of an account (e.g., an image of a check associated with a checking account, an image of a physical debit card associated with a checking account) to be provisioned onto electronic device 3200. In some embodiments, automatic account setup user interface 3206 also includes a setup request 3210 (e.g., stating “Add account”) indicating to the user that an account for provisioning is being requested to be captured (via capture region 3208). In some embodiments, automatic account setup user interface 3206 also includes a selectable manual setup option 3212 (e.g., stating “Add Account Details Manually”) informing the user of a manual process (instead of an automatic process using capture.
[0917] In some embodiments, if the user provides a check 3209 corresponding to a checking account of the user to be captured (e.g., via a camera) by electronic device 2900, the device automatically detects account information from the captured check (e.g., user name information, bank name information, account number information, routing number information) to automatically (e.g., without any other user input of account information) provision the account corresponding the captured check on the device for use by the device (e.g., for making payment transactions.
[0918] In some embodiments, as shown in FIG. 32B, in response to user selection of manual setup option 3212, electronic device 3200 displays, on display 3202, a manual account setup user interface 3214 for manually (e.g., using a virtual keyboard) enter account information for provisioning the account onto the device. For example, as shown in FIG. 32B, manual account
328
DK 2017 70505 A1 setup user interface 3214 includes a request 3214A for the user’s (full name), a request 3214B for the bank name (of the account to be provisioned), a request 3214C for a routing number (of the account to be provisioned), and a request 3214D for an account number (of the account to be provisioned). In some embodiments, as also shown in FIG. 32B, manual account setup user interface 3214 includes selectable account details unavailable option 3216 for when account details are currently unavailable to be entered. In some embodiments, in response to user selection (e.g., a tap gesture) of account details unavailable option 3216, the device exits the account setup process (and ceases to display the account setup user interface).
[0919] In some embodiments, as shown in FIG. 32C, while displaying manual account setup user interface 3214, in response to detecting user input (e.g., a tap gesture) on a selectable region of a request (e.g., where text can be entered, such as region 3218 corresponding to request 3214C), electronic device 3200 displays a virtual keypad 3220 (or a virtual alphanumeric keyboard) for use by the user when entering the requested account details information.
[0920] FIG. 32D shows manual account setup user interface 3214 with all of requested information 3214A-3214D having been entered (by the user of the device). In some embodiments, in response to a determination (e.g., by the device or by an external device, such as a server, in communication with the device) that all requested account detail information corresponding to 3214A-3214D has been entered, manual account setup user interface 3214 displays a done button 3222 for completing the setup process, and thereby provisioning the account (e.g., of the Western Bank) onto the device for use by the device (when performing payment transactions).
[0921] FIGS. 33A-33O illustrate exemplary user interfaces for peer-to-peer transfers, in accordance with some embodiments. The user interfaces in these figures are used to illustrate the processes described below, including the processes in FIGS. 34A-34D.
[0922] FIG. 33A illustrates an electronic device 3300 (e.g., portable multifunction device 100, device 300, or device 500). In the non-limiting exemplary embodiment illustrated in FIGS. 33A-33O, electronic device 3300 is a smartphone. In other embodiments, electronic
329
DK 2017 70505 A1 device 3300 can be a different type of electronic device, such as a wearable device (e.g., a smartwatch).
[0923] FIG. 33A shows a user of electronic device 3300 (e.g., “Kate Appleseed”) viewing a display of the device while holding the device at a predefined default orientation (e.g., predefined by an operating system of the device) relative to a baseline orientation with respect to a reference point (e.g., the ground, a face of a viewer, such as the user of the device). In some embodiments, the predefined default orientation is a resting state orientation. In some embodiments, the predefined default orientation is a 45 degree tilt from the surface of the ground.
[0924] FIG. 33B shows a display 3302 of electronic device 3300, which also has one or more sensor devices (e.g., an accelerometer, one or more cameras) and, optionally one or more input devices (e.g., a touchscreen of the display, a mechanical button 3304, a mic).
[0925] In FIG. 33B, electronic device 3300 displays, on display 3302, a message conversation 3308 of a messaging application 3306 between the user and a message participant 3310 (e.g., “John Appleseed”) (e.g., while the device is at the predefined default orientation relative to the baseline orientation. In some embodiments, message participant 3310 is a contact stored on the device. In some embodiments, message participant 3310 is a contact of a contact list associated with the user account logged onto the device. In some embodiments, message participant 3310 is a contact included in a trusted contacts list associated with the user account logged onto the device.
[0926] In some embodiments, electronic device 3300 also displays, on display 3302, a virtual keyboard 3312 (e.g., an alphanumeric keyboard for typing a message) and a compose bar 3314 displaying the text of a message as a message is typed using virtual keyboard 3312. In some embodiments, a mechanical keyboard can be used in addition to or alternatively to virtual keyboard 3312 to type a message. In some embodiments, compose bar 3314 can expand (e.g., expand upwards) to accommodate a longer or larger message or message object (e.g., an image, an emoticon, a special type of message object, such as a payment object). In some embodiments, compose bar 3314 includes a mic button 3316 which, when activated, enables the user to record a message using voice input.
330
DK 2017 70505 A1 [0927] As shown in FIG. 33B, message conversation 3308 includes a message object 3318 sent by the user to message participant 3310. In message corresponding to message object 3318, the user states to message participant 3310: “Dinner was $28.” As also shown in FIG. 33B, message conversation 3308 includes a pending payment message object 3320 (e.g., similar to payment message object 1118 described above with respect to FIGS. 11A-11C) sent by message participant 3310 to the user of the device. As with payment message object 1118, pending payment message object includes a mode indication 3322 (e.g., stating “PAY”) indicating to the user that the payment message object corresponds to a payment made by message participant 3310 to the user via an operating-system controlled (first-party) payment transfer application (and not by a third-party application). Pending payment message object 3320 also includes an amount object 3324 (e.g., “$28”) of the amount of the payment sent by message participant 3310 to the user. Further, pending payment message object 3320 includes an accept button 3326 for accepting the payment corresponding to the message object in the amount shown in amount object 3324. Additionally, pending payment message object 3320 is a status indicator (e.g., stating “PENDING”) informing the user that the payment (e.g., of $28) corresponding to the payment message object is pending (e.g., as opposed to being accepted / completed). In some embodiments, as with payment message object 1118, pending payment message object 3320 also includes an accompanying note message object 3330. In FIG. 33B, message participant 3310 informs the user, via note message object 3330, that the payment corresponding to payment message object 3320 is “For dinner” (that was requested by the user via message object 3318).
[0928] FIG. 33C shows electronic device 3300, while displaying the display (including payment message object 3320 of message conversation 3308) depicted in FIG. 33B, being viewed at two different orientations (e.g., at an angle 3300A and at an angle 3300B) relative to the baseline orientation with respect to a reference point 3332 (e.g., where the reference point is the ground). As shown in FIG. 33C, even if the device is viewed from the perspective of the two different orientations (e.g., represented by angle 3300A and angle 3300B relative to a baseline orientation with respect to reference point 3332), pending payment message object 3320 is displayed the same at either angle. In other words, whether a viewer (e.g., the user) views display 3302 of the device at an orientation corresponding to angle 3300A, or whether a viewer (e.g., the user) views display 3302 of the device at an orientation corresponding to angle 3300B, 331
DK 2017 70505 A1 or whether a viewer (e.g., the user) views display 3302 of the device from straight on (e.g., such that the device is at the predefined default orientation), there is no change in how payment message object 3320, or an element of the payment message object (e.g., object 3324 of the amount of the payment) is displayed on display 3302 by the device. In some embodiments, moving towards the orientations corresponding to angles 3300A and 3300B includes movement of the device away from the baseline orientation.
[0929] In some embodiments, while displaying payment message object 3320 within message conversation 3308, electronic device 3300 detects a user activation of accept button 3326 of the payment message object. As shown in FIG. 33D, in response to detecting the user activation of accept button 3326 (and thereby accepting the payment from message participant 3310), accept button 3326 ceases to be displayed on the payment message object.
[0930] As also shown in FIG. 33D, in response to detecting the user activation of accept button 3326 (and thereby accepting the payment from message participant 3310), pending payment message object 3320 as shown in FIG. 33B is updated to a corresponding completed payment message object 3334, and status indicator 3328 (e.g., stating “PAID”) is also updated accordingly. Further, electronic device 3300 generates one or more feedbacks (e.g., a visual effect, a sensory feedback, such as a haptic effect, an audio feedback) indicating to the user that the payment has been accepted and that the payment message object now corresponds to a completed (instead of a pending) payment.
[0931] In some embodiments, one or more visual feedbacks are applied to amount object 3324 of completed payments message object 3334. To more specifically describe the one or more visual feedbacks applied to amount object 3324, attention is drawn to FIGS. 33E-33I. FIG. 33E shows an enlarged view of amount object 3324 of completed payment message object 3334 from FIG. 33D at three different tilts (orientations) of electronic device 3300 (relative to the predefined default orientation). When the device is at an orientation 3301 A, the device is at (or is within a predefined limit tilting of) the predefined default orientation (e.g., a resting state orientation, a 45 degree tilt from the surface of the ground) relative to the baseline orientation with respect to a reference point (e.g., the ground). When the device is at an orientation 3301B, the device is at a (small) clockwise vertical angular tilt from the predefined default orientation
332
DK 2017 70505 A1 (e.g., such that the top edge of the device is (slightly) closer to the user and the bottom edge of the device is (slightly) farther away from the user as compared to when the device is being held by the user at the predefined default orientation). When the device is at an orientation 3301C, the device is at a (small) counter-clockwise vertical angular tilt from the predefined default orientation (e.g., such that the bottom edge of the device is (slightly) closer to the user and the bottom edge of the device is (slightly) farther away from the user as compared to when the device is being held by the user at the predefined default orientation).
[0932] In some embodiments, at orientation 3301A, a visual feedback (e.g., having a small magnitude) is applied to amount object 3324 of completed payment message object 3334. In some embodiments, the visual feedback is a geometry alteration effect (e.g., a skewing effect, a 3D effect, a simulated depth effect) applied to at least a portion of the amount object (e.g., changing an angle or distance between lines or curves that define a shape of the object). For example, FIG. 33E shows a simulated depth effect including depth line 3325 being applied to the amount object at all orientations 3301A-C. In some embodiments, the simulated depth effect also applied in conjunction with a skewing effect (e.g., which changes the amount of skew of the geometry of the amount object). In some embodiments, the skewing effect includes shifting a line that represents an upper extent of a simulated three-dimensional object (e.g., the amount object with the simulated depth effect applied) toward or away from a line that represents a lower extent of the simulated three-dimensional object (e.g., shifting a line that represents a top of a raised pattern toward or away from edges of the raised pattern), or a center line (e.g., depth line 3325) that represents a lower extent of a simulated three-dimensional object (e.g., the amount object with the simulated depth effect applied) toward or away from a line that represents an upper extent of the simulated three-dimensional object (e.g., shifting a line that represents a bottom of an engraved pattern toward or away from edges of the engraved pattern). In some embodiments, reducing the skewing effect includes decreasing the amount of shifting of the line (e.g., depth line 3325) as the orientation of the device relative to the baseline orientation changes. In some embodiments, depth line 3325 remains stationary relative to other lines (e.g., the border lines) of the amount object, and the other lines (e.g., the border lines) shift relative to the depth line as the orientation of the device relative to the baseline orientation changes.
333
DK 2017 70505 A1 [0933] In some embodiments, when the device is at orientation 3301A (corresponding to the predefined default orientation), electronic device applies (slightly, with a weak magnitude) the simulated depth effect to amount object 3324 of completed payment message object 3334 (e.g., by applying depth line 3325 down the center of the amount object indicating the bottom of the simulated depth of the object). In some embodiments, when the device is (gradually) moved to orientation 3301B, a corresponding dynamic movement of depth line 3325 is displayed (e.g., the depth line of the amount object is moved (slightly) up because the device is slightly tilted upwards relative to the predefined default orientation). In some embodiments, when the device is (gradually) moved to orientation 3301C, a corresponding dynamic movement of depth line 3325 is displayed (e.g., the depth line of the amount object is moved (slightly) down because the device is slightly tiled downwards relative to the predefined default orientation). In some embodiments, at orientations 3301B and 3301C, the device also generates a haptic feedback (e.g., a tactile output), as described in greater detail below with reference to FIG. 33F.
[0934] In some embodiments, at the predefined default orientation of orientation 3301A, there is no change in the visual feedback that is applied to amount object 3324 of completed payment message object 3334 as compared to when the device is at orientation 3301B. Thus, in some embodiments, there is no visual feedback applied to amount object 3324 at any of orientations 3301A-3301C.
[0935] FIG. 33F shows electronic device 3300, while displaying amount object 3324 of completed payment message object 3334, being viewed at two different orientations (e.g., at angle 3300A and at angle 3300B) relative to the baseline orientation with respect to a reference point 3332, as first shown in FIG. 33C. More specifically, the orientation corresponding to angle 3300A is an orientation that is a (slight, such as 10 degrees or 15 degrees) counter-clockwise horizontal angular tilt from the predefined default orientation and the orientation corresponding to angle 3300B is an orientation that is a (slight, such as 10 degrees or 15 degrees) clockwise horizontal angular tilt from the predefined default orientation. In some embodiments, moving towards the orientations corresponding to angles 3300A and 3300B includes movement of the device away from the baseline orientation.
334
DK 2017 70505 A1 [0936] In some embodiments, electronic device 3300 includes one or more tactile output generators, and when electronic device is at orientations (e.g., corresponding to angle 3300A or angle 3300B) that is not the predefined default orientation, in addition to (or instead of) visual feedback applied to amount object 3324 of completed payment message object 3334, the device generates a haptic feedback (e.g., a tactile output 3336). For example, in some embodiments, in response to detecting the change in orientation of the device from the predefined default orientation (e.g., of FIG. 33D) to an orientation corresponding to angle 3300A or angle 3300B, the device generates, via the one or more tactile output generators, (e.g., for the duration of the change in the orientation of the device) tactile output 3336 that is indicative of the change in the orientation of the device (e.g., a tactile output that includes a parameter that is adjusted based on a magnitude, speed, and/or direction of change in the orientation of the device relative to the baseline orientation). Thus, in some embodiments, tactile output 3336 provides further feedback to the user that tracks the change in orientation of the device.
[0937] FIG. 33G shows enlarged views of amount object 3324 of completed payment message object 3334 from FIG. 33F and how, in some embodiments, the amount object is displayed by the electronic device while at six different tilts of electronic device 3300. Orientations 3303A and 3303B correspond to angle 3300A and angle 3300B of FIG. 33F, respectively (or within a predefined limit tilting of those angles). Orientations 3303C and 3303D correspond to when the device is at a (small) clockwise vertical angular tilt from angle 3300A and 3300B, respectively (e.g., such that the top edge of the device is (slightly) closer to the user and the bottom edge of the device is (slightly) farther away from the user as compared to when the device is being held by the user at the predefined default orientation). Orientations 3303E and 3303F correspond to when the device is at a (small) counter-clockwise vertical angular tilt from angle 3300A and 3300B, respectively (e.g., such that the bottom edge of the device is (slightly) closer to the user and the bottom edge of the device is (slightly) farther away from the user as compared to when the device is being held by the user at the predefined default orientation).
[0938] In some embodiments, at orientations 3303A and 3303B (e.g., where the device has moved (e.g., horizontally) away from the baseline orientation), one or more visual feedback are
335
DK 2017 70505 A1 applied to amount object 3324 of completed payment message object 3334 (e.g., at a stronger magnitude than at orientation 3301A). In some embodiments, the visual feedback is simulated depth effect including depth line 3325 (as described above with respect to FIG. 33E). At orientations 3303A and 3303B, the change in the depth line 3325 is more emphasized (and therefore more perceivable by the user) (as compared to the depth effect perceived at orientation 3301A). For example, at orientation 3303A, the depth line of the amount object has (gradually, while the device changed orientations) moved left from its position at orientation 3301A, for the user is now viewing the simulated depth within the amount object at a slanted (left) side angle relative to the straight-on view. At orientation 3303B, the depth line of the amount object has (gradually, while the device changed orientations) moved right from its position at orientation 3301A, for the user is now viewing the simulated depth within the amount object at a slanted (right) side angle relative to a straight-on view. In some embodiments, the depth line 3325 remains stationary and the outline of the content shifts in location, thereby simulating the depth effect.
[0939] Further, in some embodiments, when electronic device 3300 is moved to orientations 3303C and 3303D, a corresponding dynamic movement of the simulated depth effect is displayed. For example, at orientation 3303C, depth line 3325 of the amount object has moved (gradually, while the device changed orientations) up relative to depth line 3325 at orientation 3303A, for the orientation of the device has shifted to a higher angle relative to orientation 3303A. Similarly, at orientation 3303D, depth line 3325 of the amount object has moved (gradually, while the device changed orientations) up relative to the depth line at orientation 3303B, for the orientation of the device has shifted to a higher angle relative to orientation 3303B. Further, in some embodiments, when the device is moved to orientations 3303E and 3303F, a corresponding dynamic movement of depth line 3325 is displayed. For example, at orientation 3303E, depth line 3325 of the amount object has moved (gradually, while the device changed orientations) down relative to the depth line at orientation 3303A, for the orientation of the device has shifted to a lower angle relative to orientation 3303A. Similarly, at orientation 3303F, depth line 3325 of the amount object has moved (gradually, while the device changed orientations) down relative to the depth line at orientation 3303B, for the orientation of the device has shifted to a lower angle relative to orientation 3303B. In some embodiments, depth
336
DK 2017 70505 A1 line 3325 remains stationary and the outline of the content shifts in location, thereby simulating the depth effect (using dynamic boundary instead of a dynamic depth line).
[0940] In some embodiments, in addition to (or instead of) simulated depth effect including depth line 3325, an additional visual feedback is applied to amount object 3324. In some embodiments, as shown in FIG. 33G, the additional visual feedback is a coloring effect of a plurality of colored patterns 3327 applied to several different portions of the amount object, where each colored pattern 3327 consists of one or more different colors (e.g., three different colors corresponding to 3327A-3327C). In some embodiments, each colored pattern is a rainbow-colored pattern comprising colors of the rainbow.
[0941] In some embodiments, as shown in FIG. 33G, at orientations 3303A and 3303B (corresponding to angles 3300A and 3300B, where the device has moved (e.g., horizontally) away from the baseline orientation), the coloring effect of colored patterns 3327 (e.g., including colors 3327A-3327C) is applied to amount object 3324 of completed payment message object 3334. For example, at orientations 3303A and 3303B, a plurality of colored patterns 3327 is displayed at various portions of the amount object. Further, in some embodiments, when the device is moved to orientations 3303C and 3303D, a corresponding dynamic movement of colored patterns 3327 is displayed by the electronic device. For example, at orientation 3303A, each of the colored patterns on the amount object has moved (gradually, while the device changed orientations) up relative to their positions at orientation 3303C. Similarly, at orientation 3303D, each of the colored patterns on the amount object had moved (gradually, while the device changed orientations) up relative to their positions at orientation 3303B. Further, in some embodiments, when the device is moved to orientations 3303E and 3303F, a corresponding dynamic movement of colored patterns 3327 is displayed. For example, at orientation 3303E, each of the colored patterns on the amount object has moved (gradually, while the device changed orientations) down relative to their positions at orientation 3303A. Similarly, at orientation 3303F, each of the colored patterns on the amount object has moved (gradually, while the device changed orientations) down relative to their positions at orientation 3303B.
[0942] FIG. 33H shows electronic device 3300, while maintaining display of amount object 3324 of completed payment message object 3334, being viewed at two different orientations
337
DK 2017 70505 A1 (e.g., at angle 3300C, which is farther away from the baseline orientation than angle 3300A, and at angle 3300D, which is also farther away from the baseline orientation than angle 3300B) relative to the baseline orientation with respect to reference point 3332. More specifically, the orientation corresponding to angle 3300C is an orientation that is a further counter-clockwise horizontal angular tilt from the orientation of the device at angle 3300A in FIG. 33F, and the orientation corresponding to angle 3300D is an orientation that is a further clockwise horizontal angular tilt from the orientation of the device at angle 3300B in FIG. 33F. In some embodiments, moving towards the orientations corresponding to angles 3300C and 3300D constitute further movement of the device away from the baseline orientation. In some embodiments, when the device is at (and while the device is moving towards) orientations corresponding to angle 3300C or angle 3300D, the device continues to generate (e.g., at an increasingly stronger magnitude) the haptic feedback (e.g., tactile output 3336).
[0943] FIG. 33I shows an enlarged view of amount object 3324 of completed payment message object 3334 from FIG. 33H and how, in some embodiments, the amount object is displayed at six different tilts (orientations) of electronic device 3300. Orientations 3305A and 3305B correspond to angle 3300C and angle 3300D of FIG. 33H, respectively (or are within a predefined limit tilting of those angles). Orientations 3305C and 3305D correspond to when the device is at a (small) clockwise vertical angular tilt from angle 3300C and 3300D, respectively (e.g., such that the top edge of the device is (slightly) closer to the user and the bottom edge of the device is (slightly) farther away from the user as compared to when the device is being held by the user at the predefined default orientation). Orientations 3305E and 3305F corresponds to when the device is at a (small) counter-clockwise vertical angular tilt from angle 3300A and 3300B, respectively (e.g., such that the bottom edge of the device is (slightly) closer to the user and the bottom edge of the device is (slightly) farther away from the user as compared to when the device is being held by the user at the predefined default orientation).
[0944] At orientations 3305A and 3305B, the displayed simulated depth effect including depth line 3325 is emphasized as compared to the depth effect displayed at orientations 3303A and 3303B, respectively. For example, at orientation of 3305A, depth line 3325 of the amount object has (gradually) moved farther left as compared to the depth line at orientation 3303A, for
338
DK 2017 70505 A1 the orientation of the device has shifted to an even more slanted (left) side angle relative to orientation 3303A. At orientation 3305B, depth line 3325 of the amount object has (gradually) moved farther right as compared to the depth line at orientation 3303B, for the orientation of the device has shifted to an even more slanted (right) side angle relative to orientaiton3303B. In some embodiments, depth line 3325 remains stationary and the outline of the content shifts in location, thereby simulating the depth effect (using dynamic boundary instead of a dynamic depth line).
[0945] In addition, in some embodiments, when the device is moved to orientations 3305C and 3305D, a corresponding dynamic movement of simulated depth effect 3325 is displayed. For example, at orientation 3305C, depth line 3325 of the amount object has moved (gradually) up relative to the depth line at orientation 3305A, for the orientation of the device has shifted to a higher angle relative to orientation 3305A. Similarly, at orientation 3305D, depth line 3325 of the amount object has moved (gradually) up relative to the depth line at orientation 3305B, for the orientation of the device has shifted to a higher angle relative to orientation 3305B. Further, in some embodiments, when the device is moved to orientations 3305E and 3305F, a corresponding dynamic movement of simulated depth effect 3325 is displayed. For example, at orientation 3305E, depth line 3325 of the amount object has moved (gradually, while the device changed orientations) down relative to the depth line at orientation 3305A, for the orientation of the device has shifted to a lower angle relative to orientation 3305A. Similarly, at orientation 3305F, depth line 3325 of the amount object has moved (gradually, while the device changed orientations) down relative to the depth line at orientation 3305B, for the orientation of the device has shifted to a lower angle relative to orientation 3305B.
[0946] Further, in some embodiments, as shown in FIG. 33I, at orientations 3305A-3305F, colors 3327A-3327C of colored patterns 3327 applied to amount object 3324 are more saturated relative to corresponding colors 3327A-3327C of colored patterns 3327 applied to amount object 3324 at orientation 3303A-3303F, respectively. For example, color 3327A of colored patterns 3327 at orientations 3305A-3305F is more saturated relative to corresponding color 3327A of colored patterns 3327 at orientations 3303A-3303F. Likewise, color 3327B of colored patterns 3327 at orientations 3305A-3305F is more saturated relative to corresponding color 3327B of
339
DK 2017 70505 A1 colored patterns 3327 at orientations 3303A-3303F. Likewise, color 3327C of colored patterns 3327 at orientations 3305A-3305F is more saturated relative to corresponding color 3327C of colored patterns 3327 at orientations 3303A-3303F. In some examples, rather than (or in addition to) changing the saturation of the colors, the thickness or brightness of the colors is changed.
[0947] Furthermore, in some embodiments, when the device is moved to orientations 3305C and 3305D from orientations 3305A and 3305B, respectively, a corresponding dynamic movement of colored patterns 3327 is displayed. For example, at orientation 3305C, each of the colored patterns on the amount object has moved (gradually, while the device changed orientations) up relative to their positions at orientation 3305A as the device (gradually) tilts from orientation 3305A to orientation 3305C. Similarly, at orientation 3305D, each of the colored patterns on the amount object has moved (gradually, while the device changed orientations) up relative to their positions at orientation 3305B as the device (gradually) tilts from orientation 3305B to orientation 3305D. Furthermore, in some embodiments, when the device is moved to orientations 3305E and 3305F from orientations 3305A and 3305B, respectively, a corresponding dynamic movement of colored patterns 3327 is displayed. For example, at orientation 3305E, each of the colored patterns on the amount object has moved (gradually, while the device changed orientations) down relative to their positions at orientation 3305A as the device (gradually) tilts from orientation 3305A to orientation 3305E. Similarly, at orientation 3305F, each of the colored patterns on the amount object has moved (gradually, while the device changed orientations) down relative to their positions at orientation 3305B as the device (gradually) tilts from orientation 3305B to orientation 3305F.
[0948] In addition, in some embodiments, electronic device 3300 continues to generate tactile output 3336 as the device changes in orientation (e.g., from orientations 3303A-3303F to orientations 3305A-3305F, respectively). In some embodiments, the device gradually ceases to generate tactile output 3336 when the orientation of the device relative to the baseline orientation stops changing.
[0949] In some embodiments, as electronic device 3300 changes orientation in a direction that is towards the baseline orientation (instead of in a direction that is moving away from the 340
DK 2017 70505 A1 baseline orientation), a magnitude of the one or more applied or generated feedbacks (e.g., simulated depth effect including depth line 3325, coloring effect represented by colored patterns 3327, tactile output 3336) are reduced. For example, the feedback is gradually reduced for the duration that the orientation of the device is changing (e.g., moving towards the baseline orientation). For another example, the rate of change of the feedback is gradually reduced as the orientation of the device changes (e.g., the greater the amount of change in the orientation, the greater the change in the feedback). In some embodiments, reducing the magnitude of simulated depth effect includes reducing the simulated depth of the geometry of the amount object (or of the text object) (e.g., reducing an angle from a bottom of an engraved pattern to the surface into which the engraved pattern is engraved, or reducing an angle from the top of a raised object to surface on which the raised pattern is placed). In some embodiments, reducing the magnitude of the coloring effect includes reducing a saturation of colors 3327A-3327C of colored patterns 3327.
[0950] FIG. 33J shows a text object 3354 (e.g., stating “YAY”) similar to amount object 3324. As with amount object 3324, text object 3354 can be sent to, or received from, a message participant (e.g., message participant 3310) in a message conversation (e.g., message conversation 3308) of a messaging application (e.g., messaging application 3306) using a message object similar to payment message objects 3320 and 3334. Thus, in some examples, visual feedback (e.g., simulated depth effect including depth line 3325, coloring effect represented by colored patterns 3327 of colors 3327A-C) are similarly applied to text object 3354. For example, FIG. 33J shows how, in some embodiments, text object 3354 is displayed when electronic device is at orientations 3307A-3307F, which correspond to orientations 3305A3305F described above with reference to FIG. 33I. One or more types of feedback, such as simulated depth feedback including depth line 3325 and coloring effects represented by colored patterns 3327 of colors 3327A-C can be similarly applied to text object 3354 as described above with respect amount object 3324. Further, tactile output 3336 that is generated in connection with the feedbacks applied to amount object 3324 can also be generated in connection with feedbacks applied to text object 3354.
341
DK 2017 70505 A1 [0951] FIG. 33K shows electronic device 3300 displaying, on display 3302, an accounts user interface 3337 (e.g., similar to wallet user interface 2022 described above with reference to FIGS. 20B-20J) for selecting one or more accounts from a plurality of available accounts for use in a transfer (e.g., of a payment, of a resource, of points, of a message). In some embodiments, accounts user interface 3337 includes a graphical representation 3338 (e.g., of a payment account, similar to graphical representation 2030 of a payment account described above with reference to FIGS. 20D-20J) displayed at a first location of accounts user interface 3337. At the first location of the user interface, the account corresponding to the displayed graphical representation is currently selected for use in a transfer. As shown in FIG. 33K, graphical representation 3338 (e.g., of a payment account) includes a plurality of pattern objects 3344 (e.g., similar to one or more elements 2034 of graphical representation 2030 of a payment account). In some embodiments, accounts user interface 3337 also includes other selectable accounts located at a second location of the user interface that are also available for use in a transfer. In some embodiments, accounts user interface 3337 also includes an indication 3340 of the account (e.g., the payment account) associated with graphical representation 3338 and a balance indication 3342 (e.g., stating “$30”) of the amount of funds (or resources, points, usage limits) available in the account associated with graphical representation 3338.
[0952] FIG. 33L shows accounts selection user interface 3337 with a graphical representation 3339 corresponding to a birthday card (instead of a payment account) that is analogous to graphical representation 3338 corresponding to a payment account, as shown in FIG. 33K. As shown in FIG. 33L, graphical representation 3339 of the birthday card (e.g., showing “Happy Birthday!”) also includes a plurality of pattern objects 3343 that are analogous to the plurality of pattern objects 3344 of graphical representation 3338 of the payment account, as shown in FIG. 33K.
[0953] FIGS. 33M-33O illustrate feedback (e.g., a visual feedback, a haptic feedback) that is applied to or in connection with pattern objects 3344 of graphical representation 3338 (e.g., of a payment account, of a birthday card). Thus, it is to be understood that, while the feedback effects described below are described with respect to a graphical representation of a payment
342
DK 2017 70505 A1 account, the feedback effects can analogously apply to pattern objects of graphical representations of other objects, such as a birthday card, as shown in FIG. 33L.
[0954] FIG. 33M shows electronic device 3300, while maintaining display of graphical representation 3338 at the first location of accounts user interface 3336, being viewed at two different orientations (e.g., at angle 3300A and at angle 3300B) relative to the baseline orientation with respect to a reference point 3332, as first shown in FIG. 33C. More specifically, the orientation corresponding to angle 3300A is an orientation that is a (slight, such as 10 degrees or 15 degrees) counter-clockwise horizontal angular tilt from the predefined default orientation and the orientation corresponding to angle 3300B is an orientation that is a (slight, such as 10 degrees or 15 degrees) clockwise horizontal angular tilt from the predefined default orientation. In some embodiments, moving towards the orientations corresponding to angles 3300A and 3300B constitute movement of the device away from the baseline orientation.
[0955] In some embodiments, a visual feedback is a coloring effect of a plurality of colored patterns 3345 is applied one or more pattern objects 3344 (or portions thereof), where colored patterns 3345 consists of one or more different colors (e.g., the colors of a rainbow). In some embodiments, as shown in FIG. 33M, at orientations corresponding to angles 3300A and 3300B, the coloring effect of colored patterns 3345 applied to one or more pattern objects 3344 (or portions thereof). In some embodiments, only a portion of the full colors (e.g., red and orange of all rainbow colors) of colored patterns 3345 is visible on one or more of pattern objects 3344. In some embodiments, colored patterns 3345 covers a portion of (but not all of) a pattern object. In some embodiments, as the device changes in orientation from the predefined default orientation to the orientation corresponding to angles 3300A and 3300B (e.g., moves away from the baseline orientation), one or more colors of colored patterns 3345 slides in to pattern objects 3344 from one side of graphical representation 3338. In some embodiments, an individual pattern object of pattern objects 3345 is covered by two or more colors of colored patterns 3345. In some embodiments, depending on the angular distance of the current orientation of the device from the predefined default orientation, a first set of patterned objects 3344 are covered by colored patterns 3345 and a second set of patterned objects 3344 are not covered by colored patterns 3345. In some embodiments, the visual feedback is also applied to indication 3340 of the
343
DK 2017 70505 A1 account associated with graphical representation 3338 (e.g., the payment account). In some embodiments, the visual feedback is also applied to balance indication 3342 of the amount of funds (or resources, points, usage limits) available in the account associated with graphical representation 3338.
[0956] In some embodiments, electronic device 3300 includes one or more tactile output generators, and when electronic device is at orientations (e.g., corresponding to angle 3300A or angle 3300B) that are not the predefined default orientation, in addition to (or instead of) the visual feedback applied graphical representation 3338, the device generates a haptic feedback (e.g., a tactile output 3336). For example, in some embodiments, in response to detecting the change in orientation of the device from the predefined default orientation (e.g., of FIG. 33K) to an orientation corresponding to angle 3300A or angle 3300B, the device generates, via the one or more tactile output generators, (e.g., for the duration of the change in the orientation of the device) tactile output 3336 that is indicative of the change in the orientation of the device (e.g., a tactile output that includes a parameter that is adjusted based on a magnitude, speed, and/or direction of change in the orientation of the device relative to the baseline orientation). Thus, in some embodiments, tactile output 3336 provides further feedback to the user that tracks the change in orientation of the device.
[0957] FIG. 33N shows electronic device 3300, while maintaining display of graphical representation 3338 (e.g., of an account, of a payment account, of a birthday card), tilted at more tilted angles (e.g., angles 3300C and 3300D) (and thus farther away from the baseline orientation) compared to orientations corresponding to angles 3300A and 3300B. In some embodiments, the device, in response to detecting the change in orientation of the device from orientations corresponding to angels 3300A and 3300B to angles 3300C and 3300D, respectively, continues to apply the colored patterns effect to pattern objects 3344 of graphical representation 3338 (e.g., of a payment account, of a birthday card). In some embodiments, as shown in FIG. 33N, one or more of the colors that are applied to pattern objects 3344 at orientations corresponding to angles 3300C and 3300D are different from one or more of the colors that are applied to pattern objects 3344 at orientations corresponding to angels 3300A and 3300B, as shown in FIG. 33M. In some embodiments, the different colors of colored patterns
344
DK 2017 70505 A1
3334 washes across pattern objects 3344 as the device changes orientation (e.g., from orientations corresponding to angles 3300A and 3300B to orientations corresponding to angles 3300C and 3300D, respectively). In some embodiments, once the device reaches a sufficient threshold orientation (sufficiently away from the baseline orientation or sufficiently close to the baseline orientation), colored patterns 3345 slides off of pattern objects 3344 and the device eventually ceases displaying the visual feedback (e.g., of colored patterns 3345) to graphical representation 3338 (e.g., of a payment account, of a birthday card). In some embodiments, when electronic device is at (and while the device is moving towards) orientations corresponding to angles 3300C and 3300D (from orientations corresponding to angles 3300A and 3300B, respectively), the device continues to generate (e.g., at an increasingly stronger magnitude) the haptic feedback (e.g., tactile output 3336).
[0958] FIG. 33O shows electronic device 3300, while maintaining display of graphical representation 3338 (e.g., of an account, of a payment account, of a birthday card), tilted at even larger angles (e.g., angles 3300E and 3300F) (and thus farther away from the baseline orientation) compared to orientations corresponding to angles 3300C and 3300D. In some embodiments, the device, in response to detecting the change in orientation of the device from orientations corresponding to angles 3300C and 3300D to angles 3300E and 3300F, respectively, gradually ceases to display (e.g., decreases the brightness of, fades out, washes out, gradually slides out) the visual feedback (e.g., coloring effect 3345) to pattern objects 3344 of graphical representation 3338. Further, in some embodiments, in accordance with a determination (or subsequent to the determination) that the visual feedback is no longer being applied to pattern objects 3344 of graphical representation 3338, the device further ceases generating tactile output 3336.
[0959] In some embodiments, as electronic device 3300 changes orientation in a direction that is towards the baseline orientation (instead of in a direction that is moving away from the baseline orientation), a magnitude of the one or more applied or generated feedbacks (e.g., coloring effect represented by colored patterns 3345, tactile output 3336) are reduced. For example, the feedback is gradually reduced for the duration that the orientation of the device is changing (e.g., moving towards the baseline orientation). For another example, the rate of
345
DK 2017 70505 A1 change of the feedback is gradually reduced as the orientation of the device changes (e.g., the greater the amount of change in the orientation, the greater the change in the feedback). In some embodiments, reducing the magnitude of the coloring effect includes reducing a saturation of the colors (e.g., the rainbow colors) of the colored patterns 3345).
[0960] FIGS. 34A-34D are a flow diagram illustrating a method for providing feedback corresponding to an operation associated with a transfer, in accordance with some embodiments. Method 3400 is performed at a device (e.g., 100, 300, 500, 3300) with a display and one or more sensor devices (e.g., an accelerometer for detecting an orientation of the device, one or more cameras). Some operations in method 3400 are, optionally, combined, the orders of some operations are, optionally, changed, and some operations are, optionally, omitted.
[0961] As described below, method 3400 provides an intuitive way for managing peer-topeer transactions. The method reduces the cognitive burden on a user for managing peer-to-peer transactions, thereby creating a more efficient human-machine interface. For battery-operated computing devices, enabling a user to manage peer-to-peer transactions faster and more efficiently conserves power and increases the time between battery charges.
[0962] The electronic device (e.g., 3300), while the device is at a first orientation (e.g., a predefined default orientation, a resting state orientation, a 45 degree tilt from surface of the ground) relative to a baseline orientation with respect to a reference point (e.g., 3332, the ground, a face of a viewer), displays (3402), on the display (e.g., 3302), a user interface object (e.g., 3324, 3354, 3344, a numerical number, a shape, a pattern, a part of the item that is visually distinguishable from the background of the item). In some embodiments, the object (e.g., 3324, 3354) is (part of, an element of, displayed within) a message object within a message conversation of a messaging application. In some embodiments, the object (e.g., 3344) is (part of, an element of, displayed within) a graphical representation of an account (e.g., a user account, a resource account, a payment account) stored/provisioned on the device (e.g., as described above in greater detail with respect to methods 900, 1200, 1500, 1800, 2100, 2400, 2700, and 3000).
346
DK 2017 70505 A1 [0963] The electronic device (e.g., 3300), while displaying the user interface object (e.g., 3324, 3354, 3344), detects (3404), via the one or more sensor devices (e.g., an accelerometer, a camera), a change in orientation (e.g., from 3300A to 3300C, from 3300C to 3300A, from 3300A to 3300E, from 3300E to 3300A, from 3300B to 3300D, from 3300D to 3300B, from 3300B to 3300F, from 3300F to 3300B) of the device from the first orientation (e.g., 3300A, 3300B, 3300C, 3300D, 3300E, 3300F) relative to the reference point (e.g., 3332) to a respective orientation (e.g., 3300A, 3300B, 3300C, 3300D, 3300E, 3300F) relative to the reference point.
[0964] In some embodiments, detecting the change in orientation of the device (e.g., 3300) from the first orientation relative to the reference point (e.g., 3332) to a respective orientation relative to the reference point (e.g., 3332) includes detecting a change in orientation of the device (e.g., detecting a change in orientation of the device relative to a fixed reference point on the earth, for example based on orientation sensors of the device such as an accelerometer, a gyroscope, a magnetometer).
[0965] In some embodiments, detecting the change in orientation of the device (e.g., 3300) from the first orientation relative to the reference point (e.g., 3332) to a respective orientation relative to the reference point (e.g., 3332) includes detecting a change in orientation of a user relative to the device (e.g., based on a face tracking sensor such as a camera or other face tracking sensor that can detect changes of the point of view of a viewing angle of the device by a face that is being tracked by the device). In some examples, detecting the change in orientation of the device from the first orientation relative to the reference point to a respective orientation relative to the reference point includes detecting a change in orientation of a user relative to the device and detecting a change in orientation of the device.
[0966] The electronic device (e.g., 3300), in response to detecting the change in orientation of the device (3406), changes (3408) an appearance of the user interface object (e.g., 3324, 3354, 3344) by applying a visual effect (e.g., 3325, 3327, 3345) to the user interface object that varies a set of one or more parameters of the user interface object (e.g., 3324, 3354, 3344) as the orientation of the device changes relative to the reference point (e.g., 3332). Changing an appearance of a user interface object (e.g., 3324, 3344) by applying a visual effect (e.g., 3325, 3327, 3345) to the user interface object as the orientation of the device changes relative to a
347
DK 2017 70505 A1 reference point (e.g., 3332) provides the user with visual feedback about a state of the user interface object and/or information about the user interface object, such as whether a transfer (e.g., of a message, of a file, of a resource, of a payment) associated with the user interface object has been successfully completed. Further, the change in amplitude of the visual effect as the orientation of the device changes indicates to the user that the displayed object is authentic and not a video that is displayed independent of the device orientation. Providing improved visual feedback to the user enhances the operability of the device and makes the user-device interface more efficient (e.g., by helping the user to provide proper inputs and reducing user mistakes when operating/interacting with the device, by enhancing legibility of user interface elements to the user while the device is at natural viewing angles) which, additionally, reduces power usage and improves battery life of the device by enabling the user to use the device more quickly and efficiently. Furthermore, providing the improved visual feedback also provides a security verification measure that cannot be duplicated (e.g., faked, copied) by a third-party application that is not an operating system-controlled (first-party) application. Improving security measures of the device enhances the operability of the device by preventing unauthorized access to content and operations and, additionally, reduces power usage and improves battery life of the device by enabling the user to use the device more efficiently.
[0967] In some embodiments, (3410) the visual effect (e.g., 3325, 3327, 3345) includes a coloring effect (e.g., 3327, 3345, of one or more colors, of two or more colors, of one or more colored patterns, such as a rainbow-colored pattern) applied to at least a portion of the user interface object. In some examples, the coloring effect is an iridescence effect of one or more colors that varies in color across a surface of the user interface object and changes as the orientation of the device relative to the baseline orientation changes. In some examples, the coloring effect is an iridescence effect that includes one or more rainbow-colored patterns. Applying a color effect as (part of) the visual effect allows the user to more easily perceive of and recognize the visual effect when it is applied (and thus, in some embodiments, allows the user to more easily recognize that a transfer associated with the user interface object to which the coloring effect is applied has been successfully completed). Further, modifying the color of the visual effect as the orientation of the device changes indicates to the user that the displayed object is authentic and not a video that is displayed independent of the device orientation.
348
DK 2017 70505 A1
Providing improved visual feedback to the user enhances the operability of the device and makes the user-device interface more efficient (e.g., by helping the user to provide proper inputs and reducing user mistakes when operating/interacting with the device, by enhancing legibility of user interface elements to the user while the device is at natural viewing angles) which, additionally, reduces power usage and improves battery life of the device by enabling the user to use the device more quickly and efficiently. Furthermore, providing the improved visual feedback also provides a security verification measure that cannot be duplicated (e.g., faked, copied) by a third-party application that is not an operating system-controlled (first-party) application. Improving security measures of the device enhances the operability of the device by preventing unauthorized access to content and operations and, additionally, reduces power usage and improves battery life of the device by enabling the user to use the device more efficiently.
[0968] In some embodiments, (3412) a color of the coloring effect (e.g., 3327, 3345) applied to at least the portion of the user interface object (e.g., 3324, 3354, 3344) changes (e.g., shifts, transitions, smoothly changes) from a first color to a second color different from the first color in response to a change in orientation of the device of at least a predefined angular distance (e.g., colors shift across the user interface object as the orientation of the device changes relative to the baseline orientation, and/or the color displayed at any particular portion of the user interface object gradually transitions from one color to another color (optionally, through a sequence of intermediate colors in rainbow order) as the orientation of the device changes relative to the baseline orientation).
[0969] In some embodiments, (3414) the visual effect (e.g., 3325, 3327, 3345) includes a geometry alteration effect (e.g., 3325, a skewing effect, a 3D effect, a depth effect) applied to at least a portion of the user interface object (e.g., changing an angle or distance between lines or curves that define a shape of the object). Applying a geometry alteration effect (e.g., 3325) as (part of) the visual effect allows the user to more easily perceive of and recognize the visual effect when it is applied (and thus, in some embodiments, allows the user to more easily recognize that a transfer associated with the user interface object to which the geometry alteration effect is applied has been successfully completed). Providing improved visual feedback to the user enhances the operability of the device and makes the user-device interface
349
DK 2017 70505 A1 more efficient (e.g., by helping the user to provide proper inputs and reducing user mistakes when operating/interacting with the device, by enhancing legibility of user interface elements to the user while the device is at natural viewing angles) which, additionally, reduces power usage and improves battery life of the device by enabling the user to use the device more quickly and efficiently. Furthermore, providing the improved visual feedback also provides a security verification measure that cannot be duplicated (e.g., faked, copied) by a third-party application that is not an operating system-controlled (first-party) application. Improving security measures of the device enhances the operability of the device by preventing unauthorized access to content and operations and, additionally, reduces power usage and improves battery life of the device by enabling the user to use the device more efficiently.
[0970] In some embodiments, the visual effect (e.g., 3325, 3327, 3345) includes a brightness effect. In some examples, the brightness effect is a light shine effect that causes the user interface object to appear as if a beam of light (e.g., in one or more colors) is being shined on the surface of the object. In some examples, the light shine effect causes the user interface object to sparkle as the beam of light is being shined on the surface of the object. In some examples, the brightness effect is a gloss or glazing effect that causes the surface of the user interface object to appear as if it has a glossy or polished texture. In some examples, the brightness effect is a shadow effect that causes the appearance of shadows accompanying the user interface object.
[0971] In some embodiments, (3416) the user interface object (e.g., 3324, 3354, 3344) is displayed on (e.g., located on, engraved into, on a surface of) a user interface item (e.g., 3334, 3338, a text message item (e.g., 3334) of a message conversation of a messaging application, such as a message object (as described above), a graphical representation of a payment account (e.g., 3338), such as a stored-value account, a cash account, or a checking account, a graphical representation of a user account, such as a resource account, a graphical representation of a card, such as a points card, a graphical representation of a payment card, such as a debit card or a credit card). In some examples, the user interface object (e.g., 3324, 3354, 3344) is a first user interface object of a plurality of user interface objects displayed on the surface of the user interface item (e.g., 3334, 3338). In some examples, the user interface object (e.g., 3324, 3354, 3344) appears “engraved” into the surface of the user interface item (e.g., 3334, 3338) in a V
350
DK 2017 70505 A1 shaped engraving pattern, where applying the visual effect to the user interface object includes applying a first magnitude of the visual effect to a first portion of the user interface object and applying a second magnitude (different from the first magnitude) of the visual effect to a second portion of the user interface object. Applying a first magnitude of the visual effect (e.g., 3325, 3327, 3345) to a first portion of a user interface object (e.g., 3324, 3354, 3344) and applying a second magnitude (that is different from the first magnitude) to a second portion of the user interface object (e.g., 3324, 3354, 3344) allows the user to more easily perceive of and recognize the visual effect when it is applied (and thus, in some embodiments, allows the user to more easily recognize that a transfer associated with the user interface object to which the visual effect having two different magnitudes has been applied has been successfully completed). Further, the change in the amplitude of the effect as the orientation of the device changes indicates to the user that the displayed object is authentic and not a video that is displayed independent of the device orientation. Providing improved visual feedback to the user enhances the operability of the device and makes the user-device interface more efficient (e.g., by helping the user to provide proper inputs and reducing user mistakes when operating/interacting with the device, by enhancing legibility of user interface elements to the user while the device is at natural viewing angles) which, additionally, reduces power usage and improves battery life of the device by enabling the user to use the device more quickly and efficiently. Furthermore, providing the improved visual feedback also provides a security verification measure that cannot be duplicated (e.g., faked, copied) by a third-party application that is not an operating system-controlled (firstparty) application. Improving security measures of the device enhances the operability of the device by preventing unauthorized access to content and operations and, additionally, reduces power usage and improves battery life of the device by enabling the user to use the device more efficiently.
[0972] In some embodiments, (3418) the user interface item (e.g., 3334, 3338) corresponds to a (outgoing or incoming) message object (e.g., 3334, a text message object, a payment message object) of a message conversation (e.g., 3308) of a messaging application (e.g., 3306) (that is controlled by an operating system of the device).
351
DK 2017 70505 A1 [0973] In some embodiment, the visual effect (e.g., 3325, 3327, 3345) that varies the set of one or more parameters is applied (e.g., irrespective of a change in orientation of the device relative to the baseline orientation) to the user interface object (e.g., 3324, 3354, 3344) when a transfer (e.g., of resources, of a file, of a payment) associated with the message object corresponding to the user interface item (e.g., 3334, 3338) is completed (e.g., a payment is sent to a recipient of the message conversation of the messaging application, a payment is accepted by a recipient of the message conversation of the messaging application, for example as described in greater detail above with reference to method 1200).
[0974] In some embodiments, (3420) the user interface item (e.g., 3334, 3338) corresponds to a graphical representation of an account (e.g., 3338, graphical representations of a payment card, described with respect to methods 2100 and 2400).
[0975] The electronic device (e.g., 3300), (3406) in response to detecting the change in orientation of the device, in accordance with a determination that the change in orientation of the device includes movement, towards the baseline orientation, that meets predetermined criteria, (gradually) reduces (3422) an amplitude of the visual effect (e.g., 3325, 3327, 3345). In some examples, the visual effect (e.g., 3325, 3327, 3345) is gradually reduced for the duration that the orientation of the device is changing. In some examples, the visual effect (e.g., 3325, 3327, 3345) is gradually reduced as the orientation of the device changes (e.g., the greater the amount of change in the orientation the greater the change in the visual effect). In some examples, in accordance with a determination that the change in orientation of the device includes movement towards the baseline orientation that does not meet the predetermined criteria, the amplitude of the visual effect (e.g., 3325, 3327, 3345) is maintained or increased. Reducing an amplitude of a visual effect (e.g., 3325, 3327, 3345) in accordance with a determination that a change in orientation of the device includes movement, towards a baseline orientation, that meets predetermined criteria provides the user with visual feedback about a state of the user interface object and/or information about the user interface object, such as whether a transfer (e.g., of a message, of a file, of a resource, of a payment) associated with the user interface object has been successfully completed, and indicates tot the user that the transfer associated with the user interface object is a special type of transfer (e.g., a transfer made using a first-party application
352
DK 2017 70505 A1 as opposed to a third-party application). Providing improved visual feedback to the user enhances the operability of the device and makes the user-device interface more efficient (e.g., by helping the user to provide proper inputs and reducing user mistakes when operating/interacting with the device, by enhancing legibility of user interface elements to the user while the device is at natural viewing angles) which, additionally, reduces power usage and improves battery life of the device by enabling the user to use the device more quickly and efficiently. Furthermore, providing the improved visual feedback also provides a security verification measure that cannot be duplicated (e.g., faked, copied) by a third-party application that is not an operating system-controlled (first-party) application. Improving security measures of the device enhances the operability of the device by preventing unauthorized access to content and operations and, additionally, reduces power usage and improves battery life of the device by enabling the user to use the device more efficiently.
[0976] In some embodiments, the predetermined criteria include movement (e.g., movement of the device or movement of a face of a user of the device) within a predefined angular distance from the baseline orientation.
[0977] In some embodiments, (3424) reducing the amplitude of the visual effect (e.g., 3325, 3327, 3345) comprises continuing to apply the visual effect (e.g., with a reduced amplitude compared to the amplitude of the visual effect applied to the user interface object prior to the change in orientation of the device) to the user interface object (e.g., 3324, 3354, 3344) (reducing amplitude of the visual effect to the user interface object without ceasing to apply the visual effect to the user interface object).
[0978] In some embodiments, (3426) reducing the amplitude of the visual effect (e.g., 3325, 3327, 3345) comprises gradually decreasing the amplitude while the orientation of the device (e.g., 3300) moves towards the baseline orientation. In some examples, the magnitude of the reduction of the amplitude of the visual effect is dependent on the magnitude of the change in orientation of the device such that a first amount of movement toward the baseline orientation results in a first amount of decrease in the amplitude of the visual effect and a second amount of movement toward the baseline orientation that results in the orientation of the device being
353
DK 2017 70505 A1 closer to the baseline orientation results in a second amount of decrease in the amplitude of the visual effect that is greater than the first amount of decrease in the amplitude of the visual effect.
[0979] In some embodiments, (3428) reducing the amplitude of the visual effect comprises (gradually) ceasing to apply the visual effect to the user interface object.
[0980] In some embodiments, (3430) the visual effect (e.g., 3325, 3327, 3345) includes a coloring effect (e.g., 3327, 3345, of one or more colors, of two or more colors, of one or more colored patterns, such as a rainbow-colored pattern) applied to at least a portion of the user interface object, and reducing the amplitude of the coloring effect (e.g., 3327, 3345) includes reducing a saturation of a color of the coloring effect applied to at least the portion of the user interface object (e.g., 3324, 3354, 3344). In some examples, increasing the amplitude of the visual effect includes increasing a saturation of the coloring effect (e.g., 3327, 3345).
[0981] In some embodiments, (3432) the visual effect (e.g., 3325, 3327, 3345) includes a geometry alteration effect (e.g., 3325, a skewing effect, a 3D effect, a depth effect) applied to at least a portion of the user interface object (e.g., changing an angle or distance between lines or curves that define a shape of the object), and the geometry alteration effect (e.g., 3325) is a skewing effect, and wherein reducing the amplitude of the visual effect includes reducing an amount of skew of the geometry of the user interface object (e.g., 3324, 3354, 3344). In some examples, increasing the amplitude of the visual effect includes increasing an amount of skew of the geometry of the user interface object (e.g., 3324, 3354, 3344). In some examples, the skewing effect includes shifting a line that represents an upper extent of a simulated threedimensional object toward or away from a line that represents a lower extent of the simulated three-dimensional object (e.g., shifting a line that represents a top of a raised pattern toward or away from edges of the raised pattern), or a center line that represents a lower extent of a simulated three-dimensional object toward or away from a line that represents an upper extent of the simulated three-dimensional object (e.g., shifting a line that represents a bottom of an engraved pattern toward or away from edges of the engraved pattern). In some examples, reducing the skewing effect includes decreasing the amount of shifting of the line as the orientation of the device relative to a baseline orientation changes.
354
DK 2017 70505 A1 [0982] In some embodiments, (3434) the visual effect (e.g., 3325, 3327, 3345) includes a geometry alteration effect (e.g., 3325, a skewing effect, a 3D effect, a depth effect) applied to at least a portion of the user interface object (e.g., changing an angle or distance between lines or curves that define a shape of the object), and the geometry alteration effect is a simulated depth effect, and reducing the amplitude of the visual effect includes reducing a simulated depth of the geometry of the user interface object (e.g., 3324, 3354, 3344) (e.g., reducing an angle from a bottom of an engraved pattern to the surface into which the engraved pattern is engraved, or reducing an angle from the top of a raised object to surface on which the raised pattern is placed). In some examples, increasing the amplitude of the visual effect includes increasing a simulated depth of the geometry of the user interface object (e.g., 3324, 3354, 3344).
[0983] The electronic device (e.g., 3300), in response to detecting the change in orientation of the device, in accordance with a determination that the change in orientation of the device includes movement, away from the baseline orientation, that meets the predetermined criteria, continues (3436) to apply the visual effect (e.g., 3325, 3327, 3345) to the user interface object (e.g., 3324, 3354, 3344) without reducing the amplitude of the visual effect (e.g., at a constant amplitude, at a gradually increasing amplitude). In some examples, in accordance with a determination that the change in orientation of the device includes movement away from the baseline orientation that does not meet the predetermined criteria, the amplitude of the visual effect (e.g., 3325, 3327, 3345) is reduced or the visual effect is ceased to be displayed. Continuing to apply the visual effect (e.g., 3325, 3327, 3345) to the user interface object (e.g., 3324, 3354, 3344) without reducing the amplitude of the visual effect provides the user with visual feedback about a state of the user interface object and/or information about the user interface object, such as whether a transfer (e.g., of a message, of a file, of a resource, of a payment) associated with the user interface object has been successfully completed, and, by continuing to apply the visual effect to the user interface object without reducing the amplitude of the visual effect, also enables the user to more easily notice the application of the visual effect to the user interface object (e.g., as opposed to if the visual effect is immediately removed or removed after only a very brief period after the successful completion of a transfer). Providing improved visual feedback to the user enhances the operability of the device and makes the userdevice interface more efficient (e.g., by helping the user to provide proper inputs and reducing 355
DK 2017 70505 A1 user mistakes when operating/interacting with the device, by enhancing legibility of user interface elements to the user while the device is at natural viewing angles) which, additionally, reduces power usage and improves battery life of the device by enabling the user to use the device more quickly and efficiently. Furthermore, providing the improved visual feedback also provides a security verification measure that cannot be duplicated (e.g., faked, copied) by a thirdparty application that is not an operating system-controlled (first-party) application. Improving security measures of the device enhances the operability of the device by preventing unauthorized access to content and operations and, additionally, reduces power usage and improves battery life of the device by enabling the user to use the device more efficiently.
[0984] In some embodiments, (3438) continuing to apply the visual effect (e.g., 3325, 3327, 3345) to the user interface object (e.g., 3324, 3354, 3344) without reducing the amplitude of the visual effect comprises (gradually) increasing the amplitude of the visual effect while the orientation of the device (e.g., 3300) moves away from the baseline orientation. In some examples, the magnitude of the increase of the amplitude of the visual effect (e.g., 3325, 3327, 3345) is dependent on the magnitude of the change in orientation of the device such that a first amount of movement away from the baseline orientation results in a first amount of increase in the amplitude of the visual effect and a second amount of movement away from the baseline orientation that results in the orientation of the device being further from the baseline orientation results in a second amount of increase in the amplitude of the visual effect that is greater than the first amount of increase in the amplitude of the visual effect. Increasing the amplitude of the visual effect (e.g., 3325, 3327, 3345) provides the user with visual feedback about a state of the user interface object and/or information about the user interface object, such as whether a transfer (e.g., of a message, of a file, of a resource, of a payment) associated with the user interface object has been successfully completed, and, because the amplitude of the visual effect is increased, also enables the user to more easily perceive the application of the visual effect to the user interface object (e.g., as opposed to if the visual effect was static). Providing improved visual feedback to the user enhances the operability of the device and makes the user-device interface more efficient (e.g., by helping the user to provide proper inputs and reducing user mistakes when operating/interacting with the device, by enhancing legibility of user interface elements to the user while the device is at natural viewing angles) which, additionally, reduces
356
DK 2017 70505 A1 power usage and improves battery life of the device by enabling the user to use the device more quickly and efficiently. Furthermore, providing the improved visual feedback also provides a security verification measure that cannot be duplicated (e.g., faked, copied) by a third-party application that is not an operating system-controlled (first-party) application. Improving security measures of the device enhances the operability of the device by preventing unauthorized access to content and operations and, additionally, reduces power usage and improves battery life of the device by enabling the user to use the device more efficiently.
[0985] In some embodiments, the electronic device (e.g., 3300), in response to detecting the change in orientation of the device, detects, via the one or more sensor devices, that the device is at a second orientation relative to the baseline orientation, wherein the second orientation is at least a predefined limit angular distance (e.g., 25 degrees, 30 degrees, 45 degrees, 90 degrees) from the baseline orientation. In some embodiments, the device further, in response to detecting the change in orientation of the device, and in response to detecting that the device is at the second orientation relative to the baseline orientation, gradually ceases to display the visual effect (e.g., 3325, 3327, 3345) to the user interface object (e.g., 3324, 3354, 3344) (e.g., the magnitude of the reduction of the amplitude of the visual effect while it is being gradually ceased to be displayed is dependent on the magnitude of the change in orientation of the device such that a first amount of movement toward the baseline orientation results in a first amount of decrease in the amplitude of the visual effect and a second amount of movement toward the baseline orientation that results in the orientation of the device being closer to the baseline orientation results in a second amount of decrease in the amplitude of the visual effect that is greater than the first amount of decrease in the amplitude of the visual effect).
[0986] In some embodiments, the visual effect (e.g., 3325, 3327, 3345) that varies the set of one or more parameters is applied to the user interface object (e.g., 3324, 3354, 3344) when a transfer (e.g., of resources, of one or more files, of a payment, a payment is transmitted using near-field communication) is completed using the account corresponding to the graphical representation corresponding to the user interface item (e.g., 3334, 3338) (e.g., as described in greater detail above with reference to method 1200).
357
DK 2017 70505 A1 [0987] In some embodiments, changing the appearance of the user interface object (e.g., 3324, 3354, 3344) is (only) controlled by a first application (e.g., a first-party application, such as a first-party messaging application, a first-party payment application) that is integrated with (e.g., controlled or wholly managed by) an operating system of the device (e.g., 3300) and the ability to change the appearance of a user interface object based on a change in orientation of the device from the first orientation relative to the reference point to a respective orientation relative to the reference point is not available to applications that are not integrated with the operating system of the device (e.g., applications that are not controlled by the operating system of the device, such as third-party applications). In some examples, the first application (e.g., a firstparty application) that is controlled by the operating system of the device has access to motion data (e.g., data from an accelerometer) of the device or user orientation data (e.g., face tracking data from one or more cameras or other sensors) that is used to manage display of the user interface object, while the second application (e.g., a third-party application) that is not controlled by the operating system of the device does not have access to the motion data or user orientation data (e.g., face tracking data from one or more cameras or other sensors). Restricting control of changing the appearance of a user interface object (e.g., 3324, 3354, 3344) to a first application (e.g., a first-party application) that is integrated with an operating system of the device and prohibiting the ability to change the appearance of a user interface object based on a change in orientation of the device from applications (e.g., third-party applications) that are not integrated with the operating system of the device enhances device security by disallowing other applications that are not integrated with an operating system of the device to use the same (or similar) changing appearances of a user interface object for actions that are not connected with a successful transfer (e.g., of a file, of a resource, of a payment) made using an application that is integrated with an operating system of the device. Improving security measures of the device enhances the operability of the device by preventing unauthorized access to content and operations and, additionally, reduces power usage and improves battery life of the device by enabling the user to use the device more efficiently.
[0988] In some embodiments, further in response to detecting the change in orientation of the device (e.g., 3300), in accordance with a determination that the device is at the baseline orientation (or with an orientation that is within a predetermined delta of the baseline orientation), 358
DK 2017 70505 A1 the device (e.g., 3300) continues (3440) to apply the visual effect (e.g., 3325, 3327, 3345) (e.g., with a reduced amplitude or with a constant amplitude compared to the amplitude of the visual effect applied to the user interface object prior to the change in orientation of the device) to the user interface object (e.g., 3324, 3354, 3344).
[0989] In some embodiments, the electronic device (e.g., 3300) further includes one or more tactile output generators, and, in response to detecting the change in orientation of the device from the first orientation relative to the reference point (e.g., 3332) to the respective orientation relative to the reference point, the device generates (3442), via the one or more tactile output generators, (e.g., for the duration of the change in the orientation of the device) a tactile output (e.g., 3336) that is indicative of the change in orientation of the device from the first orientation relative to the reference point (e.g., 3332) to the respective orientation relative to the reference point (e.g., a tactile output that includes a parameter that is adjusted based on a magnitude, speed, and/or direction of change in the orientation of the device relative to the baseline orientation). Generating a tactile output (e.g., 3336) provides the user with sensory feedback (e.g., in addition to visual feedback, to supplement visual feedback, or in place of visual feedback) about an operation that will be performed or has been performed by the device, such as that a transfer (e.g., of a message, of a file, of a resource, of a payment) has been successfully completed by the device. Providing improved sensory feedback to the user enhances the operability of the device and makes the user-device interface more efficient (e.g., by helping the user to provide proper inputs and reducing user mistakes when operating/interacting with the device) which, additionally, reduces power usage and improves battery life of the device by enabling the user to use the device more quickly and efficiently.
[0990] In some embodiments, the electronic device (e.g., 3300) further includes one or more tactile output generators, and, in response to detecting the change in orientation of the device (e.g., 3300) from the first orientation relative to the reference point (e.g., 3332) to the respective orientation relative to the reference point, in accordance with a determination that the visual effect (e.g., 3325, 3327, 3345) being applied to the user interface object (e.g., 3324, 3354, 3344) exceeds a predefined amplitude limit (e.g., a minimum amount of visual effect, a trigger amount of visual effect), the device generates, via the one or more tactile output generators, (e.g., for the
359
DK 2017 70505 A1 duration of the change in the orientation of the device) a tactile output (e.g., 3336) that is indicative of the change in orientation of the device from the first orientation relative to the reference point to the respective orientation relative to the reference point. In some embodiments, the device (e.g., 3300), in accordance with a determination that the visual effect (e.g., 3325, 3327, 3345) being applied to the user interface object (e.g., 3324, 3354, 3344) does not exceed the predefined amplitude limit (e.g., a minimum amount of visual effect, a trigger amount of visual effect), further forgoes generating, via the one or more tactile output generators, (e.g., for the duration of the change in the orientation of the device) the tactile output (e.g., 3336) that is indicative of the change in orientation of the device from the first orientation relative to the reference point to the respective orientation relative to the reference point. Forgoing generating a tactile output (e.g., 3336) (e.g., that is associated with a visual effect being applied to a user interface object) in accordance with a determination that the visual effect being applied to the user interface object does not exceed a predefined amplitude (e.g., magnitude) limit allows the device to avoid providing unnecessary (or inappropriate / false) sensory feedback. Reducing unnecessary output provided by the device enhances the operability of the device and makes the user-device interface more efficient (e.g., by helping the user to achieve an intended result by providing feedback indicative of an input that will cause the device to generate the intended result and reducing user mistakes when operating/interacting with the device) which, additionally, reduces power usage and improves battery life of the device by enabling the user to use the device more quickly and efficiently.
[0991] In some embodiments, (3444) a parameter (e.g., magnitude, frequency, rate of repetition) of the generated tactile output (e.g., an amount of physical displacement of the device or a component of the device caused by the tactile output, a waveform with which a mass driven by the tactile output generator is driven, such as the waveforms discussed above with reference to FIGS. 4C-4H, or a spacing between repetitions of a tactile output) changes based on (e.g., in correlation with) a velocity (speed) of the movement of the device (e.g., 3300) (while the orientation of the device changes). In some examples, the magnitude of the generated tactile output increases (e.g., the amount of physical displacement of the device or a component of the device increase) as (or in correlation with) the velocity of movement of the device increases. In some examples, the magnitude of the generated tactile output decreases (e.g., the amount of
360
DK 2017 70505 A1 physical displacement of the device or a component of the device decreases) as (or in correlation with) the velocity of movement of the device decreases. Generating a tactile output (e.g., 3336) that changes based on a velocity of movement of the device allows the user to more easily sense and recognize the tactile output when it is generated (and thus, in some embodiments, allows the user to more easily recognize that an operation that corresponds to this type of tactile output has been performed by the device). Providing a unique sensory feedback (such as the tactile output based on changes in velocity) to the user enhances the operability of the device and makes the user-device interface more efficient (e.g., by helping the user to provide proper inputs and reducing user mistakes when operating/interacting with the device) which, additionally, reduces power usage and improves battery life of the device by enabling the user to use the device more quickly and efficiently.
[0992] In some embodiments, (3446) a parameter (e.g., magnitude, frequency, rate of repetition) of the generated tactile output (e.g., an amount of physical displacement of the device or a component of the device caused by the tactile output, a waveform with which a mass driven by the tactile output generator is driven, or a spacing between repetitions of a tactile output) changes based on (e.g., in correlation with) an amount (e.g., a distance) of movement of the device (e.g., 3300) (while the orientation of the device changes). In some examples, the magnitude of the generated tactile output increases (e.g., the amount of physical displacement of the device or a component of the device increase) as (or in correlation with) the amount of movement of the device increases. In some examples, the magnitude of the generated tactile output decreases (e.g., the amount of physical displacement of the device or a component of the device decreases) as (or in correlation with) the amount of movement of the device decreases. Generating a tactile output (e.g., 3336) that changes based on an amount of movement of the device allows the user to more easily sense and recognize the tactile output when it is generated (and thus, in some embodiments, allows the user to more easily recognize that an operation that corresponds to this type of tactile output has been performed by the device). Providing a unique sensory feedback (such as the tac tile output based on changes an amount of movement) to the user enhances the operability of the device and makes the user-device interface more efficient (e.g., by helping the user to provide proper inputs and reducing user mistakes when
361
DK 2017 70505 A1 operating/interacting with the device) which, additionally, reduces power usage and improves battery life of the device by enabling the user to use the device more quickly and efficiently.
[0993] In some embodiments, the electronic device (e.g., 3300), (3448) while no longer detecting a change in orientation of the device relative to the reference point (e.g., 3332) (e.g., immediately after or in response to ceasing to detect the change in orientation of the device), ceases (3450) to change the appearance of the user interface object (e.g., 3324, 3354, 3344). In some examples, the visual effect (e.g., 3325, 3327, 3345) continues to be applied but does not change while the orientation of the device does not change. In some embodiments, (3448) while no longer detecting a change in orientation of the device relative to the reference point (e.g., 3332), the device further (continues to) generates (3452), via the one or more tactile output generators, (e.g., continuing to perform (for a predetermined period of time) the tactile output from when the orientation of the device was changing) the tactile output (e.g., 3336) that is indicative of the change in orientation of the device from the first orientation relative to the reference point to the respective orientation relative to the reference point.
[0994] In some embodiments, the electronic device (e.g., 3300) detects (3454), via the one or more sensor devices, a ceasing (e.g., stopping) of the change in orientation of the device. In some embodiments, in response to detecting the ceasing of the change in orientation of the device, the device further gradually ceases (3456) to generate the tactile output (e.g., 3336). In some examples, gradually ceasing to generate the tactile output (e.g., 3336) includes gradually ceasing to generate the tactile output over the predefined period based on a speed or an amount of movement of the device relative to the baseline orientation prior to the stopping of the device. For example, if the device had been moving (on average) at a faster speed prior to stopping, the predefined period is longer than if the device had been moving (on average) at a slower speed prior to stopping. For another example, if the device had moved a longer (aggregate) distance prior to stopping, the predefined period is longer than if the device had moved a shorter (aggregate) distance prior to stopping. In some examples, the predefined period is based on a predefined time limit, such as 0.2 seconds, 0.5 seconds, or 1 second. In some embodiments, gradually ceasing to generate the tactile output (e.g., 3336) includes ceasing to generate the tactile output based on a simulated physical system (e.g., an energy dissipation system). In some
362
DK 2017 70505 A1 embodiments, parameters of the simulated physical system are selected to ensure that the tactile output (e.g., 3336) gradually ceases within a threshold amount of time (e.g., an energy dissipation system with a predefined drain rate and a limited capacity for energy storage).
[0995] In some embodiments, the generated tactile output is a repetition of two or more distinctive tactile output patterns (e.g., one or more of the tactile output patterns described above with reference to FIGS. 4C-4H) including a first tactile output pattern and a second tactile output pattern, wherein the first tactile output pattern is different from the second tactile output pattern. In some examples, the first tactile output pattern and the second tactile output pattern have the same predetermined duration. In some examples, the first tactile output pattern and the second tactile output pattern have different durations. In some examples, the first tactile output pattern and the second tactile output pattern have different frequency patterns. In some examples, the first tactile output pattern and the second tactile output pattern have different magnitude patterns. Generating a tactile output (e.g., 3336) that is a repetition of two or more distinctive tactile output patterns allows the user to more easily sense and recognize the tactile output when it is generated (and thus, in some embodiments, allows the user to more easily recognize that an operation that corresponds to this type of tactile output has been performed by the device). Providing a unique sensory feedback (such as the tactile output that is a repetition of two or more distinctive tactile output patterns) to the user enhances the operability of the device and makes the user-device interface more efficient (e.g., by helping the user to provide proper inputs and reducing user mistakes when operating/interacting with the device) which, additionally, reduces power usage and improves battery life of the device by enabling the user to use the device more quickly and efficiently.
[0996] In some embodiments, the electronic device is a wearable device (e.g., a smartwatch). In some embodiments, subsequent to receiving authorization (from the user of the device) (e.g., a biometric authorization, such as a fingerprint authorization, a facial recognition authorization, voice recognition authorization, retina/iris scan authorization) to proceed with a transfer (e.g., a payment transaction), and in accordance with a determination that the received authorization is successful (e.g., the received authorization information is consistent with enrolled authorization information for authorizing transfers) the device transmits, via a wireless transmission device,
363
DK 2017 70505 A1 account credentials (e.g., payment credentials of a payment account, such as a stored-value account, a debit card account, a credit card account) to a transaction terminal (e.g., a near field communication terminal, a point of sale terminal) for proceeding with the transfer. In some embodiments, subsequent to transmitting the account credentials to the transaction terminal, or in response to receiving a response signal from the transaction terminal acknowledging successful receipt of the account credentials and that the transfer has been successfully completed, the device (e.g., after turning off a display and/or while maintaining a display in an off state) generates a tactile output (e.g., to indicate to the user that the payment transaction with the transaction terminal was successfully completed).
[0997] Note that details of the processes described above with respect to method 3400 (e.g., FIGS. 34A-34D) are also applicable in an analogous manner to the methods described above. For example, method 3400 optionally includes one or more of the characteristics of the various methods described above with reference to methods 900, 1200, 1500, 1800, 2100, 2400, 2700, and 3000. For example, when a transfer (e.g., of a resource, of a file, of a payment) associated with a message (e.g., corresponding to graphical representation of a message 866) is completed, as described in method 900, a visual effect (e.g., 3325, 3327, 3345, a coloring effect, a geometric alteration effect) can be applied to an element (e.g., 868) of a graphical representation of the message (e.g., 866) to indicate to the user that the transfer is successfully completed. For another example, when a transfer (e.g., of a resource, of a file, of a payment) associated with a communication (e.g., corresponding to graphical representation of a communication 1118) is completed, as described in method 1200, a visual effect (e.g., 3325, 3327, 3345, a coloring effect, a geometric alteration effect) can be applied to an element (e.g., 1122) of a graphical representation of the communication (e.g., 1118) to indicate to the user that the transfer is successfully completed. For another example, when a transfer (e.g., of a resource, of a file, of a payment) associated with a message (e.g., corresponding to received message object 1490) is completed, as described in method 1500, a visual effect (e.g., 3325, 3327, 3345, a coloring effect, a geometric alteration effect) can be applied to an element (e.g., 1468) of a received message object (e.g., 1490) corresponding to the message to indicate to the user that the transfer is successfully completed. For another example, when a transfer (e.g., of a resource, of a file, of a payment) associated with a message (e.g., corresponding to message object 1726) is completed, 364
DK 2017 70505 A1 as described in method 1800, a visual effect (e.g., 3325, 3327, 3345, a coloring effect, a geometric alteration effect) can be applied to an element of a message object (e.g., 1726) corresponding to the message to indicate to the user that the transfer is successfully completed. For another example, when an account (e.g., corresponding to representation of the second account 2030) is ready to be used in a transfer (e.g., of a resource, of a file, of a payment) and/or when a transfer using the account is completed, as described in method 2100, a visual effect (e.g., 3325, 3327, 3345, a coloring effect, a geometric alteration effect) can be applied to one or more elements (e.g., 2034) of a representation of the account (e.g., 2030) to indicate to the user that the account is ready to be used in the transfer and/or that the transfer is successfully completed. For another example, when a payment account (e.g., corresponding to graphical representation 2330) is ready to be used in a transfer (e.g., of a resource, of a file, of a payment) and/or when a transfer using the payment account is completed, as described in method 2400, a visual effect (e.g., 3325, 3327, 3345, a coloring effect, a geometric alteration effect) can be applied to one or more elements of the graphical representation of the payment account (e.g., 2330) to indicate to the user that the account is ready to be used in the transfer and/or that the transfer is successfully completed. For another example, when a transfer (e.g., of a resource, of a file, of a payment) associated with a message (e.g., corresponding to message object 2644) is completed, as described in method 2700, a visual effect (e.g., 3325, 3327, 3345, a coloring effect, a geometric alteration effect) can be applied to an element (e.g., 2622) of the message object (e.g., 2644) corresponding to the message to indicate to the user that the transfer is successfully completed. For another example, when a transfer (e.g., of a resource, of a file, of a payment) associated with a message (e.g., corresponding to message object 2932) is completed, as described in method 3000, a visual effect (e.g., 3325, 3327, 3345, a coloring effect, a geometric alteration effect) can be applied to an element of the message object (e.g., 2932) corresponding to the message to indicate to the user that the transfer is successfully completed. For brevity, these details are not repeated below.
[0998] The operations in the information processing methods described above are, optionally, implemented by running one or more functional modules in an information processing apparatus such as general purpose processors (e.g., as described with respect to FIGS. 1A, 3, and 5A) or application specific chips. Further, the operations described above with 365
DK 2017 70505 A1 reference to FIGS. 34A-34D are, optionally, implemented by components depicted in FIGS. 1A1B. For example, displaying operation 3402, detecting operation 3404, changing operation 3408, reducing operation 3422, and continuing operation 3436 are, optionally, implemented by event sorter 170, event recognizer 180, and event handler 190. Event monitor 171 in event sorter 170 detects a contact on touch-sensitive surface 604, and event dispatcher module 174 delivers the event information to application 136-1. A respective event recognizer 180 of application 136-1 compares the event information to respective event definitions 186, and determines whether a first contact at a first location on the touch-sensitive surface corresponds to a predefined event or sub-event, such as selection of an object on a user interface. When a respective predefined event or sub-event is detected, event recognizer 180 activates an event handler 190 associated with the detection of the event or sub-event. Event handler 190 optionally utilizes or calls data updater 176 or object updater 177 to update the application internal state 192. In some embodiments, event handler 190 accesses a respective GUI updater 178 to update what is displayed by the application. Similarly, it would be clear to a person having ordinary skill in the art how other processes can be implemented based on the components depicted in FIGS. 1A-1B.
[0999] In accordance with some implementations, a computer-readable storage medium (e.g., a non-transitory computer readable storage medium) is provided, the computer-readable storage medium storing one or more programs for execution by one or more processors of an electronic device, the one or more programs including instructions for performing any of the methods or processes described herein.
[1000] In accordance with some implementations, an electronic device (e.g., a portable electronic device) is provided that comprises means for performing any of the methods or processes described herein.
[1001] In accordance with some implementations, an electronic device (e.g., a portable electronic device) is provided that comprises a processing unit configured to perform any of the methods or processes described herein.
[1002] In accordance with some implementations, an electronic device (e.g., a portable electronic device) is provided that comprises one or more processors and memory storing one or
366
DK 2017 70505 A1 more programs for execution by the one or more processors, the one or more programs including instructions for performing any of the methods or processes described herein.
[1003] Exemplary methods, non-transitory computer-readable storage media, systems, and electronic devices are set out in the following items:
1. A method, comprising:
at an electronic device with a display, one or more input devices, and a wireless communication radio:
receiving, via the wireless communication radio, one or more messages;
displaying, on the display, a user interface for a messaging application that includes at least one of the one or more messages in a message conversation between a plurality of conversation participants;
while concurrently displaying, on the display, at least one of the one or more messages in the message conversation, receiving, from one of the participants, a respective message;
in response to receiving the respective message, in accordance with a determination, based on an analysis of text in the respective message, that the respective message relates to a transfer of a first type of item that the messaging application is configured to transfer, concurrently displaying, on the display, a representation of the message and a selectable indication that corresponds to the first type of item;
while the representation of the message and the selectable indication that corresponds to the first type of item are concurrently displayed on the display, detecting, via the one or more input devices, user activation of the selectable indication; and in response to detecting the user activation of the selectable indication, displaying, on the display, a transfer user interface for initiating transfer of the first type of item between participants in the message conversation.
2. The method of item 1, wherein the text in the respective message includes a first quantity of content of the first type of item, and wherein the transfer user interface includes an indication of the first quantity of the content of the first type of item.
367
DK 2017 70505 A1
3. The method of any of items 1-2, wherein:
the message conversation involves two or more participants, other than a user of the device;
the text in the respective message includes a first quantity of content of the first type of item; and the transfer user interface includes an indication of a second quantity of content of the first type of item, wherein the second quantity is a numerical value divided among the two or more participants based on the first quantity.
4. The method of any of items 1 - 3, further comprising:
further in response to receiving the respective message, in accordance with a determination, based on the analysis of text in the respective message, that the respective message does not relate to a transfer of the first type of item, displaying, on the display, a representation of the respective message without displaying the selectable indication that corresponds to the first type of item.
5. The method of any of items 1 - 4, further comprising:
in accordance with the determination, based on the analysis of the text in the respective message, that the respective message relates to the transfer of the first type of item that the messaging application is configured to transfer, displaying, on the display, a transfer affordance;
detecting user activation of the transfer affordance; and in response to detecting the user activation of the transfer affordance, displaying, on the display, the transfer user interface for initiating transfer of the first type of item to a participant in the message conversation.
6. The method of any of items 1 - 5, further comprising:
in accordance with a determination that the respective message includes one or more features that indicate that the transfer request is a fraudulent transfer request, forgoing displaying the transfer affordance.
368
DK 2017 70505 A1
7. The method of any of items 1 - 6, wherein the selectable indication is a portion of the text in the respective message that relates to the first type of item that is visually distinguished from other text in the respective message.
8. The method of any of items 1 - 7, further comprising:
while displaying the transfer user interface, receiving user input;
in response to receiving the user input, displaying, on the display, a keypad user interface, wherein the keypad user interface includes one or more suggested numerical values for a quantity of the first type of item to transfer.
9. The method of any of items 1 - 8, wherein displaying, on the display, the transfer user interface comprises replacing display of a virtual keyboard having a plurality of alphanumeric keys with the transfer user interface.
10. The method of item any of items 1 -9, wherein the transfer user interface is concurrently displayed with at least a portion of the representation of the respective message.
11. The method of item any of items 1 -10, wherein the transfer user interface includes a transfer mode affordance, the method further comprising:
detecting a first activation of the transfer mode affordance;
in response to detecting the first activation of the transfer mode affordance, designating the message associated with the transfer of the first type of item as corresponding to a transmission of the first type of item;
detecting a second activation of the transfer mode affordance; and in response to detecting the second activation of the transfer mode affordance, designating the message associated with the transfer of the first type of item as corresponding to a request for the first type of item.
12. The method of any of items 1 -11, wherein the transfer user interface includes a send affordance, the method further comprising:
369
DK 2017 70505 A1 detecting user activation of the send affordance; and in response to detecting the user activation of the send affordance, displaying, on the display, a graphical representation of a message associated with the transfer of the first type of item in the message conversation, wherein the graphical representation of the message associated with the transfer of the first type of item includes an indication of a quantity of content of the first type of item being transferred.
13. The method of item 12, wherein:
in accordance with a determination that a message prepared to be sent corresponds to the first type of item, the send affordance is displayed with a first visual characteristic; and in accordance with a determination that the message prepared to be sent corresponds to a second type of item different from the first type of item, the send affordance is displayed with a second visual characteristic different from the first visual characteristic.
14. The method of item 12 -13, wherein the graphical representation of the message associated with transfer of the first type of item is displayed with a third visual characteristic (in the message conversation, and a representation of a message in the message conversation not associated with transfer of the first type of item is displayed with a fourth visual characteristic that is different from the third visual characteristic.
15. The method of item 12 - 14, further comprising:
in response to detecting the user activation of the send affordance and prior to displaying, on the display, the graphical representation of the message associated with the transfer of the first type of item in the message conversation, in accordance with a determination that the message associated with the transfer of the first type of item corresponds to a transmission of the first type of item, displaying, on the display, an authentication user interface requesting authentication information;
receiving, via the one or more input devices, the authentication information, and:
in accordance with a determination that the received authentication information corresponds to enrolled authentication information for authorizing transfers, displaying, on the
370
DK 2017 70505 A1 display, the graphical representation of the message associated with the transfer of the first type of item in the message conversation; and in accordance with a determination that the received authentication information does not correspond to the enrolled authentication information for authorizing transfers, forgoing displaying, on the display, the graphical representation of the message associated with the transfer of the first type of item in the message conversation.
16. The method of any of items 1 -15, further comprising:
in accordance with a determination that the respective message corresponds to a transmission, from a first participant in the message conversation, of a first quantity of content of the first type of item, automatically transferring the first quantity of content of the first type of item to the first participant.
17. The method of any of items 1 -16, further comprising:
while displaying, on the display, the transfer user interface:
displaying a numerical value representing a quantity of the first type of item;
detecting, via the one or more input devices, a user input;
in accordance with a determination that the user input corresponds to a first type of user input, increasing the displayed numerical value by an amount corresponding to the first type of user input; and in accordance with a determination that the user input corresponds to a second type of user input, decreasing the displayed numerical value by an amount corresponding to the second type of user input.
18. The method of item 17, wherein the user input is a continuous input on an affordance for at least a predetermined time, the method further comprising:
in accordance with the determination that the user input corresponds to the first type of user input, increasing the displayed numerical value by an increasingly faster rate based on the duration of the user input; and
371
DK 2017 70505 A1 in accordance with the determination that the user input corresponds to the second type of user input, decreasing the displayed numerical value by an increasingly faster rate based on the duration of the user input.
19. The method of item 17, wherein the user input is a continuous input on an affordance having a first characteristic intensity at a first time and a second characteristic intensity at a second time, the method further comprising:
in accordance with the determination that the user input corresponds to the first type of user input, increasing the displayed numerical value by a first rate at the first time and by a second rate at the second time; and in accordance with the determination that the user input corresponds to the second type of user input, decreasing the displayed numerical value by the first rate at the first time and by the second rate at the second time.
20. The method of item 17 - 19, further comprising:
in accordance with a determination that the user input corresponds to a third type of user input, replacing display of the transfer user interface with a numerical keypad user interface, wherein the numerical keypad user interface includes a plurality of suggested values.
21. The method of item 20, wherein an amount of at least one of the plurality of suggested values is determined based on stored historical use data associated with a user of the electronic device.
22. The method of item 17 - 21, wherein the electronic device provides feedback while changing the displayed numerical value.
23. The method of any of items 1 - 22, further comprising:
while displaying, on the display, the transfer user interface:
displaying an affordance for changing an account for use in the transfer of the first type of item;
372
DK 2017 70505 A1 detecting, via the one or more input devices, user activation of the affordance for changing the account;
in response to detecting the user activation of the affordance for changing the account, displaying, on the display, an account user interface including a representation of a current account and a representation of a second account, wherein the current account is currently selected for use in the transfer;
detecting, via the one or more input devices, user selection of the representation of the second account; and in response to detecting the user selection of the representation of the second account, selecting the second account for use in the transfer.
24. A computer-readable storage medium storing one or more programs configured to be executed by one or more processors of an electronic device with a display, one or more input devices, and a wireless communication radio, the one or more programs including instructions for performing the method of any of items 1 -23.
25. An electronic device, comprising:
a display;
one or more input devices;
a wireless communication radio;
one or more processors; and memory storing one or more programs configured to be executed by the one or more processors, the one or more programs including instructions for performing the method of any of items 1 -23.
26. An electronic device, comprising:
a display;
one or more input devices;
a wireless communication radio; and means for performing the method of any of items 1 -23.
373
DK 2017 70505 A1
27. A non-transitory computer-readable storage medium storing one or more programs configured to be executed by one or more processors of an electronic device with a display, one or more input devices, and a wireless communication radio, the one or more programs including instructions for:
receiving, via the wireless communication radio, one or more messages;
displaying, on the display, a user interface for a messaging application that includes at least one of the one or more messages in a message conversation between a plurality of conversation participants;
while concurrently displaying, on the display, at least one of the one or more messages in the message conversation, receiving, from one of the participants, a respective message;
in response to receiving the respective message, in accordance with a determination, based on an analysis of text in the respective message, that the respective message relates to a transfer of a first type of item that the messaging application is configured to transfer, concurrently displaying, on the display, a representation of the message and a selectable indication that corresponds to the first type of item;
while the representation of the message and the selectable indication that corresponds to the first type of item are concurrently displayed on the display, detecting, via the one or more input devices, user activation of the selectable indication; and in response to detecting the user activation of the selectable indication, displaying, on the display, a transfer user interface for initiating transfer of the first type of item between participants in the message conversation.
28. An electronic device, comprising:
a display;
one or more input devices;
a wireless communication radio;
one or more processors; and memory storing one or more programs configured to be executed by the one or more processors, the one or more programs including instructions for:
374
DK 2017 70505 A1 receiving, via the wireless communication radio, one or more messages;
displaying, on the display, a user interface for a messaging application that includes at least one of the one or more messages in a message conversation between a plurality of conversation participants;
while concurrently displaying, on the display, at least one of the one or more messages in the message conversation, receiving, from one of the participants, a respective message;
in response to receiving the respective message, in accordance with a determination, based on an analysis of text in the respective message, that the respective message relates to a transfer of a first type of item that the messaging application is configured to transfer, concurrently displaying, on the display, a representation of the message and a selectable indication that corresponds to the first type of item;
while the representation of the message and the selectable indication that corresponds to the first type of item are concurrently displayed on the display, detecting, via the one or more input devices, user activation of the selectable indication; and in response to detecting the user activation of the selectable indication, displaying, on the display, a transfer user interface for initiating transfer of the first type of item between participants in the message conversation.
29. An electronic device, comprising:
a display;
one or more input devices;
a wireless communication radio;
means for receiving, via the wireless communication radio, one or more messages;
means for displaying, on the display, a user interface for a messaging application that includes at least one of the one or more messages in a message conversation between a plurality of conversation participants;
means, while concurrently displaying, on the display, at least one of the one or more messages in the message conversation, for receiving, from one of the participants, a respective message;
375
DK 2017 70505 A1 means, in response to receiving the respective message, in accordance with a determination, based on an analysis of text in the respective message, that the respective message relates to a transfer of a first type of item that the messaging application is configured to transfer, for concurrently displaying, on the display, a representation of the message and a selectable indication that corresponds to the first type of item;
means, while the representation of the message and the selectable indication that corresponds to the first type of item are concurrently displayed on the display, for detecting, via the one or more input devices, user activation of the selectable indication; and means, in response to detecting the user activation of the selectable indication, for displaying, on the display, a transfer user interface for initiating transfer of the first type of item between participants in the message conversation.
30. A method, comprising:
at an electronic device with a display and one or more sensor devices:
displaying, on the display, a graphical representation of a communication;
while displaying the graphical representation of the communication on the display, detecting, via the one or more sensor devices, a change in orientation of the electronic device relative to a reference point; and in response to detecting the change in the orientation of the electronic device relative to the reference point while displaying the graphical representation of the communication on the display:
in accordance with a determination that the communication has a first state, displaying the graphical representation of the communication and outputting a respective type of feedback corresponding to the graphical representation of the communication, wherein the feedback indicates a magnitude of the change in the orientation of the electronic device relative to the reference point; and in accordance with a determination that the communication has a second state that is different from the first state, displaying the graphical representation of the communication without outputting feedback that indicates a magnitude of the change in the orientation of the electronic device relative to the reference point.
376
DK 2017 70505 A1
31. The method of item 30, wherein the communication is associated with a completed transfer of a first type of item between a user of the device and a participant in a message conversation.
32. The method of any of items 30 - 31, further comprising:
prior to displaying, on the display, the graphical representation of the communication, receiving the communication with a predetermined type of message from an external device.
33. The method of item 32, further comprising:
in response to receiving the communication with the predetermined type of message: in accordance with a determination, based on an analysis of the communication, that the communication meets a first predefined condition, displaying, on the display, a first indication that the communication meets the first predefined condition; and in accordance with a determination, based on the analysis of the communication, that the communication does not meet the first predefined condition, forgoing displaying, on the display, the first indication.
34. The method of item 33, wherein the communication meets the first predefined condition when the external device does not correspond to one of a plurality of contacts associated with the electronic device.
35. The method of item 33, wherein the communication meets the first predefined condition when the external device corresponds to one of a plurality of contacts.
36. The method of item 33 - 35, further comprising:
in accordance with the determination, based on the analysis of the communication, that the communication meets the first predetermined condition, displaying, on the display, a reporting affordance;
while displaying, on the display, the reporting affordance, detecting user activation of the reporting affordance; and
377
DK 2017 70505 A1 in response to detecting the user activation of the reporting affordance, transmitting, to an external device, information associated with the communication that raised the predetermined flag.
37. The method of item 33 - 36, further comprising:
subsequent to displaying, on the display, the first indication that the communication raised the predetermined flag, receiving user activation of a send affordance displayed on the graphical representation of the communication; and in response to receiving the user activation of the send affordance:
displaying a second indication that the communication met the first predetermined condition, wherein the second indication is visually distinguishable from the first indication, and displaying, on the display, a cancel affordance for forgoing proceeding with a transfer of the first type of item between a user of the device and a participant in a message conversation.
38. The method of any of items 30 - 37, wherein a state of the communication is indicative of an action taken by a participant, other than a user of the device, in a message conversation.
39. The method of any of items 30 - 38, wherein a state of the communication is indicative of an action taken by a user of the device.
40. The method of any of items 30 - 39, wherein the graphical representation of the communication includes an indication of a quantity of an item associated with the communication.
41. The method of any of items 30 - 40, wherein the reference point is a face in a field of view of a sensor of the one or more sensor.
42. The method of any of items 30 - 41, wherein the reference point is a static location external to the electronic device.
378
DK 2017 70505 A1
43. The method of any of items 30 - 42, wherein the respective type of feedback is a dynamic visual feedback.
44. The method of any of items 30 - 43, wherein the respective type of feedback is a dynamic haptic feedback.
45. The method of any of items 30 - 44, wherein the respective type of feedback is caused by an operating system program of the electronic device and non-operating system programs of the electronic device are not enabled to cause the respective type of feedback.
46. The method of any of items 30 - 45, wherein the respective type of feedback is a graphical animation displayed over the graphical representation.
47. The method of any of items 30 - 46, wherein the respective type of feedback is a graphical animation displayed under the graphical representation.
48. The method of any of items 30 - 47, wherein the respective type of feedback is a graphical animation that creates an illusion that the graphical representation is a three dimensional object that is being viewed from different angles as the angle of the device changes.
49. The method of any of items 30 - 48, wherein outputting the respective type of feedback comprises outputting a non-visual feedback.
50. The method of any of items 30 - 49, wherein the communication is a message in a message conversation between a plurality of conversation participants and the communication is associated with a confirmation, the method further comprising:
prior to displaying, on the display, the graphical representation of the communication, detecting user activation of a confirmation affordance; and in response to detecting user activation of the confirmation affordance:
379
DK 2017 70505 A1 displaying, on the display, the graphical representation of the communication in the message conversation, and outputting a second type of feedback corresponding to the graphical representation of the communication, wherein the feedback indicates that the communication has been confirmed.
51. The method of any of items 30 - 50, further comprising:
receiving user selection of the graphical representation of the communication; and in response to receiving the user selection of the graphical representation of the communication, displaying, on the display, a details user interface including information associated with the communication.
52. The method of item 51, wherein the details user interface includes a cancellation affordance, wherein the cancellation affordance is user-selectable when the communication is in the first state, the method further comprising:
detecting user activation of the cancellation affordance; and in response to detecting the user activation of the cancellation affordance:
in accordance with the determination that the communication has the first state, transmitting a second communication with the predetermined type of message to an external device associated with the communication requesting a return transfer of a first type of item that was transferred via the communication.
53. The method of any of items 30 - 52, wherein the graphical representation of the communication having the first state includes a graphical indication of a completed transfer of a first type of item between the electronic device and an external device.
54. A computer-readable storage medium storing one or more programs configured to be executed by one or more processors of an electronic device with a display and one or more sensor devices, the one or more programs including instructions for performing the method of any of items 30 - 53.
380
DK 2017 70505 A1
55. An electronic device, comprising:
a display;
one or more sensor devices;
one or more processors; and memory storing one or more programs configured to be executed by the one or more processors, the one or more programs including instructions for performing the method of any of items 30 - 53.
56. An electronic device, comprising:
a display;
one or more sensor devices; and means for performing the method of any of items 30 - 53.
57. A non-transitory computer-readable storage medium storing one or more programs configured to be executed by one or more processors of an electronic device with a display and one or more sensor devices, the one or more programs including instructions for:
displaying, on the display, a graphical representation of a communication;
while displaying the graphical representation of the communication on the display, detecting, via the one or more sensor devices, a change in orientation of the electronic device relative to a reference point; and in response to detecting the change in the orientation of the electronic device relative to the reference point while displaying the graphical representation of the communication on the display:
in accordance with a determination that the communication has a first state, displaying the graphical representation of the communication and outputting a respective type of feedback corresponding to the graphical representation of the communication, wherein the feedback indicates a magnitude of the change in the orientation of the electronic device relative to the reference point; and
381
DK 2017 70505 A1 in accordance with a determination that the communication has a second state that is different from the first state, displaying the graphical representation of the communication without outputting feedback that indicates a magnitude of the change in the orientation of the electronic device relative to the reference point.
58. An electronic device, comprising:
a display;
one or more sensor devices;
one or more processors; and memory storing one or more programs configured to be executed by the one or more processors, the one or more programs including instructions for:
displaying, on the display, a graphical representation of a communication;
while displaying the graphical representation of the communication on the display, detecting, via the one or more sensor devices, a change in orientation of the electronic device relative to a reference point; and in response to detecting the change in the orientation of the electronic device relative to the reference point while displaying the graphical representation of the communication on the display:
in accordance with a determination that the communication has a first state, displaying the graphical representation of the communication and outputting a respective type of feedback corresponding to the graphical representation of the communication, wherein the feedback indicates a magnitude of the change in the orientation of the electronic device relative to the reference point; and in accordance with a determination that the communication has a second state that is different from the first state, displaying the graphical representation of the communication without outputting feedback that indicates a magnitude of the change in the orientation of the electronic device relative to the reference point.
59. An electronic device, comprising:
a display;
382
DK 2017 70505 A1 one or more sensor devices;
means for displaying, on the display, a graphical representation of a communication; means, while displaying the graphical representation of the communication on the display, for detecting, via the one or more sensor devices, a change in orientation of the electronic device relative to a reference point; and means, in response to detecting the change in the orientation of the electronic device relative to the reference point while displaying the graphical representation of the communication on the display, for:
in accordance with a determination that the communication has a first state, displaying the graphical representation of the communication and outputting a respective type of feedback corresponding to the graphical representation of the communication, wherein the feedback indicates a magnitude of the change in the orientation of the electronic device relative to the reference point; and in accordance with a determination that the communication has a second state that is different from the first state, displaying the graphical representation of the communication without outputting feedback that indicates a magnitude of the change in the orientation of the electronic device relative to the reference point.
60. A method, comprising:
at an electronic device with a display and one or more input devices:
displaying, on the display, a numerical value selection user interface;
while displaying the numerical value selection user interface, receiving, via the one or more input devices, an input that corresponds to a selection of a respective numerical value from a plurality of numerical values in the numerical value selection interface;
in response to receiving the input that corresponds to the selection of the respective numerical value, displaying, on the display, a representation of the respective numerical value in the numerical value selection user interface;
while displaying the representation of the respective numerical value in the numerical value selection user interface, receiving, via the one or more input devices, an input that
383
DK 2017 70505 A1 corresponds to a request to send a message, via a messaging application, that corresponds to the respective numerical value; and in response to receiving the input that corresponds to the request to send the message, via the messaging application, that corresponds to the respective numerical value, sending the message that corresponds to the respective numerical value to one or more participants, and:
in accordance with a determination that the message is designated as a transmission message for the respective numerical value, displaying, on the display, a first message object in a message transcript of the messaging application, wherein the first message object includes a graphical representation of the respective numerical value in a respective font that is associated with requests generated using the numerical value selection user interface; and in accordance with a determination that the message is designated as a request message for the respective numerical value, displaying, on the display, a second message object in the message transcript of the messaging application different from the first message object, wherein, in the second message object:
the respective numerical value is displayed in a font that is smaller than the respective font; and a predetermined request indicator associated with requests generated using the numerical value selection user interface is displayed in the respective font.
61. The method of item 60, further comprising:
receiving, from a participant of the one or more participants, a message that corresponds to a second respective numerical value, and:
in accordance with a determination that the received message is designated as a transmission message for the second respective numerical value, displaying, on the display, a first received message object in the message transcript of the messaging application, wherein the first received message object includes a graphical representation of the second respective numerical value in the respective font that is associated with requests generated using the numerical value selection user interface; and in accordance with a determination that the received message is designated as a request message for the second respective numerical value, displaying, on the display, a second
384
DK 2017 70505 A1 received message object in the message transcript of the messaging application different from the first received message object, wherein, in the second received message object:
the respective numerical value is displayed in the font that is smaller than the respective font; and a predetermined request indicator associated with requests generated using the numerical value selection user interface is displayed in the respective font.
62. The method of item 61, wherein the first message object, the second message object, the first received message object, and the second received message object are displayed with a first visual characteristic.
63. The method of item 62, wherein a third message object that corresponds to a message of the messaging application that was sent by the electronic device and does not correspond to the respective numerical value is displayed with a second visual characteristic and a third received message object that corresponds to a messaging application that was received from the one or more participants and does not correspond to the second respective numerical value is displayed with a third visual characteristic that is different from the second visual characteristic.
64. The method of any of items 60 - 63, wherein the first message object and the second message object are displayed with a first visual characteristic.
65. The method of item 64, wherein a third message object that corresponds to a message of the messaging application that does not correspond to the respective numerical value is displayed with a second visual characteristic that is different from the first visual characteristic.
66. The method of any of items 60 - 65, further comprising:
in response to receiving the input that corresponds to the request to send the message, via the messaging application, that corresponds to the respective numerical value, in accordance with a determination that a first participant of the one or more participants is ineligible to receive
385
DK 2017 70505 A1 the message, displaying, on the display, an indication, that the first participant is ineligible to receive the message.
67. The method of any of items 60 - 66, wherein the message transcript of the messaging application includes a third message object, wherein the third message object corresponds to a transmission message for sending one or more items corresponding to a numerical value generated at an external device of a participant of the one or more participants and the third message object includes an accept affordance for accepting one or more items associated with the third message object at the electronic device.
68. The method of any of items 60 - 67, wherein the message transcript of the messaging application includes a fourth message object, wherein the fourth message object corresponds to a request message for requesting one or more items corresponding to a numerical value generated at an external device of a participant of the one or more participants and the fourth message object includes a send affordance for sending one or more items associated with the fourth message object to a participant from whom the fourth message object was received.
69. The method of any of items 60 - 68, further comprising:
concurrently displaying, at a first location associated with a message object in the message transcript of the messaging application, a visual indicator indicating a status associated with an action of a participant of the one or more participants; and in accordance with a determination that the participant has taken an action changing the status, updating the visual indicator to reflect the change in status associated with the action of the participant.
70. The method of item 69, wherein the first location at least partially overlaps with the displayed message object.
386
DK 2017 70505 A1
71. The method of item 69, wherein the first location does not overlap with the displayed message object, and wherein content of the visual indicator is controlled by an operating system of the electronic device.
72. The method of any of items 60 - 71, further comprising:
prior to displaying, on the display, the numerical value selection user interface, displaying, on the display, a third message object that corresponds to a message received from a participant, other than a user of the electronic device, of the one or more participants;
in accordance with a determination that the third message was authenticated, by the participant, on an external device of the participant, concurrently displaying, with the third message object, an indication that the third message was biometrically authenticated by the participant.
73. The method of any of items 60 - 72, wherein the one or more participants includes a first participant and a second participant, and the first participant and the second participant are different from a user of the electronic device, the method further comprising:
receiving an indication that an intended recipient of the message is the first participant and not the second participant;
subsequent to receiving the indication that the intended recipient of the message is the first participant and not the second participant:
in accordance with the determination that the message is designated as a transmission message for the respective numerical value, displaying, on the display, the first message object in a second message transcript different from the message transcript of the messaging application, wherein the second message transcript is not associated with the second participant; and in accordance with the determination that the message is designated as a request message for the respective numerical value, displaying, on the display, the second message object in the second message transcript of the messaging application.
74. The method of any of items 60 - 73, further comprising:
387
DK 2017 70505 A1 prior to sending the message to the one or more participants, receiving, via the input mechanism, a user comment; and subsequent to receiving user comment:
in accordance with the determination that the message is designated as a transmission message for one or more items corresponding to the respective numerical value, concurrently displaying, adjacent to the first message object, a message object including the user comment; and in accordance with the determination that the message is designated as a request message for one or more items corresponding to the respective numerical value, concurrently displaying, adjacent to the second message object, the message object including the user comment.
75. The method of any of items 60 - 74, further comprising:
subsequent to displaying, on the display, the second message object in the messaging application, in accordance with a determination that a transfer of a first type of item in a quantity corresponding to the respective numerical value has been initiated by an intended recipient of the message associated with the second message object, changing display of a visual characteristic of the second message object from a first visual characteristic to a second visual characteristic.
76. The method of any of items 60 - 75, further comprising:
receiving, from an external device associated with a participant of the one or more participants, a second message associated with a request for a second respective numerical value;
subsequent to receiving the second message associated with the request for the second respective numerical value and in accordance with a determination that a predetermined amount of time has passed since receiving the second message:
in accordance with a determination that the second message is designated as a request message for one or more items corresponding to the second respective numerical value, generating a reminder of the received second message; and
388
DK 2017 70505 A1 in accordance with a determination that the second message is not designated as a request message for one or more items corresponding to the second respective numerical value, forgoing generating the reminder of the received second message.
77. A computer-readable storage medium storing one or more programs configured to be executed by one or more processors of an electronic device with a display and one or more input devices, the one or more programs including instructions for performing the method of any of items 60 - 76.
78. An electronic device, comprising:
a display;
one or more input devices;
one or more processors; and memory storing one or more programs configured to be executed by the one or more processors, the one or more programs including instructions for performing the method of any of items 60 - 76.
79. An electronic device, comprising:
a display;
one or more input devices; and means for performing the method of any of items 60 - 76.
80. A non-transitory computer-readable storage medium storing one or more programs configured to be executed by one or more processors of an electronic device with a display and one or more input devices, the one or more programs including instructions for:
displaying, on the display, a numerical value selection user interface;
while displaying the numerical value selection user interface, receiving, via the one or more input devices, an input that corresponds to selection of a respective numerical value from a plurality of numerical values in the numerical value selection interface;
389
DK 2017 70505 A1 in response to receiving the input that corresponds to the selection of the respective numerical value, displaying, on the display, a representation of the respective numerical value in the numerical value selection user interface;
while displaying the representation of the respective numerical value in the numerical value selection user interface, receiving, via the one or more input devices, an input that corresponds to a request to send a message, via a messaging application, that corresponds to the respective numerical value; and in response to receiving the input that corresponds to the request to send the message, via the messaging application, that corresponds to the respective numerical value, sending the message that corresponds to the respective numerical value to one or more participants, and:
in accordance with a determination that the message is designated as a transmission message for the respective numerical value, displaying, on the display, a first message object in a message transcript of the messaging application, wherein the first message object includes a graphical representation of the respective numerical value in a respective font that is associated with requests generated using the numerical value selection user interface; and in accordance with a determination that the message is designated as a request message for the respective numerical value, displaying, on the display, a second message object in the message transcript of the messaging application different from the first message object, wherein, in the second message object:
the respective numerical value is displayed in the message object in a font that is smaller than the respective font; and a predetermined request indicator associated with requests generated using the numerical value selection user interface is displayed in the respective font.
81. An electronic device, comprising:
a display;
one or more input devices;
one or more processors; and memory storing one or more programs configured to be executed by the one or more processors, the one or more programs including instructions for:
390
DK 2017 70505 A1 displaying, on the display, a numerical value selection user interface;
while displaying the numerical value selection user interface, receiving, via the one or more input devices, an input that corresponds to selection of a respective numerical value from a plurality of numerical values in the numerical value selection interface;
in response to receiving the input that corresponds to the selection of the respective numerical value, displaying, on the display, a representation of the respective numerical value in the numerical value selection user interface;
while displaying the representation of the respective numerical value in the numerical value selection user interface, receiving, via the one or more input devices, an input that corresponds to a request to send a message, via a messaging application, that corresponds to the respective numerical value; and in response to receiving the input that corresponds to the request to send the message, via the messaging application, that corresponds to the respective numerical value, sending the message that corresponds to the respective numerical value to one or more participants, and:
in accordance with a determination that the message is designated as a transmission message for the respective numerical value, displaying, on the display, a first message object in a message transcript of the messaging application, wherein the first message object includes a graphical representation of the respective numerical value in a respective font that is associated with requests generated using the numerical value selection user interface; and in accordance with a determination that the message is designated as a request message for the respective numerical value, displaying, on the display, a second message object in the message transcript of the messaging application different from the first message object, wherein, in the second message object:
the respective numerical value is displayed in the message object in a font that is smaller than the respective font; and a predetermined request indicator associated with requests generated using the numerical value selection user interface is displayed in the respective font.
82. An electronic device, comprising:
a display;
391
DK 2017 70505 A1 one or more input devices;
means for displaying, on the display, a numerical value selection user interface; means, while displaying the numerical value selection user interface, for receiving, via the one or more input devices, an input that corresponds to selection of a respective numerical value from a plurality of numerical values in the numerical value selection interface;
means, in response to receiving the input that corresponds to the selection of the respective numerical value, for displaying, on the display, a representation of the respective numerical value in the numerical value selection user interface;
means, while displaying the representation of the respective numerical value in the numerical value selection user interface, for receiving, via the one or more input devices, an input that corresponds to a request to send a message, via a messaging application, that corresponds to the respective numerical value; and means, in response to receiving the input that corresponds to the request to send the message, via the messaging application, that corresponds to the respective numerical value, for sending the message that corresponds to the respective numerical value to one or more participants, and:
means, in accordance with a determination that the message is designated as a transmission message for the respective numerical value, for displaying, on the display, a first message object in a message transcript of the messaging application, wherein the first message object includes a graphical representation of the respective numerical value in a respective font that is associated with requests generated using the numerical value selection user interface; and means, in accordance with a determination that the message is designated as a request message for the respective numerical value, for displaying, on the display, a second message object in the message transcript of the messaging application different from the first message object, wherein, in the second message object:
the respective numerical value is displayed in the message object in a font that is smaller than the respective font; and a predetermined request indicator associated with requests generated using the numerical value selection user interface is displayed in the respective font.
392
DK 2017 70505 A1
83. A method, comprising:
at an electronic device with a display and one or more input devices:
displaying, on the display, a message object in a message conversation, wherein the message object includes an indication of a first one or more items sent from a participant in the conversation to a user of the electronic device;
while displaying at least a portion of the message conversation, detecting, via the one or more input devices, an input that corresponds to a request to obtain the first one or more items; and in response to detecting the input that corresponds to the request to obtain the first one or more items:
in accordance with a determination that the electronic device is associated with an activated account that is authorized to obtain the first one or more items, proceeding to obtain the first one or more items; and in accordance with a determination that the electronic device is not associated with an activated account that is authorized to obtain the first content, displaying, on the display, a second affordance for activating an account that is authorized to obtain the first one or more items.
84. The method of item 83, wherein the electronic device includes an application that is configured to manage the first one or more items.
85. The method of any of items 83 - 84, further comprising:
in accordance with a determination that the first content sent from the participant corresponds to a gift:
displaying a graphical indication that the first one or more items sent from the participant corresponds to a gift; and concealing display of an indication of the amount of the first one or more items.
86. The method of item 85, further comprising:
detecting user selection of the message object; and
393
DK 2017 70505 A1 in response to detecting the user selection of the message object, displaying, on the display, the indication of the amount of the first one or more items.
87. The method of any of items 83 - 86, further comprising:
prior to displaying, on the display, the message object in the message conversation:
in accordance with a determination that the first one or more items sent from the participant corresponds to a gift, displaying, on the display, a notification of the first content received from the participant that does not include display of an indication of the amount of the first one or more items; and in accordance with a determination that the first one or more items sent from the participant does not correspond to a gift, displaying, on the display, a notification of the first one or more items received from the participant that includes display of the indication of the amount of the first one or more items.
88. The method of any of items 83 - 87, wherein the message object includes an accept affordance, and the input that corresponds to the request to obtain the first one or more items comprises an input on the accept affordance.
89. The method of any of items 83 - 88, further comprising:
while displaying at least the portion of the message conversation and prior to detecting, via the one or more input devices, the input that corresponds to the request to obtain the first one or more items:
in accordance with a determination that the electronic device is associated with an activated account that is authorized to obtain the first content without further user confirmation, proceeding to obtain the first one or more items without detecting the input that corresponds to the request to obtain the first one or more items; and in accordance with a determination that the electronic device is not associated with an activated account that is authorized to obtain the first one or more items without further user confirmation, displaying, on the display, the accept affordance for activating an account that is authorized to obtain the first one or more items.
394
DK 2017 70505 A1
90. The method of item 89, wherein the first one or more items are items of a first type and the method further comprises:
while displaying at least the portion of the message conversation:
in accordance with the determination that that the electronic device is associated with the activated account that is authorized to obtain the items of the first type without further user confirmation:
in accordance with a determination that there is no record of a prior obtained items of the first type using the activated account:
forgoing proceeding to obtain the first one or more items without detecting the input;
proceeding to obtain the first content in response to detecting the input that corresponds to the request to obtain the first one or more items; and in accordance with a determination that there is a record of a prior obtained items of the first type using the activated account, proceeding to obtain the items of the first type without requiring detection of a user input that corresponds to a request to obtain items of the first type.
91. The method of any of items 83 - 90, further comprising:
receiving a second input on the message object in the message conversation;
in response to receiving the second input on the message object, displaying, on the display, a details user interface including information associated with the message object.
92. The method of any of items 83 - 91, wherein the first one or more items are items of a first type and the method further comprises:
in accordance with a determination that obtaining the first one or more items moves a total number of prior transfers of items of the first type associated with the activated account over a predetermined limit, displaying, on the display, a verification user interface corresponding to a request to verify identity of the user associated with the activated account.
395
DK 2017 70505 A1
93. The method of item 92, wherein the first one or more items are items of a first type, and the total amount of prior transfers of items of the first type associated with the activated account includes only prior transfers of items of the first type associated with an obtaining of items of the first type, and does not include prior transfers of items of the first type associated with a transmission of items of the first type.
94. A computer-readable storage medium storing one or more programs configured to be executed by one or more processors of an electronic device with a display and one or more input devices, the one or more programs including instructions for performing the method of any of items 83 - 93.
95. An electronic device, comprising:
a display;
one or more input devices;
one or more processors; and memory storing one or more programs configured to be executed by the one or more processors, the one or more programs including instructions for performing the method of any of items 83 - 93.
96. An electronic device, comprising:
a display;
one or more input devices; and means for performing the method of any of items 83 - 93.
97. A non-transitory computer-readable storage medium storing one or more programs configured to be executed by one or more processors of an electronic device with a display and one or more input devices, the one or more programs including instructions for:
displaying, on the display, a message object in a message conversation, wherein the message object includes an indication of a first one or more items sent from a participant in the conversation to a user of the electronic device;
396
DK 2017 70505 A1 while displaying at least a portion of the message conversation, detecting, via the one or more input devices, an input that corresponds to a request to obtain the first one or more items; and in response to detecting the input that corresponds to the request to obtain the first one or more items:
in accordance with a determination that the electronic device is associated with an activated account that is authorized to obtain the first one or more items, proceeding to obtain the first one or more items; and in accordance with a determination that the electronic device is not associated with an activated account that is authorized to obtain the first content, displaying, on the display, a second affordance for activating an account that is authorized to obtain the first one or more items.
98. An electronic device, comprising:
a display;
one or more input devices;
one or more processors; and memory storing one or more programs configured to be executed by the one or more processors, the one or more programs including instructions for:
displaying, on the display, a message object in a message conversation, wherein the message object includes an indication of a first one or more items sent from a participant in the conversation to a user of the electronic device;
while displaying at least a portion of the message conversation, detecting, via the one or more input devices, an input that corresponds to a request to obtain the first one or more items; and in response to detecting the input that corresponds to the request to obtain the first one or more items:
in accordance with a determination that the electronic device is associated with an activated account that is authorized to obtain the first one or more items, proceeding to obtain the first one or more items; and
397
DK 2017 70505 A1 in accordance with a determination that the electronic device is not associated with an activated account that is authorized to obtain the first content, displaying, on the display, a second affordance for activating an account that is authorized to obtain the first one or more items.
99. An electronic device, comprising:
a display;
one or more input devices;
means for displaying, on the display, a message object in a message conversation, wherein the message object includes an indication of a first one or more items sent from a participant in the conversation to a user of the electronic device;
means, while displaying at least a portion of the message conversation, for detecting, via the one or more input devices, an input that corresponds to a request to obtain the first one or more items; and means, in response to detecting the input that corresponds to the request to obtain the first one or more items, for:
in accordance with a determination that the electronic device is associated with an activated account that is authorized to obtain the first one or more items, proceeding to obtain the first one or more items; and in accordance with a determination that the electronic device is not associated with an activated account that is authorized to obtain the first content, displaying, on the display, a second affordance for activating an account that is authorized to obtain the first one or more items.
100. A method, comprising:
at an electronic device with a display, a wireless transmission device, and one or more input devices:
receiving a request to provide restricted credentials associated with a user of the device via the wireless transmission device to an external device;
398
DK 2017 70505 A1 in response to receiving the request to provide the restricted credentials, concurrently displaying, on the display:
a representation of a first account associated with first restricted credentials at a first location of the display, wherein the first account is selected for use in providing the restricted credentials, and at least a portion of a representation of a second account associated with second restricted credentials at a second location of the display, wherein display of at least the portion of the representation of the second account includes display of a usage metric for the second account;
detecting, via the one or more input devices, user selection of the representation of the second account; and in response to detecting the user selection of the representation of the second account: replacing display of the representation of the first account with the representation of the second account at the first location of the display, and selecting the second account for use in providing the restricted credentials.
101. The method of item 100, wherein restricted credentials is stored in a secure element of the electronic device.
102. The method of any of items 100-101, wherein the restricted credentials are uniquely associated with a user of the electronic device.
103. The method of any of items 100 - 102, wherein the electronic device forgoes transmitting the restricted credentials to an external device unless user authentication has been successfully provided by a user of the electronic device.
104. The method of any of items 100 - 103, wherein the at least a portion of the representation of the second account is displayed after a predetermined amount of time has passed from displaying the representation of the first account.
399
DK 2017 70505 A1
105. The method of any of items 100 - 104, further comprising:
subsequent to selecting the second account for use in providing the restricted credentials, proceeding with providing the restricted credentials using the second account; and updating display of the usage metric for the second account to reflect the change in the usage metric caused by providing the restricted credentials using the second account.
106. The method of any of items 100 - 105, wherein the external device is a contactless terminal, and the received request to provide restricted credentials associated with the user of the device via the wireless transmission device to the external device is a signal from the contactless terminal.
107. The method of item 106, further comprising:
in accordance with a determination that the signal from the contactless terminal is detected for at least a second predetermined amount of time, proceeding with providing the restricted credentials using the first account; and in accordance with a determination that the signal from the contactless terminal is detected for less than the second predetermined amount of time, forgoing proceeding with providing the restricted credentials using the first account.
108. The method of any of items 100 - 107, wherein the received request to provide restricted credentials associated with the user of the device via the wireless transmission device to the external device is an input from the user of the device.
109. The method of any of items 100 - 108, wherein replacing display of the representation of the first account with the representation of the second account at the first location of the display includes:
displaying the entirety of the representation of the second account at the first location of the display; and displaying at least a portion of the representation of the first account at the second location of the display.
400
DK 2017 70505 A1
110. The method of any of items 100 - 109, further comprising:
concurrently displaying, on the display, at least a portion of a representation of a third account at a location adjacent to the second location of the display while maintaining display of the at least a portion of the representation of the second account at the second location;
detecting, via the one or more input devices, user selection of the representation of the third account;
in response to detecting the user selection of the representation of the third account: replacing display of the representation of the first account with the representation of the third account at the first location of the display, and maintaining display of the at least a portion of the representation of the second account.
111. The method of any of items 100-110, wherein the usage metric for the second account is displayed after a third predetermined amount of time has passed from displaying the at least a portion of the representation of the second account.
112. The method of any of items 100-111, wherein the usage metric for the second account ceases to be displayed after a fourth predetermined amount of time has passed from first displaying the usage metric.
113. The method of any of items 100-112, wherein selected accounts are displayed at the first location and non-selected accounts are displayed at the second location.
114. The method of any of items 100-113, wherein a plurality of representations of nonselected accounts is displayed in a stack configuration at the second location.
115. The method of any of items 100-114, further comprising:
in response to detecting the user selection of the representation of the second account:
401
DK 2017 70505 A1 replacing display of the at least a portion of the representation of the second account with display of at least a portion of the representation of the first account.
116. The method of any of items 100 - 115, further comprising:
in response to detecting the user selection of the representation of the second account: replacing display of the at least a portion of the representation of the second account with display of at least a portion of the representation of the first account, and selecting the second account for use in providing the restricted credentials while maintaining selection of the first account for concurrent use in providing the restricted credentials.
117. The method of any of items 100 - 116, wherein the representation of the second account includes a distinguishing visual characteristic and representations of other accounts that are not the second account, including the representation of the first account, do not include the distinguishing visual characteristic.
118. A computer-readable storage medium storing one or more programs configured to be executed by one or more processors of an electronic device with a display, a wireless transmission device, and one or more input devices, the one or more programs including instructions for performing the method of any of items 100 - 117.
119. An electronic device, comprising:
a display;
a wireless transmission device;
one or more input devices;
one or more processors; and memory storing one or more programs configured to be executed by the one or more processors, the one or more programs including instructions for performing the method of any of items 100 - 117.
402
DK 2017 70505 A1
120. An electronic device, comprising:
a display;
a wireless transmission device;
one or more input devices; and means for performing the method of any of items 100-117.
121. A non-transitory computer-readable storage medium storing one or more programs configured to be executed by one or more processors of an electronic device with a display, a wireless transmission device, and one or more input devices, the one or more programs including instructions for:
receiving a request to provide restricted credentials associated with a user of the device via the wireless transmission device to an external device;
in response to receiving the request to provide the restricted credentials, concurrently displaying, on the display:
a representation of a first account associated with first restricted credentials at a first location of the display, wherein the first account is selected for use in providing the restricted credentials, and at least a portion of a representation of a second account associated with second restricted credentials at a second location of the display, wherein display of at least the portion of the representation of the second account includes display of a usage metric for the second account;
detecting, via the one or more input devices, user selection of the representation of the second account; and in response to detecting the user selection of the representation of the second account: replacing display of the representation of the first account with the representation of the second account at the first location of the display, and selecting the second account for use in providing the restricted credentials.
122. An electronic device, comprising:
a display;
403
DK 2017 70505 A1 a wireless transmission device;
one or more input devices;
one or more processors; and memory storing one or more programs configured to be executed by the one or more processors, the one or more programs including instructions for:
receiving a request to provide restricted credentials associated with a user of the device via the wireless transmission device to an external device;
in response to receiving the request to provide the restricted credentials, concurrently displaying, on the display:
a representation of a first account associated with first restricted credentials at a first location of the display, wherein the first account is selected for use in providing the restricted credentials, and at least a portion of a representation of a second account associated with second restricted credentials at a second location of the display, wherein display of at least the portion of the representation of the second account includes display of a usage metric for the second account;
detecting, via the one or more input devices, user selection of the representation of the second account; and in response to detecting the user selection of the representation of the second account: replacing display of the representation of the first account with the representation of the second account at the first location of the display, and selecting the second account for use in providing the restricted credentials.
123. An electronic device, comprising:
a display;
a wireless transmission device;
one or more input devices;
means for receiving a request to provide restricted credentials associated with a user of the device via the wireless transmission device to an external device;
404
DK 2017 70505 A1 means, in response to receiving the request to provide the restricted credentials, for concurrently displaying, on the display:
a representation of a first account associated with first restricted credentials at a first location of the display, wherein the first account is selected for use in providing the restricted credentials, and at least a portion of a representation of a second account associated with second restricted credentials at a second location of the display, wherein display of at least the portion of the representation of the second account includes display of a usage metric for the second account;
means for detecting, via the one or more input devices, user selection of the representation of the second account; and means, in response to detecting the user selection of the representation of the second account, for:
replacing display of the representation of the first account with the representation of the second account at the first location of the display, and selecting the second account for use in providing the restricted credentials.
124. A method, comprising:
at an electronic device with a display and one or more input devices:
receiving a request to participate in a transfer of resources for a requested resource amount using a first resource account; and in response to receiving the request to participate in the transfer of resources for the requested resource amount using the first resource account:
in accordance with a determination that the requested resource amount is equal to or less than an amount of resources available via the first resource account, automatically proceeding with the transfer of resources using only the first resource account, and in accordance with a determination that the requested resource amount is greater than the amount of resources available via the first resource account, automatically proceeding with the transfer of resources using the first resource account and a second resource account different from the first resource account.
405
DK 2017 70505 A1
125. The method of item 1, further comprising:
prior to receiving the request to participate in the transfer of resources for the requested resource amount using the first resource account, receiving an initiation input.
126. The method of any of items 124 - 125, further comprising:
prior to proceeding with the transfer of resources, displaying, on the display, an authentication user interface requesting authentication information;
receiving, via the one or more input devices, the authentication information, and: wherein automatically proceeding with the transfer of resources is in accordance with a determination that the received authentication information corresponds to enrolled authentication information for authorizing transfers; and in accordance with a determination that the received authentication information does not correspond to the enrolled authentication information for authorizing transfers, forgoing proceeding with the transfer of resources.
127. The method of any of items 124 - 126, further comprising:
subsequent to proceeding with the transfer of resources using the first resource account and the second resource, displaying, on the display, a first representation associated with the first resource account and a second representation associated with the second resource account.
128. The method of any of items 124 - 127, wherein the resource is an amount of funds and the second resource account is a stored-value account containing stored funds.
129. The method of any of items 124 - 128, wherein the resource is an amount of funds and, the method further comprises, in response to receiving the request to participate in the transfer of resources for the requested resource amount using the first resource account:
in accordance with a determination that the second resource account is associated with a transaction fee, displaying, on the display, an indication that a transaction fee will be added to the transfer; and
406
DK 2017 70505 A1 in accordance with a determination that the second resource account is not associated with a transaction fee, forgoing displaying, on the display, the indication that a transaction fee will be added to the transfer.
130. The method of any of items 124 - 129, wherein receiving the request to participate in the transfer of resources includes receiving a sequence of one or more inputs from the user to transmit the resources to another participant in a message conversation.
131. The method of any of items 124 - 130, wherein receiving the request to participate in the transfer of resources includes receiving information from an external source with information about a transaction and receiving a sequence of one or more inputs from the user to transmit resources selected based on the information from the external source.
132. The method of any of items 124 - 131, wherein receiving the request to participate in the transfer of resources includes receiving a sequence of one or more inputs from the user that authorizes transmission of restricted credentials to a nearby device via a short range wireless communication.
133. The method of any of items 124 - 132, further comprising:
in accordance with the determination that the requested resource amount is equal to or less than the amount of resources available via the first resource account:
displaying, on the display, an indication of the amount of resources available via the first resource account, and forgoing displaying a selectable representation of the second resource account; and in accordance with the determination that the requested resource amount is greater than the amount of resources available via the first resource account, displaying, on the display, the indication of the amount of resources available via the first resource account and the selectable representation of the second resource account.
407
DK 2017 70505 A1
134. The method of any of items 125 - 133, further comprising:
in response to receiving the initiation input:
concurrently displaying, on the display, a representation of the first resource account and a representation of the second resource account;
receiving user input for proceeding with the transfer of resources;
in response to receiving the user input for proceeding with the transfer of resources, displaying, on the display, an authentication user interface requesting authentication information for proceeding with the transfer of resources; and wherein receiving the request to participate in the transfer of resources includes receiving authentication information.
135. The method of any of items 125 - 134, further comprising:
prior to receiving the request to participate in the transfer of resources for the requested resource amount using the first resource account, displaying, on the display, a message conversation of a messaging application between a plurality of participants; and in response to detecting the initiation input, concurrently displaying, on the display, a representation of the first resource account and a representation of the second resource account.
136. The method of any of items 125 - 135, wherein concurrently displaying, on the display, the representation of the first resource account and the representation of the second resource account includes displaying a transaction detail region that also includes additional information about the transaction and instructions for providing authorization information to authorize participation in the transaction.
137. A computer-readable storage medium storing one or more programs configured to be executed by one or more processors of an electronic device with a display and one or more input devices, the one or more programs including instructions for performing the method of any of items 124 - 136.
138. An electronic device, comprising:
408
DK 2017 70505 A1 a display;
one or more input devices;
one or more processors; and memory storing one or more programs configured to be executed by the one or more processors, the one or more programs including instructions for performing the method of any of items 124 - 136.
139. An electronic device, comprising:
a display;
one or more input devices; and means for performing the method of any of items 124 - 136.
140. A non-transitory computer-readable storage medium storing one or more programs configured to be executed by one or more processors of an electronic device with a display and one or more input devices, the one or more programs including instructions for:
receiving a request to participate in a transfer of resources for a requested resource amount using a first resource account; and in response to receiving the request to participate in the transfer of resources for the requested resource amount using the first resource account:
in accordance with a determination that the requested resource amount is equal to or less than an amount of resources available via the first resource account, automatically proceeding with the transfer of resources using only the first resource account, and in accordance with a determination that the requested resource amount is greater than the amount of resources available via the first resource account, automatically proceeding with the transfer of resources using the first resource account and a second resource account different from the first resource account.
141. An electronic device, comprising:
a display;
one or more input devices;
409
DK 2017 70505 A1 one or more processors; and memory storing one or more programs configured to be executed by the one or more processors, the one or more programs including instructions for:
receiving a request to participate in a transfer of resources for a requested resource amount using a first resource account; and in response to receiving the request to participate in the transfer of resources for the requested resource amount using the first resource account:
in accordance with a determination that the requested resource amount is equal to or less than an amount of resources available via the first resource account, automatically proceeding with the transfer of resources using only the first resource account, and in accordance with a determination that the requested resource amount is greater than the amount of resources available via the first resource account, automatically proceeding with the transfer of resources using the first resource account and a second resource account different from the first resource account.
142. An electronic device, comprising:
a display;
one or more input devices;
means for receiving a request to participate in a transfer of resources for a requested resource amount using a first resource account; and means, in response to receiving the request to participate in the transfer of resources for the requested resource amount using the first resource account, for:
in accordance with a determination that the requested resource amount is equal to or less than an amount of resources available via the first resource account, automatically proceeding with the transfer of resources using only the first resource account, and in accordance with a determination that the requested resource amount is greater than the amount of resources available via the first resource account, automatically proceeding with the transfer of resources using the first resource account and a second resource account different from the first resource account.
410
DK 2017 70505 A1
143. A method, comprising:
at an electronic device with a display:
receiving one or more messages in a first conversation of electronic messages that includes messages from a user of the electronic device to a first participant and messages from the first participant to the user of the electronic device, the one or more messages in the first conversation including a first message that is associated with the transfer of a first additional item;
receiving one or more messages in a second conversation of electronic messages that includes messages from the user of the electronic device to a second participant and messages from the second participant to the user of the electronic device, the one or more messages in the second conversation including a second message that is associated with the transfer of a second additional item; and concurrently displaying, on the display:
a first item associated with the first participant, wherein the first item includes first information from the first message in the first conversation of electronic messages and a representation of the first additional item; and a second item associated with the second participant, wherein the second item includes second information from the second message in the second conversation of electronic messages and a representation of the second additional item.
144. The method of item 143, wherein the representation of the first additional item includes a numerical representation of the first additional item.
145. The method of any of items 143 - 144, wherein the first additional item is a first transfer between the user of the electronic device and the first participant and the second additional item is a second transfer between the user of the electronic device and the second participant, and wherein the representation of the first additional item includes an indication of an amount of the first payment transfer and the representation of the second additional item includes an indication of an amount of the second payment transfer.
411
DK 2017 70505 A1
146. The method of item 145, wherein the first transfer is a transfer from the user of the electronic device to the first participant and the second transfer is a transfer from the user of the electronic device to the second participant.
147. The method of item 145, wherein the first transfer is a transfer request by the user of the electronic device to the first participant and the second transfer is a transfer request by the user of the electronic device to the second participant.
148. The method of any of items 145 - 147, wherein the representation of the first additional item includes a status indicator associated with the first transfer and an affordance for viewing additional details associated with the first transfer, the method further comprising:
detecting user activation of the affordance for viewing additional details associated with the first transfer;
in response to detecting the user activation of the affordance for viewing additional details associated with the first transfer, displaying, on the display, a details user interface, wherein the details user interface includes:
the first information from the first message in the first conversation of the electronic messages, an authorization affordance for authorizing the first transfer, and a cancel affordance for cancelling the first transfer.
149. The method of item 148, further comprising:
detecting user activation of the authorization affordance;
in response to detecting the user activation of the authorization affordance, displaying an authentication user interface for requesting authentication information;
receiving the authentication information:
in accordance with a determination that the received authentication information is consistent with enrolled authentication information for authorizing transactions:
authorizing the first transfer, and
412
DK 2017 70505 A1 updating display of the first message in the first conversation of electronic messages to indicate that the first transfer has been authorized; and in accordance with a determination that the received authentication information is not consistent with the enrolled authentication information for authorizing transactions, forgoing authorizing the first transfer.
150. The method of any of items 148 - 149, further comprising:
detecting user activation of the cancel affordance;
in response to detecting the user activation of the cancel affordance:
displaying, on the display, the first conversation of electronic messages, wherein the first conversation includes an indication that the first transfer has been canceled.
151. The method of any of items 143 - 150, wherein the first item corresponds to a pending payment transaction and the second item corresponds to a completed payment transaction.
152. The method of any of items 143 - 151, further comprising:
detecting an input at a location corresponding to the first item; and in response to detecting the input at the location corresponding to the first item:
in accordance with a determination that the location corresponds to the representation of the first additional item, displaying a first item-specific user interface; and in accordance with a determination that the location does not correspond to the representation of the first additional item, displaying a first participant-specific user interface.
153. The method of item 152, wherein the first item corresponds to a pending payment transaction and the representation of the first additional item includes an indication of an amount of the pending payment transaction, the method further comprising:
receiving user input on the representation of the first additional item of the first item;
in response to receiving the user input, displaying, on the display, an authentication user interface requesting authentication information for authorizing the transaction.
413
DK 2017 70505 A1
154. The method of any of items 143 - 153, wherein the first item includes an indication of the first participant associated with the first item and an indication of a time associated with the first item.
155. The method of any of items 143 - 154, wherein the first item corresponds to a payment sent to the user by the first participant associated with the first item, and the first item includes an affordance for transferring an amount of the payment to an external account associated with the user.
156. The method of any of items 152 - 155, wherein the first participant-specific user interface includes contact information associated with the first participant and a list of one or more first participant-specific items, including the first item, associated with the first participant.
157. The method of any of items 152 - 156, wherein the first item-specific user interface includes:
a representation of content associated with the first item, an indication of the first participant, and an indication of a time associated with the first message.
158. The method of item 157, wherein the first item-specific user interface includes an annotation of text in the first message in the first conversation of electronic messages.
159. The method of any of items 157 - 158, wherein the first item-specific user interface includes an annotation of text from one or more messages that are adjacent to the first message in the first conversation of electronic messages.
160. The method of any of items 143 - 159, wherein the first item and the second item correspond to transactions made using a first payment account, the method further comprising:
prior to concurrently displaying, on the display, the first item and the second item, displaying, on the display, a representation of the first payment account;
414
DK 2017 70505 A1 receiving user selection of the representation of the first payment account; and in response to receiving the user selection of the representation of the first payment account, concurrently displaying, on the display, a list of items associated with the first payment account, wherein the list of items includes the first item and the second item.
161. The method of any of items 143 - 160, further comprising:
in accordance with a determination that the first item is associated with a transfer of an amount of funds from the user to the first participant associated with the first item, forgoing adding a directional indicator to a numerical representation of the amount of funds included in the first item; and in accordance with a determination that the first item is associated with a transfer of the amount of funds to the user from the first participant associated with the first item, adding the directional indicator to the numerical representation of the amount of funds included in the first item.
162. The method of any of items 143 - 161, wherein:
the first item includes a graphical indication of the first participant associated with the first item; and the second item includes a graphical indication of the second participant associated with the second item.
163. The method of any of items 143 - 162, wherein the first additional item is a video file and the second additional item is a photo.
164. The method of any of items 143 - 163, wherein the representation of the first additional item includes a thumbnail image of the first additional item.
165. A computer-readable storage medium storing one or more programs configured to be executed by one or more processors of an electronic device with a display and one or more input
415
DK 2017 70505 A1 devices, the one or more programs including instructions for performing the method of any of items 143 - 164.
166. An electronic device, comprising:
a display;
one or more input devices;
one or more processors; and memory storing one or more programs configured to be executed by the one or more processors, the one or more programs including instructions for performing the method of any of items 143 - 164.
167. An electronic device, comprising:
a display;
one or more input devices; and means for performing the method of any of items 143 - 164.
168. A non-transitory computer-readable storage medium storing one or more programs configured to be executed by one or more processors of an electronic device with a display, the one or more programs including instructions for:
receiving one or more messages in a first conversation of electronic messages that includes messages from a user of the electronic device to a first participant and messages from the first participant to the user of the electronic device, the one or more messages in the first conversation including a first message that is associated with the transfer of a first additional item;
receiving one or more messages in a second conversation of electronic messages that includes messages from the user of the electronic device to a second participant and messages from the second participant to the user of the electronic device, the one or more messages in the second conversation including a second message that is associated with the transfer of a second additional item; and concurrently displaying, on the display:
416
DK 2017 70505 A1 a first item associated with the first participant, wherein the first item includes first information from the first message in the first conversation of electronic messages and a representation of the first additional item; and a second item associated with the second participant, wherein the second item includes second information from the second message in the second conversation of electronic messages and a representation of the second additional item.
169. An electronic device, comprising:
a display;
one or more processors; and memory storing one or more programs configured to be executed by the one or more processors, the one or more programs including instructions for:
receiving one or more messages in a first conversation of electronic messages that includes messages from a user of the electronic device to a first participant and messages from the first participant to the user of the electronic device, the one or more messages in the first conversation including a first message that is associated with the transfer of a first additional item;
receiving one or more messages in a second conversation of electronic messages that includes messages from the user of the electronic device to a second participant and messages from the second participant to the user of the electronic device, the one or more messages in the second conversation including a second message that is associated with the transfer of a second additional item; and concurrently displaying, on the display:
a first item associated with the first participant, wherein the first item includes first information from the first message in the first conversation of electronic messages and a representation of the first additional item; and a second item associated with the second participant, wherein the second item includes second information from the second message in the second conversation of electronic messages and a representation of the second additional item.
417
DK 2017 70505 A1
170. An electronic device, comprising:
a display;
means for receiving one or more messages in a first conversation of electronic messages that includes messages from a user of the electronic device to a first participant and messages from the first participant to the user of the electronic device, the one or more messages in the first conversation including a first message that is associated with the transfer of a first additional item;
means for receiving one or more messages in a second conversation of electronic messages that includes messages from the user of the electronic device to a second participant and messages from the second participant to the user of the electronic device, the one or more messages in the second conversation including a second message that is associated with the transfer of a second additional item; and means for concurrently displaying, on the display:
a first item associated with the first participant, wherein the first item includes first information from the first message in the first conversation of electronic messages and a representation of the first additional item; and a second item associated with the second participant, wherein the second item includes second information from the second message in the second conversation of electronic messages and a representation of the second additional item.
171. A method, comprising:
at an electronic device with one or more output devices including a display and one or more input devices:
receiving, via the one or more input devices, an utterance from a user that corresponds to a request to perform an operation;
in response to receiving the utterance, preparing to perform the operation:
in accordance with a determination that the operation requires authorization, preparing to perform the operation includes presenting, via the one or more output devices of the device:
a representation of the operation; and
418
DK 2017 70505 A1 instructions for providing authorization to the device, via the one or more input devices of the device, to perform the operation;
after preparing to perform the operation, receiving a confirmation input associated with performing the operation; and in response to receiving the confirmation input:
in accordance with a determination that the operation requires authorization and the operation has not been authorized, forgoing performing the operation in response to the confirmation input;
in accordance with a determination that the operation requires authorization and the operation has been authorized, performing the operation in response to the confirmation input; and in accordance with a determination that the operation does not require authorization, performing the operation in response to the confirmation input.
172. The method of item 171, wherein the utterance from the user that corresponds to the request to perform the operation is received while the device is in an unlocked mode of operation.
173. The method of item 171, wherein the utterance from the user that corresponds to the request to perform the operation is received while the device is in a locked mode of operation.
174. The method of any of items 171 - 173, further comprising:
in accordance with the determination that the operation requires authorization and the operation has not been authorized, forgoing unlocking the device from the locked mode of operation to an unlocked mode of operation;
in accordance with the determination that the operation requires authorization and the operation has been authorized, unlocking the device from the locked mode of operation to the unlocked mode of operation; and in accordance with the determination that the operation does not require authorization, forgoing unlocking the device from the locked mode of operation to the unlocked mode of operation.
419
DK 2017 70505 A1
175. The method of any of items 171 - 174, wherein the operation includes sending a message to a message participant in a message conversation of a messaging application, and wherein the message includes an attached item.
176. The method of item 175, wherein the attached item is marked as requiring authorization.
177. The method of any of items 175 - 176, wherein the attached item is a payment object that represents a payment to the message participant.
178. The method of any of items 175 - 176, wherein the attached item is a request for payment by the user of the device from the message participant.
179. The method of any of items 175 - 178, wherein performing the operation in response to the confirmation input includes displaying, on the display, an indication that the message will be sent to the message participant in the message conversation of the messaging application.
180. The method of any of items 175 - 179, further comprising:
prior to performing the operation in response to the confirmation input, outputting a prompt to include a user-specified message along with the attached item.
181. The method of any of items 175 - 180, further comprising:
in accordance with a determination, based on the utterance from the user, that a graphical animation is to be associated with the message, requesting, via the one or more output devices, user selection of a graphical animation; and associating the first graphical animation with the message prior to sending the message to the message participant.
182. The method of any of items 171 - 181, wherein presenting instructions for providing authorization to the device, via the one or more input devices of the device, to perform the
420
DK 2017 70505 A1 operation comprises displaying, on the display, an authorization user interface, wherein the authorization user interface includes a request for authentication information from the user of the device to authorize the operation.
183. The method of item 182, wherein the authentication information includes biometric authentication information.
184. The method of any of items 182 - 183, wherein the authorization user interface includes an indication of a resource account for use in performing the operation.
185. The method of any of items 171 - 184, wherein presenting, via the one or more output devices of the device, the representation of the operation and the instructions for providing the authorization to the device includes concurrently displaying, on the display:
the representation of the operation; and the instructions for providing the authorization to the device, via the one or more input devices of the device, to perform the operation.
186. The method of any of items 171 - 185, wherein presenting, via the one or more output devices of the device, the representation of the operation and the instructions for providing the authorization to the device includes:
outputting, via the one or more output devices description of the operation; and outputting, via the one or more output devices, audio instructions for providing authorization to the device to enable performing of the operation.
187. The method of any of items 171 - 186, further comprising:
in response to receiving the utterance, and prior to preparing to perform the operation, performing speech recognition on the utterance to determine a text representation of the utterance, wherein the operation is performed based on an analysis of the text representation of the utterance.
421
DK 2017 70505 A1
188. The method of any of items 171 - 187, wherein the analysis of the text representation of the utterance comprises performing natural language processing on the text representation of the utterance to determine an actionable intent.
189. A computer-readable storage medium storing one or more programs configured to be executed by one or more processors of an electronic device with one or more output devices including a display and one or more input devices, the one or more programs including instructions for performing the method of any of items 171 - 188.
190. An electronic device, comprising:
one or more output devices including a display;
one or more input devices;
one or more processors; and memory storing one or more programs configured to be executed by the one or more processors, the one or more programs including instructions for performing the method of any of items 171 - 188.
191. An electronic device, comprising:
one or more output devices including a display;
one or more input devices; and means for performing the method of any of items 171 - 188.
192. A non-transitory computer-readable storage medium storing one or more programs configured to be executed by one or more processors of an electronic device with one or more output devices including a display and one or more input devices, the one or more programs including instructions for:
receiving, via the one or more input devices, an utterance from a user that corresponds to a request to perform an operation;
in response to receiving the utterance, preparing to perform the operation:
422
DK 2017 70505 A1 in accordance with a determination that the operation requires authorization, preparing to perform the operation includes presenting, via the one or more output devices of the device:
a representation of the operation; and instructions for providing authorization to the device, via the one or more input devices of the device, to perform the operation;
after preparing to perform the operation, receiving a confirmation input associated with performing the operation; and in response to receiving the confirmation input:
in accordance with a determination that the operation requires authorization and the operation has not been authorized, forgoing performing the operation in response to the confirmation input;
in accordance with a determination that the operation requires authorization and the operation has been authorized, performing the operation in response to the confirmation input; and in accordance with a determination that the operation does not require authorization, performing the operation in response to the confirmation input.
193. An electronic device, comprising:
one or more output devices including a display;
one or more input devices;
one or more processors; and memory storing one or more programs configured to be executed by the one or more processors, the one or more programs including instructions for:
receiving, via the one or more input devices, an utterance from a user that corresponds to a request to perform an operation;
in response to receiving the utterance, preparing to perform the operation:
in accordance with a determination that the operation requires authorization, preparing to perform the operation includes presenting, via the one or more output devices of the device:
423
DK 2017 70505 A1 a representation of the operation; and instructions for providing authorization to the device, via the one or more input devices of the device, to perform the operation;
after preparing to perform the operation, receiving a confirmation input associated with performing the operation; and in response to receiving the confirmation input:
in accordance with a determination that the operation requires authorization and the operation has not been authorized, forgoing performing the operation in response to the confirmation input;
in accordance with a determination that the operation requires authorization and the operation has been authorized, performing the operation in response to the confirmation input; and in accordance with a determination that the operation does not require authorization, performing the operation in response to the confirmation input.
194. An electronic device, comprising:
one or more output devices, including a display;
one or more input devices;
means for receiving, via the one or more input devices, an utterance from a user that corresponds to a request to perform an operation;
means, responsive to receiving the utterance, preparing to perform the operation, for: in accordance with a determination that the operation requires authorization, preparing to perform the operation includes presenting, via the one or more output devices of the device:
a representation of the operation; and instructions for providing authorization to the device, via the one or more input devices of the device, to perform the operation;
means, after preparing to perform the operation, for receiving a confirmation input associated with performing the operation; and means, responsive to receiving the confirmation input, for:
424
DK 2017 70505 A1 in accordance with a determination that the operation requires authorization and the operation has not been authorized, forgoing performing the operation in response to the confirmation input;
in accordance with a determination that the operation requires authorization and the operation has been authorized, performing the operation in response to the confirmation input; and in accordance with a determination that the operation does not require authorization, performing the operation in response to the confirmation input.
195. A method, comprising:
at an electronic device with a display and one or more sensor devices:
while the device is at a first orientation relative to a baseline orientation with respect to a reference point, displaying, on the display, a user interface object;
while displaying the user interface object, detecting, via the one or more sensor devices, a change in orientation of the device from the first orientation relative to the reference point to a respective orientation relative to the reference point;
in response to detecting the change in orientation of the device:
changing an appearance of the user interface object by applying a visual effect to the user interface object that varies a set of one or more parameters of the user interface object as the orientation of the device changes relative to the reference point;
in accordance with a determination that the change in orientation of the device includes movement, towards the baseline orientation, that meets predetermined criteria, reducing an amplitude of the visual effect; and in accordance with a determination that the change in orientation of the device includes movement, away from the baseline orientation, that meets the predetermined criteria, continuing to apply the visual effect to the user interface object without reducing the amplitude of the visual effect.
196. The method of item 195, wherein reducing the amplitude of the visual effect comprises continuing to apply the visual effect to the user interface object.
425
DK 2017 70505 A1
197. The method of any of items 195 - 196, wherein reducing the amplitude of the visual effect comprises gradually decreasing the amplitude while the orientation of the device moves towards the baseline orientation.
198. The method of item 195, wherein reducing the amplitude of the visual effect comprises ceasing to apply the visual effect to the user interface object.
199. The method of any of items 195 - 198, wherein continuing to apply the visual effect to the user interface object without reducing the amplitude of the visual effect comprises increasing the amplitude of the visual effect while the orientation of the device moves away from the baseline orientation.
200. The method of any of items 195 - 199, comprising:
further in response to detecting the change in orientation of the device, in accordance with a determination that the device is at the baseline orientation, continuing to apply the visual effect to the user interface object.
201. The method of any of items 195 - 200, wherein the visual effect includes a coloring effect applied to at least a portion of the user interface object.
202. The method of item 201, wherein reducing the amplitude of the coloring effect includes reducing a saturation of a color of the coloring effect applied to at least the portion of the user interface object.
203. The method of item 201, wherein a color of the coloring effect applied to at least the portion of the user interface object changes from a first color to a second color different from the first color in response to a change in orientation of the device of at least a predefined angular distance.
426
DK 2017 70505 A1
204. The method of any of items 195 - 203, wherein the visual effect includes a geometry alteration effect applied to at least a portion of the user interface object.
205. The method of item 204, wherein the geometry alteration effect is a skewing effect, and wherein reducing the amplitude of the visual effect includes reducing an amount of skew of the geometry of the user interface object.
206. The method of item 204, wherein the geometry alteration effect is a simulated depth effect, and wherein reducing the amplitude of the visual effect includes reducing a simulated depth of the geometry of the user interface object.
207. The method of any of items 195 - 206, wherein the predetermined criteria include movement within a predefined angular distance from the baseline orientation.
208. The method of any of items 195 - 207, wherein:
the user interface object is displayed on a user interface item, wherein applying the visual effect to the user interface object includes:
applying a first magnitude of the visual effect to a first portion of the user interface object; and applying a second magnitude of the visual effect to a second portion of the user interface object.
209. The method of item 208, wherein the user interface item corresponds to a message object of a message conversation of a messaging application.
210. The method of item 209, wherein the visual effect that varies the set of one or more parameters is applied to the user interface object when a transfer associated with the message object corresponding to the user interface item is completed.
427
DK 2017 70505 A1
211. The method of item 208, wherein the user interface item corresponds to a graphical representation of an account.
212. The method of item 211, wherein the visual effect that varies the set of one or more parameters is applied to the user interface object when a transfer is completed using the account corresponding to the graphical representation corresponding to the user interface item.
213. The method of any of items 195 - 212, wherein changing the appearance of the user interface object is controlled by a first application that is integrated with an operating system of the device and the ability to change the appearance of a user interface object based on a change in orientation of the device from the first orientation relative to the reference point to a respective orientation relative to the reference point is not available to applications that are not integrated with the operating system of the device.
214. The method of any of items 195 - 213, wherein detecting the change in orientation of the device from the first orientation relative to the reference point to a respective orientation relative to the reference point includes detecting a change in orientation of the device.
215. The method of any of items 195 - 214, wherein detecting the change in orientation of the device from the first orientation relative to the reference point to a respective orientation relative to the reference point includes detecting a change in orientation of a user relative to the device
216. The method of any of items 195 - 215, further comprising:
in response to detecting the change in orientation of the device:
detecting, via the one or more sensor devices, that the device is at a second orientation relative to the baseline orientation, wherein the second orientation is at least a predefined limit angular distance from the baseline orientation; and in response to detecting that the device is at the second orientation relative to the baseline orientation, gradually ceasing to display the visual effect to the user interface object.
428
DK 2017 70505 A1
217. The method of any of items 195 - 216, wherein the visual effect includes a brightness effect.
218. The method of any of items 195 - 217, wherein the electronic device further includes one or more tactile output generators, the method further comprising:
in response to detecting the change in orientation of the device from the first orientation relative to the reference point to the respective orientation relative to the reference point, generating, via the one or more tactile output generators, a tactile output that is indicative of the change in orientation of the device from the first orientation relative to the reference point to the respective orientation relative to the reference point.
219. The method of any of items 195 - 217, wherein the electronic device further includes one or more tactile output generators, the method further comprising:
in response to detecting the change in orientation of the device from the first orientation relative to the reference point to the respective orientation relative to the reference point:
in accordance with a determination that the visual effect being applied to the user interface object exceeds a predefined amplitude limit, generating, via the one or more tactile output generators, a tactile output that is indicative of the change in orientation of the device from the first orientation relative to the reference point to the respective orientation relative to the reference point; and in accordance with a determination that the visual effect being applied to the user interface object does not exceed the predefined amplitude limit, forgoing generating, via the one or more tactile output generators, the tactile output that is indicative of the change in orientation of the device from the first orientation relative to the reference point to the respective orientation relative to the reference point.
220. The method of any of items 218 - 219, further comprising:
while no longer detecting a change in orientation of the device relative to the reference point:
ceasing to change the appearance of the user interface object, and
429
DK 2017 70505 A1 generating, via the one or more tactile output generators, the tactile output that is indicative of the change in orientation of the device from the first orientation relative to the reference point to the respective orientation relative to the reference point.
221. The method of any of items 218 - 220, wherein a parameter of the generated tactile output changes based on a velocity of the movement of the device.
222. The method of any of items 218 - 220, wherein a parameter of the generated tactile output changes based on an amount of movement of the device.
223. The method of any of items 218 - 222, further comprising:
detecting, via the one or more sensor devices, a ceasing of the change in orientation of the device; and in response to detecting the ceasing of the change in orientation of the device, gradually ceasing to generate the tactile output.
224. The method of any of items 218 - 223, wherein the generated tactile output is a repetition of two or more distinctive tactile output patterns including a first tactile output pattern and a second tactile output pattern, wherein the first tactile output pattern is different from the second tactile output pattern.
225. A computer-readable storage medium storing one or more programs configured to be executed by one or more processors of an electronic device with a display and one or more sensor devices, the one or more programs including instructions for performing the method of any of items 195 - 224.
226. An electronic device, comprising:
a display;
one or more sensor devices;
one or more processors; and
430
DK 2017 70505 A1 memory storing one or more programs configured to be executed by the one or more processors, the one or more programs including instructions for performing the method of any of items 195 - 224.
227. An electronic device, comprising:
a display;
one or more sensor devices; and means for performing the method of any of items 195 - 224.
228. A non-transitory computer-readable storage medium storing one or more programs configured to be executed by one or more processors of an electronic device with a display and one or more sensor devices, the one or more programs including instructions for:
while the device is at a first orientation relative to a baseline orientation with respect to a reference point, displaying, on the display, a user interface object;
while displaying the user interface object, detecting, via the one or more sensor devices, a change in orientation of the device from the first orientation relative to the reference point to a respective orientation relative to the reference point;
in response to detecting the change in orientation of the device:
changing an appearance of the user interface object by applying a visual effect to the user interface object that varies a set of one or more parameters of the user interface object as the orientation of the device changes relative to the reference point;
in accordance with a determination that the change in orientation of the device includes movement, towards the baseline orientation, that meets predetermined criteria, reducing an amplitude of the visual effect; and in accordance with a determination that the change in orientation of the device includes movement, away from the baseline orientation, that meets the predetermined criteria, continuing to apply the visual effect to the user interface object without reducing the amplitude of the visual effect.
229. An electronic device, comprising:
431
DK 2017 70505 A1 a display;
one or more sensor devices;
one or more processors; and memory storing one or more programs configured to be executed by the one or more processors, the one or more programs including instructions for:
while the device is at a first orientation relative to a baseline orientation with respect to a reference point, displaying, on the display, a user interface object;
while displaying the user interface object, detecting, via the one or more sensor devices, a change in orientation of the device from the first orientation relative to the reference point to a respective orientation relative to the reference point;
in response to detecting the change in orientation of the device:
changing an appearance of the user interface object by applying a visual effect to the user interface object that varies a set of one or more parameters of the user interface object as the orientation of the device changes relative to the reference point;
in accordance with a determination that the change in orientation of the device includes movement, towards the baseline orientation, that meets predetermined criteria, reducing an amplitude of the visual effect; and in accordance with a determination that the change in orientation of the device includes movement, away from the baseline orientation, that meets the predetermined criteria, continuing to apply the visual effect to the user interface object without reducing the amplitude of the visual effect.
230. An electronic device, comprising:
a display;
one or more sensor devices;
means, while the device is at a first orientation relative to a baseline orientation with respect to a reference point, for displaying, on the display, a user interface object;
means, while displaying the user interface object, for detecting, via the one or more sensor devices, a change in orientation of the device from the first orientation relative to the reference point to a respective orientation relative to the reference point;
432
DK 2017 70505 A1 means, in response to detecting the change in orientation of the device, for: changing an appearance of the user interface object by applying a visual effect to the user interface object that varies a set of one or more parameters of the user interface object as the orientation of the device changes relative to the reference point;
in accordance with a determination that the change in orientation of the device includes movement, towards the baseline orientation, that meets predetermined criteria, reducing an amplitude of the visual effect; and in accordance with a determination that the change in orientation of the device includes movement, away from the baseline orientation, that meets the predetermined criteria, continuing to apply the visual effect to the user interface object without reducing the amplitude of the visual effect.
[1004] The foregoing description, for purpose of explanation, has been described with reference to specific embodiments. However, the illustrative discussions above are not intended to be exhaustive or to limit the invention to the precise forms disclosed. Many modifications and variations are possible in view of the above teachings. The embodiments were chosen and described in order to best explain the principles of the techniques and their practical applications. Others skilled in the art are thereby enabled to best utilize the techniques and various embodiments with various modifications as are suited to the particular use contemplated.
[1005] Although the disclosure and examples have been fully described with reference to the accompanying drawings, it is to be noted that various changes and modifications will become apparent to those skilled in the art. Such changes and modifications are to be understood as being included within the scope of the disclosure and examples as defined by the claims.
[1006] As described above, one aspect of the present technology is the gathering and use of data available from various sources to improve the delivery to users of invitational content or any other content that may be of interest to them. The present disclosure contemplates that in some instances, this gathered data may include personal information data that uniquely identifies or can be used to contact or locate a specific person. Such personal information data can include
433
DK 2017 70505 A1 demographic data, location-based data, telephone numbers, email addresses, home addresses, or any other identifying information.
[1007] The present disclosure recognizes that the use of such personal information data, in the present technology, can be used to the benefit of users. For example, the personal information data can be used to deliver targeted content that is of greater interest to the user. Accordingly, use of such personal information data enables calculated control of the delivered content. Further, other uses for personal information data that benefit the user are also contemplated by the present disclosure.
[1008] The present disclosure further contemplates that the entities responsible for the collection, analysis, disclosure, transfer, storage, or other use of such personal information data will comply with well-established privacy policies and/or privacy practices. In particular, such entities should implement and consistently use privacy policies and practices that are generally recognized as meeting or exceeding industry or governmental requirements for maintaining personal information data private and secure. For example, personal information from users should be collected for legitimate and reasonable uses of the entity and not shared or sold outside of those legitimate uses. Further, such collection should occur only after receiving the informed consent of the users. Additionally, such entities would take any needed steps for safeguarding and securing access to such personal information data and ensuring that others with access to the personal information data adhere to their privacy policies and procedures. Further, such entities can subject themselves to evaluation by third parties to certify their adherence to widely accepted privacy policies and practices.
[1009] Despite the foregoing, the present disclosure also contemplates embodiments in which users selectively block the use of, or access to, personal information data. That is, the present disclosure contemplates that hardware and/or software elements can be provided to prevent or block access to such personal information data. For example, in the case of advertisement delivery services, the present technology can be configured to allow users to select to “opt in” or “opt out” of participation in the collection of personal information data during registration for services. In another example, users can select not to provide location information
434
DK 2017 70505 A1 for targeted content delivery services. In yet another example, users can select to not provide precise location information, but permit the transfer of location zone information.
[1010] Therefore, although the present disclosure broadly covers use of personal information data to implement one or more various disclosed embodiments, the present disclosure also contemplates that the various embodiments can also be implemented without the need for accessing such personal information data. That is, the various embodiments of the present technology are not rendered inoperable due to the lack of all or a portion of such personal information data. For example, content can be selected and delivered to users by inferring preferences based on non-personal information data or a bare minimum amount of personal information, such as the content being requested by the device associated with a user, other nonpersonal information available to the content delivery services, or publically available information.

Claims (28)

1. A method, comprising:
at an electronic device with a display:
receiving one or more messages in a first conversation of electronic messages that includes messages from a user of the electronic device to a first participant and messages from the first participant to the user of the electronic device, the one or more messages in the first conversation including a first message that is associated with the transfer of a first additional item;
receiving one or more messages in a second conversation of electronic messages that includes messages from the user of the electronic device to a second participant and messages from the second participant to the user of the electronic device, the one or more messages in the second conversation including a second message that is associated with the transfer of a second additional item; and concurrently displaying, on the display:
a first item associated with the first participant, wherein the first item includes first information from the first message in the first conversation of electronic messages and a representation of the first additional item; and a second item associated with the second participant, wherein the second item includes second information from the second message in the second conversation of electronic messages and a representation of the second additional item.
2. The method of claim 1, wherein the representation of the first additional item includes a numerical representation of the first additional item.
3. The method of any of claims 1 - 2, wherein the first additional item is a first transfer between the user of the electronic device and the first participant and the second additional item is a second transfer between the user of the electronic device and the second participant, and wherein the representation of the first additional item includes an indication of an amount of the first payment transfer and the representation of the second additional item includes an indication of an amount of the second payment transfer.
436
DK 2017 70505 A1
4. The method of claim 3, wherein the first transfer is a transfer from the user of the electronic device to the first participant and the second transfer is a transfer from the user of the electronic device to the second participant.
5. The method of claim 3, wherein the first transfer is a transfer request by the user of the electronic device to the first participant and the second transfer is a transfer request by the user of the electronic device to the second participant.
6. The method of any of claims 3 - 5, wherein the representation of the first additional item includes a status indicator associated with the first transfer and an affordance for viewing additional details associated with the first transfer, the method further comprising:
detecting user activation of the affordance for viewing additional details associated with the first transfer;
in response to detecting the user activation of the affordance for viewing additional details associated with the first transfer, displaying, on the display, a details user interface, wherein the details user interface includes:
the first information from the first message in the first conversation of the electronic messages, an authorization affordance for authorizing the first transfer, and a cancel affordance for cancelling the first transfer.
7. The method of claim 6, further comprising:
detecting user activation of the authorization affordance;
in response to detecting the user activation of the authorization affordance, displaying an authentication user interface for requesting authentication information;
receiving the authentication information:
in accordance with a determination that the received authentication information is consistent with enrolled authentication information for authorizing transactions:
authorizing the first transfer, and
437
DK 2017 70505 A1 updating display of the first message in the first conversation of electronic messages to indicate that the first transfer has been authorized; and in accordance with a determination that the received authentication information is not consistent with the enrolled authentication information for authorizing transactions, forgoing authorizing the first transfer.
8. The method of any of claims 6 - 7, further comprising:
detecting user activation of the cancel affordance;
in response to detecting the user activation of the cancel affordance:
displaying, on the display, the first conversation of electronic messages, wherein the first conversation includes an indication that the first transfer has been canceled.
9. The method of any of claims 1 - 8, wherein the first item corresponds to a pending payment transaction and the second item corresponds to a completed payment transaction.
10. The method of any of claims 1 - 9, further comprising:
detecting an input at a location corresponding to the first item; and in response to detecting the input at the location corresponding to the first item:
in accordance with a determination that the location corresponds to the representation of the first additional item, displaying a first item-specific user interface; and in accordance with a determination that the location does not correspond to the representation of the first additional item, displaying a first participant-specific user interface.
11. The method of claim 10, wherein the first item corresponds to a pending payment transaction and the representation of the first additional item includes an indication of an amount of the pending payment transaction, the method further comprising:
receiving user input on the representation of the first additional item of the first item;
in response to receiving the user input, displaying, on the display, an authentication user interface requesting authentication information for authorizing the transaction.
438
DK 2017 70505 A1
12. The method of any of claims 1 - 11, wherein the first item includes an indication of the first participant associated with the first item and an indication of a time associated with the first item.
13. The method of any of claims 1 - 12, wherein the first item corresponds to a payment sent to the user by the first participant associated with the first item, and the first item includes an affordance for transferring an amount of the payment to an external account associated with the user.
14. The method of any of claims 10 - 13, wherein the first participant-specific user interface includes contact information associated with the first participant and a list of one or more first participant-specific items, including the first item, associated with the first participant.
15. The method of any of claims 10 - 14, wherein the first item-specific user interface includes:
a representation of content associated with the first item, an indication of the first participant, and an indication of a time associated with the first message.
16. The method of claim 15, wherein the first item-specific user interface includes an annotation of text in the first message in the first conversation of electronic messages.
17. The method of any of claims 15 - 16, wherein the first item-specific user interface includes an annotation of text from one or more messages that are adjacent to the first message in the first conversation of electronic messages.
18. The method of any of claims 1 - 17, wherein the first item and the second item correspond to transactions made using a first payment account, the method further comprising:
prior to concurrently displaying, on the display, the first item and the second item, displaying, on the display, a representation of the first payment account;
receiving user selection of the representation of the first payment account; and
439
DK 2017 70505 A1 in response to receiving the user selection of the representation of the first payment account, concurrently displaying, on the display, a list of items associated with the first payment account, wherein the list of items includes the first item and the second item.
19. The method of any of claims 1 - 18, further comprising:
in accordance with a determination that the first item is associated with a transfer of an amount of funds from the user to the first participant associated with the first item, forgoing adding a directional indicator to a numerical representation of the amount of funds included in the first item; and in accordance with a determination that the first item is associated with a transfer of the amount of funds to the user from the first participant associated with the first item, adding the directional indicator to the numerical representation of the amount of funds included in the first item.
20. The method of any of claims 1 - 19, wherein:
the first item includes a graphical indication of the first participant associated with the first item; and the second item includes a graphical indication of the second participant associated with the second item.
21. The method of any of claims 1 - 20, wherein the first additional item is a video file and the second additional item is a photo.
22. The method of any of claims 1 - 21, wherein the representation of the first additional item includes a thumbnail image of the first additional item.
23. A computer-readable storage medium storing one or more programs configured to be executed by one or more processors of an electronic device with a display and one or more input devices, the one or more programs including instructions for performing the method of any of claims 1 - 22.
440
DK 2017 70505 A1
24. An electronic device, comprising:
a display;
one or more input devices;
one or more processors; and memory storing one or more programs configured to be executed by the one or more processors, the one or more programs including instructions for performing the method of any of claims 1 - 22.
25. An electronic device, comprising:
a display;
one or more input devices; and means for performing the method of any of claims 1 - 22.
26. A non-transitory computer-readable storage medium storing one or more programs configured to be executed by one or more processors of an electronic device with a display, the one or more programs including instructions for:
receiving one or more messages in a first conversation of electronic messages that includes messages from a user of the electronic device to a first participant and messages from the first participant to the user of the electronic device, the one or more messages in the first conversation including a first message that is associated with the transfer of a first additional item;
receiving one or more messages in a second conversation of electronic messages that includes messages from the user of the electronic device to a second participant and messages from the second participant to the user of the electronic device, the one or more messages in the second conversation including a second message that is associated with the transfer of a second additional item; and concurrently displaying, on the display:
441
DK 2017 70505 A1 a first item associated with the first participant, wherein the first item includes first information from the first message in the first conversation of electronic messages and a representation of the first additional item; and a second item associated with the second participant, wherein the second item includes second information from the second message in the second conversation of electronic messages and a representation of the second additional item.
27. An electronic device, comprising:
a display;
one or more processors; and memory storing one or more programs configured to be executed by the one or more processors, the one or more programs including instructions for:
receiving one or more messages in a first conversation of electronic messages that includes messages from a user of the electronic device to a first participant and messages from the first participant to the user of the electronic device, the one or more messages in the first conversation including a first message that is associated with the transfer of a first additional item;
receiving one or more messages in a second conversation of electronic messages that includes messages from the user of the electronic device to a second participant and messages from the second participant to the user of the electronic device, the one or more messages in the second conversation including a second message that is associated with the transfer of a second additional item; and concurrently displaying, on the display:
a first item associated with the first participant, wherein the first item includes first information from the first message in the first conversation of electronic messages and a representation of the first additional item; and a second item associated with the second participant, wherein the second item includes second information from the second message in the second conversation of electronic messages and a representation of the second additional item.
442
DK 2017 70505 A1
28. An electronic device, comprising:
a display;
means for receiving one or more messages in a first conversation of electronic messages that includes messages from a user of the electronic device to a first participant and messages from the first participant to the user of the electronic device, the one or more messages in the first conversation including a first message that is associated with the transfer of a first additional item;
means for receiving one or more messages in a second conversation of electronic messages that includes messages from the user of the electronic device to a second participant and messages from the second participant to the user of the electronic device, the one or more messages in the second conversation including a second message that is associated with the transfer of a second additional item; and means for concurrently displaying, on the display:
a first item associated with the first participant, wherein the first item includes first information from the first message in the first conversation of electronic messages and a representation of the first additional item; and a second item associated with the second participant, wherein the second item includes second information from the second message in the second conversation of electronic messages and a representation of the second additional item.
DKPA201770505A 2017-05-16 2017-06-26 User interfaces for peer-to-peer transfers DK201770505A1 (en)

Priority Applications (26)

Application Number Priority Date Filing Date Title
CN202210639919.XA CN114936856A (en) 2017-05-16 2018-05-16 User interface for peer-to-peer transmission
AU2018269512A AU2018269512B2 (en) 2017-05-16 2018-05-16 User interfaces for peer-to-peer transfers
JP2019572834A JP6983261B2 (en) 2017-05-16 2018-05-16 User interface for peer-to-peer transfer
KR1020227019902A KR102495947B1 (en) 2017-05-16 2018-05-16 User interfaces for peer-to-peer transfers
KR1020217035417A KR102372228B1 (en) 2017-05-16 2018-05-16 User interfaces for peer-to-peer transfers
KR1020247004706A KR20240023212A (en) 2017-05-16 2018-05-16 User interfaces for peer-to-peer transfers
KR1020237036172A KR102636696B1 (en) 2017-05-16 2018-05-16 User interfaces for peer-to-peer transfers
KR1020197033768A KR102154850B1 (en) 2017-05-16 2018-05-16 User interfaces for peer-to-peer transmissions
KR1020227007288A KR102409769B1 (en) 2017-05-16 2018-05-16 User interfaces for peer-to-peer transfers
CN202010174749.3A CN111490926B (en) 2017-05-16 2018-05-16 Method for operating electronic device, computer-readable storage medium, and electronic device
CN202210023470.4A CN114363278B (en) 2017-05-16 2018-05-16 Method for messaging, electronic device, and computer-readable storage medium
KR1020237003678A KR102594156B1 (en) 2017-05-16 2018-05-16 User interfaces for peer-to-peer transfers
EP18730556.0A EP3586481B1 (en) 2017-05-16 2018-05-16 User interfaces for peer-to-peer transfers
CN202011206499.3A CN112150133B (en) 2017-05-16 2018-05-16 User interface for peer-to-peer transmission
PCT/US2018/033054 WO2018213508A1 (en) 2017-05-16 2018-05-16 User interfaces for peer-to-peer transfers
KR1020217011434A KR102321894B1 (en) 2017-05-16 2018-05-16 User interfaces for peer-to-peer transfers
CN201880048209.1A CN110999228A (en) 2017-05-16 2018-05-16 User interface for peer-to-peer transmission
EP20204436.8A EP3800837B1 (en) 2017-05-16 2018-05-16 User interfaces for peer-to-peer transfers
EP23190272.7A EP4250679A3 (en) 2017-05-16 2018-05-16 User interfaces for peer-to-peer transfers
CN202310634790.8A CN116521302A (en) 2017-05-16 2018-05-16 User interface for peer-to-peer transmission
KR1020207025711A KR102243500B1 (en) 2017-05-16 2018-05-16 User interfaces for peer-to-peer transfers
AU2020202953A AU2020202953B2 (en) 2017-05-16 2020-05-04 User interfaces for peer-to-peer transfers
JP2021157213A JP2022000802A (en) 2017-05-16 2021-09-27 User interface for peer-to-peer transfer
AU2021290334A AU2021290334B2 (en) 2017-05-16 2021-12-23 User interfaces for peer-to-peer transfers
AU2023203197A AU2023203197B2 (en) 2017-05-16 2023-05-22 User interfaces for peer-to-peer transfers
JP2023138172A JP2023169179A (en) 2017-05-16 2023-08-28 User interface for peer-to-peer transfer

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
US201762507161P 2017-05-16 2017-05-16
US62/507,161 2017-05-16
US201762514945P 2017-06-04 2017-06-04
US62/514,945 2017-06-04

Publications (1)

Publication Number Publication Date
DK201770505A1 true DK201770505A1 (en) 2019-01-25

Family

ID=69140696

Family Applications (3)

Application Number Title Priority Date Filing Date
DKPA201770505A DK201770505A1 (en) 2017-05-16 2017-06-26 User interfaces for peer-to-peer transfers
DKPA201770502A DK180636B1 (en) 2017-05-16 2017-06-26 USER INTERFACES FOR PEER-TO-PEER TRANSFERS
DKPA201770503A DK180093B1 (en) 2017-05-16 2017-06-26 USER INTERFACES FOR PEER-TO-PEER TRANSFER

Family Applications After (2)

Application Number Title Priority Date Filing Date
DKPA201770502A DK180636B1 (en) 2017-05-16 2017-06-26 USER INTERFACES FOR PEER-TO-PEER TRANSFERS
DKPA201770503A DK180093B1 (en) 2017-05-16 2017-06-26 USER INTERFACES FOR PEER-TO-PEER TRANSFER

Country Status (1)

Country Link
DK (3) DK201770505A1 (en)

Also Published As

Publication number Publication date
DK180636B1 (en) 2021-11-04
DK201770503A1 (en) 2019-01-29
DK201770502A1 (en) 2019-01-29
DK180093B1 (en) 2020-04-23

Similar Documents

Publication Publication Date Title
AU2023203197B2 (en) User interfaces for peer-to-peer transfers
US11221744B2 (en) User interfaces for peer-to-peer transfers
WO2018213508A1 (en) User interfaces for peer-to-peer transfers
AU2020202953B2 (en) User interfaces for peer-to-peer transfers
DK180636B1 (en) USER INTERFACES FOR PEER-TO-PEER TRANSFERS

Legal Events

Date Code Title Description
PAT Application published

Effective date: 20181117

PHB Application deemed withdrawn due to non-payment or other reasons

Effective date: 20220101