DK180170B1 - Devices, procedures, and graphical messaging user interfaces - Google Patents

Devices, procedures, and graphical messaging user interfaces Download PDF

Info

Publication number
DK180170B1
DK180170B1 DKPA201670653A DKPA201670653A DK180170B1 DK 180170 B1 DK180170 B1 DK 180170B1 DK PA201670653 A DKPA201670653 A DK PA201670653A DK PA201670653 A DKPA201670653 A DK PA201670653A DK 180170 B1 DK180170 B1 DK 180170B1
Authority
DK
Denmark
Prior art keywords
touch
contact
user
menu
avatar
Prior art date
Application number
DKPA201670653A
Other languages
Danish (da)
Inventor
A Chaudhri Imran
Original Assignee
Apple Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Apple Inc filed Critical Apple Inc
Priority to CN202011289558.8A priority Critical patent/CN112732147A/en
Priority to KR1020247004154A priority patent/KR20240023200A/en
Priority to CN202011289707.0A priority patent/CN112748840A/en
Priority to AU2017266930A priority patent/AU2017266930C1/en
Priority to CN201910607633.1A priority patent/CN110333806B/en
Priority to CN201810396354.0A priority patent/CN108762862B/en
Priority to KR1020187003537A priority patent/KR101947140B1/en
Priority to EP19181254.4A priority patent/EP3594795B1/en
Priority to KR1020217040161A priority patent/KR102511443B1/en
Priority to BR112018073693A priority patent/BR112018073693A2/en
Priority to KR1020197003574A priority patent/KR102134455B1/en
Priority to CN202011291826.XA priority patent/CN112748841A/en
Priority to KR1020197019197A priority patent/KR102091368B1/en
Priority to CN201911387675.5A priority patent/CN111176509B/en
Priority to CN201910607635.0A priority patent/CN110333926A/en
Priority to CN201910607447.8A priority patent/CN110399061B/en
Priority to CN202011289515.XA priority patent/CN113157176A/en
Priority to EP17728317.3A priority patent/EP3295615B1/en
Priority to EP23213783.6A priority patent/EP4311201A3/en
Priority to CN202011291800.5A priority patent/CN112799570A/en
Priority to PCT/US2017/033396 priority patent/WO2017201326A1/en
Priority to EP22190375.0A priority patent/EP4113268B1/en
Priority to KR1020237008767A priority patent/KR102636000B1/en
Priority to CN202011289542.7A priority patent/CN112799569A/en
Priority to CN201910607634.6A priority patent/CN110377193A/en
Priority to CN201780002856.4A priority patent/CN108476168B/en
Priority to CN202011289545.0A priority patent/CN112783403A/en
Priority to CN201810396289.1A priority patent/CN109117068B/en
Priority to KR1020197036410A priority patent/KR102172901B1/en
Priority to EP18167254.4A priority patent/EP3376358B1/en
Priority to KR1020207019976A priority patent/KR102338357B1/en
Priority to EP19180887.2A priority patent/EP3620902B1/en
Priority to EP19218201.2A priority patent/EP3680763B1/en
Priority to JP2018510791A priority patent/JP6538966B2/en
Publication of DK201670653A1 publication Critical patent/DK201670653A1/en
Priority to JP2019106495A priority patent/JP6851115B2/en
Priority to AU2019204403A priority patent/AU2019204403B2/en
Priority to JP2019218615A priority patent/JP6710806B2/en
Priority to AU2019283863A priority patent/AU2019283863B2/en
Priority to AU2020202396A priority patent/AU2020202396B2/en
Priority to JP2020092417A priority patent/JP6967113B2/en
Application granted granted Critical
Publication of DK180170B1 publication Critical patent/DK180170B1/en
Priority to JP2021173046A priority patent/JP7263481B2/en
Priority to AU2021269318A priority patent/AU2021269318B2/en
Priority to AU2023201827A priority patent/AU2023201827A1/en
Priority to JP2023065202A priority patent/JP2023099007A/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/0486Drag-and-drop
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L51/00User-to-user messaging in packet-switching networks, transmitted according to store-and-forward or real-time protocols, e.g. e-mail

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Signal Processing (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

An electronic device displays a messaging user interface of a messaging application on a display, the messaging user interface including a conversation transcript of a messaging session between a user of the electronic device and a plurality of other users, a messaging-input area, and a plurality of avatars, each respective avatar in the pIurality of avatars corresponding to a respective other user in the plurality of other users included in the messaging session, wherein the pIurality of avatars are displayed as a stack of avatars, with a first avatar in the plurality of avatars displayed on the top of the stack of avatars. While displaying the messaging user interface, the device detects an input by a first contact on the touch-sensitive surface while a focus selector is at a first location in the messaging user interface that corresponds to the first avatar. In response to detecting the input by the first contact: in accordance with a determination that the input meets menu-activationcriteria, wherein the menu-activation-criteria require that a characteristic intensity of the contact on the touch-sensitive surface meet a respective intensity threshold in order for the menu-activation criteria to be met, the device displays a menu that contains activatable menu items associated with the first avatar overlaid on the messaging user interface; and, in accordance with a determination that the input meets avatar-spreading-criteria, wherein the avatar-spreading-criteria do not require that a characteristic intensity of the contact on the touchscreen meet the respective intensity threshold in order for the selection criteria to be met, the device displays the plurality of avatars in an array.

Description

DK 180170 B1 Devices, Methods, and Graphical User Interfaces for Messaging
RELATED APPLICATIONS
[0001] This application claims priority to: (1) U. S. Provisional Application Ser. No. 62/349,114, filed June 12, 2016, entitled "Devices, Methods, and Graphical User Interfaces for Messaging”; (2) U. S. Provisional Application Ser. No. 62/339,078, filed May 19, 2016, entitled "Devices, Methods, and Graphical User Interfaces for Messaging”; and (3) U. S. Provisional Application Ser. No. 62/338,502, filed May 18, 2016, entitled "Devices, Methods, and Graphical User Interfaces for Messaging, ”.
TECHNICAL FIELD
[0002] This relates generally to electronic devices with touch-sensitive surfaces, including but not limited to electronic devices with touch-sensitive surfaces that send and receive messages, such as instant messages.
BACKGROUND
[0003] The use of touch-sensitive surfaces as input devices for computers and other electronic computing devices has increased significantly in recent years. Exemplary touch- sensitive surfaces include touchpads and touch-screen displays. Such devices are often used to send messages, such as instant messages, between users using messaging applications.
[0004] But current messaging applications have numerous drawbacks and limitations. For example, they are limited in their ability to easily: acknowledge messages; edit previously sent messages; express what a user is trying to communicate; display private messages; synchronize viewing of content between users; incorporate handwritten inputs; quickly locate content in a message transcript; integrate a camera; integrate search and sharing; integrate interactive applications; integrate stickers; make payments; interact with avatars; and make suggestions. — [0005] US2010/0153844 A1 shows a method performed by a user device including displaying a graphical user interface (GUI) on a touch screen, where the GUI includes a stack having a plurality of items.
, DK 180170 B1
SUMMARY
[0006] Accordingly, there is a need for electronic devices with improved methods and interfaces for messaging. Such methods and interfaces optionally complement or replace conventional methods for messaging. Such methods and interfaces change the number, extent, and/or nature of the inputs from a user and produce a more efficient human- machine interface. For battery-operated devices, such methods and interfaces conserve power and increase the time between battery charges.
[0007] The above deficiencies and other problems associated with user interfaces for electronic devices with touch-sensitive surfaces are reduced or eliminated by the disclosed devices. In some embodiments, the device is a desktop computer. In some embodiments, the device is portable (e.g., a notebook computer, tablet computer, or handheld device). In some embodiments, the device is a personal electronic device (e.g., a wearable electronic device, such as a watch). In some embodiments, the device has a touchpad. In some embodiments, the device has a touch-sensitive display (also known as a “touch screen” or “touch-screen display”). In some embodiments, the device has a graphical user interface (GUI), one or more processors, memory and one or more modules, programs or sets of instructions stored in the memory for performing multiple functions. In some embodiments, the user interacts with the GUI primarily through stylus and/or finger contacts and gestures on the touch-sensitive surface. In some embodiments, the functions optionally include image editing, drawing, presenting, word processing, spreadsheet making, game playing, telephoning, video conferencing, e-mailing, instant messaging, workout support, digital photographing, digital videoing, web browsing, digital music playing, note taking, and/or digital video playing. Executable instructions for performing these functions are, optionally, included in a non-transitory computer readable storage medium or other computer program product configured for execution by one or more processors.
[0008] The present invention is disclosed by the subject-matter of the independent claims. One aspect of the present invention is a method as defined in independent claim 1. Other aspects of the invention are an information processing apparatus as defined in independent claim 9, computer readable storage medium and electronic device as defined in claims 17 and 18, respectively. Further aspects of the invention are the subject of the
3 DK 180170 B1 dependent claims. Any reference throughout this disclosure to an embodiment may point to alternative aspects relating to the invention, which are not necessarily embodiments encompassed by the claims, rather examples and technical descriptions forming background art or examples useful for understanding the invention. The scope of present invention is defined by the claims.
[0009] There is a need for electronic devices with improved methods and interfaces for selectively activating menus in a messaging session. Such methods and interfaces may complement or replace conventional methods for selectively activating menus in a messaging session. Such methods and interfaces reduce the number, extent, and/or the nature of the inputs from a user and produce a more efficient human-machine interface.
[0010] In accordance with some embodiments, a method is performed at an electronic device with one or more processors, memory, a touch-sensitive surface, one or more sensors to detect intensities of contacts with the touch-sensitive surface, and a display. The device displays a messaging user interface of a messaging application on the display, — the messaging user interface including a conversation transcript of a messaging session between a user of the electronic device and a plurality of other users, a message-input area, and a plurality of avatars, each respective avatar in the plurality of avatars corresponding to a respective other user in the plurality of other users included in the messaging session, wherein the plurality of avatars are displayed as a stack of avatars, with a first avatar in the plurality of avatars displayed on the top of the stack of avatars. While displaying the messaging user interface, the device detects an input by a first contact on the touch-sensitive surface while a focus selector is at a first location in the messaging user interface that corresponds to the first avatar. In response to detecting the input by the first contact: in accordance with a determination that the input meets menu-activation-criteria, wherein the menu-activation-criteria require that a characteristic intensity of the contact on the touch- sensitive surface meet a respective intensity threshold in order for the menu-activation criteria to be met, the device displays a menu that contains activatable menu items associated with the first avatar overlaid on the messaging user interface. In accordance with a determination that the input meets avatar-spreading-criteria, wherein the avatar-spreading- criteria do not require that a characteristic intensity of the contact on the touchscreen meet the respective intensity threshold in order for the selection criteria to be met, the device displays the plurality of avatars in an array.
4 DK 180170 B1
[0011] In accordance with some embodiments, there is an electronic device, including a display unit configured to display user interfaces; a touch-sensitive surface unit configured to detect contacts; and a processing unit coupled with the display unit and the touch-sensitive surface unit. The processing unit enables display of a messaging user interface of a messaging application on the display unit, the messaging user interface including a conversation transcript of a messaging session between a user of the electronic device and a plurality of other users, a message-input area, and a plurality of avatars, each respective avatar in the plurality of avatars corresponding to a respective other user in the plurality of other users included in the messaging session, wherein the plurality of avatars are displayed as a stack of avatars, with a first avatar in the plurality of avatars displayed on the top of the stack of avatars. While displaying the messaging user interface, the processing unit detects an input by a first contact on the touch-sensitive surface unit while a focus selector is at a first location in the messaging user interface that corresponds to the first avatar. In response to detecting the input by the first contact: in accordance with a determination that the input meets menu-activation-criteria, wherein the menu-activation- criteria require that a characteristic intensity of the contact on the touch-sensitive surface unit meet a respective intensity threshold in order for the menu-activation criteria to be met, the processing unit enables display of a menu that contains activatable menu items associated with the first avatar overlaid on the messaging user interface. In accordance with a determination that the input meets avatar-spreading-criteria, wherein the avatar-spreading- criteria do not require that a characteristic intensity of the contact on the touchscreen meet the respective intensity threshold in order for the selection criteria to be met, the processing unit enables display of the plurality of avatars in an array.
[0012] Thus, electronic devices with displays and touch-sensitive surfaces are provided with improved methods and interfaces for selectively activating menus in a messaging session, thereby increasing the effectiveness, efficiency, and user satisfaction with such devices. Such methods and interfaces may complement or replace conventional methods for selectively activating menus in a messaging session.
BRIEF DESCRIPTION OF THE DRAWINGS
[0013] For a better understanding of the various described embodiments, reference should be made to the Description of Embodiments below, in conjunction with the
DK 180170 B1 following drawings in which like reference numerals refer to corresponding parts throughout the figures.
[0014] Figure 1A is a block diagram illustrating a portable multifunction device with a touch-sensitive display in accordance with some embodiments. 5 [0015] Figure 1B is a block diagram illustrating example components for event handling in accordance with some embodiments.
[0016] Figure 2 illustrates a portable multifunction device having a touch screen in accordance with some embodiments.
[0017] Figure 3 is a block diagram of an example multifunction device with a — display and a touch-sensitive surface in accordance with some embodiments.
[0018] Figure 4A illustrates an example user interface for a menu of applications on a portable multifunction device in accordance with some embodiments.
[0019] Figure 4B illustrates an example user interface for a multifunction device with a touch-sensitive surface that is separate from the display in accordance with some embodiments.
[0020] Figures 4C-4E illustrate examples of dynamic intensity thresholds in accordance with some embodiments.
[0021] Figures SA-SK illustrate exemplary user interfaces for displaying message transcripts and message acknowledgments.
[0022] Figures SL-5T illustrate exemplary user interfaces for editing previously sent messages while displaying a message transcript.
[0023] Figures SU-5BF illustrate exemplary user interfaces for applying an impact effect option to a message input or message region.
[0024] Figures SBG-5CA illustrate exemplary user interfaces for interacting with concealed messages.
[0025] Figures SCB-5CW illustrate exemplary user interfaces for triggering enhanced message content and applying an effect to a messaging user interface when a message includes an enhanced message content trigger.
DK 180170 B1
[0026] Figures SCX-5DC illustrate exemplary user interfaces for detecting and responding to combinable content in separate messages.
[0027] Figures SDD-5DI illustrate exemplary user interfaces for selecting a message region type or shape. 5 [0028] Figures SDI-5DQ illustrate exemplary user interfaces for displaying and selecting automatically suggested emoji while composing a message.
[0029] Figures 40A-40W illustrate exemplary user interfaces for interacting with other users of a messaging transcript through an avatar in accordance with some embodiments.
[0030] Figures 41A-41H illustrate exemplary user interfaces for integrating data detectors into a messaging application in accordance with some embodiments.
[0031] Figures 68 A-68B are flow diagrams illustrating a method 6800 of interacting with a single user included in a group messaging session in accordance with some embodiments.
DESCRIPTION OF EMBODIMENTS
[0032] The methods, devices, and GUIs described herein improve messaging in multiple ways. For example, they make it easier to: acknowledge messages; edit previously sent messages; express what a user is trying to communicate; display private messages; synchronize viewing of content between users; incorporate handwritten inputs; quickly locate content in a message transcript; integrate a camera; integrate search and sharing; integrate interactive applications; integrate stickers; make payments; interact with avatars; and make suggestions.
EXAMPLE DEVICES
[0033] Reference will now be made in detail to embodiments, examples of which are illustrated in the accompanying drawings. In the following detailed description, numerous specific details are set forth in order to provide a thorough understanding of the various described embodiments. However, it will be apparent to one of ordinary skill in the art that the various described embodiments may be practiced without these specific details. In other instances, well-known methods, procedures, components, circuits, and networks
, DK 180170 B1 have not been described in detail so as not to unnecessarily obscure aspects of the embodiments.
[0034] It will also be understood that, although the terms first, second, etc. are, in some instances, used herein to describe various elements, these elements should not be limited by these terms. These terms are only used to distinguish one element from another. For example, a first contact could be termed a second contact, and, similarly, a second contact could be termed a first contact, without departing from the scope of the various described embodiments. The first contact and the second contact are both contacts, but they are not the same contact, unless the context clearly indicates otherwise.
[0035] The terminology used in the description of the various described embodiments herein is for the purpose of describing particular embodiments only and is not intended to be limiting. As used in the description of the various described embodiments and the appended claims, the singular forms “a,” “an,” and “the” are intended to include the plural forms as well, unless the context clearly indicates otherwise. It will also be understood that the term “and/or” as used herein refers to and encompasses any and all possible combinations of one or more of the associated listed items. It will be further understood that the terms “includes,” “including,” “comprises,” and/or “comprising,” when used in this specification, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more — other features, integers, steps, operations, elements, components, and/or groups thereof.
[0036] As used herein, the term “if” is, optionally, construed to mean “when” or “upon” or “in response to determining” or “in response to detecting,” depending on the context. Similarly, the phrase “if it is determined” or “if [a stated condition or event] is detected” is, optionally, construed to mean “upon determining” or “in response to determining” or “upon detecting [the stated condition or event]” or “in response to detecting [the stated condition or event],” depending on the context.
[0037] Embodiments of electronic devices, user interfaces for such devices, and associated processes for using such devices are described. In some embodiments, the device is a portable communications device, such as a mobile telephone, that also contains other — functions, such as PDA and/or music player functions. Example embodiments of portable multifunction devices include, without limitation, the iPhone®, iPod Touch®, and iPad®
: DK 180170 B1 devices from Apple Inc. of Cupertino, California. Other portable electronic devices, such as laptops or tablet computers with touch-sensitive surfaces (e.g., touch-screen displays and/or touchpads), are, optionally, used. It should also be understood that, in some embodiments, the device is not a portable communications device, but is a desktop computer with a touch- sensitive surface (e.g., a touch-screen display and/or a touchpad).
[0038] In the discussion that follows, an electronic device that includes a display and a touch-sensitive surface is described. It should be understood, however, that the electronic device optionally includes one or more other physical user-interface devices, such as a physical keyboard, a mouse and/or a joystick.
[0039] In addition to a messaging application, the device typically supports a variety of other applications, such as one or more of the following: a note taking application, a drawing application, a presentation application, a word processing application, a website creation application, a disk authoring application, a spreadsheet application, a gaming application, a telephone application, a video conferencing application, an e-mail application, a workout support application, a photo management application, a digital camera application, a digital video camera application, a web browsing application, a digital music player application, and/or a digital video player application.
[0040] The various applications that are executed on the device optionally use at least one common physical user-interface device, such as the touch-sensitive surface. One or more functions of the touch-sensitive surface as well as corresponding information displayed on the device are, optionally, adjusted and/or varied from one application to the next and/or within a respective application. In this way, a common physical architecture (such as the touch-sensitive surface) of the device optionally supports the variety of applications with user interfaces that are intuitive and transparent to the user.
[0041] Attention is now directed toward embodiments of portable devices with touch-sensitive displays. Figure 1A is a block diagram illustrating portable multifunction device 100 with touch-sensitive display system 112 in accordance with some embodiments. Touch-sensitive display system 112 is sometimes called a “touch screen” for convenience, and is sometimes simply called a touch-sensitive display. Device 100 includes memory 102 — (which optionally includes one or more computer readable storage mediums), memory controller 122, one or more processing units (CPUs) 120, peripherals interface 118, RF
; DK 180170 B1 circuitry 108, audio circuitry 110, speaker 111, microphone 113, input/output (I/O) subsystem 106, other input or control devices 116, and external port 124. Device 100 optionally includes one or more optical sensors 164. Device 100 includes one or more intensity sensors 165 for detecting intensities of contacts on device 100 (e.g., a touch- sensitive surface such as touch-sensitive display system 112 of device 100). Device 100 optionally includes one or more tactile output generators 167 for generating tactile outputs on device 100 (e.g., generating tactile outputs on a touch-sensitive surface such as touch- sensitive display system 112 of device 100 or touchpad 355 of device 300). These components optionally communicate over one or more communication buses or signal lines
103.
[0042] As used in the specification and claims, the term “tactile output” refers to physical displacement of a device relative to a previous position of the device, physical displacement of a component (e.g., a touch-sensitive surface) of a device relative to another component (e.g., housing) of the device, or displacement of the component relative to a center of mass of the device that will be detected by a user with the user's sense of touch. For example, in situations where the device or the component of the device is in contact with a surface of a user that is sensitive to touch (e.g., a finger, palm, or other part of a user's hand), the tactile output generated by the physical displacement will be interpreted by the user as a tactile sensation corresponding to a perceived change in physical characteristics of the device or the component of the device. For example, movement of a touch-sensitive surface (e.g., a touch-sensitive display or trackpad) is, optionally, interpreted by the user as a "down click” or "up click” of a physical actuator button. In some cases, a user will feel a tactile sensation such as an "down click” or "up click” even when there is no movement of a physical actuator button associated with the touch-sensitive surface that is physically pressed (e.g., displaced) by the user's movements. As another example, movement of the touch-sensitive surface is, optionally, interpreted or sensed by the user as “roughness” of the touch-sensitive surface, even when there is no change in smoothness of the touch-sensitive surface. While such interpretations of touch by a user will be subject to the individualized sensory perceptions of the user, there are many sensory perceptions of touch that are common to a large majority of users. Thus, when a tactile output is described as corresponding to a particular sensory perception of a user (e.g., an “up click,” a “down click,” "roughness”), unless otherwise stated, the generated tactile output corresponds to
0 DK 180170 B1 physical displacement of the device or a component thereof that will generate the described sensory perception for a typical (or average) user. Using tactile outputs to provide haptic feedback to a user enhances the operability of the device and makes the user-device interface more efficient (e.g., by helping the user to provide proper inputs and reducing user — user mistakes when operating/interacting with the device) which, additionally, reduces power usage and improves battery life of the device by enabling the user to use the device more quickly and efficiently.
[0043] It should be appreciated that device 100 is only one example of a portable multifunction device, and that device 100 optionally has more or fewer components than shown, optionally combines two or more components, or optionally has a different configuration or arrangement of the components. The various components shown in Figure 1A are implemented in hardware, software, firmware, or a combination thereof, including one or more signal processing and/or application specific integrated circuits.
[0044] Memory 102 optionally includes high-speed random access memory and optionally also includes non-volatile memory, such as one or more magnetic disk storage devices, flash memory devices, or other non-volatile solid-state memory devices. Access to memory 102 by other components of device 100, such as CPU(s) 120 and the peripherals interface 118, is, optionally, controlled by memory controller 122.
[0045] Peripherals interface 118 can be used to couple input and output peripherals of the device to CPU(s) 120 and memory 102. The one or more processors 120 run or execute various software programs and/or sets of instructions stored in memory 102 to perform various functions for device 100 and to process data.
[0046] In some embodiments, peripherals interface 118, CPU(s) 120, and memory controller 122 are, optionally, implemented on a single chip, such as chip 104. In some other embodiments, they are, optionally, implemented on separate chips.
[0047] RF (radio frequency) circuitry 108 receives and sends RF signals, also called electromagnetic signals. RF circuitry 108 converts electrical signals to/from electromagnetic signals and communicates with communications networks and other communications devices via the electromagnetic signals. RF circuitry 108 optionally includes well-known circuitry for performing these functions, including but not limited to an antenna system, an RF transceiver, one or more amplifiers, a tuner, one or more oscillators, a digital signal
DK 180170 B1 processor, a CODEC chipset, a subscriber identity module (SIM) card, memory, and so forth. RF circuitry 108 optionally communicates with networks, such as the Internet, also referred to as the World Wide Web (WWW), an intranet and/or a wireless network, such as a cellular telephone network, a wireless local area network (LAN) and/or a metropolitan area network (MAN), and other devices by wireless communication. The wireless communication optionally uses any of a plurality of communications standards, protocols and technologies, including but not limited to Global System for Mobile Communications (GSM), Enhanced Data GSM Environment (EDGE), high-speed downlink packet access (HSDPA), high-speed uplink packet access (HSUPA), Evolution, Data-Only (EV-DO), HSPA, HSPA+, Dual-Cell HSPA (DC-HSPDA), long term evolution (LTE), near field communication (NFC), wideband code division multiple access (W-CDMA), code division multiple access (CDMA), time division multiple access (TDMA), Bluetooth, Wireless Fidelity (Wi-Fi) (e.g., IEEE 802.11a, IEEE 802.11ac, IEEE 802.11ax, IEEE 802.11b, IEEE
802.11g and/or IEEE 802.11n), voice over Internet Protocol (VoIP), Wi-MAX, a protocol for e-mail (e.g., Internet message access protocol (IMAP) and/or post office protocol (POP)), instant messaging (e.g., extensible messaging and presence protocol (XMPP), Session Initiation Protocol for Instant Messaging and Presence Leveraging Extensions (SIMPLE), Instant Messaging and Presence Service (IMPS)), and/or Short Message Service (SMS), or any other suitable communication protocol, including communication protocols not yet developed as of the filing date of this document.
[0048] Audio circuitry 110, speaker 111, and microphone 113 provide an audio interface between a user and device 100. Audio circuitry 110 receives audio data from peripherals interface 118, converts the audio data to an electrical signal, and transmits the electrical signal to speaker 111. Speaker 111 converts the electrical signal to human-audible sound waves. Audio circuitry 110 also receives electrical signals converted by microphone 113 from sound waves. Audio circuitry 110 converts the electrical signal to audio data and transmits the audio data to peripherals interface 118 for processing. Audio data is, optionally, retrieved from and/or transmitted to memory 102 and/or RF circuitry 108 by peripherals interface 118. In some embodiments, audio circuitry 110 also includes a headset jack (e.g., 212, Figure 2). The headset jack provides an interface between audio circuitry 110 and removable audio input/output peripherals, such as output-only headphones or a
2 DK 180170 B1 headset with both output (e.g., a headphone for one or both ears) and input (e.g., a microphone).
[0049] I/O subsystem 106 couples input/output peripherals on device 100, such as touch-sensitive display system 112 and other input or control devices 116, with peripherals interface 118. I/O subsystem 106 optionally includes display controller 156, optical sensor controller 158, intensity sensor controller 159, haptic feedback controller 161, and one or more input controllers 160 for other input or control devices. The one or more input controllers 160 receive/send electrical signals from/to other input or control devices 116. The other input or control devices 116 optionally include physical buttons (e.g., push — buttons, rocker buttons, etc.), dials, slider switches, joysticks, click wheels, and so forth. In some alternate embodiments, input controller(s) 160 are, optionally, coupled with any (or none) of the following: a keyboard, infrared port, USB port, stylus, and/or a pointer device such as a mouse. The one or more buttons (e.g., 208, Figure 2) optionally include an up/down button for volume control of speaker 111 and/or microphone 113. The one or more buttons optionally include a push button (e.g., 206, Figure 2).
[0050] Touch-sensitive display system 112 provides an input interface and an output interface between the device and a user. Display controller 156 receives and/or sends electrical signals from/to touch-sensitive display system 112. Touch-sensitive display system 112 displays visual output to the user. The visual output optionally includes — graphics, text, icons, video, and any combination thereof (collectively termed “graphics”). In some embodiments, some or all of the visual output corresponds to user interface objects. As used herein, the term “affordance” refers to a user-interactive graphical user interface object (e.g., a graphical user interface object that is configured to respond to inputs directed toward the graphical user interface object). Examples of user-interactive graphical user interface objects include, without limitation, a button, slider, icon, selectable menu item, switch, hyperlink, or other user interface control.
[0051] Touch-sensitive display system 112 has a touch-sensitive surface, sensor or set of sensors that accepts input from the user based on haptic and/or tactile contact. Touch- sensitive display system 112 and display controller 156 (along with any associated modules — and/or sets of instructions in memory 102) detect contact (and any movement or breaking of the contact) on touch-sensitive display system 112 and converts the detected contact into
DK 180170 B1 interaction with user-interface objects (e.g., one or more soft keys, icons, web pages or images) that are displayed on touch-sensitive display system 112. In an example embodiment, a point of contact between touch-sensitive display system 112 and the user corresponds to a finger of the user or a stylus.
[0052] Touch-sensitive display system 112 optionally uses LCD (liquid crystal display) technology, LPD (light emitting polymer display) technology, or LED (light emitting diode) technology, although other display technologies are used in other embodiments. Touch-sensitive display system 112 and display controller 156 optionally detect contact and any movement or breaking thereof using any of a plurality of touch sensing technologies now known or later developed, including but not limited to capacitive, resistive, infrared, and surface acoustic wave technologies, as well as other proximity sensor arrays or other elements for determining one or more points of contact with touch-sensitive display system 112. In an example embodiment, projected mutual capacitance sensing technology is used, such as that found in the iPhone®, iPod Touch®, and iPad® from Apple — Inc. of Cupertino, California.
[0053] Touch-sensitive display system 112 optionally has a video resolution in excess of 100 dpi. In some embodiments, the touch screen video resolution is in excess of 400 dpi (e.g., 500 dpi, 800 dpi, or greater). The user optionally makes contact with touch- sensitive display system 112 using any suitable object or appendage, such as a stylus, a finger, and so forth. In some embodiments, the user interface is designed to work with finger-based contacts and gestures, which can be less precise than stylus-based input due to the larger area of contact of a finger on the touch screen. In some embodiments, the device translates the rough finger-based input into a precise pointer/cursor position or command for performing the actions desired by the user.
— [0054] In some embodiments, in addition to the touch screen, device 100 optionally includes a touchpad (not shown) for activating or deactivating particular functions. In some embodiments, the touchpad is a touch-sensitive area of the device that, unlike the touch screen, does not display visual output. The touchpad is, optionally, a touch-sensitive surface that is separate from touch-sensitive display system 112 or an extension of the touch- sensitive surface formed by the touch screen.
4 DK 180170 B1
[0055] Device 100 also includes power system 162 for powering the various components. Power system 162 optionally includes a power management system, one or more power sources (e.g., battery, alternating current (AC)), a recharging system, a power failure detection circuit, a power converter or inverter, a power status indicator (e.g., a light- emitting diode (LED)) and any other components associated with the generation, management and distribution of power in portable devices.
[0056] Device 100 optionally also includes one or more optical sensors 164. Figure 1A shows an optical sensor coupled with optical sensor controller 158 in I/O subsystem
106. Optical sensor(s) 164 optionally include charge-coupled device (CCD) or complementary metal-oxide semiconductor (CMOS) phototransistors. Optical sensor(s) 164 receive light from the environment, projected through one or more lens, and converts the light to data representing an image. In conjunction with imaging module 143 (also called a camera module), optical sensor(s) 164 optionally capture still images and/or video. In some embodiments, an optical sensor is located on the back of device 100, opposite touch- — sensitive display system 112 on the front of the device, so that the touch screen is enabled for use as a viewfinder for still and/or video image acquisition. In some embodiments, another optical sensor is located on the front of the device so that the user's image is obtained (e.g., for selfies, for videoconferencing while the user views the other video conference participants on the touch screen, etc.).
[0057] Device 100 optionally also includes one or more contact intensity sensors
165. Figure 1A shows a contact intensity sensor coupled with intensity sensor controller 159 in I/O subsystem 106. Contact intensity sensor(s) 165 optionally include one or more piezoresistive strain gauges, capacitive force sensors, electric force sensors, piezoelectric force sensors, optical force sensors, capacitive touch-sensitive surfaces, or other intensity sensors (e.g., sensors used to measure the force (or pressure) of a contact on a touch- sensitive surface). Contact intensity sensor(s) 165 receive contact intensity information (e.g., pressure information or a proxy for pressure information) from the environment. In some embodiments, at least one contact intensity sensor is collocated with, or proximate to, a touch-sensitive surface (e.g., touch-sensitive display system 112). In some embodiments, at least one contact intensity sensor is located on the back of device 100, opposite touch- screen display system 112 which is located on the front of device 100.
DK 180170 B1
[0058] Device 100 optionally also includes one or more proximity sensors 166. Figure 1A shows proximity sensor 166 coupled with peripherals interface 118. Alternately, proximity sensor 166 is coupled with input controller 160 in I/O subsystem 106. In some embodiments, the proximity sensor turns off and disables touch-sensitive display system 112 when the multifunction device is placed near the user's ear (e.g., when the user is making a phone call).
[0059] Device 100 optionally also includes one or more tactile output generators
167. Figure 1A shows a tactile output generator coupled with haptic feedback controller 161 in I/O subsystem 106. Tactile output generator(s) 167 optionally include one or more electroacoustic devices such as speakers or other audio components and/or electromechanical devices that convert energy into linear motion such as a motor, solenoid, electroactive polymer, piezoelectric actuator, electrostatic actuator, or other tactile output generating component (e.g., a component that converts electrical signals into tactile outputs on the device). Tactile output generator(s) 167 receive tactile feedback generation instructions from haptic feedback module 133 and generates tactile outputs on device 100 that are capable of being sensed by a user of device 100. In some embodiments, at least one tactile output generator is collocated with, or proximate to, a touch-sensitive surface (e.g., touch-sensitive display system 112) and, optionally, generates a tactile output by moving the touch-sensitive surface vertically (e.g., in/out of a surface of device 100) or laterally (e.g., back and forth in the same plane as a surface of device 100). In some embodiments, at least one tactile output generator sensor is located on the back of device 100, opposite touch-sensitive display system 112, which is located on the front of device 100.
[0060] Device 100 optionally also includes one or more accelerometers 168. Figure 1A shows accelerometer 168 coupled with peripherals interface 118. Alternately, accelerometer 168 is, optionally, coupled with an input controller 160 in I/O subsystem 106. In some embodiments, information is displayed on the touch-screen display in a portrait view or a landscape view based on an analysis of data received from the one or more accelerometers. Device 100 optionally includes, in addition to accelerometer(s) 168, a magnetometer (not shown) and a GPS (or GLONASS or other global navigation system) — receiver (not shown) for obtaining information concerning the location and orientation (e.g., portrait or landscape) of device 100.
DK 180170 B1
[0061] In some embodiments, the software components stored in memory 102 include operating system 126, communication module (or set of instructions) 128, contact/motion module (or set of instructions) 130, graphics module (or set of instructions) 132, haptic feedback module (or set of instructions) 133, text input module (or set of instructions) 134, Global Positioning System (GPS) module (or set of instructions) 135, and applications (or sets of instructions) 136. Furthermore, in some embodiments, memory 102 stores device/global internal state 157, as shown in Figures 1A and 3. Device/global internal state 157 includes one or more of: active application state, indicating which applications, if any, are currently active; display state, indicating what applications, views or other information occupy various regions of touch-sensitive display system 112; sensor state, including information obtained from the devices various sensors and other input or control devices 116; and location and/or positional information concerning the devices location and/or attitude.
[0062] Operating system 126 (e.g., iOS, Darwin, RTXC, LINUX, UNIX, OS X, WINDOWS, or an embedded operating system such as VxWorks) includes various software components and/or drivers for controlling and managing general system tasks (e.g., memory management, storage device control, power management, etc.) and facilitates communication between various hardware and software components.
[0063] Communication module 128 facilitates communication with other devices over one or more external ports 124 and also includes various software components for handling data received by RF circuitry 108 and/or external port 124. External port 124 (e.g., Universal Serial Bus (USB), FIREWIRE, etc.) is adapted for coupling directly to other devices or indirectly over a network (e.g., the Internet, wireless LAN, etc.). In some embodiments, the external port is a multi-pin (e.g., 30-pin) connector that is the same as, or — similar to and/or compatible with the 30-pin connector used in some iPhone®, iPod Touch®, and iPad& devices from Apple Inc. of Cupertino, California. In some embodiments, the external port is a Lightning connector that is the same as, or similar to and/or compatible with the Lightning connector used in some iPhone®, iPod Touch®, and iPad& devices from Apple Inc. of Cupertino, California.
— [0064] Contact/motion module 130 optionally detects contact with touch-sensitive display system 112 (in conjunction with display controller 156) and other touch-sensitive
- DK 180170 B1 devices (e.g., a touchpad or physical click wheel). Contact/motion module 130 includes various software components for performing various operations related to detection of contact (e.g., by a finger or by a stylus), such as determining if contact has occurred (e.g., detecting a finger-down event), determining an intensity of the contact (e.g., the force or pressure of the contact or a substitute for the force or pressure of the contact), determining if there is movement of the contact and tracking the movement across the touch-sensitive surface (e.g., detecting one or more finger-dragging events), and determining if the contact has ceased (e.g., detecting a finger-up event or a break in contact). Contact/motion module 130 receives contact data from the touch-sensitive surface. Determining movement of the point of contact, which is represented by a series of contact data, optionally includes determining speed (magnitude), velocity (magnitude and direction), and/or an acceleration (a change in magnitude and/or direction) of the point of contact. These operations are, optionally, applied to single contacts (e.g., one finger contacts or stylus contacts) or to multiple simultaneous contacts (e.g., "multitouch”/multiple finger contacts). In some embodiments, contact/motion module 130 and display controller 156 detect contact on a touchpad.
[0065] Contact/motion module 130 optionally detects a gesture input by a user. Different gestures on the touch-sensitive surface have different contact patterns (e.g., different motions, timings, and/or intensities of detected contacts). Thus, a gesture is, optionally, detected by detecting a particular contact pattern. For example, detecting a finger tap gesture includes detecting a finger-down event followed by detecting a finger-up (lift off) event at the same position (or substantially the same position) as the finger-down event (e.g., at the position of an icon). As another example, detecting a finger swipe gesture on the touch-sensitive surface includes detecting a finger-down event followed by detecting one or more finger-dragging events, and subsequently followed by detecting a finger-up (lift off) event. Similarly, tap, swipe, drag, and other gestures are optionally detected for a stylus by detecting a particular contact pattern for the stylus.
[0066] In some embodiments, detecting a finger tap gesture depends on the length of time between detecting the finger-down event and the finger-up event, but is independent of the intensity of the finger contact between detecting the finger-down event and the finger-up event. In some embodiments, a tap gesture is detected in accordance with a determination that the length of time between the finger-down event and the finger-up event is less than aa DK 180170 B1 predetermined value (e.g., less than 0.1, 0.2, 0.3, 0.4 or 0.5 seconds), independent of whether the intensity of the finger contact during the tap meets a given intensity threshold (greater than a nominal contact-detection intensity threshold), such as a light press or deep press intensity threshold. Thus, a finger tap gesture can satisfy particular input criteria that do not require that the characteristic intensity of a contact satisfy a given intensity threshold in order for the particular input criteria to be met. For clarity, the finger contact in a tap gesture typically needs to satisfy a nominal contact-detection intensity threshold, below which the contact is not detected, in order for the finger-down event to be detected. A similar analysis applies to detecting a tap gesture by a stylus or other contact. In cases where — the device is capable of detecting a finger or stylus contact hovering over a touch sensitive surface, the nominal contact-detection intensity threshold optionally does not correspond to physical contact between the finger or stylus and the touch sensitive surface.
[0067] The same concepts apply in an analogous manner to other types of gestures. For example, a swipe gesture, a pinch gesture, a depinch gesture, and/or a long press gesture are optionally detected based on the satisfaction of criteria that are either independent of intensities of contacts included in the gesture, or do not require that contact(s) that perform the gesture reach intensity thresholds in order to be recognized. For example., a swipe gesture is detected based on an amount of movement of one or more contacts; a pinch gesture is detected based on movement of two or more contacts towards each other; a depinch gesture is detected based on movement of two or more contacts away from each other; and a long press gesture is detected based on a duration of the contact on the touch- sensitive surface with less than a threshold amount of movement. As such, the statement that particular gesture recognition criteria do not require that the intensity of the contact(s) meet a respective intensity threshold in order for the particular gesture recognition criteria to be met means that the particular gesture recognition criteria are capable of being satisfied if the contact(s) in the gesture do not reach the respective intensity threshold, and are also capable of being satisfied in circumstances where one or more of the contacts in the gesture do reach or exceed the respective intensity threshold. In some embodiments, a tap gesture is detected based on a determination that the finger-down and finger-up event are detected — within a predefined time period, without regard to whether the contact is above or below the respective intensity threshold during the predefined time period, and a swipe gesture is detected based on a determination that the contact movement is greater than a predefined
9 DK 180170 B1 magnitude, even if the contact is above the respective intensity threshold at the end of the contact movement. Even in implementations where detection of a gesture is influenced by the intensities of contacts performing the gesture (e.g., the device detects a long press more quickly when the intensity of the contact is above an intensity threshold or delays detection of a tap input when the intensity of the contact is higher), the detection of those gestures does not require that the contacts reach a particular intensity threshold so long as the criteria for recognizing the gesture can be met in circumstances where the contact does not reach the particular intensity threshold (e.g., even if the amount of time that it takes to recognize the gesture changes).
[0068] Contact intensity thresholds, duration thresholds, and movement thresholds are, in some circumstances, combined in a variety of different combinations in order to create heuristics for distinguishing two or more different gestures directed to the same input element or region so that multiple different interactions with the same input element are enabled to provide a richer set of user interactions and responses. The statement that a particular set of gesture recognition criteria do not require that the intensity of the contact(s) meet a respective intensity threshold in order for the particular gesture recognition criteria to be met does not preclude the concurrent evaluation of other intensity-dependent gesture recognition criteria to identify other gestures that do have a criteria that is met when a gesture includes a contact with an intensity above the respective intensity threshold. For example, in some circumstances, first gesture recognition criteria for a first gesture — which do not require that the intensity of the contact(s) meet a respective intensity threshold in order for the first gesture recognition criteria to be met — are in competition with second gesture recognition criteria for a second gesture — which are dependent on the contact(s) reaching the respective intensity threshold. In such competitions, the gesture is, optionally, — notrecognized as meeting the first gesture recognition criteria for the first gesture if the second gesture recognition criteria for the second gesture are met first. For example, if a contact reaches the respective intensity threshold before the contact moves by a predefined amount of movement, a deep press gesture is detected rather than a swipe gesture. Conversely, if the contact moves by the predefined amount of movement before the contact reaches the respective intensity threshold, a swipe gesture is detected rather than a deep press gesture. Even in such circumstances, the first gesture recognition criteria for the first gesture still do not require that the intensity of the contact(s) meet a respective intensity
> DK 180170 B1 threshold in order for the first gesture recognition criteria to be met because if the contact stayed below the respective intensity threshold until an end of the gesture (e.g., a swipe gesture with a contact that does not increase to an intensity above the respective intensity threshold), the gesture would have been recognized by the first gesture recognition criteria asa swipe gesture. As such, particular gesture recognition criteria that do not require that the intensity of the contact(s) meet a respective intensity threshold in order for the particular gesture recognition criteria to be met will (A) in some circumstances ignore the intensity of the contact with respect to the intensity threshold (e.g. for a tap gesture) and/or (B) in some circumstances still be dependent on the intensity of the contact with respect to the intensity threshold in the sense that the particular gesture recognition criteria (e.g., for a long press gesture) will fail if a competing set of intensity-dependent gesture recognition criteria (e.g., for a deep press gesture) recognize an input as corresponding to an intensity-dependent gesture before the particular gesture recognition criteria recognize a gesture corresponding to the input (e.g., for a long press gesture that is competing with a deep press gesture for — recognition).
[0069] Graphics module 132 includes various known software components for rendering and displaying graphics on touch-sensitive display system 112 or other display, including components for changing the visual impact (e.g., brightness, transparency, saturation, contrast or other visual property) of graphics that are displayed. As used herein, the term “graphics” includes any object that can be displayed to a user, including without limitation text, web pages, icons (such as user-interface objects including soft keys), digital images, videos, animations and the like.
[0070] In some embodiments, graphics module 132 stores data representing graphics to be used. Each graphic is, optionally, assigned a corresponding code. Graphics module 132 receives, from applications etc., one or more codes specifying graphics to be displayed along with, if necessary, coordinate data and other graphic property data, and then generates screen image data to output to display controller 156.
[0071] Haptic feedback module 133 includes various software components for generating instructions used by tactile output generator(s) 167 to produce tactile outputs at — one or more locations on device 100 in response to user interactions with device 100.
DK 180170 B1 21
[0072] Text input module 134, which is, optionally, a component of graphics module 132, provides soft keyboards for entering text in various applications (e.g., contacts 137, e-mail 140, IM 141, browser 147, and any other application that needs text input).
[0073] GPS module 135 determines the location of the device and provides this information for use in various applications (e.g., to telephone 138 for use in location-based dialing, to camera 143 as picture/video metadata, and to applications that provide location- based services such as weather widgets, local yellow page widgets, and map/navigation widgets).
[0074] Applications 136 optionally include the following modules (or sets of instructions), or a subset or superset thereof: o contacts module 137 (sometimes called an address book or contact list); o telephone module 138; . video conferencing module 139; o e-mail client module 140; e instant messaging (IM) module 141; . workout support module 142; . camera module 143 for still and/or video images; o image management module 144; o browser module 147; calendar module 148; . widget modules 149, which optionally include one or more of: weather widget 149- 1, stocks widget 149-2, calculator widget 149-3, alarm clock widget 149-4, dictionary widget 149-5, and other widgets obtained by the user, as well as user- created widgets 149-6; e widget creator module 150 for making user-created widgets 149-6; o search module 151; . video and music player module 152, which is, optionally, made up of a video player module and a music player module;
» DK 180170 B1 . notes module 153; o map module 154; and/or . online video module 155.
[0075] Examples of other applications 136 that are, optionally, stored in memory 102 include other word processing applications, other image editing applications, drawing applications, presentation applications, JAVA-enabled applications, encryption, digital rights management, voice recognition, and voice replication.
[0076] In conjunction with touch-sensitive display system 112, display controller 156, contact module 130, graphics module 132, and text input module 134, contacts module 137 includes executable instructions to manage an address book or contact list (e.g., stored in application internal state 192 of contacts module 137 in memory 102 or memory 370), including: adding name(s) to the address book; deleting name(s) from the address book; associating telephone number(s), e-mail address(es), physical address(es) or other information with a name; associating an image with a name; categorizing and sorting names; providing telephone numbers and/or e-mail addresses to initiate and/or facilitate communications by telephone 138, video conference 139, e-mail 140, or IM 141; and so forth.
[0077] In conjunction with RF circuitry 108, audio circuitry 110, speaker 111, microphone 113, touch-sensitive display system 112, display controller 156, contact module 130, graphics module 132, and text input module 134, telephone module 138 includes executable instructions to enter a sequence of characters corresponding to a telephone number, access one or more telephone numbers in address book 137, modify a telephone number that has been entered, dial a respective telephone number, conduct a conversation and disconnect or hang up when the conversation is completed. As noted above, the — wireless communication optionally uses any of a plurality of communications standards, protocols and technologies.
[0078] In conjunction with RF circuitry 108, audio circuitry 110, speaker 111, microphone 113, touch-sensitive display system 112, display controller 156, optical sensor(s) 164, optical sensor controller 158, contact module 130, graphics module 132, text input module 134, contact list 137, and telephone module 138, videoconferencing module
»3 DK 180170 B1 139 includes executable instructions to initiate, conduct, and terminate a video conference between a user and one or more other participants in accordance with user instructions.
[0079] In conjunction with RF circuitry 108, touch-sensitive display system 112, display controller 156, contact module 130, graphics module 132, and text input module 134, e-mail client module 140 includes executable instructions to create, send, receive, and manage e-mail in response to user instructions. In conjunction with image management module 144, e-mail client module 140 makes it very easy to create and send e-mails with still or video images taken with camera module 143.
[0080] In conjunction with RF circuitry 108, touch-sensitive display system 112, display controller 156, contact module 130, graphics module 132, and text input module 134, the instant messaging module 141 includes executable instructions to enter a sequence of characters corresponding to an instant message, to modify previously entered characters, to transmit a respective instant message (for example, using a Short Message Service (SMS) or Multimedia Message Service (MMS) protocol for telephony-based instant messages or using XMPP, SIMPLE, Apple Push Notification Service (APNs) or IMPS for Internet-based instant messages), to receive instant messages, to view received instant messages, and to perform the functions of the messaging application described in greater detail below. In some embodiments, transmitted and/or received instant messages optionally include graphics, photos, audio files, video files and/or other attachments as are supported in a > MMS and/or an Enhanced Messaging Service (EMS). As used herein, “instant messaging” refers to both telephony-based messages (e.g., messages sent using SMS or MMS) and Internet-based messages (e.g., messages sent using XMPP, SIMPLE, APNs, or IMPS).
[0081] In conjunction with RF circuitry 108, touch-sensitive display system 112, display controller 156, contact module 130, graphics module 132, text input module 134, —> GPS module 135, map module 154, and music player module 146, workout support module 142 includes executable instructions to create workouts (e.g., with time, distance, and/or calorie burning goals); communicate with workout sensors (in sports devices and smart watches); receive workout sensor data; calibrate sensors used to monitor a workout; select and play music for a workout; and display, store and transmit workout data.
— [0082] In conjunction with touch-sensitive display system 112, display controller 156, optical sensor(s) 164, optical sensor controller 158, contact module 130, graphics
4 DK 180170 B1 module 132, and image management module 144, camera module 143 includes executable instructions to capture still images or video (including a video stream) and store them into memory 102, modify characteristics of a still image or video, and/or delete a still image or video from memory 102.
[0083] In conjunction with touch-sensitive display system 112, display controller 156, contact module 130, graphics module 132, text input module 134, and camera module 143, image management module 144 includes executable instructions to arrange, modify (e.g., edit), or otherwise manipulate, label, delete, present (e.g., in a digital slide show or album), and store still and/or video images.
[0084] In conjunction with RF circuitry 108, touch-sensitive display system 112, display system controller 156, contact module 130, graphics module 132, and text input module 134, browser module 147 includes executable instructions to browse the Internet in accordance with user instructions, including searching, linking to, receiving, and displaying web pages or portions thereof, as well as attachments and other files linked to web pages.
[0085] In conjunction with RF circuitry 108, touch-sensitive display system 112, display system controller 156, contact module 130, graphics module 132, text input module 134, e-mail client module 140, and browser module 147, calendar module 148 includes executable instructions to create, display, modify, and store calendars and data associated with calendars (e.g., calendar entries, to do lists, etc.) in accordance with user instructions.
[0086] In conjunction with RF circuitry 108, touch-sensitive display system 112, display system controller 156, contact module 130, graphics module 132, text input module 134, and browser module 147, widget modules 149 are mini-applications that are, optionally, downloaded and used by a user (e.g., weather widget 149-1, stocks widget 149- 2, calculator widget 149-3, alarm clock widget 149-4, and dictionary widget 149-5) or — created by the user (e.g., user-created widget 149-6). In some embodiments, a widget includes an HTML (Hypertext Markup Language) file, a CSS (Cascading Style Sheets) file, and a JavaScript file. In some embodiments, a widget includes an XML (Extensible Markup Language) file and a JavaScript file (e.g., Yahoo! Widgets).
[0087] In conjunction with RF circuitry 108, touch-sensitive display system 112, display system controller 156, contact module 130, graphics module 132, text input module 134, and browser module 147, the widget creator module 150 includes executable
»s DK 180170 B1 instructions to create widgets (e.g., turning a user-specified portion of a web page into a widget).
[0088] In conjunction with touch-sensitive display system 112, display system controller 156, contact module 130, graphics module 132, and text input module 134, search module 151 includes executable instructions to search for text, music, sound, image, video, and/or other files in memory 102 that match one or more search criteria (e.g., one or more user-specified search terms) in accordance with user instructions.
[0089] In conjunction with touch-sensitive display system 112, display system controller 156, contact module 130, graphics module 132, audio circuitry 110, speaker 111, RF circuitry 108, and browser module 147, video and music player module 152 includes executable instructions that allow the user to download and play back recorded music and other sound files stored in one or more file formats, such as MP3 or AAC files, and executable instructions to display, present or otherwise play back videos (e.g., on touch- sensitive display system 112, or on an external display connected wirelessly or via external port 124). In some embodiments, device 100 optionally includes the functionality of an MP3 player, such as an iPod (trademark of Apple Inc.).
[0090] In conjunction with touch-sensitive display system 112, display controller 156, contact module 130, graphics module 132, and text input module 134, notes module 153 includes executable instructions to create and manage notes, to do lists, and the like in accordance with user instructions.
[0091] In conjunction with RF circuitry 108, touch-sensitive display system 112, display system controller 156, contact module 130, graphics module 132, text input module 134, GPS module 135, and browser module 147, map module 154 includes executable instructions to receive, display, modify, and store maps and data associated with maps (e.g., driving directions; data on stores and other points of interest at or near a particular location; and other location-based data) in accordance with user instructions.
[0092] In conjunction with touch-sensitive display system 112, display system controller 156, contact module 130, graphics module 132, audio circuitry 110, speaker 111, RF circuitry 108, text input module 134, e-mail client module 140, and browser module — 147, online video module 155 includes executable instructions that allow the user to access, browse, receive (e.g., by streaming and/or download), play back (e.g., on the touch screen
2 DK 180170 B1 112, or on an external display connected wirelessly or via external port 124), send an e-mail with a link to a particular online video, and otherwise manage online videos in one or more file formats, such as H.264. In some embodiments, instant messaging module 141, rather than e-mail client module 140, is used to send a link to a particular online video.
[0093] Each of the above identified modules and applications correspond to a set of executable instructions for performing one or more functions described above and the methods described in this application (e.g., the computer-implemented methods and other information processing methods described herein). These modules (1.e., sets of instructions) need not be implemented as separate software programs, procedures or modules, and thus — various subsets of these modules are, optionally, combined or otherwise re-arranged in various embodiments. In some embodiments, memory 102 optionally stores a subset of the modules and data structures identified above. Furthermore, memory 102 optionally stores additional modules and data structures not described above.
[0094] In some embodiments, device 100 is a device where operation of a predefined set of functions on the device is performed exclusively through a touch screen and/or a touchpad. By using a touch screen and/or a touchpad as the primary input control device for operation of device 100, the number of physical input control devices (such as push buttons, dials, and the like) on device 100 is, optionally, reduced.
[0095] The predefined set of functions that are performed exclusively through a — touch screen and/or a touchpad optionally include navigation between user interfaces. In some embodiments, the touchpad, when touched by the user, navigates device 100 to a main, home, or root menu from any user interface that is displayed on device 100. In such embodiments, a “menu button” is implemented using a touchpad. In some other embodiments, the menu button is a physical push button or other physical input control — device instead of a touchpad.
[0096] Figure 1B is a block diagram illustrating example components for event handling in accordance with some embodiments. In some embodiments, memory 102 (in Figures 1A) or 370 (Figure 3) includes event sorter 170 (e.g., in operating system 126) and a respective application 136-1 (e.g., any of the aforementioned applications 136, 137-155, 380-390).
>] DK 180170 B1
[0097] Event sorter 170 receives event information and determines the application 136-1 and application view 191 of application 136-1 to which to deliver the event information. Event sorter 170 includes event monitor 171 and event dispatcher module 174. In some embodiments, application 136-1 includes application internal state 192, which indicates the current application view(s) displayed on touch-sensitive display system 112 when the application is active or executing. In some embodiments, device/global internal state 157 is used by event sorter 170 to determine which application(s) is (are) currently active, and application internal state 192 is used by event sorter 170 to determine application views 191 to which to deliver event information.
[0098] In some embodiments, application internal state 192 includes additional information, such as one or more of: resume information to be used when application 136-1 resumes execution, user interface state information that indicates information being displayed or that is ready for display by application 136-1, a state queue for enabling the user to go back to a prior state or view of application 136-1, and a redo/undo queue of previous actions taken by the user.
[0099] Event monitor 171 receives event information from peripherals interface
118. Event information includes information about a sub-event (e.g., a user touch on touch- sensitive display system 112, as part of a multi-touch gesture). Peripherals interface 118 transmits information it receives from I/O subsystem 106 or a sensor, such as proximity — sensor 166, accelerometer(s) 168, and/or microphone 113 (through audio circuitry 110). Information that peripherals interface 118 receives from I/O subsystem 106 includes information from touch-sensitive display system 112 or a touch-sensitive surface.
[00100] In some embodiments, event monitor 171 sends requests to the peripherals interface 118 at predetermined intervals. In response, peripherals interface 118 transmits event information. In other embodiments, peripheral interface 118 transmits event information only when there is a significant event (e.g., receiving an input above a predetermined noise threshold and/or for more than a predetermined duration).
[00101] In some embodiments, event sorter 170 also includes a hit view determination module 172 and/or an active event recognizer determination module 173.
[00102] Hit view determination module 172 provides software procedures for determining where a sub-event has taken place within one or more views, when touch-
DK 180170 B1 sensitive display system 112 displays more than one view. Views are made up of controls and other elements that a user can see on the display.
[00103] Another aspect of the user interface associated with an application is a set of views, sometimes herein called application views or user interface windows, in which 5 information is displayed and touch-based gestures occur. The application views (of a respective application) in which a touch is detected optionally correspond to programmatic levels within a programmatic or view hierarchy of the application. For example, the lowest level view in which a touch is detected is, optionally, called the hit view, and the set of events that are recognized as proper inputs are, optionally, determined based, at least in part, — on the hit view of the initial touch that begins a touch-based gesture.
[00104] Hit view determination module 172 receives information related to sub- events of a touch-based gesture. When an application has multiple views organized in a hierarchy, hit view determination module 172 identifies a hit view as the lowest view in the hierarchy which should handle the sub-event. In most circumstances, the hit view is the lowest level view in which an initiating sub-event occurs (i.e., the first sub-event in the sequence of sub-events that form an event or potential event). Once the hit view is identified by the hit view determination module, the hit view typically receives all sub-events related to the same touch or input source for which it was identified as the hit view.
[00105] Active event recognizer determination module 173 determines which view or views within a view hierarchy should receive a particular sequence of sub-events. In some embodiments, active event recognizer determination module 173 determines that only the hit view should receive a particular sequence of sub-events. In other embodiments, active event recognizer determination module 173 determines that all views that include the physical location of a sub-event are actively involved views, and therefore determines that all actively involved views should receive a particular sequence of sub-events. In other embodiments, even if touch sub-events were entirely confined to the area associated with one particular view, views higher in the hierarchy would still remain as actively involved views.
[00106] Event dispatcher module 174 dispatches the event information to an event recognizer (e.g., event recognizer 180). In embodiments including active event recognizer determination module 173, event dispatcher module 174 delivers the event information to
2 DK 180170 B1 an event recognizer determined by active event recognizer determination module 173. In some embodiments, event dispatcher module 174 stores in an event queue the event information, which is retrieved by a respective event receiver module 182.
[00107] In some embodiments, operating system 126 includes event sorter 170. Alternatively, application 136-1 includes event sorter 170. In yet other embodiments, event sorter 170 is a stand-alone module, or a part of another module stored in memory 102, such as contact/motion module 130.
[00108] In some embodiments, application 136-1 includes a plurality of event handlers 190 and one or more application views 191, each of which includes instructions for handling touch events that occur within a respective view of the application’s user interface. Each application view 191 of the application 136-1 includes one or more event recognizers
180. Typically, a respective application view 191 includes a plurality of event recognizers
180. In other embodiments, one or more of event recognizers 180 are part of a separate module, such as a user interface kit (not shown) or a higher level object from which application 136-1 inherits methods and other properties. In some embodiments, a respective event handler 190 includes one or more of: data updater 176, object updater 177, GUI updater 178, and/or event data 179 received from event sorter 170. Event handler 190 optionally utilizes or calls data updater 176, object updater 177 or GUI updater 178 to update the application internal state 192. Alternatively, one or more of the application views — 191 includes one or more respective event handlers 190. Also, in some embodiments, one or more of data updater 176, object updater 177, and GUI updater 178 are included in a respective application view 191.
[00109] A respective event recognizer 180 receives event information (e.g., event data 179) from event sorter 170, and identifies an event from the event information. Event recognizer 180 includes event receiver 182 and event comparator 184. In some embodiments, event recognizer 180 also includes at least a subset of: metadata 183, and event delivery instructions 188 (which optionally include sub-event delivery instructions).
[00110] Event receiver 182 receives event information from event sorter 170. The event information includes information about a sub-event, for example, a touch or a touch movement. Depending on the sub-event, the event information also includes additional information, such as location of the sub-event. When the sub-event concerns motion of a
0 DK 180170 B1 touch, the event information optionally also includes speed and direction of the sub-event. In some embodiments, events include rotation of the device from one orientation to another (e.g., from a portrait orientation to a landscape orientation, or vice versa), and the event information includes corresponding information about the current orientation (also called device attitude) of the device.
[00111] Event comparator 184 compares the event information to predefined event or sub-event definitions and, based on the comparison, determines an event or sub-event, or determines or updates the state of an event or sub-event. In some embodiments, event comparator 184 includes event definitions 186. Event definitions 186 contain definitions of — events (e.g., predefined sequences of sub-events), for example, event 1 (187-1), event 2 (187-2), and others. In some embodiments, sub-events in an event 187 include, for example, touch begin, touch end, touch movement, touch cancellation, and multiple touching. In one example, the definition for event 1 (187-1) is a double tap on a displayed object. The double tap, for example, comprises a first touch (touch begin) on the displayed object for a predetermined phase, a first lift-off (touch end) for a predetermined phase, a second touch (touch begin) on the displayed object for a predetermined phase, and a second lift-off (touch end) for a predetermined phase. In another example, the definition for event 2 (187-2) is a dragging on a displayed object. The dragging, for example, comprises a touch (or contact) on the displayed object for a predetermined phase, a movement of the touch across touch- — sensitive display system 112, and lift-off of the touch (touch end). In some embodiments, the event also includes information for one or more associated event handlers 190.
[00112] In some embodiments, event definition 187 includes a definition of an event for a respective user-interface object. In some embodiments, event comparator 184 performs a hit test to determine which user-interface object is associated with a sub-event. For example, in an application view in which three user-interface objects are displayed on touch-sensitive display system 112, when a touch is detected on touch-sensitive display system 112, event comparator 184 performs a hit test to determine which of the three user- interface objects is associated with the touch (sub-event). If each displayed object is associated with a respective event handler 190, the event comparator uses the result of the — hit test to determine which event handler 190 should be activated. For example, event comparator 184 selects an event handler associated with the sub-event and the object triggering the hit test.
31 DK 180170 B1
[00113] In some embodiments, the definition for a respective event 187 also includes delayed actions that delay delivery of the event information until after it has been determined whether the sequence of sub-events does or does not correspond to the event recognizer’s event type.
[00114] When a respective event recognizer 180 determines that the series of sub- events do not match any of the events in event definitions 186, the respective event recognizer 180 enters an event impossible, event failed, or event ended state, after which it disregards subsequent sub-events of the touch-based gesture. In this situation, other event recognizers, if any, that remain active for the hit view continue to track and process sub- — events of an ongoing touch-based gesture.
[00115] In some embodiments, a respective event recognizer 180 includes metadata 183 with configurable properties, flags, and/or lists that indicate how the event delivery system should perform sub-event delivery to actively involved event recognizers. In some embodiments, metadata 183 includes configurable properties, flags, and/or lists that indicate how event recognizers interact, or are enabled to interact, with one another. In some embodiments, metadata 183 includes configurable properties, flags, and/or lists that indicate whether sub-events are delivered to varying levels in the view or programmatic hierarchy.
[00116] In some embodiments, a respective event recognizer 180 activates event handler 190 associated with an event when one or more particular sub-events of an event are recognized. In some embodiments, a respective event recognizer 180 delivers event information associated with the event to event handler 190. Activating an event handler 190 is distinct from sending (and deferred sending) sub-events to a respective hit view. In some embodiments, event recognizer 180 throws a flag associated with the recognized event, and event handler 190 associated with the flag catches the flag and performs a predefined — process.
[00117] In some embodiments, event delivery instructions 188 include sub-event delivery instructions that deliver event information about a sub-event without activating an event handler. Instead, the sub-event delivery instructions deliver event information to event handlers associated with the series of sub-events or to actively involved views. Event handlers associated with the series of sub-events or with actively involved views receive the event information and perform a predetermined process.
2 DK 180170 B1
[00118] In some embodiments, data updater 176 creates and updates data used in application 136-1. For example, data updater 176 updates the telephone number used in contacts module 137, or stores a video file used in video player module 145. In some embodiments, object updater 177 creates and updates objects used in application 136-1. For example, object updater 177 creates a new user-interface object or updates the position of a user-interface object. GUI updater 178 updates the GUI. For example, GUI updater 178 prepares display information and sends it to graphics module 132 for display on a touch- sensitive display.
[00119] In some embodiments, event handler(s) 190 includes or has access to data updater 176, object updater 177, and GUI updater 178. In some embodiments, data updater 176, object updater 177, and GUI updater 178 are included in a single module of a respective application 136-1 or application view 191. In other embodiments, they are included in two or more software modules.
[00120] It shall be understood that the foregoing discussion regarding event handling — of user touches on touch-sensitive displays also applies to other forms of user inputs to operate multifunction devices 100 with input-devices, not all of which are initiated on touch screens. For example, mouse movement and mouse button presses, optionally coordinated with single or multiple keyboard presses or holds; contact movements such as taps, drags, scrolls, etc., on touch-pads; pen stylus inputs; movement of the device; oral instructions; detected eye movements; biometric inputs; and/or any combination thereof are optionally utilized as inputs corresponding to sub-events which define an event to be recognized.
[00121] Figure 2 illustrates a portable multifunction device 100 having a touch screen (e.g., touch-sensitive display system 112, Figure 1A) in accordance with some embodiments. The touch screen optionally displays one or more graphics within user interface (UI) 200. In this embodiment, as well as others described below, a user is enabled to select one or more of the graphics by making a gesture on the graphics, for example, with one or more fingers 202 (not drawn to scale in the figure) or one or more styluses 203 (not drawn to scale in the figure). In some embodiments, selection of one or more graphics occurs when the user breaks contact with the one or more graphics. In some embodiments, — the gesture optionally includes one or more taps, one or more swipes (from left to right, right to left, upward and/or downward) and/or a rolling of a finger (from right to left, left to
33 DK 180170 B1 right, upward and/or downward) that has made contact with device 100. In some implementations or circumstances, inadvertent contact with a graphic does not select the graphic. For example, a swipe gesture that sweeps over an application icon optionally does not select the corresponding application when the gesture corresponding to selection is a tap.
[00122] Device 100 optionally also includes one or more physical buttons, such as “home” or menu button 204. As described previously, menu button 204 is, optionally, used to navigate to any application 136 in a set of applications that are, optionally executed on device 100. Alternatively, in some embodiments, the menu button is implemented as a soft key in a GUI displayed on the touch-screen display.
[00123] In some embodiments, device 100 includes the touch-screen display, menu button 204, push button 206 for powering the device on/off and locking the device, volume adjustment button(s) 208, Subscriber Identity Module (SIM) card slot 210, head set jack 212, and docking/charging external port 124. Push button 206 is, optionally, used to turn the power on/off on the device by depressing the button and holding the button in the depressed state for a predefined time interval; to lock the device by depressing the button and releasing the button before the predefined time interval has elapsed; and/or to unlock the device or initiate an unlock process. In some embodiments, device 100 also accepts verbal input for activation or deactivation of some functions through microphone 113. Device 100 also, — includes one or more contact intensity sensors 165 for detecting intensities of contacts on touch-sensitive display system 112 and/or one or more tactile output generators 167 for generating tactile outputs for a user of device 100.
[00124] Figure 3 is a block diagram of an example multifunction device with a display and a touch-sensitive surface in accordance with some embodiments. Device 300 need not be portable. In some embodiments, device 300 is a laptop computer, a desktop computer, a tablet computer, a multimedia player device, a navigation device, an educational device (such as a child’s learning toy), a gaming system, or a control device (e.g., a home or industrial controller). Device 300 typically includes one or more processing units (CPU’s) 310, one or more network or other communications interfaces 360, memory — 370, and one or more communication buses 320 for interconnecting these components. Communication buses 320 optionally include circuitry (sometimes called a chipset) that
34 DK 180170 B1 interconnects and controls communications between system components. Device 300 includes input/output (I/O) interface 330 comprising display 340, which is typically a touch- screen display. I/O interface 330 also optionally includes a keyboard and/or mouse (or other pointing device) 350 and touchpad 355, tactile output generator 357 for generating tactile outputs on device 300 (e.g., similar to tactile output generator(s) 167 described above with reference to Figure 1A), sensors 359 (e.g., optical, acceleration, proximity, touch-sensitive, and/or contact intensity sensors similar to contact intensity sensor(s) 165 described above with reference to Figure 1A). Memory 370 includes high-speed random access memory, such as DRAM, SRAM, DDR RAM or other random access solid state memory devices; — and optionally includes non-volatile memory, such as one or more magnetic disk storage devices, optical disk storage devices, flash memory devices, or other non-volatile solid state storage devices. Memory 370 optionally includes one or more storage devices remotely located from CPU(s) 310. In some embodiments, memory 370 stores programs, modules, and data structures analogous to the programs, modules, and data structures stored in memory 102 of portable multifunction device 100 (Figure 1A), or a subset thereof. Furthermore, memory 370 optionally stores additional programs, modules, and data structures not present in memory 102 of portable multifunction device 100. For example, memory 370 of device 300 optionally stores drawing module 380, presentation module 382, word processing module 384, website creation module 386, disk authoring module 388, — and/or spreadsheet module 390, while memory 102 of portable multifunction device 100 (Figure 1A) optionally does not store these modules.
[00125] Each of the above identified elements in Figure 3 are, optionally, stored in one or more of the previously mentioned memory devices. Each of the above identified modules corresponds to a set of instructions for performing a function described above. The — above identified modules or programs (1.e., sets of instructions) need not be implemented as separate software programs, procedures or modules, and thus various subsets of these modules are, optionally, combined or otherwise re-arranged in various embodiments. In some embodiments, memory 370 optionally stores a subset of the modules and data structures identified above. Furthermore, memory 370 optionally stores additional modules and data structures not described above.
[00126] Attention is now directed towards embodiments of user interfaces ("UI") that are, optionally, implemented on portable multifunction device 100.
DK 180170 B1 35
[00127] Figure 4A illustrates an example user interface for a menu of applications on portable multifunction device 100 in accordance with some embodiments. Similar user interfaces are, optionally, implemented on device 300. In some embodiments, user interface 400 includes the following elements, or a subset or superset thereof: eo Signal strength indicator(s) 402 for wireless communication(s), such as cellular and Wi-Fi signals; o Time 404; o Bluetooth indicator 405; o Battery status indicator 406; e Tray 408 with icons for frequently used applications, such as: o Icon 416 for telephone module 138, labeled “Phone,” which optionally includes an indicator 414 of the number of missed calls or voicemail messages; o Icon 418 for e-mail client module 140, labeled “Mail,” which optionally includes an indicator 410 of the number of unread e-mails; o Icon 420 for browser module 147, labeled "Browser;” and o Icon 422 for video and music player module 152, also referred to as iPod (trademark of Apple Inc.) module 152, labeled "iPod;” and o Icons for other applications, such as: o Icon 424 for IM module 141, labeled "Messages;” o Icon 426 for calendar module 148, labeled "Calendar;” o Icon 428 for image management module 144, labeled "Photos;” o Icon 430 for camera module 143, labeled "Camera;” o Icon 432 for online video module 155, labeled "Online Video;” o Icon 434 for stocks widget 149-2, labeled "Stocks;” o Icon 436 for map module 154, labeled "Map;” o Icon 438 for weather widget 149-1, labeled "Weather;”
36 DK 180170 B1 o Icon 440 for alarm clock widget 149-4, labeled "Clock;” o Icon 442 for workout support module 142, labeled "Workout Support;” o Icon 444 for notes module 153, labeled "Notes;” and o Icon 446 for a settings application or module, which provides access to settings for device 100 and its various applications 136.
[00128] It should be noted that the icon labels illustrated in Figure 4A are merely examples. For example, in some embodiments, icon 422 for video and music player module 152 is labeled "Music” or "Music Player.” Other labels are, optionally, used for various application icons. In some embodiments, a label for a respective application icon includes a name of an application corresponding to the respective application icon. In some embodiments, a label for a particular application icon is distinct from a name of an application corresponding to the particular application icon.
[00129] Figure 4B illustrates an example user interface on a device (e.g., device 300, Figure 3) with a touch-sensitive surface 451 (e.g., a tablet or touchpad 355, Figure 3) that is separate from the display 450. Device 300 also, optionally, includes one or more contact intensity sensors (e.g., one or more of sensors 357) for detecting intensities of contacts on touch-sensitive surface 451 and/or one or more tactile output generators 359 for generating tactile outputs for a user of device 300.
[00130] Figure 4B illustrates an example user interface on a device (e.g., device 300, — Figure 3) with a touch-sensitive surface 451 (e.g., a tablet or touchpad 355, Figure 3) that is separate from the display 450. Although many of the examples that follow will be given with reference to inputs on touch screen display 112 (where the touch sensitive surface and the display are combined), in some embodiments, the device detects inputs on a touch- sensitive surface that is separate from the display, as shown in FIG. 4B. In some embodiments, the touch-sensitive surface (e.g., 451 in Figure 4B) has a primary axis (e.g., 452 in Figure 4B) that corresponds to a primary axis (e.g., 453 in Figure 4B) on the display (e.g., 450). In accordance with these embodiments, the device detects contacts (e.g., 460 and 462 in Figure 4B) with the touch-sensitive surface 451 at locations that correspond to respective locations on the display (e.g., in Figure 4B, 460 corresponds to 468 and 462 corresponds to 470). In this way, user inputs (e.g., contacts 460 and 462, and movements
37 DK 180170 B1 thereof) detected by the device on the touch-sensitive surface (e.g., 451 in Figure 4B) are used by the device to manipulate the user interface on the display (e.g., 450 in Figure 4B) of the multifunction device when the touch-sensitive surface is separate from the display. It should be understood that similar methods are, optionally, used for other user interfaces described herein.
[00131] Additionally, while the following examples are given primarily with reference to finger inputs (e.g., finger contacts, finger tap gestures, finger swipe gestures, etc.), it should be understood that, in some embodiments, one or more of the finger inputs are replaced with input from another input device (e.g., a mouse based input or a stylus — input). For example, a swipe gesture is, optionally, replaced with a mouse click (e.g., instead of a contact) followed by movement of the cursor along the path of the swipe (e.g., instead of movement of the contact). As another example, a tap gesture is, optionally, replaced with a mouse click while the cursor is located over the location of the tap gesture (e.g., instead of detection of the contact followed by ceasing to detect the contact).
Similarly, when multiple user inputs are simultaneously detected, it should be understood that multiple computer mice are, optionally, used simultaneously, or a mouse and finger contacts are, optionally, used simultaneously.
[00132] As used herein, the term “focus selector” is an input element that indicates a current part of a user interface with which a user is interacting. In some implementations that include a cursor or other location marker, the cursor acts as a “focus selector,” so that when an input (e.g., a press input) is detected on a touch-sensitive surface (e.g., touchpad 355 in Figure 3 or touch-sensitive surface 451 in Figure 4B) while the cursor is over a particular user interface element (e.g., a button, window, slider or other user interface element), the particular user interface element is adjusted in accordance with the detected — input. In some implementations that include a touch-screen display (e.g., touch-sensitive display system 112 in Figure 1A or the touch screen in Figure 4A) that enables direct interaction with user interface elements on the touch-screen display, a detected contact on the touch-screen acts as a “focus selector,” so that when an input (e.g., a press input by the contact) is detected on the touch-screen display at a location of a particular user interface element (e.g., a button, window, slider or other user interface element), the particular user interface element is adjusted in accordance with the detected input. In some implementations, focus is moved from one region of a user interface to another region of the
39 DK 180170 B1 user interface without corresponding movement of a cursor or movement of a contact on a touch-screen display (e.g., by using a tab key or arrow keys to move focus from one button to another button); in these implementations, the focus selector moves in accordance with movement of focus between different regions of the user interface. Without regard to the specific form taken by the focus selector, the focus selector is generally the user interface element (or contact on a touch-screen display) that is controlled by the user so as to communicate the user's intended interaction with the user interface (e.g., by indicating, to the device, the element of the user interface with which the user is intending to interact). For example, the location of a focus selector (e.g., a cursor, a contact, or a selection box) over a — respective button while a press input is detected on the touch-sensitive surface (e.g., a touchpad or touch screen) will indicate that the user is intending to activate the respective button (as opposed to other user interface elements shown on a display of the device).
[00133] As used in the specification and claims, the term “intensity” of a contact on a touch-sensitive surface refers to the force or pressure (force per unit area) of a contact (e.g., — a finger contact or a stylus contact) on the touch-sensitive surface, or to a substitute (proxy) for the force or pressure of a contact on the touch-sensitive surface. The intensity of a contact has a range of values that includes at least four distinct values and more typically includes hundreds of distinct values (e.g., at least 256). Intensity of a contact is, optionally, determined (or measured) using various approaches and various sensors or combinations of sensors. For example, one or more force sensors underneath or adjacent to the touch- sensitive surface are, optionally, used to measure force at various points on the touch- sensitive surface. In some implementations, force measurements from multiple force sensors are combined (e.g., a weighted average or a sum) to determine an estimated force of a contact. Similarly, a pressure-sensitive tip of a stylus is, optionally, used to determine a — pressure of the stylus on the touch-sensitive surface. Alternatively, the size of the contact area detected on the touch-sensitive surface and/or changes thereto, the capacitance of the touch-sensitive surface proximate to the contact and/or changes thereto, and/or the resistance of the touch-sensitive surface proximate to the contact and/or changes thereto are, optionally, used as a substitute for the force or pressure of the contact on the touch-sensitive — surface. In some implementations, the substitute measurements for contact force or pressure are used directly to determine whether an intensity threshold has been exceeded (e.g., the intensity threshold is described in units corresponding to the substitute measurements). In
DK 180170 B1 some implementations, the substitute measurements for contact force or pressure are converted to an estimated force or pressure and the estimated force or pressure is used to determine whether an intensity threshold has been exceeded (e.g., the intensity threshold is a pressure threshold measured in units of pressure). Using the intensity of a contact as an attribute of a user input allows for user access to additional device functionality that may otherwise not be readily accessible by the user on a reduced-size device with limited real estate for displaying affordances (e.g., on a touch-sensitive display) and/or receiving user input (e.g., via a touch-sensitive display, a touch-sensitive surface, or a physical/mechanical control such as a knob or a button).
[00134] In some embodiments, contact/motion module 130 uses a set of one or more intensity thresholds to determine whether an operation has been performed by a user (e.g., to determine whether a user has "clicked” on an icon). In some embodiments, at least a subset of the intensity thresholds are determined in accordance with software parameters (e.g., the intensity thresholds are not determined by the activation thresholds of particular physical actuators and can be adjusted without changing the physical hardware of device 100). For example, a mouse "click” threshold of a trackpad or touch-screen display can be set to any of a large range of predefined thresholds values without changing the trackpad or touch-screen display hardware. Additionally, in some implementations a user of the device is provided with software settings for adjusting one or more of the set of intensity thresholds (e.g., by adjusting individual intensity thresholds and/or by adjusting a plurality of intensity thresholds at once with a system-level click “intensity” parameter).
[00135] As used in the specification and claims, the term “characteristic intensity” of a contact is a characteristic of the contact based on one or more intensities of the contact. In some embodiments, the characteristic intensity is based on multiple intensity samples. The characteristic intensity is, optionally, based on a predefined number of intensity samples, or a set of intensity samples collected during a predetermined time period (e.g., 0.05, 0.1, 0.2,
0.5, 1, 2, 5, 10 seconds) relative to a predefined event (e.g., after detecting the contact, prior to detecting liftoff of the contact, before or after detecting a start of movement of the contact, prior to detecting an end of the contact, before or after detecting an increase in intensity of the contact, and/or before or after detecting a decrease in intensity of the contact). A characteristic intensity of a contact is, optionally based on one or more of: a maximum value of the intensities of the contact, a mean value of the intensities of the
“0 DK 180170 B1 contact, an average value of the intensities of the contact, a top 10 percentile value of the intensities of the contact, a value at the half maximum of the intensities of the contact, a value at the 90 percent maximum of the intensities of the contact, or the like. In some embodiments, the duration of the contact is used in determining the characteristic intensity (e.g., when the characteristic intensity is an average of the intensity of the contact over time). In some embodiments, the characteristic intensity is compared to a set of one or more intensity thresholds to determine whether an operation has been performed by a user. For example, the set of one or more intensity thresholds may include a first intensity threshold and a second intensity threshold. In this example, a contact with a characteristic intensity that does not exceed the first threshold results in a first operation, a contact with a characteristic intensity that exceeds the first intensity threshold and does not exceed the second intensity threshold results in a second operation, and a contact with a characteristic intensity that exceeds the second intensity threshold results in a third operation. In some embodiments, a comparison between the characteristic intensity and one or more intensity thresholds is used to determine whether or not to perform one or more operations (e.g., whether to perform a respective option or forgo performing the respective operation) rather than being used to determine whether to perform a first operation or a second operation.
[00136] In some embodiments, a portion of a gesture is identified for purposes of determining a characteristic intensity. For example, a touch-sensitive surface may receive a — continuous swipe contact transitioning from a start location and reaching an end location (e.g., a drag gesture), at which point the intensity of the contact increases. In this example, the characteristic intensity of the contact at the end location may be based on only a portion of the continuous swipe contact, and not the entire swipe contact (e.g., only the portion of the swipe contact at the end location). In some embodiments, a smoothing algorithm may be — applied to the intensities of the swipe contact prior to determining the characteristic intensity of the contact. For example, the smoothing algorithm optionally includes one or more of: an unweighted sliding-average smoothing algorithm, a triangular smoothing algorithm, a median filter smoothing algorithm, and/or an exponential smoothing algorithm. In some circumstances, these smoothing algorithms eliminate narrow spikes or dips in the intensities of the swipe contact for purposes of determining a characteristic intensity.
[00137] The user interface figures described herein optionally include various intensity diagrams that show the current intensity of the contact on the touch-sensitiveal DK 180170 B1 surface relative to one or more intensity thresholds (e.g., a contact detection intensity threshold IT, a light press intensity threshold IT, a deep press intensity threshold ITp (e.g., that is at least initially higher than Ir ), and/or one or more other intensity thresholds (e.g., an intensity threshold In that is lower than Ir)). This intensity diagram is typically not part of the displayed user interface, but is provided to aid in the interpretation of the figures. In some embodiments, the light press intensity threshold corresponds to an intensity at which the device will perform operations typically associated with clicking a button of a physical mouse or a trackpad. In some embodiments, the deep press intensity threshold corresponds to an intensity at which the device will perform operations that are different from operations typically associated with clicking a button of a physical mouse or a trackpad. In some embodiments, when a contact is detected with a characteristic intensity below the light press intensity threshold (e.g., and above a nominal contact-detection intensity threshold ITo below which the contact is no longer detected), the device will move a focus selector in accordance with movement of the contact on the touch-sensitive surface without performing — an operation associated with the light press intensity threshold or the deep press intensity threshold. Generally, unless otherwise stated, these intensity thresholds are consistent between different sets of user interface figures.
[00138] In some embodiments, the response of the device to inputs detected by the device depends on criteria based on the contact intensity during the input. For example, for some “light press” inputs, the intensity of a contact exceeding a first intensity threshold during the input triggers a first response. In some embodiments, the response of the device to inputs detected by the device depends on criteria that include both the contact intensity during the input and time-based criteria. For example, for some “deep press” inputs, the intensity of a contact exceeding a second intensity threshold during the input, greater than — the first intensity threshold for a light press, triggers a second response only if a delay time has elapsed between meeting the first intensity threshold and meeting the second intensity threshold. This delay time is typically less than 200 ms in duration (e.g., 40, 100, or 120 ms, depending on the magnitude of the second intensity threshold, with the delay time increasing as the second intensity threshold increases). This delay time helps to avoid accidental deep press inputs. As another example, for some “deep press” inputs, there is a reduced-sensitivity time period that occurs after the time at which the first intensity threshold is met. During the reduced-sensitivity time period, the second intensity thresholdi DK 180170 B1 is increased. This temporary increase in the second intensity threshold also helps to avoid accidental deep press inputs. For other deep press inputs, the response to detection of a deep press input does not depend on time-based criteria.
[00139] In some embodiments, one or more of the input intensity thresholds and/or — the corresponding outputs vary based on one or more factors, such as user settings, contact motion, input timing, application running, rate at which the intensity is applied, number of concurrent inputs, user history, environmental factors (e.g., ambient noise), focus selector position, and the like. Example factors are described in U.S. Patent Application Serial Nos. 14/399,606 and 14/624,296.
[00140] For example, Figure 4C illustrates a dynamic intensity threshold 480 that changes over time based in part on the intensity of touch input 476 over time. Dynamic intensity threshold 480 is a sum of two components, first component 474 that decays over time after a predefined delay time pl from when touch input 476 is initially detected, and second component 478 that trails the intensity of touch input 476 over time. The initial high intensity threshold of first component 474 reduces accidental triggering of a “deep press” response, while still allowing an immediate “deep press” response if touch input 476 provides sufficient intensity. Second component 478 reduces unintentional triggering of a “deep press” response by gradual intensity fluctuations of in a touch input. In some embodiments, when touch input 476 satisfies dynamic intensity threshold 480 (e.g., at point 481 in Figure 4C), the "deep press” response is triggered.
[00141] Figure 4D illustrates another dynamic intensity threshold 486 (e.g., intensity threshold Ip). Figure 4D also illustrates two other intensity thresholds: a first intensity threshold In and a second intensity threshold I. In Figure 4D, although touch input 484 satisfies the first intensity threshold In and the second intensity threshold Ii. prior to time p2, no response is provided until delay time p2 has elapsed at time 482. Also in Figure 4D, dynamic intensity threshold 486 decays over time, with the decay starting at time 488 after a predefined delay time pl has elapsed from time 482 (when the response associated with the second intensity threshold I. was triggered). This type of dynamic intensity threshold reduces accidental triggering of a response associated with the dynamic intensity threshold Ip immediately after, or concurrently with, triggering a response associated with a lower
43 DK 180170 B1 intensity threshold, such as the first intensity threshold In or the second intensity threshold IL.
[00142] Figure 4E illustrate yet another dynamic intensity threshold 492 (e.g., intensity threshold Ip). In Figure 4E, a response associated with the intensity threshold Ir is triggered after the delay time p2 has elapsed from when touch input 490 is initially detected. Concurrently, dynamic intensity threshold 492 decays after the predefined delay time pl has elapsed from when touch input 490 is initially detected. So a decrease in intensity of touch input 490 after triggering the response associated with the intensity threshold IL, followed by an increase in the intensity of touch input 490, without releasing touch input 490, can — trigger a response associated with the intensity threshold Ip (e.g., at time 494) even when the intensity of touch input 490 is below another intensity threshold, for example, the intensity threshold Ii.
[00143] An increase of characteristic intensity of the contact from an intensity below the light press intensity threshold ITL to an intensity between the light press intensity threshold IT; and the deep press intensity threshold ITp is sometimes referred to as a “light press” input. An increase of characteristic intensity of the contact from an intensity below the deep press intensity threshold ITp to an intensity above the deep press intensity threshold ITp is sometimes referred to as a “deep press” input. An increase of characteristic intensity of the contact from an intensity below the contact-detection intensity threshold ITo to an intensity between the contact-detection intensity threshold ITo and the light press intensity threshold IT; is sometimes referred to as detecting the contact on the touch- surface. A decrease of characteristic intensity of the contact from an intensity above the contact-detection intensity threshold ITo to an intensity below the contact-detection intensity threshold ITo is sometimes referred to as detecting liftoff of the contact from the touch- — surface. In some embodiments ITo is zero. In some embodiments, ITo is greater than zero. In some illustrations a shaded circle or oval is used to represent intensity of a contact on the touch-sensitive surface. In some illustrations, a circle or oval without shading is used represent a respective contact on the touch-sensitive surface without specifying the intensity of the respective contact.
[00144] In some embodiments, described herein, one or more operations are performed in response to detecting a gesture that includes a respective press input or in
4 DK 180170 B1 response to detecting the respective press input performed with a respective contact (or a plurality of contacts), where the respective press input is detected based at least in part on detecting an increase in intensity of the contact (or plurality of contacts) above a press-input intensity threshold. In some embodiments, the respective operation is performed in response to detecting the increase in intensity of the respective contact above the press-input intensity threshold (e.g., the respective operation is performed on a "down stroke” of the respective press input). In some embodiments, the press input includes an increase in intensity of the respective contact above the press-input intensity threshold and a subsequent decrease in intensity of the contact below the press-input intensity threshold, and the respective operation is performed in response to detecting the subsequent decrease in intensity of the respective contact below the press-input threshold (e.g., the respective operation is performed on an "up stroke” of the respective press input).
[00145] In some embodiments, the device employs intensity hysteresis to avoid accidental inputs sometimes termed “jitter,” where the device defines or selects a hysteresis intensity threshold with a predefined relationship to the press-input intensity threshold (e.g., the hysteresis intensity threshold is X intensity units lower than the press-input intensity threshold or the hysteresis intensity threshold is 75%, 90%, or some reasonable proportion of the press-input intensity threshold). Thus, in some embodiments, the press input includes an increase in intensity of the respective contact above the press-input intensity threshold and a subsequent decrease in intensity of the contact below the hysteresis intensity threshold that corresponds to the press-input intensity threshold, and the respective operation is performed in response to detecting the subsequent decrease in intensity of the respective contact below the hysteresis intensity threshold (e.g., the respective operation is performed on an “up stroke” of the respective press input). Similarly, in some embodiments, the press input is detected only when the device detects an increase in intensity of the contact from an intensity at or below the hysteresis intensity threshold to an intensity at or above the press- input intensity threshold and, optionally, a subsequent decrease in intensity of the contact to an intensity at or below the hysteresis intensity, and the respective operation is performed in response to detecting the press input (e.g., the increase in intensity of the contact or the decrease in intensity of the contact, depending on the circumstances).
[00146] For ease of explanation, the description of operations performed in response to a press input associated with a press-input intensity threshold or in response to a gestureincluding the press input are, optionally, triggered in response to detecting: an increase in intensity of a contact above the press-input intensity threshold, an increase in intensity of a contact from an intensity below the hysteresis intensity threshold to an intensity above the press-input intensity threshold, a decrease in intensity of the contact below the press-input intensity threshold, or a decrease in intensity of the contact below the hysteresis intensity threshold corresponding to the press-input intensity threshold. Additionally, in examples where an operation is described as being performed in response to detecting a decrease in intensity of a contact below the press-input intensity threshold, the operation is, optionally, performed in response to detecting a decrease in intensity of the contact below a hysteresis intensity threshold corresponding to, and lower than, the press-input intensity threshold. As described above, in some embodiments, the triggering of these responses also depends on time-based criteria being met (e.g., a delay time has elapsed between a first intensity threshold being met and a second intensity threshold being met).
USER INTERFACES AND ASSOCIATED PROCESSES
[00147] Attention is now directed towards examples of user interfaces (“UI”) and associated processes that may be implemented on an electronic device, such as portable multifunction device 100 or device 300, with a display, a touch-sensitive surface, and one or more sensors to detect intensities of contacts with the touch-sensitive surface.
[00148] These user interfaces and associated processes provide new, improved ways to: e acknowledge messages; e edit previously sent messages; e display an edit history for a previously sent message; e apply impact effect options to a message (e.g., to express what a user is trying to communicate); e display private messages using “invisible ink”; e display enhanced message interface content (e.g., "full screen moments”); e display content (e.g., "magic moments”) that corresponds to particular combinations of content in two separate messages;
46 DK 180170 B1 e build message bubbles; e suggest emojis e synchronize viewing of content between users; e incorporate handwritten inputs; e selectively scrunch content in a message transcript; e integrate a camera; e integrate search and sharing; e integrate interactive applications; e integrate stickers; e make payments; e interact with avatars; and e make suggestions.
47 DK 180170 B1
[00149] Figures SA-SK illustrate exemplary user interfaces for displaying message transcripts and message acknowledgments in accordance with some examples. In particular, Figure 5A illustrates a messaging user interface 5002 of a messaging user interface (e.g., for a messaging application), displayed on the display of an electronic device, sometimes called the first electronic device to help distinguish it from other electronic devices in communication with the first electronic device. The first electronic device also includes a touch-sensitive surface and one or more sensors, such as in a touch-screen display or trackpad, configured to detect intensities of contacts on the touch-sensitive surface.
[00150] The messaging user interface 5002, as shown in Figure 5A, includes a conversation transcript 5004 of a messaging session between a user (e.g., Genevive) of the electronic device and at least one other user (e.g., Isaac) of another electronic device). The conversation transcript 5004 in this example includes multiple messages, each in a respective message region 5008. A first input with a first contact 5010 at the location on the touch- sensitive surface corresponding to a first message 5006. In response to detecting the first — input 5010, the electronic device displays an acknowledgement selection affordance 5012, Figure 5B, at a location in the messaging interface that corresponds to the first message region. In this example, the acknowledgement selection affordance displays a plurality of acknowledgement options: 5014-1 (heart), 5014-2 (thumbs up), 5014-3 (thumbs down), 5014-4 (“HA”), 5014-5 (“I”) and 5014-6 (“7”). Other examples may includes fewer, — additional or different acknowledgment options. The electronic device responds to receiving a second input 5106 by a second contact (e.g., a tap gesture, light press gesture, deep press gesture, or lift off) on a respective acknowledgment option (or a hit region corresponding to the respective acknowledgment option), such as option 5014-2 (thumbs up), to select respective acknowledgment option and apply it to the first message 5006 or first message region 5008-1, as shown in Figure 5C. The selected acknowledgment option, in this example option 5014-2, is displayed in a respective acknowledgement region 5018. As shown in Figure 5H, the electronic device(s) of other user(s) (in this example, Isaac) participating in the messaging session (sometimes called a conversation) display the same selected acknowledgment option proximate for the first message region.
[00151] To edit the selected acknowledgement option, the user make a third input by a third contact 5022, as shown in Figure SC. Upon detecting the third input by the third contact, the electronic device displays an acknowledgement editing interface 5024, as shown in Figure
48 DK 180170 B1 SD. Optionally, the acknowledgement editing interface is displayed on top of and/or in place of the conversation transcript. Optionally the currently selected acknowledgment option, in this example option 5014-2, is visually distinguished from the other available acknowledgement options in the acknowledgement editing interface 5024, as shown in Figure 5D.
[00152] Figure SE shows an an input, contact 5026, selecting a different, second acknowledgement option 5014-1, and Figure SF shows a message region in a conversation transcript with the edited acknowledgment (i.e., with second acknowledgement option 5014- 1 displayed instead of first acknowledgement option 5014-2).
[00153] In some examples, while the user of the first electronic device is using the acknowledgement editing interface 5024 to edit a previously selected acknowledgment option for the first message region, the electronic device of another user in the messaging session displays an acknowledgement-preparation indicator 5020, as shown in Figure 5G, proximate (e.g., near, adjacent or partially overlapping) the first message region in the conversation transcript 5004 displayed by that electronic device.
[00154] Different users in the same messaging session may select different acknowledgment options for the same message or message region. In some examples, as shown in Figure 51, the electronic device displays, in the conversation transcript, an indicia 5036, that users in the messaging session have selected a plurality of acknowledgements — options for the second message region. In the example shown in Figure 51, indicia 5036 is a stack of overlapping acknowledgement option icons, but could alternatively be a plurality of individual acknowledgement option icons. As shown in Figure 51, indicia 5036 is optionally adjacent to and/or partially overlapping the second message region 5008-02.
[00155] Figure 5J shows an example of a user interface that includes a tally for each type of acknowledgement option selected by the users in the messaging session. For example, as shown in Figure 5J, the user interface includes tallies 5040-1, 5040-2, 5040-3 for three distinct acknowledgement options selected by users in the messaging session. In some examples, the user messaging interface shown in Figure 5J is displayed by the electronic device in response to detecting an input by a contact (e.g., contact 5034, Figure 51) at a — location on the touch-sensitive surface that corresponds to a location of the message region
49 DK 180170 B1 having indicia 5036, which indicates that multiple user in the messaging session have selected acknowledgement options for the same message (or corresponding message region).
[00156] By selecting one of the tallies 5040, a user can see the users who selected any particular acknowledgement option. The electronic device, in response to an input by a contact (e.g., contact 5041, as shown in Figure 5J) at a location on the touch-sensitive surface that corresponds to the tally 5040 for a particular acknowledgement option for a respective message region, displays icons 5042 (e.g., avatars) that represent users that selected the particular acknowledgement option for the respective message region, as shown in Figure 5K.
[00157] Figures SL-ST illustrate exemplary user interfaces for editing previously sent messages while displaying a message transcript. Figure SL shows a messaging user interface 5002 of a messaging application on the display of an electronic device. The messaging user interface 5002 includes a conversation transcript 5004 of a messaging session between a user of the electronic device and at least one other user, and a first message region 5044 that includes a first message 5046-1 that was sent from the electronic device of the user to the at least one other user in the messaging session.
[00158] The first message 5046-1 can be edited, despite the fact that it has already been sent. To initiate the editing of the first message, the user selects the first message with a predefined touch input 5048 (e.g., a tap gesture, long press gesture, light press gesture, or deep press gesture) on the first message or the message region for the first message. In some examples, the electronic device, in response to detecting the input 5048, displays a menu interface, such as the one shown in Figure SM. Alternatively, the electronic device, in response to detecting the input 5048, displays a message editing interface 5022, as shown in Figure SN. In some examples, a first input gesture (e.g., a tap gesture) on the first message is used to transition to the menu interface (e.g., as shown in Figure SM), while a second input gesture (e.g., a deep press) is used to transition to the message editing interface 5052, Figure SN. From the menu interface, as shown in Figure SM, a user can transition the messaging application to the message editing interface 5022, shown in Figure 5N, by a touch input 5050 (Figure SM) that selects an edit option in the menu interface.
[00159] The message editing interface 5052, Figure 5N, for editing a respective message, such as first message 5046-1, includes a keyboard 5054, and an update affordance
5056. While displaying the message editing interface 5052 for the respective message, the
50 DK 180170 B1 electronic device detects one or more inputs, such as input 5057 shown in Figure 5N, that revise the respective message, displays a revised version of the message, and detects an input that activates the update affordance (e.g., for sending the updated message to the one or more other electronic devices of the one or more other participants in the messaging session.
[00160] Figure 5O shows the conversation transcript 5004 after message 5046 has been updated. Because the conversation transcript 5004 includes an edited message, the edited message includes one or more indications 5058 that a particular message was revised after the original version of the message was sent to the other participant(s) in the messaging session. In Figure 50, there are two such indications of revision: indication 5058-1 is a shadow region behind the message region; indication 5058-2 is text (e.g., "Edited”) displayed below the message region that contains the revised version of the message.
[00161] An edited or revised message can be edited yet again. The electronic device, in response to an input (e.g., input 5060, Figure 50) that corresponds to a request to edit the revised version 5046-2) of a message, displays a message editing interface for the revised version of the message, as shown in Figure SR. The message editing interface, as shown in Figure SR, includes the message 5046-2 to be edited, a keyboard, and an update affordance (which is optionally not shown until at least one revision has been made to the message).
[00162] A participant in the messaging session can request to see all version, or two or — more versions, of an edited message. For example, in response to a predefined input 5060, Figure 50, on an edited message, the electronic device displays the user interface shown in Figure 5P, in which display of the conversation transcript is suppressed except for the revised version of the first message 5046-2, and a menu 5062 or list of editing options. In this example, the displayed menu 5062 includes a copy affordance 5064 (for copying the message — selected by input 5060), a show edits affordance 5066 (for showing edits to the message selected by input 5060), a delete affordance 5068 (for deleting the message selected by input 5060, or alternatively for undoing all edits made to the message selected by input 5060), and a display more options affordance 5070. In the example shown in Figure 5P, input 5072 (e.g., a tap gesture) is detected at the location on the touch-sensitive surface that corresponds to the — location of the “show edits” affordance 5066, which when activated, displays a user interface 5074 (Figure 5Q) that includes current version 5046-1 of the first message as well as a priors1 DK 180170 B1 version 5046-2, as shown in Figure 5Q. A further input, 5076 or 5078, in the user interface of Figure 5Q, selects a version of the message to edit. For example, the selected version is highlighted, and then a touch input on Edit (5080) would initiate editing of the selected version of the message. Alternatively, a touch input on Done (5082) terminates the editing of the selected version of the message.
[00163] An alternative to the edit menu interface shown in Figure 5P is the edit menu interface shown in Figure 5S, which includes the selected message (selected in response to input 5060, Figure 50), and a menu that includes an affordance (e.g., a "show edits” option, 5086), which when activated results in display of the user interface shown in Figure ST, which includes the current version 5046-2 of the selected message and one or more prior versions of the selected message (e.g., version 5046-1).
[00164] Figures SU-5BF illustrate exemplary user interfaces for applying an impact effect option to a message input or message region. Figure 5U illustrates a messaging user interface 5002 having a conversation transcript 5004 of a messaging session between a user (Wendy, in this example) of the electronic device and at least one other user (e.g., Max, in this example, a user of another electronic device), and a message-input area 5100 that includes first message input 5102 (e.g., text, stickers, images, and/or other graphics entered by a user of the electronic device in the message-input area, but not yet sent to the at least one other user in the messaging session). In Figure SU, first message input 5102 has not yet been sent.
[00165] In some examples, to trigger the activation of an impact selection interface 5110, Figure SAA, first input 5102, by a first contact at a location on the touch-sensitive surface that corresponds to a location in the message-input area 5100, includes a particular gesture (e.g., a swipe gesture) received at the message-input area 5100. In this example, the message-input area includes an impact selection affordance 5104 and the location of the contact 5106 of the first input corresponds to the impact selection affordance. In some examples, impact selection affordance 5104 is a multipurpose affordance, and a second gesture (e.g., a tap) on the same affordance 5104 is used to send the message 5102 in message input area 5100. The electronic device, upon detecting an input that includes the second gesture on affordance 5104, sends message 5102 and transitions to the user interface shown in Figure 5V. Similarly, after message 5102 is sent, the electronic device of the other user receives that message and shows the user interface shown in Figure SW.
DK 180170 B1
[00166] In some alternative examples, illustrated in Figures 5X, SY and 5Z, a deep press input 5108-1, 5108-2, 5108-3, as shown in Figures 5X, SY and 5Z, when detected by the electronic device, causes the messaging application to display an impact selection interface 5110, Figure SAA, that includes a plurality of impact effect options (e.g., impact 5 effect options 5112-1 through 5112-4. The increasing intensity of the deep press input is represented by the intensity diagrams in Figures 5X, SY and 5Z. Further, the deep press input 5108-1, 5108-2, 5108-3, on impact selection affordance 5104, as shown in Figures 5X, 5Y and 5Z, followed by a drag to the first impact effect option (e.g., input 5108-4 on impact effect option 5112-1, Figure SAA) and then pausing while over the affordance for a respective — impact effect option 5112, selects that impact effect option.
[00167] The impact selection interface 5110, Figure SAA, includes a “bubble” affordance 5114 for displaying a selecting message region impact effect options, and a “screen” affordance 5116 for selecting full screen impact effect options.
[00168] In some examples, the impact selection interface 5110, shows an animated preview of the currently selected impact effect option. For example, the sequence of Figures SAB-5AD show an animated preview of the “loud” impact effect option, which is a message region impact effect option, being applied to first message input ("Congratulations!”) in response to an input 5108-5 at a location on the touch-sensitive surface that corresponds to a location of the “loud” impact effect option. In this example, the preview of the “loud” impact effect option shows the font size of the message and the size of a message region at first increasing and then decreasing.
[00169] In another example, the, sequence of Figures SAE-5AG show an animated preview of the “slam” impact effect option, which is a message region impact effect option, being applied to first message input ("Congratulations!”) in response to a second input 5108- 6 at a location on the touch-sensitive surface that corresponds to a location of the “slam” impact effect option. In this example, the preview of the “slam” impact effect option shows the font size of the message and the size of a message region at first increasing and then decreasing, and at the same time changing the tilt or rotation state of the message region, and optionally changing the shade or color of a background region surrounding the message region. Optionally, the application of a respective impact effect option may change additional
53 DK 180170 B1 characteristics of the message region to which the impact effect option is applied, such as font color, background shade or color within the message region, etc.
[00170] Figure SAG also shows a third input 5108-7 on a send affordance, at a location corresponding to a user-selected impact effect option, in this case the “slam” option. In response, the electronic device of the user composing the message ceases to display the impact selection interface and displays a message region that contains the first message input in the conversation transcript, and in some examples, applies the selected impact effect option to the message region, as shown in Figure SAH, or to the entire conversation transcript, as shown in the Figure SAT (enlarging the message region to which the impact effect option was applied — and decreasing the size of one or more other message regions in the conversation transcript), and then transitioning to a final or static display of the conversation transcript that includes the sent message, as shown in Figure SAJ. Similarly, the sent message is displayed at the electronic device of one or more other users in the messaging session, such as Wendy, with the selected impact effect option applied to the either message region of the sent message or the entire conversation transcript, depending on which impact effect option was selected by the sending user, one example of which is shown in the sequence of Figures SAK, SAL, SAM.
[00171] The sequence of Figures SAN-SAS show an example of the "loud” impact effect option applied to a respective sent message (“congrats”). The sequence of Figures SAT-SAW show an example of the “gentle” impact effect option applied to a respective sent — message (“I'm sorry.”).
[00172] In someexamples, the selected impact effect option is applied to all (or substantially all) of the display screen of an electronic device that sends or receives the message, which includes the message region with the sent message, for a full-screen effect, an example of which is shown by the sequence of user interface images in Figures SAX — through SBF. Figures SAX through SBF show the progression of the “slam” effect of a first message input (“Hell No!!”), starting with the first message input displayed shown enlarged, rotated and with a dark background (Figure SAX), then displayed even larger and with a somewhat lighter background (Figure SAY), then less enlarged and rotated at a different angle than before (Figure 5AZ), then shown with further reduced enlargement and with blurry borders (Figure SBA), then shown with reduced size a different background and a different rotation (Figure SBB) and then with a sequence of different shadows around the message
4 DK 180170 B1 region (Figures SBC, 5BD, 5BE) until the first message input is shown at normal (default) size, not rotated, and with a normal (default) background (Figure SBF).
[00173] Figures SBG-5CA illustrate exemplary user interfaces for interacting with concealed messages. An impact selection interface 5110, shown in Figure 5BG, includes a plurality of impact effect options (e.g., impact effect options 5112-1 through 5112-4, described elsewhere in this document with respect to Figure SAA). In this example, the impact selection interface 5110 also includes a “bubble” affordance 5114 for displaying a selecting message region impact effect options, and a “screen” affordance 5116 for selecting full screen impact effect options.
[00174] In some examples, the displayed impact effect options include an option that conceals the content of a message in the conversation transcript (“invisible ink” option 5112-1, Figure SAA, indicated in Figure SBG by an empty message region ). In the example shown in Figure SBG, the invisible ink option includes a send affordance 5118 for sending a message (with user-specified message input) with the respective impact effect option. In the example shown in Figure SBG, impact selection interface 5110 includes a cancel affordance 5120 for canceling the selection of any impact effect options and returning the messaging application to either a prior user interface of the messaging application or a predefined user interface of the messaging application.
[00175] Once a message (hereinafter called the “concealed message” for ease of reference) has been sent with the invisible ink option, the concealed message is not displayed or is obscured, for example by screen elements 5124 that conceal the message, as shown in Figure SBH, until the user of the electronic device displaying a conversation transcript that includes the concealed message performs a respective gesture, such as sliding a touch input 5128-a, 5128-b, 5128-c over the message region containing the concealed message, as shown in the sequence of Figures SBH through SBL, which temporarily reveals a portion of the message corresponding to the position of the touch input 5128-1, 5128-b, 5128-c; or performing a deep press gesture on the message region containing the concealed message, as shown in the sequence of Figures SBM-5BP, in which the portion of the concealed message that is revealed corresponding to the intensity of the deep press input — 5130-a, 5130-b, 5130-c, and optionally also corresponding to the position of the deep press input 5130-a, 5130-b, 5130-c. In these examples, the concealed message is concealed againss DK 180170 B1 in response to detecting termination of the input by the contact that caused the concealed message, or portions thereof, to be temporarily revealed.
[00176] In another example, the sequence of Figures SBQ-5BV, show a concealed message being gradually revealed, as shown in Figures SBQ to 5BT, and then gradually concealed, as shown in Figures SBT to SBV. In yet another example, the sequence of Figures SBW-5CA, show a concealed message that contains a picture of photograph being gradually revealed.
[00177] Figures SCB-SCW illustrate exemplary user interfaces for triggering enhanced message content and applying an effect to a messaging user interface when a message — includes an enhanced message content trigger. More particularly, a respective message in a messaging session can be sent with an enhanced message content trigger, and when an electronic device with a compatible messaging application receives that message with the enhanced message content trigger, the electronic device displays a conversation transcript with the received message and with enhanced message content. In some examples, the — particular enhanced message content to be displayed is indicated by one or more parameters in the enhanced message content trigger.
[00178] Figures SCB and 5CC show an example of a user interface displaying a preview of a full screen impact effect, in this case a full screen impact effect that includes moving or animated balloons 5152-1. As shown in Figure SCB, the user of the electronic — device composing the message “Congratulations!” has selected the “screen” affordance 5116, for selecting full screen impact effect options. Similarly, the sequence of Figures SCD-5CF show an example of a user interface displaying a preview of a full screen impact effect, in this case a full screen impact effect that includes moving or animated confetti 5152-2. In the example shown here, the user navigates through the available full screen impact effects using swipe gestures, such as swipe gesture 5156 shown in Figure SCC, which cause the electronic device to moves from a preview of the balloon full screen impact effect shown in Figure SCC to the confetti full screen impact effect shown in Figure SCD. As shown in Figures SCB and SCC, the user interface may include an effect option indicator 5154 (sometimes called page dots) to indicate which full screen effect option is currently selected or is currently being previewed, and also to indicate how many full screen effect options are available and which one of those options in a sequence of the full screen effect options is currently being viewed.
s6 DK 180170 B1
[00179] Figure 5CG is an example of a user interface having a conversation transcript in which none of the messages include an enhanced message content trigger, and thus the messages in the transcript are displayed without displaying enhanced message content corresponding to a trigger.
[00180] The sequence of Figures SCH-5CO shows an example of a balloons full screen effect being displayed when a message (“Happy Birthday!!!!!”) containing a corresponding enhanced message content trigger is received. Similarly, the sequence of Figures SCP-5CW shows an example of a fireworks full screen effect being displayed when a message (“Happy New Year!!!”) containing a corresponding enhanced message content — trigger is received.
[00181] Figures SCX-5DC illustrate exemplary user interfaces for detecting and responding to combinable content in separate messages. In Figure SCX, a messaging user interface 5002 includes a conversation transcript 5004 a messaging session between a user of the electronic device and at least one other user (e.g., a user of another electronic device) including a first other user, and a message input area 5100. A received first message 5170 in a first message region 5008 is shown in the conversation transcript 5004. The first message 5170, for example a beer glass, has first combinable content (e.g., an emoticon or an image such as an emoji or a sticker). In Figure SCY, the user of the electronic device inputs a second message 5172 in message input area 5100 of the messaging user interface — 5002, and sends the second message 5172 by an input 5174 that selects a send affordance
5118. This results in the display of the messaging user interface shown in Figure SCZ.
[00182] If the second message 5172 contains second combinable content that forms a predefined combination with the first combinable content in the first message 5170, content 5176, the electronic device displays content that corresponds to the predefined combination, such as an animation of two beer glasses being clicked together (as shown in Figure SDA) and/or display of the word “Cheers!” (as shown in Figure SDA). Similarly, the electronic devices of the one or more other users in the messaging session would display the first message, the second message, and the content that corresponds to the predefined combination. It is noted that the content 5176 that corresponds to the predefined combination may be an animation that is briefly displayed after the second message is sent. Optionally, display of the content that corresponds to the predefined combination is
57 DK 180170 B1 repeated periodically while both the first and second messages with the combinable content are displayed in the conversation transcript 5004, or when other predefined criteria are satisfied (e.g., both the first and second messages with the combinable content are displayed in the conversation transcript 5004 and there has been no new activity in the messaging session for at least N seconds, when N is a predefined value between 1 and 60 seconds).
[00183] The sequence of Figures SDB and SDC show another example of combinable content, in which a first user has sent a first message 5178, "Happy,” and the second user inputs a second message 5180, “Birthday!” If the first message and second message both include combinable content corresponding to a predefined combination, then content corresponding to the predefined combination would be displayed in the message transcript of the messaging user interface displayed by either the sender’s electronic device, or a recipient’s electronic device, or (typically) both. For example, the content that corresponds to the predefined combination in this example might be balloons, similar to those shown in Figures SCH-5CO.
[00184] Figures SDD-5DI illustrate exemplary user interfaces for selecting a message region type or shape, in accordance with some examples. Figure SDD shows a messaging user interface 5002 with a message input region 5100 having a first message input 5102 ("Yes!”) already entered in the message input region. Messaging user interface 5002 includes a send affordance 5200 in or near message input region 5100, for sending the — message that has been input in that region. In addition, Figure SDD shows a input with a contact 5202 on an options affordance 5204.
[00185] The messaging user interface shown in Figure SDE is displayed in response to input with contact 5202 on options affordance 5204, which includes affordances for a number of message processing options, including a message region type/shape selection — affordance 5208, labeled the “bubbles” affordance in Figure SDE. While displaying this messaging user interface, the electronic device receives an input with contact 5206 on the “bubbles” option affordance 5208.
[00186] The messaging user interface shown in Figure SDF is displayed in response to input with contact 5206 on “bubbles” affordance 5208. Figure SDF also shows a preview — message region 5210-1 that includes the first message input 5102, and that has a default, or previously selected, message region type. In these examples, the messaging application hassg DK 180170 B1 a plurality of message region types or options, each having a corresponding shape, and typically having a corresponding font size, and optionally having one or more of: a font color, border color, and/or background color. In the example shown in Figure SDF, the send affordance 5200 is displayed near the preview message region 5210-1 instead of in message input region 5100. To select a different message region type for the first message 5172, or to view other available message region types, the user inputs a swipe gesture 5212. In response to swipe gesture 5212, a scrolling set of message region type icons is scrolled, as shown in the progression from Figure SDF to Figure SDG.
[00187] In the example of the messaging user interface shown in Figure SDG, an — input with contact 5214 is received, which causes the electronic device to change the message region type for the message region to be used with message input 5102, as shown in Figure SDH. In Figure SDH, message input 5102 is now shown in a message region 5210-2 having a message region type corresponding to the message region type selected by input with contact 5214 (shown in Figure SDG). The user can continue to select different message region types. For example, in Figure SDI, message input 5102 is now shown in a message region 5210-3 having a message region type corresponding to the message region type selected by input with contact 5216.
[00188] Figures 5DJ-5DOQ illustrate exemplary user interfaces for displaying and selecting automatically suggested emoji while composing a message. In Figure 5DJ, a messaging user interface 5002 having a conversation transcript 5004 with an input, but not yet sent, message 5220 in message input region 5100 is shown. In this example, the messaging using interface 5002 includes a send affordance 5200 in or near message input region 5100 for sending the input message.
[00189] As shown in Figures SDK-5DN, the electronic device begins to automatically make suggestions of emoji to replace one or more words or terms in the message input region 5100. In the example shown, the electronic device highlights word or terms, in sequence in the message, that are candidates for being replaced by emoji, first highlighting the term “sushi” 5222 in the input message 5220, as shown in Figures SDK and SDL, then highlighting the term “wine” 5224 in the input message 5220, as shown in Figure 5DM, and then highlighting the term “bowling” 5226 in the input message 5220, as shown in Figure SDN. In response to the user of the electronic device selecting one of the
“ DK 180170 B1 highlight terms (e.g., by an input with a contact on a highlighted term), such as “sushi” 5222 in the input message 5220, that term is replaced by a corresponding emoji 5232 corresponding to that term, as shown in Figure 5DO. Similarly, in Figure SDP, the term “wine” in the input message 5220 has been replaced by an emoji 5234 (e.g., a wine glass — emoji) corresponding to that term in response to a user selection of the highlighted term “wine” in the input message 5220. In a third example, the term “bowling” in the input message 5220 has been replaced by an emoji 5236 (e.g., a bowling emoji) corresponding to that term in response to a user selection of the highlighted term "bowling” in the input message 5220.
[00190] Figures 40A-40W illustrate exemplary user interfaces for interacting with other users of a messaging transcript through an avatar in accordance with some embodiments. The user interfaces in these figures are used to illustrate the processes described below, including the processes in Figures 66 and 68A-68B. For convenience of explanation, some of the embodiments will be discussed with reference to operations performed on a device with a touch-sensitive display system 112. In such embodiments, the focus selector is, optionally: a respective finger or stylus contact, a representative point corresponding to a finger or stylus contact (e.g., a centroid of a respective contact or a point associated with a respective contact), or a centroid of two or more contacts detected on the touch-sensitive display system 112. However, analogous operations are, optionally, performed on a device with a display 450 and a separate touch- sensitive surface 451 in response to detecting the contacts on the touch-sensitive surface 451 while displaying the user interfaces shown in the figures on the display 450, along with a focus selector.
[00191] Figures 40A-40W illustrate an exemplary user interface 3500 for a messaging application which includes conversation transcript 3503, message-input area — 3502, three activatable affordances—digital image affordance 3504, digital canvas affordance 3506, and application affordance 3508. Conversation transcript 3503 includes messages from participants of a corresponding messaging session, including the user of portable multifunction device 100-1 and other users included in the messaging session. Each of the other users included in the messaging transcript are represented by an avatar (e.g., avatar 3510 for “Abe” and avatar 4008 for “Mary Todd”) displayed in stack of avatars
4002.
60 DK 180170 B1
[00192] Figures 40A-40F illustrate two exemplary embodiments for interacting with another user of a messaging session through an avatar, where the device performs different operations based on detecting different types of user inputs.
[00193] In a first embodiment, a menu of actions for interacting with another user included in the messaging session is displayed by tapping on the user's avatar. Device 100-1 detects a tap gesture, including contact 4004 in Figure 40B, on Abe’s avatar 3510. In response, the device displays menu 3882 of activatable menu items 3884-1 to 3884-6 for interacting directly with Abe by a phone call, video call, individual message, e-mail, digital drawing, or payment, respectfully.
[00194] In a second embodiment, the stacked avatars are cycled to display a different avatar on top of the stack by deep pressing on the stack of avatars. Device 100-1 detects a press gesture, including contact 4006 in Figure 40D, on stack of avatars 4002 displaying Abe's avatar 3510 on top. The device then detects an increase in the intensity of contact 4006 above a predefined intensity threshold (e.g., ITL or ITp) in Figure 40E and, in response, shuffles the stack of avatars 4002 to display Mary Todd’s avatar 4008 on top, in Figures 40E-40F.
[00195] Figures 40G-40R illustrate an exemplary embodiment for paying another user of a messaging session through an avatar. Device 100-1 detects a press gesture, including contact 4010 in Figure 40G, of stack of avatars 4002 displaying Mary Todd's avatar 4008 on top. The device then detects an increase in the intensity of contact 4010 above a predefined intensity threshold (e.g., ITL or ITp) and, in response, displays menu 3882 of activatable menu items 3884-1 to 3884-6, for interacting with Mary Todd directly, and blurs display of conversation transcript 3503 in Figure 40H. Menu 3882 remains displayed after the device detects lift-off of contact 4010 in Figure 40I.
— [00196] Device 100-1 then detects a tap gesture, including contact 4012 in Figure 40J, on activatable menu item 3884-6 for a payment action. In response, the device displays payment area 4014, including termination affordance 4016, execution affordance 4018, and digital keypad 4015 for inputting payment amount, in Figure 40K. The device then detects input of a payment amount ($60) and, subsequently, a tap gesture including contact 4020 on message-input area 4019, in Figure 40L. In response, the device replaces digital touchpad 4014 with digital keyboard 4017 in Figure 40M. Responsive to detecting input of message
61 DK 180170 B1 4021, in Figure 40N, and a tap gesture, including contact 4022 in Figure 400, on execution affordance 4018, the device prompts the user of device 100-1 to confirm their identity by displaying confirmation area 4024 in Figure 40P. In response to receiving an identity confirming input, including contact 4026 in Figure 40Q, the device executes payment of $60 to Mary Todd and posts confirmation of the payment to the messaging session, displaying payment confirmation 4028 within conversation transcript 3503 in Figure 40R.
[00197] Figures 40S-40W illustrate two exemplary embodiments for interacting with another user of a messaging session through an avatar, where the device performs different operations based on detecting different types of user inputs.
[00198] In a first embodiment, tapping on the stack of avatars spreads the avatars out, such that a particular avatar can be selected. Device 100-1 detects a tap gesture, including contact 4030 in Figure 40S, on stack of avatars 4002. In response, the device spreads the avatars in stack of avatars 4002—avatar 3510 for “Abe,” avatar 4008 for “Mary Todd,” avatar 4032 for “Chuck,” avatar 4034 for “Issac,” and avatar 4036 for “Edwin”—across the — top of touch screen 112 in Figure 40T.
[00199] In a second embodiment, which is also a continuation of the first embodiment, deep pressing on a particular avatar calls up a menu of activatable actions for interacting with the user corresponding to the avatar. Device 100-1 detects a press gesture, including contact 4038 in Figure 40U, on Mary Todd’s avatar 4008. The device then detects an increase in the intensity of contact 4038 above a predefined intensity threshold (e.g., ITL or ITp) and, in response, displays menu 3882 of activatable menu items 3884-1 to 3884-6, for interacting with Mary Todd directly, and blurs display of conversation transcript 3503 in Figure 40V. Menu 3882 remains displayed after the device detects lift-off of contact 4038 in Figure 40W.
[00200] Figures 41A-41H illustrate exemplary user interfaces for integrating data detectors into a messaging application in accordance with some embodiments. The user interfaces in these figures are used to illustrate the processes described below, including the processes in Figures 70A-70B. For convenience of explanation, some of the embodiments will be discussed with reference to operations performed on a device with a touch-sensitive — display system 112. In such embodiments, the focus selector is, optionally: a respective finger or stylus contact, a representative point corresponding to a finger or stylus contact
6 DK 180170 B1 (e.g., a centroid of a respective contact or a point associated with a respective contact), or a centroid of two or more contacts detected on the touch-sensitive display system 112. However, analogous operations are, optionally, performed on a device with a display 450 and a separate touch- sensitive surface 451 in response to detecting the contacts on the touch-sensitive surface 451 while displaying the user interfaces shown in the figures on the display 450, along with a focus selector.
[00201] Figures 41A-41H illustrate two exemplary embodiments where a first electronic device 100-1 detects a word or phrase, in a messaged received from a second electronic device associated with another user included in the messaging session, associated with additional content available on the internet. Figure 41A illustrates an exemplary user interface 3500 for a messaging application which includes conversation transcript 3503, message-input area 3502, three activatable affordances—digital image affordance 3504, digital canvas affordance 3506, and application affordance 3508. Conversation transcript 3503 includes messages from participants of a corresponding messaging session, including the user of portable multifunction device 100-1 and other users included in the messaging session. Each of the other users included in the messaging transcript are represented by an avatar (e.g., avatar 4008 for “Mary Todd”) displayed in stack of avatars 4002.
[00202] In a first embodiment, the device prompts the user to view additional content associated with the identified word or phrase within the messaging user interface by displaying a selectable affordance. Device 100-1 receives message 4102 from Mary Todd, displayed in conversation transcript 3503 in Figure 41A. The device recognizes phrase 4101 (“Meteorite Catcher’) as the name of a movie about which information is available on the internet. In response, the device displays selectable affordance 4104, prompting the user to “See more information” about the movie. The device then detects a tap gesture, including contact 4106, on selectable affordance 4104 in Figure 41B. In response, the device displays information area 4108 displaying information about the movie found on the internet, including a representation 4110 of a poster for the film and biographical information 4112 about the movie, in Figure 41C.
[00203] In a second embodiment, the device prompts the user to view additional — content associated with the identified word or phrase in a separate search user interface by highlighting the word or phrase. Device 100-1 receives message 4114 from Abe, displayed
63 DK 180170 B1 in conversation transcript 3503 in Figure 41D. The device recognizes phrase 4103 (“Astron Omer”) as the name of an actor about whom information is available on the internet. In response, the device displays highlighting 4116 of phrase 4103, prompting the user to select the phrase. The device then detects a tap gesture, including contact 4118, on phrase 4103 in Figure 41E. In response, the device displays search user interface 3661 displaying categorized results —new articles 3664, 3678, and 3680 and movie previews 3670, 3682, and 3684—of an internet search of phrase 4103, in Figure 41F. The device detects a tap gesture, including contact 4120 in Figure 41G, on movie preview 3670. In response, the device posts movie preview 3670 to the messaging session, displaying movie preview 3670 — in conversation transcript 3503 in Figure 41H.
Interacting with avatars in a group messaging session
[00204] Figures 68 A-68B are flow diagrams illustrating a method 6800 of interacting with a single user included in a group messaging session in accordance with some embodiments. The method 6800 is performed at an electronic device (e.g., device 300, Figure 3, or portable multifunction device 100, Figure 1A) with a display, a touch-sensitive surface, and one or more sensors to detect intensities of contacts with the touch-sensitive surface. In some embodiments, the display is a touch-screen display and the touch-sensitive surface is on or integrated with the display. In some embodiments, the display is separate from the touch-sensitive surface. Some operations in method 6800 are, optionally, — combined and/or the order of some operations is, optionally, changed.
[00205] As described below, the method 6800 provides an intuitive way to interact with a single user included in a group messaging session. The method reduces the number, extent, and/or nature of the inputs from a user when interacting with a single user included in a group messaging session, thereby creating a more efficient human-machine interface.
— For battery-operated electronic devices, enabling a user to interact with a single user included in a group messaging session faster and more efficiently conserves power and increases the time between battery charges.
[00206] The device displays (6802) a messaging user interface of a messaging application on the display, the messaging user interface including a conversation transcript (e.g., displayed in a first area of the display) of a messaging session between a user of the electronic device and a plurality of other users (e.g., of respective other electronic devices),
64 DK 180170 B1 a message-input area, and a plurality of avatars, each respective avatar in the plurality of avatars corresponding to a respective other user in the plurality of other users included in the messaging session, wherein the plurality of avatars are displayed as a stack of (e.g., overlapping) avatars, with a first avatar in the plurality of avatars displayed on the top of the stack of avatars. For example, messaging user interface 3500, in Figure 40S, includes conversation transcript 3503 and message-input area 3502. Conversation transcript 3503 includes messages from participants of a corresponding messaging session, including the user of portable multifunction device 100-1 and other users included in the messaging session. Each of the other users included in the messaging transcript are represented by an — avatar (e.g., avatar 3510 for "Abe”) displayed on top of stack of avatars 4002, in Figure 40S.
[00207] While displaying the messaging user interface, the device detects an input by a first contact on the touch-sensitive surface while a focus selector is at a first location in the messaging user interface that corresponds to the first avatar (e.g., detect a gesture by a contact on a touch-sensitive display at the location of the first avatar, or detect a gesture by a contact on a touch-sensitive surface while a cursor or other pointer is at the location of the first avatar). For example, device 100-1 detects an input including contact 4030 on Abe’s avatar (e.g., displayed on top of stack of avatars 4002), in Figure 40S. In another example, device 100-1 detects an input including contact 4038 on Mary Todd’s avatar 4008, in Figure 40U.
[00208] In response to detecting the input by the first contact, in accordance with a determination that the input meets menu-activation-criteria, wherein the menu-activation- criteria require that a characteristic intensity of the contact on the touch-sensitive surface meet a respective intensity threshold in order for the menu-activation criteria to be met, the — device displays (6806) a menu that contains activatable menu items associated with the first avatar overlaid on the messaging user interface. For example, in response to detecting an increase in a characteristic intensity of contact 4038 on Mary Todd’s avatar 4008, meeting a predetermined intensity threshold (e.g., ITL or ITp), between Figures 40U and 40V, device 100-1 displays action menu 3882, including actions 3884 for directly interacting with Mary Todd (e.g., interacting only with Mary Todd), in Figure 40V.
DK 180170 B1 65
[00209] In response to detecting the input by the first contact, in accordance with a determination that the input meets avatar-spreading-criteria, wherein the avatar-spreading- criteria do not require that a characteristic intensity of the contact on the touchscreen meet the respective intensity threshold in order for the selection criteria to be met, the device displays (6906) the plurality of avatars in an array (e.g., a substantially or completely non- overlapping array). For example, in response to detecting an input including contact 4030 on Abe’s avatar 3510, in Figure 40S, where a characteristic intensity of contact 4030 does not meet a predetermined intensity threshold (e.g., ITL or ITp), device 100-1 displays an array of avatars in avatar stack 4002—e.g., Abe’s avatar 3510, Mary Todd’s avatar 4008, Chuck’s avatar 4032, Issac’s avatar 4034, and Edwin’s avatar 4036—in Figure 40T.
[00210] In some embodiments, in accordance with a determination that a characteristic (e.g., a maximum) intensity of the first contact met a predefined intensity threshold (e.g., was between a first intensity threshold and a second intensity threshold), the device replaces display of the first avatar on top of the stack of avatars with display of a — second avatar in the plurality of avatars on top of the stack of avatars. For example, in response to detecting an increase in a characteristic intensity of contact 4006 on Abe’s avatar 3510, meeting a predetermined intensity threshold (e.g., IT. or ITp), between Figures 40D and 40E, device 100-1 cycles (e.g., switches or shuffles) stack of avatars 4002 to display Mary Todd’s avatar 4008 on top of stakc of avatars 4002 in Figures 40E-40F (e.g., — Mary todd’s avatar 4008 replaces Abe’s avatar 3510 on top of stack of avatars 4002).
[00211] In some embodiments, in accordance with a determination that a characteristic (e.g., a maximum) intensity of the first contact did not meet a predetermined intensity threshold (e.g., was between the second intensity threshold and a third intensity threshold), the device displays a menu that contains activatable menu items associated with — the first avatar overlaid on the messaging user interface. For example, in response to detecting an input including contact 4004 on Abe’s avatar 3510, in Figure 40B, where a characteristic intensity of contact 4004 does not meet a predetermined intensity threshold (e.g., ITL or ITp), device 100-1 displays action menu 3882, including actions 3884 for directly interacting with Abe (e.g., interacting only with Abe), in Figure 40C.
[00212] In some embodiments, the second intensity threshold is above the first intensity threshold and the third intensity threshold is above the second intensity threshold
66 DK 180170 B1 (e.g., tap to cycle through the avatars and light press or deep press to call up a quick action menu for the avatar at the top of the stack).
[00213] In some embodiments, the second intensity threshold is below the first intensity threshold and the third intensity threshold is below the second intensity threshold (e.g., deep press or light press to cycle through the avatars and tap to call up the quick action menu for the avatar at the top of the stack).
[00214] In some embodiments, the avatar-spreading-criteria are met (6810) upon detection of a tap gesture on the touch-sensitive surface. For example, avatars 3510, 4008, 4032, 4034, and 4036, in avatar stack 4002, displayed in Figure 40S, are displayed as a spread array (e.g., a substantially or completely non-overlapping array), in Figure 40T, in response to detecting a tap gesture including contact 4030 on Abe’s avatar 3510, in Figure 40S.
[00215] In some embodiments, the menu that contains activatable menu items associated with the first avatar (e.g., action menu 3882 in Figure 40V) includes (6812) a menu item that when activated initiates a canvas for sharing digital drawings with the other user (e.g., digital touch action 3884-5 in Figure 40V).
[00216] In some embodiments, the menu that contains activatable menu items associated with the first avatar (e.g., action menu 3882 in Figure 40V) includes (6814) a menu item that when activated initiates messaging with only the first other user (e.g., — message action 3884-3 in Figure 40V).
[00217] For example, in some embodiments, selecting a messaging menu item (e.g., activatable action 3884-3) causes the electronic device to display a private messaging user interface between the user of the electronic device and the first other user. For example, in response to detecting an input including contact 3886 on activatable message action 3884-3, in Figure 38AY, while messaging user interface 3500 is displaying conversation transcript 3503 corresponding to a messaging session between the user of device 100-1 (e.g., SAndrew”) and a plurality of other users, each represented by an avatar in the stack of avatars 4002, in Figure 38AY, device 100-1 replaces display of conversation transcript 3503, in Figure 38AY, with conversation transcript 3700, corresponding to a messaging — session between only Abe and Andrew, in Figure 38AZ.
DK 180170 B1
[00218] In some embodiments, the menu that contains activatable menu items associated with the first avatar (e.g., action menu 3882 in Figure 40V) includes (6816) a menu item that when activated initiates an email with the first other user (e.g., mail action 3884-4 in Figure 40V).
[00219] In some embodiments, the menu that contains activatable menu items associated with the first avatar (e.g., action menu 3882 in Figure 40V) includes (6818) a menu item that when activated initiates a call with the first other user (e.g., call action 3884- 1 in Figure 40V).
[00220] In some embodiments, the menu that contains activatable menu items associated with the first avatar (e.g., action menu 3882 in Figure 40V) includes (6820) a menu item that when activated initiates a video conference with the first other user (e.g., video call action 3884-2 in Figure 40V).
[00221] In some embodiments, the menu that contains activatable menu items associated with the first avatar (e.g., action menu 3882 in Figure 40V) includes (6822) a menu item that when activated initiates a payment action with the first other user (e.g., payment action 3884-6 in Figure 40V).
[00222] It should be understood that the particular order in which the operations in Figures 68 A-68B have been described is merely exemplary and is not intended to indicate that the described order is the only order in which the operations could be performed. One of ordinary skill in the art would recognize various ways to reorder the operations described herein. Additionally, it should be noted that details of other processes described herein with respect to other methods described herein are also applicable in an analogous manner to method 6800 described above with respect to Figures 68A-68B. For example, the contacts, gestures, user interface objects, tactile outputs, intensity thresholds, focus selectors, and animations described above with reference to method 6800 optionally have one or more of the characteristics of the contacts, gestures, user interface objects, tactile outputs, intensity thresholds, focus selectors, and animations described herein with reference to other methods described herein. For brevity, these details are not repeated here.
[00223] The foregoing description, for purpose of explanation, has been described — with reference to specific embodiments. However, the illustrative discussions above are not intended to be exhaustive or to limit the invention to the precise forms disclosed. Manymodifications and variations are possible in view of the above teachings.
The embodiments were chosen and described in order to best explain the principles of the invention and its practical applications, to thereby enable others skilled in the art to best use the invention and various described embodiments with various modifications as are suited to the particular use contemplated.
The scope of present invention is defined by the claims

Claims (18)

DK 180170 B1 1 PatentkravDK 180170 B1 1 Patentkrav 1. Fremgangsmåde, som omfatter: i en elektronisk indretning med en eller flere processorer, hukommelse, en be- røringsfølsom overflade, en eller flere sensorer til detektering af intensiteter af kontakter med den berøringsfølsomme overflade og et display: at vise (6802) en messaging-brugergrænseflade af en messaging-applikation på displayet, hvilken messaging-brugergrænseflade indbefatter en konversa- tionstranskription af en messaging-session mellem en bruger af den elektroni- ske indretning og en flerhed af andre brugere, et besked-input-område og en flerhed af avatarer, hvor hver respektiv avatar i flerheden af avatarer svarer til en respektiv anden bruger i flerheden af andre brugere, som er inkluderet i messaging-sessionen, hvor flerheden af avatarer er vist som en stabel af ava- tarer, hvor en første avatar i flerheden af avatarer er vist i toppen af stablen af avatarer; under visningen (6804) af messaging-brugergrænsefladen, at detektere et in- put med en første kontakt på den berøringsfølsomme overflade, men en fokus- selektor er i en første lokation i messaging-brugergrænsefladen, som svarer til den første avatar; som reaktion på detektering (6806) af inputtet med den første kontakt: i overensstemmelse med en bestemmelse af, at inputtet opfylder menu-akti- verings-kriterier, hvor menu-aktiverings-kriterierne kræver, at en karakteristisk intensitet af kontakten på den berøringsfølsomme overflade opfylder en re- spektiv intensitetstærskel for at kunne opfylde menu-aktiverings-kriterierne, at vise en menu, som indeholder aktiverbare menu-elementer, som er associeret med den første avatar, som befinder sig på messaging-brugergrænsefladen; og i overensstemmelse med en bestemmelse af, at inputtet opfylder avatar-spred- nings-kriterier, hvor avatar-sprednings-kriterierne ikke kræver, at en karakteri- stisk intensitet af kontakten på berøringsskærmen opfylder den respektive in- tensitetstærskel for at kunne opfylde avatar-sprednings-kriterierne, at vise fler- heden af avatarer i en række.A method comprising: in an electronic device having one or more processors, memory, a touch-sensitive surface, one or more sensors for detecting intensities of contacts with the touch-sensitive surface and a display: displaying (6802) a messaging user interface of a messaging application on the display, which messaging user interface includes a conversational transcript of a messaging session between a user of the electronic device and a plurality of other users, a message input area and a plurality of avatars, where each respective avatar in the plurality of avatars corresponds to a respective second user in the plurality of other users, which is included in the messaging session, where the plurality of avatars are shown as a stack of avatars, where a first avatar in the plurality of avatars are shown at the top of the stack of avatars; during the display (6804) of the messaging user interface, detecting an input with a first contact on the touch-sensitive surface, but a focus selector is in a first location in the messaging user interface corresponding to the first avatar; in response to detection (6806) of the input with the first contact: according to a determination that the input meets menu activation criteria, the menu activation criteria requiring a characteristic intensity of the contact on the touch-sensitive surface meets a respective intensity threshold in order to meet the menu activation criteria, to display a menu containing activatable menu items associated with the first avatar located on the messaging user interface; and in accordance with a determination that the input meets avatar scattering criteria, the avatar scattering criteria not requiring that a characteristic intensity of the contact on the touch screen meet the respective intensity threshold in order to meet the avatar scattering criteria. the scatter criteria, to show the majority of avatars in a row. 2. Fremgangsmåde ifølge krav 1, hvor avatar-sprednings-kriterierne opfyldes ved detektering af en tap gesture på den berøringsfølsomme overflade.The method of claim 1, wherein the avatar spreading criteria are met by detecting a tap gesture on the touch-sensitive surface. DK 180170 B1 2DK 180170 B1 2 3. Fremgangsmåde ifølge krav 1 eller 2, hvor menuen, som indeholder akti- verbare menu-elementer, som er associeret med den første avatar, indbefatter et menu-element, som, når det er aktiveret, initierer et lærred til deling af digi- tale tegninger med den anden bruger.The method of claim 1 or 2, wherein the menu containing activatable menu items associated with the first avatar includes a menu item which, when activated, initiates a screen for sharing talk drawings with the other user. 4. Fremgangsmåde ifølge et hvilket som helst af kravene 1-3, hvor menuen, som indeholder aktiverbare menu-elementer, som er associeret med den før- ste avatar, indbefatter et menu-element, som, når det er aktiveret, initierer messaging med udelukkende den første anden bruger.A method according to any one of claims 1-3, wherein the menu containing activatable menu items associated with the first avatar includes a menu item which, when activated, initiates messaging with only the first second user. 5. Fremgangsmåde ifølge et hvilket som helst af kravene 1-4, hvor menuen, som indeholder aktiverbare menu-elementer, som er associeret med den før- ste avatar, indbefatter et menu-element, som, når det er aktiveret, initierer en e-mail med den første anden bruger.A method according to any one of claims 1-4, wherein the menu containing activatable menu items associated with the first avatar includes a menu item which, when activated, initiates an e email with the first second user. 6. Fremgangsmåde ifølge et hvilket som helst af kravene 1-5, hvor menuen, som indeholder aktiverbare menu-elementer, som er associeret med den før- ste avatar, indbefatter et menu-element, som, når det er aktiveret, initierer et opkald med den første anden bruger.A method according to any one of claims 1-5, wherein the menu containing activatable menu items associated with the first avatar includes a menu item which, when activated, initiates a call. with the first second user. 7. Fremgangsmåde ifølge et hvilket som helst af kravene 1-6, hvor menuen, som indeholder aktiverbare menu-elementer, som er associeret med den før- ste avatar, indbefatter et menu-element, som, når det er aktiveret, initierer en videokonference med den første anden bruger.A method according to any one of claims 1-6, wherein the menu containing activatable menu items associated with the first avatar includes a menu item which, when activated, initiates a video conference with the first second user. 8. Fremgangsmåde ifølge et hvilket som helst af kravene 1-7, hvor menuen, som indeholder aktiverbare menu-elementer, som er associeret med den før- ste avatar, indbefatter et menu-element, som, når det er aktiveret, initierer en betalingshandling med den første anden bruger.A method according to any one of claims 1-7, wherein the menu containing activatable menu items associated with the first avatar includes a menu item which, when activated, initiates a payment action. with the first second user. 9. Informationsprocesseringsapparat til anvendelse i en elektronisk indretning med et display, en berøringsfølsom overflade og en eller flere sensorer til de- tektering af intensiteter af kontakter med den berøringsfølsomme overflade, omfattende:An information processing apparatus for use in an electronic device having a display, a touch-sensitive surface and one or more sensors for detecting intensities of contacts with the touch-sensitive surface, comprising: DK 180170 B1 3 middel til at vise en messaging-brugergrænseflade af en messaging-applika- tion på displayet, hvilken messaging-brugergrænseflade indbefatter en kon- versationstranskription af en messaging-session mellem en bruger af den elektroniske indretning og en flerhed af andre brugere, et besked-input-om- råde og en flerhed af avatarer, hvor hver respektiv avatar i flerheden af avata- rer svarer til en respektiv anden bruger i flerheden af andre brugere, som er inkluderet i messaging-sessionen, hvor flerheden af avatarer er vist som en stabel af avatarer, hvor en første avatar i flerheden af avatarer er vist i toppen af stablen af avatarer; middel til, under visningen af messaging-brugergrænsefladen, at detektere et input med en første kontakt på den berøringsfølsomme overflade, men en fo- kus-selektor er i en første lokation i messaging-brugergrænsefladen, som sva- rer til den første avatar; som reaktion på detektering af inputtet med den første kontakt: middel til, i overensstemmelse med en bestemmelse af, at inputtet opfylder menu-aktiverings-kriterier, hvor menu-aktiverings-kriterierne kræver, at en ka- rakteristisk intensitet af kontakten på den berøringsfølsomme overflade opfyl- der en respektiv intensitetstærskel for at kunne opfylde menu-aktiverings-kri- terierne, at vise en menu, som indeholder aktiverbare menu-elementer, som er associeret med den første avatar, som befinder sig på messaging-bruger- grænsefladen; og middel til, i overensstemmelse med en bestemmelse af, at inputtet opfylder avatar-sprednings-kriterier, hvor avatar-sprednings-kriterierne ikke kræver, at en karakteristisk intensitet af kontakten på berøringsskærmen opfylder den re- spektive intensitetstærskel for at kunne opfylde udvælgelseskriterierne, at vise flerheden af avatarer i en række.Means for displaying a messaging user interface of a messaging application on the display, which messaging user interface includes a conversational transcript of a messaging session between a user of the electronic device and a plurality of other users, a message input area and a plurality of avatars, where each respective avatar in the plurality of avatars corresponds to a respective other user in the plurality of other users included in the messaging session, where the plurality of avatars are displayed as a stack of avatars, where a first avatar in the plurality of avatars is shown at the top of the stack of avatars; means for, during the display of the messaging user interface, detecting an input with a first contact on the touch sensitive surface, but a focus selector is in a first location in the messaging user interface corresponding to the first avatar; in response to detecting the input with the first contact: means for, in accordance with a determination that the input meets menu activation criteria, the menu activation criteria requiring a characteristic intensity of the contact on the touch-sensitive surface; meets a respective intensity threshold in order to meet the menu activation criteria, displaying a menu containing activatable menu items associated with the first avatar located on the messaging user interface; and means for, in accordance with a determination that the input meets avatar scattering criteria, wherein the avatar scattering criteria do not require that a characteristic intensity of the contact on the touch screen meet the respective intensity threshold in order to meet the selection criteria, that display the plurality of avatars in a row. 10. Informationsprocesseringsapparat ifølge krav 9, hvor avatar-sprednings- kriterierne opfyldes ved detektering af en tap gesture på den berøringsføl- somme overflade.The information processing apparatus according to claim 9, wherein the avatar spreading criteria are met by detecting a tap gesture on the touch-sensitive surface. 11. Informationsprocesseringsapparat følge krav 9 eller 10, hvor menuen, som indeholder aktiverbare menu-elementer, som er associeret med den første avatar, indbefatter et menu-element, som, når det er aktiveret, initierer et lær- red til deling af digitale tegninger med den anden bruger.An information processing apparatus according to claim 9 or 10, wherein the menu containing activatable menu items associated with the first avatar includes a menu item which, when activated, initiates a canvas for sharing digital drawings. with the other user. DK 180170 B1 4DK 180170 B1 4 12. Informationsprocesseringsapparat ifølge et hvilket som helst af kravene 9- 11, hvor menuen, som indeholder aktiverbare menu-elementer, som er asso- cieret med den første avatar, indbefatter et menu-element, som, når det er aktiveret, initierer messaging med udelukkende den første anden bruger.An information processing apparatus according to any one of claims 9 to 11, wherein the menu containing activatable menu items associated with the first avatar includes a menu item which, when activated, initiates messaging with only the first second user. 13. Informationsprocesseringsapparat ifølge et hvilket som helst af kravene 9- 12, hvor menuen, som indeholder aktiverbare menu-elementer, som er asso- cieret med den første avatar, indbefatter et menu-element, som, når det er aktiveret, initierer en e-mail med den første anden bruger.An information processing apparatus according to any one of claims 9 to 12, wherein the menu containing activatable menu items associated with the first avatar includes a menu item which, when activated, initiates an e email with the first second user. 14. Informationsprocesseringsapparat ifølge et hvilket som helst af kravene 9- 13, hvor menuen, som indeholder aktiverbare menu-elementer, som er asso- cieret med den første avatar, indbefatter et menu-element, som, når det er aktiveret, initierer et opkald med den første anden bruger.An information processing apparatus according to any one of claims 9 to 13, wherein the menu containing activatable menu items associated with the first avatar includes a menu item which, when activated, initiates a call. with the first second user. 15. Informationsprocesseringsapparat ifølge et hvilket som helst af kravene 9- 14, hvor menuen, som indeholder aktiverbare menu-elementer, som er asso- cieret med den første avatar, indbefatter et menu-element, som, når det er aktiveret, initierer en videokonference med den første anden bruger.An information processing apparatus according to any one of claims 9 to 14, wherein the menu containing activatable menu items associated with the first avatar includes a menu item which, when activated, initiates a video conference with the first second user. 16. Informationsprocesseringsapparat ifølge et hvilket som helst af kravene 9- 15, hvor menuen, som indeholder aktiverbare menu-elementer, som er asso- cieret med den første avatar, indbefatter et menu-element, som, når det er aktiveret, initierer en betalingshandling med den første anden bruger.An information processing apparatus according to any one of claims 9 to 15, wherein the menu containing activatable menu items associated with the first avatar includes a menu item which, when activated, initiates a payment operation. with the first second user. 17. Computerlæsbart lagringsmedium, som lagrer et eller flere programmer, hvor det ene eller flere programmer omfatter instruktioner, som, når de ekse- kveres af en elektronisk indretning med et display, en berøringsfølsom over- flade og en eller flere sensorer til detektering af intensiteter af kontakter med den berøringsfølsomme overflade, foranlediger den elektroniske indretning til at udføre en hvilken som helst af fremgangsmåderne ifølge krav 1-8.A computer-readable storage medium which stores one or more programs, the one or more programs comprising instructions which, when executed by an electronic device with a display, a touch-sensitive surface and one or more sensors for detecting intensities of contacts with the touch-sensitive surface, causes the electronic device to perform any of the methods of claims 1-8. 18. Elektronisk indretning omfattende et display, en berøringsfølsom over- flade, en eller flere sensorer til detektering af intensiteter af kontakter med den berøringsfølsomme overflade og et informationsprocesseringsapparat ifølge et hvilket som helst af kravene 9-16.Electronic device comprising a display, a touch-sensitive surface, one or more sensors for detecting intensities of contacts with the touch-sensitive surface and an information processing apparatus according to any one of claims 9-16.
DKPA201670653A 2016-05-18 2016-08-26 Devices, procedures, and graphical messaging user interfaces DK180170B1 (en)

Priority Applications (44)

Application Number Priority Date Filing Date Title
CN202011289542.7A CN112799569A (en) 2016-05-18 2017-05-18 Applying confirmation options in a graphical messaging user interface
CN202011289707.0A CN112748840A (en) 2016-05-18 2017-05-18 Applying confirmation options in a graphical messaging user interface
AU2017266930A AU2017266930C1 (en) 2016-05-18 2017-05-18 Devices, Methods, and Graphical User Interfaces for Messaging
CN201910607633.1A CN110333806B (en) 2016-05-18 2017-05-18 Applying confirmation options in a graphical messaging user interface
CN201810396354.0A CN108762862B (en) 2016-05-18 2017-05-18 Device, method and graphical user interface for messaging
KR1020187003537A KR101947140B1 (en) 2016-05-18 2017-05-18 Applying acknowledgment options within the graphical messaging user interface
EP19181254.4A EP3594795B1 (en) 2016-05-18 2017-05-18 Devices, methods, and graphical user interfaces for messaging
KR1020217040161A KR102511443B1 (en) 2016-05-18 2017-05-18 Applying acknowledgement of options in a graphical messaging user interface
BR112018073693A BR112018073693A2 (en) 2016-05-18 2017-05-18 devices, methods, and graphical user interfaces for messaging
KR1020197003574A KR102134455B1 (en) 2016-05-18 2017-05-18 Applying acknowledgement of options in a graphical messaging user interface
CN202011291826.XA CN112748841A (en) 2016-05-18 2017-05-18 Applying confirmation options in a graphical messaging user interface
KR1020197019197A KR102091368B1 (en) 2016-05-18 2017-05-18 Applying acknowledgement of options in a graphical messaging user interface
CN201911387675.5A CN111176509B (en) 2016-05-18 2017-05-18 Applying confirmation options in a graphical messaging user interface
CN201910607635.0A CN110333926A (en) 2016-05-18 2017-05-18 Using confirmation option in graphical messages transmission user interface
CN201910607447.8A CN110399061B (en) 2016-05-18 2017-05-18 Applying confirmation options in a graphical messaging user interface
CN202011289515.XA CN113157176A (en) 2016-05-18 2017-05-18 Applying confirmation options in a graphical messaging user interface
EP17728317.3A EP3295615B1 (en) 2016-05-18 2017-05-18 Applying acknowledgement options in a graphical messaging user interface
EP23213783.6A EP4311201A3 (en) 2016-05-18 2017-05-18 Devices, methods, and graphical user interfaces for messaging
CN202011291800.5A CN112799570A (en) 2016-05-18 2017-05-18 Applying confirmation options in a graphical messaging user interface
PCT/US2017/033396 WO2017201326A1 (en) 2016-05-18 2017-05-18 Applying acknowledgement options in a graphical messaging user interface
EP22190375.0A EP4113268B1 (en) 2016-05-18 2017-05-18 Devices, methods, and graphical user interfaces for messaging
KR1020237008767A KR102636000B1 (en) 2016-05-18 2017-05-18 Applying acknowledgement of options in a graphical messaging user interface
CN202011289558.8A CN112732147A (en) 2016-05-18 2017-05-18 Applying confirmation options in a graphical messaging user interface
CN201810396289.1A CN109117068B (en) 2016-05-18 2017-05-18 Device, method and graphical user interface for messaging
CN201780002856.4A CN108476168B (en) 2016-05-18 2017-05-18 Applying confirmation options in a graphical messaging user interface
CN202011289545.0A CN112783403A (en) 2016-05-18 2017-05-18 Applying confirmation options in a graphical messaging user interface
CN201910607634.6A CN110377193A (en) 2016-05-18 2017-05-18 Using confirmation option in graphical messages transmission user interface
KR1020197036410A KR102172901B1 (en) 2016-05-18 2017-05-18 Applying acknowledgement of options in a graphical messaging user interface
EP18167254.4A EP3376358B1 (en) 2016-05-18 2017-05-18 Devices, methods, and graphical user interfaces for messaging
KR1020207019976A KR102338357B1 (en) 2016-05-18 2017-05-18 Applying acknowledgement of options in a graphical messaging user interface
EP19180887.2A EP3620902B1 (en) 2016-05-18 2017-05-18 Devices, methods, and graphical user interfaces for messaging
EP19218201.2A EP3680763B1 (en) 2016-05-18 2017-05-18 Devices, methods, and graphical user interfaces for messaging
JP2018510791A JP6538966B2 (en) 2016-05-18 2017-05-18 Apparatus, method and graphical user interface for messaging
KR1020247004154A KR20240023200A (en) 2016-05-18 2017-05-18 Applying acknowledgement of options in a graphical messaging user interface
JP2019106495A JP6851115B2 (en) 2016-05-18 2019-06-06 Using acknowledgment options in the graphical message user interface
AU2019204403A AU2019204403B2 (en) 2016-05-18 2019-06-24 Devices, Methods, and Graphical User Interfaces for Messaging
JP2019218615A JP6710806B2 (en) 2016-05-18 2019-12-03 Using the Acknowledgment Option in the Graphical Message User Interface
AU2019283863A AU2019283863B2 (en) 2016-05-18 2019-12-18 Devices, methods, and graphical user interfaces for messaging
AU2020202396A AU2020202396B2 (en) 2016-05-18 2020-04-06 Devices, methods, and graphical user interfaces for messaging
JP2020092417A JP6967113B2 (en) 2016-05-18 2020-05-27 Using acknowledgment options in the graphical message user interface
JP2021173046A JP7263481B2 (en) 2016-05-18 2021-10-22 Using the Acknowledgment Option in the Graphical Message User Interface
AU2021269318A AU2021269318B2 (en) 2016-05-18 2021-11-16 Devices, methods, and graphical user interfaces for messaging
AU2023201827A AU2023201827A1 (en) 2016-05-18 2023-03-23 Devices, methods, and graphical user interfaces for messaging
JP2023065202A JP2023099007A (en) 2016-05-18 2023-04-12 Use of confirmation response option in graphical message user interface

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
US201662338502P 2016-05-18 2016-05-18
US201662339078P 2016-05-19 2016-05-19
US201662349114P 2016-06-12 2016-06-12
DKPA201670636A DK180169B1 (en) 2016-05-18 2016-08-24 Devices, procedures, and graphical messaging user interfaces

Publications (2)

Publication Number Publication Date
DK201670653A1 DK201670653A1 (en) 2017-12-04
DK180170B1 true DK180170B1 (en) 2020-07-14

Family

ID=60516186

Family Applications (13)

Application Number Title Priority Date Filing Date
DKPA201670641A DK201670641A1 (en) 2016-05-18 2016-08-24 Devices, Methods, and Graphical User Interfaces for Messaging
DKPA201670642A DK179831B1 (en) 2016-05-18 2016-08-24 Devices, methods and graphical user interfaces for messaging
DKPA201670636A DK180169B1 (en) 2016-05-18 2016-08-24 Devices, procedures, and graphical messaging user interfaces
DKPA201670653A DK180170B1 (en) 2016-05-18 2016-08-26 Devices, procedures, and graphical messaging user interfaces
DKPA201670648A DK179363B1 (en) 2016-05-18 2016-08-26 Devices, methods and graphical user interfaces for messaging
DKPA201670647A DK179830B1 (en) 2016-05-18 2016-08-26 Devices, methods and graphical user interfaces for messaging
DKPA201670655A DK179829B1 (en) 2016-05-18 2016-08-26 Devices, methods and graphical user interfaces for messaging
DKPA201670650A DK179478B1 (en) 2016-05-18 2016-08-26 Devices, methods and graphical user interfaces for messaging
DKPA201670654A DK179753B1 (en) 2016-05-18 2016-08-26 Devices, methods and graphical user interfaces for messaging
DKPA201670649A DK179174B1 (en) 2016-05-18 2016-08-26 Devices, methods and graphical user interfaces for messaging
DKPA201670652A DK180979B1 (en) 2016-05-18 2016-08-26 Devices, methods and graphical user interfaces for messaging
DKPA201670651A DK179747B1 (en) 2016-05-18 2016-08-26 Devices, methods and graphical user interfaces for messaging
DKPA202070483A DK202070483A1 (en) 2016-05-18 2020-07-10 Devices, Methods, and Graphical User Interfaces for Messaging

Family Applications Before (3)

Application Number Title Priority Date Filing Date
DKPA201670641A DK201670641A1 (en) 2016-05-18 2016-08-24 Devices, Methods, and Graphical User Interfaces for Messaging
DKPA201670642A DK179831B1 (en) 2016-05-18 2016-08-24 Devices, methods and graphical user interfaces for messaging
DKPA201670636A DK180169B1 (en) 2016-05-18 2016-08-24 Devices, procedures, and graphical messaging user interfaces

Family Applications After (9)

Application Number Title Priority Date Filing Date
DKPA201670648A DK179363B1 (en) 2016-05-18 2016-08-26 Devices, methods and graphical user interfaces for messaging
DKPA201670647A DK179830B1 (en) 2016-05-18 2016-08-26 Devices, methods and graphical user interfaces for messaging
DKPA201670655A DK179829B1 (en) 2016-05-18 2016-08-26 Devices, methods and graphical user interfaces for messaging
DKPA201670650A DK179478B1 (en) 2016-05-18 2016-08-26 Devices, methods and graphical user interfaces for messaging
DKPA201670654A DK179753B1 (en) 2016-05-18 2016-08-26 Devices, methods and graphical user interfaces for messaging
DKPA201670649A DK179174B1 (en) 2016-05-18 2016-08-26 Devices, methods and graphical user interfaces for messaging
DKPA201670652A DK180979B1 (en) 2016-05-18 2016-08-26 Devices, methods and graphical user interfaces for messaging
DKPA201670651A DK179747B1 (en) 2016-05-18 2016-08-26 Devices, methods and graphical user interfaces for messaging
DKPA202070483A DK202070483A1 (en) 2016-05-18 2020-07-10 Devices, Methods, and Graphical User Interfaces for Messaging

Country Status (1)

Country Link
DK (13) DK201670641A1 (en)

Families Citing this family (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
TWI439960B (en) 2010-04-07 2014-06-01 Apple Inc Avatar editing environment
US9716825B1 (en) 2016-06-12 2017-07-25 Apple Inc. User interface for camera effects
DK180859B1 (en) 2017-06-04 2022-05-23 Apple Inc USER INTERFACE CAMERA EFFECTS
US11722764B2 (en) 2018-05-07 2023-08-08 Apple Inc. Creative camera
US11770601B2 (en) 2019-05-06 2023-09-26 Apple Inc. User interfaces for capturing and managing visual media
US11128792B2 (en) 2018-09-28 2021-09-21 Apple Inc. Capturing and displaying images with multiple focal planes
US11706521B2 (en) 2019-05-06 2023-07-18 Apple Inc. User interfaces for capturing and managing visual media
JP7332518B2 (en) * 2020-03-30 2023-08-23 本田技研工業株式会社 CONVERSATION SUPPORT DEVICE, CONVERSATION SUPPORT SYSTEM, CONVERSATION SUPPORT METHOD AND PROGRAM
DK202070624A1 (en) 2020-05-11 2022-01-04 Apple Inc User interfaces related to time
US11921998B2 (en) 2020-05-11 2024-03-05 Apple Inc. Editing features of an avatar
US11778339B2 (en) 2021-04-30 2023-10-03 Apple Inc. User interfaces for altering visual media
US11776190B2 (en) 2021-06-04 2023-10-03 Apple Inc. Techniques for managing an avatar on a lock screen
CN117899478B (en) * 2024-03-18 2024-06-04 腾讯科技(深圳)有限公司 Virtual character control method and related device

Family Cites Families (66)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO1997019429A1 (en) * 1995-11-20 1997-05-29 Motorola Inc. Displaying graphic messages in a radio receiver
GB0113537D0 (en) * 2001-06-05 2001-07-25 Superscape Plc Improvements in message display
US7702315B2 (en) * 2002-10-15 2010-04-20 Varia Holdings Llc Unified communication thread for wireless mobile communication devices
JP2006520053A (en) * 2003-03-03 2006-08-31 アメリカ オンライン インコーポレイテッド How to use an avatar to communicate
US8171084B2 (en) * 2004-01-20 2012-05-01 Microsoft Corporation Custom emoticons
US20060041629A1 (en) * 2004-08-23 2006-02-23 Luigi Lira Magnification of incoming and outgoing messages in the user interface of instant messaging and other digital communication services
US7619616B2 (en) * 2004-12-21 2009-11-17 Microsoft Corporation Pressure sensitive controls
US8464176B2 (en) * 2005-01-19 2013-06-11 Microsoft Corporation Dynamic stacking and expansion of visual items
KR100704153B1 (en) * 2005-07-06 2007-04-06 삼성전자주식회사 Method for changing character size of message in mobile communication terminal
US8510109B2 (en) * 2007-08-22 2013-08-13 Canyon Ip Holdings Llc Continuous speech transcription performance indication
US7861175B2 (en) * 2006-09-29 2010-12-28 Research In Motion Limited IM contact list entry as a game in progress designate
US8943018B2 (en) * 2007-03-23 2015-01-27 At&T Mobility Ii Llc Advanced contact management in communications networks
TWI374382B (en) * 2008-09-01 2012-10-11 Htc Corp Icon operation method and icon operation module
US8762885B2 (en) * 2008-12-15 2014-06-24 Verizon Patent And Licensing Inc. Three dimensional icon stacks
US9159074B2 (en) * 2009-03-23 2015-10-13 Yahoo! Inc. Tool for embedding comments for objects in an article
WO2011085248A1 (en) * 2010-01-07 2011-07-14 Swakker, Llc Methods and apparatus for modifying a multimedia object within an instant messaging session at a mobile communication device
KR101740433B1 (en) * 2010-06-01 2017-05-26 엘지전자 주식회사 Mobile Terminal and Method for Displaying Message thereof
CN101867636B (en) * 2010-06-02 2015-02-04 华为终端有限公司 Method for executing user command and terminal equipment
KR101701151B1 (en) * 2010-09-20 2017-02-02 삼성전자주식회사 Integrated Message Transmitting and Receiving Method and Apparatus Using Portable Device
US20120110064A1 (en) * 2010-11-01 2012-05-03 Google Inc. Content sharing interface for sharing content in social networks
KR20120081368A (en) * 2011-01-11 2012-07-19 주식회사 엔씨소프트 Method of game invitation with chatting window in mobile platform
KR20120107836A (en) * 2011-03-22 2012-10-04 주식회사 아카북 Method and apparatus for providing sticker image services in user terminal
CN103797438B (en) * 2011-06-24 2018-02-23 谷歌有限责任公司 Group talks between multiple participants
KR101801188B1 (en) * 2011-07-05 2017-11-24 엘지전자 주식회사 Mobile device and control method for the same
IL214855A0 (en) * 2011-08-28 2011-10-31 Arnon Joseph A method and device for carrying out a computerized group session
US8819154B2 (en) * 2011-10-14 2014-08-26 Blackberry Limited User interface methods and apparatus for use in communicating text and photo messages
KR101521332B1 (en) * 2011-11-08 2015-05-20 주식회사 다음카카오 Method of provicing a lot of services extended from a instant messaging service and the instant messaging service
US8366546B1 (en) * 2012-01-23 2013-02-05 Zynga Inc. Gamelets
US9685160B2 (en) * 2012-04-16 2017-06-20 Htc Corporation Method for offering suggestion during conversation, electronic device using the same, and non-transitory storage medium
KR101917689B1 (en) * 2012-05-22 2018-11-13 엘지전자 주식회사 Mobile terminal and control method thereof
KR101685226B1 (en) * 2012-06-12 2016-12-20 라인 가부시키가이샤 Messenger interworking service system and method using social graph based on relationships of messenger platform
KR102013443B1 (en) * 2012-09-25 2019-08-22 삼성전자주식회사 Method for transmitting for image and an electronic device thereof
EP2713323A1 (en) * 2012-09-27 2014-04-02 BlackBerry Limited Apparatus and method pertaining to automatically-suggested emoticons
KR102014778B1 (en) * 2012-12-14 2019-08-27 엘지전자 주식회사 Digital device for providing text messaging service and the method for controlling the same
FR3003715A1 (en) * 2013-03-25 2014-09-26 France Telecom METHOD OF EXCHANGING MULTIMEDIA MESSAGES
KR102027899B1 (en) * 2013-05-21 2019-10-02 삼성전자 주식회사 Method and apparatus for providing information using messenger
KR20140143971A (en) * 2013-06-10 2014-12-18 삼성전자주식회사 Shared home screen apparatus and method thereof
WO2014200621A1 (en) * 2013-06-13 2014-12-18 Evernote Corporation Initializing chat sessions by pointing to content
JP5809207B2 (en) * 2013-07-30 2015-11-10 グリー株式会社 Message communication program, message communication method, and message communication system
US9553832B2 (en) * 2013-08-13 2017-01-24 Facebook, Inc. Techniques to interact with an application via messaging
US10320730B2 (en) * 2013-09-10 2019-06-11 Xiaomi Inc. Method and device for displaying message
KR102057944B1 (en) * 2013-09-17 2019-12-23 삼성전자주식회사 Terminal device and sharing method thereof
WO2015050966A1 (en) * 2013-10-01 2015-04-09 Filmstrip, Inc. Image and message integration system and method
KR102161764B1 (en) * 2013-10-31 2020-10-05 삼성전자주식회사 Method and computer readable recording medium for displaying a communication window of a messenger using a cartoon image
KR20150061336A (en) * 2013-11-27 2015-06-04 엘지전자 주식회사 Mobile terminal and method for controlling the same
US20150188861A1 (en) * 2013-12-26 2015-07-02 Aaren Esplin Mechanism for facilitating dynamic generation and transmission of canned responses on computing devices
US10050926B2 (en) * 2014-02-05 2018-08-14 Facebook, Inc. Ideograms based on sentiment analysis
US20160014059A1 (en) * 2015-09-30 2016-01-14 Yogesh Chunilal Rathod Presenting one or more types of interface(s) or media to calling and/or called user while acceptance of call
JP2015185173A (en) * 2014-03-24 2015-10-22 株式会社 ハイヂィープ Emergency operation method and terminal machine for target to be run by touch pressure and touch area
JP6484079B2 (en) * 2014-03-24 2019-03-13 株式会社 ハイディープHiDeep Inc. Kansei transmission method and terminal for the same
US10845982B2 (en) * 2014-04-28 2020-11-24 Facebook, Inc. Providing intelligent transcriptions of sound messages in a messaging application
US9367215B2 (en) * 2014-04-30 2016-06-14 AthenTek Incorporated Mobile devices and related methods for configuring a remote device
KR102105961B1 (en) * 2014-05-13 2020-05-28 엘지전자 주식회사 Mobile terminal and method for controlling the same
US20150332534A1 (en) * 2014-05-15 2015-11-19 Narvii Inc. Systems and methods implementing user interface objects
WO2015183756A1 (en) * 2014-05-31 2015-12-03 Apple Inc. Message user interfaces for capture and transmittal of media and location content
WO2015183699A1 (en) * 2014-05-30 2015-12-03 Apple Inc. Predictive messaging method
US9207835B1 (en) * 2014-05-31 2015-12-08 Apple Inc. Message user interfaces for capture and transmittal of media and location content
US10218652B2 (en) * 2014-08-08 2019-02-26 Mastercard International Incorporated Systems and methods for integrating a chat function into an e-reader application
US9432314B2 (en) * 2014-08-15 2016-08-30 Microsoft Technology Licensing, Llc Quick navigation of message conversation history
US10361986B2 (en) * 2014-09-29 2019-07-23 Disney Enterprises, Inc. Gameplay in a chat thread
US10965633B2 (en) * 2014-09-29 2021-03-30 Microsoft Technoiogy Licensing, LLC Session history horizon control
US20160117299A1 (en) * 2014-10-24 2016-04-28 Apple Inc. Device, method, and graphical user interface for visible and interactive corrected content
US20160117665A1 (en) * 2014-10-27 2016-04-28 Facebook, Inc. Facilitating initiating payments without a payment credential
CN104601812A (en) * 2015-01-05 2015-05-06 小米科技有限责任公司 Message content showing method, message content showing determination method, device and system
US10042900B2 (en) * 2015-03-23 2018-08-07 Dropbox, Inc. External user notifications in shared folder backed integrated workspaces
KR101567555B1 (en) * 2015-05-18 2015-11-20 이미지랩409(주) Social network service system and method using image

Also Published As

Publication number Publication date
DK179174B1 (en) 2018-01-02
DK201670653A1 (en) 2017-12-04
DK179831B1 (en) 2019-07-22
DK201670650A1 (en) 2017-12-04
DK180169B1 (en) 2020-07-13
DK201670655A1 (en) 2017-12-04
DK201670648A1 (en) 2017-12-04
DK179747B1 (en) 2019-05-01
DK201670652A1 (en) 2017-12-04
DK179478B1 (en) 2018-12-07
DK180979B1 (en) 2022-08-29
DK179753B1 (en) 2019-05-08
DK201670649A1 (en) 2017-12-04
DK179829B1 (en) 2019-07-22
DK201670636A1 (en) 2017-12-04
DK202070483A1 (en) 2020-07-17
DK201670641A1 (en) 2017-12-04
DK201670647A1 (en) 2017-12-04
DK201670651A1 (en) 2017-12-04
DK179363B1 (en) 2018-05-22
DK201670642A1 (en) 2017-12-04
DK201670654A1 (en) 2017-12-11
DK179830B1 (en) 2019-07-22

Similar Documents

Publication Publication Date Title
US11966579B2 (en) Devices, methods, and graphical user interfaces for messaging
US11954323B2 (en) Devices, methods, and graphical user interfaces for initiating a payment action in a messaging session
AU2022224714B2 (en) Devices and methods for interacting with an application switching user interface
US10884608B2 (en) Devices, methods, and graphical user interfaces for content navigation and manipulation
DK180170B1 (en) Devices, procedures, and graphical messaging user interfaces
AU2018260930B2 (en) Handwriting keyboard for small screens
EP3335103B1 (en) Devices, methods, and graphical user interfaces for manipulating user interfaces with physical gestures
DK179037B1 (en) Devices, Methods, and Graphical User Interfaces for Interacting with a Control Object while Dragging Another Object
WO2014105279A1 (en) Device, method, and graphical user interface for switching between user interfaces
WO2013169865A2 (en) Device, method, and graphical user interface for moving a user interface object based on an intensity of a press input
US20180088750A1 (en) Devices, Methods, and Graphical User Interfaces for Creating and Displaying Application Windows
US11966578B2 (en) Devices and methods for integrating video with user interface navigation
US20240069716A1 (en) Devices and Methods for Interacting with an Application Switching User Interface
US10540071B2 (en) Device, method, and graphical user interface for displaying a zoomed-in view of a user interface

Legal Events

Date Code Title Description
PME Patent granted

Effective date: 20200714