CN117971106A - System and method for interacting with a user interface - Google Patents

System and method for interacting with a user interface Download PDF

Info

Publication number
CN117971106A
CN117971106A CN202410161475.2A CN202410161475A CN117971106A CN 117971106 A CN117971106 A CN 117971106A CN 202410161475 A CN202410161475 A CN 202410161475A CN 117971106 A CN117971106 A CN 117971106A
Authority
CN
China
Prior art keywords
user interface
input
contact
touch
search
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202410161475.2A
Other languages
Chinese (zh)
Inventor
N·德弗里斯
C·G·卡鲁纳蒙尼
W·M·泰勒
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Apple Inc
Original Assignee
Apple Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from US17/745,782 external-priority patent/US12026364B2/en
Application filed by Apple Inc filed Critical Apple Inc
Priority claimed from PCT/US2022/029659 external-priority patent/WO2022245846A1/en
Publication of CN117971106A publication Critical patent/CN117971106A/en
Pending legal-status Critical Current

Links

Landscapes

  • User Interface Of Digital Computer (AREA)

Abstract

The present disclosure relates to systems and methods for interacting with a user interface. A device detects a first user input while displaying a first user interface including a first plurality of notifications in a notification list. In response, in accordance with a determination that the first input includes a swipe input in a first direction and an end of the notification list has been reached, displaying a search input area; and in accordance with a determination that the first input includes a swipe input in the first direction and an end of the notification list has not been reached, displaying a second plurality of notifications between the first plurality of notifications and the end of the notification list.

Description

System and method for interacting with a user interface
Statement of case division
The application is a system and method for interacting with a user interface, having a filing date of 2022, month 05, day 17, and the title of the application: 202280035146.2 is a divisional application of a Chinese patent application of the application.
Related patent application
The present application is a continuation of U.S. patent application Ser. No. 17/745,782, filed 5/16 of 2022, which claims priority from U.S. provisional patent application Ser. No. 63/189,652, filed 5/17 of 2021.
Technical Field
The present disclosure relates generally to electronic devices having touch-sensitive surfaces, including but not limited to electronic devices having touch-sensitive surfaces that provide functionality through a graphical user interface, and more particularly to devices and methods for displaying a search user interface in accordance with user interactions.
Background
The use of touch-sensitive surfaces as input devices for computers and other electronic computing devices has grown significantly in recent years. Exemplary touch sensitive surfaces include touchpads and touch screen displays. Such surfaces are widely used to select, launch and manage software applications.
For portable electronic devices, existing methods for searching for and accessing content and/or applications of interest are inefficient and cumbersome. For example, portable devices with small screens (e.g., smart phones and other pocket-sized devices) typically display a single application at a time. With such devices, locating desired content in an application or desired functionality within the application without accurate knowledge of its whereabouts on the device is difficult and may take multiple steps and employ multiple inputs from the user. This situation places a significant cognitive burden on the user when searching for content and/or applications of interest on the device. In addition, existing methods for searching and accessing content take longer than necessary, which frustrates the user experience and consumes more energy. This latter consideration is particularly acute in battery-powered devices.
Disclosure of Invention
Accordingly, there is a need for computing devices with faster, more efficient methods and interfaces for searching and accessing content and applications on portable electronic devices. Such methods and interfaces may supplement or replace conventional methods for searching and accessing content and applications. Such methods and interfaces may reduce the cognitive burden on the user and result in a more efficient human-machine interface. For battery-powered computing devices, such methods and interfaces conserve power and increase the time interval between battery charges.
An electronic device having a display generating component and a touch-sensitive surface displays a first user interface in a display area. While displaying the first user interface, the device detects a first touch gesture that includes detecting a first set of one or more contacts on the touch-sensitive surface. In response to detecting the first touch gesture and in accordance with a determination that the first touch gesture meets a first criterion, the device replaces at least a portion of the first user interface with a first search user interface in the display area. The first criterion includes a first requirement that is satisfied in accordance with a determination that: the first touch gesture includes a first movement of a first contact in a first direction across a first portion of a first edge of the touch-sensitive surface. In accordance with a determination that the first touch gesture satisfies a second criterion different from the first criterion, the device replaces at least a portion of the first user interface with a plurality of previously received notifications in the display area. The second criterion includes a second requirement that is satisfied in accordance with a determination that: the first touch gesture includes a second movement of the first contact in the first direction across a second portion of the first edge of the touch-sensitive surface. The second portion of the first edge is different from the first portion of the first edge.
In some embodiments, the first criteria and the second criteria can be met when displaying a home screen, an application user interface, or a wake screen user interface, or a cover user interface (see, e.g., fig. 5A, 5R, 5U, and 5X) of the device. In some embodiments, the device detects an additional touch gesture while the second user interface is displayed. In accordance with a determination that the additional touch gesture meets the first criteria or an additional preset criteria different from the first criteria, the device replaces at least a portion of the second user interface with the first search user interface. In some other embodiments, the device replaces the at least a portion of the second user interface with the first search user interface in accordance with only determining that the additional gesture meets the first criteria.
Thus, the electronic device is provided with a faster and more efficient method for displaying search user interfaces in various contexts, such as through an application, home screen, wake screen, lock screen, or notification center user interface. The displayed search user interface can be used to search and then present search results from various content sources including web pages, documents, applications (e.g., messages, photos, calendars, contacts, etc.). The search user interface may be displayed in response to a convenient swipe down gesture starting from a middle portion of a top edge of the electronic device. Additionally, the method further provides for displaying a notification center user interface in response to a swipe down gesture starting from a different portion of the top edge of the electronic device (e.g., the middle portion of the top edge). Thus, the user is provided with a convenient and efficient way to display frequently desired user interfaces with different edge swipe gestures.
An electronic device in communication with the display generation component and the one or more input devices displays a first user interface. The first user interface includes a first plurality of notifications in a notification list. While the first user interface is displayed, the device detects a first user input, the first user input comprising a first input. In response to detecting the first user input and in accordance with a determination that the first input includes a swipe input in a first direction and that the end of the notification list has been reached, the device displays a search input area. In accordance with a determination that the first input includes the swipe input in the first direction and an end of the notification list has not been reached, the device displays a second plurality of notifications between the first plurality of notifications and the end of the notification list.
In some embodiments, in accordance with a determination that the first input includes swipe input in a second direction different from the first direction, the device displays a plurality of previously received notifications in the first user interface. The plurality of previously received notifications includes notifications in the first plurality of notifications that are not in the notification list. In some embodiments, the first user interface is a wake screen user interface that is displayed in response to detecting a request to wake the display generating component from a low power mode.
Thus, the electronic device is also provided with a faster and more efficient method for navigating between a search user interface and a notification center user interface. The user may scroll through a notification list including notifications generated by multiple applications, for example, by applying a down and up gesture when the notification center user interface is displayed. When the end of the notification list has been reached, the user may further display the search user interface by applying a swipe gesture. The ability to navigate from the notification center user interface to the search user interface improves efficiency and convenience without the need to provide additional user input for closing and opening the respective user interfaces.
The methods described herein for visually accessing frequently used user interfaces generally reduce the cognitive burden on a user when locating content, applications, and/or notifications on an electronic device. Furthermore, the method reduces the time to access such frequently used user interfaces, thereby increasing the operating time of the battery-operated electronic device between charges.
Drawings
For a better understanding of the various described embodiments, reference should be made to the following detailed description taken in conjunction with the following drawings, in which like reference numerals designate corresponding parts throughout the several views.
Fig. 1A is a block diagram illustrating a portable multifunction device with a touch-sensitive display in accordance with some embodiments.
FIG. 1B is a block diagram illustrating exemplary components for event processing according to some embodiments.
Fig. 2 illustrates a portable multifunction device with a touch screen in accordance with some embodiments.
FIG. 3 is a block diagram of an exemplary multifunction device with a display and a touch-sensitive surface in accordance with some embodiments.
Fig. 4A illustrates an exemplary user interface for an application menu on a portable multifunction device in accordance with some embodiments.
Fig. 4B illustrates an exemplary user interface for a multifunction device with a touch-sensitive surface separate from a display in accordance with some embodiments.
Fig. 5A-5 AX illustrate exemplary user interactions for displaying a search user interface in various contexts, according to some embodiments.
Fig. 6A-6J are flowcharts illustrating methods of displaying a search user interface in response to user interaction according to some embodiments.
Detailed Description
The methods, apparatus, and GUIs described herein improve the interaction and functionality of search user interfaces in a variety of ways.
In some embodiments, an electronic device having a display generating component and a touch-sensitive surface displays a first user interface in a display area. While displaying the first user interface, the device detects a first touch gesture that includes detecting a first set of one or more contacts on the touch-sensitive surface. In some implementations, the first user interface is a home screen, an application user interface, a wake screen user interface, or a cover user interface of the device. In response to detecting the first touch gesture and in accordance with a determination that the first touch gesture meets a first criterion, the device replaces at least a portion of the first user interface with a first search user interface in the display area. The first criterion includes a first requirement that is satisfied in accordance with a determination that: the first touch gesture includes a first movement of a first contact in a first direction across a first portion of a first edge of the touch-sensitive surface. In some implementations, detecting the first touch gesture includes: movement of the first set of one or more contacts from outside the touch-sensitive surface across the first edge of the touch-sensitive surface onto the touch-sensitive surface is detected. In accordance with a determination that the first touch gesture meets a second criterion different from the first criterion, the device replaces at least a portion of the first user interface with a plurality of previously received notifications (e.g., notification center user interfaces) in the display area. The second criterion includes a second requirement that is satisfied in accordance with a determination that: the first touch gesture includes a second movement of the first contact in the first direction across a second portion of the first edge of the touch-sensitive surface. The second portion of the first edge is different from the first portion of the first edge. In some embodiments, the first portion and the second portion of the first edge are operatively adjacent to each other (e.g., the left portion of the first edge and the middle portion of the first edge). In some implementations, the device further detects an additional touch gesture (e.g., a multi-contact swipe gesture or swipe gesture starting from a middle portion of the touch-sensitive surface) while the second user interface is displayed. In accordance with a determination that the additional touch gesture meets the first criteria or an additional preset criteria different from the first criteria, the device replaces at least a portion of the second user interface with the first search user interface.
In some implementations, an electronic device in communication with a display generation component and one or more input devices displays a first user interface (e.g., a wake-up screen user interface). The first user interface includes a first plurality of notifications in a notification list. While the first user interface is displayed, the device detects a first user input, the first user input comprising a first input. In response to detecting the first user input and in accordance with a determination that the first input includes a swipe input in a first direction and that the end of the notification list has been reached, the device displays a search input area. In accordance with a determination that the first input includes a swipe input in a first direction and an end of the notification list has not been reached, the device displays a second plurality of notifications (e.g., the device scrolls the notification list) between the first plurality of notifications and the end of the notification list. In some implementations, in accordance with a determination that the first input includes swipe input in a second direction different from the first direction, the device displays a plurality of previously received notifications in the first user interface. The plurality of previously received notifications includes notifications that are not in the first plurality of notifications in the notification list.
1A-1B, 2 and 3 provide a description of an exemplary device. Fig. 5A-5 AX illustrate exemplary user interactions for displaying a search user interface in various contexts. Fig. 6A-6J are flowcharts illustrating methods of displaying a search user interface in response to user interaction. The user interfaces in fig. 5A to 5AX are used to illustrate the processes in fig. 6A to 6J.
Exemplary apparatus
Reference will now be made in detail to embodiments, examples of which are illustrated in the accompanying drawings. Numerous specific details are set forth in the following detailed description in order to provide a thorough understanding of the various described embodiments. It will be apparent, however, to one of ordinary skill in the art that the various described embodiments may be practiced without these specific details. In other instances, well-known methods, procedures, components, circuits, and networks have not been described in detail so as not to unnecessarily obscure aspects of the embodiments.
It will also be understood that, although the terms "first," "second," etc. may be used herein to describe various elements in some cases, these elements should not be limited by these terms. These terms are only used to distinguish one element from another element. For example, a first contact may be named a second contact, and similarly, a second contact may be named a first contact without departing from the scope of the various described embodiments. The first contact and the second contact are both contacts, but they are not the same contact unless the context clearly indicates otherwise.
The terminology used in the description of the various illustrated embodiments herein is for the purpose of describing particular embodiments only and is not intended to be limiting. As used in the description of the various described embodiments and in the appended claims, the singular forms "a," "an," and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise. It will also be understood that the term "and/or" as used herein refers to and encompasses any and all possible combinations of one or more of the associated listed items. It will be further understood that the terms "comprises" and/or "comprising," when used in this specification, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof.
As used herein, the term "if" is optionally interpreted to mean "after" at … … at … … "or" in response to determination "or" in response to detection "depending on the context. Similarly, the phrase "if determined … …" or "if a [ stated condition or event ] is detected" is optionally interpreted to mean "upon determination … …" or "in response to determination … …" or "upon detection of a [ stated condition or event ]" or "in response to detection of a [ stated condition or event ], depending on the context.
Embodiments of electronic devices, user interfaces for such devices, and associated processes for using such devices are described herein. In some embodiments, the device is a portable communication device, such as a mobile phone, that also includes other functions, such as PDA and/or music player functions. Exemplary embodiments of the portable multifunction device include, but are not limited to, those from Apple inc., cupertino, californiaiPod/>And/>An apparatus. Other portable electronic devices, such as a laptop or tablet computer having a touch-sensitive surface (e.g., a touch screen display and/or a touchpad), are optionally used. It should also be appreciated that in some embodiments, the device is not a portable communication device, but rather a desktop computer having a touch-sensitive surface (e.g., a touch screen display and/or a touch pad).
In the following discussion, an electronic device including a display and a touch-sensitive surface is described. However, it should be understood that the electronic device optionally includes one or more other physical user interface devices, such as a physical keyboard, mouse, and/or joystick.
The device typically supports various applications, such as one or more of the following: note applications, drawing applications, presentation applications, word processing applications, website creation applications, disk editing applications, spreadsheet applications, gaming applications, telephony applications, video conferencing applications, mail applications, instant messaging applications, fitness support applications, photo management applications, digital camera applications, digital video camera applications, web browsing applications, digital music player applications, and/or digital video player applications.
The various applications executing on the device optionally use at least one generic physical user interface device, such as a touch-sensitive surface. One or more functions of the touch-sensitive surface and corresponding information displayed on the device are optionally adjusted and/or changed for different applications and/or within the respective applications. In this way, the common physical architecture of the devices (such as the touch-sensitive surface) optionally supports various applications with a user interface that is intuitive and transparent to the user.
Attention is now directed to embodiments of a portable device having a touch sensitive display. Fig. 1A is a block diagram illustrating a portable multifunction device 100 with a touch-sensitive display system 112 in accordance with some embodiments. Touch-sensitive display system 112 is sometimes referred to as a "touch screen" for convenience and is sometimes referred to simply as a touch-sensitive display. Device 100 includes memory 102 (which optionally includes one or more computer-readable storage media), memory controller 122, one or more processing units (CPUs) 120, peripheral interface 118, RF circuitry 108, audio circuitry 110, speaker 111, microphone 113, input/output (I/O) subsystem 106, other input or control devices 116, and external ports 124. The apparatus 100 optionally includes one or more optical sensors 164. The device 100 optionally includes one or more intensity sensors 165 (e.g., a touch-sensitive surface, such as the touch-sensitive display system 112 of the device 100) for detecting the intensity of contacts on the device 100. Device 100 optionally includes one or more tactile output generators 167 (e.g., generating tactile output on a touch-sensitive surface, such as touch-sensitive display system 112 of device 100 or touch pad 355 of device 300) for generating tactile output on device 100. These components optionally communicate via one or more communication buses or signal lines 103.
It should be understood that the device 100 is merely one example of a portable multifunction device, and that the device 100 optionally has more or fewer components than shown, optionally combines two or more components, or optionally has a different configuration or arrangement of the components. The various components shown in fig. 1A are implemented in hardware, software, firmware, or any combination thereof (including one or more signal processing circuits and/or application specific integrated circuits).
Memory 102 optionally includes high-speed random access memory, and also optionally includes non-volatile memory, such as one or more disk storage devices, flash memory devices, or other non-volatile solid-state memory devices. Access to the memory 102 by other components of the device 100, such as the CPU 120 and the peripheral interface 118, is optionally controlled by a memory controller 122.
Peripheral interface 118 may be used to couple input and output peripherals of the device to CPU 120 and memory 102. The one or more processors 120 run or execute various software programs and/or sets of instructions stored in the memory 102 to perform various functions of the device 100 and process data.
In some embodiments, peripheral interface 118, CPU 120, and memory controller 122 are optionally implemented on a single chip, such as chip 104. In some other embodiments, they are optionally implemented on separate chips.
The RF (radio frequency) circuit 108 receives and transmits RF signals, also referred to as electromagnetic signals. RF circuitry 108 converts/converts electrical signals to/from electromagnetic signals and communicates with communication networks and other communication devices via electromagnetic signals. RF circuitry 108 optionally includes well known circuitry for performing these functions including, but not limited to, an antenna system, an RF transceiver, one or more amplifiers, a tuner, one or more oscillators, a digital signal processor, a codec chipset, a Subscriber Identity Module (SIM) card, memory, and the like. RF circuitry 108 optionally communicates via wireless communication with networks such as the internet (also known as the World Wide Web (WWW)), intranets, and/or wireless networks such as cellular telephone networks, wireless Local Area Networks (LANs), and/or Metropolitan Area Networks (MANs), and other devices. The wireless communication optionally uses any of a variety of communication standards, protocols, and technologies including, but not limited to, global system for mobile communications (GSM), enhanced Data GSM Environment (EDGE), high Speed Downlink Packet Access (HSDPA), high Speed Uplink Packet Access (HSUPA), evolution-only data (EV-DO), HSPA, hspa+, dual cell HSPA (DC-HSPA), long Term Evolution (LTE), near Field Communication (NFC), wideband code division multiple access (W-CDMA), code Division Multiple Access (CDMA), time Division Multiple Access (TDMA), bluetooth, wireless fidelity (Wi-Fi) (e.g., IEEE 802.11a, IEEE 802.11ac, IEEE 802.11ax, IEEE 802.11b, IEEE 802.11g, and/or IEEE 802.11 n), voice over internet protocol (VoIP), wi-MAX, electronic mail protocols (e.g., internet Message Access Protocol (IMAP), and/or Post Office Protocol (POP)), instant messaging (e.g., extensible messaging and presence protocol (XMPP), instant messaging and presence protocol (sime), instant messaging and presence protocol (IMPS) for instant messaging and presence, or SMS (SMS) using extensions, or any other communication protocol not yet developed herein, or any other suitable date.
Audio circuitry 110, speaker 111, and microphone 113 provide an audio interface between the user and device 100. Audio circuitry 110 receives audio data from peripheral interface 118, converts the audio data to electrical signals, and transmits the electrical signals to speaker 111. The speaker 111 converts electrical signals into sound waves that are audible to humans. The audio circuit 110 also receives electrical signals converted from sound waves by the microphone 113. The audio circuitry 110 converts the electrical signals into audio data and transmits the audio data to the peripheral interface 118 for processing. The audio data is optionally retrieved from and/or transmitted to the memory 102 and/or the RF circuitry 108 by the peripheral interface 118. In some embodiments, the audio circuit 110 also includes a headset jack (e.g., 212 in fig. 2). The headset jack provides an interface between the audio circuit 110 and removable audio input/output peripherals such as output-only headphones or a headset having both an output (e.g., a monaural or binaural) and an input (e.g., a microphone).
The I/O subsystem 106 couples input/output peripheral devices on the device 100, such as the touch-sensitive display system 112 and other input or control devices 116, to the peripheral device interface 118. The I/O subsystem 106 optionally includes a display controller 156, an optical sensor controller 158, an intensity sensor controller 159, a haptic feedback controller 161, and one or more input controllers 160 for other input or control devices. One or more input controllers 160 receive electrical signals from/transmit electrical signals to other input or control devices 116. Other input control devices 116 optionally include physical buttons (e.g., push buttons, rocker buttons, etc.), dials, slider switches, joysticks, click wheels, and the like. In some alternative implementations, one or more input controllers 160 are optionally coupled to (or not coupled to) any of the following: a keyboard, an infrared port, a USB port, a stylus, and/or a pointing device such as a mouse. One or more buttons (e.g., 208 in fig. 2) optionally include an up/down button for volume control of speaker 111 and/or microphone 113. The one or more buttons optionally include a push button (e.g., 206 in fig. 2).
The touch sensitive display system 112 provides an input interface and an output interface between the device and the user. The display controller 156 receives electrical signals from and/or transmits electrical signals to the touch sensitive display system 112. The touch sensitive display system 112 displays visual output to a user. Visual output optionally includes graphics, text, icons, video, and any combination thereof (collectively, "graphics"). In some embodiments, some or all of the visual output corresponds to a user interface object. As used herein, the term "affordance" refers to a user-interactive graphical user interface object (e.g., a graphical user interface object configured to respond to input directed to the graphical user interface object). Examples of user interactive graphical user interface objects include, but are not limited to, buttons, sliders, icons, selectable menu items, switches, hyperlinks, or other user interface controls.
The touch-sensitive display system 112 has a touch-sensitive surface, sensor, or set of sensors that receives input from a user based on haptic and/or tactile contact. The touch-sensitive display system 112 and the display controller 156 (along with any associated modules and/or sets of instructions in the memory 102) detect contact (and any movement or interruption of the contact) on the touch-sensitive display system 112 and translate the detected contact into interactions with user interface objects (e.g., one or more soft keys, icons, web pages, or images) displayed on the touch-sensitive display system 112. In some implementations, the point of contact between the touch-sensitive display system 112 and the user corresponds to a user's finger or stylus.
Touch-sensitive display system 112 optionally uses LCD (liquid crystal display) technology, LPD (light emitting polymer display) technology, or LED (light emitting diode) technology, but in other embodiments other display technologies are used. Touch sensitive display system 112 and display controller 156 optionally detect contact and any movement or interruption thereof using any of a variety of touch sensing technologies now known or later developed, including but not limited to capacitive, resistive, infrared, and surface acoustic wave technologies, as well as other proximity sensor arrays or other elements for determining one or more points of contact with touch sensitive display system 112. In some embodiments, projected mutual capacitance sensing techniques are used, such as those from Apple inc (Cupertino, california)iPod/>And/>The technology found in (a) is provided.
The touch sensitive display system 112 optionally has a video resolution in excess of 100 dpi. In some implementations, the touch screen video resolution exceeds 400dpi (e.g., 500dpi, 800dpi, or greater). The user optionally uses any suitable object or appendage, such as a stylus, finger, or the like, to make contact with the touch-sensitive display system 112. In some embodiments, the user interface is designed to work with finger-based contacts and gestures, which may not be as accurate as stylus-based input due to the large contact area of the finger on the touch screen. In some embodiments, the device translates the finger-based coarse input into a precise pointer/cursor position or command for performing the action desired by the user.
In some embodiments, the device 100 optionally includes a touch pad for activating or deactivating a particular function in addition to the touch screen. In some embodiments, the touch pad is a touch sensitive area of the device that, unlike the touch screen, does not display visual output. The touch pad is optionally a touch-sensitive surface separate from the touch-sensitive display system 112 or an extension of the touch-sensitive surface formed by the touch screen.
The apparatus 100 also includes a power system 162 for powering the various components. The power system 162 optionally includes a power management system, one or more power sources (e.g., battery, alternating Current (AC)), a recharging system, a power failure detection circuit, a power converter or inverter, a power status indicator (e.g., light Emitting Diode (LED)), and any other components associated with the generation, management, and distribution of power in the portable device.
The apparatus 100 optionally further comprises one or more optical sensors 164. FIG. 1A shows an optical sensor coupled to an optical sensor controller 158 in the I/O subsystem 106. The one or more optical sensors 164 optionally include a Charge Coupled Device (CCD) or a Complementary Metal Oxide Semiconductor (CMOS) phototransistor. The one or more optical sensors 164 receive light projected through the one or more lenses from the environment and convert the light into data representing an image. In conjunction with imaging module 143 (also referred to as a camera module), one or more optical sensors 164 optionally capture still images and/or video. In some embodiments, the optical sensor is located on the back of the device 100 opposite the touch sensitive display system 112 on the front of the device, enabling the touch screen to be used as a viewfinder for still image and/or video image acquisition. In some embodiments, another optical sensor is located on the front of the device to acquire an image of the user (e.g., for self-timer shooting, for video conferencing while the user views other video conference participants on a touch screen, etc.).
The apparatus 100 optionally further comprises one or more contact intensity sensors 165. FIG. 1A shows a contact intensity sensor coupled to an intensity sensor controller 159 in the I/O subsystem 106. The one or more contact strength sensors 165 optionally include one or more piezoresistive strain gauges, capacitive force sensors, electrical force sensors, piezoelectric force sensors, optical force sensors, capacitive touch-sensitive surfaces, or other strength sensors (e.g., sensors for measuring force (or pressure) of a contact on a touch-sensitive surface). One or more contact strength sensors 165 receive contact strength information (e.g., pressure information or a surrogate for pressure information) from the environment. In some implementations, at least one contact intensity sensor is juxtaposed or adjacent to a touch-sensitive surface (e.g., touch-sensitive display system 112). In some embodiments, at least one contact intensity sensor is located on a rear of the device 100 opposite the touch sensitive display system 112 located on the front of the device 100.
The device 100 optionally further includes one or more proximity sensors 166. Fig. 1A shows a proximity sensor 166 coupled to the peripheral interface 118. Alternatively, the proximity sensor 166 is coupled to the input controller 160 in the I/O subsystem 106. In some implementations, the proximity sensor turns off and disables the touch-sensitive display system 112 when the multifunction device is placed near the user's ear (e.g., when the user is making a phone call).
The device 100 optionally further comprises one or more tactile output generators 167. FIG. 1A shows a haptic output generator coupled to a haptic feedback controller 161 in the I/O subsystem 106. In some embodiments, the tactile output generator 167 includes one or more electroacoustic devices such as speakers or other audio components; and/or electromechanical devices for converting energy into linear motion such as motors, solenoids, electroactive polymers, piezoelectric actuators, electrostatic actuators, or other tactile output generating means (e.g., means for converting an electrical signal into a tactile output on a device). The haptic output generator 167 receives haptic feedback generation instructions from the haptic feedback module 133 and generates a haptic output on the device 100 that is capable of being perceived by a user of the device 100. In some embodiments, at least one tactile output generator is juxtaposed or adjacent to a touch-sensitive surface (e.g., touch-sensitive display system 112), and optionally generates tactile output by moving the touch-sensitive surface vertically (e.g., inward/outward of the surface of device 100) or laterally (e.g., backward and forward in the same plane as the surface of device 100). In some embodiments, at least one tactile output generator sensor is located on a rear of the device 100 opposite the touch sensitive display system 112 located on a front of the device 100.
The device 100 optionally further includes one or more accelerometers 168. Fig. 1A shows accelerometer 168 coupled to peripheral interface 118. Alternatively, accelerometer 168 is optionally coupled with input controller 160 in I/O subsystem 106. In some implementations, information is displayed in a portrait view or a landscape view on a touch screen display based on analysis of data received from one or more accelerometers. The device 100 optionally includes a magnetometer and a GPS (or GLONASS or other global navigation system) receiver in addition to the accelerometer 168 for obtaining information about the position and orientation (e.g., longitudinal or lateral) of the device 100.
In some embodiments, the software components stored in memory 102 include an operating system 126, a communication module (or instruction set) 128, a contact/motion module (or instruction set) 130, a graphics module (or instruction set) 132, a haptic feedback module (or instruction set) 133, a text input module (or instruction set) 134, a Global Positioning System (GPS) module (or instruction set) 135, and an application program (or instruction set) 136. Further, in some embodiments, memory 102 stores device/global internal state 157, as shown in fig. 1A and 3. The device/global internal state 157 includes one or more of the following: an active application state indicating which applications (if any) are currently active; display status, which indicates what applications, views, or other information occupy various regions of the touch-sensitive display system 112; sensor status, which includes information obtained from various sensors of the device and other input or control devices 116; and location and/or position information regarding the location and/or pose of the device.
Operating system 126 (e.g., iOS, darwin, RTXC, LINUX, UNIX, OS X, WINDOWS, or an embedded operating system such as VxWorks) includes various software components and/or drivers for controlling and managing general system tasks (e.g., memory management, storage device control, power management, etc.), and facilitates communication between the various hardware and software components.
The communication module 128 facilitates communication with other devices through one or more external ports 124 and also includes various software components for processing data received by the RF circuitry 108 and/or the external ports 124. External port 124 (e.g., universal Serial Bus (USB), firewire, etc.) is adapted to be coupled directly to other devices or indirectly via a network (e.g., the internet, wireless LAN, etc.). In some embodiments, the external ports are some of the same as Apple inc (Cupertino, california)iPod/>And/>The 30-pin connectors used in the devices are the same or similar and/or compatible multi-pin (e.g., 30-pin) connectors. In some embodiments, the external port is some/>, with Apple inc (Cupertino, california)iPod/>And/>The lighting connectors used in the devices are the same or similar and/or compatible lighting connectors.
The contact/motion module 130 optionally detects contact with the touch-sensitive display system 112 (in conjunction with the display controller 156) and other touch-sensitive devices (e.g., a touch pad or physical click wheel). The contact/motion module 130 includes various software components for performing various operations related to contact detection (e.g., by a finger or stylus), such as determining whether a contact has occurred (e.g., detecting a finger press event), determining the strength of the contact (e.g., the force or pressure of the contact, or a substitute for the force or pressure of the contact), determining whether there is movement of the contact and tracking movement across the touch-sensitive surface (e.g., detecting one or more finger drag events), and determining whether the contact has stopped (e.g., detecting a finger lift event or contact break). The contact/motion module 130 receives contact data from the touch-sensitive surface. Determining movement of the point of contact optionally includes determining a velocity (magnitude), a speed (magnitude and direction), and/or an acceleration (change in magnitude and/or direction) of the point of contact, the movement of the point of contact being represented by a series of contact data. These operations are optionally applied to single point contacts (e.g., single-finger contacts or stylus contacts) or simultaneous multi-point contacts (e.g., "multi-touch"/multi-finger contacts). In some embodiments, the contact/motion module 130 and the display controller 156 detect contact on the touch pad.
The contact/motion module 130 optionally detects gesture input by the user. Different gestures on the touch-sensitive surface have different contact patterns (e.g., different movements, timings, and/or intensities of the detected contacts). Thus, gestures are optionally detected by detecting a particular contact pattern. For example, detecting a single-finger tap gesture includes detecting a finger-down event, and then detecting a finger-up (lift-off) event at the same location (or substantially the same location) as the finger-down event (e.g., at an icon location). As another example, detecting a finger swipe gesture on a touch-sensitive surface includes detecting a finger press event, then detecting one or more finger drag events, and then detecting a finger lift (lift off) event. Similarly, taps, swipes, drags, and other gestures of the stylus are optionally detected by detecting a particular contact pattern of the stylus.
In some embodiments, detecting a finger tap gesture depends on a length of time between detecting a finger press event and a finger lift event, but is independent of a finger contact strength between detecting a finger press event and a finger lift event. In some embodiments, in accordance with a determination that the length of time between the finger press event and the finger lift event is less than a predetermined value (e.g., less than 0.1 seconds, 0.2 seconds, 0.3 seconds, 0.4 seconds, or 0.5 seconds), a flick gesture is detected, regardless of whether the intensity of the finger contact during the flick reaches a given intensity threshold (greater than a nominal contact detection intensity threshold), such as a light press or a deep press intensity threshold. Thus, a finger tap gesture may satisfy a particular input criteria that does not require the characteristic intensity of the contact to satisfy a given intensity threshold to satisfy the particular input criteria. For clarity, finger contact in a flick gesture is typically required to meet a nominal contact detection intensity threshold below which no contact is detected to detect a finger press event. Similar analysis applies to detecting a flick gesture by a stylus or other contact. In the case where the device is capable of detecting finger or stylus contact hovering over a touch sensitive surface, the nominal contact detection intensity threshold optionally does not correspond to physical contact between the finger or stylus and the touch sensitive surface.
The same concepts apply in a similar manner to other types of gestures. For example, a swipe gesture, pinch gesture, spread gesture, and/or long press gesture may optionally be detected based on meeting criteria that one or more contacts that are independent of the intensity of contacts included in the gesture or do not require the performance of the gesture reach an intensity threshold in order to be recognized. For example, a swipe gesture is detected based on an amount of movement of the one or more contacts; pinch gestures are detected based on movement of two or more contacts toward each other; the spread gesture is detected based on movement of the two or more contacts away from each other; the long press gesture is detected based on a duration of contact on the touch-sensitive surface having less than a threshold amount of movement. Thus, statement that a particular gesture recognition criterion does not require that the contact intensity meet a respective intensity threshold to meet the particular gesture recognition criterion means that the particular gesture recognition criterion can be met when a contact in the gesture does not meet the respective intensity threshold, and can also be met if one or more contacts in the gesture meet or exceed the respective intensity threshold. In some embodiments, a flick gesture is detected based on determining that a finger press event and a finger lift event are detected within a predefined time period, regardless of whether the contact is above or below a respective intensity threshold during the predefined time period, and a swipe gesture is detected based on determining that the contact movement is greater than a predefined magnitude, even though the contact is above the respective intensity threshold at the end of the contact movement. Even in implementations where the detection of gestures is affected by the intensity of the contact performing the gesture (e.g., the device detects a long press faster when the intensity of the contact is above an intensity threshold, or the device delays the detection of a tap input when the intensity of the contact is higher), the detection of these gestures does not require the contact to reach a certain intensity threshold (e.g., even if the amount of time required to recognize the gesture changes) as long as the criteria for recognizing the gesture can be met without the contact reaching the certain intensity threshold.
In some cases, the contact strength threshold, duration threshold, and movement threshold are combined in various different combinations in order to create a heuristic to distinguish between two or more different gestures for the same input element or region, such that multiple different interactions with the same input element can provide a richer set of user interactions and responses. Statement that a set of particular gesture recognition criteria does not require that the intensity of one or more contacts meet a respective intensity threshold in order to meet the particular gesture recognition criteria does not preclude simultaneous evaluation of other intensity-related gesture recognition criteria to identify other gestures having criteria that are met when the gesture includes a contact having an intensity above the respective intensity threshold. For example, in some cases, a first gesture recognition criterion of a first gesture (which does not require the intensity of a contact to meet a respective intensity threshold to meet the first gesture recognition criterion) competes with a second gesture recognition criterion of a second gesture (which depends on the contact reaching the respective intensity threshold). In such a competition, if the second gesture recognition criteria of the second gesture is satisfied first, the gesture is optionally not recognized as satisfying the first gesture recognition criteria of the first gesture. For example, if the contact reaches a corresponding intensity threshold before the contact moves by a predefined amount of movement, a deep press gesture is detected instead of a swipe gesture. Conversely, if the contact moves a predefined amount of movement before the contact reaches the corresponding intensity threshold, a swipe gesture is detected instead of a deep press gesture. Even in such cases, the first gesture recognition criteria of the first gesture still does not require the intensity of the contact to meet the respective intensity threshold to meet the first gesture recognition criteria because if the contact remains below the respective intensity threshold until the gesture ends (e.g., a swipe gesture having a contact that does not increase in intensity above the respective intensity threshold), the gesture will be recognized by the first gesture recognition criteria as a swipe gesture. Thus, a particular gesture recognition criterion that does not require the intensity of the contact to meet the respective intensity threshold to meet the particular gesture recognition criterion will (a) in some cases ignore the intensity of the contact relative to the intensity threshold (e.g., for a flick gesture) and/or (B) in some cases fail to meet the particular gesture recognition criterion (e.g., for a long press gesture) in the sense that the intensity of the contact relative to the intensity threshold (e.g., for a long press gesture) is still dependent on if a competing set of intensity-related gesture recognition criteria (e.g., for a long press gesture that competes for recognition with a deep press gesture) recognizes the input as corresponding to the intensity-related gesture before the particular gesture recognition criterion recognizes the gesture.
Graphics module 132 includes various known software components for rendering and displaying graphics on touch-sensitive display system 112 or other displays, including means for changing the visual impact (e.g., brightness, transparency, saturation, contrast, or other visual attribute) of the displayed graphics. As used herein, the term "graphic" includes any object that may be displayed to a user, including without limitation text, web pages, icons (such as user interface objects including soft keys), digital images, video, animation, and the like.
In some embodiments, graphics module 132 stores data representing graphics to be used. Each graphic is optionally assigned a corresponding code. The graphic module 132 receives one or more codes for designating graphics to be displayed from an application program or the like, and also receives coordinate data and other graphic attribute data together if necessary, and then generates screen image data to output to the display controller 156.
Haptic feedback module 133 includes various software components for generating instructions (e.g., instructions used by haptic feedback controller 161) to generate haptic output at one or more locations on device 100 using haptic output generator 167 in response to user interaction with device 100.
Text input module 134, which is optionally a component of graphics module 132, provides a soft keyboard for entering text in various applications (e.g., contacts 137, email 140, IM 141, browser 147, and any other application requiring text input).
The GPS module 135 determines the location of the device and provides this information for use in various applications (e.g., to the phone 138 for location-based dialing, to the camera 143 as picture/video metadata, and to applications that provide location-based services such as weather desktops, page-on-the-ground desktops, and map/navigation desktops).
The application 136 optionally includes the following modules (or sets of instructions) or a subset or superset thereof:
● A contacts module 137 (sometimes referred to as an address book or contact list);
● A telephone module 138;
● A video conference module 139;
● An email client module 140;
● An Instant Messaging (IM) module 141;
● A fitness support module 142;
● A camera module 143 for still and/or video images;
● An image management module 144;
● A browser module 147;
● A calendar module 148;
● A desktop applet module 149, optionally including one or more of: weather desktop applet 149-1, stock market desktop applet 149-2, calculator desktop applet 149-3, alarm desktop applet 149-4, dictionary desktop applet 149-5, and other desktop applets obtained by the user, and user created desktop applet 149-6;
● A desktop applet creator module 150 for forming a user-created desktop applet 149-6;
● A search module 151;
● A video and music player module 152 optionally comprised of a video player module and a music player module;
● A notepad module 153;
● A map module 154; and/or
● An online video module 155.
Examples of other applications 136 optionally stored in memory 102 include other word processing applications, other image editing applications, drawing applications, presentation applications, JAVA-enabled applications, encryption, digital rights management, voice recognition, and voice replication.
In connection with the touch sensitive display system 112, the display controller 156, the contact module 130, the graphics module 132, and the text input module 134, the contact module 137 includes executable instructions for managing an address book or contact list (e.g., in the application internal state 192 of the contact module 137 stored in the memory 102 or the memory 370), including: adding names to address books; deleting the name from the address book; associating a telephone number, email address, physical address, or other information with the name; associating the image with the name; classifying and classifying names; providing a telephone number and/or email address to initiate and/or facilitate communication through telephone 138, video conference 139, email 140, or IM 141; etc.
In conjunction with RF circuitry 108, audio circuitry 110, speaker 111, microphone 113, touch-sensitive display system 112, display controller 156, contact module 130, graphics module 132, and text input module 134, phone module 138 includes executable instructions for: inputting a character sequence corresponding to the telephone numbers, accessing one or more telephone numbers in the address book 137, modifying the inputted telephone numbers, dialing the corresponding telephone numbers, conducting a conversation, and disconnecting or hanging up when the conversation is completed. As described above, wireless communication optionally uses any of a variety of communication standards, protocols, and technologies.
In conjunction with RF circuitry 108, audio circuitry 110, speaker 111, microphone 113, touch-sensitive display system 112, display controller 156, one or more optical sensors 164, optical sensor controller 158, contact module 130, graphics module 132, text input module 134, contact list 137, and telephony module 138, videoconferencing module 139 includes executable instructions to initiate, conduct, and terminate a videoconference between a user and one or more other participants according to user instructions.
In conjunction with RF circuitry 108, touch-sensitive display system 112, display controller 156, contact module 130, graphics module 132, and text input module 134, email client module 140 includes executable instructions for creating, sending, receiving, and managing emails in response to user instructions. In conjunction with the image management module 144, the email client module 140 makes it very easy to create and send emails with still or video images captured by the camera module 143.
In conjunction with RF circuitry 108, touch-sensitive display system 112, display controller 156, contact module 130, graphics module 132, and text input module 134, instant message module 141 includes executable instructions for: inputting a character sequence corresponding to the instant message, modifying previously inputted characters, transmitting the corresponding instant message (e.g., using a Short Message Service (SMS) or Multimedia Message Service (MMS) protocol for phone-based instant messages or using XMPP, SIMPLE, apple Push Notification Service (APNs) or IMPS) for internet-based instant messages), receiving the instant message, and viewing the received instant message. In some implementations, the transmitted and/or received instant message optionally includes graphics, photos, audio files, video files, and/or other attachments supported in an MMS and/or Enhanced Messaging Service (EMS). As used herein, "instant message" refers to both telephone-based messages (e.g., messages sent using SMS or MMS) and internet-based messages (e.g., messages sent using XMPP, SIMPLE, APNs or IMPS).
In conjunction with RF circuitry 108, touch-sensitive display system 112, display controller 156, contact module 130, graphics module 132, text input module 134, GPS module 135, map module 154, and video and music player module 152, workout support module 142 includes executable instructions for creating a workout (e.g., with time, distance, and/or calorie burn targets); communication with fitness sensors (in sports equipment and smart watches); receiving fitness sensor data; calibrating a sensor for monitoring fitness; selecting and playing music for exercise; and displaying, storing and transmitting the fitness data.
In conjunction with touch-sensitive display system 112, display controller 156, one or more optical sensors 164, optical sensor controller 158, contact module 130, graphics module 132, and image management module 144, camera module 143 includes executable instructions for: capturing still images or videos (including video streams) and storing them in the memory 102, modifying features of the still images or videos, and/or deleting the still images or videos from the memory 102.
In conjunction with the touch-sensitive display system 112, the display controller 156, the contact module 130, the graphics module 132, the text input module 134, and the camera module 143, the image management module 144 includes executable instructions for arranging, modifying (e.g., editing), or otherwise manipulating, labeling, deleting, presenting (e.g., in a digital slide or album), and storing still images and/or video images.
In conjunction with RF circuitry 108, touch-sensitive display system 112, display system controller 156, touch module 130, graphics module 132, and text input module 134, browser module 147 includes executable instructions for browsing the internet (including searching, linking to, receiving, and displaying web pages or portions thereof, and attachments and other files linked to web pages) according to user instructions.
In conjunction with RF circuitry 108, touch-sensitive display system 112, display system controller 156, contact module 130, graphics module 132, text input module 134, email client module 140, and browser module 147, calendar module 148 includes executable instructions for creating, displaying, modifying, and storing calendars and data associated with calendars (e.g., calendar entries, to-do items, etc.) according to user instructions.
In conjunction with the RF circuitry 108, the touch-sensitive display system 112, the display system controller 156, the contact module 130, the graphics module 132, the text input module 134, and the browser module 147, the desktop applet module 149 is a mini-application (e.g., weather desktop applet 149-1, stock desktop applet 149-2, calculator desktop applet 149-3, alarm clock desktop applet 149-4, and dictionary desktop applet 149-5) optionally downloaded and used by a user, or a mini-application created by a user (e.g., user created desktop applet 149-6). In some embodiments, gadgets include HTML (hypertext markup language) files, CSS (cascading style sheet) files, and JavaScript files. In some embodiments, gadgets include XML (extensible markup language) files and JavaScript files (e.g., yahoo | gadgets).
In conjunction with RF circuitry 108, touch-sensitive display system 112, display system controller 156, contact module 130, graphics module 132, text input module 134, and browser module 147, desktop applet creator module 150 includes executable instructions for creating an applet (e.g., turning a user-specified portion of a web page into the applet).
In conjunction with touch-sensitive display system 112, display system controller 156, contact module 130, graphics module 132, and text input module 134, search module 151 includes executable instructions for searching for text, music, sound, images, video, and/or other files in memory 102 that match one or more search criteria (e.g., one or more user-specified search terms) according to user instructions.
In conjunction with the touch-sensitive display system 112, the display system controller 156, the contact module 130, the graphics module 132, the audio circuit 110, the speaker 111, the RF circuit 108, and the browser module 147, the video and music player module 152 includes executable instructions that allow a user to download and playback recorded music and other sound files stored in one or more file formats (such as MP3 or AAC files), as well as executable instructions for displaying, presenting, or otherwise playing back video (e.g., on the touch-sensitive display system 112 or on an external display wirelessly connected via the external port 124). In some embodiments, the device 100 optionally includes the functionality of an MP3 player such as an iPod (trademark of Apple inc.).
In conjunction with touch-sensitive display system 112, display controller 156, touch module 130, graphics module 132, and text input module 134, notepad module 153 includes executable instructions for creating and managing notes, backlog, etc. in accordance with user instructions.
In conjunction with the RF circuitry 108, the touch-sensitive display system 112, the display system controller 156, the contact module 130, the graphics module 132, the text input module 134, the GPS module 135, and the browser module 147, the map module 154 includes executable instructions for receiving, displaying, modifying, and storing maps and data associated with maps (e.g., driving directions, data of stores and other points of interest at or near a particular location, and other location-based data) according to user instructions.
In conjunction with the touch sensitive display system 112, the display system controller 156, the contact module 130, the graphics module 132, the audio circuit 110, the speaker 111, the RF circuit 108, the text input module 134, the email client module 140, and the browser module 147, the online video module 155 includes executable instructions that allow a user to access, browse, receive (e.g., by streaming and/or downloading), play back (e.g., on the touch screen 112 or on an external display connected wirelessly or via the external port 124), send emails with links to particular online videos, and otherwise manage online videos in one or more file formats such as H.264. In some embodiments, the instant messaging module 141 is used to send links to particular online videos instead of the email client module 140.
Each of the modules and applications identified above corresponds to a set of executable instructions for performing one or more of the functions described above, as well as the methods described in the present disclosure (e.g., computer-implemented methods and other information processing methods described herein). These modules (i.e., sets of instructions) need not be implemented in separate software programs, procedures or modules, and thus various subsets of these modules are optionally combined or otherwise rearranged in various embodiments. In some embodiments, memory 102 optionally stores a subset of the modules and data structures described above. Further, memory 102 optionally stores additional modules and data structures not described above.
In some embodiments, device 100 is a device in which the operation of a predefined set of functions on the device is performed exclusively through a touch screen and/or touch pad. By using a touch screen and/or a touch pad as the primary input control device for operating the device 100, the number of physical input control devices (e.g., push buttons, dials, etc.) on the device 100 is optionally reduced.
A predefined set of functions performed solely by the touch screen and/or touch pad optionally includes navigation between user interfaces. In some embodiments, the touchpad, when touched by a user, navigates the device 100 from any user interface displayed on the device 100 to a main menu, a main desktop menu, or a root menu. In such implementations, a touch pad is used to implement a "menu button". In some other embodiments, the menu buttons are physical push buttons or other physical input control devices, rather than touch pads.
FIG. 1B is a block diagram illustrating exemplary components for event processing according to some embodiments. In some embodiments, memory 102 (in FIG. 1A) or memory 370 (in FIG. 3) includes event sorter 170 (e.g., in operating system 126) and corresponding applications 136-1 (e.g., any of the aforementioned applications 136, 137-155, 380-390).
The event classifier 170 receives the event information and determines the application view 191 of the application 136-1 and the application 136-1 to which the event information is to be delivered. The event sorter 170 includes an event monitor 171 and an event dispatcher module 174. In some implementations, the application 136-1 includes an application internal state 192 that indicates one or more current application views that are displayed on the touch-sensitive display system 112 when the application is active or executing. In some embodiments, the device/global internal state 157 is used by the event classifier 170 to determine which application(s) are currently active, and the application internal state 192 is used by the event classifier 170 to determine the application view 191 to which to deliver event information.
In some implementations, the application internal state 192 includes additional information, such as one or more of the following: restoration information to be used when the application 136-1 resumes execution, user interface state information indicating that the information is being displayed or ready for display by the application 136-1, a state queue for enabling the user to return to a previous state or view of the application 136-1, and a repeat/undo queue of previous actions taken by the user.
Event monitor 171 receives event information from peripheral interface 118. The event information includes information about sub-events (e.g., user touches on the touch sensitive display system 112 as part of a multi-touch gesture). The peripheral interface 118 transmits information it receives from the I/O subsystem 106 or sensors, such as a proximity sensor 166, one or more accelerometers 168, and/or microphone 113 (via audio circuitry 110). The information received by the peripheral interface 118 from the I/O subsystem 106 includes information from the touch-sensitive display system 112 or touch-sensitive surface.
In some embodiments, event monitor 171 sends requests to peripheral interface 118 at predetermined intervals. In response, the peripheral interface 118 transmits event information. In other embodiments, the peripheral interface 118 transmits event information only if there is a significant event (e.g., receiving an input above a predetermined noise threshold and/or receiving an input exceeding a predetermined duration).
In some implementations, the event classifier 170 also includes a hit view determination module 172 and/or an active event identifier determination module 173.
When the touch sensitive display system 112 displays more than one view, the hit view determination module 172 provides a software process for determining where within one or more views a sub-event has occurred. The view is made up of controls and other elements that the user can see on the display.
Another aspect of the user interface associated with an application is a set of views, sometimes referred to herein as application views or user interface windows, in which information is displayed and touch-based gestures occur. The application view (of the respective application) in which the touch is detected optionally corresponds to a level of programming within the application's programming or view hierarchy. For example, the lowest horizontal view in which a touch is detected is optionally referred to as a hit view, and the set of events that are recognized as correct inputs is optionally determined based at least in part on the hit view of the initial touch that begins a touch-based gesture.
Hit view determination module 172 receives information related to sub-events of the touch-based gesture. When an application has multiple views organized in a hierarchy, hit view determination module 172 identifies the hit view as the lowest view in the hierarchy that should process sub-events. In most cases, the hit view is the lowest level view in which the initiating sub-event (i.e., the first sub-event in the sequence of sub-events that form the event or potential event) occurs. Once the hit view is identified by the hit view determination module, the hit view typically receives all sub-events related to the same touch or input source for which it was identified as a hit view.
The activity event recognizer determination module 173 determines which view or views within the view hierarchy should receive a particular sequence of sub-events. In some implementations, the active event identifier determination module 173 determines that only the hit view should receive a particular sequence of sub-events. In other embodiments, the activity event recognizer determination module 173 determines that all views that include the physical location of a sub-event are actively engaged views, and thus determines that all actively engaged views should receive a particular sequence of sub-events. In other embodiments, even if the touch sub-event is completely localized to an area associated with one particular view, the higher view in the hierarchy will remain the actively engaged view.
The event dispatcher module 174 dispatches event information to an event recognizer (e.g., event recognizer 180). In embodiments that include an active event recognizer determination module 173, the event dispatcher module 174 delivers event information to the event recognizers determined by the active event recognizer determination module 173. In some embodiments, the event dispatcher module 174 stores event information in an event queue that is retrieved by the corresponding event receiver module 182.
In some embodiments, the operating system 126 includes an event classifier 170. Alternatively, the application 136-1 includes an event classifier 170. In yet another embodiment, the event classifier 170 is a stand-alone module or part of another module stored in the memory 102, such as the contact/motion module 130.
In some embodiments, application 136-1 includes a plurality of event handlers 190 and one or more application views 191, each of which includes instructions for processing touch events that occur within a respective view of the user interface of the application. Each application view 191 of the application 136-1 includes one or more event recognizers 180. Typically, the respective application view 191 includes a plurality of event recognizers 180. In other embodiments, one or more of the event recognizers 180 are part of a separate module that is a higher level object from which methods and other properties are inherited, such as the user interface toolkit or application 136-1. In some implementations, the respective event handlers 190 include one or more of the following: data updater 176, object updater 177, GUI updater 178, and/or event data 179 received from event sorter 170. Event handler 190 optionally utilizes or invokes data updater 176, object updater 177, or GUI updater 178 to update the application internal state 192. Alternatively, one or more of application views 191 include one or more corresponding event handlers 190. Additionally, in some implementations, one or more of the data updater 176, the object updater 177, and the GUI updater 178 are included in a respective application view 191.
The corresponding event identifier 180 receives event information (e.g., event data 179) from the event classifier 170 and identifies events from the event information. Event recognizer 180 includes event receiver 182 and event comparator 184. In some embodiments, event recognizer 180 further includes at least a subset of metadata 183 and event transfer instructions 188 (which optionally include sub-event delivery instructions).
Event receiver 182 receives event information from event sorter 170. The event information includes information about sub-events such as touches or touch movements. The event information also includes additional information, such as the location of the sub-event, according to the sub-event. When a sub-event relates to movement of a touch, the event information optionally also includes the rate and direction of the sub-event. In some embodiments, the event includes rotation of the device from one orientation to another orientation (e.g., from a portrait orientation to a landscape orientation, or vice versa), and the event information includes corresponding information about a current orientation of the device (also referred to as a device pose).
The event comparator 184 compares the event information with predefined event or sub-event definitions and determines an event or sub-event or determines or updates the state of the event or sub-event based on the comparison. In some embodiments, event comparator 184 includes event definition 186. Event definition 186 includes definitions of events (e.g., a predefined sequence of sub-events), such as event 1 (187-1), event 2 (187-2), and others. In some implementations, sub-events in event 187 include, for example, touch start, touch end, touch movement, touch cancellation, and multi-touch. In one example, the definition of event 1 (187-1) is a double click on the displayed object. For example, the double click includes a first touch (touch start) for a predetermined period of time on the displayed object, a first lift-off (touch end) for a predetermined period of time, a second touch (touch start) for a predetermined period of time on the displayed object, and a second lift-off (touch end) for a predetermined period of time. In another example, the definition of event 2 (187-2) is a drag on the displayed object. For example, dragging includes touching (or contacting) on the displayed object for a predetermined period of time, movement of the touch on the touch-sensitive display system 112, and lift-off of the touch (touch end). In some embodiments, the event also includes information for one or more associated event handlers 190.
In some implementations, the event definitions 187 include definitions of events for respective user interface objects. In some implementations, the event comparator 184 performs a hit test to determine which user interface object is associated with a sub-event. For example, in an application view that displays three user interface objects on touch-sensitive display system 112, when a touch is detected on touch-sensitive display system 112, event comparator 184 performs a hit test to determine which of the three user interface objects is associated with the touch (sub-event). If each displayed object is associated with a respective event handler 190, the event comparator uses the results of the hit test to determine which event handler 190 should be activated. For example, event comparator 184 selects an event handler associated with the sub-event and the object that triggered the hit test.
In some implementations, the definition of the respective event 187 also includes delay actions that delay delivery of event information until after it has been determined that the sequence of sub-events does or does not correspond to an event type of the event recognizer.
When the respective event recognizer 180 determines that the sequence of sub-events does not match any of the events in the event definition 186, the respective event recognizer 180 enters an event impossible, event failed, or event end state after which subsequent sub-events of the touch-based gesture are ignored. In this case, the other event recognizers (if any) that remain active for the hit view continue to track and process sub-events of the ongoing touch-based gesture.
In some embodiments, the respective event recognizer 180 includes metadata 183 with configurable properties, flags, and/or lists that indicate how the event delivery system should perform sub-event delivery to the actively engaged event recognizer. In some embodiments, metadata 183 includes configurable attributes, flags, and/or lists that indicate how event recognizers interact or are able to interact with each other. In some embodiments, metadata 183 includes configurable properties, flags, and/or lists that indicate whether sub-events are delivered to different levels in a view or programmatic hierarchy.
In some embodiments, when one or more particular sub-events of an event are identified, the corresponding event recognizer 180 activates an event handler 190 associated with the event. In some implementations, the respective event identifier 180 delivers event information associated with the event to the event handler 190. The activate event handler 190 is different from sending (and deferring) sub-events to the corresponding hit view. In some embodiments, event recognizer 180 throws a marker associated with the recognized event, and event handler 190 associated with the marker retrieves the marker and performs a predefined process.
In some implementations, the event delivery instructions 188 include sub-event delivery instructions that deliver event information about the sub-event without activating the event handler. Instead, the sub-event delivery instructions deliver the event information to an event handler associated with the sub-event sequence or to an actively engaged view. Event handlers associated with the sequence of sub-events or with the actively engaged views receive the event information and perform a predetermined process.
In some embodiments, the data updater 176 creates and updates data used in the application 136-1. For example, the data updater 176 updates telephone numbers used in the contacts module 137 or stores video files used in the video or music player module 152. In some embodiments, object updater 177 creates and updates objects used in application 136-1. For example, the object updater 177 creates a new user interface object or updates the location of the user interface object. GUI updater 178 updates the GUI. For example, the GUI updater 178 prepares the display information and sends the display information to the graphics module 132 for display on a touch-sensitive display.
In some embodiments, event handler 190 includes or has access to data updater 176, object updater 177, and GUI updater 178. In some embodiments, the data updater 176, the object updater 177, and the GUI updater 178 are included in a single module of the respective application 136-1 or application view 191. In other embodiments, they are included in two or more software modules.
It should be appreciated that the above discussion regarding event handling of user touches on a touch sensitive display also applies to other forms of user inputs that utilize an input device to operate the multifunction device 100, not all of which are initiated on a touch screen. For example, mouse movements and mouse button presses optionally in conjunction with single or multiple keyboard presses or holds; contact movement on the touch pad, such as flicking, dragging, scrolling, etc.; inputting by a touch pen; movement of the device; verbal instructions; detected eye movement; inputting biological characteristics; and/or any combination thereof is optionally used as input corresponding to sub-events defining the event to be distinguished.
Fig. 2 illustrates a portable multifunction device 100 with a touch screen (e.g., touch-sensitive display system 112 of fig. 1A) in accordance with some embodiments. The touch screen optionally displays one or more graphics within a User Interface (UI) 200. In these embodiments, as well as other embodiments described below, a user can select one or more of these graphics by making a gesture on the graphics, for example, with one or more fingers 202 (not drawn to scale in the figures) or one or more styluses 203 (not drawn to scale in the figures). In some embodiments, selection of one or more graphics will occur when a user breaks contact with the one or more graphics. In some embodiments, the gesture optionally includes one or more taps, one or more swipes (left to right, right to left, up and/or down), and/or scrolling of a finger that has been in contact with the device 100 (right to left, left to right, up and/or down). In some implementations or in some cases, inadvertent contact with the graphic does not select the graphic. For example, when the gesture corresponding to the selection is a tap, a swipe gesture that swipes over an application icon optionally does not select the corresponding application.
The device 100 optionally also includes one or more physical buttons, such as a "home desktop" or menu button 204. As previously described, menu button 204 is optionally used to navigate to any application 136 in a set of applications that are optionally executed on device 100. Alternatively, in some embodiments, the menu buttons are implemented as soft keys in a GUI displayed on a touch screen display.
In some embodiments, the device 100 includes a touch screen display, menu buttons 204 (sometimes referred to as a main desktop button 204), a push button 206 for powering the device on/off and locking the device, a volume adjustment button 208, a Subscriber Identity Module (SIM) card slot 210, a headset jack 212, and a docking/charging external port 124. Pressing button 206 is optionally used to turn on/off the device by pressing the button and holding the button in the pressed state for a predefined time interval; locking the device by depressing the button and releasing the button before the predefined time interval has elapsed; and/or unlock the device or initiate an unlocking process. In some implementations, the device 100 also accepts voice input through the microphone 113 for activating or deactivating certain functions. The device 100 also optionally includes one or more contact intensity sensors 165 for detecting intensity of contacts on the touch-sensitive display system 112, and/or one or more tactile output generators 167 for generating tactile outputs for a user of the device 100.
FIG. 3 is a block diagram of an exemplary multifunction device with a display and a touch-sensitive surface in accordance with some embodiments. The device 300 need not be portable. In some embodiments, the device 300 is a laptop computer, a desktop computer, a tablet computer, a multimedia player device, a navigation device, an educational device (such as a child learning toy), a gaming system, or a control device (e.g., a home controller or an industrial controller). The device 300 generally includes one or more processing units (CPUs) 310, one or more network or other communication interfaces 360, memory 370, and one or more communication buses 320 for interconnecting these components. Communication bus 320 optionally includes circuitry (sometimes referred to as a chipset) that interconnects and controls communications between system components. The device 300 includes an input/output (I/O) interface 330 with a display 340, typically a touch screen display. The I/O interface 330 also optionally includes a keyboard and/or mouse (or other pointing device) 350 and a touch pad 355, a tactile output generator 357 (e.g., similar to the one or more tactile output generators 167 described above with reference to fig. 1A) for generating tactile outputs on the device 300, sensors 359 (e.g., optical sensors, acceleration sensors, proximity sensors, touch-sensitive sensors, and/or contact intensity sensors similar to the one or more contact intensity sensors 165 described above with reference to fig. 1A). Memory 370 includes high-speed random access memory, such as DRAM, SRAM, DDR RAM, or other random access solid state memory devices; and optionally includes non-volatile memory such as one or more magnetic disk storage devices, optical disk storage devices, flash memory devices, or other non-volatile solid state storage devices. Memory 370 optionally includes one or more storage devices located remotely from CPU 310. In some embodiments, memory 370 stores programs, modules, and data structures similar to those stored in memory 102 of portable multifunction device 100 (fig. 1A), or a subset thereof. Furthermore, memory 370 optionally stores additional programs, modules, and data structures not present in memory 102 of portable multifunction device 100. For example, memory 370 of device 300 optionally stores drawing module 380, presentation module 382, word processing module 384, website creation module 386, disk editing module 388, and/or spreadsheet module 390, while memory 102 of portable multifunction device 100 (fig. 1A) optionally does not store these modules.
Each of the above identified elements in fig. 3 is optionally stored in one or more of the previously mentioned memory devices. Each of the above identified modules corresponds to a set of instructions for performing the above described functions. The above identified modules or programs (i.e., sets of instructions) need not be implemented as separate software programs, procedures or modules, and thus various subsets of these modules are optionally combined or otherwise rearranged in various embodiments. In some embodiments, memory 370 optionally stores a subset of the modules and data structures described above. Further, memory 370 optionally stores additional modules and data structures not described above.
Attention is now directed to embodiments of a user interface ("UI") optionally implemented on the portable multifunction device 100.
Fig. 4A illustrates an example user interface 400 of an application menu on the portable multifunction device 100 in accordance with some embodiments. A similar user interface is optionally implemented on device 300. In some embodiments, the user interface 400 includes the following elements, or a subset or superset thereof:
● One or more signal strength indicators for one or more wireless communications, such as cellular signals and Wi-Fi signals;
● Time;
● A bluetooth indicator;
● A battery status indicator;
● A tray 408 with icons for commonly used applications such as:
icon 416 of phone module 138 marked "phone", optionally including an indicator 414 of the number of missed calls or voice messages;
An icon 418 of email client module 140 marked "mail" optionally including an indicator 410 of the number of unread emails;
icon 420 labeled "browser" for browser module 147; and
Icon 422 labeled "music" for video and music player module 152; and
● Icons of other applications, such as:
icon 424 marked "message" for IM module 141;
icon 426 of calendar module 148 marked "calendar";
Icon 428 marked "photo" of image management module 144;
Icon 430 marked "camera" for camera module 143;
icon 432 of online video module 155 marked "online video";
Icon 434 labeled "stock market" for stock market desktop applet 149-2;
icon 436 marked "map" of map module 154;
icon 438 marked "weather" for weather desktop applet 149-1;
icon 440 marked "clock" for alarm desktop applet 149-4;
icon 442 labeled "fitness support" for fitness support module 142;
Icon 444 marked "memo" of memo module 153; and
Setting application or module icon 446, which provides access to settings of device 100 and its various applications 136.
It should be noted that the iconic labels shown in fig. 4A are merely exemplary. For example, other labels are optionally used for various application icons. In some embodiments, the label of the respective application icon includes a name of the application corresponding to the respective application icon. In some embodiments, the label of a particular application icon is different from the name of the application corresponding to the particular application icon.
Fig. 4B illustrates an exemplary user interface on a device (e.g., device 300 in fig. 3) having a touch-sensitive surface 451 (e.g., a tablet or touchpad 355 in fig. 3) separate from the display 450. While many examples will be given later with reference to inputs on touch screen display 112 (where the touch sensitive surface and the display are combined), in some embodiments the device detects inputs on a touch sensitive surface separate from the display, as shown in fig. 4B. In some implementations, the touch-sensitive surface (e.g., 451 in fig. 4B) has a primary axis (e.g., 452 in fig. 4B) that corresponds to the primary axis (e.g., 453 in fig. 4B) on the display (e.g., 450). According to these implementations, the device detects contact with the touch-sensitive surface 451 at locations corresponding to respective locations on the display (e.g., 460 and 462 in fig. 4B) (e.g., 460 corresponds to 468 and 462 corresponds to 470 in fig. 4B). Thus, user inputs (e.g., contacts 460 and 462 and movement thereof) detected by the device on the touch-sensitive surface are used by the device to manipulate the user interface on the display (e.g., 450 in FIG. 4B) when the touch-sensitive surface (e.g., 451 in FIG. 4B) is separate from the display of the multifunction device. It should be appreciated that similar approaches are optionally used for other user interfaces described herein.
Additionally, while the following examples are primarily presented with reference to finger inputs (e.g., finger contacts, single-finger flick gestures, finger swipe gestures, etc.), it should be understood that in some embodiments one or more of these finger inputs are replaced by input from another input device (e.g., mouse-based input or stylus input). For example, a swipe gesture is optionally replaced with a mouse click (e.g., rather than a contact), followed by movement of the cursor along the path of the swipe (e.g., rather than movement of the contact). As another example, a flick gesture is optionally replaced by a mouse click (e.g., instead of detection of contact, followed by ceasing to detect contact) when the cursor is over the position of the flick gesture. Similarly, when multiple user inputs are detected simultaneously, it should be appreciated that multiple computer mice are optionally used simultaneously, or that the mice and finger contacts are optionally used simultaneously.
As used herein, the term "focus selector" refers to an input element for indicating the current portion of a user interface with which a user is interacting. In some implementations that include a cursor or other position marker, the cursor acts as a "focus selector" such that when the cursor detects an input (e.g., presses an input) on a touch-sensitive surface (e.g., touch pad 355 in fig. 3 or touch-sensitive surface 451 in fig. 4B) above a particular user interface element (e.g., a button, window, slider, or other user interface element), the particular user interface element is adjusted according to the detected input. In some implementations including a touch screen display (e.g., touch sensitive display system 112 in fig. 1A or the touch screen in fig. 4A) that enables direct interaction with user interface elements on the touch screen display, the contact detected on the touch screen acts as a "focus selector" such that when an input (e.g., a press input by contact) is detected on the touch screen display at the location of a particular user interface element (e.g., button, window, slider, or other user interface element), the particular user interface element is adjusted in accordance with the detected input. In some implementations, the focus is moved from one area of the user interface to another area of the user interface without a corresponding movement of the cursor or movement of contact on the touch screen display (e.g., by moving the focus from one button to another using a tab key or arrow key); in these implementations, the focus selector moves according to movement of the focus between different areas of the user interface. Regardless of the particular form that the focus selector takes, the focus selector is typically controlled by the user in order to deliver a user interface element (or contact on the touch screen display) that is interactive with the user of the user interface (e.g., by indicating to the device the element with which the user of the user interface desires to interact). For example, upon detection of a press input on a touch-sensitive surface (e.g., a touchpad or touch screen), the position of a focus selector (e.g., a cursor, contact, or selection box) over a respective button will indicate that the user desires to activate the respective button (rather than other user interface elements shown on the device display).
As used in this specification and the claims, the term "intensity" of a contact on a touch-sensitive surface refers to the force or pressure (force per unit area) of the contact on the touch-sensitive surface (e.g., finger contact or stylus contact), or refers to an alternative to the force or pressure of the contact on the touch-sensitive surface (surrogate). The intensity of the contact has a range of values that includes at least four different values and more typically includes hundreds of different values (e.g., at least 256). The intensity of the contact is optionally determined (or measured) using various methods and various sensors or combinations of sensors. For example, one or more force sensors below or adjacent to the touch-sensitive surface are optionally used to measure forces at different points on the touch-sensitive surface. In some implementations, force measurements from multiple force sensors are combined (e.g., weighted average or summation) to determine an estimated contact force. Similarly, the pressure sensitive tip of the stylus is optionally used to determine the pressure of the stylus on the touch sensitive surface. Alternatively, the size of the contact area and/or its variation detected on the touch-sensitive surface, the capacitance of the touch-sensitive surface and/or its variation in the vicinity of the contact and/or the resistance of the touch-sensitive surface and/or its variation in the vicinity of the contact are optionally used as a substitute for the force or pressure of the contact on the touch-sensitive surface. In some implementations, surrogate measurements of contact force or pressure are directly used to determine whether an intensity threshold has been exceeded (e.g., the intensity threshold is described in units corresponding to surrogate measurements). In some implementations, an alternative measurement of contact force or pressure is converted to an estimated force or pressure, and the estimated force or pressure is used to determine whether an intensity threshold has been exceeded (e.g., the intensity threshold is a pressure threshold measured in units of pressure). The intensity of the contact is used as an attribute of the user input, allowing the user to access additional device functions that would otherwise not be readily accessible on a smaller-sized device for displaying affordances and/or receiving user input (e.g., via a touch-sensitive display, a touch-sensitive surface, or physical/mechanical controls such as knobs or buttons).
In some implementations, the contact/motion module 130 uses a set of one or more intensity thresholds to determine whether an operation has been performed by a user (e.g., to determine whether the user has "clicked" on an icon). In some implementations, at least a subset of the intensity thresholds are determined according to software parameters (e.g., the intensity thresholds are not determined by activation thresholds of particular physical actuators, and may be adjusted without changing the physical hardware of the device 100). For example, without changing the touchpad or touch screen display hardware, the mouse "click" threshold of the touchpad or touch screen display may be set to any of a wide range of predefined thresholds. Additionally, in some implementations, a user of the device is provided with software settings for adjusting one or more intensity thresholds in a set of intensity thresholds (e.g., by adjusting individual intensity thresholds and/or by adjusting multiple intensity thresholds at once with a system-level click on an "intensity" parameter).
As used in the specification and claims, the term "characteristic intensity" of a contact refers to the characteristic of a contact based on one or more intensities of the contact. In some embodiments, the characteristic intensity is based on a plurality of intensity samples. The characteristic intensity is optionally based on a predefined number of intensity samples or a set of intensity samples acquired during a predetermined period of time (e.g., 0.05 seconds, 0.1 seconds, 0.2 seconds, 0.5 seconds, 1 second, 2 seconds, 5 seconds, 10 seconds) relative to a predefined event (e.g., after detection of contact, before or after detection of lift-off of contact, before or after detection of start of movement of contact, before or after detection of end of contact, and/or before or after detection of decrease in intensity of contact). The characteristic intensity of the contact is optionally based on one or more of: maximum value of contact intensity, average value of contact intensity, value at the first 10% of contact intensity, half maximum value of contact intensity, 90% maximum value of contact intensity, value generated by low-pass filtering contact intensity over a predefined period of time or from a predefined time, etc. In some embodiments, the duration of the contact is used in determining the characteristic intensity (e.g., when the characteristic intensity is an average of the intensity of the contact over time). In some embodiments, the characteristic intensity is compared to a set of one or more intensity thresholds to determine whether the user has performed an operation. For example, the set of one or more intensity thresholds may include a first intensity threshold and a second intensity threshold. In this example, contact of the feature strength that does not exceed the first strength threshold results in a first operation, contact of the feature strength that exceeds the first strength threshold but does not exceed the second strength threshold results in a second operation, and contact of the feature strength that exceeds the second strength threshold results in a third operation. In some implementations, a comparison between the feature intensity and one or more intensity thresholds is used to determine whether to perform one or more operations (e.g., whether to perform the respective option or to forgo performing the respective operation) instead of being used to determine whether to perform the first operation or the second operation.
In some implementations, a portion of the gesture is identified for determining a feature strength. For example, the touch-sensitive surface may receive a continuous swipe contact that transitions from a starting position and reaches an ending position (e.g., a drag gesture) where the intensity of the contact increases. In this example, the characteristic intensity of the contact at the end position may be based on only a portion of the continuous swipe contact, rather than the entire swipe contact (e.g., only a portion of the swipe contact at the end position). In some implementations, a smoothing algorithm may be applied to the intensity of the swipe gesture prior to determining the characteristic intensity of the contact. For example, the smoothing algorithm optionally includes one or more of the following: an unweighted moving average smoothing algorithm, a triangular smoothing algorithm, a median filter smoothing algorithm, and/or an exponential smoothing algorithm. In some cases, these smoothing algorithms eliminate narrow spikes or depressions in the intensity of the swipe contact for the purpose of determining the characteristic intensity.
The user interface diagrams described herein optionally include various intensity diagrams that illustrate the current intensity of a contact on a touch-sensitive surface relative to one or more intensity thresholds (e.g., contact detection intensity threshold IT 0, light press intensity threshold IT L, deep press intensity threshold IT D (e.g., at least initially above IT L), and/or one or more other intensity thresholds (e.g., lower intensity threshold IT H than IT L)). The intensity map is typically not part of the displayed user interface, but is provided to help explain the map. In some embodiments, the tap strength threshold corresponds to a strength of: at this intensity the device will perform the operations normally associated with clicking a button of a physical mouse or touch pad. In some embodiments, the deep compression intensity threshold corresponds to an intensity of: at this intensity the device will perform an operation that is different from the operation normally associated with clicking a physical mouse or a button of a touch pad. In some implementations, when a contact is detected with a characteristic intensity below a light press intensity threshold (e.g., and above a nominal contact detection intensity threshold IT 0, a contact below the nominal contact detection intensity threshold is no longer detected), the device will move the focus selector according to movement of the contact over the touch-sensitive surface without performing an operation associated with the light press intensity threshold or the deep press intensity threshold. Generally, unless otherwise stated, these intensity thresholds are consistent across different sets of user interface drawings.
In some embodiments, the response of the device to the input detected by the device depends on a criterion based on the contact strength during the input. For example, for some "tap" inputs, the intensity of the contact exceeding a first intensity threshold during the input triggers a first response. In some embodiments, the response of the device to the input detected by the device depends on criteria including both contact strength during the input and time-based criteria. For example, for some "deep press" inputs, the intensity of the contact exceeding a second intensity threshold greater than the first intensity threshold of the light press triggers a second response whenever a delay time passes between the first intensity threshold being met and the second intensity threshold being met during the input. The duration of the delay time is typically less than 200ms (milliseconds) (e.g., 40ms, 100ms, or 120ms, depending on the magnitude of the second intensity threshold, wherein the delay time increases as the second intensity threshold increases). This delay time helps to avoid accidentally recognizing a deep press input. As another example, for some "deep press" inputs, a period of reduced sensitivity will occur after the first intensity threshold is reached. During the period of reduced sensitivity, the second intensity threshold increases. This temporary increase in the second intensity threshold also helps to avoid accidental deep press inputs. For other deep press inputs, the response to detecting the deep press input is not dependent on a time-based criterion.
In some implementations, one or more of the input intensity threshold and/or corresponding outputs varies based on one or more factors, such as user settings, touch movement, input timing, application execution, rate at which intensity is applied, number of simultaneous inputs, user history, environmental factors (e.g., environmental noise), focus selector position, etc. Exemplary factors are described in U.S. patent application Ser. Nos. 14/399,606 and 14/624,296, which are incorporated herein by reference in their entireties.
User interface and associated process
Attention is now directed to embodiments of a user interface ("UI") and associated processes that may be implemented on an electronic device, such as a portable multifunction device 100 or device 300 (shown in the figures of fig. 1A, 2, 3, 4A, and 4B, etc.) having a display, a touch-sensitive surface, one or more tactile output generators (optionally) for generating tactile outputs, and one or more sensors (optionally) for detecting the intensity of contacts with the touch-sensitive surface.
Fig. 5A-5 AX illustrate exemplary user interactions for displaying and interacting with a search user interface in various contexts and invoking other system and/or application functionality, according to some embodiments.
The user interfaces in these figures are used to illustrate the processes described below, including the processes in fig. 6A-6J. For ease of explanation, some of the embodiments will be discussed with reference to operations performed on a device having a touch sensitive display system 112. In such embodiments, the focus selector is optionally: a respective finger or stylus contact, a point of representation corresponding to the finger or stylus contact (e.g., a center of gravity of the respective contact or a point associated with the respective contact), or a center of gravity of two or more contacts detected on the touch-sensitive display system 112. However, in response to detecting contact on the touch-sensitive surface 451 when the user interface shown in the figures is displayed on the display 450 along with the focus selector, similar operations are optionally performed on a device having the display 450 and the separate touch-sensitive surface 451.
Fig. 5A-5C illustrate a swipe down gesture that causes a device to navigate from a home screen user interface of the device to a search user interface, according to some embodiments.
In fig. 5A, device 100 displays a home screen user interface (e.g., user interface 5002 such as an application launch user interface, a corresponding page of a multi-page home screen user interface, etc.) in full screen mode (e.g., user interface 5002 occupies substantially all of touch screen 112). The home screen user interface includes a plurality of application icons corresponding to different applications installed on the device. The respective application icons, when activated according to preset criteria (e.g., activated by a tap gesture, a double tap gesture, etc.), cause the device to display the respective applications corresponding to the respective application icons. In fig. 5A, a device (e.g., touch screen 112) detects a contact (e.g., contact 5004) in a first portion (e.g., region 5003-1) of an edge region of touch screen 112 (e.g., in a middle top edge region of touch screen 112). In some implementations, the edge of touch screen 112 refers to a location on the device where the area covered by the plurality of contact intensity sensors (e.g., contact intensity sensor 165 described with respect to fig. 1A) ends and the bezel of the device (e.g., the edge of the cover or housing) (e.g., a portion of the edge of the device that does not have touch sensitivity) begins. In some implementations, the relative width of the region 5003-1 is in the range of about 50% to about 20% of the width of the top edge of the touch screen 112. In some embodiments, the relative width of the region 5003-1 is about 50%, about 45%, about 40%, about 35%, about 33%, about 30%, about 25%, or about 20%. The relative width may vary depending on the orientation of the device (e.g., vertical to horizontal orientation). The width also varies depending on the type of device (e.g., whether the device is a mobile phone or a tablet computer). In some implementations, the device detects that a portion of contact 5004 on touch screen 112 is just at the top edge of touch screen 112 (e.g., another portion of contact 5004 may be outside of touch screen 112). In fig. 5A-5C, the device detects movement of contact 5004 in a first direction (e.g., downward movement 5005 of contact 5004 from a middle top edge of touch screen 112 to an interior portion of touch screen 112). As shown, the movement 5005 of the contact 5004 is substantially perpendicular to the top edge of the touch screen 112. In some implementations, movement 5005 of contact 5004 corresponds to a downward swipe starting from a middle top edge of touch screen 112. In some embodiments, the movement 5005 of the contact 5004 includes: downward movement from outside touch screen 112 (e.g., across bezel 112-1 surrounding touch screen 112) across the top edge of touch screen 112 onto the interior portion of touch screen 112 (e.g., movement 5005 of contact 5004 moves past the top edge of touch screen 112 and away from the top edge of touch screen 112 by at least a threshold distance). In some embodiments, detecting movement across the edge comprises: contact at the extreme outer periphery of the touch screen edge is detected, which contact has a continuous movement towards the interior of the touch screen 112.
As shown in fig. 5B, in accordance with a determination that movement 5005 of contact 5004 meets a first criterion for displaying a search user interface (e.g., user interface 5006 in fig. 5C), the device initiates display of the search user interface. In some implementations, the first criteria for displaying the search user interface includes a requirement that movement of the contact 5004 begin from a middle portion of the top edge of the touch screen 112 (e.g., region 5003-1) in order to meet the first criteria. In some implementations, the middle region 5003-1 is the middle half of the top touch screen edge. In other embodiments, the middle region 5003-1 is the middle third of the top touch screen edge. In still other embodiments, the middle region 5003-1 is a middle quarter of the top touch screen edge. The smaller the region 5003-1, the less chance that a user accidentally invokes one feature but not another, while the larger the region 5003-1, the greater the chance that the user invokes a feature. The optimal size of region 5003-1 depends on the side of touch screen 112.
In some embodiments, the first criterion includes a requirement that movement of the contact 5004 be in a substantially downward direction in order to meet the first criterion. In some embodiments, initiating display of the search user interface includes: the search user interface is displayed to slide onto the touch screen according to the downward movement of contact 5004 (e.g., the search user interface slides down with the contact). For example, as contact 5004 moves downward, the lower edge of the search user interface follows the movement of contact 5004. In some implementations, in accordance with a determination that termination of the swipe gesture is detected and that the contact has made a movement in the downward direction that exceeds a threshold amount, the device displays a search user interface that completely replaces the home screen user interface on the touch screen (e.g., the search user interface continues to move downward on the touch screen until it completely replaces the home screen user interface on the touch screen). In some implementations, if the movement of the contact 5004 is reversed and/or the threshold distance or direction requirement of the first criteria is not met, the device does not complete the process for displaying the search user interface and redisplays the home screen user interface after the swipe gesture terminates. In some embodiments, initiating display of the search user interface includes: the search user interface is caused to gradually appear on top of the home screen user interface (e.g., the search user interface gradually fades into the foreground while the home screen user interface gradually fades into the background). In the example shown in fig. 5B, the device displays a portion of a search user interface (e.g., user interface 5006) in a top portion of touch screen 112. As shown, the device replaces a portion of the home screen user interface with a portion of the search user interface as the home screen user interface is backed into the background and/or visually de-emphasized (e.g., obscured, darkened, reduced in size, etc.) relative to the portion of the search user interface. In fig. 5B, the portion of the search user interface includes: a first search user interface area (e.g., area 5006-1) including suggested actions; and a portion of a second search user interface area (e.g., area 5006-2) that includes suggested applications (e.g., application icons of a plurality of automatically selected applications). In some implementations, the device displays suggested actions (e.g., in region 5006-1) and/or suggested applications (e.g., in region 5006-2) based on context information (e.g., previous searches performed on the device, recently accessed actions and/or applications, etc.). In some embodiments, the device displays the suggested actions and/or suggested applications based on common actions and/or applications accessed across a large number of devices similar to the device. As the portion of the search user interface moves progressively downward on the touch screen, the device continuously changes the appearance of the home screen user interface displayed in the background of the portion of the search user interface, as shown in fig. 5B, to further de-emphasize the home screen user interface relative to the search user interface (e.g., to further obscure, darken, reduce in size, etc. the home screen user interface relative to the search user interface). In some implementations, the appearance changes applied to the home screen user interface and the search user interface as described herein and the movement of the home screen user interface and the search user interface are reversibly and dynamically adjusted based on characteristics (e.g., direction of movement, speed of movement, etc.) of the movement of the contact 5004 before the gesture terminates (e.g., lift-off of the contact 5004).
In fig. 5C, the device has detected lift-off of contact 5004 at a location substantially in the middle portion of touch screen 112 (e.g., at a movement distance that meets the requirements of the first criterion, and/or at a movement speed that meets the first criterion when lifted-off, etc.); and in accordance with a determination that the swipe down gesture satisfies a first criterion for displaying search criteria when contact 5004 is lifted off, the device displays a search user interface (e.g., user interface 5006) in full screen mode. In some implementations, the device replaces the home screen user interface (e.g., user interface 5002) with a search user interface (e.g., user interface 5006). In some implementations, the home screen user interface is no longer displayed in the context of the search user interface. In some embodiments, the reduced visibility version of the home screen user interface is used as a background layer for the search user interface. In fig. 5C, in addition to displaying search user interface area 5006-1 (e.g., suggested actions) and search user interface area 5006-2 (e.g., suggested applications) of the search user interface, the device also displays a search input area in the search user interface (e.g., search input area 5008 positioned in a top portion of the search user interface). In some implementations, the search input area includes a text cursor 5010 indicating a current location for entering search criteria for a search query based on input received from a user (e.g., text input entered through a keyboard or through voice-to-text functionality provided by the device, text input or image input pasted from a clipboard, etc.). In some implementations, after displaying the search user interface, the device further displays a virtual keyboard (e.g., virtual keyboard 5011) in response to the tap input in search input area 5008. In some implementations, the device automatically displays the keyboard 5011 (e.g., as part of, or overlaying, the search user interface) in response to detecting termination of the swipe gesture by the contact 5004 (e.g., without the user tapping the search input area). In some implementations, the search user interface can scroll to reveal additional portions including other suggested content automatically selected by the device, such as suggested images, suggested contacts, suggested search queries, and so forth. In some implementations, the search user interface scrolls under the virtual keyboard, e.g., in response to a swipe gesture in a vertical direction, while the virtual keyboard remains at the same location on the touch screen. In some implementations, the virtual keyboard ceases display from the touch screen in response to user input scrolling through the search user interface.
In some implementations, in response to detecting a swipe gesture by contact 5004, in accordance with a determination that movement of contact 5004 does not meet a first criterion for displaying a search user interface, the device foregoes displaying the search user interface. For example, in accordance with a determination that the downward movement of contact 5004 has not reached a threshold distance away from the top edge of the touch screen (e.g., reached the middle portion of touch screen 112 (e.g., the position of the touch lift-off shown in fig. 5C)), the device foregoes displaying the search user interface and instead redisplays the home screen user interface after termination of the swipe gesture by contact 5004. In some implementations, after displaying the search user interface in response to the swipe gesture (e.g., movement 5005 of contact 5004 as shown in fig. 5C), the device detects another input corresponding to a request to dismiss the search user interface (e.g., another swipe gesture in an upward direction from a bottom edge of the touch screen by another contact, a tap gesture in an area of the search user interface not occupied by the user interface object, etc.), the device stops displaying the search user interface (e.g., dismisses the search user interface) and redisplays the home screen user interface (e.g., as shown in fig. 5A).
In fig. 5C, the search user interface is responsive to user input directed to a different user interface object included in the search user interface, and in some embodiments, corresponding operations associated with the different user interface object are performed. For example, if the device detects a contact (e.g., a tap) on the region 5006-1 (e.g., the suggested action) or on the region 5006-2 (e.g., the suggested application), the device selects which action or application to perform or which application to display based on the contact. For example, if the device detects a tap gesture on a first application icon (e.g., application icon 416 of a telephony application) displayed in region 5006-2, the device displays the first application corresponding to the first application icon (e.g., replaces the search user interface with the user interface of the first application (e.g., a telephony user interface) on the touch screen); and if the device detects a tap gesture on a second application icon (e.g., icon 418) displayed in region 5006-2, the device displays the second application corresponding to the second application icon (e.g., replaces the search user interface with the user interface of the second application (e.g., an email user interface) on the touch screen). Further, if the device detects a flick gesture displayed on a first action (e.g., "send message to John") in region 5006-1, the device replaces the search user interface with a user interface associated with the first action (e.g., a user interface of a messaging application for composing messages to John); and optionally, if the device detects a flick gesture displayed on a second action (e.g., turn on call and notify mute mode) in region 5006-1, the device performs the second action (e.g., turn on call and notify mute mode and display a notification that call and notify mute mode has been turned on) and optionally, maintains display of the search user interface; and optionally, if the device detects a tap input displayed on a third action (e.g., setting an alarm) in region 5006-1, the device replaces the search user interface with or overlays the user interface object for setting an alarm on the search user interface. In some implementations, the region 5006-1 also includes application icons and suggested actions, and a flick gesture on a respective application icon also causes display of a corresponding application. For example, upon detecting a flick gesture on icon 424 of "message" in region 5006-1, the device displays a user interface of the messaging application. In some implementations, tapping the application icon in region 5006-1 results in a user interface of the corresponding application that is different from the user interface of the corresponding application displayed in response to the tap gesture shown on the same application icon in region 5006-2. In some implementations, the respective user interface object (e.g., an application icon, an action, a container object for the suggested application and/or action, etc.) is also responsive to other types of gesture inputs (such as swipe gestures in different directions, touch hold gestures, tap press gestures, etc.), and performs different operations corresponding to the respective user interface object and the type of gesture input currently detected.
Fig. 5D-5E illustrate exemplary user interactions for searching using a search user interface according to some embodiments.
In fig. 5D, the device detects user input comprising a sequence of touch inputs (e.g., a sequence of tap gestures made through contacts on different keys of virtual keyboard 5011, including contact 5012 on virtual key "l"). For example, a sequence of touch inputs on a virtual keyboard corresponds to text input (e.g., a sequence of characters or words, etc.). In response to detecting text input, the device displays corresponding text in search input area 5008. For example, as shown in fig. 5D, the text input includes a text string "Appl". In response to receiving text input in search input area 5008, a device (e.g., search module 115 of device 100) performs a search using the text input as a search criteria (e.g., optionally, with other contextual information (e.g., time, location, past searches, past user interactions, etc.) as supplemental search criteria and/or search filters) to identify relevant content corresponding to the search criteria. In some embodiments, the search is performed in a search corpus corresponding to different content sources, including: content associated with an application installed on the device (e.g., content and/or data within the application (e.g., files, messages generated or stored within the application), metadata associated with the application (e.g., application name, application icon, etc.)); content from external sources (e.g., from the internet, on other related devices connected to the device, etc.); files stored on the device and/or on a user account associated with the device, etc. In some embodiments, the search is performed in a search corpus corresponding to different categories or content types of search results, the search corpus comprising: images, photographs, videos, media files, contacts with contact information (e.g., names, addresses, usernames, aliases, websites, social media handles, etc.), applications, actions or operations that may be performed on the device, and so forth. In some implementations, the search is updated as the user types in input (e.g., the user does not have to select "search" or "return"). As shown in fig. 5E, in response to detecting a search input (e.g., part or all), the search user interface updates (e.g., refreshes or replaces) the displayed first search user interface area (e.g., area 5006-1) and second search user interface area (e.g., area 5006-2) with search results (e.g., search results 5016) corresponding to the search input detected in search input area 5008. In fig. 5E, the search results include content from various applications installed on the device that is identified as being relevant to the received search input. For example, as shown in FIG. 5E, the search results 5016 include a first area (e.g., area 5016-1) displaying content identified from a web browser application (e.g., web page "apple. Com"), a second area (e.g., area 5016-2) displaying content identified from a photo application (e.g., an image showing apples), and a third area (e.g., area 5016-3) displaying content identified from a contacts application (e.g., contact information associated with "apple, john"). In some embodiments, additional regions are included in the search results to show additional categories and/or types of search results, such as applications, photos, text messages, emails, actions, and the like. In some embodiments, the respective regions of a given type of search result include a plurality of search results of the given type. In some implementations, the respective search results displayed in the search results user interface are associated with a set of respective operations, and the device performs a respective operation of the set of respective operations in response to user input (e.g., using a tap input, using a swipe gesture, using a touch-hold gesture, etc.) selecting the respective search results according to the preset criteria. For example, in response to a tap gesture on the search result "apple.com" in area 5016-1, the device replaces the display of the search result user interface with the display of the user interface of the web browser application displaying the web page at "apple.com"; and optionally, in response to a touch-and-hold gesture on the search result "apple.com" in area 5016-1, the device displays a preview of the web page at "apple.com" that overlays a portion of the search result user interface (e.g., a blurred and/or darkened version of the search result user interface that resumes once the touch-and-hold gesture is terminated or the preview is dismissed by another input); and optionally, in response to a swipe gesture to the left or right across the search result "apple. Com", the device displays a plurality of selectable options (e.g., save, flag, delete, etc.) associated with the search result. Similarly, in response to a tap gesture on the search result "image of apple" in area 5016-2, the device replaces the display of the search result user interface with the display of the user interface of the photo library displaying the photo "image of apple"; and optionally, in response to a touch-and-hold gesture on the search result "image of apple" in area 5016-2, the device displays a preview of the photograph that covers a portion of the search result user interface (e.g., a blurred and/or darkened version of the search result user interface that resumes once the touch-and-hold gesture is terminated or the preview is dismissed by another input); and optionally, in response to a swipe gesture to the left or right across the search result "image of the apple," the device displays a plurality of selectable options (e.g., save, mark, delete, etc.) associated with the search result. In some implementations, the search results include user interface objects that correspond to different operations that may be performed with respect to the search results. For example, a tap input on a phone application icon or email application icon other than the search result "contact—appleman, john" in area 5016-3 causes the device to initiate a phone call to John Appleman or begin composing an email based on the contact information in the search result; and a tap input on the search results outside of the application icon causes the device to display John Appleman's contact information stored in the contact application.
In some embodiments, the device detects search input in the form of an image file or other object. For example, the device detects that an image file is copied from another source onto the clipboard and pastes into the search input area (e.g., invoking a selectable option "paste" via a touch-and-hold gesture in the search input area, followed by selection of the selectable option to paste content that has been saved/copied onto the clipboard). In response to detecting the search input, the device (e.g., search module 115 of device 100) searches to identify content corresponding to the image file and displays search results including content identified as related to the image file.
The home screen user interface, search user interface, and search results user interface described above are exemplary, and in some embodiments, the user interface objects, appearances, and/or response behaviors of the home screen user interface, search user interface, and search results user interface may be different or modified in one or more aspects.
Fig. 5F-5H illustrate a swipe down gesture that causes a device to navigate from a home screen user interface of the device to a notification center user interface, according to some embodiments.
In fig. 5F, the device 100 displays the home screen user interface in full screen mode. The device detects a contact (e.g., contact 5018) in a second portion of the edge area of touch screen 112 (e.g., area 5003-2) (e.g., in the left top edge area of touch screen 112). In some embodiments, the first portion and the second portion of the edge region of touch screen 112 are different from each other and independent of each other. In some implementations, the relative width of the region 5003-2 is in the range of about 50% to about 20% of the width of the top edge of the touch screen 112. In some embodiments, the relative width of the region 5003-2 is about 50%, about 45%, about 40%, about 35%, about 33%, about 30%, about 25%, or about 20%. The relative width may vary depending on the orientation of the device (e.g., vertical to horizontal orientation). The relative width also varies depending on the type of device (e.g., whether the device is a mobile phone or a tablet computer). The relative width of region 5003-2 can correspond to the relative width of region 5003-1, be greater than the relative width of region 5003-1, or be less than the relative width of region 5003-1. In some embodiments, regions 5003-1 and 5003-2 are operatively adjacent to each other such that there is no interaction region between regions 5003-1 and 5003-2. In some implementations, the regions 5003-1 and 5003-2 are not visually marked with respective boundaries on edge regions of the touch screen 112 (e.g., the regions 5003-1 and 5003-2 do not have corresponding user interface elements displayed at or near the regions). In some embodiments, region 5003-2 is next to region 5002-1 (e.g., region 5003-2 is positioned on the left side of region 5002-1). In some embodiments, region 5003-2 is separated from region 5002-1 by a region in which neither function associated with sweeping over region 5003-2 or 5002-1 is available. As with region 5003-1, region 5003 can occupy the leftmost top third, quarter, or less of the width of the top edge. In some embodiments, region 5003-2 includes a portion of a home screen user interface. For example, in FIG. 5F, the time element of the home screen user interface is displayed in region 5003-2. In fig. 5F-5H, the device detects movement of contact 5018 in a first direction (e.g., downward movement 5019 of contact 5018 from the left top edge of touch screen 112 to the middle left portion of touch screen 112). As shown, the movement 5019 of the contact 5018 is substantially perpendicular to the top edge of the touch screen 112. In some implementations, movement 5005 of contact 5004 corresponds to a downward swipe starting from the left top edge of touch screen 112. In some embodiments, the movement 5019 of the contact 5018 comprises: downward movement across the top edge of touch screen 112 from outside touch screen 112 along the left portion of touch screen 112 (e.g., movement 5019 of contact 5018 moves past the top edge of touch screen 112 and away from the top edge of touch screen 112 by at least a threshold distance along the left portion of touch screen 112).
As shown in fig. 5G, in accordance with a determination that the movement 5019 of the contact 5018 meets a first criteria for displaying a notification center user interface (e.g., user interface 5020 in fig. 5G), the device initiates display of the notification center user interface. The notification center user interface includes a first set of notifications (e.g., one or more notifications) associated with an application of the device. In some implementations, the first criteria for displaying the notification center user interface includes a requirement that movement of the contact 5018 begin from a left portion of the top edge of the touch screen 112 (e.g., region 5003-2 in FIG. 5F) in order to meet the first criteria. In some embodiments, the first criteria for displaying the notification center includes a requirement that movement of the contact 5018 be in a substantially downward direction in order to meet the first criteria. In some embodiments, initiating display of the notification center user interface includes: the display informs the user interface to slide onto the touch screen based on the downward movement of contact 5018. For example, as the contact 5018 moves downward, the lower edge of the center user interface is notified to follow the movement of the contact 5018. In some implementations, in accordance with a determination that a drag gesture is detected and the contact has made more than a threshold amount of movement in a downward direction, the device displays a notification center user interface that completely replaces the home screen user interface on the touch screen (e.g., the notification center user interface continues to move downward on the touch screen until it completely replaces the home screen user interface on the touch screen 112). In some implementations, if the movement of the contact 5018 is reversed and/or does not meet a threshold distance or direction requirement for displaying the first criteria for the notification center user interface, the device does not complete the process for displaying the notification center user interface and redisplays the home screen user interface after the swipe gesture has terminated. In some embodiments, initiating display of the notification center user interface includes: the notification center user interface is caused to gradually appear on top of the home screen user interface (e.g., the notification center user interface gradually fades into the foreground while the home screen user interface gradually fades into the background). In the example shown in fig. 5G, the device displays a portion of a notification center user interface (e.g., user interface 5006) in a top portion of touch-screen 112. As shown, the device replaces a portion of the home screen user interface with a portion of the notification center user interface as the home screen user interface is backed into the background and/or visually de-emphasized (e.g., obscured, darkened, reduced in size, etc.) relative to a portion of the search user interface.
In FIG. 5G, a portion of the search user interface includes a first notification center user interface area (e.g., area 5020-1) that includes a first notification; and a second search user interface area (e.g., area 5020-2) that includes a second notification. The first notification and the second notification are associated with an application of the device. For example, a first notification is associated with a "work" application and a second notification is associated with a "message" application. In some embodiments, the respective notification includes a description of the notification and a time indicating when the notification was generated by the respective application (e.g., a time when the message was received or a time when a reminder was prompted). In some embodiments, the first set of notifications is arranged by list. In some embodiments, the list is arranged chronologically (e.g., the notification displayed at the lowest position of the list corresponds to the earliest notification in the first set of notifications, and the notification displayed at the highest position of the list corresponds to the latest notification in the first set of notifications). Similar to the features described above with respect to fig. 5B, as the portion of the notification center user interface moves progressively downward on the touch-screen, the device continuously changes the appearance of the home screen user interface displayed in the background of the portion of the notification center user interface, as shown in fig. 5G.
In fig. 5H, the device has detected lift-off of contact 5018 at a location in the middle left portion of touch screen 112 (e.g., to meet a required distance of movement for displaying a first criterion of the notification center user interface and/or to meet a speed of movement of the first criterion upon lift-off, etc.); and in accordance with a determination that the swipe down gesture satisfies a first criteria for displaying a notification center user interface when contact 5018 is lifted off, the device displays the notification center user interface (e.g., user interface 5020) in full screen mode. In some embodiments, the device replaces the home screen user interface (e.g., user interface 5002) with a notification center user interface (e.g., user interface 5020). In some implementations, the home screen user interface is no longer displayed in the context of the search user interface. In some embodiments, the reduced visibility version of the home screen user interface serves as a background layer for the notification center user interface. In accordance with a determination that the swipe down gesture does not meet the first criteria for displaying the notification center user interface when contact 5018 is lifted off, the device foregoes displaying the notification center user interface in full screen mode and instead continues displaying the home screen user interface.
In fig. 5H, the device displays the notification center user interface in full screen mode. In FIG. 5H, in addition to displaying notification center user interface area 5020-1 (e.g., a first notification) and notification center user interface area 5020-2 (e.g., a second notification) of the notification center user interface, the device also displays notification center area 5020-3 (e.g., including a third notification), notification center area 5020-4 (e.g., including a fourth notification), notification center area 5020-4 (e.g., including a fifth notification). In some embodiments, the device also displays an affordance (e.g., affordance 5022) for dismissing the displayed notification. In some embodiments, in response to detecting a tap input on the affordance for dismissing the displayed notification, the device ceases to display notification center user interface areas 5020-1 to 5020-5. Instead, the device replaces the notification center user interface area with the context of the notification center user interface. In some embodiments, the device also displays a time element (e.g., time element 5024) comprising the current time and optionally the current date. In some embodiments, the notification center user interface may scroll to reveal additional notification center user interface areas that include other notifications (e.g., notifications that have been generated prior to the first notification). In some implementations, the notification center interface may scroll, for example, in response to a swipe gesture in a vertical direction (e.g., an upright or downward swipe gesture).
Fig. 5I-5L illustrate upward and downward gestures that cause a device to scroll through a notification center user interface to display different portions of a notification list, according to some embodiments. In some embodiments, the up and down gestures described with respect to fig. 5I-5L for scrolling the notification center user interface may be used to scroll other user interfaces displaying a list of items (e.g., search user interface 5006 (e.g., regions 5006-1 and 5006-1 shown in fig. 5C) including a list of suggested actions.
In fig. 5I, the device displays the notification center user interface in full screen mode. The device displays a first set of notifications (e.g., notification center user interface areas 5020-1 through 5020-1). In fig. 5I, the device detects a contact (e.g., contact 5026) in a middle portion (e.g., a middle center portion) of touch screen 112 (e.g., a portion 5025 located away from an edge of touch screen 112, which corresponds to an inner portion of touch screen 112). In some embodiments, the portion 5025 has a rectangular, oval, or other symmetrical or asymmetrical shape. In fig. 5I and 5J, the device detects movement of the contact 5026 in a second direction (e.g., upward movement 5027 from the middle portion of the touch screen 112). In some embodiments, the second direction is substantially parallel and substantially opposite to the first direction. As shown, the movement 5027 of the contact 5026 is substantially perpendicular to the top edge of the touch screen 112. In some implementations, movement 5027 of the contact 5026 corresponds to an upward swipe starting from a middle portion of the touch screen 112 toward a top edge of the touch screen 112.
In accordance with a determination 5027 that the movement 5027 of the contact 5026 meets a first criterion for scrolling the notification center user interface, the device scrolls the notification user interface in accordance with the movement 5027 of the contact 5026. In some embodiments, the first criteria for scrolling through the notification center includes detecting a requirement of contact 5028 when displaying a notification list (e.g., including a first set of notifications in the regions 5020-1 to 5020-5 arranged in a list corresponding to a portion of the notification list) in order to satisfy the first criteria. In some embodiments, the notification list includes a threshold number of notifications such that fewer than all of the notifications in the notification list may be displayed simultaneously (e.g., scrolling the notification list transitions the first set of notifications to displaying notifications that are not currently shown). In some embodiments, the first criteria includes a requirement that contact 5028 be detected when the last notification (e.g., earliest notification) in the notification list is not displayed in order to satisfy the first criteria. For example, the notification list includes a predefined number of notifications (e.g., the predefined number of notifications includes all notifications generated during a preset time period or a fixed number of notifications). The notification list may be scrolled with a swipe up gesture only when the last (e.g., earliest) notification in the notification list is not displayed. In some embodiments, the first criteria for scrolling the notification center user interface includes a requirement that movement of the contact 5026 begin at a middle portion of the touch screen 112 in order to meet the first criteria. In some implementations, movement of the contact 5026 begins with a portion of the touch screen 112 that is located a first threshold distance away from the top and bottom edges of the touch screen 112 and a second threshold distance away from the left and right edges of the touch screen 112 in order to meet the first criteria.
In fig. 5J, the device has detected liftoff of the contact 5026 (e.g., at a movement distance that meets the requirements of the first criteria, and/or at a movement speed that meets the first criteria for scrolling the notification center user interface when lifted off, etc.); and in accordance with a determination that the first criteria for scrolling the notification center user interface are met, the device displays notification user interface areas 5020-1 to 5020-5 that have been shifted in accordance with the direction of movement. In FIG. 5J, the device has stopped displaying time element 5024 and has replaced time element 5024 with areas 5020-1 to 5020-5 that have been shifted up on touch screen 112. For example, in FIG. 5J, the areas 5020-1 through 5020-5 have been shifted upward such that the device stops displaying a portion of the areas 5020-5 such that the top portion of the areas 5020-5 are no longer displayed. In some embodiments, the device displaces the regions 5020-1 to 5020-5 upward according to the direction of movement 5027 of the contact 5026. In some embodiments, the device shifts the regions 5020-1 to 5020-5 upward by a distance corresponding to the distance moved by the contact 5026 on the touch screen 112. In some embodiments, the displacing includes progressively moving the regions 5020-1 to 5020-5 upward according to the movement 5027 of the contact 5026.
Additionally, in FIG. 5J, the device is displaying a second set of notifications (e.g., areas 5020-6 and 5020-7) in the lower portion of touch screen 112 (e.g., areas 5020-6 through 5020-7 correspond to a second portion of the notification list of the notification center user interface). The areas 5020-6 comprise the sixth notification and the areas 5020-7 comprise the seventh notification. In some embodiments, the sixth notification and the seventh notification have been generated by the respective applications prior to the first notification (e.g., the sixth notification and the seventh notification are earlier than the first notification). In some embodiments, the first set of notifications and the second set of notifications are arranged in a continuous list. In some embodiments, displaying the second set of notifications includes progressively shifting 5020-6 to 5020-7 upward from the bottom edge of the touch screen 112 in accordance with movement 5027 of the contact 5026.
In FIG. 5K, the device displays a notification center user interface that includes a first set of notifications (e.g., areas 5020-1 through 5020-5) and a second set of notifications (e.g., areas 5020-6 and 5020-7). In fig. 5K, the device detects a contact (e.g., contact 5028) in a middle portion (e.g., a middle center portion) of touch screen 112. In fig. 5K and 5L, the device detects movement of the contact 5028 in a first direction (e.g., downward movement 5029 from a middle portion of the touch screen 112). In some implementations, the movement 5029 of the contact 5028 corresponds to a downward swipe starting from a middle portion of the touch screen 112 toward a bottom edge of the touch screen 112.
In accordance with a determination 5029 that the movement 5029 of the contact 5028 meets a second criterion for scrolling the notification center user interface, the device scrolls the notification user interface in accordance with the movement 5029 of the contact 5028. In some embodiments, the second criteria for scrolling through the notification center includes detecting a requirement of contact 5028 when displaying the notification list (e.g., the regions 5020-1 through 5020-7 arranged by list) in order to satisfy the second criteria. In some embodiments, the second criteria for scrolling through the notification center includes at least a requirement that the first notification in the list (e.g., the most recent notification in the list) not be fully displayed in order to satisfy the second criteria. In some embodiments, the second criteria for scrolling through the notification center includes a requirement that the first notification in the list be incompletely displayed in a preset position of touch-screen 112 in order to satisfy the second criteria. For example, as shown in FIG. 5H, when a first notification (e.g., areas 5020-5) of the list is displayed a threshold distance from the top edge of touch-screen 112, a second criterion for scrolling through the notification center user interface is met. In some embodiments, the second criteria for scrolling through the notification center further includes a requirement that the time element 5024 shown in fig. 5H not be displayed on the touch-screen 112 in order to satisfy the second criteria. In some embodiments, the second criteria for scrolling the notification center user interface includes a requirement that movement of the contact 5028 begin at a middle portion of the touch screen 112 in order to meet the second criteria. In some implementations, movement of the contact 5028 begins with a portion of the touch screen 112 that is located a first threshold distance away from the top and bottom edges of the touch screen 112 and a second threshold distance away from the left and right edges of the touch screen 112.
In fig. 5L, the device has detected liftoff of the contact 5028 (e.g., at a distance of movement that meets the requirements of the second criteria, and/or at a speed of movement that meets the second criteria for scrolling the notification center user interface when lifted off, etc.); and in accordance with a determination that the second criterion is met, the device displays a notification list that has been shifted downward in accordance with the direction of movement of the contact 5028. For example, in FIG. 5L, the device is displaying a notification center user interface that includes a first set of notifications (e.g., areas 5020-1 through 5020-5), as described with respect to FIG. 5H. In some embodiments, the operation for scrolling the notification center user interface to display different portions of the notification list in accordance with movement of the detected gesture (e.g., the up or down swipe gesture described with respect to fig. 5I-5L) may be repeated in order to display a further portion of the notification list. In some embodiments, the device detects lift-off of the contact after the first swipe gesture before detecting the start of the second swipe gesture. In some embodiments, the device does not detect lift-off of the first swipe gesture before detecting the start of the second swipe gesture. Instead, the first swipe gesture and the second swipe gesture are detected as continuous movement of the contact. For example, in response to detecting an additional swipe-up gesture (e.g., the gesture of FIG. 5I with movement 5027 of contact 5026) while displaying the first and second sets of notifications in FIG. 5J, the device shifts the notification list upward. In some embodiments, the device displays a third set of notifications below the second set of notifications while ceasing to display at least a portion of the first set of notifications. As another example, in response to detecting another downward swipe gesture (e.g., a gesture with movement 5029 contacting 5028) while displaying the second and third sets of notifications and optionally a portion of the first set of notifications, the device shifts the notification list downward (e.g., the device stops displaying the third set of notifications).
Fig. 5M-5N illustrate a swipe down gesture that causes a device to navigate from a notification center user interface of the device to a search user interface, according to some embodiments.
As shown in fig. 5M, the device displays the notification center user interface in full screen mode, as described above with respect to fig. 5H. A device (e.g., including touch screen 112) detects a contact (e.g., contact 5030) in a middle portion of touch screen 112 (e.g., portion 5025 of touch screen 112 shown in fig. 5I is positioned away from an edge of touch screen 112 that corresponds to an inner portion of touch screen 112). In fig. 5M-5N, the device detects movement of contact 5030 in a first direction (e.g., downward movement 5031 of contact 5030 from a middle portion of touch screen 112). As shown, the movement 5031 of the contact 5030 is substantially perpendicular to the top edge of the touch screen 112. In some implementations, movement 5031 of contact 5030 corresponds to a downward swipe from a middle portion of touch screen 112.
In accordance with a determination that movement 5031 of contact 5030 meets a second criterion for displaying a search user interface (e.g., user interface 5020), the device displays the search user interface. In some implementations, the second criteria for displaying the central user interface includes a requirement that movement 5031 of contact 5030 begin from a middle portion of touch screen 112 (e.g., portion 5025 of touch screen 112 in fig. 5I) in order to meet the second criteria. In some implementations, the second criteria for displaying the search user interface includes a requirement that movement 5031 of the contact 5030 be in a substantially downward direction in order to meet the second criteria. In some embodiments, the second criteria for displaying the search user interface includes detecting a requirement of contact 5030 when the device displays a top of a notification list of the notification center user interface (e.g., a topmost notification corresponding to regions 5020-5 in fig. 5M) in order to satisfy the second criteria. For example, this requirement is satisfied when the device displays an area 5020-5 corresponding to the most recent notification in the notification list. Alternatively, instead of displaying the search user interface, the device scrolls down the notification list (e.g., as described with respect to fig. 5I-5J) in accordance with a determination that contact 5030 is detected when the device does not display the top of the notification list of the notification center user interface.
In fig. 5N, the device has detected lift-off of contact 5030 (e.g., at a distance of movement that meets the requirements of the second criterion, and/or at a speed of movement that meets the second criterion upon lift-off, etc.) for displaying the search user interface; and in accordance with a determination that the second criteria for displaying the search user interface is met, the device displays the search user interface (e.g., user interface 5006) in full screen mode. In some embodiments, the device displays progressively replacing the notification center user interface with the search user interface in accordance with movement 5031 of the contact 5030, as described above with respect to replacing the home screen user interface with the search user interface in fig. 5A-5C. In accordance with a determination that the swipe down gesture upon liftoff of contact 5030 does not meet the second criteria for displaying the search user interface, the device foregoes displaying the search user interface in full screen mode and instead continues displaying the notification center user interface.
Fig. 5O-5Q illustrate a swipe down gesture that causes a device to navigate from a home screen user interface to a control panel user interface of the device, according to some embodiments.
In fig. 5O, the device 100 displays the home screen user interface in full screen mode. The device detects a contact (e.g., contact 5032) in a third portion of the edge region of touch screen 112 (e.g., region 5003-3) (e.g., in the right top edge region of touch screen 112). In some embodiments, the first, second, and third portions of the edge region of touch screen 112 are different from each other and independent of each other. In some implementations, the relative width of the region 5003-3 is in the range of about 50% to about 20% of the width of the top edge of the touch screen 112. The relative width may vary depending on the orientation of the device (e.g., vertical to horizontal orientation). The relative width also varies depending on the type of device (e.g., whether the device is a mobile phone or a tablet computer). In some embodiments, regions 5003-1, 5003-2, and 5003-3 are operatively adjacent to one another such that there is no interaction region between regions 5003-1 and 5003-2 and regions 5003-1 and 5003-3. In some implementations, the regions 5003-1, 5003-2, and 5003-3 are separated from each other such that there are inactive regions therebetween (e.g., a downward swipe gesture detected between the regions 5003-1 and 5003-2 or on a respective inactive region between the regions 5003-1 and 5003-3 does not invoke any of the operations invoked by the downward swipe gesture detected on the regions 5003-1, 5002-2, or 5003-3). In some implementations, the regions 5003-1, 5003-2, and 5003-3 have substantially corresponding widths (e.g., the top edge of the touch screen 112 is divided into equal three portions). In some embodiments, the regions 5003-1, 5003-2, and 5003-3 have different relative widths (e.g., the centrally located region 5003-1 has a greater relative width than the regions 5003-2 and 5003-3, or the regions 5003-2 and 5003-3 have a greater relative width than the central region 5003-1). It should be appreciated that the relative widths and positions of the regions may vary depending on the type of device or orientation of the device. In some implementations, the regions 5003-1, 5003-2, and 5003-3 are not visually marked with respective boundaries on edge regions of the touch screen 112 (e.g., the regions 5003-1, 5003-2, and 5003-3 do not have corresponding user interface elements displayed at or near the regions). Region 5003-1 is positioned between regions 5003-2 and 5003-3. In some embodiments, region 5003-3 includes a portion of a home screen user interface. For example, in fig. 5O, device status indicators (e.g., signal strength indicators including battery status indicators and cellular and Wi-Fi signals) of the home screen user interface are displayed in region 5003-3. In fig. 5O-5Q, the device detects movement of contact 5032 in a first direction (e.g., downward movement 5033 of contact 5032 from a right top edge of touch screen 112 to a middle right portion of touch screen 112). In some implementations, movement 5033 of contact 5032 corresponds to a downward swipe from a right top edge of touch screen 112. In some embodiments, the movement 5033 of the contact 5032 includes: downward movement along the right portion of touch screen 112 from outside touch screen 112 across the top edge of touch screen 112 (e.g., movement 5033 of contact 5032 moves past the top edge of touch screen 112 and away from the top edge of touch screen 112 by at least a threshold distance along the right portion of the touch screen).
In accordance with a determination that movement 5033 of contact 5032 meets a first criterion for displaying a control panel user interface (e.g., user interface 5034 in fig. 5P), the device initiates display of the control panel user interface. As used herein, a control panel user interface (also referred to as a control center user interface or control user interface) is used to control a plurality of system level operations. The control panel user interface includes a plurality of controls (e.g., affordances) corresponding to a plurality of system functions of the device. In some implementations, the first criteria for displaying the control panel user interface includes a requirement that movement of the contact 5032 begin from a right portion of the top edge of the touch screen 112 (e.g., region 5003-3 in fig. 5O) in order to meet the first criteria. In some embodiments, the first criteria for displaying the control panel user interface includes a requirement that movement of the contact 5032 be in a substantially downward direction in order to meet the first criteria. In some embodiments, initiating display of the control panel user interface includes: the display notification panel user interface slides onto the touch-screen according to the downward movement of contact 5032. In some embodiments, the display of the replacement home screen user interface with the control panel user interface in accordance with the movement of contact 5032 shown in fig. 5O-5Q is performed as described with respect to the replacement home screen user interface with the search user interface in fig. 5A-5C. In some embodiments, the control panel user interface is dismissed as described with respect to dismissal of the search user interface in fig. 5C.
In fig. 5Q, the device has detected lift-off of contact 5032 at a location in the middle right portion of touch screen 112 (e.g., to meet a required travel distance for displaying a first criterion of the control panel user interface, and/or to meet a travel speed of the first criterion upon lift-off, etc.); and in accordance with a determination that the swipe down gesture satisfies the first criteria for displaying the control panel user interface when the contact 5032 is lifted off, the device displays the control panel user interface (e.g., user interface 5034) in full screen mode. In some implementations, the device replaces the home screen user interface (e.g., user interface 5002) with a control panel user interface (e.g., user interface 5034). In some embodiments, the home screen user interface is no longer displayed in the context of the control panel user interface. In some embodiments, the reduced visibility version of the home screen user interface serves as a background layer for the control panel user interface. In accordance with a determination that the swipe down gesture does not meet the first criteria for displaying the control panel user interface when the contact 5032 is lifted off, the device foregoes displaying the control panel user interface in full screen mode and instead continues displaying the home screen user interface.
The control panel user interface includes one or more control affordances. As shown in fig. 5Q, the control panel user interface includes: airplane mode icon 5034-1 (which when activated causes the device to turn on/off a limited wireless connection mode), cellular data icon 5034-3, wi-Fi icon 5034-2, bluetooth icon 5034-4, audio control 5034-5, orientation lock icon 5034-9 (for locking the orientation of the touch screen so that the orientation of the touch screen does not change when the orientation of the device changes), call and notification mute icon 5034-8 (for muting the call and notification during a selected period of time), content cast icon 5042-10 (for causing a nearby device to play content currently being played on the device), brightness control 5034-7, volume control 5034-6, and one or more user configurable control affordances including flashlight icon 5034-14, timer icon 5034-11, calculator icon 5034-12, and camera icon 5034-13. In some embodiments, one or more of the control affordances on the control panel user interface are user-non-configurable (e.g., may not be removable or re-arranged by a user of the device 100). For example, in some embodiments, the control affordances (such as airplane mode icon 5034-1, cellular data icon 5034-3, wi-Fi icon 5034-2, bluetooth icon 5034-4, audio control 5034-5, orientation lock icon 5034-9, call and notification mute icon 5034-8, content drop icon 5042-10, brightness control 5034-7, and volume control 5034-6) are not user configurable. In some embodiments, one or more of the control affordances on the control panel user interface are user-configurable (e.g., can be added, removed, or rearranged by a user of the device 100). For example, in some embodiments, control affordances, such as a flashlight icon 5034-14, a timer icon 5034-11, a calculator icon 5034-12, and a camera icon 5034-13, are user configurable.
Fig. 5R-5S illustrate a swipe down gesture that causes a device to navigate from an application user interface of the device to a search user interface, according to some embodiments.
In fig. 5R, device 100 displays an application user interface (e.g., email user interface 5036) in full screen mode. The device detects a contact (e.g., contact 5038) in a first portion of an edge region of touch screen 112 (e.g., region 5003-1) (e.g., in a middle top edge region of touch screen 112). In fig. 5R-5S, the device detects movement of contact 5038 in a first direction (e.g., downward movement 5039 of contact 5030 from a middle portion of touch screen 112). As shown, the movement 5031 of the contact 5030 is substantially perpendicular to the top edge of the touch screen 112. In some embodiments, movement 5039 of contact 5038 corresponds to movement 5005 of contact 5004 described above with respect to fig. 5A-5C.
In accordance with a determination that movement 5039 of contact 5038 meets a third criterion for displaying a search user interface (e.g., user interface 5006), the device displays the search user interface. In some embodiments, the third criteria for displaying the search user interface corresponds to the first criteria for displaying the search user interface described above with respect to fig. 5A-5C. The device thereby displays the search user interface in accordance with determining that movement of the contact satisfies the first criteria for displaying the search user interface, regardless of whether the contact is detected when the home screen user interface (e.g., user interface 5002) or the application user interface (e.g., email user interface 5036) is displayed. In accordance with a determination that the movement of contact 5038 does not meet the third criteria for displaying the search user interface, the device foregoes displaying the search user interface in full screen mode and instead continues displaying the application user interface.
As shown in fig. 5S, the device has detected lift-off of contact 5038 (e.g., at a distance of movement that meets the requirements of a third criterion, and/or at a speed of movement that meets the third criterion upon lift-off for displaying a search user interface, etc.); and in accordance with a determination that the movement of contact 5038 meets a third criterion for displaying a search user interface (e.g., user interface 5006), the device displays the search user interface. In some embodiments, the device displays progressively replacing the application user interface with the search user interface according to movement 5039 of contact 5038, as described above with respect to replacing the home screen user interface with the search user interface in fig. 5A-5C.
Fig. 5T-5W illustrate a swipe down gesture that causes the device to navigate from a low-power mode to a wake-up screen user interface and further to a search user interface, according to some embodiments.
In fig. 5T, the device 100 is in a low power mode (e.g., low power mode 5040) (e.g., low power always on mode, display off mode, power saving sleep mode, etc.). In some embodiments, the device has turned off the touch screen 112 when in the low power mode, thereby reducing the power consumption of the device. For example, the device has turned off any display made by the display generating component (e.g., display controller 156 associated with touch screen 112). In some embodiments, the device turns on the low power mode in accordance with a determination that there is no user interaction with touch screen 112 for a predetermined period of time. In some implementations, the device turns on the low power mode in response to detecting a user input for turning on the low power mode. In some embodiments, the device turns off the low power mode in response to detecting a change in orientation of the device. For example, the orientation of the device has changed from a horizontal orientation (e.g., lying down) to a vertical position. When the device exits the low power mode (e.g., wakes up when a communication is received, the user picks up a phone, clicks on an on/off button, or touches the screen), the display generation component displays a wake-up screen user interface (e.g., user interface 5042), as shown in fig. 5U. In some implementations, the wake screen user interface is initially displayed in a locked state and later transitions to an unlocked state after authentication information has been obtained (e.g., through password entry or biometric information verification). In some embodiments, the wake screen user interface and the lock screen user interface have similar appearances. In some embodiments, waking up the screen user interface includes displaying a time element (e.g., time element 5024) of the current time and optionally date. In some embodiments, when the device is locked, the wake screen user interface displays a prompt (e.g., prompt element 5046) for unlocking the device. In some embodiments, the wake screen displays one or more notifications (e.g., notification 5044) when the notification(s) are newly received (and optionally while the notifications remain unread in the always-on lower power mode). In some embodiments, as shown in FIG. 5U, the wake screen user interface includes one or more of the control affordances, such as flashlight icons 5034-14 and camera icons 5034-13.
In fig. 5U, the device detects a contact (e.g., contact 5048) in a middle portion of touch screen 112 (e.g., portion 5025 shown in fig. 5I located away from an edge of touch screen 112, which corresponds to an inner portion of touch screen 112). In fig. 5U and 5V, the device detects movement of contact 5048 in a second direction (e.g., upward movement 5049 from a middle portion of touch screen 112). As shown, movement 5049 of contact 5048 is substantially perpendicular to the top edge of touch screen 112. In some implementations, movement 5049 of contact 5048 corresponds to an upward swipe starting from a middle portion of touch screen 112 toward a top edge of touch screen 112.
In fig. 5V, the device has detected lift-off of contact 5048; and in accordance with a determination that the movement of the contact 5048 upon liftoff meets a second criterion for displaying a notification center user interface (e.g., user interface 5020), the device displays the notification center user interface (e.g., interface 5020 including areas 5020-4 and 5020-5). In some embodiments, the second criteria for displaying the notification center user interface includes detecting a requirement of contact 5048 when displaying the wake screen user interface (e.g., user interface 5042 in fig. 5U) in order to satisfy the second criteria. In some embodiments, the second criteria for displaying the notification center user interface includes a requirement that movement of contact 5048 begin at a middle portion of touch-screen 112 in order to meet the first criteria. In some embodiments, the second criteria for displaying the notification center user interface includes a requirement that movement of the contact 5048 be in a substantially upward direction in order to meet the second criteria. In some embodiments, initiating display of the notification center user interface includes: the display informs the user interface to slide onto the touch screen in accordance with the upward movement of contact 5048. For example, as the contact 5018 moves downward, the top edge of the center user interface (e.g., the top edge of the area 5046-5) is notified to follow the movement of the contact 5018. In some embodiments, in accordance with a determination that a swipe gesture is detected and the contact has made more than a threshold amount of movement in a downward direction, the device displays a notification center user interface that partially replaces the wake screen user interface on the touch screen 112 (e.g., the notification center user interface continues to move upward on the touch screen until it replaces a portion of the wake screen user interface on the touch screen). In some implementations, if the movement of the contact 5048 is reversed and/or the threshold distance or direction requirement of the second criterion is not met, the device does not complete the process for informing the center user interface and redisplays the wake screen user interface after the swipe gesture is terminated. In some embodiments, initiating display of the notification center user interface includes: causing the notification center user interface to appear gradually on top of the bottom screen user interface.
In fig. 5V, the device further detects a contact (e.g., contact 5052) in a first portion of an edge region of touch screen 112 (e.g., region 5003-1) (e.g., in a middle top edge region of touch screen 112). In fig. 5V-5W, the device detects movement of contact 5038 in a first direction (e.g., downward movement 5053 of contact 5052 from a middle portion of touch screen 112). As shown, the movement 5053 of the contact 5052 is substantially perpendicular to the top edge of the touch screen 112. In some embodiments, the movement 5053 of the contact 5052 corresponds to the movement 5005 of the contact 5004 described above with respect to fig. 5A-5C.
In fig. 5W, the device has detected lift-off of contact 5052; and in accordance with a determination that movement of contact 5038 upon liftoff meets a fourth criterion for displaying a search user interface (e.g., user interface 5020), the device displays the search user interface in full screen mode. In some embodiments, the device displays gradual replacement of the wake screen user interface with the search user interface in accordance with the movement 5053 of the contact 5052, as described above with respect to replacing the home screen user interface with the search user interface in fig. 5A-5C. In some embodiments, the fourth criteria for displaying the search user interface corresponds to the first criteria for displaying the search user interface described above with respect to fig. 5A-5C. The device thereby displays the search user interface in accordance with determining that movement of the contact satisfies the first criteria for displaying the search user interface, regardless of whether the contact is detected when the home screen user interface (e.g., user interface 5002), the application user interface (e.g., email user interface 5036), or the wake screen user interface (e.g., 5042) is displayed. In accordance with a determination that the movement of contact 5038 does not meet the fourth criterion, the device foregoes displaying the search user interface in full screen mode and instead continues displaying the wake user interface.
Fig. 5X-5Y illustrate a swipe down gesture that causes a device to navigate from a cover user interface of the device to a search user interface, according to some embodiments.
In fig. 5X, the device 100 displays a cover user interface (e.g., user interface 5050) in full screen mode. In some embodiments, the overlay screen user interface has the same or similar appearance as the wake screen user interface (e.g., user interface 5042 in fig. 5U and 5V) or the lock screen user interface. In some embodiments, the cover user interface includes a time element (e.g., time element 5024) that displays the current time and optionally the current date. In some embodiments, the cover user interface includes one or more notifications (e.g., notification 5044) when the notification(s) are newly received (and optionally while the notifications remain unread in the always-on lower power mode). In some embodiments, the cover user interface includes a portion of the notification center user interface 5020 (e.g., including areas 5020-5 and 5020-2). However, in some embodiments, unlike the wake screen user interface, the display of the overlay screen user interface need not be initiated when the device is in a low power mode and the device is not caused to be locked (e.g., the release of the overlay screen user interface does not require reentry of authentication information). In some embodiments, the cover user interface has the same behavior as that shown in fig. 5I-5N. In fig. 5X, the device detects a contact (e.g., contact 5054) in a middle portion of touch screen 112 (e.g., portion 5025 shown in fig. 5I located away from an edge of touch screen 112, which corresponds to an inner portion of touch screen 112). In fig. 5X-5Y, the device detects movement of contact 5054 in a first direction (e.g., downward movement 5055 of contact 5054 from a middle portion of touch screen 112). As shown, the movement 5055 of the contact 5054 is substantially perpendicular to the top edge of the touch screen 112.
In fig. 5Y, the device has detected lift-off of contact 5054; and in accordance with a determination that movement of contact 5054 meets a fifth criterion for displaying a search user interface (e.g., user interface 5006), the device displays the search user interface in full screen mode. In some embodiments, the device displays gradual replacement of the cover interface with the search user interface in accordance with movement 5055 of the contact 5054, as described above with respect to replacement of the home screen user interface with the search user interface in fig. 5A-5C. In some embodiments, the fifth criteria for displaying the search user interface includes a requirement that movement of contact 5054 begin from a middle portion of touch screen 112 (e.g., portion 5025 of touch screen 112 shown in fig. 5I) in order to meet the fifth criteria. In some embodiments, the fifth criterion includes a requirement that movement of the contact 5054 be in a substantially downward direction in order to meet the fifth criterion. In some embodiments, the fifth criteria includes a requirement that movement of contact 5054 be detected when the cover user interface is displayed in order to meet the fifth criteria. In accordance with a determination that the movement of contact 5054 does not meet the fifth criterion, the device foregoes displaying the search user interface in full screen mode and instead continues displaying the cover user interface.
Fig. 5Z-5 AA illustrate a swipe down gesture that causes a device to navigate from a home screen user interface of the device to a search user interface, according to some embodiments.
In fig. 5Z, the device 100 displays the home screen user interface (e.g., user interface 5002) in full screen mode. The device detects a contact (e.g., contact 5056) in a middle portion of touch screen 112 (e.g., portion 5025 in fig. 5I located away from an edge of touch screen 112, which corresponds to an inner portion of touch screen 112). In fig. 5Z-5 AA, the device detects movement of contact 5056 in a first direction (e.g., downward movement 5057 of contact 5056 from a middle portion of touch screen 112). In some embodiments, the movement 5057 of the contact 5056 is substantially perpendicular to the top edge of the touch screen 112. In some implementations, the movement 5029 of the contact 5028 corresponds to a downward swipe starting from a middle portion of the touch screen 112 toward a bottom edge of the touch screen 112.
In fig. 5AA, the device has detected lift-off of contact 5054; and in accordance with a determination that movement of contact 5056 meets a sixth criterion for displaying a search user interface (e.g., user interface 5006), the device displays the search user interface in full screen mode. In some embodiments, the device displays progressively replaces the home screen user interface with the search user interface in accordance with movement 5039 of contact 5038, similarly as described above with respect to replacing the home screen user interface with the search user interface in fig. 5A-5C. In some implementations, the sixth criteria for displaying the search user interface includes a requirement that movement of contact 5056 begin from a middle portion of touch screen 112 (e.g., portion 5025 of touch screen 112 in fig. 5I) in order to meet the sixth criteria. In some implementations, the sixth criteria for displaying the search user interface includes a requirement that movement of the contact 5056 be in a substantially downward direction in order to meet the sixth criteria. In some embodiments, the sixth criteria for displaying the search user interface includes detecting a requirement of contact 5030 when the device displays the home screen user interface in order to meet the second criteria. In some implementations, the sixth criteria for displaying the search user interface corresponds to the first criteria for displaying the search user interface, except that movement of the contact begins from a middle portion of touch screen 112 instead of from a first portion of a top edge region of touch screen 112 (e.g., region 5003-1). In some embodiments, the device displays the search user interface in response to a swipe down gesture starting from a first portion (e.g., shown in fig. 5A-5C) or a middle portion (e.g., shown in fig. 5M-5N and 5Z-5 AA) of the top edge region of touch screen 112. In accordance with a determination that the movement of contact 5056 does not meet the sixth criteria, the device foregoes displaying the search user interface in full screen mode and instead continues displaying the home screen user interface.
Fig. 5 AB-5 AG illustrate gestures that cause a device to navigate from an application user interface to a multitasking user interface, a home screen user interface, or another application user interface according to the direction of the respective gesture, according to some embodiments.
In fig. 5AB, the device 100 displays an application user interface (e.g., email user interface 5036) in full screen mode. The device detects a contact (e.g., contact 5058) in a fourth portion of the edge region of touch screen 112 (e.g., region 5003-4) (e.g., in the middle bottom edge region of touch screen 112). In some implementations, the regions 5003-4 are positioned on a different side of the touch screen 112 than the regions 5003-1, 5003-2, and 5003-3. For example, regions 5003-4 are positioned in a bottom edge region of touch screen 112 and regions 5003-1, 5003-2, and 5003-3 are positioned in a top edge region of touch screen 112. In some implementations, the regions 5003-4 correspond to one third of the bottom edge region, one half of the bottom edge region, or substantially the entire width of the bottom edge region of the touch screen 112. In some embodiments, the device detects a portion of contact 5058 on touch screen 112 while another portion of contact 5058 is outside touch screen 112. In fig. 5 AB-5 AC, the device further detects movement of contact 5058 (e.g., upward movement 5058 of contact 5058 from the middle bottom edge of touch screen 112 to the inner portion of touch screen 112). In some embodiments, the movement 5059 of the contact 5058 includes: upward movement from outside of touch screen 112 across the bottom edge of touch screen 112 to an interior portion of touch screen 112 (e.g., movement 5059 of contact 5058 moves past the top edge of touch screen 112 and away from the bottom edge of touch screen 112 by at least a threshold distance). In some implementations, the gesture is an upward swipe gesture starting from a middle bottom edge region of the touch screen 112 toward an interior portion of the touch screen 112.
In accordance with a determination that the movement 5059 of the contact 5058 meets a first criterion for displaying a multitasking user interface (e.g., user interface 5060 in fig. 5 AD), the device initiates display of the multitasking user interface. The multi-tasking user interface (e.g., also referred to as an application switcher user interface) includes one or more representations of one or more user interfaces of an opened application (e.g., a multi-tasking user interface including a plurality of cards that are scaled down images of the last viewed user interface of a different opened application).
In some implementations, the first criteria for displaying the multitasking user interface includes a requirement that movement of the contact 5032 begin from a bottom edge region of the touch screen 112 (e.g., region 5004-4 in fig. 5 AB) in order to meet the first criteria. In some implementations, the first criteria for displaying the multitasking user interface includes a requirement that movement of the contact 5032 begin from a middle bottom edge region of the touch screen 112 (e.g., region 5004-4 shown in fig. 5 AB) in order to meet the first criteria. In some implementations, the first criteria for displaying the multitasking user interface includes a requirement that movement of the contact 5058 be in a substantially upward direction in order to meet the first criteria. In some implementations, the first criteria for displaying the multi-tasking user interface includes a requirement that the movement 5059 have a first distance, a first movement acceleration, and/or a first movement speed when the contact 5058 lifts off in order to satisfy the first criteria. In some embodiments, initiating display of the multitasking user interface comprises: the size of the currently displayed application user interface (e.g., email user interface 5036) is reduced while the application user interface is gradually shifted upward according to the upward movement of contact 5058. In some implementations, a reduced visibility (e.g., obscured, darkened, etc.) version of the home screen user interface (e.g., user interface 5002) serves as a background layer for the multitasking user interface. For example, when the size and position of the application user interface changes, a portion of the home screen user interface with reduced visibility is displayed in the background.
In fig. 5AD, the device has detected lift-off of contact 5058 at a location in the middle portion of touch screen 112 (e.g., to meet a required movement distance for displaying a first criterion of the multitasking user interface, and/or to meet a movement speed of the first criterion when lifted-off, etc.). In some embodiments, the device has detected a suspension of movement of contact 5058, rather than lift-off of contact 5058 (e.g., movement of contact 5058 has stopped for a duration longer than a threshold duration without lift-off of contact 5058). In accordance with a determination that the swipe-up gesture satisfies a first criteria for displaying a multitasking user interface when contact 5058 is lifted off (e.g., or paused), the device displays the multitasking user interface (e.g., user interface 5060) in full screen mode. In accordance with a determination that the swipe-up gesture does not meet the first criteria for displaying the multitasking user interface when contact 5058 is lifted off (e.g., or paused), the device foregoes displaying the multitasking user interface in full screen mode and instead continues displaying the application user interface (e.g., email user interface 5036).
The multitasking user interface includes one or more representations of one or more open application user interfaces (e.g., email user interface 5036, web browser user interface 5060-1, and message user interface 5060-2). The multitasking user interface includes representations of all or some of the open applications. In some embodiments, the multitasking user interface includes a representation of a preset number or less of open applications. As shown in fig. 5AD, the multitasking user interface includes a reduced-scale image of an opened application. In some embodiments, the scaled down image of the opened application is partial (e.g., email user interface 5036, web browser user interface 5060-1, and message user interface 5060-2 in FIG. 5AD are partial representations of images of the corresponding opened application). In some embodiments, multiple ones of the cards show images of the last viewed user interface of the corresponding application. In some implementations, the device displays cards arranged in a stacked manner. In some implementations, cards in the stack are arranged chronologically based on the time that the application was previously seen. For example, in fig. 5AD, a card representing email user interface 5036 is displayed on top of the stack, as email user interface 5036 is the most recently seen user interface. In addition, web browser user interface 5060-1 is displayed behind email user interface 5036, while message user interface 5060-2 is displayed behind web browser user interface 5060-1. The representation of the application user interface, when activated according to preset criteria (e.g., activated by a tap gesture, a double tap gesture, etc.), causes the device to display a respective application user interface corresponding to a respective application icon. Thus, the multitasking user interface allows navigation through multiple open applications.
In fig. 5AE, the device has detected movement 5059-1 of contact 5058 (e.g., movement 5059-1 of contact 5058 in the second direction). In some embodiments, the movement 5059-1 of the contact 5058 begins at a middle bottom edge region of the touch screen 112 and has a substantially upward direction that is the same as the direction described with respect to the movement 5059 of the contact 5058 in fig. 5 AB-5 AC. However, the movement 5059-1 has certain features that are different from the features of the movement 5059 of the contact 5058, which distinguish the up swipe feature of fig. 5AE from the up swipe gesture of fig. 5 AB-5 AC. In some implementations, such features include acceleration, speed, duration, or length of the gesture. In accordance with a determination that lift-off of the contact 5058 is detected and in accordance with a determination that the movement 5059-1 of the contact 5058 meets a first criteria for displaying the home screen (e.g., user interface 5002 in fig. 5 AE), the device initiates display of the home screen. In some embodiments, the first criteria for displaying the home screen user interface includes a requirement that the movement 5059-1 have a second distance, a second acceleration of movement, and/or a second speed of movement when the contact 5058 lifts off in order to meet the first criteria. The second distance, the second movement acceleration and/or the second movement speed are different from the first distance, the first movement acceleration and/or the first movement speed required by the first criterion for displaying the multi-tasking user interface described with respect to fig. 5AC to 5 AD. In some embodiments, the first criteria for displaying the home screen user interface includes a requirement that movement 5059-1 of contact 5058 be detected when displaying the multitasking user interface in order to meet the first criteria. In some embodiments, the first criteria for displaying the home screen user interface includes a requirement that the movement 5059-1 of contact 5058 begin at a middle portion of touch screen 112 (e.g., a middle portion positioned a threshold distance from an edge of touch screen 112). In some embodiments, initiating display of the home screen user interface comprises: the main user interface is caused to gradually appear on top of the multi-tasking user interface (e.g., the main screen user interface gradually fades out onto the foreground while the multi-tasking user interface gradually fades out into the background). In some embodiments, initiating display of the home screen user interface comprises: the size of the multi-tasking user interface is gradually reduced (e.g., the size of a scaled-down representation of one or more user interfaces of different applications is reduced) while the home screen user interface is gradually faded into the foreground. In accordance with a determination that the swipe-up gesture satisfies the first criteria for displaying the home screen user interface upon liftoff of contact 5058, the device displays the home screen user interface (e.g., user interface 5002) in full screen mode. In accordance with a determination that the swipe-up gesture does not meet the first criteria for displaying the home screen user interface when the contact 5058 is lifted off, the device foregoes displaying the home screen user interface in full screen mode and instead continues displaying the application user interface (e.g., email user interface 5036).
Fig. 5AF illustrates an alternative movement of the contact 5058 after the movement 5059 (e.g., an upward swipe gesture) of the contact 5058 in the second direction is detected in fig. 5 AB-5 AC. In fig. 5AF, the device detects movement 5059-2 in a third direction instead of continuing to detect movement of contact 5058 detected in fig. 5AB in a second direction (e.g., upward) in fig. 5AC and 5 AE. As shown, the movement 5059-2 of the contact 5058 is substantially parallel to the bottom edge of the touch screen 112. In some implementations, the movement 5059-2 of the contact 5058 corresponds to a lateral swipe (e.g., a left-to-right or right-to-left swipe gesture). In some embodiments, the movement 5059-2 includes a lateral movement from an interior portion of the touch screen 112 toward a left or right edge of the touch screen 112. For example, movement of the contact 5058 includes movement 5059 in a straight-cube direction, followed by movement 5059-2 of the contact 5058 from left to right. In some embodiments, the device detects the movements 5059 and 5059-2 of the contact 5058 as a continuous swipe gesture. In some embodiments, the device detects a pause or lift-off between the movements 5059 and 5059-2 of the contact 5058.
As shown in fig. 5AF, in accordance with a determination that the movement 5059-2 of the contact 5058 meets a first criteria for switching between user interfaces of applications having a stored state (e.g., switching between an email user interface 5036 in fig. 5AB and a web browser user interface 5060-1 in fig. 5 AF), the device initiates a switch between the opened user interfaces. For example, an application having a storage state may be a currently running application whose current user interface is stored, or an application having a storage state may be closed when its previously viewed user interface is stored, such that the previously viewed user interface may be displayed upon request. In some implementations, an application having a storage state is alternatively referred to as an opened application. As set forth above with respect to fig. 5AD, the multitasking user interface includes one or more representations (e.g., cards) of the user interface of the application with corresponding storage states. In some implementations, switching between the opened user interfaces includes: the user interface of the first application (e.g., email user interface 5036 in FIG. 5 AB) is replaced with the user interface of the second application, where the first application and the second application are currently running on the device. In some implementations, initiating a switch between the opened user interfaces includes: the movement 5059-2 according to contact 5058 laterally displaces the user interface of the first application (e.g., email user interface 5036) while gradually displaying the user interface of the second application (e.g., web browser user interface 5060-1) in full screen mode, as shown in FIG. 5 AF.
In some embodiments, the first criteria for switching between open user interfaces includes a requirement that two or more user interfaces of different applications have a storage state on the device in order to meet the first criteria. In some implementations, the first criteria for switching between the opened user interfaces includes a requirement that movement of the contact 5058 begin from a bottom edge region of the touch screen 112 (e.g., regions 5004-4 in fig. 5 AB) in order to meet the first criteria. In some implementations, the first criteria for switching between the opened user interfaces includes a requirement that movement of the contact 5058 begin from a middle bottom edge region (e.g., region 5004-4 in fig. 5 AB) of the touch screen 112 in order to meet the first criteria. In some embodiments, the first criteria for switching between the opened user interfaces includes the following requirements in order to satisfy the first criteria: a first portion of the movement of contact 5058 is in a substantially upward direction (e.g., movement 5059 in fig. 5 AB) and is followed by a second portion of the movement of contact 5058 that is in a substantially lateral direction (e.g., movement 5059-2 in fig. 5 AF). In some embodiments, initiating display of the multitasking user interface comprises: the first application user interface (e.g., email user interface 5036) is reduced in size while gradually shifting the first application user interface sideways in accordance with the movement (e.g., movement 5059-2) of the contact 5058.
In fig. 5AG, the device has detected lift-off of contact 5058 at a location in the middle right portion of touch screen 112 (e.g., to meet the required travel distance for the first criterion for switching between opened user interfaces, and/or to meet the first criterion upon lift-off, etc.); and in accordance with a determination that the swipe gesture satisfies the first criteria for switching between opened user interfaces when contact 5058 is lifted off, the device displays a user interface of the second application (e.g., web browser user interface 5060-1) in full screen mode. In some implementations, the device replaces the user interface (e.g., email user interface 5036) of the first application with the user interface (e.g., web browser user interface 5060-1) of the second application. In some implementations, the device stops displaying the user interface of the first application. In accordance with a determination that the swipe gesture does not meet the first criteria for switching between opened user interfaces when contact 5058 is lifted off, the device discards displaying the user interface of the second application in full screen mode (e.g., web browser user interface 5060-1) and instead continues to display the user interface of the first application (e.g., email user interface 5036).
Fig. 5 AH-5 AL illustrate swipe gestures that cause a device to navigate between an application user interface and a notification center user interface, according to some embodiments.
Fig. 5AH through 5AJ illustrate a swipe down gesture for navigating from an application user interface to a notification center user interface. In fig. 5AH, device 100 displays an application user interface (e.g., email user interface 5036) in full screen mode. The device detects a contact (e.g., contact 5062) in a second portion of the edge region of touch screen 112 (e.g., region 5003-2) (e.g., in the left top edge region of touch screen 112). In fig. 5 AH-5 AI, the device detects movement of contact 5062 in a first direction (e.g., downward movement 5063 of contact 5062 from a left top edge of touch screen 112 to a middle left portion of touch screen 112). In some embodiments, the movement 5063 of the contact 5062 corresponds to the movement 5019 of the contact 5018 described with respect to fig. 5F to 5H.
In accordance with a determination that movement 5063 of contact 5062 meets a third criterion for displaying a notification center user interface (e.g., user interface 5020), the device initiates display of the notification center user interface. In some embodiments, the third criteria for displaying the notification center user interface corresponds to the first criteria for displaying the notification user interface described above with respect to fig. 5F-5H. In fig. 5F-5H, the device initiates display of the notification center user interface in accordance with a determination that movement of the contact meets a first criterion for displaying the notification center user interface, regardless of whether movement of the contact is detected when the home screen user interface (e.g., user interface 5002) or the application user interface (e.g., email user interface 5036) is displayed. In some embodiments, the device displays progressively replacing the notification center user interface with the search user interface according to movement 5063 of contact 5062, as described above with respect to replacing the home screen user interface with the notification user interface in fig. 5F-5H.
In fig. 5AJ, the device has detected lift-off of contact 5062 at a location in the middle left portion of touch screen 112 (e.g., at a movement distance that meets the requirements of the first criterion, and/or at a movement speed that meets the first criterion when lifted-off, etc.); and in accordance with a determination that the swipe down gesture satisfies a third criteria for displaying the notification center user interface when the contact 5062 is lifted off, the device displays the notification center user interface (e.g., user interface 5020) in full screen mode. In accordance with a determination that the movement of contact 5062 does not meet the third criteria for displaying the notification center user interface, the device foregoes displaying the notification center user interface in full screen mode and instead continues displaying the application user interface.
Fig. 5 AK-5 AL illustrate swipe-up gestures for navigating from a notification center user interface to a previously displayed application interface (e.g., dismissing the notification center user interface). For example, in fig. 5 AK-5 AL, the device replaces the notification center user interface with an email user interface (e.g., user interface 5036) in response to the swipe-up gesture, the email user interface being displayed prior to detecting the swipe-down gesture in fig. 5AH for displaying the notification center user interface. In fig. 5AK, the device detects contact 5064 when displaying the notification center user interface in full screen mode. The device detects a contact (e.g., contact 5064) in a fourth portion of the edge region of touch screen 112 (e.g., region 5003-4 shown in fig. 5 AB) (e.g., in the middle bottom region of touch screen 112). In some implementations, the device detects at least a portion of the contact 5064 on an edge of the touch screen 112. In fig. 5AK, the device detects movement 5065 in a second direction (e.g., upward movement 5065 of contact 5064 from a middle bottom edge of touch screen 112 to an inner portion of touch screen 112).
In accordance with a determination that movement 50555 of contact 5054 satisfies a first criterion for dismissing a notification center user interface (e.g., user interface 5020) and replacing the notification center user interface with a previously displayed application user interface, the device initiates display of the application user interface. In some embodiments, the first criteria for dismissing the notification center user interface includes a requirement that the user interface (e.g., an application user interface) be replaced by a notification user interface before in order to satisfy the first criteria. For example, in response to a swipe gesture detected while the application user is displayed, a notification center user interface is initiated to be displayed. In some implementations, the first criteria for dismissing the notification center user interface includes a requirement that movement of the contact 5064 begin from a bottom edge region of the touch-screen 112 (e.g., region 5004-4 in FIG. 5 AB) in order to meet the first criteria. In some implementations, the first criteria for dismissing the notification center user interface includes a requirement that movement of the contact 5064 begin from a middle bottom edge region of the touch-screen 112 (e.g., regions 5004-4 in FIG. 5 AB) in order to meet the first criteria. In some embodiments, the first criteria for dismissing the notification center user interface includes a requirement that movement of the contact 5064 be in a substantially upward direction in order to satisfy the first criteria. In some implementations, the device display progressively replaces the notification center user interface with the application user interface previously displayed according to movement 5063 of contact 5062. For example, the bottom edge of the notification center user interface slides up according to the movement 5063 of the contact 5062 while the application user interface is displayed in the background.
In fig. 5AL, the device has detected lift-off of contact 5064 at a location in the middle portion of touch screen 112 (e.g., to meet a required distance of movement of a first criterion, and/or to meet a speed of movement of a first criterion upon lift-off, etc.) for dismissing the notification center user interface; and in accordance with a determination that the swipe-up gesture satisfies the first criteria for displaying the off-center user interface when the contact 5064 is lifted off, the device displays the application user interface (e.g., user interface 5036) in full screen mode. In accordance with a determination that the movement of contact 5064 does not meet the first criteria for dismissing the notification user interface, the device forgoes displaying the application user interface in full screen mode and instead continues to display the notification user interface.
In the case where the user interface displayed before the notification center user interface is displayed is the home screen, the dismissal of the notification center user interface described with respect to fig. 5AK to 5AL may also be performed for navigating from the notification center user interface back to the home screen. For example, the dismissal of the notification center user interface described in fig. 5AK to 5AL may be performed after performing the operation for navigating from the home screen to the notification center by the swipe down gesture described above with respect to fig. 5F to 5H.
Fig. 5 AM-5 AN illustrate a downward multi-contact swipe gesture that causes a device to navigate from a home screen user interface of the device to a search user interface, according to some embodiments.
In fig. 5AM, device 100 displays a home screen user interface (e.g., user interface 5002) in full screen mode. The device detects a plurality of contacts (e.g., two, three, or four contacts) (e.g., a plurality of contacts 5066 including contacts 5066-1, 5066-2, 5066-3, and 5066-4) in a middle portion of touch screen 112 (e.g., portion 5025 of fig. 5AM located away from an edge of touch screen 112 that corresponds to an inner portion of touch screen 112). In fig. 5AM, the device detects movement of a plurality of contacts in a first direction (e.g., downward movement 5067 of contact 5066 from a middle portion of touch screen 112). In some implementations, the movement 5067 of the contact 5066 is substantially perpendicular to the top edge of the touch screen 112. In some implementations, movement 5067 of contact 5066 corresponds to a downward multi-contact swipe starting from a middle portion of touch screen 112 toward a bottom edge of touch screen 112.
In accordance with a determination that movement of contact 5066 meets a seventh criterion for displaying a search user interface (e.g., user interface 5006), the device displays the search user interface. In some embodiments, the device displays progressively replaces the home screen user interface with the search user interface in accordance with movement 5067 of contact 5066, similarly as described above with respect to replacing the home screen user interface with the search user interface in fig. 5A-5C. In some embodiments, the seventh criteria for displaying the search user interface includes a requirement that the device detect a preset number of contacts (e.g., two, three, or four contacts) in order to satisfy the seventh criteria. In some implementations, the seventh criteria for displaying the search user interface includes a requirement that the preset number of contacts be configured in a particular configuration (e.g., that the preset number of contacts be positioned in a particular manner relative to each other) in order to satisfy the seventh criteria. In some embodiments, the seventh criteria for displaying the search user interface includes a requirement that the preset number of contacts move substantially simultaneously in the first direction (e.g., the preset number of contacts move in substantially the same direction at substantially the same speed) in order to meet the seventh criteria. In some embodiments, the seventh criteria for displaying the search user interface includes a requirement that movement of the contact have reached a threshold distance and/or have a threshold speed in order to meet the seventh criteria.
In fig. 5AN, the device has detected liftoff of contact 5066; and in accordance with a determination that movement of contact 5066 meets a seventh criterion for displaying the search user interface, the device displays the search user interface in full screen mode. In some embodiments, replacing home screen user interface 5002 with a search user interface is performed as described with respect to fig. 5A-5C. In accordance with a determination that the movement of contact 5066 does not meet the seventh criterion, the device foregoes displaying the search user interface in full screen mode and instead continues displaying the home screen user interface.
In some embodiments, the device displays the search user interface in response to a swipe down gesture starting from a first portion (e.g., shown in fig. 5A-5C), a middle portion (e.g., shown in fig. 5M-5N and 5Z-5 AA) of the top edge region of touch screen 112, or a multi-contact swipe down gesture starting from a middle portion (e.g., shown in fig. 5 AM-5 AN) of touch screen 112.
Fig. 5 AO-5 AP illustrate an upward multi-touch swipe gesture that causes a device to navigate from a home screen user interface of the device to a multi-tasking user interface according to some embodiments.
In fig. 5AO, the device 100 displays the home screen user interface (e.g., user interface 5002) in full screen mode. The device detects a plurality of contacts (e.g., two, three, or four contacts) (e.g., a plurality of contacts 5068 including contacts 5068-1, 5068-2, 5068-3, and 5068-4) in a middle portion of touch screen 112 (e.g., portion 5025 of FIG. 5AM located away from an edge of touch screen 112 that corresponds to an inner portion of touch screen 112). In FIG. 5AO, the device detects movement of the plurality of contacts in a second direction (e.g., upward movement 5069 of contact 5068 from a middle portion of touch screen 112). In some implementations, the movement 5069 of the contact 5068 is substantially perpendicular to the top edge of the touch screen 112. In some implementations, movement 5069 of contact 5068 corresponds to an upward multi-contact swipe starting from a middle portion of touch screen 112 toward a bottom edge of touch screen 112.
In accordance with a determination that movement of the contact 5068 meets a second criterion for displaying a multitasking user interface (e.g., user interface 5060), the device displays the multitasking user interface. The multi-tasking user interface (e.g., also referred to as an application switcher user interface) includes a representation of one or more open application user interfaces having a storage state (e.g., a multi-tasking user interface including a plurality of cards that are scaled down images of the last viewed user interface of different applications having a storage state). In FIG. 5AP, representations of a web browser user interface 5060-1, a photo user interface 5072, and a message user interface 5060-2 are shown. In the configuration shown in fig. 5AP, representations of the user interfaces of one or more applications having a stored state are arranged next to each other, rather than in a stacked manner as shown in fig. 5 AD. In some embodiments, the device displays progressively replaces the home screen user interface with the multi-tasking user interface in accordance with movement 5069 of contact 5068, similarly as described above with respect to replacing the home screen user interface with the multi-tasking user interface in fig. 5 AB-5 AD. In some embodiments, the second criteria for displaying the multitasking user interface includes a requirement that the device detect a preset number of contacts 5068 (e.g., two, three, or four contacts) in order to meet the seventh criteria. In some implementations, the second criteria for displaying the search user interface includes a requirement that the preset number of contacts 5068 be configured in a particular configuration (e.g., that the preset number of contacts be positioned in a particular manner relative to each other) in order to satisfy the second criteria. In some embodiments, the second criteria for displaying the search user interface includes a requirement that the preset number of contacts 5068 move in the second direction substantially simultaneously (e.g., the preset number of contacts move in substantially the same direction and at substantially the same speed) in order to meet the second criteria. In some implementations, the second criteria for displaying the multitasking user interface includes the movement of contact 5068 having reached a threshold distance and/or having a threshold speed requirement in order to meet the second criteria.
In fig. 5AP, the device has detected liftoff of contact 5068; and in accordance with a determination that movement of contact 5068 meets a second criterion for displaying the multitasking user interface, the device displays the multitasking user interface in full screen mode. In some embodiments, replacing the home screen user interface with the multitasking user interface is performed as described with respect to fig. 5 AB-5 AD. In accordance with a determination that the movement of contact 5066 does not meet the second criteria for displaying the multitasking user interface, the device foregoes displaying the multitasking user interface and instead displays the home screen user interface.
In some implementations, the device displays the multi-tasking user interface in response to an upward swipe gesture (e.g., shown in fig. 5 AB-5 AD) starting from a fourth portion of the bottom edge region of the touch screen 112 or a multi-contact upward swipe gesture (e.g., shown in fig. 5 AO-5 AP) starting from a middle portion of the touch screen 112.
Fig. 5 AQ-5 AR illustrate an upward multi-contact swipe gesture that causes a device to navigate from an application user interface to a home screen user interface of the device, according to some embodiments.
In fig. 5AQ, device 100 displays an application user interface (e.g., photo user interface 5072) in full screen mode. The device detects a plurality of contacts (e.g., a plurality of contacts 5070 including contacts 5070-1, 5070-2, 5070-3, and 5070-4) in a middle portion of touch screen 112 (e.g., portion 5025 in FIG. 5AM located away from an edge of touch screen 112, which corresponds to an inner portion of touch screen 112). In fig. 5AQ, the device further detects movement of the plurality of contacts in a second direction (e.g., upward movement 5071 of contact 5068 from a middle portion of touch screen 112). In some implementations, the movement 5071 of the contact 5070 is substantially perpendicular to the top edge of the touch screen 112. In some implementations, movement 5071 of contact 5070 corresponds to an upward multi-contact swipe starting from a middle portion of touch screen 112 toward a bottom edge of touch screen 112.
In accordance with a determination that movement of contact 5070 meets a second criterion for displaying a home screen user interface (e.g., user interface 5002), the device displays the home screen user interface. In some embodiments, the device displays progressively replaces the application user interface with the home screen user interface in accordance with movement 5071 of contact 5070, similarly as described above with respect to replacing the home screen user interface with the search user interface in fig. 5A-5C. In some embodiments, the second criteria for displaying the home screen user interface includes a requirement that the device detect a preset number of contacts 5070 (e.g., two, three, or four contacts) in order to meet the second criteria. In some embodiments, the second criteria for displaying the home screen user interface includes a requirement that the device detect a preset number of contacts 5070 when displaying the application user interface (e.g., photo user interface 5072) in order to satisfy the second criteria. In some embodiments, the second criteria for displaying the home screen user interface includes a requirement that the preset number of contacts 5070 be configured in a particular configuration (e.g., that the preset number of contacts be positioned in a particular manner relative to each other) in order to satisfy the second criteria. In some embodiments, the second criteria for displaying the search user interface includes a requirement that the preset number of contacts 5070 move in the second direction substantially simultaneously (e.g., the preset number of contacts move in substantially the same direction at substantially the same speed) in order to meet the second criteria. In some implementations, the second criteria for displaying the multitasking user interface includes the movement of contact 5068 having reached a threshold distance and/or having a threshold speed requirement in order to meet the second criteria.
In fig. 5AR, the device has detected lift-off of contact 5070; and in accordance with a determination that movement of contact 5070 meets a second criterion for displaying the home screen interface, the device displays the home screen user interface. In some embodiments, replacing the application user interface with the multitasking user interface is performed as described with respect to replacing the home screen user interface with the search user interface in fig. 5A-5C. In accordance with a determination that the movement of contact 5070 does not meet the second criteria for displaying the home screen user interface, the device relinquishes displaying the home screen user interface and instead displays the application user interface.
Fig. 5 AS-5 AT illustrate right-to-left or left-to-right multi-touch swipe gestures that cause a device to switch between user interfaces of applications having a stored state, according to some embodiments.
In fig. 5AS, device 100 displays an application user interface (e.g., photo user interface 5072) in full screen mode. The device detects a plurality of contacts (e.g., a plurality of contacts 5074 including contacts 5074-1, 5074-2, 5074-3, and 5074-4) in a middle left portion of touch screen 112 (e.g., region 5073 in FIG. 5AS positioned in an interior portion of touch screen 112 and near a left edge of touch screen 112). In fig. 5AS, the device further detects movement of the plurality of contacts in a third direction (e.g., left to right movement 5075 of contact 5074 from the middle right portion of touch screen 112). In some implementations, the movement 5071 of the contact 5070 is substantially parallel to the top edge of the touch screen 112. In some implementations, movement 5075 of contact 5074 corresponds to a substantially horizontal (e.g., left to right or right to left) multi-contact swipe starting from a middle right portion of touch screen 112 toward a middle portion of touch screen 112.
In accordance with a determination that movement of the contact 5074 meets a second criterion for switching between opened user interfaces (e.g., switching from the photo user interface 5072 to the web browser user interface 5060-1), the device switches between opened user interfaces. In some implementations, switching between the opened user interfaces includes: the user interface of the first application (e.g., photo user interface 5072 in fig. 5 AS) is replaced with the user interface of the second application (e.g., web browser user interface 5060-1 in fig. 5 AT), where the first application and the second application are currently open on the device. In some implementations, initiating a switch between the opened user interfaces includes: the user interface of the second application (e.g., web browser user interface 5060-1) is slid from the left edge of touch screen 112 toward the center of touch screen 112 while the user interface of the second application (e.g., web browser user interface 5060-1) is gradually presented according to movement 5075 of contact 5074. In some embodiments, the second criteria of the first criteria for switching between opened user interfaces includes two or more user interfaces of different applications opening requirements on the device in order to meet the second criteria. In some implementations, the second criteria for switching between the opened user interfaces includes a requirement that movement of the contact 5074 be detected when the application user interface is displayed in order to meet the second criteria. In some implementations, the second criteria for switching between the opened user interfaces includes movement of the contact 5074 starting from a left middle portion of the touch screen 112 (e.g., region 5073 in fig. 5 AS) in order to meet the second criteria. In some embodiments, the second criteria for switching between the opened user interfaces includes the requirement that movement of the contact 5074 be in a substantially lateral (e.g., horizontal) direction (e.g., movement 5075 in fig. 5 AS) in order to meet the first criteria.
In fig. 5AT, the device has detected lift-off of contact 5074 AT a location in the middle right portion of touch screen 112 (e.g., to a travel distance that meets the requirements of a second criterion for switching between opened user interfaces, and/or to a travel speed that meets the second criterion when lifted-off, etc.); and in accordance with a determination that the lateral swipe gesture satisfies a second criterion for switching between opened user interfaces when contact 5074 is lifted off, the device displays a user interface of a second application (e.g., web browser user interface 5060-1) in full screen mode. In some implementations, the device replaces the user interface (e.g., photo user interface 5072) of the first application with the user interface (e.g., web browser user interface 5060-1) of the second application. In some implementations, the device stops displaying the user interface of the first application.
Fig. 5 AU-5 AV illustrate downward multi-contact swipe gestures that cause a device to navigate from an application user interface of the device to a search user interface, according to some embodiments.
In fig. 5AU, the device 100 displays an application user interface (e.g., photo user interface 5072) in full screen mode. The device detects a plurality of contacts (e.g., a plurality of contacts 5076 including contacts 5076-1, 5076-2, 5076-3, and 5076-4) in a middle portion of touch screen 112 (e.g., portion 5025 in FIG. 5AM located away from an edge of touch screen 112, which corresponds to an inner portion of touch screen 112). In fig. 5AU, the device further detects movement of the plurality of contacts in a first direction (e.g., downward movement 5077 of contact 5076 from the middle portion of touch screen 112). In some implementations, the movement 5077 of the contact 5076 is substantially perpendicular to the top edge of the touch screen 112. In some implementations, movement 5077 of contact 5076 corresponds to a downward multi-contact swipe starting from a middle portion of touch screen 112 toward a bottom edge of touch screen 112.
In accordance with a determination that movement of contact 5076 meets an eighth criterion for displaying a search user interface (e.g., user interface 5006), the device displays the search user interface. In some embodiments, the device displays progressively replacing the application user interface with the search user interface in accordance with movement 5077 of contact 5076, similarly as described above with respect to replacing the home screen user interface with the search user interface in fig. 5A-5C. In some embodiments, the eight criteria for displaying the search user correspond to the first criteria for displaying the search user interface described above with respect to fig. 5AM through 5 AN. The device thus initiates display of the search user interface in accordance with a determination that movement of the contact satisfies the seventh criteria for displaying the search user interface, regardless of whether movement of the contact is detected while the home screen user interface (e.g., user interface 5002) or the application user interface (e.g., photo user interface 5072) is displayed.
In fig. 5AV, the device has detected the lift-off of contact 5076; and in accordance with a determination that movement of contact 5076 meets the eighth criterion for displaying the search user interface, the device displays the search user interface in full screen mode. In some embodiments, replacing the application user interface with the search user interface is performed as described with respect to replacing the home screen user interface with the search user interface in fig. 5A-5C. In accordance with a determination that the movement of contact 5076 does not meet the eighth criterion, the device foregoes displaying the search user interface and instead continues displaying the application user interface.
Fig. 5 AW-5 AX illustrate a downward multi-contact swipe gesture that causes a device to navigate from an application user interface of the device to a search user interface, according to some embodiments.
In FIG. 5AW, the device 100 displays an application user interface (e.g., web browser interface 5060-1) in full screen mode. The application user interface in fig. 5AW (e.g., web browser user interface 5060-1) is different from the application user interface in fig. 5AU (e.g., photo user interface 5072). The device detects a plurality of contacts (e.g., a plurality of contacts 5078 including contacts 5078-1, 5078-2, 5078-3, and 5078-4) in a middle portion of touch screen 112 (e.g., portion 5025 in FIG. 5AM located away from an edge of touch screen 112, which corresponds to an inner portion of touch screen 112). In fig. 5AV, the device further detects movement of the plurality of contacts in a first direction (e.g., downward movement 5079 of contact 5078 from the middle portion of touch screen 112). In some implementations, the movement 5079 of the contact 5078 is substantially perpendicular to the top edge of the touch screen 112. In some implementations, movement 5079 of contact 5078 corresponds to a downward multi-contact swipe starting from a middle portion of touch screen 112 toward a bottom edge of touch screen 112.
In accordance with a determination that the movement of contact 5078 meets an eighth criterion for displaying a search user interface (e.g., user interface 5006), the device displays the search user interface. The device thus initiates display of the search user interface in accordance with a determination that movement of the contact meets eight criteria for display of the search user interface, regardless of whether movement of the contact is detected when the photo user interface 5072 is displayed or the web browser program user interface 5060-1 is displayed.
Fig. 6A-6J are flowcharts illustrating methods of interacting with an application switching user interface according to some embodiments. Although some of the examples that follow will be given with reference to inputs on a touch-sensitive display (where the touch-sensitive surface and the display are combined), in some implementations, the device detects inputs on a touch-sensitive surface 451 that is separate from the display 450, as shown in fig. 4B.
In some embodiments, methods 6000 and 6100 are performed by an electronic device (e.g., portable multifunction device 100 of fig. 1A) and/or one or more components of an electronic device (e.g., I/O subsystem 106, operating system 126, etc.). In some embodiments, methods 6000 and 6100 are managed by instructions stored on a non-transitory computer readable storage medium and executed by one or more processors of the device, such as one or more processors 122 (fig. 1A) of device 100. For ease of explanation, methods 6000 and 6100 performed by the device 100 are described below. In some implementations, referring to fig. 1A, the operations of method 6000 are performed or used, at least in part, by operating system 126, communication module 128, and/or graphics module 132, and a touch-sensitive display (e.g., touch screen 112). Some operations in methods 6000 and 6100 are optionally combined and/or the order of some operations is optionally changed.
Methods 6000 and 6100 (and associated interfaces) provide a visual way to invoke search features, such as displaying a search user interface (e.g., concurrently displaying an application, waking up a screen, cover or home screen user interface, etc.) in response to gestures detected in different contexts, as described below. These methods reduce the number, extent, and/or nature of inputs from a user when performing search operations, thereby creating a more efficient human-machine interface. For battery-operated electronic devices, enabling a user to perform search operations faster and more efficiently saves power and increases the time between battery charges.
The method 6000 is performed at an electronic device having a display generating component (e.g., a touch screen display, projector, stand alone display, heads up display, head mounted display, etc.) and a touch sensitive surface.
In method 6000, the device displays (6002) a first user interface (e.g., home screen, lock screen, wake screen, cover, application UI, etc.) in a display area (e.g., home screen user interface 5002 displayed on touch screen 112 in fig. 5A) via a display generating component. In some implementations, the display generating component and the touch-sensitive surface are integrated into a touch screen display. In some implementations, the display generating component is a display housed separately from the touch-sensitive surface, and the respective locations on the touch-sensitive surface optionally have corresponding locations on the display. In some implementations, the respective locations on the touch-sensitive surface have corresponding locations in a display area provided via the display generating component (e.g., a spatial area having a limited and preset spatial extent in which computer-generated content can be visually presented), and the respective movements of the contacts across the touch-sensitive surface correspond to movements (e.g., in magnitude, speed, direction, etc.) across the display area provided by the display generating component. In some embodiments, the first user interface occupies substantially all of the display area provided by the display generating component and is a full screen user interface. In some embodiments, the first user interface reaches a boundary of the display area provided by the display generating component, wherein the boundary corresponds to a first edge of the display area. While displaying the first user interface, the device detects (6004) a first touch gesture comprising detecting a first set of one or more contacts on the touch-sensitive surface (e.g., a flick gesture, a touch-and-hold gesture, a swipe gesture, a multi-finger pinch gesture, a multi-finger swipe gesture, a volatile gesture comprising multiple types of movements, such as movements in two or more directions, pinch movements, spread movements, pivoting movements, etc.) of one or more simultaneously detected contacts by one or more contacts provided by one or more fingers or a stylus. In response to detecting the first touch gesture (6006), and in accordance with a determination that the first touch gesture meets a first criterion, the device replaces (6008) at least a portion of the first user interface with a first search user interface in the display region, wherein the first criterion includes a first requirement that is met in accordance with a determination that: the first touch gesture includes a first movement (e.g., a gesture corresponding to movement 5005 of contact 5004 in fig. 5A-5C) of the first contact in a first direction across a first portion of the first edge (e.g., region 5003-1) (e.g., left half of the top edge, right quarter of the top edge, top half of the left edge, bottom half of the left edge, left third of the bottom edge, right third of the bottom edge, etc.) of the touch-sensitive surface (e.g., the first movement is in a direction substantially perpendicular to the first edge and/or across the first portion of the first edge toward a center or central portion of the touch-sensitive surface) (e.g., downward from the top edge, leftward from the right edge, rightward from the left edge, etc.). This is shown, for example, in fig. 5B-5C following fig. 5A, where the device displays search user interface 5006 in response to a swipe down gesture by contact 5004 starting from a middle top edge region of touch screen 112 (e.g., region 5003-1 in fig. 5A). In some implementations, the first user interface is shifted in the display area to make room for display of the first user interface, including the first search input area and optionally background layers and/or other user interface objects (e.g., suggested searches, dynamically generated search results, etc.). In some implementations, the first user interface fades into the background to allow the first search user interface to be displayed in a display layer on top of the first user interface. In some implementations, the first user interface is shrunk in size to make room for display of the first search user interface. In some implementations, the first user interface is backed in depth in a direction away from the user, and the first search user interface is displayed in a display layer previously occupied by the first user interface. In some embodiments, the first user interface is completely replaced in the display area by a first search user interface that includes a first search input area. In some implementations, in response to detecting the first touch gesture, at least a portion of the first user interface remains displayed concurrently with the first search input area (e.g., some elements of the first user interface remain displayed and concurrently visible with the first search input area in the first search user interface). In some implementations, the first user interface stops displaying in response to detecting the first touch gesture. In some implementations, a first search user interface including a first search input area slides in from an edge of the touch screen according to a direction of movement of a swipe input of a first touch gesture while the first user interface recedes in a direction away from a surface of the touch screen. In some embodiments, the first search user interface including the first search input area has a translucent background, and after the first search user interface including the first search input area is fully displayed on the touch screen, the background partially reveals the first user interface in a display layer below the first search user interface, e.g., with a blurred and/or darkened image of the first user interface. In accordance with a determination that the first touch gesture meets a second criterion different from the first criterion, the device replaces (6010) at least a portion of the first user interface with a plurality of previously received notifications in the display area (e.g., notification center user interface 5020 in fig. 5H), wherein the second criterion includes a second requirement that is met in accordance with a determination that: the first touch gesture includes a second movement (e.g., a gesture corresponding to movement 5019 of contact 5018 in fig. 5F-5H) of the first contact in a first direction across a second portion of the first edge (e.g., area 5003-2) (e.g., right side quarter of the top edge, left side half of the top edge, bottom half of the left side edge, top half of the left side edge, right side third of the bottom edge, left side third of the bottom edge, etc.) of the first edge (e.g., a second movement across the second portion of the first edge in a direction substantially perpendicular to the first edge and/or toward a center or central portion of the touch sensitive surface) (e.g., downward from the top edge, upward from the right side edge, rightward from the left side edge, etc.), where the second portion of the first edge is different than the first portion of the first edge (e.g., the first movement and the second movement across the same edge in different redefined interaction areas). This is shown, for example, in fig. 5G-5H following fig. 5F, where the device displays a notification center user interface 5020 in response to a swipe down gesture by contact 5018 starting from the left top edge area of touch screen 112 (e.g., area 5003-2 in fig. 5F). For example, in some embodiments, instead of replacing a portion of the first user interface with the first search user interface, the device replaces a portion of the first user interface with a user interface that includes a plurality of previously received notifications. In some implementations, the first user interface is shifted in the display area to make room for display of the previously received notification. In some implementations, the first user interface fades into the background to allow previously received notifications to be displayed on top of the first user interface. In some embodiments, the first search input area is shrunk in size to make room for display of previously received notifications. In some implementations, the first user interface is completely replaced in the display area by another user interface that includes the previously received notification. In some implementations, in response to detecting the first touch gesture, at least a portion of the first user interface remains displayed concurrently with the previously received notification. In some implementations, the first user interface stops displaying in response to detecting the first touch gesture. In some implementations, the plurality of previously received notifications and the first search input area are not displayed simultaneously in response to the first touch gesture (e.g., the plurality of previously received notifications are displayed without displaying the first search input area; and the first search input area is displayed without displaying the plurality of previously received notifications). In some implementations, in response to detection of the first touch gesture, the plurality of previously received notifications are not displayed simultaneously with the first search input area in a display area provided by the display generation component because the first touch gesture fails to satisfy both the first criteria and the second criteria. In some implementations, the user interface including the previously received notification slides in from an edge of the touch screen according to a direction of movement of the swipe input of the first touch gesture while the first user interface recedes in a direction away from a surface of the touch screen. In some embodiments, the user interface including the previously received notification has a semi-transparent background, and after the user interface including the previously received notification is fully displayed on the touch-screen, the background partially reveals the first user interface in a display layer below the first search input area, e.g., utilizing a blurred and/or darkened image of the first user interface. In some implementations, the first user interface is obscured more by the user interface including the previously received notification than the first search user interface including the first search input area. In some implementations, the first user interface is less or equally obscured by a user interface that includes previously received notifications than a first search user interface that includes a first search input area. In some implementations, the first criteria and the second criteria cannot be met simultaneously (e.g., the swipe down gesture is on a first portion of the top edge region of touch screen 112 (e.g., region 5003-1 in fig. 5A) or on a second portion of the top edge region (e.g., region 5003-2 in fig. 5F).
Displaying a user interface having a search input area in accordance with a determination that the touch gesture includes movement across a first portion of an edge of the touch-sensitive surface and displaying a user interface having a plurality of previously received notifications in accordance with a determination that the touch gesture includes movement across a second portion of the edge of the touch-sensitive surface provides additional control options for displaying the user interface without requiring display of user interface controls for invoking the user interfaces. Providing additional control options without cluttering the user interface with controls for additional display enhances the operability of the device, which in turn reduces power usage and extends battery life of the device by enabling a user to more quickly and efficiently use the device.
In some embodiments, the first portion of the first edge and the second portion of the first edge (e.g., regions 5003-1 and 5003-2 of the top edge of touch screen 112 shown in fig. 5A and 5F, respectively) are operatively adjacent (6012) to each other (e.g., the first portion of the first edge is immediately adjacent (in close proximity) to the second portion of the first edge; there is no interaction region between the first portion of the first edge and the second portion of the first edge, there is no input region between the first portion of the first edge and the second portion of the first edge; etc.). In some embodiments, edge portions of the display area corresponding to the first portion of the first edge and the second portion of the first edge, respectively, are not visually marked with respective boundaries on the display area. For example, portions of the display area corresponding to the first and second portions of the first edge do not have corresponding graphical user interface elements displayed at or near portions of the display area corresponding to the first and second portions of the first edge (e.g., the first and second portions of the first edge of the touch screen, the first and second edge portions of the first edge along the first edge of the display area corresponding to the first and second portions of the touch-sensitive surface, respectively, etc.). In some implementations, the display area displays some of the portions of the first user interface that are later replaced by the first search user interface or the previously received notification at, near, or corresponding to the first portion of the first edge and the second portion of the first edge (e.g., an edge portion of the touch screen, an edge portion of the display area provided by the display generating component, etc.), respectively.
Displaying the different user interface in response to a touch gesture comprising movement of a different and operationally adjacent portion across the edge of the touch-sensitive surface provides additional control options for displaying the user interface without requiring display of user interface controls for the different user interface. Providing additional control options without cluttering the user interface with controls for additional display enhances the operability of the device, which in turn reduces power usage and extends battery life of the device by enabling a user to more quickly and efficiently use the device.
In some implementations, replacing the portion of the first user interface with the first search user interface in the display area includes (6014) displaying a search input area (e.g., search input area 5008 in fig. 5C) in a corresponding portion of the display area, and replacing the portion of the first user interface with a plurality of previously received notifications (e.g., area 5006-1 including the suggested action and area 5006-2 including the suggested application in fig. 5C) includes displaying the notifications in a corresponding portion of the display area. In some implementations, a respective portion of the display area provided by the display generating component displays at least a portion of the first search input area in accordance with a determination that the first touch gesture meets the first criteria and displays at least a portion of the plurality of previously received notifications in accordance with a determination that the first touch gesture meets the second criteria. In other words, in some embodiments, the display area displaying the first search input area and the display area displaying the previously received notification are the same display area or overlap. Displaying the different user interfaces on respective portions of the display area in accordance with a determination that the touch gesture meets the first criterion or the second criterion provides additional control options for displaying the different user interfaces without displaying user interface controls for the different user interfaces. Providing additional control options without cluttering the user interface with controls for additional display enhances the operability of the device, which in turn reduces power usage and extends battery life of the device by enabling a user to more quickly and efficiently use the device.
In some implementations, detecting the first touch gesture includes detecting (6016) movement of the first set of one or more contacts from outside the touch-sensitive surface across a first edge of the touch-sensitive surface (e.g., a top edge of the touch screen 112) onto the touch-sensitive surface (e.g., movement 5005 of contact 5004) and the first direction is substantially perpendicular to the first edge of the touch-sensitive surface (e.g., movement 5005 of contact 5004 is substantially downward) (e.g., the first edge is a top edge of the touch screen display area and the first touch gesture is a downward swipe from the top edge; the first edge is a left edge of the touch screen display and the first touch gesture is a rightward swipe from the left edge; and so forth).
Displaying the different user interface in response to a touch gesture (e.g., an edge gesture) that includes a movement of contact on a different portion of an edge of the touch-sensitive surface from outside the touch-sensitive surface to on the touch-sensitive surface provides additional control options for displaying the different user interface without displaying user interface controls for the different user interface. Providing additional control options without cluttering the user interface with controls for additional display enhances the operability of the device, which in turn reduces power usage and extends battery life of the device by enabling a user to more quickly and efficiently use the device.
In response to detecting the first touch gesture, and in accordance with a determination that the first touch gesture satisfies a third criterion, the device replaces (6018) at least a portion of the first user interface with a first search user interface in the display area, wherein the third criterion includes a third requirement that is satisfied in accordance with a determination that: the first touch gesture includes a third movement of the first contact across the touch-sensitive surface in the first direction; and the third movement begins in a first interior portion of the touch-sensitive surface that is outside of and located away from the first edge (e.g., separated from the first edge by one or more other input regions (e.g., one or more edge input regions, and/or one or more interior input regions, etc.) on the touch-sensitive surface) (and, optionally, separated from other edges of the touch-sensitive surface). These features are shown, for example, in fig. 5Z-5 AA, where the device replaces the home screen user interface (e.g., user interface 5002) with the search user interface (e.g., user interface 5006) in response to a swipe down gesture starting from the middle portion of touch screen 112. In some implementations, the first search user interface including the first search input area displayed in response to the touch gesture meeting the first criterion and the first search user interface including the first search input area displayed in response to the touch gesture meeting the third criterion are the same search user interface having the same first search input area. In some implementations, the first search user interface including the first search input area has a semi-transparent background that partially reveals the first user interface as a blurred and/or darkened image of the first user interface (e.g., home screen, application user interface, wake screen, cover, lock screen, notification history user interface, etc.) that has been backed into a display layer below the first search input area. In some implementations, the first search input area is a user interface object displayed on the first user interface in response to a touch gesture that meets the first criteria and/or the third criteria (e.g., in an original state, or optionally, in a blurred and/or darkened version of the original state, etc.). In some implementations, the third criterion requires that the first user interface not yet have another function that has been associated with a swipe gesture that meets the direction and location requirements of the third criterion. In some implementations, the first criteria, the second criteria, and the third criteria cannot be met simultaneously (e.g., the swipe down gesture is on a first portion of the top edge region of touch screen 112 (e.g., region 5003-1 in fig. 5A), on a second portion of the top edge region (e.g., region 5003-2 in fig. 5F), or on a third portion of the top edge region (e.g., region 5003-3 in fig. 5F).
Displaying the user interface with the search input area in accordance with determining that the touch gesture includes movement across a first portion of an edge of the touch-sensitive surface or in accordance with determining that the touch gesture includes movement across the touch-sensitive surface starting from an interior portion of the touch-sensitive surface provides additional control options for displaying the user interface without displaying user interface controls of the user interface with the search input area. Providing additional control options without cluttering the user interface with controls for additional display enhances the operability of the device, which in turn reduces power usage and extends battery life of the device by enabling a user to more quickly and efficiently use the device.
In response to detecting the first touch gesture, and in accordance with a determination that the first touch gesture satisfies a fourth criterion, the device replaces (6020) at least a portion of the first user interface with the control panel user interface, wherein the fourth criterion includes a fourth requirement that is satisfied in accordance with a determination that: the first touch gesture includes a fourth movement of the first contact in a first direction across a third portion of the first edge of the touch-sensitive surface that is different from the first portion and the second portion of the first edge of the touch-sensitive surface (e.g., region 5003-3 of the top edge of touch screen 112 in fig. 5O) (e.g., a third portion functionally adjacent to the first portion or the second portion of the touch-sensitive surface that is an end portion of the first edge (e.g., a left top corner of the top edge or a right top portion of the top edge, etc.)). These features are shown, for example, in fig. 5O-5Q, where the device replaces a portion of the home screen user interface (e.g., user interface 5002) with the control panel user interface (e.g., user interface 5034) in response to a downward swipe gesture (e.g., movement 5033 of contact 5032) starting from the right top edge region (e.g., region 5003-3) of touch screen 112. The control panel user interface includes a plurality of user interface objects (e.g., icons 5034-1 to 5034-14) (e.g., buttons, sliders, dials, etc.) corresponding to different device control functions (e.g., turn on/off WiFi, turn on/off airplane mode, turn on/off other network connection mode, playback control of a media player, display brightness control, audio volume control, turn on/off calls and notify mute mode, turn on/off flashlight, turn on/off camera, etc.).
Displaying a user interface with a search input area in accordance with determining that the touch gesture includes movement across a first portion of an edge of the touch-sensitive surface, displaying a user interface with a plurality of previously received notifications in accordance with determining that the touch gesture includes movement across a second portion of the edge of the touch-sensitive surface, and displaying a user interface with a control panel in accordance with determining that the touch gesture includes movement across a third portion of the edge of the touch-sensitive surface provides additional control options for displaying the user interfaces without displaying user interface controls for the user interfaces. Providing additional control options without cluttering the user interface with controls for additional display enhances the operability of the device, which in turn reduces power usage and extends battery life of the device by enabling a user to more quickly and efficiently use the device.
In some embodiments, the first criteria and the second criteria (and optionally, the third criteria, the fourth criteria, and/or the fifth criteria, etc.) can be met (6022) when an application user interface (e.g., the first user interface is a user interface of the first application) (e.g., a messaging application, a map application, a game application, a media player application, a browser application, etc.). This is illustrated, for example, in fig. 5R-5S, where in response to detecting a swipe down gesture (e.g., movement 5039 of contact 5038) starting from the top middle edge region of touch screen 112, the device replaces the application user interface (e.g., email user interface 5036) with a search user interface (e.g., user interface 5006). For example, in response to activation of an application icon corresponding to the first application (e.g., by tapping an input or a voice command, etc.), a user interface of the first application is displayed. In some implementations, the user interface of the first application has an application function associated with a touch gesture that satisfies a third criterion (e.g., scrolling within the application user interface, or drawing a line within the application user interface, etc.), and if the touch gesture is detected while the first user interface is the user interface of the first application, the touch gesture that satisfies the third criterion does not trigger display of the first search user interface (e.g., but triggers the application function, or requires additional confirmation input, etc.), but only the touch gesture that satisfies the first criterion (and optionally the touch gesture that satisfies the fifth criterion) triggers display of the first search user interface.
Displaying a user interface with a search input area or a user interface with a plurality of previously received notifications in response to a touch gesture received while displaying an application user interface provides a way to display the user interface with a reduced amount of input (e.g., without having to close the displayed application user interface and open a home screen before providing a touch gesture for displaying the user interface). Reducing the number of inputs required to perform the operation enhances the operability of the device, which in turn reduces power usage and extends the battery life of the device by enabling a user to more quickly and efficiently use the device.
In some embodiments, the first criteria and the second criteria (and optionally, the third criteria, the fourth criteria, and/or the fifth criteria, etc.) can be satisfied (6024) (e.g., satisfied, may be satisfied under the condition and one or more other conditions, etc.) when a home screen user interface including a plurality of application icons corresponding to different applications is displayed (e.g., the first user interface is a home screen user interface). This is illustrated, for example, in fig. 5A-5C, where in response to detecting a swipe down gesture (e.g., movement 5005 of contact 5004) starting from a top middle edge region of touch screen 112, the device replaces the home screen user interface (e.g., user interface 5002) with the search user interface (e.g., user interface 5006). The respective ones of the plurality of application icons (e.g., icon 440 of the alarm clock, icon 432 of the online video module, icon 434 of the stock market applet, icon 436 of the map, and icon 438 of the weather applet) when activated according to preset criteria (e.g., by tap input, by double-click input, etc.) cause display of respective ones of the different applications corresponding to the respective application icons. For example, in response to detecting a tap gesture on an application icon (e.g., the icon 440 of the alarm), the device displays a user interface (e.g., an alarm user interface) corresponding to the respective icon. In some implementations, the home screen user interface is displayed in response to activation of the home desktop button or detection of a home desktop gesture, and the home screen user interface replaces the display of the currently displayed user interface in response to activation of the home desktop button or detection of the home desktop gesture. In some implementations, the home screen user interface includes a plurality of pages that a user can navigate using one or more navigational inputs (e.g., horizontal swipes, vertical swipes, etc.).
Displaying a user interface with a search input area or a user interface with a plurality of previously received notifications in response to a touch gesture received while displaying a home screen user interface provides a way to display the user interface in a reduced amount of input (e.g., without providing input for scrolling a plurality of user interface icons on a home screen and providing input for displaying the corresponding user interface icons). Reducing the number of inputs required to perform the operation enhances the operability of the device, which in turn reduces power usage and extends the battery life of the device by enabling a user to more quickly and efficiently use the device.
In some embodiments, upon displaying a wake screen (e.g., the first user interface is a wake screen user interface) that is displayed in response to detecting a request to wake the display generating component from a low power mode (e.g., a low power always on mode, a display off mode, a power saving sleep mode, etc.), the first criteria and the second criteria (and optionally, the third criteria, the fourth criteria, and/or the fifth criteria, etc.) can be satisfied (6026) (e.g., satisfied, may be satisfied under the condition and one or more other conditions, etc.). These features are illustrated, for example, in fig. 5T-5W, where the device replaces the wake screen user interface 5042 with a search user interface (e.g., user interface 5006) in response to detecting a swipe down gesture (e.g., movement 5053 contacting 5054) starting from the middle top edge region of the touch screen 112. The wake screen user interface is displayed in response to detecting a request to wake the display generating component from a low power mode (e.g., low power mode 5040 in fig. 5T). In some implementations, the wake screen user interface is initially displayed in a locked state and later transitions to an unlocked state after authentication information has been obtained (e.g., through password entry or biometric information verification). In some embodiments, waking up the on-screen user interface includes displaying a time element of the current time and optionally the date. In some embodiments, when the device is locked, the wake screen user interface displays a prompt to unlock the device. In some embodiments, the wake-up screen displays one or more notifications when the one or more notifications are newly received (and optionally, when the notifications remain unread in the always-on lower power mode) and when the display generating component wakes up from the low power mode and enters the normal operating mode.
Displaying a user interface with a search input area or a user interface with a plurality of previously received notifications in response to a touch gesture received while displaying a wake screen user interface provides a way to display a used user interface with a reduced amount of input (e.g., without having to close the displayed wake screen user interface and open a home screen before providing a touch gesture for opening a frequently used user interface). Reducing the number of inputs required to perform the operation enhances the operability of the device, which in turn reduces power usage and extends the battery life of the device by enabling a user to more quickly and efficiently use the device.
In some implementations, the first and second criteria (and optionally, the third, fourth, and/or fifth criteria, etc.) can be met (e.g., 6028) (e.g., met, can be met under the condition and one or more other conditions, etc.) when an overlay screen user interface (e.g., the first user interface is an overlay screen user interface) is displayed that overlays a currently displayed user interface in multiple contexts (e.g., different application user interfaces) in response to a preset touch gesture. For example, the cover user interface (e.g., user interface 5050) shown in fig. 5X has the same features as the wake screen user (e.g., user interface 5043) shown in fig. 5U-5V. In fig. 5U-5W, the wake screen user interface 5043 is replaced with a search user interface (e.g., user interface 5006) in response to detecting a downward swipe gesture (e.g., movement 5053 of contact 5054) starting from the middle top edge region of the touch screen 112. The currently displayed user interface is redisplayed when the overlay screen user interface is released. For example, in a case where the cover user interface is displayed in response to a request to display the cover user interface while the application user interface is displayed, the apparatus redisplays the application user interface in response to a request to dismiss the cover user interface. In some embodiments, the overlay screen user interface has an appearance that is the same or similar to the appearance of the wake screen user interface or the lock screen user interface, and has the same time element that shows the current time. In some embodiments, unlike the wake screen user interface, there is no need to initiate display of the overlay screen user interface when the device is in a low power mode and the device is not caused to be locked (e.g., release of the overlay screen user interface does not require re-entry of authentication information). In some embodiments, in accordance with a determination that the first criteria and the second criteria can be met when any of a plurality of user interfaces (e.g., a wake screen user interface, a lock screen user interface, an overlay screen user interface, a home screen user interface, an application user interface, etc.) is displayed, the device detects a touch gesture when the respective user interface is displayed, and in response to detecting the touch gesture, the device determines whether the respective user interface is one of the plurality of user interfaces, and in accordance with a determination that the respective user interface is another one of the plurality of user interfaces, the device evaluates the gesture against the first criteria and the second criteria (and other criteria described herein) to determine an operation to be performed (e.g., display a first search user interface, a previously received notification, a control panel user interface, etc.). In some embodiments, when a touch gesture is detected while different ones of the plurality of user interfaces are respectively displayed, the device sequentially performs the above-mentioned evaluating and operating at different times.
Displaying a user interface with a search input area or a user interface with a plurality of previously received notifications in response to a touch gesture received while displaying a cover user interface provides a way to display the user interface used with a reduced amount of input (e.g., without having to close the displayed cover user interface and open the home screen before providing a touch gesture for displaying the user interface). Reducing the number of inputs required to perform the operation enhances the operability of the device, which in turn reduces power usage and extends the battery life of the device by enabling a user to more quickly and efficiently use the device.
In some implementations, in response to detecting the first touch gesture and in accordance with a determination that the first touch gesture satisfies a fifth criterion, the device replaces (6030) at least a portion of the first user interface with a first search input area on the display area, wherein the fifth criterion includes a fifth requirement that is satisfied in accordance with a determination that: the first touch gesture includes a fifth movement of the preset number of simultaneously detected contacts (e.g., the preset number of contacts are two or more simultaneously detected contacts, two simultaneously detected contacts, three simultaneously detected contacts, and/or four simultaneously detected contacts, etc.) in the first direction across the touch-sensitive surface that begins in a respective portion (e.g., an inner portion) of the touch-sensitive surface that is different from (and, optionally, separate from) the first edge (e.g., a portion of the touch-sensitive surface that is outside of (and remote from) the first edge (e.g., by one or more other input regions (e.g., one or more edge input regions, and/or one or more inner input regions, etc.) on the touch-sensitive surface). These features are shown, for example, in fig. 5 AM-5 AN, where in response to detecting a downward multi-contact swipe gesture (e.g., movement 5067 of contact 5066), the device replaces the home screen user interface (e.g., user interface 5002) with the search user interface (e.g., user interface 5006). In some implementations, the first search user interface including the first search input area displayed in response to the touch gesture meeting the first criterion and the first search user interface including the first search input area displayed in response to the touch gesture meeting the fifth criterion are the same search user interface having the same first search input area. In some implementations, the first search input area is a user interface object in a display layer displayed on the first user interface (e.g., in an original state, or optionally, in a blurred and/or darkened version of the original state, etc.) in response to a touch gesture that meets the first criteria, the third criteria, and/or the fifth criteria.
Displaying user interfaces having search input areas in accordance with determining that a touch gesture includes movement of a single contact across a first portion of an edge of a touch-sensitive surface or in accordance with determining that a touch gesture includes movement of a preset number of contacts across the touch-sensitive surface starting from an interior portion of the touch-sensitive surface provides additional control options for displaying user interfaces without displaying user interface controls for those user interfaces. Providing additional control options without cluttering the user interface with controls for additional display enhances the operability of the device, which in turn reduces power usage and extends battery life of the device by enabling a user to more quickly and efficiently use the device.
In some embodiments, in response to detecting the first touch gesture, and in accordance with a determination that the first touch gesture satisfies a sixth criterion, the device replaces (6032) display of the first user interface with display of an application selection user interface (e.g., a multitasking user interface, a selection user interface of a most recent application, etc.) that includes a plurality of representations (e.g., application views, application icons) of most recently opened applications (e.g., applications that have been currently opened or running or closed applications that are saved in a state of prior use), wherein the sixth criterion includes a sixth requirement that is satisfied in accordance with a determination that: the first touch gesture includes a sixth movement of the preset number of simultaneously detected contacts across the touch-sensitive surface in a second direction different from (e.g., substantially opposite to, substantially perpendicular to, etc.) the first direction. These features are illustrated, for example, in fig. 5 AO-5 AP, where in response to detecting an upward multi-contact swipe gesture (e.g., movement 5069 of contact 5068), the device replaces the home screen user interface (e.g., user interface 5002) with a multitasking user interface (e.g., user interface 5060). The respective ones of the plurality of representations, when activated according to preset criteria (e.g., via a tap input, a double-tap input, etc.), cause a redisplay of the respective ones of the recently opened applications corresponding to the respective representations. For example, in response to detecting an input (e.g., a tap input) on a respective representation of an application user interface (e.g., a representation of web browser user interface 5060-1, photo user interface 5072, or message user interface 5060-2), the device redisplays the application corresponding to the respective representation. In some embodiments, a first touch gesture comprising a preset number of simultaneously detected contacts is detected anywhere in an interior portion of the touch-sensitive surface away from a first edge and optionally other edges of the touch-sensitive surface. The movement of the current number of concurrently detected contacts prior to lift-off of the contacts is evaluated against a different set of criteria (e.g., fifth criteria, sixth criteria, etc.) to determine a first search input area or application selection user interface, or some other user interface (e.g., home screen, applet user interface, application library, overlay screen user interface, etc.).
Displaying the different user interfaces in response to gestures in different directions of a preset number of contacts provides additional control options for displaying the different user interfaces without displaying user interface controls for the different user interfaces. Providing additional control options without cluttering the user interface with controls for additional display enhances the operability of the device, which in turn reduces power usage and extends battery life of the device by enabling a user to more quickly and efficiently use the device.
In some implementations, in response to detecting the first touch gesture, and in accordance with a determination that the first touch gesture satisfies a seventh criterion, wherein the seventh criterion includes a requirement that the first touch gesture includes a seventh movement of the simultaneously detected contact across the touch-sensitive surface in a second direction different from (e.g., substantially opposite to, substantially perpendicular to, etc.) the first direction in accordance with a determination that the first touch gesture satisfies a display of the primary screen user interface in place of (6034) the display of the first user interface. This is illustrated, for example, in fig. 5 AQ-5 AR, where in response to an up multi-contact swipe gesture (e.g., movement 5071 of contact 5070), the device replaces the application user interface (e.g., photo user interface 5072) with home screen user interface 5002. The home screen user interface includes a plurality of application icons (e.g., application icons 504 through 524) corresponding to different applications. The respective application icon, when activated according to the current criteria (e.g., by tap input, double-tap input, etc.), causes display of a respective one of the different applications corresponding to the respective application icon. For example, in response to detecting an input (e.g., a tap input on the message's application icon 504), the device displays a user interface (e.g., user interface 5060-2) for the message application. In some embodiments, a first touch gesture comprising a preset number of simultaneously detected contacts is detected anywhere in an interior portion of the touch-sensitive surface away from a first edge and optionally other edges of the touch-sensitive surface. The movement of the current number of concurrently detected contacts prior to lift-off of the contacts is evaluated against a different set of criteria (e.g., fifth criteria, sixth criteria, seventh criteria, etc.) to determine a first search input area, a home screen user interface, or some other user interface (e.g., an application selection user interface, an applet user interface, an application library, an overlay screen user interface, etc.). In some embodiments, both the fifth criterion and the sixth criterion can be met by movement of a preset number of contacts in a second direction opposite the first direction, however, the fifth criterion and the sixth criterion differ in their requirements for various features of movement (e.g., speed, distance, position, acceleration, or a combination of two or more of the above, etc.). In some embodiments, the movement that eventually meets the sixth criteria and triggers the display of the home screen user interface may have met the requirements of the fifth criteria earlier and if it has terminated earlier will cause the application to select the display of the user interface instead of the display of the home screen. Similarly, in some embodiments, the movement that eventually satisfies the fifth criterion and results in the display of the application selection user interface may have satisfied the fifth criterion earlier and if it has terminated earlier will result in the display of the home screen user interface instead of the display of the application selection user interface.
Displaying the different user interfaces in response to gestures in different directions of a preset number of contacts provides additional control options for displaying the different user interfaces without displaying user interface controls for the different user interfaces. Providing additional control options without cluttering the user interface with controls for additional display enhances the operability of the device, which in turn reduces power usage and extends battery life of the device by enabling a user to more quickly and efficiently use the device.
In some implementations, in response to detecting the first touch gesture and in accordance with a determination that the first touch gesture satisfies an eighth criterion, the device replaces (6036) display of the user interface of the first application with the second user interface, wherein the eighth criterion includes an eighth requirement that is satisfied in accordance with a determination that: the first touch gesture includes an eighth movement (e.g., a substantially perpendicular first direction, a left-to-right swipe, or a right-to-left swipe) of the preset number of simultaneously detected contacts across the touch-sensitive surface in a third direction different from the first direction and the second direction (e.g., substantially perpendicular to the first direction and the second direction). This is illustrated, for example, in fig. 5 AS-5 AT, where the device display switches from displaying a user interface of a first application (e.g., a photo user interface in fig. 5 AS) to displaying a user interface of a second application (e.g., a web browser user interface in fig. 5 AT) that is different from the first application. The second user interface is the user interface of the recently opened application (e.g., the last displayed user interface of a second application different from the first application providing the first user interface). In some implementations, the first user interface is a home screen user interface when a first touch gesture that satisfies the eighth criterion is detected and the device navigates from the home screen user interface to a last displayed user interface of a last displayed application in response to the first touch input that satisfies the eighth criterion.
Switching between user interfaces of different recently opened applications by touch gestures reduces the amount of input required to display user interfaces of different recently opened applications (e.g., without closing a currently displayed application user interface and opening a recently opened application user interface). Reducing the number of inputs required to perform the operation enhances the operability of the device, which in turn reduces power usage and extends the battery life of the device by enabling a user to more quickly and efficiently use the device.
In some implementations, in response to detecting the first touch gesture and in accordance with a determination that the first touch gesture meets a ninth criterion, the device replaces (6038) at least a portion of the first user interface with a first search input area on a display area provided by the display generating component, wherein the ninth criterion includes a ninth requirement that is met in accordance with a determination that: the first touch gesture includes a ninth movement of the first contact in the first direction that begins in a third interior portion of the touch-sensitive surface that is outside of and positioned away from the first edge (e.g., separated from the first edge by one or more other input regions (e.g., one or more edge input regions, and/or one or more interior input regions, etc.) on the touch-sensitive surface) (and, optionally, separated from the other edges of the touch-sensitive surface); and the first user interface is a wake screen user interface or an overlay screen user interface. These features are shown, for example, in fig. 5X-5Y, where in response to detecting a swipe down gesture (e.g., movement 5055 corresponding to contact 5054) from a middle portion of touch screen 112, the device replaces a cover user interface (e.g., user interface 5050) with a search user interface (e.g., user interface 5006). In some implementations, the touch gesture that causes the display of the wake screen user interface and/or the overlay screen user interface is a touch gesture that meets the first criteria or the second criteria, and in such scenarios, the first search user interface including the first search input area is displayed in response to the touch gesture meeting the ninth criteria, but is not displayed in response to the touch gesture meeting the first criteria or the second criteria if the touch gesture is detected while the user interfaces other than the wake screen or the overlay screen user interface are displayed (e.g., home screen user interface, application user interface, applet screen, application library, etc.).
Displaying a user interface with a search input area or a user interface with a plurality of previously received notifications in response to a touch gesture received while displaying a wake screen user interface provides a way to display the user interface with a reduced amount of input (e.g., without having to turn off the displayed wake screen user interface and turn on the home screen before providing a touch gesture for displaying the user interface). Reducing the number of inputs required to perform the operation enhances the operability of the device, which in turn reduces power usage and extends the battery life of the device by enabling a user to more quickly and efficiently use the device.
In some embodiments, the first user interface includes (6040) a plurality of notifications (e.g., notification center user interface 5020 includes areas 5020-1 to 5020-5 corresponding to the notifications in fig. 5H) (e.g., arranged in a vertical list extending beyond the display area provided by the display generating component (e.g., the list includes previously received notifications arranged in a sequential order according to time, alphabetical order, notification type, etc. of receiving the notifications), and/or in a scrollable window, etc.) (e.g., the plurality of notifications have been displayed prior to detection of the first touch gesture and/or have been displayed in response to an initial portion of the first touch gesture, etc.). In response to detecting the first touch gesture and in accordance with a determination that the first touch gesture includes a tenth movement of the first contact across the touch-sensitive surface (e.g., in a first direction, first in a second direction and then in the first direction, etc.) and that the tenth movement begins in a fourth interior portion of the touch-sensitive surface, in accordance with a determination that an end of the plurality of notifications has been reached in accordance with a direction of movement of the tenth movement of the first contact in a display area provided by the display generating component, the device replaces (6040) display of at least a portion of the first user interface with the first search input area. In accordance with a determination that a moving direction of a tenth movement according to the first contact has not reached an end of the plurality of notifications in the display area provided by the display generating section, the apparatus scrolls the plurality of notifications in accordance with the tenth movement of the first contact. These features are illustrated, for example, in fig. 5K-5N, which describe features of notification center user interface 5020. The wakeup screen user interface (e.g., user interface 5042 in fig. 5U and cover user interface 5050 in fig. 5X) has the same features as those described with respect to the notification user interface in fig. 5K-5N. In fig. 5K-5L, in response to a swipe down gesture (e.g., corresponding to movement 5029 of contact 5028) starting from an interior portion of touch screen 112, and in accordance with a determination that the notification list has not reached the end of the list (e.g., corresponding to the topmost notification region 5020-5 in the notification list) is not fully displayed, the device scrolls down the notification list, thereby shifting the notification list downward (e.g., fully displaying regions 5020-5 of fig. 5L). In fig. 5M-5N, in response to a swipe down gesture (e.g., corresponding to movement 5031 of contact 5030) starting from an interior portion of touch screen 112, and in accordance with the notification list having reached an end (e.g., region 5020-5 is fully displayed), the device replaces the notification center user interface (e.g., user interface 5020) with a search user interface (e.g., user interface 5006).
Scrolling through an arrangement (e.g., a list) including a plurality of notifications or displaying a search input area in accordance with a determination that an end of the arrangement including a plurality of notifications has been or has not been reached provides for navigating through user interfaces including a plurality of notifications and user interfaces having a search input area without requiring additional control options to be displayed for user interface controls of those user interfaces. Providing additional control options without cluttering the user interface with controls for additional display enhances the operability of the device, which in turn reduces power usage and extends battery life of the device by enabling a user to more quickly and efficiently use the device.
In some implementations, while displaying the first user interface, the device detects (6042) a second touch gesture (e.g., one or more simultaneously detected tap gestures, touch-and-hold gestures, swipe gestures, multi-finger pinch gestures, multi-finger swipe gestures, volatile gestures that include multiple types of movements (e.g., movements in two or more directions, pinch movements, spread movements, pivot movements, etc.) that includes movements of a second set of contacts on the touch-sensitive surface. In response to detecting the second touch gesture and in accordance with a determination that the second touch gesture satisfies an eleventh criterion, replacing the display of the first user interface with a second user interface different from the first user interface, wherein the eleventh criterion comprises an eleventh requirement satisfied in accordance with a determination that: the second touch gesture includes an eleventh movement of the second contact across a second edge of the touch-sensitive surface different from the first edge. These features are illustrated, for example, in fig. 5 AB-5 AG, where the device navigates from the application user interface to the multitasking user interface, the home screen user interface, or another application user interface, depending on the direction of the corresponding gesture. For example, as shown in fig. 5 AB-5 AD, in response to an upward swipe gesture (e.g., movement 5059 corresponding to contact 5058 in fig. 5 AB) starting from the bottom edge of touch screen 112 and having a first set of features, the device replaces email user interface 5036 with a multitasking user interface. As shown in fig. 5 AB-5 AE, the device replaces the email user interface 5036 with the home screen user interface 5002 in response to an upward swipe gesture (e.g., movement 5059-1 corresponding to contact 5058) that also starts from the bottom edge of the touch screen 112 (e.g., contact 5058 in fig. 5 AB) but includes a second set of features (e.g., acceleration, speed, pressure, or corresponding to a different distance) that is different than the first set of features of the swipe gesture corresponding to movement 5059 of contact 5058. Further, as shown in fig. 5 AF-5 AG, in response to a swipe gesture including an upward portion and a lateral portion (e.g., a left-to-right swipe of the movement 5059-2 corresponding to the contact 5058), the device replaces the email user interface 5036 with another application user interface (e.g., the web browser user interface 5060-1 in fig. 5 AG). In some implementations, the second user interface is a home screen user interface (e.g., the first user interface is an application user interface of the first application, an overlay screen user interface, a wake screen user interface, an applet screen user interface, a system arranged application library user interface, etc.). In some implementations, the second user interface is an application selection user interface (e.g., the first user interface is an application user interface of the first application, an overlay screen user interface, a wake screen user interface, an applet screen user interface, a system arranged application library user interface, a home screen user interface, etc.). In some implementations, the second user interface is a user interface of the second application (e.g., the first user interface is an application user interface of the first application, an overlay screen user interface, a wake screen user interface, an applet screen user interface, a system arranged application library user interface, a home screen user interface, an application selection user interface, etc.). In some implementations, the second user interface is another user interface within the same currently displayed application (e.g., a previously displayed user interface within the same application). In some implementations, each edge of the touch-sensitive surface other than the first edge is associated with a corresponding function (e.g., navigate to a home screen, navigate to another application, display a wake-up screen user interface, bring-in an overlay screen user interface, navigate to an applet screen, navigate to a home screen of a system arrangement, navigate to a different page of the home screen, navigate to a different user interface of a currently displayed application, etc.) selected from above, respectively, and the device determines which function to perform based on which edge and in which direction the touch input spans. In some embodiments, the applet screen further includes the same first search input area.
Displaying a user interface having a search input area in accordance with determining that the touch gesture includes movement across a first portion of an edge of the touch-sensitive surface, displaying a user interface having a plurality of previously received notifications in accordance with determining that the touch gesture includes movement across a second portion of an edge of the touch-sensitive surface, and displaying a different user interface in accordance with determining that the touch gesture includes movement across another edge of the touch-sensitive surface provides additional control options for navigating through the user interfaces without displaying user interface controls for the user interfaces. Providing additional control options without cluttering the user interface with controls for additional display enhances the operability of the device, which in turn reduces power usage and extends battery life of the device by enabling a user to more quickly and efficiently use the device.
In some embodiments, while displaying the second user interface (e.g., the same as, or different from, the first user interface), the device detects (6044) a third touch gesture (e.g., different from the first touch gesture) that includes detecting a third set of contacts on the touch-sensitive surface (e.g., one or more simultaneously detected tap gestures, touch-and-hold gestures, swipe gestures, multi-finger pinch gestures, multi-finger swipe gestures, volatile gestures that include multiple types of movement (e.g., movement in two or more directions, pinch movement, spread movement, pivot movement, etc.)). In response to detecting the third touch gesture and in accordance with a determination that the third touch gesture meets the first criteria or additional preset criteria (e.g., third criteria, fifth criteria, ninth criteria, other criteria, etc.) that are different from the first criteria, at least a portion of the second user interface is replaced with the first search user interface in the display area. For example, in some implementations, the same search interface is invoked in various contexts with any of a plurality of gestures (e.g., swipe down from a top edge of the touch screen, swipe anywhere on the interior of the wake screen, and/or swipe down with multiple fingers anywhere in the interior of the touch screen, etc.). These features are shown, for example, in fig. 5A-5C, fig. 5Z-5 AA, and fig. 5 AM-5 AN (e.g., invoking a search user interface when a home screen user interface is displayed), fig. 5M-5N (e.g., invoking a search user interface when a notification center user interface is displayed), fig. 5R-5S, and fig. 5 AU-5 AX (e.g., invoking a search user interface when AN application user interface is displayed), and fig. 5V-5Z (e.g., invoking a search user interface when a wake screen or cover user interface is displayed).
Displaying the user interface with the search input area in accordance with determining that the touch gesture includes movement across a first portion of an edge of the touch-sensitive surface or in accordance with determining that the touch gesture meets some other preset criteria provides additional control options for displaying the user interface with the search input area without requiring display of user interface controls for the user interface. Providing additional control options without cluttering the user interface with controls for additional display enhances the operability of the device, which in turn reduces power usage and extends battery life of the device by enabling a user to more quickly and efficiently use the device.
Method 6100 is performed at an electronic device in communication with a display generation component (e.g., a touch screen display, projector, stand alone display, head-up display, head-mounted display, etc.) and one or more input devices (e.g., mouse-based input or stylus input).
In method 6100, the device displays (6102) a first user interface (e.g., a wake screen user interface, a cover user interface, etc.) via a display generation component (e.g., a wake screen user interface 5042 in fig. 5V or a cover user interface 5050 in fig. 5X having similar appearance and features as described with respect to the notification center user interface in fig. 5H-5N). The first user interface includes a first plurality of notifications (e.g., a subset of the notifications, less than all of the notifications) in a notification list (e.g., areas 5020-1 through 5020-5 shown in fig. 5H). In some embodiments, the first user interface is a wake screen user interface that is an initial user interface that is displayed when the electronic device transitions from a display off state or a low power always on state to a normal operating state. In some implementations, the wake screen user interface is initially displayed in a locked state and transitions to an unlocked state when authentication has been completed (e.g., by a fingerprint sensor, retinal scanner, facial recognition, voice authentication, password entry, etc.). In some embodiments, the wake screen user interface is a lock screen user interface. In some implementations, the first user interface is a cover user interface displayed in response to user input corresponding to a request to overlay a currently displayed user interface in various contexts (e.g., when a home screen is displayed, when an application user interface is displayed, when a control center user interface is displayed, when a setup user interface is displayed, etc.). In some embodiments, the cover user interface is not a lock screen and is dismissed without the need to re-enter authentication information; and upon dismissal of the cover user interface in response to a preset user input, the previously displayed user interface (the user interface "overlaid" by the cover user interface) is restored. In some embodiments, the wakeup screen user interface and the cover user interface have the same or similar appearance, e.g., both include time elements showing the current time and optionally the date, both share the same background image, and/or provide similar functionality (e.g., display of unread or saved notifications, navigate to the same user interface in response to preset gestures, etc.). In some embodiments, the first user interface is not a home screen user interface, an application user interface, or a control panel user interface. While displaying the first user interface, the device detects (6104) a first user input comprising a first input (e.g., contact 5026 in fig. 5I) (e.g., the first input is a swipe input of a first contact (e.g., one contact, two simultaneously detected contacts, etc.) moving across the touch-sensitive surface) (e.g., movement of the contact from a middle region of the user interface toward an edge of the first user interface, movement of the contact detected (e.g., touched down, passed through, etc.) on the touch-sensitive surface at a location remote from an edge region of the touch-sensitive surface and corresponding to a location of a first plurality of notifications displayed on the display, etc.). In response to detecting the first user input (6106), in accordance with a determination that the first input includes swipe input in a first direction (e.g., a direction toward a top edge of the touch-sensitive surface, a direction toward a bottom edge of the touch-sensitive surface, a direction corresponding to a scrolling direction of the notification list, etc.) and has reached an end of the notification list (e.g., a top of the list, a bottom of the list, a top of the list or the bottom of the list only, a bottom of the list only, etc.) (e.g., an end reached when scrolling the notification in the first direction), the device displays (6108) a search input area (e.g., a search input area for receiving a text search query entered by a user, which causes a search result to be returned that corresponds to the search query and identified from a plurality of sources (e.g., an application, text message, email, image, web page, address book, etc.). In some embodiments, displaying the search input area includes: the display of the first user interface is replaced with the display of the first search user interface in the display area provided by the display generating means. In some embodiments, displaying the search input area includes: the first search input area is displayed in the first user interface without replacing the entire first user interface. In some implementations, the search input area is configured to receive search input (e.g., text search criteria, search keywords, image-based search criteria, etc.) and cause search results (e.g., representations of applications, messages, emails, web pages, images, photographs, etc.) corresponding to the search criteria received in the search input area to be returned. This is illustrated, for example, in fig. 5M-5N, where in response to detecting a swipe down gesture (e.g., corresponding to movement 5031 of contact 5030) starting from the middle portion of touch screen 112, and in accordance with a determination that the end of the notification list has been reached (e.g., the topmost region 5020-5 of the notification is fully displayed in fig. 5M), the device replaces the notification center user interface (e.g., user interface 5020) with a search user interface (e.g., user interface 5006). In accordance with a determination that the first input includes a swipe input in a first direction and an end of the notification list has not been reached (e.g., neither at the top of the list nor at the bottom of the list, the top of the list, the bottom of the list, etc.) (e.g., an end reached when scrolling the notification in the first direction), the device displays (6110) a second plurality of notifications between the first plurality of notifications and the end of the notification list (e.g., in response to the swipe input in the first direction, the second plurality of notifications are scrolled into view on the first user interface, while some or all of the plurality of notifications initially displayed are scrolled out of view on the first user interface) (e.g., the notification list is ordered in accordance with a time at which the respective notification was received, and scrollable in a direction corresponding to the first direction of the swipe input). This is illustrated, for example, in fig. 5K-5L, where in response to detecting a downward swipe gesture starting from a middle portion of touch screen 112 (e.g., corresponding to movement 5029 of contact 5028), and in accordance with a determination that an end of the notification list has not been reached (e.g., top-most region 5020-5 of the notification is not fully displayed in fig. 5K), the device scrolls down the notification list. In some embodiments, a single continuous swipe input causes the device to scroll through notifications in the notification list until the end of the notification list is displayed, and then causes display of a search input area. In some embodiments, at least some of the notifications from the notification list are simultaneously visible on the first user interface as a search input area. In some embodiments, when the search input area is displayed, notifications from the notification list cease to be displayed or pushed into the background layer.
In response to detecting user input while displaying a user interface comprising a first plurality of notifications in the notification list, displaying the user interface comprising a search input area in accordance with determining that user input is detected while displaying an end of the notification list, or displaying a second plurality of notifications in accordance with determining that user input is detected while not displaying an end of the notification list, provides additional control options for displaying the user interface without displaying user interface controls. Providing additional control options without cluttering the user interface with controls for additional display enhances the operability of the device, which in turn reduces power usage and extends battery life of the device by enabling a user to more quickly and efficiently use the device.
In some implementations, in response to detecting the first user input and in accordance with a determination that the first input includes swipe input in a second direction different from the first direction (e.g., opposite the first direction, perpendicular, greater than a threshold angle relative to the first direction, etc.), the device displays (6112) a plurality of previously received notifications in the first user interface. The plurality of previously received notifications includes notifications that are not in the first plurality of notifications in the notification list (e.g., the previously received notifications are different from the second plurality of notifications) (e.g., the previously received notifications correspond to notifications received prior to the plurality of notifications and the second plurality of notifications) (e.g., the previously received notifications are not between the plurality of notifications and the end of the notification list). These features are shown, for example, in fig. 5I-5J, where the device scrolls up the notification list in response to an up swipe gesture (e.g., movement 5027 of contact 5026). As shown, in fig. 5I, the device displays a first portion of the notification list (e.g., areas 5020-1 through 5020-5) before detecting the swipe gesture, and in fig. 5J, the device displays a second portion of the notification list (e.g., areas 5020-6 and 5020-7) in addition to the first portion of the notification list. In some embodiments, the plurality of previously received notifications are stored in a notification history, and the notification list is an unread notification that has not been stored in the notification history. In some embodiments, in accordance with a determination that the swipe input is in a second direction different from the first direction and has reached the end of the notification list, a plurality of previously received notifications are displayed in the first user interface. For example, in some embodiments, the notification list is an unread notification displayed in the first user interface, and multiple notifications from the notification list may be scrolled in response to swipe-up input on the multiple notifications. The notification list scrolls upward in response to the swipe-up input, thereby presenting a second plurality of notifications in the notification list; and when the end of the list is reached, a notification history is displayed on the first user interface. The notification list scrolls down in response to the swipe down input, redisplaying a plurality of notifications that have been scrolled out of view by the swipe up input earlier. In response to the swipe down input, a search input area is displayed when the end of the notification list is reached.
In accordance with a determination that the user input includes swipe input in a direction different from a direction used to display a user interface including a search input area or to display a plurality of notifications different from previously received notifications, displaying the plurality of previously received notifications provides additional control options for displaying the user interface without requiring display of user interface controls. Providing additional control options without cluttering the user interface with controls for additional display enhances the operability of the device, which in turn reduces power usage and extends battery life of the device by enabling a user to more quickly and efficiently use the device.
In some implementations, the first user interface is (6114) a wake screen user interface (e.g., wake screen user interface 5042 in fig. 5U-5V) that is displayed in response to detecting a request to wake the display generation component from a low power mode (e.g., low power mode 5040 in fig. 5T) (e.g., low power always on mode, display off mode, power saving sleep mode, etc.). In some implementations, the wake screen user interface is initially displayed in a locked state and later transitions to an unlocked state after authentication information has been obtained (e.g., through password entry or biometric information verification, such as facial, iris, or fingerprint biometric verification). In some embodiments, waking up the screen user interface includes a visual cue to the user to unlock the device when the device is locked. In some embodiments, the wake screen user interface includes user interface objects corresponding to device functions, such as flashlights, cameras, etc., and/or user interface objects indicating a locked/unlocked state of the device, etc.
Displaying a user interface including a search input area or displaying a second plurality of notifications in response to detecting user input while displaying the wake screen user interface provides a way to display the user interface in a reduced amount of input (e.g., without having to close the displayed wake screen user interface and open the home screen before providing user input for displaying the user interface including the search input area or the second plurality of notifications). Reducing the number of inputs required to perform the operation enhances the operability of the device, which in turn reduces power usage and extends the battery life of the device by enabling a user to more quickly and efficiently use the device.
In some embodiments, the wakeup screen user interface includes (6116) a first area displaying notifications (e.g., newly received notifications, unread notifications, notifications that have not been handled by the user, etc.) including the first plurality of notifications (e.g., notification 5044 in fig. 5V includes the newly received notifications). In some embodiments, the first region of the wakeup screen displays one or more notifications when the one or more notifications are newly received (and optionally, when the notifications remain unread in the always-on lower power mode), and when the display generating component wakes up from the low power mode and enters the normal operating mode. In some implementations, the notification is associated with an application (e.g., email, message, news, etc.). In some embodiments, when a small number of notifications are received, all notifications are displayed in the first area at the same time. In some embodiments, when a greater number of notifications accumulate without being handled, the latest notification is displayed in the first area, while the earlier untreated notifications are pushed out of the first area while remaining in a list of notifications that can be scrolled back into the first area by the user's swipe input.
Displaying the notification while displaying the wakeup screen user interface provides a way to display the notification with a reduced number of inputs (e.g., without providing additional user input for displaying the notification). Reducing the number of inputs required to perform the operation enhances the operability of the device, which in turn reduces power usage and extends the battery life of the device by enabling a user to more quickly and efficiently use the device.
In some implementations, the wakeup screen user interface includes (6118) a second area that displays the stored notification history (e.g., wakeup screen user interface 5042 includes a portion that displays notification center user interface 5020, including areas 5020-4 and 5020-5 in fig. 5V) (e.g., previously received notifications that have been automatically stored into the notification history without user interaction, previously received notifications that were stored upon first receipt of a notification based on user interaction with the notification, etc.). In some embodiments, the first region and the second region are the same region. In some embodiments, the first region and the second region are operatively adjacent to each other (e.g., the second region is immediately adjacent to the first region, and there is no visible region between the first region and the second region). In some embodiments, the notification list includes a stored notification history. In some embodiments, when one or more notifications are newly received (and optionally, when the notifications remain unread in the always-on lower power mode), and when the display generating component wakes up from the low power mode and enters the normal operating mode, a second area of the wake-up screen that overlaps the first area of the wake-up screen displays the one or more notifications. In some implementations, the second area replaces the first area on the display when the stored notification history is displayed. In some implementations, the stored notification history is displayed on the wake screen user interface in response to a request to display the wake screen user interface. In some implementations, the stored notification history is displayed on the wake screen user interface in response to a request to display the stored notification history received while the wake screen user interface is displayed.
Displaying the stored notifications while displaying the wake screen user interface provides a way to display the stored notifications with a reduced number of inputs (e.g., without providing additional user input for displaying the notifications). Reducing the number of inputs required to perform the operation enhances the operability of the device, which in turn reduces power usage and extends the battery life of the device by enabling a user to more quickly and efficiently use the device.
In some implementations, the wakeup screen user interface includes (6120) a user interface object displaying the current time (e.g., time element 5024 in fig. 5V). In some embodiments, the user interface object also displays the current date. The user interface object continues to update as the current time changes.
Displaying the user interface object that displays the current time while displaying the wakeup screen user interface provides a way to display the user interface object that displays the current time with a reduced number of inputs (e.g., without providing additional user input for displaying the current time). Reducing the number of inputs required to perform the operation enhances the operability of the device, which in turn reduces power usage and extends the battery life of the device by enabling a user to more quickly and efficiently use the device.
In some embodiments, in response to detecting the first user input (e.g., in accordance with a determination that the first swipe input is in a first direction, and optionally in accordance with a determination that an end of the notification list has been reached), the device stops (6122) displaying one or more user interface objects of the wakeup screen user interface (e.g., stops displaying elements of the current date and time, a flashlight affordance, and/or an affordance for prompting unlocking of the device) (e.g., stops displaying some, less than all of the wakeup screen elements). This feature is illustrated, for example, in fig. 5I-5J, where the device stops displaying the time element 5024 when the scroll up of the center user interface (e.g., user interface 5020) is notified. The wake screen user interface has features similar to those of the notification center user interface. In some implementations, notifications on the wakeup screen user interface (e.g., notifications in a notification list, notifications in a notification history, etc.) scroll in response to a first user input, and some of the user interface objects of the wakeup screen user interface move across the display in unison with the notifications in the direction of the scroll movement. For example, in some implementations, when the unread notification and/or notification history scrolls upward in response to an upward swipe input on the touch-screen, a current display portion of the unread notification and/or notification history moves upward to reveal additional notifications previously hidden on the display; and as the notification moves up, the user interface object on the wake screen user interface showing the current time and/or date, indication of the locked/unlocked state of the device, etc. also moves up and may stop displaying when reaching across the top of the display.
Stopping the display of one or more user interface objects of the wake screen user interface in response to detecting user input to display the user interface having the search input area or to display the second plurality of notifications provides a means for reducing the number of inputs (e.g., without providing additional user input to stop displaying the one or more user interface objects while leaving room for displaying the user interface having the search input area or the second plurality of notifications). Reducing the number of inputs required to perform the operation enhances the operability of the device, which in turn reduces power usage and extends the battery life of the device by enabling a user to more quickly and efficiently use the device.
In some embodiments, in response to detecting the first user input and in accordance with a determination that the first input includes a swipe input in a first direction and an end of the notification list has not been reached, the device stops (6124) displaying one or more notifications of the first plurality of notifications in the notification list. This feature is illustrated, for example, in fig. 5M-5N, where the device stops displaying the notification list when the notification center user interface is replaced with a search user interface. For example, in some embodiments, the notification list is a scrollable list that scrolls according to a first direction, and some of the plurality of notifications previously visible in the first area are scrolled out of the first area and stopped from being displayed in the first user interface.
Stopping the display of the first plurality of notifications in the notification list in response to detecting user input for displaying the second plurality of notifications provides a way to reduce the amount of input (e.g., scrolling through the notification list instead of closing and opening different portions of the notification list). Reducing the number of inputs required to perform the operation enhances the operability of the device, which in turn reduces power usage and extends the battery life of the device by enabling a user to more quickly and efficiently use the device.
In some embodiments, in response to detecting the first user input and in accordance with a determination that the first input includes swipe input in a first direction and the end of the notification list has been reached, the device displays (6126) a keyboard (e.g., the keyboard in fig. 5011) for entering the search input into the search input area while displaying the search input area (e.g., the search input area is displayed in a first portion (e.g., a top portion) of the display area provided by the display generating component and the keyboard is displayed in a second portion (e.g., a bottom portion) of the display area provided by the display generating component. In some embodiments, the keyboard slides into the display area provided by the display generating component during a portion after the swipe input in the first direction has reached the end of the notification list. In some embodiments, in response to detecting activation of a key in the keyboard, the device displays a character corresponding to the activated key in the search input area; and correspondingly, display search results and/or suggested searches corresponding to the search criteria including the input character.
Displaying the keyboard in response to user input for displaying a user interface having a search input area provides a way to reduce the number of inputs (e.g., without providing additional user input for opening the keyboard). Reducing the number of inputs required to perform the operation enhances the operability of the device, which in turn reduces power usage and extends the battery life of the device by enabling a user to more quickly and efficiently use the device.
In some embodiments, in response to detecting the first user input and in accordance with a determination that the first input includes swipe input in a first direction and the end of the notification list has been reached, the device displays (6128) one or more suggested searches (e.g., region 5006-1 including suggested actions and region 5006-2 including suggested applications in fig. 5C) while displaying the search input region (e.g., the search input region is displayed in a first portion (e.g., a top portion) of the display generating component and one or more suggested search criteria is displayed in a second portion (e.g., a bottom portion) of the display generating component). For example, the suggested searches include search criteria that are automatically generated based on the user's search history, popular searches, content, contacts and/or applications on the device, and/or partial input that has been entered by the user in the search input area. In some implementations, the suggested searches are text keywords, images, and/or photographs, etc., that are selectable by the user, and cause search results corresponding to the selected text keywords, images, and/or photographs to be returned.
Displaying one or more suggested searches in response to user input for displaying a user interface having a search input area provides a way to reduce the amount of input (e.g., without providing additional user input for displaying one or more suggested searches). Reducing the number of inputs required to perform the operation enhances the operability of the device, which in turn reduces power usage and extends the battery life of the device by enabling a user to more quickly and efficiently use the device.
In some implementations, the device detects (6130) a second user input (e.g., after scrolling the plurality of notifications in response to the first swipe input and/or an earlier swipe input in a second direction before the first swipe input), the second user input including a second input (e.g., the second input is a swipe input of a second contact (e.g., one contact, two simultaneously detected contacts, etc.) that moves across the touch-sensitive surface (e.g., movement of the contact from a middle region of the user interface toward an edge of the first user interface, movement of the contact detected (e.g., a downward touch, a pass, etc.) on the touch-sensitive surface at a location away from an edge region of the touch-sensitive surface, and corresponding to a location of the plurality of notifications displayed on the display, etc.). In some embodiments, the second contact is the same contact as the first contact (e.g., a continuously maintained contact throughout movement of the first swipe input and the second swipe input). In some embodiments, the first contact and the second contact are different contacts separated by lift-off of the first contact and touching of the second contact. In response to detecting the second user input and in accordance with a determination that the second input includes a swipe input in a second direction different from the first direction (e.g., opposite the first direction, perpendicular, greater than a threshold angle relative to the first direction, etc.), scrolling through notifications in the notification list in accordance with movement of the second contact in the second direction. These features are shown, for example, in fig. 5I-5J, where the device scrolls through the notification list in response to an upward swipe gesture (e.g., movement 5027 corresponding to contact 5026). In some implementations, if a second swipe input is detected while a notification history including a plurality of previously received notifications is displayed in the first user interface, the notifications in the notification history are scrolled in accordance with the second swipe input.
Scrolling through notifications in the notification list based on movement of the contact provides a way to reduce the amount of input (e.g., scrolling through the notification list without having to turn off and on different portions of the notification list). Reducing the number of inputs required to perform the operation enhances the operability of the device, which in turn reduces power usage and extends the battery life of the device by enabling a user to more quickly and efficiently use the device.
In some implementations, a second user input (e.g., movement 5027 of contact 5026 in fig. 5I-5J) is detected after the first user input is detected and while the search input region is displayed as a result of the first user input (6132). For example, in some embodiments, after the notification has scrolled in a first direction and the search input area is displayed as a result of having reached the end of the notification list, the device stops displaying the search input area and redisplays the notification, and scrolls the notification in a second direction in response to a second user input in the second direction.
Scrolling through notifications in the notification list in response to user input detected while displaying the user interface with the search input area provides a way to reduce the amount of input (e.g., scrolling through the notification list without providing additional user input for closing the user interface with the search input area). Reducing the number of inputs required to perform the operation enhances the operability of the device, which in turn reduces power usage and extends the battery life of the device by enabling a user to more quickly and efficiently use the device.
In some implementations, prior to detecting the first user input, the device detects (6134) a third user input including a third input (e.g., the third input is a swipe input of a third contact (e.g., one contact, two simultaneously detected contacts, etc.) moving across the touch-sensitive surface) (e.g., movement of the contact from a middle region of the user interface toward an edge of the first user interface, movement of the contact detected (e.g., touched down, passed, etc.) at a location on the touch-sensitive surface away from an edge region of the touch-sensitive surface, and corresponding to a location of a plurality of notifications displayed on the display, etc.). In response to detecting the third user input, in accordance with a determination that the third input includes a swipe input in a second direction different from the second direction (e.g., opposite the second direction, perpendicular, greater than a threshold angle relative to the second direction, etc.), scrolling through the notification list to present the first plurality of notifications in the first user interface in accordance with the second direction. These features are shown, for example, in fig. 5K-5L, where the device scrolls through the notification list in response to a swipe down gesture (e.g., movement 5027 corresponding to contact 5026).
Scrolling through notifications in the notification list based on movement of the contact provides a means for reducing the number of inputs (e.g., scrolling through the notification list without having to close and open different portions of the notification list). Reducing the number of inputs required to perform the operation enhances the operability of the device, which in turn reduces power usage and extends the battery life of the device by enabling a user to more quickly and efficiently use the device.
In some implementations, while displaying the second plurality of notifications (e.g., without displaying the search user interface), the device detects (6136) a fourth user input including a fourth input that moves across the touch-sensitive surface (e.g., the fourth input is a swipe input of a fourth contact (e.g., one contact, two simultaneously detected contacts, etc.). In response to detecting the fourth user input, and in accordance with a determination that the fourth input includes a swipe input in a first direction (e.g., a direction toward a top edge of the touch-sensitive surface, a direction toward a bottom edge of the touch-sensitive surface, a direction corresponding to a scrolling direction of the notification list, etc.) and has reached an end of the notification list (e.g., a top of the list, a bottom of the list, a top or bottom of the list, a top of the list only, a bottom of the list only, etc.) (e.g., an end reached when scrolling the notification in the first direction), the device displays a search input area (e.g., a search input area for receiving a text search query entered by the user, which causes a search result to be returned that corresponds to the search query and is identified from a plurality of sources (e.g., an application, text message, email, image, web page, address book, etc.). These features are shown, for example, in fig. 5M-5N, where the device displays a search user interface in response to a swipe down gesture and in accordance with a determination that the end of the notification list has been reached. The device performs this operation regardless of the notification currently being displayed. In some embodiments, displaying the search input area includes: the display of the first user interface is replaced with the display of the first search user interface in the display area provided by the display generating means. In some embodiments, displaying the search input area includes: the first search input area is displayed in the first user interface without replacing the entire first user interface. In some implementations, the search input area is configured to receive search input (e.g., text search criteria, search keywords, image-based search criteria, etc.) and cause search results (e.g., representations of applications, messages, emails, web pages, images, photographs, etc.) corresponding to the search criteria received in the search input area to be returned. In accordance with a determination that the fourth input includes a swipe input in the first direction and that the end of the notification list has not been reached (e.g., neither the top of the list nor the bottom of the list, the top of the list, the bottom of the list, etc.) (e.g., the end reached when scrolling the notification in the first direction), the device displays a third plurality of notifications between the second plurality of notifications and the end of the notification list (e.g., in response to the swipe input in the first direction, the second plurality of notifications are scrolled into view on the first user interface, while some or all of the initially displayed plurality of notifications are scrolled out of view on the first user interface) (e.g., the notification list is ordered in accordance with the time at which the respective notification was received, and is scrollable in a direction corresponding to the first direction of the swipe input). In some embodiments, a single continuous swipe input causes the device to scroll through notifications in the notification list until the end of the notification list is displayed, and then causes display of a search input area. In some embodiments, at least some of the notifications from the notification list are simultaneously visible on the first user interface as a search input area. In some embodiments, when the search input area is displayed, notifications from the notification list cease to be displayed or pushed into the background layer.
In response to detecting user input while displaying a user interface comprising a second plurality of notifications in the notification list, displaying the user interface comprising a search input area in accordance with determining that user input is detected while displaying the end of the notification list, or displaying a third plurality of notifications in accordance with determining that user input is detected while not displaying the end of the notification list, provides additional control options for displaying the user interface without displaying the user interface controls. Providing additional control options without cluttering the user interface with controls for additional display enhances the operability of the device, which in turn reduces power usage and extends battery life of the device by enabling a user to more quickly and efficiently use the device.
It should be understood that the particular order in which the operations described in fig. 6A-6J have been performed is merely an example and is not intended to indicate that the order is the only order in which the operations may be performed. Those of ordinary skill in the art will recognize a variety of ways to reorder the operations described herein. In some embodiments, one or more operations of method 6000 and method 6100 are combined with, supplemented with, or replaced with one or more operations of other methods described herein.
The operations described above with reference to fig. 6A to 6J are optionally implemented by the components depicted in fig. 1A to 1B. For example, display operations 6002 and 6012 and detection operations 6004 and 6104 are optionally implemented by event sorter 170, event recognizer 180 and event handler 190. An event monitor 171 in the event sorter 170 detects a contact on the touch-sensitive display 112 and an event dispatcher module 174 delivers event information to the application 136-1. The respective event identifier 180 of the application 136-1 compares the event information to the respective event definition 186 and determines whether the first contact at the first location on the touch-sensitive surface (or whether the rotation of the device 100) corresponds to a predefined event or sub-event, such as a selection of an object on the user interface, or a rotation of the device 100 from one orientation to another. When a respective predefined event or sub-event is detected, the event recognizer 180 activates an event handler 190 associated with the detection of the event or sub-event. Event handler 190 optionally uses or invokes data updater 176 or object updater 177 to update the application internal state 192. In some embodiments, event handler 190 accesses a corresponding GUI updater 178 to update what is displayed by the application. Similarly, it will be apparent to those skilled in the art how other processes may be implemented based on the components depicted in fig. 1A-1B.
The foregoing description, for purposes of explanation, has been described with reference to specific embodiments. However, the illustrative discussions above are not intended to be exhaustive or to limit the invention to the precise forms disclosed. Many modifications and variations are possible in light of the above teaching. The embodiments were chosen and described in order to best explain the principles of the invention and its practical application to thereby enable others skilled in the art to best utilize the invention and various described embodiments with various modifications as are suited to the particular use contemplated.
Furthermore, in a method described herein in which one or more steps are dependent on one or more conditions having been met, it should be understood that the method may be repeated in multiple iterations such that during the iteration, all conditions that determine steps in the method have been met in different iterations of the method. For example, if a method requires performing a first step (if a condition is met) and performing a second step (if a condition is not met), one of ordinary skill will know that the stated steps are repeated until both the condition and the condition are not met (not sequentially). Thus, a method described as having one or more steps depending on one or more conditions having been met may be rewritten as a method that repeats until each of the conditions described in the method have been met. However, this does not require the system or computer-readable medium to claim that the system or computer-readable medium contains instructions for performing the contingent operation based on the satisfaction of the corresponding condition or conditions, and thus is able to determine whether the contingent situation has been met without explicitly repeating the steps of the method until all conditions to decide on steps in the method have been met. It will also be appreciated by those of ordinary skill in the art that, similar to a method with optional steps, a system or computer readable storage medium may repeat the steps of the method as many times as necessary to ensure that all optional steps have been performed.

Claims (18)

1. A method, comprising:
at an electronic device in communication with a display generation component and one or more input devices: displaying, via the display generating component, a first user interface comprising a first plurality of notifications in a notification list; detecting a first user input while displaying the first user interface, the first user input comprising a first input;
in response to detecting the first user input:
in accordance with a determination that the first input includes a swipe input in a first direction and that an end of the notification list has been reached, displaying a search input area; and
In accordance with a determination that the first input includes the swipe input in the first direction and an end of the notification list has not been reached, a second plurality of notifications is displayed between the first plurality of notifications and the end of the notification list.
2. The method of claim 1, further comprising: in response to detecting the first user input: in accordance with a determination that the first input includes a swipe input in a second direction different from the first direction, a plurality of previously received notifications are displayed in the first user interface, wherein the plurality of previously received notifications includes notifications of the first plurality of notifications that are not in the notification list.
3. The method of any one of claims 1 to 2, wherein: the first user interface is a wake screen user interface that is displayed in response to detecting a request to wake the display generating component from a low power mode.
4. A method according to claim 3, wherein:
the wake screen user interface includes a first area to display notifications, the notifications including the first plurality of notifications.
5. The method of any one of claims 3 to 4, wherein: the wake screen user interface includes a second area that displays the stored notification history.
6. The method of any one of claims 3 to 4, wherein: the wakeup screen user interface includes a user interface object that displays a current time.
7. The method of claim 6, further comprising: in response to detecting the first user input: stopping displaying one or more user interface objects of the wakeup screen user interface.
8. The method of any of claims 1 to 7, further comprising: in response to detecting the first user input: in accordance with the determination that the first input includes the swipe input in the first direction and the end of the notification list has not been reached, one or more notifications of the first plurality of notifications are stopped from being displayed in the notification list.
9. The method of any one of claims 1 to 8, further comprising: in response to detecting the first user input: in accordance with the determination that the first input includes the swipe input in the first direction and the end of the notification list having been reached, a keyboard for entering search input into the search input area is displayed while the search input area is displayed.
10. The method of any one of claims 1 to 9, further comprising: in response to detecting the first user input: in accordance with the determining that the first input includes the swipe input in the first direction and the end of the notification list having been reached, one or more suggested searches are displayed while the search input area is displayed.
11. The method of any one of claims 1 to 10, further comprising: detecting a second user input, the second user input comprising a second input that moves across the touch-sensitive surface;
In response to detecting the second user input: in accordance with a determination that the second input includes a swipe input in a second direction different from the first direction, scrolling through notifications in the notification list in accordance with the movement of the second input in the second direction.
12. The method of claim 11, wherein the second user input is detected after the first user input is detected and while the search input area is displayed as a result of the first user input.
13. The method according to any one of claims 1 to 11, comprising: detecting a third user input prior to detecting the first user input, the third user input comprising a third input that moves across a touch-sensitive surface; in response to detecting the third user input, in accordance with a determination that the third input includes a swipe input in a second direction different from the first direction, scrolling through the notification list in accordance with the second direction to present the first plurality of notifications in the first user interface.
14. The method according to claim 1, comprising: detecting a fourth user input while displaying the second plurality of notifications, the fourth user input comprising a fourth movement across the touch-sensitive surface;
In response to detecting the fourth user input: in accordance with a determination that the fourth user input includes a swipe input in the first direction and has reached an end of the notification list, displaying a search input area; and
In accordance with a determination that the fourth input includes a swipe input in the first direction and an end of the notification list has not been reached, a third plurality of notifications is displayed between the second plurality of notifications and the end of the notification list.
15. An electronic device, comprising:
A display generation section;
one or more input devices;
one or more processors;
a memory; and
One or more programs, wherein the one or more programs are stored in the memory and configured to be executed by the one or more processors, the one or more programs comprising instructions for:
Displaying, via the display generating component, a first user interface comprising a first plurality of notifications in a notification list; detecting a first user input while displaying the first user interface, the first user input comprising a first input;
In response to detecting the first user input: in accordance with a determination that the first input includes a swipe input in a first direction and that an end of the notification list has been reached, displaying a search input area; and
In accordance with a determination that the first input includes the swipe input in the first direction and an end of the notification list has not been reached, a second plurality of notifications is displayed between the first plurality of notifications and the end of the notification list.
16. The electronic device of claim 15, wherein the one or more programs include instructions for performing the method of any of claims 1-14.
17. A computer readable storage medium storing one or more programs, the one or more programs comprising instructions, which when executed by an electronic device with a display generation component and one or more input devices, cause the electronic device to: displaying, via the display generating component, a first user interface comprising a first plurality of notifications in a notification list;
detecting a first user input while displaying the first user interface, the first user input comprising a first input;
in response to detecting the first user input:
in accordance with a determination that the first input includes a swipe input in a first direction and that an end of the notification list has been reached, displaying a search input area; and
In accordance with a determination that the first input includes the swipe input in the first direction and has not reached an end of the notification list, a second plurality of notifications is displayed between the first plurality of notifications and the end of the notification list.
18. The computer readable storage medium of claim 17, wherein the one or more programs comprise instructions, which when executed by the electronic device, cause the electronic device to perform the method of any of claims 1-14.
CN202410161475.2A 2021-05-17 2022-05-17 System and method for interacting with a user interface Pending CN117971106A (en)

Applications Claiming Priority (5)

Application Number Priority Date Filing Date Title
US63/189,652 2021-05-17
US17/745,782 US12026364B2 (en) 2022-05-16 Systems and methods for interacting with user interfaces
US17/745,782 2022-05-16
PCT/US2022/029659 WO2022245846A1 (en) 2021-05-17 2022-05-17 Systems and methods for interacting with user interfaces
CN202280035146.2A CN117321560A (en) 2021-05-17 2022-05-17 System and method for interacting with a user interface

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
CN202280035146.2A Division CN117321560A (en) 2021-05-17 2022-05-17 System and method for interacting with a user interface

Publications (1)

Publication Number Publication Date
CN117971106A true CN117971106A (en) 2024-05-03

Family

ID=89287084

Family Applications (2)

Application Number Title Priority Date Filing Date
CN202410161475.2A Pending CN117971106A (en) 2021-05-17 2022-05-17 System and method for interacting with a user interface
CN202280035146.2A Pending CN117321560A (en) 2021-05-17 2022-05-17 System and method for interacting with a user interface

Family Applications After (1)

Application Number Title Priority Date Filing Date
CN202280035146.2A Pending CN117321560A (en) 2021-05-17 2022-05-17 System and method for interacting with a user interface

Country Status (1)

Country Link
CN (2) CN117971106A (en)

Also Published As

Publication number Publication date
CN117321560A (en) 2023-12-29

Similar Documents

Publication Publication Date Title
AU2021212005B2 (en) Devices and methods for accessing prevalent device functions
US11977726B2 (en) Devices, methods, and graphical user interfaces for interacting with a control object while dragging another object
US11720861B2 (en) Reduced size user interface
US20230035905A1 (en) Devices, Methods, and Graphical User Interfaces for Proactive Management of Notifications
EP3625654B1 (en) Devices, methods, and graphical user interfaces for providing a home button replacement
US10976917B2 (en) Devices and methods for interacting with an application switching user interface
CN107491257B (en) Device and method for accessing common device functions
US20180088750A1 (en) Devices, Methods, and Graphical User Interfaces for Creating and Displaying Application Windows
US20220253189A1 (en) Devices and Methods for Interacting with an Application Switching User Interface
CN110457093B (en) Apparatus, method and graphical user interface for proactively managing notifications
US11875016B2 (en) Devices, methods, and graphical user interfaces for displaying media items shared from distinct applications
US12026364B2 (en) Systems and methods for interacting with user interfaces
US20220365669A1 (en) Systems and Methods for Interacting with User Interfaces
US20240143134A1 (en) Devices, Methods, and Graphical User Interfaces for Displaying Media Items Shared from Distinct Applications
CN117971106A (en) System and method for interacting with a user interface
EP4341795A1 (en) Systems and methods for interacting with user interfaces
CN117321557A (en) Device, method and graphical user interface for displaying media items shared from different applications

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination