AU2017202587B2 - Portable multifunction device with interface reconfiguration mode - Google Patents

Portable multifunction device with interface reconfiguration mode Download PDF

Info

Publication number
AU2017202587B2
AU2017202587B2 AU2017202587A AU2017202587A AU2017202587B2 AU 2017202587 B2 AU2017202587 B2 AU 2017202587B2 AU 2017202587 A AU2017202587 A AU 2017202587A AU 2017202587 A AU2017202587 A AU 2017202587A AU 2017202587 B2 AU2017202587 B2 AU 2017202587B2
Authority
AU
Australia
Prior art keywords
icons
area
touch screen
screen display
icon
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
AU2017202587A
Other versions
AU2017202587A1 (en
Inventor
Imran Chaudhri
Greg Christie
Scott Herz
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Apple Inc
Original Assignee
Apple Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from US11/849,938 external-priority patent/US8619038B2/en
Application filed by Apple Inc filed Critical Apple Inc
Priority to AU2017202587A priority Critical patent/AU2017202587B2/en
Publication of AU2017202587A1 publication Critical patent/AU2017202587A1/en
Priority to AU2019204835A priority patent/AU2019204835B2/en
Application granted granted Critical
Publication of AU2017202587B2 publication Critical patent/AU2017202587B2/en
Priority to AU2021201687A priority patent/AU2021201687B2/en
Priority to AU2022224726A priority patent/AU2022224726B2/en
Priority to AU2024203944A priority patent/AU2024203944A1/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04817Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance using icons
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/016Input arrangements with force or tactile feedback as computer generated output to the user
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/0482Interaction with lists of selectable items, e.g. menus
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04842Selection of displayed objects or displayed text elements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04847Interaction techniques to control parameter settings, e.g. interaction with sliders or dials
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/0486Drag-and-drop
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

In one aspect of the invention, a computer-implemented method at a computing device with a touch screen display includes: displaying a first set of a first plurality of icons in a first area of the touch screen display, wherein the first plurality of icons includes a plurality of sets of icons that are separately displayed in the first area of the touch screen display; displaying a second plurality of icons in a second area on the touch screen display, wherein the second area is different from the first area; detecting a first finger gesture on the touch screen display; in response to detecting the first finger gesture, initiating a user interface reconfiguration process, and varying positions of one or more icons in the first set of the first plurality of icons about respective average positions.

Description

TECHNICAL FIELD [0001A] The disclosed embodiments relate generally to portable electronic devices, and more particularly, to user interfaces on portable multifunction devices with touch-sensitive displays that include an interface reconfiguration mode and to creating widgets for displaying specified 0 areas of web pages (i.e., creating web-clip widgets) on portable multifunction devices.
BACKGROUND [0002] As portable electronic devices become more compact, and the number of functions performed by a given device increases, it has become a significant challenge to design a user interface that allows users to easily interact with a multifunction device. This challenge is 5 particularly significant for handheld portable devices, which have much smaller screens than desktop or laptop computers. This situation is unfortunate because the user interface is the gateway through which users receive not only content but also responses to user actions or behaviors, including user attempts to access a device's features, tools, and functions. Some portable communication devices (e.g., mobile telephones, sometimes called mobile phones, cell 20 phones, cellular telephones, and the like) have resorted to adding more pushbuttons, increasing the density of pushbuttons, overloading the functions of pushbuttons, or using complex menu systems to allow a user to access, store and manipulate data. These conventional user interfaces often result in complicated key sequences and menu hierarchies that must be memorized by the user.
[0003] Many conventional user interfaces, such as those that include physical pushbuttons, are also inflexible. This may prevent a user interface from being configured and/or adapted by either an application running on the portable device or by users. When coupled with the time consuming requirement to memorize multiple key sequences and menu hierarchies, and the
1001780575
2017202587 19 Apr 2017 difficulty in activating a desired pushbutton, such inflexibility is frustrating to most users.
[0004] Some conventional user interfaces can be configured by users, thereby allowing at least partial customization. Unfortunately, the process of modifying such
1A
1002479966
2017202587 08 Mar 2019 conventional user interfaces is often as cumbersome and complicated as the use of the conventional user interface itself. In particular, the required behaviors during configuration of such conventional user interfaces are often counterintuitive and the corresponding indicators guiding user actions are often difficult to understand. These challenges are often a source of 5 additional frustration for users.
[0005] Accordingly, there is a need for more transparent and intuitive user interfaces for portable devices that enable a user to easily configure the user interface.
[0006] In addition, as a result of the small size of display screens on portable electronic devices, frequently only a portion of a web page of interest to a user can be displayed on the screen at a given time. Furthermore, the scale of display may be too small for comfortable or practical viewing. Users thus will frequently need to scroll and to scale a web page to view a portion of interest each time that they access the web page. However, the limitations of conventional user interfaces can cause this scrolling and scaling to be awkward to perform.
[0007] Accordingly, there is a need for portable multifunction devices with more transparent 15 and intuitive user interfaces for creating widgets for displaying specified areas of web pages (i.e., for creating web-clip widgets) that are easy to use, configure, and/or adapt. In addition, once the web-clip widgets are created, there is a need for transparent and intuitive methods for configuring user interfaces that include icons for activating web-clip widgets.
[0007a] Reference to any prior art in the specification is not an acknowledgment or ’0 suggestion that this prior art forms part of the common general knowledge in any jurisdiction or that this prior art could reasonably be expected to be understood, regarded as relevant, and/or combined with other pieces of prior art by a skilled person in the art.
SUMMARY [0008] The above deficiencies and other problems associated with user interfaces for 25 portable devices are reduced or eliminated by the disclosed portable multifunction device. In some embodiments, the device has a touch-sensitive display (also known as a touch screen) with a graphical user interface (GUI), one or more processors, memory and one or more modules, programs or sets of instructions stored in the memory for performing multiple functions. In some embodiments, the user interacts with the GUI primarily through finger
1002479966
2017202587 08 Mar 2019 contacts and gestures on the touch-sensitive display. In some embodiments, the functions may include telephoning, video conferencing, e-mailing, instant messaging, blogging, digital photographing, digital videoing, web browsing, digital music playing, and/or digital video playing. Instructions for performing these functions may be included in a computer readable storage medium or other computer program product configured for execution by one or more processors.
[0008a] According to a first aspect of the invention there is provided a computerimplemented method, comprising: at a computing device with a touch screen display: displaying a first set of a first plurality of icons in a first area of the touch screen display, wherein the first D plurality of icons includes a plurality of sets of icons that are separately displayed in the first area of the touch screen display; displaying a second plurality of icons in a second area on the touch screen display, wherein: the second area is different from the first area, and both of the first plurality of icons and the second plurality of icons includes application launch icons, wherein each application launch icon represents a particular application, and activation of a respective application launch icon causes activation of the particular application represented by the activated application launch icon; while displaying the first set of the first plurality of icons in the first area of the touch screen display, detecting a first finger gesture on the touch screen display in the first area in a first direction; in response to detecting the first finger gesture, replacing display of the first set of the first plurality of icons with display of a second set of the first plurality of icons in the first area on the touch screen display while maintaining the display of the second plurality of icons in the second area on the touch screen display, wherein the second set of the first plurality of icons is distinct from the first set of the first plurality of icons [0008b] According to a second aspect of the invention there is provided a computing device, comprising: a touch screen display; one or more processors; memory; and one or more programs, 25 wherein the one or more programs are stored in the memory and configured to be executed by the one or more processors, the one or more programs including: instructions for displaying a first set of a first plurality of icons in a first area of the touch screen display, wherein the first plurality of icons includes a plurality of sets of icons that are separately displayed in the first area of the touch screen display; instructions for displaying a second plurality of icons in a second area on the touch screen display, wherein: the second area is different from the first area, and both of the first plurality of icons and the second plurality of icons includes application launch icons, wherein each application launch icon represents a particular application, and activation of a respective application launch icon causes activation of the particular application represented by
1002479966
2017202587 08 Mar 2019 the activated application launch icon; instructions for, while displaying the first set of the first plurality of icons in the first area of the touch screen display, detecting a first finger gesture on the touch screen display in the first area; and instructions for replacing display ofthe first set of the first plurality of icons with display of a second set of the first plurality of icons in the first area on the touch screen display while maintaining the display of the second plurality of icons in the second area on the touch screen display, in response to detecting the first finger gesture on the touch screen display in the first area, wherein the second set of the first plurality of icons is distinct from the first set of the first plurality of icons.
[0008c] According to a third aspect of the invention there is provided a computer readable storage medium having stored therein instructions, which when executed by a device with a touch screen display, cause the device to: display a first set of a first plurality of icons in a first area of the touch screen display, wherein the first plurality of icons includes a plurality of sets of icons that are separately displayed in the first area of the touch screen display; display a second plurality of icons in a second area on the touch screen display, wherein: the second area is different from the first area, and both of the first plurality of icons and the second plurality of icons includes application launch icons, wherein each application launch icon represents a particular application, and activation of a respective application launch icon causes activation of the particular application represented by the activated application launch icon; while displaying the first set of the first plurality of icons in the first area of the touch screen display, detect a first finger gesture on the touch screen display in the first area; and replace display of the first set of the first plurality of icons with display of a second set of the first plurality of icons in the first area on the touch screen display while maintaining the display of the second plurality of icons in the second area on the touch screen display, in response to detecting the first finger gesture on the touch screen display in the first area, wherein the second set of the first plurality of icons is distinct from the first set of the first plurality of icons.
[0008d] According to a fourth aspect of the invention there is provided a graphical user interface on a computing device with a touch screen display, comprising: a first set of a first plurality of icons displayed in a first area of the touch screen display, wherein the first plurality of icons includes a plurality of sets of icons that are separately displayed in the first area of the touch screen display; and a second plurality of icons displayed in a second area on the touch screen display, wherein: the second area is different from the first area, and both of the first plurality of icons and the second plurality of icons includes application launch icons, wherein each application launch icon represents a particular application, and activation of a respective
3a
1002479966
2017202587 08 Mar 2019 application launch icon causes activation of the particular application represented by the activated application launch icon; wherein: while displaying the first set of the first plurality of icons in the first area of the touch screen display and in response to detecting a first finger gesture on the touch screen display in the first area, display of the first set of the first plurality of 5 icons is replaced with display of a second set of the first plurality of icons in the first area on the touch screen display while maintaining the display of the second plurality of icons in the second area on the touch screen display, wherein the second set of the first plurality of icons is distinct from the first set of the first plurality of icons.
[0008e] According to a fifth aspect of the invention there is provided a computing device 0 with a touch screen display, comprising: means for displaying a first set of a first plurality of icons in a first area of the touch screen display, wherein the first plurality of icons includes a plurality of sets of icons that are separately displayed in the first area of the touch screen display; means for displaying a second plurality of icons in a second area on the touch screen display, wherein: the second area is different from the first area, and both of the first plurality of icons and the second plurality of icons includes application launch icons, wherein each application launch icon represents a particular application, and activation of a respective application launch icon causes activation of the particular application represented by the activated application launch icon; means for, while displaying the first set of the first plurality of icons in the first area of the touch screen display, detecting a first finger gesture on the touch screen display in the first 0 area; and means for replacing display of the first set of the first plurality of icons with display of a second set of the first plurality of icons in the first area on the touch screen display while maintaining the display of the second plurality of icons in the second area on the touch screen display, in response to detecting the first finger gesture on the touch screen display in the first area, wherein the second set of the first plurality of icons is distinct from the first set of the first 25 plurality of icons.
[0009] In one embodiment of the disclosure, a computer-implemented method at a computing device with a touch screen display includes: displaying a first set of a first plurality of icons in a first area of the touch screen display, wherein the first plurality of icons includes a plurality of sets of icons that are separately displayed in the first area of the touch screen display;
displaying a second plurality of icons in a second area on the touch screen display, wherein the second area is different from the first area; detecting a first finger gesture on the touch screen display in the first area; and in response to detecting the first finger gesture on the touch screen display in the first area, replacing display of the first set of the first plurality of icons with display
3b
1002479966
2017202587 08 Mar 2019 of a second set of the first plurality of icons in the first area on the touch screen display while maintaining the display of the second plurality of icons in the second area on the touch screen display.
[0010] In another embodiment of the disclosure, a computer-implemented method at a computing device with a touch screen display includes: displaying a first set of a first plurality of icons in a first area of the touch screen display, wherein the first plurality of icons includes a plurality of sets of icons that are separately displayed in the first area of the touch screen display; displaying a second plurality of icons in a second area on the touch screen display, wherein the second area is different from the first area; detecting a first finger gesture on the touch screen display; in response to detecting the first finger gesture, initiating a user interface reconfiguration process, and varying positions of one or more icons in the first set of the first plurality of icons about respective average positions.
[0011] Thus, interface reconfiguration in accordance with the disclosed embodiments allows a user to reposition displayed icons (e.g., icons for activating applications and/or web-clip .5 widgets) in a simple, intuitive manner with finger gestures.
BRIEF DESCRIPTION OF THE DRAWINGS [0012] For a better understanding of the aforementioned embodiments of the invention as well as additional embodiments thereof, reference should be made to the Description of Embodiments below, in conjunction with the following drawings in which like reference !0 numerals refer to corresponding parts throughout the figures.
3c
WO 2009/089222
PCT/US2009/030225
2017202587 19 Apr 2017 [0013] Figures 1A and IB are block diagrams illustrating a portable multifunction device with a touch-sensitive display in accordance with some embodiments.
[0014] Figure 2 illustrates a portable multifunction device having a touch screen in accordance with some embodiments.
[0015] Figure 3 illustrates an exemplary user interface for unlocking a portable electronic device in accordance with some embodiments.
[0016] Figures 4A-4B illustrate exemplary user interfaces having menus of applications and/or widgets on a portable multifunction device in accordance with some embodiments.
[0017] Figure 4C illustrates an exemplary user interface having a list of user-created widgets on a portable multifunction device in accordance with some embodiments.
[0018] Figures 5A-5K illustrate an exemplary user interface for a browser in accordance with some embodiments.
[0019] Figures 5L and 5M illustrate exemplary user interfaces for displaying web-clip 15 widgets in accordance with some embodiments.
[0020] Figures 6A-6D illustrate an animation for creating and displaying an icon corresponding to a web-clip widget in accordance with some embodiments.
[0021] Figure 6E illustrates an exemplary user interface for activating a web-clip widget in accordance with some embodiments.
[0022] Figures 7A-7E are flow diagrams illustrating processes for creating and using a web-clip widget in accordance with some embodiments.
[0023] Figures 7F-7H are flow diagrams illustrating processes for displaying web-clip widgets in accordance with some embodiments.
[0024] Figures 8A-8D illustrate exemplary user interfaces for displaying icons in 25 accordance with some embodiments.
[0025] Figures 9A and 9B are flow diagrams of an icon display process in accordance with some embodiments.
[0026] Figure 10 is a flow diagram of a position adjustment process for a portable multifunction device in accordance with some embodiments.
WO 2009/089222
PCT/US2009/030225
2017202587 19 Apr 2017 [0027] Figures 11 A-1100 illustrate exemplary user interfaces during interface reconfiguration in accordance with some embodiments.
[0028] Figures 12A-12F flow diagrams of icon reconfiguration processes in accordance with some embodiments.
DESCRIPTION OF EMBODIMENTS [0029] Reference will now be made in detail to embodiments, examples of which are illustrated in the accompanying drawings. In the following detailed description, numerous specific details are set forth in order to provide a thorough understanding of the present invention. However, it will be apparent to one of ordinary skill in the art that the present 10 invention may be practiced without these specific details. In other instances, well-known methods, procedures, components, circuits, and networks have not been described in detail so as not to unnecessarily obscure aspects of the embodiments.
[0030] It will also be understood that, although the terms first, second, etc. may be used herein to describe various elements, these elements should not be limited by these terms.
These terms are only used to distinguish one element from another. For example, a first gesture could be termed a second gesture, and, similarly, a second gesture could be termed a first gesture, without departing from the scope of the present invention.
[0031] The terminology used in the description of the invention herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the 20 invention. As used in the description of the invention and the appended claims, the singular forms “a”, “an” and “the” are intended to include the plural forms as well, unless the context clearly indicates otherwise. It will also be understood that the term and/or as used herein refers to and encompasses any and all possible combinations of one or more of the associated listed items. It will be further understood that the terms comprises and/or comprising, when used in this specification, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof.
[0032] As used herein, the term if' may be construed to mean when or upon or 30 in response to determining or in response to detecting, depending on the context.
Similarly, the phrase if it is determined or if [a stated condition or event] is detected may
WO 2009/089222
PCT/US2009/030225
2017202587 19 Apr 2017 be construed to mean upon determining or in response to determining or upon detecting [the stated condition or event] or in response to detecting [the stated condition or event], depending on the context.
[0033] Embodiments of a portable multifunction device, user interfaces for such 5 devices, and associated processes for using such devices are described. In some embodiments, the device is a portable communications device such as a mobile telephone that also contains other functions, such as PDA and/or music player functions.
[0034] The user interface may include a physical click wheel in addition to a touch screen or a virtual click wheel displayed on the touch screen. A click wheel is a user10 interface device that may provide navigation commands based on an angular displacement of the wheel or a point of contact with the wheel by a user of the device. A click wheel may also be used to provide a user command corresponding to selection of one or more items, for example, when the user of the device presses down on at least a portion of the wheel or the center of the wheel. Alternatively, breaking contact with a click wheel image on a touch 15 screen surface may indicate a user command corresponding to selection. For simplicity, in the discussion that follows, a portable multifunction device that includes a touch screen is used as an exemplary embodiment. It should be understood, however, that some of the user interfaces and associated processes may be applied to other devices, such as personal computers and laptop computers, which may include one or more other physical user20 interface devices, such as a physical click wheel, a physical keyboard, a mouse and/or a joystick.
[0035] The device supports a variety of applications, such as one or more of the following: a telephone application, a video conferencing application, an e-mail application, an instant messaging application, a blogging application, a photo management application, a 25 digital camera application, a digital video camera application, a web browsing application, a digital music player application, and/or a digital video player application.
[0036] The various applications that may be executed on the device may use at least one common physical user-interface device, such as the touch screen. One or more functions of the touch screen as well as corresponding information displayed on the device may be 30 adjusted and/or varied from one application to the next and/or within a respective application.
In this way, a common physical architecture (such as the touch screen) of the device may support the variety of applications with user interfaces that are intuitive and transparent.
WO 2009/089222
PCT/US2009/030225
2017202587 19 Apr 2017 [0037] The user interfaces may include one or more soft keyboard embodiments. The soft keyboard embodiments may include standard (QWERTY) and/or non-standard configurations of symbols on the displayed icons of the keyboard, such as those described in U.S. Patent Applications 11/459,606, Keyboards For Portable Electronic Devices, filed July 5 24, 2006, and 11/459,615, Touch Screen Keyboards For Portable Electronic Devices, filed
July 24, 2006, the contents of which are hereby incorporated by reference in their entirety. The keyboard embodiments may include a reduced number of icons (or soft keys) relative to the number of keys in existing physical keyboards, such as that for a typewriter. This may make it easier for users to select one or more icons in the keyboard, and thus, one or more 10 corresponding symbols. The keyboard embodiments may be adaptive. For example, displayed icons may be modified in accordance with user actions, such as selecting one or more icons and/or one or more corresponding symbols. One or more applications on the portable device may utilize common and/or different keyboard embodiments. Thus, the keyboard embodiment used may be tailored to at least some of the applications. In some 15 embodiments, one or more keyboard embodiments may be tailored to a respective user. For example, one or more keyboard embodiments may be tailored to a respective user based on a word usage history (lexicography, slang, individual usage) of the respective user. Some of the keyboard embodiments may be adjusted to reduce a probability of a user error when selecting one or more icons, and thus one or more symbols, when using the soft keyboard 20 embodiments.
[0038] Attention is now directed towards embodiments of the device. Figures 1A and IB are block diagrams illustrating portable multifunction devices 100 with touch-sensitive displays 112 in accordance with some embodiments. The touch-sensitive display 112 is sometimes called a touch screen for convenience, and may also be known as or called a 25 touch-sensitive display system. The device 100 may include a memory 102 (which may include one or more computer readable storage mediums), a memory controller 122, one or more processing units (CPU’s) 120, a peripherals interface 118, RF circuitry 108, audio circuitry 110, a speaker 111, a microphone 113, an input/output (I/O) subsystem 106, other input or control devices 116, and an external port 124. The device 100 may include one or 30 more optical sensors 164. These components may communicate over one or more communication buses or signal lines 103.
[0039] It should be appreciated that the device 100 is only one example of a portable multifunction device 100, and that the device 100 may have more or fewer components than
WO 2009/089222
PCT/US2009/030225
2017202587 19 Apr 2017 shown, may combine two or more components, or a may have a different configuration or arrangement of the components. The various components shown in Figures 1A and IB may be implemented in hardware, software or a combination of both hardware and software, including one or more signal processing and/or application specific integrated circuits.
[0040] Memory 102 may include high-speed random access memory and may also include non-volatile memory, such as one or more magnetic disk storage devices, flash memory devices, or other non-volatile solid-state memory devices. Access to memory 102 by other components of the device 100, such as the CPU 120 and the peripherals interface 118, may be controlled by the memory controller 122.
[0041] The peripherals interface 118 couples the input and output peripherals of the device to the CPU 120 and memory 102. The one or more processors 120 run or execute various software programs and/or sets of instructions stored in memory 102 to perform various functions for the device 100 and to process data.
[0042] In some embodiments, the peripherals interface 118, the CPU 120, and the 15 memory controller 122 may be implemented on a single chip, such as a chip 104. In some other embodiments, they may be implemented on separate chips.
[0043] The RF (radio frequency) circuitry 108 receives and sends RF signals, also called electromagnetic signals. The RF circuitry 108 converts electrical signals to/from electromagnetic signals and communicates with communications networks and other 20 communications devices via the electromagnetic signals. The RF circuitry 108 may include well-known circuitry for performing these functions, including but not limited to an antenna system, an RF transceiver, one or more amplifiers, a tuner, one or more oscillators, a digital signal processor, a CODEC chipset, a subscriber identity module (SIM) card, memory, and so forth. The RF circuitry 108 may communicate with networks, such as the Internet, also 25 referred to as the World Wide Web (WWW), an intranet and/or a wireless network, such as a cellular telephone network, a wireless local area network (LAN) and/or a metropolitan area network (MAN), and other devices by wireless communication. The wireless communication may use any of a plurality of communications standards, protocols and technologies, including but not limited to Global System for Mobile Communications (GSM), Enhanced 30 Data GSM Environment (EDGE), high-speed downlink packet access (HSDPA), wideband code division multiple access (W-CDMA), code division multiple access (CDMA), time division multiple access (TDMA), Bluetooth, Wireless Fidelity (Wi-Fi) (e.g., IEEE 802.1 la,
WO 2009/089222
PCT/US2009/030225
2017202587 19 Apr 2017
IEEE 802.11b, IEEE 802.11g and/or IEEE 802.11η), voice over Internet Protocol (VoIP), Wi-MAX, a protocol for email (e.g., Internet message access protocol (IMAP) and/or post office protocol (POP)), instant messaging (e.g., extensible messaging and presence protocol (XMPP), Session Initiation Protocol for Instant Messaging and Presence Leveraging 5 Extensions (SIMPLE), and/or Instant Messaging and Presence Service (IMPS)), and/or Short Message Service (SMS)), or any other suitable communication protocol, including communication protocols not yet developed as of the filing date of this document.
[0044] The audio circuitry 110, the speaker 111, and the microphone 113 provide an audio interface between a user and the device 100. The audio circuitry 110 receives audio 10 data from the peripherals interface 118, converts the audio data to an electrical signal, and transmits the electrical signal to the speaker 111. The speaker 111 converts the electrical signal to human-audible sound waves. The audio circuitry 110 also receives electrical signals converted by the microphone 113 from sound waves. The audio circuitry 110 converts the electrical signal to audio data and transmits the audio data to the peripherals interface 118 for 15 processing. Audio data may be retrieved from and/or transmitted to memory 102 and/or the RF circuitry 108 by the peripherals interface 118. In some embodiments, the audio circuitry 110 also includes a headset jack (e.g. 212, Figure 2). The headset jack provides an interface between the audio circuitry 110 and removable audio input/output peripherals, such as output-only headphones or a headset with both output (e.g., a headphone for one or both ears) 20 and input (e.g., a microphone).
[0045] The I/O subsystem 106 couples input/output peripherals on the device 100, such as the touch screen 112 and other input/control devices 116, to the peripherals interface 118. The I/O subsystem 106 may include a display controller 156 and one or more input controllers 160 for other input or control devices. The one or more input controllers 160 25 receive/send electrical signals from/to other input or control devices 116. The other input/control devices 116 may include physical buttons (e.g., push buttons, rocker buttons, etc.), dials, slider switches, joysticks, click wheels, and so forth. In some alternate embodiments, input controller(s) 160 may be coupled to any (or none) of the following: a keyboard, infrared port, USB port, and a pointer device such as a mouse. The one or more 30 buttons (e.g., 208, Figure 2) may include an up/down button for volume control of the speaker 111 and/or the microphone 113. The one or more buttons may include a push button (e.g., 206, Figure 2). A quick press of the push button may disengage a lock of the touch screen 112 or begin a process that uses gestures on the touch screen to unlock the device, as
WO 2009/089222
PCT/US2009/030225
2017202587 19 Apr 2017 described in U.S. Patent Application 11/322,549, “Unlocking a Device by Performing Gestures on an Unlock Image,” filed December 23, 2005, which is hereby incorporated by reference in its entirety. A longer press of the push button (e.g., 206) may turn power to the device 100 on or off. The user may be able to customize a functionality of one or more of the 5 buttons. The touch screen 112 is used to implement virtual or soft buttons and one or more soft keyboards.
[0046] The touch-sensitive touch screen 112 provides an input interface and an output interface between the device and a user. The display controller 156 receives and/or sends electrical signals from/to the touch screen 112. The touch screen 112 displays visual output 10 to the user. The visual output may include graphics, text, icons, video, and any combination thereof (collectively termed graphics). In some embodiments, some or all of the visual output may correspond to user-interface objects, further details of which are described below.
[0047] A touch screen 112 has a touch-sensitive surface, sensor or set of sensors that accepts input from the user based on haptic and/or tactile contact. The touch screen 112 and 15 the display controller 156 (along with any associated modules and/or sets of instructions in memory 102) detect contact (and any movement or breaking of the contact) on the touch screen 112 and converts the detected contact into interaction with user-interface objects (e.g., one or more soft keys, icons, web pages or images) that are displayed on the touch screen. In an exemplary embodiment, a point of contact between a touch screen 112 and the user 20 corresponds to a finger of the user.
[0048] The touch screen 112 may use LCD (liquid crystal display) technology, or LPD (light emitting polymer display) technology, although other display technologies may be used in other embodiments. The touch screen 112 and the display controller 156 may detect contact and any movement or breaking thereof using any of a plurality of touch sensing 25 technologies now known or later developed, including but not limited to capacitive, resistive, infrared, and surface acoustic wave technologies, as well as other proximity sensor arrays or other elements for determining one or more points of contact with a touch screen 112.
[0049] A touch-sensitive display in some embodiments of the touch screen 112 may be analogous to the multi-touch sensitive tablets described in the following U.S. Patents: 30 6,323,846 (Westerman et al.), 6,570,557 (Westerman et al.), and/or 6,677,932 (Westerman), and/or U.S. Patent Publication 2002/0015024A1, each of which is hereby incorporated by
WO 2009/089222
PCT/US2009/030225
2017202587 19 Apr 2017 reference in its entirety. However, a touch screen 112 displays visual output from the portable device 100, whereas touch sensitive tablets do not provide visual output.
[0050] A touch-sensitive display in some embodiments of the touch screen 112 may be as described in the following applications: (1) U.S. Patent Application No. 11/381,313, 5 Multipoint Touch Surface Controller, filed May 2, 2006; (2) U.S. Patent Application No.
10/840,862, Multipoint Touchscreen, filed May 6, 2004; (3) U.S. Patent Application No. 10/903,964, Gestures For Touch Sensitive Input Devices, filed July 30, 2004; (4) U.S. Patent Application No. 11/048,264, Gestures For Touch Sensitive Input Devices, filed January 31, 2005; (5) U.S. Patent Application No. 11/038,590, Mode-Based Graphical User 10 Interfaces For Touch Sensitive Input Devices, filed January 18, 2005; (6) U.S. Patent Application No. 11/228,758, Virtual Input Device Placement On A Touch Screen User Interface, filed September 16, 2005; (7) U.S. Patent Application No. 11/228,700, Operation Of A Computer With A Touch Screen Interface, filed September 16, 2005; (8) U.S. Patent Application No. 11/228,737, Activating Virtual Keys Of A Touch-Screen Virtual 15 Keyboard, filed September 16, 2005; and (9) U.S. Patent Application No. 11/367,749, Multi-Functional Hand-Held Device, filed March 3, 2006. All of these applications are incorporated by reference in their entirety herein.
[0051] The touch screen 112 may have a resolution in excess of 100 dpi. In an exemplary embodiment, the touch screen has a resolution of approximately 160 dpi. The user 20 may make contact with the touch screen 112 using any suitable object or appendage, such as a stylus, a finger, and so forth. In some embodiments, the user interface is designed to work primarily with finger-based contacts and gestures, which are much less precise than stylusbased input due to the larger area of contact of a finger on the touch screen. In some embodiments, the device translates the rough finger-based input into a precise pointer/cursor 25 position or command for performing the actions desired by the user.
[0052] In some embodiments, in addition to the touch screen, the device 100 may include a touchpad (not shown) for activating or deactivating particular functions. In some embodiments, the touchpad is a touch-sensitive area of the device that, unlike the touch screen, does not display visual output. The touchpad may be a touch-sensitive surface that is 30 separate from the touch screen 112 or an extension of the touch-sensitive surface formed by the touch screen.
WO 2009/089222
PCT/US2009/030225
2017202587 19 Apr 2017 [0053] In some embodiments, the device 100 may include a physical or virtual click wheel as an input control device 116. A user may navigate among and interact with one or more graphical objects (henceforth referred to as icons) displayed in the touch screen 112 by rotating the click wheel or by moving a point of contact with the click wheel (e.g., where the 5 amount of movement of the point of contact is measured by its angular displacement with respect to a center point of the click wheel). The click wheel may also be used to select one or more of the displayed icons. For example, the user may press down on at least a portion of the click wheel or an associated button. User commands and navigation commands provided by the user via the click wheel may be processed by an input controller 160 as well as one or 10 more of the modules and/or sets of instructions in memory 102. For a virtual click wheel, the click wheel and click wheel controller may be part of the touch screen 112 and the display controller 156, respectively. For a virtual click wheel, the click wheel may be either an opaque or semitransparent object that appears and disappears on the touch screen display in response to user interaction with the device. In some embodiments, a virtual click wheel is 15 displayed on the touch screen of a portable multifunction device and operated by user contact with the touch screen.
[0054] The device 100 also includes a power system 162 for powering the various components. The power system 162 may include a power management system, one or more power sources (e.g., battery, alternating current (AC)), a recharging system, a power failure 20 detection circuit, a power converter or inverter, a power status indicator (e.g., a light-emitting diode (LED)) and any other components associated with the generation, management and distribution of power in portable devices.
[0055] The device 100 may also include one or more optical sensors 164. Figures 1A and IB show an optical sensor coupled to an optical sensor controller 158 in I/O subsystem 25 106. The optical sensor 164 may include charge-coupled device (CCD) or complementary metal-oxide semiconductor (CMOS) phototransistors. The optical sensor 164 receives light from the environment, projected through one or more lens, and converts the light to data representing an image. In conjunction with an imaging module 143 (also called a camera module), the optical sensor 164 may capture still images or video. In some embodiments, an 30 optical sensor is located on the back of the device 100, opposite the touch screen display 112 on the front of the device, so that the touch screen display may be used as a viewfinder for either still and/or video image acquisition. In some embodiments, an optical sensor is located on the front of the device so that the user's image may be obtained for videoconferencing 12
WO 2009/089222
PCT/US2009/030225
2017202587 19 Apr 2017 while the user views the other video conference participants on the touch screen display. In some embodiments, the position of the optical sensor 164 can be changed by the user (e.g., by rotating the lens and the sensor in the device housing) so that a single optical sensor 164 may be used along with the touch screen display for both video conferencing and still and/or video 5 image acquisition.
[0056] The device 100 may also include one or more proximity sensors 166. Figures 1A and IB show a proximity sensor 166 coupled to the peripherals interface 118. Alternately, the proximity sensor 166 may be coupled to an input controller 160 in the I/O subsystem 106. The proximity sensor 166 may perform as described in U.S. Patent 10 Application Nos. 11/241,839, “Proximity Detector In Handheld Device,” September 30, 2005; 11/240,788, “Proximity Detector In Handheld Device,” September 30, 2005; 11/620,702, “Using Ambient Light Sensor To Augment Proximity Sensor Output”; 11/586,862, “Automated Response To And Sensing Of User Activity In Portable Devices.” filed October 24, 2006; and 11/638,251, “Methods And Systems For Automatic 15 Configuration Of Peripherals,” which are hereby incorporated by reference herein in their entirety. In some embodiments, the proximity sensor turns off and disables the touch screen 112 when the multifunction device is placed near the user's ear (e.g., when the user is making a phone call). In some embodiments, the proximity sensor keeps the screen off when the device is in the user's pocket, purse, or other dark area to prevent unnecessary battery 20 drainage when the device is a locked state.
[0057] The device 100 may also include one or more accelerometers 168. Figures 1A and IB show an accelerometer 168 coupled to the peripherals interface 118. Alternately, the accelerometer 168 may be coupled to an input controller 160 in the I/O subsystem 106. The accelerometer 168 may perform as described in U.S. Patent Publication No. 20050190059, 25 “Acceleration-based Theft Detection System for Portable Electronic Devices,” and U.S.
Patent Publication No. 20060017692, Methods And Apparatuses For Operating A Portable Device Based On An Accelerometer, both of which are which are incorporated herein by reference in their entirety. In some embodiments, information is displayed on the touch screen display in a portrait view or a landscape view based on an analysis of data received 30 from the one or more accelerometers.
[0058] In some embodiments, the software components stored in memory 102 may include an operating system 126, a communication module (or set of instructions) 128, a
WO 2009/089222
PCT/US2009/030225
2017202587 19 Apr 2017 contact/motion module (or set of instructions) 130, a graphics module (or set of instructions) 132, a text input module (or set of instructions) 134, a Global Positioning System (GPS) module (or set of instructions) 135, and applications (or set of instructions) 136.
[0059] The operating system 126 (e.g., Darwin, RTXC, LINUX, UNIX, OS X, 5 WINDOWS, or an embedded operating system such as VxWorks) includes various software components and/or drivers for controlling and managing general system tasks (e.g., memory management, storage device control, power management, etc.) and facilitates communication between various hardware and software components.
[0060] The communication module 128 facilitates communication with other devices 10 over one or more external ports 124 and also includes various software components for handling data received by the RF circuitry 108 and/or the external port 124. The external port 124 (e.g., Universal Serial Bus (USB), FIREWIRE, etc.) is adapted for coupling directly to other devices or indirectly over a network (e.g., the Internet, wireless LAN, etc.). In some embodiments, the external port is a multi-pin (e.g., 30-pin) connector that is the same as, or 15 similar to and/or compatible with the 30-pin connector used on iPod (trademark of Apple Computer, Inc.) devices.
[0061] The contact/motion module 130 may detect contact with the touch screen 112 (in conjunction with the display controller 156) and other touch sensitive devices (e.g., a touchpad or physical click wheel). The contact/motion module 130 includes various software 20 components for performing various operations related to detection of contact, such as determining if contact has occurred, determining if there is movement of the contact and tracking the movement across the touch screen 112, and determining if the contact has been broken (i.e., if the contact has ceased). Determining movement of the point of contact may include determining speed (magnitude), velocity (magnitude and direction), and/or an 25 acceleration (a change in magnitude and/or direction) of the point of contact. These operations may be applied to single contacts (e.g., one finger contacts) or to multiple simultaneous contacts (e.g., multitouch/multiple finger contacts). In some embodiments, the contact/motion module 130 and the display controller 156 also detects contact on a touchpad. In some embodiments, the contact/motion module 130 and the controller 160 30 detects contact on a click wheel.
[0062] The graphics module 132 includes various known software components for rendering and displaying graphics on the touch screen 112, including components for
WO 2009/089222
PCT/US2009/030225
2017202587 19 Apr 2017 changing the intensity of graphics that are displayed. As used herein, the term “graphics” includes any object that can be displayed to a user, including without limitation text, web pages, icons (such as user-interface objects including soft keys), digital images, videos, animations and the like. An animation in this context is a display of a sequence of images that gives the appearance of movement, and informs the user of an action that has been performed (such as expanding a user-selected web-page portion to fill a browser window). In this context, a respective animation that executes an action, or confirms an action by the user of the device, typically takes a predefined, finite amount of time, typically between 0.2 and 1.0 seconds, and generally less than two seconds.
[0063] The text input module 134, which may be a component of graphics module
132, provides soft keyboards for entering text in various applications (e.g., contacts 137, email 140, IM 141, blogging 142, browser 147, and any other application that needs text input).
[0064] The GPS module 135 determines the location of the device and provides this 15 information for use in various applications (e.g., to telephone 138 for use in location-based dialing, to camera 143 and/or blogger 142 as picture/video metadata, and to applications that provide location-based services such as weather widgets, local yellow page widgets, and map/navigation widgets).
[0065] The applications 136 may include the following modules (or sets of 20 instructions), or a subset or superset thereof:
• a contacts module 137 (sometimes called an address book or contact list);
• a telephone module 138;
• a video conferencing module 139;
• an e-mail client module 140;
· an instant messaging (IM) module 141;
• a blogging module 142;
• a camera module 143 for still and/or video images;
• an image management module 144;
• a video player module 145;
· a music player module 146;
WO 2009/089222
PCT/US2009/030225
2017202587 19 Apr 2017 • a browser module 147;
• a calendar module 148;
• widget modules 149, which may include weather widget 149-1, stocks widget 149-2, calculator widget 149-3, alarm clock widget 149-4, dictionary widget 149-5, and other widgets obtained by the user, as well as user-created widgets 149-6;
• widget creator module 150 for making user-created widgets 149-6;
• search module 151;
• video and music player module 152, which merges video player module 145 and music player module 146;
· notes module 153;
• map module 154; and/or • online video module 15 5.
[0066] Examples of other applications 136 that may be stored in memory 102 include other word processing applications, JAVA-enabled applications, encryption, digital rights 15 management, voice recognition, and voice replication.
[0067] In conjunction with touch screen 112, display controller 156, contact module 130, graphics module 132, and text input module 134, the contacts module 137 may be used to manage an address book or contact list, including: adding name(s) to the address book; deleting name(s) from the address book; associating telephone number(s), e-mail address(es), 20 physical address(es) or other information with a name; associating an image with a name;
categorizing and sorting names; providing telephone numbers or e-mail addresses to initiate and/or facilitate communications by telephone 138, video conference 139, e-mail 140, or IM 141; and so forth.
[0068] In conjunction with RF circuitry 108, audio circuitry 110, speaker 111, 25 microphone 113, touch screen 112, display controller 156, contact module 130, graphics module 132, and text input module 134, the telephone module 138 may be used to enter a sequence of characters corresponding to a telephone number, access one or more telephone numbers in the address book 137, modify a telephone number that has been entered, dial a respective telephone number, conduct a conversation and disconnect or hang up when the
WO 2009/089222
PCT/US2009/030225
2017202587 19 Apr 2017 conversation is completed. As noted above, the wireless communication may use any of a plurality of communications standards, protocols and technologies.
[0069] In conjunction with RF circuitry 108, audio circuitry 110, speaker 111, microphone 113, touch screen 112, display controller 156, optical sensor 164, optical sensor 5 controller 158, contact module 130, graphics module 132, text input module 134, contact list 137, and telephone module 138, the videoconferencing module 139 may be used to initiate, conduct, and terminate a video conference between a user and one or more other participants.
[0070] In conjunction with RF circuitry 108, touch screen 112, display controller 156, contact module 130, graphics module 132, and text input module 134, the e-mail client 10 module 140 may be used to create, send, receive, and manage e-mail. In conjunction with image management module 144, the e-mail module 140 makes it very easy to create and send e-mails with still or video images taken with camera module 143.
[0071] In conjunction with RF circuitry 108, touch screen 112, display controller 156, contact module 130, graphics module 132, and text input module 134, the instant messaging 15 module 141 may be used to enter a sequence of characters corresponding to an instant message, to modify previously entered characters, to transmit a respective instant message (for example, using a Short Message Service (SMS) or Multimedia Message Service (MMS) protocol for telephony-based instant messages or using XMPP, SIMPLE, or IMPS for Internet-based instant messages), to receive instant messages and to view received instant 20 messages. In some embodiments, transmitted and/or received instant messages may include graphics, photos, audio files, video files and/or other attachments as are supported in a MMS and/or an Enhanced Messaging Service (EMS). As used herein, “instant messaging” refers to both telephony-based messages (e.g., messages sent using SMS or MMS) and Internet-based messages (e.g., messages sent using XMPP, SIMPLE, or IMPS).
[0072] In conjunction with RF circuitry 108, touch screen 112, display controller 156, contact module 130, graphics module 132, text input module 134, image management module 144, and browsing module 147, the blogging module 142 may be used to send text, still images, video, and/or other graphics to a blog (e.g., the user's blog).
[0073] In conjunction with touch screen 112, display controller 156, optical sensor(s) 30 164, optical sensor controller 158, contact module 130, graphics module 132, and image management module 144, the camera module 143 may be used to capture still images or
WO 2009/089222
PCT/US2009/030225
2017202587 19 Apr 2017 video (including a video stream) and store them into memory 102, modify characteristics of a still image or video, or delete a still image or video from memory 102.
[0074] In conjunction with touch screen 112, display controller 156, contact module 130, graphics module 132, text input module 134, and camera module 143, the image 5 management module 144 may be used to arrange, modify or otherwise manipulate, label, delete, present (e.g., in a digital slide show or album), and store still and/or video images.
[0075] In conjunction with touch screen 112, display controller 156, contact module 130, graphics module 132, audio circuitry 110, and speaker 111, the video player module 145 may be used to display, present or otherwise play back videos (e.g., on the touch screen or on 10 an external, connected display via external port 124).
[0076] In conjunction with touch screen 112, display system controller 156, contact module 130, graphics module 132, audio circuitry 110, speaker 111, RF circuitry 108, and browser module 147, the music player module 146 allows the user to download and play back recorded music and other sound files stored in one or more file formats, such as MP3 or AAC 15 files. In some embodiments, the device 100 may include the functionality of an MP3 player, such as an iPod (trademark of Apple Computer, Inc.).
[0077] In conjunction with RF circuitry 108, touch screen 112, display system controller 156, contact module 130, graphics module 132, and text input module 134, the browser module 147 may be used to browse the Internet, including searching, linking to, 20 receiving, and displaying web pages or portions thereof, as well as attachments and other files linked to web pages. Embodiments of user interfaces and associated processes using browser module 147 are described further below.
[0078] In conjunction with RF circuitry 108, touch screen 112, display system controller 156, contact module 130, graphics module 132, text input module 134, e-mail 25 module 140, and browser module 147, the calendar module 148 may be used to create, display, modify, and store calendars and data associated with calendars (e.g., calendar entries, to do lists, etc.).
[0079] In conjunction with RF circuitry 108, touch screen 112, display system controller 156, contact module 130, graphics module 132, text input module 134, and 30 browser module 147, the widget modules 149 are mini-applications that may be downloaded and used by a user (e.g., weather widget 149-1, stocks widget 149-2, calculator widget 149-3,
WO 2009/089222
PCT/US2009/030225
2017202587 19 Apr 2017 alarm clock widget 149-4, and dictionary widget 149-5) or created by the user (e.g., usercreated widget 149-6). In some embodiments, a widget includes an HTML (Hypertext Markup Language) file, a CSS (Cascading Style Sheets) file, and a JavaScript file. In some embodiments, a widget includes an XML (Extensible Markup Language) file and a 5 JavaScript file (e.g., Yahoo! Widgets). Embodiments of user interfaces and associated processes using widget modules 149 are described further below.
[0080] In conjunction with RF circuitry 108, touch screen 112, display system controller 156, contact module 130, graphics module 132, text input module 134, and browser module 147, the widget creator module 150 may be used by a user to create widgets 10 (e.g., turning a user-specified portion of a web page into a web-clip widget). In some embodiments, a web-clip widget comprises a file containing an XML property list that includes a URL for the web page and data indicating the user-specified portion of the web page. In some embodiments, the data indicating the user-specified portion of the web page includes a reference point and a scale factor. In some embodiments, the data indicating the 15 user-specified portion of the web page includes a set of coordinates within the web page or an identification of a structural element within the web page. Alternatively, in some embodiments a web-clip widget includes an HTML (Hypertext Markup Language) file, a CSS (Cascading Style Sheets) file, and a JavaScript file. Alternatively, in some embodiments a web-clip widget includes an XML (Extensible Markup Language) file and a JavaScript file.
[0081] In some embodiments a web-clip widget includes an image file (e.g., a png file) of an icon corresponding to the widget. In some embodiments, a web-clip widget corresponds to a folder containing the image file and a file that includes the URL for the web page and data indicating the user-specified portion of the web page. In some embodiments, a web-clip widget corresponds to a folder containing the image file and an executable script.
[0082] Embodiments of user interfaces and associated processes using widget creator module 150 are described further below.
[0083] In conjunction with touch screen 112, display system controller 156, contact module 130, graphics module 132, and text input module 134, the search module 151 may be used to search for text, music, sound, image, video, and/or other files in memory 102 that 30 match one or more search criteria (e.g., one or more user-specified search terms).
WO 2009/089222
PCT/US2009/030225
2017202587 19 Apr 2017 [0084] In conjunction with touch screen 112, display controller 156, contact module 130, graphics module 132, and text input module 134, the notes module 153 may be used to create and manage notes, to do lists, and the like.
[0085] In conjunction with RF circuitry 108, touch screen 112, display system 5 controller 156, contact module 130, graphics module 132, text input module 134, GPS module 135, and browser module 147, the map module 154 may be used to receive, display, modify, and store maps and data associated with maps (e.g., driving directions; data on stores and other points of interest at or near a particular location; and other location-based data).
[0086] In conjunction with touch screen 112, display system controller 156, contact 10 module 130, graphics module 132, audio circuitry 110, speaker 111, RF circuitry 108, text input module 134, e-mail client module 140, and browser module 147, the online video module 155 allows the user to access, browse, receive (e.g., by streaming and/or download), play back (e.g., on the touch screen or on an external, connected display via external port 124), send an e-mail with a link to a particular online video, and otherwise manage online 15 videos in one or more file formats, such as H.264. In some embodiments, instant messaging module 141, rather than e-mail client module 140, is used to send a link to a particular online video. Additional description of the online video application can be found in U.S. Provisional Patent Application No. 60/936,562, “Portable Multifunction Device, Method, and Graphical User Interface for Playing Online Videos,” filed June 20, 2007, the content of which is 20 hereby incorporated by reference in its entirety.
[0087] Each of the above identified modules and applications correspond to a set of instructions for performing one or more functions described above. These modules (i.e., sets of instructions) need not be implemented as separate software programs, procedures or modules, and thus various subsets of these modules may be combined or otherwise re25 arranged in various embodiments. For example, video player module 145 may be combined with music player module 146 into a single module (e.g., video and music player module 152, Figure IB). In some embodiments, memory 102 may store a subset of the modules and data structures identified above. Furthermore, memory 102 may store additional modules and data structures not described above.
[0088] In some embodiments, the device 100 is a device where operation of a predefined set of functions on the device is performed exclusively through a touch screen 112 and/or a touchpad. By using a touch screen and/or a touchpad as the primary input/control
WO 2009/089222
PCT/US2009/030225
2017202587 19 Apr 2017 device for operation of the device 100, the number of physical input/control devices (such as push buttons, dials, and the like) on the device 100 may be reduced.
[0089] The predefined set of functions that may be performed exclusively through a touch screen and/or a touchpad include navigation between user interfaces. In some 5 embodiments, the touchpad, when touched by the user, navigates the device 100 to a main, home, or root menu from any user interface that may be displayed on the device 100. In such embodiments, the touchpad may be referred to as a “menu button.” In some other embodiments, the menu button may be a physical push button or other physical input/control device instead of a touchpad.
[0090] Figure 2 illustrates a portable multifunction device 100 having a touch screen
112 in accordance with some embodiments. The touch screen may display one or more graphics within user interface (UI) 200. In this embodiment, as well as others described below, a user may select one or more of the graphics by making contact or touching the graphics, for example, with one or more fingers 202 (not drawn to scale in the figure). In 15 some embodiments, selection of one or more graphics occurs when the user breaks contact with the one or more graphics. In some embodiments, the contact may include a gesture, such as one or more taps, one or more swipes (from left to right, right to left, upward and/or downward) and/or a rolling of a finger (from right to left, left to right, upward and/or downward) that has made contact with the device 100. In some embodiments, inadvertent contact with a graphic may not select the graphic. For example, a swipe gesture that sweeps over an application icon may not select the corresponding application when the gesture corresponding to selection is a tap.
[0091] The device 100 may also include one or more physical buttons, such as home or menu button 204. As described previously, the menu button 204 may be used to 25 navigate to any application 136 in a set of applications that may be executed on the device 100. Alternatively, in some embodiments, the menu button is implemented as a soft key in a GUI in touch screen 112.
[0092] In one embodiment, the device 100 includes a touch screen 112, a menu button 204, a push button 206 for powering the device on/off and locking the device, volume 30 adjustment button(s) 208, a Subscriber Identity Module (SIM) card slot 210, a head set jack 212, and a docking/charging external port 124. The push button 206 may be used to turn the power on/off on the device by depressing the button and holding the button in the depressed
WO 2009/089222
PCT/US2009/030225
2017202587 19 Apr 2017 state for a predefined time interval; to lock the device by depressing the button and releasing the button before the predefined time interval has elapsed; and/or to unlock the device or initiate an unlock process. In an alternative embodiment, the device 100 also may accept verbal input for activation or deactivation of some functions through the microphone 113.
[0093] Attention is now directed towards embodiments of user interfaces (UI) and associated processes that may be implemented on a portable multifunction device 100.
[0094] Figure 3 illustrates an exemplary user interface for unlocking a portable electronic device in accordance with some embodiments. In some embodiments, user interface 300 includes the following elements, or a subset or superset thereof:
· Unlock image 302 that is moved with a finger gesture to unlock the device;
• Arrow 304 that provides a visual cue to the unlock gesture;
• Channel 306 that provides additional cues to the unlock gesture;
• Time 308;
• Day 310;
· Date 312; and • Wallpaper image 314.
[0095] In some embodiments, the device detects contact with the touch-sensitive display (e.g., a user's finger making contact on or near the unlock image 302) while the device is in a user-interface lock state. The device moves the unlock image 302 in accordance 20 with the contact. The device transitions to a user-interface unlock state if the detected contact corresponds to a predefined gesture, such as moving the unlock image across channel 306. Conversely, the device maintains the user-interface lock state if the detected contact does not correspond to the predefined gesture. This process saves battery power by ensuring that the device is not accidentally awakened. This process is easy for users to perform, in part because 25 of the visual cue(s) provided on the touch screen.
[0096] As noted above, processes that use gestures on the touch screen to unlock the device are described in U.S. Patent Applications 11/322,549, “Unlocking A Device By Performing Gestures On An Unlock Image,” filed December 23, 2005, and 11/322,550, “Indication Of Progress Towards Satisfaction Of A User Input Condition,” filed December 30 23, 2005, which are hereby incorporated by reference in their entirety.
WO 2009/089222
PCT/US2009/030225
2017202587 19 Apr 2017 [0097] Figure 4A illustrates an exemplary user interface for a menu of applications on a portable multifunction device in accordance with some embodiments. In some embodiments, user interface 400A includes the following elements, or a subset or superset thereof:
· Signal strength indicator 402 for wireless communication;
• Time 404;
• Battery status indicator 406;
• Tray 408 with icons for frequently used applications, such as:
o Phone 138;
o E-mail client 140, which may include an indicator 410 of the number of unread e-mails;
o Browser 147; and o Music player 146; and • Icons for other applications, such as:
o IM 141;
o Image management 144;
o Camera 143;
o Video player 145;
o Weather 149-1;
o Stocks 149-2;
o Blog 142;
o Calendar 148;
o Calculator 149-3;
o Alarm clock 149-4;
o Dictionary 149-5;
o User-created widget 149-6; and o Other applications (not shown)(e.g., map 154 and online video 155).
WO 2009/089222
PCT/US2009/030225
2017202587 19 Apr 2017 [0098] In some embodiments, UI 400A displays all of the available applications 136 on one screen so that there is no need to scroll through a list of applications (e.g., via a scroll bar). In some embodiments, as the number of applications increase, the icons corresponding to the applications may decrease in size so that all applications may be displayed on a single 5 screen without scrolling. In some embodiments, having all applications on one screen and a menu button enables a user to access any desired application with at most two inputs, such as activating the menu button 204 and then activating the desired application (e.g., by a tap or other finger gesture on the icon corresponding to the application).
[0099] In some embodiments, UI 400A provides integrated access to both widget10 based applications and non-widget-based applications. In some embodiments, all of the widgets, whether user-created or not, are displayed in UI 400A. In other embodiments, activating the icon for user-created widget 149-6 may lead to another UI that displays the user-created widgets or icons corresponding to the user-created widgets. For example, UI 400B (Figure 4B) displays a menu of six icons corresponding to six user-created widgets 15 149-6-1 through 149-6-6 in accordance with some embodiments. A user may activate a particular widget by gesturing on the corresponding icon. Alternatively, user-created widgets may be displayed in a list. UI 400C (Figure 4C) illustrates a list of names of six user-created widgets 149-6-1 through 149-6-6 along with corresponding icons in accordance with some embodiments. A user may activate a particular widget by gesturing on the corresponding 20 name or icon.
[00100] In some embodiments, a user may rearrange the icons in UI 400A, UI 400B, or UI 400C, e.g., using processes described in U.S. Patent Application No. 11/459,602, Portable Electronic Device With Interface Reconfiguration Mode, filed July 24, 2006, which is hereby incorporated by reference in its entirety. For example, a user may move 25 application icons in and out of tray 408 using finger gestures.
[00101] In some embodiments, UI 400A includes a gauge (not shown) that displays an updated account usage metric for an account associated with usage of the device (e.g., a cellular phone account), as described in U.S. Patent Application 11/322,552, Account Information Display For Portable Communication Device, filed December 23, 2005, which 30 is hereby incorporated by reference in its entirety.
Making and Using Web-Clip Widgets
WO 2009/089222
PCT/US2009/030225
2017202587 19 Apr 2017 [00102] Figures 5A-5I illustrate an exemplary user interface for a browser in accordance with some embodiments.
[00103] In some embodiments, user interface 3900A (Figure 5A) includes the following elements, or a subset or superset thereof:
· 402, 404, and 406, as described above;
• Previous page icon 3902 that when activated (e.g., by a finger tap on the icon) initiates display of a previous web page (if any);
• Web page name 3904;
• Next page icon 3906 that when activated (e.g., by a finger tap on the icon) initiates display of a next web page (if any);
• URL (Uniform Resource Locator) entry box 3908 for inputting URLs of web pages;
• Refresh icon 3910 that when activated (e.g., by a finger tap on the icon) initiates a refresh of the web page;
• Web page 3912 or other structured document, which includes a plurality of blocks
3914 of text content and other graphics (e.g., images);
• Settings icon 3916 that when activated (e.g., by a finger tap on the icon) initiates display of a settings menu for the browser;
• Bookmarks icon 3918 that when activated (e.g., by a finger tap on the icon) initiates display of a bookmarks list or menu for the browser;
· Options icon 3920 that when activated (e.g., by a finger tap on the icon) initiates display of a plurality of options, including options for creating a web-clip widget, adding a bookmark, and emailing a link to the displayed web page 3912 (e.g., UI 3900F, Figure 5F, which like other UIs and pages, can be displayed in either portrait or landscape view); and · New window icon 3922 that when activated (e.g., by a finger tap on the icon) initiates display of a UI for adding new windows to the browser (e.g., UI 3900G, Figure 5G).
[00104] In some embodiments, in response to a predefined gesture by the user on a block 3914 (e.g., a single tap gesture or a double tap gesture), the block is enlarged and centered (or substantially centered) in the web page display. For example, in response to a
WO 2009/089222
PCT/US2009/030225
2017202587 19 Apr 2017 single tap gesture 3923 on block 3914-5, the user-selected block 3914-5 may be enlarged and centered in the display, as shown in UI 3900C (Figure 5C). In some embodiments, the width of the user-selected block is scaled to fill the touch screen display. In some embodiments, the width of the user-selected block is scaled to fill the touch screen display with a predefined 5 amount of padding along the sides of the display. In some embodiments, a zooming animation of the user-selected block is displayed during enlargement of the block. Similarly, in response to a single tap gesture 3925 on block 3914-2, block 3914-2 may be enlarged with a zooming animation and two-dimensionally scrolled to the center of the display (not shown).
[00105] In some embodiments, the device analyzes the render tree of the web page 10 3912 to determine the blocks 3914 in the web page. In some embodiments, a block 3914 corresponds to a render node that is: replaced; a block; an inline block; or an inline table.
[00106] In some embodiments, in response to the same predefined gesture by the user on a block 3914 (e.g., a single tap gesture or a double tap gesture) that is already enlarged and centered, the enlargement and/or centering is substantially or completely reversed. For 15 example, in response to a single tap gesture 3929 on block 3914-5 (Figure 5C), the web page image may zoom out and return to UI 3900A (Figure 5A).
[00107] In some embodiments, in response to a predefined gesture (e.g., a single tap gesture or a double tap gesture) by the user on a block 3914 that is already enlarged but not centered, the block is centered (or substantially centered) in the web page display. For 20 example, in response to a single tap gesture 3927 on block 3914-4 (Figure 5C), block 3914-4 may be centered (or substantially centered) in the web page display. Similarly, in response to a single tap gesture 3935 on block 3914-6, block 3914-6 may be centered (or substantially centered) in the web page display. Thus, for a web page display that is already enlarged, in response to a predefined gesture, the device may display in an intuitive manner a series of 25 blocks that the user wants to view. This same gesture may initiate different actions in different contexts (e.g., (1) zooming and/or enlarging in combination with scrolling when the web page is reduced in size, UI 3 900A and (2) reversing the enlargement and/or centering if the block is already centered and enlarged).
[00108] In some embodiments, in response to a multi-touch (3931 and 3933) de30 pinching gesture by the user (Figure 5C), the web page may be enlarged. Conversely, in response to a multi-touch pinching gesture by the user, the web page may be reduced.
WO 2009/089222
PCT/US2009/030225
2017202587 19 Apr 2017 [00109] In some embodiments, in response to a substantially vertical upward (or downward) swipe gesture by the user, the web page (or, more generally, other electronic documents) may scroll one-dimensionally upward (or downward) in the vertical direction. For example, in response to an upward swipe gesture 3937 by the user that is within a 5 predetermined angle (e.g., 27°) of being perfectly vertical, the web page may scroll onedimensionally upward in the vertical direction.
[00110] Conversely, in some embodiments, in response to a swipe gesture that is not within a predetermined angle (e.g., 27°) of being perfectly vertical, the web page may scroll two-dimensionally (i.e., with simultaneous movement in both the vertical and horizontal 10 directions). For example, in response to an upward or diagonal swipe gesture 3939 by the user that is not within a predetermined angle (e.g., 27°) of being perfectly vertical, the web page may scroll two-dimensionally along the direction of the swipe 3939.
[00111] In some embodiments, in response to a multi-touch (3941 and 3943) rotation gesture by the user, the web page may be rotated exactly 90° (UI 3900D, Figure 5D) for 15 landscape viewing, even if the amount of rotation in the multi-touch (3941 and 3943) rotation gesture is substantially different from 90°. Similarly, in response to a multi-touch (3945 and 3947) rotation gesture by the user (UI 3900D, Figure 5D), the web page may be rotated exactly 90° for portrait viewing, even if the amount of rotation in the multi-touch (3945 and 3947) rotation gesture is substantially different from 90°.
[00112] Thus, in response to imprecise gestures by the user, precise movements of graphics occur. The device behaves in the manner desired by the user despite inaccurate input by the user. Also, note that the gestures described for UI 3900C, which has a portrait view, are also applicable to UIs with a landscape view (e.g., UI 3900D, Figure 5D) so that the user can choose whichever view (portrait or landscape) the user prefers for web browsing.
[00113] In some embodiments, in response to a tap or other predefined user gesture on
URL entry box 3908 (UI 3900A, Figure 5A), the touch screen displays an enlarged entry box 3926 and a keyboard 616 (e.g., UI 3900B, Figure 5B in portrait viewing and UI 3900E, Figure 5E in landscape viewing). In some embodiments, the touch screen also displays:
• Contextual clear icon 3928 that when activated (e.g., by a finger tap on the icon) 30 initiates deletion of all text in entry box 3926;
WO 2009/089222
PCT/US2009/030225
2017202587 19 Apr 2017 • a search icon 3930 that when activated (e.g., by a finger tap on the icon) initiates an Internet search using the search terms input in box 3926; and • Go to URL icon 3932 that when activated (e.g., by a finger tap on the icon) initiates acquisition of the web page at the URL in box 3926;
[00114] Thus, the same entry box 3926 may be used for inputting both search terms and URLs. In some embodiments, whether or not clear icon 3928 is displayed depends on the context.
[00115] UI 3900G (Figure 5G) is a UI for adding new windows to an application, such as the browser 147. UI 3900G displays an application (e.g., the browser 147), which includes 10 a displayed window (e.g., web page 3912-2) and at least one hidden window (e.g., web pages 3912-1 and 3912-3 and possibly other web pages that are completely hidden off-scrcen). UI 3900G also displays an icon for adding windows to the application (e.g., new window or new page icon 3936). In response to detecting activation of the icon 3936 for adding windows, the browser adds a window to the application (e.g., a new window for a new web page 3912).
[00116] In response to detecting a gesture on the touch screen display, a displayed window in the application is moved off the display and a hidden window is moved onto the display. For example, in response to detecting a tap gesture 3949 on the left side of the screen, the window with web page 3912-2 is moved partially or fully off-screen to the right, the window with web page 3912-3 is moved completely off-screen, partially hidden window 20 with web page 3912-1 is moved to the center of the display, and another completely hidden window (not shown in Figure 5G) with a web page may be moved partially onto the display. Alternatively, detection of a left-to-right swipe gesture 3951 may achieve the same effect.
[00117] Conversely, in response to detecting a tap gesture 3953 on the right side of the screen, the window with web page 3912-2 is moved partially or fully off-screen to the left, 25 the window with web page 3912-1 is moved completely off-screen, partially hidden window with web page 3912-3 is moved to the center of the display, and another completely hidden window (not shown in Figure 5G) with a web page may be moved partially onto the display. Alternatively, detection of a right-to-left swipe gesture 3951 may achieve the same effect.
[00118] In some embodiments, in response to a tap or other predefined gesture on a 30 delete icon 3934 (e.g., 3934-2 or 3934-3), the corresponding window 3912 is deleted. In
WO 2009/089222
PCT/US2009/030225
2017202587 19 Apr 2017 some embodiments, in response to a tap or other predefined gesture on Done icon 3938, the window in the center of the display (e.g., 3912-2) is enlarged to fill the screen.
[00119] A user may create a web-clip widget in accordance with some embodiments. Activation of the user-created web-clip widget displays a previously specified area in a web 5 page (having a specified URL) at a specified display size or scale factor. In some embodiments, the area in the web page is specified by scaling and/or translating the display of the web page. For example, a specified area in the web page is enlarged and centered. The specified area may be displayed in a browser application (e.g., the browser 147) or other application. For example, activation of the web-clip widget may display a particular block 10 that is of interest to the user within the web page; furthermore, the block may be enlarged.
Activation of the web-clip widget thus enables the user to view the particular block of interest without having to enlarge and center the web page area that is of interest each time the user visits the web page. In some embodiments, after activation of the web-clip widget, the user may manipulate the display to view other portions of the web page by scaling and/or 15 translating the display. Alternatively, in some embodiments, the user may not be permitted to manipulate the display.
[00120] Web-clip widgets provide more functionality than mere bookmarks: activation of a bookmark only displays a specified web page, while activation of a web-clip widget displays a specified area of a web page at a specified display size or scale factor in 20 accordance with some embodiments. Similarly, a web-clip widget is distinguishable from a hyperlink. To view a web page or portion thereof specified by a hyperlink, the user must activate the browser application, navigate to a web page containing the hyperlink, activate the hyperlink, and then potentially scroll and/or scale the resulting web page. In contrast, to view an area of a web page specified by a web-clip widget, the user merely activates the widget.
[00121] In some embodiments, the web-clip widget corresponds to a block or other structural element of the web page. As described in U.S. Patent Application No. 11/620,492, “Selecting and Manipulating Web Content,” filed on January 5, 2007, which application is incorporated by reference herein in its entirety, structural elements that are displayed in a web page may be identified during the web-clip widget creation process. In some embodiments, if 30 the dimensions of a selected structural element change after creation of a web-clip widget, the area that is displayed upon activation of the web-clip widget is changed accordingly.
WO 2009/089222
PCT/US2009/030225
2017202587 19 Apr 2017 [00122] In some embodiments, a web-clip widget comprises a URL for the web page and data (e.g., metadata) indicating the user-specified portion of the web page. For example, in some embodiments the web-clip widget comprises a file containing an XML property list that includes the URL and the data indicating the user-specified portion of the web page. In 5 some embodiments, the data indicating the user-specified portion of the web page includes a reference point (e.g., a comer point or center point for the widget) and a scale factor. In some embodiments, the data indicating the user-specified portion of the web page includes a set of coordinates within the web page (e.g., a user-defined rectangle) or an identification of a structural element within the web page. The application for viewing the web-clip widget 10 (e.g., the browser 147) is configured to process the data indicating the user-specified portion of the web page and to display the corresponding portion.
[00123] In some embodiments a web-clip widget comprises an executable script. In some embodiments, the widget includes an HTML (Hypertext Markup Language) file, a CSS (Cascading Style Sheets) file, and a JavaScript file. In some embodiments, the widget 15 includes an XML (Extensible Markup Language) file and a JavaScript file (e.g., Yahoo!
Widgets).
[00124] To the extent that any application incorporated by reference herein includes a definition of web-clip widgets that contradicts the definition in the preceding five paragraphs, the definition in the preceding five paragraphs is to be considered controlling for purposes of 20 interpreting the specification and claims of the present application.
[00125] Referring to Figure 5C, in some embodiments, once a user has centered and/or enlarged an area of a web page (e.g., block 3914-5), the user may initiate creation of a webclip widget by activating the options icon 3920. The options icon 3920 is an example of an options icon referenced in operation 706 of process 700 (Figure 7A, below). In some 25 embodiments, the user activates the options icon 3920 by performing a tap or other predefined gesture on the options icon 3920.
[00126] As a result of activating the options icon 3920, a user interface such as UI 3900F (Figure 5F) is displayed (e.g., operation 708, Figure 7A), which includes a plurality of icons 3972. In some embodiments, the plurality of icons 3972 includes an icon 3973 for 30 creating a web-clip widget, an icon 3974 for adding a bookmark (e.g., via UI 39001, Figure 51), an icon 3975 for emailing a link corresponding to the displayed web page 3912, and a cancel icon 3976 for returning to the previous UI. If the user activates the create web-clip
WO 2009/089222
PCT/US2009/030225
2017202587 19 Apr 2017 widget icon 3973, a web-clip widget corresponding to the centered and/or enlarged area of the web page (e.g., block 3914-5 or the entire displayed portion of the web page 3912), will be created (e.g., operations 710 and 712, Figure 7A). Text and/or graphics displayed for the icon 3973 may vary. In some embodiments, for example, the icon 3973 may be labeled Add 5 to Home Screen.
[00127] In some embodiments, in response to user activation of the create web-clip widget icon 3973 (Figure 5F), UI 3900H (Figure 5H) will appear and will prompt the user to enter the widget name in text entry box 3960 using the contextual keyboard 616. In some embodiments, the user can access other keyboards that display other symbols by activating 10 the alternate keyboard selector icon 618. In some embodiments, UI 3900H includes an image
3978 of the selected area of the web page. Once the user has completed entering the widget name in the text entry box 3960, the user activates the add-widget icon 3928 and the widget is created. Alternately, the user may activate the cancel icon 3928 to avoid creating the widget.
[00128] In some embodiments, as a result of activating the create web-clip widget 15 icon 3973, a web-clip widget corresponding to the centered and/or enlarged area of the web page will be created and assigned a name without any further actions by a user. In some embodiments, instead of displaying a user interface such as UI 3900H (Figure 5H) for receiving a name, the newly created web-clip widget may be assigned the same name as the web page name 3904.
[00129] An icon corresponding to the newly created widget may be created and displayed on a menu in a UI such as UI 400A or UI 400B (Figure 4A or 4B). Alternatively, the icon and/or the name of the newly created widget may be listed on a UI such as UI 400C (Figure 4C). Subsequent activation of the newly created widget will launch an application (e.g., the browser 147) that will display the web-clip widget. In some embodiments, the web25 clip widget is displayed within the browser UI (e.g., UI 3900C, Figure 5C). In some embodiments, the web-clip widget is displayed without other elements of the browser UI (e.g., without elements 3902, 3906, 3908, and/or 3910), such that the web-clip widget appears to be its own mini-application rather than a portion of a web page displayed in a browser. In some embodiments, the web-clip widget is displayed with decorative features such as a decorative frame or a border resembling a tom page. In some embodiments, the decorative features are user-customizable.
WO 2009/089222
PCT/US2009/030225
2017202587 19 Apr 2017 [00130] For example, as described above, a user viewing web page 3912 (Figure 5A) may enlarge and center block 3914-5 by performing a tap gesture 3923 (e.g., a single tap or a double tap) on block 3914-5. As a result, block 3914-5 appears enlarged and centered in the browser window, as shown in Figure 5C. The user then may perform gestures (e.g., taps) on 5 the options icon 3920 and the web-clip widget creation icon 3973 (Figure 5F) to create a widget corresponding to block 3914-5, in accordance with some embodiments. In some embodiments, the user then enters a widget name in the text entry box 3960 (Figure 5H) and activates the add-widget icon 3928. A corresponding icon may be created and displayed on a menu such as in UI 400A or 400B (Figure 4A or 4B) or in a list such as in UI 400C (Figure 10 4C). In some embodiments, subsequent activation of the newly created widget will launch the browser 147, which will display block 3914-5, as shown in UI 3900C (Figure 5C).
[00131] In some embodiments, instead of or in addition to performing a tap gesture 3923 (Figure 5A) to center and enlarge a block, a user may define the area of a web page to be associated with a widget by performing one or more other gestures. Examples of gestures 15 that may be used to define the area of the web page include a tap gesture 3927 or 3935 (Figure 5C) to center an adjacent enlarged block; a multi-touch depinching gesture (3931 and 3933) (Figure 5C) to enlarge the web page; a multi-touch pinching gesture (not shown) to reduce the web page; swipe gestures such as a substantially vertical swipe 3937 (Figure 5C), an upward or diagonal swipe 3939 (Figure 5C), and/or other swipe gestures (not shown) to 20 scroll the web page; and/or a multi-touch rotation gesture (3941 and 3943) to select a portrait or landscape view (Figure 5C).
[00132] In some embodiments, instead of first defining the area of the web page to be associated with the web-clip widget and then activating the options icon 3920 (e.g., Figure 5C) and the create web-clip widget icon 3973 (Figure 5F), a user may first activate the 25 icons 3920 and 3973 and then define the area by performing gestures that are detected by the touch screen display, such as those described above. Once the area has been selected and/or scaled, the user may make a gesture on the touch screen to indicate that the area of the web page to be associated with the widget has been defined.
[00133] In some embodiments, in response to the user activating the create web-clip 30 widget icon 3973 (Figure 5F), the device displays a user interface (e.g., UI 3900K, Figure 5K) that lets the user define the area of the web page to be associated with the widget. The user may define the area using gestures such as the gestures described above with reference
WO 2009/089222
PCT/US2009/030225
2017202587 19 Apr 2017 to UIs 3900A, 3900C, and 3900D (Figures 5A, 5C, and 5D). In some embodiments, the user interface may include information 3950 to help guide the user. In some embodiments, the user may activate a cancel icon 3952 to abort the widget creation process and may activate an add widget icon 3954 to complete the widget creation process. In some embodiments, a 5 rotation gesture such as multi-touch rotation gesture (3941 and 3943, Figure 5C) rotates the entire UI 3900K, and not just the defined area, from portrait viewing to landscape viewing or vice versa.
[00134] In some embodiments, in response to the user activating the create web-clip widget icon 3973 (Figure 5F), the device displays a user interface (e.g., UI 3900J, Figure 5J) 10 that lets the user define the area of a web page to be associated with a widget by toggling between frames. The frames are successively overlaid on the web page to frame or highlight successive blocks and other structural elements of the web page. For example, in UI 3900J a frame 3958 frames block 2 3914-2. The user may activate a toggle icon 3956 to toggle between successive blocks. Once a block of interest is framed, the user may activate an add 15 widget icon 3954 to create a widget corresponding to the framed block. The user may activate a cancel icon 3952 to end the widget creation process.
[00135] In some embodiments, creating and displaying an icon corresponding to the newly created web-clip widget includes displaying an animation, as illustrated in Figures 6A6D in accordance with some embodiments. The animation may be displayed, for example, 20 after activation of the add-widget icon 3928 (Figure 5H) or after activation of the create web-clip widget icon 3973 (Figure 5F). In the animation, the selected area of the web page 3912 corresponding to the newly created web-clip widget (e.g., block 3914-5 in UI 3900C) is displayed, as illustrated in Figure 6A. The displayed image is shrunk down, as illustrated for image 602 (Figure 6B), and displayed over a menu of icons. In some embodiments, the menu 25 of icons includes vacant areas (e.g., 604-1 and 604-2, Figure 6B) in which an icon could be displayed but is not currently displayed. The image 602 may be moved (Figure 6C) into the first available vacancy 604-1, where it is displayed as an icon corresponding to the new webclip widget 149-6-7 (Figure 6D). In some embodiments, the first available vacancy is the left-most vacancy in the highest row with a vacancy. In other embodiments, the image is 30 moved into another vacancy or is appended to the menu after the last (e.g., lowest and rightmost) vacancy.
WO 2009/089222
PCT/US2009/030225
2017202587 19 Apr 2017 [00136] In some embodiments, instead of displaying an animation, the icon corresponding to the newly created web-clip widget is simply displayed in a first available vacancy in a menu of icons or in another available vacancy in the menu, or is appended to the menu.
[00137] Activation of the icon corresponding to the newly created web-clip widget
149-6-7 (e.g., by a gesture 606 (Figure 6E) on the icon, such as a tap gesture) results in display of the corresponding web-clip widget (e.g., display of block 3914-5, as shown in Figure 6A) in the browser application or in its own mini-application without other elements of the browser UI.
[00138] UI 3900L (Figure 5L) is a UI for displaying a portion of two or more web-clip widgets, in accordance with some embodiments. The displayed portion may include a first web-clip widget (e.g., 149-6-1), and may include all or a portion of additional web-clip widgets (e.g., 149-6-2). The displayed portion is scrolled in response to detecting a gesture on the touch screen display, such as a swipe gesture 3962.
[00139] UI 3900M (Figure 5M) is a UI for displaying a web-clip widget (e.g., 149-6-2) in accordance with some embodiments. In response to detecting a gesture on the touch screen display, display of the web-clip widget is ceased and another web-clip widget is displayed. For example, in response to detecting a downward swipe 3962 or a tap gesture 3964 at the top of the displayed widget 149-6-2, display of the web-clip widget 149-6-2 is 20 ceased and a previous user-created widget 149-6-1 is displayed. In response to detecting an upward swipe 3962 or a tap gesture 3966 at the bottom of the displayed widget 149-6-2, display of the web-clip widget 149-6-2 is ceased and a next user-created widget 149-6-3 is displayed. Alternatively, in response to detecting a substantially horizontal right-to-left swipe 3963 or a tap gesture 3965 at the right side of the displayed widget 149-6-2, display of 25 the web-clip widget 149-6-2 is ceased and a next user-created widget 149-6-3 is displayed.
In response to detecting a substantially horizontal left-to-right swipe 3963 or a tap gesture 3967 at the left side of the displayed widget 149-6-2, display of the web-clip widget 149-6-2 is ceased and a previous user-created widget 149-6-1 is displayed.
[00140] Figure 7A is a flow diagram illustrating a process 700 for creating a web-clip 30 widget from a web page or portion thereof on a portable multifunction device with a touch screen display in accordance with some embodiments. While the web-clip widget creation process 700 described below includes a number of operations that appear to occur in a
WO 2009/089222
PCT/US2009/030225
2017202587 19 Apr 2017 specific order, it should be apparent that the process 700 can include more or fewer operations, which can be executed serially or in parallel (e.g., using parallel processors or a multi-threading environment), an order of two or more operations may be changed, and/or two or more operations may be combined into a single operation.
[00141] In some embodiments, selection of a web page or portion thereof for display is detected (702). For example, one or more finger gestures are detected on the touch screen display to select the web page or portion thereof. In some embodiments, the one or more finger gestures include one or more finger gestures to scale an area in the web page. In some embodiments, the one or more finger gestures include one or more finger gestures to center 10 an area in the web page. Examples of finger gestures used to select, center, and/or scale an area in the web page include a tap gesture 3923 or 3925 to center and enlarge a block (Figure 5A); a tap gesture 3927 or 3935 to center an adjacent enlarged block; a multi-touch depinching gesture (3931 and 3933) to enlarge the web page; a multi-touch pinching gesture (not shown) to reduce the web page; swipe gestures such as a substantially vertical swipe 15 3937, an upward or diagonal swipe 3939, and/or other swipe gestures (not shown) to translate the web page; and/or a multi-touch rotation gesture (3941 and 3943) to select a portrait or landscape view (Figure 5C).
[00142] The web page or portion thereof is displayed (704) on the touch screen display. In the example of Figure 5C, block 3914-5 is displayed on the touch screen display.
[00143] An activation of an options icon (e.g., icon 3920) is detected (706). In some embodiments, detecting activation of the options icon includes detecting a finger gesture (e.g., a tap gesture) on the options icon.
[00144] In response to detecting activation of the options icon, a plurality of icons (e.g., 3972, Figure 5F) is displayed (708) including a web-clip widget creation icon (e.g., icon 25 3973, Figure 5F). In some embodiments, the web-clip widget creation icon includes text, such as Create Web-Clip Widget or Add to Home Screen.
[00145] An activation of the web-clip widget creation icon (e.g., 3973) is detected (710). In some embodiments, detecting activation of the web-clip widget creation icon includes detecting a finger gesture (e.g., a tap gesture) on the web-clip widget creation icon.
[00146] In response to detecting activation of the web-clip widget creation icon, a web-clip widget is created (712) corresponding to the displayed web page or portion thereof.
WO 2009/089222
PCT/US2009/030225
2017202587 19 Apr 2017 [00147] In some embodiments, the web-clip widget corresponds to a structural element of the web page, such as a particular block within the web page. In some embodiments, the web-clip widget corresponds to a user-specified rectangle in the web page.
[00148] In some embodiments, creating the web-clip widget includes (714) requesting 5 a name for the web-clip widget, receiving the name, and storing the name. In some embodiments, requesting the name includes displaying a keyboard to receive input for the name. For example, in UI 3900H (Figure 5H), the user is prompted to enter the widget name in the text entry box 3960 using the keyboard 616.
[00149] In some embodiments, creating the web-clip widget includes creating (716) an 10 icon corresponding to the web-clip widget and displaying (718) the icon corresponding to the web-clip widget in a menu (e.g., UI 400A or 400B, Figure 4A or 4B) or list (e.g., UI 400C, Figure 4C) of icons. In some embodiments, the icon corresponding to the web-clip widget is created in response to detecting an activation of an add-widget icon (e.g., icon 3928, Figure 5H). In some embodiments, the icon corresponding to the web-clip widget is created in 15 response to detecting an activation of the web-clip widget creation icon (e.g., 3973, Figure
5F).
[00150] In some embodiments, the menu or list of icons comprises a menu or list of applications and widgets (e.g., UI 400A, Figure 4A) on the multifunction device. In some embodiments, the menu or list of icons comprises a menu or list of widgets on the 20 multifunction device. In some embodiments, the menu or list of icons comprises a menu or list of user-created widgets (e.g., UI 400B or 400C, Figure 4B or 4C) on the multifunction device.
[00151] In some embodiments, the icon corresponding to the web-clip widget is displayed in a previously vacant area in the menu of icons. In some embodiments, the 25 previously vacant area is a first available vacancy (e.g., 604-1, Figure 6B) in the menu of icons. In some embodiments, an animation is displayed of the icon corresponding to the web-clip widget moving into the previously vacant area. For example, Figures 6A-6D illustrate an animation in which an icon corresponding to the web-clip widget 149-6-7 is created and moved into a previously vacant area in UI 600B.
[00152] In some embodiments, the web-clip widget is stored (720) as a bookmark in a browser application. In some embodiments, as described in U.S. Patent Application No. 11/469,838, “Presenting and Managing Clipped Content,” filed on September 1, 2006, which
WO 2009/089222
PCT/US2009/030225
2017202587 19 Apr 2017 application is incorporated by reference herein in its entirety, the web-clip widget is encoded as a URL associated with the bookmark.
[00153] In some embodiments, the web-clip widget is sent (722) to a web server for storage. In some embodiments, the web-clip widget stored on the web server is publicly 5 accessible. Storing a user-created web-clip widget on a publicly accessible server allows the user to share the web-clip widget with other users.
[00154] In some embodiments, as illustrated in Figure 7B, an activation of the icon corresponding to the web-clip widget is detected (724). For example, a finger gesture (e.g., a tap gesture 606, Figure 6E) is detected on the icon. In response, the web-clip widget is 10 displayed (726). For example, in response to detecting the tap gesture 606, block 3914-5 is displayed, as illustrated in Figure 6A in the browser application or, as described above, as its own mini-application without other elements of the browser UI.
[00155] Tn some embodiments, as illustrated in Figure 7C, the web-clip widget is sent (728) to an electronic device external to the portable multifunction device. For example, the 15 web-clip widget may be sent to another portable multifunction device 100. The external electronic device stores (730) the web-clip widget, detects an activation (732) of the web-clip widget, and displays the web-clip widget (734). In some embodiments, the web-clip widget is sent to the external electronic device via email. In some embodiments, the web-clip widget is sent to the external electronic device via instant messaging. As used herein, “instant 20 messaging” refers to both telephony-based messages (e.g., messages sent using Multimedia
Message Service (MMS)) and Internet-based messages (e.g., messages sent using Extensible Messaging and Presence Protocol (XMPP), Session Initiation Protocol for Instant Messaging and Presence Leveraging Extensions (SIMPLE), or Instant Messaging and Presence Service (IMPS)). Sending a user-created web-clip widget to another electronic device provides the 25 user with a way to share the web-clip widget with other users. Operations 728-734 of Figure 7C may be performed as part of process 700 or may be performed as an independent process.
[00156] In some embodiments, as illustrated in Figure 7D, an activation of a widget editing icon (e.g., edit widget icon 3970, Figure 5M) is detected (736). In response to detecting the activation of the widget editing icon, one or more settings associated with the 30 web-clip widget are displayed (738). In some embodiments, an animation is displayed (740) of flipping the web-clip widget, to reveal the one or more settings. As described in U.S. Patent Application No. 11/145,561, “Presenting Clips of Content,” filed on June 3, 2005,
WO 2009/089222
PCT/US2009/030225
2017202587 19 Apr 2017 which application is incorporated by reference herein in its entirety, settings (e.g., preferences) associated with a web-clip widget may be displayed by flipping the widget to reveal a user interface to edit the settings. A change to a setting of the one or more settings is received (742). In some embodiments, one or more finger gestures are detected to refocus 5 (744) an area in the web-clip or portion thereof for use by the web-clip widget. As described in the “Presenting Clips of Content” application, the user interface revealed by flipping the widget may include a refocus preference to allow redefinition of the selected area of the web page for use by the web-clip widget. The change is stored (746) and display of the one or more settings is ceased (748). Operations 736-748 of Figure 7D may be performed as part of 10 process 700 or may be performed as an independent process.
[00157] In some embodiments, each operation of process 700 is performed by a portable multifunction device. In some embodiments, however, one or more operations of process 700 are performed by a server system in communication with a portable multifunction device via a network connection. The portable multifunction device may 15 transmit data associated with the widget creation process to the server system and may receive information corresponding to the widget in return. For example, code (e.g., an HTML file, a CSS file, and/or a JavaScript file, in accordance with some embodiments, or an XML file and/or a JavaScript file, in accordance with some other embodiments) associated with the widget may be generated by the server system and then transmitted to the portable 20 multifunction device. In general, operations in the widget creation process may be performed by the portable multifunction device, by the server system, or by a combination thereof.
[00158] Process 700 creates a widget that allows a user to view a specified area in a web page upon activation of the widget. The user thus is spared from having to enlarge and center the area of the web page that is of interest, such as a particular block of interest, each 25 time the user visits the web page.
[00159] Figure 7E is a flow diagram illustrating a process 750 for creating a web-clip widget from a web page or portion thereof in accordance with some embodiments. While the web-clip widget creation process 750 described below includes a number of operations that appear to occur in a specific order, it should be apparent that the process 750 can include 30 more or fewer operations, which can be executed serially or in parallel (e.g., using parallel processors or a multi-threading environment), an order of two or more operations may be changed and/or two or more operations may be combined into a single operation.
WO 2009/089222
PCT/US2009/030225
2017202587 19 Apr 2017 [00160] On a touch screen display of a portable multifunction device, an activation of an options icon (e.g., icon 3920, Figure 5A) is detected (752). In some embodiments, a finger gesture (e.g., a tap gesture) is detected (753) on the options icon.
[00161] An activation of a web-clip widget creation icon (e.g., icon 3973, Figure 5F) is 5 detected (754). In some embodiments, a finger gesture (e.g., a tap gesture) is detected (756) on the web-clip widget creation icon.
[00162] An area in a web page or portion thereof displayed on the touch screen display is selected (758). In some embodiments, selecting the area includes toggling (760) between frames that are successively overlaid on the displayed web page or portion thereof. For 10 example, in UI 3900J (Figure 5J), a frame 3958 is displayed overlaid on the web page 3912 such that it frames block 2 3914-2. Upon activation of a toggle icon 3956, display of the frame 3958 is ceased and another frame is displayed overlaid on the web page 3912 such that it frames another block (e.g., block 3 3914-3). Thus, in some embodiments, the frames successively highlight blocks and other structural elements of the web page. As described in 15 U.S. Patent Application No. 11/620,492, “Selecting and Manipulating Web Content,” filed on
January 5, 2007, which application is incorporated by reference herein in its entirety, structural elements that are displayed in a web page can be identified during the web-clip widget creation process.
[00163] In some embodiments, selecting the area includes detecting (762) one or more 20 finger gestures to select an area in the web page or portion thereof for use by the web-clip widget. In some embodiments, selecting the area includes detecting (764) one or more finger gestures to scale an area in the web page or portion thereof for display by the web-clip widget. Examples of finger gestures used to select and/or scale an area in the web page or portion thereof include a single tap gesture 3923 or 3925 to center and enlarge a block 25 (Figure 5A); a single tap gesture 3927 or 3935 to center an adjacent enlarged block; a multitouch depinching gesture (3931 and 3933) to enlarge the web page; a multi-touch pinching gesture (not shown) to reduce the web page; swipe gestures such as a substantially vertical swipe 3937, an upward or diagonal swipe 3939, and/or other swipe gestures (not shown) to scroll the web page; and/or a multi-touch rotation gesture (3941 and 3943) to select a portrait 30 or landscape view (Figure 5C).
WO 2009/089222
PCT/US2009/030225
2017202587 19 Apr 2017 [00164] A finishing gesture is detected (766). In some embodiments, a finger gesture (e.g., a tap gesture) on an icon (e.g., add widget icon 3954, Figure 5J or 5K) is detected (768). A web-clip widget is created (770) from the selected area.
[00165] In some embodiments, creating the web-clip widget includes requesting a 5 name for the web-clip widget, receiving the name, and storing the name, in accordance with operation 714 of process 700 (Figure 7A).
[00166] In some embodiments, creating the web-clip widget includes creating an icon corresponding to the web-clip widget, in accordance with operation 716 of process 700. In some embodiments, the icon corresponding to the web-clip widget is displayed in a menu or 10 list of icons, in accordance with operation 718 of process 700. In some embodiments, the menu or list of icons comprises a menu or list of applications and widgets on the multifunction device. In some embodiments, the menu or list of icons comprises a menu or list of widgets on the multifunction device. In some embodiments, the menu or list of icons comprises a menu or list of user-created widgets on the multifunction device.
[00167] In some embodiments, an activation of the icon corresponding to the web-clip widget is detected and the web-clip widget is displayed, in accordance with operations 724 and 726 (Figure 6B) of process 700.
[00168] In some embodiments, settings associated with the web-clip widget are edited, in accordance with operations 736-748 (Figure 7D) of process 700.
[00169] In some embodiments, the web-clip widget is stored as a bookmark in a browser application, in accordance with operation 720 of process 700 (Figure 7A).
[00170] In some embodiments, the web-clip widget is sent to a web server for storage, in accordance with operation 722 of process 700. In some embodiments, the web-clip widget is sent to an external electronic device, in accordance with operations 728-734 (Figure 7C) of 25 process 700.
[00171] In some embodiments, each operation of process 750 is performed by a portable multifunction device. In some embodiments, however, one or more operations of process 750 are performed by a server system in communication with a portable multifunction device via a network connection. The portable multifunction device may 30 transmit data associated with the widget creation process to the server system and may receive information corresponding to the widget in return. For example, code (e.g., an
WO 2009/089222
PCT/US2009/030225
2017202587 19 Apr 2017
HTML file, a CSS file, and/or a JavaScript file, in accordance with some embodiments, or an XML file and/or a JavaScript file, in accordance with some other embodiments) associated with the widget may be generated by the server system and then transmitted to the portable multifunction device. In general, operations in the widget creation process may be performed 5 by the portable multifunction device, by the server system, or by a combination thereof.
[00172] Process 750, like process 700, creates a widget that allows a user to view a specified area in a web page upon activation of the widget, thus sparing the user from having to enlarge and center the area of the web page that is of interest each time the user visits the web page.
[00173] Figure 7F is a flow diagram illustrating a process 780 for displaying web-clip widgets in accordance with some embodiments. On a touch screen display on a portable multifunction device, an icon is displayed (781) corresponding to a plurality of widgets, including two or more web-clip widgets. For example, in some embodiments, the icon for user-created widget 149-6 (Figure 4A) corresponds to multiple widgets including multiple 15 web-clip widgets.
[00174] An activation of the icon is detected (782). For example, a finger gesture (e.g., a tap gesture) on the icon is detected.
[00175] In response to detecting the activation, a first portion of the two or more webclip widgets is displayed (783). For example, UI 3900L (Figure 5L) displays a first portion 20 that includes a first user-created widget 149-6-1 and a portion of a second user-created widget 149-6-2. In another example, UI 3900M (Figure 5M) displays a first portion that includes the second user-created widget 149-6-2 and no other widgets or portions thereof. Thus, in some embodiments, the first portion is a first web-clip widget.
[00176] A gesture is detected (784) on the touch screen display. In some 25 embodiments, the gesture is a scrolling gesture. For example, a swipe gesture 3962 (Figures 5L and 5M) or 3963 (Figure 5M) is detected on the touch screen display.
[00177] In response to detecting the gesture, a second portion of the two or more webclip widgets is displayed (785). In some embodiments, in response to detecting the gesture, a displayed portion of the two or more web-clip widgets is scrolled from the first portion to the 30 second portion. For example, in response to detecting an upward scroll gesture 3962 in UI 3900L (Figure 5L), a second portion is displayed that includes more or all of the second user41
WO 2009/089222
PCT/US2009/030225
2017202587 19 Apr 2017 created widget 149-6-2 and less or none of the first user-created widget 149-6-1. In some embodiments, the second portion is a second web-clip widget (e.g., the second user-created widget 149-6-2).
[00178] In some embodiments, the gesture is a de-pinching gesture (e.g., gestures 3931 5 and 3933, Figure 5C). In response to detecting the de-pinching gesture, a displayed portion of the two or more web-clip widgets is zoomed in from the first portion to the second portion.
[00179] In some embodiments, the gesture is a finger tap on an area within the first portion (e.g., a finger tap analogous to gesture 3923, Figure 5A), and the displayed second portion is centered on the area and is zoomed in with respect to the first portion.
[00180] Figure 7G is a flow diagram illustrating a process 790 for displaying web-clip widgets in accordance with some embodiments. On a touch screen display on a portable multifunction device, an icon is displayed (791) corresponding to a plurality of widgets, including two or more web-clip widgets. For example, in some embodiments, the icon for user-created widget 149-6 (Figure 4A) corresponds to multiple widgets including multiple 15 web-clip widgets.
[00181] An activation of the icon is detected (792). For example, a finger gesture (e.g., a tap gesture) on the icon is detected.
[00182] In response to detecting the activation of the icon, a plurality of icons corresponding to respective widgets in the plurality of widgets is displayed (793). In some 20 embodiments, the plurality of icons is displayed in a menu, or in a list. For example, UI 400B (Figure 4B) displays a menu of icons corresponding to user-created widgets 149-6-1 through 149-6-6, and UI 400C (Figure 4C) displays a list of icons corresponding to usercreated widgets 149-6-1 through 149-6-6.
[00183] An activation is detected (794) of a respective icon in the plurality of icons 25 corresponding to a respective web-clip widget. In response to detecting the activation of the respective icon, the respective web-clip widget is displayed (795). For example, in response to detecting an activation of an icon corresponding to user-created widget 149-6-2 in UI 400B or UI 400C, user-created widget 149-6-2 is displayed in UI 3900M (Figure 5M).
[00184] A gesture is detected (796) on the touch screen display. For example, a swipe 30 gesture 3962 or 3963 (Figure 5M) is detected on the touch screen display. Alternately, a tap gesture 3964 at the top or a tap gesture 3966 at the bottom of the displayed widget 149-6-2 is
WO 2009/089222
PCT/US2009/030225
2017202587 19 Apr 2017 detected. In another example, a tap gesture 3965 at the right side or a tap gesture 3967 at the left side of the displayed widget 149-6-2 is detected.
[00185] In response to detecting the gesture, display of the respective web-clip widget is ceased and another web-clip widget is displayed (797). For example, in response to 5 detecting a downward swipe 3962, a substantially horizontal left-to-right swipe 3963, a tap gesture 3967 at the left side of the displayed widget 149-6-2, or a tap gesture 3964 at the top of the displayed widget 149-6-2, a previous user-created widget 149-6-1 is displayed. In response to detecting an upward swipe 3962, a substantially horizontal right-to-left swipe 3963, a tap gesture 3965 at the right side of the displayed widget 149-6-2, or a tap gesture 10 3966 at the bottom of the displayed widget 149-6-2, a next user-created widget 149-6-3 is displayed.
[00186] Processes 780 and 790 thus provide user-friendly ways to view multiple specified areas in web pages without having to surf between successive web pages and without having to enlarge and center an area of interest in each web page.
[00187] Figure 7H is a flow diagram illustrating a process 7000 for displaying a webclip widget in accordance with some embodiments. On a touch screen display on a portable multifunction device, an icon for a web-clip widget (e.g., 149-6-7, Figure 6E) is displayed (7002). The web-clip widget corresponds to a user-specified area of a web page (e.g., block 3914-5, Figure 6A).
[00188] In some embodiments, the icon is displayed (7004) in a menu or list of icons.
In some embodiments, the menu or list of icons comprises a menu or list of applications and widgets (e.g., UI 400A, Figure 4A) on the multifunction device. In some embodiments, the menu or list of icons comprises a menu or list of widgets on the multifunction device. In some embodiments, the menu or list of icons comprises a menu or list of user-created widgets 25 (e.g., UI 400B or 400C, Figure 4B or 4C) on the multifunction device.
[00189] In some embodiments, the user-specified area was previously selected by translating and scaling (7006) a displayed portion of the web page. In some embodiments, the user-specified area was previously selected by centering and enlarging (7008) a displayed portion of the web page. Examples of finger gestures used to translate, scale, center, and/or 30 enlarge an area in the web page include a tap gesture 3923 or 3925 to center and enlarge a block (Figure 5A); a tap gesture 3927 or 3935 to center an adjacent enlarged block; a multitouch depinching gesture (3931 and 3933, Figure 5C) to enlarge the web page; a multi-touch 43
WO 2009/089222
PCT/US2009/030225
2017202587 19 Apr 2017 pinching gesture (not shown) to reduce the web page; swipe gestures such as a substantially vertical swipe 3937, an upward or diagonal swipe 3939, and/or other swipe gestures (not shown) to translate the web page; and/or a multi-touch rotation gesture (3941 and 3943, Figure 5C) to select a portrait or landscape view.
[00190] An activation of the icon is detected (7010). In some embodiments, a finger gesture (e.g., a tap gesture 606, Figure 6E) is detected (7012) on the icon.
[00191] In response to detecting activation of the icon, the user-specified area of the web page is displayed (7014). For example, in response to activation of the icon for the webclip widget 149-6-7 (Figure 6E), block 3914-5 is displayed (Figure 6A).
[00192] The process 7000 allows a user to view a specified area in a web page upon activation of the corresponding icon. The user thus is spared from having to enlarge and center the area of the web page that is of interest, such as a particular block of interest, each time the user visits the web page.
Icon Display and Interface Reconfiguration [00193] Figures 8A-8D illustrate exemplary user interfaces for displaying icons in accordance with some embodiments. Figures 9A and 9B are flow diagrams of an icon display process 900 in accordance with some embodiments. The process is performed by a computing device with a touch screen display (e.g., portable multifunction device 100). The process provides a simple intuitive way for a user to view a large number of icons (e.g., 20 multiple pages of application icons and web-clip widget icons) on a touch screen display.
[00194] The computing device displays (902) a first set of a first plurality of icons in a first area of the touch screen display (e.g., area 802, Figure 8A). The first plurality of icons includes a plurality of sets of icons that are separately displayed in the first area of the touch screen display. For example, in Figures 8A-8C, icons 141, 148, 144, 143, 155, 149-2, 154, 25 149-1, 149-4, 149-3, 153, 412, 149-6, 149-6-1, 149-6-2, 149-6-3, 149-6-4, 149-6-5, 149-6-6,
149-6-7, 149-6-8, 149-6-9, 149-6-10, 149-6-11, 149-6-12, 149-6-13, 149-6-14, and 149-6-6 are a first plurality of icons in area 802. Icons 141, 148, 144, 143, 155, 149-2, 154, 149-1, 149-4, 149-3, 153, 412, and 149-6 form a first set in area 802 in Figure 8A; icons 149-6-1, 149-6-2, 149-6-3, 149-6-4, 149-6-5, and 149-6-6 form a second set in area 802 in Figure 8B; 30 and icons 149-6-7, 149-6-8, 149-6-9, 149-6-10, 149-6-11, 149-6-12, 149-6-13, 149-6-14, and 149-6-15 form a third set in area 802 in Figure 8C. In this context, “separately displayed”
WO 2009/089222
PCT/US2009/030225
2017202587 19 Apr 2017 means when one of the sets is displayed, the other sets are not concurrently displayed, except possibly during a brief transition from one set of icons to the next (e.g., an animation). As this example illustrates, the first and second sets of the first plurality of icons are distinct sets of icons.
[00195] In some embodiments, the first plurality of icons includes a plurality of application launch icons, wherein in response to detecting activation of an application launch icon in the plurality of application icons, an application that corresponds to the activated application icon is launched and displayed. In some embodiments, the applications include a default set of applications, third-party applications, and/or web-clip widget applications. The 10 application launch icons are not for issuing commands or subcommands with an application.
Rather, they are for launching applications. If an application is already launched, then activation of the corresponding application launch icon results in display of the application.
[00196] In some embodiments, the first plurality of icons includes one or more webclip widget icons (e.g., widget icon 149-6, Figure 8A), wherein in response to detecting 15 activation of a web-clip widget icon, a portion of a web page that corresponds to the activated web-clip widget icon is displayed.
[00197] The computing device displays (904) a second plurality of icons in a second area (e.g., tray 408, Figure 8A) on the touch screen display while displaying icons in the first plurality of icons in the first area. For example, in Figures 8A-8C, application launch icons 20 138, 140, 147, and 152 are displayed in tray 408. The second area is different (e.g., visually distinct) from the first area. For example, tray 408 is different from area 802 in Figure 8A. In some embodiments, the second plurality of icons correspond to applications or functions that are frequently used by a user.
[00198] In some embodiments, the second plurality of icons includes a plurality of 25 application launch icons, wherein in response to detecting activation of an application icon in the plurality of application icons, an application that corresponds to the activated application icon is launched and/or displayed, as explained above. In some embodiments, the applications include a default set of applications, third-party applications, and/or web-clip widget applications.
[00199] The computing device detects (906) a first finger gesture on the touch screen display in the first area. In some embodiments, the first finger gesture is a swipe gesture (e.g., swipe 808, Figure 8A). In some embodiments, the swipe gesture is a horizontal (or 45
WO 2009/089222
PCT/US2009/030225
2017202587 19 Apr 2017 substantially horizontal) swipe gesture on the touch screen display, from left to right or from right to left on the touch screen display. In some embodiments, the swipe gesture is a vertical (or substantially vertical) swipe gesture on the touch screen display.
[00200] In response to detecting the first finger gesture on the touch screen display in 5 the first area, the computing device replaces (908) display of the first set of the first plurality of icons with display of a second set of the first plurality of icons in the first area on the touch screen display while maintaining the display of the second plurality of icons in the second area on the touch screen display. For example, in response to swipe 808, UI 800A (Figure 8A) transitions to UI 800B (Figure 8B). The first set of icons (141, 148, 144, 143, 155, 149-2, 10 154, 149-1, 149-4, 149-3, 153, 412, and 149-6 in area 802, Figure 8A) are replaced by a second set of icons (149-6-1, 149-6-2, 149-6-3, 149-6-4, 149-6-5, and 149-6-6 in area 802, Figure 8B) while the display of the second plurality of icons (138, 140, 147, and 152) is maintained.
[00201] In some embodiments, replacing display of the first set of the first plurality of 15 icons with display of a second set of the first plurality of icons in the first area on the touch screen display comprises an animation that moves the first set out of the first area and the second set into the first area.
[00202] In some embodiments, the plurality of sets of icons includes a number of sets of icons that are configured to be separately displayed as a sequence of sets of icons in the 20 first area of the touch screen display. In some embodiments, the computing device displays two or more set-sequence-indicia icons (e.g., icons 804-1, 804-2, and 804-3 in Figures 8A8D). The set-sequence-indicia icons provide information about the number of sets of icons in the plurality of sets of icons and a position of a displayed set of icons in the sequence of sets of icons. In response to detecting the first finger gesture, the computing device updates (910) 25 the information provided by the set-sequence-indicia icons to reflect the replacement of the displayed first set by the second set. For example, set-sequence-indicia icons 804-1, 804-2, and 804-3 in Figures 8A-8D indicate that there are three sets of icons in the plurality of sets of icons. The set-sequence-indicia icons 804-1, 804-2, and 804-3 also indicate a position of a displayed set of icons in the sequence of sets of icons. For example, the set-scquence-indicia 30 icons are displayed in a sequence, with the icon that corresponds to the set that is currently displayed being visually distinguished from the other set-sequence-indicia icons (e.g., icon 804-1 is darkened in Figure 8A when the first set is displayed, icon 804-2 is darkened in
WO 2009/089222
PCT/US2009/030225
2017202587 19 Apr 2017
Figure 8B when the second set is displayed, and icon 804-3 is darkened in Figure 8C when the third set is displayed).
[00203] In some embodiments, the computing device detects (912) a second finger gesture on an icon in the second set of the first plurality of icons. In response to detecting the 5 second finger gesture, the computing device displays (914) an application that corresponds to the icon in the second set upon which the second finger gesture was detected. For example, in response to a finger tap gesture 814 (Figure 8B), user-created widget 149-6-5 is displayed.
[00204] In some embodiments, the computing device detects (916) a third finger gesture on the touch screen display while the second set of the first plurality of icons are 10 displayed. In response to detecting the third finger gesture, the computing device replaces (918) display of the second set of the first plurality of icons with display of a third set of the first plurality of icons in the first area on the touch screen display while maintaining the display of the second plurality of icons in the second area on the touch screen display. For example, in response to detecting swipe 812 (Figure 8B), the computing device replaces 15 (918) display of the second set of the first plurality of icons (icons 149-6-1, 149-6-2, 149-6-3,
149-6-4, 149-6-5, and 149-6-6, Figure 8B) with display of a third set of the first plurality of icons (icons 149-6-7, 149-6-8, 149-6-9, 149-6-10, 149-6-11, 149-6-12, 149-6-13, 149-6-14, and 149-6-15, Figure 8C) in area 802 on the touch screen display while maintaining the display of the second plurality of icons in the second area on the touch screen display (icons 20 138, 140, 147, and 152 in tray 408).
[00205] In some embodiments, the computing device detects (920) a fourth finger gesture on an icon in the third set of the first plurality of icons. In response to detecting the fourth finger gesture, the computing device displays (922) an application that corresponds to the icon in the third set upon which the fourth finger gesture was detected. For example, in 25 response to a finger tap gesture 816 (Figure 8C), user-created widget 149-6-11 is displayed.
[00206] In some embodiments, the first finger gesture is a swipe gesture in a first direction and the computing device detects (924) a second finger swipe gesture on the touch screen display in a direction that is opposite (or substantially opposite) the first direction. In response to detecting the second finger swipe gesture, the computing device replaces (926) 30 display of the first set of the first plurality of icons with a display of information, other than a set in the plurality of sets of icons, customized to a user of the device. In some embodiments, the customized information includes: local time, location, weather, stocks, calendar entries,
WO 2009/089222
PCT/US2009/030225
2017202587 19 Apr 2017 and/or recent messages for the user. For example, in response to detecting finger swipe gesture 810 (Figure 8A), the computing device replaces (926) display of the first set of the first plurality of icons (icons 141, 148, 144, 143, 155, 149-2, 154, 149-1, 149-4, 149-3, 153, 412, and 149-6, Figure 8 A) with a display of information, other than a set in the plurality of 5 sets of icons, customized to a user of the device (e.g., local time, location, weather, stocks, calendar entries, and recent messages for the user in area 802, Figure 8D) .
[00207] In some embodiments, the first finger gesture is a swipe gesture (e.g., swipe 808, Figure 8A) in a first direction and the computing device detects (924) a second finger swipe gesture (e.g., swipe 810, Figure 8A) on the touch screen display in a direction that is opposite (or substantially opposite) the first direction. In response to detecting the second finger swipe gesture, the computing device replaces (926) display of the first set of the first plurality of icons with a display of information, other than a set in the plurality of sets of icons, customized to a user of the device, and updates (928) the information provided by a customized-information indicia icon (e.g., icon 806, Figures 8A-8D) and the set-sequence15 indicia icons (e.g., icons 804) to reflect the replacement of the displayed first set by the information customized to the user (e.g., icon 806 is darkened in Figure 8D and none of the set-sequence-indicia icons 804 are darkened). In some embodiments, the customizedinformation indicia icon and the set-sequence-indicia icons have the same visual appearance (e.g., all are circles, not shown). In some embodiments, the customized-information indicia 20 icon and the set-sequence-indicia icons are visually distinct (e.g., the customized-information indicia icon 806 is a star and the set-sequence-indicia icons 804 are circles). In some embodiments, the customized-information indicia icon 806 and the sct-sequence-indicia icons 804 are adjacent to each other (e.g., as shown in Figures 8A-8D).
[00208] Attention is now directed towards interface reconfiguration. In response to a 25 user initiating an interface reconfiguration mode, positions of one or more icons displayed on the portable device may be varied about respective average positions. The varying of the positions of the one or more icons may include animating the one or more icons to simulate floating of the one or more icons on a surface corresponding to a surface of a display in the portable device. The display may be a touch-sensitive display, which responds to physical 30 contact by a stylus or one or more fingers at one or more contact points. While the following embodiments may be equally applied to other types of displays, a touch-sensitive display is used as an illustrative example.
WO 2009/089222
PCT/US2009/030225
2017202587 19 Apr 2017 [00209] The varying of the positions of the one or more icons may intuitively indicate to the user that the positions of the one or more icons may be reconfigured by the user. The user may modify, adapt and/or reconfigure the positions of the one or more icons. In embodiments where the portable device includes a touch-sensitive display, the user may 5 make contact with the touch-sensitive display proximate to a respective icon at a first position. Upon making contact with the touch-sensitive display, the respective icon may cease varying its position. The user may drag the respective icon to a second position. Upon breaking contact with the touch-sensitive display, the respective icon may resume varying its position. In some embodiments, the respective icon can be thrown, so that the final 10 position of the respective icon is different from the point at which the icon is released. In this embodiment, the final position can depend on a variety of factors, such as the speed of the throw, the parameters used in a simulated equation of motion for the throw (e.g., coefficient of friction), and/or the presence of a lay out grid with simulated attractive forces. In some embodiments, the display may include two regions. During the interface 15 reconfiguration mode, positions of one or more icons displayed in the first region may be varied while positions of one or more icons displayed in the second region may be stationary.
[00210] The user may similarly modify, adapt and/or reconfigure the positions of additional icons during the interface reconfiguration mode. When the user has completed these changes (at least for the time being), he or she may terminate the interface 20 reconfiguration mode. In response to this user action, the portable device may return to a normal mode of operation and the varying of the displayed positions of the one or more icons will cease.
[00211] The user may initiate or terminate the interface reconfiguration process by selecting one or more appropriate physical buttons on the portable device (e.g., menu button 25 204, Figure 2), by a gesture (such as making contact and swiping one or more fingers across the touch-sensitive display or making contact and holding for more than a predefined time period) and/or by selecting one or more soft buttons (such as one or more icons that are displayed on the touch-sensitive display). As used herein, a gesture is a motion of the object/appendage making contact with the touch screen display surface. Exemplary gestures 30 include finger tap gestures and finger swipe gestures. In some embodiments, the interface reconfiguration process terminates a pre-defined time after the interface reconfiguration process is initiated, i.e., there is a time out.
WO 2009/089222
PCT/US2009/030225
2017202587 19 Apr 2017 [00212] The one or more icons displayed on the portable device may be graphical objects. In some embodiments, the one or more icons may be on-screen representations of controls that may be manipulated by the user, such as bars, buttons and text boxes. In some embodiments, the one or more icons correspond to application programs (email, browser, 5 address book, etc.) and/or web-clip widgets that may be selected by the user by contacting the touch-sensitive display proximate to an icon of interest.
[00213] Figure 10 is a flow diagram of a position adjustment process 1000 for a portable multifunction device in accordance with some embodiments. While the position adjustment process 1000 described below includes a number of operations that appear to 10 occur in a specific order, it should be apparent that the process 1000 can include more or fewer operations, which can be executed serially or in parallel (e.g., using parallel processors or a multi-threading environment), an order of two or more operations may be changed and/or two or more operations may be combined into a single operation.
[00214] In the position adjustment process 1000, a plurality of icons are displayed in a 15 GUI in a touch-sensitive display (1002). A first predefined user action that initiates an interface reconfiguration process is detected (1004). Exemplary predefined user actions include selecting a physical button on the portable device, making a predefined gesture on the touch screen display surface, or selecting a soft button. Position(s) of one or more of the plurality of displayed icons are varied about respective average position(s) (1006). A point of 20 contact with the touch-sensitive display at a first position of a respective icon is detected (1008). Movement of the point of contact to a second position is detected (1010). Movement of the respective icon to the second position is displayed and the respective icon is displayed at the second position (1012).
[00215] If a second predefined user action that terminates the interface reconfiguration 25 process is detected (1014-yes), the position(s) of the one or more icons is fixed (1016).
Exemplary predefined user actions include selecting or deselecting a physical button on the portable device (e.g., menu button 204, Figure 2), making another predefined gesture on the touch screen display surface, or selecting or deselecting a soft button. The fixed position(s) may correspond to a respective average position(s) for the one or more icons. If a second 30 pre-defined user action that terminates the interface reconfiguration process is not detected (1014-no), the process may continue when a point of contact proximate to the same or another icon is detected (1008).
WO 2009/089222
PCT/US2009/030225
2017202587 19 Apr 2017 [00216] Figures 11 A-1100 illustrate exemplary user interfaces during interface reconfiguration in accordance with some embodiments.
[00217] In some embodiments, the user interface on the touch screen display 112 is divided into multiple sections or windows. For example, in Figure 11 A, a region of UI 5 1100A may include a tray 408 for holding icons or graphical objects representing functions that are frequently used by the user (e.g., phone 138, mail 140, and browser 147) and a region or area 802 for holding icons or graphical objects representing functions that are used less frequently by the user (e.g., IM 141, calendar 148, image management 144, etc.).
[00218] Figures 11B-11D illustrate the portable multifunction device 100 during the 10 interface reconfiguration mode in accordance with some embodiments. After the interface reconfiguration mode is initiated, the display of one or more of the icons in the area 802 is modified from the previous stationary positions to time-varying positions. As noted previously, the display may include animating one or more of the icons to simulate floating of one or more of the icons on a surface corresponding to the display surface. For example, the 15 animated varying of the positions of one or more of the icons during the interface reconfiguration mode may resemble that of a hockey puck in an air hockey game. The displayed position(s) of a respective icon in the icons may be varied in a region 1104 (Figure 1 IB) centered on the average position of the respective icon.
[00219] While Figures 1 IB-1 ID illustrate movement of one or more of the icons in the 20 area 802, in other embodiments positions of one or more of the icons in another region of the user interface, such as tray 408, may be varied separately or in addition to those of one or more of the icons in area 802.
[00220] The time-varying position(s) of one or more of the icons in area 802 intuitively indicate to the user that the positions of one or more of the icons may be modified. 25 This is illustrated in Figures 11C-11D, which show the repositioning of an icon during the interface reconfiguration mode. The user makes contact with one of the icons that is moving at a position 1108 and moves the point of contact across the display surface. The contact and the motion are detected by the portable multifunction device 100. As a consequence, the displayed icon, in this example corresponding to a stocks application 149-2, is moved 30 accordingly.
[00221] As shown in Figure 1 ID, the user moves the stocks application icon 149-2 to position 1110 and breaks contact with the display surface. The stocks application icon 149-2 51
WO 2009/089222
PCT/US2009/030225
2017202587 19 Apr 2017 is now displayed at the position 1110. While the displayed position of the stocks application icon 149-2 is shown as stationary in Figure HD, in some embodiments the position of the stocks application icon 149-2 may be varied once the user breaks contact with the display surface. In some embodiments, only icons displayed in one or more subsections of the user 5 interface are displayed with a varying position during the interface reconfiguration mode.
Thus, if the stocks application icon 149-2 had been dragged to another position in the area 802, it may be displayed with a varying position after the user breaks contact with the display. In some embodiments, the device may provide audio and/or tactile feedback when an icon is moved to a new position, such as an audible chime and/or a vibration.
[00222] Figure HD also illustrates the optional displacement of the browser icon 147 to position 1112. The browser icon 147 was displaced from its initial position to its new position 1112 due to at least partial overlap with the stocks application icon 149-2, i.e., when the portable multifunction device 100 determined that the user positioned the stocks application icon 149-2 over the browser icon 147, the displayed position of the browser icon 15 147 was changed.
[00223] In other embodiments, an icon may be evicted or removed from the tray 408 when an additional icon, such as the iPod icon 152, is added to the tray 408. For example, the tray 408 may be configured to accommodate a finite number of icons, such as 4 icons. If an additional icon is added to the tray 408, a nearest icon to the additional icon or an icon that 20 at least partially overlaps the additional icon may be evicted or removed from the tray 408. In some embodiments, the evicted icon floats or zooms from its position in tray 408 to a new position in area 802, where it may join a sorted list of icons. In some embodiments, if the eviction process is not completed (e.g., the additional icon is not added to tray 408), the evicted icon may halt its progress towards its new position in area 802 and return to its 25 position in tray 408.
[00224] Figure HE illustrates the user interface after the interface reconfiguration mode has been terminated or has terminated (due to a time out) in accordance with some embodiments. The icons in UI 1100E have stationary positions. The stocks application icon 149-2 and the browser icon 147 are displayed in their new positions in the tray 408.
[00225] The animated effects during the interface reconfiguration mode, such as the varying position(s) of one or more of the icons, may be in accordance with corresponding equations of motion for one or more of the icons in a plane substantially coincident with the
WO 2009/089222
PCT/US2009/030225
2017202587 19 Apr 2017 touch screen display surface. The equations of motion may have a coefficient of friction less than a threshold allowing the simulation and/or animation of floating or sliding of one or more of the icons. The equation of motion for the respective icon may have a non-zero initial velocity, a non-zero angular velocity, and/or a restoring force about the respective average 5 position of the respective icon such that the position of the respective icon oscillates in the region 1104 (Figure 11B) substantially centered on the respective average position of the respective icon.
[00226] In some embodiments, the position of the respective icon may be varied during the interface reconfiguration mode in such a way that the respective icon rotates about 10 the respective average position of the respective icon while maintaining a fixed orientation with respect to the user interface and the portable electronic device 100. This is illustrated in Figures 1 IF and 11G. In this example, the position of the online video icon 155 in area 802 is varied in such a way that it maintains a fixed orientation in region 1104. This may make it easier for the user to determine the function of the respective icon during the interface 15 reconfiguration mode.
[00227] Figures 12A-12F are flow diagrams of icon reconfiguration processes 1200 in accordance with some embodiments. The processes are performed by a computing device with a touch screen display (e.g., portable multifunction device 100). While the icon reconfiguration processes 1200 described below include a number of operations that appear 20 to occur in a specific order, it should be apparent that the processes 1200 can include more or fewer operations, which can be executed serially or in parallel (e.g., using parallel processors or a multi-threading environment), an order of two or more operations may be changed and/or two or more operations may be combined into a single operation.
[00228] The computing device displays (1202) a first set of a first plurality of icons in 25 a first area of the touch screen display (e.g., area 802, Figure 11H). The first plurality of icons includes a plurality of sets of icons that are separately displayed in the first area of the touch screen display. For example, in Figures 11H-11OO, icons 141, 148, 144, 143, 155, 149-2, 154, 149-1, 149-4, 149-3, 153, 412, 152, 149-6-20, 149-6-21, 149-6-22, 149-6-30, 149-6-31, 149-6-32, 149-6-33, 149-6-34, 149-6-35, 149-6-40, 149-6-41, 149-6-42, 149-6-43, 149-6-44, 30 and 149-6-45 are a first plurality of icons in area 802. Icons 141, 148, 144, 143, 155, 149-2, 154, 149-1, 149-4, 149-3, 153, 412, 152, 149-6-20, 149-6-21, and 149-6-22 form a first set in area 802 in Figure 11H; icons 149-6-30, 149-6-31, 149-6-32, 149-6-33, 149-6-34 and 149-653
WO 2009/089222
PCT/US2009/030225
2017202587 19 Apr 2017 form a second set in area 802 in Figure HZ; and icons 149-6-40, 149-6-41, 149-6-42, 149-6-43, 149-6-44, and 149-6-45 form a third set in area 802 in Figure 11KK. In this context, “separately displayed” means when one of the sets is displayed, the other sets are not concurrently displayed, except possibly during a brief transition from one set of icons to the 5 next (e.g., an animation). As this example illustrates, respective sets in the first plurality of icons are distinct sets of icons, although an icon can be moved from one set to another set during the icon reconfiguration process (e.g., as described below using calculator icon 149-3 as an example).
[00229] In some embodiments, the first plurality of icons includes a plurality of 10 application launch icons, wherein in response to detecting activation of an application icon in the plurality of application icons when the user interface reconfiguration process is not active, an application that corresponds to the activated application icon is launched and displayed. In some embodiments, the applications include a default set of applications, third-party applications, and/or web-clip widget applications. As noted above, the application launch 15 icons are not for issuing commands or subcommands with an application. Rather, they are for launching applications. If an application is already launched, then activation of the corresponding application launch icon results in display of the application.
[00230] In some embodiments, the first plurality of icons includes one or more webclip widget icons (e.g., web-clip widget icons 149-6-20, 149-6-21, and 149-6-22, Figure 20 UH), wherein in response to detecting activation of a web-clip widget icon when the user interface reconfiguration process is not active, a portion of a web page that corresponds to the activated web-clip widget icon is displayed.
[00231] The computing device displays (1204) a second plurality of icons in a second area on the touch screen display (e.g., tray 408, Figure 11H) while displaying icons in the 25 first plurality of icons in the first area. The second area is different (e.g., visually distinct) from the first area. For example, tray 408 is different from area 802 in Figure 11H. In some embodiments, the second plurality of icons correspond to applications or functions that are frequently used by a user.
[00232] In some embodiments, the second plurality of icons includes a plurality of 30 application launch icons, wherein in response to detecting activation of an application icon in the plurality of application icons when the user interface reconfiguration process is not active, an application that corresponds to the activated application icon is launched and/or displayed,
WO 2009/089222
PCT/US2009/030225
2017202587 19 Apr 2017 as explained above. In some embodiments, the applications include a default set of applications, third-party applications, and/or web-clip widget applications.
[00233] The computing device detects (1206) a first finger gesture on the touch screen display. In some embodiments, the first finger gesture is a stationary (or substantially 5 stationary) contact with an icon in the first set of the first plurality of icons (e.g., gesture 1114 on stocks icon 149-2, Figure 11H) for greater than a predetermined time (e.g., 0.5-2.0 seconds). In some embodiments, the first finger gesture is on an edit icon (not shown). In some embodiments, the first finger gesture is on any application icon.
[00234] In response to detecting the first finger gesture, the computing device initiates 10 a user interface reconfiguration process, and varies positions of one or more icons in the first set of the first plurality of icons about respective average positions (1208). In some embodiments, in response to detecting the first finger gesture, the computing device also varies (1210) positions of one or more icons in the second plurality of icons about respective average positions (e.g., UI 11001, Figure 111).
[00235] In some embodiments, the varying includes animating the one or more icons to simulate floating of the one or more icons on a surface corresponding to a surface of the touch screen display.
[00236] In some embodiments, the varying position of a respective icon in the one or more icons corresponds to an equation of motion in a plane substantially coincident with the 20 touch screen display, the equation of motion having a coefficient of friction less than a threshold. In some embodiments, the equation of motion for the respective icon has a nonzero initial velocity. In some embodiments, the equation of motion for the respective icon has a restoring force about a respective average position of the respective icon such that the position of the respective icon oscillates in a region substantially centered on the respective 25 average position of the respective icon. In some embodiments, the equation of motion for the respective icon includes a non-zero angular velocity. In some embodiments, the respective icon rotates about the respective average position of the respective icon while maintaining a fixed orientation with respect to the touch screen display.
[00237] In some embodiments, the varying includes randomly varying each icon in the 30 first set of the first plurality of icons about a respective average position.
WO 2009/089222
PCT/US2009/030225
2017202587 19 Apr 2017 [00238] In some embodiments, icons displayed in at least one of the first area and the second area include icons that may be deleted by a user and icons that may not be deleted by the user. In some embodiments, the computing device visually distinguishes (1212) the icons that may be deleted by the user from the icons that may not be deleted by the user; detects 5 (1214) one or more finger gestures corresponding to a request to delete an icon that may be deleted by the user; and, in response to detecting the one or more finger gestures corresponding to the request to delete the icon, deletes (1216) the icon. For example, in Figure 1II, only the web clip widgets 149-6 may be deleted, so these icons have a circled X deletion icons 1116 next to them to visually indicate that these icons may be deleted. In 10 response to detecting a finger gesture on the deletion icon 1116 (Figure 111) for icon 149-622 (Figure 111), icon 149-6-22 is deleted (Figure 11 J).
[00239] In some embodiments, third party applications and web clip widgets may be deleted, but core or default applications may not be deleted. In some embodiments, if the device is reset, the default applications are displayed in the first set in area 802 and in tray 15 408, with the third party applications and web clip widgets deleted. In some embodiments, if the device is reset, the default applications are displayed in the first set in area 802 and in tray 408, with the third party applications and web clip widgets displayed after the default applications in the first set in area 802. In some embodiments, if the device is reset, the default applications are displayed in the first set in area 802 and in tray 408, with the third 20 party applications and web clip widgets displayed in a second set in area 802.
[00240] In some embodiments, the computing device detects (1218) a user making a point of contact with the touch screen display at a first position corresponding to a first icon in the first set and detects movement of the point of contact to a second position on the touch screen display. In response to detecting the point of contact and detecting movement of the 25 point of contact, the computing device displays (1220) movement of the first icon to the second position on the touch screen display and displays the first icon at the second position. In some embodiments, the second position is in the first area. For example, in response to detecting point of contact 1118 on stocks icon 149-2 (Figure 11 J) and detecting movement of the point of contact, the computing device displays (1220) movement of the stocks icon 14930 2 to the second position (Figure 11 J) on the touch screen display and displays the stocks icon
149-2 at the second position (Figure 1 IF).
WO 2009/089222
PCT/US2009/030225
2017202587 19 Apr 2017 [00241] In some embodiments, the computing device moves (1222) a second icon from a respective initial position to a respective new position when the second position of the first icon at least partially overlaps with the respective initial position of the second icon. For example, the iPod icon 152, which overlaps with the stocks icon 149-2 (Figure 11 J), is moved 5 to a new position (Figures 1 IK-1 IL). In some embodiments, the second icon is either in the first area (e.g., area 902) or the second area (e.g., tray 408).
[00242] In some embodiments, the second position is in the first area and the computing device rearranges (1224) icons in the first set other than the first icon to accommodate display of the first icon at the second position in the first area (e.g., as shown in 10 Figures 11 K-l IL).
[00243] In some embodiments, rearranging (1224) icons in the first set other than the first icon includes compacting (1226) at least some of the icons in the first set other than the first icon to place an icon in the first position, which was previously occupied by the first icon (e.g., as shown in Figures 11 K-l IL).
[00244] In some embodiments, rearranging (1224) icons in the first set other than the first icon includes snaking (1228) at least some of the icons in the first set other than the first icon to place an icon in the first position, which was previously occupied by the first icon (e.g., as shown in Figures 11 K-l IL).
[00245] In some embodiments, rearranging (1224) icons in the first set other than the 20 first icon includes moving (1230) an icon in the first set to the first position, which was previously occupied by the first icon, wherein the moved icon was at the second position prior to movement of the first icon (e.g., as shown in Figure 1 IM). In other words, the icons in the first position and the second position are swapped.
[00246] In some embodiments, the computing device fixes (1232) a position of the 25 first icon at the second position and ceases to vary positions of the one or more icons in the first set in response to detecting a predefined user action for terminating the user interface reconfiguration process (e.g., as shown in Figure 1 IN). In some embodiments, the predefined user action is activation of a physical button (e.g., menu button 204, Figure UN) or a soft button (e.g., a done icon, not shown).
[00247] In some embodiments, the computing device detects (1234) a user making a first point of contact (e.g., contact 1120, Figure 110) with the touch screen display at a first
WO 2009/089222
PCT/US2009/030225
2017202587 19 Apr 2017 position corresponding to a first icon in the first set and detects movement of the first point of contact to a second position in the second area on the touch screen display (e.g., as shown in Figure 110). In response to detecting the first point of contact and detecting movement of the first point of contact, the computing device displays (1236) movement of the first icon to the 5 second position in the second area on the touch screen display and displays the first icon at the second position (e.g., as shown in Figure 1 IQ). In some embodiments, icons in the second area are symmetrically distributed about the center of the second area (e.g., as shown in Figure HR).
[00248] In some embodiments, the computing device moves (1238) a third icon in the 10 second plurality of icons from a respective initial position to a respective new position when the new position of the first icon at least partially overlaps with the respective initial position of the third icon (e.g., as shown in Figures 11P-11R, where mail icon 140 and browser icon 147 move to new positions). In some embodiments, the size of the icons in the second area (e.g., tray 408) is reduced as more icons are added, until a predetermined maximum number 15 (e.g., 6 icons) is reached (e.g., as shown in Figure 1 IS). In some embodiments, after the maximum is reached, icons must be removed from the second area prior to adding more icons to the second area, in some embodiments, after the maximum is reached, icons in the second area are evicted from the second area when more icons are added to the second area.
[00249] In some embodiments, the computing device detects (1240) the user making a 20 second point of contact (e.g., contact 1122, Figure 1 IT) with the touch screen display at a third position corresponding to a second icon in the second plurality of icons in the second region on the touch screen display and detects movement of the second point of contact to a fourth position in the first region on the touch screen display (e.g., as shown in Figure 1 IT). The computing device responds (1242) to detecting the second point of contact and detecting 25 movement of the second point of contact by displaying movement of the second icon to the fourth position of the touch screen display and displaying the second icon at the fourth position (e.g., as shown in Figures 1 IT-1 IV, where mail icon 140 moves from the tray 408 to the first area 802). In some embodiments, the computing device fixes (1244) a position of the second icon at the fourth position and ceases to vary positions of the one or more icons in the 30 first set in response to detecting a predefined user action for terminating the predefined user interface reconfiguration process (e.g., as shown in Figure 11W). In some embodiments, the predefined user action is activation of a physical button (e.g., menu button 204, Figure 11W) or a soft button (e.g., a done icon, not shown).
WO 2009/089222
PCT/US2009/030225
2017202587 19 Apr 2017 [00250] In some embodiments, the computing device detects (1246) a second finger gesture on a first icon in the first set on the touch screen display. In response to detecting the second finger gesture, the computing device replaces (1256) display of the first set of the first plurality of icons with display of a second set of the first plurality of icons in the first area on 5 the touch screen display, and moves the first icon from the first set to the second set (e.g., as shown in Figures 1IX-11Z, where the first set of icons (141, 148, 144, 143, 155, 149-2, 154, 149-1, 149-4, 149-3, 153, 412, 152, 149-6-20, 149-6-21, and 149-6-22) is replaced by the second set of icons (149-6-30, 149-6-31, 149-6-32, 149-6-33, 149-6-34 and 149-6-35) and the calculator icon 149-3 is moved from the first set to the second set).
[00251] In some embodiments, detecting the second finger gesture includes: detecting (1248) a user making a first point of contact (e.g., contact 1124, Figure 11X) with the touch screen display at a first position corresponding to the first icon in the first set and detecting movement of the first point of contact to an edge of the first area; and in response to detecting the first point of contact and detecting movement of the first point of contact to the edge of 15 the first area, displaying (1250) movement of the first icon to the edge of the first area (e.g., as shown in Figure 11X for calculator icon 149-3). In some embodiments, the edge of first area coincides with an edge of the touch screen display. In some embodiments, the plurality of sets form a sequence of sets and going to the right edge results in display of the next set in the sequence of sets and going to the left edge results in display of the previous set in the 20 sequence of sets. In some embodiments, the plurality of sets form a sequence of sets and going to the bottom edge results in display of the next set in the sequence of sets and going to the top edge results in display of the previous set in the sequence of sets.
[00252] In some embodiments, detecting the second finger gesture includes detecting (1252) the first point of contact at the edge of the first area for greater than a predetermined 25 time (e.g., 0.2-1.0 seconds).
[00253] In some embodiments, detecting the second finger gesture includes detecting (1254) movement of the first point of contact away from the edge of the first area and then detecting another movement of the first point of contact back to the edge of the first area (e.g., as shown in Figure 11Y for calculator icon 149-3) within a predetermined time (e.g., 30 0.2-0.5 seconds).
[00254] In some embodiments, the computing device detects (1258) a user making a second point of contact with the touch screen display at a second position corresponding to
WO 2009/089222
PCT/US2009/030225
2017202587 19 Apr 2017 the first icon in the second set and detects movement of the second point of contact to a third position on the touch screen display. In response to detecting the second point of contact and detecting movement of the second point of contact, the computing device displays (1260) movement of the first icon to the third position on the touch screen display and displays the 5 first icon at the third position (e.g., as shown in Figures 11Z, 11AA, and 11CC-11EE for calculator icon 149-3). In some embodiments, the third position is in the first area. In some embodiments, the first icon is the only icon in the second set (e.g., as shown in Figure 1 IBB for calculator icon 149-3). In other words, the first icon is added to an otherwise empty first area. In some embodiments, the plurality of sets of icons that are separately displayed in the 10 first area comprise a sequence of sets and, during the reconfiguration process, an empty set is added after the last set of icons in the sequence of sets.
[00255] In some embodiments, positions of one or more icons in the second set of the first plurality of icons vary about respective average positions (e.g., as shown in Figures 11Z and 11AA). In some embodiments, positions of all of the icons in the second set vary about 15 respective average positions. In some embodiments, positions of all of the icons in the second set except the first icon vary about respective average positions.
[00256] In some embodiments, the third position is in the first area and the computing device rearranges (1262) icons in the second set other than the first icon to accommodate display of the first icon at the third position in the first area (e.g., as shown in Figures 1 ICC 20 and HDD). In some embodiments, rearranging icons in the second set other than the first icon includes compacting (1264) at least some of the icons in the first set and the second set other than the first icon (e.g., as shown in Figures 11FF and 11GG, where icon 149-6-30 is compacted into the first set). In some embodiments, rearranging icons in the second set other than the first icon includes snaking (1266) at least some of the icons in the second set other 25 than the first icon (e.g., as shown in Figures 11CC and 11DD).
[00257] In some embodiments, the plurality of sets of icons includes a number of sets of icons that are configured to be separately displayed as a sequence of sets of icons in the first area of the touch screen display. The computing device displays (1268) two or more setsequence-indicia icons (e.g., icons 804-1, 804-2, and 804-3 in Figures 11H-11OO, which 30 operate in the same manner as the corresponding icons in Figures 8A-8D, described above).
The set-sequence-indicia icons provide information about the number of sets of icons in the plurality of sets of icons and a position of a displayed set of icons in the sequence of sets of
WO 2009/089222
PCT/US2009/030225
2017202587 19 Apr 2017 icons. In response to detecting the second finger gesture, the computing device updates (1270) the information provided by the set-sequence-indicia icons to reflect the replacement of the displayed first set by the second set (e.g., icon 804-1 is darkened in Figure 11X when the first set is displayed and icon 804-2 is darkened in Figure 11Z when the second set is 5 displayed).
[00258] In some embodiments, the computing device fixes (1272) a position of the first icon at the third position and ceases to vary positions of the one or more icons in the second set in response to detecting a predefined user action for terminating the user interface reconfiguration process (e.g., as shown in Figures 11EE and 11HH). In some embodiments, 10 the predefined user action is activation of a physical button (e.g., menu button 204, Figure 11EE or 11HH) or a soft button (e.g., a done icon, not shown).
[00259] In some embodiments, the computing device detects (1274) a second finger gesture on a first icon in the first set on the touch screen display. In response to detecting the second finger gesture, the computing device replaces (1276) display of the first set of the first 15 plurality of icons with display of a second set of the first plurality of icons in the first area on the touch screen display, and moves the first icon from the first set to the second set (e.g., as shown in Figures 11X-11Z, where the first set of icons (141, 148, 144, 143, 155, 149-2, 154, 149-1, 149-4, 149-3, 153, 412, 152, 149-6-20, 149-6-21, and 149-6-22) is replaced by the second set of icons (149-6-30, 149-6-31, 149-6-32, 149-6-33, 149-6-34 and 149-6-35) and the 20 calculator icon 149-3 is moved from the first set to the second set). The computing device detects (1278) a third finger gesture on the first icon in the second set on the touch screen display. In response to detecting the third finger gesture, the computing device replaces (1280) display of the second set of the first plurality of icons with display of a third set of the first plurality of icons in the first area on the touch screen display, and moves the first icon 25 from the second set to the third set (e.g., as shown in Figures 11Π-ΚΚ for calculator icon 149-3). The computing device detects (1282) a user making a second point of contact with the touch screen display at a second position corresponding to the first icon in the third set and detects movement of the second point of contact to a third position on the touch screen display. In response to detecting the second point of contact and detecting movement of the 30 second point of contact, the computing device displays (1284) movement of the first icon to the third position on the touch screen display and displays the first icon at the third position (e.g., as shown in Figures 11LL-11NN for calculator icon 149-3). In some embodiments, the third position is in the first area. In some embodiments, the first icon is the only icon in the
WO 2009/089222
PCT/US2009/030225
2017202587 19 Apr 2017 third set. In other words, the first icon is added to an otherwise empty first area. In some embodiments, positions of one or more icons in the third set of the first plurality of icons vary about respective average positions (e.g., as shown in Figure 1 INN). In some embodiments, positions of all of the icons in the third set vary about respective average positions. In some 5 embodiments, positions of all of the icons in the third set except the first icon vary about respective average positions.
[00260] In some embodiments, the computing device fixes (1286) a position of the first icon at the third position and ceases to vary positions of the one or more icons in the third set in response to detecting a predefined user action for terminating the user interface 10 reconfiguration process (e.g., as shown in Figure 1100). In some embodiments, the predefined user action is activation of a physical button (e.g., menu button 204, Figure 1100) or a soft button (e.g., a done icon, not shown).
[00261] The foregoing description, for purpose of explanation, has been described with reference to specific embodiments. However, the illustrative discussions above are not 15 intended to be exhaustive or to limit the invention to the precise forms disclosed. Many modifications and variations are possible in view of the above teachings. The embodiments were chosen and described in order to best explain the principles of the invention and its practical applications, to thereby enable others skilled in the art to best utilize the invention and various embodiments with various modifications as are suited to the particular use 20 contemplated.

Claims (10)

  1. What is claimed is:
    1. A computer-implemented method, comprising:
    at a computing device with a touch screen display:
    displaying a first set of a first plurality of icons in a first area of the touch screen display, wherein the first plurality of icons includes a plurality of sets of icons that are separately displayed in the first area of the touch screen display;
    displaying a second plurality of icons in a second area on the touch screen display, wherein:
    the second area is different from the first area, and both of the first plurality of icons and the second plurality of icons includes application launch icons, wherein each application launch icon represents a particular application, and activation of a respective application launch icon causes activation of the particular application represented by the activated application launch icon;
    while displaying the first set of the first plurality of icons in the first area of the touch screen display, detecting a first finger gesture on the touch screen display in the first area in a first direction;
    in response to detecting the first finger gesture, replacing display of the first set of the first plurality of icons with display of a second set of the first plurality of icons in the first area on the touch screen display while maintaining the display of the second plurality of icons in the second area on the touch screen display, wherein the second set of the first plurality of icons is distinct from the first set of the first plurality of icons.
  2. 2. The computer-implemented method of claim 1, wherein the first finger gesture is a swipe gesture.
  3. 3. The computer-implemented method of claim 1, wherein replacing display of the first set of the first plurality of icons with display of a second set of the first plurality of icons in the first area on the touch screen display comprises an animation that moves the first set out of the first area and the second set into the first area.
  4. 4. The computer-implemented method of claim 1, including:
    1001901516
    2017202587 05 Oct 2017 detecting a second finger gesture on an icon in the second set of the first plurality of icons; and in response to detecting the second finger gesture, displaying an application that corresponds to the icon in the second set upon which the second finger gesture was detected.
  5. 5. The computer-implemented method of claim 1, including:
    detecting a third finger gesture on the touch screen display;
    in response to detecting the third finger gesture, replacing display of the second set of the first plurality of icons with display of a third set of the first plurality of icons in the first area on the touch screen display while maintaining the display of the second plurality of icons in the second area on the touch screen display;
    detecting a fourth finger gesture on an icon in the third set of the first plurality of icons; and in response to detecting the fourth finger gesture, displaying an application that corresponds to the icon in the third set upon which the fourth finger gesture was detected.
  6. 6. The computer-implemented method of claim 1, wherein the plurality of sets of icons includes a number of sets of icons that are configured to be separately displayed as a sequence of sets of icons in the first area of the touch screen display; and further including:
    displaying two or more set-sequence-indicia icons, wherein the set-sequence-indicia icons provide information about the number of sets of icons in the plurality of sets of icons and a position of a displayed set of icons in the sequence of sets of icons; and in response to detecting the first finger gesture, updating the information provided by the set-sequence-indicia icons to reflect the replacement of the displayed first set by the second set.
  7. 7. A computing device, comprising:
    a touch screen display;
    one or more processors;
    memory; and one or more programs, wherein the one or more programs are stored in the memory and configured to be executed by the one or more processors, the one or more programs including:
    1001901516
    2017202587 05 Oct 2017 instructions for displaying a first set of a first plurality of icons in a first area of the touch screen display, wherein the first plurality of icons includes a plurality of sets of icons that are separately displayed in the first area of the touch screen display;
    instructions for displaying a second plurality of icons in a second area on the touch screen display, wherein:
    the second area is different from the first area, and both of the first plurality of icons and the second plurality of icons includes application launch icons, wherein each application launch icon represents a particular application, and activation of a respective application launch icon causes activation of the particular application represented by the activated application launch icon;
    instructions for, while displaying the first set of the first plurality of icons in the first area of the touch screen display, detecting a first finger gesture on the touch screen display in the first area; and instructions for replacing display of the first set of the first plurality of icons with display of a second set of the first plurality of icons in the first area on the touch screen display while maintaining the display of the second plurality of icons in the second area on the touch screen display, in response to detecting the first finger gesture on the touch screen display in the first area, wherein the second set of the first plurality of icons is distinct from the first set of the first plurality of icons.
  8. 8. A computer readable storage medium having stored therein instructions, which when executed by a device with a touch screen display, cause the device to:
    display a first set of a first plurality of icons in a first area of the touch screen display, wherein the first plurality of icons includes a plurality of sets of icons that are separately displayed in the first area of the touch screen display;
    display a second plurality of icons in a second area on the touch screen display, wherein:
    the second area is different from the first area, and both of the first plurality of icons and the second plurality of icons includes application launch icons, wherein each application launch icon represents a particular application, and activation of a respective application launch icon causes activation of the particular application represented by the activated application launch icon;
    1001901516
    2017202587 05 Oct 2017 while displaying the first set of the first plurality of icons in the first area of the touch screen display, detect a first finger gesture on the touch screen display in the first area; and replace display of the first set of the first plurality of icons with display of a second set of the first plurality of icons in the first area on the touch screen display while maintaining the display of the second plurality of icons in the second area on the touch screen display, in response to detecting the first finger gesture on the touch screen display in the first area, wherein the second set of the first plurality of icons is distinct from the first set of the first plurality of icons.
  9. 9. A graphical user interface on a computing device with a touch screen display, comprising:
    a first set of a first plurality of icons displayed in a first area of the touch screen display, wherein the first plurality of icons includes a plurality of sets of icons that are separately displayed in the first area of the touch screen display; and a second plurality of icons displayed in a second area on the touch screen display, wherein:
    the second area is different from the first area, and both of the first plurality of icons and the second plurality of icons includes application launch icons, wherein each application launch icon represents a particular application, and activation of a respective application launch icon causes activation of the particular application represented by the activated application launch icon;
    wherein:
    while displaying the first set of the first plurality of icons in the first area of the touch screen display and in response to detecting a first finger gesture on the touch screen display in the first area, display of the first set of the first plurality of icons is replaced with display of a second set of the first plurality of icons in the first area on the touch screen display while maintaining the display of the second plurality of icons in the second area on the touch screen display, wherein the second set of the first plurality of icons is distinct from the first set of the first plurality of icons.
  10. 10. A computing device with a touch screen display, comprising:
    1001901516
    2017202587 05 Oct 2017 means for displaying a first set of a first plurality of icons in a first area of the touch screen display, wherein the first plurality of icons includes a plurality of sets of icons that are separately displayed in the first area of the touch screen display;
    means for displaying a second plurality of icons in a second area on the touch screen display, wherein:
    the second area is different from the first area, and both of the first plurality of icons and the second plurality of icons includes application launch icons, wherein each application launch icon represents a particular application, and activation of a respective application launch icon causes activation of the particular application represented by the activated application launch icon;
    means for, while displaying the first set of the first plurality of icons in the first area of the touch screen display, detecting a first finger gesture on the touch screen display in the first area; and means for replacing display of the first set of the first plurality of icons with display of a second set of the first plurality of icons in the first area on the touch screen display while maintaining the display of the second plurality of icons in the second area on the touch screen display, in response to detecting the first finger gesture on the touch screen display in the first area, wherein the second set of the first plurality of icons is distinct from the first set of the first plurality of icons.
AU2017202587A 2007-09-04 2017-04-19 Portable multifunction device with interface reconfiguration mode Active AU2017202587B2 (en)

Priority Applications (5)

Application Number Priority Date Filing Date Title
AU2017202587A AU2017202587B2 (en) 2007-09-04 2017-04-19 Portable multifunction device with interface reconfiguration mode
AU2019204835A AU2019204835B2 (en) 2007-09-04 2019-07-04 Portable multifunction device with interface reconfiguration mode
AU2021201687A AU2021201687B2 (en) 2007-09-04 2021-03-17 Portable multifunction device with interface reconfiguration mode
AU2022224726A AU2022224726B2 (en) 2007-09-04 2022-08-30 Portable multifunction device with interface reconfiguration mode
AU2024203944A AU2024203944A1 (en) 2007-09-04 2024-06-11 Portable multifunction device with interface reconfiguration mode

Applications Claiming Priority (5)

Application Number Priority Date Filing Date Title
US11/849,938 US8619038B2 (en) 2007-09-04 2007-09-04 Editing interface
US11/849,938 2007-09-04
AU2012200475A AU2012200475B2 (en) 2007-09-04 2012-01-27 Portable multifunction device with interface reconfiguration mode
AU2015215876A AU2015215876A1 (en) 2007-09-04 2015-08-19 Portable multifunction device with interface reconfiguration mode
AU2017202587A AU2017202587B2 (en) 2007-09-04 2017-04-19 Portable multifunction device with interface reconfiguration mode

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
AU2015215876A Division AU2015215876A1 (en) 2007-09-04 2015-08-19 Portable multifunction device with interface reconfiguration mode

Related Child Applications (1)

Application Number Title Priority Date Filing Date
AU2019204835A Division AU2019204835B2 (en) 2007-09-04 2019-07-04 Portable multifunction device with interface reconfiguration mode

Publications (2)

Publication Number Publication Date
AU2017202587A1 AU2017202587A1 (en) 2017-05-11
AU2017202587B2 true AU2017202587B2 (en) 2019-11-21

Family

ID=45812267

Family Applications (5)

Application Number Title Priority Date Filing Date
AU2012200475A Active AU2012200475B2 (en) 2007-09-04 2012-01-27 Portable multifunction device with interface reconfiguration mode
AU2015215876A Abandoned AU2015215876A1 (en) 2007-09-04 2015-08-19 Portable multifunction device with interface reconfiguration mode
AU2017202587A Active AU2017202587B2 (en) 2007-09-04 2017-04-19 Portable multifunction device with interface reconfiguration mode
AU2019204835A Active AU2019204835B2 (en) 2007-09-04 2019-07-04 Portable multifunction device with interface reconfiguration mode
AU2021201687A Active AU2021201687B2 (en) 2007-09-04 2021-03-17 Portable multifunction device with interface reconfiguration mode

Family Applications Before (2)

Application Number Title Priority Date Filing Date
AU2012200475A Active AU2012200475B2 (en) 2007-09-04 2012-01-27 Portable multifunction device with interface reconfiguration mode
AU2015215876A Abandoned AU2015215876A1 (en) 2007-09-04 2015-08-19 Portable multifunction device with interface reconfiguration mode

Family Applications After (2)

Application Number Title Priority Date Filing Date
AU2019204835A Active AU2019204835B2 (en) 2007-09-04 2019-07-04 Portable multifunction device with interface reconfiguration mode
AU2021201687A Active AU2021201687B2 (en) 2007-09-04 2021-03-17 Portable multifunction device with interface reconfiguration mode

Country Status (1)

Country Link
AU (5) AU2012200475B2 (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9881315B2 (en) * 2012-06-11 2018-01-30 Retailmenot, Inc. Systems, methods, and computer-readable media for a customizable redemption header for merchant offers across browser instances
US9558507B2 (en) 2012-06-11 2017-01-31 Retailmenot, Inc. Reminding users of offers

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20060197753A1 (en) * 2005-03-04 2006-09-07 Hotelling Steven P Multi-functional hand-held device
US20060277481A1 (en) * 2005-06-03 2006-12-07 Scott Forstall Presenting clips of content

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8479122B2 (en) * 2004-07-30 2013-07-02 Apple Inc. Gestures for touch sensitive input devices
US7233316B2 (en) * 2003-05-01 2007-06-19 Thomson Licensing Multimedia user interface
JP4148187B2 (en) * 2004-06-03 2008-09-10 ソニー株式会社 Portable electronic device, input operation control method and program thereof
DE602005025700D1 (en) * 2005-03-03 2011-02-10 Nokia Corp USER INTERFACE COMPONENT
US7509588B2 (en) * 2005-12-30 2009-03-24 Apple Inc. Portable electronic device with interface reconfiguration mode
US7596761B2 (en) * 2006-01-05 2009-09-29 Apple Inc. Application user interface with navigation bar showing current and prior application contexts

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20060197753A1 (en) * 2005-03-04 2006-09-07 Hotelling Steven P Multi-functional hand-held device
US20060277481A1 (en) * 2005-06-03 2006-12-07 Scott Forstall Presenting clips of content

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
APPLE :"iPhone User's Guide" [Viewed on internet on 04 July 2018] Viewed on internet < URL: https://mesnotices.20minutes.fr/014159.php?ID=1168379&k=7637ba37901832711e0d4610438e2f77&q=APPLE%20IPHONE>, Published June 2007 *
JOBS, S. - " iPhone Introduction in 2017 (Complete) [Viewed on internet on 04 July 2018] Viewed on internet< URL: https://www.youtube.com/watch?v=9hUIxyE2Ns8>, Published June 2007 *

Also Published As

Publication number Publication date
AU2012200475A1 (en) 2012-02-16
AU2021201687A1 (en) 2021-04-08
AU2019204835A1 (en) 2019-07-25
AU2017202587A1 (en) 2017-05-11
AU2015215876A1 (en) 2015-09-24
AU2012200475B2 (en) 2015-09-03
AU2021201687B2 (en) 2022-06-23
AU2019204835B2 (en) 2020-12-17

Similar Documents

Publication Publication Date Title
US20200225843A1 (en) Replacing display of icons in response to a gesture
US11507255B2 (en) Portable multifunction device with animated sliding user interface transitions
US8788954B2 (en) Web-clip widgets on a portable multifunction device
EP2565803B1 (en) Web-clip widgets on a portable multifunction device
WO2008085747A2 (en) Portable electronic device, method and graphical user interface for displaying inline multimedia content
AU2021201687B2 (en) Portable multifunction device with interface reconfiguration mode
AU2022224726B2 (en) Portable multifunction device with interface reconfiguration mode
AU2011101194A4 (en) Portable multifunction device with interface reconfiguration mode
AU2017245373A1 (en) Portable electronic device, method, and graphical user interface for displaying structured electronic documents

Legal Events

Date Code Title Description
ON Decision of a delegate or deputy of the commissioner of patents (result of patent office hearing)

Free format text: (2019) APO 44: DECISION: ALL OF THE CLAIMS ARE NOVEL AND HAVE AN INVENTIVE STEP. I ACCEPT THE PATENT REQUEST AND COMPLETE SPECIFICATION.

Effective date: 20191008

FGA Letters patent sealed or granted (standard patent)
MK14 Patent ceased section 143(a) (annual fees not paid) or expired
NA Applications received for extensions of time, section 223

Free format text: AN APPLICATION TO EXTEND THE TIME FROM 28 AUG 2022 TO 28 OCT 2023 IN WHICH TO PAY A RENEWAL FEE HAS BEEN FILED

NB Applications allowed - extensions of time section 223(2)

Free format text: THE TIME IN WHICH TO PAY A RENEWAL FEE HAS BEEN EXTENDED TO 28 OCT 2023