AU2019206101A1 - User interface for manipulating user interface objects - Google Patents

User interface for manipulating user interface objects Download PDF

Info

Publication number
AU2019206101A1
AU2019206101A1 AU2019206101A AU2019206101A AU2019206101A1 AU 2019206101 A1 AU2019206101 A1 AU 2019206101A1 AU 2019206101 A AU2019206101 A AU 2019206101A AU 2019206101 A AU2019206101 A AU 2019206101A AU 2019206101 A1 AU2019206101 A1 AU 2019206101A1
Authority
AU
Australia
Prior art keywords
icons
icon
touch
display
crown
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
AU2019206101A
Other versions
AU2019206101B2 (en
Inventor
Gary Ian Butcher
Imran Chaudhri
Jonathan R. Dascola
Anton M. Davydov
Alan C. Dye
Dylan Ross Edwards
Christopher Patrick Foss
Aurelio GUZMAN
Jonathan P. Ive
Chanaka G. Karunamuni
Zachery KENNEDY
Duncan Robert Kerr
Nicholas V. King
Stephen O. Lemay
Natalia MARIC
Daniel Trent Preston
Christopher Wilson
Eric Lance Wilson
Lawrence Y. Yang
Nicholas Zambetti
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Apple Inc
Original Assignee
Apple Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Apple Inc filed Critical Apple Inc
Priority to AU2019206101A priority Critical patent/AU2019206101B2/en
Publication of AU2019206101A1 publication Critical patent/AU2019206101A1/en
Application granted granted Critical
Publication of AU2019206101B2 publication Critical patent/AU2019206101B2/en
Priority to AU2021201748A priority patent/AU2021201748C1/en
Priority to AU2022235585A priority patent/AU2022235585A1/en
Priority to AU2024205135A priority patent/AU2024205135A1/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0362Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of 1D translations or rotations of an operating part of the device, e.g. scroll wheels, sliders, knobs, rollers or belts
    • GPHYSICS
    • G04HOROLOGY
    • G04GELECTRONIC TIME-PIECES
    • G04G21/00Input or output devices integrated in time-pieces
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04817Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance using icons
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/0482Interaction with lists of selectable items, e.g. menus
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/0485Scrolling or panning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/048Indexing scheme relating to G06F3/048
    • G06F2203/04806Zoom, i.e. interaction techniques or interactors for controlling the zooming operation

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • User Interface Of Digital Computer (AREA)
  • Position Input By Displaying (AREA)
  • Electric Clocks (AREA)
  • Controls And Circuits For Display Device (AREA)
  • Electromechanical Clocks (AREA)
  • Measuring Pulse, Heart Rate, Blood Pressure Or Blood Flow (AREA)
  • Ultra Sonic Daignosis Equipment (AREA)

Abstract

Abstract User interface navigation on a personal electronics device based on movements of a crown is disclosed. The device can select an appropriate level of information arranged along a z-axis for 5 display based on crown movement. The navigation can be based on an angular velocity of the crown.

Description

USER INTERFACE FOR MANIPULATING USER INTERFACE OBJECTS
CROSS-REFERENCE TO RELATED APPLICATIONS [0001] This application claims priority to U.S. Provisional Patent Application Serial No.
61/873,356, filed September 3, 2013, entitled “CROWN INPUT FOR A WEARABLE
ELECTRONIC DEVICE”; U.S. Provisional Patent Application Serial No. 61/873,359, filed September 3, 2013, entitled “USER INTERFACE OBJECT MANIPULATIONS IN A USER INTERFACE”; U.S. Provisional Patent Application Serial No. 61/959,851, filed September 3, 2013, entitled “USER INTERFACE FOR MANIPULATING USER INTERFACE OBJECTS”;
U.S. Provisional Patent Application Serial No. 61/873,360, filed September 3, 2013, entitled “USER INTERFACE FOR MANIPULATING USER INTERFACE OBJECTS WITH MAGNETIC PROPERTIES”; and U.S. Non-provisional Patent Application Serial No. 14/476,657, filed September 3, 2014, entitled “USER INTERFACE FOR MANIPULATING USER INTERFACE OBJECTS WITH MAGNETIC PROPERTIES”. The content of these applications is hereby incorporated by reference in its entirety for all purposes.
[0002] This application is related to co-pending applications U.S. Non-provisional Patent Application filed September 3, 2014, concurrently herewith, entitled “CROWN INPUT FOR A WEARABLE ELECTRONIC DEVICE,” naming Nicholas Zambetti et al. as inventors; U.S. Non-provisional Patent Application filed September 3, 2014, concurrently herewith, entitled “USER INTERFACE OBJECT MANIPULATIONS IN A USER INTERFACE”, naming Nicholas Zambetti et al. as inventors; and U.S. Provisional Patent Application Serial No. 61/747,278, filed December 29, 2012, entitled “Device, Method, and Graphical User Interface for Manipulating User Interface Objects with Visual and/or Haptic Feedback”. The content of these applications is hereby incorporated by reference in its entirety for all purposes.
FIELD [0003] The disclosed embodiments relate generally to user interfaces of electronic devices, including but not limited to user interfaces for electronic watches.
BACKGROUND [0004] Advanced personal electronic devices can have small form factors. Exemplary personal electronic devices include but are not limited to tablets and smart phones. Uses of such personal electronic devices involve manipulation of user interface objects on display screens
WO 2015/034965
PCT/US2014/053957
2019206101 18 Jul 2019 which also have small form factors that complement the design of the personal electronic devices.
[0005] Exemplary manipulations that users can perform on personal electronic devices include navigating a hierarchy, selecting a user interface object, adjusting the position, size, and zoom of user interface objects, or otherwise manipulating user interfaces. Exemplary user interface objects include digital images, video, text, icons, control elements such as buttons, and other graphics.
[0006] Existing methods for manipulating user interface objects on reduced-size personal electronic devices can be inefficient. Further, existing methods generally provide less precision 0 than is preferable.
SUMMARY [0007] In some embodiments, techniques for navigating a user interface on a personal electronics device based on movements of a crown are disclosed. Systems and computerreadable storage media for performing the processes described above are also disclosed.
BRIEF DESCRIPTION OF THE DRAWINGS [0008] FIG. 1 illustrates an exemplary personal electronic device.
[0009] FIG. 2 illustrates an exemplary user interface.
[0010] FIG. 3 illustrates an exemplary user interface.
[0011] FIG. 4 illustrates an exemplary user interface.
[0012]
FIG. 5 illustrates an exemplary user interface.
[0013] FIG. 6 illustrates an exemplary user interface.
[0014] FIG. 7 illustrates an exemplary user interface.
[0015] FIG. 8 illustrates an exemplary user interface.
[0016] FIG. 9 illustrates an exemplary logical structure of a user interface.
[0017]
FIG. 10 illustrates an exemplary user interface.
WO 2015/034965
PCT/US2014/053957
2019206101 18 Jul 2019 [0018] FIG. 11 illustrates an exemplary user interface.
[0019] FIG. 12 illustrates an exemplary user interface.
[0020] FIG. 13 illustrates an exemplary user interface transition.
[0021] FIG. 14 illustrates an exemplary user interface.
[0022] FIG. 15 illustrates an exemplary user interface.
[0023] FIG. 16 illustrates an exemplary user interface transition.
[0024] FIG. 17 illustrates an exemplary user interface.
[0025] FIG. 18 illustrates an exemplary user interface.
[0026] FIG. 19 illustrates an exemplary user interface transition.
[0027]
FIG. 20 illustrates an exemplary user interface.
[0028] FIG. 21 illustrates an exemplary user interface.
[0029] FIG. 22 illustrates an exemplary user interface and transition.
[0030] FIG. 23 illustrates an exemplary user interface.
[0031] FIG. 24 illustrates an exemplary user interface and transition.
[0032]
FIG. 25A and FIG. 25B illustrate an exemplary user interface.
[0033] FIG. 26 illustrates an exemplary user interface.
[0034] FIG. 27 illustrates an exemplary user interface and transition.
[0035] FIG. 28 illustrates an exemplary user interface.
[0036] FIG. 29 illustrates an exemplary user interface.
[0037]
FIG. 30 illustrates an exemplary user interface and transition.
[0038] FIG. 31 illustrates an exemplary user interface.
[0039] FIG. 32 illustrates an exemplary user interface.
WO 2015/034965
PCT/US2014/053957
2019206101 18 Jul 2019 [0040] FIG. 33 illustrates an exemplary user interface.
[0041] FIG. 34 illustrates an exemplary user interface.
[0042] FIG. 35 illustrates an exemplary process.
[0043] FIG. 36 illustrates an exemplary computing system.
[0044] FIG. 37 illustrates an exemplary personal electronic device.
[0045] FIG. 38 illustrates an exemplary personal electronic device.
[0046] FIG. 39 illustrates an exemplary personal electronic device.
[0047] FIG. 40 illustrates an exemplary user interface.
[0048] FIG. 41 illustrates an exemplary logical structure of a user interface.
[0049] FIG. 42 illustrates an exemplary user interface.
DETAILED DESCRIPTION [0050] In the following description of the disclosure and examples, reference is made to the accompanying drawings in which it is shown by way of illustration specific examples that can be practiced. It is to be understood that other examples can be practiced and structural changes can 5 be made without departing from the scope of the disclosure.
[0051] FIG. 1 illustrates exemplary personal electronic device 100. In the illustrated example, device 100 is a watch that generally includes body 102 and strap 104 for affixing device 100 to the body of a user. That is, device 100 is wearable. Body 102 can designed to couple with straps 104. Device 100 can have touch-sensitive display screen (hereafter touchscreen) 106 and crown 108. In some embodiments, device 100 can have one or more buttons 110, 112, and 114. In some embodiments, device 100 does not have buttons 110, 112, nor 114.
[0052] Conventionally, the term “crown,” in the context of a watch, refers to the cap atop a stem for winding the watch. In the context of a personal electronic device, the crown can be a physical component of the electronic device, rather than a virtual crown on a touch sensitive display. Crown 108 can be mechanical meaning that it can be connected to a sensor for converting physical movement of the crown into electrical signals. Crown 108 can rotate in two
WO 2015/034965
PCT/US2014/053957
2019206101 18 Jul 2019 directions of rotation (e.g., forward and backward). Crown 108 can also be pushed in towards the body of device 100 and/or be pulled away from device 100. Crown 108 can be touchsensitive, for example, using capacitive touch technologies that can detect whether a user is touching the crown. Moreover, crown 108 can further be rocked in one or more directions or 5 translated along a track along an edge or at least partially around a perimeter of body 102. In some examples, more than one crown 108 can be used. The visual appearance of crown 108 can, but need not, resemble crowns of conventional watches. There examples described herein refer to crown rotations, pushes, pulls, and/or touches, each of which constitutes a physical state of the crown.
[0053] Buttons 110, 112, and 114, if included, can each be a physical or a touch-sensitive button. That is, the buttons may be, for example, physical buttons or capacitive buttons.
Further, body 102, which can include a bezel, may have predetermined regions on the bezel that act as buttons.
[0054] Touchscreen 106 can include a display device, such as a liquid crystal display (LCD), 5 light-emitting diode (LED) display, organic light-emitting diode (OLED) display, or the like, positioned partially or fully behind or in front of a touch sensor panel implemented using any desired touch sensing technology, such as mutual-capacitance touch sensing, self-capacitance touch sensing, resistive touch sensing, projection scan touch sensing, or the like. Touchscreen 106 can allow a user to perform various functions by touching over hovering near the touch ,0 sensor panel using one or more fingers or other object.
[0055] In some examples, device 100 can further include one or more pressure sensors (not shown) for detecting a force or pressure applied to the display. The force or pressure applied to touchscreen 106 can be used as an input to device 100 to perform any desired operation, such as making a selection, entering or exiting a menu, causing the display of additional options/actions, 25 or the like. In some examples, different operations can be performed based on the amount of force or pressure being applied to touchscreen 106. The one or more pressure sensors can further be used to determine a position that the force is being applied to touchscreen 106.
1. Crown-based user interface control [0056] FIGs. 2-7 illustrate exemplary user interfaces that respond to movements of crown
108 (FIG. 1). FIG. 2 shows exemplary screen 200 that can be displayed by device 100. Screen
200 can be, for example, a home screen that appears upon power-on of device 100 or that appears initially when the touchscreen display of device 100 powers-on (including wake up from
WO 2015/034965
PCT/US2014/053957
2019206101 18 Jul 2019 a sleep state). Icons 204, 206, and 208 can be displayed in screen 200. In some embodiments, the icons can correspond to applications operable on device 100, meaning that the applications can be installed onto and/or can execute as a service on device 100. A touch (e.g., a finger tap) on an icon causes the corresponding application to launch, meaning that the application runs in the foreground of device 100 and appears on touchscreen 106. In some embodiments, the icons can correspond to text documents, media items, web pages, e-mail messages, or the like.
[0057] Device 100 can select icons 204, 206, and 208 out of larger set of available icons for display on screen 200 because these icons have information relevant to the user at the current time. For example, icon 204 can correspond to a messaging application in which the user has 0 just received an incoming message, and icon 206 can correspond to a calendar application where the user has an upcoming calendar appointment entry.
[0058] FIG. 3 shows exemplary screen 300, which can be displayed by device 100 in response to a rotation of crown 108 in direction 302 while screen 200 (FIG. 2) is displayed. Screen 300 can show, for example, a user’s favorite icons, selected previously by the user from a 5 larger set of available icons. Also, screen 300 can include icons, selected from the larger set of available icons, by device 100 based on a user’s frequency of access of the icons. Exemplary icons 304, 306, 308, 310, and 312 displayed in screen 300 can each correspond to an application operable on device 100. A touch (e.g., a finger tap) on an icon causes the corresponding application to launch.
[0059] FIG. 4 shows exemplary screen 400, which can be displayed by device 100 in response to a rotation of crown 108 in direction 402 while screen 300 (FIG. 3) is displayed. Screen 400 can show, for example, icons corresponding to all of the applications operable on device 100. Because a large number of applications can be operable on device 100, screen 400 can include a large number of icons. When many icons are displayed, the icons can be sized accordingly so that they can fit within touchscreen 106, or sized so that at least a representative number or predetermined percentage of icons can fit visibly within touchscreen 106.
[0060] FIG. 5 shows exemplary screen 500, which can be displayed by device 100 in response to a rotation of crown 108 in direction 502 while screen 400 (FIG. 4) is displayed. Screen 500 can show, for example, icons corresponding to a subset of the applications operable 30 on device 100. Because fewer icons are displayed on screen 500 as compared with screen 400, the icons that are displayed on screen 500, e.g., icon 504, can become larger and can have additional fidelity as compared with the display of icons on screen 400. For example, icons on
WO 2015/034965
PCT/US2014/053957
2019206101 18 Jul 2019 screen 500 can have indicia, in the form of text and/or imagery, identifying its corresponding application. As shown, icon 504 uses the letter “c” to suggest the name of the corresponding application begins with a “c”, as in clock. In some embodiments, a touch (e.g., a finger tap) on an icon causes the corresponding application to launch.
[0061] FIG. 6 shows exemplary screen 600, which can be displayed by device 100 in response to a rotation of crown 108 in direction 602. Screen 600 can show, for example, a further winnowed subset of icons, as compared with screen 500, that correspond to applications operable on device 100. Because even fewer icons are displayed on screen 600 as compared with screen 500 (FIG. 5), the icons that are displayed (e.g., icon 604) can enlarge further and can have additional fidelity as compared with the display of icons on screens 200, 300, 400, and 500. For example, icon 604 can have the image of a clock that displays the current time. In some embodiments, a touch (e.g., a finger tap) on an icon causes the corresponding application to launch.
[0062] FIGs. 7 and 8 show exemplary screens 700 and 800, respectively, that can be displayed by device 100 in response to a rotation of crown 108 in direction 702 while screen 600 (FIG. 6) is displayed.
[0063] With reference to FIG. 7, in some embodiments, screen 700 can be displayed in response to crown rotation in direction 702 when screen 600 (FIG. 6) is displayed. Because a single icon 704 is displayed on screen 700, icon 704 can have additional fidelity as compared 0 with the previous screens. For example, icon 704 can have the image of a clock that displays day-date information along with the current time. A touch (e.g., a finger tap) on icon 704 causes the corresponding application to launch.
[0064] Turning to FIG. 8, in some embodiments, screen 800 can be displayed in response to crown rotation in direction 802 when screen 600 (FIG. 6) is displayed. Screen 800 shows application 804, which corresponds to icon 704 (FIG. 7), operating in the foreground of device 100. That is, application 804 launched in response to crown rotation in direction 802. Exemplary application 804 can be a clock application that provides alarm features. Also, in some embodiments, screen 800 becomes displayed in response to crown rotation in direction 802 when screen 700 FIG. 7) is displayed.
[0065] Screens 200-700 (FIGs. 2-7) described above can be logically organized as planes of information along an axis. Under this organization, a given screen of icons can be thought of as a plane, defined by two axes (e.g., x- and y-axes), having icons spatially positioned thereon.
WO 2015/034965
PCT/US2014/053957
2019206101 18 Jul 2019
Multiple planes can be organized along a third axis orthogonal to at least one of the x- or y-axes, called the z-axis. (The z-axis can be perpendicular to the plane formed by the x- and y-axes.) [0066] This logical organization is illustrated by FIG. 9, in which x-axis 902 and y-axis 904 form a plane co-planar with the touchscreen screen surface of device 100 (FIG. 1) and z-axis 906 5 is perpendicular to the x/y-plane formed by axes 902 and 904. Plane 908 can correspond to screen 200 (FIG. 2). Plane 910 can correspond to screen 300 (FIG. 3). Plane 912 can represent the collection of icons that represent the operable applications of a personal electronic device. Thus, different viewpoints of plane 912 can correspond to screens 400-700 (FIGs. 4-7). Planes 908 and 910 can be related to plane 912 in that planes 908 and 910 can each include a subset of 0 the icons available on plane 912. The particular plane of information (i.e., screen of icons) that is to be displayed on a personal electronic device can be selected via crown movement, such as crown rotation. That is, crown movement can be used to traverse the planes of information intersecting z-axis 906, or to provide alternative views of a given plane (e.g., plane 912).
[0067] In some embodiments, when an end of the z-axis (e.g., the top or bottom-most plane) 5 is reached via crown movement, the displayed information (e.g., screen of icons) produces a rubberband effect to indicate that the end has been reached. Consider the situation in which a user has, through crown input, reached the bottom most plane of information. As the user provides additional crown input in the same direction, the displayed collection of icons shrink (to the extent possible) in accordance with the crown movement until the movement stops. When 0 the crown movement stops, the displayed icons return from their shrunken size back to their normal size via on-screen animation, thereby producing the visual effect of rubberbanding.
[0068] One notable benefit of this logical organization is that different planes of information need not be (but can be) zoomed subsets of one another. That is, for example, planes 908 and 910 can contain entire different icons out of those icons available on a personal electronic device, but yet the different planes of information can be accessed efficiently by a user.
[0069] Alternatively, screens 200-700 (FIG. 2-7) can be logically organized as subsets of information belonging to different modal states of a personal electronic device. Under this organization, screens 200 and 300 can correspond to first and a second modal state of the device, and screens 400-700 can correspond to a third modal state, for example. The personal electronic device can cycle through modal states in response to crown pushes, and can display screens 200 or 300 in the first and second modal states, respectively. In alternative embodiments, modal states may be cycled using buttons 110, 112, or 114. When multiple screens are available within
WO 2015/034965
PCT/US2014/053957
2019206101 18 Jul 2019 a particular modal state (e.g., the third modal state), the device can switch from the display of one screen (e.g., 300) to another screen (e.g., 400) based on crown rotation. On-screen user interface elements, such as paging dots, can be used to indicate the availability of additional screens for display within a particular modal state.
[0070] This logical arrangement is illustrated by FIG. 41. As shown, planes 4102 and 4104 can correspond to screens 200 (FIG. 2) and 300 (FIG. 3) respectively. Plane 4106 can represent the collection of icons that represent the operable applications of a personal electronic device. Thus, different viewpoints of plane 4106 can correspond to screens 400-700 (FIGs. 4-7). The particular plane of information (i.e., screen of icons) that is to be displayed on a personal electronic device can be selected via crown movement, such as crown pushes.
2. Velocity-based crown control [0071] Device 100 (FIG. 1) can consider the angular velocity of rotation of crown 108 (FIG. 1) in determining whether one screen of icons should be replaced with another screen of icons. Specifically, device 100 can require crown 108 to rotate above a predetermined angular velocity 5 before changing the display of one screen of icons to another. In this way, while slow rotations of crown 108 that are unintended by a user can still cause device 100 to receive crown input indicating angular displacement, the displacement need not be interpreted as having sufficient velocity to cause user interface updates that are unintended. The selection of predetermined angular velocities for this purpose can depend on a number of factors, such as the density of 0 icons currently displayed, the visual arrangement of icons currently displayed, and so forth.
[0072] In some embodiments, the minimum angular velocity of crown rotation that is necessary to switch between screens of icons corresponds directly to the instantaneous angular velocity of crown 108 (FIG. 1), meaning that the user interface of device 100, in essence, responds when crown 108 reaches a sufficient angular velocity. In some embodiments, the minimum angular velocity of crown rotation necessary for switching between screens of icons is a calculated velocity that is based on, but not directly equal to, the instantaneous (“current”) angular velocity of crown 108. In these embodiments, device 100 can maintain a calculated crown (angular) velocity V in discrete moments in time T according to equation 1:
(EQ. 1) [0073] In equation 1, Vt represents a calculated crown velocity (speed and direction) at time
T, V(T-i) represents the previous velocity (speed and direction) at time T-l, AVcrown represents
WO 2015/034965
PCT/US2014/053957
2019206101 18 Jul 2019 the change in velocity caused by the force being applied through the rotation of the crown at time T, and AVdrag represents the change in velocity due to a drag force. The force being applied, which is reflected through AVcrown, can depend on the current velocity of angular rotation of the crown. Thus, AVcrown can also depend on the current angular velocity of the crown. In this 5 way, device 100 can provide user interface interactions based not only on instantaneous crown velocity but also based on user input in the form of crown movement over multiple time intervals, even if those intervals are finely divided. Note, typically, in the absence of user input in the form of AVcrown, Vt will approach (and become) zero based on AVdrag in accordance with EQ. 1, but Vt would not change signs without user input in the form of crown rotation (AVcrown)· [0074] Typically, the greater the velocity of angular rotation of the crown, the greater the value of AVcrown will be. However, the actual mapping between the velocity of angular rotation of the crown and AVcrown can be varied depending on the desired user interface effect. For example, various linear or non-linear mappings between the velocity of angular rotation of the 5 crown and AVcrown can be used. In another example, the mapping can depend on the number of icons and/or icon arrangement currently being displayed.
[0075] Also, AVdrag can take on various values. For example, AVdrag can depend on the velocity of crown rotation such that at greater velocities, a greater opposing change in velocity (AVdrag) can be produced. In another example, AVdrag can have a constant value. In yet 0 another example, AVdrag can be based on the number of current displayed icons and/or the currently displayed icon arrangement. It should be appreciated that the above-described requirements of AVcrown and AVdrag can be changed to produce desirable user interface effects.
[0076] As can be seen from EQ. 1, the maintained velocity (Vt) can continue to increase as long as AVcrown is greater than AVdrag· Additionally, Vt can have non-zero values even when no AVcrown input is being received, meaning that user interface screens can continue to change without the user rotating the crown. When this occurs, screens can stop changing based on the maintained velocity at the time the user stops rotating the crown and the AVdrag component.
[0077] In some embodiments, when the crown is rotated in a direction corresponding to a rotation direction that is opposite the current user interface changes, the V(T-i) component can be reset to a value of zero, allowing the user to quickly change the direction of the screen changes without having to provide a force sufficient to offset the Vt.
WO 2015/034965
PCT/US2014/053957
2019206101 18 Jul 2019 [0078] In other embodiments, different physical crown states other than rotation of the crown are used to navigate through displayed icons.
3. User interface appearance [0079] Icons can take on various visual appearances. For example, icons can be rectangular 5 in shape, as shown in FIG. 10. As another example, icons can be circular, as shown in FIGs. 2-7.
Further, icons can take on various spatial arrangement schemes, meaning that icons can be arranged along the rows and columns of an invisible grid. Grids can be symmetrical or nonsymmetrical. In FIG. 10, a symmetrical grid is used, for example. In FIG. 5, a non-symmetrical grid having x icons arranged on a first row and y icons arranged along a second row is used, for 0 example.
[0080] FIG. 11 illustrates a radial icon arrangement scheme where circular icons are aligned along the circumference of invisible circles 1102 and 1104 of different diameters. Invisible circles 1102 and 1104 are, but need not be, concentric. Icons, such as icon 1106, arranged along different invisible circles can have different sizes. As shown, icons arranged along invisible circle 1102 are closer to the center of device 100 and are larger than those arranged along invisible circle 1104. Also, although not illustrated in FIG. 11, icons in a radial arrangement can be arranged along more than two invisible circles.
[0081] The distance that a particular icon is position from the center of the radial icon arrangement can depend on different factors. For example, the distance can be proportional to 0 frequency of use of the icon; an icon that is used frequently is closer to the center. As another example, the distance can depend on whether an incoming notification has been received for (the application corresponding to) the icon. As another example, the distance can be user-defined, or can be otherwise determined by device 100 (i.e., curated).
[0082] FIG. 25A illustrates an arrangement of icons into icon groups. On grid 2502, four groups of icons, including icon group 2512, are displayed. In response to a touch input, such as a finger tap at touchscreen location 2514 on group 2512, the icons within group 2512 can be displayed in enlarged form. In grid 2506, the icons within group 2512, including icon 2516, are displayed in enlarged form. FIG. 25B illustrates an arrangement of application functionalities into groups. On grid 2508, as discussed above, the four icons of icon group 2512 are displayed on grid 2506. A selection of icon 2516 (e.g., via finger tap 2518) can cause a group of functions
2520 provided by application 2510 (which corresponds to icon 2508) to be displayed.
WO 2015/034965
PCT/US2014/053957
2019206101 18 Jul 2019 [0083] The size and shape of icon groups can be organic or defined. Icon groups that are defined, such as icon group 2512 in grid 2502 (FIG. 25A), share a predefined group size and group shape. Organic icon groups, shown in FIG. 42, can be of a user-defined group size and/or group shape. For example, icon groups 4204 and 4206 in grid 4202 are of different user-defined shapes and sizes. In some embodiments, organic icon groups are defined using software running on a computer external to the personal electronic device and downloaded onto the personal electronic device.
[0084] FIG. 30 illustrates an icon arrangement scheme where icons are arranged similar to pages of a rolodex. Pages of exemplary rolodex 3002 can flip in response to crown rotation. For 0 example, page (icon) 3004 can flip downward onto page (icon) 3006 in response to a crown rotation.
[0085] FIG. 31 illustrates an icon arrangement scheme where icons are arranged on the outer circumference of a spinning dial. Exemplary spinning dial 3102 can spin in response to crown rotation. For example, a crown rotation in direction 3104 can cause dial 3102 to spin in the same direction (3106). Also, a crown push (or pull) can change the number of columns in 3102, allowing the icons of the remaining columns to be enlarged and/or to have increased fidelity.
[0086] FIG. 32 illustrates an icon arrangement scheme in the form of a thumbnailed list 202. Icon 3204 within exemplary thumbnailed list 3202 can have corresponding thumbnail 3206. The icons of thumbnailed list 3202 can be traversed via crown rotation. A specific icon, such as icon
3204, can be selected directly for display by touching corresponding thumbnail 3206.
[0087] FIG. 33 illustrates an arrangement scheme where icons are aligned with the surface of an invisible sphere or polyhedron. Icons on the foreground surface of the invisible sphere, such as icon 3302, can be displayed. Icons on the far side of the invisible sphere’s surface are not displayed. The invisible sphere can rotate in response to crown rotation and/or touchscreen input, thereby changing the specific icons that are displayed.
[0088] During operation, device 100 (FIG. 1) can use one or more of the icon arrangement schemes described above. The particular arrangement(s) used by device 10 can be user-selected and/or system-selected. That is, a user may be permitted to identify one or more preferred arrangements for display. Also, arrangements can be selected by device 100 based on criteria such as the total number of applications installed on the device, the number frequently accessed icons, and so forth.
WO 2015/034965
PCT/US2014/053957
2019206101 18 Jul 2019 [0089] Further, the specific ordering and placement of icons within a particular icon arrangement scheme can be user-selected and/or system-selected. For example, a user can be permitted to specify the position of an icon on a given screen. Also, icon placement can be determined by device 100 (i.e., curated) based on criteria such as the frequency of use of particular icons, a calculated relevance, and so forth.
4. Responses to user input [0090] Displayed icons can respond to user input. FIGs. 12-14 illustrate a rearrangement of displayed icons in response to crown rotation. In FIG. 12, nine icons are displayed along a 3-by3 symmetric grid 1202. Icon 1204 is displayed in the top-right position of grid 1202. As discussed above with respect to FIGs. 4-7, a rotation of crown 108 can cause device 100 to reduce the number of displayed icons. For example, a rotation of crown 108 can cause device 100 to display a 2-by-2 grid, thereby reducing the number of displayed icons. FIG. 13 illustrates an exemplary transition to a 2-by-2 grid in response to a crown rotation in direction 1302. As shown, in response to crown rotation 1302, icon 1204 is translated visibly on-screen from its top5 right position in the 3-by-3 grid of FIG. 12 to its new position in the 2-by-2 grid to be displayed. Specifically, as shown in FIG. 14, icon 1204 is translated to the lower-left corner of 2-by-2 grid 1402. Further, icons that are to remain displayed in the 2-by-2 grid after the transition from grid 1202 are enlarged and positioned into the 2-by-2 grid 1402.
[0091] FIGs. 15-17 illustrate another rearrangement of icons in response to crown rotation.
In FIG. 15, nine icons are displayed along a 3-by-3 symmetric grid 1502. Icon 1504 is displayed in the top-right position of grid 1502. As shown in FIG. 16, in response to crown rotation 1602, icon 1504 is translated off-screen from its position in grid 1502 (FIG. 15) while it is translated into its new position in the 2-by-2 grid to be displayed. To put another way, during the transition illustrated by FIG. 16, icon 1504 can be split into two portions that are displayed in two separate, 25 non-abutting positions of the touchscreen of device 100. More specifically, while one portion of icon 1504 remains partially displayed in the top-right corner as icon 1504 is translated offscreen, the remaining portion of 1504 is partially displayed in the lower-left corner as it is translated on-screen. As shown in FIG. 17, icon 1504 is translated to the lower-left corner of 2by-2 grid 1702. Further, icons that are to remain displayed in the 2-by-2 grid after the transition 30 from grid 1502 are enlarged and positioned into the 2-by-2 grid 1702.
[0092] FIGs. 18-20 illustrate another rearrangement of icons in response to crown rotation.
In FIG. 18, nine icons are displayed along a 3-by-3 symmetric grid 1802. As shown in FIG. 19,
WO 2015/034965
PCT/US2014/053957
2019206101 18 Jul 2019 in response to crown rotation 1902, the icons along the right and bottom boundaries of grid 1802 (FIG. 18) are removed from display while the remaining icons are enlarged. The remaining icons are displayed enlarged as shown in grid 2002 of FIG. 20.
[0093] It should be noted that in the exemplary screens shown in FIGs. 12-20, the icon displayed in the upper-left comer (i.e., marked “A”) is anchored, meaning that the abovedescribed transitions do not cause the icon to move away from the upper-left comer. It is possible, however, to unanchor such an icon through user input, as discussed below.
[0094] FIG. 21 illustrates a rearrangement of icons in response to touchscreen input. As shown, icon 2106 is displayed in the bottom row of 4-by-4 grid 2012. In response to a finger tap 0 2104 on icon 2106, 3-by-3 grid 2108 is displayed with icon 2106 enlarged in the center.
Notably, the icon marked “A,” which is displayed in grid 2012, is no longer displayed in grid 2108. FIG. 21 also illustrates an update of displayed icons in response to crown rotation. Specifically, in response to crown rotation 2110, icon 2106 is further enlarged and becomes the only icon displayed on-screen.
[0095] FIG. 22 illustrates a rearrangement of icons in response to movement of device 100.
Device movement can be detected using one or more sensors, for example, a gyroscope. As shown, various icons are displayed in grid 2202. In response to tilting of device 100 in direction 2204, the displayed icons are translated in direction 2206, resulting in the display of different icons in grid 2208. Specifically, in response to the leftward tilting of device 100 in direction
2204, the icons of grid 2202 translate in the left direction 2206. In some embodiments, the translation may be incremental such that a single row or column transitions off a single row or column transitions onto the display. Alternatively, a whole screen of icons may transition off as a completely new set of icons transition onto the display.
[0096] FIG. 23 illustrates a change in icon appearance in response to touchscreen input. As shown, in response to a touch at location 2304, icon 2306 becomes enlarged. Notably, icon 2306 is not located at location 2304, rather, icon 2306 (in its unenlarged state) is in row 2310 above touch location 2304 which is along row 2312. In this way, user visibility of icon 2306 is improved both because the icon is enlarged and because the icon is not blocked from view by the potentially opaque object that is touching device 100. It should be noted that more than one icon can be enlarged in response to a nearby touch. Multiple icons can be enlarged at different levels of magnification inversely proportional to the distance between each icon being enlarged and the touch location.
WO 2015/034965
PCT/US2014/053957
2019206101 18 Jul 2019 [0097] FIG. 40 illustrates icon movements that account for physical interaction between nearby icons. As shown, grid 4002 includes a number of icons arranged in a radial arrangement. In response a touch input at location 4010, a number of icons are enlarged to at different levels of magnification. Notably, the enlarging of icon 4004 can cause adjacent icons 4006 and 4008 to 5 move away from icon 4004 so the icons do not block each other from view.
[0098] FIG. 24 illustrates icon movements that account for interaction between icons and grid boundaries. As shown, a number of icons are displayed according to non-symmetrical grid 2402. The displayed icons include uncompressed icons 2408. In response to touch input in the form of a rightward gesture in direction 2404, icons on the right boundary of grid 2402 can be 0 compressed into compressed icons 2406 so that icons from the left side of grid 2402 are more predominately displayed either in enlarged or unenlarged form. Also, in response to a touch gesture in the leftward direction 2406, icons that are on the left boundary of grid 2402 can be compressed into compressed icons 2412 so that icons from the right side of grid 2402 are more predominately displayed. The above-described interaction allows all, or substantially all, icons to be simultaneously displayed while allowing a user to easily view and select an icon. Note that this compression may occur in a symmetrical grid, although not shown.
[0099] FIG. 34 illustrates icon movements that account for interaction between grid boundaries and nearby icons. In the radial arrangement of FIG. 34, icons are arranged between invisible inner circle 3402 and invisible outer boundary circle 3400. Outer circle 3400 can be 0 sized based on the physical size the touchscreen of device 100. Inner circle 3402 can be sized based on design and/or user preferences. Inner circle 3402 can also be sized based on user input, such as a crown rotation. Inner circle 3402 can respond to touchscreen input within its surface area. For example, a touch down that occurs within the surface area of inner circle 3402 and subsequent touch movement can be interpreted as panning of inner circle 3402. When inner 25 circle 3402 is panned, the icons that are arranged between the inner circle 3402 and outer circle
3400, such as icons 3404 and 3408, can be resize based on the available spacing between inner circle 3402 and outer circle 3400, the number of icons being displayed, and the sizes of adjacent icons. For example, in response to the rightward panning of circle 3402, icon 3404 can increase in size, and the enlarging of icon 3404 can cause icon 3408 to decrease in size.
[00100] Note, in the absence of user input, displayed icons can be programmed to move onscreen to prevent screen burn-in. Also, icon arrangements can respond to multi-touch gestures. For example, a two-finger downward gesture on the touchscreen of device 100 (FIG. 1) can cause the display of system information such as a status bar. As another example, a two-finger
WO 2015/034965
PCT/US2014/053957
2019206101 18 Jul 2019 gesture in which the two fingers move in opposite directions can configure device 100 (FIG. 1) for left-handed or right-handed use.
5. Additional features [00101] Turning back to FIG. 2, home screen 200 can display system-generated information such as alerts. For example, home screen 200 can display a reminder that the user has sat for an extended duration and exercise is in order. Also, screen 200 can display a suggestion for rest because the user has a busy calendar for the next morning. Also turning back to FIG. 3, screen 300 can be displayed when device 100 is coupled with a dock.
[00102] FIG. 26 illustrates the use of wallpaper 2602 to aid user navigation in a grid of icons.
As shown, grid 2600 has a relatively large number of icons. In response to crown rotation 2604, a subset of the icons from grid 2600 is enlarged and displayed in grid 2606. In addition, the corresponding portion of wallpaper 2602 displayed in the background of the subset is also displayed, meaning that, for example, if icons from the upper-left quadrant of grid 2600 become displayed in grid 2606, then the upper-left quadrant of wallpaper 2602 is also displayed with grid
2606. Also as shown, in response to a touch gesture in leftward direction 2608, device 100 can display another subset of icons from grid 2600. For example, in grid 2610, icons from the upperright quadrant of grid 2600 are displayed together with the upper-right quadrant of wallpaper 2600. In this way, a user can determine the relationship between a set of currently displayed icons relative to the totality of icons available for display on device 100.
[00103] FIG. 27 illustrates an exemplary arrangement of icons where the arrangement provides information, for example current time information, to a user. The arrangement can be displayed in response to crown movement. Also, the arrangement can be displayed after a predetermined period of user input inactivity. For example, screen 2702, which uses icons in small sizes to show the current time, can be displayed after a predetermined period of user input 25 inactivity. Further, in response to a crown rotation, screen 2702 can transition through screens 2704 and 2706 to screen 2708, which shows a grid of icons.
[00104] FIG. 28 illustrates an exemplary arrangement of icons (grid 2802) where the color and/or intensity of displayed icons can change in response to incoming information. For example, icon 2804 corresponding to a messaging application can blink or glow when a new 30 message arrives. In some embodiments, the blink or glow can correspond to the popularity of an application in an application store or frequency of use of the application in a larger ecosystem of
WO 2015/034965
PCT/US2014/053957
2019206101 18 Jul 2019 users. Further, the icons of grid 2802 can show icons representing a larger set of applications available in an application store, beyond those applications that are installed [00105] FIG. 29 illustrates an exemplary display of a contextual message. A contextual message can be displayed in response to detection of a user’s touch of crown 108. A contextual 5 message indicates the current functionality of crown 108, which can take on different functions depending on the application that is currently operating in the foreground of device 100. For example, when a music application is operating in the foreground of device 100, a touch on crown 108 can result in the display of contextual message 2902 in the form of a volume indicator, which can indicate to a user that the current functionality of crown 108 is volume 0 control.
[00106] FIG. 35 depicts exemplary process 3500 for providing the user interface techniques described above. At block 3510, input based on crown movement and/or crown touch is received. The crown movement can be a rotation, a push, and/or a pull. At block 3520, a decision is made based on the type of crown movement represented by the received input. If the 5 received input represents a crown rotation, processing proceeds to block 3530. If the received input represents a crown push or pull, processing proceeds to block 3550. If the received input represents a crown touch (without a rotation or a push/pull), processing proceeds to block 3560. At block 3530, the currently displayed screen and its corresponding position along z-axis 906 (FIG. 9) can be determined. In addition, an adjacent level of information along the z-axis 906 0 can be determined. The adjacent level can be determined based on the direction of the crown rotation that is represented by the received input. A corresponding grid of icons, such as those illustrated by each of FIGs. 4-7, can be displayed. At block 3550, a home screen, such as the exemplary screen 200 of FIG. 2, can be displayed. In the alternative, a user-favorites screen, such as the exemplary screen 300 of FIG. 3, can be displayed. At block 3560, a contextual message, such as the exemplary contextual message 2902 of FIG. 29, can be displayed.
[00107] FIG. 36 depicts exemplary computing system 3600 for providing the user interface techniques described above. In some embodiments, computing system 3600 can form device 100. As shown, computing system 3600 can have bus 3602 that connects I/O section 3604, one or more computer processors 3606, and a memory section 3608 together. Memory section 3608 30 can contain computer-executable instructions and/or data for carrying out the above-described techniques, including process 3500 (FIG. 35). I/O section 3604 can be connected to display 3610, which can have touch-sensitive component 3612. I/O section 3604 can be connected to crown 3614. I/O section 3604 can be connected to input device 3616, which may include
WO 2015/034965
PCT/US2014/053957
2019206101 18 Jul 2019 buttons. I/O section 3604 can be connected to communication unit 3618, which can provide WiFi, Bluetooth, and/or cellular features, for example. VO section 3604 can be connected to sensor pack 3620, which can have a gyroscope, a GPS sensor, a light sensor, a gyroscope, an accelerometer, and/or a combination thereof. Note, one or more of the above-described components can be part of a system-on-a-chip.
[00108] Memory section 3608 of computing system 3600 can be a non-transitory computer readable storage medium, for storing computer-executable instructions, which, when executed by one or more computer processors 3606, for example, can cause the computer processors to perform the user interface techniques described above, including process 3500 (FIG. 35). The computer-executable instructions can also be stored and/or transported within any non-transitory computer readable storage medium for use by or in connection with an instruction execution system, apparatus, or device, such as a computer-based system, processor-containing system, or other system that can fetch the instructions from the instruction execution system, apparatus, or device and execute the instructions. For purposes of this document, a “non-transitory computer readable storage medium” can be any medium that can contain or store computer-executable instructions for use by or in connection with the instruction execution system, apparatus, or device. The non-transitory computer readable storage medium can include, but is not limited to, magnetic, optical, and/or semiconductor storages. Examples of such storage include magnetic disks, optical discs based on CD, DVD, or Blu-ray technologies, as well as RAM, ROM,
EPROM, flash memory, and solid-state memory.
[00109] Computing system 3600 is not limited to the components and configuration of FIG. 36, but can include other or additional components in multiple configurations. In some embodiments, system 3600 can form personal electronic device 3700, which is a tablet, as shown in FIG. 37. In some embodiments, computing system 3600 can form personal electronic device
3800, which is a mobile phone, as shown in FIG. 38. In some embodiments, computing system
3600 can form personal electronic device 3900, which is a portal music device, as shown in FIG. 39.
[00110] Although the disclosure and examples have been fully described with reference to the accompanying figures, it is to be noted that various changes and modifications will become apparent to those skilled in the art. Such changes and modifications are to be understood as being included within the scope of the disclosure and examples as defined by the appended claims.

Claims (5)

CLAIMS WHAT IS CLAIMED IS:
1. A computer-implemented method comprising:
5 displaying a first plurality of icons on a touch-sensitive display of a wearable electronic device;
receiving input based on a movement of a physical crown of the wearable electronic device; and in response to the received input, replacing the first plurality of icons with a second
0 plurality of icons on the touch-sensitive display, wherein the second plurality of icons is a subset of the first plurality of icons.
2. The method of claim 1, wherein the wearable electronic device is a watch.
5
3. The method of claims 1, wherein the movement is a push, pull on the physical crown.
4. The method of claim 1, wherein the movement is a rotation of the physical crown.
5. The method of claim 4, wherein the rotation is in a first rotation direction.
6. The method of claim 4, wherein the rotation exceeds a predetermined angular velocity threshold.
7. The method of claim 1, wherein the physical crown comprises a capacitive touch sensor
25 configured to sense a touch input, and wherein the received input is further based on the touch input on the physical crown.
8. The method of claim 1, wherein a first icon of the first plurality of icons is associated with an application, and a second icon of the second plurality of icons is associated with the
30 same application, the method further comprising:
displaying the second icon with information regarding the application, when the second plurality of icons is displayed, and displaying the first icon with different information regarding the application, when the first plurality of icons is display.
WO 2015/034965
PCT/US2014/053957
2019206101 18 Jul 2019
9. The method of claim 8, wherein a first density of information is associated with the first plurality of icons and a second density of information is associated with the second plurality of icons.
5 f0. The method of claim 1, wherein the received input is a first received input, the method further comprising:
receiving a second input based on a second movement of the physical crown; and in response to the second received input, replacing the second plurality of icons with the first plurality of icons on the touch-sensitive display.
11. The method of claim 10, wherein the second movement is a rotation of the physical crown in a second direction opposite the first direction.
12. The method of claim 10, wherein the second movement is a push or pull of the physical 5 crown.
13. The method of claim 1, wherein the received input is a first received input, the method further comprising:
receiving a second input based on a second movement of the physical crown, wherein the 0 second movement is a rotation in the first rotation direction; and in response to the second received input, replacing the display of the second plurality of icons with a third plurality of icons, wherein the third plurality of icons is a subset of the second plurality of icons.
25 14. The method of claim 1, wherein the received input is a first received input, the method further comprising:
receiving a second input based on a second movement of the physical crown, wherein the second movement is a rotation in the first direction; and in response to the second received input, launching an application associated with an icon 30 of the second plurality of icons.
f5. The method of claim 1, the method further comprising:
receiving information representing an activity in an application, wherein the application corresponds to a displayed icon;
35 in response to the received information, altering the appearance of the displayed icon.
WO 2015/034965
PCT/US2014/053957
2019206101 18 Jul 2019
16. The method of claim 15, wherein the altering is one or more of blinking, changing color, and animating.
5
17. The method of claim 1, wherein the top-right icon of the first plurality of icons is not displayed in the second display.
18. The method of claim 1, wherein the top-right icon of the first plurality of icons is the leftmost icon in the second top-most row in the second display.
19. The method of claim 18, wherein the replacing of the display of the first plurality of icons to the second plurality of icons comprises:
translating an icon of the first plurality of icons, from a first position of the touchsensitive display, to a second position of the touch-sensitive display,
5 wherein the icon is displayed, in whole, on the touch-sensitive display during the translating.
20. The method of claim 18, wherein the replacing of the display of the first plurality of icons to the second plurality of icons comprises:
0 displaying, at a first position of the touch-sensitive display, only a portion of an icon of the first plurality of icons; and displaying the remaining portion of the icon in a second position of the touch-sensitive display, wherein the first position and the second position are separate.
21. The method of claim 1, wherein the physical crown is a mechanical crown.
22. The method of claim 1, further comprising:
detecting a force applied to the touch-sensitive display; and
30 replacing the first display based on the detected force.
23. A non-transitory computer-readable storage medium having computer-executable instructions which, when executed by one or more computer processors, cause the one or more computer processors to:
WO 2015/034965
PCT/US2014/053957
2019206101 18 Jul 2019 displaying a first plurality of icons on a touch-sensitive display of a wearable electronic device;
receiving input based on a movement of a physical crown of the wearable electronic device; and
5 in response to the received input, replacing the first plurality of icons with a second plurality of icons on the touch-sensitive display, wherein the second plurality of icons is a subset of the first plurality of icons.
24. The non-transitory computer-readable storage medium of claim 23, wherein the wearable 0 electronic device is a watch.
25. The non-transitory computer-readable storage medium of claim 23, wherein the movement is a push, pull on the physical crown.
5
26. The non-transitory computer-readable storage medium of claim 23, wherein the movement is a rotation of the physical crown.
27. The non-transitory computer-readable storage medium of claim 26, wherein the rotation is in a first rotation direction.
28. The non-transitory computer-readable storage medium of claim 26, wherein the rotation exceeds a predetermined angular velocity threshold.
29. The non-transitory computer-readable storage medium of claim 23, wherein the physical 25 crown comprises a capacitive touch sensor configured to sense a touch input, and wherein the received input is further based on the touch input on the physical crown.
30. The non-transitory computer-readable storage medium of claim 23, wherein a first icon of the first plurality of icons is associated with an application, and a second icon of the second
30 plurality of icons is associated with the same application, the non-transitory computer-readable storage medium further comprising computer-executable instructions for:
displaying the second icon with information regarding the application, when the second plurality of icons is displayed, and displaying the first icon with different information regarding the application, when the
35 first plurality of icons is display.
WO 2015/034965
PCT/US2014/053957
2019206101 18 Jul 2019
31. The non-transitory computer-readable storage medium of claim 23, wherein a first density of information is associated with the first plurality of icons and a second density of information is associated with the second plurality of icons.
32. The non-transitory computer-readable storage medium of claim 23, wherein the received input is a first received input, the computer-executable instructions further comprising instructions for:
receiving a second input based on a second movement of the physical crown; and
0 in response to the second received input, replacing the second plurality of icons with the first plurality of icons on the touch-sensitive display.
33. The non-transitory computer-readable storage medium of claim 32, wherein the second movement is a rotation of the physical crown in a second direction opposite the first direction.
34. The non-transitory computer-readable storage medium of claim 32, wherein the second movement is a push or pull of the physical crown.
35. The non-transitory computer-readable storage medium of claim 23, wherein the received 0 input is a first received input, the non-transitory computer-readable storage medium further comprising computer-executable instructions for:
receiving a second input based on a second movement of the physical crown, wherein the second movement is a rotation in the first rotation direction; and in response to the second received input, replacing the display of the second plurality of 25 icons with a third plurality of icons, wherein the third plurality of icons is a subset of the second plurality of icons.
36. The non-transitory computer-readable storage medium of claim 23, wherein the received input is a first received input, the non-transitory computer-readable storage medium further
30 comprising computer-executable instructions for:
receiving a second input based on a second movement of the physical crown, wherein the second movement is a rotation in the first direction; and in response to the second received input, launching an application associated with an icon of the second plurality of icons.
WO 2015/034965
PCT/US2014/053957
2019206101 18 Jul 2019
37. The non-transitory computer-readable storage medium of claim 23, further comprising computer-executable instructions for:
receiving information representing an activity in an application, wherein the application corresponds to a displayed icon;
5 in response to the received information, altering the appearance of the displayed icon.
38. The non-transitory computer-readable storage medium of claim 37, wherein the altering is one or more of blinking, changing color, and animating.
0
39. The non-transitory computer-readable storage medium of claim 23, wherein the top-right icon of the first plurality of icons is not displayed in the second display.
40. The non-transitory computer-readable storage medium of claim 23, wherein the top-right icon of the first plurality of icons is the left-most icon in the second top-most row in the second
5 display.
41. The non-transitory computer-readable storage medium of claim 40, wherein the replacing of the display of the first plurality of icons to the second plurality of icons comprises:
translating an icon of the first plurality of icons, from a first position of the touch0 sensitive display, to a second position of the touch-sensitive display, wherein the icon is displayed, in whole, on the touch-sensitive display during the translating.
42. The non-transitory computer-readable storage medium of claim 40, wherein the replacing
25 of the display of the first plurality of icons to the second plurality of icons comprises: displaying, at a first position of the touch-sensitive display, only a portion of an icon of the first plurality of icons; and displaying the remaining portion of the icon in a second position of the touch-sensitive display,
30 wherein the first position and the second position are separate.
43. The non-transitory computer-readable storage medium of claim 23, wherein the physical crown is a mechanical crown.
WO 2015/034965
PCT/US2014/053957
2019206101 18 Jul 2019
44. The non-transitory computer-readable storage medium of claim 23, further comprising computer-executable instructions for:
detecting a force applied to the touch-sensitive display; and replacing the first display based on the detected force.
45. An electronic device comprising:
one or more processors;
a physical crown operatively coupled to the one or more processors; and a touch-sensitive display operatively coupled to the one or more processors,
0 the one or more processors configured to:
display a first plurality of icons on a touch-sensitive display of a wearable electronic device;
receive input based on a movement of a physical crown of the wearable electronic device; and
5 in response to the received input, replace the first plurality of icons with a second plurality of icons on the touch-sensitive display, wherein the second plurality of icons is a subset of the first plurality of icons.
46. The device of claim 45, wherein the electronic device is a watch.
47. The device of claim 45-, wherein the movement is a push, pull on the physical crown.
48. The device of claim 45, wherein the movement is a rotation of the physical crown.
25
49. The device of claim 48, wherein the rotation is in a first rotation direction.
50. The device of claim 48, wherein the rotation exceeds a predetermined angular velocity threshold.
30
51. The device of claim 45, wherein the physical crown comprises a capacitive touch sensor configured to sense a touch input, and wherein the received input is further based on the touch input on the physical crown.
WO 2015/034965
PCT/US2014/053957
2019206101 18 Jul 2019
52. The device of claim 45, wherein a first icon of the first plurality of icons is associated with an application, and a second icon of the second plurality of icons is associated with the same application, the one or more processors further configured to:
display the second icon with information regarding the application, when the second
5 plurality of icons is displayed, and display the first icon with different information regarding the application, when the first plurality of icons is display.
53. The device of claim 45, wherein a first density of information is associated with the first 0 plurality of icons and a second density of information is associated with the second plurality of icons.
54. The device of claim 45, wherein the received input is a first received input, the one or more processors further configured to:
5 receive a second input based on a second movement of the physical crown; and in response to the second received input, replace the second plurality of icons with the first plurality of icons on the touch-sensitive display.
55. The device of claim 54, wherein the second movement is a rotation of the physical crown 0 in a second direction opposite the first direction.
56. The device of claim 54, wherein the second movement is a push or pull of the physical crown.
25
57. The device of claim 45, wherein the received input is a first received input, the one or more processors further configured to:
receive a second input based on a second movement of the physical crown, wherein the second movement is a rotation in the first rotation direction; and in response to the second received input, replace the display of the second plurality of
30 icons with a third plurality of icons, wherein the third plurality of icons is a subset of the second plurality of icons.
58. The device of claim 45, wherein the received input is a first received input, the one or more processors further configured to:
WO 2015/034965
PCT/US2014/053957
2019206101 18 Jul 2019 receive a second input based on a second movement of the physical crown, wherein the second movement is a rotation in the first direction; and in response to the second received input, launch an application associated with an icon of the second plurality of icons.
59. The device of claim 45, the one or more processors further configured to:
receive information representing an activity in an application, wherein the application corresponds to a displayed icon;
in response to the received information, alter the appearance of the displayed icon.
60. The device of claim 59, wherein the altering is one or more of blinking, changing color, and animating.
61. The device of claim 45, wherein the top-right icon of the first plurality of icons is not
5 displayed in the second display.
62. The device of claim 45, wherein the top-right icon of the first plurality of icons is the leftmost icon in the second top-most row in the second display.
0
63. The device of claim 62, wherein the replacing of the display of the first plurality of icons to the second plurality of icons comprises:
translating an icon of the first plurality of icons, from a first position of the touchsensitive display, to a second position of the touch-sensitive display, wherein the icon is displayed, in whole, on the touch-sensitive display during the
25 translating.
64. The device of claim 62, wherein the replacing of the display of the first plurality of icons to the second plurality of icons comprises:
displaying, at a first position of the touch-sensitive display, only a portion of an icon of
30 the first plurality of icons; and displaying the remaining portion of the icon in a second position of the touch-sensitive display, wherein the first position and the second position are separate.
35
65. The device of claim 45, wherein the physical crown is a mechanical crown
WO 2015/034965
PCT/US2014/053957
2019206101 18 Jul 2019
66. The device of claim 45, further comprising one or more sensors for detecting a force applied to the touch-sensitive display.
5 67. The device of claim 45, further comprising a sensor for detecting a touch on the physical crown.
AU2019206101A 2013-09-03 2019-07-18 User interface for manipulating user interface objects Active AU2019206101B2 (en)

Priority Applications (4)

Application Number Priority Date Filing Date Title
AU2019206101A AU2019206101B2 (en) 2013-09-03 2019-07-18 User interface for manipulating user interface objects
AU2021201748A AU2021201748C1 (en) 2013-09-03 2021-03-19 User interface for manipulating user interface objects
AU2022235585A AU2022235585A1 (en) 2013-09-03 2022-09-21 User interface for manipulating user interface objects
AU2024205135A AU2024205135A1 (en) 2013-09-03 2024-07-26 User interface for manipulating user interface objects

Applications Claiming Priority (14)

Application Number Priority Date Filing Date Title
US201361959851P 2013-09-03 2013-09-03
US201361873360P 2013-09-03 2013-09-03
US201361873356P 2013-09-03 2013-09-03
US201361873359P 2013-09-03 2013-09-03
US61/873,360 2013-09-03
US61/959,851 2013-09-03
US61/873,359 2013-09-03
US61/873,356 2013-09-03
US201414476657A 2014-09-03 2014-09-03
US14/476,657 2014-09-03
PCT/US2014/053957 WO2015034965A1 (en) 2013-09-03 2014-09-03 User interface for manipulating user interface objects
AU2014315324A AU2014315324B2 (en) 2013-09-03 2014-09-03 User interface for manipulating user interface objects
AU2017276285A AU2017276285B2 (en) 2013-09-03 2017-12-14 User interface for manipulating user interface objects
AU2019206101A AU2019206101B2 (en) 2013-09-03 2019-07-18 User interface for manipulating user interface objects

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
AU2017276285A Division AU2017276285B2 (en) 2013-09-03 2017-12-14 User interface for manipulating user interface objects

Related Child Applications (1)

Application Number Title Priority Date Filing Date
AU2021201748A Division AU2021201748C1 (en) 2013-09-03 2021-03-19 User interface for manipulating user interface objects

Publications (2)

Publication Number Publication Date
AU2019206101A1 true AU2019206101A1 (en) 2019-08-08
AU2019206101B2 AU2019206101B2 (en) 2020-12-24

Family

ID=51589515

Family Applications (12)

Application Number Title Priority Date Filing Date
AU2014315325A Active AU2014315325B2 (en) 2013-09-03 2014-09-03 User interface object manipulations in a user interface
AU2014315324A Active AU2014315324B2 (en) 2013-09-03 2014-09-03 User interface for manipulating user interface objects
AU2014315319A Active AU2014315319B2 (en) 2013-09-03 2014-09-03 Crown input for a wearable electronic device
AU2017276285A Active AU2017276285B2 (en) 2013-09-03 2017-12-14 User interface for manipulating user interface objects
AU2018200289A Active AU2018200289B2 (en) 2013-09-03 2018-01-12 Crown input for a wearable electronic device
AU2019206101A Active AU2019206101B2 (en) 2013-09-03 2019-07-18 User interface for manipulating user interface objects
AU2019257521A Abandoned AU2019257521A1 (en) 2013-09-03 2019-11-01 Crown input for a wearable electronic device
AU2021201748A Active AU2021201748C1 (en) 2013-09-03 2021-03-19 User interface for manipulating user interface objects
AU2021212114A Active AU2021212114B9 (en) 2013-09-03 2021-08-06 Crown input for a wearable electronic device
AU2022235585A Abandoned AU2022235585A1 (en) 2013-09-03 2022-09-21 User interface for manipulating user interface objects
AU2023237127A Pending AU2023237127A1 (en) 2013-09-03 2023-09-28 Crown input for a wearable electronic device
AU2024205135A Pending AU2024205135A1 (en) 2013-09-03 2024-07-26 User interface for manipulating user interface objects

Family Applications Before (5)

Application Number Title Priority Date Filing Date
AU2014315325A Active AU2014315325B2 (en) 2013-09-03 2014-09-03 User interface object manipulations in a user interface
AU2014315324A Active AU2014315324B2 (en) 2013-09-03 2014-09-03 User interface for manipulating user interface objects
AU2014315319A Active AU2014315319B2 (en) 2013-09-03 2014-09-03 Crown input for a wearable electronic device
AU2017276285A Active AU2017276285B2 (en) 2013-09-03 2017-12-14 User interface for manipulating user interface objects
AU2018200289A Active AU2018200289B2 (en) 2013-09-03 2018-01-12 Crown input for a wearable electronic device

Family Applications After (6)

Application Number Title Priority Date Filing Date
AU2019257521A Abandoned AU2019257521A1 (en) 2013-09-03 2019-11-01 Crown input for a wearable electronic device
AU2021201748A Active AU2021201748C1 (en) 2013-09-03 2021-03-19 User interface for manipulating user interface objects
AU2021212114A Active AU2021212114B9 (en) 2013-09-03 2021-08-06 Crown input for a wearable electronic device
AU2022235585A Abandoned AU2022235585A1 (en) 2013-09-03 2022-09-21 User interface for manipulating user interface objects
AU2023237127A Pending AU2023237127A1 (en) 2013-09-03 2023-09-28 Crown input for a wearable electronic device
AU2024205135A Pending AU2024205135A1 (en) 2013-09-03 2024-07-26 User interface for manipulating user interface objects

Country Status (5)

Country Link
JP (11) JP6333387B2 (en)
KR (12) KR102072614B1 (en)
AU (12) AU2014315325B2 (en)
DK (1) DK179231B1 (en)
WO (3) WO2015034965A1 (en)

Families Citing this family (99)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9100493B1 (en) * 2011-07-18 2015-08-04 Andrew H B Zhou Wearable personal digital device for facilitating mobile device payments and personal use
TWI439960B (en) 2010-04-07 2014-06-01 Apple Inc Avatar editing environment
US9753436B2 (en) 2013-06-11 2017-09-05 Apple Inc. Rotary input mechanism for an electronic device
KR102231031B1 (en) 2013-08-09 2021-03-23 애플 인크. Tactile switch for an electronic device
US10394325B2 (en) 2013-12-10 2019-08-27 Apple Inc. Input friction mechanism for rotary inputs of electronic devices
US10048802B2 (en) 2014-02-12 2018-08-14 Apple Inc. Rejection of false turns of rotary inputs for electronic devices
EP3195096B1 (en) 2014-08-02 2020-08-12 Apple Inc. Context-specific user interfaces
US10599101B2 (en) 2014-09-02 2020-03-24 Apple Inc. Wearable electronic device
US10145712B2 (en) 2014-09-09 2018-12-04 Apple Inc. Optical encoder including diffuser members
US9829350B2 (en) 2014-09-09 2017-11-28 Apple Inc. Magnetically coupled optical encoder
US9651405B1 (en) 2015-03-06 2017-05-16 Apple Inc. Dynamic adjustment of a sampling rate for an optical encoder
WO2016144385A1 (en) 2015-03-08 2016-09-15 Apple Inc. Sharing user-configurable graphical constructs
EP3251139B1 (en) 2015-03-08 2021-04-28 Apple Inc. Compressible seal for rotatable and translatable input mechanisms
ES2890451T3 (en) * 2015-03-27 2022-01-19 Saronikos Trading & Services Unipessoal Lda Electronic wristwatch or pocket watch comprising a rotating crown
WO2016171467A1 (en) 2015-04-23 2016-10-27 Samsung Electronics Co., Ltd. Electronic device including rotary member and display method thereof
KR102406102B1 (en) * 2015-04-24 2022-06-10 삼성전자주식회사 Electronic apparatus and method for displaying object thereof
KR20160131275A (en) * 2015-05-06 2016-11-16 엘지전자 주식회사 Watch type terminal
KR102356449B1 (en) 2015-05-13 2022-01-27 삼성전자주식회사 Apparatus and method for providing additional information according to rotary input
CN105137746B (en) * 2015-07-28 2018-03-27 广东欧珀移动通信有限公司 A kind of receives frequency adjusting method and intelligent watch
CN105068738A (en) * 2015-07-28 2015-11-18 广东欧珀移动通信有限公司 Control method for smartwatch and smartwatch
CN105116996A (en) * 2015-07-28 2015-12-02 广东欧珀移动通信有限公司 Control method for smart watch and smart watch
CN105141755A (en) * 2015-07-28 2015-12-09 广东欧珀移动通信有限公司 Information reply method, smart watch, terminal equipment and system
CN105117002B (en) * 2015-07-28 2017-07-11 广东欧珀移动通信有限公司 The table hat and the operating method of intelligent watch of a kind of intelligent watch
CN105117001B (en) * 2015-07-28 2017-07-11 广东欧珀移动通信有限公司 The table hat and the operating method of intelligent watch of a kind of intelligent watch
CN105138116B (en) * 2015-07-28 2018-07-06 广东欧珀移动通信有限公司 A kind of information displaying method, smartwatch, terminal device and system
CN105137819B (en) * 2015-07-28 2019-07-02 Oppo广东移动通信有限公司 A kind of method and smartwatch of music
CN105117118B (en) * 2015-07-28 2019-02-01 Oppo广东移动通信有限公司 A kind of method and smartwatch controlling video playing
CN105117120B (en) * 2015-07-28 2017-07-11 广东欧珀移动通信有限公司 The table hat and the operating method of intelligent watch of a kind of intelligent watch
CN105116997B (en) * 2015-07-28 2018-05-29 广东欧珀移动通信有限公司 A kind of data encryption, the method for decryption and smartwatch
CN106708379B (en) * 2015-07-28 2020-01-10 Oppo广东移动通信有限公司 Interface operation method and device and smart watch
CN105005479B (en) * 2015-07-28 2018-06-29 广东欧珀移动通信有限公司 A kind of alarm clock method for closing and smartwatch
CN105117143B (en) * 2015-07-28 2020-07-03 Oppo广东移动通信有限公司 Information display method, smart watch, server and system
CN105116998B (en) * 2015-07-28 2019-05-21 Oppo广东移动通信有限公司 A kind of method and smartwatch of fastopen
CN113154645B (en) * 2015-07-28 2022-07-26 Oppo广东移动通信有限公司 Air conditioner control method and smart watch
CN105117121B (en) * 2015-07-28 2019-04-02 Oppo广东移动通信有限公司 A kind of method that smartwatch is sought help and smartwatch
CN105025630A (en) * 2015-07-28 2015-11-04 广东欧珀移动通信有限公司 Brightness adjusting method and intelligent watch
CN105117119B (en) * 2015-07-28 2018-12-11 广东欧珀移动通信有限公司 A kind of method and smartwatch of rotation of screen picture
CN105013175A (en) * 2015-07-28 2015-11-04 广东欧珀移动通信有限公司 Game motion control method and intelligent watch
CN105389074A (en) * 2015-07-28 2016-03-09 广东欧珀移动通信有限公司 Control method for smart watch and smart watch
CN105022947B (en) * 2015-07-28 2019-02-22 Oppo广东移动通信有限公司 A kind of fingerprint identification method and smartwatch of smartwatch
CN105025629B (en) * 2015-07-28 2019-11-29 Oppo广东移动通信有限公司 A kind of control method and smartwatch of smartwatch
EP4327731A3 (en) 2015-08-20 2024-05-15 Apple Inc. Exercise-based watch face
CN105208675B (en) * 2015-08-26 2018-09-04 广东欧珀移动通信有限公司 A kind of wireless connection method and smartwatch based on smartwatch
CN105117010B (en) * 2015-08-26 2018-12-11 广东欧珀移动通信有限公司 A kind of method and smartwatch starting application program
CN105068847B (en) * 2015-08-26 2016-12-28 广东欧珀移动通信有限公司 A kind of application program launching method and intelligent watch
CN105117013B (en) * 2015-08-26 2018-03-27 广东欧珀移动通信有限公司 The unlocking method and intelligent watch of a kind of intelligent watch
CN105224208B (en) * 2015-08-26 2018-07-06 广东欧珀移动通信有限公司 The method and smartwatch that a kind of page is shown
CN105224193B (en) * 2015-08-26 2018-05-29 广东欧珀移动通信有限公司 The control method and smartwatch of a kind of smartwatch
CN105204893B (en) * 2015-08-26 2018-07-06 广东欧珀移动通信有限公司 A kind of application control method and smartwatch
CN105224072B (en) * 2015-08-26 2018-07-06 广东欧珀移动通信有限公司 The control method and smartwatch of a kind of music
CN105068742B (en) * 2015-08-26 2018-03-27 广东欧珀移动通信有限公司 The control method and intelligent watch of a kind of intelligent watch
CN105117014B (en) * 2015-08-26 2018-03-27 广东欧珀移动通信有限公司 A kind of friend-making management method and intelligent watch
CN105227201B (en) * 2015-08-26 2018-03-27 广东欧珀移动通信有限公司 A kind of communication information answering method and intelligent watch
CN105117011B (en) * 2015-08-26 2017-08-29 广东欧珀移动通信有限公司 A kind of method for operating application program, device and intelligent watch
CN105068412B (en) * 2015-08-26 2017-10-17 广东欧珀移动通信有限公司 A kind of intelligent watch and operating method
CN105117012B (en) * 2015-08-26 2018-06-29 广东欧珀移动通信有限公司 A kind of display interface method of adjustment and smartwatch
CN105117129A (en) * 2015-08-26 2015-12-02 广东欧珀移动通信有限公司 Interface operation method and device and smart watch
US10503271B2 (en) * 2015-09-30 2019-12-10 Apple Inc. Proximity detection for an input mechanism of an electronic device
US9983029B2 (en) 2015-09-30 2018-05-29 Apple Inc. Integrated optical encoder for tilt able rotatable shaft
KR102204682B1 (en) 2016-01-14 2021-01-19 후아웨이 테크놀러지 컴퍼니 리미티드 Electronic devices and methods of operating such electronic devices
WO2017126727A1 (en) * 2016-01-22 2017-07-27 엘지전자 주식회사 Watch-type mobile terminal and method of operation thereof
US10048837B2 (en) 2016-02-16 2018-08-14 Google Llc Target selection on a small form factor display
WO2017152139A1 (en) * 2016-03-04 2017-09-08 Apple Inc. Input with haptic feedback
CN107203261B (en) * 2016-03-16 2022-05-24 Lg电子株式会社 Watch type mobile terminal and control method thereof
US10025399B2 (en) * 2016-03-16 2018-07-17 Lg Electronics Inc. Watch type mobile terminal and method for controlling the same
US10551798B1 (en) 2016-05-17 2020-02-04 Apple Inc. Rotatable crown for an electronic device
JP6927670B2 (en) * 2016-05-26 2021-09-01 株式会社アイ・オー・データ機器 Operation reception device, program, and operation reception method
US10061399B2 (en) 2016-07-15 2018-08-28 Apple Inc. Capacitive gap sensor ring for an input device
US10019097B2 (en) * 2016-07-25 2018-07-10 Apple Inc. Force-detecting input structure
KR102607562B1 (en) * 2016-08-30 2023-11-30 삼성전자주식회사 Method for providing visual effects according to interaction based on bezel and electronic device for the same
US10324620B2 (en) 2016-09-06 2019-06-18 Apple Inc. Processing capacitive touch gestures implemented on an electronic device
CN114740963B (en) 2016-09-23 2024-06-28 苹果公司 Film watching mode
DK179412B1 (en) 2017-05-12 2018-06-06 Apple Inc Context-Specific User Interfaces
DK179555B1 (en) 2017-05-16 2019-02-13 Apple Inc. User interface for a flashlight mode on an electronic device
US10962935B1 (en) 2017-07-18 2021-03-30 Apple Inc. Tri-axis force sensor
US10203662B1 (en) 2017-09-25 2019-02-12 Apple Inc. Optical position sensor for a crown
DK180078B1 (en) 2018-05-07 2020-03-31 Apple Inc. USER INTERFACE FOR AVATAR CREATION
US12033296B2 (en) 2018-05-07 2024-07-09 Apple Inc. Avatar creation user interface
US11327650B2 (en) 2018-05-07 2022-05-10 Apple Inc. User interfaces having a collection of complications
US11722764B2 (en) 2018-05-07 2023-08-08 Apple Inc. Creative camera
US11360440B2 (en) 2018-06-25 2022-06-14 Apple Inc. Crown for an electronic watch
US11561515B2 (en) 2018-08-02 2023-01-24 Apple Inc. Crown for an electronic watch
CN211293787U (en) 2018-08-24 2020-08-18 苹果公司 Electronic watch
US11181863B2 (en) 2018-08-24 2021-11-23 Apple Inc. Conductive cap for watch crown
CN209625187U (en) 2018-08-30 2019-11-12 苹果公司 Electronic watch and electronic equipment
US11194299B1 (en) 2019-02-12 2021-12-07 Apple Inc. Variable frictional feedback device for a digital crown of an electronic watch
US11960701B2 (en) 2019-05-06 2024-04-16 Apple Inc. Using an illustration to show the passing of time
DK180684B1 (en) 2019-09-09 2021-11-25 Apple Inc Techniques for managing display usage
DK181103B1 (en) 2020-05-11 2022-12-15 Apple Inc User interfaces related to time
US11921998B2 (en) 2020-05-11 2024-03-05 Apple Inc. Editing features of an avatar
CN115904596B (en) 2020-05-11 2024-02-02 苹果公司 User interface for managing user interface sharing
KR102503135B1 (en) * 2020-05-11 2023-02-23 애플 인크. User interfaces related to time
US11550268B2 (en) 2020-06-02 2023-01-10 Apple Inc. Switch module for electronic crown assembly
US11694590B2 (en) 2020-12-21 2023-07-04 Apple Inc. Dynamic user interface with time indicator
US11720239B2 (en) 2021-01-07 2023-08-08 Apple Inc. Techniques for user interfaces related to an event
US11921992B2 (en) 2021-05-14 2024-03-05 Apple Inc. User interfaces related to time
US11776190B2 (en) 2021-06-04 2023-10-03 Apple Inc. Techniques for managing an avatar on a lock screen
US12092996B2 (en) 2021-07-16 2024-09-17 Apple Inc. Laser-based rotation sensor for a crown of an electronic watch
US20230236547A1 (en) 2022-01-24 2023-07-27 Apple Inc. User interfaces for indicating time

Family Cites Families (76)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5530455A (en) * 1994-08-10 1996-06-25 Mouse Systems Corporation Roller mouse for implementing scrolling in windows applications
US6047301A (en) * 1996-05-24 2000-04-04 International Business Machines Corporation Wearable computer
US6266098B1 (en) * 1997-10-22 2001-07-24 Matsushita Electric Corporation Of America Function presentation and selection using a rotatable function menu
JP3673425B2 (en) * 1999-04-16 2005-07-20 松下電器産業株式会社 Program selection execution device and data selection execution device
US7469381B2 (en) 2007-01-07 2008-12-23 Apple Inc. List scrolling and document translation, scaling, and rotation on a touch-screen display
EP1052566A1 (en) * 1999-05-14 2000-11-15 Alcatel Graphical user interface
US6661438B1 (en) * 2000-01-18 2003-12-09 Seiko Epson Corporation Display apparatus and portable information processing apparatus
US6809724B1 (en) * 2000-01-18 2004-10-26 Seiko Epson Corporation Display apparatus and portable information processing apparatus
US6522347B1 (en) * 2000-01-18 2003-02-18 Seiko Epson Corporation Display apparatus, portable information processing apparatus, information recording medium, and electronic apparatus
US6556222B1 (en) * 2000-06-30 2003-04-29 International Business Machines Corporation Bezel based input mechanism and user interface for a smart watch
US7081905B1 (en) * 2000-06-30 2006-07-25 International Business Machines Corporation Method and apparatus for dynamically controlling scroller speed employed for a user interface of a wearable appliance
JP2002175139A (en) * 2000-12-07 2002-06-21 Sony Corp Information processor, menu display method and program storage medium
JP3762243B2 (en) * 2001-03-26 2006-04-05 陣山 俊一 Information processing method, information processing program, and portable information terminal device
US7312785B2 (en) * 2001-10-22 2007-12-25 Apple Inc. Method and apparatus for accelerated scrolling
JP2003248544A (en) * 2002-02-25 2003-09-05 Sony Corp Graphical user interface, method for operating information processor, the information processor, and program
WO2003075146A1 (en) * 2002-03-05 2003-09-12 Sony Ericsson Mobile Communications Japan, Inc. Image processing device, image processing program, and image processing method
JP3761165B2 (en) * 2002-05-13 2006-03-29 株式会社モバイルコンピューティングテクノロジーズ Display control device, portable information terminal device, program, and display control method
JP2004021522A (en) * 2002-06-14 2004-01-22 Sony Corp Apparatus, method, and program for information processing
JP2004070654A (en) * 2002-08-06 2004-03-04 Matsushita Electric Ind Co Ltd Portable electronic equipment
JP2004184396A (en) * 2002-10-09 2004-07-02 Seiko Epson Corp Display device, clock, method for controlling display device, control program, and recording medium
JP2004178584A (en) 2002-11-26 2004-06-24 Asulab Sa Input method of security code by touch screen for accessing function, device or specific place, and device for executing the method
US20040130581A1 (en) * 2003-01-03 2004-07-08 Microsoft Corporation Interaction model
JP2004259063A (en) * 2003-02-26 2004-09-16 Sony Corp Device and method for display processing for three dimensional object and computer program
US8046705B2 (en) * 2003-05-08 2011-10-25 Hillcrest Laboratories, Inc. Systems and methods for resolution consistent semantic zooming
US20040264301A1 (en) * 2003-06-30 2004-12-30 Microsoft Corporation Calendar user interface
US7454713B2 (en) * 2003-12-01 2008-11-18 Sony Ericsson Mobile Communications Ab Apparatus, methods and computer program products providing menu expansion and organization functions
WO2005055034A1 (en) * 2003-12-01 2005-06-16 Research In Motion Limited Previewing a new event on a small screen device
US8082382B2 (en) * 2004-06-04 2011-12-20 Micron Technology, Inc. Memory device with user configurable density/performance
US7778671B2 (en) * 2004-10-08 2010-08-17 Nokia Corporation Mobile communications terminal having an improved user interface and method therefor
JP2006140990A (en) * 2004-10-13 2006-06-01 Olympus Corp Image display apparatus, camera, display methods of image display apparatus and camera
KR100630154B1 (en) * 2005-08-31 2006-10-02 삼성전자주식회사 Method for controlling display according to declension degree using a terrestrial magnetism sensor and the mobile terminal thereof
US20070063995A1 (en) * 2005-09-22 2007-03-22 Bailey Eric A Graphical user interface for use with a multi-media system
JP2007170995A (en) * 2005-12-22 2007-07-05 Casio Comput Co Ltd Electronic equipment and electronic timepiece
KR100678963B1 (en) * 2005-12-28 2007-02-06 삼성전자주식회사 Portable device and operation method comprising input button to enable revolution
KR100754674B1 (en) * 2006-03-10 2007-09-03 삼성전자주식회사 Method and apparatus for selecting menu in portable terminal
KR100896055B1 (en) * 2007-01-15 2009-05-07 엘지전자 주식회사 Mobile terminal having a rotating input device and display method thereof
KR20080073868A (en) * 2007-02-07 2008-08-12 엘지전자 주식회사 Terminal and method for displaying menu
TW200734916A (en) * 2007-05-03 2007-09-16 Ying-Chu Lee Method of using mouse wheel to operate picture
WO2009053606A2 (en) * 2007-10-12 2009-04-30 France Telecom Device for displaying a plurality of multimedia documents
JP4462338B2 (en) * 2007-11-27 2010-05-12 セイコーエプソン株式会社 Electronic clock, electronic clock time correction method, electronic clock control program
JP5356713B2 (en) * 2008-03-28 2013-12-04 京セラ株式会社 Mobile phone
JP2009265793A (en) 2008-04-23 2009-11-12 Sony Ericsson Mobilecommunications Japan Inc Display and operation device, operation device and program
KR101512041B1 (en) * 2008-07-01 2015-04-14 엘지전자 주식회사 Mobile terminal and control method thereof
KR101546774B1 (en) * 2008-07-29 2015-08-24 엘지전자 주식회사 Mobile terminal and operation control method thereof
KR101555055B1 (en) * 2008-10-10 2015-09-22 엘지전자 주식회사 Mobile terminal and display method thereof
US20110055752A1 (en) * 2009-06-04 2011-03-03 Rubinstein Jonathan J Method and Apparatus for Displaying and Auto-Correcting an Over-Scroll State on a Computing Device
JP5513071B2 (en) * 2009-10-26 2014-06-04 株式会社プロフィールド Information processing apparatus, information processing method, and program
CH701440A2 (en) 2009-07-03 2011-01-14 Comme Le Temps Sa Wrist touch screen and method for displaying on a watch with touch screen.
KR101608764B1 (en) * 2009-07-14 2016-04-04 엘지전자 주식회사 Mobile terminal and method for controlling display thereof
KR101595384B1 (en) * 2009-07-20 2016-02-18 엘지전자 주식회사 Watch type mobile terminal
JP5333068B2 (en) 2009-08-31 2013-11-06 ソニー株式会社 Information processing apparatus, display method, and display program
KR101649646B1 (en) * 2010-02-11 2016-08-19 엘지전자 주식회사 Portable terminal
US8930841B2 (en) * 2010-02-15 2015-01-06 Motorola Mobility Llc Methods and apparatus for a user interface configured to display event information
CH702862B1 (en) * 2010-03-30 2024-06-14 Smart Communications Sa Wristwatch with electronic display.
US20110252376A1 (en) 2010-04-07 2011-10-13 Imran Chaudhri Device, Method, and Graphical User Interface for Managing Concurrently Open Software Applications
US20110316888A1 (en) * 2010-06-28 2011-12-29 Invensense, Inc. Mobile device user interface combining input from motion sensors and other controls
JP5676952B2 (en) * 2010-07-26 2015-02-25 キヤノン株式会社 Display control apparatus, display control method, program, and storage medium
JP5745241B2 (en) * 2010-09-08 2015-07-08 任天堂株式会社 Information processing program, information processing apparatus, information processing system, and information processing method
US9104211B2 (en) * 2010-11-19 2015-08-11 Google Inc. Temperature controller with model-based time to target calculation and display
JP5762718B2 (en) 2010-10-20 2015-08-12 シャープ株式会社 Image forming apparatus
US9244606B2 (en) * 2010-12-20 2016-01-26 Apple Inc. Device, method, and graphical user interface for navigation of concurrently open software applications
KR101740439B1 (en) * 2010-12-23 2017-05-26 엘지전자 주식회사 Mobile terminal and method for controlling thereof
US9423951B2 (en) * 2010-12-31 2016-08-23 Microsoft Technology Licensing, Llc Content-based snap point
KR101785323B1 (en) * 2011-01-05 2017-10-17 삼성전자주식회사 Method and apparatus for providing a user interface in a portable terminal
TWI441051B (en) * 2011-01-25 2014-06-11 Compal Electronics Inc Electronic device and information display method thereof
US20120246678A1 (en) * 2011-03-24 2012-09-27 Tobe Barksdale Distance Dependent Scalable User Interface
JP2012252384A (en) 2011-05-31 2012-12-20 Camelot:Kk Screen control system, screen control method, and screen control program
JP2013003718A (en) * 2011-06-14 2013-01-07 Mitsubishi Electric Information Systems Corp Information processing device, scroll display method of information processing device, and scroll display program
EP2551784A1 (en) * 2011-07-28 2013-01-30 Roche Diagnostics GmbH Method of controlling the display of a dataset
US20130097566A1 (en) * 2011-10-17 2013-04-18 Carl Fredrik Alexander BERGLUND System and method for displaying items on electronic devices
US20130117698A1 (en) 2011-10-31 2013-05-09 Samsung Electronics Co., Ltd. Display apparatus and method thereof
JP6159078B2 (en) * 2011-11-28 2017-07-05 京セラ株式会社 Apparatus, method, and program
JP2013152693A (en) * 2011-12-27 2013-08-08 Nintendo Co Ltd Information processing program, information processing device, image display method, and image display system
CN103460164B (en) * 2012-02-03 2017-02-08 松下知识产权经营株式会社 Tactile sense presentation device and method for driving tactile sense presentation device
JP2013164700A (en) * 2012-02-10 2013-08-22 Samsung Electronics Co Ltd Scroll method and scroll device for portable terminal
KR20130094054A (en) * 2012-02-15 2013-08-23 삼성전자주식회사 Apparatus and method for managing object in portable electronic device

Also Published As

Publication number Publication date
WO2015034960A1 (en) 2015-03-12
JP6564493B2 (en) 2019-08-21
KR102072614B1 (en) 2020-02-03
JP2018136983A (en) 2018-08-30
AU2021212114B2 (en) 2023-07-20
KR102111452B1 (en) 2020-05-15
AU2014315325A1 (en) 2016-04-21
AU2014315324B2 (en) 2017-10-12
JP6333387B2 (en) 2018-05-30
KR20210070395A (en) 2021-06-14
JP2021177397A (en) 2021-11-11
KR20180054897A (en) 2018-05-24
JP6924802B2 (en) 2021-08-25
JP2023065397A (en) 2023-05-12
JP6397918B2 (en) 2018-09-26
JP6170250B2 (en) 2017-07-26
AU2021201748B2 (en) 2022-07-07
KR20200096999A (en) 2020-08-14
AU2017276285A1 (en) 2018-01-18
AU2021212114A1 (en) 2021-08-26
JP2018142361A (en) 2018-09-13
KR102305362B1 (en) 2021-09-24
DK179231B1 (en) 2018-02-19
JP2016532212A (en) 2016-10-13
KR20180041779A (en) 2018-04-24
AU2021212114B9 (en) 2023-11-23
JP2016532973A (en) 2016-10-20
KR20160048967A (en) 2016-05-04
JP7128153B2 (en) 2022-08-30
JP7223081B2 (en) 2023-02-15
AU2014315319B2 (en) 2017-10-26
AU2014315325B2 (en) 2017-05-04
KR102045111B1 (en) 2019-11-14
KR20190114034A (en) 2019-10-08
KR20190032627A (en) 2019-03-27
AU2021201748C1 (en) 2023-03-16
AU2021201748A1 (en) 2021-04-15
AU2019257521A1 (en) 2019-11-28
JP7532568B2 (en) 2024-08-13
JP2021182426A (en) 2021-11-25
AU2019206101B2 (en) 2020-12-24
JP2016534462A (en) 2016-11-04
AU2024205135A1 (en) 2024-08-15
AU2018200289B2 (en) 2019-08-01
KR102029303B1 (en) 2019-10-07
KR20160048972A (en) 2016-05-04
KR20180122752A (en) 2018-11-13
AU2018200289A1 (en) 2018-02-01
JP7471262B2 (en) 2024-04-19
WO2015034966A1 (en) 2015-03-12
AU2014315324A1 (en) 2016-04-28
KR102143895B1 (en) 2020-08-12
KR20160048955A (en) 2016-05-04
AU2014315319A1 (en) 2016-04-21
KR20210010661A (en) 2021-01-27
AU2017276285B2 (en) 2019-04-18
JP6547039B2 (en) 2019-07-17
KR102263620B1 (en) 2021-06-11
JP2019215891A (en) 2019-12-19
AU2022235585A1 (en) 2022-10-13
AU2023237127A1 (en) 2023-10-19
KR102131228B1 (en) 2020-07-07
WO2015034965A1 (en) 2015-03-12
KR20200084906A (en) 2020-07-13
DK201670117A1 (en) 2016-03-21
JP2019194892A (en) 2019-11-07
JP2023126783A (en) 2023-09-12

Similar Documents

Publication Publication Date Title
AU2021201748B2 (en) User interface for manipulating user interface objects
US10921976B2 (en) User interface for manipulating user interface objects
DK180032B8 (en) User Interface for manipulating user interface objects
US11513675B2 (en) User interface for manipulating user interface objects
US20230024225A1 (en) User interface for manipulating user interface objects

Legal Events

Date Code Title Description
FGA Letters patent sealed or granted (standard patent)