US20210349426A1 - User interfaces with a character having a visual state based on device activity state and an indication of time - Google Patents
User interfaces with a character having a visual state based on device activity state and an indication of time Download PDFInfo
- Publication number
- US20210349426A1 US20210349426A1 US17/031,671 US202017031671A US2021349426A1 US 20210349426 A1 US20210349426 A1 US 20210349426A1 US 202017031671 A US202017031671 A US 202017031671A US 2021349426 A1 US2021349426 A1 US 2021349426A1
- Authority
- US
- United States
- Prior art keywords
- character
- displaying
- computer system
- time
- user interface
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 230000000007 visual effect Effects 0.000 title claims description 513
- 230000000694 effects Effects 0.000 title claims description 364
- 230000004044 response Effects 0.000 claims description 382
- 238000000034 method Methods 0.000 claims description 298
- 230000008859 change Effects 0.000 claims description 183
- 230000033001 locomotion Effects 0.000 claims description 120
- 238000004891 communication Methods 0.000 claims description 73
- 230000007704 transition Effects 0.000 claims description 52
- 238000003860 storage Methods 0.000 claims description 34
- 230000003247 decreasing effect Effects 0.000 claims description 23
- 230000007935 neutral effect Effects 0.000 claims description 14
- 230000003068 static effect Effects 0.000 claims description 12
- 206010034719 Personality change Diseases 0.000 claims description 2
- 230000000977 initiatory effect Effects 0.000 abstract description 25
- 238000005259 measurement Methods 0.000 abstract description 12
- 230000001815 facial effect Effects 0.000 description 185
- 239000003086 colorant Substances 0.000 description 109
- 230000008569 process Effects 0.000 description 88
- 230000007246 mechanism Effects 0.000 description 71
- 238000010586 diagram Methods 0.000 description 28
- 230000001965 increasing effect Effects 0.000 description 26
- 239000003550 marker Substances 0.000 description 26
- 210000004209 hair Anatomy 0.000 description 25
- 230000003287 optical effect Effects 0.000 description 22
- 230000006870 function Effects 0.000 description 19
- 230000002093 peripheral effect Effects 0.000 description 19
- 238000013461 design Methods 0.000 description 18
- 230000002708 enhancing effect Effects 0.000 description 15
- 230000004397 blinking Effects 0.000 description 14
- 230000007423 decrease Effects 0.000 description 11
- 238000007726 management method Methods 0.000 description 11
- 230000005540 biological transmission Effects 0.000 description 10
- 238000005516 engineering process Methods 0.000 description 10
- 241000699666 Mus <mouse, genus> Species 0.000 description 9
- 230000001149 cognitive effect Effects 0.000 description 9
- 238000012545 processing Methods 0.000 description 9
- 241000238413 Octopus Species 0.000 description 8
- 230000002829 reductive effect Effects 0.000 description 8
- 230000004913 activation Effects 0.000 description 7
- 230000037308 hair color Effects 0.000 description 7
- 210000000887 face Anatomy 0.000 description 6
- 238000006073 displacement reaction Methods 0.000 description 5
- 210000004247 hand Anatomy 0.000 description 5
- 238000012986 modification Methods 0.000 description 5
- 230000004048 modification Effects 0.000 description 5
- 210000000707 wrist Anatomy 0.000 description 5
- 230000003213 activating effect Effects 0.000 description 4
- 238000001514 detection method Methods 0.000 description 4
- 210000003128 head Anatomy 0.000 description 4
- 230000003993 interaction Effects 0.000 description 4
- 230000021317 sensory perception Effects 0.000 description 4
- 238000012360 testing method Methods 0.000 description 4
- 230000001133 acceleration Effects 0.000 description 3
- 230000009471 action Effects 0.000 description 3
- 230000001413 cellular effect Effects 0.000 description 3
- 230000000295 complement effect Effects 0.000 description 3
- 210000004709 eyebrow Anatomy 0.000 description 3
- 238000003384 imaging method Methods 0.000 description 3
- PEDCQBHIVMGVHV-UHFFFAOYSA-N Glycerine Chemical compound OCC(O)CO PEDCQBHIVMGVHV-UHFFFAOYSA-N 0.000 description 2
- 241001465754 Metazoa Species 0.000 description 2
- 238000004590 computer program Methods 0.000 description 2
- 238000012790 confirmation Methods 0.000 description 2
- 230000000881 depressing effect Effects 0.000 description 2
- 210000005069 ears Anatomy 0.000 description 2
- 230000036541 health Effects 0.000 description 2
- 230000036961 partial effect Effects 0.000 description 2
- 230000000737 periodic effect Effects 0.000 description 2
- 239000004065 semiconductor Substances 0.000 description 2
- 230000035807 sensation Effects 0.000 description 2
- 239000007787 solid Substances 0.000 description 2
- 241000699670 Mus sp. Species 0.000 description 1
- 241001422033 Thestylus Species 0.000 description 1
- 238000004458 analytical method Methods 0.000 description 1
- 238000013459 approach Methods 0.000 description 1
- 238000003491 array Methods 0.000 description 1
- 230000008878 coupling Effects 0.000 description 1
- 238000010168 coupling process Methods 0.000 description 1
- 238000005859 coupling reaction Methods 0.000 description 1
- 230000009849 deactivation Effects 0.000 description 1
- 230000003111 delayed effect Effects 0.000 description 1
- 229920001746 electroactive polymer Polymers 0.000 description 1
- 230000004424 eye movement Effects 0.000 description 1
- 238000005562 fading Methods 0.000 description 1
- 230000010365 information processing Effects 0.000 description 1
- 230000000670 limiting effect Effects 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 230000007774 longterm Effects 0.000 description 1
- 230000014759 maintenance of location Effects 0.000 description 1
- 229910044991 metal oxide Inorganic materials 0.000 description 1
- 150000004706 metal oxides Chemical class 0.000 description 1
- 238000010295 mobile communication Methods 0.000 description 1
- 238000003032 molecular docking Methods 0.000 description 1
- 230000002085 persistent effect Effects 0.000 description 1
- 229920000642 polymer Polymers 0.000 description 1
- 238000003672 processing method Methods 0.000 description 1
- 238000009877 rendering Methods 0.000 description 1
- 230000010076 replication Effects 0.000 description 1
- 238000005096 rolling process Methods 0.000 description 1
- 238000010079 rubber tapping Methods 0.000 description 1
- 238000010897 surface acoustic wave method Methods 0.000 description 1
- 230000001755 vocal effect Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
-
- G—PHYSICS
- G04—HOROLOGY
- G04G—ELECTRONIC TIME-PIECES
- G04G21/00—Input or output devices integrated in time-pieces
- G04G21/02—Detectors of external physical values, e.g. temperature
-
- G—PHYSICS
- G04—HOROLOGY
- G04G—ELECTRONIC TIME-PIECES
- G04G21/00—Input or output devices integrated in time-pieces
- G04G21/02—Detectors of external physical values, e.g. temperature
- G04G21/025—Detectors of external physical values, e.g. temperature for measuring physiological data
-
- G—PHYSICS
- G04—HOROLOGY
- G04G—ELECTRONIC TIME-PIECES
- G04G21/00—Input or output devices integrated in time-pieces
- G04G21/08—Touch switches specially adapted for time-pieces
-
- G—PHYSICS
- G04—HOROLOGY
- G04G—ELECTRONIC TIME-PIECES
- G04G9/00—Visual time or date indication means
- G04G9/0064—Visual time or date indication means in which functions not related to time can be displayed
-
- G—PHYSICS
- G04—HOROLOGY
- G04G—ELECTRONIC TIME-PIECES
- G04G9/00—Visual time or date indication means
- G04G9/0064—Visual time or date indication means in which functions not related to time can be displayed
- G04G9/007—Visual time or date indication means in which functions not related to time can be displayed combined with a calculator or computing means
-
- G—PHYSICS
- G04—HOROLOGY
- G04G—ELECTRONIC TIME-PIECES
- G04G9/00—Visual time or date indication means
- G04G9/0076—Visual time or date indication means in which the time in another time-zone or in another city can be displayed at will
-
- G—PHYSICS
- G04—HOROLOGY
- G04G—ELECTRONIC TIME-PIECES
- G04G9/00—Visual time or date indication means
- G04G9/02—Visual time or date indication means by selecting desired characters out of a number of characters or by selecting indicating elements the position of which represent the time, e.g. by using multiplexing techniques
- G04G9/04—Visual time or date indication means by selecting desired characters out of a number of characters or by selecting indicating elements the position of which represent the time, e.g. by using multiplexing techniques by controlling light sources, e.g. electroluminescent diodes
-
- G—PHYSICS
- G04—HOROLOGY
- G04G—ELECTRONIC TIME-PIECES
- G04G9/00—Visual time or date indication means
- G04G9/02—Visual time or date indication means by selecting desired characters out of a number of characters or by selecting indicating elements the position of which represent the time, e.g. by using multiplexing techniques
- G04G9/06—Visual time or date indication means by selecting desired characters out of a number of characters or by selecting indicating elements the position of which represent the time, e.g. by using multiplexing techniques using light valves, e.g. liquid crystals
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/033—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
- G06F3/0362—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of 1D translations or rotations of an operating part of the device, e.g. scroll wheels, sliders, knobs, rollers or belts
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
- G06F3/04817—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance using icons
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
- G06F3/04842—Selection of displayed objects or displayed text elements
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
- G06F3/04845—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range for image manipulation, e.g. dragging, rotation, expansion or change of colour
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
- G06F3/04847—Interaction techniques to control parameter settings, e.g. interaction with sliders or dials
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
- G06F3/0485—Scrolling or panning
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
- G06F3/0486—Drag-and-drop
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
- G06F3/04883—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T11/00—2D [Two Dimensional] image generation
- G06T11/40—Filling a planar surface by adding surface attributes, e.g. colour or texture
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T11/00—2D [Two Dimensional] image generation
- G06T11/60—Editing figures and text; Combining figures or text
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T3/00—Geometric image transformations in the plane of the image
- G06T3/60—Rotation of whole images or parts thereof
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2200/00—Indexing scheme for image data processing or generation, in general
- G06T2200/24—Indexing scheme for image data processing or generation, in general involving graphical user interfaces [GUIs]
Definitions
- the present disclosure relates generally to computer user interfaces, and more specifically to techniques for managing user interfaces related to time.
- User interfaces can be displayed on an electronic device.
- a user of the electronic device can interact with the electronic device via the displayed user interface.
- User interfaces can enable one or more operations to be performed on the electronic device.
- Some techniques for managing user interfaces related to time using electronic devices are generally cumbersome and inefficient. For example, some existing techniques use a complex and time-consuming user interface, which may include multiple key presses or keystrokes. Existing techniques require more time than necessary, wasting user time and device energy. This latter consideration is particularly important in battery-operated devices.
- the present technique provides devices with faster, more efficient methods and interfaces for managing user interfaces related to time.
- Such methods and interfaces optionally complement or replace other methods for managing user interfaces related to time.
- Such methods and interfaces reduce the cognitive burden on a user and produce a more efficient human-machine interface.
- For battery-operated computing devices, such methods and interfaces conserve power and increase the time between battery charges.
- a method performed at a computer system that is in communication with a display generation component and one or more input devices.
- the method comprises: displaying, via the display generation component, a watch user interface, wherein displaying the watch user interface includes concurrently displaying: a first analog dial and a first time indicator that indicates a current time in a first time zone on the first analog dial, and a second analog dial and a second time indicator that indicates a current time in a second time zone on the second analog dial, wherein the second analog dial is displayed at a first orientation relative to the first analog dial; after displaying the watch user interface with the first analog dial and the second analog dial that is displayed at a first orientation relative to the first analog dial, receiving, via the one or more input devices, a request to change a time zone associated with the second analog dial; in response to receiving the request to change the time zone associated with the second analog dial, changing the time zone associated with the second analog dial to a third time zone that is different from the first time zone; and while the second analog dial
- a non-transitory computer-readable storage medium storing one or more programs configured to be executed by one or more processors of a computer system that is in communication with a display generation component and one or more input devices.
- the one or more programs include instructions for: displaying, via the display generation component, a watch user interface, wherein displaying the watch user interface includes concurrently displaying: a first analog dial and a first time indicator that indicates a current time in a first time zone on the first analog dial, and a second analog dial and a second time indicator that indicates a current time in a second time zone on the second analog dial, wherein the second analog dial is displayed at a first orientation relative to the first analog dial; after displaying the watch user interface with the first analog dial and the second analog dial that is displayed at a first orientation relative to the first analog dial, receiving, via the one or more input devices, a request to change a time zone associated with the second analog dial; in response to receiving the request to change the time zone associated with the second analog dial, changing the
- a transitory computer-readable storage medium storing one or more programs configured to be executed by one or more processors of a computer system that is in communication with a display generation component and one or more input devices.
- the one or more programs include instructions for: displaying, via the display generation component, a watch user interface, wherein displaying the watch user interface includes concurrently displaying: a first analog dial and a first time indicator that indicates a current time in a first time zone on the first analog dial, and a second analog dial and a second time indicator that indicates a current time in a second time zone on the second analog dial, wherein the second analog dial is displayed at a first orientation relative to the first analog dial; after displaying the watch user interface with the first analog dial and the second analog dial that is displayed at a first orientation relative to the first analog dial, receiving, via the one or more input devices, a request to change a time zone associated with the second analog dial; in response to receiving the request to change the time zone associated with the second analog dial, changing the time zone
- a computer system comprising a display generation component, one or more input devices, one or more processors, and memory storing one or more programs configured to be executed by the one or more processors.
- the one or more programs including instructions for: displaying, via the display generation component, a watch user interface, wherein displaying the watch user interface includes concurrently displaying: a first analog dial and a first time indicator that indicates a current time in a first time zone on the first analog dial, and a second analog dial and a second time indicator that indicates a current time in a second time zone on the second analog dial, wherein the second analog dial is displayed at a first orientation relative to the first analog dial; after displaying the watch user interface with the first analog dial and the second analog dial that is displayed at a first orientation relative to the first analog dial, receiving, via the one or more input devices, a request to change a time zone associated with the second analog dial; in response to receiving the request to change the time zone associated with the second analog dial, changing the time zone associated with the second
- a computer system comprises: a display generation component; one or more input devices; and means for displaying, via the display generation component, a watch user interface, wherein displaying the watch user interface includes concurrently displaying: a first analog dial and a first time indicator that indicates a current time in a first time zone on the first analog dial, and a second analog dial and a second time indicator that indicates a current time in a second time zone on the second analog dial, wherein the second analog dial is displayed at a first orientation relative to the first analog dial; means for, after displaying the watch user interface with the first analog dial and the second analog dial that is displayed at a first orientation relative to the first analog dial, receiving, via the one or more input devices, a request to change a time zone associated with the second analog dial; means for, in response to receiving the request to change the time zone associated with the second analog dial, changing the time zone associated with the second analog dial to a third time zone that is different from the first time zone; and means for, while
- a method performed at a computer system that is in communication with a display generation component and one or more input devices.
- the method comprises: displaying, via the display generation component, a watch user interface, the watch user interface including an analog clock face that includes a first clock hand and a graphical indicator, wherein the graphical indicator is displayed at a first position relative to the analog clock face; while displaying the watch user interface, detecting, via the one or more input devices, a first user input; in response to detecting the first user input, moving the graphical indicator to a second position relative to the analog clock face such that the graphical indicator is aligned with the first clock hand; and while the graphical indicator is displayed at the second position relative to the analog clock face, displaying a graphical indication of a time that has elapsed from a time when the first user input was detected to a current time.
- a non-transitory computer-readable storage medium storing one or more programs configured to be executed by one or more processors of a computer system that is in communication with a display generation component and one or more input devices.
- the one or more programs include instructions for: displaying, via the display generation component, a watch user interface, the watch user interface including an analog clock face that includes a first clock hand and a graphical indicator, wherein the graphical indicator is displayed at a first position relative to the analog clock face; while displaying the watch user interface, detecting, via the one or more input devices, a first user input; in response to detecting the first user input, moving the graphical indicator to a second position relative to the analog clock face such that the graphical indicator is aligned with the first clock hand; and while the graphical indicator is displayed at the second position relative to the analog clock face, displaying a graphical indication of a time that has elapsed from a time when the first user input was detected to a current time.
- a transitory computer-readable storage medium storing one or more programs configured to be executed by one or more processors of a computer system that is in communication with a display generation component and one or more input devices.
- the one or more programs include instructions for: displaying, via the display generation component, a watch user interface, the watch user interface including an analog clock face that includes a first clock hand and a graphical indicator, wherein the graphical indicator is displayed at a first position relative to the analog clock face; while displaying the watch user interface, detecting, via the one or more input devices, a first user input; in response to detecting the first user input, moving the graphical indicator to a second position relative to the analog clock face such that the graphical indicator is aligned with the first clock hand; and while the graphical indicator is displayed at the second position relative to the analog clock face, displaying a graphical indication of a time that has elapsed from a time when the first user input was detected to a current time.
- a computer system comprising a display generation component, one or more input devices, one or more processors, and memory storing one or more programs configured to be executed by the one or more processors.
- the one or more programs including instructions for: displaying, via the display generation component, a watch user interface, the watch user interface including an analog clock face that includes a first clock hand and a graphical indicator, wherein the graphical indicator is displayed at a first position relative to the analog clock face; while displaying the watch user interface, detecting, via the one or more input devices, a first user input; in response to detecting the first user input, moving the graphical indicator to a second position relative to the analog clock face such that the graphical indicator is aligned with the first clock hand; and while the graphical indicator is displayed at the second position relative to the analog clock face, displaying a graphical indication of a time that has elapsed from a time when the first user input was detected to a current time.
- a computer system comprises: a display generation component; one or more input devices; means for displaying, via the display generation component, a watch user interface, the watch user interface including an analog clock face that includes a first clock hand and a graphical indicator, wherein the graphical indicator is displayed at a first position relative to the analog clock face; means for, while displaying the watch user interface, detecting, via the one or more input devices, a first user input; means for, in response to detecting the first user input, moving the graphical indicator to a second position relative to the analog clock face such that the graphical indicator is aligned with the first clock hand; and means for, while the graphical indicator is displayed at the second position relative to the analog clock face, displaying a graphical indication of a time that has elapsed from a time when the first user input was detected to a current time.
- a method performed at a computer system that is in communication with a display generation component comprises: at a first time, displaying, concurrently in a user interface displayed via the display generation component: an indication of time, and a graphical representation of a first character, wherein displaying the graphical representation of the first character includes: in accordance with a determination that the computer system is in a first activity state, displaying the graphical representation of the first character in a first visual state that corresponds to the first activity state of the computer system; and in accordance with a determination that the computer system is in a second activity state that is different from the first activity state, displaying the graphical representation of the first character in a second visual state, different from the first visual state, that corresponds to the second activity state of the computer system; and at a second time, after the first time, displaying, concurrently in the user interface: the indication of time, and a graphical representation of a second character, wherein displaying the graphical representation of the second character includes: in
- a non-transitory computer-readable storage medium storing one or more programs configured to be executed by one or more processors of a computer system that is in communication with a display generation component.
- the one or more programs include instructions for: at a first time, displaying, concurrently in a user interface displayed via the display generation component: an indication of time, and a graphical representation of a first character, wherein displaying the graphical representation of the first character includes: in accordance with a determination that the computer system is in a first activity state, displaying the graphical representation of the first character in a first visual state that corresponds to the first activity state of the computer system; and in accordance with a determination that the computer system is in a second activity state that is different from the first activity state, displaying the graphical representation of the first character in a second visual state, different from the first visual state, that corresponds to the second activity state of the computer system; and at a second time, after the first time, displaying, concurrently in the user interface: the indication of
- a transitory computer-readable storage medium storing one or more programs configured to be executed by one or more processors of a computer system that is in communication with a display generation component.
- the one or more programs include instructions for: at a first time, displaying, concurrently in a user interface displayed via the display generation component: an indication of time, and a graphical representation of a first character, wherein displaying the graphical representation of the first character includes: in accordance with a determination that the computer system is in a first activity state, displaying the graphical representation of the first character in a first visual state that corresponds to the first activity state of the computer system; and in accordance with a determination that the computer system is in a second activity state that is different from the first activity state, displaying the graphical representation of the first character in a second visual state, different from the first visual state, that corresponds to the second activity state of the computer system; and at a second time, after the first time, displaying, concurrently in the user interface: the indication of time,
- a computer system comprising a display generation component; one or more processors; and memory storing one or more programs configured to be executed by the one or more processors.
- the one or more programs include instructions for: at a first time, displaying, concurrently in a user interface displayed via the display generation component: an indication of time, and a graphical representation of a first character, wherein displaying the graphical representation of the first character includes: in accordance with a determination that the computer system is in a first activity state, displaying the graphical representation of the first character in a first visual state that corresponds to the first activity state of the computer system; and in accordance with a determination that the computer system is in a second activity state that is different from the first activity state, displaying the graphical representation of the first character in a second visual state, different from the first visual state, that corresponds to the second activity state of the computer system; and at a second time, after the first time, displaying, concurrently in the user interface: the indication of time, and a
- a computer system comprises: a display generation component; means for, at a first time, displaying, concurrently in a user interface displayed via the display generation component: an indication of time, and a graphical representation of a first character, wherein displaying the graphical representation of the first character includes: in accordance with a determination that the computer system is in a first activity state, displaying the graphical representation of the first character in a first visual state that corresponds to the first activity state of the computer system; and in accordance with a determination that the computer system is in a second activity state that is different from the first activity state, displaying the graphical representation of the first character in a second visual state, different from the first visual state, that corresponds to the second activity state of the computer system; and means for, at a second time, after the first time, displaying, concurrently in the user interface: the indication of time, and a graphical representation of a second character, wherein displaying the graphical representation of the second character includes: in accordance
- a method performed at a computer system that is in communication with a display generation component comprises: displaying, via the display generation component, a time user interface that includes a representation of a first face having a first facial feature and a second facial feature, wherein: the first facial feature of the first face indicates a current time, and the second facial feature of the first face has a first visual characteristic; while displaying the representation of the first face, detecting the satisfaction of a predetermined criteria for changing an appearance of the time user interface; and in response to detecting the satisfaction of the predetermined criteria for changing an appearance of the time user interface, ceasing to display the representation of the first face and displaying a representation of a second face having a first facial feature and a second facial feature, wherein: the representation of the second face is different from the representation of the first face, the first facial feature of the second face indicates a current time, the second facial feature of the second face has a second visual characteristic different from the first visual characteristic, and ceasing to display the representation of
- a non-transitory computer-readable storage medium storing one or more programs configured to be executed by one or more processors of a computer system that is in communication with a display generation component.
- the one or more programs include instructions for: displaying, via the display generation component, a time user interface that includes a representation of a first face having a first facial feature and a second facial feature, wherein: the first facial feature of the first face indicates a current time, and the second facial feature of the first face has a first visual characteristic; while displaying the representation of the first face, detecting the satisfaction of a predetermined criteria for changing an appearance of the time user interface; and in response to detecting the satisfaction of the predetermined criteria for changing an appearance of the time user interface, ceasing to display the representation of the first face and displaying a representation of a second face having a first facial feature and a second facial feature, wherein: the representation of the second face is different from the representation of the first face, the first facial feature of the second face indicates a current time, the
- a transitory computer-readable storage medium storing one or more programs configured to be executed by one or more processors of a computer system that is in communication with a display generation component.
- the one or more programs include instructions for: displaying, via the display generation component, a time user interface that includes a representation of a first face having a first facial feature and a second facial feature, wherein: the first facial feature of the first face indicates a current time, and the second facial feature of the first face has a first visual characteristic; while displaying the representation of the first face, detecting the satisfaction of a predetermined criteria for changing an appearance of the time user interface; and in response to detecting the satisfaction of the predetermined criteria for changing an appearance of the time user interface, ceasing to display the representation of the first face and displaying a representation of a second face having a first facial feature and a second facial feature, wherein: the representation of the second face is different from the representation of the first face, the first facial feature of the second face indicates a current time, the second facial
- a computer system comprising a display generation component, one or more processors, and memory storing one or more programs configured to be executed by the one or more processors.
- the one or more programs including instructions for: displaying, via the display generation component, a time user interface that includes a representation of a first face having a first facial feature and a second facial feature, wherein: the first facial feature of the first face indicates a current time, and the second facial feature of the first face has a first visual characteristic; while displaying the representation of the first face, detecting the satisfaction of a predetermined criteria for changing an appearance of the time user interface; and in response to detecting the satisfaction of the predetermined criteria for changing an appearance of the time user interface, ceasing to display the representation of the first face and displaying a representation of a second face having a first facial feature and a second facial feature, wherein: the representation of the second face is different from the representation of the first face, the first facial feature of the second face indicates a current time, the second facial feature of the second
- a computer system comprises; a display generation component; means for displaying, via the display generation component, a time user interface that includes a representation of a first face having a first facial feature and a second facial feature, wherein: the first facial feature of the first face indicates a current time, and the second facial feature of the first face has a first visual characteristic; means for, while displaying the representation of the first face, detecting the satisfaction of a predetermined criteria for changing an appearance of the time user interface; and means for, in response to detecting the satisfaction of the predetermined criteria for changing an appearance of the time user interface, ceasing to display the representation of the first face and displaying a representation of a second face having a first facial feature and a second facial feature, wherein: the representation of the second face is different from the representation of the first face, the first facial feature of the second face indicates a current time, the second facial feature of the second face has a second visual characteristic different from the first visual characteristic, and ceasing to display the representation of
- a method performed at a computer system that is in communication with a display generation component and one or more input devices.
- the method comprises: displaying, via the display generation component, an editing user interface for editing a background of a user interface, wherein: the user interface includes content overlaid on the background, and the editing user interface includes a representation of the background of the user interface that includes a first number of stripes that is greater than one; while displaying the editing user interface, detecting, via the one or more input devices, a first user input; in response to detecting the first user input: in accordance with a determination that the first user input corresponds to a first type of input, displaying, in the user interface, a representation of an updated background with a second number of stripes that is greater than the first number of stripes; and in accordance with a determination that the first user input corresponds to a second type of input different from the first type of input, displaying, in the user interface, the representation of the updated background with a third number of stripes that is less than the first
- a non-transitory computer-readable storage medium storing one or more programs configured to be executed by one or more processors of a computer system that is in communication with a display generation component and one or more input devices.
- the one or more programs include instructions for: displaying, via the display generation component, an editing user interface for editing a background of a user interface, wherein: the user interface includes content overlaid on the background, and the editing user interface includes a representation of the background of the user interface that includes a first number of stripes that is greater than one; while displaying the editing user interface, detecting, via the one or more input devices, a first user input; in response to detecting the first user input: in accordance with a determination that the first user input corresponds to a first type of input, displaying, in the user interface, a representation of an updated background with a second number of stripes that is greater than the first number of stripes; and in accordance with a determination that the first user input corresponds to a second type of input different from the first type of
- a transitory computer-readable storage medium storing one or more programs configured to be executed by one or more processors of a computer system that is in communication with a display generation component and one or more input devices.
- the one or more programs include instructions for: displaying, via the display generation component, an editing user interface for editing a background of a user interface, wherein: the user interface includes content overlaid on the background, and the editing user interface includes a representation of the background of the user interface that includes a first number of stripes that is greater than one; while displaying the editing user interface, detecting, via the one or more input devices, a first user input; in response to detecting the first user input: in accordance with a determination that the first user input corresponds to a first type of input, displaying, in the user interface, a representation of an updated background with a second number of stripes that is greater than the first number of stripes; and in accordance with a determination that the first user input corresponds to a second type of input different from the first type of input,
- a computer system comprising a display generation component, one or more input devices, one or more processors, and memory storing one or more programs configured to be executed by the one or more processors.
- the one or more programs include instructions for: displaying, via the display generation component, an editing user interface for editing a background of a user interface, wherein: the user interface includes content overlaid on the background, and the editing user interface includes a representation of the background of the user interface that includes a first number of stripes that is greater than one; while displaying the editing user interface, detecting, via the one or more input devices, a first user input; in response to detecting the first user input: in accordance with a determination that the first user input corresponds to a first type of input, displaying, in the user interface, a representation of an updated background with a second number of stripes that is greater than the first number of stripes; and in accordance with a determination that the first user input corresponds to a second type of input different from the first type of input, displaying, in
- a computer system comprises: a display generation component; one or more input devices; means for displaying, via the display generation component, an editing user interface for editing a background of a user interface, wherein: the user interface includes content overlaid on the background, and the editing user interface includes a representation of the background of the user interface that includes a first number of stripes that is greater than one; means for, while displaying the editing user interface, detecting, via the one or more input devices, a first user input; means for, in response to detecting the first user input: in accordance with a determination that the first user input corresponds to a first type of input, displaying, in the user interface, a representation of an updated background with a second number of stripes that is greater than the first number of stripes; and in accordance with a determination that the first user input corresponds to a second type of input different from the first type of input, displaying, in the user interface, the representation of the updated background with a third number of stripes that is less than the first
- a method performed at a computer system that is in communication with a display generation component and one or more input devices.
- the method comprises: displaying, via the display generation component, a watch face editing user interface, wherein the watch face editing user interface includes a representation of a layout of a watch user interface including a time region for displaying a current time and one or more complication regions for displaying complications on the watch user interface; while displaying the watch face editing user interface, detecting, via the one or more input devices, a first input directed to a complication region of the one or more complication regions; and in response to detecting the first input directed to the complication region of the one or more complication regions, displaying a complication selection user interface, wherein displaying the complication selection user interface includes concurrently displaying: an indication of a first application, a first complication preview corresponding to a first complication that is configured to display, on the watch user interface, a first set of information obtained from the first application, where
- a non-transitory computer-readable storage medium storing one or more programs configured to be executed by one or more processors of a computer system that is in communication with a display generation component and one or more input devices.
- the one or more programs include instructions for: displaying, via the display generation component, a watch face editing user interface, wherein the watch face editing user interface includes a representation of a layout of a watch user interface including a time region for displaying a current time and one or more complication regions for displaying complications on the watch user interface; while displaying the watch face editing user interface, detecting, via the one or more input devices, a first input directed to a complication region of the one or more complication regions; and in response to detecting the first input directed to the complication region of the one or more complication regions, displaying a complication selection user interface, wherein displaying the complication selection user interface includes concurrently displaying: an indication of a first application, a first complication preview corresponding to a
- a transitory computer-readable storage medium storing one or more programs configured to be executed by one or more processors of a computer system that is in communication with a display generation component and one or more input devices.
- the one or more programs include instructions for: displaying, via the display generation component, a watch face editing user interface, wherein the watch face editing user interface includes a representation of a layout of a watch user interface including a time region for displaying a current time and one or more complication regions for displaying complications on the watch user interface; while displaying the watch face editing user interface, detecting, via the one or more input devices, a first input directed to a complication region of the one or more complication regions; and in response to detecting the first input directed to the complication region of the one or more complication regions, displaying a complication selection user interface, wherein displaying the complication selection user interface includes concurrently displaying: an indication of a first application, a first complication preview corresponding to a first
- a computer system comprising a display generation component, one or more input devices, one or more processors, and memory storing one or more programs configured to be executed by the one or more processors.
- the one or more programs include instructions for: displaying, via the display generation component, a watch face editing user interface, wherein the watch face editing user interface includes a representation of a layout of a watch user interface including a time region for displaying a current time and one or more complication regions for displaying complications on the watch user interface; while displaying the watch face editing user interface, detecting, via the one or more input devices, a first input directed to a complication region of the one or more complication regions; and in response to detecting the first input directed to the complication region of the one or more complication regions, displaying a complication selection user interface, wherein displaying the complication selection user interface includes concurrently displaying: an indication of a first application, a first complication preview corresponding to a first complication that is
- a computer system comprises: a display generation component; one or more input devices; means for displaying, via the display generation component, a watch face editing user interface, wherein the watch face editing user interface includes a representation of a layout of a watch user interface including a time region for displaying a current time and one or more complication regions for displaying complications on the watch user interface; means for, while displaying the watch face editing user interface, detecting, via the one or more input devices, a first input directed to a complication region of the one or more complication regions; and means for, in response to detecting the first input directed to the complication region of the one or more complication regions, displaying a complication selection user interface, wherein displaying the complication selection user interface includes concurrently displaying: an indication of a first application, a first complication preview corresponding to a first complication that is configured to display, on the watch user interface, a first set of information obtained from the first application, where
- a method performed at a computer system that is in communication with a display generation component comprises: displaying, via the display generation component, a representation of a watch face user interface that is associated with one or more graphical representations of respective characters; while displaying the representation of the watch face user interface, detecting an input corresponding to a request to share the watch face user interface with an external device; in response to detecting the input, initiating a process for sharing the watch face user interface with the external device, wherein: in accordance with a determination that the watch face user interface is associated with less than a threshold number of graphical representations of respective characters, the process for sharing the watch face user interface with the external device includes sharing one or more characteristics of the watch face user interface including transmitting a representation of one or more of the one or more graphical representations of respective characters associated with the watch face user interface; and in accordance with a determination that the watch face user interface is associated with greater than or equal to the threshold number of graphical representations of respective characters, the process for sharing the
- a non-transitory computer-readable storage medium storing one or more programs configured to be executed by one or more processors of a computer system that is in communication with a display generation component.
- the one or more programs include instructions for: displaying, via the display generation component, a representation of a watch face user interface that is associated with one or more graphical representations of respective characters; while displaying the representation of the watch face user interface, detecting an input corresponding to a request to share the watch face user interface with an external device; in response to detecting the input, initiating a process for sharing the watch face user interface with the external device, wherein: in accordance with a determination that the watch face user interface is associated with less than a threshold number of graphical representations of respective characters, the process for sharing the watch face user interface with the external device includes sharing one or more characteristics of the watch face user interface including transmitting a representation of one or more of the one or more graphical representations of respective characters associated with the watch face user interface; and in accordance with a determination that the
- a transitory computer-readable storage medium storing one or more programs configured to be executed by one or more processors of a computer system that is in communication with a display generation component.
- the one or more programs include instructions for: displaying, via the display generation component, a representation of a watch face user interface that is associated with one or more graphical representations of respective characters; while displaying the representation of the watch face user interface, detecting an input corresponding to a request to share the watch face user interface with an external device; in response to detecting the input, initiating a process for sharing the watch face user interface with the external device, wherein: in accordance with a determination that the watch face user interface is associated with less than a threshold number of graphical representations of respective characters, the process for sharing the watch face user interface with the external device includes sharing one or more characteristics of the watch face user interface including transmitting a representation of one or more of the one or more graphical representations of respective characters associated with the watch face user interface; and in accordance with a determination that the watch face
- a computer system comprising a display generation component, one or more processors, and memory storing one or more programs configured to be executed by the one or more processors.
- the one or more programs include instructions for: displaying, via the display generation component, a representation of a watch face user interface that is associated with one or more graphical representations of respective characters; while displaying the representation of the watch face user interface, detecting an input corresponding to a request to share the watch face user interface with an external device; in response to detecting the input, initiating a process for sharing the watch face user interface with the external device, wherein: in accordance with a determination that the watch face user interface is associated with less than a threshold number of graphical representations of respective characters, the process for sharing the watch face user interface with the external device includes sharing one or more characteristics of the watch face user interface including transmitting a representation of one or more of the one or more graphical representations of respective characters associated with the watch face user interface; and in accordance with a determination that the watch face user interface is associated
- a computer system comprises: a display generation component; means for displaying, via the display generation component, a representation of a watch face user interface that is associated with one or more graphical representations of respective characters; means, while displaying the representation of the watch face user interface, for detecting an input corresponding to a request to share the watch face user interface with an external device; in response to detecting the input, means for initiating a process for sharing the watch face user interface with the external device, wherein: in accordance with a determination that the watch face user interface is associated with less than a threshold number of graphical representations of respective characters, the process for sharing the watch face user interface with the external device includes sharing one or more characteristics of the watch face user interface including transmitting a representation of one or more of the one or more graphical representations of respective characters associated with the watch face user interface; and in accordance with a determination that the watch face user interface is associated with greater than or equal to the threshold number of graphical representations of respective characters, the process for sharing the watch
- Executable instructions for performing these functions are, optionally, included in a non-transitory computer-readable storage medium or other computer program product configured for execution by one or more processors. Executable instructions for performing these functions are, optionally, included in a transitory computer-readable storage medium or other computer program product configured for execution by one or more processors.
- devices are provided with faster, more efficient methods and interfaces for managing user interfaces related to time, thereby increasing the effectiveness, efficiency, and user satisfaction with such computer systems (e.g., electronic devices).
- Such methods and interfaces may complement or replace other methods for managing user interfaces related to time.
- FIG. 1A is a block diagram illustrating a portable multifunction device with a touch-sensitive display in accordance with some embodiments.
- FIG. 1B is a block diagram illustrating exemplary components for event handling in accordance with some embodiments.
- FIG. 2 illustrates a portable multifunction device having a touch screen in accordance with some embodiments.
- FIG. 3 is a block diagram of an exemplary multifunction device with a display and a touch-sensitive surface in accordance with some embodiments.
- FIG. 4A illustrates an exemplary user interface for a menu of applications on a portable multifunction device in accordance with some embodiments.
- FIG. 4B illustrates an exemplary user interface for a multifunction device with a touch-sensitive surface that is separate from the display in accordance with some embodiments.
- FIG. 5B is a block diagram illustrating a personal electronic device in accordance with some embodiments.
- FIGS. 6A-6H illustrate exemplary user interfaces for displaying and enabling an adjustment of a displayed time zone, in accordance with some embodiments.
- FIGS. 7A-7C are a flow diagram illustrating methods of displaying and enabling an adjustment of a displayed time zone, in accordance with some embodiments.
- FIGS. 8A-8M illustrate exemplary user interfaces for initiating a measurement of time, in accordance with some embodiments.
- FIGS. 9A-9B are a flow diagram illustrating methods of initiating a measurement of time, in accordance with some embodiments.
- FIGS. 10A-10AC illustrate exemplary user interfaces for enabling and displaying a user interface using a character, in accordance with some embodiments.
- FIGS. 11A-11H are a flow diagram illustrating methods of enabling and displaying a user interface using a character, in accordance with some embodiments.
- FIGS. 12A-12G illustrate exemplary user interfaces for enabling and displaying an indication of a current time, in accordance with some embodiments.
- FIGS. 13A-13C are a flow diagram illustrating methods of enabling and displaying an indication of a current time, in accordance with some embodiments.
- FIGS. 14A-14AD illustrate exemplary user interfaces for enabling configuration of a background for a user interface, in accordance with some embodiments.
- FIGS. 15A-15F are a flow diagram illustrating methods of enabling configuration of a background for a user interface, in accordance with some embodiments.
- FIGS. 16A-16AE illustrate exemplary user interfaces for enabling configuration of a user interface, in accordance with some embodiments.
- FIGS. 17A-17D are a flow diagram illustrating methods of enabling configuration of a user interface, in accordance with some embodiments.
- FIGS. 18A-18J illustrate exemplary user interfaces for sharing a configuration of a user interface with an external device, in accordance with some embodiments.
- FIGS. 19A-19C are a flow diagram illustrating methods for sharing a configuration of a user interface with an external device, in accordance with some embodiments.
- FIGS. 1A-1B, 2, 3, 4A-4B, and 5A-5B provide a description of exemplary devices for performing the techniques for managing event notifications.
- FIGS. 6A-6H illustrate exemplary user interfaces for displaying and enabling an adjustment of a displayed time zone, in accordance with some embodiments.
- FIGS. 7A-7C are a flow diagram illustrating methods of displaying and enabling an adjustment of a displayed time zone, in accordance with some embodiments.
- the user interfaces in FIGS. 6A-6H are used to illustrate the processes described below, including the processes in FIGS. 7A-7C .
- FIGS. 8A-8M illustrate exemplary user interfaces for initiating a measurement of time, in accordance with some embodiments.
- FIGS. 9A-9B are a flow diagram illustrating methods of initiating a measurement of time, in accordance with some embodiments.
- the user interfaces in FIGS. 8A-8M are used to illustrate the processes described below, including the processes in FIGS. 9A-9B .
- FIGS. 10A-10AC illustrate exemplary user interfaces for enabling and displaying a user interface using a character, in accordance with some embodiments.
- FIGS. 11A-11H are a flow diagram illustrating methods of enabling and displaying a user interface using a character, in accordance with some embodiments.
- the user interfaces in FIGS. 10A-10AC are used to illustrate the processes described below, including the processes in FIGS. 11A-11H .
- FIGS. 12A-12G illustrate exemplary user interfaces for enabling and displaying an indication of a current time, in accordance with some embodiments.
- FIGS. 13A-13C are a flow diagram illustrating methods of enabling and displaying an indication of a current time, in accordance with some embodiments.
- the user interfaces in FIGS. 12A-12G are used to illustrate the processes described below, including the processes in FIGS. 13A-13C .
- FIGS. 14A-14AD illustrate exemplary user interfaces for enabling configuration of a background for a user interface, in accordance with some embodiments.
- FIGS. 15A-15F are a flow diagram illustrating methods of enabling configuration of a background for a user interface, in accordance with some embodiments.
- FIGS. 14A-14AD are used to illustrate the processes described below, including the processes in FIGS. 15A-15F .
- FIGS. 16A-16AE illustrate exemplary user interfaces for enabling configuration of a user interface, in accordance with some embodiments.
- FIGS. 17A-17D are a flow diagram illustrating methods of enabling configuration of a user interface, in accordance with some embodiments. The user interfaces in FIGS. 16A-16AE are used to illustrate the processes described below, including the processes in FIGS. 17A-17D .
- FIGS. 18A-18J illustrate exemplary user interfaces for sharing a configuration of a user interface with an external device, in accordance with some embodiments.
- FIGS. 19A-19C are a flow diagram illustrating methods for sharing a configuration of a user interface with an external device, in accordance with some embodiments.
- the user interfaces in FIGS. 18A-18J are used to illustrate the processes described below, including the processes in FIGS. 19A-19C .
- first could be termed a second touch
- first touch could be termed a first touch
- second touch could be termed a first touch
- the first touch and the second touch are both touches, but they are not the same touch.
- if is, optionally, construed to mean “when” or “upon” or “in response to determining” or “in response to detecting,” depending on the context.
- phrase “if it is determined” or “if [a stated condition or event] is detected” is, optionally, construed to mean “upon determining” or “in response to determining” or “upon detecting [the stated condition or event]” or “in response to detecting [the stated condition or event],” depending on the context.
- the device is a portable communications device, such as a mobile telephone, that also contains other functions, such as PDA and/or music player functions.
- portable multifunction devices include, without limitation, the iPhone®, iPod Touch®, and iPad® devices from Apple Inc. of Cupertino, Calif.
- Other portable electronic devices such as laptops or tablet computers with touch-sensitive surfaces (e.g., touch screen displays and/or touchpads), are, optionally, used.
- the device is not a portable communications device, but is a desktop computer with a touch-sensitive surface (e.g., a touch screen display and/or a touchpad).
- the electronic device is a computer system that is in communication (e.g., via wireless communication, via wired communication) with a display generation component.
- the display generation component is configured to provide visual output, such as display via a CRT display, display via an LED display, or display via image projection.
- the display generation component is integrated with the computer system. In some embodiments, the display generation component is separate from the computer system.
- displaying includes causing to display the content (e.g., video data rendered or decoded by display controller 156 ) by transmitting, via a wired or wireless connection, data (e.g., image data or video data) to an integrated or external display generation component to visually produce the content.
- content e.g., video data rendered or decoded by display controller 156
- data e.g., image data or video data
- an electronic device that includes a display and a touch-sensitive surface is described. It should be understood, however, that the electronic device optionally includes one or more other physical user-interface devices, such as a physical keyboard, a mouse, and/or a joystick.
- the device typically supports a variety of applications, such as one or more of the following: a drawing application, a presentation application, a word processing application, a website creation application, a disk authoring application, a spreadsheet application, a gaming application, a telephone application, a video conferencing application, an e-mail application, an instant messaging application, a workout support application, a photo management application, a digital camera application, a digital video camera application, a web browsing application, a digital music player application, and/or a digital video player application.
- applications such as one or more of the following: a drawing application, a presentation application, a word processing application, a website creation application, a disk authoring application, a spreadsheet application, a gaming application, a telephone application, a video conferencing application, an e-mail application, an instant messaging application, a workout support application, a photo management application, a digital camera application, a digital video camera application, a web browsing application, a digital music player application, and/or a digital video player application.
- the various applications that are executed on the device optionally use at least one common physical user-interface device, such as the touch-sensitive surface.
- One or more functions of the touch-sensitive surface as well as corresponding information displayed on the device are, optionally, adjusted and/or varied from one application to the next and/or within a respective application.
- a common physical architecture (such as the touch-sensitive surface) of the device optionally supports the variety of applications with user interfaces that are intuitive and transparent to the user.
- FIG. 1A is a block diagram illustrating portable multifunction device 100 with touch-sensitive display system 112 in accordance with some embodiments.
- Touch-sensitive display 112 is sometimes called a “touch screen” for convenience and is sometimes known as or called a “touch-sensitive display system.”
- Device 100 includes memory 102 (which optionally includes one or more computer-readable storage mediums), memory controller 122 , one or more processing units (CPUs) 120 , peripherals interface 118 , RF circuitry 108 , audio circuitry 110 , speaker 111 , microphone 113 , input/output (I/O) subsystem 106 , other input control devices 116 , and external port 124 .
- memory 102 which optionally includes one or more computer-readable storage mediums
- memory controller 122 includes memory controller 122 , one or more processing units (CPUs) 120 , peripherals interface 118 , RF circuitry 108 , audio circuitry 110 , speaker 111 , microphone 113 , input/output (I/O)
- Device 100 optionally includes one or more optical sensors 164 .
- Device 100 optionally includes one or more contact intensity sensors 165 for detecting intensity of contacts on device 100 (e.g., a touch-sensitive surface such as touch-sensitive display system 112 of device 100 ).
- Device 100 optionally includes one or more tactile output generators 167 for generating tactile outputs on device 100 (e.g., generating tactile outputs on a touch-sensitive surface such as touch-sensitive display system 112 of device 100 or touchpad 355 of device 300 ). These components optionally communicate over one or more communication buses or signal lines 103 .
- the term “intensity” of a contact on a touch-sensitive surface refers to the force or pressure (force per unit area) of a contact (e.g., a finger contact) on the touch-sensitive surface, or to a substitute (proxy) for the force or pressure of a contact on the touch-sensitive surface.
- the intensity of a contact has a range of values that includes at least four distinct values and more typically includes hundreds of distinct values (e.g., at least 256).
- Intensity of a contact is, optionally, determined (or measured) using various approaches and various sensors or combinations of sensors. For example, one or more force sensors underneath or adjacent to the touch-sensitive surface are, optionally, used to measure force at various points on the touch-sensitive surface.
- force measurements from multiple force sensors are combined (e.g., a weighted average) to determine an estimated force of a contact.
- a pressure-sensitive tip of a stylus is, optionally, used to determine a pressure of the stylus on the touch-sensitive surface.
- the size of the contact area detected on the touch-sensitive surface and/or changes thereto, the capacitance of the touch-sensitive surface proximate to the contact and/or changes thereto, and/or the resistance of the touch-sensitive surface proximate to the contact and/or changes thereto are, optionally, used as a substitute for the force or pressure of the contact on the touch-sensitive surface.
- the substitute measurements for contact force or pressure are used directly to determine whether an intensity threshold has been exceeded (e.g., the intensity threshold is described in units corresponding to the substitute measurements).
- the substitute measurements for contact force or pressure are converted to an estimated force or pressure, and the estimated force or pressure is used to determine whether an intensity threshold has been exceeded (e.g., the intensity threshold is a pressure threshold measured in units of pressure).
- intensity of a contact as an attribute of a user input allows for user access to additional device functionality that may otherwise not be accessible by the user on a reduced-size device with limited real estate for displaying affordances (e.g., on a touch-sensitive display) and/or receiving user input (e.g., via a touch-sensitive display, a touch-sensitive surface, or a physical/mechanical control such as a knob or a button).
- the term “tactile output” refers to physical displacement of a device relative to a previous position of the device, physical displacement of a component (e.g., a touch-sensitive surface) of a device relative to another component (e.g., housing) of the device, or displacement of the component relative to a center of mass of the device that will be detected by a user with the user's sense of touch.
- a component e.g., a touch-sensitive surface
- another component e.g., housing
- the tactile output generated by the physical displacement will be interpreted by the user as a tactile sensation corresponding to a perceived change in physical characteristics of the device or the component of the device.
- a touch-sensitive surface e.g., a touch-sensitive display or trackpad
- the user is, optionally, interpreted by the user as a “down click” or “up click” of a physical actuator button.
- a user will feel a tactile sensation such as an “down click” or “up click” even when there is no movement of a physical actuator button associated with the touch-sensitive surface that is physically pressed (e.g., displaced) by the user's movements.
- movement of the touch-sensitive surface is, optionally, interpreted or sensed by the user as “roughness” of the touch-sensitive surface, even when there is no change in smoothness of the touch-sensitive surface. While such interpretations of touch by a user will be subject to the individualized sensory perceptions of the user, there are many sensory perceptions of touch that are common to a large majority of users.
- a tactile output is described as corresponding to a particular sensory perception of a user (e.g., an “up click,” a “down click,” “roughness”)
- the generated tactile output corresponds to physical displacement of the device or a component thereof that will generate the described sensory perception for a typical (or average) user.
- device 100 is only one example of a portable multifunction device, and that device 100 optionally has more or fewer components than shown, optionally combines two or more components, or optionally has a different configuration or arrangement of the components.
- the various components shown in FIG. 1A are implemented in hardware, software, or a combination of both hardware and software, including one or more signal processing and/or application-specific integrated circuits.
- Memory 102 optionally includes high-speed random access memory and optionally also includes non-volatile memory, such as one or more magnetic disk storage devices, flash memory devices, or other non-volatile solid-state memory devices.
- Memory controller 122 optionally controls access to memory 102 by other components of device 100 .
- Peripherals interface 118 can be used to couple input and output peripherals of the device to CPU 120 and memory 102 .
- the one or more processors 120 run or execute various software programs and/or sets of instructions stored in memory 102 to perform various functions for device 100 and to process data.
- peripherals interface 118 , CPU 120 , and memory controller 122 are, optionally, implemented on a single chip, such as chip 104 . In some other embodiments, they are, optionally, implemented on separate chips.
- RF (radio frequency) circuitry 108 receives and sends RF signals, also called electromagnetic signals.
- RF circuitry 108 converts electrical signals to/from electromagnetic signals and communicates with communications networks and other communications devices via the electromagnetic signals.
- RF circuitry 108 optionally includes well-known circuitry for performing these functions, including but not limited to an antenna system, an RF transceiver, one or more amplifiers, a tuner, one or more oscillators, a digital signal processor, a CODEC chipset, a subscriber identity module (SIM) card, memory, and so forth.
- an antenna system an RF transceiver, one or more amplifiers, a tuner, one or more oscillators, a digital signal processor, a CODEC chipset, a subscriber identity module (SIM) card, memory, and so forth.
- SIM subscriber identity module
- RF circuitry 108 optionally communicates with networks, such as the Internet, also referred to as the World Wide Web (WWW), an intranet and/or a wireless network, such as a cellular telephone network, a wireless local area network (LAN) and/or a metropolitan area network (MAN), and other devices by wireless communication.
- the RF circuitry 108 optionally includes well-known circuitry for detecting near field communication (NFC) fields, such as by a short-range communication radio.
- NFC near field communication
- the wireless communication optionally uses any of a plurality of communications standards, protocols, and technologies, including but not limited to Global System for Mobile Communications (GSM), Enhanced Data GSM Environment (EDGE), high-speed downlink packet access (HSDPA), high-speed uplink packet access (HSUPA), Evolution, Data-Only (EV-DO), HSPA, HSPA+, Dual-Cell HSPA (DC-HSPDA), long term evolution (LTE), near field communication (NFC), wideband code division multiple access (W-CDMA), code division multiple access (CDMA), time division multiple access (TDMA), Bluetooth, Bluetooth Low Energy (BTLE), Wireless Fidelity (Wi-Fi) (e.g., IEEE 802.11a, IEEE 802.11b, IEEE 802.11g, IEEE 802.11n, and/or IEEE 802.11ac), voice over Internet Protocol (VoIP), Wi-MAX, a protocol for e-mail (e.g., Internet message access protocol (IMAP) and/or post office protocol (POP)), instant messaging (e.
- Audio circuitry 110 , speaker 111 , and microphone 113 provide an audio interface between a user and device 100 .
- Audio circuitry 110 receives audio data from peripherals interface 118 , converts the audio data to an electrical signal, and transmits the electrical signal to speaker 111 .
- Speaker 111 converts the electrical signal to human-audible sound waves.
- Audio circuitry 110 also receives electrical signals converted by microphone 113 from sound waves.
- Audio circuitry 110 converts the electrical signal to audio data and transmits the audio data to peripherals interface 118 for processing. Audio data is, optionally, retrieved from and/or transmitted to memory 102 and/or RF circuitry 108 by peripherals interface 118 .
- audio circuitry 110 also includes a headset jack (e.g., 212 , FIG.
- the headset jack provides an interface between audio circuitry 110 and removable audio input/output peripherals, such as output-only headphones or a headset with both output (e.g., a headphone for one or both ears) and input (e.g., a microphone).
- removable audio input/output peripherals such as output-only headphones or a headset with both output (e.g., a headphone for one or both ears) and input (e.g., a microphone).
- I/O subsystem 106 couples input/output peripherals on device 100 , such as touch screen 112 and other input control devices 116 , to peripherals interface 118 .
- I/O subsystem 106 optionally includes display controller 156 , optical sensor controller 158 , depth camera controller 169 , intensity sensor controller 159 , haptic feedback controller 161 , and one or more input controllers 160 for other input or control devices.
- the one or more input controllers 160 receive/send electrical signals from/to other input control devices 116 .
- the other input control devices 116 optionally include physical buttons (e.g., push buttons, rocker buttons, etc.), dials, slider switches, joysticks, click wheels, and so forth.
- input controller(s) 160 are, optionally, coupled to any (or none) of the following: a keyboard, an infrared port, a USB port, and a pointer device such as a mouse.
- the one or more buttons optionally include an up/down button for volume control of speaker 111 and/or microphone 113 .
- the one or more buttons optionally include a push button (e.g., 206 , FIG. 2 ).
- the electronic device is a computer system that is in communication (e.g., via wireless communication, via wired communication) with one or more input devices.
- the one or more input devices include a touch-sensitive surface (e.g., a trackpad, as part of a touch-sensitive display).
- the one or more input devices include one or more camera sensors (e.g., one or more optical sensors 164 and/or one or more depth camera sensors 175 ), such as for tracking a user's gestures (e.g., hand gestures) as input.
- the one or more input devices are integrated with the computer system. In some embodiments, the one or more input devices are separate from the computer system.
- a quick press of the push button optionally disengages a lock of touch screen 112 or optionally begins a process that uses gestures on the touch screen to unlock the device, as described in U.S. patent application Ser. No. 11/322,549, “Unlocking a Device by Performing Gestures on an Unlock Image,” filed Dec. 23, 2005, U.S. Pat. No. 7,657,849, which is hereby incorporated by reference in its entirety.
- a longer press of the push button e.g., 206
- the functionality of one or more of the buttons are, optionally, user-customizable.
- Touch screen 112 is used to implement virtual or soft buttons and one or more soft keyboards.
- Touch-sensitive display 112 provides an input interface and an output interface between the device and a user.
- Display controller 156 receives and/or sends electrical signals from/to touch screen 112 .
- Touch screen 112 displays visual output to the user.
- the visual output optionally includes graphics, text, icons, video, and any combination thereof (collectively termed “graphics”). In some embodiments, some or all of the visual output optionally corresponds to user-interface objects.
- Touch screen 112 has a touch-sensitive surface, sensor, or set of sensors that accepts input from the user based on haptic and/or tactile contact.
- Touch screen 112 and display controller 156 (along with any associated modules and/or sets of instructions in memory 102 ) detect contact (and any movement or breaking of the contact) on touch screen 112 and convert the detected contact into interaction with user-interface objects (e.g., one or more soft keys, icons, web pages, or images) that are displayed on touch screen 112 .
- user-interface objects e.g., one or more soft keys, icons, web pages, or images
- a point of contact between touch screen 112 and the user corresponds to a finger of the user.
- Touch screen 112 optionally uses LCD (liquid crystal display) technology, LPD (light emitting polymer display) technology, or LED (light emitting diode) technology, although other display technologies are used in other embodiments.
- Touch screen 112 and display controller 156 optionally detect contact and any movement or breaking thereof using any of a plurality of touch sensing technologies now known or later developed, including but not limited to capacitive, resistive, infrared, and surface acoustic wave technologies, as well as other proximity sensor arrays or other elements for determining one or more points of contact with touch screen 112 .
- touch sensing technologies now known or later developed, including but not limited to capacitive, resistive, infrared, and surface acoustic wave technologies, as well as other proximity sensor arrays or other elements for determining one or more points of contact with touch screen 112 .
- projected mutual capacitance sensing technology is used, such as that found in the iPhone® and iPod Touch® from Apple Inc. of Cupertino, Calif.
- a touch-sensitive display in some embodiments of touch screen 112 is, optionally, analogous to the multi-touch sensitive touchpads described in the following U.S. Pat. No. 6,323,846 (Westerman et al.), U.S. Pat. No. 6,570,557 (Westerman et al.), and/or U.S. Pat. No. 6,677,932 (Westerman), and/or U.S. Patent Publication 2002/0015024A1, each of which is hereby incorporated by reference in its entirety.
- touch screen 112 displays visual output from device 100 , whereas touch-sensitive touchpads do not provide visual output.
- a touch-sensitive display in some embodiments of touch screen 112 is described in the following applications: (1) U.S. patent application Ser. No. 11/381,313, “Multipoint Touch Surface Controller,” filed May 2, 2006; (2) U.S. patent application Ser. No. 10/840,862, “Multipoint Touchscreen,” filed May 6, 2004; (3) U.S. patent application Ser. No. 10/903,964, “Gestures For Touch Sensitive Input Devices,” filed Jul. 30, 2004; (4) U.S. patent application Ser. No. 11/048,264, “Gestures For Touch Sensitive Input Devices,” filed Jan. 31, 2005; (5) U.S. patent application Ser. No.
- Touch screen 112 optionally has a video resolution in excess of 100 dpi. In some embodiments, the touch screen has a video resolution of approximately 160 dpi.
- the user optionally makes contact with touch screen 112 using any suitable object or appendage, such as a stylus, a finger, and so forth.
- the user interface is designed to work primarily with finger-based contacts and gestures, which can be less precise than stylus-based input due to the larger area of contact of a finger on the touch screen.
- the device translates the rough finger-based input into a precise pointer/cursor position or command for performing the actions desired by the user.
- device 100 in addition to the touch screen, device 100 optionally includes a touchpad for activating or deactivating particular functions.
- the touchpad is a touch-sensitive area of the device that, unlike the touch screen, does not display visual output.
- the touchpad is, optionally, a touch-sensitive surface that is separate from touch screen 112 or an extension of the touch-sensitive surface formed by the touch screen.
- Power system 162 for powering the various components.
- Power system 162 optionally includes a power management system, one or more power sources (e.g., battery, alternating current (AC)), a recharging system, a power failure detection circuit, a power converter or inverter, a power status indicator (e.g., a light-emitting diode (LED)) and any other components associated with the generation, management and distribution of power in portable devices.
- power sources e.g., battery, alternating current (AC)
- AC alternating current
- a recharging system e.g., a recharging system
- a power failure detection circuit e.g., a power failure detection circuit
- a power converter or inverter e.g., a power converter or inverter
- a power status indicator e.g., a light-emitting diode (LED)
- Device 100 optionally also includes one or more optical sensors 164 .
- FIG. 1A shows an optical sensor coupled to optical sensor controller 158 in I/O subsystem 106 .
- Optical sensor 164 optionally includes charge-coupled device (CCD) or complementary metal-oxide semiconductor (CMOS) phototransistors.
- CCD charge-coupled device
- CMOS complementary metal-oxide semiconductor
- Optical sensor 164 receives light from the environment, projected through one or more lenses, and converts the light to data representing an image.
- imaging module 143 also called a camera module
- optical sensor 164 optionally captures still images or video.
- an optical sensor is located on the back of device 100 , opposite touch screen display 112 on the front of the device so that the touch screen display is enabled for use as a viewfinder for still and/or video image acquisition.
- an optical sensor is located on the front of the device so that the user's image is, optionally, obtained for video conferencing while the user views the other video conference participants on the touch screen display.
- the position of optical sensor 164 can be changed by the user (e.g., by rotating the lens and the sensor in the device housing) so that a single optical sensor 164 is used along with the touch screen display for both video conferencing and still and/or video image acquisition.
- Device 100 optionally also includes one or more depth camera sensors 175 .
- FIG. 1A shows a depth camera sensor coupled to depth camera controller 169 in I/O subsystem 106 .
- Depth camera sensor 175 receives data from the environment to create a three dimensional model of an object (e.g., a face) within a scene from a viewpoint (e.g., a depth camera sensor).
- a viewpoint e.g., a depth camera sensor
- depth camera sensor 175 in conjunction with imaging module 143 (also called a camera module), depth camera sensor 175 is optionally used to determine a depth map of different portions of an image captured by the imaging module 143 .
- a depth camera sensor is located on the front of device 100 so that the user's image with depth information is, optionally, obtained for video conferencing while the user views the other video conference participants on the touch screen display and to capture selfies with depth map data.
- the depth camera sensor 175 is located on the back of device, or on the back and the front of the device 100 .
- the position of depth camera sensor 175 can be changed by the user (e.g., by rotating the lens and the sensor in the device housing) so that a depth camera sensor 175 is used along with the touch screen display for both video conferencing and still and/or video image acquisition.
- Device 100 optionally also includes one or more contact intensity sensors 165 .
- FIG. 1A shows a contact intensity sensor coupled to intensity sensor controller 159 in I/O subsystem 106 .
- Contact intensity sensor 165 optionally includes one or more piezoresistive strain gauges, capacitive force sensors, electric force sensors, piezoelectric force sensors, optical force sensors, capacitive touch-sensitive surfaces, or other intensity sensors (e.g., sensors used to measure the force (or pressure) of a contact on a touch-sensitive surface).
- Contact intensity sensor 165 receives contact intensity information (e.g., pressure information or a proxy for pressure information) from the environment.
- contact intensity information e.g., pressure information or a proxy for pressure information
- At least one contact intensity sensor is collocated with, or proximate to, a touch-sensitive surface (e.g., touch-sensitive display system 112 ). In some embodiments, at least one contact intensity sensor is located on the back of device 100 , opposite touch screen display 112 , which is located on the front of device 100 .
- Device 100 optionally also includes one or more proximity sensors 166 .
- FIG. 1A shows proximity sensor 166 coupled to peripherals interface 118 .
- proximity sensor 166 is, optionally, coupled to input controller 160 in I/O subsystem 106 .
- Proximity sensor 166 optionally performs as described in U.S. patent application Ser. No. 11/241,839, “Proximity Detector In Handheld Device”; Ser. No. 11/240,788, “Proximity Detector In Handheld Device”; Ser. No. 11/620,702, “Using Ambient Light Sensor To Augment Proximity Sensor Output”; Ser. No. 11/586,862, “Automated Response To And Sensing Of User Activity In Portable Devices”; and Ser.
- the proximity sensor turns off and disables touch screen 112 when the multifunction device is placed near the user's ear (e.g., when the user is making a phone call).
- Device 100 optionally also includes one or more tactile output generators 167 .
- FIG. 1A shows a tactile output generator coupled to haptic feedback controller 161 in I/O subsystem 106 .
- Tactile output generator 167 optionally includes one or more electroacoustic devices such as speakers or other audio components and/or electromechanical devices that convert energy into linear motion such as a motor, solenoid, electroactive polymer, piezoelectric actuator, electrostatic actuator, or other tactile output generating component (e.g., a component that converts electrical signals into tactile outputs on the device).
- Contact intensity sensor 165 receives tactile feedback generation instructions from haptic feedback module 133 and generates tactile outputs on device 100 that are capable of being sensed by a user of device 100 .
- At least one tactile output generator is collocated with, or proximate to, a touch-sensitive surface (e.g., touch-sensitive display system 112 ) and, optionally, generates a tactile output by moving the touch-sensitive surface vertically (e.g., in/out of a surface of device 100 ) or laterally (e.g., back and forth in the same plane as a surface of device 100 ).
- at least one tactile output generator sensor is located on the back of device 100 , opposite touch screen display 112 , which is located on the front of device 100 .
- Device 100 optionally also includes one or more accelerometers 168 .
- FIG. 1A shows accelerometer 168 coupled to peripherals interface 118 .
- accelerometer 168 is, optionally, coupled to an input controller 160 in I/O subsystem 106 .
- Accelerometer 168 optionally performs as described in U.S. Patent Publication No. 20050190059, “Acceleration-based Theft Detection System for Portable Electronic Devices,” and U.S. Patent Publication No. 20060017692, “Methods And Apparatuses For Operating A Portable Device Based On An Accelerometer,” both of which are incorporated by reference herein in their entirety.
- information is displayed on the touch screen display in a portrait view or a landscape view based on an analysis of data received from the one or more accelerometers.
- Device 100 optionally includes, in addition to accelerometer(s) 168 , a magnetometer and a GPS (or GLONASS or other global navigation system) receiver for obtaining information concerning the location and orientation (e.g., portrait or landscape) of device 100 .
- GPS or GLONASS or other global navigation system
- the software components stored in memory 102 include operating system 126 , communication module (or set of instructions) 128 , contact/motion module (or set of instructions) 130 , graphics module (or set of instructions) 132 , text input module (or set of instructions) 134 , Global Positioning System (GPS) module (or set of instructions) 135 , and applications (or sets of instructions) 136 .
- memory 102 FIG. 1A or 370 ( FIG. 3 ) stores device/global internal state 157 , as shown in FIGS. 1A and 3 .
- Device/global internal state 157 includes one or more of: active application state, indicating which applications, if any, are currently active; display state, indicating what applications, views or other information occupy various regions of touch screen display 112 ; sensor state, including information obtained from the device's various sensors and input control devices 116 ; and location information concerning the device's location and/or attitude.
- Operating system 126 e.g., Darwin, RTXC, LINUX, UNIX, OS X, iOS, WINDOWS, or an embedded operating system such as VxWorks
- Operating system 126 includes various software components and/or drivers for controlling and managing general system tasks (e.g., memory management, storage device control, power management, etc.) and facilitates communication between various hardware and software components.
- Communication module 128 facilitates communication with other devices over one or more external ports 124 and also includes various software components for handling data received by RF circuitry 108 and/or external port 124 .
- External port 124 e.g., Universal Serial Bus (USB), FIREWIRE, etc.
- USB Universal Serial Bus
- FIREWIRE FireWire
- the external port is a multi-pin (e.g., 30-pin) connector that is the same as, or similar to and/or compatible with, the 30-pin connector used on iPod® (trademark of Apple Inc.) devices.
- Contact/motion module 130 optionally detects contact with touch screen 112 (in conjunction with display controller 156 ) and other touch-sensitive devices (e.g., a touchpad or physical click wheel).
- Contact/motion module 130 includes various software components for performing various operations related to detection of contact, such as determining if contact has occurred (e.g., detecting a finger-down event), determining an intensity of the contact (e.g., the force or pressure of the contact or a substitute for the force or pressure of the contact), determining if there is movement of the contact and tracking the movement across the touch-sensitive surface (e.g., detecting one or more finger-dragging events), and determining if the contact has ceased (e.g., detecting a finger-up event or a break in contact).
- Contact/motion module 130 receives contact data from the touch-sensitive surface. Determining movement of the point of contact, which is represented by a series of contact data, optionally includes determining speed (magnitude), velocity (magnitude and direction), and/or an acceleration (a change in magnitude and/or direction) of the point of contact. These operations are, optionally, applied to single contacts (e.g., one finger contacts) or to multiple simultaneous contacts (e.g., “multitouch”/multiple finger contacts). In some embodiments, contact/motion module 130 and display controller 156 detect contact on a touchpad.
- contact/motion module 130 uses a set of one or more intensity thresholds to determine whether an operation has been performed by a user (e.g., to determine whether a user has “clicked” on an icon).
- at least a subset of the intensity thresholds are determined in accordance with software parameters (e.g., the intensity thresholds are not determined by the activation thresholds of particular physical actuators and can be adjusted without changing the physical hardware of device 100 ). For example, a mouse “click” threshold of a trackpad or touch screen display can be set to any of a large range of predefined threshold values without changing the trackpad or touch screen display hardware.
- a user of the device is provided with software settings for adjusting one or more of the set of intensity thresholds (e.g., by adjusting individual intensity thresholds and/or by adjusting a plurality of intensity thresholds at once with a system-level click “intensity” parameter).
- Contact/motion module 130 optionally detects a gesture input by a user.
- Different gestures on the touch-sensitive surface have different contact patterns (e.g., different motions, timings, and/or intensities of detected contacts).
- a gesture is, optionally, detected by detecting a particular contact pattern.
- detecting a finger tap gesture includes detecting a finger-down event followed by detecting a finger-up (liftoff) event at the same position (or substantially the same position) as the finger-down event (e.g., at the position of an icon).
- detecting a finger swipe gesture on the touch-sensitive surface includes detecting a finger-down event followed by detecting one or more finger-dragging events, and subsequently followed by detecting a finger-up (liftoff) event.
- Graphics module 132 includes various known software components for rendering and displaying graphics on touch screen 112 or other display, including components for changing the visual impact (e.g., brightness, transparency, saturation, contrast, or other visual property) of graphics that are displayed.
- graphics includes any object that can be displayed to a user, including, without limitation, text, web pages, icons (such as user-interface objects including soft keys), digital images, videos, animations, and the like.
- graphics module 132 stores data representing graphics to be used. Each graphic is, optionally, assigned a corresponding code. Graphics module 132 receives, from applications etc., one or more codes specifying graphics to be displayed along with, if necessary, coordinate data and other graphic property data, and then generates screen image data to output to display controller 156 .
- Haptic feedback module 133 includes various software components for generating instructions used by tactile output generator(s) 167 to produce tactile outputs at one or more locations on device 100 in response to user interactions with device 100 .
- Text input module 134 which is, optionally, a component of graphics module 132 , provides soft keyboards for entering text in various applications (e.g., contacts 137 , e-mail 140 , IM 141 , browser 147 , and any other application that needs text input).
- applications e.g., contacts 137 , e-mail 140 , IM 141 , browser 147 , and any other application that needs text input.
- GPS module 135 determines the location of the device and provides this information for use in various applications (e.g., to telephone 138 for use in location-based dialing; to camera 143 as picture/video metadata; and to applications that provide location-based services such as weather widgets, local yellow page widgets, and map/navigation widgets).
- applications e.g., to telephone 138 for use in location-based dialing; to camera 143 as picture/video metadata; and to applications that provide location-based services such as weather widgets, local yellow page widgets, and map/navigation widgets).
- Applications 136 optionally include the following modules (or sets of instructions), or a subset or superset thereof:
- Examples of other applications 136 that are, optionally, stored in memory 102 include other word processing applications, other image editing applications, drawing applications, presentation applications, JAVA-enabled applications, encryption, digital rights management, voice recognition, and voice replication.
- contacts module 137 are, optionally, used to manage an address book or contact list (e.g., stored in application internal state 192 of contacts module 137 in memory 102 or memory 370 ), including: adding name(s) to the address book; deleting name(s) from the address book; associating telephone number(s), e-mail address(es), physical address(es) or other information with a name; associating an image with a name; categorizing and sorting names; providing telephone numbers or e-mail addresses to initiate and/or facilitate communications by telephone 138 , video conference module 139 , e-mail 140 , or IM 141 ; and so forth.
- an address book or contact list e.g., stored in application internal state 192 of contacts module 137 in memory 102 or memory 370 , including: adding name(s) to the address book; deleting name(s) from the address book; associating telephone number(s), e-mail address(es), physical address(es) or other information with a name
- telephone module 138 are optionally, used to enter a sequence of characters corresponding to a telephone number, access one or more telephone numbers in contacts module 137 , modify a telephone number that has been entered, dial a respective telephone number, conduct a conversation, and disconnect or hang up when the conversation is completed.
- the wireless communication optionally uses any of a plurality of communications standards, protocols, and technologies.
- video conference module 139 includes executable instructions to initiate, conduct, and terminate a video conference between a user and one or more other participants in accordance with user instructions.
- e-mail client module 140 includes executable instructions to create, send, receive, and manage e-mail in response to user instructions.
- e-mail client module 140 makes it very easy to create and send e-mails with still or video images taken with camera module 143 .
- the instant messaging module 141 includes executable instructions to enter a sequence of characters corresponding to an instant message, to modify previously entered characters, to transmit a respective instant message (for example, using a Short Message Service (SMS) or Multimedia Message Service (MMS) protocol for telephony-based instant messages or using XMPP, SIMPLE, or IMPS for Internet-based instant messages), to receive instant messages, and to view received instant messages.
- SMS Short Message Service
- MMS Multimedia Message Service
- XMPP extensible Markup Language
- SIMPLE Session Initiation Protocol
- IMPS Internet Messaging Protocol
- transmitted and/or received instant messages optionally include graphics, photos, audio files, video files and/or other attachments as are supported in an MMS and/or an Enhanced Messaging Service (EMS).
- EMS Enhanced Messaging Service
- instant messaging refers to both telephony-based messages (e.g., messages sent using SMS or MMS) and Internet-based messages (e.g., messages sent using XMPP, SIMPLE, or IMPS).
- workout support module 142 includes executable instructions to create workouts (e.g., with time, distance, and/or calorie burning goals); communicate with workout sensors (sports devices); receive workout sensor data; calibrate sensors used to monitor a workout; select and play music for a workout; and display, store, and transmit workout data.
- create workouts e.g., with time, distance, and/or calorie burning goals
- communicate with workout sensors sports devices
- receive workout sensor data calibrate sensors used to monitor a workout
- select and play music for a workout and display, store, and transmit workout data.
- camera module 143 includes executable instructions to capture still images or video (including a video stream) and store them into memory 102 , modify characteristics of a still image or video, or delete a still image or video from memory 102 .
- image management module 144 includes executable instructions to arrange, modify (e.g., edit), or otherwise manipulate, label, delete, present (e.g., in a digital slide show or album), and store still and/or video images.
- modify e.g., edit
- present e.g., in a digital slide show or album
- browser module 147 includes executable instructions to browse the Internet in accordance with user instructions, including searching, linking to, receiving, and displaying web pages or portions thereof, as well as attachments and other files linked to web pages.
- calendar module 148 includes executable instructions to create, display, modify, and store calendars and data associated with calendars (e.g., calendar entries, to-do lists, etc.) in accordance with user instructions.
- widget modules 149 are mini-applications that are, optionally, downloaded and used by a user (e.g., weather widget 149 - 1 , stocks widget 149 - 2 , calculator widget 149 - 3 , alarm clock widget 149 - 4 , and dictionary widget 149 - 5 ) or created by the user (e.g., user-created widget 149 - 6 ).
- a widget includes an HTML (Hypertext Markup Language) file, a CSS (Cascading Style Sheets) file, and a JavaScript file.
- a widget includes an XML (Extensible Markup Language) file and a JavaScript file (e.g., Yahoo! Widgets).
- the widget creator module 150 are, optionally, used by a user to create widgets (e.g., turning a user-specified portion of a web page into a widget).
- search module 151 includes executable instructions to search for text, music, sound, image, video, and/or other files in memory 102 that match one or more search criteria (e.g., one or more user-specified search terms) in accordance with user instructions.
- search criteria e.g., one or more user-specified search terms
- video and music player module 152 includes executable instructions that allow the user to download and play back recorded music and other sound files stored in one or more file formats, such as MP3 or AAC files, and executable instructions to display, present, or otherwise play back videos (e.g., on touch screen 112 or on an external, connected display via external port 124 ).
- device 100 optionally includes the functionality of an MP3 player, such as an iPod (trademark of Apple Inc.).
- notes module 153 includes executable instructions to create and manage notes, to-do lists, and the like in accordance with user instructions.
- map module 154 are, optionally, used to receive, display, modify, and store maps and data associated with maps (e.g., driving directions, data on stores and other points of interest at or near a particular location, and other location-based data) in accordance with user instructions.
- maps e.g., driving directions, data on stores and other points of interest at or near a particular location, and other location-based data
- online video module 155 includes instructions that allow the user to access, browse, receive (e.g., by streaming and/or download), play back (e.g., on the touch screen or on an external, connected display via external port 124 ), send an e-mail with a link to a particular online video, and otherwise manage online videos in one or more file formats, such as H.264.
- instant messaging module 141 is used to send a link to a particular online video. Additional description of the online video application can be found in U.S. Provisional Patent Application No. 60/936,562, “Portable Multifunction Device, Method, and Graphical User Interface for Playing Online Videos,” filed Jun. 20, 2007, and U.S. patent application Ser. No. 11/968,067, “Portable Multifunction Device, Method, and Graphical User Interface for Playing Online Videos,” filed Dec. 31, 2007, the contents of which are hereby incorporated by reference in their entirety.
- modules and applications corresponds to a set of executable instructions for performing one or more functions described above and the methods described in this application (e.g., the computer-implemented methods and other information processing methods described herein).
- modules e.g., sets of instructions
- video player module is, optionally, combined with music player module into a single module (e.g., video and music player module 152 , FIG. 1A ).
- memory 102 optionally stores a subset of the modules and data structures identified above. Furthermore, memory 102 optionally stores additional modules and data structures not described above.
- device 100 is a device where operation of a predefined set of functions on the device is performed exclusively through a touch screen and/or a touchpad.
- a touch screen and/or a touchpad as the primary input control device for operation of device 100 , the number of physical input control devices (such as push buttons, dials, and the like) on device 100 is, optionally, reduced.
- the predefined set of functions that are performed exclusively through a touch screen and/or a touchpad optionally include navigation between user interfaces.
- the touchpad when touched by the user, navigates device 100 to a main, home, or root menu from any user interface that is displayed on device 100 .
- a “menu button” is implemented using a touchpad.
- the menu button is a physical push button or other physical input control device instead of a touchpad.
- FIG. 1B is a block diagram illustrating exemplary components for event handling in accordance with some embodiments.
- memory 102 FIG. 1A
- 370 FIG. 3
- event sorter 170 e.g., in operating system 126
- application 136 - 1 e.g., any of the aforementioned applications 137 - 151 , 155 , 380 - 390 ).
- Event sorter 170 receives event information and determines the application 136 - 1 and application view 191 of application 136 - 1 to which to deliver the event information.
- Event sorter 170 includes event monitor 171 and event dispatcher module 174 .
- application 136 - 1 includes application internal state 192 , which indicates the current application view(s) displayed on touch-sensitive display 112 when the application is active or executing.
- device/global internal state 157 is used by event sorter 170 to determine which application(s) is (are) currently active, and application internal state 192 is used by event sorter 170 to determine application views 191 to which to deliver event information.
- application internal state 192 includes additional information, such as one or more of: resume information to be used when application 136 - 1 resumes execution, user interface state information that indicates information being displayed or that is ready for display by application 136 - 1 , a state queue for enabling the user to go back to a prior state or view of application 136 - 1 , and a redo/undo queue of previous actions taken by the user.
- Event monitor 171 receives event information from peripherals interface 118 .
- Event information includes information about a sub-event (e.g., a user touch on touch-sensitive display 112 , as part of a multi-touch gesture).
- Peripherals interface 118 transmits information it receives from I/O subsystem 106 or a sensor, such as proximity sensor 166 , accelerometer(s) 168 , and/or microphone 113 (through audio circuitry 110 ).
- Information that peripherals interface 118 receives from I/O subsystem 106 includes information from touch-sensitive display 112 or a touch-sensitive surface.
- event monitor 171 sends requests to the peripherals interface 118 at predetermined intervals. In response, peripherals interface 118 transmits event information. In other embodiments, peripherals interface 118 transmits event information only when there is a significant event (e.g., receiving an input above a predetermined noise threshold and/or for more than a predetermined duration).
- event sorter 170 also includes a hit view determination module 172 and/or an active event recognizer determination module 173 .
- Hit view determination module 172 provides software procedures for determining where a sub-event has taken place within one or more views when touch-sensitive display 112 displays more than one view. Views are made up of controls and other elements that a user can see on the display.
- the application views (of a respective application) in which a touch is detected optionally correspond to programmatic levels within a programmatic or view hierarchy of the application. For example, the lowest level view in which a touch is detected is, optionally, called the hit view, and the set of events that are recognized as proper inputs are, optionally, determined based, at least in part, on the hit view of the initial touch that begins a touch-based gesture.
- Hit view determination module 172 receives information related to sub-events of a touch-based gesture.
- hit view determination module 172 identifies a hit view as the lowest view in the hierarchy which should handle the sub-event. In most circumstances, the hit view is the lowest level view in which an initiating sub-event occurs (e.g., the first sub-event in the sequence of sub-events that form an event or potential event).
- the hit view typically receives all sub-events related to the same touch or input source for which it was identified as the hit view.
- Active event recognizer determination module 173 determines which view or views within a view hierarchy should receive a particular sequence of sub-events. In some embodiments, active event recognizer determination module 173 determines that only the hit view should receive a particular sequence of sub-events. In other embodiments, active event recognizer determination module 173 determines that all views that include the physical location of a sub-event are actively involved views, and therefore determines that all actively involved views should receive a particular sequence of sub-events. In other embodiments, even if touch sub-events were entirely confined to the area associated with one particular view, views higher in the hierarchy would still remain as actively involved views.
- Event dispatcher module 174 dispatches the event information to an event recognizer (e.g., event recognizer 180 ). In embodiments including active event recognizer determination module 173 , event dispatcher module 174 delivers the event information to an event recognizer determined by active event recognizer determination module 173 . In some embodiments, event dispatcher module 174 stores in an event queue the event information, which is retrieved by a respective event receiver 182 .
- operating system 126 includes event sorter 170 .
- application 136 - 1 includes event sorter 170 .
- event sorter 170 is a stand-alone module, or a part of another module stored in memory 102 , such as contact/motion module 130 .
- application 136 - 1 includes a plurality of event handlers 190 and one or more application views 191 , each of which includes instructions for handling touch events that occur within a respective view of the application's user interface.
- Each application view 191 of the application 136 - 1 includes one or more event recognizers 180 .
- a respective application view 191 includes a plurality of event recognizers 180 .
- one or more of event recognizers 180 are part of a separate module, such as a user interface kit or a higher level object from which application 136 - 1 inherits methods and other properties.
- a respective event handler 190 includes one or more of: data updater 176 , object updater 177 , GUI updater 178 , and/or event data 179 received from event sorter 170 .
- Event handler 190 optionally utilizes or calls data updater 176 , object updater 177 , or GUI updater 178 to update the application internal state 192 .
- one or more of the application views 191 include one or more respective event handlers 190 .
- one or more of data updater 176 , object updater 177 , and GUI updater 178 are included in a respective application view 191 .
- a respective event recognizer 180 receives event information (e.g., event data 179 ) from event sorter 170 and identifies an event from the event information.
- Event recognizer 180 includes event receiver 182 and event comparator 184 .
- event recognizer 180 also includes at least a subset of: metadata 183 , and event delivery instructions 188 (which optionally include sub-event delivery instructions).
- Event receiver 182 receives event information from event sorter 170 .
- the event information includes information about a sub-event, for example, a touch or a touch movement. Depending on the sub-event, the event information also includes additional information, such as location of the sub-event. When the sub-event concerns motion of a touch, the event information optionally also includes speed and direction of the sub-event. In some embodiments, events include rotation of the device from one orientation to another (e.g., from a portrait orientation to a landscape orientation, or vice versa), and the event information includes corresponding information about the current orientation (also called device attitude) of the device.
- Event comparator 184 compares the event information to predefined event or sub-event definitions and, based on the comparison, determines an event or sub-event, or determines or updates the state of an event or sub-event.
- event comparator 184 includes event definitions 186 .
- Event definitions 186 contain definitions of events (e.g., predefined sequences of sub-events), for example, event 1 ( 187 - 1 ), event 2 ( 187 - 2 ), and others.
- sub-events in an event ( 187 ) include, for example, touch begin, touch end, touch movement, touch cancellation, and multiple touching.
- the definition for event 1 is a double tap on a displayed object.
- the double tap for example, comprises a first touch (touch begin) on the displayed object for a predetermined phase, a first liftoff (touch end) for a predetermined phase, a second touch (touch begin) on the displayed object for a predetermined phase, and a second liftoff (touch end) for a predetermined phase.
- the definition for event 2 is a dragging on a displayed object.
- the dragging for example, comprises a touch (or contact) on the displayed object for a predetermined phase, a movement of the touch across touch-sensitive display 112 , and liftoff of the touch (touch end).
- the event also includes information for one or more associated event handlers 190 .
- event definition 187 includes a definition of an event for a respective user-interface object.
- event comparator 184 performs a hit test to determine which user-interface object is associated with a sub-event. For example, in an application view in which three user-interface objects are displayed on touch-sensitive display 112 , when a touch is detected on touch-sensitive display 112 , event comparator 184 performs a hit test to determine which of the three user-interface objects is associated with the touch (sub-event). If each displayed object is associated with a respective event handler 190 , the event comparator uses the result of the hit test to determine which event handler 190 should be activated. For example, event comparator 184 selects an event handler associated with the sub-event and the object triggering the hit test.
- the definition for a respective event also includes delayed actions that delay delivery of the event information until after it has been determined whether the sequence of sub-events does or does not correspond to the event recognizer's event type.
- a respective event recognizer 180 determines that the series of sub-events do not match any of the events in event definitions 186 , the respective event recognizer 180 enters an event impossible, event failed, or event ended state, after which it disregards subsequent sub-events of the touch-based gesture. In this situation, other event recognizers, if any, that remain active for the hit view continue to track and process sub-events of an ongoing touch-based gesture.
- a respective event recognizer 180 includes metadata 183 with configurable properties, flags, and/or lists that indicate how the event delivery system should perform sub-event delivery to actively involved event recognizers.
- metadata 183 includes configurable properties, flags, and/or lists that indicate how event recognizers interact, or are enabled to interact, with one another.
- metadata 183 includes configurable properties, flags, and/or lists that indicate whether sub-events are delivered to varying levels in the view or programmatic hierarchy.
- a respective event recognizer 180 activates event handler 190 associated with an event when one or more particular sub-events of an event are recognized.
- a respective event recognizer 180 delivers event information associated with the event to event handler 190 .
- Activating an event handler 190 is distinct from sending (and deferred sending) sub-events to a respective hit view.
- event recognizer 180 throws a flag associated with the recognized event, and event handler 190 associated with the flag catches the flag and performs a predefined process.
- event delivery instructions 188 include sub-event delivery instructions that deliver event information about a sub-event without activating an event handler. Instead, the sub-event delivery instructions deliver event information to event handlers associated with the series of sub-events or to actively involved views. Event handlers associated with the series of sub-events or with actively involved views receive the event information and perform a predetermined process.
- data updater 176 creates and updates data used in application 136 - 1 .
- data updater 176 updates the telephone number used in contacts module 137 , or stores a video file used in video player module.
- object updater 177 creates and updates objects used in application 136 - 1 .
- object updater 177 creates a new user-interface object or updates the position of a user-interface object.
- GUI updater 178 updates the GUI.
- GUI updater 178 prepares display information and sends it to graphics module 132 for display on a touch-sensitive display.
- event handler(s) 190 includes or has access to data updater 176 , object updater 177 , and GUI updater 178 .
- data updater 176 , object updater 177 , and GUI updater 178 are included in a single module of a respective application 136 - 1 or application view 191 . In other embodiments, they are included in two or more software modules.
- event handling of user touches on touch-sensitive displays also applies to other forms of user inputs to operate multifunction devices 100 with input devices, not all of which are initiated on touch screens.
- mouse movement and mouse button presses optionally coordinated with single or multiple keyboard presses or holds; contact movements such as taps, drags, scrolls, etc. on touchpads; pen stylus inputs; movement of the device; oral instructions; detected eye movements; biometric inputs; and/or any combination thereof are optionally utilized as inputs corresponding to sub-events which define an event to be recognized.
- FIG. 2 illustrates a portable multifunction device 100 having a touch screen 112 in accordance with some embodiments.
- the touch screen optionally displays one or more graphics within user interface (UI) 200 .
- UI user interface
- a user is enabled to select one or more of the graphics by making a gesture on the graphics, for example, with one or more fingers 202 (not drawn to scale in the figure) or one or more styluses 203 (not drawn to scale in the figure).
- selection of one or more graphics occurs when the user breaks contact with the one or more graphics.
- the gesture optionally includes one or more taps, one or more swipes (from left to right, right to left, upward and/or downward), and/or a rolling of a finger (from right to left, left to right, upward and/or downward) that has made contact with device 100 .
- inadvertent contact with a graphic does not select the graphic.
- a swipe gesture that sweeps over an application icon optionally does not select the corresponding application when the gesture corresponding to selection is a tap.
- Device 100 optionally also include one or more physical buttons, such as “home” or menu button 204 .
- menu button 204 is, optionally, used to navigate to any application 136 in a set of applications that are, optionally, executed on device 100 .
- the menu button is implemented as a soft key in a GUI displayed on touch screen 112 .
- device 100 includes touch screen 112 , menu button 204 , push button 206 for powering the device on/off and locking the device, volume adjustment button(s) 208 , subscriber identity module (SIM) card slot 210 , headset jack 212 , and docking/charging external port 124 .
- Push button 206 is, optionally, used to turn the power on/off on the device by depressing the button and holding the button in the depressed state for a predefined time interval; to lock the device by depressing the button and releasing the button before the predefined time interval has elapsed; and/or to unlock the device or initiate an unlock process.
- device 100 also accepts verbal input for activation or deactivation of some functions through microphone 113 .
- Device 100 also, optionally, includes one or more contact intensity sensors 165 for detecting intensity of contacts on touch screen 112 and/or one or more tactile output generators 167 for generating tactile outputs for a user of device 100 .
- FIG. 3 is a block diagram of an exemplary multifunction device with a display and a touch-sensitive surface in accordance with some embodiments.
- Device 300 need not be portable.
- device 300 is a laptop computer, a desktop computer, a tablet computer, a multimedia player device, a navigation device, an educational device (such as a child's learning toy), a gaming system, or a control device (e.g., a home or industrial controller).
- Device 300 typically includes one or more processing units (CPUs) 310 , one or more network or other communications interfaces 360 , memory 370 , and one or more communication buses 320 for interconnecting these components.
- Communication buses 320 optionally include circuitry (sometimes called a chipset) that interconnects and controls communications between system components.
- Device 300 includes input/output (I/O) interface 330 comprising display 340 , which is typically a touch screen display.
- I/O interface 330 also optionally includes a keyboard and/or mouse (or other pointing device) 350 and touchpad 355 , tactile output generator 357 for generating tactile outputs on device 300 (e.g., similar to tactile output generator(s) 167 described above with reference to FIG. 1A ), sensors 359 (e.g., optical, acceleration, proximity, touch-sensitive, and/or contact intensity sensors similar to contact intensity sensor(s) 165 described above with reference to FIG. 1A ).
- I/O interface 330 also optionally includes a keyboard and/or mouse (or other pointing device) 350 and touchpad 355 , tactile output generator 357 for generating tactile outputs on device 300 (e.g., similar to tactile output generator(s) 167 described above with reference to FIG. 1A ), sensors 359 (e.g., optical, acceleration, proximity, touch-sensitive, and/or contact intensity sensors similar to
- Memory 370 includes high-speed random access memory, such as DRAM, SRAM, DDR RAM, or other random access solid state memory devices; and optionally includes non-volatile memory, such as one or more magnetic disk storage devices, optical disk storage devices, flash memory devices, or other non-volatile solid state storage devices. Memory 370 optionally includes one or more storage devices remotely located from CPU(s) 310 . In some embodiments, memory 370 stores programs, modules, and data structures analogous to the programs, modules, and data structures stored in memory 102 of portable multifunction device 100 ( FIG. 1A ), or a subset thereof. Furthermore, memory 370 optionally stores additional programs, modules, and data structures not present in memory 102 of portable multifunction device 100 .
- memory 370 of device 300 optionally stores drawing module 380 , presentation module 382 , word processing module 384 , website creation module 386 , disk authoring module 388 , and/or spreadsheet module 390 , while memory 102 of portable multifunction device 100 ( FIG. 1A ) optionally does not store these modules.
- Each of the above-identified elements in FIG. 3 is, optionally, stored in one or more of the previously mentioned memory devices.
- Each of the above-identified modules corresponds to a set of instructions for performing a function described above.
- the above-identified modules or programs (e.g., sets of instructions) need not be implemented as separate software programs, procedures, or modules, and thus various subsets of these modules are, optionally, combined or otherwise rearranged in various embodiments.
- memory 370 optionally stores a subset of the modules and data structures identified above. Furthermore, memory 370 optionally stores additional modules and data structures not described above.
- FIG. 4A illustrates an exemplary user interface for a menu of applications on portable multifunction device 100 in accordance with some embodiments. Similar user interfaces are, optionally, implemented on device 300 .
- user interface 400 includes the following elements, or a subset or superset thereof:
- icon labels illustrated in FIG. 4A are merely exemplary.
- icon 422 for video and music player module 152 is labeled “Music” or “Music Player.”
- Other labels are, optionally, used for various application icons.
- a label for a respective application icon includes a name of an application corresponding to the respective application icon.
- a label for a particular application icon is distinct from a name of an application corresponding to the particular application icon.
- FIG. 4B illustrates an exemplary user interface on a device (e.g., device 300 , FIG. 3 ) with a touch-sensitive surface 451 (e.g., a tablet or touchpad 355 , FIG. 3 ) that is separate from the display 450 (e.g., touch screen display 112 ).
- Device 300 also, optionally, includes one or more contact intensity sensors (e.g., one or more of sensors 359 ) for detecting intensity of contacts on touch-sensitive surface 451 and/or one or more tactile output generators 357 for generating tactile outputs for a user of device 300 .
- one or more contact intensity sensors e.g., one or more of sensors 359
- tactile output generators 357 for generating tactile outputs for a user of device 300 .
- the device detects inputs on a touch-sensitive surface that is separate from the display, as shown in FIG. 4B .
- the touch-sensitive surface e.g., 451 in FIG. 4B
- the touch-sensitive surface has a primary axis (e.g., 452 in FIG. 4B ) that corresponds to a primary axis (e.g., 453 in FIG. 4B ) on the display (e.g., 450 ).
- the device detects contacts (e.g., 460 and 462 in FIG.
- finger inputs e.g., finger contacts, finger tap gestures, finger swipe gestures
- one or more of the finger inputs are replaced with input from another input device (e.g., a mouse-based input or stylus input).
- a swipe gesture is, optionally, replaced with a mouse click (e.g., instead of a contact) followed by movement of the cursor along the path of the swipe (e.g., instead of movement of the contact).
- a tap gesture is, optionally, replaced with a mouse click while the cursor is located over the location of the tap gesture (e.g., instead of detection of the contact followed by ceasing to detect the contact).
- multiple user inputs are simultaneously detected, it should be understood that multiple computer mice are, optionally, used simultaneously, or a mouse and finger contacts are, optionally, used simultaneously.
- FIG. 5A illustrates exemplary personal electronic device 500 .
- Device 500 includes body 502 .
- device 500 can include some or all of the features described with respect to devices 100 and 300 (e.g., FIGS. 1A-4B ).
- device 500 has touch-sensitive display screen 504 , hereafter touch screen 504 .
- touch screen 504 optionally includes one or more intensity sensors for detecting intensity of contacts (e.g., touches) being applied.
- the one or more intensity sensors of touch screen 504 (or the touch-sensitive surface) can provide output data that represents the intensity of touches.
- the user interface of device 500 can respond to touches based on their intensity, meaning that touches of different intensities can invoke different user interface operations on device 500 .
- Exemplary techniques for detecting and processing touch intensity are found, for example, in related applications: International Patent Application Serial No. PCT/US2013/040061, titled “Device, Method, and Graphical User Interface for Displaying User Interface Objects Corresponding to an Application,” filed May 8, 2013, published as WIPO Publication No. WO/2013/169849, and International Patent Application Serial No. PCT/US2013/069483, titled “Device, Method, and Graphical User Interface for Transitioning Between Touch Input to Display Output Relationships,” filed Nov. 11, 2013, published as WIPO Publication No. WO/2014/105276, each of which is hereby incorporated by reference in their entirety.
- device 500 has one or more input mechanisms 506 and 508 .
- Input mechanisms 506 and 508 can be physical. Examples of physical input mechanisms include push buttons and rotatable mechanisms.
- device 500 has one or more attachment mechanisms. Such attachment mechanisms, if included, can permit attachment of device 500 with, for example, hats, eyewear, earrings, necklaces, shirts, jackets, bracelets, watch straps, chains, trousers, belts, shoes, purses, backpacks, and so forth. These attachment mechanisms permit device 500 to be worn by a user.
- FIG. 5B depicts exemplary personal electronic device 500 .
- device 500 can include some or all of the components described with respect to FIGS. 1A, 1B , and 3 .
- Device 500 has bus 512 that operatively couples I/O section 514 with one or more computer processors 516 and memory 518 .
- I/O section 514 can be connected to display 504 , which can have touch-sensitive component 522 and, optionally, intensity sensor 524 (e.g., contact intensity sensor).
- I/O section 514 can be connected with communication unit 530 for receiving application and operating system data, using Wi-Fi, Bluetooth, near field communication (NFC), cellular, and/or other wireless communication techniques.
- Device 500 can include input mechanisms 506 and/or 508 .
- Input mechanism 506 is, optionally, a rotatable input device or a depressible and rotatable input device, for example.
- Input mechanism 508 is, optionally, a button, in some examples.
- Input mechanism 508 is, optionally, a microphone, in some examples.
- Personal electronic device 500 optionally includes various sensors, such as GPS sensor 532 , accelerometer 534 , directional sensor 540 (e.g., compass), gyroscope 536 , motion sensor 538 , and/or a combination thereof, all of which can be operatively connected to I/O section 514 .
- sensors such as GPS sensor 532 , accelerometer 534 , directional sensor 540 (e.g., compass), gyroscope 536 , motion sensor 538 , and/or a combination thereof, all of which can be operatively connected to I/O section 514 .
- Memory 518 of personal electronic device 500 can include one or more non-transitory computer-readable storage mediums, for storing computer-executable instructions, which, when executed by one or more computer processors 516 , for example, can cause the computer processors to perform the techniques described below, including processes 700 ( FIGS. 7A-7C ), 900 ( FIGS. 9A-9B ), 1100 ( FIGS. 11A-11H ), 1300 ( FIGS. 13A-13C ), 1500 ( FIGS. 15A-15F ), 1700 ( FIGS. 17A-17D ), and 1900 ( FIGS. 19A-19C ).
- a computer-readable storage medium can be any medium that can tangibly contain or store computer-executable instructions for use by or in connection with the instruction execution system, apparatus, or device.
- the storage medium is a transitory computer-readable storage medium.
- the storage medium is a non-transitory computer-readable storage medium.
- the non-transitory computer-readable storage medium can include, but is not limited to, magnetic, optical, and/or semiconductor storages. Examples of such storage include magnetic disks, optical discs based on CD, DVD, or Blu-ray technologies, as well as persistent solid-state memory such as flash, solid-state drives, and the like.
- Personal electronic device 500 is not limited to the components and configuration of FIG. 5B , but can include other or additional components in multiple configurations.
- the term “affordance” refers to a user-interactive graphical user interface object that is, optionally, displayed on the display screen of devices 100 , 300 , and/or 500 ( FIGS. 1A, 3, and 5A-5B ).
- an image e.g., icon
- a button e.g., button
- text e.g., hyperlink
- the term “focus selector” refers to an input element that indicates a current part of a user interface with which a user is interacting.
- the cursor acts as a “focus selector” so that when an input (e.g., a press input) is detected on a touch-sensitive surface (e.g., touchpad 355 in FIG. 3 or touch-sensitive surface 451 in FIG. 4B ) while the cursor is over a particular user interface element (e.g., a button, window, slider, or other user interface element), the particular user interface element is adjusted in accordance with the detected input.
- a touch screen display e.g., touch-sensitive display system 112 in FIG.
- a detected contact on the touch screen acts as a “focus selector” so that when an input (e.g., a press input by the contact) is detected on the touch screen display at a location of a particular user interface element (e.g., a button, window, slider, or other user interface element), the particular user interface element is adjusted in accordance with the detected input.
- an input e.g., a press input by the contact
- a particular user interface element e.g., a button, window, slider, or other user interface element
- focus is moved from one region of a user interface to another region of the user interface without corresponding movement of a cursor or movement of a contact on a touch screen display (e.g., by using a tab key or arrow keys to move focus from one button to another button); in these implementations, the focus selector moves in accordance with movement of focus between different regions of the user interface.
- the focus selector is generally the user interface element (or contact on a touch screen display) that is controlled by the user so as to communicate the user's intended interaction with the user interface (e.g., by indicating, to the device, the element of the user interface with which the user is intending to interact).
- a focus selector e.g., a cursor, a contact, or a selection box
- a press input is detected on the touch-sensitive surface (e.g., a touchpad or touch screen) will indicate that the user is intending to activate the respective button (as opposed to other user interface elements shown on a display of the device).
- UI user interfaces
- portable multifunction device 100 such as portable multifunction device 100 , device 300 , or device 500 .
- FIGS. 6A-6H illustrate exemplary user interfaces for displaying and enabling an adjustment of a displayed time zone, in accordance with some embodiments.
- the user interfaces in these figures are used to illustrate the processes described below, including the processes in FIGS. 7A-7C .
- device 600 displays watch user interface 604 A, which includes first analog dial 608 concurrently displayed with second analog dial 606 .
- Hour hand 608 A, minute hand 608 B, and seconds hand 608 C indicate the hour, minute, and second (respectively) of a current time in a first time zone on first analog dial 608 .
- First analog dial 608 represents a period of 12 hours (e.g., hour hand 608 A will make a full rotation every 12 hours).
- Clock hand 608 D indicates a current time in a second time zone on second analog dial 606 .
- Second analog dial 606 represents a period of 24 hours (e.g., clock hand 608 D will make a full rotation every 24 hours).
- Marker 606 C indicates the position of midnight on second analog dial 606 (e.g., clock hand 608 D will point to marker 606 C at midnight in the second time zone).
- Time zone indicator 608 E displays a textual indication (“LAX”, representing Los Angeles) of the time zone associated with second analog dial 606 (e.g., an abbreviation of a geographic location within the time zone associated with second analog dial 606 ).
- second analog dial 606 is a ring that surrounds first analog dial 608 and has a first orientation relative to first analog dial 608 .
- Second analog dial 606 is oriented such that midnight on second analog dial 606 is aligned with the 12 o'clock hour on first analog dial 608 .
- First analog dial 608 and second analog dial 606 are associated with respective time zones.
- Watch user interface 604 A includes time zone indicator 608 E of the time zone associated with second analog dial 606 (e.g., a location in the time zone associated with the second analog dial 606 ).
- first analog dial 608 and second analog dial 606 are associated with the same time zone, a first time zone, and the time indicator associated with each dial (e.g., hour hand 608 A, minute hand 608 B, and/or seconds hand 608 C for first analog dial 608 , and clock hand 608 D for second analog dial 608 ) indicates the same time (the current time in the first time zone).
- the first time zone is the Pacific time zone
- the current time in the Pacific time zone is 6:00 AM.
- Hour hand 608 A and minute hand 608 B indicate 6:00 AM on first analog dial 608
- clock hand 608 D indicates 6:00 AM on second analog dial 606 .
- second analog dial 606 includes tick marks, representing the positions on second analog dial 606 corresponding to respective hours, and current hour indicator 606 D, which includes a numerical indicator of the hour of the current time in the time zone associated with second analog dial 606 (e.g., second analog dial 606 includes a single numerical indicator only for the hour of the current time).
- current hour indicator 606 D is displayed only if the time zone associated with second analog dial 606 is different from the time zone associated with first analog dial 608 .
- second analog dial 606 includes numerical indicators at all hour positions or at two or more, but less than all, hour positions.
- Second analog dial 606 includes first portion 606 A, which corresponds to nighttime in the time zone associated with the second analog dial, and second portion 606 B (e.g., the portion of second analog dial 606 that is not included in first portion 606 A), which corresponds to daytime in the time zone associated with the second analog dial.
- First portion 606 A and second portion 606 B have different visual characteristics (e.g., different color, brightness, transparency, or pattern).
- first portion 606 A and second portion 606 B that is in the clockwise direction from midnight marker 606 C corresponds to a sunrise time (approximately at the 6 o'clock hour position), and the boundary between first portion 606 A and second portion 606 B that is in the counter-clockwise direction from midnight marker 606 C corresponds to the sunset time (approximately at the 8 o'clock hour position).
- the size (e.g., angular extent) of first portion 606 A is smaller than the size of second portion 606 B, which indicates that nighttime is shorter than daytime.
- the size and/or position (e.g., the angular extent and/or angular position) of first portion 606 A and second portion 606 B on second analog dial 606 depends on the time zone, time of year, and/or a geographic location associated with the time zone (e.g., first portion 606 A representing nighttime is smaller when it is summer in a location associated with the selected time zone than when it is winter in the same location).
- first portion 606 A and second portion 606 B are displayed differently when second analog dial 606 is associated with a first location in a first time zone than they are when second analog dial 606 is associated with a second location (e.g., a location different from the first location) in the first time zone (e.g., the same time zone).
- first portion 606 A and second portion 606 B are displayed differently when second analog dial 606 is associated with Cleveland than when second analog dial 606 is associated with New York City (e.g., for Cleveland, first portion 606 A and second portion 606 B are rotated clockwise relative to marker 606 C compared to their position for New York City).
- first portion 606 B and second portion 606 A are displayed differently when second analog dial 606 is associated with Seattle than when second analog dial 606 is associated with San Diego (e.g., during summer in Seattle and San Diego, first portion 606 A has a smaller angular extent and second portion 606 B has a larger angular extend for Seattle as compared to the angular extent for San Diego).
- first portion 606 A and second portion 606 B are displayed accordingly based on the time of year for a particular location (e.g., first portion 606 A representing nighttime has a larger angular extent in winter than in summer, for a particular location).
- FIG. 6B illustrates device 600 displaying watch user interface 604 A at a different time (10:09 AM Pacific time) compared to FIG. 6A , as indicated by the position of hour hand 608 A and minute hand 608 B relative to first analog dial 608 , and the position of clock hand 608 D relative to second analog dial 606 .
- Current hour indicator 606 D is displayed at the 10 o'clock hour on second analog dial 606 according to the current time associated with second analog dial 606
- a tick mark is displayed at the 6 o'clock hour on second analog dial 606 , where current hour indicator 606 D was located in FIG. 6A when the current time was 6:00 AM.
- Device 600 receives (e.g., detects) a request to change the time zone associated with second analog dial 606 .
- the request includes a sequence of one or more inputs (e.g., one or more of inputs 610 , 618 , 620 , or 622 ).
- device 600 receives (e.g., detects) input 610 (e.g., a gesture, a tap on display 602 ).
- input 610 includes a rotation of rotatable input mechanism 603 .
- rotatable input mechanism 603 is physically connected to device 600 (e.g., to a housing of device 600 ).
- rotatable input mechanism 603 has an axis of rotation that is parallel to a surface of display 602 (e.g., rotatable input mechanism 603 is attached to a side of device 600 that is perpendicular to a surface of display 602 ).
- device 600 In response to receiving input 610 , device 600 displays watch user interface 612 A shown in FIG. 6C .
- Watch user interface 612 A provides a user interface for changing the time zone associated with second analog dial 606 .
- second analog dial 606 includes numerical hour indicators at the positions on second analog dial 606 corresponding to respective hours (e.g., the tick marks shown in FIG. 6B are replaced with the numerals shown in FIG. 6C ). Display of marker 606 C is maintained.
- Watch user interface 612 includes visual indication 614 of the current time in the time zone associated with second analog dial 606 .
- visual indication 614 includes a circle around the respective numerical hour indicator corresponding to the hour of the current time in the time zone associated with second analog dial 606 .
- visual indicator 614 includes highlighting of the respective numerical hour indicator and/or display of the respective numerical indicator with a different visual characteristic (e.g., style, color, size, font) than the other numerical hour indicators.
- Watch user interface 612 A includes time zone selection element 616 , which displays a designated time zone option corresponding to the time zone associated with the second analog dial.
- time zone selection element 616 replaces the display of first analog dial 608 (e.g., device 600 ceases display of first analog dial 608 and displays time zone selection element 616 ) and complications 605 A- 605 D are replaced with affordance 607 (e.g., device 600 ceases display of complications 605 A- 605 D and displays affordance 607 ).
- device 600 displays complications 605 A- 605 D in watch user interface 612 A.
- device 600 does not display affordance 607 in watch user interface 612 A.
- time zone selection element includes a list of selectable time zone options arranged according to the difference in time (also referred to as the offset) between the current time in the time zone associated with first analog dial 608 (or the time zone in which device 600 is located) and the respective time zone option.
- the time zone option corresponding to the time zone associated with second analog dial 606 is designated by being visually distinguished (e.g., placed in focus, emphasized, outlined, displayed without displaying other time zone options, highlighted in a different color than other time zone options, displayed brighter than or with less transparency than other time zone options).
- the time zone option corresponding to the time zone associated with second analog dial 606 is visually distinguished by being displayed in the center of time zone selection element 616 and at a larger size than the other time zone options.
- the time zone options show the current time in the corresponding time zone and an identifier of the time zone (referred to as a time zone identifier).
- a time zone identifier For example, in FIG. 6C , the option for the Mountain time zone includes the current time in the Mountain time zone (11:09) and text (DEN) indicating a location (Denver) within the Mountain time zone.
- the style of the time zone identifier can depend on the option.
- the time zone identifier includes text representing the particular geographic location; if the option corresponds to the time zone in which device 600 is located, then the time zone identifier includes a “current location” symbol (e.g., the arrow to the left of 10:09 in FIG.
- the time zone identifier includes a numerical indicator of the offset (e.g., since no geographic location is designated for the time zone adjacent to the West of the Pacific time zone, which has a current time of 9:09 corresponding to an offset of one hour behind, the time zone indicator includes the numerical indicator “ ⁇ 1”).
- the time zone identifier indicates the offset of the time zone option compared to Coordinated Universal Time (UTC) or Greenwich Mean Time (GMT).
- input 618 includes a rotation of rotatable input mechanism 603 .
- input 618 includes a gesture (e.g., a vertical swipe on display 602 ).
- device 600 displays watch user interface 612 B shown in FIG. 6D .
- Watch user interface 612 B designates a different time zone option compared to FIG. 6C (e.g., device 600 changes the designated time zone option in response to input 618 ).
- the list of options in time zone selection element 616 has been shifted (e.g., scrolled) compared to FIG.
- second analog dial 606 is displayed at a different orientation (e.g., rotated) relative to time zone selection element 616 , as compared to FIG. 6C , to correspond to the designated time zone option.
- device 600 displays an animated rotation of second analog dial 606 and/or an animated scrolling or rotation of the list of options in time zone selection element 616 in response to receiving input 618 .
- the change in second analog dial 606 corresponds to the change in time zone selection element 616 such that the hour indicated by visual indication 614 in second analog dial 606 corresponds to the hour of the current time associated with the designated time zone option (DEN 11:09).
- second analog dial 606 is rotated counter-clockwise 1/24 th of a complete rotation (e.g., one hour) such that the hour numeral for the 11 o'clock hour is indicated by visual indication 614 (e.g., visual indication 614 maintains the same position while second analog dial 606 is rotated counter-clockwise).
- second analog dial 606 is rotated around an axis that is normal to a surface of display 602 and passes through the center of second analog dial 606 ; the list of time zone options is displayed such that the time zone options appear to rotate about an axis that is perpendicular to the axis of rotation of second analog dial 606 (e.g., the time zone options appear to rotate about an axis that is parallel to an axis of rotation of rotatable input mechanism 603 ; the time zone options appear to move at least partly in a direction normal to (e.g., toward and away from) a surface of display 602 , in addition to moving vertically on display 602 ).
- device 600 changes the offset by an amount that is based on (e.g., proportional to) a magnitude, speed, and/or direction of input 618 (e.g., an amount of rotation of rotatable input mechanism 603 ; a distance of a gesture). For example, the list of time zone options is scrolled by an amount proportional to the magnitude of input 618 , and second analog dial 606 is rotated by an amount proportional to the magnitude of input 618 .
- a magnitude, speed, and/or direction of input 618 e.g., an amount of rotation of rotatable input mechanism 603 ; a distance of a gesture.
- device 600 changes the offset based on a direction of input 618 (e.g., a direction of rotation of rotatable input mechanism 603 ; a direction of a gesture). For example, device 600 increases the offset (e.g., moves to a time zone option is that is further ahead in time) in response to an input in a first direction (e.g., a clockwise rotation, an upward gesture), and decreases the offset (e.g., moves to a times zone option that is further behind in time) in response to an input in a second direction (e.g., a direction opposite the first direction, a counter-clockwise rotation, a downward gesture).
- a direction of input 618 e.g., a direction of rotation of rotatable input mechanism 603 ; a direction of a gesture.
- a direction of input 618 e.g., a direction of rotation of rotatable input mechanism 603 ; a direction of a gesture.
- a direction of input 618 e.g.
- device 600 receives (e.g., detects) input 620 (e.g., a gesture, a rotation of rotatable input mechanism 603 ).
- input 620 includes a rotation of rotatable input mechanism 603 .
- input 620 is a continuation of input 618 (e.g., further rotation of rotatable input mechanism 603 ).
- device 600 displays watch user interface 612 C shown in FIG. 6E .
- Watch user interface 612 C designates the time zone option corresponding to the time zone that is eight hours ahead of the time zone associated with first analog dial 608 (or the time zone in which device 600 is located), corresponding to an offset of +8 hours.
- the designated time zone option corresponds to the time zone in which London (LON) is located, where the current time is 6:09 PM (18:09 in 24-hour time).
- Second analog dial 606 is positioned to correspond to the designated time zone option such the numerical indicator for the 18 o'clock hour is indicated by visual indication 614 (e.g., visual indication 614 maintains the same position while second analog dial 606 is rotated counter-clockwise from the orientation shown in FIG. 6D ).
- first portion 606 A and second portion 606 B are displayed (e.g., updated) according to the designated option (e.g., to represent daytime and nighttime based on the geographic location and time of year for the selected option, as described above).
- first portion 606 A and second portion 606 B indicate sunrise and sunset times of approximately 6 AM and 8 PM, respectively, for Los Angeles in FIG. 6C , whereas they indicate sunrise and sunset times of 7 AM and 7 PM, respectively, for London in FIG. 6E .
- device 600 receives (e.g., detects) input 622 .
- input 622 includes a tap on an affordance (e.g., “SET” affordance 607 ) on display 602 .
- input 622 includes a press of rotatable and depressible input mechanism 603 .
- input 622 includes a contact on display 602 (e.g., a contact anywhere on display 602 , a contact at a location outside of second analog dial 606 , a tap on time zone selection element 616 ).
- device 600 In response to input 622 , device 600 associates the time zone option designated in FIG. 6E (e.g., the time zone option that is designated at the time of input 622 ) with second analog dial 606 (e.g., in response to input 622 , device 600 sets the time zone associated with second analog dial 606 to the time zone corresponding to the time zone option that is designated at the time of input 622 ).
- the time zone option designated in FIG. 6E e.g., the time zone option that is designated at the time of input 622
- second analog dial 606 e.g., in response to input 622 , device 600 sets the time zone associated with second analog dial 606 to the time zone corresponding to the time zone option that is designated at the time of input 622 ).
- device 600 In response to input 622 , device 600 displays an animation, an embodiment of which is illustrated in FIGS. 6F-6G , resulting in display of watch user interface 604 B. In some embodiments, device 600 displays watch user interface 604 B in response to input 622 without the animation illustrated by FIGS. 6F-6G or with an animation different from the animation illustrated by FIGS. 6F-6G .
- second analog dial 606 includes tick marks indicating the positions of respective hours, and marker 606 C, similar to the appearance of second analog dial 606 in FIGS. 6A-6B .
- the numerical hour indicators shown in FIG. 6E fade out and the tick marks shown in FIG. 6F fade in.
- complications 605 A- 605 D are displayed (e.g., all at the same time, one at a time, while the tick marks are displayed, after the tick marks are displayed).
- Watch user interface 604 B is similar to watch user interface 604 A, except that second analog dial 606 is displayed at a different orientation relative to first analog dial 608 , clock hand 608 D indicates, on second analog dial 606 , the current time in the time zone selected in FIGS. 6C-6E , and current hour indicator 606 D indicates the hour of the current time in the time zone selected in FIGS. 6C-6E .
- the orientation of second analog dial 606 relative to first analog dial 608 corresponds to the offset between the time zone associated with second analog dial 606 and the time zone associated with first analog dial 608 .
- time zone indicator 608 E displays a textual indication (“LON”) of the time zone associated with second analog dial 606 (e.g., an abbreviation of a geographic location within the time zone associated with second analog dial 606 ).
- LON textual indication
- the position of clock hand 608 D relative to first analog dial 608 indicates the current time in the time zone associated with first analog dial 608 , regardless of the orientation of second analog dial 606 relative to first analog dial 608 (e.g., clock hand 608 D indicates the current time in the time zone associated with first analog dial 608 as if first analog dial 608 represented a 24-hour period of time; clock hand 608 D points to the 12 o'clock hour on first analog dial 608 at midnight in the time zone associated with first analog dial 608 and points to the 3 o'clock hour on first analog dial 608 at 6:00 AM in the time zone associated with first analog dial 608 ).
- watch user interface 604 B is displayed at a different (e.g., later) time compared to FIG. 6G .
- the current time in the time zone associated with first analog dial 608 is 11:00 AM, as indicated by hour hand 608 A and minute hand 608 B.
- the corresponding current time in the time zone associated with second analog dial 606 is 7:00 PM (19:00 in 24-hour time).
- Second analog dial 606 has the same orientation relative to first analog dial 608 as in FIG. 6G (e.g., the orientation of second analog dial 606 relative to first analog dial 608 remains the same (e.g., is maintained) as time advances as long as the time zone associated with second analog dial 606 is not changed).
- Clock hand 608 D indicates the current time in the time zone associated with second analog dial 606 by being positioned at the location on the second analog dial representing 19:00.
- clock hand 608 D is rotated clockwise (e.g., clock hand 608 D advances clockwise at a rate of 1/24 th of a full rotation per hour) and current hour indicator 606 D is displayed at the 19 o'clock position instead of the 18 o'clock position.
- current hour indicator 606 D advances to the next adjacent hour position at the top of an hour (e.g., when the current time changes from 18:59 to 19:00).
- FIGS. 7A-7C are a flow diagram illustrating methods of displaying and enabling an adjustment of a displayed time zone, in accordance with some embodiments.
- Method 700 is performed at a computer system (e.g., 100 , 300 , 500 , 600 ) (e.g., a smart device, such as a smartphone or a smartwatch; a mobile device) that is in communication with a display generation component and one or more input devices (e.g., including a touch-sensitive surface that is integrated with the display generation component; a mechanical input device; a rotatable input device; a rotatable and depressible input device; a microphone).
- Some operations in method 700 are, optionally, combined, the orders of some operations are, optionally, changed, and some operations are, optionally, omitted.
- method 700 provides an intuitive way for managing user interfaces related to time.
- the method reduces the cognitive burden on a user for managing user interfaces related to time, thereby creating a more efficient human-machine interface.
- the computer system displays ( 702 ), via the display generation component (e.g., 602 ), a watch user interface (e.g., 604 A) (e.g., showing one or more times via an analog clock), wherein displaying the watch user interface includes concurrently displaying a first analog dial (e.g., 608 ) (e.g., a 12-hour dial) and a first time indicator (e.g., 608 A or 608 B) (e.g., an hour hand or an hour hand and a minute hand) that indicates a current time in a first time zone on the first analog dial (e.g., the current time; the time of the current time zone) ( 704 ), and a second analog dial (e.g., 606 ) (e.g., a 24-hour dial) and a second time indicator (e.g., 608 D) (e.g., an hour hand) that indicates a current time in a second time zone on the second analog dial,
- a first analog dial
- the same time is indicated on both the first analog dial and the second analog dial.
- the second time indicator is displayed in a different color and/or shape than the first time indicator.
- the second analog dial surrounds the outside of the first analog dial.
- the second analog dial includes a graphical indicator (e.g., 606 C) (e.g., a marker; a triangular marker) of the midnight mark (e.g., the 24-hour mark of the 24-hour dial). Concurrently displaying the first analog dial that indicates the current time in the first time zone and the second analog dial that indicates the current time in the second time zone enables a user quickly and easily view current times for different time zones with a reduced number of inputs.
- Reducing the number of inputs needed to perform an operation enhances the operability of the device and makes the user-device interface more efficient (e.g., by helping the user to provide proper inputs and reducing user mistakes when operating/interacting with the device) which, additionally, reduces power usage and improves battery life of the device by enabling the user to use the device more quickly and efficiently.
- the computer system receives ( 710 ), via the one or more input devices, a request (e.g., 610 , 618 , 620 ) to change a time zone associated with the second analog dial (e.g., a time zone that is shown/represented via the second analog dial).
- a request e.g., 610 , 618 , 620
- the computer system In response to receiving the request (e.g., 610 , 618 , 620 ) to change the time zone associated with the second analog dial (e.g., 606 ) ( 716 ), the computer system (e.g., 600 ) changes ( 718 ) the time zone associated with the second analog dial to a third time zone that is different from the first time zone.
- the computer system e.g., 600
- displays 722
- the display generation component e.g., 602
- the watch user interface e.g., 604 A
- Displaying the watch user interface includes concurrently displaying the first analog dial (e.g., 608 ) and the first time indicator (e.g., 608 A or 608 B) indicating a current time in the first time zone (e.g., the first time; the first time plus the amount of time that has passed since detecting the user input and rotating the second analog dial) on the first analog dial ( 724 ), and the second analog dial (e.g., 606 ) and the second time indicator (e.g., 608 D) indicating a current time in the third time zone on the second analog dial, wherein the second analog dial is displayed at a second orientation relative to the first analog dial (e.g., based on the difference between the first time zone and the third time zone) ( 726 ).
- the first analog dial e.g., 608
- the first time indicator e.g., 608 A or 608 B
- the second analog dial e.g., 606
- the second time indicator e.g., 608 D
- Displaying the current time in the third time zone on the second analog dial with the second analog dial being displayed at a second orientation relative to the first analog dial enables a user to efficiently view the current time at the third time zone relative to the current time at the first time zone.
- Providing additional features on a user interface without cluttering the UI with additional displayed controls enhances the operability of the device and makes the user-device interface more efficient (e.g., by helping the user to provide proper inputs and reducing user mistakes when operating/interacting with the device) which, additionally, reduces power usage and improves battery life of the device by enabling the user to use the device more quickly and efficiently.
- the first analog dial (e.g., 608 ) represents a period of 12 hours
- the first time indicator (e.g., 608 A or 608 B) includes at least a first clock hand (e.g., an hour hand) that indicates, on the first analog dial, the current time in the first time zone (e.g., the position of the first clock hand relative to the first analog dial indicates the current time in the first time zone)
- the second analog dial (e.g., 606 ) represents a period of 24 hours
- the second time indicator (e.g., 608 D) includes a second clock hand (e.g., an alternative hour hand) that indicates, on the second analog dial, the current time in the time zone associated with the second analog dial (e.g., the position of the second clock relative to the second analog dial indicates the current time in the time zone associated with the second analog dial).
- Providing the first analog dial that represents a period of 12 hours and the second analog dial that represents a period of 24 hours enables a user to easily distinguish between the two analog dials, thereby enhancing the operability of the device and making the user-device interface more efficient (e.g., by helping the user to more easily read or view displayed content) which, additionally, reduces power usage and improves battery life of the device by enabling the user to use the device more quickly and efficiently.
- the computer system displays ( 728 ), in the second analog dial, a numerical indication (e.g., 606 D) of an hour of the current time in the third time zone without displaying, in the second analog dial, a numerical indication of any other hour.
- the computer system displays, in the second analog dial, a numerical indication of an hour of the current time in the third time zone and numerical indications of a subset of (e.g., but not all of) other hours (e.g., one or more hours before and/or after the current hour, but not all 24 hours).
- the watch user interface (e.g., 604 A) includes a text indication (e.g., 608 E; a name; an abbreviation of the name) of a location (e.g., city; country; geographic region) associated with the second analog dial (e.g., 606 ) ( 730 ).
- a text indication e.g., 608 E; a name; an abbreviation of the name
- a location e.g., city; country; geographic region
- the second analog dial e.g., 606
- Including the text indication of the location associated with the second analog dial in the watch user interface enables a user to easily identify the time zone displayed via the second analog dial, thereby enhancing the operability of the device and making the user-device interface more efficient (e.g., by helping the user to more easily read or view displayed content) which, additionally, reduces power usage and improves battery life of the device by enabling the user to use the device more quickly and efficiently.
- the second analog dial (e.g., 606 ) includes ( 732 ) a first portion (e.g., 606 B) that corresponds to daytime in the time zone (e.g., represented by portion 606 B in FIGS. 6A-6B and 6G-6H ) associated with the second analog dial (e.g., the daytime hours; beginning at a point in the second analog dial (e.g., a first boundary between portion 606 B and 606 A in FIGS. 6A-6B and 6G-6H ) corresponding to a sunrise time and ending at a point in the second analog dial (e.g., a second boundary between portion 606 B and 606 A in FIGS.
- the first portion includes a first visual characteristic (e.g., a first color; a first brightness/dimness level) ( 734 ), and a second portion (e.g., 606 A) (e.g., the remaining portion of the second analog dial other than the first potion) that corresponds to nighttime in the time zone (e.g., represented by portion 606 A in FIGS.
- a first visual characteristic e.g., a first color; a first brightness/dimness level
- 606 A e.g., the remaining portion of the second analog dial other than the first potion
- the second analog dial e.g., the nighttime hours; beginning at the point in the second analog dial corresponding to the sunset time and ending at the point in the second analog dial corresponding to the sunrise time
- the second portion includes a second visual characteristic different from the first visual characteristic (e.g., a second color; a second brightness/dimness level) ( 736 ).
- Providing the first portion that corresponds to daytime and the second portion that corresponds to nighttime in the time zone associated with the second analog dial provides information about daytime/nighttime hours at the time zone associated with the second analog dial in an intuitive manner.
- Providing improved feedback enhances the operability of the device and makes the user-device interface more efficient (e.g., by helping the user to more easily read or view displayed content) which, additionally, reduces power usage and improves battery life of the device by enabling the user to use the device more quickly and efficiently.
- a first position in the second analog dial (e.g., 606 ) (e.g., the point in the second analog dial corresponding to the sunrise time) that corresponds to a beginning point for the first portion (e.g., 606 B) and an ending point for the second portion (e.g., 606 A) and a second position in the second analog dial (e.g., the point in the second analog dial corresponding to the sunset time) that corresponds to an ending point for the first portion and a beginning point for the second portion are determined (e.g., automatically) based on geographic location (e.g., the location (e.g., city; region) corresponding to the respective time zone) and time of year (e.g., the current month; the current season).
- geographic location e.g., the location (e.g., city; region) corresponding to the respective time zone
- time of year e.g., the current month; the current season
- receiving the request (e.g., 610 , 618 , 620 ) to change the time zone associated with the second analog dial (e.g., 606 ) includes detecting, via the one or more input devices (e.g., a touch-sensitive surface integrated with the display generation component), user input (e.g., 610 ) (e.g., touch input) directed to a location (e.g., the center region) on the watch user interface (e.g., 604 A) ( 712 ).
- the one or more input devices e.g., a touch-sensitive surface integrated with the display generation component
- user input e.g., 610
- a location e.g., the center region
- the watch user interface e.g., 604 A
- the request is received while the computer system (e.g., 600 ) is displaying or causing display of, via the display generation component (e.g., 602 ), the watch user interface, and receiving the request does not require access of a menu or a dedicated editing mode to edit the second analog dial.
- changing e.g., shifting; rotating
- the second analog dial does not cause a change to other aspects or features of the watch user interface (e.g., the first analog dial; the first indication of time; displayed watch complications).
- receiving the request (e.g., 610 , 618 , 620 ) to change the time zone associated with the second analog dial (e.g., 606 ) includes detecting, via the one or more input devices (e.g., a rotatable input device; a rotatable and depressible input device), rotational input (e.g., 618 , 620 ) (e.g., in clockwise direction; in a counter-clockwise direction) of a rotatable input mechanism (e.g., 603 ) ( 714 ).
- the one or more input devices e.g., a rotatable input device; a rotatable and depressible input device
- rotational input e.g., 618 , 620
- a rotatable input mechanism e.g., 603
- time zone associated with the second analog dial e.g., 606
- a third time zone e.g., the time zone corresponding to “LON” in FIGS. 6E-6H
- first time zone e.g., the current time zone associated with first analog dial 608 in FIGS.
- 6A-6B includes (e.g., in accordance with detecting an input (e.g., 618 , 620 ) directed to rotating the second analog dial (e.g., while detecting the input directed to rotating the second analog dial)) rotating (e.g., where the rotation is displayed (e.g., as an animation) while an input (e.g., a rotational input on the rotatable input device; a touch input such as a swipe or pinch input) is being received), about a first rotational axis, the second analog dial (e.g., 606 ) to a respective orientation relative to the first analog dial (e.g., 608 ) (e.g., while the first analog dial is not rotated) (e.g., from the orientation of the second analog dial relative to the first analog dial as in FIG.
- an input e.g., 618 , 620
- rotating e.g., where the rotation is displayed (e.g., as an animation) while an input (e.g
- the first rotational axis is perpendicular to a surface of the display generation component (e.g., 602 ).
- the first rotational axis goes through a center of the display generation component (e.g., 602 ).
- the first rotational axis is perpendicular to an axis of rotation of the input directed to rotating the second analog dial. Rotating the second analog dial about the first rotational axis, where the first rotational axis is perpendicular to a surface of the display generation component, when changing the time zone associated with the second analog dial provides visual feedback of the time zone being changed in an intuitive manner.
- Providing improved feedback enhances the operability of the device and makes the user-device interface more efficient (e.g., by helping the user to more easily read or view displayed content) which, additionally, reduces power usage and improves battery life of the device by enabling the user to use the device more quickly and efficiently.
- the computer system e.g., 600
- rotates the second analog dial e.g., 606
- the first direction e.g., the clockwise direction
- a first rotational axis e.g., a first axis going through the center of the watch user interface/display generation component and is perpendicular to the display generation component.
- the computer system e.g., 600 ) rotates the second analog dial (e.g., 606 ) in the second direction (e.g., the counter-clockwise direction) about the first rotational axis.
- the rotational axis of the detected input (e.g., a rotational input; a touch input (e.g., a two-finger twisting input)) is perpendicular to the first rotational axis for rotation of the second analog dial (e.g., 606 ). In some embodiments, the rotational axis of the detected input (e.g., a rotational input; a touch input) is parallel to the first rotational axis for rotation of the second analog dial.
- the amount of rotation (e.g., amount of angle of rotation) of the second dial corresponds to (e.g., is directly proportional to) a magnitude of the user input (e.g., an angular magnitude of a rotation of the rotatable input device).
- the computer system displays or causes display of, in the second analog dial, numbers corresponding to each time mark (e.g., each hour mark) in the second analog dial.
- time zone associated with the second analog dial e.g., 606
- a third time zone e.g., the time zone corresponding to “LON” in FIGS. 6E-6H
- first time zone e.g., the current time zone associated with first analog dial 608 in FIGS.
- 6A-6B includes (e.g., in accordance with detecting an input (e.g., 618 , 620 ) directed to rotating a rotatable user interface element (e.g., 616 ) (e.g., while detecting the input directed to rotating the rotatable user interface element)) rotating, about a second rotational axis, the rotatable user interface element (e.g., as shown via rotation of time zone selection element 616 in FIGS. 6C-6E ) (e.g., while concurrently rotating the second analog dial (e.g., 606 ) to reflect the changing time zone), wherein the second rotational axis is parallel with a surface of the display generation component (e.g., 602 ).
- the second rotational axis is perpendicular to the first rotational axis.
- Rotating the rotatable user interface element e.g., while concurrently rotating the second analog dial to reflect the changing time zone
- the second rotational axis is parallel with a surface of the display generation component, when changing the time zone associated with the second analog dial provides visual feedback of the time zone being changed in an intuitive manner.
- Providing improved feedback enhances the operability of the device and makes the user-device interface more efficient (e.g., by helping the user to more easily read or view displayed content) which, additionally, reduces power usage and improves battery life of the device by enabling the user to use the device more quickly and efficiently.
- the computer system e.g., 600
- the computer system rotates the second analog dial in the second direction (e.g., the counter-clockwise direction) about the second rotational axis.
- the rotational input is directed via a rotatable input device (e.g., 603 ) for which the rotational axis is parallel to the second rotational axis for rotation of the rotatable user interface element (e.g., 616 ).
- a rotatable input device e.g., 603
- the rotational axis is parallel to the second rotational axis for rotation of the rotatable user interface element (e.g., 616 ).
- time zone options that can be selected from the rotatable user interface element include cities/countries/regions (e.g., shown with abbreviations) (e.g., as shown via time zone selection element 616 in FIGS. 6C-6E ).
- time zone options that can be selected from the rotatable user interface element include numerical offsets (e.g., both plus and minus) (e.g., the top two time zone options shown in time zone selection element 616 in FIG.
- the offsets indicate the time difference between a respective different time zone and the current time zone (and where the offset is zero if there is no difference between the time zones).
- the one or more input devices include a rotatable input device (e.g., 603 ) (e.g., a rotatable and depressible input device), and wherein changing the time zone associated with the second analog dial (e.g., 606 ) to a third time zone that is different from the first time zone includes changing the time zone associated with the second analog dial to the third time zone in response to detecting, via the rotatable input device, a rotational input (e.g., 618 or 620 ) (e.g., in a clockwise direction or a counter-clockwise direction).
- a rotational input e.g., 618 or 620
- Changing the time zone associated with the second analog dial in response to detecting, via the rotatable input device, the rotational input provides an intuitive method for a user to navigate through available time zone and select a different time zone.
- Providing improved control options enhances the operability of the device and makes the user-device interface more efficient (e.g., by helping the user to provide proper inputs and reducing user mistakes when operating/interacting with the device) which, additionally, reduces power usage and improves battery life of the device by enabling the user to use the device more quickly and efficiently.
- the computer system adjusts, in the second analog dial, a visual indication of daytime (e.g., 606 B) (e.g., daytime hours; the time between sunrise and sunset) to indicate daytime at the third time zone (e.g., instead of at the second time zone), wherein adjusting the visual indication of daytime to indicate daytime at the third time zone includes transitioning from visually distinguishing (e.g., using a first color; a first shade) a first portion of the second analog dial (e.g., 606 B in FIG.
- a visual indication of daytime e.g., 606 B
- a visual indication of daytime e.g., 606 B
- adjusting the visual indication of daytime to indicate daytime at the third time zone includes transitioning from visually distinguishing (e.g., using a first color; a first shade) a first portion of the second analog dial (e.g., 606 B in FIG.
- the visual indication of daytime includes the portion of the second analog dial corresponding to the daytime hours being shown (e.g., colored; brightened or dimmed) with a first visual characteristic while the remaining portion (e.g., 606 A) of the second analog dial that does not correspond to the daytime hours is not shown with the first visual characteristic.
- the portion (e.g., 606 B) of the second analog dial corresponding to the daytime hours is of a first size and the remaining portion (e.g., 606 A) of the second analog dial that do not correspond to the daytime hours are of a second size that is different from the first size.
- Adjusting the visual indication of daytime (e.g., daytime hours; the time between sunrise and sunset) to indicate daytime at the new time zone in the second analog dial when the time zone is changed provides information about the different daytime/nighttime hours at the new time zone in an intuitive manner.
- Providing improved feedback enhances the operability of the device and makes the user-device interface more efficient (e.g., by helping the user to more easily read or view displayed content) which, additionally, reduces power usage and improves battery life of the device by enabling the user to use the device more quickly and efficiently.
- the portion of the second analog dial corresponding to the daytime hours (e.g., 606 B) and the remaining portion of the second analog dial that do not correspond to the daytime hours (e.g., 606 A) can change (e.g., because different regions/locations within the same time zone can have different daytime hours).
- a first location e.g., a first city; a first region
- the portion of the second analog dial corresponding to the daytime hours has the first size (e.g., size of 606 B in FIGS.
- the remaining portion of the second analog dial that do not correspond to the daytime hours has the second size (e.g., size of 606 A in FIGS. 6A-6B ) different from the first size.
- the portion of the second analog dial corresponding to the daytime hours has a third size different form the first size and the remaining portion of the second analog dial that do not correspond to the daytime hours has a fourth size different from the second size.
- receiving the request (e.g., 610 , 618 , 620 ) to change the time zone associated with the second analog dial (e.g., 606 ) includes receiving a selection of (e.g., via a (e.g., rotatable) user interface element (e.g., 616 ) displayed in the watch user interface (e.g., 604 A) that includes a plurality of selectable time zone options) a geographic location (e.g., a country; a geographic region) in the third time zone.
- a selection of e.g., via a (e.g., rotatable) user interface element (e.g., 616 ) displayed in the watch user interface (e.g., 604 A) that includes a plurality of selectable time zone options) a geographic location (e.g., a country; a geographic region) in the third time zone.
- the computer system in response to receiving the selection of the geographic location in the third time zone, in accordance with a determination that the geographic location corresponds to a first location in the third time zone (e.g., a first city within the third time zone), the computer system (e.g., 600 ) displays, in the second analog dial (e.g., 606 ), a visual indication (e.g., via a different visual characteristic; via a different shade; via a different color) of daytime (e.g., 606 B in FIG. 6B )) (e.g., daytime hours; the time between sunrise and sunset) at a first position within the second analog dial (which indicates daytime hours at the first location in the third time zone).
- a visual indication e.g., via a different visual characteristic; via a different shade; via a different color
- daytime e.g., 606 B in FIG. 6B
- the computer system in response to receiving the selection of the geographic location in the third time zone, in accordance with a determination that the geographic location corresponds to a second location in the third time zone (e.g., a second city within the third time zone), displays, in the second analog dial, the visual indication (e.g., via a different visual characteristic; via a different shade; via a different color) of daytime (e.g., 606 B in FIG. 6D ) (e.g., daytime hours; the time between sunrise and sunset) at a second position within the second analog dial (which indicates daytime hours at the second location in the third time zone).
- daytime e.g., 606 B in FIG. 6D
- the visual indication of daytime at the first location is a different size/length and/or encompasses (e.g., covers) a different portion of the second analog dial than the visual indication of daytime at the second location (e.g., because the amount of daytime is different between the first location and the second location).
- Adjusting the visual indication of daytime e.g., daytime hours; the time between sunrise and sunset
- Adjusting the visual indication of daytime to indicate daytime at the new time zone in the second analog dial when the time zone is changed provides information about the different daytime/nighttime hours at the new time zone in an intuitive manner.
- Providing improved feedback enhances the operability of the device and makes the user-device interface more efficient (e.g., by helping the user to more easily read or view displayed content) which, additionally, reduces power usage and improves battery life of the device by enabling the user to use the device more quickly and efficiently.
- changing the time zone associated with the second analog dial (e.g., 606 ) to the third time zone includes changing a numerical indicator (e.g., 606 D) (e.g., in the second analog dial) corresponding to the current time indicated by the second time indicator (e.g., 608 D) from a first value (e.g., the hour number for a first hour) corresponding to the current time at the second time zone to a second value (e.g., the hour number for a second hour) corresponding to the current time at the third time zone.
- a numerical indicator e.g., 606 D
- a first value e.g., the hour number for a first hour
- a second value e.g., the hour number for a second hour
- Changing the numerical indicator corresponding to the current time indicated by the second time indicator to the second value corresponding to the current time at the third time zone enables a user to quickly and easily identify the current time at the third time zone when the time zone is first changed.
- Providing improved feedback enhances the operability of the device and makes the user-device interface more efficient (e.g., by helping the user to more easily read or view displayed content) which, additionally, reduces power usage and improves battery life of the device by enabling the user to use the device more quickly and efficiently.
- the computer system in response to receiving the request (e.g., 610 , 618 , 620 ) to change the time zone associated with the second analog dial (e.g., 606 ), displays, in the watch user interface (e.g., 604 A) (e.g., inside the second analog dial; in place of the first analog dial), a (e.g., rotatable) user interface element (e.g., 616 ) that includes a plurality of (e.g., list of; a rotatable list of) selectable time zone options, wherein the plurality of selectable time zone options are arranged (e.g., ordered) based on an amount of time offset (e.g., plus/minus a certain number of hours) between the first time zone and respective time zone options of the plurality of selectable time zone options.
- a user interface element e.g., 616
- Displaying the user interface element that includes a plurality of (e.g., list of; a rotatable list of) selectable time zone options, where the plurality of selectable time zone options are arranged (e.g., ordered) based on an amount of time offset enables a user to efficiently navigate (e.g., scroll) through the selectable time zone options as the time zone options are arranged in an intuitive manner.
- Providing improved control options enhances the operability of the device and makes the user-device interface more efficient (e.g., by helping the user to provide proper inputs and reducing user mistakes when operating/interacting with the device) which, additionally, reduces power usage and improves battery life of the device by enabling the user to use the device more quickly and efficiently.
- the plurality of selectable time zone options includes a first time zone option corresponding to a designated geographic location (e.g., a first city; a first country; a first geographic region (e.g., a saved time zone; a favorite time zone; a time zone that is selected and/or stored in a world clock application)), and wherein the displayed first time zone option includes a text indication (e.g., an abbreviation) of the designated geographic location, and a second time zone option that does not correspond to a designated geographic location (e.g., a time zone that is not saved, favorited, or otherwise stored or selected in a world clock application or a different application), wherein the displayed second time zone option includes a numerical indication (e.g., a plus or minus number) of a respective amount of time offset (e.g., plus/minus a certain number of hours) between the second time zone and a time zone corresponding to the second time zone option.
- a designated geographic location e.g., a first city;
- the plurality of selectable time zone options include a third time zone option corresponding to a first geographic location (e.g., a first city; a first country; a first geographic region), wherein the first geographic location corresponds to a first time zone (e.g., a saved time zone; a favorited time zone; a time zone that is selected and/or stored in a world clock application), wherein the displayed first time zone option includes a text indication (e.g., an abbreviation) of the first geographic location, and a fourth time zone option corresponding to a second geographic location different from the first physical location, wherein the second geographic location corresponds to the first time zone, and wherein the fourth time zone option includes a text indication (e.g., an abbreviation) of the second geographic location.
- a third time zone option corresponding to a first geographic location (e.g., a first city; a first country; a first geographic region), wherein the first geographic location corresponds to a first time zone (e.g.,
- the computer system in response to receiving the request (e.g., 610 , 618 , 620 ) to change the time zone associated with the second analog dial, displays, via the display generation component (e.g., 602 ), the watch user interface (e.g., 604 A), wherein displaying the watch user interface includes concurrently displaying a selectable user interface object (e.g., 607 ; a confirmation affordance; a “set” or “done” option) for confirming the change in time zone for the second analog dial (e.g., 606 ).
- a selectable user interface object e.g., 607 ; a confirmation affordance; a “set” or “done” option
- the computer system detects, via the one or more input devices (e.g., a touch-sensitive surface integrated with the display generation component), activation (e.g., selection) (e.g., 622 ) of the selectable user interface object.
- the computer system in response to detecting the activation of the selectable user interface object, sets the second analog dial and the second time indicator (e.g., 608 D) to indicate the current time in the third time zone on the second analog dial (e.g., and ceasing display of the selectable user interface object).
- method 900 optionally includes one or more of the characteristics of the various methods described above with reference to method 700 .
- a watch user interface as described with reference to FIGS. 6A-6H can include and be used to perform a counting operation as described with reference to FIGS. 8A-8M .
- method 1100 optionally includes one or more of the characteristics of the various methods described above with reference to method 700 .
- a device can use as a watch user interface either a user interface that includes an indication of time and a graphical representation of a character as described with reference to FIGS.
- method 1300 optionally includes one or more of the characteristics of the various methods described above with reference to method 700 .
- a device can use as a watch user interface either a time user interface as described with reference to FIGS. 12A-12G or a watch user interface as described with reference to FIGS. 6A-6H .
- method 1500 optionally includes one or more of the characteristics of the various methods described above with reference to method 700 .
- a background of a watch user interface as described with reference to FIGS. 6A-6H can be created or edited via the process for updating a background as described with reference to FIGS. 14A-14AD .
- method 1700 optionally includes one or more of the characteristics of the various methods described above with reference to method 700 .
- the process for changing one or more complications of a watch user interface as described with reference to FIGS. 16A-16AE can be used to change one or more complications of a watch user interface as described with reference to FIGS. 6A-6H .
- these details are not repeated below.
- FIGS. 8A-8M illustrate exemplary user interfaces for initiating a measurement of time, in accordance with some embodiments.
- the user interfaces in these figures are used to illustrate the processes described below, including the processes in FIGS. 9A-9B .
- FIG. 8A illustrates device 600 displaying watch user interface 800 , which includes analog clock face 804 , hour hand 802 A, minute hand 802 B, and seconds hand 802 C.
- Analog clock face 804 includes bezel 804 A (e.g., a ring representing a 12-hour period of time with respect to hour hand 802 A and a 60-minute period of time with respect to minute hand 802 B) and graphical indicator 806 .
- bezel 804 A includes graphical indicator 806 (e.g., graphical indicator 806 is fixed to a position of bezel 804 A).
- graphical indicator 806 is independent from at least some portion of bezel 804 A (e.g., graphical indicator 806 can be displayed independently from at least some portion of bezel 804 A or change position relative to at least some portion of bezel 804 A).
- minute hand 802 B has a length such that it at least partially overlaps (e.g., extends into) bezel 804 A.
- Bezel 804 A has visual indicators (e.g., tick marks, numerals) around bezel 804 A (e.g., at 12 evenly-spaced positions), including graphical indicator 806 .
- bezel 804 A and graphical indicator 806 are displayed at respective orientations relative to analog clock face 804 .
- the 12 o'clock (or zero minutes) position of bezel 804 A is aligned with the 12 o'clock position of analog clock face 804 (e.g., the position vertically upward from origin 801 ), and graphical indicator 806 is positioned at the 12 o'clock (or zero minutes) position with respect to bezel 804 A and the 12 o'clock position with respect to analog clock face 804 .
- device 600 receives (e.g., detects) input 808 .
- input 808 includes a gesture (e.g., a tap on display 602 ).
- input 808 includes a rotation of rotatable input mechanism 603 or a press of a button (e.g., a press of rotatable and depressible input mechanism 603 or hardware button 613 ).
- input 808 can be anywhere on display 602 .
- input 808 must correspond to selection of analog clock face 804 (e.g., a location on display 602 inside the outer boundary of bezel 804 A).
- device 600 in response to an input on analog clock face 804 , device 600 performs a first function (e.g., rotates bezel 804 A and starts counter 810 as described below); and in response to an input that is not on analog clock face 804 , device 600 performs a different function (e.g., if the input is on one of complications 805 A- 805 D, device 600 launches an application corresponding to the selected complication) or no function at all.
- a first function e.g., rotates bezel 804 A and starts counter 810 as described below
- a different function e.g., if the input is on one of complications 805 A- 805 D, device 600 launches an application corresponding to the selected complication
- device 600 displays watch user interface 800 as shown in FIGS. 8B-8C .
- device 600 displays counter 810 and, compared to FIG. 8A , the length of minute hand 802 B is shortened (e.g., such that minute hand 802 B does not overlap bezel 804 A), bezel 804 A and graphical indicator 806 are rotated clockwise, and a visual characteristic (e.g., fill color, fill pattern, outline color, brightness, transparency) of hour hand 802 A and minute hand 802 B is changed.
- Counter 810 is an example of a graphical indication of time (e.g., the time that has elapsed since device 600 received input 808 ).
- bezel 804 A and graphical indicator 806 are displayed at positions (e.g., orientations) relative to analog clock face 804 such that graphical indicator 806 is aligned with minute hand 802 B (e.g., graphical indicator 806 snaps into alignment with minute hand 802 B in response to receiving input 808 ), and counter 810 is updated to show that one second has elapsed (e.g., since device 600 received input 808 , since graphical indicator 806 became aligned with minute hand 802 B).
- the length of minute hand 802 B is displayed (e.g., remains) such that minute hand 802 B does not overlap bezel 804 A.
- device 600 automatically aligns graphical indicator 806 with minute hand 802 B in response to receiving input 808 (e.g., a user does not have to provide input to adjust the position of graphical indicator 806 to align it with minute hand 802 B; inputs of different magnitude (e.g., amount of rotation of rotatable input mechanism 603 ; a duration or spatial length of input 808 (e.g., angular extent of a twist gesture)) result in alignment of graphical indicator 806 with minute hand 802 B).
- inputs of different magnitude e.g., amount of rotation of rotatable input mechanism 603 ; a duration or spatial length of input 808 (e.g., angular extent of a twist gesture)
- device 600 aligns graphical indicator 806 with minute hand 802 B (e.g., by rotating bezel 804 A) without further user input.
- device 600 generates a tactile output when graphical indicator reaches minute hand 802 B (e.g., in conjunction with minute hand 802 B reaching).
- the transition from FIG. 8A to FIG. 8C is animated (e.g., device 600 displays an animation of bezel 804 A rotating until graphical indicator 806 is aligned with minute hand 802 B).
- device 600 displays bezel 804 in the orientation shown in FIG. 8C , with graphical indicator 806 aligned with minute hand 802 B in response to receiving input 808 without an animation or without display of the intermediate state illustrated by FIG. 8B .
- time passes e.g., without further input
- bezel 804 A and graphical indicator 806 remain stationary relative to analog clock face 804 while the hands of clock face 804 progress to indicate the current time and counter 810 continues to update according to the elapsed time.
- device 600 begins counter 810 in response to receiving input 808 .
- device 600 device in response to receiving input 816 , device 600 does not start counter 810 (e.g., device 600 aligns graphical indicator 806 with minute hand 802 B and displays counter 810 , but does not start counter 810 (e.g., counter 810 maintains a time of zero) until further input is received).
- device 600 receives (e.g., detects) input 812 .
- input 812 includes a rotation of rotatable input mechanism 603 in a first direction (e.g., clockwise).
- input 812 includes a gesture (e.g., a touch gesture on display 602 ).
- device 600 rotates bezel 804 A relative to clock face 804 and changes the time displayed by counter 810 in accordance with input 812 , as shown in FIG. 8D .
- the direction in which bezel 804 A is rotated is based on the direction of input 812 .
- the amount of rotation of bezel 804 is based on (e.g., proportional to, directly proportional to) an amount, speed, and/or direction of rotation of input 812 .
- the time displayed by counter 810 is changed based on the change in position of bezel 804 to correspond to the position of bezel 804 A relative to minute hand 802 B.
- bezel 804 A is rotated counter-clockwise by an amount equivalent to five minutes (where one full rotation of bezel 804 A is equivalent to 60 minutes) and the display of counter 810 is changed to show 5:00.
- bezel 804 A is rotated, and counter 810 is updated accordingly, as input is received (e.g., bezel 804 A and counter 810 are updated continually as rotatable input mechanism 603 is rotated).
- device 600 receives (e.g., detects) input 814 corresponding to a rotation of rotatable input mechanism 603 in a direction opposite of the direction of input 812 .
- device 600 moves bezel 804 A such that graphical indicator 806 is in alignment with minute hand 802 B and updates counter 810 accordingly.
- device 600 displays watch user interface 800 as shown in FIG. 8E .
- device 600 displays counter 810 and, similar to as in FIGS. 8B-8D , the length of minute hand 802 B is shortened, bezel 804 A and graphical indicator 806 are rotated clockwise such that, relative to analog clock face 804 , graphical indicator 806 is aligned with minute hand 802 B (e.g., graphical indicator 806 snaps into alignment with minute hand 802 B in response to receiving input 808 ), and a visual characteristic (e.g., fill color, fill pattern, outline color, brightness, transparency) of hour hand 802 A and minute hand 802 B is changed.
- a visual characteristic e.g., fill color, fill pattern, outline color, brightness, transparency
- counter 810 does not start in response to receiving input 808 .
- FIG. 8E while displaying watch user interface 800 including counter 810 that not started ((e.g., counter 810 maintains a time of zero) and graphical indicator 806 is aligned with minute hand 802 B, device 600 receives (e.g., detects) an input 816 .
- input 816 includes a gesture (e.g., a touch gesture on display 602 ).
- input 816 includes a press input directed to rotatable input mechanism 603 .
- device 600 in response to receiving input 816 , device 600 starts counter 810 .
- device 600 after aligning graphical indicator 806 with minute hand 802 B (e.g., by rotating bezel 804 A) and displaying counter 810 in response to receiving input 808 , if device 600 does not receive further input (e.g., a confirmation input, a tap, a button press) within a threshold amount of time (e.g., a non-zero amount of time, 1 second, 2 seconds, 3 seconds, 5 seconds), device 600 displays (e.g., reverts to) watch user interface 800 as displayed in FIG. 8A (e.g., bezel 804 A and graphical indicator 806 are displayed in the orientation relative to clock face 804 shown in FIG. 8A and counter 810 is not displayed (e.g., device 600 ceases display of counter 810 )).
- further input e.g., a confirmation input, a tap, a button press
- a threshold amount of time e.g., a
- watch user interface 800 is displayed at a later time, where 20 minutes and 20 seconds have elapsed, as indicated by counter 810 .
- FIG. 8G illustrates that as minute hand 802 B moves according to the passage of time, device 600 maintains the orientation of bezel 804 A and displays tick marks at the minute positions on bezel 804 A (e.g., between the existing 5-minute interval marks) clockwise from graphical indicator 806 to minute hand 802 B.
- FIG. 8H shows watch user interface 800 at a later time, where 56 minutes and 35 seconds have elapsed, as indicated by counter 810 . At this time, minute hand 802 B has not made a full rotation around clock face 804 relative to the position of graphical indicator 806 .
- FIG. 8G illustrates that as minute hand 802 B moves according to the passage of time, device 600 maintains the orientation of bezel 804 A and displays tick marks at the minute positions on bezel 804 A (e.g., between the existing 5-minute interval marks) clockwise from graphical indicator 806 to minute hand
- Minute hand 802 B has made more than a full rotation around clock face 804 and passed graphical indicator 806 .
- device 600 removes tick marks from the minute positions on bezel 804 A from graphical indicator 806 to minute hand 802 B. Removing the tick marks after minute hand 802 B has passed graphical indicator 806 indicates to the user that minute hand 802 B has made a full rotation.
- device 600 receives (e.g., detects) input 820 .
- input 820 includes a rotation of rotatable input mechanism 603 .
- input 820 includes a gesture (e.g., a touch gesture on display 602 ).
- device 600 rotates bezel 804 A clockwise, until graphical indicator 806 is almost aligned with minute hand 802 B, and updates counter 810 accordingly, as shown in FIG. 8J .
- device 600 maintains display of the tick marks at the minute positions on bezel 804 A between the 5-minute interval marks.
- the time on counter 810 is adjusted by an amount of time that is based on the magnitude, speed, and/or direction of input 820 (e.g., the amount of rotation of rotatable input mechanism 603 ) and the corresponding amount of rotation of bezel 804 A (e.g., device 600 does not reset counter 810 to zero in response to input 820 ).
- device 600 removes tick marks from the minute positions on bezel 804 A in the counter-clockwise direction from graphical indicator 806 to minute hand 802 B.
- device 600 receives (e.g., detects) input 824 .
- input 824 includes a tap gesture on a location of display 602 corresponding to counter 810 .
- input 824 includes a rotation of rotatable input mechanism 603 or a press of a button (e.g., a press of rotatable and depressible input mechanism 603 or hardware button 613 ).
- input 824 can be anywhere on display 602 .
- input 808 must correspond to selection of analog clock face 804 (e.g., a location on display 602 inside the outer boundary of bezel 804 A).
- device 600 in response to an input on analog clock face 804 , device 600 performs a first function (e.g., displays watch user interface 826 in FIG. 8K as described below); and in response to an input that is not on analog clock face 804 , device 600 performs a different function (e.g., if the input is on one of complications 805 A- 805 D, device 600 launches an application corresponding to the selected complication) or no function at all.
- a first function e.g., displays watch user interface 826 in FIG. 8K as described below
- a different function e.g., if the input is on one of complications 805 A- 805 D, device 600 launches an application corresponding to the selected complication
- Watch user interface 826 includes graphical indication of time 810 A (e.g., an enlarged version of counter 810 ), continue affordance 826 A, and stop affordance 826 B.
- graphical indication of time 810 A shows a static indication of the elapsed time on counter 810 when input 824 was received.
- graphical indication of time 810 A updates to show the currently elapsed time (e.g., graphical indication of time 810 A continues to progress from the time on counter 810 when input 824 was received).
- device 600 pauses counter 810 in response to receiving input 824 .
- device 600 continues counter 810 in response to receiving input 824 .
- device 600 in response to receiving input 824 , device 600 ceases display of clock face 804 and/or complications 805 A- 805 D.
- device 600 displays graphical indication of time 810 A, continue affordance 826 A, and stop affordance 826 B overlaid on watch user interface 824 .
- device 600 in response to receiving input 824 , device 600 at least partially obscures (e.g., blurs or greys out) watch user interface 824 .
- device 600 in response to receiving input 824 , resets the user interface (e.g., displays watch user interface 800 as shown in FIG. 8A indicating the current time, or resets counter 810 to zero and aligns graphical indicator 806 with the current position of minute hand 802 B).
- device 600 if input 824 is a first type of input (e.g., a single tap on counter 810 , then device 600 displays watch user interface 826 as shown in FIG. 8K ; and if input 824 is a second type of input (e.g., a double tap on counter 810 ), then device 600 resets the user interface.
- FIG. 8K shows input 828 corresponding to selection of continue affordance 826 A (e.g., a tap at a location on display 602 corresponding to continue affordance 826 A) and input 830 corresponding to selection of stop affordance 826 B (e.g., a tap at a location on display 602 corresponding to stop affordance 826 B).
- continue affordance 826 A e.g., a tap at a location on display 602 corresponding to continue affordance 826 A
- stop affordance 826 B e.g., a tap at a location on display 602 corresponding to stop affordance 826 B
- device 600 in response to receiving input 828 , device 600 returns to the watch user interface that was displayed at the time of receiving input 824 and continues to update counter 810 (e.g., device 600 ceases to display continue affordance 826 A, stop affordance 826 B, and graphical indication of time 810 A (e.g., reduces the enlarged version of counter 810 to its previous size)).
- update counter 810 e.g., device 600 ceases to display continue affordance 826 A, stop affordance 826 B, and graphical indication of time 810 A (e.g., reduces the enlarged version of counter 810 to its previous size).
- device 600 in response to receiving input 830 , device 600 returns to watch user interface 800 (e.g., device 600 ceases to display continue affordance 826 A, stop affordance 826 B, and graphical indication of time 810 A), in which bezel 804 A and graphical indicator 806 are aligned with the 12 o'clock position of clock face 804 , counter 810 is not displayed, no tick marks are displayed between the 5-minute intervals of bezel 804 , and hour hand 802 A and minute hand 802 B are displayed with the visual characteristics shown in FIG. 8A (e.g., instead of the visual characteristics shown in FIGS. 8B-8J ).
- watch user interface 800 e.g., device 600 ceases to display continue affordance 826 A, stop affordance 826 B, and graphical indication of time 810 A
- bezel 804 A and graphical indicator 806 are aligned with the 12 o'clock position of clock face 804
- counter 810 is not displayed
- no tick marks are displayed between the 5-minute intervals of bezel 80
- FIGS. 9A-9B are a flow diagram illustrating methods of initiating a measurement of time, in accordance with some embodiments.
- Method 900 is performed at a computer system (e.g., 100 , 300 , 500 , 600 ) (e.g., a smart device, such as a smartphone or a smartwatch; a mobile device) that is in communication with a display generation component and one or more input devices (e.g., including a touch-sensitive surface that is integrated with the display generation component; a mechanical input device; a rotatable input device; a rotatable and depressible input device; a microphone).
- Some operations in method 900 are, optionally, combined, the orders of some operations are, optionally, changed, and some operations are, optionally, omitted.
- method 900 provides an intuitive way for managing user interfaces related to time.
- the method reduces the cognitive burden on a user for managing user interfaces related to time, thereby creating a more efficient human-machine interface.
- the computer system displays ( 902 ), via the display generation component (e.g., 602 ), a watch user interface (e.g., 800 ) (e.g., showing a clock with a hour hand and a minute hand), the watch user interface including an analog clock face (e.g., 804 ) that includes a first clock hand (e.g., 802 B) (e.g., the minute hand of the clock) and a graphical indicator (e.g., 806 ) (e.g., a marker (e.g., a triangular marker)), wherein the graphical indicator is displayed at a first position relative to the analog clock face (e.g., along/within a dial region surrounding the clock).
- the graphical indicator is initially not aligned with the first clock hand along the boundary.
- the graphical indicator is initially displayed at the top-center position along the boundary.
- the computer system detects ( 906 ), via the one or more input devices (e.g., via a first input device (e.g., 602 or 603 ) (e.g., a touch-sensitive surface; a touch-sensitive display; a rotatable input device; a rotatable and depressible input device; a mechanical input device)), a first user input (e.g., 808 ).
- the first user input is an input of a first type (e.g., a rotational input on the first input device; a scrolling input on the first input device or a tap input on a touch-sensitive surface such as a touchscreen display).
- the computer system moves ( 912 ) the graphical indicator (e.g., 806 ) to a second position relative to the analog clock face (e.g., 804 ) such that the graphical indicator is aligned with the first clock hand (e.g., 802 B) (e.g., such that the graphical indicator is pointing to or marking the position of the first clock hand; such that the graphical indicator is at the outer end of the first clock hand).
- Moving the graphical indicator to the second position relative to the analog clock face such that the graphical indicator is aligned with the first clock hand in response to detecting the first user input provides visual feedback of the initiation of a feature (e.g., initiation of a time counter) and a starting point of the initiated feature (e.g., the starting time for the counter) in an intuitive manner.
- Providing improved feedback enhances the operability of the device and makes the user-device interface more efficient (e.g., by helping the user to more easily read or view displayed content) which, additionally, reduces power usage and improves battery life of the device by enabling the user to use the device more quickly and efficiently.
- the computer system displays ( 920 ) a graphical indication of a time (e.g., 810 ) (e.g., a time counter; a digital counter) that has elapsed from a time when the first user input (e.g., 808 ) (e.g., the input moving the graphical indicator to a second position relative to the analog clock face such that the graphical indicator is aligned with the first clock hand) was detected to a current time.
- a time e.g., 810
- a time counter e.g., a digital counter
- the graphical indication of the time that has elapsed is displayed within the analog clock face in the watch user interface (e.g., 800 ). Displaying the graphical indication of a time that has elapsed from the time when the first user input while the graphical indicator is displayed at the second position relative to the analog clock face enables a user to quickly and easily recognize that the time has been initiated and the time that has elapsed. Providing improved feedback enhances the operability of the device and makes the user-device interface more efficient (e.g., by helping the user to more easily read or view displayed content) which, additionally, reduces power usage and improves battery life of the device by enabling the user to use the device more quickly and efficiently.
- Initiating a time counter (e.g., displayed via the graphical indication of a time) in response to the first user input enables a user to initiate the time counter in a quick and efficient manner.
- Providing additional control options without cluttering the UI with additional displayed controls enhances the operability of the device and makes the user-device interface more efficient (e.g., by helping the user to provide proper inputs and reducing user mistakes when operating/interacting with the device) which, additionally, reduces power usage and improves battery life of the device by enabling the user to use the device more quickly and efficiently.
- the computer system in response to detecting the first user input (e.g., 808 ), displays or causes display of the graphical indicator (e.g., 806 ) to a second position (e.g., position of 806 in FIG. 8C from position of 806 in FIG. 8A ) relative to the analog clock face (e.g., 804 ) and displays the graphical indication of the time (e.g., 810 ), where the graphical indication of the time is shown at an initial state (e.g., “00:00”) without yet indicating an elapsed time.
- an initial state e.g., “00:00”
- the computer system detects, via the one or more input devices (e.g., via a second input device, such as a touch-sensitive surface that is integrated with the display generation component (e.g., 602 )), a second user input (e.g., corresponding to an activation/selection of the graphical indication of the time).
- the second user input is an input of a second type (e.g., a touch input on a touch-sensitive surface that is integrated with the display generation component) that is different from the first type.
- the computer system displays or causes display of, in the graphical indication of the time, the time that has elapsed from the time when the first user input was detected to the current time.
- the computer system shifts (e.g., rotates) ( 914 ) an analog dial (e.g., 804 A) (e.g., including indications of time positions (e.g., 00:00/12:00 position, 3:00/15:00 position, 6:00/18:00 position, 9:00/21:00 position; 0 minute position, 15 minute position, 30 minute position, 45 minute position)) of the analog clock face (e.g., 804 ) in accordance with the movement of the graphical indicator (e.g., 806 ) (e.g., a marker (e.g., a triangular marker)) such that a scale of the analog dial is aligned to begin at (e.g., the 00:00/12:00 position/0 minute position of the analog dial is aligned to) the second position relative to the analog clock face.
- an analog dial e.g., 804 A
- the graphical indicator e.g., 806
- a marker e.g., a triangular marker
- Shifting e.g., rotating
- Shifting e.g., rotating
- the analog dial in accordance with the movement of the graphical indicator such that a scale of the analog dial is aligned to begin at the second position relative to the analog clock face provides visual feedback of the starting position of the time counter in an intuitive manner.
- Providing improved feedback enhances the operability of the device and makes the user-device interface more efficient (e.g., by helping the user to more easily read or view displayed content) which, additionally, reduces power usage and improves battery life of the device by enabling the user to use the device more quickly and efficiently.
- the first user input includes a rotational input detected via the one or more input devices (e.g., a first input device (e.g., 603 ) (e.g., a rotatable input device; a rotatable and depressible input device)) ( 908 ).
- moving the graphical indicator e.g., 806
- moving the graphical indicator in response to detecting the first user input includes snapping the graphical indicator to the second position relative to the analog clock face (e.g., 804 ) such that the graphical indicator is aligned with the first clock hand (e.g., 802 B).
- the computer system in response to the first input (e.g., 808 ) ( 910 ), in conjunction with moving the graphical indicator (e.g., 806 ) (e.g., a marker (e.g., a triangular marker)) to the second position relative to the analog clock face (e.g., 804 ) (e.g., in response to detecting the first user input; when the graphical indicator is moved from the first position to the second position), the computer system (e.g., 600 ) generates ( 916 ) (e.g., via one or more tactile output generators that is in communication with the computer system) a tactile output (e.g., a tactile output sequence that corresponds to moving the graphical indicator to the second position).
- a tactile output e.g., a tactile output sequence that corresponds to moving the graphical indicator to the second position
- Generating the tactile output in conjunction with moving the graphical indicator e.g., a marker (e.g., a triangular marker)
- moving the graphical indicator e.g., a marker (e.g., a triangular marker)
- the graphical indicator e.g., a marker (e.g., a triangular marker)
- Providing improved visual feedback to the user enhances the operability of the device and makes the user-device interface more efficient (e.g., by helping the user to provide proper inputs and reducing user mistakes when operating/interacting with the device) which, additionally, reduces power usage and improves battery life of the device by enabling the user to use the device more quickly and efficiently.
- the computer system displays ( 924 ) a movement of the first clock hand (e.g., 802 B) (e.g., rotating within the analog clock face) to indicate the current time (e.g., the “minute” of the current time).
- a movement of the first clock hand e.g., 802 B
- the current time e.g., the “minute” of the current time
- the computer system in accordance with the first clock hand being aligned with (e.g., to point to; to be in line with) the second position of the graphical indicator (e.g., 806 ) (e.g., a marker (e.g., a triangular marker)) within the analog clock face, the computer system generates ( 926 ) (e.g., via one or more tactile output generators that is in communication with the computer system) a tactile output (e.g., a tactile output sequence that corresponds to the first clock hand being aligned with the second position of the graphical indicator).
- a tactile output e.g., a tactile output sequence that corresponds to the first clock hand being aligned with the second position of the graphical indicator.
- the computer system does not move the graphical indicator (e.g., the graphical indicator remains at (e.g., stays fixed to) the second position relative to the analog clock face) while the computer system moves the first clock hand relative to the analog clock face to indicate the current time.
- the graphical indicator e.g., the graphical indicator remains at (e.g., stays fixed to) the second position relative to the analog clock face
- the computer system detects ( 928 ), via the one or more input devices (e.g., the first input device (e.g., 603 ) (e.g., a rotatable input device; a rotatable and depressible input device)), a second user input (e.g., 812 or 814 ) (e.g., a rotational input on the first input device; a continuation of the first user input (e.g., additional or continued rotation of the rotatable input mechanism)).
- the input devices e.g., the first input device (e.g., 603 ) (e.g., a rotatable input device; a rotatable and depressible input device)
- a second user input e.g., 812 or 814
- a rotational input on the first input device e.g., a continuation of the first user input (e.g., additional or continued rotation of the rotatable input mechanism)
- the computer system in response to detecting the second user input ( 930 ), adjusts (e.g., increasing or decreasing) ( 932 ) the graphical indication of the time in accordance with (e.g., based on an amount of, speed of, and/or direction of) the second user input.
- adjusting the graphical indication of the time in accordance with the second user input being in a first (e.g., clockwise) direction on the first input device, adjusting the graphical indication of the time includes increasing the displayed time based on the amount and/or speed of the input.
- adjusting the graphical indication of the time includes decreasing the displayed time based on the amount and/or speed of the counter-clockwise input. Adjusting (e.g., increasing or decreasing) the graphical indication of the time in accordance with (e.g., based on an amount of, speed of, and/or direction of) the second user input while the time counter is running enables a user to adjust the running time counter in an convenient and efficient manner.
- Providing additional control options without cluttering the UI with additional displayed controls enhances the operability of the device and makes the user-device interface more efficient (e.g., by helping the user to provide proper inputs and reducing user mistakes when operating/interacting with the device) which, additionally, reduces power usage and improves battery life of the device by enabling the user to use the device more quickly and efficiently.
- the computer system detects a third user input (e.g., 812 or 814 ) (e.g., that is a continuation of the first user input (e.g., in the same rotational direction); that is an input in a different (e.g., rotational) direction from the first user input).
- a third user input e.g., 812 or 814
- the computer system detects a third user input (e.g., 812 or 814 ) (e.g., that is a continuation of the first user input (e.g., in the same rotational direction); that is an input in a different (e.g., rotational) direction from the first user input).
- the computer system in response to detecting the third user input, moves (e.g., slides; rotates) the graphical indicator (e.g., a marker (e.g., a triangular marker)) from the second position relative to the analog clock face (e.g., 804 ) to a third position relative to the analog clock face different from the second position.
- the computer system adjusts the time displayed in the graphical indication of the time (e.g., 810 ) to include an offset from the elapsed time from when the first user input was detected to the current time, wherein the offset corresponds to a difference (e.g., in minutes) between the second position and the third position relative to the analog clock face.
- Adjusting the time displayed in the graphical indication of the time to include the offset from the elapsed time from when the first user input was detected to the current time enables a user to quickly and easily adjust the time displayed in the graphical indication of the time if an adjustment is needed without needing to re-initiate the time displayed in the graphical indication of the time. Reducing the number of inputs needed to perform an operation enhances the operability of the device and makes the user-device interface more efficient (e.g., by helping the user to provide proper inputs and reducing user mistakes when operating/interacting with the device) which, additionally, reduces power usage and improves battery life of the device by enabling the user to use the device more quickly and efficiently.
- the offset corresponds to the addition of the first amount of time
- the time displayed in the graphical indication of the time includes the elapsed time from when the first user input (e.g., 808 ) was detected to the current time adjusted by the addition of the first amount of time.
- the offset corresponds to the subtraction of the second amount of time
- the time displayed in the graphical indication of the time includes the elapsed time from when the first user input was detected to the current time adjusted by the subtraction of the second amount of time (e.g., which can be a negative time).
- the computer system in response to detecting the third input, in accordance a determination that the third user input corresponds to an input (e.g., detected via a rotatable input device; detected via a rotatable and depressible input device) in a first direction (e.g., a clockwise direction), the computer system (e.g., 600 ) moving the graphical indicator (e.g., a marker (e.g., a triangular marker)) from the second position to the third position includes moving (e.g., sliding; rotating) the graphical indicator (e.g., 806 ) along (e.g., a dial region of) the analog clock face (e.g., 804 ) in a clockwise direction (towards the third position (e.g., where, based on a clockwise direction, the third position is ahead of the second position within the analog clock face) as the third user input (e.g., 814 ) is detected.
- the graphical indicator e.g.,
- the computer system moving the graphical indicator from the second position to the third position includes moving (e.g., sliding; rotating) the graphical indicator along (e.g., a dial region of) the analog clock face in a counter-clockwise direction towards the third position (e.g., where, based on a clockwise direction, the third position is behind the second position within the analog clock face) as the third user input is detected.
- the input (e.g., 812 ) in the first direction corresponds to a rotational input (e.g., detected via a rotatable input device; detected via a rotatable and depressible input device) in a first rotational direction (e.g., clockwise direction).
- the input (e.g., 814 ) in the second direction corresponds to a rotational input (e.g., detected via a rotatable input device; detected via a rotatable and depressible input device) in a second rotational direction opposite the first rotational direction (e.g., counter-clockwise direction).
- the computer system while displaying the graphical indication of the time (e.g., 810 ) (e.g., a time counter a digital counter) that has elapsed from the time when the first user input (e.g., 808 ) was detected to the current time, the computer system (e.g., 600 ) detects, via the one or more input devices (e.g., a touch-sensitive surface), selection (e.g., 824 ) of (e.g., touch input on) the graphical indication of the time.
- the one or more input devices e.g., a touch-sensitive surface
- the computer system in response to detecting the selection of the graphical indication of the time, displays, via the display generation component (e.g., 602 ), a prompt (e.g., 826 ; an alert; a notification) that includes a first option (e.g., 826 A; a first selectable user interface object; a first affordance) that, when selected, causes the computer system to continue counting, via the graphical indication of the time, the time that has elapsed from a time when the first user input was detected to a current time, and a second option (e.g., 826 B; a second selectable user interface object; a second affordance) that, when selected, causes the computer system to cease (e.g., stop) counting, via the graphical indication of the time, the time that has elapsed from a time when the first user input was detected to a current time.
- a prompt e.g., 826 ; an alert; a notification
- a prompt e.g., 826 A
- ceasing counting the time includes ceasing displaying the graphical indication of the time. In some embodiments, ceasing counting the time includes maintaining display of the graphical indication of the time and resetting (e.g., to “00:00”) the time counted via the graphical indication of the time. Displaying the prompt that includes the first potion and the second option in response to detecting the selection of the graphical indication of the time enables a user to cause the computer system to continue or cease the counting in an easy and intuitive manner.
- Providing improved feedback enhances the operability of the device and makes the user-device interface more efficient (e.g., by helping the user to more easily read or view displayed content) which, additionally, reduces power usage and improves battery life of the device by enabling the user to use the device more quickly and efficiently.
- the computer system in response to detecting the first user input (e.g., 808 ), changes (e.g., modifies) a visual characteristic of (e.g., dims; changes color of (e.g., to be the same color as the graphical indicator and/or as the graphical indication of the time)) the first clock hand (e.g., 802 B) to include a first visual characteristic (e.g., a dimmed color or visual state; the color of the graphical indicator and/or the graphical indication of the time).
- a visual characteristic of e.g., dims; changes color of (e.g., to be the same color as the graphical indicator and/or as the graphical indication of the time)
- the first clock hand e.g. 802 B
- the analog clock face (e.g., 804 ) includes a second clock hand (e.g., 802 A) (e.g., the hour hand of the clock).
- the computer system in response to detecting the first user input, changes (e.g., modifies) the visual characteristic of the second clock hand to include the first visual characteristic.
- Changing the visual characteristic of the first clock hand to include the first visual characteristic in response to detecting the first user input provides visual feedback that an operation (e.g., the counting) has been enabled, thereby enhancing the operability of the device and making the user-device interface more efficient (e.g., by helping the user to more easily recognize that the operation has been initiated) which, additionally, reduces power usage and improves battery life of the device by enabling the user to use the device more quickly and efficiently.
- an operation e.g., the counting
- the computer system e.g., 600 detects (e.g., via a touch-sensitive surface of the one or more input devices) an input (e.g., a rotational input on the rotatable input device; a touch input such as a swipe or pinch input) directed to a rotatable input device (e.g., 603 ) of the one or more input devices.
- an input e.g., a rotational input on the rotatable input device; a touch input such as a swipe or pinch input
- a rotatable input device e.g., 603
- the computer system in response to detecting the input directed to the rotatable input device, changes (e.g., modifies) the visual characteristic of (e.g., dims; changes the color of (e.g., to be the same color as the graphical indicator and/or as the graphical indication of the time)) the first clock hand (e.g., 802 B) to include the first visual characteristic (e.g., a dimmed color or visual state; the color of the graphical indicator and/or the graphical indication of the time).
- the visual characteristic of e.g., dims
- the first clock hand e.g. 802 B
- the computer system in response to detecting the first user input (e.g., 808 ), changes (e.g., modifies) a shape of (e.g., changes a feature of; changes the size of; makes smaller; shrinks) the first clock hand (e.g., 802 B) to be a first shape (e.g., a smaller, shrunk clock hand).
- the analog clock face e.g., 804
- the analog clock face includes a second clock hand (e.g., 802 A) (e.g., the hour hand of the clock).
- the computer system in response to detecting the first user input, changes (e.g., modifies) a shape of (e.g., changes a feature of; changes the size of; makes smaller; shrinks) the second clock hand to be a second shape (e.g., a smaller, shrunk clock hand).
- a shape of e.g., changes a feature of; changes the size of; makes smaller; shrinks
- the second clock hand in response to detecting the first user input, the computer system changes (e.g., modifies) a shape of (e.g., changes a feature of; changes the size of; makes smaller; shrinks) the second clock hand to be a second shape (e.g., a smaller, shrunk clock hand).
- Changing the shape of the first clock hand to be the first shape in response to detecting the first user input provides visual feedback that an operation(e.g., the counting) has been enabled, thereby enhancing the operability of the device and making the user-device interface more efficient (e.g., by helping the user to more easily recognize that the operation has been initiated) which, additionally, reduces power usage and improves battery life of the device by enabling the user to use the device more quickly and efficiently.
- an operation e.g., the counting
- the computer system displays (e.g., continues to display), in the analog clock face (e.g., 804 ), a movement of the first clock hand (e.g., 802 B) to indicate the current time (e.g., the “minute” of the current time).
- the graphical indicator e.g., 806
- the computer system displays (e.g., continues to display), in the analog clock face (e.g., 804 ), a movement of the first clock hand (e.g., 802 B) to indicate the current time (e.g., the “minute” of the current time).
- the computer system displays, in the analog clock face (e.g., 804 ) (e.g., in a dial region of the analog clock face), visual indicators (e.g., visual markers (e.g., tick marks), as shown in FIGS. 8G-8H ) along a path of movement of (e.g., the tip of) the first clock hand as the first clock hand is moving (e.g., rotating) around the analog clock face (e.g., the visual indicators appear along the path of movement of the first clock hand as the first clock hand is moving circularly within the analog clock face).
- the analog clock face e.g. 804
- visual indicators e.g., visual markers (e.g., tick marks)
- Displaying the visual indicators along the path of movement of (e.g., the tip of) the first clock hand as the first clock hand is moving (e.g., rotating) around the analog clock face provides visual feedback that the counting is on-going, thereby enhancing the operability of the device and making the user-device interface more efficient (e.g., by helping the user to more easily recognize that the operation has been initiated) which, additionally, reduces power usage and improves battery life of the device by enabling the user to use the device more quickly and efficiently.
- the computer system removes display of the visual indicators along the path of movement of (e.g., the tip of) the first clock hand (e.g., 802 B) as the first clock hand is moving (e.g., rotating) around the analog clock face (e.g., 804 ) (e.g., as shown in FIG. 8I ).
- the computer system in response to detecting the first user input (e.g., 808 ), moves the graphical indicator (e.g., 806 ) to the second position relative to the analog clock face (e.g., 804 ) such that the graphical indicator is aligned with the first clock hand (e.g., 802 B) (e.g., such that the graphical indicator is pointing to or marking the position of the first clock hand; such that the graphical indicator is at the outer end of the first clock hand) and displays the graphical indication of the time (e.g., 810 ) (e.g., a time counter; a digital counter) but does not automatically initiate a counting of the time using the graphical indication of the time.
- the graphical indicator e.g., 806
- the computer system while displaying the graphical indication of the time, the computer system detects (e.g., via a touch-sensitive surface of the one or more input devices) an input (e.g., 816 ; a user's tap input) directed to confirming the initiation of the counting of the time (e.g., user selection of a confirm affordance (e.g., “set” affordance or “done” affordance)).
- an input e.g., 816 ; a user's tap input
- a confirm affordance e.g., “set” affordance or “done” affordance
- the computer system moves the graphical indicator back to its previous position (the first position) relative to the analogic clock face.
- method 700 optionally includes one or more of the characteristics of the various methods described above with reference to method 900 .
- a watch user interface as described with reference to FIGS. 6A-6H can include and be used to perform a counting operation as described with reference to FIGS. 8A-8M .
- method 1100 optionally includes one or more of the characteristics of the various methods described above with reference to method 900 .
- a device can use as a watch user interface either a user interface that includes an indication of time and a graphical representation of a character as described with reference to FIGS.
- method 1300 optionally includes one or more of the characteristics of the various methods described above with reference to method 900 .
- a device can use as a watch user interface either a time user interface as described with reference to FIGS. 12A-12G or a watch user interface as described with reference to FIGS. 8A-8M .
- method 1500 optionally includes one or more of the characteristics of the various methods described above with reference to method 900 .
- a background of a watch user interface as described with reference to FIGS. 8A-8M can be created or edited via the process for updating a background as described with reference to FIGS. 14A-14AD .
- method 1700 optionally includes one or more of the characteristics of the various methods described above with reference to method 900 .
- the process for changing one or more complications of a watch user interface as described with reference to FIGS. 16A-16AE can be used to change one or more complications of a watch user interface as described with reference to FIGS. 8A-8M .
- FIGS. 8A-8M For brevity, these details are not repeated below.
- FIGS. 10A-10AC illustrate exemplary user interfaces for enabling and displaying user interface using a character, in accordance with some embodiments.
- the user interfaces in these figures are used to illustrate the processes described below, including the processes in FIGS. 11A-11H .
- FIG. 10A illustrates device 600 displaying user interface 1001 that concurrently includes indication of time 1002 and graphical representation 1000 of a first character displayed on background 1004 .
- representation 1000 of the first character corresponds to a graphical representation of a user associated with device 600 (e.g., a representation created or customized by a user).
- device 600 is in a first activity state (e.g., a locked state; a sleep state, a low-power state) in which display 602 is dimmed (e.g., at a lower brightness) compared to a “normal” operating state.
- a first activity state e.g., a locked state; a sleep state, a low-power state
- display 602 is dimmed (e.g., at a lower brightness) compared to a “normal” operating state.
- device 600 displays fewer graphical elements than in the normal operating state (e.g., complication 1005 A and complication 1005 B shown in, e.g., FIG. 10B are not displayed in the first state).
- FIG. 10B illustrates device 600 in a second activity state (e.g., the normal operating state, an active state, a different activity state from the first activity state depicted in FIG. 10A ) in which display 602 is not dimmed.
- user interface 1001 concurrently displays indication of time 1002 and graphical representation 1000 of the first character on background 1004 (e.g., similar to FIG. 10A ), as well as complications 1005 A and 1005 B that provide date and weather information, respectively.
- device 600 displays graphical representation 1000 of the first character in a second visual state different, from the first visual state, that corresponds to the second activity state.
- FIG. 10B illustrates device 600 in a second activity state in which display 602 is not dimmed.
- user interface 1001 concurrently displays indication of time 1002 and graphical representation 1000 of the first character on background 1004 (e.g., similar to FIG. 10A ), as well as complications 1005 A and 1005 B that provide date and weather information, respectively.
- device 600 displays graphical representation 1000 of the
- the second visual state shows the first character with eyes open (e.g., a neutral pose).
- device 600 changes from the user interface in FIG. 10A to the user interface in FIG. 10B (or vice versa) in response to detecting a change in the activity state of device 600 (e.g., in response to detecting a change from the first activity state to the second activity state (or vice versa), respectively).
- FIGS. 10C-10D illustrate device 600 in the second activity state (e.g., the normal or active activity state) and displaying the first character in a visual state that includes an animation in which representation 1000 of the first character alternates between a first position (e.g., head tilted to the left as depicted in FIG. 10C ) and a second position (e.g., head tilted to the right as depicted in FIG. 10D ).
- representation 1000 alternates between the first position and the second position (e.g., at a periodic rate) to indicate the passing of time (e.g., from the first position to the second position every one second or 0.5 seconds, from the first position to the second position and back to the first position every two seconds or 1 second).
- the animation is based on the character (e.g., different animations are displayed for different characters).
- device 600 displays a gradual transition from a first animation of representation 1000 of the first character to a second (e.g., different) animation (e.g., device 600 interpolates (e.g., based on a last state of the first animation and a first state of the second animation) between the two animations to provide a smooth transition).
- device 600 receives (e.g., detects) input 1006 (e.g., a tap at a location on display 602 that corresponds to representation 1000 , a wrist raise).
- device 600 displays representation 1000 with the first character in a different visual state (e.g., device 600 changes the visual state of the first character), as illustrated by FIG. 10E .
- device 600 changes the display of visual representation 1000 to change the visual state of the first character in response to input 1006 .
- the first character is shown winking with an open mouth (e.g., a selfie pose), whereas in FIG. 10D the first character had both eyes open and mouth closed.
- device 600 changes the display of visual representation 1000 to change the visual state of the first character without user input (e.g., device 600 changes the visual state in response to time-based criteria being met, device 600 automatically cycles through a set of predetermined visual states (e.g., device 600 displays representation 1000 with a visual state for a predetermined amount of time before changing to another visual state)).
- representation 1000 is displayed in a manner that indicates a change in time.
- indication of time 1002 shows that the time has changed to 10:10 from 10:09 in FIG. 10E .
- the first character looks or glances at indication of time 1002 (e.g., the head and/or eyes of representation 1000 move to appear as though the first character is looking at indication of time 1002 ).
- representation 1000 indicates a change in time in response to a change in the minute of the current time.
- representation 1000 indicates a change in time only in response to a change in the hour of the current time (e.g., from 10:59 to 11:00). In some embodiments, representation 1000 indicates a change in time (e.g., appears to look at indication of time 1002 ) when a predetermined time has been reached (e.g., the hour has changed, a quarter past the hour has been reached, half past the hour has been reached, 45 minutes past the hour has been reached).
- a predetermined time e.g., the hour has changed, a quarter past the hour has been reached, half past the hour has been reached, 45 minutes past the hour has been reached.
- FIG. 10G illustrates device 600 in a third activity state (e.g., an inactive unlocked state, a low-power unlocked state) different from the first activity state in FIG. 10A and the second activity state in FIGS. 10B-10F .
- device 600 displays indication of time 1002 , graphical representation 1000 of the first character (e.g., in a visual state having a neutral body expression), and complications 1005 A and 1005 B on background 1004 (similar to the second activity state in, e.g., FIG. 10B ); display 602 is dimmed compared to the second activity state (e.g., an active unlocked state) and brighter compared to the first activity state (e.g., a locked state).
- the second activity state e.g., an active unlocked state
- the first activity state e.g., a locked state
- representation 1000 shows the first character in the same visual state shown in FIG. 10B , where device 600 was in the second activity state (e.g., when device 600 changes from the second activity state to the third activity state, representation 1000 can maintain the visual state of the first character while changing the brightness of display 602 ).
- FIG. 10H illustrates device 600 in a fourth activity state (e.g., a change-in-time state for predetermined intervals) different from the first activity state in FIG. 10A , the second activity state in FIGS. 10B-10F , and the third activity state in FIG. 10G .
- device 600 changes the visual state (e.g., changes the pose, displays a different animation) of the first character in representation 1000 , where changing the visual state includes displaying the first character in representation 1000 to look (e.g., glance) at indication of time 1002 , as illustrated by FIG. 10H .
- device 600 is in the fourth activity state at predetermined time intervals (e.g., every 10 seconds; every 15 seconds; every 30 seconds; every minute; every 5 minutes).
- device 600 receives (e.g., detects) input 1007 (e.g., a touch on display 602 with a duration that exceeds a predetermined threshold, a touch on display 602 with a characteristic intensity that exceeds a predetermined threshold).
- device 600 displays user interface 1008 shown in FIG. 10I .
- user interface 1008 is a user interface of a user interface editing mode (e.g., in response to receiving input 1006 , device 600 enters a user interface editing mode for editing one or more features of user interface 1001 ).
- User interface 1008 displays representation 1001 A of user interface 1001 (e.g., a static, smaller-scale image of user interface 1001 ), share affordance 1010 , and customize affordance 1012 .
- device 600 receives (e.g., detects) input 1014 corresponding to a request to edit user interface 1001 (e.g., a tap at a location on display 602 corresponding to customize affordance 1012 ).
- device 600 displays user interface 1016 A shown in FIG. 10J .
- Paging dots 1044 A- 1044 C indicate that user interface 1016 A is the first in a sequence of three editing user interfaces.
- User interface 1016 A provides the capability to change the character displayed on user interface 1001 (e.g., by swiping up or down on display 602 or rotating rotatable input mechanism 603 ).
- User interface 1016 A displays de-emphasized (e.g., dimmed, greyed, blurred) representations of complications 1005 A and 1005 B, representation 1000 of the currently-selected character (e.g., the first character), character selection element 1046 , and textual identifier 1018 of the currently-selected character.
- Character option selection element 1046 indicates the position of the currently selected option in a sequence of character options.
- device 600 receives input 1020 (e.g., a right-to-left swipe gesture on display 602 ).
- device 600 displays user interface 1016 B, which (as indicated by label 1022 ) provides the capability to change the color of background 1004 of user interface 1001 .
- Paging dots 1044 A- 1044 C are updated to indicate that user interface 1016 B is the second in the sequence of three editing user interfaces.
- User interface 1016 B includes color selection element 1048 , which displays various color options for background 1004 of user interface 1001 . The currently-selected color option is displayed in the middle of color selection element 1048 and at a larger size than the other color options.
- a user can provide an input (e.g., rotation of rotatable input mechanism 603 or a vertical swipe gesture on display 602 ) to select a different color option, and device 600 updates color selection element 1048 and background 1004 accordingly in response to the input.
- an input e.g., rotation of rotatable input mechanism 603 or a vertical swipe gesture on display 602
- device 600 updates color selection element 1048 and background 1004 accordingly in response to the input.
- device 600 receives (e.g., detects) input 1024 (e.g., a right-to-left swipe gesture on display 602 ).
- device 600 displays user interface 1016 C, which (as indicated by label 1022 ) provides the capability to change the information displayed by complication 1005 A and complication 1005 B.
- Paging dots 1044 A- 1044 C are updated to indicate that user interface 1016 C is the third in the sequence of editing user interfaces.
- a user can select a complication (e.g., by tapping on the complication) and edit the selected complication (e.g., by rotating rotatable input mechanism 603 ).
- Device 600 indicates that the complications can be edited by, e.g., outlining complication 1005 A and complication 1005 B. Upon selection of a complication, device 600 visually distinguishes (e.g., highlights, outlines, increases the brightness of) the selected complication relative to other complications.
- device 600 receives (e.g., detects) input 1030 (e.g., two left-to-right swipes on display 602 , an input with a direction opposite of a direction of input 1024 in FIG. 10K ).
- device 600 displays (e.g., returns to) user interface 1016 A.
- device 600 receives (e.g., detects) input 1032 (e.g., a rotation of rotatable input mechanism 603 ).
- device 600 displays a different character option (e.g., the adjacent option in the sequence of character options) and updates character selection element 1046 accordingly, as shown in FIG. 10N .
- a character option can include only one character or a set of two or more characters.
- the displayed character option includes a set of four characters identified as “Toy Box.”
- device 600 displays the characters of the set individually at different times (e.g., device 600 displays the characters according to a predefined sequence in response to user input (e.g., a wrist raise, a tap on display 602 ) or automatically cycles through the set of characters at predetermined time intervals).
- device 600 receives (e.g., detects) input 1036 (e.g., rotation of rotatable input mechanism 603 , a continuation of input 1032 ).
- device 600 displays a different character option (e.g., the next adjacent option in the sequence of character options) and updates character selection element 1046 accordingly, as shown in FIG. 10 10 O.
- the selected character option corresponds to representation 1040 of an octopus character (as indicated by identifier 1038 ).
- device 600 in response to receiving input 1042 , displays (e.g., returns to) user interface 1008 (shown in FIG. 10I ) with an updated version of representation 1001 A including a representation of the selected character (e.g., representation 1040 ), and then displays user interface 1001 with representation 1040 of the selected character option in response to receiving further input (e.g., a tap on representation 1001 A, a press of rotatable and depressible input mechanism 603 or button 613 while displaying user interface 1008 ).
- further input e.g., a tap on representation 1001 A, a press of rotatable and depressible input mechanism 603 or button 613 while displaying user interface 1008 .
- FIG. 10Q illustrates an example of representation 1040 of the octopus character in a visual state (e.g., a visual state different from the visual state shown in FIG. 10P ) displayed while device 600 is in the second activity state (e.g., an active, unlocked state).
- a visual state e.g., a visual state different from the visual state shown in FIG. 10P
- the second activity state e.g., an active, unlocked state
- representation 1000 of the first character is displayed concurrently with indication of time 1002 at a first time
- a representation of a second character e.g., representation 1040 of the octopus character or representation 1000 of the first character
- a representation of a second character is displayed concurrently with indication of time 1002 at a second time different from the first time
- device 600 displays the representation of the second character in a visual state (e.g., representation 1000 of the first character in the visual state illustrated in FIG. 10B ; representation 1040 of the octopus character in the visual state illustrated in FIG. 10P ; representation 1040 of the octopus character in the visual state illustrated in FIG.
- device 600 displays the representation of the second character in a different visual state (e.g., representation 1000 of the first character in the state shown in FIG. 10A ; representation 1040 of the octopus character in the visual state illustrated in FIG. 10P , except with eyes closed; representation 1040 of the octopus character in the visual state illustrated in FIG. 10Q , except with eyes closed).
- a different activity state e.g., a locked state
- device 600 displays the representation of the second character in a different visual state (e.g., representation 1000 of the first character in the state shown in FIG. 10A ; representation 1040 of the octopus character in the visual state illustrated in FIG. 10P , except with eyes closed; representation 1040 of the octopus character in the visual state illustrated in FIG. 10Q , except with eyes closed).
- electronic device 600 is configured to transition between characters in response to detecting a change in the activity state from a third activity state (e.g., a higher-power consumption mode and/or the second activity state) to a fourth activity state (e.g., a lower-power consumption mode and/or the first activity state). For example, when a set of two or more characters is selected for display on user interface 1001 , as shown at FIG.
- electronic device 600 displays the characters of the set individually, and in response to a change in the activity state from the third activity state (e.g., a higher-power consumption state, a normal operating state, and/or the second activity state) to the fourth activity state (e.g., a lower-power consumption state, a sleep state, a locked state, and/or the first activity state), transitions from one character in the set to another character in the set.
- electronic device 600 forgoes transitioning between characters in response to detecting a change in the activity state from the fourth activity state (e.g., a lower-power consumption mode) to the third activity state (e.g., a higher-power consumption mode).
- electronic device transitions between characters in response to detecting a change in the activity state from the fourth activity state to the third activity state in addition to, or in lieu of, transitioning between characters in response to detecting a change in the activity state from the third activity state to the fourth activity state.
- electronic device 600 is in a third activity state (e.g., the second activity state, a normal operating state, and/or a higher-power consumption state) and displays user interface 1001 with a graphical representation 1050 of a second character (e.g., a character different from the first character corresponding to graphical representation 1000 and the octopus character corresponding to graphical representation 1040 ).
- User interface 1001 also includes time indicator 1002 and complications 1005 A and 1005 B.
- user interface 1001 includes a default color (e.g., black) and background 1004 having one or more colors that are different from the default color (e.g., colors displayed by electronic device 600 in accordance with user inputs while second user interface 1016 B is displayed at FIG.
- a default color e.g., black
- background 1004 having one or more colors that are different from the default color (e.g., colors displayed by electronic device 600 in accordance with user inputs while second user interface 1016 B is displayed at FIG.
- user interface 1001 in FIGS. 10B-10F, 10H-10M, and 10O-10Q show the default color as lighter than background 1004 (e.g., white)
- user interface 1001 in FIGS. 10B-10F, 10H-10M, and 10O-10Q can alternatively display the default color as darker than background 1004 (e.g., black) as shown at FIGS. 10R-10W .
- electronic device 600 displays graphical representation 1050 of the second character in a third visual state (e.g., the second visual state and/or an animated visual state) that corresponds to the third activity state.
- the third visual state includes the second character with eyes and mouth open (e.g., the second character is posing and appears awake (not asleep)).
- FIG. 10S illustrates electronic device 600 in a transition state between the third activity state and a fourth activity state (e.g., the first activity state, a lower-power consumption state, a locked state, a sleep state) in which display 602 begins to dim as compared to FIG. 10R .
- background 1004 and graphical representation 1050 are reduced in size as compared to FIG. 10R as the transition between third activity state and fourth activity state occurs.
- graphical representation 1050 fades out, reduces in brightness, and/or dissolves in the transition between the third activity state and the fourth activity state.
- Electronic device 600 ceases to display complications 1005 A and 1005 B on user interface 1001 .
- electronic device 600 displays time indicator 1002 with a reduced thickness and/or size during the transition between the third activity state and the fourth activity state.
- electronic device 600 is operating in the fourth activity state.
- electronic device 600 displays graphical representation 1052 of a third character, different from the second character. Accordingly, during the transition between the third activity state and the fourth activity state, graphical representation 1050 ceases to be displayed on user interface 1001 and graphical representation 1052 is displayed on user interface 1001 . In some embodiments, graphical representation 1050 fades out and/or dissolves as graphical representation 1052 fades in or is otherwise displayed on user interface 1001 . As set forth above, the second character and the third character are included in the set of characters selected to be displayed on user interface 1001 .
- electronic device 600 transitions between display of the second character to display of the third character.
- graphical representation 1052 displayed while electronic device 600 operates in the fourth activity state is dimmed (e.g., includes a reduced brightness) as compared to graphical representation 1050 displayed while electronic device 600 operates in the third activity state.
- dimming the graphical representation 1052 indicates that electronic device 600 is in the fourth activity state.
- graphical representation 1052 is illustrated in greyscale to indicate that graphical representation 1052 is faded and/or otherwise displayed at a reduced brightness when compared to graphical representation 1050 shown at FIG. 10R .
- Electronic device 600 ceases to display background 1004 on user interface 1001 when electronic device 600 is in the fourth activity state.
- device 600 displays graphical representation 1052 of the third character in a fourth visual state different, from the third visual state, that corresponds to the fourth activity state.
- the fourth visual state shows the third character with eyes open (e.g., a neutral pose).
- the fourth visual state shows the third character with eyes closed such that the third character appears to be asleep.
- the fourth visual state of the third character does not include movement and/or animations of the third character. Accordingly, electronic device 600 does not animate and/or does not cause graphical representation 1052 of the third character to move in response to changes in time (e.g., every minute, every fifteen minutes, every thirty minutes, every hour) and/or in response to user inputs.
- electronic device 600 operates in the third activity state (e.g., electronic device 600 detects a user input and/or a wrist raise gesture causing a transition from the fourth activity state to the third activity state) and displays user interface 1001 with graphical representation 1052 of the third character.
- electronic device 600 does not replace graphical representation 1052 of the third character with a graphical representation of a different character upon transitioning from the fourth activity state to the third activity state.
- electronic device 600 maintains display of the graphical representation 1052 of the third character in response to detecting a change from the fourth activity state to the third activity state.
- electronic device 600 transitions display of graphical representation 1050 with graphical representation 1052 in response to detecting a change from the fourth activity state to the third activity state, but not in response to detecting a change from the third activity state to the fourth activity state.
- user interface 1001 includes background 1004 (e.g., the same background as displayed at FIG. 10R ) and complications 1005 A and 1005 B.
- time indicator 1002 is displayed as having an increased thickness and/or size when compared to time indicator 1002 displayed while electronic device 600 operates in the fourth activity state shown at FIG. 10T .
- electronic device 600 displays graphical representation 1052 of the third character in the third visual state (e.g., the second visual state and/or an animated visual state) that corresponds to the third activity state.
- the third visual state includes the third character with eyes and mouth open (e.g., the third character is posing and appears awake (not asleep)).
- the third visual state of the third character includes periodic movement and/or animations of the third character.
- electronic device 600 can animate and/or cause graphical representation 1052 of the third character to move in response to changes in time (e.g., every minute, every fifteen minutes, every thirty minutes, every hour) and/or in response to user input.
- electronic device 600 in response to detecting a change in the activity state from the third activity state to the fourth activity state, displays user interface 1001 with a fourth character, different from the second character and the third character.
- electronic device 600 while electronic device 600 is in the third activity state, electronic device 600 detects user input 1054 (e.g., a tap gesture) on user interface 1001 . In response to detecting user input 1054 , electronic device 600 causes display of graphical representation 1052 of the third character to move (e.g., causes a randomly selected or predetermined animation of graphical representation), as shown at FIG. 10V . At FIG. 10V , electronic device 600 displays an enlargement animation (e.g., zooms and/or increases a size) of graphical representation 1052 of the third character. In some embodiments, in response to the user input 1054 , electronic device 600 ceases to display a portion of graphical representation 1052 on display 602 . For example, at FIG.
- a lower portion of graphical representation 1052 of the third character appears to move off of display 602 and cease to be displayed by electronic device 600 for a predetermined period of time. Additionally, electronic device 600 causes display of graphical representation 1052 of the third character to cover and/or block at least a portion of complication 1005 B for the predetermined period of time in response to user input 1052 .
- electronic device 600 is configured to fluidly transition between different animations of graphical representation 1052 of the third character in response to user inputs. For example, at FIG. 10V , electronic device 600 detects user input 1056 on user interface 1001 while the lower portion of graphical representation 1052 of the third character is not displayed on display 602 (e.g., while electronic device 600 is causing an enlargement animation of graphical representation 1052 ). In response to detecting user input 1056 , electronic device 600 displays a pose animation of graphical representation 1052 of the third character, as shown at FIG. 10W .
- electronic device 600 displays a randomly selected animation (e.g., another pose animation and/or a different animation than the pose animation) of graphical representation 1052 of the third character in response to detecting user input 1056 .
- electronic device 600 displays graphical representation 1052 of the third character as winking and with an open mouth (e.g., the mouth is open wider than in FIG. 10U ).
- electronic device 600 displays graphical representation 1052 of the third character in the pose depicted in FIG. 10W for a predetermined period of time before returning display of graphical representation 1052 of the third character to the third visual state, as shown at FIG. 10U .
- electronic device 600 displays the animation of graphical representation 1052 in response to detecting user input 1056 after graphical representation 1052 returns to the position shown in FIG. 10U instead of while graphical representation 1052 is positioned as illustrated in FIG. 10V (e.g., while graphical representation 1052 is undergoing enlargement animation caused by user input 1054 ).
- electronic device 600 detects user input 1058 (e.g., a long press gesture) on user interface 1001 .
- electronic device 600 displays user interface 1008 shown at FIG. 10X .
- user interface 1008 is a user interface of a user interface editing mode.
- User interface 1008 displays representation 1060 of user interface 1001 , share affordance 1010 , and customize affordance 1012 (e.g., edit affordance).
- representation 1060 of user interface 1001 includes multiple characters that are included in the set of characters configured to be displayed on user interface 1001 .
- electronic device 600 transitions display of user interface 1001 between individual graphical representations of the set of characters in response to detecting the change from the third activity state to the fourth activity state (and/or in response to detecting the change from the fourth activity state to the third activity state).
- representation 1060 provides an indication that electronic device 600 transitions between displaying the characters in the set of characters when user interface 1001 is selected.
- electronic device 600 receives (e.g., detects) input 1062 corresponding to a request to edit user interface 1001 (e.g., a tap at a location on display 602 corresponding to customize affordance 1012 ).
- electronic device 600 displays user interface 1064 shown at FIG. 10Y .
- User interface 1064 provides the ability to change the character and/or set of characters displayed on user interface 1001 (e.g., by swiping up or down on display 602 or rotating rotatable input mechanism 603 ).
- user interface 1064 includes editing mode indicator 1066 (e.g., “Type”) and additional editing mode user interface object 1068 (e.g., “Color”).
- electronic device 600 In response to detecting user input (e.g., a swipe gesture on display 602 ), electronic device 600 adjusts display of user interface 1064 to a second page that provides the ability to change a color of background 1004 .
- user interface 1064 displays representation 1060 of the currently-selected watch face user interface 1001 (e.g., a watch face user interface that displays the set of characters), watch face selection element 1070 , and textual identifier 1072 of the currently-selected set of characters (e.g., “Random Avatar”).
- Watch face option selection element 1070 indicates the position of the currently selected option in a sequence of watch face options.
- electronic device 600 detects rotational input 1074 on rotatable input mechanism 603 .
- electronic device 600 displays user interface 1064 with representation 1076 of a second watch face user interface that includes a second set of characters (e.g., animal-like characters and/or emojis) configured to be displayed on display 602 , as shown at FIG. 10Z .
- a second set of characters e.g., animal-like characters and/or emojis
- user interface 1064 includes textual identifier 1078 (e.g., “Random Emoji”) to reflect representation 1076 of the second watch face user interface that includes the second set of characters. Additionally, electronic device 600 adjusts a position of watch face selection element 1070 in response to rotational input 1074 . At FIG. 10Z , electronic device detects rotational input 1080 on rotatable mechanism 603 . In response to detecting rotational input 1080 , electronic device 600 displays user interface 1064 with representation 1082 of a third watch face that includes a single character configured to be displayed on display 602 , as shown at FIG. 10AA .
- textual identifier 1078 e.g., “Random Emoji”
- electronic device 600 displays representation 1060 and representation 1076 with multiple characters to indicate that the corresponding watch face user interface displays individual graphical representations of multiple characters when representation 1060 and/or representation 1076 are selected (e.g., via user input).
- electronic device 600 displays representation 1082 with a single character to indicate that a corresponding watch face user interface displays a graphical representation of a single character when representation 1082 is selected.
- the third watch face user interface does not transition between graphical representations of different characters in response to a change from the third activity state to the fourth activity state, in response to a user input, or after a predetermined amount of time.
- the third watch face user interface maintains display of a graphical representation of the single character, even as electronic device 600 changes from the third activity state to the fourth activity state.
- user interface 1064 also includes textual identifier 1083 (e.g., “Avatar 1”) to identify the third watch face corresponding to representation 1082 .
- electronic device 600 detects user input 1084 (e.g., a tap gesture) corresponding to selection of representation 1076 .
- user input 1084 e.g., a tap gesture
- electronic device 600 displays user interface 1085 , as shown at FIG. 10AB .
- electronic device 600 is in the third activity state (e.g., a normal operating state, a higher-power consumption state) and user interface 1085 includes graphical representation 1086 of a fourth character (e.g., an animal-like emoji, such as a frog) in the third visual state. Additionally, user interface 1085 includes time indicator 1002 , background 1004 , and complications 1005 A and 1005 B.
- the third activity state e.g., a normal operating state, a higher-power consumption state
- user interface 1085 includes graphical representation 1086 of a fourth character (e.g., an animal-like emoji, such as a frog) in the third visual state.
- user interface 1085 includes time indicator 1002 , background 1004 , and complications 1005 A and 1005 B.
- electronic device 600 is in the fourth activity state (e.g., a locked state, a sleep state, a lower-power consumption state) and displays user interface 1085 .
- representation 1076 in FIG. 10Z corresponds to a watch face user interface that includes a set of characters that includes more than one character (e.g., as opposed to a single character).
- electronic device 600 in response to detecting a change from the third activity state to the fourth activity state, ceases to display graphical representation 1086 of the fourth character (e.g., a frog character) and displays graphical representation 1088 of a fifth character (e.g., a dog character).
- graphical representation 1086 of the fourth character e.g., a frog character
- graphical representation 1088 of a fifth character e.g., a dog character.
- electronic device 600 also ceases to display background 1004 and complications 1005 A and 1005 B because electronic device 600 operates in the fourth activity state.
- user interface 1085 includes time indicator 1002 having a reduced thickness and/or size as compared to time indicator 1002 displayed at FIG. 10AB .
- FIGS. 11A-11H are a flow diagram illustrating methods of enabling and displaying a user interface using a character, in accordance with some embodiments.
- Method 1100 is performed at a computer system (e.g., 100 , 300 , 500 , 600 ) (e.g., a smart device, such as a smartphone or a smartwatch; a mobile device) that is in communication with a display generation component.
- a computer system e.g., 100 , 300 , 500 , 600
- Some operations in method 1100 are, optionally, combined, the orders of some operations are, optionally, changed, and some operations are, optionally, omitted.
- method 1100 provides an intuitive way for managing user interfaces related to time.
- the method reduces the cognitive burden on a user for managing user interfaces related to time, thereby creating a more efficient human-machine interface.
- the computer system displays ( 1102 ), concurrently in a user interface (e.g., 1001 ) (e.g., a watch face user interface) displayed via the display generation component (e.g., 602 ), an indication of time (e.g., 1002 ) (e.g., the current time; the time set in the systems setting of the computer system) ( 1104 ), and a graphical representation of a first character (e.g., 1000 , 1040 ) (e.g., an animated character; an emoji; an animated (e.g., 3D) emoji of an animal-like character; an animated (e.g., 3D) avatar-like emoji; an animated representation of a user of the computer system) ( 1106 ).
- a user interface e.g., 1001
- the display generation component e.g., 602
- an indication of time e.g., 1002
- a graphical representation of a first character e.g.,
- Displaying the graphical representation of the first character includes ( 1106 ), in accordance with a determination that the computer system (e.g., 600 ) is in a first activity state (e.g., activity state in FIG.
- 10A, 10B 10 C, 10 D, 10 E, 10 F, 10 G, 10 H, 10 P, or 10 Q) e.g., dimmed (e.g., but unlocked) state; locked state; time-passing state; detecting an input (e.g., tap input) state; time-change state), displaying the graphical representation of the first character in a first visual state (e.g., a neutral state; sleeping state; selfie state; a time change state; a tick tock state) that corresponds to the first activity state of the computer system ( 1108 ).
- a first visual state e.g., a neutral state; sleeping state; selfie state; a time change state; a tick tock state
- Displaying the graphical representation of the first character includes ( 1106 ), in accordance with a determination that the computer system (e.g., 600 ) is in a second activity state (e.g., activity state in FIG.
- 10A, 10B 10 C, 10 D, 10 E, 10 F, 10 G, 10 H, 10 P, or 10 Q) e.g., dimmed (e.g., but unlocked) state; locked state; time-passing state; detecting an input (e.g., tap input) state; time-change state) that is different from the first activity state, displaying the graphical representation of the first character in a second visual state (e.g., a neutral state; sleeping state; selfie state; a time change state; a tick tock state), different from the first visual state, that corresponds to the second activity state of the computer system ( 1110 ).
- a second visual state e.g., a neutral state; sleeping state; selfie state; a time change state; a tick tock state
- Displaying the graphical representation of the first character in a different visual state based on an activity state of the computer system provides visual feedback about the current activity state of the computer system (e.g., without one or more user inputs directed to causing the computer system to indicate the activity state of the computer system).
- Providing improved visual feedback and reducing the number of inputs needed to perform an operation enhances the operability of the device and makes the user-device interface more efficient (e.g., by helping the user to provide proper inputs and reducing user mistakes when operating/interacting with the device) which, additionally, reduces power usage and improves battery life of the device by enabling the user to use the device more quickly and efficiently.
- the computer system displays ( 1112 ), concurrently in the user interface (e.g., 1001 ) the indication of time (e.g., 1002 ) (e.g., the current time; the time set in the systems setting of the computer system) ( 1114 ), and a graphical representation of a second character (e.g., 1000 , 1040 ) (e.g., an animated character; an emoji; an animated (e.g., 3D) emoji of an animal-like character; an animated (e.g., 3D) avatar-like emoji; an animated representation of a user of the computer system, the first character, a character different from the first character) ( 1116 ).
- the second character is the same character as the first character.
- the second character is a different character from the first character.
- Displaying the graphical representation of the second character includes ( 1116 ), in accordance with a determination that the computer system (e.g., 600 ) is in the first activity state (e.g., activity state in FIG.
- 10A, 10B 10 C, 10 D, 10 E, 10 F, 10 G, 10 H, 10 P, or 10 Q) e.g., dimmed (e.g., but unlocked) state; locked state; time-passing state; detecting an input (e.g., tap input) state; time-change state), displaying the graphical representation of the second character in the first visual state (e.g., a neutral state; sleeping state; selfie state; a time change state; a tick tock state) that corresponds to the first activity state of the computer system ( 1118 ).
- Displaying the graphical representation of the second character includes ( 1116 ), in accordance with a determination that the computer system (e.g., 600 ) is in the second activity state (e.g., activity state in FIG. 10A, 10B 10 C, 10 D, 10 E, 10 F, 10 G, 10 H, 10 P, or 10 Q) (e.g., dimmed (e.g., but unlocked) state; locked state; time-passing state; detecting an input (e.g., tap input) state; time-change state) that is different from the first activity state (e.g., activity state in FIG.
- the second activity state e.g., activity state in FIG. 10A, 10B 10 C, 10 D, 10 E, 10 F, 10 G, 10 H, 10 P, or 10 Q
- dimmed e.g., but unlocked
- locked state locked state
- time-passing state detecting an input (e.g., tap input) state; time-change state) that is different from the first activity state (e.
- the second visual state e.g., a neutral state; sleeping state; selfie state; a time change state; a tick tock state
- Displaying the graphical representation of the second character in a different visual state based on an activity state of the computer system provides visual feedback about the current activity state (e.g., or a change in activity state) of the computer system (e.g., without one or more user inputs directed to causing the computer system to indicate the activity state or a change in activity state of the computer system).
- Providing improved visual feedback and reducing the number of inputs needed to perform an operation enhances the operability of the device and makes the user-device interface more efficient (e.g., by helping the user to provide proper inputs and reducing user mistakes when operating/interacting with the device) which, additionally, reduces power usage and improves battery life of the device by enabling the user to use the device more quickly and efficiently.
- the computer system concurrently displays or causes display of, in the user interface (e.g., 1001 ) (e.g., overlaid on the graphical representation of the first character and/or the graphical representation of the second character), one or more watch complications (e.g., 1005 A, 1005 B).
- the one or more watch complications include a complication indicating a current date.
- the one or more watch complications include a complication that includes text information (e.g., about the weather; about a calendar meeting).
- the user interface also includes an editing tab (e.g., to access an editing page) for editing the one or more watch complications (e.g., changing one or more of the watch complications to a different type).
- the computer system detects (e.g., determines) ( 1122 ) a change in activity state of the computer system from the first activity state (e.g., activity state in FIG. 10A, 10B 10 C, 10 D, 10 E, 10 F, 10 G, 10 H, 10 P, or 10 Q) to the second activity state (e.g., activity state in FIG.
- 10A, 10B 10 C, 10 D, 10 E, 10 F, 10 G, 10 H, 10 P, or 10 Q) e.g., a change in a display setting (e.g., getting dimmer; getting brighter) of the computer system; a change in a security state (e.g., device being locked or unlocked) of the computer system; a change in the current time (e.g., a change in the hour of the current time, a change in the minute of the current time, a change in the second of the current time); a change in a state of the computer system due to a detected user input and the computer system displaying (or causing display of)/providing a response to the user input and/or performing an operation due to the user input).
- a display setting e.g., getting dimmer; getting brighter
- a security state e.g., device being locked or unlocked
- a change in the current time e.g., a change in the hour of the current time,
- displaying the graphical representation of the second character (e.g., 1000 , 1040 ) in the second visual state includes displaying the graphical representation of the second character in the second visual state in response to detecting (e.g., determining) the change in activity state of the computer system from the first activity state (e.g., activity state in FIG. 10A, 10B 10 C, 10 D, 10 E, 10 F, 10 G, 10 H, 10 P, or 10 Q) to the second activity state (e.g., activity state in FIG. 10A, 10B 10 C, 10 D, 10 E, 10 F, 10 G, 10 H, 10 P, or 10 Q).
- the first activity state e.g., activity state in FIG. 10A, 10B 10 C, 10 D, 10 E, 10 F, 10 G, 10 H, 10 P, or 10 Q
- the second activity state e.g., activity state in FIG. 10A, 10B 10 C, 10 D, 10 E, 10 F, 10 G, 10 H, 10 P, or 10 Q.
- the second character is the same character as the first character (e.g., 1000 , 1040 ). In some embodiments, the second character is a different character from the first character. Displaying the graphical representation of the second character in the second visual state in response to detecting (e.g., determining) the change in activity state of the computer system from the first activity state to the second activity state provides visual feedback about the change in activity state of the computer system (e.g., without one or more user inputs directed to causing the computer system to indicate the change in activity state of the computer system).
- Providing improved visual feedback and reducing the number of inputs needed to perform an operation enhances the operability of the device and makes the user-device interface more efficient (e.g., by helping the user to provide proper inputs and reducing user mistakes when operating/interacting with the device) which, additionally, reduces power usage and improves battery life of the device by enabling the user to use the device more quickly and efficiently.
- the first character is the same character as the second character ( 1124 ). In some embodiments, the first character is a different character from the second character ( 1126 ). In some embodiments, the first visual state or the second visual state is a static (e.g., not moving; not animated; not dynamic) visual state ( 1128 ). In some embodiments, the first visual state or the second visual state is an animated (e.g., moving; dynamic) visual state ( 1130 ).
- the first activity state corresponds to a state in which the user interface (e.g., 1001 ) is displayed at a lower brightness level than a designated brightness level (e.g., as compared to a standard brightens level, a brightness level of an active state), and the first visual state corresponds to a neutral body expression (e.g., a neutral state; a state or animation of the respective character (e.g., the first character and/or the second character) that reflects a neutral stance/image or motion).
- a neutral body expression e.g., a neutral state; a state or animation of the respective character (e.g., the first character and/or the second character) that reflects a neutral stance/image or motion.
- Displaying the representation of a character with the first visual state corresponding to the neutral body expression when/if first activity state corresponds to a state in which the user interface is displayed at a lower brightness level than a designated brightness level provides visual feedback that the current activity state of the computer system corresponds to the state in which the user interface is displayed at a lower brightness level than a designated brightness level (e.g., without one or more user inputs directed to causing the computer system to indicate the current activity state).
- Providing improved visual feedback and reducing the number of inputs needed to perform an operation enhances the operability of the device and makes the user-device interface more efficient (e.g., by helping the user to provide proper inputs and reducing user mistakes when operating/interacting with the device) which, additionally, reduces power usage and improves battery life of the device by enabling the user to use the device more quickly and efficiently.
- the first activity state corresponds to a locked state (e.g., where authentication (e.g., biometric authentication; passcode authentication) is required to unlock the computer system (e.g., 600 )
- the first visual state includes a visual appearance that the first character (e.g., 1000 , 1040 ) is asleep (e.g., a sleeping state; a state or motion of the respective character (e.g., the first character and/or the second character) that reflects a sleeping stance/image or motion).
- Displaying the representation of a character with the first visual state including the visual appearance that the first character is asleep when/if first activity state corresponds to a locked state provides visual feedback that the current activity state of the computer system corresponds to the locked state (e.g., without one or more user inputs directed to causing the computer system to indicate the current activity state).
- Providing improved visual feedback and reducing the number of inputs needed to perform an operation enhances the operability of the device and makes the user-device interface more efficient (e.g., by helping the user to provide proper inputs and reducing user mistakes when operating/interacting with the device) which, additionally, reduces power usage and improves battery life of the device by enabling the user to use the device more quickly and efficiently.
- the first activity state corresponds to a state in which the indication of time (e.g., the current time; the time set in the systems setting of the computer system) is being displayed (e.g., the passing time is being displayed).
- the indication of time e.g., the current time; the time set in the systems setting of the computer system
- the first visual state corresponds to a respective motion (e.g., animation) repeating at a regular frequency time indication state (e.g., a state or motion of the respective character (e.g., the first character and/or the second character) indicating that time is passing or that time is ticking by (e.g., a tick tock state; a tick tock animation)), wherein the respective motion corresponds to a nodding motion by the first character (e.g., a back-and-forth motion of a head of the first character representing the nodding motion).
- a regular frequency time indication state e.g., a state or motion of the respective character (e.g., the first character and/or the second character) indicating that time is passing or that time is ticking by (e.g., a tick tock state; a tick tock animation)
- a nodding motion by the first character e.g., a back-and-forth motion of a head of the first character representing the nodding motion
- Displaying the representation of a character corresponding to a respective motion e.g., animation
- the respective motion corresponds to a nodding motion by the first character
- first activity state corresponds to a state in which the indication of time is being displayed
- visual feedback that the current activity state of the computer system corresponds to the state in which the indication of time is being displayed (e.g., without one or more user inputs directed to causing the computer system to indicate the current activity state).
- Providing improved visual feedback and reducing the number of inputs needed to perform an operation enhances the operability of the device and makes the user-device interface more efficient (e.g., by helping the user to provide proper inputs and reducing user mistakes when operating/interacting with the device) which, additionally, reduces power usage and improves battery life of the device by enabling the user to use the device more quickly and efficiently.
- displaying the graphical representation of the first character (e.g., 1000 , 1040 ) (e.g., and/or the second character) in the time indication state includes displaying the first character looking at the indication of time at a predetermined time interval (e.g., every 10 seconds; every 15 seconds; every 30 seconds; every minute; every 5 minutes).
- a predetermined time interval e.g., every 10 seconds; every 15 seconds; every 30 seconds; every minute; every 5 minutes.
- the displayed glancing animation corresponds to a first type of glancing animation.
- a first character type e.g., an animated (e.g., 3D) emoji of an animal-like character; an animated (e.g., 3D) avatar-like emoji
- the displayed glancing animation corresponds to a first type of glancing animation.
- the displayed glancing animation corresponds to a second type of glancing animation (e.g., glancing in a different direction; glancing in a different manner) different from the first type of glancing animation.
- a second type of glancing animation e.g., glancing in a different direction; glancing in a different manner
- the first activity state corresponds to detecting a touch (e.g., tap) input (e.g., a tap input detected via a touch-sensitive surface integrated with the display generation component), and the first visual state corresponds to a first type of motion state (e.g., static or dynamic) that is indicative of a posing gesture (e.g., posing for a selfie) (e.g., a selfie pose; a pose or motion of the respective character (e.g., the first character and/or the second character) that reflects a pose or motion of taking a selfie).
- a touch e.g., tap
- a tap input e.g., a tap input detected via a touch-sensitive surface integrated with the display generation component
- a first visual state corresponds to a first type of motion state (e.g., static or dynamic) that is indicative of a posing gesture (e.g., posing for a selfie) (e.g., a selfie pose; a pose or motion
- Displaying the representation of a character corresponding to a first type of motion state that is indicative of a posing gesture when/if first activity state corresponds to detecting a touch (e.g., tap) input provides visual feedback that the current activity state of the computer system corresponds to detecting the touch (e.g., tap) input (e.g., without one or more user inputs directed to causing the computer system to indicate the current activity state).
- a touch e.g., tap
- Providing improved visual feedback and reducing the number of inputs needed to perform an operation enhances the operability of the device and makes the user-device interface more efficient (e.g., by helping the user to provide proper inputs and reducing user mistakes when operating/interacting with the device) which, additionally, reduces power usage and improves battery life of the device by enabling the user to use the device more quickly and efficiently.
- the first activity state corresponds to detecting that there has been a change in time (e.g., a certain time has been reached (e.g., the hour has changed; a quarter past the hour has been reached; half past the hour has been reached)
- the first visual state corresponds to a second type of motion state (e.g., static or dynamic) that is indicative of the change in time (e.g., a time change pose; a pose or motion of the respective character (e.g., the first character and/or the second character) that reflects a pose or motion indicating or acknowledging that the time has changed).
- a change in time e.g., a certain time has been reached (e.g., the hour has changed; a quarter past the hour has been reached; half past the hour has been reached)
- a second type of motion state e.g., static or dynamic
- Displaying the representation of a character corresponding to a second type of motion state that is indicative of the change in time when/if first activity state corresponds to the computer system detecting that there has been a change in time provides visual feedback that the current activity state of the computer system corresponds to the computer system detecting that there has been a change in time (e.g., without one or more user inputs directed to causing the computer system to indicate the current activity state).
- a second type of motion state e.g., static or dynamic
- Providing improved visual feedback and reducing the number of inputs needed to perform an operation enhances the operability of the device and makes the user-device interface more efficient (e.g., by helping the user to provide proper inputs and reducing user mistakes when operating/interacting with the device) which, additionally, reduces power usage and improves battery life of the device by enabling the user to use the device more quickly and efficiently.
- displaying the user interface includes displaying, in the user interface, the graphical representation of the first character (e.g., 1000 , 1040 ).
- displaying the user interface includes displaying, in the user interface, a transition (e.g., a gradual transition; a smooth transition) from the graphical representation of the first character to the graphical representation of the second character, wherein the second character is different from the first character.
- displaying the user interface includes displaying, in the user interface, a graphical representation of a third character, wherein the third character is different from the first character and from the second character.
- the computer system displays, via the display generation component (e.g., 602 ), a second user interface that includes a plurality of selectable characters (e.g., 1016 A) (e.g., including a plurality of animated (e.g., 3D) emojis of animal-like characters; a plurality of animated (e.g., 3D) avatar-like emojis).
- the plurality of selectable characters are displayed in a first tab or first screen of the second user interface.
- Displaying the second user interface that includes the plurality of selectable characters enables a user to manage the characters that are displayed in the user interface with the indication of time and thus easily customize the user interface, thereby enhancing the operability of the device and making the user-device interface more efficient (e.g., by helping the user to provide proper inputs when operating/interacting with the device to customize the user interface) which, additionally, reduces power usage and improves battery life of the device by enabling the user to use the device more quickly and efficiently.
- the computer system while displaying the second user interface, the computer system (e.g., 600 ) detects (e.g., via one or more input devices of the computer system, such as a touch-sensitive surface integrated with the display generation component) a selection of a third character of the plurality of selectable characters.
- the computer system displays, via the display device, the user interface, wherein the user interface concurrently includes the indication of time (e.g., the current time; the time set in the systems setting of the computer system), and a graphical representation of the third character (e.g., different from the first character and from the second character).
- the computer system displays, via the display generation component (e.g., 602 ), a third user interface (e.g., 1016 A) (e.g., the second user interface) that includes a graphical representation of a set of characters that includes two or more characters.
- the computer system detects (e.g., via one or more input devices that is in communication with the computer system, such as a touch-sensitive surface integrated with the display generation component) an input corresponding to selection of the set of characters.
- the computer system in accordance with (e.g., or in response to) detecting the selection of the set of characters, concurrently displays, in the user interface, the indication of time (e.g., the current time; the time set in the systems setting of the computer system), and a graphical representation of a respective character from the set of characters, wherein the respective character changes among the set of characters over time (e.g., one character from the subset of characters is (e.g., randomly) selected for display at a time).
- the indication of time e.g., the current time; the time set in the systems setting of the computer system
- a graphical representation of a respective character from the set of characters wherein the respective character changes among the set of characters over time (e.g., one character from the subset of characters is (e.g., randomly) selected for display at a time).
- the representation of the first character corresponds to a graphical representation of (e.g., an animation based on; a graphical representations that animates features of) a user associated (e.g., based on an account to which the computer system is logged into) with the computer system (e.g., 600 ) (e.g., an animated (e.g., 3D) avatar-like representation of the user of the computer system).
- a graphical representation of e.g., an animation based on; a graphical representations that animates features of
- a user associated e.g., based on an account to which the computer system is logged into
- the computer system e.g., 600
- an animated (e.g., 3D) avatar-like representation of the user of the computer system e.g., 3D
- the computer system displays, via the display generation component (e.g., 602 ), a fourth user interface (e.g., that includes a representation of a selected character (e.g., a selected animated (e.g., 3D) emoji of an animal-like character; a selected animated (e.g., 3D) avatar-like emoji).
- a representation of a selected character e.g., a selected animated (e.g., 3D) emoji of an animal-like character; a selected animated (e.g., 3D) avatar-like emoji.
- the representation of the selected character is displayed in a second tab or second screen of the second user interface.
- the second tab or second screen of the second user interface enables a user to customize (e.g., change a color of; change a background color of) the representation of the selected character and/or a background associated with the representation of the selected character.
- detecting e.g., via one or more input devices that is in communication with the computer system, such as a touch-sensitive surface integrated with the display generation component
- an input e.g., a rotational input on rotatable input device 603 in FIG. 10K ; a scrolling input on a touch-sensitive surface integrated with the display generation component
- a visual characteristic e.g., a background color; a background color theme
- the computer system in response to detecting the input directed to changing the visual characteristic, changes (e.g., by transitioning through a plurality of selectable visual characteristics (e.g., selectable colors)) the visual characteristic (e.g., a color; a background color) from a first visual characteristic (e.g., a first color; a first background color) to a second visual characteristic (e.g., a second color; a second background color) different from the first visual characteristic.
- the visual characteristic e.g., a color; a background color
- a first visual characteristic e.g., a first color; a first background color
- a second visual characteristic e.g., a second color; a second background color
- the computer system displays or causes display of, in the second user interface (e.g., 1016 B; a second tab or second screen of the second user interface), a user interface element (e.g., 1048 ; a rotatable user interface element; a color wheel) for changing the visual characteristic (e.g., a color; a background color).
- the computer system in response to (e.g., and while) detecting the input directed to changing the visual characteristic, displays or causes display of a change in the selected visual characteristic via the user interface element for changing the visual characteristic (e.g., transition and/or rotating through selectable colors in the color wheel while the input is being detected).
- the input directed to changing the visual characteristic is a rotational input (e.g., detected/received via a rotatable input device that is in communication with the computer system), and change in the selected visual characteristic includes scrolling/navigating through a plurality of different colors (e.g., scrolling through the color wheel) of the user interface element.
- a rotational input e.g., detected/received via a rotatable input device that is in communication with the computer system
- change in the selected visual characteristic includes scrolling/navigating through a plurality of different colors (e.g., scrolling through the color wheel) of the user interface element.
- the computer system scrolls/navigates the user interface element (e.g., the color wheel) in a first direction in accordance with a determination that the rotational input is in a first direction (e.g., clockwise direction) and scrolls/navigates the user interface element (e.g., the color wheel) in a first direction in accordance with a determination that the rotational input is in a second direction (e.g., counter-clockwise direction).
- a first direction e.g., clockwise direction
- scrolls/navigates the user interface element e.g., the color wheel
- the computer system detects ( 1132 ) (e.g., determines) a change in activity state of the computer system (e.g., 600 ) from the first activity state to the second activity state (e.g., a lower power consumption mode) (e.g., a change in a display setting (e.g., getting dimmer; getting brighter) of the computer system; a change in a security state (e.g., device being locked or unlocked) of the computer system; a change in a state of the computer system due to a detected user input and the computer system displaying (or causing display of)/providing a response to the user input and/or performing an operation due to the user input).
- a change in activity state of the computer system e.g., 600
- the second activity state e.g., a lower power consumption mode
- a change in a display setting e.g., getting dimmer; getting brighter
- a security state e.g., device being locked or unlocked
- the computer system in response to detecting ( 1134 ) the change in activity state of the computer system (e.g., 600 ) from the first activity state to the second activity state, displays ( 1136 ), in the user interface (e.g., 1001 ), the graphical representation (e.g., 1052 , 1088 ) of the second character (e.g., a transition animation causes the graphical representation of the first character to begin to fade, dissolve, and/or reduce in size and the graphical representation of the second character begin to be displayed at the same size as the first character) (e.g., the graphical representation of the second character is in the second visual state, such as a neutral state, a static state, and/or a sleeping state); and ceases ( 1138 ) to display, in the user interface (e.g., 1001 ), the graphical representation (e.g., 1050 , 1086 ) of the first character, wherein the second character is different from the first character (e.g., the first
- the computer system maintains display of the graphical representation (e.g., 1052 , 1088 ) of the second character in response to detecting a change in activity state of the computer system (e.g., 600 ) from the second activity state to the first activity state.
- the computer system transitions between the graphical representation (e.g., 1050 , 1086 ) of the first character and the graphical representation (e.g., 1052 , 1088 ) of the second character in response to detecting a change in the activity state from a lower power consumption mode to a higher power consumption mode, and maintains display of the currently displayed graphical representation (e.g., 1050 , 1086 ) of the first character or the graphical representation (e.g., 1052 , 1088 ) of the second character in response to detecting the transition from the higher power consumption mode to the lower power consumption mode.
- the graphical representation e.g., 1050 , 1086
- the graphical representation e.g., 1052 , 1088
- Displaying the graphical representation of the second character and ceasing to display the graphical representation of the first character in response to detecting the change in activity state from the first activity state to the second activity state provides improved visual feedback about the current activity state of the computer system.
- Providing improved visual feedback improves the operability of the device and makes the user-device interface more efficient (e.g., by helping the user to provide proper inputs and reducing user mistakes when operating/interacting with the device) which, additionally, reduces power usage and improves battery life of the device by enabling the user to use the device more quickly and efficiently.
- the computer system detects ( 1142 ) a change in activity state of the computer system (e.g., 600 ) from the second activity state to the first activity state; and in response to detecting the change in activity state of the computer system (e.g., 600 ) from the second activity state to the first activity state, maintains ( 1144 ) display, in the user interface (e.g., 1001 ), of the graphical representation (e.g., 1052 , 1088 ) of the second character, wherein the graphical representation (e.g., 1052 , 1088 ) of the second character includes an animated visual state (e.g., maintaining display of the graphical representation of the second character, but changing a visual state of the graphical representation of the second character in response to detecting the change in activity state from the second activity state to the first activity state).
- Displaying the graphical representation of the second character in an animated visual state in response to detecting the change in activity state from the second activity state to the first activity state provides improved visual feedback about the current activity state of the computer system.
- Providing improved visual feedback improves the operability of the device and makes the user-device interface more efficient (e.g., by helping the user to provide proper inputs and reducing user mistakes when operating/interacting with the device) which, additionally, reduces power usage and improves battery life of the device by enabling the user to use the device more quickly and efficiently.
- the computer system detects ( 1146 ) a change in activity state of the computer system (e.g., 600 ) from the first activity state to the second activity state.
- the computer system in response to detecting ( 1148 ) the change in activity state of the computer system (e.g., 600 ) from the first activity state to the second activity: displays ( 1150 ), in the user interface (e.g., 1001 ), a graphical representation of a third character, (e.g., a transition animation causes the graphical representation of the second character to begin to fade, dissolve, and/or reduce in size and the graphical representation of the third character begin to be displayed at the same size as the first character) (e.g., the graphical representation of the third character is in the second visual state, such as a neutral state, a static state, and/or a sleeping state); and ceases ( 1152 ) to display, in the user interface (e.g., 1001 ), the graphical representation (e.g., 1052 , 1088 ) of the second character, wherein the third character is different from the first character and the second character (e.g., the first character, the second character, and the
- Displaying the graphical representation of the third character and ceasing to display the graphical representation of the second character in response to detecting the change in activity state from the first activity state to the second activity state provides improved visual feedback about the current activity state of the computer system.
- Providing improved visual feedback improves the operability of the device and makes the user-device interface more efficient (e.g., by helping the user to provide proper inputs and reducing user mistakes when operating/interacting with the device) which, additionally, reduces power usage and improves battery life of the device by enabling the user to use the device more quickly and efficiently.
- displaying, in the user interface (e.g., 1001 ), the graphical representation (e.g., 1050 , 1086 ) of the first character includes displaying a graphical element (e.g., 1004 ) surrounding at least a portion of the first character (e.g., displaying the first character overlaid on the graphical element) (e.g., a background having a ring of color and/or multiple rings of color different from a color of user interface (e.g., a black color)) displayed in the user interface (e.g., 1001 ).
- the computer system detects ( 1132 ) (e.g., determining) a change in activity state of the computer system (e.g., 600 ) from the first activity state to the second activity state (e.g., a lower power consumption mode) (e.g., a change in a display setting (e.g., getting dimmer; getting brighter) of the computer system; a change in a security state (e.g., device being locked or unlocked) of the computer system; a change in a state of the computer system due to a detected user input and the computer system displaying (or causing display of)/providing a response to the user input and/or performing an operation due to the user input).
- a change in activity state of the computer system e.g., 600
- the second activity state e.g., a lower power consumption mode
- a change in a display setting e.g., getting dimmer; getting brighter
- a security state e.g., device being locked or unlocked
- the computer system in response ( 1134 ) to detecting the change in activity state of the computer system (e.g., 600 ) from the first activity state to the second activity state, decreases ( 1140 ) a brightness of a portion of the user interface (e.g., 1001 ) that included the graphical element (e.g., 1004 ) (e.g., fading the graphical element or displaying the graphical representation of the second character without the graphical element in the user interface) (e.g., a transition animation causes the graphical element to fade to a color that is closer to or the same as the color of a background portion of the user interface (e.g., black) in response to detecting the change in activity state of the computer system from the first activity state to the second activity state).
- a brightness of a portion of the user interface e.g., 1001
- the graphical element e.g., 1004
- a transition animation causes the graphical element to fade to a color that is closer to or the same as the color
- Decreasing the brightness of the portion of the user interface that included the graphical element in response to detecting the change in activity state from the first activity state to the second activity state provides improved visual feedback about the current activity state of the computer system.
- Providing improved visual feedback improves the operability of the device and makes the user-device interface more efficient (e.g., by helping the user to provide proper inputs and reducing user mistakes when operating/interacting with the device) which, additionally, reduces power usage and improves battery life of the device by enabling the user to use the device more quickly and efficiently.
- the computer system (e.g., 600 ), while the computer system (e.g., 600 ) is in the first activity state (e.g., a higher power consumption mode), in response to a determination that a predetermined change in time has occurred (e.g., a minute has changed, an hour has changed, 15-minutes past the hour has been reached, 30-minutes past the hour has been reached; 45-minutes past the hour has been reached), displays ( 1154 ) the graphical representation (e.g., 1050 , 1086 ) of the first character in a change-in-time visual state (e.g., time change pose; a pose or motion of the first character that reflects a pose or motion indicating or acknowledging that the time has changed).
- a predetermined change in time e.g., a minute has changed, an hour has changed, 15-minutes past the hour has been reached, 30-minutes past the hour has been reached; 45-minutes past the hour has been reached
- displays 1154
- the graphical representation
- the computer system (e.g., 600 ), while the computer system (e.g., 600 ) is in the second activity state (e.g., a lower power consumption mode), forgoes ( 1156 ) display of the graphical representation (e.g., 1052 , 1088 ) of the second character in the change-in-time visual state when the predetermined change in time has occurred.
- the second activity state e.g., a lower power consumption mode
- Displaying the graphical representation of the first character in the change-in-time visual state while the computer system is in the first activity state and forgoing display of the graphical representation of the second character in the change-in-state visual state while the computer system is in the second activity state provides improved visual feedback about the current activity state of the computer system.
- Providing improved visual feedback improves the operability of the device and makes the user-device interface more efficient (e.g., by helping the user to provide proper inputs and reducing user mistakes when operating/interacting with the device) which, additionally, reduces power usage and improves battery life of the device by enabling the user to use the device more quickly and efficiently.
- the computer system detects ( 1158 ) a change in time (e.g., a minute has changed, an hour has changed, 15-minutes past the hour has been reached, 30-minutes past the hour has been reached; 45-minutes past the hour has been reached), and in response to detecting ( 1160 ) the change in time and in accordance with a determination that the computer system (e.g., 600 ) is in the first activity state (e.g., a higher power consumption mode), updates ( 1162 ) a representation of time (e.g., 1002 ) and displays the graphical representation (e.g., 1050 , 1086 ) of the first character in a first manner (e.g., a visual state that includes animating the graphical representation of the first character in response to detecting the change in time).
- a change in time e.g., a minute has changed, an hour has changed, 15-minutes past the hour has been reached, 30-minutes past the hour has been reached; 45-minutes past the hour has
- the computer system detects ( 1158 ) a change in time (e.g., a minute has changed, an hour has changed, 15-minutes past the hour has been reached, 30-minutes past the hour has been reached; 45-minutes past the hour has been reached), and in response to detecting ( 1160 ) the change in time and in accordance with a determination that the computer system (e.g., 600 ) is in the second activity state (e.g., a lower power consumption mode), updates ( 1164 ) the representation of time (e.g., 1002 ) without displaying the graphical representation (e.g., 1050 , 1086 ) of the first character in the first manner (e.g., displaying the graphical representation of the first character in a second manner (e.g., a static visual state) that is different from the first manner and/or forgoing any change in the graphical representation of the first character in response to detecting the change in time).
- a change in time e.g., a minute has changed, an
- Displaying the graphical representation of the first character in the first manner and forgoing display of the graphical representation of the first character in the first manner depending on an activity state of the computer system provides improved visual feedback about the current activity state of the computer system.
- Providing improved visual feedback improves the operability of the device and makes the user-device interface more efficient (e.g., by helping the user to provide proper inputs and reducing user mistakes when operating/interacting with the device) which, additionally, reduces power usage and improves battery life of the device by enabling the user to use the device more quickly and efficiently.
- the computer system while displaying the graphical representation (e.g., e.g., 1050 , 1052 , 1086 , 1088 ) of the first character, detects ( 1166 ) an input (e.g., 1054 ) directed to one or more input devices of the computer system (e.g., 600 ) (e.g., a touch input while the computer system is in the higher power consumption mode, or a digital crown rotation input while the computer system is in the higher power consumption mode); and in response to detecting the input (e.g., 1054 ), displays ( 1170 ) the graphical representation (e.g., e.g., 1050 , 1052 , 1086 , 1088 ) of the first character in a third visual state that includes enlarging the graphical representation of the first character (e.g., increasing a size of the first character with respect to the user interface and/or the display generation component) such that a portion of the graphical representation of the first character
- Displaying the graphical representation of the first character in the third visual state provides improved visual feedback about the current activity state of the computer system.
- Providing improved visual feedback improves the operability of the device and makes the user-device interface more efficient (e.g., by helping the user to provide proper inputs and reducing user mistakes when operating/interacting with the device) which, additionally, reduces power usage and improves battery life of the device by enabling the user to use the device more quickly and efficiently.
- the computer system while displaying the graphical representation (e.g., e.g., 1050 , 1052 , 1086 , 1088 ) of the first character, detects ( 1172 ) a first input (e.g., 1054 ) directed to one or more input devices of the computer system (e.g., 600 ) (e.g., a touch input while the computer system is in the higher power consumption mode, or a digital crown rotation input while the computer system is in the higher power consumption mode).
- a first input e.g., 1054
- the computer system in response to detecting the first input (e.g., 1054 ), displays ( 1174 ) the graphical representation (e.g., e.g., 1050 , 1052 , 1086 , 1088 ) of the first character in a first animated visual state for a predetermined period of time (e.g., causing an animation of the graphical representation of the first character that lasts for a certain period of time, such as 1 second, 2 seconds, 3 seconds, 4 seconds, or 5 seconds).
- a predetermined period of time e.g., causing an animation of the graphical representation of the first character that lasts for a certain period of time, such as 1 second, 2 seconds, 3 seconds, 4 seconds, or 5 seconds.
- the computer system (e.g., 600 ), after detecting the first input (e.g., 1054 ), detects ( 1176 ) a second input (e.g., 1056 ) directed to one or more input devices of the computer system (e.g., 600 ) (e.g., a touch input while computer system is in the higher power consumption mode, or a digital crown rotation input while the computer system is in the higher power consumption mode).
- a second input e.g., 1056
- the computer system in response to detecting ( 1178 ) the second input (e.g., 1056 ) and in accordance with a determination that the predetermined period of time has ended (e.g., the animation caused by the first input has ended and the graphical representation of the first character is displayed in a default position), displays ( 1180 ) the graphical representation (e.g., e.g., 1050 , 1052 , 1086 , 1088 ) of the first character in a second animated visual state (e.g., causing an animation of the graphical representation of the first character), wherein the second animated visual state includes movement of the graphical representation (e.g., e.g., 1050 , 1052 , 1086 , 1088 ) of the first character starting from a first position (e.g., a default position of the graphical representation of the first character that is displayed when no user input is detected that causes an animation of the graphical representation of the first character).
- a first position e.g., a default position
- the computer system in response to detecting ( 1178 ) the second input (e.g., 1056 ) and in accordance with a determination that the predetermined period of time has not ended (e.g., the animation caused by the first input is still occurring, such that the graphical representation of the first character is not in the default position), displays ( 1182 ) the graphical representation (e.g., e.g., 1050 , 1052 , 1086 , 1088 ) of the first character in a third animated visual state (e.g., causing an animation of the graphical representation of the first character) (e.g., the second animated visual state where the graphical representation of the first character starts from a different position), wherein the third animated visual state includes movement of the graphical representation (e.g., 1050 , 1052 , 1086 , 1088 ) of the first character starting from a second position (e.g., a position of the graphical representation of the first character that is not the default position and/
- Displaying the graphical representation of the first character in the second animated visual state or the third animal visual state depending on whether the predetermined time period has ended provides improved visual feedback about the current activity state of the computer system.
- Providing improved visual feedback improves the operability of the device and makes the user-device interface more efficient (e.g., by helping the user to provide proper inputs and reducing user mistakes when operating/interacting with the device) which, additionally, reduces power usage and improves battery life of the device by enabling the user to use the device more quickly and efficiently.
- the computer system displays ( 1184 ), via the display generation component (e.g., 602 ), a fifth user interface (e.g., 1064 ) (e.g., the second user interface and/or the third user interface) for selecting between a first set of characters (e.g., 1060 ) that includes a plurality of user-customizable virtual avatars (e.g., a plurality of avatar-like emojis) and a graphical representation (e.g., 1076 ) of a second set of characters (e.g., a plurality of emojis of animal-like characters) that includes two or more predetermined characters that are not available in the first set of characters.
- a fifth user interface e.g., 1064
- a graphical representation e.g., 1076
- the computer system while displaying the third user interface (e.g., 1064 ), detects ( 1186 ) (e.g., via one or more input devices that is in communication with the computer system, such as a touch-sensitive surface integrated with the display generation component) an input (e.g., 1084 ) corresponding to selection of the first set of characters (e.g., 1060 ) or the second set of characters (e.g., 1076 ), and, in accordance with (e.g., or in response to) a determination that the input corresponds to selection of the first set of characters (e.g., 1060 ), the computer system (e.g., 600 ) concurrently displays ( 1188 ), in the user interface (e.g., 1001 ): the indication of time (e.g., 1002 ) ( 1190 ) (e.g., the current time; the time set in the systems setting of the computer system), and a graphical representation (e.g., 1050 ,
- the user interface e.
- the computer system while displaying the third user interface (e.g., 1064 ), detects ( 1186 ) (e.g., via one or more input devices that is in communication with the computer system, such as a touch-sensitive surface integrated with the display generation component) an input (e.g., 1084 ) corresponding to selection of the first set of characters (e.g., 1060 ) or the second set of characters (e.g., 1076 ), and, in accordance with (e.g., or in response to) a determination that the input (e.g., 1084 ) corresponds to selection of the second set of characters (e.g., 1076 ), concurrently displays ( 1194 ), in the user interface (e.g., 1001 ): the indication of time (e.g., 1002 ) ( 1196 ) (e.g., the current time; the time set in the systems setting of the computer system), and a graphical representation (e.g., 1086 , 1088
- Displaying the fifth user interface for selecting between the first set of characters and the second set of characters enables a user to easily customize the watch face user interface, thereby enhancing the operability of the device and making the user-device interface more efficient (e.g., by helping the user to provide proper inputs and reducing user mistakes when operating/interacting with the device) which, additionally, reduces power usage and improves battery life of the device by enabling the user to use the device more quickly and efficiently.
- method 700 optionally includes one or more of the characteristics of the various methods described above with reference to method 1100 .
- a device can use as a watch user interface either a watch user interface as described with reference to FIGS. 6A-6H or a user interface that includes an indication of time and a graphical representation of a character as described with reference to FIGS. 10A-10AC .
- method 900 optionally includes one or more of the characteristics of the various methods described above with reference to method 1100 .
- a device can use as a watch user interface either a watch user interface as described with reference to FIGS. 8A-8M or a user interface that includes an indication of time and a graphical representation of a character as described with reference to FIGS. 10A-10AC .
- method 1300 optionally includes one or more of the characteristics of the various methods described above with reference to method 1100 .
- a device can use as a watch user interface either a time user interface as described with reference to FIGS. 12A-12G or a user interface that includes an indication of time and a graphical representation of a character as described with reference to FIGS. 10A-10AC .
- method 1500 optionally includes one or more of the characteristics of the various methods described above with reference to method 1100 .
- a device can use as a watch user interface either a user interface that includes a background as described with reference to FIGS. 14A-14AD or a user interface that includes an indication of time and a graphical representation of a character as described with reference to FIGS. 10A-10AC .
- method 1700 optionally includes one or more of the characteristics of the various methods described above with reference to method 1100 .
- one or more characteristics or features of a user interface that includes an indication of time and a graphical representation of a character as described with reference to FIGS. 10A-10AC can be edited via the process for editing characteristics or features of a watch user interface as described with reference to FIGS. 16A-16AE .
- FIGS. 16A-16AE For brevity, these details are not repeated below.
- FIGS. 12A-12G illustrate exemplary user interfaces for enabling and displaying an indication of a current time, in accordance with some embodiments.
- the user interfaces in these figures are used to illustrate the processes described below, including the processes in FIGS. 13A-13C .
- FIG. 12A illustrates device 600 displaying, via display 602 , a time user interface 1204 (e.g., a watch user interface that includes an indication of a current time) that includes a face 1206 (e.g., a representation of a human face or a representation of an anthropomorphic face of a non-human character).
- a time user interface 1204 e.g., a watch user interface that includes an indication of a current time
- a face 1206 e.g., a representation of a human face or a representation of an anthropomorphic face of a non-human character.
- face 1206 comprises a plurality of facial features, including a first facial feature 1208 (e.g., representing/indicative of the eyes; also referred to as eyes 1208 ), a second facial feature 1210 (e.g., also referred to as nose 1210 ), a third facial feature 1212 (e.g., also referred to as mouth 1212 (e.g., lips)), a fourth facial feature 1214 (e.g., also referred to as hair 1214 ), a fifth facial feature 1216 (e.g., also referred to as facial outline 1216 (e.g., including cheeks and/or jawline)), a sixth facial feature 1218 (e.g., also referred to as neck 1218 ), and a seventh facial feature 1220 (e.g., also referred to as shoulders 1220 ).
- a first facial feature 1208 e.g., representing/indicative of the eyes; also referred to as eyes 1208
- a second facial feature 1210 e.g., also
- eyes 1208 indicate a current time (e.g., the current time; the time set in the systems setting of device 600 ), where the shape of the eyes corresponds to the current time (e.g., the right eye is represented via a number or numbers that indicate the current hour, and the left eye is represented via numbers that indicate the current minute).
- a current time e.g., the current time; the time set in the systems setting of device 600
- the shape of the eyes corresponds to the current time
- the right eye is represented via a number or numbers that indicate the current hour
- the left eye is represented via numbers that indicate the current minute.
- an animation e.g., blinking motion
- a change in visual characteristic e.g., change in color; change in font; change in style
- eyes 1208 , nose 1210 , mouth 1212 , hair 1214 , facial outline 1216 , neck 1218 , and shoulders 1220 respectively, have a corresponding visual characteristic (e.g., a respective color (e.g., a respective line color or a respective fill color); a respective shape; a respective position).
- a corresponding visual characteristic e.g., a respective color (e.g., a respective line color or a respective fill color); a respective shape; a respective position).
- one or more of the facial features 1208 - 1220 have the same corresponding visual characteristic (e.g., the same line or fill colors).
- nose 1210 and mouth 1212 can have the same visual characteristic (e.g., the same color (e.g., the same line color or the same fill color)), while eyes 1208 , hair 1214 , facial outline 1216 , neck 1218 , and shoulders 1220 can have different visual characteristics (e.g., different colors (e.g., different line colors and/or different fill colors)).
- the same visual characteristic e.g., the same color (e.g., the same line color or the same fill color)
- eyes 1208 , hair 1214 , facial outline 1216 , neck 1218 , and shoulders 1220 can have different visual characteristics (e.g., different colors (e.g., different line colors and/or different fill colors)).
- eyes 1208 , mouth 1212 , facial outline 1216 , and shoulders 1220 can have the same visual characteristic (e.g., the same color (e.g., the same line color or the same fill color)) while nose 1210 , hair 1214 , neck 1218 can have different visual characteristics (e.g., different colors (e.g., different line colors and/or different fill colors)).
- the same visual characteristic e.g., the same color (e.g., the same line color or the same fill color)
- nose 1210 , hair 1214 , neck 1218 can have different visual characteristics (e.g., different colors (e.g., different line colors and/or different fill colors)).
- a respective visual characteristic for a respective facial feature corresponds to a type of color.
- the type of color is programmatically selected (e.g., determined), without user input, from a plurality of available colors by device 600 .
- an application process selects (e.g., programmatically determines) the color based on a color of device 600 (e.g., a color of a housing or case of device 600 ).
- the application process selects the color based on usage history of a user of device 600 (e.g., based on a previous user-selected color or color scheme).
- device 600 While displaying time user interface 1204 including face 1206 , device 600 detects (e.g., determines) the satisfaction of a predetermined criteria for changing an appearance of time user interface 1204 (e.g., a change in the current time; a change in a state of device 600 due to a detected user input (e.g., a tap on display 602 ); detecting a movement of device 600 (e.g., caused by a user movement, such as a wrist-raise movement); a change in state or a change in mode of device 600 (e.g., transitioning to a sleep mode or sleeping state; transitioning from a locked state to an unlocked state)).
- a predetermined criteria for changing an appearance of time user interface 1204 e.g., a change in the current time; a change in a state of device 600 due to a detected user input (e.g., a tap on display 602 ); detecting a movement of device 600 (e.g., caused by a user movement, such
- device 600 in response to detecting the satisfaction of the predetermined criteria for changing an appearance of time user interface 1204 , ceases display of face 1206 of FIG. 12A and displays a different type of face (e.g., a face where respective visual characteristics of all facial features have been changed), for example a face 1222 in FIG. 12B .
- a different type of face e.g., a face where respective visual characteristics of all facial features have been changed
- FIG. 12B illustrates device 600 displaying, via display 602 , time user interface 1204 that includes (e.g., a representation of) face 1222 that is different from face 1206 .
- face 1222 comprises a plurality of facial features, including a first facial feature 1224 (e.g., eyes indicating the current time; also referred to as eyes 1224 ), a second facial feature 1226 (e.g., also referred to as nose 1226 ), a third facial feature 1228 (e.g., also referred to as mouth 1228 (e.g., lips)), a fourth facial feature 1230 (e.g., also referred to as hair 1230 ), a fifth facial feature 1232 (e.g., also referred to as facial outline 1232 (e.g., including checks and/or jawline)), a sixth facial feature 1234 (e.g., also referred to as neck 1234 ), and a seventh facial feature 1236 (e.g., also referred to as shoulders 1236
- eyes 1224 indicates a current time, where the shape of the eyes corresponds to the current time.
- facial features 1224 - 1236 of face 1222 have respective visual characteristics (e.g., a respective color (e.g., line color or fill color); a respective shape; a respective position).
- ceasing display of face 1206 as in FIG. 12A and displaying face 1222 as in FIG. 12B includes displaying a gradual transition from face 1206 to face 1222 that includes transitioning a respective facial feature of face 1206 from having the corresponding visual characteristic, as in FIG. 12A , through a plurality of intermediate (e.g., temporary) states to a final state in which a corresponding respective facial feature of face 1222 has the corresponding visual characteristic, as in FIG. 12B , where the corresponding visual characteristic of a respective facial feature in FIG. 12A is different from the corresponding visual characteristic of the counterpart respective facial feature in FIG. 12B (e.g., hair 1214 of face 1206 has a different fill color and/or shape than hair 1230 of face 1222 ).
- intermediate e.g., temporary
- FIG. 12C illustrates device 600 displaying, via display 602 , time user interface 1204 that includes face 1222 , where face 1222 in FIG. 12C is different from face 1222 in FIG. 12B (e.g., a different version of the same face).
- changing the appearance of time user interface 1204 includes changing a subset of the facial features of the displayed face without changing all of the facial features of the displayed face.
- device 600 While displaying face 1222 as in FIG. 12B , device 600 detects (e.g., determines) the satisfaction of a predetermined criteria for changing an appearance of the time user interface. In some embodiments, in response to detecting the satisfaction of the predetermined criteria for changing an appearance of time user interface 1204 , device 600 changes the appearance of time user interface 1204 by ceasing display of face 1222 as in FIG. 12B and displaying face 1222 as in FIG. 12C . In FIG. 12C , the predetermined criteria for changing the appearance of time user interface 1204 (e.g., as shown in the transition of time user interface 1204 from face 1206 in FIG. 12A to face 1222 in FIG. 12B and the transition of time user interface 1204 from face 1222 in FIG.
- the predetermined criteria for changing the appearance of time user interface 1204 e.g., as shown in the transition of time user interface 1204 from face 1206 in FIG. 12A to face 1222 in FIG. 12B and the transition of time user interface 1204 from face 1222 in FIG
- the predetermined criteria for changing the appearance of time user interface 1204 e.g., changing one or more facial features of the respective face in the time user interface
- device 600 changes the appearance of time user interface 1204 (e.g., changes one or more facial features of the respective face in time user interface 1204 ) randomly and not based on when the predetermined time has elapsed.
- face 1222 includes the same visual characteristics for eyes 1224 , mouth 1228 , facial outline 1232 , and neck 1234 as face 1222 of FIG. 12B .
- face 1222 includes different visual characteristics for nose 1226 , hair 1230 , and shoulders 1236 from face 1222 in FIG. 12B (e.g., nose 1226 has a different shape, and hair 1230 has a different fill color in FIG. 12C as compared to FIG. 12B ).
- ceasing display of face 1222 as in FIG. 12B and displaying (e.g., transitioning to) face 1222 as in FIG. 12C includes displaying a gradual transition from face 1222 in FIG. 12B to face 1222 in FIG. 12C that includes transitioning nose 1226 , hair 1230 , and shoulders 1236 from have their respective visual characteristic in FIG. 12B through a plurality of intermediate (e.g., temporary) states to a final state in which nose 1226 , hair 1230 , and shoulders 1236 have their respective visual characteristic in FIG. 12C .
- FIG. 12D illustrates device 600 displaying an animation (e.g., a blinking animation) using eyes 1224 , while displaying face 1222 .
- displaying the animation via eyes 1224 includes ceasing display of at least a portion of eyes 1224 , as shown in FIG. 12D , for a period of time (e.g., a brief moment; a fraction of a second; 1 second), then re-displaying the portion of eyes 1224 (e.g., as previously shown in FIG. 12C ) after the period of time has elapsed.
- a period of time e.g., a brief moment; a fraction of a second; 1 second
- the animation is a blinking animation of eyes 1224 that includes a temporary/brief movement or change in shape/form of eyes 1224 such that the first facial feature mimics the movement of a human eye blinking.
- device 600 periodically displays the animation via eyes 1224 based on time (e.g., every 1 second, every 10 seconds, every 15 seconds, every 30 seconds, every 1 minute; every 5 minutes; every 30 minutes; every hour).
- device 600 displays the animation via eyes 1224 non-periodically (e.g., not based on time; not in regular intervals; at random times; not based on a period change in time).
- device 600 While displaying time user interface 1204 including face 1222 as shown in FIGS. 12C-12D , device 600 detects (e.g., determines) the satisfaction of a second predetermined criteria (e.g., a type of input; a change in activity state of device 600 ) for changing an appearance of time user interface 1204 . In response to detecting the satisfaction of the second predetermined criteria for changing an appearance of time user interface 1204 , device 600 ceases display of second face 1222 , as shown in FIGS. 12C-12D , and displays face 1222 as shown in FIG. 12E .
- a second predetermined criteria e.g., a type of input; a change in activity state of device 600
- device 600 is in a different state (e.g., a reduced-power state) from FIGS. 12A-12D , in which device 600 changes one or more visual features of a displayed user interface while in the different state (e.g., device 600 dims/darkens the background or reverts from using a respective color to fill in a respective element/region of the user interface to using the respective color as an outline color of the respective element/region of the user interface).
- a different state e.g., a reduced-power state
- device 600 changes one or more visual features of a displayed user interface while in the different state (e.g., device 600 dims/darkens the background or reverts from using a respective color to fill in a respective element/region of the user interface to using the respective color as an outline color of the respective element/region of the user interface).
- eyes 1224 (e.g., still) indicates the current time.
- device 600 displays an animation via eyes 1224 (e.g., based on a change in the time or non-periodically).
- nose 1226 has a different visual characteristic than in FIGS. 12C-12D , where the different visual characteristic in FIG. 12E is a visually distinguished outline (e.g., borderline) for nose 1226 , and the visually distinguished outline has a respective color (e.g., line color) that is based on a respective color used to fill nose 1226 in FIGS. 12C-12D (e.g., device 600 applies the color or tone (or a color similar to the color or tone) of the fill color of nose 1226 in FIGS. 12C-12D to the line color of nose 1226 in FIG. 12E ).
- the different visual characteristic in FIG. 12E is a visually distinguished outline (e.g., borderline) for nose 1226
- the visually distinguished outline has a respective color (e.g., line color) that is based on a respective color used to fill nose 1226 in FIGS. 12C-12D (e.g., device 600 applies the color or tone (or a color similar to the color or tone) of the fill color of nose 12
- mouth 1228 , hair 1230 , facial outline 1232 , neck 1234 , and shoulders 1236 have different visual characteristics than in FIGS. 12C-12D , where the respective different visual characteristics in FIG. 12E are visually distinguished outlines that have respective colors (e.g., line colors) that are based on (e.g., correspond to) respective colors used to fill (e.g., used as fill colors) mouth 1228 , hair 1230 , facial outline 1232 , neck 1234 , and shoulders 1236 , respectively, in FIGS.
- respective colors e.g., line colors
- respective colors used to fill e.g., used as fill colors
- FIGS. 12C-12D apply the color or tone (or a color similar to the color or tone) of the fill color of mouth 1228 , hair 1230 , facial outline 1232 , neck 1234 , and shoulders 1236 , respectively, in FIGS. 12C-12D to the line color of mouth 1228 , hair 1230 , facial outline 1232 , neck 1234 , and shoulders 1236 , respectively, in FIG. 12E ).
- device 600 While displaying face 1222 as in FIGS. 12C-12D , device 600 detects (e.g., determines) the satisfaction of a predetermined criteria for changing an appearance of the time user interface. In some embodiments, in response to detecting the satisfaction of the predetermined criteria for changing an appearance of time user interface 1204 , device 600 changes the appearance of time user interface 1204 by ceasing display of face 1222 as in FIGS. 12C-12D and displaying face 1222 as in FIG. 12F . In FIG. 12F , the predetermined criteria for changing the appearance of time user interface 1204 includes a criterion that is satisfied when a predefined movement (e.g., of device 600 ) has been detected. In some embodiments, device 600 is a wearable device (e.g., a smartwatch), and the predefined movement criteria corresponds to a wrist-raise movement while device 600 is being worn.
- a predefined movement e.g., of device 600
- face 1222 includes the same visual characteristics for eyes 1224 ), hair 1230 , facial outline 1232 , neck 1234 , and shoulders 1236 as face 1222 of FIGS. 12C-12D .
- face 1222 includes different visual characteristics (e.g., different color and/or different shape) for nose 1226 and mouth 1226 as compared to face 1222 in FIGS. 12C-12D .
- device 600 While displaying face 1222 as in FIG. 12F , device 600 detects (e.g., determines) the satisfaction of a predetermined criteria for changing an appearance of the time user interface. In some embodiments, in response to detecting the satisfaction of the predetermined criteria for changing an appearance of time user interface 1204 , device 600 changes the appearance of time user interface 1204 by ceasing display of face 1222 as in FIG. 12F and displaying face 1222 as in FIG. 12G . In FIG.
- the predetermined criteria for changing the appearance of time user interface 1204 includes a criterion that is satisfied when a change in state (e.g., a change in mode from one device state/mode to another device state/mode) of device 600 has been detected (e.g., it is determined that device 600 has undergone a change in state).
- the change in state corresponds to device 600 transitioning to a sleep mode or sleeping state.
- the sleep mode or sleep state corresponds to a state in which the display generation component is off
- the sleep mode or sleep state corresponds to a state in which device 600 is in a low-power state (e.g., in which display 602 is off).
- the change in state corresponds to device 600 transitioning from a locked state to an unlocked state.
- face 1222 includes the same visual characteristics for eyes 1224 , nose 1226 , mouth 1228 , hair 1230 , neck 1234 , and shoulders 1236 as face 1222 of FIG. 12F .
- face 1222 includes a different visual characteristic for facial outline 1232 from face 1222 in FIG. 12F (e.g., facial outline 1232 has a different fill color in FIG. 12G than in FIG. 12F ).
- face 1222 displayed in time user interface 1204 has a primary color scheme (e.g., a predominant color; a most-prevalent color).
- the primary color scheme corresponds to the color of the facial outline 1232 .
- the color of neck 1234 and/or the color of shoulders 1236 are based on the primary color scheme (e.g., neck 1234 is a slightly lighter shade of the color of facial outline 1232 or neck 1234 is a slightly darker shade of the color of facial outline 1232 , as indicated in FIG. 12G ).
- the color of the second facial feature has a predetermined relationship to the color of facial outline 1232 for a plurality of different types of faces (e.g., face 1206 ; face 1222 ) (e.g., the neck is a predetermined amount lighter than the face for a plurality of different types of faces or the neck is a predetermined amount darker than the face for a plurality of different types of faces).
- FIGS. 13A-13C are a flow diagram illustrating methods of enabling and displaying a user interface that includes an indication of a current time, in accordance with some embodiments.
- Method 1300 is performed at a computer system (e.g., 100 , 300 , 500 , 600 ) (e.g., a smart device, such as a smartphone or a smartwatch; a mobile device) that is in communication with a display generation component.
- a computer system e.g., 100 , 300 , 500 , 600
- a smart device such as a smartphone or a smartwatch; a mobile device
- Some operations in method 1300 are, optionally, combined, the orders of some operations are, optionally, changed, and some operations are, optionally, omitted.
- method 1300 provides an intuitive way for managing user interfaces related to time.
- the method reduces the cognitive burden on a user for managing user interfaces related to time, thereby creating a more efficient human-machine interface.
- the computer system displays ( 1302 ), via the display generation component (e.g., 602 ), a time user interface (e.g., 1204 ) (e.g., a watch user interface that includes an indication of a current time) that includes a representation of a first face (e.g., 1206 or 1222 ) (e.g., a representation of a human face or a representation of an anthropomorphic face of a non-human character) having a first facial feature (e.g., 1208 , 1224 ) (e.g., eyes) and a second facial feature (e.g., 1210 , 1212 , 1214 , 1216 , 1218 , 1220 , 1226 , 1228 , 1230 , 1232 , 1234 , or 1236 ) (e.g., nose; mouth; hair; facial shape; neck; shoulders), wherein the first facial feature of the first face indicates a current time (e.g., the
- Displaying the time user interface that includes the representation of the first face having the first facial feature and the second facial feature, where the first facial feature of the first face indicates a current time and the second facial feature of the first face has a first visual characteristic provides information about the current time while providing a user interface with features that do not relate to time, thereby enhancing the operability of the device and making the user-device interface more efficient (e.g., by including time information in an animated user interface) which, additionally, reduces power usage and improves battery life of the device by enabling the user to use the device more quickly and efficiently.
- the computer system detects (e.g., determining) ( 1310 ) the satisfaction of a predetermined criteria for changing an appearance of the time user interface (e.g., 1204 ) (e.g., a change in the current time (e.g., a change in the hour of the current time, a change in the minute of the current time, a change in the second of the current time); a change in a state of the computer system due to a detected user input (e.g., a tap input on the display generation component) and the computer system displaying (or causing display of)/providing a response to the user input and/or performing an operation due to the user input; detecting a movement of the computer system (e.g., caused by a user movement, such as a wrist-raise movement); a change in state or a change in mode of the computer system (e.g., a predetermined criteria for changing an appearance of the time user interface (e.g., 1204 ) (e.g., a change
- the computer system In response to detecting the satisfaction of the predetermined criteria for changing an appearance of the time user interface (e.g., 1204 ) ( 1318 ), the computer system (e.g., 600 ) ceases ( 1320 ) to display the representation of the first face (e.g., 1206 or 1222 ) and displays ( 1322 ) a representation of a second face (e.g., 1206 , 1222 ) having a first facial feature (e.g., 1208 or 1224 ) (e.g., eyes) and a second facial feature (e.g., 1210 , 1212 , 1214 , 1216 , 1218 , 1220 , 1226 , 1228 , 1230 , 1232 , 1234 , or 1236 ) (e.g., nose; mouth; hair; facial shape; neck; shoulders), wherein the representation of the second face is different from the representation of the first face ( 1324 ), the first facial feature of the second face indicates a current time ( 1326
- the computer system displays or causes display of an animation via the first facial feature (e.g., blinking of the displayed time if the first facial feature represents eyes) based on a change in the time or non-periodically.
- Ceasing to display the representation of the first face and displaying the representation of the second face having the first facial feature and the second facial feature provides feedback to a user that a predetermined criteria for changing the appearance of the time user interface has been satisfied.
- Performing an operation when a set of conditions has been met without requiring further user input enhances the operability of the device and makes the user-device interface more efficient (e.g., by helping the user to provide proper inputs and reducing user mistakes when operating/interacting with the device) which, additionally, reduces power usage and improves battery life of the device by enabling the user to use the device more quickly and efficiently.
- the second facial feature (e.g., 1210 , 1212 , 1214 , 1216 , 1218 , 1220 , 1226 , 1228 , 1230 , 1232 , 1234 , or 1236 ) of the first face (e.g., 1206 or 1222 ) has the first visual characteristic and a first additional visual characteristic (e.g., if the first visual characteristic is a first line color, then a first fill color, a first shape, or a first position; if the first visual characteristic is a first fill color, then a first line color, a first shape, or a first position; if the first visual characteristic is a first shape, then a first line color, a first fill color, or a first position; if the first visual characteristic is a first position, then a first line color, a first fill color, or a first shape) different from the first visual characteristic.
- a first additional visual characteristic e.g., if the first visual characteristic is a first line color, then
- the display generation component e.g. 602
- the second facial feature (e.g., 1210 , 1212 , 1214 , 1216 , 1218 , 1220 , 1226 , 1228 , 1230 , 1232 , 1234 , or 1236 ) of the second face (e.g., 1206 or 1222 ) has the second visual characteristic and a second additional visual characteristic (e.g., if the second visual characteristic is a second line color, then a second fill color, a second shape, or a second position; if the second visual characteristic is a second fill color, then a second line color, a second shape, or a second position; if the second visual characteristic is a second shape, then a second line color, a second fill color, or a second position; if the second visual characteristic is a second position, then a second line color, a second fill color, or a second shape) different from the second visual characteristic.
- a second additional visual characteristic e.g., if the second visual characteristic is a second line color, then
- the display generation component e.g. 602
- ceasing display of the representation of the first face (e.g., 1206 , 1222 ) and displaying the representation of the second face (e.g., 1206 , 1222 ) includes displaying a gradual transition from the first face to the second face that includes (e.g., concurrently/simultaneously with transitioning the second facial feature of the first face from having the first visual characteristic through a plurality of intermediate (e.g., temporary) states to a final state in which the second facial feature has the second visual characteristic) transitioning the second facial feature of the first face from having the first additional visual characteristic through a plurality of intermediate (e.g., temporary) states to a final state in which the second facial feature has the second additional visual characteristic.
- a gradual transition from the first face to the second face that includes (e.g., concurrently/simultaneously with transitioning the second facial feature of the first face from having the first visual characteristic through a plurality of intermediate (e.g., temporary) states to a final state in which
- Changing a plurality of facial features in response to detecting the satisfaction of the predetermined criteria for changing an appearance of the time user interface provides visual feedback that the predetermined criteria for changing an appearance of the time user interface has been satisfied.
- Performing an operation when a set of conditions has been met without requiring further user input enhances the operability of the device and makes the user-device interface more efficient (e.g., by helping the user to provide proper inputs and reducing user mistakes when operating/interacting with the device) which, additionally, reduces power usage and improves battery life of the device by enabling the user to use the device more quickly and efficiently.
- the first face (e.g., 1206 or 1222 ) has a third facial feature (e.g., 1210 , 1212 , 1214 , 1216 , 1218 , 1220 , 1226 , 1228 , 1230 , 1232 , 1234 , or 1236 ) (e.g., nose; mouth; hair; facial shape; neck; shoulders) different from the second facial feature (e.g., 1210 , 1212 , 1214 , 1216 , 1218 , 1220 , 1226 , 1228 , 1230 , 1232 , 1234 , or 1236 ) of the first face, wherein the third facial feature for the first face has a third visual characteristic (e.g., a third color (e.g., a third line color or a third fill color); a third shape; a third position).
- a third visual characteristic e.g., a third color (e.g., a third line color or a third fill color); a third
- the second face (e.g., 1206 or 1222 ) has a third facial feature (e.g., nose; mouth; hair; facial shape; neck; shoulders) different from the second facial feature of the second face, wherein the third facial feature for the second face has a fourth visual characteristic (e.g., a fourth color (e.g., a fourth line color or a fourth fill color); a fourth shape; a fourth position) different from the third visual characteristic.
- a third facial feature e.g., nose; mouth; hair; facial shape; neck; shoulders
- a fourth visual characteristic e.g., a fourth color (e.g., a fourth line color or a fourth fill color); a fourth shape; a fourth position) different from the third visual characteristic.
- ceasing display of the representation of the first face and displaying the representation of the second face includes displaying a gradual transition from the first face to the second face that includes transitioning the third facial feature of the first face from having the third visual characteristic through a plurality of intermediate (e.g., temporary) states to a final state in which the third facial feature has the fourth visual characteristic.
- a plurality of intermediate e.g., temporary
- the predetermined criteria for changing the appearance of the time user interface includes a criterion that is satisfied when a predetermined time has elapsed (e.g., every minute; every 15 minutes; every 30 minutes; every hour) ( 1312 ).
- a predetermined time e.g., every minute; every 15 minutes; every 30 minutes; every hour
- the predetermined criteria for changing the appearance of the time user interface e.g., 1204
- the predetermined criteria for changing the appearance of the time user interface does not includes the criterion that is satisfied when the predetermined time has elapsed.
- the computer system changes the appearance of the time user interface (e.g., changes one or more facial features of the respective face in the time user interface) randomly and not based on when the predetermined time has elapsed.
- the predetermined criteria includes a criterion that is satisfied when a predetermined time has elapsed, provides visual feedback that the predetermined time has elapsed without requiring user input.
- Performing an operation when a set of conditions has been met without requiring further user input enhances the operability of the device and makes the user-device interface more efficient (e.g., by helping the user to provide proper inputs and reducing user mistakes when operating/interacting with the device) which, additionally, reduces power usage and improves battery life of the device by enabling the user to use the device more quickly and efficiently.
- the predetermined criteria for changing the appearance of the time user interface includes a criterion (e.g., a predefined movement criterion) that is satisfied when a predefined movement (e.g., of the computer system) has been detected (e.g., determined to have happened; resulting from a movement of the computer system (e.g., caused by a user of the computer system) ( 1314 ).
- a predefined movement e.g., of the computer system
- the computer system is a wearable device (e.g., a smartwatch), and the predefined movement criteria corresponds to a wrist-raise movement while the computer system is being worn.
- the predetermined criteria includes a criterion that is satisfied when a predefined movement (e.g., of the computer system) has been detected, provides visual feedback that the predefined movement has been detected.
- Performing an operation when a set of conditions has been met without requiring further user input enhances the operability of the device and makes the user-device interface more efficient (e.g., by helping the user to provide proper inputs and reducing user mistakes when operating/interacting with the device) which, additionally, reduces power usage and improves battery life of the device by enabling the user to use the device more quickly and efficiently.
- the predetermined criteria for changing the appearance of the time user interface includes a criterion that is satisfied when a change in state (e.g., a change in mode from one device state/mode to another device state/mode) of the computer system (e.g., 600 ) has been detected (e.g., it is determined that the computer system has undergone a change in state) ( 1316 ).
- the change in state corresponds to the computer system transitioning to a sleep mode or sleeping state.
- the sleep mode or sleep state corresponds to a state in which the display generation component is off In some embodiments, the sleep mode or sleep state corresponds to a state in which the computer system is in a low-power state (e.g., in which the display generation component is also off). In some embodiments, the change in state corresponds to the computer system transitioning from a locked state to an unlocked state.
- the predetermined criteria includes a criterion that is satisfied when a change in state (e.g., a change in mode from one device state/mode to another device state/mode) of the computer system has been detected, provides visual feedback that the a change in state of the computer system has been detected.
- a change in state e.g., a change in mode from one device state/mode to another device state/mode
- Performing an operation when a set of conditions has been met without requiring further user input enhances the operability of the device and makes the user-device interface more efficient (e.g., by helping the user to provide proper inputs and reducing user mistakes when operating/interacting with the device) which, additionally, reduces power usage and improves battery life of the device by enabling the user to use the device more quickly and efficiently.
- the second facial feature (e.g., 1210 , 1212 , 1214 , 1216 , 1218 , 1220 , 1226 , 1228 , 1230 , 1232 , 1234 , or 1236 ) of the second face (e.g., 1206 or 1222 ) has the second visual characteristic that is a first color used to fill the second facial feature of the second face (e.g., a background color or base color used to visually fill out the second facial feature of the second face).
- the computer system while displaying the representation of the second face, the computer system (e.g., 600 ) detects (e.g., determining) the satisfaction of a second predetermined criteria (e.g., a type of input; a timeout of the computer system) for changing an appearance of the time user interface (e.g., 1204 ).
- a second predetermined criteria e.g., a type of input; a timeout of the computer system
- the computer system in response to detecting the satisfaction of the second predetermined criteria for changing an appearance of the time user interface, ceases to display the representation of the second face and displaying a representation of a third face having a first facial feature of the third face (e.g., eyes) and a second facial feature of the third face (e.g., nose; mouth; hair; facial shape; neck; shoulders), wherein the representation of the third face is different from the representation of the second face, the first facial feature of the third face indicates a current time, and the second facial feature of the third face has a third visual characteristic (e.g., a second color (e.g., a second line color or a second fill color); a second shape) different from the second visual characteristic, wherein the third visual characteristic is a visually distinguished outline (e.g., borderline) for the second facial feature of the third face having a respective color that is based on (e.g., the same as; the same tone as; similar to) the first color used to fill the second facial feature of
- the computer system displays, via the first facial feature of the second face, an animation (e.g., a blinking animation) that includes ceasing display of at least a portion of the first facial feature of the second face for a period of time, and re-displaying the at least a portion of the first facial feature of the second face after the period of time has elapsed.
- an animation e.g., a blinking animation
- the animation is a blinking animation of the first facial feature that includes a temporary/brief movement or change in shape/form of the first facial feature such that the first facial feature mimics the movement of a human eye blinking.
- the computer system periodically, based on time, (e.g., every 1 minute; every 5 minutes; every 30 minutes; every hour) displays the animation (e.g., blinking animation).
- Providing a blinking animation via the first facial feature (e.g., periodically, based on time) provides visual feedback about the change in time in an intuitive manner.
- Providing improved visual feedback enhances the operability of the device and makes the user-device interface more efficient (e.g., by helping the user to provide proper inputs and reducing user mistakes when operating/interacting with the device) which, additionally, reduces power usage and improves battery life of the device by enabling the user to use the device more quickly and efficiently.
- the first facial feature (e.g., 1208 , 1224 ) is an indication of a current time and the animation is a blinking animation where the current time is animated to look like blinking eyes (e.g., the hour and minute indicators are compressed vertically and then expand vertically).
- displaying, via the first facial feature (e.g., 1208 , 1224 ) of the second face (e.g., 1206 , 1222 ), the animation includes non-periodically (e.g., not in regular intervals; at random times; not based on a period change in time) displaying, via the first facial feature of the second face, the animation.
- the second face (e.g., 1206 or 1222 ) (e.g., the main face portion of the second face) includes a primary color scheme (e.g., a predominant color; a most-prevalent color).
- the second visual characteristic for the second facial feature (e.g., 1210 , 1212 , 1214 , 1216 , 1218 , 1220 , 1226 , 1228 , 1230 , 1232 , 1234 , or 1236 ) (e.g., the neck; the neck and shoulder) of the second face is a second color that is based on (e.g., is the same as; is a similar tone as; is within a range of color variants of) the primary color scheme (e.g., the neck is a slightly lighter shade of the color of the face or the neck is a slightly darker shade of the color of the face) ( 1332 ).
- the color of the second facial feature has a predetermined relationship to the color of the first facial feature for a plurality of different faces (e.g., the neck is a predetermined amount lighter than the face for a plurality of faces or the neck is a predetermined amount darker than the face for a plurality of faces).
- the second facial feature (e.g., 1210 , 1212 , 1214 , 1216 , 1218 , 1220 , 1226 , 1228 , 1230 , 1232 , 1234 , or 1236 ) of the second face (e.g., 1206 or 1222 ) is selected from the group consisting of: hair, facial outline (e.g., including cheeks and/or jawline), nose, eyes, mouth (e.g., lips) neck, and shoulders ( 1334 ).
- the second visual characteristic for the second facial feature (e.g., 1210 , 1212 , 1214 , 1216 , 1218 , 1220 , 1226 , 1228 , 1230 , 1232 , 1234 , or 1236 ) of the first face (e.g., 1206 or 1222 ) is a third color
- the second visual characteristic for the second facial feature e.g., 1210 , 1212 , 1214 , 1216 , 1218 , 1220 , 1226 , 1228 , 1230 , 1232 , 1234 , or 1236
- the fourth color is programmatically selected (e.g., determined), without user input, from a plurality of available colors by the computer system (e.g., 600 ) ( 1336 ).
- the application process selects (e.g., programmatically determines) the fourth color based on a color of the computer system (e.g., a color of a housing or case of the computer system). In some embodiments, the application process selects (e.g., programmatically determines) the fourth color based on usage history of a user of the computer system (e.g., based on a previous user-selected color or color scheme). Programmatically selecting, without user input, colors for facial features of a displayed face provides a diverse range of characteristics that are displayed via the time user interface without requiring user input to enable the diverse range of characteristics.
- Reducing the number of inputs needed to perform an operation enhances the operability of the device and makes the user-device interface more efficient (e.g., by helping the user to provide proper inputs and reducing user mistakes when operating/interacting with the device) which, additionally, reduces power usage and improves battery life of the device by enabling the user to use the device more quickly and efficiently.
- method 700 optionally includes one or more of the characteristics of the various methods described above with reference to method 1300 .
- a device can use as a watch user interface either a watch user interface as described in FIGS. 6A-6H or a time user interface as described in FIGS. 12A-12G .
- method 900 optionally includes one or more of the characteristics of the various methods described above with reference to method 1300 .
- a device can use as a watch user interface either a watch user interface as described in FIGS.
- method 1100 optionally includes one or more of the characteristics of the various methods described above with reference to method 1300 .
- a device can use as a watch user interface either a user interface with the indication of time and the graphical representation of a respective character as described in FIGS. 10A-10AC or a time user interface as described in FIGS. 12A-12G .
- method 1500 optionally includes one or more of the characteristics of the various methods described above with reference to method 1300 .
- a device can use as a watch user interface either a user interface with a background as described in FIGS. 14A-14AD and a time user interface as described in FIGS.
- method 1700 optionally includes one or more of the characteristics of the various methods described above with reference to method 1300 .
- one or more characteristics or features of a time user interface as described with reference to FIGS. 12A-12G can be edited via the process for editing characteristics or features of a watch user interface as described with reference to FIGS. 16A-16AE .
- these details are not repeated below.
- FIGS. 14A-14AD illustrate exemplary user interfaces for enabling configuration of a background for a user interface, in accordance with some embodiments.
- the user interfaces in these figures are used to illustrate the processes described below, including the processes in FIGS. 15A-15F .
- FIG. 14A illustrates device 600 displaying, via display 602 , a first page (indicated by paging dot 1410 ) of an editing user interface 1406 for editing a respective user interface that includes content overlaid on the background.
- the respective user interface is available to be used as a watch user interface on device 600 (e.g., a watch face that includes an indication of time and one or more watch complications overlaid on the background).
- the user interface is a watch user interface, and the content includes an indication of the current time or current date.
- editing user interface 1406 includes a plurality of pages that can be navigated, where a respective page enables editing of a different feature of a user interface, as described in greater detail below.
- editing user interface 1406 includes a background 1408 for a respective user interface, where background 1408 comprises a plurality of stripes (e.g., graphical lines across the background in a vertical or horizontal direction) including a stripe 1408 A and a stripe 1408 B.
- Stripe 1408 A has a first visual characteristic (e.g., a first color; a first fill pattern) and stripe 1408 B has a second visual characteristic (e.g., a second color; a second fill pattern) different from the first visual characteristic.
- stripes of 1408 A and 1408 B are arranged in a first visual pattern of stripes (e.g., a first type of alternating color pattern, such as a repeating 2-color pattern).
- device 600 receives (e.g., detects) an input 1401 for changing the current page of editing user interface 1406 .
- input 1401 includes a gesture (e.g., a horizontal swipe on display 602 in a first direction).
- device 600 displays a second page (indicated by paging dot 1412 ) of editing user interface 1406 , as shown in FIG. 14B , where second page 1412 of editing user interface 1406 can be used to change a number of stripes (e.g., increase the number of stripes; decrease the number of stripes) of background 1408 .
- device 600 receives (e.g., detects) an input 1403 directed to changing (e.g., increasing) the number of stripes of background 1408 , as shown in FIGS. 14B-14E .
- input 1403 is a rotational input in a first direction (e.g., clockwise; up) on rotatable input mechanism 603 shown in FIGS. 14B-14E .
- input 1403 is a touch input such as a swipe or pinch input.
- device 600 in response to (e.g., and while) receiving input 1403 , device 600 displays an increase in the number of stripes for background 1408 .
- the new stripes maintain the initial visual pattern of stripes 1408 A and 1408 B (e.g., maintain the initial alternating color pattern).
- device 600 in response to (e.g., and while) receiving input 1403 , includes stripe 1408 C in background 1408 (e.g., below stripe 1408 B), where stripe 1408 C moves onto display 602 from an edge (e.g., bottom edge) of display 602 .
- stripe 1408 C has a same visual characteristic (e.g., color; fill pattern) as stripe 1408 C.
- Device 600 decreases a size of displayed stripes (e.g., decreases the height or width) as a new stripe is added to background 1408 .
- device 600 in response to (e.g., and while) continuing to receive input 1403 , device 600 includes stripe 1408 D in background 1408 (e.g., below stripe 1408 C), where stripe 1408 D moves onto display 602 from the same edge of display 602 as stripe 1408 C.
- stripe 1408 D has a same visual characteristic as stripe 1408 D (e.g., the same color and/or fill pattern as stripe 1408 D).
- Device 600 automatically maintains the first visual pattern of stripes (e.g., alternating between two colors) as new stripes are added to background 1408 .
- Device 600 continues to decrease the size of displayed stripes as new stripes are added to background 1408 .
- device 600 continues receiving input 1403 and responds by increasing the number of stripes until twelve stripes 1408 A- 1408 L are included in background 1408 , while maintaining the first visual pattern, as shown in FIG. 14F .
- FIG. 14F illustrates device 600 displaying, in second page 1412 of editing user interface 1406 , background 1408 with stripes 1408 A- 1408 L arranged in the first visual pattern of stripes. While displaying second page 1412 of editing user interface 1406 with background 1408 having stripes 1408 A- 1408 L arranged in the first visual pattern of stripes, device 600 receives (e.g., detects) an input 1405 directed to changing (e.g., decreasing) the number of stripes of background 1408 , as shown in FIG. 14F .
- input 1405 has a direction (e.g., counter-clockwise; down) that is opposite of a direction of input 1403 .
- input 1405 is a rotational input on rotatable input mechanism 603 in a direction opposite the direction of input 1403 .
- input 1405 is a touch input such as a swipe or pinch input.
- device 600 In response to receiving input 1405 , device 600 displays, in editing user interface 1406 , a decrease in the number of stripes for background 1408 , where existing stripes move off of display 602 at the edge of display 602 (e.g., at the bottom of display 602 ). Device 600 increases the size of remaining stripes (e.g., increases the height or width) as a stripe is removed from background 1408 .
- device 600 in response to receiving input 1405 , device 600 displays background 1408 with eight stripes 1408 A- 1408 H, where stripes 1408 A- 1408 H maintain the first visual pattern of stripes as in FIG. 14F .
- device 600 receives (e.g., detects) an input 1407 directed to selecting stripe 1408 D.
- input 1407 includes a tap input on stripe 1408 D.
- input 1407 includes a tap-and-hold input on stripe 1408 D.
- device 600 in response to receiving input 1407 , changes the current page in editing user interface 1406 to a third page (indicated by paging dot 1414 ) of editing user interface 1406 , as shown in FIG. 14H .
- Third page 1414 provides an editing mode for changing a visual characteristic, such as a color, of the selected stripe.
- device 600 In response to receiving input 1407 (e.g., and while displaying editing user interface 1406 in third page 1414 ), device 600 displays a visual indicator 1416 (e.g., a box) indicating that stripe 1408 D has been selected (via input 1407 ).
- visual indicator 1416 e.g., a box
- visual indictor 1416 includes an indication 1418 of a current visual characteristic (e.g., the color) applied to the selected stripe.
- device 600 receives (e.g., detects) an input 1409 directed to changing the current visual characteristic applied to stripe 1408 D.
- input 1409 is a rotational input on rotatable input mechanism 603 shown in FIG. 14H .
- input 1409 is a touch input such as a swipe or pinch input.
- device 600 navigates (e.g., scrolls) through a plurality of selectable visual characteristics (e.g., selectable colors). While the selectable visual characteristics are being navigated, different selectable visual characteristics are applied to stripe 1408 D and indicated via indication 1418 of visual indicator 1416 (e.g., the color of stripe 1408 D and indication 1418 are updated during navigation to reflect the currently-selected visual characteristic).
- selectable visual characteristics e.g., selectable colors
- device 600 in response to (e.g., and while) receiving input 1409 , changes the respective visual characteristic applied to stripe 1408 D to a third visual characteristic (e.g., a third color; a third fill pattern) different from the second visual characteristic and indicates, via indication 1418 of visual indicator 1416 , that the third visual characteristic is the currently-selected visual characteristic.
- a third visual characteristic e.g., a third color; a third fill pattern
- device 600 continues detecting input 1409 directed to changing the current visual characteristic applied to stripe 1408 D until device 600 changes the respective visual characteristic applied to stripe 1408 D to a fourth visual characteristic (e.g., a fourth color; a fourth fill pattern), different from the second visual characteristic and the third visual characteristic, and indicates, via indication 1418 of visual indicator 1416 , that the fourth visual characteristic is the currently-selected visual characteristic, as shown in FIG. 14J .
- a fourth visual characteristic e.g., a fourth color; a fourth fill pattern
- device 600 receives (e.g., detects) an input 1411 .
- Input 1411 is first detected a location on display 602 corresponding to stripe 1408 D and is moved towards a location on display 602 corresponding to stripe 1408 G, where stripe 1408 G has a different visual characteristic from stripe 1408 D.
- input 1411 is a touch-and-drag input from stripe 1408 D to stripe 1408 G.
- device 600 In response to detecting input 1411 , device 600 displays stripe 1408 G with the visual characteristic of stripe 1408 D (e.g., the visual characteristic from stripe 1408 D is applied to stripe 1408 G), as shown in FIG. 14K , and moves visual indicator 1416 to stripe 1408 G from stripe 1408 D. As shown in FIG. 14K visual indicator 1416 indicates that stripe 1408 G has been selected (via input 1411 ) and indication 1418 indicates the visual characteristic of stripe 1408 D has been applied to stripe 1408 G.
- the visual characteristic of stripe 1408 D e.g., the visual characteristic from stripe 1408 D is applied to stripe 1408 G
- device 600 receives (e.g., detects) an input 1413 directed to returning editing user interface 1406 to second page 1412 (e.g., the editing mode for changing the number of stripes in the background).
- input 1413 includes a gesture (e.g., a horizontal swipe on display 602 in a direction opposite a direction of input 1407 ).
- device 600 displays second page 1412 of editing user interface 1406 , as shown in FIG. 14L .
- background 1408 includes stripes 1408 A- 1408 L, where stripes 1408 A- 1408 L form a second visual pattern of stripes (e.g., an eight-color pattern, where stripes 1408 A, 1408 C, and 1408 E have the first visual characteristic (e.g., the first color; the first fill pattern), stripes 1408 B, 1408 F, 1408 H have the second visual characteristic (e.g., the second color; the second fill pattern), and stripes 1408 D and 1408 G have the fourth visual characteristic (e.g., the fourth color; the fourth fill pattern).
- stripes 1408 A, 1408 C, and 1408 E have the first visual characteristic (e.g., the first color; the first fill pattern)
- stripes 1408 B, 1408 F, 1408 H have the second visual characteristic (e.g., the second color; the second fill pattern)
- stripes 1408 D and 1408 G have the fourth visual characteristic (e.g., the fourth color; the fourth fill pattern).
- device 600 receives (e.g., detects) an input 1415 directed to changing (e.g., decreasing) the number of stripes in background 1408 .
- input 1415 is a rotational input on rotatable input mechanism 603 shown in FIG. 14L .
- input 1415 is a touch input such as a swipe or pinch input.
- device 600 In response to receiving (e.g., detecting) input 1415 directed to decreasing the number of stripes of background 1408 , where input 1415 is in the second direction (e.g., a counter-clockwise direction; a down direction), device 600 displays a decrease in the number of stripes for background 1408 .
- Existing stripes move off of display 602 at the edge of display 602 (e.g., at the bottom of display 602 ).
- Device 600 increases the size of remaining stripes (e.g., increases the height or width) as a stripe is removed from background 1408 .
- device 600 In response to (e.g., after) receiving input 1415 , device 600 displays background 1408 with four remaining stripes 1408 A- 1408 D, as shown in FIG. 14M , as stripes 1408 E- 1408 H have been removed from background 1408 by input 1415 .
- background 1408 includes stripes 1408 A- 1408 D, where stripes 1408 A- 1408 D are arranged in a third visual pattern of stripes (e.g., a third type of alternating color pattern (e.g., a repeating 4-color pattern), where stripe 1408 A and stripe 1408 C have the first visual characteristic (e.g., the first color; the first fill pattern), stripe 1408 B has the second visual characteristic, and stripe 1408 D has the fourth visual characteristic.
- a third visual pattern of stripes e.g., a third type of alternating color pattern (e.g., a repeating 4-color pattern)
- stripe 1408 A and stripe 1408 C have the first visual characteristic (e.g., the first color; the first fill pattern)
- stripe 1408 B has the second visual characteristic
- stripe 1408 D has the fourth visual characteristic.
- device 600 receives (e.g., detects) an input 1417 directed to changing (e.g., increasing) the number of stripes in background 1408 .
- input 1417 is a rotational input on rotatable input mechanism 603 shown in FIG. 14M .
- input 1417 is a touch input such as a swipe or pinch input.
- device 600 In response to receiving input 1417 , where input 1417 is in the first direction (e.g., a clockwise direction; an up direction), device 600 displays an increase in the number of stripes for background 1408 , where stripes are moved onto display 602 from the edge of display 602 (e.g., at the bottom of display 602 ). Device 600 decreases the size of stripes (e.g., decreases the height or width) as a stripe is added to background 1408 .
- the first direction e.g., a clockwise direction; an up direction
- Device 600 displays an increase in the number of stripes for background 1408 , where stripes are moved onto display 602 from the edge of display 602 (e.g., at the bottom of display 602 ).
- Device 600 decreases the size of stripes (e.g., decreases the height or width) as a stripe is added to background 1408 .
- device 600 In response to (e.g., after) receiving input 1417 , device 600 displays background 1408 in editing user interface 1406 with eight stripes 1408 A- 1408 H, as shown in FIG. 14N , where stripes 1408 A- 1408 H have the second visual pattern of stripes as first described above with reference to FIG. 14L (e.g., instead of maintain the four-stripe visual pattern of stripes shown in FIG. 14M ).
- device 600 in response to receiving an input directed to increasing the number of stripes (e.g., input 1417 in FIG. 14M ) after receiving an input directed to decreasing the number of stripes (e.g., input 1415 of FIG. 14L ), device 600 maintains the visual pattern of stripes from when the input directed to decreasing the number of stripes (e.g., input 1415 of FIG. 14L ) was first detected. In some embodiments, in accordance with detecting one or more inputs (e.g., input 1415 in FIG. 14L , then input 1417 in FIG. 14M ) directed to decreasing, then increasing, the number of stripes, device 600 maintains the visual pattern of stripes (e.g., the second visual pattern of stripes as in FIG. 14L ) from prior to the one or more inputs being received.
- the visual pattern of stripes e.g., the second visual pattern of stripes as in FIG. 14L
- device 600 in response to receiving an input directed to decreasing the number of stripes (e.g., input 1415 in FIG. 14L ), and subsequently receiving an input directed to increasing the number of stripes (e.g., input 1417 in FIG. 14M ), device 600 re-displays stripes (e.g., stripes 1408 E- 1408 H) in the background to include the same visual pattern of stripes (e.g., the second visual pattern of stripes as in FIG. 14L ) from prior to the inputs being received if no other inputs are received by device 600 between receiving the two respective inputs (e.g., between receiving input 1415 and input 1417 ). For example, if there were no intervening operations received by device 600 between displaying background 1408 with the second visual pattern of stripes as in FIG.
- device 600 re-displays stripes (e.g., stripes 1408 E- 1408 H) in background 1408 to include the same visual pattern of stripes.
- device 600 in accordance with receiving an input directed to decreasing the number of stripes (e.g., input 1415 in FIG. 14L ), and subsequently receiving an input directed to increasing the number of stripes (e.g., input 1417 in FIG. 14M ), device 600 does not re-display stripes (e.g., stripes 1408 E- 1408 H) in the background to include the same visual pattern of stripes (e.g., the second visual pattern of stripes as in FIG.
- performing the operation includes displaying a user interface different from editing user interface 1406 .
- performing the operation includes editing a different aspect/feature of background 1408 (e.g., in a different page of editing user interface 1406 ) than changing the number of stripes of background 1408 (e.g., editing features of a watch face, such as watch face style or watch complications).
- device 600 displays stripes 1408 E- 1408 H to include the third visual pattern of stripes of stripes 1408 A- 1408 D as in FIG. 14M (when the number of stripes is decreased) to stripes 1408 A- 1408 H (when the number of stripes is increased).
- device 600 receives (e.g., detects) an input 1419 directed to changing the current page of editing user interface 1406 to a fourth page (indicated by paging dot 1420 ) (e.g., an editing mode for rotating the background).
- input 1419 includes a gesture (e.g., a horizontal swipe on display 602 ).
- device 600 displays fourth page 1420 of editing user interface 1406 , as shown in FIG. 14O .
- input 1421 is a rotational input on rotatable input mechanism 603 shown in FIGS. 14O-14P .
- input 1421 is a touch input such as a swipe, twist, or pinch input.
- device 600 rotates stripes 1408 A- 1408 B of background 1408 in accordance with input 1421 (e.g., background 1408 is rotated with the center of display 602 as the axis point for rotation).
- input 1421 is a rotational input in a clockwise direction
- stripes 1408 A- 1408 H of background 1408 are rotated in the clockwise direction.
- stripes 1408 A- 1408 H of background 1408 are rotated in the counter-clockwise direction.
- stripes 1408 A- 1408 H of background 1408 maintain a straight shape while being rotated, as shown in FIG. 14P .
- rotating background 1408 includes rotating background 1408 by predefined rotational increments (e.g., by 10 degree increments; by 15 degree increments; by 30 degree increments) with respect to a rotational axis point (e.g., the center of display 602 ).
- rotating background 1408 includes changing (e.g., increasing; decreasing) a characteristic (e.g., thickness; size; area) of stripes 1408 A- 1408 H of background 1408 as the background is being rotated in accordance with the input directed to rotating the stripes (e.g., input 1421 ).
- device 600 In response to (e.g., after) detecting input 1421 , device 600 displays stripes 1408 A- 1408 H of background 1408 rotated from a horizontal orientation, as in FIG. 14P , to a vertical orientation, as in FIG. 14Q .
- stripes 1408 A- 1408 H can be rotated to an intermediary angle between the horizontal and vertical orientations (e.g., by 1 degree increments, 2 degree increments, 5 degree increments, 10 degree increments; by 15 degree increments; by 30 degree increments).
- device 600 receives (e.g., detects) an input 1423 directed to exiting editing user interface 1406 .
- input 1423 is directed to rotatable input mechanism 603 (e.g., a press input or a press-and-hold input at rotatable input mechanism 603 ), as in FIG. 14Q .
- input 1423 is a touch input (e.g., a tap-and-hold input) on display 602 .
- device 600 displays a user interface 1422 (e.g., a watch user interface) that includes background 1408 with stripes 1408 A- 1408 H as the background of the user interface.
- user interface 1422 is a watch user interface that includes background 1408 with stripes 1408 A- 1408 H as the background of the watch user interface and an indication of time 1424 overlaid on background 1408 .
- electronic device 600 detects user input 1426 (e.g., a tap and hold gesture) on user interface 1422 .
- electronic device 600 displays user interface 1428 , as shown at FIG. 14S .
- user interface 1428 includes representation 1430 of background 1408 , watch user interface type indicator 1432 (e.g., “Stripes”), share affordance 1434 , and edit affordance 1436 .
- Representation 1430 of background 1408 includes stripes 1408 A- 1408 H arranged in the vertical orientation and/or having a fourth visual pattern.
- electronic device 600 is configured to display representations of different backgrounds for user interface 1422 and/or representations of additional user interfaces (e.g., different from user interface 1422 ) in response to detecting rotational input on rotatable input mechanism 603 .
- electronic device 600 detects user input 1438 (e.g., a tap gesture) corresponding to selection of edit affordance 1436 .
- electronic device 600 displays editing user interface 1440 (e.g., a modified version of editing user interface 1406 ), at FIG. 14T .
- a first page of editing user interface 1440 includes representation 1430 of background 1408 , first editing feature indicator 1442 (e.g., “Style”), second editing feature indicator 1444 (e.g., “Color”), and first style indicator 1446 (e.g., “Full Screen”).
- Representation 1430 of background 1408 includes stripes 1408 A- 1408 H in the vertical orientation and/or having the fourth visual pattern.
- First editing feature indicator 1442 corresponds to a currently selected editing feature for background 1408 (e.g., “Style”), as indicated by first editing feature indicator 1442 being centered on display 602 and above representation 1430 .
- the currently selected editing feature relates to a format of a border (e.g., a shape of the border) in which background 1408 will be displayed on user interface 1422 .
- First style indicator 1446 provides a first option for the currently selected editing feature and indicates the option as full screen (e.g., a border having a rectangular shape).
- electronic device 600 displays background 1408 in a full screen mode on display 602 (e.g., background 1408 occupies all or substantially all of display 602 and is displayed within a border having a shape of display 602 , such as a rectangular shape or a square shape).
- electronic device 600 detects rotational input 1448 on rotatable input mechanism 603 .
- electronic device 600 displays the first page of editing user interface 1440 with representation 1450 and second style indicator 1452 (e.g., “Circle”), as shown at FIG. 14U .
- Second style indicator 1452 corresponds to a second option for the currently selected editing feature and indicates the option as a circular mask (e.g., displaying background 1408 within a border having a circular shape). In some embodiments, the circular mask does not occupy the full screen of display 602 .
- electronic device 600 In response to detecting selection of the circular mask option (e.g., via a tap gesture or press gesture on rotatable input mechanism 603 ), electronic device 600 displays background 1408 within a circular shaped border on a portion of display 602 .
- representation 1450 of background 1408 maintains the vertical orientation of stripes 1408 A- 1408 H in the circular shaped border.
- electronic device 600 in response to detecting rotational input 1448 , electronic device 600 adjusts a size (e.g., a width and/or a thickness) of stripes 1408 A- 1408 H displayed in representation 1450 of background 1408 (as compared to representation 1430 ) to enable stripes 1408 A- 1408 H to fit within the circular shaped border of representation 1450 .
- electronic device 600 reduces the size (e.g., the width and/or the thickness) of stripes 1408 A- 1408 H displayed in representation 1450 (as compared to representation 1430 ) because the circular shaped border of representation 1450 includes a smaller width than the rectangular border of representation 1430 .
- electronic device detects user input 1454 (e.g., a swipe gesture) on editing user interface 1440 .
- electronic device 600 displays a second page of editing user interface 1440 for editing a second feature of background 1408 , as shown at FIG. 14V .
- electronic device displays a second page of editing user interface 1440 for editing the second feature of background 1408 , as indicated by second editing feature indicator 1444 being centered on display 602 above representation 1450 .
- electronic device 600 displays third editing feature indicator 1456 (e.g., “Position”) in response to detecting user input 1454 (e.g., electronic device 600 translates first editing feature indicator 1442 , second editing feature indicator 1444 , and third editing feature indicator 1456 in a direction associated with movement of user input 1454 ).
- the second page of editing user interface 1440 corresponds to an ability to adjust a color of one or more stripes 1408 A- 1408 H of background 1408 .
- electronic device displays indication 1416 around stripe 1408 A indicating that stripe 1408 A is selected for editing.
- electronic device 600 displays indication 1418 indicating a current color of stripe 1408 A that is selected for editing (e.g., “White”).
- electronic device 600 adjusts the color of stripe 1408 A in response to detecting rotational input on rotational input mechanism 603 .
- the second page of editing user interface 1440 includes color selection element 1458 , which includes indicators 1458 A- 1458 D corresponding to different colors that may be designated to stripe 1408 A (or another selected stripe 1408 B- 1408 H).
- Electronic device 600 is configured to adjust and/or change a position of indicator 1416 from stripe 1408 A to one of stripes 1408 B- 1408 H in response to detecting a tap gesture on one of stripes 1408 B- 1408 H.
- representation 1450 of background 1408 is rotated when compared to representation 1450 of FIG. 14U so that stripes 1408 A- 1408 H are in a horizontal orientation (e.g., stripes 1408 A- 1408 H extend between the left and ride sides of display 602 ).
- displaying representation 1450 such that stripes 1408 A- 1408 H are in the horizontal orientation facilitates a user's ability to accurately select a particular stripe.
- electronic device 600 detects user input 1460 (e.g., a swipe gesture) on editing user interface 1440 .
- electronic device 600 displays a third page of editing user interface 1440 , as shown at FIG. 14W .
- the third page of editing user interface 1440 enables adjustment of an angle and/or position of background 1408 , and thus the angle and/or position of stripes 1408 A- 1408 H of background 1408 .
- electronic device displays third editing feature indicator 1456 as centered on display 602 above representation 1450 to indicate that the third page of editing user interface 1440 enables adjustment of the position of background 1408 .
- electronic device 600 displays fourth editing feature indicator 1462 (e.g., “Complications”) in response to detecting user input 1460 (e.g., electronic device 600 translates first editing feature indicator 1442 , second editing feature indicator 1444 , third editing feature indicator 1456 , and fourth editing feature indicator 1462 in a direction associated with movement of user input 1460 ).
- fourth editing feature indicator 1462 e.g., “Complications”
- electronic device 600 rotates representation 1450 of background 1408 back to the orientation (e.g., a vertical orientation) of background 1408 prior to displaying the second page (e.g., for editing color) of editing user interface 1440 .
- background 1408 is returned to the previous orientation because the second page of editing user interface 1440 for adjusting colors of stripes 1408 A- 1408 H is no longer displayed (e.g., electronic device 600 does not detect and/or respond to user inputs on individual stripes 1408 A- 1408 H when the second page of editing user interface 1440 is not displayed).
- the third page of editing user interface 1440 enables adjustment of an angle and/or position of background 1408 .
- the third page of editing user interface 1440 includes rotation indicator 1464 that provides a visual indication of an angle of background 1408 with respect to a rotational axis (e.g., the center of display 602 ).
- electronic device detects rotational input 1466 on rotatable input mechanism 603 .
- electronic device 600 rotates representation 1450 with respect to the rotational axis, as shown at FIG. 14X .
- electronic device 600 updates rotation indicator 1464 to provide a visual indication of the new angle of background 1408 with respect to the rotational axis (e.g., 45 degrees). While electronic device 600 displays representation 1450 with stripes 1408 A- 1408 H at an angle of 45 degrees with respect to the rotational axis, electronic device 600 can rotate representation 1450 of background 1408 to any suitable angle (e.g., any angle from 0 degrees to 360 degrees) with respect to the rotational axis. In some embodiments, electronic device 600 rotates representation 1450 to a particular angle in accordance with a detected amount of movement associated with rotational input 1466 (e.g., an amount of rotation of representation 1450 is based on an amount of detected movement or rotation associated with rotational input 1466 ).
- a detected amount of movement associated with rotational input 1466 e.g., an amount of rotation of representation 1450 is based on an amount of detected movement or rotation associated with rotational input 1466 ).
- electronic device 600 can continuously rotate representation 1450 while continuing to detect rotational input 1466 (e.g., the angle of rotation is selectable by a continuous input, such as continuous rotation of rotatable input mechanism 603 ).
- representation 1450 corresponds to background 1408 being displayed within a border that includes a circular shape.
- electronic device 600 forgoes adjustment of a size (e.g., a thickness and/or a width) of stripes 1408 A- 1408 H of representation 1450 because representation 1450 includes the circular border (e.g., rotating representation 1450 does not cause the lengths or widths of stripes 1408 A- 1408 H to change because the diameter of the circular border remains constant).
- electronic device 600 adjusts the size (e.g., thickness and/or width) of stripes 1408 A- 1408 H in response to rotational input 1466 when background 1408 is displayed within a non-circular border.
- electronic device 600 detects user input 1468 (e.g., two swipe gestures) on editing user interface 1440 .
- electronic device 600 displays the first page of editing user interface 1440 for adjusting the shape of the border in which background 1408 is displayed, as shown at FIG. 14Y .
- electronic device 600 displays representation 1450 with the updated position (e.g., an angle of 45 degrees) caused by rotational input 1466 (e.g., because the second page of editing user interface 1440 is not displayed).
- electronic device 600 detects rotational input 1470 on rotatable input mechanism 603 .
- electronic device 600 displays the first page of editing user interface 1440 with representation 1430 of background 1408 in the rectangular shaped border, as shown at FIG. 14Z .
- representation 1430 maintains the angle of representation 1450 caused by rotational input 1466 (e.g., an angle of 45 degrees).
- electronic device 600 adjusts a size (e.g., thickness and/or width) of stripes 1408 A- 1408 H of representation 1430 when compared to representation 1450 at FIG. 14Y .
- Electronic device 600 adjusts the size (e.g., length, thickness, and/or width) of stripes 1408 A- 1408 H to occupy the entire area defined by the rectangular shaped border, while maintaining the same number of stripes (e.g., and the same width for each stripe).
- the width of the stripes varies with the dimension of background 1408 in the direction perpendicular to the length of the stripes (e.g., the stripes are wider when oriented horizontally than when oriented vertically because the vertical dimension of display 602 is larger than the horizontal dimension of display 602 , and vice versa).
- electronic device 600 detects user input 1472 (e.g., two successive swipe gestures) on editing user interface 1440 .
- electronic device 600 displays the third page of editing user interface 1440 for adjusting the position of representation 1430 , as shown at FIG. 14AA .
- electronic device 600 detects rotational input 1474 on rotatable input mechanism 603 .
- electronic device 600 rotates representation 1430 (e.g., about the rotational axis) in accordance with an amount of movement and/or a direction of rotational input 1474 , as shown at FIG. 14AB .
- electronic device 600 displays representation 1430 with stripes 1408 A- 1408 H at an angle of 60 degrees (e.g., relative to horizontal), as indicated by rotation indicator 1464 .
- electronic device 600 reduces a size (e.g., thickness and/or width) of stripes 1408 A- 1408 H in addition to rotating stripes 1408 A- 1408 H about the rotational axis. For example, in response to rotating stripes 1408 A- 1408 H from an angle of 45 degrees to an angle of 60 degrees, electronic device 600 varies the lengths of stripes 1408 A- 1408 H as needed to fit within rectangular border of representation 1430 .
- Electronic device 600 also reduces the size (e.g., thickness and/or width) of stripes 1408 A- 1408 H in order to maintain the same number of stripes 1408 A- 1408 H (e.g., each with the same width) within the rectangular border of representation 1430 .
- electronic device 600 adjusts the size of stripes 1408 A- 1408 H based on a detected amount of movement associated with rotational input 1474 . For example, while electronic device 600 detects rotational input 1474 (e.g., continuously detects rotational input 1474 ), electronic device 600 gradually and/or continuously adjusts the size of stripes 1408 A- 1408 H in response to continuing to detect rotational input 1474 .
- electronic device 600 adjusts the size of stripes 1408 A- 1408 H based on a direction of rotational input 1474 (e.g., clockwise or counter-clockwise). For example, in response to detecting that rotational input 1474 is in a first direction, electronic device 600 reduces the size of stripes 1408 A- 1408 H and, in response to detecting that rotational input 1474 is in a second direction, different from the first direction, electronic device 600 increases the size of stripes 1408 A- 1408 H.
- a direction of rotational input 1474 e.g., clockwise or counter-clockwise
- electronic device 600 detects user input 1476 (e.g., a swipe input) on editing user interface 1440 .
- electronic device 600 displays the second page of editing user interface 1440 , as shown at FIG. AC.
- the second page of editing user interface 1440 enables adjustment of colors of stripes 1408 A- 1408 H.
- Electronic device 600 detects user input (e.g., a tap gesture) on a respective stripe in order to enable adjustment of the color of the respective stripe.
- electronic device 600 rotates representation 1430 when transitioning from the third page of editing user interface 1440 (shown at FIG.
- electronic device 600 rotates representation 1430 when transitioning from any page of editing user interface 1440 to the second page of editing user interface 1440 .
- electronic device 600 rotates representation 1430 to include the horizontal orientation of stripes 1408 A- 1408 H.
- representation 1430 includes the horizontal orientation when displayed in the first page and/or the third page of editing user interface 1440
- electronic device 600 maintains display of representation 1430 in the horizontal orientation when transitioning to the second page of editing user interface 1440 .
- the horizontal orientation of representation 1430 can facilitate a user's ability to select a particular stripe of stripes 1408 A- 1408 H by providing uniform targets for a user to select (e.g., via a tap gesture). As such, displaying representation 1430 in the horizontal orientation when electronic device 600 displays the second page of editing user interface 1440 can improve a user's ability to select stripes 1408 A- 1408 H and adjust a particular stripe to a desired color.
- FIG. 14AD illustrates examples of user interface 1422 after electronic device 600 ceases to display editing user interface 1440 .
- FIG. 14AD includes first representation 1478 of user interface 1422 and second representation 1480 of user interface 1422 with background 1408 displayed within a circular border. Additionally, FIG. 14AD shows third representation 1482 of user interface 1422 and fourth representation 1484 of user interface 1422 with background 1408 displayed within a rectangular border (e.g., a full screen border that includes the shape of display 602 ).
- First representation 1478 and second representation 1480 include complications 1486 , 1488 , 1490 , and 1492 positioned in corners of display 602 and outside of background 1408 .
- Complications 1486 , 1488 , 1490 , and 1492 may be selected and/or edited via user input in a fourth page of editing user interface 1440 .
- third representation 1482 and fourth representation 1484 include complications 1494 and 1496 overlaid on background 1408 .
- Complications 1494 and 1496 can also be selected via user input in the fourth page of editing user interface 1440 .
- FIGS. 15A-15F are a flow diagram illustrating methods of enabling configuration of a background for a user interface, in accordance with some embodiments.
- Method 1500 is performed at a computer system (e.g., 100 , 300 , 500 , 600 ) (e.g., a smart device, such as a smartphone or a smartwatch; a mobile device) that is in communication with a display generation component and one or more input devices (e.g., including a touch-sensitive surface that is integrated with the display generation component; a mechanical input device; a rotatable input device; a rotatable and depressible input device; a microphone).
- Some operations in method 1500 are, optionally, combined, the orders of some operations are, optionally, changed, and some operations are, optionally, omitted.
- method 1500 provides an intuitive way for managing user interfaces related to time.
- the method reduces the cognitive burden on a user for managing user interfaces related to time, thereby creating a more efficient human-machine interface.
- the computer system displays ( 1502 ), via the display generation component (e.g., 602 ), an editing user interface (e.g., 1406 ) for editing a background (e.g., 1408 ) of a user interface (e.g., a home/main user interface; a wake screen user interface; a lock screen user interface; a watch user interface; a watch face that includes an indication of time and one or more watch complications), wherein the user interface includes content (e.g., an indication of time; watch complications; icons; menus; folders) overlaid on the background ( 1504 ), and the editing user interface includes a representation of the background of the user interface that includes a first number of stripes (e.g., graphical lines across the background in a vertical or horizontal direction) that is greater than one (e.g., two or more stripes; an even number of repeating two stripes of different colors) ( 1506 ).
- a user interface e.g., a home/main user interface; a wake screen user interface;
- the computer system detects ( 1514 ), via the one or more input devices, a first user input (e.g., 1403 , 1405 ) (e.g., a rotational input on the rotatable input device; a touch input such as a swipe or pinch input).
- a first user input e.g., 1403 , 1405
- a rotational input on the rotatable input device e.g., a rotational input on the rotatable input device; a touch input such as a swipe or pinch input.
- the computer system In response to detecting the first user input (e.g., 1403 ) ( 1518 ), in accordance with a determination that the first user input corresponds to a first type of input (e.g., an input in a first direction (e.g., a clockwise rotational direction; a first vertical or horizontal direction)), the computer system (e.g., 600 ) displays ( 1522 ), in the user interface, a representation of an updated background (e.g., 1408 ) with a second number of stripes that is greater than the first number of stripes (e.g., add one or more additional stripes to the background (e.g., add one more stripe; add multiple stripes; add an even number of stripes; double the number of stripes); add one or more additional stripes to the background where the added stripes repeat a pattern (e.g., a repeating color pattern) of the original stripes).
- updating the background with the second number of stripes that is greater than the first number of stripes includes moving (e.g., sliding) the new stripes onto the
- the computer system displays ( 1524 ), in the user interface, the representation of the updated background (e.g., 1408 ) with a third number of stripes that is less than the first number of stripes (e.g., remove one or more stripes from the background (e.g., remove one stripe; remove multiple stripes); if the first number of stripes have a repeating pattern (e.g., a repeating color pattern), remove one or more stripes such that the pattern is maintained within the remaining stripes; if the first number of stripes do not have a repeating pattern (e.g., a repeating color pattern), remove one or more stripes from the background in one direction
- a repeating pattern e.g., a repeating color pattern
- updating the background with the third number of stripes that is less than the first number of stripes includes moving (e.g., sliding) stripes out of the background off of an edge of the display.
- Changing the number of stripes in the background in accordance with the first user input enables a user to change the number of stripes in the background easily and in an intuitive manner.
- Providing improved control options enhances the operability of the device and makes the user-device interface more efficient (e.g., by helping the user to provide proper inputs and reducing user mistakes when operating/interacting with the device) which, additionally, reduces power usage and improves battery life of the device by enabling the user to use the device more quickly and efficiently.
- the computer system detects ( 1526 ) (e.g., subsequent to detecting the first input), via the one or more input devices, a second user input (e.g., 1423 ) (e.g., a request to exit or cease display of the user interface for editing the background).
- a second user input e.g., 1423
- the computer system In response to detecting the second user input (e.g., 1423 ) ( 1528 ), the computer system (e.g., 600 ) displays ( 1530 ), via the display generation component (e.g., 602 ), the user interface with the updated background (e.g., 1408 ).
- the updated background includes the second number of stripes.
- the updated background includes the third number of stripes. Displaying the user interface with the updated background in response to detecting the second user input enables a user to quickly and easily update the background of the current user interface.
- Providing improved control options without cluttering the UI with additional displayed controls enhances the operability of the device and makes the user-device interface more efficient (e.g., by helping the user to provide proper inputs and reducing user mistakes when operating/interacting with the device) which, additionally, reduces power usage and improves battery life of the device by enabling the user to use the device more quickly and efficiently.
- the user interface is a watch user interface (e.g., a watch face; a user interface that includes an indication of a current time; a clock user interface for a smartwatch) ( 1508 ).
- the content is an indication of a current time or current date ( 1510 ).
- the computer system displays ( 1516 ), in the editing user interface, a user interface (e.g., a tab (e.g., 1412 ) within the editing user interface) for editing (e.g., increasing or decreasing) a number of stripes of the representation of the background of the user interface, wherein the user interface for editing the number of stripes includes the representation of the background (e.g., 1408 ) of the user interface.
- a user interface e.g., a tab (e.g., 1412 ) within the editing user interface) for editing (e.g., increasing or decreasing) a number of stripes of the representation of the background of the user interface, wherein the user interface for editing the number of stripes includes the representation of the background (e.g., 1408 ) of the user interface.
- the first number of stripes are arranged in a first visual pattern of stripes of different colors (e.g., a first type of alternating color pattern (e.g., a repeating 2-color pattern; a repeating 3-color pattern)), and second number of stripes are arranged in the first visual pattern of stripes of different colors (e.g., the first type of alternating color pattern (e.g., a repeating 2-color pattern; a repeating 3-color pattern)) ( 1522 ). Maintaining the first visual pattern of stripes when the number of stirpes in the background are increased enables efficient editing of a background that includes the number of stripes.
- a first type of alternating color pattern e.g., a repeating 2-color pattern; a repeating 3-color pattern
- Maintaining the first visual pattern of stripes when the number of stirpes in the background are increased enables efficient editing of a background that includes the number of stripes.
- Reducing the number of inputs needed to perform an operation enhances the operability of the device and makes the user-device interface more efficient (e.g., by helping the user to provide proper inputs and reducing user mistakes when operating/interacting with the device) which, additionally, reduces power usage and improves battery life of the device by enabling the user to use the device more quickly and efficiently.
- the computer system detects ( 1534 ), via the one or more input devices, a third user input (e.g., a rotational input on the rotatable input device; a touch input such as a swipe or pinch input).
- a third user input e.g., a rotational input on the rotatable input device; a touch input such as a swipe or pinch input.
- the computer system in response to detecting the third user input ( 1536 ), displays ( 1538 ), in the user interface, the representation of the updated background with the first number of stripes, wherein the first number of stripes are arranged in the second visual pattern of stripes of different colors (e.g., a second type of alternating color pattern (e.g., a repeating 2-color pattern; a repeating 3-color pattern)).
- a second type of alternating color pattern e.g., a repeating 2-color pattern; a repeating 3-color pattern
- Reducing the number of inputs needed to perform an operation enhances the operability of the device and makes the user-device interface more efficient (e.g., by helping the user to provide proper inputs and reducing user mistakes when operating/interacting with the device) which, additionally, reduces power usage and improves battery life of the device by enabling the user to use the device more quickly and efficiently
- the computer system detects, via the one or more input devices, a fourth user input (e.g., a rotational input on the rotatable input device; a touch input such as a swipe or pinch input), wherein no other inputs were detected between displaying the representation of the updated background with the third number to detecting the fourth user input (e.g., there were no intervening operations on the computer system from updating the representation of the updated background to include the third number of stripes to detecting the fourth user input).
- a fourth user input e.g., a rotational input on the rotatable input device; a touch input such as a swipe or pinch input
- the representation of the updated background with the first number of stripes in response to detecting the fourth user input, displaying, in the user interface, the representation of the updated background with the first number of stripes, wherein the first number of stripes are arranged in the third visual pattern of stripes of different colors (e.g., the third type of alternating color pattern (e.g., a repeating 2-color pattern; a repeating 3-color pattern)).
- the third visual pattern of stripes of different colors e.g., remembering the previous visual pattern of stripes
- the number of stripes were first decreased, then increased via the fourth user input (e.g., and no intervening inputs were detected between the decreasing and increasing of the number of stripes)
- Reducing the number of inputs needed to perform an operation enhances the operability of the device and makes the user-device interface more efficient (e.g., by helping the user to provide proper inputs and reducing user mistakes when operating/interacting with the device) which, additionally, reduces power usage and improves battery life of the device by enabling the user to use the device more quickly and efficiently.
- the computer system while displaying the representation of the updated background (e.g., 1408 ) with the third number of stripes, where the third number of stripes are arranged in the third visual pattern of stripes of different colors, the computer system (e.g., 600 ) detects one or more intervening inputs directed to causing display of a different user interface and/or causing display of a different page than a current page of the editing user interface, then detects the fourth user input.
- the computer system in response to detecting the fourth user input, displays or causes display of, in the user interface, the representation of the updated background with the first number of stripes, where the first number of stripes are still arranged in the third visual pattern of stripes of different colors (e.g., the third type of alternating color pattern (e.g., a repeating 2-color pattern; a repeating 3-color pattern)).
- the third visual pattern of stripes of different colors e.g., the third type of alternating color pattern (e.g., a repeating 2-color pattern; a repeating 3-color pattern)
- the computer system detects, via the one or more input devices, a user input directed to performing an operation that does not include changing the third number of stripes of the representation of the updated background to a different number of stripes.
- performing the operation includes displaying a user interface different from the editing user interface.
- performing the operation includes editing a different aspect/feature of the representation of the updated background than changing or other modifying the stripes within the representation of the updated background (e.g., editing features of a watch face (e.g., watch face style; watch complications) having the updated background as the background).
- editing features of a watch face e.g., watch face style; watch complications
- the computer system in response to detecting the user input directed to performing the operation, ceases display of the representation of the updated background (e.g., 1408 ) (e.g., and exiting the user interface for editing the number of stripes and displaying (e.g., replacing display of the user interface for editing the number of stripes with) a different user interface for performing the operation that does not include changing the third number of stripes of the representation of the updated background to a different number of stripes).
- the computer system detects, via the one or more input devices, a fifth user input (e.g., a rotational input on the rotatable input device; a touch input such as a swipe or pinch input).
- a fifth user input e.g., a rotational input on the rotatable input device; a touch input such as a swipe or pinch input.
- the computer system in response to detecting the fifth user input, displays, in the user interface, the representation of the updated background (e.g., 1408 ) with the first number of stripes, wherein the first number of stripes are arranged in a fifth visual pattern of stripes of different colors (e.g., the fifth type of alternating color pattern (e.g., a repeating 2-color pattern; a repeating 3-color pattern)) that is different from the fourth visual pattern of stripes of different colors.
- the fifth type of alternating color pattern e.g., a repeating 2-color pattern; a repeating 3-color pattern
- Reducing the number of inputs needed to perform an operation enhances the operability of the device and makes the user-device interface more efficient (e.g., by helping the user to provide proper inputs and reducing user mistakes when operating/interacting with the device) which, additionally, reduces power usage and improves battery life of the device by enabling the user to use the device more quickly and efficiently.
- the computer system while displaying the editing user interface (e.g., 1406 ), the computer system (e.g., 600 ) detects, via the one or more input devices (e.g., a touch-sensitive surface that is integrated with the display generation component (e.g., 602 )), an input (e.g., 1407 ; a press-and-hold input; a touch-and-hold input) directed to a first stripe (e.g., 1408 D; a stripe of the first number of stripes of the representation of the background (e.g., 1408 ).
- the one or more input devices e.g., a touch-sensitive surface that is integrated with the display generation component (e.g., 602 )
- an input e.g., 1407 ; a press-and-hold input; a touch-and-hold input
- a first stripe e.g., 1408 D
- a stripe of the first number of stripes of the representation of the background e.g.
- the computer system in response to detecting the input directed to the first stripe, displays, in the editing user interface, an indication (e.g., 1416 ) (e.g., a visual indication (e.g., a tab, a box) surrounding or within the selected stripe indicating that the stripe has been selected, and that it can be modified) that the first stripe is selected for editing (e.g., editing for a different visual characteristic (e.g., a different color)). Transitioning through different selectable colors in response to detecting the rotational input enables a user to quickly and easily transition through the different selectable colors.
- an indication e.g., 1416
- a visual indication e.g., a tab, a box
- Providing improved control options without cluttering the UI with additional displayed controls enhances the operability of the device and makes the user-device interface more efficient (e.g., by helping the user to provide proper inputs and reducing user mistakes when operating/interacting with the device) which, additionally, reduces power usage and improves battery life of the device by enabling the user to use the device more quickly and efficiently.
- the computer system while displaying the indication (e.g., 1416 ) that the first stripe is selected for editing, the computer system (e.g., 600 ) detects, via the one or more input devices (e.g., a rotatable input device; a rotatable and depressible input device), a rotational input (e.g., 1409 ) (e.g., a rotational input on the rotatable input device; a touch input such as a swipe or pinch input).
- the one or more input devices e.g., a rotatable input device; a rotatable and depressible input device
- a rotational input e.g., 1409
- a rotational input on the rotatable input device e.g., a rotational input on the rotatable input device; a touch input such as a swipe or pinch input.
- the computer system in response to (e.g., and while) detecting the rotational input, transitions from a first color to a second color different from the first color (e.g., such that the second color is now set as the current color for the first stripe).
- the transition from the first color to the second color includes, while detecting the rotational input, transitioning from the first color, through a plurality of different colors), to the second color.
- the first stripe is edited without editing other stripes of the first number of stripes. Displaying the indication that the second stripe is selected for editing in response to detecting the input corresponding to the drag gesture enables efficient editing of a respective stripe of the background.
- Reducing the number of inputs needed to perform an operation enhances the operability of the device and makes the user-device interface more efficient (e.g., by helping the user to provide proper inputs and reducing user mistakes when operating/interacting with the device) which, additionally, reduces power usage and improves battery life of the device by enabling the user to use the device more quickly and efficiently.
- displaying the editing user interface includes, in accordance with a determination that the editing user interface is in a first editing mode (e.g., an editing mode for changing the number of respective stripes in the background), the representation of the background (e.g., 1408 ) of the user interface includes displaying respective stripes in the background with visually distinguishable spaces between the respective stripes.
- a first editing mode e.g., an editing mode for changing the number of respective stripes in the background
- the representation of the background (e.g., 1408 ) of the user interface includes displaying respective stripes in the background with visually distinguishable spaces between the respective stripes.
- displaying the editing user interface includes, in accordance with a determination that the editing user interface is in a second editing mode (e.g., an editing mode for changing a visual characteristic, such as a color, of one or more stripes in the background; an editing mode for rotating the respective stripes in the background) different from the first editing mode, the representation of the background includes displaying the respective stripes in the background without visually distinguishable spaces between the respective stripes.
- a second editing mode e.g., an editing mode for changing a visual characteristic, such as a color, of one or more stripes in the background; an editing mode for rotating the respective stripes in the background
- the computer system while displaying the editing user interface (e.g., 1406 ), the computer system (e.g., 600 ) detects, via the one or more input devices (e.g., a touch-sensitive surface that is integrated with the display generation component), an input (e.g., 1411 ) on the representation of the background corresponding to a drag gesture (e.g., a finger touch drag gesture), wherein the drag gesture is detected across a plurality of stripes of the first number of stripes, beginning at an first stripe and ending at a second stripe (e.g., and including one or more stripes between the initial stripe and the final stripe).
- a drag gesture e.g., a finger touch drag gesture
- the computer system in response to detecting the input corresponding to the drag gesture, displays, in the editing user interface, an indication (e.g., a visual indication (e.g., a tab, a box) surrounding or within the selected stripe indicating that the stripe has been selected, and that it can be modified) that the second stripe (e.g., the stripe that is displayed at a location that corresponds to a location in the user interface at which the drag gesture ended) is selected for editing (e.g., editing for a different visual characteristic (e.g., a different color)).
- an indication e.g., a visual indication (e.g., a tab, a box) surrounding or within the selected stripe indicating that the stripe has been selected, and that it can be modified
- the second stripe e.g., the stripe that is displayed at a location that corresponds to a location in the user interface at which the drag gesture ended
- editing e.g., editing for a different visual characteristic (e.g.
- Enabling the selection of a second stripe within the background using a drag gesture, where the drag gesture is detected beginning at the first stripe and ending at the second stripe, provides a convenient and intuitive method for selecting a different stripe in the background (e.g., without needing to provide additional controls for enabling selection of the second stripe).
- Providing additional control options without cluttering the UI with additional displayed controls enhances the operability of the device and makes the user-device interface more efficient (e.g., by helping the user to provide proper inputs and reducing user mistakes when operating/interacting with the device) which, additionally, reduces power usage and improves battery life of the device by enabling the user to use the device more quickly and efficiently
- the computer system displays, via the display generation component (e.g., 602 ), the editing user interface (e.g., 1406 ) for editing the background of the user interface (e.g., including a respective number of stripes) in a second editing mode (e.g., an editing mode for rotating the stripes in the background; different from the current editing mode for changing the number of stripes in the background).
- the editing user interface e.g., 1406
- a second editing mode e.g., an editing mode for rotating the stripes in the background; different from the current editing mode for changing the number of stripes in the background.
- the computer system while displaying the editing user interface for editing the background of the user interface, the computer system detects, via the one or more input devices (e.g., via a touch-sensitive surface that is integrated with the display generation component), an input (e.g., a swipe input (e.g., a horizontal swipe input)) directed to changing an editing mode.
- an input e.g., a swipe input (e.g., a horizontal swipe input)
- the computer system displays or causes display of the editing user interface in the second editing mode.
- Enabling quick and easy changing of an editing mode for editing a different feature/characteristic of a user interface while maintaining display of the editing user interface (e.g., without needing to exit the editing user interface), enables the editing of user interfaces in an efficient manner and reduces the inputs required to edit the user interface. Reducing the number of inputs needed to perform an operation and providing improved control options enhances the operability of the device and makes the user-device interface more efficient (e.g., by helping the user to provide proper inputs and reducing user mistakes when operating/interacting with the device) which, additionally, reduces power usage and improves battery life of the device by enabling the user to use the device more quickly and efficiently.
- the computer system while displaying, via the display generation component (e.g., 602 ), the editing user interface (e.g., 1406 ) for editing the background (e.g., 1408 ) of the user interface (e.g., including the respective number of stripes) in the second editing mode, the computer system detects, via the one or more input devices (e.g., a rotatable input device; a rotatable and depressible input device), a rotational input (e.g., a rotational input on the rotatable input device; a touch input such as a swipe or pinch input).
- the one or more input devices e.g., a rotatable input device; a rotatable and depressible input device
- a rotational input e.g., a rotational input on the rotatable input device; a touch input such as a swipe or pinch input.
- the computer system in response to (e.g., and while) detecting the rotational input (e.g., 1415 ), the computer system (e.g., 600 ) rotates the representation of the background (e.g., 1408 ) (e.g., including the respective number of stripes) (e.g., rotating with the center of the display generation component as the axis point) in accordance with the detected rotational input.
- the rotational input is in a clockwise direction
- the (stripes within) the representation of the background is also rotated in the clockwise direction.
- the representation of the background is rotated in the counter-clockwise direction.
- the representation of the background including its respective number of stripes, are rotated with the center of the display generation component as the axis point for the rotation.
- the respective number of stripes of the representation of the background maintain their straight shape (e.g., maintain their straightness as stripes) while they are being rotated about the axis point. Rotating the representation of the background in accordance with the detected rotational input enables efficient editing of a feature/characteristic of the background.
- Providing additional control options without cluttering the UI with additional displayed controls enhances the operability of the device and makes the user-device interface more efficient (e.g., by helping the user to provide proper inputs and reducing user mistakes when operating/interacting with the device) which, additionally, reduces power usage and improves battery life of the device by enabling the user to use the device more quickly and efficiently.
- rotating the representation of the background includes rotating the representation of the background by predefined rotational increments (e.g., 1 degree, 2 degree, 5 degree, by 10 degree increments; by 15 degree increments; by 30 degree increments) with respect to a rotational axis point (e.g., the center of the display generation component (e.g., 602 )).
- predefined rotational increments e.g., 1 degree, 2 degree, 5 degree, by 10 degree increments; by 15 degree increments; by 30 degree increments
- a rotational axis point e.g., the center of the display generation component (e.g., 602 )
- rotating the representation of the background includes changing (e.g., increasing; decreasing) a characteristic (e.g., thickness; size; area) of a respective stripe within the representation of the background as the representation of the background is being rotated in accordance with the rotational input (e.g., 1415 ).
- a characteristic e.g., thickness; size; area
- the computer system displays, via the display generation component (e.g., 602 ), the user interface with the updated background (e.g., 1408 ).
- the computer system while displaying the user interface with the updated background (e.g., a watch user interface (e.g., watch face) with the updated background; a home user interface or main user interface with the updated background), the computer system detects, via the one or more input devices (e.g., a rotatable input device; a rotatable and depressible input device), a rotational input (e.g., 1415 ) (e.g., a rotational input on the rotatable input device; a touch input such as a swipe or pinch input).
- the one or more input devices e.g., a rotatable input device; a rotatable and depressible input device
- a rotational input e.g., 1415
- the computer system in response to (e.g., and while) detecting the rotational input, rotates the updated background (e.g., with the center of the display generation component as the axis point) within the user interface in accordance with the detected rotational input.
- the rotational input is in a clockwise direction
- the (stripes within) the updated background is also rotated in the clockwise direction.
- the rotational input is in a counter-clockwise direction
- the (stripes within) the updated background is also rotated in the counter-clockwise direction.
- Enabling the updated background to be rotated based on the rotational input, where the direction of rotation of the updated background is based on direction of rotation of the input provides an efficient and intuitive method for editing a feature of the updated background.
- Providing additional control options without cluttering the UI with additional displayed controls enhances the operability of the device and makes the user-device interface more efficient (e.g., by helping the user to provide proper inputs and reducing user mistakes when operating/interacting with the device) which, additionally, reduces power usage and improves battery life of the device by enabling the user to use the device more quickly and efficiently.
- the content is a first complication.
- a complication refers to any clock face feature other than those used to indicate the hours and minutes of a time (e.g., clock hands or hour/minute indications).
- complications provide data obtained from an application.
- a complication includes an affordance that when selected launches a corresponding application.
- a complication is displayed at a fixed, predefined location on the display.
- complications occupy respective locations at particular regions of a watch face (e.g., lower-right, lower-left, upper-right, and/or upper-left).
- the computer system displays the user interface with the updated background (e.g., 1408 ), wherein the first complication includes a primary color (e.g., a color that most visually prominent in the displayed respective complication) that is selected (by the computer system) based on a first color a first stripe of a plurality of stripes in the updated background (e.g., based on the color of the first-in-order stripe in the updated background; based on the color of the stripes that are most common in the updated background).
- Automatically applying (e.g., without user input) the primary color for the first complication based on the first color of the first stripe of the updated background provides efficient editing/configuration of features of the user interface.
- Reducing the number of inputs needed to perform an operation enhances the operability of the device and makes the user-device interface more efficient (e.g., by helping the user to provide proper inputs and reducing user mistakes when operating/interacting with the device) which, additionally, reduces power usage and improves battery life of the device by enabling the user to use the device more quickly and efficiently.
- the computer system displays the user interface with the updated background (e.g., 1408 ), wherein the first complication includes a secondary color (e.g., a color that is second-most visually prominent in the displayed respective complication; a color that is not as visually prominent in the displayed respective complication than the primary color) that is selected (by the computer system) based on a second color from a second stripe, different from the first stripe, of the plurality of stripes in the updated background (e.g., based on the color of the second-in-order stripe; based on the color of the stripe(s) that is not the most common in the updated background).
- a secondary color e.g., a color that is second-most visually prominent in the displayed respective complication; a color that is not as visually prominent in the displayed respective complication than the primary color
- Selecting (e.g., automatically, without user input) the secondary color for the first complication based on the second color from the second stripe reduces the number of user inputs needed to create a respective user interface that includes the updated background. Reducing the number of inputs needed to perform an operation enhances the operability of the device and makes the user-device interface more efficient (e.g., by helping the user to provide proper inputs and reducing user mistakes when operating/interacting with the device) which, additionally, reduces power usage and improves battery life of the device by enabling the user to use the device more quickly and efficiently.
- rotating the representation (e.g., 1430 ) of the background (e.g., 1408 ) includes changing a thickness (e.g., a width) of the first number of stripes (e.g., 1408 A- 1408 H) within the representation (e.g., 1430 ) of the background (e.g., 1408 ) as the representation (e.g., 1430 ) of the background (e.g., 1408 ) is being rotated in accordance with the rotational input (e.g., 1474 ).
- the thickness of the first number of stripes within the representation of the background are changed uniformly (e.g., each stripe of the first number of stripes changes by the same amount).
- the thickness of the first number of stripes changes based on a length of the longest stripe of the first number of stripes on the representation of the background (e.g., the stripes stretch and reduce in thickness as the length of the longest stripe increases).
- rotating the representation (e.g., 1430 ) of the background (e.g., 1408 ) includes maintaining the first number of stripes (e.g., 1408 A- 1408 H) within the representation (e.g., 1430 ) of the background (e.g., 1408 ) (e.g., the thickness of the stripes changes in order to fit the first number of stripes within the shape of the background without changing the first number of stripes).
- Changing the thickness of the first number of stripes as the representation of the background is being rotated in accordance with the rotational input enables a user to customize and/or adjust the background in an easy and intuitive manner.
- Providing improved control options enhances the operability of the device and makes the user-device interface more efficient (e.g., by helping the user to provide proper inputs and reducing user mistakes when operating/interacting with the device) which, additionally, reduces power usage and improves battery life of the device by enabling the user to use the device more quickly and efficiently.
- the representation (e.g., 1430 ) of the background (e.g., 1408 ) is within a boundary having a first shape (e.g., a rectangle and/or a square).
- the computer system e.g., 600
- a third editing mode e.g., an editing mode for changing the representation of the background from a full screen mode to a partial screen mode (e.g., the partial screen mode displays the
- the computer system while displaying the editing user interface (e.g., 1440 ) for editing the background (e.g., 1408 ) of the user interface, the computer system (e.g., 600 ) detects, via the one or more input devices (e.g., via a touch-sensitive surface that is integrated with the display generation component), an input (e.g., 1454 , 1460 , 1468 , 1472 , 1476 ) (e.g., a swipe input (e.g., a horizontal swipe input)) directed to changing an editing mode.
- an input e.g., 1454 , 1460 , 1468 , 1472 , 1476
- a swipe input e.g., a horizontal swipe input
- the computer system in response to detecting the input (e.g., 1454 , 1460 , 1468 , 1472 , 1476 ) directed to changing the editing mode, displays or causes display of the editing user interface (e.g., 1440 ) in the second editing mode.
- the input e.g., 1454 , 1460 , 1468 , 1472 , 1476
- the computer system displays or causes display of the editing user interface (e.g., 1440 ) in the second editing mode.
- the computer system while displaying, via the display generation component (e.g., 602 ), the editing user interface (e.g., 1440 ) for editing the background (e.g., 1408 ) of the user interface (e.g., 1422 ) (e.g., including the respective number of stripes) in the third editing mode, detects, via the one or more input devices (e.g., a rotatable input device; a rotatable and depressible input device), an input (e.g., 1448 , 1470 ) (e.g., a rotational input on the rotatable input device; a touch input such as a swipe or pinch input).
- the one or more input devices e.g., a rotatable input device; a rotatable and depressible input device
- an input e.g., 1448 , 1470
- a rotational input on the rotatable input device e.g., a touch input such as a swipe or pinch input
- the computer system in response to (e.g., and while) detecting the input (e.g., 1448 , 1470 ), displays the representation (e.g., 1430 , 1450 ) of the background (e.g., 1408 ) within a boundary having a second shape that is different from the first shape (e.g., the second shape is a circle, oval, and/or a round shape) and changes a thickness of the first number of stripes (e.g., 1408 A- 1408 H) within the representation (e.g., 1430 , 1450 ) of the background (e.g., 1408 ) (e.g., the first number of stripes is maintained when displaying the representation of the background in the boundary having the second shape, but the thickness of the first number of stripes is changed so that the first number of stripes fit evenly within the boundary having the second shape).
- the first number of stripes e.g., 1408 A- 1408 H
- Displaying the representation of the background within a boundary having a second shape that is different from the first shape in response to detecting the input enables a user to customize and/or adjust the background in an easy and intuitive manner.
- Providing improved control options enhances the operability of the device and makes the user-device interface more efficient (e.g., by helping the user to provide proper inputs and reducing user mistakes when operating/interacting with the device) which, additionally, reduces power usage and improves battery life of the device by enabling the user to use the device more quickly and efficiently.
- the computer system while displaying the user interface (e.g., 1422 ), receives ( 1540 ) a request (e.g., 1426 ) to display a watch face (e.g., a request to turn on the display, a request to switch from one watch face to a stripes watch face, or a request to exit an editing mode) with a first arrangement of stripes (e.g., color, thickness, number, angle).
- a request e.g., 1426
- a watch face e.g., a request to turn on the display, a request to switch from one watch face to a stripes watch face, or a request to exit an editing mode
- a first arrangement of stripes e.g., color, thickness, number, angle
- the computer system in response ( 1542 ) to the request (e.g., 1426 ) to display the watch face and in accordance with a determination that the first arrangement of stripes is displayed within a first boundary (e.g., a boundary having a first shape and first size), displays ( 1544 ) the first arrangement of stripes with a first width.
- a first boundary e.g., a boundary having a first shape and first size
- the computer system in response ( 1542 ) to the request (e.g., 1426 ) to display the watch face and in accordance with a determination that the first arrangement of stripes is displayed within a second boundary (e.g., a boundary having a second shape different from the first shape and/or a second size different from the first size) that is different from the first boundary, displays ( 1546 ) the first arrangement of stripes with a second width that is different from the first width.
- a second boundary e.g., a boundary having a second shape different from the first shape and/or a second size different from the first size
- Displaying the first arrangement of stripes with the first width or displaying the first arrangement of stripes with the second width based on a boundary of the first arrangement of stripes reduces a number of inputs needed by the user to customize the background. Reducing the number of inputs needed to customize the background enhances the operability of the device and makes the user-device interface more efficient (e.g., by helping the user to provide proper inputs and reducing user mistakes when operating/interacting with the device) which, additionally, reduces power usage and improves battery life of the device by enabling the user to use the device more quickly and efficiently.
- the computer system while displaying the user interface (e.g., 1422 ), receives ( 1540 ) a request (e.g., 1426 ) to display a watch face (e.g., a request to turn on the display, a request to switch from one watch face to a stripes watch face, or a request to exit an editing mode) with a first arrangement of stripes (e.g., color, thickness, number, angle).
- a request e.g., 1426
- a watch face e.g., a request to turn on the display, a request to switch from one watch face to a stripes watch face, or a request to exit an editing mode
- a first arrangement of stripes e.g., color, thickness, number, angle
- the computer system in response ( 1542 ) to the request (e.g., 1426 ) to display the watch face and in accordance with a determination that the first arrangement of stripes is displayed at a first angle within a first boundary (e.g., a boundary having a first shape and a first size), displays ( 1548 ) the first arrangement of stripes with a first width.
- a first boundary e.g., a boundary having a first shape and a first size
- the computer system in response ( 1542 ) to the request (e.g., 1426 ) to display the watch face and in accordance with a determination that the first arrangement of stripes is displayed at the first angle within a second boundary (e.g., a boundary having a second shape that is different from the first shape and/or a second size different from the first size) that is different from the first boundary, displays ( 1550 ) the first arrangement of stripes with a second width (e.g., the first width or a width different from the first width).
- a second boundary e.g., a boundary having a second shape that is different from the first shape and/or a second size different from the first size
- the computer system in response ( 1542 ) to the request (e.g., 1426 ) to display the watch face and in accordance with a determination that the first arrangement of stripes is displayed at a second angle that is different from the first angle within the first boundary, displays ( 1552 ) the first arrangement of stripes with the first width (e.g., the first boundary includes a circular shape such that the width of the first arrangement of stripes do not change based on an angle of the first arrangement of stripes).
- the computer system in response ( 1542 ) to the request (e.g., 1426 ) to display the watch face and in accordance with a determination that the first arrangement of stripes is displayed at the second angle within the second boundary, displays ( 1554 ) the first arrangement of stripes with a third width that is different from the second width (e.g., the second boundary includes a non-circular shape such that the width of the first arrangement of stripes changes based on the angle of the first arrangement of stripes to fit the first arrangement of stripes evenly within the non-circular shaped boundary).
- Displaying the first arrangement of stripes with the first width, the second width, or the third width based on the boundary and an angle of the first arrangement of stripes reduces a number of inputs needed by the user to customize the background. Reducing the number of inputs needed to customize the background enhances the operability of the device and makes the user-device interface more efficient (e.g., by helping the user to provide proper inputs and reducing user mistakes when operating/interacting with the device) which, additionally, reduces power usage and improves battery life of the device by enabling the user to use the device more quickly and efficiently.
- the computer system while displaying the editing user interface (e.g., 1440 ) for editing the background (e.g., 1408 ) of the user interface (e.g., 1422 ) (e.g., including a respective number of stripes) in a fourth editing mode (e.g., the second editing mode, an editing mode for rotating the stripes in the background; different from the editing mode for changing the number of stripes in the background), detects ( 1556 ), via the one or more input devices, an input (e.g., 1466 , 1474 ) (e.g., rotational input on the rotatable input device) corresponding to a request to rotate the representation (e.g., 1430 , 1450 ) of the background (e.g., 1408 ).
- an input e.g., 1466 , 1474
- the computer system in response to detecting ( 1558 ) the input (e.g., 1466 , 1474 ) and in accordance with a determination that the representation (e.g., 1450 ) of the background (e.g., 1408 ) is set to be displayed within a boundary of a first shape (e.g., a circle, an oval, and/or a round shape), rotates ( 1560 ) the representation of the background without adjusting a thickness of the first number of stripes (e.g., 1408 A- 1408 H) within the representation (e.g., 1450 ) of the background (e.g., 1408 ) (e.g., rotating the representation of the background when displayed within the boundary having the first shape does not adjust a thickness of the first number of stripes).
- a first shape e.g., a circle, an oval, and/or a round shape
- the computer system in response to detecting ( 1558 ) the input (e.g., 1466 , 1474 ) and in accordance with a determination ( 1562 ) that the representation (e.g., 1430 ) of the background (e.g., 1408 ) is set to be displayed within a boundary of a second shape (e.g., a square and/or a rectangle), rotates ( 1564 ) the representation (e.g., 1430 ) of the background (e.g., 1408 ) and adjusts ( 1566 ) (e.g., changing, increasing, decreasing) the thickness of the first number of stripes (e.g., 1408 A- 1408 H) as the representation (e.g., 1430 ) of the background (e.g., 1408 ) is rotated.
- a second shape e.g., a square and/or a rectangle
- Adjusting the thickness of the first number of stripes or forgoing adjusting the thickness of the first number of stripes based on a shape of the boundary of the background reduces a number of inputs needed by the user to customize the background. Reducing the number of inputs needed to customize the background enhances the operability of the device and makes the user-device interface more efficient (e.g., by helping the user to provide proper inputs and reducing user mistakes when operating/interacting with the device) which, additionally, reduces power usage and improves battery life of the device by enabling the user to use the device more quickly and efficiently.
- the computer system while displaying the editing user interface (e.g., 1440 ) for editing the background (e.g., 1408 ) of the user interface (e.g., 1422 ) (e.g., in an editing mode for rotating the representation of the background, in an editing mode for adjusting the first number of stripes, in an editing mode for adjusting the shape of the boundary of the representation of the background, and/or in an editing mode that is not for adjusting the color of a respective stripe of the first number of stripes), detects ( 1568 ) an input (e.g., 1454 , 1476 ) corresponding to a request to display the editing user interface for editing the background of the user interface in a fifth editing mode (e.g., an editing mode for changing a color of a respective stripe of the first number of stripes).
- an input e.g., 1454 , 1476
- the computer system in response to detecting the input (e.g., 1454 , 1476 ), displays ( 1570 ), via the display generation component (e.g., 602 ), the editing user interface (e.g., 1440 ) for editing the background (e.g., 1408 ) of the user interface (e.g., 1422 ) (e.g., including a respective number of stripes) in the fifth editing mode (e.g., an editing mode for changing a color of a respective stripe of the first number of stripes), wherein displaying the editing user interface (e.g., 1440 ) for editing the background (e.g., 1408 ) of the user interface (e.g., 1422 ) in the fifth editing mode includes the computer system (e.g., 600 ), in accordance with a determination that the representation (e.g., 1430 , 1450 ) of the background (e.g., 1408 ) is in a first position (e.g.,
- the computer system in response to detecting the input (e.g., 1454 , 1476 ), displays ( 1570 ), via the display generation component (e.g., 602 ), the editing user interface (e.g., 1440 ) for editing the background (e.g., 1408 ) of the user interface (e.g., 1422 ) (e.g., including a respective number of stripes) in the fifth editing mode (e.g., an editing mode for changing a color of a respective stripe of the first number of stripes), wherein displaying the editing user interface (e.g., 1440 ) for editing the background (e.g., 1408 ) of the user interface (e.g., 1422 ) in the fifth editing mode includes the computer system (e.g., 600 ), in accordance with a determination that the representation (e.g., 1430 , 1450 ) of the background (e.g., 1408 ) is in the second position (e.g.,
- Displaying the representation of the background in the second position while the computer system displays the editing user interface for editing the background of the user interface in the fifth editing mode facilitates a user's ability to select a particular stripe of the first number of stripes, which reduces a number of inputs needed by the user to customize the background. Reducing the number of inputs needed to customize the background enhances the operability of the device and makes the user-device interface more efficient (e.g., by helping the user to provide proper inputs and reducing user mistakes when operating/interacting with the device) which, additionally, reduces power usage and improves battery life of the device by enabling the user to use the device more quickly and efficiently.
- the editing user interface (e.g., 1440 ) for editing the background (e.g., 1408 ) of the user interface (e.g., 1422 ) in the fifth editing mode includes, in accordance with a determination that the representation (e.g., 1430 , 1450 ) of the background (e.g., 1408 ) is in a third position (e.g., a rotational position and/or an angular position where the first number of stripes do not extend horizontally (at a 0 degree angle and/or a 360 degree angle) across display generation component) (e.g., a position different from the first position and the second position), rotating the representation (e.g., 1430 , 1450 ) of the background (e.g., 1408 ) to the second position (e.g., a rotational position and/or an angular position where the first number of stripes are in a predetermined orientation such as a horizontal orientation (at a 0 degree angle and/or a 360 degree
- Displaying the representation of the background in the second position while the computer system displays the editing user interface for editing the background of the user interface in the fifth editing mode facilitates a user's ability to select a particular stripe of the first number of stripes, which reduces a number of inputs needed by the user to customize the background. Reducing the number of inputs needed to customize the background enhances the operability of the device and makes the user-device interface more efficient (e.g., by helping the user to provide proper inputs and reducing user mistakes when operating/interacting with the device) which, additionally, reduces power usage and improves battery life of the device by enabling the user to use the device more quickly and efficiently.
- method 700 optionally includes one or more of the characteristics of the various methods described above with reference to method 1500 .
- a background for a user interface as described in FIGS. 14A-14AD can be used as the background for a watch user interface as described in FIGS. 6A-6H .
- method 900 optionally includes one or more of the characteristics of the various methods described above with reference to method 1500 .
- a background for a user interface as described in FIGS. 14A-14AD can be used as the background for a watch user interface as described in FIGS. 8A-8M .
- method 1100 optionally includes one or more of the characteristics of the various methods described above with reference to method 1500 .
- a device can use as a watch user interface either a watch user interface as described in FIGS. 10A-10AC or a user interface with a background as described in FIGS. 14A-14AD .
- method 1300 optionally includes one or more of the characteristics of the various methods described above with reference to method 1500 .
- a device can use as a watch user interface either a watch user interface as described in FIGS. 12A-12G or a user interface with a background as described in FIGS. 14A-14AD .
- method 1700 optionally includes one or more of the characteristics of the various methods described above with reference to method 1500 .
- one or more characteristics or features of a user interface that includes a background as described in FIGS. 14A-14AD can be edited via the process for editing characteristics or features of a watch user interface as described with reference to FIGS. 16A-16AE .
- FIGS. 16A-16AE For brevity, these details are not repeated below.
- FIGS. 16A-16AE illustrate exemplary user interfaces for enabling configuration of a user interface (e.g., editing a watch user interface), in accordance with some embodiments.
- the user interfaces in these figures are used to illustrate the processes described below, including the processes in FIGS. 17A-17D .
- FIG. 16A illustrates device 600 displaying, via display 602 , a watch user interface 1606 that includes a time region for displaying a current time (e.g., a dial and clock hands indicate the current time) and one or more complication regions for displaying watch complications on watch user interface 1606 .
- a complication refers to any clock face feature other than those used to indicate the hours and minutes of a time (e.g., clock hands or hour/minute indications).
- complications provide data obtained from an application.
- a complication includes an affordance that when selected launches a corresponding application.
- a complication is displayed at a fixed, predefined location on display 602 .
- complications occupy respective locations at particular regions of watch user interface 1606 (e.g., lower-right, lower-left, upper-right, and/or upper-left). In some embodiments the complications are displayed at respective complication regions within watch user interface 1606 .
- watch user interface 1606 includes a complication 1608 corresponding to a contactable users application, a complication 1610 corresponding to a calendar application, a complication 1612 corresponding to a weather application, and a complication 1614 corresponding to a moon phase application.
- device 600 receives (e.g., detects) an input 1601 on watch user interface 1606 .
- input 1601 is a touch input (e.g., touch press input) on display 602 .
- input 1601 is a press-and-hold input on display 602 .
- device 600 displays a user interface 1616 that includes a representation 1618 of watch user interface 1606 and an edit affordance 1620 for initiating a process for editing watch user interface 1606 , as shown in FIG. 16B .
- device 600 receives (e.g., detects) an input 1603 directed to selecting edit affordance 1620 .
- device 600 displays, via display 602 , a first page 1626 (e.g., a style page) of an editing user interface 1622 , as shown in FIG. 16C , where editing user interface 1622 includes a representation 1624 of a layout of watch user interface 1606 .
- first page 1626 of editing user interface 1622 is for editing a style of watch user interface 1606 .
- device 600 receives (e.g., detects) an input 1605 directed to changing the current page of editing user interface 1622 to a second page 1628 (e.g., an editing mode for editing a dial of watch user interface 1606 ).
- input 1605 includes a touch gesture (e.g., a horizontal swipe on display 602 ) or a rotational input on rotatable input mechanism 603 .
- device 600 displays second page 1628 of editing user interface 1606 including representation 1624 of a layout of watch user interface 1606 , as shown in FIG. 16D .
- device 600 receives (e.g., detects) an input 1607 directed to changing the current page of editing user interface 1622 to a third page 1630 (e.g., an editing mode for changing a color (e.g., a background color; a color scheme) of watch user interface 1606 ).
- input 1607 includes a touch gesture (e.g., a horizontal swipe on display 602 ) or a rotational input on rotatable input mechanism 603 .
- device 600 displays third page 1630 of editing user interface 1606 including representation 1624 of a layout of watch user interface 1606 , as shown in FIG. 16E .
- FIGS. 16V-16X Features of third page 1630 of editing user interface 1622 are described in greater detail below with reference to FIGS. 16V-16X .
- device 600 receives (e.g., detects) an input 1609 directed to changing the current page of editing user interface 1622 to a fourth page 1632 (e.g., an editing mode for changing one or more complications of watch user interface 1606 ).
- input 1609 includes a touch gesture (e.g., a horizontal swipe on display 602 ) or a rotational input on rotatable input mechanism 603 .
- device 600 displays fourth page 1632 of editing user interface 1606 , as shown in FIG. 16F .
- device 600 displays, in fourth page 1632 of editing user interface 1622 , complication previews 1634 - 1640 corresponding to complications 1608 - 1614 of watch user interface 1606 , as shown in FIG. 16A .
- Complication preview 1634 corresponds to complication 1608 for the contactable users application
- complication preview 1636 corresponds to complication 1610 for the calendar application.
- Complication preview 1638 corresponds to complication 1612 for the weather application
- complication preview 1640 corresponds to complication 1614 for the moon phase application.
- device 600 receives (e.g., detects) an input 1611 directed to selecting complication preview 1634 corresponding to complication 1608 for the contactable users application.
- input 1611 is a touch input on display 602 .
- device 600 displays, via display 602 , a complication selection user interface 1642 for selecting a complication to be included in watch user interface 1606 (e.g., to replace complication 1608 in watch user interface 1606 ), as shown in FIG. 16G .
- complication selection user interface 1642 includes a first region 1644 corresponding to the contactable users application (e.g., because the selected complication preview corresponds to the contactable users application).
- Region 1644 includes a header/label indicating that the region corresponds to the contactable users application and a group of complication previews 1644 A- 1644 E.
- a respective complication preview corresponds to a respective complication that is configured to display a respective set of information obtained from the respective application (e.g., information based on a feature, operation, and/or characteristic of the respective application).
- the respective complication preview includes a graphical representation of the respective complication displaying the first set of information (e.g., an exemplary representation of the respective complication with an example of the respective set of information).
- complication selection user interface 1642 when the respective application is associated with a plurality of available complications, includes a plurality of complication previews corresponding to the plurality of available complications. For example, in accordance with a determination that the plurality of available complications exceeds a predetermined number of available complications (e.g., more than 5 or 6 complications), device 600 displays a plurality of complication previews that correspond to respective complications of the plurality of available complication along with an affordance for showing one or more additional complication previews of complications in the plurality of available complications (e.g., the plurality of complication previews does not exceed the predetermined number).
- a predetermined number of available complications e.g., more than 5 or 6 complications
- complication previews 1644 A- 1644 E corresponding to the predetermined number of available complications for the respective application (the contactable users application) are displayed along with affordance 1648 (e.g., a “show more” icon or button).
- affordance 1648 e.g., a “show more” icon or button.
- device 600 displays one or more additional complication previews that were not included in the plurality of complication previews as well as the complication previews that were included in the plurality of complication previews.
- complication selection user interface 1642 in accordance with a determination that the plurality of available complications does not exceed the predetermined number, includes a complication preview for all of the available complications, without displaying the affordance (e.g., affordance 1648 ).
- complication selection user interface 1642 includes first region 1644 corresponding to the contactable users application, where the contactable users application is for managing information of a set of contactable users (e.g., user contacts stored in and/or accessible on device 600 ; user contacts stored in and/or accessible from an address book).
- a respective complication corresponding to the contactable users application corresponds to a respective contactable user of the set of contactable users.
- Complication previews 1644 A- 1644 E correspond to respective complications (complication 1608 ) for five respective contactable users of the set of contactable users.
- device 600 displays a first respective complication preview corresponding to the first respective contactable user prior to a second respective complication preview corresponding to the second respective contactable user in the displayed order of the complication previews.
- a candidate contact e.g., a favorite contact; a frequent contact; a primary contact
- a second respective contactable user is not a candidate contact
- device 600 displays the second respective complication preview corresponding to the second respective contactable user prior to the first respective complication preview corresponding to the first respective contactable user in the displayed order of the complication previews.
- all of the maximum number of complication previews that are shown correspond to candidate contacts (e.g., listed in alphabetical order).
- the candidate contacts are shown first (e.g., in alphabetical order) and regular contacts (non-candidate contacts) are shown for the remaining complication previews (e.g., separately in alphabetical order).
- device 600 displays a visual indication 1646 that complication preview 1644 A corresponds to the currently-selected complication for complication 1608 in watch user interface 1606 (e.g., complication preview 1644 A is highlighted and/or outlined compared to other complication previews).
- device 600 receives (e.g., detects) an input 1613 directed to selecting complication preview 1644 D.
- input 1613 is a touch input on display 602 .
- input 1613 is a press input on rotatable input mechanism 603 after visual indication 1646 is moved to complication preview 1644 D (e.g., via rotation of rotatable input mechanism 603 ).
- device 600 In response to receiving input 1613 , device 600 removes visual indication 1646 from complication preview 1644 A and displays visual indication 1646 for complication preview 1644 D, as shown in FIG. 16H , thereby indicating that the complication corresponding to complication preview 1644 D has been selected to be used as the complication for complication 1608 in watch user interface 1606 .
- device 600 receives (e.g., detects) an input 1615 directed to an affordance 1650 for exiting complication selection user interface 1642 with the newly-selected settings.
- input 1615 is a touch input on display 602 .
- device 600 displays fourth page 1632 (e.g., an editing mode for changing one or more complications of watch user interface 1606 ) of editing user interface 1622 , where complication preview 1634 for watch user interface 1606 now corresponds to the contactable user corresponding to complication preview 1644 D (instead of the contactable user corresponding to complication preview 1644 A) in FIGS. 16G-16H , as shown in FIG. 16I .
- fourth page 1632 e.g., an editing mode for changing one or more complications of watch user interface 1606
- device 600 receives (e.g., detects) an input 1617 directed to selecting complication 1634 .
- input 1617 is a touch input on display 602 .
- device 600 displays first region 1644 of complication selection user interface 1642 , as shown in FIG. 16J , where first region 1644 includes complication previews 1644 A- 1644 E corresponding to complications for the contactable users application, as first described above with reference to FIG. 16G .
- first region 1644 of complication selection user interface 1642 includes affordance 1648 that, when selected, causes device 600 to display one or more additional complication previews that were not included in the plurality of complication previews (e.g., in addition to the complication previews that were included in the plurality of complication previews).
- device 600 while displaying first region 1644 corresponding to the contactable users application of complication selection user interface 1642 , device 600 receives (e.g., detects) an input 1619 directed to selecting affordance 1648 .
- input 1619 is a touch input on display 602 .
- device 600 displays a contactable user selection user interface 1652 , as shown in FIG. 16K .
- contactable user selection user interface 1652 includes a first region 1654 for candidate contacts (e.g., favorite contacts; frequent contacts; primary contacts), where first region 1654 includes complication previews 1644 A- 1644 D. Complication previews 1644 A- 1644 D each correspond to a respective contactable user that is designated (e.g., by a user of device 600 ) as a candidate contact.
- contactable user selection user interface 1652 includes a second region 1656 for regular contacts (e.g., non-candidate contacts; non-favorite contacts), where second region 1656 includes complication previews 1644 E and 1656 A that correspond to respective contactable users that are not designated as candidate contacts.
- contactable user selection user interface 1652 can be navigated (e.g., scrolled) to show, in second region 1656 , additional complication previews corresponding to respective contactable users that are not designated as candidate contacts.
- FIG. 16L illustrates device 600 displaying, via display 602 , complication selection user interface 1642 with first region 1644 corresponding to complication previews for the contactable users application, as first described above with reference to FIG. 16G . While displaying first region 1644 of complication selection user interface 1642 , device 600 receives (e.g., detects) an input 1621 directed to navigating (e.g., scrolling) complication selection user interface 1642 .
- input 1621 is a rotational input on rotatable input mechanism 603 shown in FIG. 16L .
- input 1621 is a touch input such as a swipe or pinch input.
- FIGS. 16M-16O illustrate complication selection user interface 1642 being navigated (e.g., scrolled) in response to input 1621 .
- device 600 navigates complication selection user interface 1642 from first region 1644 (corresponding to a complication group for contactable users application complications) to a second region 1658 of complication selection user interface 1642 , where second region 1658 corresponds to a complication group for a first third-party application
- second region 1658 includes complication previews 1658 A- 1658 E corresponding to respective complications that are configured to display, on watch user interface 1606 , a respective set of information obtained from the first third-party application.
- One or more of complication previews 1658 A- 1658 E can include a respective graphical representation of the respective complication displaying the respective set of information.
- Second region 1658 of complication selection user interface 1642 includes an affordance 1660 that, when selected, causes device 600 to display one or more additional complication previews that were not included in the plurality of complication previews corresponding to the first third-party application in second region 1658 of complication selection user interface 1642 .
- device 600 navigates complication selection user interface 1642 from second region 1658 (corresponding to a complication group for the first third-party application complications) to a third region 1662 and a fourth region 1664 of complication selection user interface 1642 , where third region 1662 corresponds to a complication group for a second third-party application and fourth region 1664 corresponds to a complication group for a fitness application.
- third region 1662 includes complication previews 1662 A- 1662 B corresponding to respective complications that are configured to display a respective set of information obtained from the second third-party application.
- One or more of complication previews 1662 A- 1662 B can include a respective graphical representation of the respective complication displaying the respective set of information.
- third region 1662 of complication selection user interface 1642 includes fewer than the predetermined number (e.g., 5 or 6) of complication previews that can be included for a respective region in complication selection user interface 1642 , and thus no affordance (e.g., affordance 1648 ; affordance 1660 ) that, when selected, causes device 600 to display one or more additional complication previews for the respective application, is included.
- the predetermined number e.g., 5 or 6
- affordance 1648 e.g., affordance 1648 ; affordance 1660
- fourth region 1664 includes complication previews 1664 A- 1664 B corresponding to respective complications that are configured to display a respective set of information obtained from the fitness application.
- One or more of complication previews 1662 A- 1662 B can include a respective graphical representation of the respective complication displaying the respective set of information.
- fourth region 1664 of complication selection user interface 1642 includes fewer than the predetermined number (e.g., 5 or 6) of complication previews that can be included for a respective region in complication selection user interface 1642 , and thus no affordance (e.g., affordance 1648 ; affordance 1660 ) that, when selected, causes device 600 to display one or more additional complication previews for the respective application, is included.
- the predetermined number e.g., 5 or 6
- affordance 1648 e.g., affordance 1648 ; affordance 1660
- device 600 navigates (e.g., scrolls) complication selection user interface 1642 to a fifth region 1666 of complication selection user interface 1642 , where fifth region 1666 corresponds to a complication group for the weather application.
- fifth region 1666 includes complication previews 1666 A- 1666 D corresponding to respective complications that are configured to display, on watch user interface 1606 , a respective set of information obtained from the weather application.
- One or more of complication previews 1666 A- 1666 D can include a respective graphical representation of the respective complication displaying the respective set of information.
- fifth region 1666 of complication selection user interface 1642 includes fewer than the predetermined number (e.g., 5 or 6) of complication previews that can be included for a respective region in complication selection user interface 1642 , and thus no affordance (e.g., affordance 1648 ; affordance 1660 ) that, when selected, causes device 600 to display one or more additional complication previews for the respective application, is included.
- the predetermined number e.g., 5 or 6
- affordance 1648 e.g., affordance 1648 ; affordance 1660
- FIG. 16P illustrates device 600 displaying, via display 602 , a watch user interface 1668 that is different from watch user interface 1606 first described above with reference to FIG. 16A .
- watch user interface 1668 includes a complication 1670 corresponding to an activity application, complication 1672 corresponding to a calendar application, complication 1674 corresponding to a health application, complication 1676 corresponding to a fitness application, complication 1678 corresponding to a time application, complication 1680 corresponding to a weather application, complication 1682 corresponding to the weather application, and complication 1684 corresponding to the calendar application.
- FIG. 16Q illustrates device 600 displaying, via display 602 , fourth page 1632 (e.g., an editing mode for changing one or more complications of watch user interface 1606 ) of editing user interface 1622 , including complication preview 1686 corresponding to complication 1670 for the activity application, complication preview 1688 corresponding to complication 1672 for the calendar application, complication preview 1690 corresponding to complication 1674 for the health application, complication preview 1692 corresponding to complication 1676 for the fitness application, complication preview 1694 corresponding to complication 1678 for the time application, complication preview 1696 corresponding to complication 1680 for the weather application, complication preview 1698 corresponding to complication 1682 for the weather application, and complication preview 1699 corresponding to complication 1684 for the calendar application.
- fourth page 1632 e.g., an editing mode for changing one or more complications of watch user interface 1606
- device 600 receives (e.g., detects) an input 1625 directed to selecting complication preview 1688 corresponding to complication 1672 for the calendar application.
- input 1625 is a touch input on display 602 .
- device 600 displays a sixth region 1697 of complication selection user interface 1642 corresponding to a complication group for the calendar application, as shown in FIG. 16R , where sixth region 1697 of complication selection user interface 1642 includes a complication preview 1697 A.
- complication selection user interface 1642 includes complication preview 1697 A in a first shape (e.g., a first layout; a first design; a first outline) that corresponds to how the corresponding complication will be displayed if applied to watch user interface 1668 at the location within watch user interface 1668 corresponding to the current location of complication 1688 .
- a first shape e.g., a first layout; a first design; a first outline
- a respective complication preview shown in the complication selection user interface includes a graphical representation of the corresponding respective complication in the first shape.
- the shape for the respective complication preview is (e.g., at least partly) determined based on the layout, design, and/or configuration of the respective watch user interface for which the corresponding respective complication is to be used.
- the shape for a respective complication preview is (e.g., at least partly) determined based on the respective complication region of the one or more complications within the respective watch user interface for which the respective complication is being used.
- device 600 receives (e.g., detects) an input 1627 directed to selecting complication preview 1698 corresponding to complication 1682 for the weather application.
- input 1627 is a touch input on display 602 .
- device 600 displays a region 1693 of complication selection user interface 1642 corresponding to a complication group for the weather application, as shown in FIG. 16S , where region 1693 of complication selection user interface 1642 includes complication previews 1693 A- 1693 D.
- complication selection user interface 1642 includes complication previews 1693 A- 1693 D in a second shape (e.g., a second layout; a second design; a second outline) that corresponds to how the corresponding complication will be displayed if applied to watch user interface 1668 at the location within watch user interface 1668 corresponding to the current location of complication 1698 .
- a second shape e.g., a second layout; a second design; a second outline
- corresponding respective complication previews shown in the complication selection user interface include respective graphical representations of the corresponding respective complications in the second shape, different from the first shape.
- the shape for the respective complication preview is (e.g., at least partly) determined based on the layout, design, and/or configuration of the respective watch user interface for which the corresponding respective complication is to be used.
- the shape for a respective complication preview is (e.g., at least partly) determined based on the respective complication region of the one or more complications within the respective watch user interface for which the respective complication is being used.
- device 600 receives (e.g., detects) an input 1631 directed to selecting complication preview 1693 C.
- input 1631 is a touch input on display 602 .
- device 600 visually indicates that complication preview 1693 C has been selected, as shown in FIG. 16T (e.g., complication preview 1693 C is outlined, highlighted, etc. compared to other complication previews to visually distinguish complication preview 1693 C from other complication previews).
- device 600 receives (e.g., detects) an input 1633 directed to affordance 1650 for exiting complication selection user interface 1642 with the newly-selected settings.
- input 1633 is a touch input on display 602 .
- device 600 displays fourth page 1632 (e.g., an editing mode for changing one or more complications of watch user interface 1668 ) of editing user interface 1622 , as shown in FIG. 16U , where complication preview 1698 for watch user interface 1668 now corresponds to complication preview 1693 C selected in FIGS. 16S-16T .
- fourth page 1632 e.g., an editing mode for changing one or more complications of watch user interface 1668
- device 600 receives (e.g., detects) an input 1635 directed to changing the current page of editing user interface 1622 to third page 1630 (e.g., an editing mode for changing a color of watch user interface 1668 ).
- input 1635 includes a gesture (e.g., a horizontal swipe on display 602 ; a rotational input on rotatable input mechanism 603 ).
- device 600 displays third page 1630 of editing user interface 1622 including representation 1691 of a layout of watch user interface 1668 , as shown in FIG. 16V .
- third page 1630 of editing user interface 1622 includes a navigable (e.g., scrollable) user interface element 1689 that includes a plurality of selectable colors (e.g., to be used as a background color for watch user interface 1668 ; to be applied as a color scheme to watch user interface 1668 ).
- user interface element 1689 includes a color wheel with colors represented in selectable circles.
- device 600 receives (e.g., detects) an input 1637 .
- input 1637 is a rotational input on rotatable input mechanism 603 shown in FIG. 16V .
- input 1637 is a touch input such as a swipe or pinch input.
- device 600 In response to (e.g., and while) receiving input 1637 , device 600 navigates through the plurality of selectable colors in user interface 1689 . In some embodiments, as the plurality of selectable colors are being navigated via user interface element 1689 , device 600 indicates (e.g., by highlighting; by bolding; by visually emphasizing) the currently-selected color.
- device 600 in response to receiving input 1637 , navigates through the plurality of selectable colors in user interface element 1689 to an end (e.g., top or bottom) of user interface element 1689 , as shown in FIG. 16W .
- user interface element 1689 includes, at the end of user interface element 1689 , an indication 1687 that more colors are available for selection.
- device 600 in response to reaching the end of user interface element 1689 , displays an affordance 1685 that, when selected, causes display of the additional selectable colors.
- device 600 receives (e.g., detects) an input 1639 directed to affordance 1685 .
- input 1639 is a touch input on display 602 .
- device 600 displays an additional color selection user interface 1683 that includes one or more groups (e.g., group 1681 ) of additional selectable colors (e.g., group 1681 including at least additional selectable colors 1681 A- 1681 D), as shown in FIG. 16X .
- additional color selection user interface 1683 can be navigated (e.g., scrolled) for more groups of additional selectable colors.
- a group of colors includes similar colors (e.g., a similar range of colors; colors of a common shade or theme). In some embodiments, a group of colors includes colors from a common period (e.g., a particular season of a particular year). In some embodiments, the plurality of selectable colors included in user interface element 1689 corresponds to common colors and/or frequently used colors. In some embodiments, the plurality of additional selectable colors included in additional color selection user interface 1683 corresponds to less-common colors and/or less-frequently used colors.
- FIG. 16Y illustrates a second device 600 B (e.g., a smartphone) displaying, via a display 602 B, a first user interface 1679 of a companion application.
- device 600 B is paired with device 600 .
- the companion application on device 600 B can be used to edit, configure, and/or modify settings or features of device 600 and/or applications that are installed on device 600 .
- first user interface 1679 includes a watch user interface representation 1677 corresponding to a representation of a watch user interface (e.g., watch user interface 1668 ; a watch user interface that is currently selected to be used on device 600 ).
- first user interface 1679 includes a colors region 1675 that includes a plurality of selectable colors that can be applied to the watch user interface (e.g., as a background color or for a color scheme). Similar to third page 1630 of editing user interface 1622 of device 600 , a color can be selected from color region 1675 to be applied to the watch user interface.
- first user interface 1679 includes a complications region 1673 that indicates and enables changes to the current complications that are selected for the watch user interface.
- FIG. 16Z illustrates device 600 B displaying, via display 602 B, a second user interface 1671 of the companion application, where second user interface 1671 includes a selectable user interface element 1669 for managing/editing a color(s) of the watch user interface.
- device 600 B receives (e.g., detects) an input 1641 directed to user interface element 1669 .
- input 1641 is a touch input on display 602 B.
- device 600 B displays, via display 602 , an additional color selection user interface 1667 of the companion application, as shown in FIG. 16AA .
- additional color selection user interface 1667 of FIG. 16AA includes one or more groups (e.g., groups 1665 and 1663 ) of additional selectable colors (e.g., group 1665 including additional selectable colors 1665 A- 1665 F and group 1663 including at least additional selectable colors 1663 A- 1663 D), as shown in FIG. 16AA .
- additional color selection user interface 1667 can be navigated (e.g., scrolled) for more groups of additional selectable colors.
- the plurality of selectable colors included in color region 1675 of first user interface 1679 of the companion application corresponds to common colors and/or frequently used colors.
- the plurality of additional selectable colors included in additional color selection user interface 1667 of the companion application corresponds to less-common colors and/or less-frequently used colors.
- FIG. 16AB-16AE illustrate device 600 displaying, in region 1693 of complication selection user interface 1642 , the complication previews 1693 A- 1693 D for respective corresponding complications of the weather application, where the shape of each respective complication preview is automatically adjusted or modified.
- complication previews 1693 A- 1693 D corresponding to complications of the weather application are displayed with a first shape (e.g., a first layout; a first design; a first type of outline).
- complication previews 1693 A- 1693 D in the first shape correspond to a first complication region (e.g., the top-left-corner region, thus being the top-left-corner complication) of watch user interface 1668 .
- complication previews 1693 A- 1693 D corresponding to complications of the weather application are displayed, in complication selection user interface 1642 , with a second shape.
- complication previews 1693 A- 1693 D in the second shape, as in FIG. 16AC correspond to a second complication region (e.g., the top-right-corner region, thus being the top-right-corner complication) of watch user interface 1668 .
- complication preview 1693 B corresponding to a complication of the weather application is displayed, in complication selection user interface 1642 , with a third shape.
- complication preview 1693 B in the third shape corresponds to a third complication region (e.g., the top-bezel region, thus being the top-bezel complication) of watch user interface 1668 .
- complication previews 1693 C- 1693 D corresponding to complications of the weather application are displayed, in complication selection user interface 1642 , with a fourth shape.
- complication previews 1693 C- 1693 D in the fourth shape correspond to a fourth complication region (e.g., one of the (e.g., 4 possible) inner-dial regions, thus being one of the inner-dial complications) of watch user interface 1668 .
- complication previews 1693 A- 1693 D corresponding to complications of the weather application are displayed, in complication selection user interface 1642 , with the fourth shape.
- complication previews 1693 A- 1693 D in the fifth shape, as in FIG. 16AE correspond to the fourth complication region (e.g., one of the inner-dial regions) of watch user interface 1668 .
- the same complication for the same application can be include in a respective watch user interface with different shapes based on the type of the respective watch user interface and/or the respective complication region within the respective watch user interface for which the complication is being used.
- the shape for the respective complication preview is (e.g., at least partly) determined based on the layout, design, and/or configuration of the respective watch user interface (e.g., watch user interface 1668 ) for which the corresponding respective complication is to be used.
- the shape for a respective complication preview is (e.g., at least partly) determined based on the respective complication region of the one or more complications within the respective watch user interface for which the respective complication is being used.
- FIGS. 17A-17D are a flow diagram illustrating methods of enabling configuration of a background for a user interface, in accordance with some embodiments.
- Method 1700 is performed at a computer system (e.g., 100 , 300 , 500 , 600 ) (e.g., a smart device, such as a smartphone or a smartwatch; a mobile device) that is in communication with a display generation component and one or more input devices (e.g., including a touch-sensitive surface that is integrated with the display generation component; a mechanical input device; a rotatable input device; a rotatable and depressible input device; a microphone).
- Some operations in method 1700 are, optionally, combined, the orders of some operations are, optionally, changed, and some operations are, optionally, omitted.
- method 1700 provides an intuitive way for managing user interfaces related to time.
- the method reduces the cognitive burden on a user for managing user interfaces related to time, thereby creating a more efficient human-machine interface.
- the computer system e.g., 600
- displays or causes display of the watch user interface e.g., 1606 , 1668
- the watch user interface includes a dial that indicates a current time.
- the watch user interface includes one or more complications (e.g., 1608 , 1610 , 1612 , 1614 , 1670 , 1672 , 1674 , 1676 , 1678 , 1680 , 1682 , 1684 ) corresponding to respective applications that indicate respective sets of information (e.g., a date; a calendar event; weather; contacts).
- the complications are displayed at respective complication regions within the watch user interface.
- the computer system while displaying the watch user interface (e.g., 1606 , 1668 ), the computer system (e.g., 600 ) detects an input (e.g., 1601 , 1603 ) (e.g., a press input; a press-and-hold input) on the watch user interface. In some embodiments, in response to detecting the input on the watch user interface, the computer system displays or causes display of the watch face editing user interface (e.g., 1622 ).
- an input e.g., 1601 , 1603
- the computer system displays or causes display of the watch face editing user interface (e.g., 1622 ).
- the computer system displays ( 1702 ), via the display generation component (e.g., 602 ), a watch face editing user interface (e.g., 1622 ), wherein the watch face editing user interface includes a representation of a layout of a watch user interface (e.g., 1624 ) (e.g., a watch face; a user interface for a watch that includes an indication of a time and/or date) including a time region for displaying a current time and one or more complication regions for displaying complications on the watch user interface.
- a complication refers to any clock face feature other than those used to indicate the hours and minutes of a time (e.g., clock hands or hour/minute indications).
- complications provide data obtained from an application.
- a complication includes an affordance that, when selected, launches a corresponding application.
- a complication is displayed at a fixed, predefined location on the display.
- complications occupy respective locations at particular regions of a watch face (e.g., lower-right, lower-left, upper-right, and/or upper-left).
- the computer system detects ( 1706 ), via the one or more input devices, a first input (e.g., 1611 , 1617 ) (e.g., a first user selection) directed to a complication region of the one or more complication regions (e.g., regions corresponding to complications 1608 , 1610 , 1612 , 1614 ; regions corresponding to complications 1670 , 1672 , 1674 , 1676 , 1678 , 1680 , 1682 , 1684 ) (e.g., a corner region (e.g., top-left, top-right, bottom-left, bottom-right); a bezel region).
- a first input e.g., 1611 , 1617
- a first user selection directed to a complication region of the one or more complication regions (e.g., regions corresponding to complications 1608 , 1610 , 1612 , 1614 ; regions corresponding to complications 1670 , 1672 , 1674 , 16
- the computer system In response to detecting the first input (e.g., 1611 , 1617 ) directed to the complication region of the one or more complication regions ( 1708 ), the computer system (e.g., 600 ) displays ( 1710 ) a complication selection user interface (e.g., 1642 ).
- a complication selection user interface e.g., 1642
- Displaying the complication selection user interface includes ( 1710 ) concurrently displaying an indication (e.g., label/header of region 1644 , 1658 , 1662 , 1664 , 1666 ) of (e.g., the name of; a graphical indication of; an icon corresponding to; a category of) a first application (e.g., an application that is installed on, can be launched on, and/or is accessible from the computer system) ( 1712 ), a first complication preview (e.g., 1644 A- 1644 E) (e.g., a graphical preview of how the first complication would be displayed in the watch user interface) corresponding to a first complication that is configured to display, on the watch user interface (e.g., 1606 , 1668 ), a first set of information obtained from the first application (e.g., information based on a feature, operation, and/or characteristic of the first application), wherein the first complication preview
- an indication e.g.
- Displaying the complication selection user interface that includes the indication of the first application, the first complication preview, and the second complication preview enables a user to quickly and easily recognize that the first and second complication previews correspond to complications related to the first application, thereby enhancing the operability of the device and making the user-device interface more efficient (e.g., by helping the user to view related/associated items in the user interface together without needing to navigate to other portions of the user interface) which, additionally, reduces power usage and improves battery life of the device by enabling the user to use the device more quickly and efficiently.
- the computer system detects ( 1720 ), via the one or more input devices (e.g., via a rotatable input device (e.g., 603 ); via a touch-sensitive surface), a second input (e.g., 1613 ) directed to selecting a respective complication preview (e.g., 1644 A- 1644 E).
- the one or more input devices e.g., via a rotatable input device (e.g., 603 ); via a touch-sensitive surface
- a second input e.g., 1613
- a respective complication preview e.g., 1644 A- 1644 E
- the computer system In response to detecting the second input (e.g., 1613 ) directed to selecting the respective complication preview (e.g., 1644 A- 1644 E) ( 1722 ), the computer system (e.g., 600 ) displays ( 1724 ), via the display generation component (e.g., 602 ), a representation of the watch user interface (e.g., as shown in FIG. 16F and 16Q ) with a representation of a selected complication corresponding to the respective complication preview displayed at the first complication region of the watch user interface (e.g., 1606 , 1668 ).
- the first complication is displayed in the first complication region of the watch user interface (e.g., 1606 , 1668 ) ( 1726 ).
- the second complication is displayed in the first complication region of the watch user interface (e.g., 1606 , 1668 ) ( 1728 ).
- Displaying e.g., automatically, without user input
- a respective complication in a respective complication region of the watch user interface based on the selected complication preview enables a user to conveniently and efficiently manage and change complications of the watch user interface.
- Providing improved control options without cluttering the UI with additional displayed controls enhances the operability of the device.
- the computer system while displaying the complication selection user interface (e.g., 1642 ) ( 1730 ), the computer system (e.g., 600 ) detects ( 1732 ), via the one or more input devices (e.g., via a rotatable input device; via a touch-sensitive surface), a third input (e.g., 1621 ) (e.g., a rotational input on the rotatable input device (e.g., 603 ); a touch scrolling input on the touch-sensitive surface).
- the computer system in response to detecting the third input ( 1734 ), the computer system navigates (e.g., scrolls) through the complication selection user interface ( 1736 ).
- navigating e.g., scrolling through the complication selection user interface (e.g., 1642 ) includes ( 1736 ) concurrently displaying an indication of (e.g., the name of; a graphical indication of; an icon corresponding to; a category of) a second application (e.g., an application that is installed on, can be launched on, and/or is accessible from the computer system) ( 1728 ), a third complication preview (e.g., 1634 , 1636 , 1638 , 1640 , 1686 , 1688 , 1690 , 1692 , 1694 , 1696 , 1698 , 1699 ) (e.g., a graphical preview of how the third complication would be displayed in the watch user interface) corresponding to a third complication that is configured to display, on the watch user interface (e.g., 1606 , 1668 ), a third set of information obtained from the second application (e.g., information based on
- Displaying the indication of the second application, the third complication preview, and the fourth complication preview (e.g., together in the same region of the complication selection user interface; together as a group of complications corresponding to the second application) in accordance with navigating (e.g., scrolling) through the complication selection user interface provides easy and efficient access to different complications that are available for selection, as related complications (complications corresponding to the same application) are grouped together within the complication selection user interface.
- Providing improved visual feedback enhances the operability of the device enhances the operability of the device and makes the user-device interface more efficient (e.g., by helping the user to view related/associated items in the user interface together without needing to navigate to other portions of the user interface) which, additionally, reduces power usage and improves battery life of the device by enabling the user to use the device more quickly and efficiently.
- navigating e.g., scrolling through the complication selection user interface (e.g., 1642 ) further includes ceasing display of the first complication preview (e.g., 1634 , 1636 , 1638 , 1640 , 1686 , 1688 , 1690 , 1692 , 1694 , 1696 , 1698 , 1699 ) corresponding to the first complication and the second complication preview (e.g., 1634 , 1636 , 1638 , 1640 , 1686 , 1688 , 1690 , 1692 , 1694 , 1696 , 1698 , 1699 ) corresponding to the second complication (e.g., and other complication previews corresponding to respective complications that are configured to display, on the watch user interface (e.g., 1606 , 1668 ) (e.g., watch face), a respective set of information obtained from the first application) ( 1744 ).
- the watch user interface e.g., 16
- ceasing display of the first complication preview and the second complication preview comprises moving the first complication preview and the second complication preview off of an edge of the display generation component as the complication selection user interface is navigated (e.g., scrolled).
- the indication of the first application, the first complication preview (e.g., 1634 , 1636 , 1638 , 1640 , 1686 , 1688 , 1690 , 1692 , 1694 , 1696 , 1698 , 1699 ), and the second complication preview (e.g., 1634 , 1636 , 1638 , 1640 , 1686 , 1688 , 1690 , 1692 , 1694 , 1696 , 1698 , 1699 ) are displayed in (e.g., grouped together in) a first region (e.g., 1644 , 1658 , 1662 , 1664 , 1666 ) of the complication selection user interface (e.g., 1642 ) (e.g., where the indication of the first application is a header/label for the group), and the indication of the second application, the third complication preview, and the fourth complication preview are displayed in (e.g., grouped together in)
- Displaying the indication of the first application, the first complication preview, and the second complication preview together in the first region of the complication selection user interface and displaying the indication of the second application, the third complication preview, and the fourth complication preview together in the second region of the complication selection user interface enable a user to view and select from the available complications in an intuitive manner.
- Providing additional control options enhances the operability of the device and makes the user-device interface more efficient (e.g., by helping the user to provide proper inputs and reducing user mistakes when operating/interacting with the device) which, additionally, reduces power usage and improves battery life of the device by enabling the user to use the device more quickly and efficiently.
- the first application is associated with a plurality of available complications (e.g., 1608 , 1610 , 1612 , 1614 , 1670 , 1672 , 1674 , 1676 , 1678 , 1680 , 1682 , 1684 ) that are configured to display information obtained from the first application, and the plurality of available complications include the first complication and the second complication.
- a plurality of available complications e.g., 1608 , 1610 , 1612 , 1614 , 1670 , 1672 , 1674 , 1676 , 1678 , 1680 , 1682 , 1684 .
- displaying the complication selection user interface includes (e.g., 1642 ), in accordance with a determination that the plurality of available complications that are configured to display information obtained from the first application exceeds a predetermined number (e.g., 5, 6), the computer system (e.g., 600 ) displays a plurality of complication previews (e.g., the plurality of complication previews includes a number of complication previews that equals the predetermined number) that each correspond to a complication of the plurality of available complication, where the plurality of complication previews does not exceed the predetermined number, and a first selectable user interface object (e.g., 1648 , 1660 ) (e.g., a first affordance; a “show more” icon/button) that, when selected, causes display of one or more additional complication previews (e.g., 1656 A) that were not included in the plurality of complication previews (e.g., the one or
- displaying the complication selection user interface includes, in accordance with a determination that the plurality of available complications that are configured to display information obtained from the first application does not exceed the predetermined number, displaying a second plurality of complication previews (e.g., the second plurality of complication previews includes complication previews for all of the available complications that are configured to display information obtained from the first application) that each correspond to a complication of the plurality of available complication without displaying the first selectable user interface object.
- a second plurality of complication previews e.g., the second plurality of complication previews includes complication previews for all of the available complications that are configured to display information obtained from the first application
- Providing additional control options without cluttering the UI with additional displayed controls enhances the operability of the device and makes the user-device interface more efficient (e.g., by helping the user to provide proper inputs and reducing user mistakes when operating/interacting with the device) which, additionally, reduces power usage and improves battery life of the device by enabling the user to use the device more quickly and efficiently.
- the first application corresponds to an application (e.g., a contactable users application) for managing information of a set of contactable users (e.g., user contacts stored in and/or accessible on the computer system (e.g., 600 ); user contacts stored in and/or accessible from an address book),
- the first complication e.g., 1608
- the second complication corresponds to a second contactable user of the set of contactable users
- the first complication preview and the second complication preview are displayed in an order (e.g., a predetermined order; a selected order).
- displaying the complication selection user interface includes, in accordance with a determination that the first contactable user is a user of a first type (e.g., a candidate contact, a favorite contact; a frequent contact) and that the second contactable user is not a user of the first type, the computer system (e.g., 600 ) displays the first complication preview prior to the second complication preview in the order.
- displaying the complication selection user interface includes, in accordance with a determination that the first contactable user is not a user of the first type and that the second contactable user is a user of the first type, displaying the second complication preview prior to the first complication preview in the order.
- Displaying a complication preview corresponding to a candidate contact prior to displaying a complication preview corresponding to a non-candidate contact in the complication selection user interface provides a user with quicker and easier access to a respective complication preview corresponding to a candidate contact when navigating the complication selection user interface. Reducing the number of inputs needed to perform an operation enhances the operability of the device and makes the user-device interface more efficient (e.g., by helping the user to provide proper inputs and reducing user mistakes when operating/interacting with the device) which, additionally, reduces power usage and improves battery life of the device by enabling the user to use the device more quickly and efficiently.
- the first application is the contactable users application
- the computer system e.g., 600
- a maximum number of complication previews for the contactable users application in the complication selection user interface (e.g., 1642 ).
- all of the maximum number of complication previews that are shown correspond to candidate contacts (e.g., listed in alphabetical order).
- the candidate contacts are shown first (e.g., in alphabetical order) and regular contacts are shown for the remaining complication previews (e.g., separately in alphabetical order).
- the first complication preview includes the graphical representation of the first complication in a first shape (e.g., 1693 A- 1693 D in FIG. 16AB ) (e.g., a first layout; a first design; a first outline) and the second complication preview includes the graphical representation of the second complication in the first shape.
- a first shape e.g., 1693 A- 1693 D in FIG. 16AB
- the second complication preview includes the graphical representation of the second complication in the first shape.
- the first complication preview includes the graphical representation of the first complication in a second shape (e.g., 1693 A- 1693 D in FIG. 16AC ) (e.g., a second layout; a second design; a second outline) and the second complication preview includes the graphical representation of the second complication in the second shape, wherein the second shape is different from the first shape.
- a second shape e.g., 1693 A- 1693 D in FIG. 16AC
- the second complication preview includes the graphical representation of the second complication in the second shape, wherein the second shape is different from the first shape.
- the type of shape (e.g., layout; design; outline) for complication previews are (e.g., at least partly) determined based on the layout, design, and/or configuration of the watch face for which the corresponding complications are to be used.
- complication previews that include graphical representations of a respective complication in a respective shape, where the type of the respective shape is at least partly determined based on the layout, design, and/or configuration of the current watch user interface enables a user to conveniently preview, before selecting a particular complication for use, how a respective complication would appear when used in the watch user interface.
- Providing improved visual feedback and reducing the number of inputs needed to perform an operation enhances the operability of the device and makes the user-device interface more efficient (e.g., by helping the user to provide proper inputs and reducing user mistakes when operating/interacting with the device) which, additionally, reduces power usage and improves battery life of the device by enabling the user to use the device more quickly and efficiently.
- the first complication preview in accordance with a determination that the complication region (selected via the first input) of the one or more complication regions correspond to a first complication region, includes the graphical representation of the first complication in a third shape (e.g., 1693 B in FIG. 16AD ) (e.g., a third layout; a third design; a third outline) and the second complication preview includes the graphical representation of the second complication in the third shape.
- a third shape e.g., 1693 B in FIG. 16AD
- the second complication preview includes the graphical representation of the second complication in the third shape.
- the first complication preview includes the graphical representation of the first complication in a fourth shape (e.g., 1693 A- 1693 D in FIG. 16AE ) (e.g., a fourth layout; a fourth design; a fourth outline) and the second complication preview includes the graphical representation of the second complication in the fourth shape, wherein the fourth shape is different from the third shape.
- a fourth shape e.g., 1693 A- 1693 D in FIG. 16AE
- the second complication preview includes the graphical representation of the second complication in the fourth shape, wherein the fourth shape is different from the third shape.
- the type of shape (e.g., layout; design; outline) for complication previews are (e.g., at least partly) determined based on the respective complication region of the one or more complications within a watch face for which the respective complication is being used.
- displaying the complication selection user interface further includes displaying the indication of the first application prior to (e.g., above; as a header) the first complication preview and the second complication preview (e.g., prior to all complication previews that are associated with the first application).
- the indication of the first application is indicative of (e.g., represents; is the name for; is the header for) a complication preview group comprising the first complication preview and the second complication preview.
- Displaying the indication of the first application prior to the first complication preview and the second complication preview enables a user to quickly and easily recognize the corresponding application for the displayed compilations, thereby enhancing the operability of the device and making the user-device interface more efficient (e.g., by helping the user to more easily recognize and categorize the displayed complications) which, additionally, reduces power usage and improves battery life of the device by enabling the user to use the device more quickly and efficiently.
- the computer system while displaying the watch face editing user interface (e.g., 1622 ), the computer system (e.g., 600 ) displays, via the display generation component (e.g., 602 ) (e.g., at a top region of the display generation component), an indication (e.g., “DIAL” or “COLOR” in FIGS. 16V and 16W ; an indication of a color editing user interface; an indication of a dial editing user interface) of an adjacent editing tab corresponding to an adjacent user interface that is different from a user interface for editing one or more complications of the watch user interface.
- the editing interface different from the watch face editing user interface is configured to edit a different aspect/characteristic of the watch face other than the complications of the watch face.
- the computer system while displaying the watch face editing user interface, the computer system detects, via the one or more input devices, a fourth input (e.g., a swipe input detected via a touch-sensitive surface that is integrated with the display generation component) directed to navigating to a different editing tab.
- a fourth input e.g., a swipe input detected via a touch-sensitive surface that is integrated with the display generation component
- the computer system displays, via the display generation component, the adjacent user interface, the adjacent user interface for editing a characteristic (e.g., different aspect; different feature) of the watch user interface different from the one or more complications of the watch user interface.
- Providing, in the watch face editing user interface, adjacent editing tabs for editing different aspects/characteristics of the watch user interface enables a user to quickly and easily access the other editing tabs for editing the different aspects/characteristics (e.g., without needing to exit the watch face editing user interface).
- Providing improved control options and reducing the number of inputs needed to perform an operation enhances the operability of the device and makes the user-device interface more efficient (e.g., by helping the user to provide proper inputs and reducing user mistakes when operating/interacting with the device) which, additionally, reduces power usage and improves battery life of the device by enabling the user to use the device more quickly and efficiently.
- the computer system displays, via the display generation component (e.g., 602 ), a color editing user interface (e.g., 1630 ) (e.g., different from the watch face editing user interface).
- a color editing user interface e.g., 1630
- the color editing user interface can be accessed via one or more swipe inputs from the watch face editing user interface (e.g., 1622 ) (e.g., the watch face editing user interface and color editing user interface are different tabs within a watch face editing mode).
- the color editing user interface is accessed while the computer system is in watch face editing mode.
- the color editing user interface is a tab within a plurality of (e.g., adjacent) tabs (e.g., style tab; dial tab; color tab; complication tab) that can be accessed while the computer system is in watch face editing mode.
- the color editing user interface can be accessed via a companion application on a second computer system (e.g., a second electronic device, such as a smartphone) that is paired with the computer system.
- Providing the color editing user interface that can be accessed via one or more swipe inputs from the watch face editing user interface provides quick and easy access for editing colors of a current watch user interface that is being edited (e.g., without needing to exit the watch face editing user interface).
- Providing improved control options and reducing the number of inputs needed to perform an operation enhances the operability of the device and makes the user-device interface more efficient (e.g., by helping the user to provide proper inputs and reducing user mistakes when operating/interacting with the device) which, additionally, reduces power usage and improves battery life of the device by enabling the user to use the device more quickly and efficiently.
- the color editing user interface (e.g., 1630 ) includes the representation of the layout of the watch user interface (e.g., 1624 ) displayed in a first color scheme based on a first color, and a first plurality of selectable colors (e.g., 1689 ) (e.g., displayed as navigable list of colors, with each color represented in a selectable circle) for the watch user interface (e.g., 1606 , 1668 ) (e.g., a watch face), including the first color.
- a first plurality of selectable colors e.g., 1689
- the color editing user interface is used to edit/modify a color/color scheme of the (e.g., background of) the layout of the watch user interface.
- the first color is the currently-selected color.
- the computer system e.g., 600 if the first color is the currently-selected color, the computer system (e.g., 600 ) indicates (e.g., by highlighting; by bolding; by visually emphasizing), in the first plurality of colors, that the first color is the currently-selected color.
- the computer system detects, via the one or more input devices (e.g., via a rotatable input device (e.g., 603 ); via a touch-sensitive surface), a fifth input (e.g., 1637 ) (e.g., a rotational input on the rotatable input device; a touch scrolling input on the touch-sensitive surface) directed to navigating (e.g., scrolling) through the first plurality of selectable colors (e.g., 1689 ).
- a fifth input e.g., 1637
- navigating e.g., scrolling
- the first plurality of selectable colors e.g., 1689
- Enabling the plurality of selectable colors to be navigated (e.g., scrolled) via a rotational input on a rotatable input device provides an intuitive method for navigating through and selecting from the plurliaty of selectable colors.
- Providing improved control options enhances the operability of the device and makes the user-device interface more efficient (e.g., by helping the user to provide proper inputs and reducing user mistakes when operating/interacting with the device) which, additionally, reduces power usage and improves battery life of the device by enabling the user to use the device more quickly and efficiently.
- the comptuer system in response to detecting the fifth input (e.g., 1637 ), navigates (e.g., scrolls) through the first plurality of colors (e.g., 1689 ) from the first color to a second color different from the first color.
- the computer system also indicates (e.g., by highlighting; by bolding; by visually emphasizing), in the first plurality of colors, that the second color is now the currently-selected color.
- the computer system in response to detecting the fifth input, displays the representation of the layout of the watch user interface (e.g., 1624 ) in a second color scheme based on the second color.
- Providing a color editing user interface that includes the representation of the layout of the watch user interface, where the displayed presentation of the layout of the watch user interface is adjusted based on a selected color scheme from the color editing user interface enables a quick and easy method for editing the color scheme of the current watch user interface. Reducing the number of inputs needed to perform an operation enhances the operability of the device and makes the user-device interface more efficient (e.g., by helping the user to provide proper inputs and reducing user mistakes when operating/interacting with the device) which, additionally, reduces power usage and improves battery life of the device by enabling the user to use the device more quickly and efficiently.
- the computer system subsequent to detecting the fifth input (e.g., 1637 ), the computer system (e.g., 600 ) detects, via the one or more input devices (e.g., via a rotatable input device (e.g., 603 ); via a touch-sensitive surface), a sixth input (e.g., a continuation of the fifth input) directed to navigating (e.g., scrolling) through the first plurality of selectable colors (e.g., 1689 ).
- the one or more input devices e.g., via a rotatable input device (e.g., 603 ); via a touch-sensitive surface
- a sixth input e.g., a continuation of the fifth input directed to navigating (e.g., scrolling) through the first plurality of selectable colors (e.g., 1689 ).
- the computer system in response to detecting the sixth input, navigates (e.g., scrolls) through the first plurality of colors to display a second selectable user interface object (e.g., 1685 ) (e.g., a second affordance; a “show more” icon/button).
- the second selectable user interface object is displayed with (e.g., with the same shape/layout/design as) other colors in the first plurality of colors.
- the second selectable user interface object is displayed as the last color in the list of the first plurality of colors.
- the computer system detects, via the one or more input devices, an activation (e.g., selection) of the second selectable user interface object.
- the computer system in response to detecting the activation of the second selectable user interface object, displays, via the display generation component, a second plurality of selectable colors for the watch user interface that is different from the first plurality of selectable colors.
- the first plurality of colors include common colors and/or frequently used colors while the second plurality of colors include less-common colors and/or less-frequently used colors.
- Providing additional control options without cluttering the UI with additional displayed controls enhances the operability of the device and makes the user-device interface more efficient (e.g., by helping the user to provide proper inputs and reducing user mistakes when operating/interacting with the device) which, additionally, reduces power usage and improves battery life of the device by enabling the user to use the device more quickly and efficiently.
- method 700 optionally includes one or more of the characteristics of the various methods described above with reference to method 1700 .
- a respective complication of a watch user interface as described with reference to FIGS. 6A-6H can be changed to a different complication via the process for managing complications described with reference to FIGS. 16A-16AE .
- method 900 optionally includes one or more of the characteristics of the various methods described above with reference to method 1700 .
- method 1100 optionally includes one or more of the characteristics of the various methods described above with reference to method 1700 .
- one or more characteristics or features of a user interface that includes an indication of time and a graphical representation of a character as described with reference to FIGS. 10A-10AC can be edited via the process for editing characteristics or features of a watch user interface as described with reference to FIGS. 16A-16AE .
- method 1300 optionally includes one or more of the characteristics of the various methods described above with reference to method 1700 .
- method 1500 optionally includes one or more of the characteristics of the various methods described above with reference to method 1700 .
- a respective complication of a watch user interface with a background as described with reference to FIGS. 14A-14AD can be changed to a different complication via the process for managing complications described with reference to FIGS. 16A-16AE .
- these details are not repeated below.
- FIGS. 18A-18J illustrate exemplary user interfaces for sharing a configuration of a user interface with an external device, in accordance with some embodiments.
- the user interfaces in these figures are used to illustrate the processes described below, including the processes in FIGS. 19A-19C .
- watch face user interface 1800 includes graphical representation 1802 of a character (e.g., a first character in a set of characters configured to be displayed on watch face user interface 1800 ).
- watch face user interface 1800 includes time indicator 1804 and complication 1806 A (e.g., corresponding to a calendar application) and complication 1806 B (e.g., corresponding to weather application).
- Watch face user interface 1800 includes a default color (e.g., black) and background 1808 having colors that are different from the default color (e.g., colors displayed by electronic device 600 in accordance with user inputs while an editing user interface is displayed by electronic device 600 ).
- electronic device 600 detects user input 1850 A (e.g., a long press gesture) on watch face user interface 1800 .
- user input 1850 A e.g., a long press gesture
- electronic device 600 displays user interface 1810 , as shown at FIG. 18B .
- user interface 1810 includes first representation 1800 A of watch face user interface 1800 (e.g., corresponding to set or collection of avatar characters configured to be sequentially displayed on watch face user interface 1800 ) and second representation 1800 B of an additional watch face user interface configured to be displayed by electronic device 600 (e.g., a watch face user interface corresponding to a set or collection of animal-like characters and/or emojis configured to be sequentially displayed on the watch face user interface).
- first representation 1800 A of watch face user interface 1800 e.g., corresponding to set or collection of avatar characters configured to be sequentially displayed on watch face user interface 1800
- second representation 1800 B of an additional watch face user interface configured to be displayed by electronic device 600 (e.g., a watch face user interface corresponding to a set or collection of animal-like characters and/or emojis configured to be sequentially displayed on the watch face user interface).
- First representation 1800 A of watch face user interface 1800 includes graphical representations of multiple characters (e.g., a collection and/or a set of characters) configured to be displayed on watch face user interface 1800 (e.g., displayed sequentially based on electronic device 600 detecting a change in activity state and/or a user input), as indicated by the multiple characters included on first representation 1800 A.
- User interface 1810 includes watch face indicator 1812 that includes a name associated with watch face user interface 1800 (e.g., “Avatar”).
- User interface 1810 also includes share affordance 1814 and edit affordance 1816 .
- electronic device 600 detects user input 1850 B (e.g., a tap gesture) on share affordance 1814 .
- electronic device 600 displays sharing user interface 1818 , as shown at FIG. 18C .
- sharing user interface 1818 enables selection of a recipient for receiving information associated with watch face user interface 1800 .
- sharing user interface 1818 includes affordances 1820 A- 1820 C corresponding to respective recipients (e.g., contactable users, information for which is stored in electronic device 600 ) for receiving information associated with watch face user interface 1800 .
- electronic device 600 displays sharing user interface 1818 including affordances 1820 A- 1820 C
- electronic device 600 detects user input 1850 C (e.g., a tap gesture) corresponding to selection of affordance 1820 C corresponding to recipient Ann Smith or an external device associated with recipient Ann Smith.
- user input 1850 C electronic device 600 displays messaging user interface 1822 of a messaging application of electronic device 600 , as shown at FIG. 18D .
- messaging user interface includes 1822 includes a message 1824 having representation 1826 of watch face user interface 1800 .
- Messaging user interface 1822 includes indicator 1828 that indicates the recipient (e.g., Ann Smith) of message 1824 .
- messaging user interface 1822 includes send affordance 1830 for initiating transmission of message 1824 .
- electronic device 600 detects user input 1850 D (e.g., a tap gesture) corresponding to selection of send affordance 1830 .
- electronic device 600 initiates a process for sending message 1824 to the selected recipient (e.g., external device 1832 (e.g., Ann's Watch)).
- Message 1824 includes representation 1826 of watch face user interface 1800 .
- electronic device 600 transmits data and/or information associated with watch face user interface 1800 to external device 1832 .
- electronic device 600 transmits information associated with a background of watch face user interface 1800 (e.g., color and/or size of background), a font of watch face user interface 1800 (e.g., a font for a date and/or time displayed by watch face user interface 1800 ), a position of a time indicator and/or complications of watch face user interface 1800 , applications corresponding to complications of watch face user interface 1800 , and/or customizations to complications of watch face user interface 1800 (e.g., colors and/or size of complications).
- a background of watch face user interface 1800 e.g., color and/or size of background
- a font of watch face user interface 1800 e.g., a font for a date and/or time displayed by watch face user interface 1800
- a position of a time indicator and/or complications of watch face user interface 1800 e.g., applications corresponding to complications of watch face user interface 1800
- electronic device 600 transmits information and/or data indicative of graphical representation 1802 of a character of watch face user interface 1800 .
- electronic device 600 transmits information and/or data indicative of (e.g., that defines) graphical representation 1802 of the character of watch face user interface 1800 when watch face user interface 1800 is configured to display a graphical representation of a single character without transitioning between display of graphical representations of multiple characters.
- Electronic device 600 forgoes transmission of information and/or data indicative of graphical representation 1802 of a character of watch face user interface 1800 when watch face user interface 1800 is configured to transition between display of respective graphical representations for multiple characters (e.g., a set of predetermined characters and/or a collection of predetermined characters).
- electronic device 600 transmits information associated with (e.g., that defines) a graphical representation of a character for watch face user interfaces that are configured to display a graphical representation of only a single character.
- electronic device 600 forgoes transmission of information associated with any graphical representation of any character for watch face user interfaces that transition between display of graphical representations of multiple characters (e.g., in response to detecting a change in activity state of electronic device 600 and/or in response to user input).
- electronic device 600 transmits and/or forgoes transmission of information associated with graphical representations of characters based on a type of watch face user interface (e.g., a single character watch face user interface or a collection of characters watch face user interface), in some embodiments, electronic device 600 transmits other data associated with watch face user interface 1800 (e.g., information related to background, fonts, and/or complications) regardless of whether information associated with a graphical representation of a character is transmitted or not.
- other data associated with watch face user interface 1800 e.g., information related to background, fonts, and/or complications
- external device 1832 receives message conversation 1824 and representation 1826 of watch face user interface 1800 .
- external device 1832 displays message conversation 1824 and representation 1826 in a messaging user interface 1831 on display 1833 of external device. Since watch face user interface 1800 includes graphical representations of multiple characters (e.g., watch face user interface 1800 is configured to transition between display of graphical representations of characters included in a collection of characters), external device 1832 does not receive information related to graphical representation 1802 and/or graphical representations of other characters associated with watch face user interface 1800 .
- external device 1832 detects user input 1834 (e.g., a tap gesture) corresponding to selection of representation 1826 . In response to detecting user input 1834 , external device 1832 displays user interface 1836 , as shown at FIG. 18F .
- user input 1834 e.g., a tap gesture
- user interface 1836 includes representation 1838 , watch face indicator 1840 , and add watch face affordance 1842 .
- external device 1832 detects user input 1844 (e.g., a tap gesture) corresponding to selection of add watch face affordance 1842 .
- user input 1844 e.g., a tap gesture
- external device 1832 adds a new watch face user interface to a watch face library of external device 1832 and displays watch face user interface 1846 , as shown at FIG. 18G .
- Watch face user interface 1846 includes time indicator 1804 and complication 1806 A (e.g., corresponding to a calendar application) and complication 1806 B (e.g., corresponding to a weather application). Watch face user interface 1846 further includes a default color (e.g., black) and background 1808 having colors that are different from the default color (e.g., colors displayed by electronic device 600 in response to user inputs while an editing user interface is displayed by electronic device 600 ). As such, watch face user interface 1846 includes features that are the same as watch face user interface 1800 .
- time indicator 1804 e.g., corresponding to a calendar application
- complication 1806 B e.g., corresponding to a weather application
- Watch face user interface 1846 further includes a default color (e.g., black) and background 1808 having colors that are different from the default color (e.g., colors displayed by electronic device 600 in response to user inputs while an editing user interface is displayed by electronic device 600 ).
- watch face user interface 1846
- time indicator 1804 and complication 1806 A and complication 1806 B of watch face user interface 1846 include a same position, font, and/or size as watch face user interface 1800 .
- background 1808 of watch face user interface 1846 includes a same color and/or size as watch face user interface 1800 .
- electronic device 600 transmits information related to watch face user interface 1800 to external device 1832 that is not indicative of graphical representation 1802 of watch face user interface 1800 . Because watch face user interface 1800 is associated with a collection of graphical representations of multiple characters, electronic device 600 forgoes transmission of information associated with graphical representation 1802 and information associated with any other graphical representations of characters associated with watch face user interface 1800 .
- watch face user interface 1846 includes graphical representation 1848 of a character that is different from graphical representation 1802 of the character of watch face user interface 1800 (e.g., since information defining the characters of watch face user interface 1800 is not provided).
- watch face user interface 1846 is associated with a collection of graphical representations of characters that are included and/or stored on external device 1832 , or stored in an account associated with external device 1832 .
- watch face user interface 1846 is associated with a collection of graphical representations of characters that are selected randomly from a library of characters (e.g., stored on external device 1832 and/or stored on another external device different from external device 1832 (e.g., a server)).
- electronic device 600 detects user input 1850 E (e.g., a swipe gesture) on user interface 1810 .
- electronic device 600 translates first representation 1800 A of watch face user interface 1800 and second representation 1800 B of a second watch face user interface in a direction corresponding to user input 1850 E, as shown at FIG. 18H .
- electronic device 600 displays third representation 1800 C associated with a third watch face user interface, different from watch face user interface 1800 and second watch face user interface.
- second representation 1800 B of the second watch face user interface includes multiple different characters (e.g., animal-like avatars and/or emojis) to indicate that the second watch face user interface associated with second representation 1800 B is configured to transition between display of graphical representations of multiple characters. Accordingly, in response to detecting user input corresponding to selection of share affordance 1814 , electronic device 600 initiates the process for transmitting data associated with the second watch face without including information associated with graphical representations of characters configured to be displayed on the second watch face user interface.
- characters e.g., animal-like avatars and/or emojis
- electronic device detects user input 1850 F (e.g., a swipe gesture) on user interface 1810 .
- electronic device 600 translates first representation 1800 A, second representation 1800 B, and third representation 1800 C in a direction associated with user input 1850 F, as shown at FIG. 18I .
- electronic device 600 ceases to display first representation 1800 A and displays fourth representation 1800 D associated with a fourth watch face user interface, different from watch face user interface 1800 , second watch face user interface, and third watch face user interface.
- third representation 1800 C includes a graphical representation of a single character, thereby indicating that the third watch face user interface is configured to display a graphical representation of only a single character (e.g., regardless of electronic device 600 detecting a change in activity state and/or a user input).
- electronic device 600 detects user input 1850 G (e.g., a tap gesture) corresponding to selection of share affordance 1814 (e.g., to share third watch face user interface).
- user input 1850 G e.g., a tap gesture
- electronic device 600 initiates a process for sharing the third watch face user interface (e.g., because third representation 1800 C is designated, as indicated by being in a center position on user interface 1810 ).
- electronic device 600 displays sharing user interface 1818 in response to detecting user input 1850 G.
- electronic device 600 displays messaging user interface 1822 in response to detecting user input on an affordance associated with an external device of a recipient on sharing user interface 1818 .
- electronic device 600 displays messaging user interface 1822 in response to detecting user input corresponding to selection of send affordance 1830 .
- electronic device 600 initiates a process for transmitting information associated with the third watch face user interface (e.g., a background, a font, and/or complications) as well as information associated with (e.g., that defines) a graphical representation of the character of the third user interface.
- information associated with the third watch face user interface e.g., a background, a font, and/or complications
- external device 1832 displays watch face user interface 1852 (e.g., in response to receiving the transmission from electronic device 600 and detecting user input corresponding to add watch face affordance 1842 ).
- watch face user interface 1852 includes graphical representation 1854 of a character that is the same character displayed on third representation 1800 C. Since third representation 1800 C corresponds to a watch face user interface of electronic device 600 that is configured to display a graphical representation of a single character, electronic device 600 transmits information corresponding to the graphical representation of the single character to external device 632 .
- the information corresponding to the graphical representation of the single character includes a recipe that defines the graphical representation of the single character.
- the recipe of the graphical representation of the single character includes information related to features of the character, such as skin color, hair type, hair color, hair length, nose type, nose size, mouth type, mouth size, lip color, eye color, eye type, eye size, eyebrow color, eyebrow size, eyebrow type, and/or accessories of the character (e.g., headwear, eyewear, earrings, nose rings, etc.).
- the recipe of the graphical representation of the single character includes information related to animations that can be performed by the character either automatically (e.g., at predetermined intervals) and/or in response to user inputs.
- the information related to animations may be user defined (e.g., by a user of electronic device 600 ) such that the animations are specific to the character.
- the information corresponding to the graphical representation of the single character includes an image and/or a video of the graphical representation of the character.
- external device 1832 is configured to store and/or add graphical representation 1854 to a character library once watch face user interface 1852 is added to external device 1832 .
- external device 1832 is configured to edit the character associated with graphical representation 1854 after adding watch face user interface 1852 to external device 1832 and/or storing graphical representation 1854 to external device 1832 and/or to the character library of external device 1832 .
- FIGS. 19A-19C are a flow diagram illustrating methods for sharing a configuration of a user interface with an external device, in accordance with some embodiments.
- Method 1900 is performed at a computer system (e.g., 100 , 300 , 500 , 600 ) (e.g., a smart device, such as a smartphone or a smartwatch; a mobile device) that is in communication with a display generation component (e.g., 602 ) (e.g., a display and/or a touchscreen).
- a computer system e.g., 100 , 300 , 500 , 600
- a smart device such as a smartphone or a smartwatch
- a mobile device e.g., a mobile device
- a display generation component e.g., 602
- Some operations in method 1900 are, optionally, combined, the orders of some operations are, optionally, changed, and some operations are, optionally, omitted.
- method 1900 provides an intuitive way for sharing a configuration of a user interface with an external device.
- the method reduces the cognitive burden on a user for sharing a configuration of a user interface with an external device, thereby creating a more efficient human-machine interface.
- the computer system displays ( 1902 ), via the display generation component (e.g., 602 ), a representation (e.g., 1800 A- 1800 D) of a watch face user interface (e.g., 1800 ) (e.g., a watch face user interface that displays a single character without transitioning between multiple characters or a watch face user interface that transitions between display of multiple characters in a collection of characters) that is associated with one or more graphical representations (e.g., 1802 ) of respective characters (e.g., predetermined animated characters such as anthropomorphized animals, robots, or other objects or user-generated animated characters such as virtual avatars) (e.g., a recipe for a character that is included in the watch face user interface, the recipe including information related to features of the character, such as hair color, skin color, facial feature information, and/or accessory information) (e.g., a graphical representation of a single character when the watch face user interface is of
- the computer system while displaying the representation (e.g., 1800 A- 1800 D) of the watch face user interface (e.g., 1800 ), detects ( 1904 ) an input (e.g., 1850 A, 1850 B, 1850 C, and/or 1850 D) (e.g., a long press gesture on display generation component, and optionally, a subsequent tap gesture on a share affordance and/or a contact displayed in response to the long press gesture) corresponding to a request to share the watch face user interface (e.g., 1800 ) with an external device (e.g., 1832 ).
- an input e.g., 1850 A, 1850 B, 1850 C, and/or 1850 D
- an input e.g., 1850 A, 1850 B, 1850 C, and/or 1850 D
- an input e.g., 1850 A, 1850 B, 1850 C, and/or 1850 D
- an input e.g., 1850 A, 1850 B, 18
- the computer system in response to detecting the input ( 1850 A, 1850 B, 1850 C, and/or 1850 D), initiates ( 1906 ) a process for sharing the watch face user interface (e.g., 1800 ) with the external device (e.g., 1832 ) and, in accordance with a determination that the watch face user interface (e.g., 1800 ) is associated with less than a threshold number of graphical representations (e.g., 1802 ) of respective characters (e.g., less than two characters, a single character) (e.g., a first watch face that does not transition between multiple characters), the process ( 1908 ) for sharing the watch face user interface (e.g., 1800 ) with the external device (e.g., 1832 ) includes sharing one or more characteristics of the watch face user interface (e.g., 1800 ) (e.g., background color, date/time font, date/time size, date/time placement,
- characteristics of the watch face user interface e.g., 1800
- transmitting the representation of one or more of the one or more graphical representations of the respective characters associated with the watch face user interface includes sending data and/or information (e.g., without image data and/or multimedia data) associated with the one or more of the one or more graphical representations of the respective characters associated with the watch face user interface.
- transmitting the representation of one or more of the one or more graphical representations of the respective characters associated with the watch face includes sending image data and/or multimedia data (e.g., video data) associated with the one or more of the one or more graphical representations of the respective characters associated with the watch face user interface.
- the computer system in response to detecting the input (e.g., 1850 A, 1850 B, 1850 C, and/or 1850 D), initiates ( 1906 ) a process for sharing the watch face user interface (e.g., 1800 ) with the external device (e.g., 1832 ) and, in accordance with a determination that the watch face user interface (e.g., 1800 ) is associated with greater than or equal to the threshold number of graphical representations (e.g., 1802 ) of respective characters (e.g., a collection of characters, two or more characters) (e.g., a second watch face that transitions between display of characters sequentially, and optionally, the transition between characters is in response to meeting a transition criteria (e.g., inactivity of and/or an absence of user inputs detected by the computer system for a predetermined period of time)), the process ( 1910 ) for sharing the watch face user interface (e.g., 1800 )
- a transition criteria e.g., inactivity of and/
- Sharing one or more characteristics of the watch face user interface with or without transmitting a representation of one or more graphical representations of respective characters associated with the watch face user interface depending on a number of graphical representations of respective characters associated with the watch face user interface reduces an amount of data transmitted between the computer system and the external device.
- transmitting multiple representations of one or more graphical representations of respective characters associated with the watch face user interface consumes a relatively large amount of storage on external device and/or a relatively large amount of processing power of computer system.
- Reducing a size of a transmission improves the operability of the device and makes the user-device interface more efficient (e.g., by helping the user to provide proper inputs and reducing user mistakes when operating/interacting with the device) which, additionally, reduces power usage and improves battery life of the device by enabling the user to use the device more quickly and efficiently.
- the representation of one or more of the one or more graphical representations (e.g., 1802 ) of respective characters associated with the watch face user interface (e.g., 1800 ) includes transmitting information corresponding to one or more settings associated with characteristic features (e.g., settings set by a user of computer system that are associated with (e.g., define) visual characteristics of the respective character corresponding to the graphical representation) of the representation of one or more of the one or more graphical representations (e.g., 1802 ) of respective characters associated with the watch face user interface (e.g., 1800 ) (e.g., without transmitting image data (e.g., an image file) and/or multimedia data (e.g., a video file) associated with the representation of one or more of the one or more graphical representations of respective characters associated with the watch face user interface).
- characteristic features e.g., settings set by a user of computer system that are associated with (e.g., define) visual characteristics of the respective character corresponding to the graphical representation
- Sharing settings associated with characteristic features of the representation of one or more of the one or more graphical representations of respective characters associated with the watch face user interface without transmitting image data and/or multimedia data reduces an amount of data transmitted between the computer system and the external device. Reducing a size of a transmission improves the operability of the device and makes the user-device interface more efficient (e.g., by helping the user to provide proper inputs and reducing user mistakes when operating/interacting with the device) which, additionally, reduces power usage and improves battery life of the device by enabling the user to use the device more quickly and efficiently.
- sharing the one or more characteristics of the watch face user interface (e.g., 1800 ) includes transmitting one or more graphical representation templates (e.g., blank and/or fillable graphical representations that do not correspond to the one or more graphical representations of respective characters associated with the watch face user interface) for one or more second graphical representations (e.g., 1848 ) of respective second characters, different from the one or more graphical representations (e.g., 1802 ) of respective characters of the watch face user interface (e.g., 1800 ) (e.g., the one or more second graphical representations of respective second characters are stored on external device).
- graphical representation templates e.g., blank and/or fillable graphical representations that do not correspond to the one or more graphical representations of respective characters associated with the watch face user interface
- second graphical representations e.g., 1848
- Sharing one or more graphical representation templates instead of sharing the representation of the one or more graphical representations of respective characters associated with the watch face user interface reduces an amount data transmitted between computer system and external device. Reducing a size of a transmission improves the operability of the device and makes the user-device interface more efficient (e.g., by helping the user to provide proper inputs and reducing user mistakes when operating/interacting with the device) which, additionally, reduces power usage and improves battery life of the device by enabling the user to use the device more quickly and efficiently.
- the computer system while displaying the representation (e.g., 1800 A- 1800 D) of the watch face user interface (e.g., 1800 ), detects ( 1912 ) a sequence of one or more inputs (e.g., 1850 A) (e.g., a long press gesture on display generation component, and optionally, a subsequent tap gesture on an edit affordance) corresponding to a request to edit the watch face user interface (e.g., 1800 ).
- a sequence of one or more inputs e.g., 1850 A
- a long press gesture on display generation component e.g., a long press gesture on display generation component, and optionally, a subsequent tap gesture on an edit affordance
- the computer system in response to detecting the sequence of one or more inputs (e.g., 1850 A), displays ( 1941 ), via the display generation component (e.g., 602 ), a first user interface (e.g., 1810 ) for selecting between a first set of characters (e.g., 1800 A) that includes a plurality of user-customizable virtual avatars (e.g., a plurality of avatar-like emojis and/or the respective characters associated with the watch face user interface) and a graphical representation (e.g., 1800 B) of a second set of characters (e.g., a plurality of emojis of animal-like characters) that includes two or more predetermined characters that are not available in the first set of characters.
- a first set of characters e.g., 1800 A
- a plurality of user-customizable virtual avatars e.g., a plurality of avatar-like emojis and/or the respective characters associated with the watch face user
- the computer system while displaying the first user interface (e.g., 1810 ), detects ( 1916 ) (e.g., via one or more input devices that is in communication with the computer system, such as a touch-sensitive surface integrated with the display generation component) a third input corresponding to selection of the first set of characters (e.g., 1800 A) or the second set of characters (e.g., 1800 B).
- a third input corresponding to selection of the first set of characters (e.g., 1800 A) or the second set of characters (e.g., 1800 B).
- the computer system in accordance with (e.g., or in response to) a determination that the third input corresponds to selection of the first set of characters (e.g., 1800 A), displays ( 1918 ) the representation (e.g., 1800 A) of the watch face user interface (e.g., 1800 ) including a first graphical representation (e.g., 1802 ) of a currently selected character from the first set of characters.
- the representation e.g., 1800 A
- the watch face user interface e.g., 1800
- a first graphical representation e.g., 1802
- the computer system in accordance with (e.g., or in response to) a determination that the input corresponds to selection of the second set of characters (e.g., 1800 B), displays ( 1920 ) the representation of the watch face user interface including a second graphical representation of a currently selected character from the second set of characters.
- Displaying the first user interface for selecting between the first set of characters and the second set of characters enables a user to easily customize the watch face user interface, thereby enhancing the operability of the device and making the user-device interface more efficient (e.g., by helping the user to provide proper inputs and reducing user mistakes when operating/interacting with the device) which, additionally, reduces power usage and improves battery life of the device by enabling the user to use the device more quickly and efficiently.
- the computer system while displaying the representation (e.g., 1800 A- 1800 D) of the watch face user interface (e.g., 1800 ), detects ( 1922 ) a fourth input (e.g., 1850 A) (e.g., a long press gesture on display generation component, and optionally, a subsequent tap gesture on an edit affordance) corresponding to a request to edit the watch face user interface.
- a fourth input e.g., 1850 A
- a long press gesture on display generation component e.g., a subsequent tap gesture on an edit affordance
- the computer system (e.g., 100 , 300 , 500 , 600 ), after detecting the fourth input (e.g., 1850 A), displays ( 1924 ), via the display generation component (e.g., 602 ), a second user interface (e.g., 810 ) that includes a plurality of selectable characters (e.g., 1800 A- 1800 D) (e.g., including a plurality of animated (e.g., 3D) emojis of animal-like characters; a plurality of animated (e.g., 3D) avatar-like emojis).
- the plurality of selectable characters are displayed in a first tab or first screen of the second user interface.
- the plurality of selectable characters includes selectable sets of characters.
- the computer system e.g., 100 , 300 , 500 , 600
- displaying the second user interface e.g., 810
- detects e.g., via one or more input devices of the computer system, such as a touch-sensitive surface integrated with the display generation component
- a selection of a character of the plurality of selectable characters e.g., via one or more input devices of the computer system, such as a touch-sensitive surface integrated with the display generation component.
- the computer system in accordance with (e.g., or in response to) detecting the selection of the character, updates ( 1928 ) the representation of the watch face user interface to include a third graphical representation of the selected character (e.g., a graphical representation of a single character corresponding to the selected character and/or a graphical representation of a currently selected character from a selected set of characters).
- a third graphical representation of the selected character e.g., a graphical representation of a single character corresponding to the selected character and/or a graphical representation of a currently selected character from a selected set of characters.
- Displaying the second user interface for selecting between a plurality of selectable characters enables a user to easily customize the watch face user interface, thereby enhancing the operability of the device and making the user-device interface more efficient (e.g., by helping the user to provide proper inputs and reducing user mistakes when operating/interacting with the device) which, additionally, reduces power usage and improves battery life of the device by enabling the user to use the device more quickly and efficiently.
- the computer system while displaying the representation (e.g., 1800 A- 1800 D) of the watch face user interface (e.g., 1800 ), detects ( 1930 ) a fifth input (e.g., 1850 A) (e.g., a long press gesture on display generation component, and optionally, a subsequent tap gesture on an edit affordance) corresponding to a request to edit the watch face user interface (e.g., 1800 ).
- a fifth input e.g., 1850 A
- a long press gesture on display generation component e.g., a long press gesture on display generation component, and optionally, a subsequent tap gesture on an edit affordance
- the computer system displays ( 1932 ), via the display generation component (e.g., 602 ), a third user interface that includes a fourth graphical representation of a character of the one or more graphical representations of respective characters associated with the watch face user interface (e.g., 1800 ).
- the computer system while displaying the fourth representation of the character, detects ( 1934 ) (e.g., via one or more input devices that is in communication with the computer system, such as a touch-sensitive surface integrated with the display generation component) a sixth input (e.g., a rotational input on a rotatable input device or a rotatable and depressible input device; a scrolling input on a touch-sensitive surface integrated with the display generation component) directed to changing a visual characteristic of the character (e.g., hair color, skin color, facial feature information, and/or accessory information).
- a visual characteristic of the character e.g., hair color, skin color, facial feature information, and/or accessory information.
- the computer system in response to detecting the input directed to changing the visual characteristic, changes ( 1936 ) (e.g., by transitioning through a plurality of selectable visual characteristics (e.g., selectable features associated with hair color, skin color, facial feature information, and/or accessory information)) the visual characteristic (e.g., hair color, skin color, facial feature information, and/or accessory information) from a first visual characteristic (e.g., a first hair color, a first skin color, a first facial feature, and/or a first accessory) to a second visual characteristic (e.g., a second hair color, a second skin color, a second facial feature, and/or a second accessory) different from the first visual characteristic.
- changing the visual characteristic to the second visual characteristic is performed prior to sharing the watch face user interface and, when the watch face user interface with the second visual characteristic is shared, a representation of the watch face user interface including the second visual characteristic, is shared.
- Displaying the third user interface for changing the visual characteristic of the character enables a user to easily customize the watch face user interface, thereby enhancing the operability of the device and making the user-device interface more efficient (e.g., by helping the user to provide proper inputs and reducing user mistakes when operating/interacting with the device) which, additionally, reduces power usage and improves battery life of the device by enabling the user to use the device more quickly and efficiently.
- the representation (e.g., 1800 A- 1800 D) of the watch face user interface (e.g., 1800 ) includes a fifth graphical representation (e.g., 1802 ) of a character that corresponds to a graphical representation of (e.g., an animation based on; a graphical representations that animates features of) a user associated (e.g., based on an account to which the computer system is logged into) with the computer system (e.g., 100 , 300 , 500 , 600 ) (e.g., an animated (e.g., 3D) avatar-like representation of the user of the computer system).
- a graphical representation of e.g., an animation based on; a graphical representations that animates features of
- a user associated e.g., based on an account to which the computer system is logged into
- the computer system e.g., 100 , 300 , 500 , 600
- Displaying the representation of the watch face user interface having the fifth graphical representation of a character that corresponds to a graphical representation of the user associated with the computer system provides improved visual feedback related to an identity of the user of the computer system, and in some embodiments, the identity of the user sharing the watch face user interface.
- Providing improved visual feedback enhances the operability of the device and makes the user-device interface more efficient (e.g., by helping the user to provide proper inputs and reducing user mistakes when operating/interacting with the device) which, additionally, reduces power usage and improves battery life of the device by enabling the user to use the device more quickly and efficiently.
- sharing the one or more characteristics of the watch face user interface (e.g., 1800 ) includes transmitting one or more graphical representation templates (e.g., blank and/or fillable graphical representations that do not correspond to the one or more graphical representations of respective characters associated with the watch face user interface) for one or more second graphical representations (e.g., 1848 ) of respective second characters stored on the external device (e.g., 1832 ), different from the one or more graphical representations of respective characters of watch face user interface (e.g., 1800 ), wherein the one or more second graphical representations (e.g., 1848 ) of respective second characters stored on the external
- Sharing one or more graphical representation templates instead of sharing the representation of the one or more graphical representations of respective characters associated with the watch face user interface reduces an amount data transmitted between computer system and external device. Reducing a size of a transmission improves the operability of the device and makes the user-device interface more efficient (e.g., by helping the user to provide proper inputs and reducing user mistakes when operating/interacting with the device) which, additionally, reduces power usage and improves battery life of the device by enabling the user to use the device more quickly and efficiently.
- transmitting the representation of one or more of the one or more graphical representations (e.g., 1802 ) of respective characters associated with the watch face user interface (e.g., 1800 ) includes initiating a process for storing the representation of one or more of the one more graphical representations (e.g., 1802 ) of respective characters associated with the watch face user interface (e.g., 1800 ) on the external device (e.g., 1832 ) (e.g., in response to detecting user input corresponding to an add watch face affordance on external device, external device stores the representation of one or more of the one or more graphical representations of respective characters associated with the watch face user interface in a character library and/or an image library of external device).
- Initiating the process for storing the representation of one or more of the one or more graphical representations of respective characters associated with the watch face user interface on the external device reduces a number of inputs needed by a user of the external device to store the particular character on the external device.
- the user of the external device may store the representation of one or more of the graphical representations of respective characters associated with the watch face user interface instead of providing a sequence of inputs to create the particular character.
- Reducing the number of inputs needed to store the particular character improves the operability of the device and makes the user-device interface more efficient (e.g., by helping the user to provide proper inputs and reducing user mistakes when operating/interacting with the device) which, additionally, reduces power usage and improves battery life of the device by enabling the user to use the device more quickly and efficiently.
- initiating the process for storing the representation of one or more of the one or more graphical representations (e.g., 1802 ) of respective characters associated with the watch face user interface (e.g., 1800 ) on the external device (e.g., 1832 ) includes enabling, via the external device (e.g., 1832 ), an ability to change one or more visual characteristics (e.g., via an editing user interface) of the representation of one or more of the one or more graphical representations (e.g., 1802 ) of respective characters associated with the watch face user interface (e.g., 1800 ) (e.g., a user of the external device may access the representation of one or more of the one or more graphical representations of respective characters associated with the watch face user interface (e.g., via a character library, via an image library, via a watch face selection user interface, and/or via a watch face editing user interface) and request to enter an editing mode of the representation, such that the external device may receive user inputs and adjust
- Enabling an ability on the external device to change one or more visual characteristics of the representation of one or more of the one or more graphical representations of respective characters associated with the watch face user interface reduces a number of inputs needed by the user of the external device to customize the character.
- the user of the external device may start with the representation of one or more of the one or more graphical representations of respective characters associated with the watch face user interface instead of creating the representation of the character via a sequence of user inputs.
- Reducing the number of inputs needed to customize the particular character improves the operability of the device and makes the user-device interface more efficient (e.g., by helping the user to provide proper inputs and reducing user mistakes when operating/interacting with the device) which, additionally, reduces power usage and improves battery life of the device by enabling the user to use the device more quickly and efficiently.
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Human Computer Interaction (AREA)
- Mathematical Physics (AREA)
- Life Sciences & Earth Sciences (AREA)
- Health & Medical Sciences (AREA)
- Biophysics (AREA)
- General Health & Medical Sciences (AREA)
- Physiology (AREA)
- User Interface Of Digital Computer (AREA)
- Chemical & Material Sciences (AREA)
- Crystallography & Structural Chemistry (AREA)
- Telephone Function (AREA)
Abstract
Description
- This application claims priority to U.S. Provisional Application Ser. No. 63/023,194, filed May 11, 2020, entitled “USER INTERFACES RELATED TO TIME” and U.S. Provisional Application Ser. No. 63/078,314 filed Sep. 14, 2020, entitled “USER INTERFACES RELATED TO TIME.” All of these applications are incorporated by reference herein in their entirety.
- The present disclosure relates generally to computer user interfaces, and more specifically to techniques for managing user interfaces related to time.
- User interfaces can be displayed on an electronic device. A user of the electronic device can interact with the electronic device via the displayed user interface. User interfaces can enable one or more operations to be performed on the electronic device.
- Some techniques for managing user interfaces related to time using electronic devices, however, are generally cumbersome and inefficient. For example, some existing techniques use a complex and time-consuming user interface, which may include multiple key presses or keystrokes. Existing techniques require more time than necessary, wasting user time and device energy. This latter consideration is particularly important in battery-operated devices.
- Accordingly, the present technique provides devices with faster, more efficient methods and interfaces for managing user interfaces related to time. Such methods and interfaces optionally complement or replace other methods for managing user interfaces related to time. Such methods and interfaces reduce the cognitive burden on a user and produce a more efficient human-machine interface. For battery-operated computing devices, such methods and interfaces conserve power and increase the time between battery charges.
- In accordance with some embodiments, a method performed at a computer system that is in communication with a display generation component and one or more input devices is described. The method comprises: displaying, via the display generation component, a watch user interface, wherein displaying the watch user interface includes concurrently displaying: a first analog dial and a first time indicator that indicates a current time in a first time zone on the first analog dial, and a second analog dial and a second time indicator that indicates a current time in a second time zone on the second analog dial, wherein the second analog dial is displayed at a first orientation relative to the first analog dial; after displaying the watch user interface with the first analog dial and the second analog dial that is displayed at a first orientation relative to the first analog dial, receiving, via the one or more input devices, a request to change a time zone associated with the second analog dial; in response to receiving the request to change the time zone associated with the second analog dial, changing the time zone associated with the second analog dial to a third time zone that is different from the first time zone; and while the second analog dial is associated with the third time zone, displaying, via the display generation component, the watch user interface, wherein displaying the watch user interface includes concurrently displaying: the first analog dial and the first time indicator indicating a current time in the first time zone on the first analog dial, and the second analog dial and the second time indicator indicating a current time in the third time zone on the second analog dial, wherein the second analog dial is displayed at a second orientation relative to the first analog dial.
- In accordance with some embodiments, a non-transitory computer-readable storage medium storing one or more programs configured to be executed by one or more processors of a computer system that is in communication with a display generation component and one or more input devices is described. The one or more programs include instructions for: displaying, via the display generation component, a watch user interface, wherein displaying the watch user interface includes concurrently displaying: a first analog dial and a first time indicator that indicates a current time in a first time zone on the first analog dial, and a second analog dial and a second time indicator that indicates a current time in a second time zone on the second analog dial, wherein the second analog dial is displayed at a first orientation relative to the first analog dial; after displaying the watch user interface with the first analog dial and the second analog dial that is displayed at a first orientation relative to the first analog dial, receiving, via the one or more input devices, a request to change a time zone associated with the second analog dial; in response to receiving the request to change the time zone associated with the second analog dial, changing the time zone associated with the second analog dial to a third time zone that is different from the first time zone; and while the second analog dial is associated with the third time zone, displaying, via the display generation component, the watch user interface, wherein displaying the watch user interface includes concurrently displaying: the first analog dial and the first time indicator indicating a current time in the first time zone on the first analog dial, and the second analog dial and the second time indicator indicating a current time in the third time zone on the second analog dial, wherein the second analog dial is displayed at a second orientation relative to the first analog dial.
- In accordance with some embodiments, a transitory computer-readable storage medium storing one or more programs configured to be executed by one or more processors of a computer system that is in communication with a display generation component and one or more input devices is described. The one or more programs include instructions for: displaying, via the display generation component, a watch user interface, wherein displaying the watch user interface includes concurrently displaying: a first analog dial and a first time indicator that indicates a current time in a first time zone on the first analog dial, and a second analog dial and a second time indicator that indicates a current time in a second time zone on the second analog dial, wherein the second analog dial is displayed at a first orientation relative to the first analog dial; after displaying the watch user interface with the first analog dial and the second analog dial that is displayed at a first orientation relative to the first analog dial, receiving, via the one or more input devices, a request to change a time zone associated with the second analog dial; in response to receiving the request to change the time zone associated with the second analog dial, changing the time zone associated with the second analog dial to a third time zone that is different from the first time zone; and while the second analog dial is associated with the third time zone, displaying, via the display generation component, the watch user interface, wherein displaying the watch user interface includes concurrently displaying: the first analog dial and the first time indicator indicating a current time in the first time zone on the first analog dial, and the second analog dial and the second time indicator indicating a current time in the third time zone on the second analog dial, wherein the second analog dial is displayed at a second orientation relative to the first analog dial.
- In accordance with some embodiments, a computer system comprising a display generation component, one or more input devices, one or more processors, and memory storing one or more programs configured to be executed by the one or more processors is described. The one or more programs including instructions for: displaying, via the display generation component, a watch user interface, wherein displaying the watch user interface includes concurrently displaying: a first analog dial and a first time indicator that indicates a current time in a first time zone on the first analog dial, and a second analog dial and a second time indicator that indicates a current time in a second time zone on the second analog dial, wherein the second analog dial is displayed at a first orientation relative to the first analog dial; after displaying the watch user interface with the first analog dial and the second analog dial that is displayed at a first orientation relative to the first analog dial, receiving, via the one or more input devices, a request to change a time zone associated with the second analog dial; in response to receiving the request to change the time zone associated with the second analog dial, changing the time zone associated with the second analog dial to a third time zone that is different from the first time zone; and while the second analog dial is associated with the third time zone, displaying, via the display generation component, the watch user interface, wherein displaying the watch user interface includes concurrently displaying: the first analog dial and the first time indicator indicating a current time in the first time zone on the first analog dial, and the second analog dial and the second time indicator indicating a current time in the third time zone on the second analog dial, wherein the second analog dial is displayed at a second orientation relative to the first analog dial.
- In accordance with some embodiments, a computer system is described. The computer system comprises: a display generation component; one or more input devices; and means for displaying, via the display generation component, a watch user interface, wherein displaying the watch user interface includes concurrently displaying: a first analog dial and a first time indicator that indicates a current time in a first time zone on the first analog dial, and a second analog dial and a second time indicator that indicates a current time in a second time zone on the second analog dial, wherein the second analog dial is displayed at a first orientation relative to the first analog dial; means for, after displaying the watch user interface with the first analog dial and the second analog dial that is displayed at a first orientation relative to the first analog dial, receiving, via the one or more input devices, a request to change a time zone associated with the second analog dial; means for, in response to receiving the request to change the time zone associated with the second analog dial, changing the time zone associated with the second analog dial to a third time zone that is different from the first time zone; and means for, while the second analog dial is associated with the third time zone, displaying, via the display generation component, the watch user interface, wherein displaying the watch user interface includes concurrently displaying: the first analog dial and the first time indicator indicating a current time in the first time zone on the first analog dial, and the second analog dial and the second time indicator indicating a current time in the third time zone on the second analog dial, wherein the second analog dial is displayed at a second orientation relative to the first analog dial.
- In accordance with some embodiments, a method performed at a computer system that is in communication with a display generation component and one or more input devices is described. The method comprises: displaying, via the display generation component, a watch user interface, the watch user interface including an analog clock face that includes a first clock hand and a graphical indicator, wherein the graphical indicator is displayed at a first position relative to the analog clock face; while displaying the watch user interface, detecting, via the one or more input devices, a first user input; in response to detecting the first user input, moving the graphical indicator to a second position relative to the analog clock face such that the graphical indicator is aligned with the first clock hand; and while the graphical indicator is displayed at the second position relative to the analog clock face, displaying a graphical indication of a time that has elapsed from a time when the first user input was detected to a current time.
- In accordance with some embodiments, a non-transitory computer-readable storage medium storing one or more programs configured to be executed by one or more processors of a computer system that is in communication with a display generation component and one or more input devices is described. The one or more programs include instructions for: displaying, via the display generation component, a watch user interface, the watch user interface including an analog clock face that includes a first clock hand and a graphical indicator, wherein the graphical indicator is displayed at a first position relative to the analog clock face; while displaying the watch user interface, detecting, via the one or more input devices, a first user input; in response to detecting the first user input, moving the graphical indicator to a second position relative to the analog clock face such that the graphical indicator is aligned with the first clock hand; and while the graphical indicator is displayed at the second position relative to the analog clock face, displaying a graphical indication of a time that has elapsed from a time when the first user input was detected to a current time.
- In accordance with some embodiments, a transitory computer-readable storage medium storing one or more programs configured to be executed by one or more processors of a computer system that is in communication with a display generation component and one or more input devices is described. The one or more programs include instructions for: displaying, via the display generation component, a watch user interface, the watch user interface including an analog clock face that includes a first clock hand and a graphical indicator, wherein the graphical indicator is displayed at a first position relative to the analog clock face; while displaying the watch user interface, detecting, via the one or more input devices, a first user input; in response to detecting the first user input, moving the graphical indicator to a second position relative to the analog clock face such that the graphical indicator is aligned with the first clock hand; and while the graphical indicator is displayed at the second position relative to the analog clock face, displaying a graphical indication of a time that has elapsed from a time when the first user input was detected to a current time.
- In accordance with some embodiments, a computer system comprising a display generation component, one or more input devices, one or more processors, and memory storing one or more programs configured to be executed by the one or more processors is described. The one or more programs including instructions for: displaying, via the display generation component, a watch user interface, the watch user interface including an analog clock face that includes a first clock hand and a graphical indicator, wherein the graphical indicator is displayed at a first position relative to the analog clock face; while displaying the watch user interface, detecting, via the one or more input devices, a first user input; in response to detecting the first user input, moving the graphical indicator to a second position relative to the analog clock face such that the graphical indicator is aligned with the first clock hand; and while the graphical indicator is displayed at the second position relative to the analog clock face, displaying a graphical indication of a time that has elapsed from a time when the first user input was detected to a current time.
- In accordance with some embodiments, a computer system is described. The computer system comprises: a display generation component; one or more input devices; means for displaying, via the display generation component, a watch user interface, the watch user interface including an analog clock face that includes a first clock hand and a graphical indicator, wherein the graphical indicator is displayed at a first position relative to the analog clock face; means for, while displaying the watch user interface, detecting, via the one or more input devices, a first user input; means for, in response to detecting the first user input, moving the graphical indicator to a second position relative to the analog clock face such that the graphical indicator is aligned with the first clock hand; and means for, while the graphical indicator is displayed at the second position relative to the analog clock face, displaying a graphical indication of a time that has elapsed from a time when the first user input was detected to a current time.
- In accordance with some embodiments, a method performed at a computer system that is in communication with a display generation component is described. The method comprises: at a first time, displaying, concurrently in a user interface displayed via the display generation component: an indication of time, and a graphical representation of a first character, wherein displaying the graphical representation of the first character includes: in accordance with a determination that the computer system is in a first activity state, displaying the graphical representation of the first character in a first visual state that corresponds to the first activity state of the computer system; and in accordance with a determination that the computer system is in a second activity state that is different from the first activity state, displaying the graphical representation of the first character in a second visual state, different from the first visual state, that corresponds to the second activity state of the computer system; and at a second time, after the first time, displaying, concurrently in the user interface: the indication of time, and a graphical representation of a second character, wherein displaying the graphical representation of the second character includes: in accordance with a determination that the computer system is in the first activity state, displaying the graphical representation of the second character in the first visual state that corresponds to the first activity state of the computer system; and in accordance with a determination that the computer system is in the second activity state that is different from the first activity state, displaying the graphical representation of the second character in the second visual state, different from the first visual state, that corresponds to the second activity state of the computer system.
- In accordance with some embodiments, a non-transitory computer-readable storage medium storing one or more programs configured to be executed by one or more processors of a computer system that is in communication with a display generation component is described. The one or more programs include instructions for: at a first time, displaying, concurrently in a user interface displayed via the display generation component: an indication of time, and a graphical representation of a first character, wherein displaying the graphical representation of the first character includes: in accordance with a determination that the computer system is in a first activity state, displaying the graphical representation of the first character in a first visual state that corresponds to the first activity state of the computer system; and in accordance with a determination that the computer system is in a second activity state that is different from the first activity state, displaying the graphical representation of the first character in a second visual state, different from the first visual state, that corresponds to the second activity state of the computer system; and at a second time, after the first time, displaying, concurrently in the user interface: the indication of time, and a graphical representation of a second character, wherein displaying the graphical representation of the second character includes: in accordance with a determination that the computer system is in the first activity state, displaying the graphical representation of the second character in the first visual state that corresponds to the first activity state of the computer system; and in accordance with a determination that the computer system is in the second activity state that is different from the first activity state, displaying the graphical representation of the second character in the second visual state, different from the first visual state, that corresponds to the second activity state of the computer system.
- In accordance with some embodiments, a transitory computer-readable storage medium storing one or more programs configured to be executed by one or more processors of a computer system that is in communication with a display generation component is described. The one or more programs include instructions for: at a first time, displaying, concurrently in a user interface displayed via the display generation component: an indication of time, and a graphical representation of a first character, wherein displaying the graphical representation of the first character includes: in accordance with a determination that the computer system is in a first activity state, displaying the graphical representation of the first character in a first visual state that corresponds to the first activity state of the computer system; and in accordance with a determination that the computer system is in a second activity state that is different from the first activity state, displaying the graphical representation of the first character in a second visual state, different from the first visual state, that corresponds to the second activity state of the computer system; and at a second time, after the first time, displaying, concurrently in the user interface: the indication of time, and a graphical representation of a second character, wherein displaying the graphical representation of the second character includes: in accordance with a determination that the computer system is in the first activity state, displaying the graphical representation of the second character in the first visual state that corresponds to the first activity state of the computer system; and in accordance with a determination that the computer system is in the second activity state that is different from the first activity state, displaying the graphical representation of the second character in the second visual state, different from the first visual state, that corresponds to the second activity state of the computer system.
- In accordance with some embodiments, a computer system comprising a display generation component; one or more processors; and memory storing one or more programs configured to be executed by the one or more processors is described. The one or more programs include instructions for: at a first time, displaying, concurrently in a user interface displayed via the display generation component: an indication of time, and a graphical representation of a first character, wherein displaying the graphical representation of the first character includes: in accordance with a determination that the computer system is in a first activity state, displaying the graphical representation of the first character in a first visual state that corresponds to the first activity state of the computer system; and in accordance with a determination that the computer system is in a second activity state that is different from the first activity state, displaying the graphical representation of the first character in a second visual state, different from the first visual state, that corresponds to the second activity state of the computer system; and at a second time, after the first time, displaying, concurrently in the user interface: the indication of time, and a graphical representation of a second character, wherein displaying the graphical representation of the second character includes: in accordance with a determination that the computer system is in the first activity state, displaying the graphical representation of the second character in the first visual state that corresponds to the first activity state of the computer system; and in accordance with a determination that the computer system is in the second activity state that is different from the first activity state, displaying the graphical representation of the second character in the second visual state, different from the first visual state, that corresponds to the second activity state of the computer system.
- In accordance with some embodiments, a computer system is described. The computer system comprises: a display generation component; means for, at a first time, displaying, concurrently in a user interface displayed via the display generation component: an indication of time, and a graphical representation of a first character, wherein displaying the graphical representation of the first character includes: in accordance with a determination that the computer system is in a first activity state, displaying the graphical representation of the first character in a first visual state that corresponds to the first activity state of the computer system; and in accordance with a determination that the computer system is in a second activity state that is different from the first activity state, displaying the graphical representation of the first character in a second visual state, different from the first visual state, that corresponds to the second activity state of the computer system; and means for, at a second time, after the first time, displaying, concurrently in the user interface: the indication of time, and a graphical representation of a second character, wherein displaying the graphical representation of the second character includes: in accordance with a determination that the computer system is in the first activity state, displaying the graphical representation of the second character in the first visual state that corresponds to the first activity state of the computer system; and in accordance with a determination that the computer system is in the second activity state that is different from the first activity state, displaying the graphical representation of the second character in the second visual state, different from the first visual state, that corresponds to the second activity state of the computer system.
- In accordance with some embodiments, a method performed at a computer system that is in communication with a display generation component is described. The method comprises: displaying, via the display generation component, a time user interface that includes a representation of a first face having a first facial feature and a second facial feature, wherein: the first facial feature of the first face indicates a current time, and the second facial feature of the first face has a first visual characteristic; while displaying the representation of the first face, detecting the satisfaction of a predetermined criteria for changing an appearance of the time user interface; and in response to detecting the satisfaction of the predetermined criteria for changing an appearance of the time user interface, ceasing to display the representation of the first face and displaying a representation of a second face having a first facial feature and a second facial feature, wherein: the representation of the second face is different from the representation of the first face, the first facial feature of the second face indicates a current time, the second facial feature of the second face has a second visual characteristic different from the first visual characteristic, and ceasing to display the representation of the first face and displaying the representation of the second face includes displaying a gradual transition from the first face to the second face that includes transitioning the second facial feature of the first face from having the first visual characteristic through a plurality of intermediate states to a final state in which the second facial feature of the second face has the second visual characteristic.
- In accordance with some embodiments, a non-transitory computer-readable storage medium storing one or more programs configured to be executed by one or more processors of a computer system that is in communication with a display generation component is described. The one or more programs include instructions for: displaying, via the display generation component, a time user interface that includes a representation of a first face having a first facial feature and a second facial feature, wherein: the first facial feature of the first face indicates a current time, and the second facial feature of the first face has a first visual characteristic; while displaying the representation of the first face, detecting the satisfaction of a predetermined criteria for changing an appearance of the time user interface; and in response to detecting the satisfaction of the predetermined criteria for changing an appearance of the time user interface, ceasing to display the representation of the first face and displaying a representation of a second face having a first facial feature and a second facial feature, wherein: the representation of the second face is different from the representation of the first face, the first facial feature of the second face indicates a current time, the second facial feature of the second face has a second visual characteristic different from the first visual characteristic, and ceasing to display the representation of the first face and displaying the representation of the second face includes displaying a gradual transition from the first face to the second face that includes transitioning the second facial feature of the first face from having the first visual characteristic through a plurality of intermediate states to a final state in which the second facial feature of the second face has the second visual characteristic.
- In accordance with some embodiments, a transitory computer-readable storage medium storing one or more programs configured to be executed by one or more processors of a computer system that is in communication with a display generation component is described. The one or more programs include instructions for: displaying, via the display generation component, a time user interface that includes a representation of a first face having a first facial feature and a second facial feature, wherein: the first facial feature of the first face indicates a current time, and the second facial feature of the first face has a first visual characteristic; while displaying the representation of the first face, detecting the satisfaction of a predetermined criteria for changing an appearance of the time user interface; and in response to detecting the satisfaction of the predetermined criteria for changing an appearance of the time user interface, ceasing to display the representation of the first face and displaying a representation of a second face having a first facial feature and a second facial feature, wherein: the representation of the second face is different from the representation of the first face, the first facial feature of the second face indicates a current time, the second facial feature of the second face has a second visual characteristic different from the first visual characteristic, and ceasing to display the representation of the first face and displaying the representation of the second face includes displaying a gradual transition from the first face to the second face that includes transitioning the second facial feature of the first face from having the first visual characteristic through a plurality of intermediate states to a final state in which the second facial feature of the second face has the second visual characteristic.
- In accordance with some embodiments, a computer system comprising a display generation component, one or more processors, and memory storing one or more programs configured to be executed by the one or more processors is described. The one or more programs including instructions for: displaying, via the display generation component, a time user interface that includes a representation of a first face having a first facial feature and a second facial feature, wherein: the first facial feature of the first face indicates a current time, and the second facial feature of the first face has a first visual characteristic; while displaying the representation of the first face, detecting the satisfaction of a predetermined criteria for changing an appearance of the time user interface; and in response to detecting the satisfaction of the predetermined criteria for changing an appearance of the time user interface, ceasing to display the representation of the first face and displaying a representation of a second face having a first facial feature and a second facial feature, wherein: the representation of the second face is different from the representation of the first face, the first facial feature of the second face indicates a current time, the second facial feature of the second face has a second visual characteristic different from the first visual characteristic, and ceasing to display the representation of the first face and displaying the representation of the second face includes displaying a gradual transition from the first face to the second face that includes transitioning the second facial feature of the first face from having the first visual characteristic through a plurality of intermediate states to a final state in which the second facial feature of the second face has the second visual characteristic.
- In accordance with some embodiments, a computer system is described. The computer system comprises; a display generation component; means for displaying, via the display generation component, a time user interface that includes a representation of a first face having a first facial feature and a second facial feature, wherein: the first facial feature of the first face indicates a current time, and the second facial feature of the first face has a first visual characteristic; means for, while displaying the representation of the first face, detecting the satisfaction of a predetermined criteria for changing an appearance of the time user interface; and means for, in response to detecting the satisfaction of the predetermined criteria for changing an appearance of the time user interface, ceasing to display the representation of the first face and displaying a representation of a second face having a first facial feature and a second facial feature, wherein: the representation of the second face is different from the representation of the first face, the first facial feature of the second face indicates a current time, the second facial feature of the second face has a second visual characteristic different from the first visual characteristic, and ceasing to display the representation of the first face and displaying the representation of the second face includes displaying a gradual transition from the first face to the second face that includes transitioning the second facial feature of the first face from having the first visual characteristic through a plurality of intermediate states to a final state in which the second facial feature of the second face has the second visual characteristic.
- In accordance with some embodiments, a method performed at a computer system that is in communication with a display generation component and one or more input devices is described. The method comprises: displaying, via the display generation component, an editing user interface for editing a background of a user interface, wherein: the user interface includes content overlaid on the background, and the editing user interface includes a representation of the background of the user interface that includes a first number of stripes that is greater than one; while displaying the editing user interface, detecting, via the one or more input devices, a first user input; in response to detecting the first user input: in accordance with a determination that the first user input corresponds to a first type of input, displaying, in the user interface, a representation of an updated background with a second number of stripes that is greater than the first number of stripes; and in accordance with a determination that the first user input corresponds to a second type of input different from the first type of input, displaying, in the user interface, the representation of the updated background with a third number of stripes that is less than the first number of stripes; detecting, via the one or more input devices, a second user input; and in response to detecting the second user input, displaying, via the display generation component, the user interface with the updated background.
- In accordance with some embodiments, a non-transitory computer-readable storage medium storing one or more programs configured to be executed by one or more processors of a computer system that is in communication with a display generation component and one or more input devices is described. The one or more programs include instructions for: displaying, via the display generation component, an editing user interface for editing a background of a user interface, wherein: the user interface includes content overlaid on the background, and the editing user interface includes a representation of the background of the user interface that includes a first number of stripes that is greater than one; while displaying the editing user interface, detecting, via the one or more input devices, a first user input; in response to detecting the first user input: in accordance with a determination that the first user input corresponds to a first type of input, displaying, in the user interface, a representation of an updated background with a second number of stripes that is greater than the first number of stripes; and in accordance with a determination that the first user input corresponds to a second type of input different from the first type of input, displaying, in the user interface, the representation of the updated background with a third number of stripes that is less than the first number of stripes; detecting, via the one or more input devices, a second user input; and in response to detecting the second user input, displaying, via the display generation component, the user interface with the updated background.
- In accordance with some embodiments, a transitory computer-readable storage medium storing one or more programs configured to be executed by one or more processors of a computer system that is in communication with a display generation component and one or more input devices is described. The one or more programs include instructions for: displaying, via the display generation component, an editing user interface for editing a background of a user interface, wherein: the user interface includes content overlaid on the background, and the editing user interface includes a representation of the background of the user interface that includes a first number of stripes that is greater than one; while displaying the editing user interface, detecting, via the one or more input devices, a first user input; in response to detecting the first user input: in accordance with a determination that the first user input corresponds to a first type of input, displaying, in the user interface, a representation of an updated background with a second number of stripes that is greater than the first number of stripes; and in accordance with a determination that the first user input corresponds to a second type of input different from the first type of input, displaying, in the user interface, the representation of the updated background with a third number of stripes that is less than the first number of stripes; detecting, via the one or more input devices, a second user input; and in response to detecting the second user input, displaying, via the display generation component, the user interface with the updated background.
- In accordance with some embodiments, a computer system comprising a display generation component, one or more input devices, one or more processors, and memory storing one or more programs configured to be executed by the one or more processors is described. The one or more programs include instructions for: displaying, via the display generation component, an editing user interface for editing a background of a user interface, wherein: the user interface includes content overlaid on the background, and the editing user interface includes a representation of the background of the user interface that includes a first number of stripes that is greater than one; while displaying the editing user interface, detecting, via the one or more input devices, a first user input; in response to detecting the first user input: in accordance with a determination that the first user input corresponds to a first type of input, displaying, in the user interface, a representation of an updated background with a second number of stripes that is greater than the first number of stripes; and in accordance with a determination that the first user input corresponds to a second type of input different from the first type of input, displaying, in the user interface, the representation of the updated background with a third number of stripes that is less than the first number of stripes; detecting, via the one or more input devices, a second user input; and in response to detecting the second user input, displaying, via the display generation component, the user interface with the updated background.
- In accordance with some embodiments, a computer system is described. The computer system comprises: a display generation component; one or more input devices; means for displaying, via the display generation component, an editing user interface for editing a background of a user interface, wherein: the user interface includes content overlaid on the background, and the editing user interface includes a representation of the background of the user interface that includes a first number of stripes that is greater than one; means for, while displaying the editing user interface, detecting, via the one or more input devices, a first user input; means for, in response to detecting the first user input: in accordance with a determination that the first user input corresponds to a first type of input, displaying, in the user interface, a representation of an updated background with a second number of stripes that is greater than the first number of stripes; and in accordance with a determination that the first user input corresponds to a second type of input different from the first type of input, displaying, in the user interface, the representation of the updated background with a third number of stripes that is less than the first number of stripes; means for detecting, via the one or more input devices, a second user input; and means for, in response to detecting the second user input, displaying, via the display generation component, the user interface with the updated background.
- In accordance with some embodiments, a method performed at a computer system that is in communication with a display generation component and one or more input devices is described. The method comprises: displaying, via the display generation component, a watch face editing user interface, wherein the watch face editing user interface includes a representation of a layout of a watch user interface including a time region for displaying a current time and one or more complication regions for displaying complications on the watch user interface; while displaying the watch face editing user interface, detecting, via the one or more input devices, a first input directed to a complication region of the one or more complication regions; and in response to detecting the first input directed to the complication region of the one or more complication regions, displaying a complication selection user interface, wherein displaying the complication selection user interface includes concurrently displaying: an indication of a first application, a first complication preview corresponding to a first complication that is configured to display, on the watch user interface, a first set of information obtained from the first application, wherein the first complication preview includes a graphical representation of the first complication displaying the first set of information, and a second complication preview corresponding to a second complication that is configured to display, on the watch user interface, a second set of information, different from the first set of information, obtained from the first application, wherein the second complication preview includes a graphical representation of the second complication displaying the second set of information; while displaying the complication selection user interface, detecting, via the one or more input devices, a second input directed to selecting a respective complication preview; and in response to detecting the second input directed to selecting the respective complication preview, displaying, via the display generation component, a representation of the watch user interface with a representation of a selected complication corresponding to the respective complication preview displayed at the first complication region of the watch user interface, wherein: in accordance with a determination that the respective complication preview is the first complication preview, the first complication is displayed in the first complication region of the watch user interface; and in accordance with a determination that the respective complication preview is the second complication preview, the second complication is displayed in the first complication region of the watch user interface.
- In accordance with some embodiments, a non-transitory computer-readable storage medium storing one or more programs configured to be executed by one or more processors of a computer system that is in communication with a display generation component and one or more input devices is described. The one or more programs include instructions for: displaying, via the display generation component, a watch face editing user interface, wherein the watch face editing user interface includes a representation of a layout of a watch user interface including a time region for displaying a current time and one or more complication regions for displaying complications on the watch user interface; while displaying the watch face editing user interface, detecting, via the one or more input devices, a first input directed to a complication region of the one or more complication regions; and in response to detecting the first input directed to the complication region of the one or more complication regions, displaying a complication selection user interface, wherein displaying the complication selection user interface includes concurrently displaying: an indication of a first application, a first complication preview corresponding to a first complication that is configured to display, on the watch user interface, a first set of information obtained from the first application, wherein the first complication preview includes a graphical representation of the first complication displaying the first set of information, and a second complication preview corresponding to a second complication that is configured to display, on the watch user interface, a second set of information, different from the first set of information, obtained from the first application, wherein the second complication preview includes a graphical representation of the second complication displaying the second set of information; while displaying the complication selection user interface, detecting, via the one or more input devices, a second input directed to selecting a respective complication preview; and in response to detecting the second input directed to selecting the respective complication preview, displaying, via the display generation component, a representation of the watch user interface with a representation of a selected complication corresponding to the respective complication preview displayed at the first complication region of the watch user interface, wherein: in accordance with a determination that the respective complication preview is the first complication preview, the first complication is displayed in the first complication region of the watch user interface; and in accordance with a determination that the respective complication preview is the second complication preview, the second complication is displayed in the first complication region of the watch user interface.
- In accordance with some embodiments, a transitory computer-readable storage medium storing one or more programs configured to be executed by one or more processors of a computer system that is in communication with a display generation component and one or more input devices is described. The one or more programs include instructions for: displaying, via the display generation component, a watch face editing user interface, wherein the watch face editing user interface includes a representation of a layout of a watch user interface including a time region for displaying a current time and one or more complication regions for displaying complications on the watch user interface; while displaying the watch face editing user interface, detecting, via the one or more input devices, a first input directed to a complication region of the one or more complication regions; and in response to detecting the first input directed to the complication region of the one or more complication regions, displaying a complication selection user interface, wherein displaying the complication selection user interface includes concurrently displaying: an indication of a first application, a first complication preview corresponding to a first complication that is configured to display, on the watch user interface, a first set of information obtained from the first application, wherein the first complication preview includes a graphical representation of the first complication displaying the first set of information, and a second complication preview corresponding to a second complication that is configured to display, on the watch user interface, a second set of information, different from the first set of information, obtained from the first application, wherein the second complication preview includes a graphical representation of the second complication displaying the second set of information; while displaying the complication selection user interface, detecting, via the one or more input devices, a second input directed to selecting a respective complication preview; and in response to detecting the second input directed to selecting the respective complication preview, displaying, via the display generation component, a representation of the watch user interface with a representation of a selected complication corresponding to the respective complication preview displayed at the first complication region of the watch user interface, wherein: in accordance with a determination that the respective complication preview is the first complication preview, the first complication is displayed in the first complication region of the watch user interface; and in accordance with a determination that the respective complication preview is the second complication preview, the second complication is displayed in the first complication region of the watch user interface.
- In accordance with some embodiments, a computer system comprising a display generation component, one or more input devices, one or more processors, and memory storing one or more programs configured to be executed by the one or more processors is described. The one or more programs include instructions for: displaying, via the display generation component, a watch face editing user interface, wherein the watch face editing user interface includes a representation of a layout of a watch user interface including a time region for displaying a current time and one or more complication regions for displaying complications on the watch user interface; while displaying the watch face editing user interface, detecting, via the one or more input devices, a first input directed to a complication region of the one or more complication regions; and in response to detecting the first input directed to the complication region of the one or more complication regions, displaying a complication selection user interface, wherein displaying the complication selection user interface includes concurrently displaying: an indication of a first application, a first complication preview corresponding to a first complication that is configured to display, on the watch user interface, a first set of information obtained from the first application, wherein the first complication preview includes a graphical representation of the first complication displaying the first set of information, and a second complication preview corresponding to a second complication that is configured to display, on the watch user interface, a second set of information, different from the first set of information, obtained from the first application, wherein the second complication preview includes a graphical representation of the second complication displaying the second set of information; while displaying the complication selection user interface, detecting, via the one or more input devices, a second input directed to selecting a respective complication preview; and in response to detecting the second input directed to selecting the respective complication preview, displaying, via the display generation component, a representation of the watch user interface with a representation of a selected complication corresponding to the respective complication preview displayed at the first complication region of the watch user interface, wherein: in accordance with a determination that the respective complication preview is the first complication preview, the first complication is displayed in the first complication region of the watch user interface; and in accordance with a determination that the respective complication preview is the second complication preview, the second complication is displayed in the first complication region of the watch user interface.
- In accordance with some embodiments, a computer system is described. The computer system comprises: a display generation component; one or more input devices; means for displaying, via the display generation component, a watch face editing user interface, wherein the watch face editing user interface includes a representation of a layout of a watch user interface including a time region for displaying a current time and one or more complication regions for displaying complications on the watch user interface; means for, while displaying the watch face editing user interface, detecting, via the one or more input devices, a first input directed to a complication region of the one or more complication regions; and means for, in response to detecting the first input directed to the complication region of the one or more complication regions, displaying a complication selection user interface, wherein displaying the complication selection user interface includes concurrently displaying: an indication of a first application, a first complication preview corresponding to a first complication that is configured to display, on the watch user interface, a first set of information obtained from the first application, wherein the first complication preview includes a graphical representation of the first complication displaying the first set of information, and a second complication preview corresponding to a second complication that is configured to display, on the watch user interface, a second set of information, different from the first set of information, obtained from the first application, wherein the second complication preview includes a graphical representation of the second complication displaying the second set of information; means for, while displaying the complication selection user interface, detecting, via the one or more input devices, a second input directed to selecting a respective complication preview; and means for, in response to detecting the second input directed to selecting the respective complication preview, displaying, via the display generation component, a representation of the watch user interface with a representation of a selected complication corresponding to the respective complication preview displayed at the first complication region of the watch user interface, wherein: in accordance with a determination that the respective complication preview is the first complication preview, the first complication is displayed in the first complication region of the watch user interface; and in accordance with a determination that the respective complication preview is the second complication preview, the second complication is displayed in the first complication region of the watch user interface.
- In accordance with some embodiments, a method performed at a computer system that is in communication with a display generation component is described. The method comprises: displaying, via the display generation component, a representation of a watch face user interface that is associated with one or more graphical representations of respective characters; while displaying the representation of the watch face user interface, detecting an input corresponding to a request to share the watch face user interface with an external device; in response to detecting the input, initiating a process for sharing the watch face user interface with the external device, wherein: in accordance with a determination that the watch face user interface is associated with less than a threshold number of graphical representations of respective characters, the process for sharing the watch face user interface with the external device includes sharing one or more characteristics of the watch face user interface including transmitting a representation of one or more of the one or more graphical representations of respective characters associated with the watch face user interface; and in accordance with a determination that the watch face user interface is associated with greater than or equal to the threshold number of graphical representations of respective characters, the process for sharing the watch face user interface with the external device includes sharing one or more characteristics of the watch face user interface without transmitting a representation of the one or more graphical representations of respective characters associated with the watch user interface.
- In accordance with some embodiments, a non-transitory computer-readable storage medium storing one or more programs configured to be executed by one or more processors of a computer system that is in communication with a display generation component is described. The one or more programs include instructions for: displaying, via the display generation component, a representation of a watch face user interface that is associated with one or more graphical representations of respective characters; while displaying the representation of the watch face user interface, detecting an input corresponding to a request to share the watch face user interface with an external device; in response to detecting the input, initiating a process for sharing the watch face user interface with the external device, wherein: in accordance with a determination that the watch face user interface is associated with less than a threshold number of graphical representations of respective characters, the process for sharing the watch face user interface with the external device includes sharing one or more characteristics of the watch face user interface including transmitting a representation of one or more of the one or more graphical representations of respective characters associated with the watch face user interface; and in accordance with a determination that the watch face user interface is associated with greater than or equal to the threshold number of graphical representations of respective characters, the process for sharing the watch face user interface with the external device includes sharing one or more characteristics of the watch face user interface without transmitting a representation of the one or more graphical representations of respective characters associated with the watch user interface.
- In accordance with some embodiments, a transitory computer-readable storage medium storing one or more programs configured to be executed by one or more processors of a computer system that is in communication with a display generation component is described. The one or more programs include instructions for: displaying, via the display generation component, a representation of a watch face user interface that is associated with one or more graphical representations of respective characters; while displaying the representation of the watch face user interface, detecting an input corresponding to a request to share the watch face user interface with an external device; in response to detecting the input, initiating a process for sharing the watch face user interface with the external device, wherein: in accordance with a determination that the watch face user interface is associated with less than a threshold number of graphical representations of respective characters, the process for sharing the watch face user interface with the external device includes sharing one or more characteristics of the watch face user interface including transmitting a representation of one or more of the one or more graphical representations of respective characters associated with the watch face user interface; and in accordance with a determination that the watch face user interface is associated with greater than or equal to the threshold number of graphical representations of respective characters, the process for sharing the watch face user interface with the external device includes sharing one or more characteristics of the watch face user interface without transmitting a representation of the one or more graphical representations of respective characters associated with the watch user interface.
- In accordance with some embodiments, a computer system comprising a display generation component, one or more processors, and memory storing one or more programs configured to be executed by the one or more processors is described. The one or more programs include instructions for: displaying, via the display generation component, a representation of a watch face user interface that is associated with one or more graphical representations of respective characters; while displaying the representation of the watch face user interface, detecting an input corresponding to a request to share the watch face user interface with an external device; in response to detecting the input, initiating a process for sharing the watch face user interface with the external device, wherein: in accordance with a determination that the watch face user interface is associated with less than a threshold number of graphical representations of respective characters, the process for sharing the watch face user interface with the external device includes sharing one or more characteristics of the watch face user interface including transmitting a representation of one or more of the one or more graphical representations of respective characters associated with the watch face user interface; and in accordance with a determination that the watch face user interface is associated with greater than or equal to the threshold number of graphical representations of respective characters, the process for sharing the watch face user interface with the external device includes sharing one or more characteristics of the watch face user interface without transmitting a representation of the one or more graphical representations of respective characters associated with the watch user interface.
- In accordance with some embodiments, a computer system is described. The computer system comprises: a display generation component; means for displaying, via the display generation component, a representation of a watch face user interface that is associated with one or more graphical representations of respective characters; means, while displaying the representation of the watch face user interface, for detecting an input corresponding to a request to share the watch face user interface with an external device; in response to detecting the input, means for initiating a process for sharing the watch face user interface with the external device, wherein: in accordance with a determination that the watch face user interface is associated with less than a threshold number of graphical representations of respective characters, the process for sharing the watch face user interface with the external device includes sharing one or more characteristics of the watch face user interface including transmitting a representation of one or more of the one or more graphical representations of respective characters associated with the watch face user interface; and in accordance with a determination that the watch face user interface is associated with greater than or equal to the threshold number of graphical representations of respective characters, the process for sharing the watch face user interface with the external device includes sharing one or more characteristics of the watch face user interface without transmitting a representation of the one or more graphical representations of respective characters associated with the watch user interface.
- Executable instructions for performing these functions are, optionally, included in a non-transitory computer-readable storage medium or other computer program product configured for execution by one or more processors. Executable instructions for performing these functions are, optionally, included in a transitory computer-readable storage medium or other computer program product configured for execution by one or more processors.
- Thus, devices are provided with faster, more efficient methods and interfaces for managing user interfaces related to time, thereby increasing the effectiveness, efficiency, and user satisfaction with such computer systems (e.g., electronic devices). Such methods and interfaces may complement or replace other methods for managing user interfaces related to time.
- For a better understanding of the various described embodiments, reference should be made to the Description of Embodiments below, in conjunction with the following drawings in which like reference numerals refer to corresponding parts throughout the figures.
-
FIG. 1A is a block diagram illustrating a portable multifunction device with a touch-sensitive display in accordance with some embodiments. -
FIG. 1B is a block diagram illustrating exemplary components for event handling in accordance with some embodiments. -
FIG. 2 illustrates a portable multifunction device having a touch screen in accordance with some embodiments. -
FIG. 3 is a block diagram of an exemplary multifunction device with a display and a touch-sensitive surface in accordance with some embodiments. -
FIG. 4A illustrates an exemplary user interface for a menu of applications on a portable multifunction device in accordance with some embodiments. -
FIG. 4B illustrates an exemplary user interface for a multifunction device with a touch-sensitive surface that is separate from the display in accordance with some embodiments. -
FIG. 5A illustrates a personal electronic device in accordance with some embodiments. -
FIG. 5B is a block diagram illustrating a personal electronic device in accordance with some embodiments. -
FIGS. 6A-6H illustrate exemplary user interfaces for displaying and enabling an adjustment of a displayed time zone, in accordance with some embodiments. -
FIGS. 7A-7C are a flow diagram illustrating methods of displaying and enabling an adjustment of a displayed time zone, in accordance with some embodiments. -
FIGS. 8A-8M illustrate exemplary user interfaces for initiating a measurement of time, in accordance with some embodiments. -
FIGS. 9A-9B are a flow diagram illustrating methods of initiating a measurement of time, in accordance with some embodiments. -
FIGS. 10A-10AC illustrate exemplary user interfaces for enabling and displaying a user interface using a character, in accordance with some embodiments. -
FIGS. 11A-11H are a flow diagram illustrating methods of enabling and displaying a user interface using a character, in accordance with some embodiments. -
FIGS. 12A-12G illustrate exemplary user interfaces for enabling and displaying an indication of a current time, in accordance with some embodiments. -
FIGS. 13A-13C are a flow diagram illustrating methods of enabling and displaying an indication of a current time, in accordance with some embodiments. -
FIGS. 14A-14AD illustrate exemplary user interfaces for enabling configuration of a background for a user interface, in accordance with some embodiments. -
FIGS. 15A-15F are a flow diagram illustrating methods of enabling configuration of a background for a user interface, in accordance with some embodiments. -
FIGS. 16A-16AE illustrate exemplary user interfaces for enabling configuration of a user interface, in accordance with some embodiments. -
FIGS. 17A-17D are a flow diagram illustrating methods of enabling configuration of a user interface, in accordance with some embodiments. -
FIGS. 18A-18J illustrate exemplary user interfaces for sharing a configuration of a user interface with an external device, in accordance with some embodiments. -
FIGS. 19A-19C are a flow diagram illustrating methods for sharing a configuration of a user interface with an external device, in accordance with some embodiments. - The following description sets forth exemplary methods, parameters, and the like. It should be recognized, however, that such description is not intended as a limitation on the scope of the present disclosure but is instead provided as a description of exemplary embodiments.
- There is a need for electronic devices that provide efficient methods and interfaces for managing user interfaces related to time. For example, there is a need for devices that enable an intuitive and efficient method for adjusting and displaying a time zone. For another example, there is a need for devices that enable an intuitive and efficient method for initiating and providing a measurement of time. For another example, there is a need for devices that provide an indication of a current time in a compelling manner. For another example, there is a need for devices that enable adjustments and modifications to a background and/or applications of a user interface in an intuitive and efficient manner. Such techniques can reduce the cognitive burden on a user who accesses user interfaces related to time on a device, thereby enhancing productivity. Further, such techniques can reduce processor and battery power otherwise wasted on redundant user inputs.
- Below,
FIGS. 1A-1B, 2, 3, 4A-4B, and 5A-5B provide a description of exemplary devices for performing the techniques for managing event notifications.FIGS. 6A-6H illustrate exemplary user interfaces for displaying and enabling an adjustment of a displayed time zone, in accordance with some embodiments.FIGS. 7A-7C are a flow diagram illustrating methods of displaying and enabling an adjustment of a displayed time zone, in accordance with some embodiments. The user interfaces inFIGS. 6A-6H are used to illustrate the processes described below, including the processes inFIGS. 7A-7C .FIGS. 8A-8M illustrate exemplary user interfaces for initiating a measurement of time, in accordance with some embodiments.FIGS. 9A-9B are a flow diagram illustrating methods of initiating a measurement of time, in accordance with some embodiments. The user interfaces inFIGS. 8A-8M are used to illustrate the processes described below, including the processes inFIGS. 9A-9B .FIGS. 10A-10AC illustrate exemplary user interfaces for enabling and displaying a user interface using a character, in accordance with some embodiments.FIGS. 11A-11H are a flow diagram illustrating methods of enabling and displaying a user interface using a character, in accordance with some embodiments. The user interfaces inFIGS. 10A-10AC are used to illustrate the processes described below, including the processes inFIGS. 11A-11H .FIGS. 12A-12G illustrate exemplary user interfaces for enabling and displaying an indication of a current time, in accordance with some embodiments.FIGS. 13A-13C are a flow diagram illustrating methods of enabling and displaying an indication of a current time, in accordance with some embodiments. The user interfaces inFIGS. 12A-12G are used to illustrate the processes described below, including the processes inFIGS. 13A-13C .FIGS. 14A-14AD illustrate exemplary user interfaces for enabling configuration of a background for a user interface, in accordance with some embodiments.FIGS. 15A-15F are a flow diagram illustrating methods of enabling configuration of a background for a user interface, in accordance with some embodiments. The user interfaces inFIGS. 14A-14AD are used to illustrate the processes described below, including the processes inFIGS. 15A-15F .FIGS. 16A-16AE illustrate exemplary user interfaces for enabling configuration of a user interface, in accordance with some embodiments.FIGS. 17A-17D are a flow diagram illustrating methods of enabling configuration of a user interface, in accordance with some embodiments. The user interfaces inFIGS. 16A-16AE are used to illustrate the processes described below, including the processes inFIGS. 17A-17D .FIGS. 18A-18J illustrate exemplary user interfaces for sharing a configuration of a user interface with an external device, in accordance with some embodiments.FIGS. 19A-19C are a flow diagram illustrating methods for sharing a configuration of a user interface with an external device, in accordance with some embodiments. The user interfaces inFIGS. 18A-18J are used to illustrate the processes described below, including the processes inFIGS. 19A-19C . - Although the following description uses terms “first,” “second,” etc. to describe various elements, these elements should not be limited by the terms. These terms are only used to distinguish one element from another. For example, a first touch could be termed a second touch, and, similarly, a second touch could be termed a first touch, without departing from the scope of the various described embodiments. The first touch and the second touch are both touches, but they are not the same touch.
- The terminology used in the description of the various described embodiments herein is for the purpose of describing particular embodiments only and is not intended to be limiting. As used in the description of the various described embodiments and the appended claims, the singular forms “a,” “an,” and “the” are intended to include the plural forms as well, unless the context clearly indicates otherwise. It will also be understood that the term “and/or” as used herein refers to and encompasses any and all possible combinations of one or more of the associated listed items. It will be further understood that the terms “includes,” “including,” “comprises,” and/or “comprising,” when used in this specification, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof.
- The term “if” is, optionally, construed to mean “when” or “upon” or “in response to determining” or “in response to detecting,” depending on the context. Similarly, the phrase “if it is determined” or “if [a stated condition or event] is detected” is, optionally, construed to mean “upon determining” or “in response to determining” or “upon detecting [the stated condition or event]” or “in response to detecting [the stated condition or event],” depending on the context.
- Embodiments of electronic devices, user interfaces for such devices, and associated processes for using such devices are described. In some embodiments, the device is a portable communications device, such as a mobile telephone, that also contains other functions, such as PDA and/or music player functions. Exemplary embodiments of portable multifunction devices include, without limitation, the iPhone®, iPod Touch®, and iPad® devices from Apple Inc. of Cupertino, Calif. Other portable electronic devices, such as laptops or tablet computers with touch-sensitive surfaces (e.g., touch screen displays and/or touchpads), are, optionally, used. It should also be understood that, in some embodiments, the device is not a portable communications device, but is a desktop computer with a touch-sensitive surface (e.g., a touch screen display and/or a touchpad). In some embodiments, the electronic device is a computer system that is in communication (e.g., via wireless communication, via wired communication) with a display generation component. The display generation component is configured to provide visual output, such as display via a CRT display, display via an LED display, or display via image projection. In some embodiments, the display generation component is integrated with the computer system. In some embodiments, the display generation component is separate from the computer system. As used herein, “displaying” content includes causing to display the content (e.g., video data rendered or decoded by display controller 156) by transmitting, via a wired or wireless connection, data (e.g., image data or video data) to an integrated or external display generation component to visually produce the content.
- In the discussion that follows, an electronic device that includes a display and a touch-sensitive surface is described. It should be understood, however, that the electronic device optionally includes one or more other physical user-interface devices, such as a physical keyboard, a mouse, and/or a joystick.
- The device typically supports a variety of applications, such as one or more of the following: a drawing application, a presentation application, a word processing application, a website creation application, a disk authoring application, a spreadsheet application, a gaming application, a telephone application, a video conferencing application, an e-mail application, an instant messaging application, a workout support application, a photo management application, a digital camera application, a digital video camera application, a web browsing application, a digital music player application, and/or a digital video player application.
- The various applications that are executed on the device optionally use at least one common physical user-interface device, such as the touch-sensitive surface. One or more functions of the touch-sensitive surface as well as corresponding information displayed on the device are, optionally, adjusted and/or varied from one application to the next and/or within a respective application. In this way, a common physical architecture (such as the touch-sensitive surface) of the device optionally supports the variety of applications with user interfaces that are intuitive and transparent to the user.
- Attention is now directed toward embodiments of portable devices with touch-sensitive displays.
FIG. 1A is a block diagram illustrating portablemultifunction device 100 with touch-sensitive display system 112 in accordance with some embodiments. Touch-sensitive display 112 is sometimes called a “touch screen” for convenience and is sometimes known as or called a “touch-sensitive display system.”Device 100 includes memory 102 (which optionally includes one or more computer-readable storage mediums),memory controller 122, one or more processing units (CPUs) 120, peripherals interface 118,RF circuitry 108,audio circuitry 110,speaker 111,microphone 113, input/output (I/O)subsystem 106, otherinput control devices 116, andexternal port 124.Device 100 optionally includes one or moreoptical sensors 164.Device 100 optionally includes one or morecontact intensity sensors 165 for detecting intensity of contacts on device 100 (e.g., a touch-sensitive surface such as touch-sensitive display system 112 of device 100).Device 100 optionally includes one or moretactile output generators 167 for generating tactile outputs on device 100 (e.g., generating tactile outputs on a touch-sensitive surface such as touch-sensitive display system 112 ofdevice 100 ortouchpad 355 of device 300). These components optionally communicate over one or more communication buses orsignal lines 103. - As used in the specification and claims, the term “intensity” of a contact on a touch-sensitive surface refers to the force or pressure (force per unit area) of a contact (e.g., a finger contact) on the touch-sensitive surface, or to a substitute (proxy) for the force or pressure of a contact on the touch-sensitive surface. The intensity of a contact has a range of values that includes at least four distinct values and more typically includes hundreds of distinct values (e.g., at least 256). Intensity of a contact is, optionally, determined (or measured) using various approaches and various sensors or combinations of sensors. For example, one or more force sensors underneath or adjacent to the touch-sensitive surface are, optionally, used to measure force at various points on the touch-sensitive surface. In some implementations, force measurements from multiple force sensors are combined (e.g., a weighted average) to determine an estimated force of a contact. Similarly, a pressure-sensitive tip of a stylus is, optionally, used to determine a pressure of the stylus on the touch-sensitive surface. Alternatively, the size of the contact area detected on the touch-sensitive surface and/or changes thereto, the capacitance of the touch-sensitive surface proximate to the contact and/or changes thereto, and/or the resistance of the touch-sensitive surface proximate to the contact and/or changes thereto are, optionally, used as a substitute for the force or pressure of the contact on the touch-sensitive surface. In some implementations, the substitute measurements for contact force or pressure are used directly to determine whether an intensity threshold has been exceeded (e.g., the intensity threshold is described in units corresponding to the substitute measurements). In some implementations, the substitute measurements for contact force or pressure are converted to an estimated force or pressure, and the estimated force or pressure is used to determine whether an intensity threshold has been exceeded (e.g., the intensity threshold is a pressure threshold measured in units of pressure). Using the intensity of a contact as an attribute of a user input allows for user access to additional device functionality that may otherwise not be accessible by the user on a reduced-size device with limited real estate for displaying affordances (e.g., on a touch-sensitive display) and/or receiving user input (e.g., via a touch-sensitive display, a touch-sensitive surface, or a physical/mechanical control such as a knob or a button).
- As used in the specification and claims, the term “tactile output” refers to physical displacement of a device relative to a previous position of the device, physical displacement of a component (e.g., a touch-sensitive surface) of a device relative to another component (e.g., housing) of the device, or displacement of the component relative to a center of mass of the device that will be detected by a user with the user's sense of touch. For example, in situations where the device or the component of the device is in contact with a surface of a user that is sensitive to touch (e.g., a finger, palm, or other part of a user's hand), the tactile output generated by the physical displacement will be interpreted by the user as a tactile sensation corresponding to a perceived change in physical characteristics of the device or the component of the device. For example, movement of a touch-sensitive surface (e.g., a touch-sensitive display or trackpad) is, optionally, interpreted by the user as a “down click” or “up click” of a physical actuator button. In some cases, a user will feel a tactile sensation such as an “down click” or “up click” even when there is no movement of a physical actuator button associated with the touch-sensitive surface that is physically pressed (e.g., displaced) by the user's movements. As another example, movement of the touch-sensitive surface is, optionally, interpreted or sensed by the user as “roughness” of the touch-sensitive surface, even when there is no change in smoothness of the touch-sensitive surface. While such interpretations of touch by a user will be subject to the individualized sensory perceptions of the user, there are many sensory perceptions of touch that are common to a large majority of users. Thus, when a tactile output is described as corresponding to a particular sensory perception of a user (e.g., an “up click,” a “down click,” “roughness”), unless otherwise stated, the generated tactile output corresponds to physical displacement of the device or a component thereof that will generate the described sensory perception for a typical (or average) user.
- It should be appreciated that
device 100 is only one example of a portable multifunction device, and thatdevice 100 optionally has more or fewer components than shown, optionally combines two or more components, or optionally has a different configuration or arrangement of the components. The various components shown inFIG. 1A are implemented in hardware, software, or a combination of both hardware and software, including one or more signal processing and/or application-specific integrated circuits. -
Memory 102 optionally includes high-speed random access memory and optionally also includes non-volatile memory, such as one or more magnetic disk storage devices, flash memory devices, or other non-volatile solid-state memory devices.Memory controller 122 optionally controls access tomemory 102 by other components ofdevice 100. - Peripherals interface 118 can be used to couple input and output peripherals of the device to
CPU 120 andmemory 102. The one ormore processors 120 run or execute various software programs and/or sets of instructions stored inmemory 102 to perform various functions fordevice 100 and to process data. In some embodiments, peripherals interface 118,CPU 120, andmemory controller 122 are, optionally, implemented on a single chip, such aschip 104. In some other embodiments, they are, optionally, implemented on separate chips. - RF (radio frequency)
circuitry 108 receives and sends RF signals, also called electromagnetic signals.RF circuitry 108 converts electrical signals to/from electromagnetic signals and communicates with communications networks and other communications devices via the electromagnetic signals.RF circuitry 108 optionally includes well-known circuitry for performing these functions, including but not limited to an antenna system, an RF transceiver, one or more amplifiers, a tuner, one or more oscillators, a digital signal processor, a CODEC chipset, a subscriber identity module (SIM) card, memory, and so forth.RF circuitry 108 optionally communicates with networks, such as the Internet, also referred to as the World Wide Web (WWW), an intranet and/or a wireless network, such as a cellular telephone network, a wireless local area network (LAN) and/or a metropolitan area network (MAN), and other devices by wireless communication. TheRF circuitry 108 optionally includes well-known circuitry for detecting near field communication (NFC) fields, such as by a short-range communication radio. The wireless communication optionally uses any of a plurality of communications standards, protocols, and technologies, including but not limited to Global System for Mobile Communications (GSM), Enhanced Data GSM Environment (EDGE), high-speed downlink packet access (HSDPA), high-speed uplink packet access (HSUPA), Evolution, Data-Only (EV-DO), HSPA, HSPA+, Dual-Cell HSPA (DC-HSPDA), long term evolution (LTE), near field communication (NFC), wideband code division multiple access (W-CDMA), code division multiple access (CDMA), time division multiple access (TDMA), Bluetooth, Bluetooth Low Energy (BTLE), Wireless Fidelity (Wi-Fi) (e.g., IEEE 802.11a, IEEE 802.11b, IEEE 802.11g, IEEE 802.11n, and/or IEEE 802.11ac), voice over Internet Protocol (VoIP), Wi-MAX, a protocol for e-mail (e.g., Internet message access protocol (IMAP) and/or post office protocol (POP)), instant messaging (e.g., extensible messaging and presence protocol (XMPP), Session Initiation Protocol for Instant Messaging and Presence Leveraging Extensions (SIMPLE), Instant Messaging and Presence Service (IMPS)), and/or Short Message Service (SMS), or any other suitable communication protocol, including communication protocols not yet developed as of the filing date of this document. -
Audio circuitry 110,speaker 111, andmicrophone 113 provide an audio interface between a user anddevice 100.Audio circuitry 110 receives audio data fromperipherals interface 118, converts the audio data to an electrical signal, and transmits the electrical signal tospeaker 111.Speaker 111 converts the electrical signal to human-audible sound waves.Audio circuitry 110 also receives electrical signals converted bymicrophone 113 from sound waves.Audio circuitry 110 converts the electrical signal to audio data and transmits the audio data to peripherals interface 118 for processing. Audio data is, optionally, retrieved from and/or transmitted tomemory 102 and/orRF circuitry 108 byperipherals interface 118. In some embodiments,audio circuitry 110 also includes a headset jack (e.g., 212,FIG. 2 ). The headset jack provides an interface betweenaudio circuitry 110 and removable audio input/output peripherals, such as output-only headphones or a headset with both output (e.g., a headphone for one or both ears) and input (e.g., a microphone). - I/
O subsystem 106 couples input/output peripherals ondevice 100, such astouch screen 112 and otherinput control devices 116, toperipherals interface 118. I/O subsystem 106 optionally includesdisplay controller 156,optical sensor controller 158,depth camera controller 169,intensity sensor controller 159,haptic feedback controller 161, and one ormore input controllers 160 for other input or control devices. The one ormore input controllers 160 receive/send electrical signals from/to otherinput control devices 116. The otherinput control devices 116 optionally include physical buttons (e.g., push buttons, rocker buttons, etc.), dials, slider switches, joysticks, click wheels, and so forth. In some alternate embodiments, input controller(s) 160 are, optionally, coupled to any (or none) of the following: a keyboard, an infrared port, a USB port, and a pointer device such as a mouse. The one or more buttons (e.g., 208,FIG. 2 ) optionally include an up/down button for volume control ofspeaker 111 and/ormicrophone 113. The one or more buttons optionally include a push button (e.g., 206,FIG. 2 ). In some embodiments, the electronic device is a computer system that is in communication (e.g., via wireless communication, via wired communication) with one or more input devices. In some embodiments, the one or more input devices include a touch-sensitive surface (e.g., a trackpad, as part of a touch-sensitive display). In some embodiments, the one or more input devices include one or more camera sensors (e.g., one or moreoptical sensors 164 and/or one or more depth camera sensors 175), such as for tracking a user's gestures (e.g., hand gestures) as input. In some embodiments, the one or more input devices are integrated with the computer system. In some embodiments, the one or more input devices are separate from the computer system. - A quick press of the push button optionally disengages a lock of
touch screen 112 or optionally begins a process that uses gestures on the touch screen to unlock the device, as described in U.S. patent application Ser. No. 11/322,549, “Unlocking a Device by Performing Gestures on an Unlock Image,” filed Dec. 23, 2005, U.S. Pat. No. 7,657,849, which is hereby incorporated by reference in its entirety. A longer press of the push button (e.g., 206) optionally turns power todevice 100 on or off. The functionality of one or more of the buttons are, optionally, user-customizable.Touch screen 112 is used to implement virtual or soft buttons and one or more soft keyboards. - Touch-
sensitive display 112 provides an input interface and an output interface between the device and a user.Display controller 156 receives and/or sends electrical signals from/totouch screen 112.Touch screen 112 displays visual output to the user. The visual output optionally includes graphics, text, icons, video, and any combination thereof (collectively termed “graphics”). In some embodiments, some or all of the visual output optionally corresponds to user-interface objects. -
Touch screen 112 has a touch-sensitive surface, sensor, or set of sensors that accepts input from the user based on haptic and/or tactile contact.Touch screen 112 and display controller 156 (along with any associated modules and/or sets of instructions in memory 102) detect contact (and any movement or breaking of the contact) ontouch screen 112 and convert the detected contact into interaction with user-interface objects (e.g., one or more soft keys, icons, web pages, or images) that are displayed ontouch screen 112. In an exemplary embodiment, a point of contact betweentouch screen 112 and the user corresponds to a finger of the user. -
Touch screen 112 optionally uses LCD (liquid crystal display) technology, LPD (light emitting polymer display) technology, or LED (light emitting diode) technology, although other display technologies are used in other embodiments.Touch screen 112 anddisplay controller 156 optionally detect contact and any movement or breaking thereof using any of a plurality of touch sensing technologies now known or later developed, including but not limited to capacitive, resistive, infrared, and surface acoustic wave technologies, as well as other proximity sensor arrays or other elements for determining one or more points of contact withtouch screen 112. In an exemplary embodiment, projected mutual capacitance sensing technology is used, such as that found in the iPhone® and iPod Touch® from Apple Inc. of Cupertino, Calif. - A touch-sensitive display in some embodiments of
touch screen 112 is, optionally, analogous to the multi-touch sensitive touchpads described in the following U.S. Pat. No. 6,323,846 (Westerman et al.), U.S. Pat. No. 6,570,557 (Westerman et al.), and/or U.S. Pat. No. 6,677,932 (Westerman), and/or U.S. Patent Publication 2002/0015024A1, each of which is hereby incorporated by reference in its entirety. However,touch screen 112 displays visual output fromdevice 100, whereas touch-sensitive touchpads do not provide visual output. - A touch-sensitive display in some embodiments of
touch screen 112 is described in the following applications: (1) U.S. patent application Ser. No. 11/381,313, “Multipoint Touch Surface Controller,” filed May 2, 2006; (2) U.S. patent application Ser. No. 10/840,862, “Multipoint Touchscreen,” filed May 6, 2004; (3) U.S. patent application Ser. No. 10/903,964, “Gestures For Touch Sensitive Input Devices,” filed Jul. 30, 2004; (4) U.S. patent application Ser. No. 11/048,264, “Gestures For Touch Sensitive Input Devices,” filed Jan. 31, 2005; (5) U.S. patent application Ser. No. 11/038,590, “Mode-Based Graphical User Interfaces For Touch Sensitive Input Devices,” filed Jan. 18, 2005; (6) U.S. patent application Ser. No. 11/228,758, “Virtual Input Device Placement On A Touch Screen User Interface,” filed Sep. 16, 2005; (7) U.S. patent application Ser. No. 11/228,700, “Operation Of A Computer With A Touch Screen Interface,” filed Sep. 16, 2005; (8) U.S. patent application Ser. No. 11/228,737, “Activating Virtual Keys Of A Touch-Screen Virtual Keyboard,” filed Sep. 16, 2005; and (9) U.S. patent application Ser. No. 11/367,749, “Multi-Functional Hand-Held Device,” filed Mar. 3, 2006. All of these applications are incorporated by reference herein in their entirety. -
Touch screen 112 optionally has a video resolution in excess of 100 dpi. In some embodiments, the touch screen has a video resolution of approximately 160 dpi. The user optionally makes contact withtouch screen 112 using any suitable object or appendage, such as a stylus, a finger, and so forth. In some embodiments, the user interface is designed to work primarily with finger-based contacts and gestures, which can be less precise than stylus-based input due to the larger area of contact of a finger on the touch screen. In some embodiments, the device translates the rough finger-based input into a precise pointer/cursor position or command for performing the actions desired by the user. - In some embodiments, in addition to the touch screen,
device 100 optionally includes a touchpad for activating or deactivating particular functions. In some embodiments, the touchpad is a touch-sensitive area of the device that, unlike the touch screen, does not display visual output. The touchpad is, optionally, a touch-sensitive surface that is separate fromtouch screen 112 or an extension of the touch-sensitive surface formed by the touch screen. -
Device 100 also includespower system 162 for powering the various components.Power system 162 optionally includes a power management system, one or more power sources (e.g., battery, alternating current (AC)), a recharging system, a power failure detection circuit, a power converter or inverter, a power status indicator (e.g., a light-emitting diode (LED)) and any other components associated with the generation, management and distribution of power in portable devices. -
Device 100 optionally also includes one or moreoptical sensors 164.FIG. 1A shows an optical sensor coupled tooptical sensor controller 158 in I/O subsystem 106.Optical sensor 164 optionally includes charge-coupled device (CCD) or complementary metal-oxide semiconductor (CMOS) phototransistors.Optical sensor 164 receives light from the environment, projected through one or more lenses, and converts the light to data representing an image. In conjunction with imaging module 143 (also called a camera module),optical sensor 164 optionally captures still images or video. In some embodiments, an optical sensor is located on the back ofdevice 100, oppositetouch screen display 112 on the front of the device so that the touch screen display is enabled for use as a viewfinder for still and/or video image acquisition. In some embodiments, an optical sensor is located on the front of the device so that the user's image is, optionally, obtained for video conferencing while the user views the other video conference participants on the touch screen display. In some embodiments, the position ofoptical sensor 164 can be changed by the user (e.g., by rotating the lens and the sensor in the device housing) so that a singleoptical sensor 164 is used along with the touch screen display for both video conferencing and still and/or video image acquisition. -
Device 100 optionally also includes one or moredepth camera sensors 175.FIG. 1A shows a depth camera sensor coupled todepth camera controller 169 in I/O subsystem 106.Depth camera sensor 175 receives data from the environment to create a three dimensional model of an object (e.g., a face) within a scene from a viewpoint (e.g., a depth camera sensor). In some embodiments, in conjunction with imaging module 143 (also called a camera module),depth camera sensor 175 is optionally used to determine a depth map of different portions of an image captured by theimaging module 143. In some embodiments, a depth camera sensor is located on the front ofdevice 100 so that the user's image with depth information is, optionally, obtained for video conferencing while the user views the other video conference participants on the touch screen display and to capture selfies with depth map data. In some embodiments, thedepth camera sensor 175 is located on the back of device, or on the back and the front of thedevice 100. In some embodiments, the position ofdepth camera sensor 175 can be changed by the user (e.g., by rotating the lens and the sensor in the device housing) so that adepth camera sensor 175 is used along with the touch screen display for both video conferencing and still and/or video image acquisition. -
Device 100 optionally also includes one or morecontact intensity sensors 165.FIG. 1A shows a contact intensity sensor coupled tointensity sensor controller 159 in I/O subsystem 106.Contact intensity sensor 165 optionally includes one or more piezoresistive strain gauges, capacitive force sensors, electric force sensors, piezoelectric force sensors, optical force sensors, capacitive touch-sensitive surfaces, or other intensity sensors (e.g., sensors used to measure the force (or pressure) of a contact on a touch-sensitive surface).Contact intensity sensor 165 receives contact intensity information (e.g., pressure information or a proxy for pressure information) from the environment. In some embodiments, at least one contact intensity sensor is collocated with, or proximate to, a touch-sensitive surface (e.g., touch-sensitive display system 112). In some embodiments, at least one contact intensity sensor is located on the back ofdevice 100, oppositetouch screen display 112, which is located on the front ofdevice 100. -
Device 100 optionally also includes one ormore proximity sensors 166.FIG. 1A showsproximity sensor 166 coupled toperipherals interface 118. Alternately,proximity sensor 166 is, optionally, coupled toinput controller 160 in I/O subsystem 106.Proximity sensor 166 optionally performs as described in U.S. patent application Ser. No. 11/241,839, “Proximity Detector In Handheld Device”; Ser. No. 11/240,788, “Proximity Detector In Handheld Device”; Ser. No. 11/620,702, “Using Ambient Light Sensor To Augment Proximity Sensor Output”; Ser. No. 11/586,862, “Automated Response To And Sensing Of User Activity In Portable Devices”; and Ser. No. 11/638,251, “Methods And Systems For Automatic Configuration Of Peripherals,” which are hereby incorporated by reference in their entirety. In some embodiments, the proximity sensor turns off and disablestouch screen 112 when the multifunction device is placed near the user's ear (e.g., when the user is making a phone call). -
Device 100 optionally also includes one or moretactile output generators 167.FIG. 1A shows a tactile output generator coupled tohaptic feedback controller 161 in I/O subsystem 106.Tactile output generator 167 optionally includes one or more electroacoustic devices such as speakers or other audio components and/or electromechanical devices that convert energy into linear motion such as a motor, solenoid, electroactive polymer, piezoelectric actuator, electrostatic actuator, or other tactile output generating component (e.g., a component that converts electrical signals into tactile outputs on the device).Contact intensity sensor 165 receives tactile feedback generation instructions fromhaptic feedback module 133 and generates tactile outputs ondevice 100 that are capable of being sensed by a user ofdevice 100. In some embodiments, at least one tactile output generator is collocated with, or proximate to, a touch-sensitive surface (e.g., touch-sensitive display system 112) and, optionally, generates a tactile output by moving the touch-sensitive surface vertically (e.g., in/out of a surface of device 100) or laterally (e.g., back and forth in the same plane as a surface of device 100). In some embodiments, at least one tactile output generator sensor is located on the back ofdevice 100, oppositetouch screen display 112, which is located on the front ofdevice 100. -
Device 100 optionally also includes one ormore accelerometers 168.FIG. 1A showsaccelerometer 168 coupled toperipherals interface 118. Alternately,accelerometer 168 is, optionally, coupled to aninput controller 160 in I/O subsystem 106.Accelerometer 168 optionally performs as described in U.S. Patent Publication No. 20050190059, “Acceleration-based Theft Detection System for Portable Electronic Devices,” and U.S. Patent Publication No. 20060017692, “Methods And Apparatuses For Operating A Portable Device Based On An Accelerometer,” both of which are incorporated by reference herein in their entirety. In some embodiments, information is displayed on the touch screen display in a portrait view or a landscape view based on an analysis of data received from the one or more accelerometers.Device 100 optionally includes, in addition to accelerometer(s) 168, a magnetometer and a GPS (or GLONASS or other global navigation system) receiver for obtaining information concerning the location and orientation (e.g., portrait or landscape) ofdevice 100. - In some embodiments, the software components stored in
memory 102 includeoperating system 126, communication module (or set of instructions) 128, contact/motion module (or set of instructions) 130, graphics module (or set of instructions) 132, text input module (or set of instructions) 134, Global Positioning System (GPS) module (or set of instructions) 135, and applications (or sets of instructions) 136. Furthermore, in some embodiments, memory 102 (FIG. 1A ) or 370 (FIG. 3 ) stores device/globalinternal state 157, as shown inFIGS. 1A and 3 . Device/globalinternal state 157 includes one or more of: active application state, indicating which applications, if any, are currently active; display state, indicating what applications, views or other information occupy various regions oftouch screen display 112; sensor state, including information obtained from the device's various sensors andinput control devices 116; and location information concerning the device's location and/or attitude. - Operating system 126 (e.g., Darwin, RTXC, LINUX, UNIX, OS X, iOS, WINDOWS, or an embedded operating system such as VxWorks) includes various software components and/or drivers for controlling and managing general system tasks (e.g., memory management, storage device control, power management, etc.) and facilitates communication between various hardware and software components.
-
Communication module 128 facilitates communication with other devices over one or moreexternal ports 124 and also includes various software components for handling data received byRF circuitry 108 and/orexternal port 124. External port 124 (e.g., Universal Serial Bus (USB), FIREWIRE, etc.) is adapted for coupling directly to other devices or indirectly over a network (e.g., the Internet, wireless LAN, etc.). In some embodiments, the external port is a multi-pin (e.g., 30-pin) connector that is the same as, or similar to and/or compatible with, the 30-pin connector used on iPod® (trademark of Apple Inc.) devices. - Contact/
motion module 130 optionally detects contact with touch screen 112 (in conjunction with display controller 156) and other touch-sensitive devices (e.g., a touchpad or physical click wheel). Contact/motion module 130 includes various software components for performing various operations related to detection of contact, such as determining if contact has occurred (e.g., detecting a finger-down event), determining an intensity of the contact (e.g., the force or pressure of the contact or a substitute for the force or pressure of the contact), determining if there is movement of the contact and tracking the movement across the touch-sensitive surface (e.g., detecting one or more finger-dragging events), and determining if the contact has ceased (e.g., detecting a finger-up event or a break in contact). Contact/motion module 130 receives contact data from the touch-sensitive surface. Determining movement of the point of contact, which is represented by a series of contact data, optionally includes determining speed (magnitude), velocity (magnitude and direction), and/or an acceleration (a change in magnitude and/or direction) of the point of contact. These operations are, optionally, applied to single contacts (e.g., one finger contacts) or to multiple simultaneous contacts (e.g., “multitouch”/multiple finger contacts). In some embodiments, contact/motion module 130 anddisplay controller 156 detect contact on a touchpad. - In some embodiments, contact/
motion module 130 uses a set of one or more intensity thresholds to determine whether an operation has been performed by a user (e.g., to determine whether a user has “clicked” on an icon). In some embodiments, at least a subset of the intensity thresholds are determined in accordance with software parameters (e.g., the intensity thresholds are not determined by the activation thresholds of particular physical actuators and can be adjusted without changing the physical hardware of device 100). For example, a mouse “click” threshold of a trackpad or touch screen display can be set to any of a large range of predefined threshold values without changing the trackpad or touch screen display hardware. Additionally, in some implementations, a user of the device is provided with software settings for adjusting one or more of the set of intensity thresholds (e.g., by adjusting individual intensity thresholds and/or by adjusting a plurality of intensity thresholds at once with a system-level click “intensity” parameter). - Contact/
motion module 130 optionally detects a gesture input by a user. Different gestures on the touch-sensitive surface have different contact patterns (e.g., different motions, timings, and/or intensities of detected contacts). Thus, a gesture is, optionally, detected by detecting a particular contact pattern. For example, detecting a finger tap gesture includes detecting a finger-down event followed by detecting a finger-up (liftoff) event at the same position (or substantially the same position) as the finger-down event (e.g., at the position of an icon). As another example, detecting a finger swipe gesture on the touch-sensitive surface includes detecting a finger-down event followed by detecting one or more finger-dragging events, and subsequently followed by detecting a finger-up (liftoff) event. -
Graphics module 132 includes various known software components for rendering and displaying graphics ontouch screen 112 or other display, including components for changing the visual impact (e.g., brightness, transparency, saturation, contrast, or other visual property) of graphics that are displayed. As used herein, the term “graphics” includes any object that can be displayed to a user, including, without limitation, text, web pages, icons (such as user-interface objects including soft keys), digital images, videos, animations, and the like. - In some embodiments,
graphics module 132 stores data representing graphics to be used. Each graphic is, optionally, assigned a corresponding code.Graphics module 132 receives, from applications etc., one or more codes specifying graphics to be displayed along with, if necessary, coordinate data and other graphic property data, and then generates screen image data to output to displaycontroller 156. -
Haptic feedback module 133 includes various software components for generating instructions used by tactile output generator(s) 167 to produce tactile outputs at one or more locations ondevice 100 in response to user interactions withdevice 100. -
Text input module 134, which is, optionally, a component ofgraphics module 132, provides soft keyboards for entering text in various applications (e.g.,contacts 137,e-mail 140,IM 141,browser 147, and any other application that needs text input). -
GPS module 135 determines the location of the device and provides this information for use in various applications (e.g., to telephone 138 for use in location-based dialing; tocamera 143 as picture/video metadata; and to applications that provide location-based services such as weather widgets, local yellow page widgets, and map/navigation widgets). -
Applications 136 optionally include the following modules (or sets of instructions), or a subset or superset thereof: -
- Contacts module 137 (sometimes called an address book or contact list);
-
Telephone module 138; -
Video conference module 139; -
E-mail client module 140; - Instant messaging (IM)
module 141; -
Workout support module 142; -
Camera module 143 for still and/or video images; -
Image management module 144; - Video player module;
- Music player module;
-
Browser module 147; -
Calendar module 148; -
Widget modules 149, which optionally include one or more of: weather widget 149-1, stocks widget 149-2, calculator widget 149-3, alarm clock widget 149-4, dictionary widget 149-5, and other widgets obtained by the user, as well as user-created widgets 149-6; -
Widget creator module 150 for making user-created widgets 149-6; -
Search module 151; - Video and
music player module 152, which merges video player module and music player module; -
Notes module 153; -
Map module 154; and/or -
Online video module 155.
- Examples of
other applications 136 that are, optionally, stored inmemory 102 include other word processing applications, other image editing applications, drawing applications, presentation applications, JAVA-enabled applications, encryption, digital rights management, voice recognition, and voice replication. - In conjunction with
touch screen 112,display controller 156, contact/motion module 130,graphics module 132, andtext input module 134,contacts module 137 are, optionally, used to manage an address book or contact list (e.g., stored in applicationinternal state 192 ofcontacts module 137 inmemory 102 or memory 370), including: adding name(s) to the address book; deleting name(s) from the address book; associating telephone number(s), e-mail address(es), physical address(es) or other information with a name; associating an image with a name; categorizing and sorting names; providing telephone numbers or e-mail addresses to initiate and/or facilitate communications bytelephone 138,video conference module 139,e-mail 140, orIM 141; and so forth. - In conjunction with
RF circuitry 108,audio circuitry 110,speaker 111,microphone 113,touch screen 112,display controller 156, contact/motion module 130,graphics module 132, andtext input module 134,telephone module 138 are optionally, used to enter a sequence of characters corresponding to a telephone number, access one or more telephone numbers incontacts module 137, modify a telephone number that has been entered, dial a respective telephone number, conduct a conversation, and disconnect or hang up when the conversation is completed. As noted above, the wireless communication optionally uses any of a plurality of communications standards, protocols, and technologies. - In conjunction with
RF circuitry 108,audio circuitry 110,speaker 111,microphone 113,touch screen 112,display controller 156,optical sensor 164,optical sensor controller 158, contact/motion module 130,graphics module 132,text input module 134,contacts module 137, andtelephone module 138,video conference module 139 includes executable instructions to initiate, conduct, and terminate a video conference between a user and one or more other participants in accordance with user instructions. - In conjunction with
RF circuitry 108,touch screen 112,display controller 156, contact/motion module 130,graphics module 132, andtext input module 134,e-mail client module 140 includes executable instructions to create, send, receive, and manage e-mail in response to user instructions. In conjunction withimage management module 144,e-mail client module 140 makes it very easy to create and send e-mails with still or video images taken withcamera module 143. - In conjunction with
RF circuitry 108,touch screen 112,display controller 156, contact/motion module 130,graphics module 132, andtext input module 134, theinstant messaging module 141 includes executable instructions to enter a sequence of characters corresponding to an instant message, to modify previously entered characters, to transmit a respective instant message (for example, using a Short Message Service (SMS) or Multimedia Message Service (MMS) protocol for telephony-based instant messages or using XMPP, SIMPLE, or IMPS for Internet-based instant messages), to receive instant messages, and to view received instant messages. In some embodiments, transmitted and/or received instant messages optionally include graphics, photos, audio files, video files and/or other attachments as are supported in an MMS and/or an Enhanced Messaging Service (EMS). As used herein, “instant messaging” refers to both telephony-based messages (e.g., messages sent using SMS or MMS) and Internet-based messages (e.g., messages sent using XMPP, SIMPLE, or IMPS). - In conjunction with
RF circuitry 108,touch screen 112,display controller 156, contact/motion module 130,graphics module 132,text input module 134,GPS module 135,map module 154, and music player module,workout support module 142 includes executable instructions to create workouts (e.g., with time, distance, and/or calorie burning goals); communicate with workout sensors (sports devices); receive workout sensor data; calibrate sensors used to monitor a workout; select and play music for a workout; and display, store, and transmit workout data. - In conjunction with
touch screen 112,display controller 156, optical sensor(s) 164,optical sensor controller 158, contact/motion module 130,graphics module 132, andimage management module 144,camera module 143 includes executable instructions to capture still images or video (including a video stream) and store them intomemory 102, modify characteristics of a still image or video, or delete a still image or video frommemory 102. - In conjunction with
touch screen 112,display controller 156, contact/motion module 130,graphics module 132,text input module 134, andcamera module 143,image management module 144 includes executable instructions to arrange, modify (e.g., edit), or otherwise manipulate, label, delete, present (e.g., in a digital slide show or album), and store still and/or video images. - In conjunction with
RF circuitry 108,touch screen 112,display controller 156, contact/motion module 130,graphics module 132, andtext input module 134,browser module 147 includes executable instructions to browse the Internet in accordance with user instructions, including searching, linking to, receiving, and displaying web pages or portions thereof, as well as attachments and other files linked to web pages. - In conjunction with
RF circuitry 108,touch screen 112,display controller 156, contact/motion module 130,graphics module 132,text input module 134,e-mail client module 140, andbrowser module 147,calendar module 148 includes executable instructions to create, display, modify, and store calendars and data associated with calendars (e.g., calendar entries, to-do lists, etc.) in accordance with user instructions. - In conjunction with
RF circuitry 108,touch screen 112,display controller 156, contact/motion module 130,graphics module 132,text input module 134, andbrowser module 147,widget modules 149 are mini-applications that are, optionally, downloaded and used by a user (e.g., weather widget 149-1, stocks widget 149-2, calculator widget 149-3, alarm clock widget 149-4, and dictionary widget 149-5) or created by the user (e.g., user-created widget 149-6). In some embodiments, a widget includes an HTML (Hypertext Markup Language) file, a CSS (Cascading Style Sheets) file, and a JavaScript file. In some embodiments, a widget includes an XML (Extensible Markup Language) file and a JavaScript file (e.g., Yahoo! Widgets). - In conjunction with
RF circuitry 108,touch screen 112,display controller 156, contact/motion module 130,graphics module 132,text input module 134, andbrowser module 147, thewidget creator module 150 are, optionally, used by a user to create widgets (e.g., turning a user-specified portion of a web page into a widget). - In conjunction with
touch screen 112,display controller 156, contact/motion module 130,graphics module 132, andtext input module 134,search module 151 includes executable instructions to search for text, music, sound, image, video, and/or other files inmemory 102 that match one or more search criteria (e.g., one or more user-specified search terms) in accordance with user instructions. - In conjunction with
touch screen 112,display controller 156, contact/motion module 130,graphics module 132,audio circuitry 110,speaker 111,RF circuitry 108, andbrowser module 147, video andmusic player module 152 includes executable instructions that allow the user to download and play back recorded music and other sound files stored in one or more file formats, such as MP3 or AAC files, and executable instructions to display, present, or otherwise play back videos (e.g., ontouch screen 112 or on an external, connected display via external port 124). In some embodiments,device 100 optionally includes the functionality of an MP3 player, such as an iPod (trademark of Apple Inc.). - In conjunction with
touch screen 112,display controller 156, contact/motion module 130,graphics module 132, andtext input module 134, notesmodule 153 includes executable instructions to create and manage notes, to-do lists, and the like in accordance with user instructions. - In conjunction with
RF circuitry 108,touch screen 112,display controller 156, contact/motion module 130,graphics module 132,text input module 134,GPS module 135, andbrowser module 147,map module 154 are, optionally, used to receive, display, modify, and store maps and data associated with maps (e.g., driving directions, data on stores and other points of interest at or near a particular location, and other location-based data) in accordance with user instructions. - In conjunction with
touch screen 112,display controller 156, contact/motion module 130,graphics module 132,audio circuitry 110,speaker 111,RF circuitry 108,text input module 134,e-mail client module 140, andbrowser module 147,online video module 155 includes instructions that allow the user to access, browse, receive (e.g., by streaming and/or download), play back (e.g., on the touch screen or on an external, connected display via external port 124), send an e-mail with a link to a particular online video, and otherwise manage online videos in one or more file formats, such as H.264. In some embodiments,instant messaging module 141, rather thane-mail client module 140, is used to send a link to a particular online video. Additional description of the online video application can be found in U.S. Provisional Patent Application No. 60/936,562, “Portable Multifunction Device, Method, and Graphical User Interface for Playing Online Videos,” filed Jun. 20, 2007, and U.S. patent application Ser. No. 11/968,067, “Portable Multifunction Device, Method, and Graphical User Interface for Playing Online Videos,” filed Dec. 31, 2007, the contents of which are hereby incorporated by reference in their entirety. - Each of the above-identified modules and applications corresponds to a set of executable instructions for performing one or more functions described above and the methods described in this application (e.g., the computer-implemented methods and other information processing methods described herein). These modules (e.g., sets of instructions) need not be implemented as separate software programs, procedures, or modules, and thus various subsets of these modules are, optionally, combined or otherwise rearranged in various embodiments. For example, video player module is, optionally, combined with music player module into a single module (e.g., video and
music player module 152,FIG. 1A ). In some embodiments,memory 102 optionally stores a subset of the modules and data structures identified above. Furthermore,memory 102 optionally stores additional modules and data structures not described above. - In some embodiments,
device 100 is a device where operation of a predefined set of functions on the device is performed exclusively through a touch screen and/or a touchpad. By using a touch screen and/or a touchpad as the primary input control device for operation ofdevice 100, the number of physical input control devices (such as push buttons, dials, and the like) ondevice 100 is, optionally, reduced. - The predefined set of functions that are performed exclusively through a touch screen and/or a touchpad optionally include navigation between user interfaces. In some embodiments, the touchpad, when touched by the user, navigates
device 100 to a main, home, or root menu from any user interface that is displayed ondevice 100. In such embodiments, a “menu button” is implemented using a touchpad. In some other embodiments, the menu button is a physical push button or other physical input control device instead of a touchpad. -
FIG. 1B is a block diagram illustrating exemplary components for event handling in accordance with some embodiments. In some embodiments, memory 102 (FIG. 1A ) or 370 (FIG. 3 ) includes event sorter 170 (e.g., in operating system 126) and a respective application 136-1 (e.g., any of the aforementioned applications 137-151, 155, 380-390). -
Event sorter 170 receives event information and determines the application 136-1 andapplication view 191 of application 136-1 to which to deliver the event information.Event sorter 170 includes event monitor 171 andevent dispatcher module 174. In some embodiments, application 136-1 includes applicationinternal state 192, which indicates the current application view(s) displayed on touch-sensitive display 112 when the application is active or executing. In some embodiments, device/globalinternal state 157 is used byevent sorter 170 to determine which application(s) is (are) currently active, and applicationinternal state 192 is used byevent sorter 170 to determineapplication views 191 to which to deliver event information. - In some embodiments, application
internal state 192 includes additional information, such as one or more of: resume information to be used when application 136-1 resumes execution, user interface state information that indicates information being displayed or that is ready for display by application 136-1, a state queue for enabling the user to go back to a prior state or view of application 136-1, and a redo/undo queue of previous actions taken by the user. -
Event monitor 171 receives event information fromperipherals interface 118. Event information includes information about a sub-event (e.g., a user touch on touch-sensitive display 112, as part of a multi-touch gesture). Peripherals interface 118 transmits information it receives from I/O subsystem 106 or a sensor, such asproximity sensor 166, accelerometer(s) 168, and/or microphone 113 (through audio circuitry 110). Information that peripherals interface 118 receives from I/O subsystem 106 includes information from touch-sensitive display 112 or a touch-sensitive surface. - In some embodiments, event monitor 171 sends requests to the peripherals interface 118 at predetermined intervals. In response, peripherals interface 118 transmits event information. In other embodiments, peripherals interface 118 transmits event information only when there is a significant event (e.g., receiving an input above a predetermined noise threshold and/or for more than a predetermined duration).
- In some embodiments,
event sorter 170 also includes a hitview determination module 172 and/or an active eventrecognizer determination module 173. - Hit
view determination module 172 provides software procedures for determining where a sub-event has taken place within one or more views when touch-sensitive display 112 displays more than one view. Views are made up of controls and other elements that a user can see on the display. - Another aspect of the user interface associated with an application is a set of views, sometimes herein called application views or user interface windows, in which information is displayed and touch-based gestures occur. The application views (of a respective application) in which a touch is detected optionally correspond to programmatic levels within a programmatic or view hierarchy of the application. For example, the lowest level view in which a touch is detected is, optionally, called the hit view, and the set of events that are recognized as proper inputs are, optionally, determined based, at least in part, on the hit view of the initial touch that begins a touch-based gesture.
- Hit
view determination module 172 receives information related to sub-events of a touch-based gesture. When an application has multiple views organized in a hierarchy, hitview determination module 172 identifies a hit view as the lowest view in the hierarchy which should handle the sub-event. In most circumstances, the hit view is the lowest level view in which an initiating sub-event occurs (e.g., the first sub-event in the sequence of sub-events that form an event or potential event). Once the hit view is identified by the hitview determination module 172, the hit view typically receives all sub-events related to the same touch or input source for which it was identified as the hit view. - Active event
recognizer determination module 173 determines which view or views within a view hierarchy should receive a particular sequence of sub-events. In some embodiments, active eventrecognizer determination module 173 determines that only the hit view should receive a particular sequence of sub-events. In other embodiments, active eventrecognizer determination module 173 determines that all views that include the physical location of a sub-event are actively involved views, and therefore determines that all actively involved views should receive a particular sequence of sub-events. In other embodiments, even if touch sub-events were entirely confined to the area associated with one particular view, views higher in the hierarchy would still remain as actively involved views. -
Event dispatcher module 174 dispatches the event information to an event recognizer (e.g., event recognizer 180). In embodiments including active eventrecognizer determination module 173,event dispatcher module 174 delivers the event information to an event recognizer determined by active eventrecognizer determination module 173. In some embodiments,event dispatcher module 174 stores in an event queue the event information, which is retrieved by arespective event receiver 182. - In some embodiments,
operating system 126 includesevent sorter 170. Alternatively, application 136-1 includesevent sorter 170. In yet other embodiments,event sorter 170 is a stand-alone module, or a part of another module stored inmemory 102, such as contact/motion module 130. - In some embodiments, application 136-1 includes a plurality of
event handlers 190 and one or more application views 191, each of which includes instructions for handling touch events that occur within a respective view of the application's user interface. Eachapplication view 191 of the application 136-1 includes one ormore event recognizers 180. Typically, arespective application view 191 includes a plurality ofevent recognizers 180. In other embodiments, one or more ofevent recognizers 180 are part of a separate module, such as a user interface kit or a higher level object from which application 136-1 inherits methods and other properties. In some embodiments, arespective event handler 190 includes one or more of:data updater 176,object updater 177,GUI updater 178, and/orevent data 179 received fromevent sorter 170.Event handler 190 optionally utilizes or callsdata updater 176,object updater 177, orGUI updater 178 to update the applicationinternal state 192. Alternatively, one or more of the application views 191 include one or morerespective event handlers 190. Also, in some embodiments, one or more ofdata updater 176,object updater 177, andGUI updater 178 are included in arespective application view 191. - A
respective event recognizer 180 receives event information (e.g., event data 179) fromevent sorter 170 and identifies an event from the event information.Event recognizer 180 includesevent receiver 182 andevent comparator 184. In some embodiments,event recognizer 180 also includes at least a subset of:metadata 183, and event delivery instructions 188 (which optionally include sub-event delivery instructions). -
Event receiver 182 receives event information fromevent sorter 170. The event information includes information about a sub-event, for example, a touch or a touch movement. Depending on the sub-event, the event information also includes additional information, such as location of the sub-event. When the sub-event concerns motion of a touch, the event information optionally also includes speed and direction of the sub-event. In some embodiments, events include rotation of the device from one orientation to another (e.g., from a portrait orientation to a landscape orientation, or vice versa), and the event information includes corresponding information about the current orientation (also called device attitude) of the device. -
Event comparator 184 compares the event information to predefined event or sub-event definitions and, based on the comparison, determines an event or sub-event, or determines or updates the state of an event or sub-event. In some embodiments,event comparator 184 includesevent definitions 186.Event definitions 186 contain definitions of events (e.g., predefined sequences of sub-events), for example, event 1 (187-1), event 2 (187-2), and others. In some embodiments, sub-events in an event (187) include, for example, touch begin, touch end, touch movement, touch cancellation, and multiple touching. In one example, the definition for event 1 (187-1) is a double tap on a displayed object. The double tap, for example, comprises a first touch (touch begin) on the displayed object for a predetermined phase, a first liftoff (touch end) for a predetermined phase, a second touch (touch begin) on the displayed object for a predetermined phase, and a second liftoff (touch end) for a predetermined phase. In another example, the definition for event 2 (187-2) is a dragging on a displayed object. The dragging, for example, comprises a touch (or contact) on the displayed object for a predetermined phase, a movement of the touch across touch-sensitive display 112, and liftoff of the touch (touch end). In some embodiments, the event also includes information for one or more associatedevent handlers 190. - In some embodiments, event definition 187 includes a definition of an event for a respective user-interface object. In some embodiments,
event comparator 184 performs a hit test to determine which user-interface object is associated with a sub-event. For example, in an application view in which three user-interface objects are displayed on touch-sensitive display 112, when a touch is detected on touch-sensitive display 112,event comparator 184 performs a hit test to determine which of the three user-interface objects is associated with the touch (sub-event). If each displayed object is associated with arespective event handler 190, the event comparator uses the result of the hit test to determine whichevent handler 190 should be activated. For example,event comparator 184 selects an event handler associated with the sub-event and the object triggering the hit test. - In some embodiments, the definition for a respective event (187) also includes delayed actions that delay delivery of the event information until after it has been determined whether the sequence of sub-events does or does not correspond to the event recognizer's event type.
- When a
respective event recognizer 180 determines that the series of sub-events do not match any of the events inevent definitions 186, therespective event recognizer 180 enters an event impossible, event failed, or event ended state, after which it disregards subsequent sub-events of the touch-based gesture. In this situation, other event recognizers, if any, that remain active for the hit view continue to track and process sub-events of an ongoing touch-based gesture. - In some embodiments, a
respective event recognizer 180 includesmetadata 183 with configurable properties, flags, and/or lists that indicate how the event delivery system should perform sub-event delivery to actively involved event recognizers. In some embodiments,metadata 183 includes configurable properties, flags, and/or lists that indicate how event recognizers interact, or are enabled to interact, with one another. In some embodiments,metadata 183 includes configurable properties, flags, and/or lists that indicate whether sub-events are delivered to varying levels in the view or programmatic hierarchy. - In some embodiments, a
respective event recognizer 180 activatesevent handler 190 associated with an event when one or more particular sub-events of an event are recognized. In some embodiments, arespective event recognizer 180 delivers event information associated with the event toevent handler 190. Activating anevent handler 190 is distinct from sending (and deferred sending) sub-events to a respective hit view. In some embodiments,event recognizer 180 throws a flag associated with the recognized event, andevent handler 190 associated with the flag catches the flag and performs a predefined process. - In some embodiments,
event delivery instructions 188 include sub-event delivery instructions that deliver event information about a sub-event without activating an event handler. Instead, the sub-event delivery instructions deliver event information to event handlers associated with the series of sub-events or to actively involved views. Event handlers associated with the series of sub-events or with actively involved views receive the event information and perform a predetermined process. - In some embodiments,
data updater 176 creates and updates data used in application 136-1. For example,data updater 176 updates the telephone number used incontacts module 137, or stores a video file used in video player module. In some embodiments, objectupdater 177 creates and updates objects used in application 136-1. For example, objectupdater 177 creates a new user-interface object or updates the position of a user-interface object.GUI updater 178 updates the GUI. For example,GUI updater 178 prepares display information and sends it tographics module 132 for display on a touch-sensitive display. - In some embodiments, event handler(s) 190 includes or has access to
data updater 176,object updater 177, andGUI updater 178. In some embodiments,data updater 176,object updater 177, andGUI updater 178 are included in a single module of a respective application 136-1 orapplication view 191. In other embodiments, they are included in two or more software modules. - It shall be understood that the foregoing discussion regarding event handling of user touches on touch-sensitive displays also applies to other forms of user inputs to operate
multifunction devices 100 with input devices, not all of which are initiated on touch screens. For example, mouse movement and mouse button presses, optionally coordinated with single or multiple keyboard presses or holds; contact movements such as taps, drags, scrolls, etc. on touchpads; pen stylus inputs; movement of the device; oral instructions; detected eye movements; biometric inputs; and/or any combination thereof are optionally utilized as inputs corresponding to sub-events which define an event to be recognized. -
FIG. 2 illustrates aportable multifunction device 100 having atouch screen 112 in accordance with some embodiments. The touch screen optionally displays one or more graphics within user interface (UI) 200. In this embodiment, as well as others described below, a user is enabled to select one or more of the graphics by making a gesture on the graphics, for example, with one or more fingers 202 (not drawn to scale in the figure) or one or more styluses 203 (not drawn to scale in the figure). In some embodiments, selection of one or more graphics occurs when the user breaks contact with the one or more graphics. In some embodiments, the gesture optionally includes one or more taps, one or more swipes (from left to right, right to left, upward and/or downward), and/or a rolling of a finger (from right to left, left to right, upward and/or downward) that has made contact withdevice 100. In some implementations or circumstances, inadvertent contact with a graphic does not select the graphic. For example, a swipe gesture that sweeps over an application icon optionally does not select the corresponding application when the gesture corresponding to selection is a tap. -
Device 100 optionally also include one or more physical buttons, such as “home” ormenu button 204. As described previously,menu button 204 is, optionally, used to navigate to anyapplication 136 in a set of applications that are, optionally, executed ondevice 100. Alternatively, in some embodiments, the menu button is implemented as a soft key in a GUI displayed ontouch screen 112. - In some embodiments,
device 100 includestouch screen 112,menu button 204,push button 206 for powering the device on/off and locking the device, volume adjustment button(s) 208, subscriber identity module (SIM)card slot 210,headset jack 212, and docking/chargingexternal port 124.Push button 206 is, optionally, used to turn the power on/off on the device by depressing the button and holding the button in the depressed state for a predefined time interval; to lock the device by depressing the button and releasing the button before the predefined time interval has elapsed; and/or to unlock the device or initiate an unlock process. In an alternative embodiment,device 100 also accepts verbal input for activation or deactivation of some functions throughmicrophone 113.Device 100 also, optionally, includes one or morecontact intensity sensors 165 for detecting intensity of contacts ontouch screen 112 and/or one or moretactile output generators 167 for generating tactile outputs for a user ofdevice 100. -
FIG. 3 is a block diagram of an exemplary multifunction device with a display and a touch-sensitive surface in accordance with some embodiments.Device 300 need not be portable. In some embodiments,device 300 is a laptop computer, a desktop computer, a tablet computer, a multimedia player device, a navigation device, an educational device (such as a child's learning toy), a gaming system, or a control device (e.g., a home or industrial controller).Device 300 typically includes one or more processing units (CPUs) 310, one or more network orother communications interfaces 360,memory 370, and one ormore communication buses 320 for interconnecting these components.Communication buses 320 optionally include circuitry (sometimes called a chipset) that interconnects and controls communications between system components.Device 300 includes input/output (I/O)interface 330 comprisingdisplay 340, which is typically a touch screen display. I/O interface 330 also optionally includes a keyboard and/or mouse (or other pointing device) 350 andtouchpad 355,tactile output generator 357 for generating tactile outputs on device 300 (e.g., similar to tactile output generator(s) 167 described above with reference toFIG. 1A ), sensors 359 (e.g., optical, acceleration, proximity, touch-sensitive, and/or contact intensity sensors similar to contact intensity sensor(s) 165 described above with reference toFIG. 1A ).Memory 370 includes high-speed random access memory, such as DRAM, SRAM, DDR RAM, or other random access solid state memory devices; and optionally includes non-volatile memory, such as one or more magnetic disk storage devices, optical disk storage devices, flash memory devices, or other non-volatile solid state storage devices.Memory 370 optionally includes one or more storage devices remotely located from CPU(s) 310. In some embodiments,memory 370 stores programs, modules, and data structures analogous to the programs, modules, and data structures stored inmemory 102 of portable multifunction device 100 (FIG. 1A ), or a subset thereof. Furthermore,memory 370 optionally stores additional programs, modules, and data structures not present inmemory 102 of portablemultifunction device 100. For example,memory 370 ofdevice 300 optionallystores drawing module 380,presentation module 382,word processing module 384,website creation module 386,disk authoring module 388, and/orspreadsheet module 390, whilememory 102 of portable multifunction device 100 (FIG. 1A ) optionally does not store these modules. - Each of the above-identified elements in
FIG. 3 is, optionally, stored in one or more of the previously mentioned memory devices. Each of the above-identified modules corresponds to a set of instructions for performing a function described above. The above-identified modules or programs (e.g., sets of instructions) need not be implemented as separate software programs, procedures, or modules, and thus various subsets of these modules are, optionally, combined or otherwise rearranged in various embodiments. In some embodiments,memory 370 optionally stores a subset of the modules and data structures identified above. Furthermore,memory 370 optionally stores additional modules and data structures not described above. - Attention is now directed towards embodiments of user interfaces that are, optionally, implemented on, for example,
portable multifunction device 100. -
FIG. 4A illustrates an exemplary user interface for a menu of applications onportable multifunction device 100 in accordance with some embodiments. Similar user interfaces are, optionally, implemented ondevice 300. In some embodiments,user interface 400 includes the following elements, or a subset or superset thereof: -
- Signal strength indicator(s) 402 for wireless communication(s), such as cellular and Wi-Fi signals;
-
Time 404; -
Bluetooth indicator 405; -
Battery status indicator 406; -
Tray 408 with icons for frequently used applications, such as:-
Icon 416 fortelephone module 138, labeled “Phone,” which optionally includes an indicator 414 of the number of missed calls or voicemail messages; -
Icon 418 fore-mail client module 140, labeled “Mail,” which optionally includes anindicator 410 of the number of unread e-mails; -
Icon 420 forbrowser module 147, labeled “Browser;” and -
Icon 422 for video andmusic player module 152, also referred to as iPod (trademark of Apple Inc.)module 152, labeled “iPod;” and
-
- Icons for other applications, such as:
-
Icon 424 forIM module 141, labeled “Messages;” - Icon 426 for
calendar module 148, labeled “Calendar;” -
Icon 428 forimage management module 144, labeled “Photos;” -
Icon 430 forcamera module 143, labeled “Camera;” -
Icon 432 foronline video module 155, labeled “Online Video;” -
Icon 434 for stocks widget 149-2, labeled “Stocks;” -
Icon 436 formap module 154, labeled “Maps;” -
Icon 438 for weather widget 149-1, labeled “Weather;” -
Icon 440 for alarm clock widget 149-4, labeled “Clock;” -
Icon 442 forworkout support module 142, labeled “Workout Support;” -
Icon 444 fornotes module 153, labeled “Notes;” and -
Icon 446 for a settings application or module, labeled “Settings,” which provides access to settings fordevice 100 and itsvarious applications 136.
-
- It should be noted that the icon labels illustrated in
FIG. 4A are merely exemplary. For example,icon 422 for video andmusic player module 152 is labeled “Music” or “Music Player.” Other labels are, optionally, used for various application icons. In some embodiments, a label for a respective application icon includes a name of an application corresponding to the respective application icon. In some embodiments, a label for a particular application icon is distinct from a name of an application corresponding to the particular application icon. -
FIG. 4B illustrates an exemplary user interface on a device (e.g.,device 300,FIG. 3 ) with a touch-sensitive surface 451 (e.g., a tablet ortouchpad 355,FIG. 3 ) that is separate from the display 450 (e.g., touch screen display 112).Device 300 also, optionally, includes one or more contact intensity sensors (e.g., one or more of sensors 359) for detecting intensity of contacts on touch-sensitive surface 451 and/or one or moretactile output generators 357 for generating tactile outputs for a user ofdevice 300. - Although some of the examples that follow will be given with reference to inputs on touch screen display 112 (where the touch-sensitive surface and the display are combined), in some embodiments, the device detects inputs on a touch-sensitive surface that is separate from the display, as shown in
FIG. 4B . In some embodiments, the touch-sensitive surface (e.g., 451 inFIG. 4B ) has a primary axis (e.g., 452 inFIG. 4B ) that corresponds to a primary axis (e.g., 453 inFIG. 4B ) on the display (e.g., 450). In accordance with these embodiments, the device detects contacts (e.g., 460 and 462 inFIG. 4B ) with the touch-sensitive surface 451 at locations that correspond to respective locations on the display (e.g., inFIG. 4B, 460 corresponds to 468 and 462 corresponds to 470). In this way, user inputs (e.g.,contacts FIG. 4B ) are used by the device to manipulate the user interface on the display (e.g., 450 inFIG. 4B ) of the multifunction device when the touch-sensitive surface is separate from the display. It should be understood that similar methods are, optionally, used for other user interfaces described herein. - Additionally, while the following examples are given primarily with reference to finger inputs (e.g., finger contacts, finger tap gestures, finger swipe gestures), it should be understood that, in some embodiments, one or more of the finger inputs are replaced with input from another input device (e.g., a mouse-based input or stylus input). For example, a swipe gesture is, optionally, replaced with a mouse click (e.g., instead of a contact) followed by movement of the cursor along the path of the swipe (e.g., instead of movement of the contact). As another example, a tap gesture is, optionally, replaced with a mouse click while the cursor is located over the location of the tap gesture (e.g., instead of detection of the contact followed by ceasing to detect the contact). Similarly, when multiple user inputs are simultaneously detected, it should be understood that multiple computer mice are, optionally, used simultaneously, or a mouse and finger contacts are, optionally, used simultaneously.
-
FIG. 5A illustrates exemplary personalelectronic device 500.Device 500 includesbody 502. In some embodiments,device 500 can include some or all of the features described with respect todevices 100 and 300 (e.g.,FIGS. 1A-4B ). In some embodiments,device 500 has touch-sensitive display screen 504,hereafter touch screen 504. Alternatively, or in addition totouch screen 504,device 500 has a display and a touch-sensitive surface. As withdevices device 500 can respond to touches based on their intensity, meaning that touches of different intensities can invoke different user interface operations ondevice 500. - Exemplary techniques for detecting and processing touch intensity are found, for example, in related applications: International Patent Application Serial No. PCT/US2013/040061, titled “Device, Method, and Graphical User Interface for Displaying User Interface Objects Corresponding to an Application,” filed May 8, 2013, published as WIPO Publication No. WO/2013/169849, and International Patent Application Serial No. PCT/US2013/069483, titled “Device, Method, and Graphical User Interface for Transitioning Between Touch Input to Display Output Relationships,” filed Nov. 11, 2013, published as WIPO Publication No. WO/2014/105276, each of which is hereby incorporated by reference in their entirety.
- In some embodiments,
device 500 has one ormore input mechanisms Input mechanisms device 500 has one or more attachment mechanisms. Such attachment mechanisms, if included, can permit attachment ofdevice 500 with, for example, hats, eyewear, earrings, necklaces, shirts, jackets, bracelets, watch straps, chains, trousers, belts, shoes, purses, backpacks, and so forth. These attachment mechanisms permitdevice 500 to be worn by a user. -
FIG. 5B depicts exemplary personalelectronic device 500. In some embodiments,device 500 can include some or all of the components described with respect toFIGS. 1A, 1B , and 3.Device 500 hasbus 512 that operatively couples I/O section 514 with one ormore computer processors 516 andmemory 518. I/O section 514 can be connected to display 504, which can have touch-sensitive component 522 and, optionally, intensity sensor 524 (e.g., contact intensity sensor). In addition, I/O section 514 can be connected withcommunication unit 530 for receiving application and operating system data, using Wi-Fi, Bluetooth, near field communication (NFC), cellular, and/or other wireless communication techniques.Device 500 can includeinput mechanisms 506 and/or 508.Input mechanism 506 is, optionally, a rotatable input device or a depressible and rotatable input device, for example.Input mechanism 508 is, optionally, a button, in some examples. -
Input mechanism 508 is, optionally, a microphone, in some examples. Personalelectronic device 500 optionally includes various sensors, such asGPS sensor 532,accelerometer 534, directional sensor 540 (e.g., compass),gyroscope 536,motion sensor 538, and/or a combination thereof, all of which can be operatively connected to I/O section 514. -
Memory 518 of personalelectronic device 500 can include one or more non-transitory computer-readable storage mediums, for storing computer-executable instructions, which, when executed by one ormore computer processors 516, for example, can cause the computer processors to perform the techniques described below, including processes 700 (FIGS. 7A-7C ), 900 (FIGS. 9A-9B ), 1100 (FIGS. 11A-11H ), 1300 (FIGS. 13A-13C ), 1500 (FIGS. 15A-15F ), 1700 (FIGS. 17A-17D ), and 1900 (FIGS. 19A-19C ). A computer-readable storage medium can be any medium that can tangibly contain or store computer-executable instructions for use by or in connection with the instruction execution system, apparatus, or device. In some examples, the storage medium is a transitory computer-readable storage medium. In some examples, the storage medium is a non-transitory computer-readable storage medium. The non-transitory computer-readable storage medium can include, but is not limited to, magnetic, optical, and/or semiconductor storages. Examples of such storage include magnetic disks, optical discs based on CD, DVD, or Blu-ray technologies, as well as persistent solid-state memory such as flash, solid-state drives, and the like. Personalelectronic device 500 is not limited to the components and configuration ofFIG. 5B , but can include other or additional components in multiple configurations. - As used here, the term “affordance” refers to a user-interactive graphical user interface object that is, optionally, displayed on the display screen of
devices FIGS. 1A, 3, and 5A-5B ). For example, an image (e.g., icon), a button, and text (e.g., hyperlink) each optionally constitute an affordance. - As used herein, the term “focus selector” refers to an input element that indicates a current part of a user interface with which a user is interacting. In some implementations that include a cursor or other location marker, the cursor acts as a “focus selector” so that when an input (e.g., a press input) is detected on a touch-sensitive surface (e.g.,
touchpad 355 inFIG. 3 or touch-sensitive surface 451 inFIG. 4B ) while the cursor is over a particular user interface element (e.g., a button, window, slider, or other user interface element), the particular user interface element is adjusted in accordance with the detected input. In some implementations that include a touch screen display (e.g., touch-sensitive display system 112 inFIG. 1A ortouch screen 112 inFIG. 4A ) that enables direct interaction with user interface elements on the touch screen display, a detected contact on the touch screen acts as a “focus selector” so that when an input (e.g., a press input by the contact) is detected on the touch screen display at a location of a particular user interface element (e.g., a button, window, slider, or other user interface element), the particular user interface element is adjusted in accordance with the detected input. In some implementations, focus is moved from one region of a user interface to another region of the user interface without corresponding movement of a cursor or movement of a contact on a touch screen display (e.g., by using a tab key or arrow keys to move focus from one button to another button); in these implementations, the focus selector moves in accordance with movement of focus between different regions of the user interface. Without regard to the specific form taken by the focus selector, the focus selector is generally the user interface element (or contact on a touch screen display) that is controlled by the user so as to communicate the user's intended interaction with the user interface (e.g., by indicating, to the device, the element of the user interface with which the user is intending to interact). For example, the location of a focus selector (e.g., a cursor, a contact, or a selection box) over a respective button while a press input is detected on the touch-sensitive surface (e.g., a touchpad or touch screen) will indicate that the user is intending to activate the respective button (as opposed to other user interface elements shown on a display of the device). - Attention is now directed towards embodiments of user interfaces (“UI”) and associated processes that are implemented on an electronic device, such as
portable multifunction device 100,device 300, ordevice 500. -
FIGS. 6A-6H illustrate exemplary user interfaces for displaying and enabling an adjustment of a displayed time zone, in accordance with some embodiments. The user interfaces in these figures are used to illustrate the processes described below, including the processes inFIGS. 7A-7C . - In
FIG. 6A ,device 600 displays watchuser interface 604A, which includesfirst analog dial 608 concurrently displayed withsecond analog dial 606.Hour hand 608A,minute hand 608B, andseconds hand 608C indicate the hour, minute, and second (respectively) of a current time in a first time zone onfirst analog dial 608.First analog dial 608 represents a period of 12 hours (e.g.,hour hand 608A will make a full rotation every 12 hours).Clock hand 608D indicates a current time in a second time zone onsecond analog dial 606.Second analog dial 606 represents a period of 24 hours (e.g.,clock hand 608D will make a full rotation every 24 hours).Marker 606C indicates the position of midnight on second analog dial 606 (e.g.,clock hand 608D will point tomarker 606C at midnight in the second time zone).Time zone indicator 608E displays a textual indication (“LAX”, representing Los Angeles) of the time zone associated with second analog dial 606 (e.g., an abbreviation of a geographic location within the time zone associated with second analog dial 606). - In
FIG. 6A ,second analog dial 606 is a ring that surroundsfirst analog dial 608 and has a first orientation relative tofirst analog dial 608.Second analog dial 606 is oriented such that midnight onsecond analog dial 606 is aligned with the 12 o'clock hour onfirst analog dial 608.First analog dial 608 andsecond analog dial 606 are associated with respective time zones. Watchuser interface 604A includestime zone indicator 608E of the time zone associated with second analog dial 606 (e.g., a location in the time zone associated with the second analog dial 606). - In
FIG. 6A ,first analog dial 608 andsecond analog dial 606 are associated with the same time zone, a first time zone, and the time indicator associated with each dial (e.g.,hour hand 608A,minute hand 608B, and/orseconds hand 608C forfirst analog dial 608, andclock hand 608D for second analog dial 608) indicates the same time (the current time in the first time zone). InFIG. 6A , the first time zone is the Pacific time zone, and the current time in the Pacific time zone is 6:00 AM.Hour hand 608A andminute hand 608B indicate 6:00 AM onfirst analog dial 608, andclock hand 608D indicates 6:00 AM onsecond analog dial 606. - In
FIG. 6A ,second analog dial 606 includes tick marks, representing the positions onsecond analog dial 606 corresponding to respective hours, andcurrent hour indicator 606D, which includes a numerical indicator of the hour of the current time in the time zone associated with second analog dial 606 (e.g.,second analog dial 606 includes a single numerical indicator only for the hour of the current time). In some embodiments,current hour indicator 606D is displayed only if the time zone associated withsecond analog dial 606 is different from the time zone associated withfirst analog dial 608. In some embodiments,second analog dial 606 includes numerical indicators at all hour positions or at two or more, but less than all, hour positions. -
Second analog dial 606 includesfirst portion 606A, which corresponds to nighttime in the time zone associated with the second analog dial, andsecond portion 606B (e.g., the portion ofsecond analog dial 606 that is not included infirst portion 606A), which corresponds to daytime in the time zone associated with the second analog dial.First portion 606A andsecond portion 606B have different visual characteristics (e.g., different color, brightness, transparency, or pattern). The boundary betweenfirst portion 606A andsecond portion 606B that is in the clockwise direction frommidnight marker 606C corresponds to a sunrise time (approximately at the 6 o'clock hour position), and the boundary betweenfirst portion 606A andsecond portion 606B that is in the counter-clockwise direction frommidnight marker 606C corresponds to the sunset time (approximately at the 8 o'clock hour position). InFIG. 6A , the size (e.g., angular extent) offirst portion 606A is smaller than the size ofsecond portion 606B, which indicates that nighttime is shorter than daytime. - In some embodiments, the size and/or position (e.g., the angular extent and/or angular position) of
first portion 606A andsecond portion 606B onsecond analog dial 606 depends on the time zone, time of year, and/or a geographic location associated with the time zone (e.g.,first portion 606A representing nighttime is smaller when it is summer in a location associated with the selected time zone than when it is winter in the same location). In some embodiments,first portion 606A andsecond portion 606B are displayed differently whensecond analog dial 606 is associated with a first location in a first time zone than they are whensecond analog dial 606 is associated with a second location (e.g., a location different from the first location) in the first time zone (e.g., the same time zone). For example, since sunrise and sunset are later in Cleveland than they are in New York City (due to Cleveland being to the west of New York City, even though they are in the same time zone),first portion 606A andsecond portion 606B are displayed differently whensecond analog dial 606 is associated with Cleveland than whensecond analog dial 606 is associated with New York City (e.g., for Cleveland,first portion 606A andsecond portion 606B are rotated clockwise relative tomarker 606C compared to their position for New York City). Similarly, since daytime is longer (e.g., sunrise is earlier and sunset is later) during the summer in Seattle than in San Diego (due to Seattle being at a higher latitude than San Diego, even though they are in the same time zone),first portion 606B andsecond portion 606A are displayed differently whensecond analog dial 606 is associated with Seattle than whensecond analog dial 606 is associated with San Diego (e.g., during summer in Seattle and San Diego,first portion 606A has a smaller angular extent andsecond portion 606B has a larger angular extend for Seattle as compared to the angular extent for San Diego). Similarly,first portion 606A andsecond portion 606B are displayed accordingly based on the time of year for a particular location (e.g.,first portion 606A representing nighttime has a larger angular extent in winter than in summer, for a particular location). -
FIG. 6B illustratesdevice 600 displayingwatch user interface 604A at a different time (10:09 AM Pacific time) compared toFIG. 6A , as indicated by the position ofhour hand 608A andminute hand 608B relative tofirst analog dial 608, and the position ofclock hand 608D relative tosecond analog dial 606.Current hour indicator 606D is displayed at the 10 o'clock hour onsecond analog dial 606 according to the current time associated withsecond analog dial 606, and a tick mark is displayed at the 6 o'clock hour onsecond analog dial 606, wherecurrent hour indicator 606D was located inFIG. 6A when the current time was 6:00 AM. -
Device 600 receives (e.g., detects) a request to change the time zone associated withsecond analog dial 606. In some embodiments, the request includes a sequence of one or more inputs (e.g., one or more ofinputs FIG. 6B ,device 600 receives (e.g., detects) input 610 (e.g., a gesture, a tap on display 602). In some embodiments,input 610 includes a rotation ofrotatable input mechanism 603. In some embodiments,rotatable input mechanism 603 is physically connected to device 600 (e.g., to a housing of device 600). In some embodiments,rotatable input mechanism 603 has an axis of rotation that is parallel to a surface of display 602 (e.g.,rotatable input mechanism 603 is attached to a side ofdevice 600 that is perpendicular to a surface of display 602). - In response to receiving
input 610,device 600 displays watchuser interface 612A shown inFIG. 6C . Watchuser interface 612A provides a user interface for changing the time zone associated withsecond analog dial 606. - In
watch user interface 612A,second analog dial 606 includes numerical hour indicators at the positions onsecond analog dial 606 corresponding to respective hours (e.g., the tick marks shown inFIG. 6B are replaced with the numerals shown inFIG. 6C ). Display ofmarker 606C is maintained. Watchuser interface 612 includesvisual indication 614 of the current time in the time zone associated withsecond analog dial 606. InFIG. 6C ,visual indication 614 includes a circle around the respective numerical hour indicator corresponding to the hour of the current time in the time zone associated withsecond analog dial 606. In some embodiments,visual indicator 614 includes highlighting of the respective numerical hour indicator and/or display of the respective numerical indicator with a different visual characteristic (e.g., style, color, size, font) than the other numerical hour indicators. - Watch
user interface 612A includes timezone selection element 616, which displays a designated time zone option corresponding to the time zone associated with the second analog dial. In the embodiment illustrated inFIGS. 6B-6C , timezone selection element 616 replaces the display of first analog dial 608 (e.g.,device 600 ceases display offirst analog dial 608 and displays time zone selection element 616) andcomplications 605A-605D are replaced with affordance 607 (e.g.,device 600 ceases display ofcomplications 605A-605D and displays affordance 607). In some embodiments,device 600displays complications 605A-605D inwatch user interface 612A. In some embodiments,device 600 does not displayaffordance 607 inwatch user interface 612A. - In the embodiment illustrated in
FIG. 6D , time zone selection element includes a list of selectable time zone options arranged according to the difference in time (also referred to as the offset) between the current time in the time zone associated with first analog dial 608 (or the time zone in whichdevice 600 is located) and the respective time zone option. The time zone option corresponding to the time zone associated withsecond analog dial 606 is designated by being visually distinguished (e.g., placed in focus, emphasized, outlined, displayed without displaying other time zone options, highlighted in a different color than other time zone options, displayed brighter than or with less transparency than other time zone options). In the embodiment illustrated inFIG. 6D , the time zone option corresponding to the time zone associated withsecond analog dial 606 is visually distinguished by being displayed in the center of timezone selection element 616 and at a larger size than the other time zone options. In some embodiments, the time zone options show the current time in the corresponding time zone and an identifier of the time zone (referred to as a time zone identifier). For example, inFIG. 6C , the option for the Mountain time zone includes the current time in the Mountain time zone (11:09) and text (DEN) indicating a location (Denver) within the Mountain time zone. The style of the time zone identifier can depend on the option. For example, if a particular geographic location is designated for the option (e.g., via a system setting or by a user), then the time zone identifier includes text representing the particular geographic location; if the option corresponds to the time zone in whichdevice 600 is located, then the time zone identifier includes a “current location” symbol (e.g., the arrow to the left of 10:09 inFIG. 6C ); and if no particular geographic location is designated for the time zone option and the time zone option does not correspond to the location ofdevice 600, then the time zone identifier includes a numerical indicator of the offset (e.g., since no geographic location is designated for the time zone adjacent to the West of the Pacific time zone, which has a current time of 9:09 corresponding to an offset of one hour behind, the time zone indicator includes the numerical indicator “−1”). In some embodiments, the time zone identifier indicates the offset of the time zone option compared to Coordinated Universal Time (UTC) or Greenwich Mean Time (GMT). - While displaying
watch user interface 612A,device 600 receives (e.g., detects)input 618. InFIG. 6C ,input 618 includes a rotation ofrotatable input mechanism 603. In some embodiments,input 618 includes a gesture (e.g., a vertical swipe on display 602). In response to receivinginput 618,device 600 displays watchuser interface 612B shown inFIG. 6D . Watchuser interface 612B designates a different time zone option compared toFIG. 6C (e.g.,device 600 changes the designated time zone option in response to input 618). InFIG. 6D , the list of options in timezone selection element 616 has been shifted (e.g., scrolled) compared toFIG. 6C to designate a different time zone (Mountain time), andsecond analog dial 606 is displayed at a different orientation (e.g., rotated) relative to timezone selection element 616, as compared toFIG. 6C , to correspond to the designated time zone option. In some embodiments,device 600 displays an animated rotation ofsecond analog dial 606 and/or an animated scrolling or rotation of the list of options in timezone selection element 616 in response to receivinginput 618. The change insecond analog dial 606 corresponds to the change in timezone selection element 616 such that the hour indicated byvisual indication 614 insecond analog dial 606 corresponds to the hour of the current time associated with the designated time zone option (DEN 11:09). In FIG. 6D,second analog dial 606 is rotated counter-clockwise 1/24th of a complete rotation (e.g., one hour) such that the hour numeral for the 11 o'clock hour is indicated by visual indication 614 (e.g.,visual indication 614 maintains the same position whilesecond analog dial 606 is rotated counter-clockwise). - In the embodiment illustrated in
FIGS. 6C-6D ,second analog dial 606 is rotated around an axis that is normal to a surface ofdisplay 602 and passes through the center ofsecond analog dial 606; the list of time zone options is displayed such that the time zone options appear to rotate about an axis that is perpendicular to the axis of rotation of second analog dial 606 (e.g., the time zone options appear to rotate about an axis that is parallel to an axis of rotation ofrotatable input mechanism 603; the time zone options appear to move at least partly in a direction normal to (e.g., toward and away from) a surface ofdisplay 602, in addition to moving vertically on display 602). - In some embodiments,
device 600 changes the offset by an amount that is based on (e.g., proportional to) a magnitude, speed, and/or direction of input 618 (e.g., an amount of rotation ofrotatable input mechanism 603; a distance of a gesture). For example, the list of time zone options is scrolled by an amount proportional to the magnitude ofinput 618, andsecond analog dial 606 is rotated by an amount proportional to the magnitude ofinput 618. - In some embodiments,
device 600 changes the offset based on a direction of input 618 (e.g., a direction of rotation ofrotatable input mechanism 603; a direction of a gesture). For example,device 600 increases the offset (e.g., moves to a time zone option is that is further ahead in time) in response to an input in a first direction (e.g., a clockwise rotation, an upward gesture), and decreases the offset (e.g., moves to a times zone option that is further behind in time) in response to an input in a second direction (e.g., a direction opposite the first direction, a counter-clockwise rotation, a downward gesture). - In
FIG. 6D ,device 600 receives (e.g., detects) input 620 (e.g., a gesture, a rotation of rotatable input mechanism 603). InFIG. 6D ,input 620 includes a rotation ofrotatable input mechanism 603. In some embodiments,input 620 is a continuation of input 618 (e.g., further rotation of rotatable input mechanism 603). In response toinput 620,device 600 displays watchuser interface 612C shown inFIG. 6E . Watchuser interface 612C designates the time zone option corresponding to the time zone that is eight hours ahead of the time zone associated with first analog dial 608 (or the time zone in whichdevice 600 is located), corresponding to an offset of +8 hours. In the example illustrated inFIG. 6E , the designated time zone option corresponds to the time zone in which London (LON) is located, where the current time is 6:09 PM (18:09 in 24-hour time).Second analog dial 606 is positioned to correspond to the designated time zone option such the numerical indicator for the 18 o'clock hour is indicated by visual indication 614 (e.g.,visual indication 614 maintains the same position whilesecond analog dial 606 is rotated counter-clockwise from the orientation shown inFIG. 6D ). As the time zone option is changed,first portion 606A andsecond portion 606B are displayed (e.g., updated) according to the designated option (e.g., to represent daytime and nighttime based on the geographic location and time of year for the selected option, as described above). For example,first portion 606A andsecond portion 606B indicate sunrise and sunset times of approximately 6 AM and 8 PM, respectively, for Los Angeles inFIG. 6C , whereas they indicate sunrise and sunset times of 7 AM and 7 PM, respectively, for London inFIG. 6E . - In
FIG. 6E ,device 600 receives (e.g., detects)input 622. In the embodiment illustrated inFIG. 6E ,input 622 includes a tap on an affordance (e.g., “SET” affordance 607) ondisplay 602. In some embodiments,input 622 includes a press of rotatable anddepressible input mechanism 603. In some embodiments,input 622 includes a contact on display 602 (e.g., a contact anywhere ondisplay 602, a contact at a location outside ofsecond analog dial 606, a tap on time zone selection element 616). - In response to
input 622,device 600 associates the time zone option designated inFIG. 6E (e.g., the time zone option that is designated at the time of input 622) with second analog dial 606 (e.g., in response toinput 622,device 600 sets the time zone associated withsecond analog dial 606 to the time zone corresponding to the time zone option that is designated at the time of input 622). - In response to
input 622,device 600 displays an animation, an embodiment of which is illustrated inFIGS. 6F-6G , resulting in display ofwatch user interface 604B. In some embodiments,device 600 displays watchuser interface 604B in response toinput 622 without the animation illustrated byFIGS. 6F-6G or with an animation different from the animation illustrated byFIGS. 6F-6G . - As shown in
FIG. 6F ,device 600 ceases to display affordance 607 and timezone selection element 616, and displaysfirst analog dial 608,hour hand 608A,minute hand 608B, andclock hand 608D. InFIG. 6F , compared to watchuser interface 612C,second analog dial 606 includes tick marks indicating the positions of respective hours, andmarker 606C, similar to the appearance ofsecond analog dial 606 inFIGS. 6A-6B . In some embodiments, the numerical hour indicators shown inFIG. 6E fade out and the tick marks shown inFIG. 6F fade in. InFIG. 6G ,complications 605A-605D are displayed (e.g., all at the same time, one at a time, while the tick marks are displayed, after the tick marks are displayed). - Watch
user interface 604B is similar to watchuser interface 604A, except thatsecond analog dial 606 is displayed at a different orientation relative tofirst analog dial 608,clock hand 608D indicates, onsecond analog dial 606, the current time in the time zone selected inFIGS. 6C-6E , andcurrent hour indicator 606D indicates the hour of the current time in the time zone selected inFIGS. 6C-6E . The orientation ofsecond analog dial 606 relative tofirst analog dial 608 corresponds to the offset between the time zone associated withsecond analog dial 606 and the time zone associated withfirst analog dial 608. Inwatch user interface 604B,time zone indicator 608E displays a textual indication (“LON”) of the time zone associated with second analog dial 606 (e.g., an abbreviation of a geographic location within the time zone associated with second analog dial 606). - In some embodiments, the position of
clock hand 608D relative tofirst analog dial 608 indicates the current time in the time zone associated withfirst analog dial 608, regardless of the orientation ofsecond analog dial 606 relative to first analog dial 608 (e.g.,clock hand 608D indicates the current time in the time zone associated withfirst analog dial 608 as iffirst analog dial 608 represented a 24-hour period of time;clock hand 608D points to the 12 o'clock hour onfirst analog dial 608 at midnight in the time zone associated withfirst analog dial 608 and points to the 3 o'clock hour onfirst analog dial 608 at 6:00 AM in the time zone associated with first analog dial 608). - Turning to
FIG. 6H , watchuser interface 604B is displayed at a different (e.g., later) time compared toFIG. 6G . InFIG. 6H , the current time in the time zone associated withfirst analog dial 608 is 11:00 AM, as indicated byhour hand 608A andminute hand 608B. The corresponding current time in the time zone associated withsecond analog dial 606 is 7:00 PM (19:00 in 24-hour time).Second analog dial 606 has the same orientation relative tofirst analog dial 608 as inFIG. 6G (e.g., the orientation ofsecond analog dial 606 relative tofirst analog dial 608 remains the same (e.g., is maintained) as time advances as long as the time zone associated withsecond analog dial 606 is not changed).Clock hand 608D indicates the current time in the time zone associated withsecond analog dial 606 by being positioned at the location on the second analog dial representing 19:00. Compared to watchuser interface 604B inFIG. 6G ,clock hand 608D is rotated clockwise (e.g.,clock hand 608D advances clockwise at a rate of 1/24th of a full rotation per hour) andcurrent hour indicator 606D is displayed at the 19 o'clock position instead of the 18 o'clock position. In some embodiments,current hour indicator 606D advances to the next adjacent hour position at the top of an hour (e.g., when the current time changes from 18:59 to 19:00). -
FIGS. 7A-7C are a flow diagram illustrating methods of displaying and enabling an adjustment of a displayed time zone, in accordance with some embodiments.Method 700 is performed at a computer system (e.g., 100, 300, 500, 600) (e.g., a smart device, such as a smartphone or a smartwatch; a mobile device) that is in communication with a display generation component and one or more input devices (e.g., including a touch-sensitive surface that is integrated with the display generation component; a mechanical input device; a rotatable input device; a rotatable and depressible input device; a microphone). Some operations inmethod 700 are, optionally, combined, the orders of some operations are, optionally, changed, and some operations are, optionally, omitted. - As described below,
method 700 provides an intuitive way for managing user interfaces related to time. The method reduces the cognitive burden on a user for managing user interfaces related to time, thereby creating a more efficient human-machine interface. For battery-operated computing devices, enabling a user to manage user interfaces related to time faster and more efficiently conserves power and increases the time between battery charges. - The computer system (e.g., 600) displays (702), via the display generation component (e.g., 602), a watch user interface (e.g., 604A) (e.g., showing one or more times via an analog clock), wherein displaying the watch user interface includes concurrently displaying a first analog dial (e.g., 608) (e.g., a 12-hour dial) and a first time indicator (e.g., 608A or 608B) (e.g., an hour hand or an hour hand and a minute hand) that indicates a current time in a first time zone on the first analog dial (e.g., the current time; the time of the current time zone) (704), and a second analog dial (e.g., 606) (e.g., a 24-hour dial) and a second time indicator (e.g., 608D) (e.g., an hour hand) that indicates a current time in a second time zone on the second analog dial, wherein the second analog dial is displayed at a first orientation relative to the first analog dial (e.g., based on the difference between the first time zone and the second time zone) (706).
- In some embodiments, the same time is indicated on both the first analog dial and the second analog dial. In some embodiments, the second time indicator is displayed in a different color and/or shape than the first time indicator. In some embodiments, the second analog dial surrounds the outside of the first analog dial. In some embodiments, the second analog dial includes a graphical indicator (e.g., 606C) (e.g., a marker; a triangular marker) of the midnight mark (e.g., the 24-hour mark of the 24-hour dial). Concurrently displaying the first analog dial that indicates the current time in the first time zone and the second analog dial that indicates the current time in the second time zone enables a user quickly and easily view current times for different time zones with a reduced number of inputs. Reducing the number of inputs needed to perform an operation enhances the operability of the device and makes the user-device interface more efficient (e.g., by helping the user to provide proper inputs and reducing user mistakes when operating/interacting with the device) which, additionally, reduces power usage and improves battery life of the device by enabling the user to use the device more quickly and efficiently.
- After displaying the watch user interface (e.g., 604A) with the first analog dial (e.g., 608) and the second analog dial (e.g., 606) that is displayed at a first orientation relative to the first analog dial (708), the computer system (e.g., 600) receives (710), via the one or more input devices, a request (e.g., 610, 618, 620) to change a time zone associated with the second analog dial (e.g., a time zone that is shown/represented via the second analog dial).
- In response to receiving the request (e.g., 610, 618, 620) to change the time zone associated with the second analog dial (e.g., 606) (716), the computer system (e.g., 600) changes (718) the time zone associated with the second analog dial to a third time zone that is different from the first time zone.
- While the second analog dial (e.g., 606) is associated with (e.g., set to) the third time zone (720), the computer system (e.g., 600) displays (722), via the display generation component (e.g., 602), the watch user interface (e.g., 604A).
- Displaying the watch user interface (e.g., 604A) includes concurrently displaying the first analog dial (e.g., 608) and the first time indicator (e.g., 608A or 608B) indicating a current time in the first time zone (e.g., the first time; the first time plus the amount of time that has passed since detecting the user input and rotating the second analog dial) on the first analog dial (724), and the second analog dial (e.g., 606) and the second time indicator (e.g., 608D) indicating a current time in the third time zone on the second analog dial, wherein the second analog dial is displayed at a second orientation relative to the first analog dial (e.g., based on the difference between the first time zone and the third time zone) (726). Displaying the current time in the third time zone on the second analog dial with the second analog dial being displayed at a second orientation relative to the first analog dial enables a user to efficiently view the current time at the third time zone relative to the current time at the first time zone. Providing additional features on a user interface without cluttering the UI with additional displayed controls enhances the operability of the device and makes the user-device interface more efficient (e.g., by helping the user to provide proper inputs and reducing user mistakes when operating/interacting with the device) which, additionally, reduces power usage and improves battery life of the device by enabling the user to use the device more quickly and efficiently.
- In some embodiments, the first analog dial (e.g., 608) represents a period of 12 hours, the first time indicator (e.g., 608A or 608B) includes at least a first clock hand (e.g., an hour hand) that indicates, on the first analog dial, the current time in the first time zone (e.g., the position of the first clock hand relative to the first analog dial indicates the current time in the first time zone), the second analog dial (e.g., 606) represents a period of 24 hours, and the second time indicator (e.g., 608D) includes a second clock hand (e.g., an alternative hour hand) that indicates, on the second analog dial, the current time in the time zone associated with the second analog dial (e.g., the position of the second clock relative to the second analog dial indicates the current time in the time zone associated with the second analog dial). Providing the first analog dial that represents a period of 12 hours and the second analog dial that represents a period of 24 hours enables a user to easily distinguish between the two analog dials, thereby enhancing the operability of the device and making the user-device interface more efficient (e.g., by helping the user to more easily read or view displayed content) which, additionally, reduces power usage and improves battery life of the device by enabling the user to use the device more quickly and efficiently.
- In some embodiments, while the second analog dial (e.g., 606) is associated with (e.g., set to) the third time zone (720), wherein the third time zone is different from the first time zone (e.g., the first analog dial and the second analog dial are indicating current times at different time zones), the computer system (e.g., 600) displays (728), in the second analog dial, a numerical indication (e.g., 606D) of an hour of the current time in the third time zone without displaying, in the second analog dial, a numerical indication of any other hour. In some embodiments, while the second analog dial is associated with (e.g., set to) the third time zone, wherein the third time zone is different from the first time zone (e.g., the first analog dial and the second analog dial are indicating current times at different time zones), the computer system displays, in the second analog dial, a numerical indication of an hour of the current time in the third time zone and numerical indications of a subset of (e.g., but not all of) other hours (e.g., one or more hours before and/or after the current hour, but not all 24 hours).
- In some embodiments, the watch user interface (e.g., 604A) includes a text indication (e.g., 608E; a name; an abbreviation of the name) of a location (e.g., city; country; geographic region) associated with the second analog dial (e.g., 606) (730). Including the text indication of the location associated with the second analog dial in the watch user interface enables a user to easily identify the time zone displayed via the second analog dial, thereby enhancing the operability of the device and making the user-device interface more efficient (e.g., by helping the user to more easily read or view displayed content) which, additionally, reduces power usage and improves battery life of the device by enabling the user to use the device more quickly and efficiently.
- In some embodiments, the second analog dial (e.g., 606) includes (732) a first portion (e.g., 606B) that corresponds to daytime in the time zone (e.g., represented by
portion 606B inFIGS. 6A-6B and 6G-6H ) associated with the second analog dial (e.g., the daytime hours; beginning at a point in the second analog dial (e.g., a first boundary betweenportion FIGS. 6A-6B and 6G-6H ) corresponding to a sunrise time and ending at a point in the second analog dial (e.g., a second boundary betweenportion FIGS. 6A-6B and 6G-6H ) corresponding to the sunset time), wherein the first portion includes a first visual characteristic (e.g., a first color; a first brightness/dimness level) (734), and a second portion (e.g., 606A) (e.g., the remaining portion of the second analog dial other than the first potion) that corresponds to nighttime in the time zone (e.g., represented byportion 606A inFIGS. 6A-6B and 6G-6H ) associated with the second analog dial (e.g., the nighttime hours; beginning at the point in the second analog dial corresponding to the sunset time and ending at the point in the second analog dial corresponding to the sunrise time), wherein the second portion includes a second visual characteristic different from the first visual characteristic (e.g., a second color; a second brightness/dimness level) (736). Providing the first portion that corresponds to daytime and the second portion that corresponds to nighttime in the time zone associated with the second analog dial provides information about daytime/nighttime hours at the time zone associated with the second analog dial in an intuitive manner. Providing improved feedback enhances the operability of the device and makes the user-device interface more efficient (e.g., by helping the user to more easily read or view displayed content) which, additionally, reduces power usage and improves battery life of the device by enabling the user to use the device more quickly and efficiently. - In some embodiments, a first position in the second analog dial (e.g., 606) (e.g., the point in the second analog dial corresponding to the sunrise time) that corresponds to a beginning point for the first portion (e.g., 606B) and an ending point for the second portion (e.g., 606A) and a second position in the second analog dial (e.g., the point in the second analog dial corresponding to the sunset time) that corresponds to an ending point for the first portion and a beginning point for the second portion are determined (e.g., automatically) based on geographic location (e.g., the location (e.g., city; region) corresponding to the respective time zone) and time of year (e.g., the current month; the current season).
- In some embodiments, receiving the request (e.g., 610, 618, 620) to change the time zone associated with the second analog dial (e.g., 606) includes detecting, via the one or more input devices (e.g., a touch-sensitive surface integrated with the display generation component), user input (e.g., 610) (e.g., touch input) directed to a location (e.g., the center region) on the watch user interface (e.g., 604A) (712). In some embodiments, the request is received while the computer system (e.g., 600) is displaying or causing display of, via the display generation component (e.g., 602), the watch user interface, and receiving the request does not require access of a menu or a dedicated editing mode to edit the second analog dial. In some embodiments, changing (e.g., shifting; rotating) the second analog dial does not cause a change to other aspects or features of the watch user interface (e.g., the first analog dial; the first indication of time; displayed watch complications).
- In some embodiments, receiving the request (e.g., 610, 618, 620) to change the time zone associated with the second analog dial (e.g., 606) includes detecting, via the one or more input devices (e.g., a rotatable input device; a rotatable and depressible input device), rotational input (e.g., 618, 620) (e.g., in clockwise direction; in a counter-clockwise direction) of a rotatable input mechanism (e.g., 603) (714).
- In some embodiments, changing the time zone associated with the second analog dial (e.g., 606) to a third time zone (e.g., the time zone corresponding to “LON” in
FIGS. 6E-6H ) that is different from the first time zone (e.g., the current time zone associated withfirst analog dial 608 inFIGS. 6A-6B ) includes (e.g., in accordance with detecting an input (e.g., 618, 620) directed to rotating the second analog dial (e.g., while detecting the input directed to rotating the second analog dial)) rotating (e.g., where the rotation is displayed (e.g., as an animation) while an input (e.g., a rotational input on the rotatable input device; a touch input such as a swipe or pinch input) is being received), about a first rotational axis, the second analog dial (e.g., 606) to a respective orientation relative to the first analog dial (e.g., 608) (e.g., while the first analog dial is not rotated) (e.g., from the orientation of the second analog dial relative to the first analog dial as inFIG. 6C to the orientation of the second analog dial relative to the first analog dial as inFIG. 6E ), wherein the first rotational axis is perpendicular to a surface of the display generation component (e.g., 602). In some embodiments, the first rotational axis goes through a center of the display generation component (e.g., 602). In some embodiments, the first rotational axis is perpendicular to an axis of rotation of the input directed to rotating the second analog dial. Rotating the second analog dial about the first rotational axis, where the first rotational axis is perpendicular to a surface of the display generation component, when changing the time zone associated with the second analog dial provides visual feedback of the time zone being changed in an intuitive manner. Providing improved feedback enhances the operability of the device and makes the user-device interface more efficient (e.g., by helping the user to more easily read or view displayed content) which, additionally, reduces power usage and improves battery life of the device by enabling the user to use the device more quickly and efficiently. - In some embodiments, in accordance with a determination that the input (e.g., a rotational input on the rotatable input device; a touch input such as a swipe or pinch input) directed to rotating the second analog dial is in a first direction (e.g., a clockwise direction), the computer system (e.g., 600) rotates the second analog dial (e.g., 606) in the first direction (e.g., the clockwise direction) about a first rotational axis (e.g., a first axis going through the center of the watch user interface/display generation component and is perpendicular to the display generation component).
- In some embodiments, in accordance with a determination that the input (e.g., a rotational input on the rotatable input device (e.g., 603); a touch input such as a swipe or pinch input) directed to rotating the second analog dial (e.g., 606) is in a second direction (e.g., counter-clockwise direction) (e.g., an input that is in the opposite direction to
inputs FIGS. 6C-6D , the computer system (e.g., 600) rotates the second analog dial (e.g., 606) in the second direction (e.g., the counter-clockwise direction) about the first rotational axis. - In some embodiments, the rotational axis of the detected input (e.g., a rotational input; a touch input (e.g., a two-finger twisting input)) is perpendicular to the first rotational axis for rotation of the second analog dial (e.g., 606). In some embodiments, the rotational axis of the detected input (e.g., a rotational input; a touch input) is parallel to the first rotational axis for rotation of the second analog dial. In some embodiments, the amount of rotation (e.g., amount of angle of rotation) of the second dial corresponds to (e.g., is directly proportional to) a magnitude of the user input (e.g., an angular magnitude of a rotation of the rotatable input device).
- In some embodiments, while (e.g., and only while) the second analog dial (e.g., 606) is being rotated, the computer system (e.g., 600) displays or causes display of, in the second analog dial, numbers corresponding to each time mark (e.g., each hour mark) in the second analog dial.
- In some embodiments, changing the time zone associated with the second analog dial (e.g., 606) to a third time zone (e.g., the time zone corresponding to “LON” in
FIGS. 6E-6H ) that is different from the first time zone (e.g., the current time zone associated withfirst analog dial 608 inFIGS. 6A-6B ) includes (e.g., in accordance with detecting an input (e.g., 618, 620) directed to rotating a rotatable user interface element (e.g., 616) (e.g., while detecting the input directed to rotating the rotatable user interface element)) rotating, about a second rotational axis, the rotatable user interface element (e.g., as shown via rotation of timezone selection element 616 inFIGS. 6C-6E ) (e.g., while concurrently rotating the second analog dial (e.g., 606) to reflect the changing time zone), wherein the second rotational axis is parallel with a surface of the display generation component (e.g., 602). In some embodiments, the second rotational axis is perpendicular to the first rotational axis. Rotating the rotatable user interface element (e.g., while concurrently rotating the second analog dial to reflect the changing time zone) about the second rotational axis, where the second rotational axis is parallel with a surface of the display generation component, when changing the time zone associated with the second analog dial provides visual feedback of the time zone being changed in an intuitive manner. Providing improved feedback enhances the operability of the device and makes the user-device interface more efficient (e.g., by helping the user to more easily read or view displayed content) which, additionally, reduces power usage and improves battery life of the device by enabling the user to use the device more quickly and efficiently. - In some embodiments, in accordance with a determination that the input (e.g., 618, 620) (e.g., a rotational input on the rotatable input device; a touch input such as a swipe or pinch input) directed to rotating the rotatable user interface element (e.g., 616) is in a first direction (e.g., a clockwise direction), the computer system (e.g., 600) rotates the rotatable user interface element in the first direction (e.g., the clockwise direction) about a second rotational axis (e.g., a second axis that is parallel with the display generation component). In some embodiments, in accordance with a determination that the input (e.g., a rotational input on the rotatable input device; a touch input such as a swipe or pinch input) directed to rotating the rotatable user interface element is in a second direction (e.g., counter-clockwise direction), the computer system rotates the second analog dial in the second direction (e.g., the counter-clockwise direction) about the second rotational axis.
- In some embodiments, the rotational input is directed via a rotatable input device (e.g., 603) for which the rotational axis is parallel to the second rotational axis for rotation of the rotatable user interface element (e.g., 616).
- In some embodiments, time zone options that can be selected from the rotatable user interface element (e.g., 616) include cities/countries/regions (e.g., shown with abbreviations) (e.g., as shown via time
zone selection element 616 inFIGS. 6C-6E ). In some embodiments, time zone options that can be selected from the rotatable user interface element include numerical offsets (e.g., both plus and minus) (e.g., the top two time zone options shown in timezone selection element 616 inFIG. 6C ) from the current time zone (e.g., the first time zone) corresponding to the time zone of the physical location of the computer system (e.g., 600) (e.g., the center time zone shown in timezone selection element 616 inFIG. 6C ), where the offsets indicate the time difference between a respective different time zone and the current time zone (and where the offset is zero if there is no difference between the time zones). - In some embodiments, the one or more input devices include a rotatable input device (e.g., 603) (e.g., a rotatable and depressible input device), and wherein changing the time zone associated with the second analog dial (e.g., 606) to a third time zone that is different from the first time zone includes changing the time zone associated with the second analog dial to the third time zone in response to detecting, via the rotatable input device, a rotational input (e.g., 618 or 620) (e.g., in a clockwise direction or a counter-clockwise direction). Changing the time zone associated with the second analog dial in response to detecting, via the rotatable input device, the rotational input provides an intuitive method for a user to navigate through available time zone and select a different time zone. Providing improved control options enhances the operability of the device and makes the user-device interface more efficient (e.g., by helping the user to provide proper inputs and reducing user mistakes when operating/interacting with the device) which, additionally, reduces power usage and improves battery life of the device by enabling the user to use the device more quickly and efficiently.
- In some embodiments, in accordance with changing the time zone associated with the second analog dial (e.g., 606) to a third time zone that is different from the first time zone, the computer system (e.g., 600) adjusts, in the second analog dial, a visual indication of daytime (e.g., 606B) (e.g., daytime hours; the time between sunrise and sunset) to indicate daytime at the third time zone (e.g., instead of at the second time zone), wherein adjusting the visual indication of daytime to indicate daytime at the third time zone includes transitioning from visually distinguishing (e.g., using a first color; a first shade) a first portion of the second analog dial (e.g., 606B in
FIG. 6B ) (from the remaining portion of the second analog dial) to visually distinguishing a second portion of the second analog dial (e.g., 606B inFIG. 6D ) (from the remaining portion of the second analog dial), the second portion of the second analog dial corresponding to the visual indication of daytime at the third time zone. In some embodiments, the visual indication of daytime includes the portion of the second analog dial corresponding to the daytime hours being shown (e.g., colored; brightened or dimmed) with a first visual characteristic while the remaining portion (e.g., 606A) of the second analog dial that does not correspond to the daytime hours is not shown with the first visual characteristic. In some embodiments, the portion (e.g., 606B) of the second analog dial corresponding to the daytime hours is of a first size and the remaining portion (e.g., 606A) of the second analog dial that do not correspond to the daytime hours are of a second size that is different from the first size. Adjusting the visual indication of daytime (e.g., daytime hours; the time between sunrise and sunset) to indicate daytime at the new time zone in the second analog dial when the time zone is changed provides information about the different daytime/nighttime hours at the new time zone in an intuitive manner. Providing improved feedback enhances the operability of the device and makes the user-device interface more efficient (e.g., by helping the user to more easily read or view displayed content) which, additionally, reduces power usage and improves battery life of the device by enabling the user to use the device more quickly and efficiently. - In some embodiments, even within the same time zone, the portion of the second analog dial corresponding to the daytime hours (e.g., 606B) and the remaining portion of the second analog dial that do not correspond to the daytime hours (e.g., 606A) can change (e.g., because different regions/locations within the same time zone can have different daytime hours). In some embodiments, at a first location (e.g., a first city; a first region) (e.g., “CHI” as shown via time
zone selection element 616 inFIG. 6D ) within a respective time zone, the portion of the second analog dial corresponding to the daytime hours has the first size (e.g., size of 606B inFIGS. 6A-6B ) and the remaining portion of the second analog dial that do not correspond to the daytime hours has the second size (e.g., size of 606A inFIGS. 6A-6B ) different from the first size. In some embodiments, at a second location (e.g., a second city; a second region) (e.g., “DAL” as shown via rotatable user interface element inFIG. 6D ) within the respective time zone, the portion of the second analog dial corresponding to the daytime hours has a third size different form the first size and the remaining portion of the second analog dial that do not correspond to the daytime hours has a fourth size different from the second size. - In some embodiments, receiving the request (e.g., 610, 618, 620) to change the time zone associated with the second analog dial (e.g., 606) includes receiving a selection of (e.g., via a (e.g., rotatable) user interface element (e.g., 616) displayed in the watch user interface (e.g., 604A) that includes a plurality of selectable time zone options) a geographic location (e.g., a country; a geographic region) in the third time zone. In some embodiments, in response to receiving the selection of the geographic location in the third time zone, in accordance with a determination that the geographic location corresponds to a first location in the third time zone (e.g., a first city within the third time zone), the computer system (e.g., 600) displays, in the second analog dial (e.g., 606), a visual indication (e.g., via a different visual characteristic; via a different shade; via a different color) of daytime (e.g., 606B in
FIG. 6B )) (e.g., daytime hours; the time between sunrise and sunset) at a first position within the second analog dial (which indicates daytime hours at the first location in the third time zone). In some embodiments, in response to receiving the selection of the geographic location in the third time zone, in accordance with a determination that the geographic location corresponds to a second location in the third time zone (e.g., a second city within the third time zone), the computer system displays, in the second analog dial, the visual indication (e.g., via a different visual characteristic; via a different shade; via a different color) of daytime (e.g., 606B inFIG. 6D ) (e.g., daytime hours; the time between sunrise and sunset) at a second position within the second analog dial (which indicates daytime hours at the second location in the third time zone). In some embodiments, the visual indication of daytime at the first location is a different size/length and/or encompasses (e.g., covers) a different portion of the second analog dial than the visual indication of daytime at the second location (e.g., because the amount of daytime is different between the first location and the second location). Adjusting the visual indication of daytime (e.g., daytime hours; the time between sunrise and sunset) to indicate daytime at the new time zone in the second analog dial when the time zone is changed provides information about the different daytime/nighttime hours at the new time zone in an intuitive manner. Providing improved feedback enhances the operability of the device and makes the user-device interface more efficient (e.g., by helping the user to more easily read or view displayed content) which, additionally, reduces power usage and improves battery life of the device by enabling the user to use the device more quickly and efficiently. - In some embodiments, changing the time zone associated with the second analog dial (e.g., 606) to the third time zone includes changing a numerical indicator (e.g., 606D) (e.g., in the second analog dial) corresponding to the current time indicated by the second time indicator (e.g., 608D) from a first value (e.g., the hour number for a first hour) corresponding to the current time at the second time zone to a second value (e.g., the hour number for a second hour) corresponding to the current time at the third time zone. Changing the numerical indicator corresponding to the current time indicated by the second time indicator to the second value corresponding to the current time at the third time zone enables a user to quickly and easily identify the current time at the third time zone when the time zone is first changed. Providing improved feedback enhances the operability of the device and makes the user-device interface more efficient (e.g., by helping the user to more easily read or view displayed content) which, additionally, reduces power usage and improves battery life of the device by enabling the user to use the device more quickly and efficiently.
- In some embodiments, in response to receiving the request (e.g., 610, 618, 620) to change the time zone associated with the second analog dial (e.g., 606), the computer system (e.g., 600) displays, in the watch user interface (e.g., 604A) (e.g., inside the second analog dial; in place of the first analog dial), a (e.g., rotatable) user interface element (e.g., 616) that includes a plurality of (e.g., list of; a rotatable list of) selectable time zone options, wherein the plurality of selectable time zone options are arranged (e.g., ordered) based on an amount of time offset (e.g., plus/minus a certain number of hours) between the first time zone and respective time zone options of the plurality of selectable time zone options. Displaying the user interface element that includes a plurality of (e.g., list of; a rotatable list of) selectable time zone options, where the plurality of selectable time zone options are arranged (e.g., ordered) based on an amount of time offset enables a user to efficiently navigate (e.g., scroll) through the selectable time zone options as the time zone options are arranged in an intuitive manner. Providing improved control options enhances the operability of the device and makes the user-device interface more efficient (e.g., by helping the user to provide proper inputs and reducing user mistakes when operating/interacting with the device) which, additionally, reduces power usage and improves battery life of the device by enabling the user to use the device more quickly and efficiently.
- In some embodiments, the plurality of selectable time zone options (e.g., shown via 616) includes a first time zone option corresponding to a designated geographic location (e.g., a first city; a first country; a first geographic region (e.g., a saved time zone; a favorite time zone; a time zone that is selected and/or stored in a world clock application)), and wherein the displayed first time zone option includes a text indication (e.g., an abbreviation) of the designated geographic location, and a second time zone option that does not correspond to a designated geographic location (e.g., a time zone that is not saved, favorited, or otherwise stored or selected in a world clock application or a different application), wherein the displayed second time zone option includes a numerical indication (e.g., a plus or minus number) of a respective amount of time offset (e.g., plus/minus a certain number of hours) between the second time zone and a time zone corresponding to the second time zone option.
- In some embodiments, the plurality of selectable time zone options (e.g., shown via 616) include a third time zone option corresponding to a first geographic location (e.g., a first city; a first country; a first geographic region), wherein the first geographic location corresponds to a first time zone (e.g., a saved time zone; a favorited time zone; a time zone that is selected and/or stored in a world clock application), wherein the displayed first time zone option includes a text indication (e.g., an abbreviation) of the first geographic location, and a fourth time zone option corresponding to a second geographic location different from the first physical location, wherein the second geographic location corresponds to the first time zone, and wherein the fourth time zone option includes a text indication (e.g., an abbreviation) of the second geographic location.
- In some embodiments, in response to receiving the request (e.g., 610, 618, 620) to change the time zone associated with the second analog dial, the computer system (e.g., 600) displays, via the display generation component (e.g., 602), the watch user interface (e.g., 604A), wherein displaying the watch user interface includes concurrently displaying a selectable user interface object (e.g., 607; a confirmation affordance; a “set” or “done” option) for confirming the change in time zone for the second analog dial (e.g., 606). In some embodiments, the computer system detects, via the one or more input devices (e.g., a touch-sensitive surface integrated with the display generation component), activation (e.g., selection) (e.g., 622) of the selectable user interface object. In some embodiments, in response to detecting the activation of the selectable user interface object, the computer system sets the second analog dial and the second time indicator (e.g., 608D) to indicate the current time in the third time zone on the second analog dial (e.g., and ceasing display of the selectable user interface object).
- Note that details of the processes described above with respect to method 700 (e.g.,
FIGS. 7A-7C ) are also applicable in an analogous manner to the methods described below. For example,method 900 optionally includes one or more of the characteristics of the various methods described above with reference tomethod 700. For example, a watch user interface as described with reference toFIGS. 6A-6H can include and be used to perform a counting operation as described with reference toFIGS. 8A-8M . For another example,method 1100 optionally includes one or more of the characteristics of the various methods described above with reference tomethod 700. For example, a device can use as a watch user interface either a user interface that includes an indication of time and a graphical representation of a character as described with reference toFIGS. 10A-10AC or a watch user interface as described with reference toFIGS. 6A-6H . For another example,method 1300 optionally includes one or more of the characteristics of the various methods described above with reference tomethod 700. For example, a device can use as a watch user interface either a time user interface as described with reference toFIGS. 12A-12G or a watch user interface as described with reference toFIGS. 6A-6H . For another example,method 1500 optionally includes one or more of the characteristics of the various methods described above with reference tomethod 700. For example, a background of a watch user interface as described with reference toFIGS. 6A-6H can be created or edited via the process for updating a background as described with reference toFIGS. 14A-14AD . For another example,method 1700 optionally includes one or more of the characteristics of the various methods described above with reference tomethod 700. For example, the process for changing one or more complications of a watch user interface as described with reference toFIGS. 16A-16AE can be used to change one or more complications of a watch user interface as described with reference toFIGS. 6A-6H . For brevity, these details are not repeated below. -
FIGS. 8A-8M illustrate exemplary user interfaces for initiating a measurement of time, in accordance with some embodiments. The user interfaces in these figures are used to illustrate the processes described below, including the processes inFIGS. 9A-9B . -
FIG. 8A illustratesdevice 600 displayingwatch user interface 800, which includesanalog clock face 804,hour hand 802A,minute hand 802B, andseconds hand 802C.Analog clock face 804 includesbezel 804A (e.g., a ring representing a 12-hour period of time with respect tohour hand 802A and a 60-minute period of time with respect tominute hand 802B) andgraphical indicator 806. In some embodiments,bezel 804A includes graphical indicator 806 (e.g.,graphical indicator 806 is fixed to a position ofbezel 804A). In some embodiments,graphical indicator 806 is independent from at least some portion ofbezel 804A (e.g.,graphical indicator 806 can be displayed independently from at least some portion ofbezel 804A or change position relative to at least some portion ofbezel 804A). - In
FIG. 8A ,minute hand 802B has a length such that it at least partially overlaps (e.g., extends into)bezel 804A.Bezel 804A has visual indicators (e.g., tick marks, numerals) aroundbezel 804A (e.g., at 12 evenly-spaced positions), includinggraphical indicator 806. InFIG. 8A ,bezel 804A andgraphical indicator 806 are displayed at respective orientations relative toanalog clock face 804. The 12 o'clock (or zero minutes) position ofbezel 804A is aligned with the 12 o'clock position of analog clock face 804 (e.g., the position vertically upward from origin 801), andgraphical indicator 806 is positioned at the 12 o'clock (or zero minutes) position with respect tobezel 804A and the 12 o'clock position with respect toanalog clock face 804. - In
FIG. 8A ,device 600 receives (e.g., detects)input 808. In the embodiment illustrated inFIG. 8A ,input 808 includes a gesture (e.g., a tap on display 602). In some embodiments,input 808 includes a rotation ofrotatable input mechanism 603 or a press of a button (e.g., a press of rotatable anddepressible input mechanism 603 or hardware button 613). In some embodiments,input 808 can be anywhere ondisplay 602. In some embodiments,input 808 must correspond to selection of analog clock face 804 (e.g., a location ondisplay 602 inside the outer boundary ofbezel 804A). For example, in response to an input onanalog clock face 804,device 600 performs a first function (e.g., rotatesbezel 804A and starts counter 810 as described below); and in response to an input that is not onanalog clock face 804,device 600 performs a different function (e.g., if the input is on one ofcomplications 805A-805D,device 600 launches an application corresponding to the selected complication) or no function at all. - In response to
input 808,device 600 displays watchuser interface 800 as shown inFIGS. 8B-8C . InFIG. 8B ,device 600 displays counter 810 and, compared toFIG. 8A , the length ofminute hand 802B is shortened (e.g., such thatminute hand 802B does not overlapbezel 804A),bezel 804A andgraphical indicator 806 are rotated clockwise, and a visual characteristic (e.g., fill color, fill pattern, outline color, brightness, transparency) ofhour hand 802A andminute hand 802B is changed.Counter 810 is an example of a graphical indication of time (e.g., the time that has elapsed sincedevice 600 received input 808). - In
FIG. 8C ,bezel 804A andgraphical indicator 806 are displayed at positions (e.g., orientations) relative toanalog clock face 804 such thatgraphical indicator 806 is aligned withminute hand 802B (e.g.,graphical indicator 806 snaps into alignment withminute hand 802B in response to receiving input 808), andcounter 810 is updated to show that one second has elapsed (e.g., sincedevice 600 receivedinput 808, sincegraphical indicator 806 became aligned withminute hand 802B). InFIG. 8C , the length ofminute hand 802B is displayed (e.g., remains) such thatminute hand 802B does not overlapbezel 804A. - In some embodiments,
device 600 automatically alignsgraphical indicator 806 withminute hand 802B in response to receiving input 808 (e.g., a user does not have to provide input to adjust the position ofgraphical indicator 806 to align it withminute hand 802B; inputs of different magnitude (e.g., amount of rotation ofrotatable input mechanism 603; a duration or spatial length of input 808 (e.g., angular extent of a twist gesture)) result in alignment ofgraphical indicator 806 withminute hand 802B). For example, in response to receiving a single tap onanalog clock face 804,device 600 alignsgraphical indicator 806 withminute hand 802B (e.g., by rotatingbezel 804A) without further user input. In some embodiments,device 600 generates a tactile output when graphical indicator reachesminute hand 802B (e.g., in conjunction withminute hand 802B reaching). - In some embodiments, the transition from
FIG. 8A toFIG. 8C is animated (e.g.,device 600 displays an animation ofbezel 804A rotating untilgraphical indicator 806 is aligned withminute hand 802B). In some embodiments,device 600 displays bezel 804 in the orientation shown inFIG. 8C , withgraphical indicator 806 aligned withminute hand 802B in response to receivinginput 808 without an animation or without display of the intermediate state illustrated byFIG. 8B . As time passes (e.g., without further input),bezel 804A andgraphical indicator 806 remain stationary relative toanalog clock face 804 while the hands ofclock face 804 progress to indicate the current time andcounter 810 continues to update according to the elapsed time. - In the embodiment illustrated in
FIGS. 8A-8C ,device 600 begins counter 810 in response to receivinginput 808. In some embodiments, in response to receivinginput 816,device 600 device does not start counter 810 (e.g.,device 600 alignsgraphical indicator 806 withminute hand 802B and displays counter 810, but does not start counter 810 (e.g.,counter 810 maintains a time of zero) until further input is received). - In
FIG. 8C ,device 600 receives (e.g., detects)input 812. As shown inFIG. 8C ,input 812 includes a rotation ofrotatable input mechanism 603 in a first direction (e.g., clockwise). In some embodiments,input 812 includes a gesture (e.g., a touch gesture on display 602). - In response to receiving
input 812,device 600 rotatesbezel 804A relative toclock face 804 and changes the time displayed bycounter 810 in accordance withinput 812, as shown inFIG. 8D . In some embodiments, the direction in which bezel 804A is rotated is based on the direction ofinput 812. In some embodiments, the amount of rotation ofbezel 804 is based on (e.g., proportional to, directly proportional to) an amount, speed, and/or direction of rotation ofinput 812. The time displayed bycounter 810 is changed based on the change in position ofbezel 804 to correspond to the position ofbezel 804A relative tominute hand 802B. InFIG. 8D ,bezel 804A is rotated counter-clockwise by an amount equivalent to five minutes (where one full rotation ofbezel 804A is equivalent to 60 minutes) and the display ofcounter 810 is changed to show 5:00. - In some embodiments,
bezel 804A is rotated, andcounter 810 is updated accordingly, as input is received (e.g.,bezel 804A and counter 810 are updated continually asrotatable input mechanism 603 is rotated). For example, inFIG. 8D ,device 600 receives (e.g., detects)input 814 corresponding to a rotation ofrotatable input mechanism 603 in a direction opposite of the direction ofinput 812. In response to receivinginput 814,device 600 movesbezel 804A such thatgraphical indicator 806 is in alignment withminute hand 802B and updates counter 810 accordingly. - Alternatively, in response to
input 808,device 600 displays watchuser interface 800 as shown inFIG. 8E . InFIG. 8E ,device 600 displays counter 810 and, similar to as inFIGS. 8B-8D , the length ofminute hand 802B is shortened,bezel 804A andgraphical indicator 806 are rotated clockwise such that, relative toanalog clock face 804,graphical indicator 806 is aligned withminute hand 802B (e.g.,graphical indicator 806 snaps into alignment withminute hand 802B in response to receiving input 808), and a visual characteristic (e.g., fill color, fill pattern, outline color, brightness, transparency) ofhour hand 802A andminute hand 802B is changed. Alternatively toFIGS. 8B-8D ,counter 810 does not start in response to receivinginput 808. - In
FIG. 8E , while displayingwatch user interface 800 including counter 810 that not started ((e.g.,counter 810 maintains a time of zero) andgraphical indicator 806 is aligned withminute hand 802B,device 600 receives (e.g., detects) aninput 816. As shown inFIG. 8C ,input 816 includes a gesture (e.g., a touch gesture on display 602). In some embodiments,input 816 includes a press input directed torotatable input mechanism 603. - In
FIG. 8E , in response to receivinginput 816,device 600 starts counter 810. In some embodiments, after aligninggraphical indicator 806 withminute hand 802B (e.g., by rotatingbezel 804A) and displaying counter 810 in response to receivinginput 808, ifdevice 600 does not receive further input (e.g., a confirmation input, a tap, a button press) within a threshold amount of time (e.g., a non-zero amount of time, 1 second, 2 seconds, 3 seconds, 5 seconds),device 600 displays (e.g., reverts to) watchuser interface 800 as displayed inFIG. 8A (e.g.,bezel 804A andgraphical indicator 806 are displayed in the orientation relative toclock face 804 shown inFIG. 8A andcounter 810 is not displayed (e.g.,device 600 ceases display of counter 810)). - Turning to
FIG. 8G , watchuser interface 800 is displayed at a later time, where 20 minutes and 20 seconds have elapsed, as indicated bycounter 810.FIG. 8G illustrates that asminute hand 802B moves according to the passage of time,device 600 maintains the orientation ofbezel 804A and displays tick marks at the minute positions onbezel 804A (e.g., between the existing 5-minute interval marks) clockwise fromgraphical indicator 806 tominute hand 802B.FIG. 8H showswatch user interface 800 at a later time, where 56 minutes and 35 seconds have elapsed, as indicated bycounter 810. At this time,minute hand 802B has not made a full rotation aroundclock face 804 relative to the position ofgraphical indicator 806. InFIG. 8I , one hour, six minutes, and 35 seconds have elapsed (as indicated by counter 810).Minute hand 802B has made more than a full rotation aroundclock face 804 and passedgraphical indicator 806. Onceminute hand 802B makes a full rotation and passesgraphical indicator 806,device 600 removes tick marks from the minute positions onbezel 804A fromgraphical indicator 806 tominute hand 802B. Removing the tick marks afterminute hand 802B has passedgraphical indicator 806 indicates to the user thatminute hand 802B has made a full rotation. - In
FIG. 8I ,device 600 receives (e.g., detects)input 820. In the embodiment shown inFIG. 8I ,input 820 includes a rotation ofrotatable input mechanism 603. In some embodiments,input 820 includes a gesture (e.g., a touch gesture on display 602). In response to receivinginput 820,device 600 rotatesbezel 804A clockwise, untilgraphical indicator 806 is almost aligned withminute hand 802B, and updates counter 810 accordingly, as shown inFIG. 8J . In response to receivinginput 820,device 600 maintains display of the tick marks at the minute positions onbezel 804A between the 5-minute interval marks. The time oncounter 810 is adjusted by an amount of time that is based on the magnitude, speed, and/or direction of input 820 (e.g., the amount of rotation of rotatable input mechanism 603) and the corresponding amount of rotation ofbezel 804A (e.g.,device 600 does not reset counter 810 to zero in response to input 820). In some embodiments, ifinput 820 causes an amount of clockwise rotation ofbezel 804A such thatgraphical indicator 806 passesminute hand 802B (e.g., the elapsed time or offset betweengraphical indicator 806 andminute hand 802B is reduced to less than 59 minutes),device 600 removes tick marks from the minute positions onbezel 804A in the counter-clockwise direction fromgraphical indicator 806 tominute hand 802B. - In
FIG. 8J ,device 600 receives (e.g., detects)input 824. In the embodiment illustrated inFIG. 8J ,input 824 includes a tap gesture on a location ofdisplay 602 corresponding to counter 810. In some embodiments,input 824 includes a rotation ofrotatable input mechanism 603 or a press of a button (e.g., a press of rotatable anddepressible input mechanism 603 or hardware button 613). In some embodiments,input 824 can be anywhere ondisplay 602. In some embodiments,input 808 must correspond to selection of analog clock face 804 (e.g., a location ondisplay 602 inside the outer boundary ofbezel 804A). For example, in response to an input onanalog clock face 804,device 600 performs a first function (e.g., displays watchuser interface 826 inFIG. 8K as described below); and in response to an input that is not onanalog clock face 804,device 600 performs a different function (e.g., if the input is on one ofcomplications 805A-805D,device 600 launches an application corresponding to the selected complication) or no function at all. - In response to receiving
input 824,device 600 displays watchuser interface 826 shown inFIG. 8K . Watchuser interface 826 includes graphical indication oftime 810A (e.g., an enlarged version of counter 810), continueaffordance 826A, and stopaffordance 826B. In some embodiments, graphical indication oftime 810A shows a static indication of the elapsed time oncounter 810 wheninput 824 was received. In some embodiments, graphical indication oftime 810A updates to show the currently elapsed time (e.g., graphical indication oftime 810A continues to progress from the time oncounter 810 wheninput 824 was received). In some embodiments,device 600 pauses counter 810 in response to receivinginput 824. In some embodiments,device 600 continues counter 810 in response to receivinginput 824. In some embodiments, in response to receivinginput 824,device 600 ceases display ofclock face 804 and/orcomplications 805A-805D. In some embodiments,device 600 displays graphical indication oftime 810A, continueaffordance 826A, and stopaffordance 826B overlaid onwatch user interface 824. In some embodiments, in response to receivinginput 824,device 600 at least partially obscures (e.g., blurs or greys out) watchuser interface 824. - In some embodiments, in response to receiving
input 824,device 600 resets the user interface (e.g., displays watchuser interface 800 as shown inFIG. 8A indicating the current time, or resets counter 810 to zero and alignsgraphical indicator 806 with the current position ofminute hand 802B). In some embodiments, ifinput 824 is a first type of input (e.g., a single tap oncounter 810, thendevice 600 displays watchuser interface 826 as shown inFIG. 8K ; and ifinput 824 is a second type of input (e.g., a double tap on counter 810), thendevice 600 resets the user interface. -
FIG. 8K showsinput 828 corresponding to selection of continueaffordance 826A (e.g., a tap at a location ondisplay 602 corresponding to continueaffordance 826A) andinput 830 corresponding to selection ofstop affordance 826B (e.g., a tap at a location ondisplay 602 corresponding to stopaffordance 826B). - As shown in
FIG. 8L , in response to receivinginput 828,device 600 returns to the watch user interface that was displayed at the time of receivinginput 824 and continues to update counter 810 (e.g.,device 600 ceases to display continueaffordance 826A, stopaffordance 826B, and graphical indication oftime 810A (e.g., reduces the enlarged version ofcounter 810 to its previous size)). - As shown in
FIG. 8M , in response to receivinginput 830,device 600 returns to watch user interface 800 (e.g.,device 600 ceases to display continueaffordance 826A, stopaffordance 826B, and graphical indication oftime 810A), in whichbezel 804A andgraphical indicator 806 are aligned with the 12 o'clock position ofclock face 804,counter 810 is not displayed, no tick marks are displayed between the 5-minute intervals ofbezel 804, andhour hand 802A andminute hand 802B are displayed with the visual characteristics shown inFIG. 8A (e.g., instead of the visual characteristics shown inFIGS. 8B-8J ). -
FIGS. 9A-9B are a flow diagram illustrating methods of initiating a measurement of time, in accordance with some embodiments.Method 900 is performed at a computer system (e.g., 100, 300, 500, 600) (e.g., a smart device, such as a smartphone or a smartwatch; a mobile device) that is in communication with a display generation component and one or more input devices (e.g., including a touch-sensitive surface that is integrated with the display generation component; a mechanical input device; a rotatable input device; a rotatable and depressible input device; a microphone). Some operations inmethod 900 are, optionally, combined, the orders of some operations are, optionally, changed, and some operations are, optionally, omitted. - As described below,
method 900 provides an intuitive way for managing user interfaces related to time. The method reduces the cognitive burden on a user for managing user interfaces related to time, thereby creating a more efficient human-machine interface. For battery-operated computing devices, enabling a user to manage user interfaces related to time faster and more efficiently conserves power and increases the time between battery charges. - The computer system (e.g., 600) displays (902), via the display generation component (e.g., 602), a watch user interface (e.g., 800) (e.g., showing a clock with a hour hand and a minute hand), the watch user interface including an analog clock face (e.g., 804) that includes a first clock hand (e.g., 802B) (e.g., the minute hand of the clock) and a graphical indicator (e.g., 806) (e.g., a marker (e.g., a triangular marker)), wherein the graphical indicator is displayed at a first position relative to the analog clock face (e.g., along/within a dial region surrounding the clock). In some embodiments, the graphical indicator is initially not aligned with the first clock hand along the boundary. In some embodiments, the graphical indicator is initially displayed at the top-center position along the boundary.
- While displaying, via the display generation component (e.g., 602), the watch user interface (e.g., 800) (904), the computer system (e.g., 600) detects (906), via the one or more input devices (e.g., via a first input device (e.g., 602 or 603) (e.g., a touch-sensitive surface; a touch-sensitive display; a rotatable input device; a rotatable and depressible input device; a mechanical input device)), a first user input (e.g., 808). In some embodiments, the first user input is an input of a first type (e.g., a rotational input on the first input device; a scrolling input on the first input device or a tap input on a touch-sensitive surface such as a touchscreen display).
- In response to detecting the first user input (e.g., 808) (910), the computer system (e.g., 600) moves (912) the graphical indicator (e.g., 806) to a second position relative to the analog clock face (e.g., 804) such that the graphical indicator is aligned with the first clock hand (e.g., 802B) (e.g., such that the graphical indicator is pointing to or marking the position of the first clock hand; such that the graphical indicator is at the outer end of the first clock hand). Moving the graphical indicator to the second position relative to the analog clock face such that the graphical indicator is aligned with the first clock hand in response to detecting the first user input provides visual feedback of the initiation of a feature (e.g., initiation of a time counter) and a starting point of the initiated feature (e.g., the starting time for the counter) in an intuitive manner. Providing improved feedback enhances the operability of the device and makes the user-device interface more efficient (e.g., by helping the user to more easily read or view displayed content) which, additionally, reduces power usage and improves battery life of the device by enabling the user to use the device more quickly and efficiently.
- While the graphical indicator (e.g., 806) is displayed at the second position relative to the analog clock face (e.g., 804) (918), the computer system (e.g., 600) displays (920) a graphical indication of a time (e.g., 810) (e.g., a time counter; a digital counter) that has elapsed from a time when the first user input (e.g., 808) (e.g., the input moving the graphical indicator to a second position relative to the analog clock face such that the graphical indicator is aligned with the first clock hand) was detected to a current time. In some embodiments, the graphical indication of the time that has elapsed is displayed within the analog clock face in the watch user interface (e.g., 800). Displaying the graphical indication of a time that has elapsed from the time when the first user input while the graphical indicator is displayed at the second position relative to the analog clock face enables a user to quickly and easily recognize that the time has been initiated and the time that has elapsed. Providing improved feedback enhances the operability of the device and makes the user-device interface more efficient (e.g., by helping the user to more easily read or view displayed content) which, additionally, reduces power usage and improves battery life of the device by enabling the user to use the device more quickly and efficiently. Initiating a time counter (e.g., displayed via the graphical indication of a time) in response to the first user input enables a user to initiate the time counter in a quick and efficient manner. Providing additional control options without cluttering the UI with additional displayed controls enhances the operability of the device and makes the user-device interface more efficient (e.g., by helping the user to provide proper inputs and reducing user mistakes when operating/interacting with the device) which, additionally, reduces power usage and improves battery life of the device by enabling the user to use the device more quickly and efficiently.
- Alternatively, in some embodiments, in response to detecting the first user input (e.g., 808), the computer system (e.g., 600) displays or causes display of the graphical indicator (e.g., 806) to a second position (e.g., position of 806 in
FIG. 8C from position of 806 inFIG. 8A ) relative to the analog clock face (e.g., 804) and displays the graphical indication of the time (e.g., 810), where the graphical indication of the time is shown at an initial state (e.g., “00:00”) without yet indicating an elapsed time. In some embodiments, while the graphical indication of the time is shown at the initial state, the computer system detects, via the one or more input devices (e.g., via a second input device, such as a touch-sensitive surface that is integrated with the display generation component (e.g., 602)), a second user input (e.g., corresponding to an activation/selection of the graphical indication of the time). In some embodiments, the second user input is an input of a second type (e.g., a touch input on a touch-sensitive surface that is integrated with the display generation component) that is different from the first type. In some embodiments, in response to detecting the second user input, the computer system displays or causes display of, in the graphical indication of the time, the time that has elapsed from the time when the first user input was detected to the current time. - In some embodiments, in response to detecting the first user input (e.g., 808) (910), the computer system (e.g., 600) shifts (e.g., rotates) (914) an analog dial (e.g., 804A) (e.g., including indications of time positions (e.g., 00:00/12:00 position, 3:00/15:00 position, 6:00/18:00 position, 9:00/21:00 position; 0 minute position, 15 minute position, 30 minute position, 45 minute position)) of the analog clock face (e.g., 804) in accordance with the movement of the graphical indicator (e.g., 806) (e.g., a marker (e.g., a triangular marker)) such that a scale of the analog dial is aligned to begin at (e.g., the 00:00/12:00 position/0 minute position of the analog dial is aligned to) the second position relative to the analog clock face. Shifting (e.g., rotating) the analog dial in accordance with the movement of the graphical indicator such that a scale of the analog dial is aligned to begin at the second position relative to the analog clock face provides visual feedback of the starting position of the time counter in an intuitive manner. Providing improved feedback enhances the operability of the device and makes the user-device interface more efficient (e.g., by helping the user to more easily read or view displayed content) which, additionally, reduces power usage and improves battery life of the device by enabling the user to use the device more quickly and efficiently.
- In some embodiments, the first user input (e.g., 808) includes a rotational input detected via the one or more input devices (e.g., a first input device (e.g., 603) (e.g., a rotatable input device; a rotatable and depressible input device)) (908). In some embodiments, moving the graphical indicator (e.g., 806) in response to detecting the first user input includes snapping the graphical indicator to the second position relative to the analog clock face (e.g., 804) such that the graphical indicator is aligned with the first clock hand (e.g., 802B).
- In some embodiments, in response to the first input (e.g., 808) (910), in conjunction with moving the graphical indicator (e.g., 806) (e.g., a marker (e.g., a triangular marker)) to the second position relative to the analog clock face (e.g., 804) (e.g., in response to detecting the first user input; when the graphical indicator is moved from the first position to the second position), the computer system (e.g., 600) generates (916) (e.g., via one or more tactile output generators that is in communication with the computer system) a tactile output (e.g., a tactile output sequence that corresponds to moving the graphical indicator to the second position). Generating the tactile output in conjunction with moving the graphical indicator (e.g., a marker (e.g., a triangular marker)) to the second position relative to the analog clock face provides feedback that the time counter has been initiated. Providing improved visual feedback to the user enhances the operability of the device and makes the user-device interface more efficient (e.g., by helping the user to provide proper inputs and reducing user mistakes when operating/interacting with the device) which, additionally, reduces power usage and improves battery life of the device by enabling the user to use the device more quickly and efficiently.
- In some embodiments, while displaying the graphical indication of the time (e.g., 810) (e.g., a time counter a digital counter) that has elapsed from the time when the first user input (e.g., 808) was detected to the current time (922), the computer system (e.g., 600) displays (924) a movement of the first clock hand (e.g., 802B) (e.g., rotating within the analog clock face) to indicate the current time (e.g., the “minute” of the current time). In some embodiments, in accordance with the first clock hand being aligned with (e.g., to point to; to be in line with) the second position of the graphical indicator (e.g., 806) (e.g., a marker (e.g., a triangular marker)) within the analog clock face, the computer system generates (926) (e.g., via one or more tactile output generators that is in communication with the computer system) a tactile output (e.g., a tactile output sequence that corresponds to the first clock hand being aligned with the second position of the graphical indicator). In some embodiments, the computer system does not move the graphical indicator (e.g., the graphical indicator remains at (e.g., stays fixed to) the second position relative to the analog clock face) while the computer system moves the first clock hand relative to the analog clock face to indicate the current time.
- In some embodiments, while displaying the graphical indication of the time (e.g., 810) (e.g., a time counter a digital counter) that has elapsed from the time when the first user input (e.g., 808) was detected to the current time (922), the computer system (e.g., 600) detects (928), via the one or more input devices (e.g., the first input device (e.g., 603) (e.g., a rotatable input device; a rotatable and depressible input device)), a second user input (e.g., 812 or 814) (e.g., a rotational input on the first input device; a continuation of the first user input (e.g., additional or continued rotation of the rotatable input mechanism)). In some embodiments, in response to detecting the second user input (930), the computer system adjusts (e.g., increasing or decreasing) (932) the graphical indication of the time in accordance with (e.g., based on an amount of, speed of, and/or direction of) the second user input. In some embodiments, in accordance with the second user input being in a first (e.g., clockwise) direction on the first input device, adjusting the graphical indication of the time includes increasing the displayed time based on the amount and/or speed of the input. In some embodiments, in accordance with the second user input being in a second (e.g., counter-clockwise) direction on the first input device, adjusting the graphical indication of the time includes decreasing the displayed time based on the amount and/or speed of the counter-clockwise input. Adjusting (e.g., increasing or decreasing) the graphical indication of the time in accordance with (e.g., based on an amount of, speed of, and/or direction of) the second user input while the time counter is running enables a user to adjust the running time counter in an convenient and efficient manner. Providing additional control options without cluttering the UI with additional displayed controls enhances the operability of the device and makes the user-device interface more efficient (e.g., by helping the user to provide proper inputs and reducing user mistakes when operating/interacting with the device) which, additionally, reduces power usage and improves battery life of the device by enabling the user to use the device more quickly and efficiently.
- In some embodiments, subsequent to (e.g., immediately after) detecting the first user input (e.g., 808), the computer system (e.g., 600) detects a third user input (e.g., 812 or 814) (e.g., that is a continuation of the first user input (e.g., in the same rotational direction); that is an input in a different (e.g., rotational) direction from the first user input). In some embodiments, in response to detecting the third user input, the computer system moves (e.g., slides; rotates) the graphical indicator (e.g., a marker (e.g., a triangular marker)) from the second position relative to the analog clock face (e.g., 804) to a third position relative to the analog clock face different from the second position. In some embodiments, the computer system adjusts the time displayed in the graphical indication of the time (e.g., 810) to include an offset from the elapsed time from when the first user input was detected to the current time, wherein the offset corresponds to a difference (e.g., in minutes) between the second position and the third position relative to the analog clock face. Adjusting the time displayed in the graphical indication of the time to include the offset from the elapsed time from when the first user input was detected to the current time enables a user to quickly and easily adjust the time displayed in the graphical indication of the time if an adjustment is needed without needing to re-initiate the time displayed in the graphical indication of the time. Reducing the number of inputs needed to perform an operation enhances the operability of the device and makes the user-device interface more efficient (e.g., by helping the user to provide proper inputs and reducing user mistakes when operating/interacting with the device) which, additionally, reduces power usage and improves battery life of the device by enabling the user to use the device more quickly and efficiently.
- In some embodiments, when the graphical indicator (e.g., 806) is moved from the second position to the third position, where the difference between the third position relative to the second position is an addition of (e.g., going forwards in time) a first amount of time (e.g., a first amount of minutes) relative to the analog clock face (e.g., 804), the offset corresponds to the addition of the first amount of time, and the time displayed in the graphical indication of the time includes the elapsed time from when the first user input (e.g., 808) was detected to the current time adjusted by the addition of the first amount of time. In some embodiments, when the graphical indicator (e.g., 806) is moved from the second position to the third position, where the difference between the third position relative to the second position is a subtraction of (e.g., going backwards in time) a second amount of time (e.g., a second amount of minutes) relative to the analog clock face, the offset corresponds to the subtraction of the second amount of time, and the time displayed in the graphical indication of the time includes the elapsed time from when the first user input was detected to the current time adjusted by the subtraction of the second amount of time (e.g., which can be a negative time).
- In some embodiments, in response to detecting the third input, in accordance a determination that the third user input corresponds to an input (e.g., detected via a rotatable input device; detected via a rotatable and depressible input device) in a first direction (e.g., a clockwise direction), the computer system (e.g., 600) moving the graphical indicator (e.g., a marker (e.g., a triangular marker)) from the second position to the third position includes moving (e.g., sliding; rotating) the graphical indicator (e.g., 806) along (e.g., a dial region of) the analog clock face (e.g., 804) in a clockwise direction (towards the third position (e.g., where, based on a clockwise direction, the third position is ahead of the second position within the analog clock face) as the third user input (e.g., 814) is detected. In some embodiments, in response to detecting the third input, in accordance a determination that the third user input corresponds to an input (e.g., detected via a rotatable input device; detected via a rotatable and depressible input device) in a second direction (e.g., a counter-clockwise direction), the computer system moving the graphical indicator from the second position to the third position includes moving (e.g., sliding; rotating) the graphical indicator along (e.g., a dial region of) the analog clock face in a counter-clockwise direction towards the third position (e.g., where, based on a clockwise direction, the third position is behind the second position within the analog clock face) as the third user input is detected.
- In some embodiments, the input (e.g., 812) in the first direction corresponds to a rotational input (e.g., detected via a rotatable input device; detected via a rotatable and depressible input device) in a first rotational direction (e.g., clockwise direction). In some embodiments, the input (e.g., 814) in the second direction corresponds to a rotational input (e.g., detected via a rotatable input device; detected via a rotatable and depressible input device) in a second rotational direction opposite the first rotational direction (e.g., counter-clockwise direction).
- In some embodiments, while displaying the graphical indication of the time (e.g., 810) (e.g., a time counter a digital counter) that has elapsed from the time when the first user input (e.g., 808) was detected to the current time, the computer system (e.g., 600) detects, via the one or more input devices (e.g., a touch-sensitive surface), selection (e.g., 824) of (e.g., touch input on) the graphical indication of the time. In some embodiments, in response to detecting the selection of the graphical indication of the time, the computer system displays, via the display generation component (e.g., 602), a prompt (e.g., 826; an alert; a notification) that includes a first option (e.g., 826A; a first selectable user interface object; a first affordance) that, when selected, causes the computer system to continue counting, via the graphical indication of the time, the time that has elapsed from a time when the first user input was detected to a current time, and a second option (e.g., 826B; a second selectable user interface object; a second affordance) that, when selected, causes the computer system to cease (e.g., stop) counting, via the graphical indication of the time, the time that has elapsed from a time when the first user input was detected to a current time. In some embodiments, ceasing counting the time includes ceasing displaying the graphical indication of the time. In some embodiments, ceasing counting the time includes maintaining display of the graphical indication of the time and resetting (e.g., to “00:00”) the time counted via the graphical indication of the time. Displaying the prompt that includes the first potion and the second option in response to detecting the selection of the graphical indication of the time enables a user to cause the computer system to continue or cease the counting in an easy and intuitive manner. Providing improved feedback enhances the operability of the device and makes the user-device interface more efficient (e.g., by helping the user to more easily read or view displayed content) which, additionally, reduces power usage and improves battery life of the device by enabling the user to use the device more quickly and efficiently.
- In some embodiments, in response to detecting the first user input (e.g., 808), the computer system (e.g., 600) changes (e.g., modifies) a visual characteristic of (e.g., dims; changes color of (e.g., to be the same color as the graphical indicator and/or as the graphical indication of the time)) the first clock hand (e.g., 802B) to include a first visual characteristic (e.g., a dimmed color or visual state; the color of the graphical indicator and/or the graphical indication of the time). In some embodiments, the analog clock face (e.g., 804) includes a second clock hand (e.g., 802A) (e.g., the hour hand of the clock). In some embodiments, in response to detecting the first user input, the computer system changes (e.g., modifies) the visual characteristic of the second clock hand to include the first visual characteristic. Changing the visual characteristic of the first clock hand to include the first visual characteristic in response to detecting the first user input provides visual feedback that an operation (e.g., the counting) has been enabled, thereby enhancing the operability of the device and making the user-device interface more efficient (e.g., by helping the user to more easily recognize that the operation has been initiated) which, additionally, reduces power usage and improves battery life of the device by enabling the user to use the device more quickly and efficiently. In some embodiments, after detecting the first user input (e.g., 808), the computer system (e.g., 600) detects (e.g., via a touch-sensitive surface of the one or more input devices) an input (e.g., a rotational input on the rotatable input device; a touch input such as a swipe or pinch input) directed to a rotatable input device (e.g., 603) of the one or more input devices. In some embodiments, in response to detecting the input directed to the rotatable input device, the computer system changes (e.g., modifies) the visual characteristic of (e.g., dims; changes the color of (e.g., to be the same color as the graphical indicator and/or as the graphical indication of the time)) the first clock hand (e.g., 802B) to include the first visual characteristic (e.g., a dimmed color or visual state; the color of the graphical indicator and/or the graphical indication of the time).
- In some embodiments, in response to detecting the first user input (e.g., 808), the computer system (e.g., 600) changes (e.g., modifies) a shape of (e.g., changes a feature of; changes the size of; makes smaller; shrinks) the first clock hand (e.g., 802B) to be a first shape (e.g., a smaller, shrunk clock hand). In some embodiments, the analog clock face (e.g., 804) includes a second clock hand (e.g., 802A) (e.g., the hour hand of the clock). In some embodiments, in response to detecting the first user input, the computer system changes (e.g., modifies) a shape of (e.g., changes a feature of; changes the size of; makes smaller; shrinks) the second clock hand to be a second shape (e.g., a smaller, shrunk clock hand). Changing the shape of the first clock hand to be the first shape in response to detecting the first user input provides visual feedback that an operation(e.g., the counting) has been enabled, thereby enhancing the operability of the device and making the user-device interface more efficient (e.g., by helping the user to more easily recognize that the operation has been initiated) which, additionally, reduces power usage and improves battery life of the device by enabling the user to use the device more quickly and efficiently.
- In some embodiments, while the graphical indicator (e.g., 806) (e.g., a marker (e.g., a triangular marker)) is displayed at the second position relative to the analog clock face, the computer system (e.g., 600) displays (e.g., continues to display), in the analog clock face (e.g., 804), a movement of the first clock hand (e.g., 802B) to indicate the current time (e.g., the “minute” of the current time). In some embodiments, while displaying the movement of the first clock hand, the computer system displays, in the analog clock face (e.g., 804) (e.g., in a dial region of the analog clock face), visual indicators (e.g., visual markers (e.g., tick marks), as shown in
FIGS. 8G-8H ) along a path of movement of (e.g., the tip of) the first clock hand as the first clock hand is moving (e.g., rotating) around the analog clock face (e.g., the visual indicators appear along the path of movement of the first clock hand as the first clock hand is moving circularly within the analog clock face). Displaying the visual indicators along the path of movement of (e.g., the tip of) the first clock hand as the first clock hand is moving (e.g., rotating) around the analog clock face provides visual feedback that the counting is on-going, thereby enhancing the operability of the device and making the user-device interface more efficient (e.g., by helping the user to more easily recognize that the operation has been initiated) which, additionally, reduces power usage and improves battery life of the device by enabling the user to use the device more quickly and efficiently. - In some embodiments, while concurrently displaying the movement of the first clock hand (e.g., 802B) and the visual indicators, in accordance with a determination that the visual indicators are already displayed along a full path of movement of (e.g., the tip of) the first clock hand (e.g., fully around the analog clock face (e.g., fully around a dial region of the analog clock face)), the computer system (e.g., 600) removes display of the visual indicators along the path of movement of (e.g., the tip of) the first clock hand (e.g., 802B) as the first clock hand is moving (e.g., rotating) around the analog clock face (e.g., 804) (e.g., as shown in
FIG. 8I ). - In some embodiments, in response to detecting the first user input (e.g., 808), the computer system (e.g., 600) moves the graphical indicator (e.g., 806) to the second position relative to the analog clock face (e.g., 804) such that the graphical indicator is aligned with the first clock hand (e.g., 802B) (e.g., such that the graphical indicator is pointing to or marking the position of the first clock hand; such that the graphical indicator is at the outer end of the first clock hand) and displays the graphical indication of the time (e.g., 810) (e.g., a time counter; a digital counter) but does not automatically initiate a counting of the time using the graphical indication of the time. In some embodiments, while displaying the graphical indication of the time, the computer system detects (e.g., via a touch-sensitive surface of the one or more input devices) an input (e.g., 816; a user's tap input) directed to confirming the initiation of the counting of the time (e.g., user selection of a confirm affordance (e.g., “set” affordance or “done” affordance)). In some embodiments, if the input directed to confirming the initiation of the counting of the time is not detected by the computer system for a predetermined time period (e.g., 5 seconds; 10 seconds; 30 seconds), the computer system moves the graphical indicator back to its previous position (the first position) relative to the analogic clock face.
- Note that details of the processes described above with respect to method 900 (e.g.,
FIGS. 9A-9B ) are also applicable in an analogous manner to the method described above and below. For example,method 700 optionally includes one or more of the characteristics of the various methods described above with reference tomethod 900. For example, a watch user interface as described with reference toFIGS. 6A-6H can include and be used to perform a counting operation as described with reference toFIGS. 8A-8M . For another example,method 1100 optionally includes one or more of the characteristics of the various methods described above with reference tomethod 900. For example, a device can use as a watch user interface either a user interface that includes an indication of time and a graphical representation of a character as described with reference toFIGS. 10A-10AC or a watch user interface as described with reference toFIGS. 8A-8M . For another example,method 1300 optionally includes one or more of the characteristics of the various methods described above with reference tomethod 900. For example, a device can use as a watch user interface either a time user interface as described with reference toFIGS. 12A-12G or a watch user interface as described with reference toFIGS. 8A-8M . For another example,method 1500 optionally includes one or more of the characteristics of the various methods described above with reference tomethod 900. For example, a background of a watch user interface as described with reference toFIGS. 8A-8M can be created or edited via the process for updating a background as described with reference toFIGS. 14A-14AD . For another example,method 1700 optionally includes one or more of the characteristics of the various methods described above with reference tomethod 900. For example, the process for changing one or more complications of a watch user interface as described with reference toFIGS. 16A-16AE can be used to change one or more complications of a watch user interface as described with reference toFIGS. 8A-8M . For brevity, these details are not repeated below. -
FIGS. 10A-10AC illustrate exemplary user interfaces for enabling and displaying user interface using a character, in accordance with some embodiments. The user interfaces in these figures are used to illustrate the processes described below, including the processes inFIGS. 11A-11H . -
FIG. 10A illustratesdevice 600 displayinguser interface 1001 that concurrently includes indication oftime 1002 andgraphical representation 1000 of a first character displayed onbackground 1004. In some embodiments,representation 1000 of the first character corresponds to a graphical representation of a user associated with device 600 (e.g., a representation created or customized by a user). - In
FIG. 10A ,device 600 is in a first activity state (e.g., a locked state; a sleep state, a low-power state) in which display 602 is dimmed (e.g., at a lower brightness) compared to a “normal” operating state. In the first state depicted inFIG. 10A ,device 600 displays fewer graphical elements than in the normal operating state (e.g.,complication 1005A andcomplication 1005B shown in, e.g.,FIG. 10B are not displayed in the first state). In accordance withdevice 600 being in the first activity state,device 600 displaysgraphical representation 1000 of the first character in a first visual state (e.g., a static visual state or an animated visual state) that corresponds to the first activity state. In the embodiment illustrated inFIG. 10A , the first visual state includes the showing the character with eyes shut (e.g., a character appears to be sleeping). -
FIG. 10B illustratesdevice 600 in a second activity state (e.g., the normal operating state, an active state, a different activity state from the first activity state depicted inFIG. 10A ) in which display 602 is not dimmed. In the second activity state,user interface 1001 concurrently displays indication oftime 1002 andgraphical representation 1000 of the first character on background 1004 (e.g., similar toFIG. 10A ), as well ascomplications device 600 being in the second activity state,device 600 displaysgraphical representation 1000 of the first character in a second visual state different, from the first visual state, that corresponds to the second activity state. In the embodiment illustrated inFIG. 10B , the second visual state shows the first character with eyes open (e.g., a neutral pose). In some embodiments,device 600 changes from the user interface inFIG. 10A to the user interface inFIG. 10B (or vice versa) in response to detecting a change in the activity state of device 600 (e.g., in response to detecting a change from the first activity state to the second activity state (or vice versa), respectively). -
FIGS. 10C-10D illustratedevice 600 in the second activity state (e.g., the normal or active activity state) and displaying the first character in a visual state that includes an animation in whichrepresentation 1000 of the first character alternates between a first position (e.g., head tilted to the left as depicted inFIG. 10C ) and a second position (e.g., head tilted to the right as depicted inFIG. 10D ). In some embodiments,representation 1000 alternates between the first position and the second position (e.g., at a periodic rate) to indicate the passing of time (e.g., from the first position to the second position every one second or 0.5 seconds, from the first position to the second position and back to the first position every two seconds or 1 second). In some embodiments, the animation is based on the character (e.g., different animations are displayed for different characters). In some embodiments,device 600 displays a gradual transition from a first animation ofrepresentation 1000 of the first character to a second (e.g., different) animation (e.g.,device 600 interpolates (e.g., based on a last state of the first animation and a first state of the second animation) between the two animations to provide a smooth transition). - In
FIG. 10D ,device 600 receives (e.g., detects) input 1006 (e.g., a tap at a location ondisplay 602 that corresponds torepresentation 1000, a wrist raise). In response to receivinginput 1006,device 600 displaysrepresentation 1000 with the first character in a different visual state (e.g.,device 600 changes the visual state of the first character), as illustrated byFIG. 10E . For example,device 600 changes the display ofvisual representation 1000 to change the visual state of the first character in response toinput 1006. InFIG. 10E , the first character is shown winking with an open mouth (e.g., a selfie pose), whereas inFIG. 10D the first character had both eyes open and mouth closed. In some embodiments,device 600 changes the display ofvisual representation 1000 to change the visual state of the first character without user input (e.g.,device 600 changes the visual state in response to time-based criteria being met,device 600 automatically cycles through a set of predetermined visual states (e.g.,device 600 displaysrepresentation 1000 with a visual state for a predetermined amount of time before changing to another visual state)). - In some embodiments,
representation 1000 is displayed in a manner that indicates a change in time. For example, inFIG. 10F , indication oftime 1002 shows that the time has changed to 10:10 from 10:09 inFIG. 10E . When (e.g., in response to) the time changing from 10:09 to 10:10, the first character looks or glances at indication of time 1002 (e.g., the head and/or eyes ofrepresentation 1000 move to appear as though the first character is looking at indication of time 1002). In some embodiments,representation 1000 indicates a change in time in response to a change in the minute of the current time. In some embodiments,representation 1000 indicates a change in time only in response to a change in the hour of the current time (e.g., from 10:59 to 11:00). In some embodiments,representation 1000 indicates a change in time (e.g., appears to look at indication of time 1002) when a predetermined time has been reached (e.g., the hour has changed, a quarter past the hour has been reached, half past the hour has been reached, 45 minutes past the hour has been reached). -
FIG. 10G illustratesdevice 600 in a third activity state (e.g., an inactive unlocked state, a low-power unlocked state) different from the first activity state inFIG. 10A and the second activity state inFIGS. 10B-10F . In the activity state depicted inFIG. 10G ,device 600 displays indication oftime 1002,graphical representation 1000 of the first character (e.g., in a visual state having a neutral body expression), andcomplications FIG. 10B );display 602 is dimmed compared to the second activity state (e.g., an active unlocked state) and brighter compared to the first activity state (e.g., a locked state). In the embodiment illustrated inFIG. 10G ,representation 1000 shows the first character in the same visual state shown inFIG. 10B , wheredevice 600 was in the second activity state (e.g., whendevice 600 changes from the second activity state to the third activity state,representation 1000 can maintain the visual state of the first character while changing the brightness of display 602). -
FIG. 10H illustratesdevice 600 in a fourth activity state (e.g., a change-in-time state for predetermined intervals) different from the first activity state inFIG. 10A , the second activity state inFIGS. 10B-10F , and the third activity state inFIG. 10G . In the activity state depicted inFIG. 10H , in response to the time changing from 10:10 to 10:11,device 600 changes the visual state (e.g., changes the pose, displays a different animation) of the first character inrepresentation 1000, where changing the visual state includes displaying the first character inrepresentation 1000 to look (e.g., glance) at indication oftime 1002, as illustrated byFIG. 10H . In some embodiments,device 600 is in the fourth activity state at predetermined time intervals (e.g., every 10 seconds; every 15 seconds; every 30 seconds; every minute; every 5 minutes). - In
FIG. 10H ,device 600 receives (e.g., detects) input 1007 (e.g., a touch ondisplay 602 with a duration that exceeds a predetermined threshold, a touch ondisplay 602 with a characteristic intensity that exceeds a predetermined threshold). In response to receivinginput 1006,device 600displays user interface 1008 shown inFIG. 10I . In some embodiments,user interface 1008 is a user interface of a user interface editing mode (e.g., in response to receivinginput 1006,device 600 enters a user interface editing mode for editing one or more features of user interface 1001).User interface 1008 displaysrepresentation 1001A of user interface 1001 (e.g., a static, smaller-scale image of user interface 1001),share affordance 1010, and customizeaffordance 1012. - In
FIG. 10I ,device 600 receives (e.g., detects)input 1014 corresponding to a request to edit user interface 1001 (e.g., a tap at a location ondisplay 602 corresponding to customize affordance 1012). In response to receivinginput 1014,device 600displays user interface 1016A shown inFIG. 10J . Pagingdots 1044A-1044C indicate thatuser interface 1016A is the first in a sequence of three editing user interfaces.User interface 1016A provides the capability to change the character displayed on user interface 1001 (e.g., by swiping up or down ondisplay 602 or rotating rotatable input mechanism 603).User interface 1016A displays de-emphasized (e.g., dimmed, greyed, blurred) representations ofcomplications representation 1000 of the currently-selected character (e.g., the first character),character selection element 1046, andtextual identifier 1018 of the currently-selected character. Characteroption selection element 1046 indicates the position of the currently selected option in a sequence of character options. - In
FIG. 10J ,device 600 receives input 1020 (e.g., a right-to-left swipe gesture on display 602). In response to receivinginput 1020,device 600displays user interface 1016B, which (as indicated by label 1022) provides the capability to change the color ofbackground 1004 ofuser interface 1001. Pagingdots 1044A-1044C are updated to indicate thatuser interface 1016B is the second in the sequence of three editing user interfaces.User interface 1016B includescolor selection element 1048, which displays various color options forbackground 1004 ofuser interface 1001. The currently-selected color option is displayed in the middle ofcolor selection element 1048 and at a larger size than the other color options. In some embodiments, a user can provide an input (e.g., rotation ofrotatable input mechanism 603 or a vertical swipe gesture on display 602) to select a different color option, anddevice 600 updatescolor selection element 1048 andbackground 1004 accordingly in response to the input. - In
FIG. 10K ,device 600 receives (e.g., detects) input 1024 (e.g., a right-to-left swipe gesture on display 602). In response to receivinginput 1024,device 600displays user interface 1016C, which (as indicated by label 1022) provides the capability to change the information displayed bycomplication 1005A andcomplication 1005B. Pagingdots 1044A-1044C are updated to indicate thatuser interface 1016C is the third in the sequence of editing user interfaces. While displaying usinginterface 1016C, a user can select a complication (e.g., by tapping on the complication) and edit the selected complication (e.g., by rotating rotatable input mechanism 603).Device 600 indicates that the complications can be edited by, e.g., outliningcomplication 1005A andcomplication 1005B. Upon selection of a complication,device 600 visually distinguishes (e.g., highlights, outlines, increases the brightness of) the selected complication relative to other complications. - In
FIG. 10L ,device 600 receives (e.g., detects) input 1030 (e.g., two left-to-right swipes ondisplay 602, an input with a direction opposite of a direction ofinput 1024 inFIG. 10K ). In response to receivinginput 1030,device 600 displays (e.g., returns to)user interface 1016A. While displayinguser interface 1016A,device 600 receives (e.g., detects) input 1032 (e.g., a rotation of rotatable input mechanism 603). In response to receivinginput 1032,device 600 displays a different character option (e.g., the adjacent option in the sequence of character options) and updatescharacter selection element 1046 accordingly, as shown inFIG. 10N . A character option can include only one character or a set of two or more characters. InFIG. 10N , the displayed character option includes a set of four characters identified as “Toy Box.” In some embodiments, when a set of two or more characters is selected for display onuser interface 1001,device 600 displays the characters of the set individually at different times (e.g.,device 600 displays the characters according to a predefined sequence in response to user input (e.g., a wrist raise, a tap on display 602) or automatically cycles through the set of characters at predetermined time intervals). - In
FIG. 10N ,device 600 receives (e.g., detects) input 1036 (e.g., rotation ofrotatable input mechanism 603, a continuation of input 1032). In response to receivinginput 1034,device 600 displays a different character option (e.g., the next adjacent option in the sequence of character options) and updatescharacter selection element 1046 accordingly, as shown inFIG. 10 10O. InFIG. 10O , the selected character option corresponds torepresentation 1040 of an octopus character (as indicated by identifier 1038). - While
representation 1040 is designated as the selected character (e.g., while displayinguser interface 1016A. 1016B, or 1016C after designating representation 1040),device 600 receives (e.g., detects)input 1042 corresponding to selection of the currently-displayed character option (e.g., a press of rotatable and depressible input mechanism 603). As shown inFIG. 10P , in response to receivinginput 1042,device 600displays user interface 1001 with a representation of a character different from the first character, and in particular,representation 1040 of the selected character option. In some embodiments,device 600 exits user interface editing mode in response to receivinginput 1042. In some embodiments, in response to receivinginput 1042,device 600 displays (e.g., returns to) user interface 1008 (shown inFIG. 10I ) with an updated version ofrepresentation 1001A including a representation of the selected character (e.g., representation 1040), and then displaysuser interface 1001 withrepresentation 1040 of the selected character option in response to receiving further input (e.g., a tap onrepresentation 1001A, a press of rotatable anddepressible input mechanism 603 orbutton 613 while displaying user interface 1008). -
FIG. 10Q illustrates an example ofrepresentation 1040 of the octopus character in a visual state (e.g., a visual state different from the visual state shown inFIG. 10P ) displayed whiledevice 600 is in the second activity state (e.g., an active, unlocked state). - In some embodiments,
representation 1000 of the first character is displayed concurrently with indication oftime 1002 at a first time, and a representation of a second character (e.g.,representation 1040 of the octopus character orrepresentation 1000 of the first character) is displayed concurrently with indication oftime 1002 at a second time different from the first time, where: in accordance withdevice 600 being in an activity state (e.g., an active state) at the second time,device 600 displays the representation of the second character in a visual state (e.g.,representation 1000 of the first character in the visual state illustrated inFIG. 10B ;representation 1040 of the octopus character in the visual state illustrated inFIG. 10P ;representation 1040 of the octopus character in the visual state illustrated inFIG. 10Q ); and in accordance withdevice 600 being in a different activity state (e.g., a locked state) at the second time,device 600 displays the representation of the second character in a different visual state (e.g.,representation 1000 of the first character in the state shown inFIG. 10A ;representation 1040 of the octopus character in the visual state illustrated inFIG. 10P , except with eyes closed;representation 1040 of the octopus character in the visual state illustrated inFIG. 10Q , except with eyes closed). - In some embodiments,
electronic device 600 is configured to transition between characters in response to detecting a change in the activity state from a third activity state (e.g., a higher-power consumption mode and/or the second activity state) to a fourth activity state (e.g., a lower-power consumption mode and/or the first activity state). For example, when a set of two or more characters is selected for display onuser interface 1001, as shown atFIG. 10N ,electronic device 600 displays the characters of the set individually, and in response to a change in the activity state from the third activity state (e.g., a higher-power consumption state, a normal operating state, and/or the second activity state) to the fourth activity state (e.g., a lower-power consumption state, a sleep state, a locked state, and/or the first activity state), transitions from one character in the set to another character in the set. In some embodiments,electronic device 600 forgoes transitioning between characters in response to detecting a change in the activity state from the fourth activity state (e.g., a lower-power consumption mode) to the third activity state (e.g., a higher-power consumption mode). In some embodiments, electronic device transitions between characters in response to detecting a change in the activity state from the fourth activity state to the third activity state in addition to, or in lieu of, transitioning between characters in response to detecting a change in the activity state from the third activity state to the fourth activity state. - At
FIG. 10R ,electronic device 600 is in a third activity state (e.g., the second activity state, a normal operating state, and/or a higher-power consumption state) and displaysuser interface 1001 with agraphical representation 1050 of a second character (e.g., a character different from the first character corresponding tographical representation 1000 and the octopus character corresponding to graphical representation 1040).User interface 1001 also includestime indicator 1002 andcomplications user interface 1001 includes a default color (e.g., black) andbackground 1004 having one or more colors that are different from the default color (e.g., colors displayed byelectronic device 600 in accordance with user inputs whilesecond user interface 1016B is displayed atFIG. 10K ). Whileuser interface 1001 inFIGS. 10B-10F, 10H-10M, and 10O-10Q show the default color as lighter than background 1004 (e.g., white),user interface 1001 inFIGS. 10B-10F, 10H-10M, and 10O-10Q can alternatively display the default color as darker than background 1004 (e.g., black) as shown atFIGS. 10R-10W . - At
FIG. 10R , in accordance withelectronic device 600 being in the third activity state,electronic device 600 displaysgraphical representation 1050 of the second character in a third visual state (e.g., the second visual state and/or an animated visual state) that corresponds to the third activity state. In the embodiment illustrated inFIG. 10R , the third visual state includes the second character with eyes and mouth open (e.g., the second character is posing and appears awake (not asleep)). -
FIG. 10S illustrateselectronic device 600 in a transition state between the third activity state and a fourth activity state (e.g., the first activity state, a lower-power consumption state, a locked state, a sleep state) in which display 602 begins to dim as compared toFIG. 10R . AtFIG. 10S ,background 1004 andgraphical representation 1050 are reduced in size as compared toFIG. 10R as the transition between third activity state and fourth activity state occurs. In some embodiments,graphical representation 1050 fades out, reduces in brightness, and/or dissolves in the transition between the third activity state and the fourth activity state.Electronic device 600 ceases to displaycomplications user interface 1001. As shown inFIG. 10S ,electronic device 600 displaystime indicator 1002 with a reduced thickness and/or size during the transition between the third activity state and the fourth activity state. - At
FIG. 10T ,electronic device 600 is operating in the fourth activity state. AtFIG. 10T ,electronic device 600 displaysgraphical representation 1052 of a third character, different from the second character. Accordingly, during the transition between the third activity state and the fourth activity state,graphical representation 1050 ceases to be displayed onuser interface 1001 andgraphical representation 1052 is displayed onuser interface 1001. In some embodiments,graphical representation 1050 fades out and/or dissolves asgraphical representation 1052 fades in or is otherwise displayed onuser interface 1001. As set forth above, the second character and the third character are included in the set of characters selected to be displayed onuser interface 1001. In response to detecting the change between the third activity state and the fourth activity state,electronic device 600 transitions between display of the second character to display of the third character. AtFIG. 10T ,graphical representation 1052 displayed whileelectronic device 600 operates in the fourth activity state is dimmed (e.g., includes a reduced brightness) as compared tographical representation 1050 displayed whileelectronic device 600 operates in the third activity state. In some embodiments, dimming thegraphical representation 1052 indicates thatelectronic device 600 is in the fourth activity state. For example,graphical representation 1052 is illustrated in greyscale to indicate thatgraphical representation 1052 is faded and/or otherwise displayed at a reduced brightness when compared tographical representation 1050 shown atFIG. 10R .Electronic device 600 ceases to displaybackground 1004 onuser interface 1001 whenelectronic device 600 is in the fourth activity state. - In accordance with
device 600 being in the fourth activity state,device 600 displaysgraphical representation 1052 of the third character in a fourth visual state different, from the third visual state, that corresponds to the fourth activity state. In the embodiment illustrated inFIG. 10T , the fourth visual state shows the third character with eyes open (e.g., a neutral pose). In some embodiments, the fourth visual state shows the third character with eyes closed such that the third character appears to be asleep. In some embodiments, the fourth visual state of the third character does not include movement and/or animations of the third character. Accordingly,electronic device 600 does not animate and/or does not causegraphical representation 1052 of the third character to move in response to changes in time (e.g., every minute, every fifteen minutes, every thirty minutes, every hour) and/or in response to user inputs. - At
FIG. 10U ,electronic device 600 operates in the third activity state (e.g.,electronic device 600 detects a user input and/or a wrist raise gesture causing a transition from the fourth activity state to the third activity state) and displaysuser interface 1001 withgraphical representation 1052 of the third character. As such,electronic device 600 does not replacegraphical representation 1052 of the third character with a graphical representation of a different character upon transitioning from the fourth activity state to the third activity state. For example,electronic device 600 maintains display of thegraphical representation 1052 of the third character in response to detecting a change from the fourth activity state to the third activity state. In some embodiments,electronic device 600 transitions display ofgraphical representation 1050 withgraphical representation 1052 in response to detecting a change from the fourth activity state to the third activity state, but not in response to detecting a change from the third activity state to the fourth activity state. AtFIG. 10U ,user interface 1001 includes background 1004 (e.g., the same background as displayed atFIG. 10R ) andcomplications time indicator 1002 is displayed as having an increased thickness and/or size when compared totime indicator 1002 displayed whileelectronic device 600 operates in the fourth activity state shown atFIG. 10T . - At
FIG. 10U , in accordance withelectronic device 600 being in the third activity state,electronic device 600 displaysgraphical representation 1052 of the third character in the third visual state (e.g., the second visual state and/or an animated visual state) that corresponds to the third activity state. In the embodiment illustrated inFIG. 10U , the third visual state includes the third character with eyes and mouth open (e.g., the third character is posing and appears awake (not asleep)). In some embodiments, the third visual state of the third character includes periodic movement and/or animations of the third character. For example,electronic device 600 can animate and/or causegraphical representation 1052 of the third character to move in response to changes in time (e.g., every minute, every fifteen minutes, every thirty minutes, every hour) and/or in response to user input. In some embodiments, in response to detecting a change in the activity state from the third activity state to the fourth activity state,electronic device 600displays user interface 1001 with a fourth character, different from the second character and the third character. - At
FIG. 10U , whileelectronic device 600 is in the third activity state,electronic device 600 detects user input 1054 (e.g., a tap gesture) onuser interface 1001. In response to detectinguser input 1054,electronic device 600 causes display ofgraphical representation 1052 of the third character to move (e.g., causes a randomly selected or predetermined animation of graphical representation), as shown atFIG. 10V . AtFIG. 10V ,electronic device 600 displays an enlargement animation (e.g., zooms and/or increases a size) ofgraphical representation 1052 of the third character. In some embodiments, in response to theuser input 1054,electronic device 600 ceases to display a portion ofgraphical representation 1052 ondisplay 602. For example, atFIG. 10V , a lower portion ofgraphical representation 1052 of the third character (e.g., the ears and mouth of third character) appears to move off ofdisplay 602 and cease to be displayed byelectronic device 600 for a predetermined period of time. Additionally,electronic device 600 causes display ofgraphical representation 1052 of the third character to cover and/or block at least a portion ofcomplication 1005B for the predetermined period of time in response touser input 1052. - In some embodiments,
electronic device 600 is configured to fluidly transition between different animations ofgraphical representation 1052 of the third character in response to user inputs. For example, atFIG. 10V ,electronic device 600 detectsuser input 1056 onuser interface 1001 while the lower portion ofgraphical representation 1052 of the third character is not displayed on display 602 (e.g., whileelectronic device 600 is causing an enlargement animation of graphical representation 1052). In response to detectinguser input 1056,electronic device 600 displays a pose animation ofgraphical representation 1052 of the third character, as shown atFIG. 10W . In some embodiments,electronic device 600 displays a randomly selected animation (e.g., another pose animation and/or a different animation than the pose animation) ofgraphical representation 1052 of the third character in response to detectinguser input 1056. AtFIG. 10W ,electronic device 600 displaysgraphical representation 1052 of the third character as winking and with an open mouth (e.g., the mouth is open wider than inFIG. 10U ). In some embodiments, in response touser input 1056,electronic device 600 displaysgraphical representation 1052 of the third character in the pose depicted inFIG. 10W for a predetermined period of time before returning display ofgraphical representation 1052 of the third character to the third visual state, as shown atFIG. 10U . In some embodiments,electronic device 600 displays the animation ofgraphical representation 1052 in response to detectinguser input 1056 aftergraphical representation 1052 returns to the position shown inFIG. 10U instead of whilegraphical representation 1052 is positioned as illustrated inFIG. 10V (e.g., whilegraphical representation 1052 is undergoing enlargement animation caused by user input 1054). - Turning back to
FIG. 10U ,electronic device 600 detects user input 1058 (e.g., a long press gesture) onuser interface 1001. In response to detectinguser input 1058,electronic device 600displays user interface 1008 shown atFIG. 10X . As set forth above, in some embodiments,user interface 1008 is a user interface of a user interface editing mode.User interface 1008 displaysrepresentation 1060 ofuser interface 1001,share affordance 1010, and customize affordance 1012 (e.g., edit affordance). AtFIG. 10X ,representation 1060 ofuser interface 1001 includes multiple characters that are included in the set of characters configured to be displayed onuser interface 1001. For example,electronic device 600 transitions display ofuser interface 1001 between individual graphical representations of the set of characters in response to detecting the change from the third activity state to the fourth activity state (and/or in response to detecting the change from the fourth activity state to the third activity state). As such,representation 1060 provides an indication thatelectronic device 600 transitions between displaying the characters in the set of characters whenuser interface 1001 is selected. - At
FIG. 10X ,electronic device 600 receives (e.g., detects)input 1062 corresponding to a request to edit user interface 1001 (e.g., a tap at a location ondisplay 602 corresponding to customize affordance 1012). In response to receivinginput 1062,electronic device 600displays user interface 1064 shown atFIG. 10Y .User interface 1064 provides the ability to change the character and/or set of characters displayed on user interface 1001 (e.g., by swiping up or down ondisplay 602 or rotating rotatable input mechanism 603). For example,user interface 1064 includes editing mode indicator 1066 (e.g., “Type”) and additional editing mode user interface object 1068 (e.g., “Color”). In response to detecting user input (e.g., a swipe gesture on display 602),electronic device 600 adjusts display ofuser interface 1064 to a second page that provides the ability to change a color ofbackground 1004. AtFIG. 10Y ,user interface 1064 displaysrepresentation 1060 of the currently-selected watch face user interface 1001 (e.g., a watch face user interface that displays the set of characters), watchface selection element 1070, andtextual identifier 1072 of the currently-selected set of characters (e.g., “Random Avatar”). Watch faceoption selection element 1070 indicates the position of the currently selected option in a sequence of watch face options. AtFIG. 10Y ,electronic device 600 detectsrotational input 1074 onrotatable input mechanism 603. In response to detectingrotational input 1074,electronic device 600displays user interface 1064 withrepresentation 1076 of a second watch face user interface that includes a second set of characters (e.g., animal-like characters and/or emojis) configured to be displayed ondisplay 602, as shown atFIG. 10Z . - At
FIG. 10Z ,user interface 1064 includes textual identifier 1078 (e.g., “Random Emoji”) to reflectrepresentation 1076 of the second watch face user interface that includes the second set of characters. Additionally,electronic device 600 adjusts a position of watchface selection element 1070 in response torotational input 1074. AtFIG. 10Z , electronic device detectsrotational input 1080 onrotatable mechanism 603. In response to detectingrotational input 1080,electronic device 600displays user interface 1064 withrepresentation 1082 of a third watch face that includes a single character configured to be displayed ondisplay 602, as shown atFIG. 10AA . Accordingly,electronic device 600 displaysrepresentation 1060 andrepresentation 1076 with multiple characters to indicate that the corresponding watch face user interface displays individual graphical representations of multiple characters whenrepresentation 1060 and/orrepresentation 1076 are selected (e.g., via user input). Conversely,electronic device 600 displaysrepresentation 1082 with a single character to indicate that a corresponding watch face user interface displays a graphical representation of a single character whenrepresentation 1082 is selected. For example, the third watch face user interface does not transition between graphical representations of different characters in response to a change from the third activity state to the fourth activity state, in response to a user input, or after a predetermined amount of time. Rather, the third watch face user interface maintains display of a graphical representation of the single character, even aselectronic device 600 changes from the third activity state to the fourth activity state. AtFIG. 10AA ,user interface 1064 also includes textual identifier 1083 (e.g., “Avatar 1”) to identify the third watch face corresponding torepresentation 1082. Turning back toFIG. 10Z ,electronic device 600 detects user input 1084 (e.g., a tap gesture) corresponding to selection ofrepresentation 1076. In response to detectinguser input 1084,electronic device 600displays user interface 1085, as shown atFIG. 10AB . AtFIG. 10AB ,electronic device 600 is in the third activity state (e.g., a normal operating state, a higher-power consumption state) anduser interface 1085 includesgraphical representation 1086 of a fourth character (e.g., an animal-like emoji, such as a frog) in the third visual state. Additionally,user interface 1085 includestime indicator 1002,background 1004, andcomplications - At
FIG. 10AC ,electronic device 600 is in the fourth activity state (e.g., a locked state, a sleep state, a lower-power consumption state) and displaysuser interface 1085. As set forth above,representation 1076 inFIG. 10Z corresponds to a watch face user interface that includes a set of characters that includes more than one character (e.g., as opposed to a single character). Accordingly, in response to detecting a change from the third activity state to the fourth activity state,electronic device 600 ceases to displaygraphical representation 1086 of the fourth character (e.g., a frog character) and displaysgraphical representation 1088 of a fifth character (e.g., a dog character). AtFIG. 10AC ,electronic device 600 also ceases to displaybackground 1004 andcomplications electronic device 600 operates in the fourth activity state. Further, atFIG. 10AC ,user interface 1085 includestime indicator 1002 having a reduced thickness and/or size as compared totime indicator 1002 displayed atFIG. 10AB . -
FIGS. 11A-11H are a flow diagram illustrating methods of enabling and displaying a user interface using a character, in accordance with some embodiments.Method 1100 is performed at a computer system (e.g., 100, 300, 500, 600) (e.g., a smart device, such as a smartphone or a smartwatch; a mobile device) that is in communication with a display generation component. Some operations inmethod 1100 are, optionally, combined, the orders of some operations are, optionally, changed, and some operations are, optionally, omitted. - As described below,
method 1100 provides an intuitive way for managing user interfaces related to time. The method reduces the cognitive burden on a user for managing user interfaces related to time, thereby creating a more efficient human-machine interface. For battery-operated computing devices, enabling a user to manage user interfaces related to time faster and more efficiently conserves power and increases the time between battery charges. - At a first time, the computer system (e.g., 600) displays (1102), concurrently in a user interface (e.g., 1001) (e.g., a watch face user interface) displayed via the display generation component (e.g., 602), an indication of time (e.g., 1002) (e.g., the current time; the time set in the systems setting of the computer system) (1104), and a graphical representation of a first character (e.g., 1000, 1040) (e.g., an animated character; an emoji; an animated (e.g., 3D) emoji of an animal-like character; an animated (e.g., 3D) avatar-like emoji; an animated representation of a user of the computer system) (1106).
- Displaying the graphical representation of the first character (e.g., 1000, 1040) includes (1106), in accordance with a determination that the computer system (e.g., 600) is in a first activity state (e.g., activity state in
FIG. 10A, 10B 10C, 10D, 10E, 10F, 10G, 10H, 10P, or 10Q) (e.g., dimmed (e.g., but unlocked) state; locked state; time-passing state; detecting an input (e.g., tap input) state; time-change state), displaying the graphical representation of the first character in a first visual state (e.g., a neutral state; sleeping state; selfie state; a time change state; a tick tock state) that corresponds to the first activity state of the computer system (1108). - Displaying the graphical representation of the first character (e.g., 1000, 1040) includes (1106), in accordance with a determination that the computer system (e.g., 600) is in a second activity state (e.g., activity state in
FIG. 10A, 10B 10C, 10D, 10E, 10F, 10G, 10H, 10P, or 10Q) (e.g., dimmed (e.g., but unlocked) state; locked state; time-passing state; detecting an input (e.g., tap input) state; time-change state) that is different from the first activity state, displaying the graphical representation of the first character in a second visual state (e.g., a neutral state; sleeping state; selfie state; a time change state; a tick tock state), different from the first visual state, that corresponds to the second activity state of the computer system (1110). Displaying the graphical representation of the first character in a different visual state based on an activity state of the computer system provides visual feedback about the current activity state of the computer system (e.g., without one or more user inputs directed to causing the computer system to indicate the activity state of the computer system). Providing improved visual feedback and reducing the number of inputs needed to perform an operation enhances the operability of the device and makes the user-device interface more efficient (e.g., by helping the user to provide proper inputs and reducing user mistakes when operating/interacting with the device) which, additionally, reduces power usage and improves battery life of the device by enabling the user to use the device more quickly and efficiently. - At a second time, after the first time, the computer system (e.g., 600) displays (1112), concurrently in the user interface (e.g., 1001) the indication of time (e.g., 1002) (e.g., the current time; the time set in the systems setting of the computer system) (1114), and a graphical representation of a second character (e.g., 1000, 1040) (e.g., an animated character; an emoji; an animated (e.g., 3D) emoji of an animal-like character; an animated (e.g., 3D) avatar-like emoji; an animated representation of a user of the computer system, the first character, a character different from the first character) (1116). In some embodiments, the second character is the same character as the first character. In some embodiments, the second character is a different character from the first character.
- Displaying the graphical representation of the second character (e.g., 1000, 1040) includes (1116), in accordance with a determination that the computer system (e.g., 600) is in the first activity state (e.g., activity state in
FIG. 10A, 10B 10C, 10D, 10E, 10F, 10G, 10H, 10P, or 10Q) (e.g., dimmed (e.g., but unlocked) state; locked state; time-passing state; detecting an input (e.g., tap input) state; time-change state), displaying the graphical representation of the second character in the first visual state (e.g., a neutral state; sleeping state; selfie state; a time change state; a tick tock state) that corresponds to the first activity state of the computer system (1118). - Displaying the graphical representation of the second character (e.g., 1000, 1040) includes (1116), in accordance with a determination that the computer system (e.g., 600) is in the second activity state (e.g., activity state in
FIG. 10A, 10B 10C, 10D, 10E, 10F, 10G, 10H, 10P, or 10Q) (e.g., dimmed (e.g., but unlocked) state; locked state; time-passing state; detecting an input (e.g., tap input) state; time-change state) that is different from the first activity state (e.g., activity state inFIG. 10A, 10B 10C, 10D, 10E, 10F, 10G, 10H, 10P, or 10Q), displaying the graphical representation of the second character (e.g., 1000, 1040) in the second visual state (e.g., a neutral state; sleeping state; selfie state; a time change state; a tick tock state), different from the first visual state, that corresponds to the second activity state of the computer system (1120). Displaying the graphical representation of the second character in a different visual state based on an activity state of the computer system provides visual feedback about the current activity state (e.g., or a change in activity state) of the computer system (e.g., without one or more user inputs directed to causing the computer system to indicate the activity state or a change in activity state of the computer system). Providing improved visual feedback and reducing the number of inputs needed to perform an operation enhances the operability of the device and makes the user-device interface more efficient (e.g., by helping the user to provide proper inputs and reducing user mistakes when operating/interacting with the device) which, additionally, reduces power usage and improves battery life of the device by enabling the user to use the device more quickly and efficiently. - In some embodiments, the computer system (e.g., 600) concurrently displays or causes display of, in the user interface (e.g., 1001) (e.g., overlaid on the graphical representation of the first character and/or the graphical representation of the second character), one or more watch complications (e.g., 1005A, 1005B). In some embodiments, the one or more watch complications include a complication indicating a current date. In some embodiments, the one or more watch complications include a complication that includes text information (e.g., about the weather; about a calendar meeting). In some embodiments, the user interface also includes an editing tab (e.g., to access an editing page) for editing the one or more watch complications (e.g., changing one or more of the watch complications to a different type).
- In some embodiments, at the second time (e.g., or immediately prior to the second time), the computer system (e.g., 600) detects (e.g., determines) (1122) a change in activity state of the computer system from the first activity state (e.g., activity state in
FIG. 10A, 10B 10C, 10D, 10E, 10F, 10G, 10H, 10P, or 10Q) to the second activity state (e.g., activity state inFIG. 10A, 10B 10C, 10D, 10E, 10F, 10G, 10H, 10P, or 10Q) (e.g., a change in a display setting (e.g., getting dimmer; getting brighter) of the computer system; a change in a security state (e.g., device being locked or unlocked) of the computer system; a change in the current time (e.g., a change in the hour of the current time, a change in the minute of the current time, a change in the second of the current time); a change in a state of the computer system due to a detected user input and the computer system displaying (or causing display of)/providing a response to the user input and/or performing an operation due to the user input). - In some embodiments, displaying the graphical representation of the second character (e.g., 1000, 1040) in the second visual state includes displaying the graphical representation of the second character in the second visual state in response to detecting (e.g., determining) the change in activity state of the computer system from the first activity state (e.g., activity state in
FIG. 10A, 10B 10C, 10D, 10E, 10F, 10G, 10H, 10P, or 10Q) to the second activity state (e.g., activity state inFIG. 10A, 10B 10C, 10D, 10E, 10F, 10G, 10H, 10P, or 10Q). In some embodiments, the second character is the same character as the first character (e.g., 1000, 1040). In some embodiments, the second character is a different character from the first character. Displaying the graphical representation of the second character in the second visual state in response to detecting (e.g., determining) the change in activity state of the computer system from the first activity state to the second activity state provides visual feedback about the change in activity state of the computer system (e.g., without one or more user inputs directed to causing the computer system to indicate the change in activity state of the computer system). Providing improved visual feedback and reducing the number of inputs needed to perform an operation enhances the operability of the device and makes the user-device interface more efficient (e.g., by helping the user to provide proper inputs and reducing user mistakes when operating/interacting with the device) which, additionally, reduces power usage and improves battery life of the device by enabling the user to use the device more quickly and efficiently. - In some embodiments, the first character is the same character as the second character (1124). In some embodiments, the first character is a different character from the second character (1126). In some embodiments, the first visual state or the second visual state is a static (e.g., not moving; not animated; not dynamic) visual state (1128). In some embodiments, the first visual state or the second visual state is an animated (e.g., moving; dynamic) visual state (1130).
- In some embodiments, the first activity state (e.g., activity state in
FIG. 10A, 10B 10C, 10D, 10E, 10F, 10G, 10H, 10P, or 10Q) corresponds to a state in which the user interface (e.g., 1001) is displayed at a lower brightness level than a designated brightness level (e.g., as compared to a standard brightens level, a brightness level of an active state), and the first visual state corresponds to a neutral body expression (e.g., a neutral state; a state or animation of the respective character (e.g., the first character and/or the second character) that reflects a neutral stance/image or motion). Displaying the representation of a character with the first visual state corresponding to the neutral body expression when/if first activity state corresponds to a state in which the user interface is displayed at a lower brightness level than a designated brightness level provides visual feedback that the current activity state of the computer system corresponds to the state in which the user interface is displayed at a lower brightness level than a designated brightness level (e.g., without one or more user inputs directed to causing the computer system to indicate the current activity state). Providing improved visual feedback and reducing the number of inputs needed to perform an operation enhances the operability of the device and makes the user-device interface more efficient (e.g., by helping the user to provide proper inputs and reducing user mistakes when operating/interacting with the device) which, additionally, reduces power usage and improves battery life of the device by enabling the user to use the device more quickly and efficiently. - In some embodiments, the first activity state (e.g., activity state of 1000 in
FIG. 10A ) corresponds to a locked state (e.g., where authentication (e.g., biometric authentication; passcode authentication) is required to unlock the computer system (e.g., 600)), and the first visual state includes a visual appearance that the first character (e.g., 1000, 1040) is asleep (e.g., a sleeping state; a state or motion of the respective character (e.g., the first character and/or the second character) that reflects a sleeping stance/image or motion). Displaying the representation of a character with the first visual state including the visual appearance that the first character is asleep when/if first activity state corresponds to a locked state provides visual feedback that the current activity state of the computer system corresponds to the locked state (e.g., without one or more user inputs directed to causing the computer system to indicate the current activity state). Providing improved visual feedback and reducing the number of inputs needed to perform an operation enhances the operability of the device and makes the user-device interface more efficient (e.g., by helping the user to provide proper inputs and reducing user mistakes when operating/interacting with the device) which, additionally, reduces power usage and improves battery life of the device by enabling the user to use the device more quickly and efficiently. - In some embodiments, the first activity state (e.g., activity state in
FIG. 10C or 10D ) corresponds to a state in which the indication of time (e.g., the current time; the time set in the systems setting of the computer system) is being displayed (e.g., the passing time is being displayed). In some embodiments, the first visual state corresponds to a respective motion (e.g., animation) repeating at a regular frequency time indication state (e.g., a state or motion of the respective character (e.g., the first character and/or the second character) indicating that time is passing or that time is ticking by (e.g., a tick tock state; a tick tock animation)), wherein the respective motion corresponds to a nodding motion by the first character (e.g., a back-and-forth motion of a head of the first character representing the nodding motion). Displaying the representation of a character corresponding to a respective motion (e.g., animation) repeating at a regular frequency time indication state, where the respective motion corresponds to a nodding motion by the first character, when/if first activity state corresponds to a state in which the indication of time is being displayed provides visual feedback that the current activity state of the computer system corresponds to the state in which the indication of time is being displayed (e.g., without one or more user inputs directed to causing the computer system to indicate the current activity state). Providing improved visual feedback and reducing the number of inputs needed to perform an operation enhances the operability of the device and makes the user-device interface more efficient (e.g., by helping the user to provide proper inputs and reducing user mistakes when operating/interacting with the device) which, additionally, reduces power usage and improves battery life of the device by enabling the user to use the device more quickly and efficiently. - In some embodiments, displaying the graphical representation of the first character (e.g., 1000, 1040) (e.g., and/or the second character) in the time indication state includes displaying the first character looking at the indication of time at a predetermined time interval (e.g., every 10 seconds; every 15 seconds; every 30 seconds; every minute; every 5 minutes).
- In some embodiments, in accordance with a determination that the first character (e.g., 1000, 1040) corresponds to a first version (e.g., a first variant) of a first character type (e.g., an animated (e.g., 3D) emoji of an animal-like character; an animated (e.g., 3D) avatar-like emoji), the displayed glancing animation corresponds to a first type of glancing animation. In some embodiments, in accordance with a determination that the first character corresponds to a second version (e.g., a second variant) of the first character type (e.g., an animated (e.g., 3D) emoji of an animal-like character; an animated (e.g., 3D) avatar-like emoji) different from the first version, the displayed glancing animation corresponds to a second type of glancing animation (e.g., glancing in a different direction; glancing in a different manner) different from the first type of glancing animation.
- In some embodiments, the first activity state (e.g., activity state in
FIG. 10E ) corresponds to detecting a touch (e.g., tap) input (e.g., a tap input detected via a touch-sensitive surface integrated with the display generation component), and the first visual state corresponds to a first type of motion state (e.g., static or dynamic) that is indicative of a posing gesture (e.g., posing for a selfie) (e.g., a selfie pose; a pose or motion of the respective character (e.g., the first character and/or the second character) that reflects a pose or motion of taking a selfie). Displaying the representation of a character corresponding to a first type of motion state (e.g., static or dynamic) that is indicative of a posing gesture when/if first activity state corresponds to detecting a touch (e.g., tap) input provides visual feedback that the current activity state of the computer system corresponds to detecting the touch (e.g., tap) input (e.g., without one or more user inputs directed to causing the computer system to indicate the current activity state). Providing improved visual feedback and reducing the number of inputs needed to perform an operation enhances the operability of the device and makes the user-device interface more efficient (e.g., by helping the user to provide proper inputs and reducing user mistakes when operating/interacting with the device) which, additionally, reduces power usage and improves battery life of the device by enabling the user to use the device more quickly and efficiently. - In some embodiments, the first activity state (e.g., activity state in
FIG. 10F ) corresponds to detecting that there has been a change in time (e.g., a certain time has been reached (e.g., the hour has changed; a quarter past the hour has been reached; half past the hour has been reached)), and the first visual state corresponds to a second type of motion state (e.g., static or dynamic) that is indicative of the change in time (e.g., a time change pose; a pose or motion of the respective character (e.g., the first character and/or the second character) that reflects a pose or motion indicating or acknowledging that the time has changed). Displaying the representation of a character corresponding to a second type of motion state (e.g., static or dynamic) that is indicative of the change in time when/if first activity state corresponds to the computer system detecting that there has been a change in time provides visual feedback that the current activity state of the computer system corresponds to the computer system detecting that there has been a change in time (e.g., without one or more user inputs directed to causing the computer system to indicate the current activity state). Providing improved visual feedback and reducing the number of inputs needed to perform an operation enhances the operability of the device and makes the user-device interface more efficient (e.g., by helping the user to provide proper inputs and reducing user mistakes when operating/interacting with the device) which, additionally, reduces power usage and improves battery life of the device by enabling the user to use the device more quickly and efficiently. - In some embodiments, at the first time (e.g., and prior to the first time), displaying the user interface (e.g., 1001) includes displaying, in the user interface, the graphical representation of the first character (e.g., 1000, 1040). In some embodiments, at the second time after the first time (e.g., and prior to the second time but after the first time), displaying the user interface includes displaying, in the user interface, a transition (e.g., a gradual transition; a smooth transition) from the graphical representation of the first character to the graphical representation of the second character, wherein the second character is different from the first character. In some embodiments, at a third time after the second time (e.g., and prior to the third time but after the second time), displaying the user interface includes displaying, in the user interface, a graphical representation of a third character, wherein the third character is different from the first character and from the second character.
- In some embodiments, the computer system (e.g., 600) displays, via the display generation component (e.g., 602), a second user interface that includes a plurality of selectable characters (e.g., 1016A) (e.g., including a plurality of animated (e.g., 3D) emojis of animal-like characters; a plurality of animated (e.g., 3D) avatar-like emojis). In some embodiments, the plurality of selectable characters are displayed in a first tab or first screen of the second user interface. Displaying the second user interface that includes the plurality of selectable characters enables a user to manage the characters that are displayed in the user interface with the indication of time and thus easily customize the user interface, thereby enhancing the operability of the device and making the user-device interface more efficient (e.g., by helping the user to provide proper inputs when operating/interacting with the device to customize the user interface) which, additionally, reduces power usage and improves battery life of the device by enabling the user to use the device more quickly and efficiently. In some embodiments, while displaying the second user interface, the computer system (e.g., 600) detects (e.g., via one or more input devices of the computer system, such as a touch-sensitive surface integrated with the display generation component) a selection of a third character of the plurality of selectable characters. In some embodiments, in accordance with (e.g., or in response to) detecting the selection of the third character, the computer system displays, via the display device, the user interface, wherein the user interface concurrently includes the indication of time (e.g., the current time; the time set in the systems setting of the computer system), and a graphical representation of the third character (e.g., different from the first character and from the second character).
- In some embodiments, the computer system (e.g., 600) displays, via the display generation component (e.g., 602), a third user interface (e.g., 1016A) (e.g., the second user interface) that includes a graphical representation of a set of characters that includes two or more characters. In some embodiments, while displaying the third user interface, the computer system detects (e.g., via one or more input devices that is in communication with the computer system, such as a touch-sensitive surface integrated with the display generation component) an input corresponding to selection of the set of characters. In some embodiments, in accordance with (e.g., or in response to) detecting the selection of the set of characters, the computer system concurrently displays, in the user interface, the indication of time (e.g., the current time; the time set in the systems setting of the computer system), and a graphical representation of a respective character from the set of characters, wherein the respective character changes among the set of characters over time (e.g., one character from the subset of characters is (e.g., randomly) selected for display at a time).
- In some embodiments, the representation of the first character (e.g., 1000, 1040) corresponds to a graphical representation of (e.g., an animation based on; a graphical representations that animates features of) a user associated (e.g., based on an account to which the computer system is logged into) with the computer system (e.g., 600) (e.g., an animated (e.g., 3D) avatar-like representation of the user of the computer system).
- In some embodiments, the computer system (e.g., 600) displays, via the display generation component (e.g., 602), a fourth user interface (e.g., that includes a representation of a selected character (e.g., a selected animated (e.g., 3D) emoji of an animal-like character; a selected animated (e.g., 3D) avatar-like emoji). In some embodiments, the representation of the selected character is displayed in a second tab or second screen of the second user interface. In some embodiments, the second tab or second screen of the second user interface enables a user to customize (e.g., change a color of; change a background color of) the representation of the selected character and/or a background associated with the representation of the selected character.
- In some embodiments, while displaying the representation of the selected character (e.g., 1000, 1040), detecting (e.g., via one or more input devices that is in communication with the computer system, such as a touch-sensitive surface integrated with the display generation component) an input (e.g., a rotational input on
rotatable input device 603 inFIG. 10K ; a scrolling input on a touch-sensitive surface integrated with the display generation component) directed to changing a visual characteristic (e.g., a background color; a background color theme). - In some embodiments, in response to detecting the input directed to changing the visual characteristic, the computer system (e.g., 600) changes (e.g., by transitioning through a plurality of selectable visual characteristics (e.g., selectable colors)) the visual characteristic (e.g., a color; a background color) from a first visual characteristic (e.g., a first color; a first background color) to a second visual characteristic (e.g., a second color; a second background color) different from the first visual characteristic.
- In some embodiments, the computer system (e.g., 600) displays or causes display of, in the second user interface (e.g., 1016B; a second tab or second screen of the second user interface), a user interface element (e.g., 1048; a rotatable user interface element; a color wheel) for changing the visual characteristic (e.g., a color; a background color). In some embodiments, in response to (e.g., and while) detecting the input directed to changing the visual characteristic, the computer system displays or causes display of a change in the selected visual characteristic via the user interface element for changing the visual characteristic (e.g., transition and/or rotating through selectable colors in the color wheel while the input is being detected). In some embodiments, the input directed to changing the visual characteristic is a rotational input (e.g., detected/received via a rotatable input device that is in communication with the computer system), and change in the selected visual characteristic includes scrolling/navigating through a plurality of different colors (e.g., scrolling through the color wheel) of the user interface element. In some embodiments, the computer system scrolls/navigates the user interface element (e.g., the color wheel) in a first direction in accordance with a determination that the rotational input is in a first direction (e.g., clockwise direction) and scrolls/navigates the user interface element (e.g., the color wheel) in a first direction in accordance with a determination that the rotational input is in a second direction (e.g., counter-clockwise direction).
- The computer system (e.g., 600), at the second time (e.g., or immediately prior to the second time), detects (1132) (e.g., determines) a change in activity state of the computer system (e.g., 600) from the first activity state to the second activity state (e.g., a lower power consumption mode) (e.g., a change in a display setting (e.g., getting dimmer; getting brighter) of the computer system; a change in a security state (e.g., device being locked or unlocked) of the computer system; a change in a state of the computer system due to a detected user input and the computer system displaying (or causing display of)/providing a response to the user input and/or performing an operation due to the user input).
- The computer system (e.g., 600), in response to detecting (1134) the change in activity state of the computer system (e.g., 600) from the first activity state to the second activity state, displays (1136), in the user interface (e.g., 1001), the graphical representation (e.g., 1052, 1088) of the second character (e.g., a transition animation causes the graphical representation of the first character to begin to fade, dissolve, and/or reduce in size and the graphical representation of the second character begin to be displayed at the same size as the first character) (e.g., the graphical representation of the second character is in the second visual state, such as a neutral state, a static state, and/or a sleeping state); and ceases (1138) to display, in the user interface (e.g., 1001), the graphical representation (e.g., 1050, 1086) of the first character, wherein the second character is different from the first character (e.g., the first character and the second character are different characters and are from a predetermined collection and/or set of characters).
- In some embodiments, the computer system (e.g., 600) maintains display of the graphical representation (e.g., 1052, 1088) of the second character in response to detecting a change in activity state of the computer system (e.g., 600) from the second activity state to the first activity state. In some embodiments, the computer system (e.g., 600) transitions between the graphical representation (e.g., 1050, 1086) of the first character and the graphical representation (e.g., 1052, 1088) of the second character in response to detecting a change in the activity state from a lower power consumption mode to a higher power consumption mode, and maintains display of the currently displayed graphical representation (e.g., 1050, 1086) of the first character or the graphical representation (e.g., 1052, 1088) of the second character in response to detecting the transition from the higher power consumption mode to the lower power consumption mode.
- Displaying the graphical representation of the second character and ceasing to display the graphical representation of the first character in response to detecting the change in activity state from the first activity state to the second activity state provides improved visual feedback about the current activity state of the computer system. Providing improved visual feedback improves the operability of the device and makes the user-device interface more efficient (e.g., by helping the user to provide proper inputs and reducing user mistakes when operating/interacting with the device) which, additionally, reduces power usage and improves battery life of the device by enabling the user to use the device more quickly and efficiently.
- The computer system (e.g., 600), at a third time (e.g., after the second time and/or immediately prior to the third time), detects (1142) a change in activity state of the computer system (e.g., 600) from the second activity state to the first activity state; and in response to detecting the change in activity state of the computer system (e.g., 600) from the second activity state to the first activity state, maintains (1144) display, in the user interface (e.g., 1001), of the graphical representation (e.g., 1052, 1088) of the second character, wherein the graphical representation (e.g., 1052, 1088) of the second character includes an animated visual state (e.g., maintaining display of the graphical representation of the second character, but changing a visual state of the graphical representation of the second character in response to detecting the change in activity state from the second activity state to the first activity state).
- Displaying the graphical representation of the second character in an animated visual state in response to detecting the change in activity state from the second activity state to the first activity state provides improved visual feedback about the current activity state of the computer system. Providing improved visual feedback improves the operability of the device and makes the user-device interface more efficient (e.g., by helping the user to provide proper inputs and reducing user mistakes when operating/interacting with the device) which, additionally, reduces power usage and improves battery life of the device by enabling the user to use the device more quickly and efficiently.
- The computer system (e.g., 600), at a fourth time (e.g., after the third time and/or immediately prior to the fourth time), after (or while) displaying the second character in the animated visual state, detects (1146) a change in activity state of the computer system (e.g., 600) from the first activity state to the second activity state.
- The computer system (e.g., 600), in response to detecting (1148) the change in activity state of the computer system (e.g., 600) from the first activity state to the second activity: displays (1150), in the user interface (e.g., 1001), a graphical representation of a third character, (e.g., a transition animation causes the graphical representation of the second character to begin to fade, dissolve, and/or reduce in size and the graphical representation of the third character begin to be displayed at the same size as the first character) (e.g., the graphical representation of the third character is in the second visual state, such as a neutral state, a static state, and/or a sleeping state); and ceases (1152) to display, in the user interface (e.g., 1001), the graphical representation (e.g., 1052, 1088) of the second character, wherein the third character is different from the first character and the second character (e.g., the first character, the second character, and the third character are different characters and are from a predetermined collection and/or set of characters).
- Displaying the graphical representation of the third character and ceasing to display the graphical representation of the second character in response to detecting the change in activity state from the first activity state to the second activity state provides improved visual feedback about the current activity state of the computer system. Providing improved visual feedback improves the operability of the device and makes the user-device interface more efficient (e.g., by helping the user to provide proper inputs and reducing user mistakes when operating/interacting with the device) which, additionally, reduces power usage and improves battery life of the device by enabling the user to use the device more quickly and efficiently.
- In some embodiments, at the first time, displaying, in the user interface (e.g., 1001), the graphical representation (e.g., 1050, 1086) of the first character includes displaying a graphical element (e.g., 1004) surrounding at least a portion of the first character (e.g., displaying the first character overlaid on the graphical element) (e.g., a background having a ring of color and/or multiple rings of color different from a color of user interface (e.g., a black color)) displayed in the user interface (e.g., 1001).
- The computer system (e.g., 600), at the second time (e.g., or immediately prior to the second time), detects (1132) (e.g., determining) a change in activity state of the computer system (e.g., 600) from the first activity state to the second activity state (e.g., a lower power consumption mode) (e.g., a change in a display setting (e.g., getting dimmer; getting brighter) of the computer system; a change in a security state (e.g., device being locked or unlocked) of the computer system; a change in a state of the computer system due to a detected user input and the computer system displaying (or causing display of)/providing a response to the user input and/or performing an operation due to the user input).
- The computer system (e.g., 600), in response (1134) to detecting the change in activity state of the computer system (e.g., 600) from the first activity state to the second activity state, decreases (1140) a brightness of a portion of the user interface (e.g., 1001) that included the graphical element (e.g., 1004) (e.g., fading the graphical element or displaying the graphical representation of the second character without the graphical element in the user interface) (e.g., a transition animation causes the graphical element to fade to a color that is closer to or the same as the color of a background portion of the user interface (e.g., black) in response to detecting the change in activity state of the computer system from the first activity state to the second activity state).
- Decreasing the brightness of the portion of the user interface that included the graphical element in response to detecting the change in activity state from the first activity state to the second activity state provides improved visual feedback about the current activity state of the computer system. Providing improved visual feedback improves the operability of the device and makes the user-device interface more efficient (e.g., by helping the user to provide proper inputs and reducing user mistakes when operating/interacting with the device) which, additionally, reduces power usage and improves battery life of the device by enabling the user to use the device more quickly and efficiently.
- The computer system (e.g., 600), while the computer system (e.g., 600) is in the first activity state (e.g., a higher power consumption mode), in response to a determination that a predetermined change in time has occurred (e.g., a minute has changed, an hour has changed, 15-minutes past the hour has been reached, 30-minutes past the hour has been reached; 45-minutes past the hour has been reached), displays (1154) the graphical representation (e.g., 1050, 1086) of the first character in a change-in-time visual state (e.g., time change pose; a pose or motion of the first character that reflects a pose or motion indicating or acknowledging that the time has changed).
- The computer system (e.g., 600), while the computer system (e.g., 600) is in the second activity state (e.g., a lower power consumption mode), forgoes (1156) display of the graphical representation (e.g., 1052, 1088) of the second character in the change-in-time visual state when the predetermined change in time has occurred.
- Displaying the graphical representation of the first character in the change-in-time visual state while the computer system is in the first activity state and forgoing display of the graphical representation of the second character in the change-in-state visual state while the computer system is in the second activity state provides improved visual feedback about the current activity state of the computer system. Providing improved visual feedback improves the operability of the device and makes the user-device interface more efficient (e.g., by helping the user to provide proper inputs and reducing user mistakes when operating/interacting with the device) which, additionally, reduces power usage and improves battery life of the device by enabling the user to use the device more quickly and efficiently.
- The computer system (e.g., 600) detects (1158) a change in time (e.g., a minute has changed, an hour has changed, 15-minutes past the hour has been reached, 30-minutes past the hour has been reached; 45-minutes past the hour has been reached), and in response to detecting (1160) the change in time and in accordance with a determination that the computer system (e.g., 600) is in the first activity state (e.g., a higher power consumption mode), updates (1162) a representation of time (e.g., 1002) and displays the graphical representation (e.g., 1050, 1086) of the first character in a first manner (e.g., a visual state that includes animating the graphical representation of the first character in response to detecting the change in time). The computer system (e.g., 600) detects (1158) a change in time (e.g., a minute has changed, an hour has changed, 15-minutes past the hour has been reached, 30-minutes past the hour has been reached; 45-minutes past the hour has been reached), and in response to detecting (1160) the change in time and in accordance with a determination that the computer system (e.g., 600) is in the second activity state (e.g., a lower power consumption mode), updates (1164) the representation of time (e.g., 1002) without displaying the graphical representation (e.g., 1050, 1086) of the first character in the first manner (e.g., displaying the graphical representation of the first character in a second manner (e.g., a static visual state) that is different from the first manner and/or forgoing any change in the graphical representation of the first character in response to detecting the change in time).
- Displaying the graphical representation of the first character in the first manner and forgoing display of the graphical representation of the first character in the first manner depending on an activity state of the computer system provides improved visual feedback about the current activity state of the computer system. Providing improved visual feedback improves the operability of the device and makes the user-device interface more efficient (e.g., by helping the user to provide proper inputs and reducing user mistakes when operating/interacting with the device) which, additionally, reduces power usage and improves battery life of the device by enabling the user to use the device more quickly and efficiently.
- The computer system (e.g., 600), while displaying the graphical representation (e.g., e.g., 1050, 1052, 1086, 1088) of the first character, detects (1166) an input (e.g., 1054) directed to one or more input devices of the computer system (e.g., 600) (e.g., a touch input while the computer system is in the higher power consumption mode, or a digital crown rotation input while the computer system is in the higher power consumption mode); and in response to detecting the input (e.g., 1054), displays (1170) the graphical representation (e.g., e.g., 1050, 1052, 1086, 1088) of the first character in a third visual state that includes enlarging the graphical representation of the first character (e.g., increasing a size of the first character with respect to the user interface and/or the display generation component) such that a portion of the graphical representation of the first character ceases to be displayed in the user interface (e.g., 1001) (e.g., the first character increases and size and/or moves to cause a portion of the first character to appear to move off of the display generation component, such that the portion of the first character ceases to be displayed via the display generation component for a predetermined period of time).
- Displaying the graphical representation of the first character in the third visual state provides improved visual feedback about the current activity state of the computer system. Providing improved visual feedback improves the operability of the device and makes the user-device interface more efficient (e.g., by helping the user to provide proper inputs and reducing user mistakes when operating/interacting with the device) which, additionally, reduces power usage and improves battery life of the device by enabling the user to use the device more quickly and efficiently.
- The computer system (e.g., 600), while displaying the graphical representation (e.g., e.g., 1050, 1052, 1086, 1088) of the first character, detects (1172) a first input (e.g., 1054) directed to one or more input devices of the computer system (e.g., 600) (e.g., a touch input while the computer system is in the higher power consumption mode, or a digital crown rotation input while the computer system is in the higher power consumption mode).
- The computer system (e.g., 600), in response to detecting the first input (e.g., 1054), displays (1174) the graphical representation (e.g., e.g., 1050, 1052, 1086, 1088) of the first character in a first animated visual state for a predetermined period of time (e.g., causing an animation of the graphical representation of the first character that lasts for a certain period of time, such as 1 second, 2 seconds, 3 seconds, 4 seconds, or 5 seconds).
- The computer system (e.g., 600), after detecting the first input (e.g., 1054), detects (1176) a second input (e.g., 1056) directed to one or more input devices of the computer system (e.g., 600) (e.g., a touch input while computer system is in the higher power consumption mode, or a digital crown rotation input while the computer system is in the higher power consumption mode).
- The computer system (e.g., 600), in response to detecting (1178) the second input (e.g., 1056) and in accordance with a determination that the predetermined period of time has ended (e.g., the animation caused by the first input has ended and the graphical representation of the first character is displayed in a default position), displays (1180) the graphical representation (e.g., e.g., 1050, 1052, 1086, 1088) of the first character in a second animated visual state (e.g., causing an animation of the graphical representation of the first character), wherein the second animated visual state includes movement of the graphical representation (e.g., e.g., 1050, 1052, 1086, 1088) of the first character starting from a first position (e.g., a default position of the graphical representation of the first character that is displayed when no user input is detected that causes an animation of the graphical representation of the first character).
- The computer system (e.g., 600), in response to detecting (1178) the second input (e.g., 1056) and in accordance with a determination that the predetermined period of time has not ended (e.g., the animation caused by the first input is still occurring, such that the graphical representation of the first character is not in the default position), displays (1182) the graphical representation (e.g., e.g., 1050, 1052, 1086, 1088) of the first character in a third animated visual state (e.g., causing an animation of the graphical representation of the first character) (e.g., the second animated visual state where the graphical representation of the first character starts from a different position), wherein the third animated visual state includes movement of the graphical representation (e.g., 1050, 1052, 1086, 1088) of the first character starting from a second position (e.g., a position of the graphical representation of the first character that is not the default position and/or a position of the graphical representation of the first character that is along a predetermined path of movement of the first animated visual state), different from the first position.
- Displaying the graphical representation of the first character in the second animated visual state or the third animal visual state depending on whether the predetermined time period has ended provides improved visual feedback about the current activity state of the computer system. Providing improved visual feedback improves the operability of the device and makes the user-device interface more efficient (e.g., by helping the user to provide proper inputs and reducing user mistakes when operating/interacting with the device) which, additionally, reduces power usage and improves battery life of the device by enabling the user to use the device more quickly and efficiently.
- The computer system (e.g., 600) displays (1184), via the display generation component (e.g., 602), a fifth user interface (e.g., 1064) (e.g., the second user interface and/or the third user interface) for selecting between a first set of characters (e.g., 1060) that includes a plurality of user-customizable virtual avatars (e.g., a plurality of avatar-like emojis) and a graphical representation (e.g., 1076) of a second set of characters (e.g., a plurality of emojis of animal-like characters) that includes two or more predetermined characters that are not available in the first set of characters.
- The computer system (e.g., 600), while displaying the third user interface (e.g., 1064), detects (1186) (e.g., via one or more input devices that is in communication with the computer system, such as a touch-sensitive surface integrated with the display generation component) an input (e.g., 1084) corresponding to selection of the first set of characters (e.g., 1060) or the second set of characters (e.g., 1076), and, in accordance with (e.g., or in response to) a determination that the input corresponds to selection of the first set of characters (e.g., 1060), the computer system (e.g., 600) concurrently displays (1188), in the user interface (e.g., 1001): the indication of time (e.g., 1002) (1190) (e.g., the current time; the time set in the systems setting of the computer system), and a graphical representation (e.g., 1050, 1052) (1192) of a currently selected character from the first set of characters (e.g., 1060), wherein the currently selected character is automatically changed between different characters in the first set of characters (e.g., 1060) when predetermined criteria are met (e.g., one character from the subset of characters is (e.g., randomly) selected for display over time, in response to detecting a change in activity state of the computer system, and/or in response to detecting a user gesture, such as a wrist raise and/or a tap gesture).
- The computer system (e.g., 600), while displaying the third user interface (e.g., 1064), detects (1186) (e.g., via one or more input devices that is in communication with the computer system, such as a touch-sensitive surface integrated with the display generation component) an input (e.g., 1084) corresponding to selection of the first set of characters (e.g., 1060) or the second set of characters (e.g., 1076), and, in accordance with (e.g., or in response to) a determination that the input (e.g., 1084) corresponds to selection of the second set of characters (e.g., 1076), concurrently displays (1194), in the user interface (e.g., 1001): the indication of time (e.g., 1002) (1196) (e.g., the current time; the time set in the systems setting of the computer system), and a graphical representation (e.g., 1086, 1088) (1198) of a currently selected character from the second set of characters (e.g., 1076), wherein the currently selected character is automatically changed between different characters in the second set of characters (e.g., 1076) when the predetermined criteria are met (e.g., one character from the subset of characters is (e.g., randomly) selected for display over time, in response to detecting a change in activity state of the computer system, and/or in response to detecting a user gesture, such as a wrist raise and/or a tap gesture).
- Displaying the fifth user interface for selecting between the first set of characters and the second set of characters enables a user to easily customize the watch face user interface, thereby enhancing the operability of the device and making the user-device interface more efficient (e.g., by helping the user to provide proper inputs and reducing user mistakes when operating/interacting with the device) which, additionally, reduces power usage and improves battery life of the device by enabling the user to use the device more quickly and efficiently.
- Note that details of the processes described above with respect to method 1100 (e.g.,
FIGS. 11A-11H ) are also applicable in an analogous manner to the methods described above and below. For example,method 700 optionally includes one or more of the characteristics of the various methods described above with reference tomethod 1100. For example, a device can use as a watch user interface either a watch user interface as described with reference toFIGS. 6A-6H or a user interface that includes an indication of time and a graphical representation of a character as described with reference toFIGS. 10A-10AC . For another example,method 900 optionally includes one or more of the characteristics of the various methods described above with reference tomethod 1100. For example, a device can use as a watch user interface either a watch user interface as described with reference toFIGS. 8A-8M or a user interface that includes an indication of time and a graphical representation of a character as described with reference toFIGS. 10A-10AC . For another example,method 1300 optionally includes one or more of the characteristics of the various methods described above with reference tomethod 1100. For example, a device can use as a watch user interface either a time user interface as described with reference toFIGS. 12A-12G or a user interface that includes an indication of time and a graphical representation of a character as described with reference toFIGS. 10A-10AC . For another example,method 1500 optionally includes one or more of the characteristics of the various methods described above with reference tomethod 1100. For example, a device can use as a watch user interface either a user interface that includes a background as described with reference toFIGS. 14A-14AD or a user interface that includes an indication of time and a graphical representation of a character as described with reference toFIGS. 10A-10AC . For another example,method 1700 optionally includes one or more of the characteristics of the various methods described above with reference tomethod 1100. For example, one or more characteristics or features of a user interface that includes an indication of time and a graphical representation of a character as described with reference toFIGS. 10A-10AC can be edited via the process for editing characteristics or features of a watch user interface as described with reference toFIGS. 16A-16AE . For brevity, these details are not repeated below. -
FIGS. 12A-12G illustrate exemplary user interfaces for enabling and displaying an indication of a current time, in accordance with some embodiments. The user interfaces in these figures are used to illustrate the processes described below, including the processes inFIGS. 13A-13C . -
FIG. 12A illustratesdevice 600 displaying, viadisplay 602, a time user interface 1204 (e.g., a watch user interface that includes an indication of a current time) that includes a face 1206 (e.g., a representation of a human face or a representation of an anthropomorphic face of a non-human character). As shown inFIG. 12A ,face 1206 comprises a plurality of facial features, including a first facial feature 1208 (e.g., representing/indicative of the eyes; also referred to as eyes 1208), a second facial feature 1210 (e.g., also referred to as nose 1210), a third facial feature 1212 (e.g., also referred to as mouth 1212 (e.g., lips)), a fourth facial feature 1214 (e.g., also referred to as hair 1214), a fifth facial feature 1216 (e.g., also referred to as facial outline 1216 (e.g., including cheeks and/or jawline)), a sixth facial feature 1218 (e.g., also referred to as neck 1218), and a seventh facial feature 1220 (e.g., also referred to as shoulders 1220). - In
FIG. 12A ,eyes 1208 indicate a current time (e.g., the current time; the time set in the systems setting of device 600), where the shape of the eyes corresponds to the current time (e.g., the right eye is represented via a number or numbers that indicate the current hour, and the left eye is represented via numbers that indicate the current minute). As described in greater detail below, an animation (e.g., blinking motion) can be applied toeyes 1208 and/or a change in visual characteristic (e.g., change in color; change in font; change in style) can be applied toeyes 1208. - In
FIG. 12A ,eyes 1208,nose 1210,mouth 1212,hair 1214,facial outline 1216,neck 1218, andshoulders 1220, respectively, have a corresponding visual characteristic (e.g., a respective color (e.g., a respective line color or a respective fill color); a respective shape; a respective position). In some embodiments, one or more of the facial features 1208-1220 have the same corresponding visual characteristic (e.g., the same line or fill colors). For example,nose 1210 andmouth 1212 can have the same visual characteristic (e.g., the same color (e.g., the same line color or the same fill color)), whileeyes 1208,hair 1214,facial outline 1216,neck 1218, andshoulders 1220 can have different visual characteristics (e.g., different colors (e.g., different line colors and/or different fill colors)). For another example,eyes 1208,mouth 1212,facial outline 1216, andshoulders 1220 can have the same visual characteristic (e.g., the same color (e.g., the same line color or the same fill color)) whilenose 1210,hair 1214,neck 1218 can have different visual characteristics (e.g., different colors (e.g., different line colors and/or different fill colors)). - In some embodiments, a respective visual characteristic for a respective facial feature corresponds to a type of color. In some embodiments, the type of color is programmatically selected (e.g., determined), without user input, from a plurality of available colors by
device 600. In some embodiments, an application process selects (e.g., programmatically determines) the color based on a color of device 600 (e.g., a color of a housing or case of device 600). In some embodiments, the application process selects the color based on usage history of a user of device 600 (e.g., based on a previous user-selected color or color scheme). - While displaying
time user interface 1204 includingface 1206,device 600 detects (e.g., determines) the satisfaction of a predetermined criteria for changing an appearance of time user interface 1204 (e.g., a change in the current time; a change in a state ofdevice 600 due to a detected user input (e.g., a tap on display 602); detecting a movement of device 600 (e.g., caused by a user movement, such as a wrist-raise movement); a change in state or a change in mode of device 600 (e.g., transitioning to a sleep mode or sleeping state; transitioning from a locked state to an unlocked state)). - In some embodiments, in response to detecting the satisfaction of the predetermined criteria for changing an appearance of
time user interface 1204,device 600 ceases display offace 1206 ofFIG. 12A and displays a different type of face (e.g., a face where respective visual characteristics of all facial features have been changed), for example aface 1222 inFIG. 12B . -
FIG. 12B illustratesdevice 600 displaying, viadisplay 602,time user interface 1204 that includes (e.g., a representation of) face 1222 that is different fromface 1206. As withface 1206,face 1222 comprises a plurality of facial features, including a first facial feature 1224 (e.g., eyes indicating the current time; also referred to as eyes 1224), a second facial feature 1226 (e.g., also referred to as nose 1226), a third facial feature 1228 (e.g., also referred to as mouth 1228 (e.g., lips)), a fourth facial feature 1230 (e.g., also referred to as hair 1230), a fifth facial feature 1232 (e.g., also referred to as facial outline 1232 (e.g., including checks and/or jawline)), a sixth facial feature 1234 (e.g., also referred to as neck 1234), and a seventh facial feature 1236 (e.g., also referred to as shoulders 1236). - In
FIG. 12B , as witheyes 1208 offace 1206,eyes 1224 indicates a current time, where the shape of the eyes corresponds to the current time. InFIG. 12B , facial features 1224-1236 offace 1222 have respective visual characteristics (e.g., a respective color (e.g., line color or fill color); a respective shape; a respective position). - In some embodiments, ceasing display of
face 1206 as inFIG. 12A and displayingface 1222 as inFIG. 12B includes displaying a gradual transition fromface 1206 to face 1222 that includes transitioning a respective facial feature offace 1206 from having the corresponding visual characteristic, as inFIG. 12A , through a plurality of intermediate (e.g., temporary) states to a final state in which a corresponding respective facial feature offace 1222 has the corresponding visual characteristic, as inFIG. 12B , where the corresponding visual characteristic of a respective facial feature inFIG. 12A is different from the corresponding visual characteristic of the counterpart respective facial feature inFIG. 12B (e.g.,hair 1214 offace 1206 has a different fill color and/or shape thanhair 1230 of face 1222). -
FIG. 12C illustratesdevice 600 displaying, viadisplay 602,time user interface 1204 that includesface 1222, whereface 1222 inFIG. 12C is different fromface 1222 inFIG. 12B (e.g., a different version of the same face). In some embodiments, changing the appearance oftime user interface 1204 includes changing a subset of the facial features of the displayed face without changing all of the facial features of the displayed face. - While displaying
face 1222 as inFIG. 12B ,device 600 detects (e.g., determines) the satisfaction of a predetermined criteria for changing an appearance of the time user interface. In some embodiments, in response to detecting the satisfaction of the predetermined criteria for changing an appearance oftime user interface 1204,device 600 changes the appearance oftime user interface 1204 by ceasing display offace 1222 as inFIG. 12B and displayingface 1222 as inFIG. 12C . InFIG. 12C , the predetermined criteria for changing the appearance of time user interface 1204 (e.g., as shown in the transition oftime user interface 1204 fromface 1206 inFIG. 12A to face 1222 inFIG. 12B and the transition oftime user interface 1204 fromface 1222 inFIG. 12B to face 1222 inFIG. 12C ) includes a criterion that is satisfied when a predetermined time has elapsed (e.g., every minute; every 15 minutes; every 30 minutes; every hour). In some embodiments, the predetermined criteria for changing the appearance of time user interface 1204 (e.g., changing one or more facial features of the respective face in the time user interface) does not includes the criterion that is satisfied when the predetermined time has elapsed. In some embodiments,device 600 changes the appearance of time user interface 1204 (e.g., changes one or more facial features of the respective face in time user interface 1204) randomly and not based on when the predetermined time has elapsed. - In
FIG. 12C ,face 1222 includes the same visual characteristics foreyes 1224,mouth 1228,facial outline 1232, andneck 1234 asface 1222 ofFIG. 12B . InFIG. 12C ,face 1222 includes different visual characteristics fornose 1226,hair 1230, andshoulders 1236 fromface 1222 inFIG. 12B (e.g.,nose 1226 has a different shape, andhair 1230 has a different fill color inFIG. 12C as compared toFIG. 12B ). - In some embodiments, ceasing display of
face 1222 as inFIG. 12B and displaying (e.g., transitioning to)face 1222 as inFIG. 12C includes displaying a gradual transition fromface 1222 inFIG. 12B to face 1222 inFIG. 12C that includes transitioningnose 1226,hair 1230, andshoulders 1236 from have their respective visual characteristic inFIG. 12B through a plurality of intermediate (e.g., temporary) states to a final state in whichnose 1226,hair 1230, andshoulders 1236 have their respective visual characteristic inFIG. 12C . -
FIG. 12D illustratesdevice 600 displaying an animation (e.g., a blinking animation) usingeyes 1224, while displayingface 1222. In some embodiments, displaying the animation viaeyes 1224 includes ceasing display of at least a portion ofeyes 1224, as shown inFIG. 12D , for a period of time (e.g., a brief moment; a fraction of a second; 1 second), then re-displaying the portion of eyes 1224 (e.g., as previously shown inFIG. 12C ) after the period of time has elapsed. In some embodiments, the animation is a blinking animation ofeyes 1224 that includes a temporary/brief movement or change in shape/form ofeyes 1224 such that the first facial feature mimics the movement of a human eye blinking. In some embodiments,device 600 periodically displays the animation viaeyes 1224 based on time (e.g., every 1 second, every 10 seconds, every 15 seconds, every 30 seconds, every 1 minute; every 5 minutes; every 30 minutes; every hour). In some embodiments,device 600 displays the animation viaeyes 1224 non-periodically (e.g., not based on time; not in regular intervals; at random times; not based on a period change in time). - While displaying
time user interface 1204 includingface 1222 as shown inFIGS. 12C-12D ,device 600 detects (e.g., determines) the satisfaction of a second predetermined criteria (e.g., a type of input; a change in activity state of device 600) for changing an appearance oftime user interface 1204. In response to detecting the satisfaction of the second predetermined criteria for changing an appearance oftime user interface 1204,device 600 ceases display ofsecond face 1222, as shown inFIGS. 12C-12D , and displays face 1222 as shown inFIG. 12E . - In
FIG. 12E ,device 600 is in a different state (e.g., a reduced-power state) fromFIGS. 12A-12D , in whichdevice 600 changes one or more visual features of a displayed user interface while in the different state (e.g.,device 600 dims/darkens the background or reverts from using a respective color to fill in a respective element/region of the user interface to using the respective color as an outline color of the respective element/region of the user interface). - In
FIG. 12E , eyes 1224 (e.g., still) indicates the current time. In some embodiments,device 600 displays an animation via eyes 1224 (e.g., based on a change in the time or non-periodically). - In
FIG. 12E ,nose 1226 has a different visual characteristic than inFIGS. 12C-12D , where the different visual characteristic inFIG. 12E is a visually distinguished outline (e.g., borderline) fornose 1226, and the visually distinguished outline has a respective color (e.g., line color) that is based on a respective color used to fillnose 1226 inFIGS. 12C-12D (e.g.,device 600 applies the color or tone (or a color similar to the color or tone) of the fill color ofnose 1226 inFIGS. 12C-12D to the line color ofnose 1226 inFIG. 12E ). Similarly,mouth 1228,hair 1230,facial outline 1232,neck 1234, andshoulders 1236, respectively, have different visual characteristics than inFIGS. 12C-12D , where the respective different visual characteristics inFIG. 12E are visually distinguished outlines that have respective colors (e.g., line colors) that are based on (e.g., correspond to) respective colors used to fill (e.g., used as fill colors)mouth 1228,hair 1230,facial outline 1232,neck 1234, andshoulders 1236, respectively, inFIGS. 12C-12D (e.g.,device 600 applies the color or tone (or a color similar to the color or tone) of the fill color ofmouth 1228,hair 1230,facial outline 1232,neck 1234, andshoulders 1236, respectively, inFIGS. 12C-12D to the line color ofmouth 1228,hair 1230,facial outline 1232,neck 1234, andshoulders 1236, respectively, inFIG. 12E ). - While displaying
face 1222 as inFIGS. 12C-12D ,device 600 detects (e.g., determines) the satisfaction of a predetermined criteria for changing an appearance of the time user interface. In some embodiments, in response to detecting the satisfaction of the predetermined criteria for changing an appearance oftime user interface 1204,device 600 changes the appearance oftime user interface 1204 by ceasing display offace 1222 as inFIGS. 12C-12D and displayingface 1222 as inFIG. 12F . InFIG. 12F , the predetermined criteria for changing the appearance oftime user interface 1204 includes a criterion that is satisfied when a predefined movement (e.g., of device 600) has been detected. In some embodiments,device 600 is a wearable device (e.g., a smartwatch), and the predefined movement criteria corresponds to a wrist-raise movement whiledevice 600 is being worn. - In
FIG. 12F ,face 1222 includes the same visual characteristics for eyes 1224),hair 1230,facial outline 1232,neck 1234, andshoulders 1236 asface 1222 ofFIGS. 12C-12D . InFIG. 12F ,face 1222 includes different visual characteristics (e.g., different color and/or different shape) fornose 1226 andmouth 1226 as compared toface 1222 inFIGS. 12C-12D . - While displaying
face 1222 as inFIG. 12F ,device 600 detects (e.g., determines) the satisfaction of a predetermined criteria for changing an appearance of the time user interface. In some embodiments, in response to detecting the satisfaction of the predetermined criteria for changing an appearance oftime user interface 1204,device 600 changes the appearance oftime user interface 1204 by ceasing display offace 1222 as inFIG. 12F and displayingface 1222 as inFIG. 12G . InFIG. 12G , the predetermined criteria for changing the appearance oftime user interface 1204 includes a criterion that is satisfied when a change in state (e.g., a change in mode from one device state/mode to another device state/mode) ofdevice 600 has been detected (e.g., it is determined thatdevice 600 has undergone a change in state). In some embodiments, the change in state corresponds todevice 600 transitioning to a sleep mode or sleeping state. In some embodiments, the sleep mode or sleep state corresponds to a state in which the display generation component is off In some embodiments, the sleep mode or sleep state corresponds to a state in whichdevice 600 is in a low-power state (e.g., in which display 602 is off). In some embodiments, the change in state corresponds todevice 600 transitioning from a locked state to an unlocked state. - In
FIG. 12G ,face 1222 includes the same visual characteristics foreyes 1224,nose 1226,mouth 1228,hair 1230,neck 1234, andshoulders 1236 asface 1222 ofFIG. 12F . InFIG. 12G ,face 1222 includes a different visual characteristic forfacial outline 1232 fromface 1222 inFIG. 12F (e.g.,facial outline 1232 has a different fill color inFIG. 12G than inFIG. 12F ). - In some embodiments, face 1222 displayed in
time user interface 1204 has a primary color scheme (e.g., a predominant color; a most-prevalent color). In some embodiments, the primary color scheme corresponds to the color of thefacial outline 1232. - In some embodiments, the color of
neck 1234 and/or the color ofshoulders 1236 are based on the primary color scheme (e.g.,neck 1234 is a slightly lighter shade of the color offacial outline 1232 orneck 1234 is a slightly darker shade of the color offacial outline 1232, as indicated inFIG. 12G ). In some embodiments, the color of the second facial feature has a predetermined relationship to the color offacial outline 1232 for a plurality of different types of faces (e.g.,face 1206; face 1222) (e.g., the neck is a predetermined amount lighter than the face for a plurality of different types of faces or the neck is a predetermined amount darker than the face for a plurality of different types of faces). -
FIGS. 13A-13C are a flow diagram illustrating methods of enabling and displaying a user interface that includes an indication of a current time, in accordance with some embodiments.Method 1300 is performed at a computer system (e.g., 100, 300, 500, 600) (e.g., a smart device, such as a smartphone or a smartwatch; a mobile device) that is in communication with a display generation component. Some operations inmethod 1300 are, optionally, combined, the orders of some operations are, optionally, changed, and some operations are, optionally, omitted. - As described below,
method 1300 provides an intuitive way for managing user interfaces related to time. The method reduces the cognitive burden on a user for managing user interfaces related to time, thereby creating a more efficient human-machine interface. For battery-operated computing devices, enabling a user to manage user interfaces related to time faster and more efficiently conserves power and increases the time between battery charges. - The computer system (e.g., 600) displays (1302), via the display generation component (e.g., 602), a time user interface (e.g., 1204) (e.g., a watch user interface that includes an indication of a current time) that includes a representation of a first face (e.g., 1206 or 1222) (e.g., a representation of a human face or a representation of an anthropomorphic face of a non-human character) having a first facial feature (e.g., 1208, 1224) (e.g., eyes) and a second facial feature (e.g., 1210, 1212, 1214, 1216, 1218, 1220, 1226, 1228, 1230, 1232, 1234, or 1236) (e.g., nose; mouth; hair; facial shape; neck; shoulders), wherein the first facial feature of the first face indicates a current time (e.g., the current time; the time set in the systems setting of the computer system) (1304), and the second facial feature of the first face has a first visual characteristic (e.g., a first color (e.g., a first line color or a first fill color); a first shape; a first position) (1306). Displaying the time user interface that includes the representation of the first face having the first facial feature and the second facial feature, where the first facial feature of the first face indicates a current time and the second facial feature of the first face has a first visual characteristic provides information about the current time while providing a user interface with features that do not relate to time, thereby enhancing the operability of the device and making the user-device interface more efficient (e.g., by including time information in an animated user interface) which, additionally, reduces power usage and improves battery life of the device by enabling the user to use the device more quickly and efficiently.
- While displaying the representation of the first face (e.g., 1206 or 1222) (1308), the computer system (e.g., 600) detects (e.g., determining) (1310) the satisfaction of a predetermined criteria for changing an appearance of the time user interface (e.g., 1204) (e.g., a change in the current time (e.g., a change in the hour of the current time, a change in the minute of the current time, a change in the second of the current time); a change in a state of the computer system due to a detected user input (e.g., a tap input on the display generation component) and the computer system displaying (or causing display of)/providing a response to the user input and/or performing an operation due to the user input; detecting a movement of the computer system (e.g., caused by a user movement, such as a wrist-raise movement); a change in state or a change in mode of the computer system (e.g., transitioning to a sleep mode or sleeping state; transitioning from a locked state to an unlocked state).
- In response to detecting the satisfaction of the predetermined criteria for changing an appearance of the time user interface (e.g., 1204) (1318), the computer system (e.g., 600) ceases (1320) to display the representation of the first face (e.g., 1206 or 1222) and displays (1322) a representation of a second face (e.g., 1206, 1222) having a first facial feature (e.g., 1208 or 1224) (e.g., eyes) and a second facial feature (e.g., 1210, 1212, 1214, 1216, 1218, 1220, 1226, 1228, 1230, 1232, 1234, or 1236) (e.g., nose; mouth; hair; facial shape; neck; shoulders), wherein the representation of the second face is different from the representation of the first face (1324), the first facial feature of the second face indicates a current time (1326), and the second facial feature of the second face has a second visual characteristic (e.g., a second color (e.g., a second line color or a second fill color); a second shape) different from the first visual characteristic (1328), and ceasing display of the representation of the first face and displaying the representation of the second face includes displaying a gradual transition from the first face to the second face that includes transitioning the second facial feature of the first face from having the first visual characteristic through a plurality of intermediate (e.g., temporary) states to a final state in which the second facial feature of the second face has the second visual characteristic (1330). In some embodiments, the computer system displays or causes display of an animation via the first facial feature (e.g., blinking of the displayed time if the first facial feature represents eyes) based on a change in the time or non-periodically. Ceasing to display the representation of the first face and displaying the representation of the second face having the first facial feature and the second facial feature provides feedback to a user that a predetermined criteria for changing the appearance of the time user interface has been satisfied. Performing an operation when a set of conditions has been met without requiring further user input enhances the operability of the device and makes the user-device interface more efficient (e.g., by helping the user to provide proper inputs and reducing user mistakes when operating/interacting with the device) which, additionally, reduces power usage and improves battery life of the device by enabling the user to use the device more quickly and efficiently.
- In some embodiments, the second facial feature (e.g., 1210, 1212, 1214, 1216, 1218, 1220, 1226, 1228, 1230, 1232, 1234, or 1236) of the first face (e.g., 1206 or 1222) has the first visual characteristic and a first additional visual characteristic (e.g., if the first visual characteristic is a first line color, then a first fill color, a first shape, or a first position; if the first visual characteristic is a first fill color, then a first line color, a first shape, or a first position; if the first visual characteristic is a first shape, then a first line color, a first fill color, or a first position; if the first visual characteristic is a first position, then a first line color, a first fill color, or a first shape) different from the first visual characteristic. Displaying the second facial feature of the first face to have the first visual characteristic and the first additional visual characteristic different from the first visual characteristic limits burn-in effects on the display generation component (e.g., 602) that may occur when an image with the same visual characteristic is constantly displayed, which in turn enhances the operability of the device and, by reducing display burn-in, increases the lifetime of the display generation component and improved the battery life of the device.
- In some embodiments, the second facial feature (e.g., 1210, 1212, 1214, 1216, 1218, 1220, 1226, 1228, 1230, 1232, 1234, or 1236) of the second face (e.g., 1206 or 1222) has the second visual characteristic and a second additional visual characteristic (e.g., if the second visual characteristic is a second line color, then a second fill color, a second shape, or a second position; if the second visual characteristic is a second fill color, then a second line color, a second shape, or a second position; if the second visual characteristic is a second shape, then a second line color, a second fill color, or a second position; if the second visual characteristic is a second position, then a second line color, a second fill color, or a second shape) different from the second visual characteristic. Displaying the second facial feature of the second face to have the second visual characteristic and the second additional visual characteristic different from the first visual characteristic limits burn-in effects on the display generation component (e.g., 602) that may occur when an image with the same visual characteristic is constantly displayed, which in turn enhances the operability of the device and, by reducing display burn-in, increases the lifetime of the display generation component and improved the battery life of the device.
- In some embodiments, ceasing display of the representation of the first face (e.g., 1206, 1222) and displaying the representation of the second face (e.g., 1206, 1222) includes displaying a gradual transition from the first face to the second face that includes (e.g., concurrently/simultaneously with transitioning the second facial feature of the first face from having the first visual characteristic through a plurality of intermediate (e.g., temporary) states to a final state in which the second facial feature has the second visual characteristic) transitioning the second facial feature of the first face from having the first additional visual characteristic through a plurality of intermediate (e.g., temporary) states to a final state in which the second facial feature has the second additional visual characteristic. Changing a plurality of facial features (e.g., the first facial feature and the second facial feature) in response to detecting the satisfaction of the predetermined criteria for changing an appearance of the time user interface provides visual feedback that the predetermined criteria for changing an appearance of the time user interface has been satisfied. Performing an operation when a set of conditions has been met without requiring further user input enhances the operability of the device and makes the user-device interface more efficient (e.g., by helping the user to provide proper inputs and reducing user mistakes when operating/interacting with the device) which, additionally, reduces power usage and improves battery life of the device by enabling the user to use the device more quickly and efficiently.
- In some embodiments, the first face (e.g., 1206 or 1222) has a third facial feature (e.g., 1210, 1212, 1214, 1216, 1218, 1220, 1226, 1228, 1230, 1232, 1234, or 1236) (e.g., nose; mouth; hair; facial shape; neck; shoulders) different from the second facial feature (e.g., 1210, 1212, 1214, 1216, 1218, 1220, 1226, 1228, 1230, 1232, 1234, or 1236) of the first face, wherein the third facial feature for the first face has a third visual characteristic (e.g., a third color (e.g., a third line color or a third fill color); a third shape; a third position). In some embodiments, the second face (e.g., 1206 or 1222) has a third facial feature (e.g., nose; mouth; hair; facial shape; neck; shoulders) different from the second facial feature of the second face, wherein the third facial feature for the second face has a fourth visual characteristic (e.g., a fourth color (e.g., a fourth line color or a fourth fill color); a fourth shape; a fourth position) different from the third visual characteristic. In some embodiments, ceasing display of the representation of the first face and displaying the representation of the second face includes displaying a gradual transition from the first face to the second face that includes transitioning the third facial feature of the first face from having the third visual characteristic through a plurality of intermediate (e.g., temporary) states to a final state in which the third facial feature has the fourth visual characteristic.
- In some embodiments, the predetermined criteria for changing the appearance of the time user interface (e.g., changing one or more facial features of the respective face in the time user interface) includes a criterion that is satisfied when a predetermined time has elapsed (e.g., every minute; every 15 minutes; every 30 minutes; every hour) (1312). In some embodiments, alternatively, the predetermined criteria for changing the appearance of the time user interface (e.g., 1204) (e.g., changing one or more facial features of the respective face in the time user interface) does not includes the criterion that is satisfied when the predetermined time has elapsed. In some embodiments, the computer system (e.g., 600) changes the appearance of the time user interface (e.g., changes one or more facial features of the respective face in the time user interface) randomly and not based on when the predetermined time has elapsed. Ceasing to display the representation of the first face and displaying the representation of the second face having the first facial feature and the second facial feature in response to detecting the satisfaction of the predetermined criteria, where the predetermined criteria includes a criterion that is satisfied when a predetermined time has elapsed, provides visual feedback that the predetermined time has elapsed without requiring user input. Performing an operation when a set of conditions has been met without requiring further user input enhances the operability of the device and makes the user-device interface more efficient (e.g., by helping the user to provide proper inputs and reducing user mistakes when operating/interacting with the device) which, additionally, reduces power usage and improves battery life of the device by enabling the user to use the device more quickly and efficiently.
- In some embodiments, the predetermined criteria for changing the appearance of the time user interface (e.g., 1204) includes a criterion (e.g., a predefined movement criterion) that is satisfied when a predefined movement (e.g., of the computer system) has been detected (e.g., determined to have happened; resulting from a movement of the computer system (e.g., caused by a user of the computer system) (1314). In some embodiments, the computer system is a wearable device (e.g., a smartwatch), and the predefined movement criteria corresponds to a wrist-raise movement while the computer system is being worn. Ceasing to display the representation of the first face and displaying the representation of the second face having the first facial feature and the second facial feature in response to detecting the satisfaction of the predetermined criteria, where the predetermined criteria includes a criterion that is satisfied when a predefined movement (e.g., of the computer system) has been detected, provides visual feedback that the predefined movement has been detected. Performing an operation when a set of conditions has been met without requiring further user input enhances the operability of the device and makes the user-device interface more efficient (e.g., by helping the user to provide proper inputs and reducing user mistakes when operating/interacting with the device) which, additionally, reduces power usage and improves battery life of the device by enabling the user to use the device more quickly and efficiently.
- In some embodiments, the predetermined criteria for changing the appearance of the time user interface includes a criterion that is satisfied when a change in state (e.g., a change in mode from one device state/mode to another device state/mode) of the computer system (e.g., 600) has been detected (e.g., it is determined that the computer system has undergone a change in state) (1316). In some embodiments, the change in state corresponds to the computer system transitioning to a sleep mode or sleeping state. In some embodiments, the sleep mode or sleep state corresponds to a state in which the display generation component is off In some embodiments, the sleep mode or sleep state corresponds to a state in which the computer system is in a low-power state (e.g., in which the display generation component is also off). In some embodiments, the change in state corresponds to the computer system transitioning from a locked state to an unlocked state. Ceasing to display the representation of the first face and displaying the representation of the second face having the first facial feature and the second facial feature in response to detecting the satisfaction of the predetermined criteria, where the predetermined criteria includes a criterion that is satisfied when a change in state (e.g., a change in mode from one device state/mode to another device state/mode) of the computer system has been detected, provides visual feedback that the a change in state of the computer system has been detected. Performing an operation when a set of conditions has been met without requiring further user input enhances the operability of the device and makes the user-device interface more efficient (e.g., by helping the user to provide proper inputs and reducing user mistakes when operating/interacting with the device) which, additionally, reduces power usage and improves battery life of the device by enabling the user to use the device more quickly and efficiently.
- In some embodiments, the second facial feature (e.g., 1210, 1212, 1214, 1216, 1218, 1220, 1226, 1228, 1230, 1232, 1234, or 1236) of the second face (e.g., 1206 or 1222) has the second visual characteristic that is a first color used to fill the second facial feature of the second face (e.g., a background color or base color used to visually fill out the second facial feature of the second face). In some embodiments, while displaying the representation of the second face, the computer system (e.g., 600) detects (e.g., determining) the satisfaction of a second predetermined criteria (e.g., a type of input; a timeout of the computer system) for changing an appearance of the time user interface (e.g., 1204). In some embodiments, in response to detecting the satisfaction of the second predetermined criteria for changing an appearance of the time user interface, the computer system ceases to display the representation of the second face and displaying a representation of a third face having a first facial feature of the third face (e.g., eyes) and a second facial feature of the third face (e.g., nose; mouth; hair; facial shape; neck; shoulders), wherein the representation of the third face is different from the representation of the second face, the first facial feature of the third face indicates a current time, and the second facial feature of the third face has a third visual characteristic (e.g., a second color (e.g., a second line color or a second fill color); a second shape) different from the second visual characteristic, wherein the third visual characteristic is a visually distinguished outline (e.g., borderline) for the second facial feature of the third face having a respective color that is based on (e.g., the same as; the same tone as; similar to) the first color used to fill the second facial feature of the second face. In some embodiments, the computer system displays or causes display of an animation via the first facial feature (e.g., blinking of the displayed time if the first facial feature represents eyes) based on a change in the time or non-periodically.
- In some embodiments, while displaying the representation of the second face (e.g., 1206, or 1222) having the first facial feature (e.g., 1208 or 1224) and the second facial feature (e.g., 1210, 1212, 1214, 1216, 1218, 1220, 1226, 1228, 1230, 1232, 1234, or 1236) in the time user interface (e.g., 1204), the computer system (e.g., 600) displays, via the first facial feature of the second face, an animation (e.g., a blinking animation) that includes ceasing display of at least a portion of the first facial feature of the second face for a period of time, and re-displaying the at least a portion of the first facial feature of the second face after the period of time has elapsed. In some embodiments, the animation is a blinking animation of the first facial feature that includes a temporary/brief movement or change in shape/form of the first facial feature such that the first facial feature mimics the movement of a human eye blinking. In some embodiments, the computer system periodically, based on time, (e.g., every 1 minute; every 5 minutes; every 30 minutes; every hour) displays the animation (e.g., blinking animation). Providing a blinking animation via the first facial feature (e.g., periodically, based on time) provides visual feedback about the change in time in an intuitive manner. Providing improved visual feedback enhances the operability of the device and makes the user-device interface more efficient (e.g., by helping the user to provide proper inputs and reducing user mistakes when operating/interacting with the device) which, additionally, reduces power usage and improves battery life of the device by enabling the user to use the device more quickly and efficiently.
- In some embodiments, the first facial feature (e.g., 1208, 1224) is an indication of a current time and the animation is a blinking animation where the current time is animated to look like blinking eyes (e.g., the hour and minute indicators are compressed vertically and then expand vertically).
- In some embodiments, displaying, via the first facial feature (e.g., 1208, 1224) of the second face (e.g., 1206, 1222), the animation (e.g., blinking) includes non-periodically (e.g., not in regular intervals; at random times; not based on a period change in time) displaying, via the first facial feature of the second face, the animation.
- In some embodiments, the second face (e.g., 1206 or 1222) (e.g., the main face portion of the second face) includes a primary color scheme (e.g., a predominant color; a most-prevalent color). In some embodiments, the second visual characteristic for the second facial feature (e.g., 1210, 1212, 1214, 1216, 1218, 1220, 1226, 1228, 1230, 1232, 1234, or 1236) (e.g., the neck; the neck and shoulder) of the second face is a second color that is based on (e.g., is the same as; is a similar tone as; is within a range of color variants of) the primary color scheme (e.g., the neck is a slightly lighter shade of the color of the face or the neck is a slightly darker shade of the color of the face) (1332). In some embodiments, the color of the second facial feature has a predetermined relationship to the color of the first facial feature for a plurality of different faces (e.g., the neck is a predetermined amount lighter than the face for a plurality of faces or the neck is a predetermined amount darker than the face for a plurality of faces).
- In some embodiments, the second facial feature (e.g., 1210, 1212, 1214, 1216, 1218, 1220, 1226, 1228, 1230, 1232, 1234, or 1236) of the second face (e.g., 1206 or 1222) is selected from the group consisting of: hair, facial outline (e.g., including cheeks and/or jawline), nose, eyes, mouth (e.g., lips) neck, and shoulders (1334).
- In some embodiments, the second visual characteristic for the second facial feature (e.g., 1210, 1212, 1214, 1216, 1218, 1220, 1226, 1228, 1230, 1232, 1234, or 1236) of the first face (e.g., 1206 or 1222) is a third color, and the second visual characteristic for the second facial feature (e.g., 1210, 1212, 1214, 1216, 1218, 1220, 1226, 1228, 1230, 1232, 1234, or 1236) of the second face (e.g., 1206 or 1222) is a fourth color different from the third color, wherein the fourth color is programmatically selected (e.g., determined), without user input, from a plurality of available colors by the computer system (e.g., 600) (1336). In some embodiments, the application process selects (e.g., programmatically determines) the fourth color based on a color of the computer system (e.g., a color of a housing or case of the computer system). In some embodiments, the application process selects (e.g., programmatically determines) the fourth color based on usage history of a user of the computer system (e.g., based on a previous user-selected color or color scheme). Programmatically selecting, without user input, colors for facial features of a displayed face provides a diverse range of characteristics that are displayed via the time user interface without requiring user input to enable the diverse range of characteristics. Reducing the number of inputs needed to perform an operation enhances the operability of the device and makes the user-device interface more efficient (e.g., by helping the user to provide proper inputs and reducing user mistakes when operating/interacting with the device) which, additionally, reduces power usage and improves battery life of the device by enabling the user to use the device more quickly and efficiently.
- Note that details of the processes described above with respect to method 1300 (e.g.,
FIGS. 13A-13C ) are also applicable in an analogous manner to the methods described above and below. For example,method 700 optionally includes one or more of the characteristics of the various methods described above with reference tomethod 1300. For example, a device can use as a watch user interface either a watch user interface as described inFIGS. 6A-6H or a time user interface as described inFIGS. 12A-12G . For another example,method 900 optionally includes one or more of the characteristics of the various methods described above with reference tomethod 1300. For example, a device can use as a watch user interface either a watch user interface as described inFIGS. 8A-8M or a time user interface as described inFIGS. 12A-12G . For another example,method 1100 optionally includes one or more of the characteristics of the various methods described above with reference tomethod 1300. For example, a device can use as a watch user interface either a user interface with the indication of time and the graphical representation of a respective character as described inFIGS. 10A-10AC or a time user interface as described inFIGS. 12A-12G . For another example,method 1500 optionally includes one or more of the characteristics of the various methods described above with reference tomethod 1300. For example, a device can use as a watch user interface either a user interface with a background as described inFIGS. 14A-14AD and a time user interface as described inFIGS. 12A-12G . For another example,method 1700 optionally includes one or more of the characteristics of the various methods described above with reference tomethod 1300. For example, one or more characteristics or features of a time user interface as described with reference toFIGS. 12A-12G can be edited via the process for editing characteristics or features of a watch user interface as described with reference toFIGS. 16A-16AE . For brevity, these details are not repeated below. -
FIGS. 14A-14AD illustrate exemplary user interfaces for enabling configuration of a background for a user interface, in accordance with some embodiments. The user interfaces in these figures are used to illustrate the processes described below, including the processes inFIGS. 15A-15F . -
FIG. 14A illustratesdevice 600 displaying, viadisplay 602, a first page (indicated by paging dot 1410) of anediting user interface 1406 for editing a respective user interface that includes content overlaid on the background. In some embodiments, the respective user interface is available to be used as a watch user interface on device 600 (e.g., a watch face that includes an indication of time and one or more watch complications overlaid on the background). In some embodiments, the user interface is a watch user interface, and the content includes an indication of the current time or current date. In some embodiments,editing user interface 1406 includes a plurality of pages that can be navigated, where a respective page enables editing of a different feature of a user interface, as described in greater detail below. - In
FIG. 14A ,editing user interface 1406 includes abackground 1408 for a respective user interface, wherebackground 1408 comprises a plurality of stripes (e.g., graphical lines across the background in a vertical or horizontal direction) including astripe 1408A and astripe 1408B.Stripe 1408A has a first visual characteristic (e.g., a first color; a first fill pattern) andstripe 1408B has a second visual characteristic (e.g., a second color; a second fill pattern) different from the first visual characteristic. InFIG. 14A , stripes of 1408A and 1408B are arranged in a first visual pattern of stripes (e.g., a first type of alternating color pattern, such as a repeating 2-color pattern). - In
FIG. 14A , while displaying first page 1410 ofediting user interface 1406,device 600 receives (e.g., detects) aninput 1401 for changing the current page ofediting user interface 1406. In some embodiments,input 1401 includes a gesture (e.g., a horizontal swipe ondisplay 602 in a first direction). In response to receivinginput 1401,device 600 displays a second page (indicated by paging dot 1412) ofediting user interface 1406, as shown inFIG. 14B , wheresecond page 1412 ofediting user interface 1406 can be used to change a number of stripes (e.g., increase the number of stripes; decrease the number of stripes) ofbackground 1408. - In
FIG. 14B , while displaying second page (indicated by paging dot 1412) ofediting user interface 1406 withbackground 1408 havingstripes 1408A-1408B arranged in the first visual pattern of stripes,device 600 receives (e.g., detects) aninput 1403 directed to changing (e.g., increasing) the number of stripes ofbackground 1408, as shown inFIGS. 14B-14E . In some embodiments,input 1403 is a rotational input in a first direction (e.g., clockwise; up) onrotatable input mechanism 603 shown inFIGS. 14B-14E . In some embodiments,input 1403 is a touch input such as a swipe or pinch input. - In
FIGS. 14B-14E , in response to (e.g., and while) receivinginput 1403,device 600 displays an increase in the number of stripes forbackground 1408. The new stripes maintain the initial visual pattern ofstripes - In
FIG. 14C , in response to (e.g., and while) receivinginput 1403,device 600 includesstripe 1408C in background 1408 (e.g., belowstripe 1408B), wherestripe 1408C moves ontodisplay 602 from an edge (e.g., bottom edge) ofdisplay 602. InFIG. 14C ,stripe 1408C has a same visual characteristic (e.g., color; fill pattern) asstripe 1408C.Device 600 decreases a size of displayed stripes (e.g., decreases the height or width) as a new stripe is added tobackground 1408. - In
FIGS. 14D-14E , in response to (e.g., and while) continuing to receiveinput 1403,device 600 includesstripe 1408D in background 1408 (e.g., belowstripe 1408C), wherestripe 1408D moves ontodisplay 602 from the same edge ofdisplay 602 asstripe 1408C. InFIGS. 14D-14E ,stripe 1408D has a same visual characteristic asstripe 1408D (e.g., the same color and/or fill pattern asstripe 1408D).Device 600 automatically maintains the first visual pattern of stripes (e.g., alternating between two colors) as new stripes are added tobackground 1408.Device 600 continues to decrease the size of displayed stripes as new stripes are added tobackground 1408. - After
FIG. 14E ,device 600 continues receivinginput 1403 and responds by increasing the number of stripes until twelvestripes 1408A-1408L are included inbackground 1408, while maintaining the first visual pattern, as shown inFIG. 14F . -
FIG. 14F illustratesdevice 600 displaying, insecond page 1412 ofediting user interface 1406,background 1408 withstripes 1408A-1408L arranged in the first visual pattern of stripes. While displayingsecond page 1412 ofediting user interface 1406 withbackground 1408 havingstripes 1408A-1408L arranged in the first visual pattern of stripes,device 600 receives (e.g., detects) aninput 1405 directed to changing (e.g., decreasing) the number of stripes ofbackground 1408, as shown inFIG. 14F . In some embodiments,input 1405 has a direction (e.g., counter-clockwise; down) that is opposite of a direction ofinput 1403. In the embodiment illustrated inFIG. 14F ,input 1405 is a rotational input onrotatable input mechanism 603 in a direction opposite the direction ofinput 1403. In some embodiments,input 1405 is a touch input such as a swipe or pinch input. - In response to receiving
input 1405,device 600 displays, inediting user interface 1406, a decrease in the number of stripes forbackground 1408, where existing stripes move off ofdisplay 602 at the edge of display 602 (e.g., at the bottom of display 602).Device 600 increases the size of remaining stripes (e.g., increases the height or width) as a stripe is removed frombackground 1408. - As shown in
FIG. 14G , in response to receivinginput 1405,device 600displays background 1408 with eightstripes 1408A-1408H, wherestripes 1408A-1408H maintain the first visual pattern of stripes as inFIG. 14F . - In
FIG. 14G , while displayingsecond page 1412 ofediting user interface 1406 withbackground 1408 havingstripes 1408A-1408H arranged in the first visual pattern of stripes,device 600 receives (e.g., detects) aninput 1407 directed to selectingstripe 1408D. In some embodiments,input 1407 includes a tap input onstripe 1408D. In some embodiments,input 1407 includes a tap-and-hold input onstripe 1408D. - In some embodiments, in response to receiving
input 1407,device 600 changes the current page inediting user interface 1406 to a third page (indicated by paging dot 1414) ofediting user interface 1406, as shown inFIG. 14H .Third page 1414 provides an editing mode for changing a visual characteristic, such as a color, of the selected stripe. - In response to receiving input 1407 (e.g., and while displaying
editing user interface 1406 in third page 1414),device 600 displays a visual indicator 1416 (e.g., a box) indicating thatstripe 1408D has been selected (via input 1407). In some embodiments,visual indictor 1416 includes anindication 1418 of a current visual characteristic (e.g., the color) applied to the selected stripe. - In
FIG. 14H , while displayingediting user interface 1406 withvisual indicator 1416 indicating thatstripe 1408D ofbackground 1408 has been selected,device 600 receives (e.g., detects) aninput 1409 directed to changing the current visual characteristic applied tostripe 1408D. In some embodiments,input 1409 is a rotational input onrotatable input mechanism 603 shown inFIG. 14H . In some embodiments,input 1409 is a touch input such as a swipe or pinch input. - In response to (e.g., and while) receiving
input 1409,device 600 navigates (e.g., scrolls) through a plurality of selectable visual characteristics (e.g., selectable colors). While the selectable visual characteristics are being navigated, different selectable visual characteristics are applied tostripe 1408D and indicated viaindication 1418 of visual indicator 1416 (e.g., the color ofstripe 1408D andindication 1418 are updated during navigation to reflect the currently-selected visual characteristic). - In
FIG. 14I , in response to (e.g., and while) receivinginput 1409,device 600 changes the respective visual characteristic applied tostripe 1408D to a third visual characteristic (e.g., a third color; a third fill pattern) different from the second visual characteristic and indicates, viaindication 1418 ofvisual indicator 1416, that the third visual characteristic is the currently-selected visual characteristic. - After
FIG. 14I ,device 600 continues detectinginput 1409 directed to changing the current visual characteristic applied tostripe 1408D untildevice 600 changes the respective visual characteristic applied tostripe 1408D to a fourth visual characteristic (e.g., a fourth color; a fourth fill pattern), different from the second visual characteristic and the third visual characteristic, and indicates, viaindication 1418 ofvisual indicator 1416, that the fourth visual characteristic is the currently-selected visual characteristic, as shown inFIG. 14J . - In
FIG. 14J , while displayingstripe 1408D ofbackground 1408 with the fourth visual characteristic applied,device 600 receives (e.g., detects) aninput 1411.Input 1411 is first detected a location ondisplay 602 corresponding tostripe 1408D and is moved towards a location ondisplay 602 corresponding tostripe 1408G, wherestripe 1408G has a different visual characteristic fromstripe 1408D. In some embodiments,input 1411 is a touch-and-drag input fromstripe 1408D tostripe 1408G. - In response to detecting
input 1411,device 600displays stripe 1408G with the visual characteristic ofstripe 1408D (e.g., the visual characteristic fromstripe 1408D is applied tostripe 1408G), as shown inFIG. 14K , and movesvisual indicator 1416 tostripe 1408G fromstripe 1408D. As shown inFIG. 14K visual indicator 1416 indicates thatstripe 1408G has been selected (via input 1411) andindication 1418 indicates the visual characteristic ofstripe 1408D has been applied tostripe 1408G. - In
FIG. 14K , while displayingthird page 1414 of the editing user interface,device 600 receives (e.g., detects) aninput 1413 directed to returningediting user interface 1406 to second page 1412 (e.g., the editing mode for changing the number of stripes in the background). In some embodiments,input 1413 includes a gesture (e.g., a horizontal swipe ondisplay 602 in a direction opposite a direction of input 1407). In response to receivinginput 1413,device 600 displayssecond page 1412 ofediting user interface 1406, as shown inFIG. 14L . - In
FIG. 14L ,background 1408 includesstripes 1408A-1408L, wherestripes 1408A-1408L form a second visual pattern of stripes (e.g., an eight-color pattern, wherestripes stripes stripes - In
FIG. 14L , while displayingbackground 1408 withstripes 1408A-1408L,device 600 receives (e.g., detects) aninput 1415 directed to changing (e.g., decreasing) the number of stripes inbackground 1408. In some embodiments,input 1415 is a rotational input onrotatable input mechanism 603 shown inFIG. 14L . In some embodiments,input 1415 is a touch input such as a swipe or pinch input. - In response to receiving (e.g., detecting)
input 1415 directed to decreasing the number of stripes ofbackground 1408, whereinput 1415 is in the second direction (e.g., a counter-clockwise direction; a down direction),device 600 displays a decrease in the number of stripes forbackground 1408. Existing stripes move off ofdisplay 602 at the edge of display 602 (e.g., at the bottom of display 602).Device 600 increases the size of remaining stripes (e.g., increases the height or width) as a stripe is removed frombackground 1408. - In response to (e.g., after) receiving
input 1415,device 600displays background 1408 with four remainingstripes 1408A-1408D, as shown inFIG. 14M , asstripes 1408E-1408H have been removed frombackground 1408 byinput 1415. - In
FIG. 14M ,background 1408 includesstripes 1408A-1408D, wherestripes 1408A-1408D are arranged in a third visual pattern of stripes (e.g., a third type of alternating color pattern (e.g., a repeating 4-color pattern), wherestripe 1408A andstripe 1408C have the first visual characteristic (e.g., the first color; the first fill pattern),stripe 1408B has the second visual characteristic, andstripe 1408D has the fourth visual characteristic. - In
FIG. 14M , while displayingbackground 1408 withstripes 1408A-1408D,device 600 receives (e.g., detects) aninput 1417 directed to changing (e.g., increasing) the number of stripes inbackground 1408. In some embodiments,input 1417 is a rotational input onrotatable input mechanism 603 shown inFIG. 14M . In some embodiments,input 1417 is a touch input such as a swipe or pinch input. - In response to receiving
input 1417, whereinput 1417 is in the first direction (e.g., a clockwise direction; an up direction),device 600 displays an increase in the number of stripes forbackground 1408, where stripes are moved ontodisplay 602 from the edge of display 602 (e.g., at the bottom of display 602).Device 600 decreases the size of stripes (e.g., decreases the height or width) as a stripe is added tobackground 1408. - In response to (e.g., after) receiving
input 1417,device 600displays background 1408 inediting user interface 1406 with eightstripes 1408A-1408H, as shown inFIG. 14N , wherestripes 1408A-1408H have the second visual pattern of stripes as first described above with reference toFIG. 14L (e.g., instead of maintain the four-stripe visual pattern of stripes shown inFIG. 14M ). - In some embodiments, in response to receiving an input directed to increasing the number of stripes (e.g.,
input 1417 inFIG. 14M ) after receiving an input directed to decreasing the number of stripes (e.g.,input 1415 ofFIG. 14L ),device 600 maintains the visual pattern of stripes from when the input directed to decreasing the number of stripes (e.g.,input 1415 ofFIG. 14L ) was first detected. In some embodiments, in accordance with detecting one or more inputs (e.g.,input 1415 inFIG. 14L , then input 1417 inFIG. 14M ) directed to decreasing, then increasing, the number of stripes,device 600 maintains the visual pattern of stripes (e.g., the second visual pattern of stripes as inFIG. 14L ) from prior to the one or more inputs being received. - In some embodiments, in response to receiving an input directed to decreasing the number of stripes (e.g.,
input 1415 inFIG. 14L ), and subsequently receiving an input directed to increasing the number of stripes (e.g.,input 1417 inFIG. 14M ),device 600 re-displays stripes (e.g.,stripes 1408E-1408H) in the background to include the same visual pattern of stripes (e.g., the second visual pattern of stripes as inFIG. 14L ) from prior to the inputs being received if no other inputs are received bydevice 600 between receiving the two respective inputs (e.g., between receivinginput 1415 and input 1417). For example, if there were no intervening operations received bydevice 600 between displayingbackground 1408 with the second visual pattern of stripes as inFIG. 14L to receiving input directed to increasing the number of stripes (e.g.,input 1417 inFIG. 14M ),device 600 re-displays stripes (e.g.,stripes 1408E-1408H) inbackground 1408 to include the same visual pattern of stripes. - In some embodiments, in accordance with receiving an input directed to decreasing the number of stripes (e.g.,
input 1415 inFIG. 14L ), and subsequently receiving an input directed to increasing the number of stripes (e.g.,input 1417 inFIG. 14M ),device 600 does not re-display stripes (e.g.,stripes 1408E-1408H) in the background to include the same visual pattern of stripes (e.g., the second visual pattern of stripes as inFIG. 14L ) from prior to the inputs being received (e.g., detected) if another input directed to performing an operation that does not include changing the number of stripes of the background is received bydevice 600 between receiving the two respective inputs (e.g., between receiving (e.g., detecting)input 1415 and input 1417). For example, if there is an intervening operation received bydevice 600 between displayingbackground 1408 with the second visual pattern of stripes as inFIG. 14L to receiving input directed to increasing the number of stripes (e.g.,input 1417 inFIG. 14M ),device 600 does not re-display stripes (e.g.,stripes 1408E-1408H) inbackground 1408 to include the same visual pattern of stripes. In some embodiments, performing the operation includes displaying a user interface different fromediting user interface 1406. In some embodiments, performing the operation includes editing a different aspect/feature of background 1408 (e.g., in a different page of editing user interface 1406) than changing the number of stripes of background 1408 (e.g., editing features of a watch face, such as watch face style or watch complications). - In some embodiments, if an input directed to performing an operation that does not include changing the number of stripes of the background is received by
device 600 between receivinginput 1415 to decrease the number of stripes and, subsequently, receivinginput 1417 to increase the number of stripes,device 600displays stripes 1408E-1408H to include the third visual pattern of stripes ofstripes 1408A-1408D as inFIG. 14M (when the number of stripes is decreased) tostripes 1408A-1408H (when the number of stripes is increased). - In
FIG. 14N , while displayingstripes 1408A-1408H inbackground 1408 with the second visual pattern of stripes,device 600 receives (e.g., detects) aninput 1419 directed to changing the current page ofediting user interface 1406 to a fourth page (indicated by paging dot 1420) (e.g., an editing mode for rotating the background). In some embodiments,input 1419 includes a gesture (e.g., a horizontal swipe on display 602). In response to receivinginput 1419,device 600 displaysfourth page 1420 ofediting user interface 1406, as shown inFIG. 14O . - While displaying
fourth page 1420 of editing user interface withbackground 1408 includingstripes 1408A-1408H arranged in the second visual pattern of stripes,device 600 receives (e.g., detects) aninput 1421 directed to rotating the stripes ofbackground 1408. In some embodiments,input 1421 is a rotational input onrotatable input mechanism 603 shown inFIGS. 14O-14P . In some embodiments,input 1421 is a touch input such as a swipe, twist, or pinch input. - In
FIG. 14P , in response to (e.g., and while) receivinginput 1421,device 600 rotatesstripes 1408A-1408B ofbackground 1408 in accordance with input 1421 (e.g.,background 1408 is rotated with the center ofdisplay 602 as the axis point for rotation). In some embodiments, ifinput 1421 is a rotational input in a clockwise direction,stripes 1408A-1408H ofbackground 1408 are rotated in the clockwise direction. In some embodiments, ifinput 1421 is a rotational input in a counter-clockwise direction,stripes 1408A-1408H ofbackground 1408 are rotated in the counter-clockwise direction. In some embodiments,stripes 1408A-1408H ofbackground 1408 maintain a straight shape while being rotated, as shown inFIG. 14P . - In some embodiments, rotating
background 1408 includesrotating background 1408 by predefined rotational increments (e.g., by 10 degree increments; by 15 degree increments; by 30 degree increments) with respect to a rotational axis point (e.g., the center of display 602). In some embodiments, rotatingbackground 1408 includes changing (e.g., increasing; decreasing) a characteristic (e.g., thickness; size; area) ofstripes 1408A-1408H ofbackground 1408 as the background is being rotated in accordance with the input directed to rotating the stripes (e.g., input 1421). - In response to (e.g., after) detecting
input 1421,device 600displays stripes 1408A-1408H ofbackground 1408 rotated from a horizontal orientation, as inFIG. 14P , to a vertical orientation, as inFIG. 14Q . In some embodiments,stripes 1408A-1408H can be rotated to an intermediary angle between the horizontal and vertical orientations (e.g., by 1 degree increments, 2 degree increments, 5 degree increments, 10 degree increments; by 15 degree increments; by 30 degree increments). - In
FIG. 14Q , while displayingstripes 1408A-1408H ofbackground 1408 in the vertical orientation,device 600 receives (e.g., detects) aninput 1423 directed to exitingediting user interface 1406. In some embodiments,input 1423 is directed to rotatable input mechanism 603 (e.g., a press input or a press-and-hold input at rotatable input mechanism 603), as inFIG. 14Q . In some embodiments,input 1423 is a touch input (e.g., a tap-and-hold input) ondisplay 602. - In response to receiving
input 1423 while displayingbackground 1408 as inFIG. 14Q ,device 600 displays a user interface 1422 (e.g., a watch user interface) that includesbackground 1408 withstripes 1408A-1408H as the background of the user interface. In some embodiments,user interface 1422 is a watch user interface that includesbackground 1408 withstripes 1408A-1408H as the background of the watch user interface and an indication of time 1424 overlaid onbackground 1408. - At
FIG. 14R ,electronic device 600 detects user input 1426 (e.g., a tap and hold gesture) onuser interface 1422. In response to detectinguser input 1426,electronic device 600displays user interface 1428, as shown atFIG. 14S . AtFIG. 14S ,user interface 1428 includesrepresentation 1430 ofbackground 1408, watch user interface type indicator 1432 (e.g., “Stripes”),share affordance 1434, and editaffordance 1436.Representation 1430 ofbackground 1408 includesstripes 1408A-1408H arranged in the vertical orientation and/or having a fourth visual pattern. In some embodiments,electronic device 600 is configured to display representations of different backgrounds foruser interface 1422 and/or representations of additional user interfaces (e.g., different from user interface 1422) in response to detecting rotational input onrotatable input mechanism 603. AtFIG. 14S ,electronic device 600 detects user input 1438 (e.g., a tap gesture) corresponding to selection ofedit affordance 1436. In response to detectinguser input 1438,electronic device 600 displays editing user interface 1440 (e.g., a modified version of editing user interface 1406), atFIG. 14T . - At
FIG. 14T , a first page ofediting user interface 1440 includesrepresentation 1430 ofbackground 1408, first editing feature indicator 1442 (e.g., “Style”), second editing feature indicator 1444 (e.g., “Color”), and first style indicator 1446 (e.g., “Full Screen”).Representation 1430 ofbackground 1408 includesstripes 1408A-1408H in the vertical orientation and/or having the fourth visual pattern. Firstediting feature indicator 1442 corresponds to a currently selected editing feature for background 1408 (e.g., “Style”), as indicated by firstediting feature indicator 1442 being centered ondisplay 602 and aboverepresentation 1430. AtFIG. 14T , the currently selected editing feature relates to a format of a border (e.g., a shape of the border) in whichbackground 1408 will be displayed onuser interface 1422.First style indicator 1446 provides a first option for the currently selected editing feature and indicates the option as full screen (e.g., a border having a rectangular shape). In response to detecting selection of the full screen option (e.g., via a tap gesture or press gesture on rotatable input mechanism 603),electronic device 600displays background 1408 in a full screen mode on display 602 (e.g.,background 1408 occupies all or substantially all ofdisplay 602 and is displayed within a border having a shape ofdisplay 602, such as a rectangular shape or a square shape). - At
FIG. 14T ,electronic device 600 detectsrotational input 1448 onrotatable input mechanism 603. In response to detectingrotational input 1448,electronic device 600 displays the first page ofediting user interface 1440 withrepresentation 1450 and second style indicator 1452 (e.g., “Circle”), as shown atFIG. 14U .Second style indicator 1452 corresponds to a second option for the currently selected editing feature and indicates the option as a circular mask (e.g., displayingbackground 1408 within a border having a circular shape). In some embodiments, the circular mask does not occupy the full screen ofdisplay 602. In response to detecting selection of the circular mask option (e.g., via a tap gesture or press gesture on rotatable input mechanism 603),electronic device 600displays background 1408 within a circular shaped border on a portion ofdisplay 602. AtFIG. 14U ,representation 1450 ofbackground 1408 maintains the vertical orientation ofstripes 1408A-1408H in the circular shaped border. In some embodiments, in response to detectingrotational input 1448,electronic device 600 adjusts a size (e.g., a width and/or a thickness) ofstripes 1408A-1408H displayed inrepresentation 1450 of background 1408 (as compared to representation 1430) to enablestripes 1408A-1408H to fit within the circular shaped border ofrepresentation 1450. For example, in some embodiments,electronic device 600 reduces the size (e.g., the width and/or the thickness) ofstripes 1408A-1408H displayed in representation 1450 (as compared to representation 1430) because the circular shaped border ofrepresentation 1450 includes a smaller width than the rectangular border ofrepresentation 1430. - At
FIG. 14U , electronic device detects user input 1454 (e.g., a swipe gesture) onediting user interface 1440. In response to detectinguser input 1454,electronic device 600 displays a second page ofediting user interface 1440 for editing a second feature ofbackground 1408, as shown atFIG. 14V . AtFIG. 14V , electronic device displays a second page ofediting user interface 1440 for editing the second feature ofbackground 1408, as indicated by secondediting feature indicator 1444 being centered ondisplay 602 aboverepresentation 1450. Additionally,electronic device 600 displays third editing feature indicator 1456 (e.g., “Position”) in response to detecting user input 1454 (e.g.,electronic device 600 translates firstediting feature indicator 1442, secondediting feature indicator 1444, and thirdediting feature indicator 1456 in a direction associated with movement of user input 1454). The second page ofediting user interface 1440 corresponds to an ability to adjust a color of one ormore stripes 1408A-1408H ofbackground 1408. AtFIG. 14V , electronicdevice displays indication 1416 aroundstripe 1408A indicating thatstripe 1408A is selected for editing. Additionally,electronic device 600displays indication 1418 indicating a current color ofstripe 1408A that is selected for editing (e.g., “White”). As set forth above,electronic device 600 adjusts the color ofstripe 1408A in response to detecting rotational input onrotational input mechanism 603. For instance, the second page ofediting user interface 1440 includescolor selection element 1458, which includesindicators 1458A-1458D corresponding to different colors that may be designated tostripe 1408A (or another selectedstripe 1408B-1408H). -
Electronic device 600 is configured to adjust and/or change a position ofindicator 1416 fromstripe 1408A to one ofstripes 1408B-1408H in response to detecting a tap gesture on one ofstripes 1408B-1408H. AtFIG. 14V (e.g., in response to detecting input 1454),representation 1450 ofbackground 1408 is rotated when compared torepresentation 1450 ofFIG. 14U so thatstripes 1408A-1408H are in a horizontal orientation (e.g.,stripes 1408A-1408H extend between the left and ride sides of display 602). As discussed below with reference toFIGS. 14AB and 14AC , in some embodiments, displayingrepresentation 1450 such thatstripes 1408A-1408H are in the horizontal orientation facilitates a user's ability to accurately select a particular stripe. - At
FIG. 14V ,electronic device 600 detects user input 1460 (e.g., a swipe gesture) onediting user interface 1440. In response to detectinguser input 1460,electronic device 600 displays a third page ofediting user interface 1440, as shown atFIG. 14W . The third page ofediting user interface 1440 enables adjustment of an angle and/or position ofbackground 1408, and thus the angle and/or position ofstripes 1408A-1408H ofbackground 1408. AtFIG. 14W , electronic device displays thirdediting feature indicator 1456 as centered ondisplay 602 aboverepresentation 1450 to indicate that the third page ofediting user interface 1440 enables adjustment of the position ofbackground 1408. Additionally,electronic device 600 displays fourth editing feature indicator 1462 (e.g., “Complications”) in response to detecting user input 1460 (e.g.,electronic device 600 translates firstediting feature indicator 1442, secondediting feature indicator 1444, thirdediting feature indicator 1456, and fourthediting feature indicator 1462 in a direction associated with movement of user input 1460). - At
FIG. 14W (e.g., in response to detecting input 1460),electronic device 600 rotatesrepresentation 1450 ofbackground 1408 back to the orientation (e.g., a vertical orientation) ofbackground 1408 prior to displaying the second page (e.g., for editing color) ofediting user interface 1440. In some embodiments,background 1408 is returned to the previous orientation because the second page ofediting user interface 1440 for adjusting colors ofstripes 1408A-1408H is no longer displayed (e.g.,electronic device 600 does not detect and/or respond to user inputs onindividual stripes 1408A-1408H when the second page ofediting user interface 1440 is not displayed). - As set forth above, the third page of
editing user interface 1440 enables adjustment of an angle and/or position ofbackground 1408. The third page ofediting user interface 1440 includesrotation indicator 1464 that provides a visual indication of an angle ofbackground 1408 with respect to a rotational axis (e.g., the center of display 602). AtFIG. 14W , electronic device detectsrotational input 1466 onrotatable input mechanism 603. In response to detecting rotational input 1466 (and while receiving rotational input),electronic device 600 rotatesrepresentation 1450 with respect to the rotational axis, as shown atFIG. 14X . - At
FIG. 14X ,electronic device 600updates rotation indicator 1464 to provide a visual indication of the new angle ofbackground 1408 with respect to the rotational axis (e.g., 45 degrees). Whileelectronic device 600 displaysrepresentation 1450 withstripes 1408A-1408H at an angle of 45 degrees with respect to the rotational axis,electronic device 600 can rotaterepresentation 1450 ofbackground 1408 to any suitable angle (e.g., any angle from 0 degrees to 360 degrees) with respect to the rotational axis. In some embodiments,electronic device 600rotates representation 1450 to a particular angle in accordance with a detected amount of movement associated with rotational input 1466 (e.g., an amount of rotation ofrepresentation 1450 is based on an amount of detected movement or rotation associated with rotational input 1466). For example,electronic device 600 can continuously rotaterepresentation 1450 while continuing to detect rotational input 1466 (e.g., the angle of rotation is selectable by a continuous input, such as continuous rotation of rotatable input mechanism 603). As set forth above,representation 1450 corresponds tobackground 1408 being displayed within a border that includes a circular shape. In response torotational input 1466,electronic device 600 forgoes adjustment of a size (e.g., a thickness and/or a width) ofstripes 1408A-1408H ofrepresentation 1450 becauserepresentation 1450 includes the circular border (e.g.,rotating representation 1450 does not cause the lengths or widths ofstripes 1408A-1408H to change because the diameter of the circular border remains constant). As discussed below, in some embodiments,electronic device 600 adjusts the size (e.g., thickness and/or width) ofstripes 1408A-1408H in response torotational input 1466 whenbackground 1408 is displayed within a non-circular border. - At
FIG. 14X ,electronic device 600 detects user input 1468 (e.g., two swipe gestures) onediting user interface 1440. In response to detectinguser input 1468,electronic device 600 displays the first page ofediting user interface 1440 for adjusting the shape of the border in whichbackground 1408 is displayed, as shown atFIG. 14Y . AtFIG. 14Y ,electronic device 600 displaysrepresentation 1450 with the updated position (e.g., an angle of 45 degrees) caused by rotational input 1466 (e.g., because the second page ofediting user interface 1440 is not displayed). Additionally, atFIG. 14Y ,electronic device 600 detectsrotational input 1470 onrotatable input mechanism 603. In response to detectingrotational input 1470,electronic device 600 displays the first page ofediting user interface 1440 withrepresentation 1430 ofbackground 1408 in the rectangular shaped border, as shown atFIG. 14Z . - At
FIG. 14Z ,representation 1430 maintains the angle ofrepresentation 1450 caused by rotational input 1466 (e.g., an angle of 45 degrees). However,electronic device 600 adjusts a size (e.g., thickness and/or width) ofstripes 1408A-1408H ofrepresentation 1430 when compared torepresentation 1450 atFIG. 14Y .Electronic device 600 adjusts the size (e.g., length, thickness, and/or width) ofstripes 1408A-1408H to occupy the entire area defined by the rectangular shaped border, while maintaining the same number of stripes (e.g., and the same width for each stripe). In general, the width of the stripes varies with the dimension ofbackground 1408 in the direction perpendicular to the length of the stripes (e.g., the stripes are wider when oriented horizontally than when oriented vertically because the vertical dimension ofdisplay 602 is larger than the horizontal dimension ofdisplay 602, and vice versa). - At
FIG. 14Z ,electronic device 600 detects user input 1472 (e.g., two successive swipe gestures) onediting user interface 1440. In response to detectinguser input 1472,electronic device 600 displays the third page ofediting user interface 1440 for adjusting the position ofrepresentation 1430, as shown atFIG. 14AA . AtFIG. 14AA ,electronic device 600 detectsrotational input 1474 onrotatable input mechanism 603. In response to detectingrotational input 1474,electronic device 600 rotates representation 1430 (e.g., about the rotational axis) in accordance with an amount of movement and/or a direction ofrotational input 1474, as shown atFIG. 14AB . - At
FIG. 14AB ,electronic device 600 displaysrepresentation 1430 withstripes 1408A-1408H at an angle of 60 degrees (e.g., relative to horizontal), as indicated byrotation indicator 1464. In response torotational input 1474,electronic device 600 reduces a size (e.g., thickness and/or width) ofstripes 1408A-1408H in addition to rotatingstripes 1408A-1408H about the rotational axis. For example, in response to rotatingstripes 1408A-1408H from an angle of 45 degrees to an angle of 60 degrees,electronic device 600 varies the lengths ofstripes 1408A-1408H as needed to fit within rectangular border ofrepresentation 1430.Electronic device 600 also reduces the size (e.g., thickness and/or width) ofstripes 1408A-1408H in order to maintain the same number ofstripes 1408A-1408H (e.g., each with the same width) within the rectangular border ofrepresentation 1430. In some embodiments,electronic device 600 adjusts the size ofstripes 1408A-1408H based on a detected amount of movement associated withrotational input 1474. For example, whileelectronic device 600 detects rotational input 1474 (e.g., continuously detects rotational input 1474),electronic device 600 gradually and/or continuously adjusts the size ofstripes 1408A-1408H in response to continuing to detectrotational input 1474. In some embodiments,electronic device 600 adjusts the size ofstripes 1408A-1408H based on a direction of rotational input 1474 (e.g., clockwise or counter-clockwise). For example, in response to detecting thatrotational input 1474 is in a first direction,electronic device 600 reduces the size ofstripes 1408A-1408H and, in response to detecting thatrotational input 1474 is in a second direction, different from the first direction,electronic device 600 increases the size ofstripes 1408A-1408H. - At
FIG. 14AB ,electronic device 600 detects user input 1476 (e.g., a swipe input) onediting user interface 1440. In response to detectinguser input 1476,electronic device 600 displays the second page ofediting user interface 1440, as shown at FIG. AC. As set forth above, the second page ofediting user interface 1440 enables adjustment of colors ofstripes 1408A-1408H.Electronic device 600 detects user input (e.g., a tap gesture) on a respective stripe in order to enable adjustment of the color of the respective stripe. As shown inFIG. 14AC (e.g., in response to detecting input 1476),electronic device 600rotates representation 1430 when transitioning from the third page of editing user interface 1440 (shown atFIG. 14AB ) to the second page of editing user interface 1440 (shown atFIG. 14AC ). In some embodiments,electronic device 600rotates representation 1430 when transitioning from any page ofediting user interface 1440 to the second page ofediting user interface 1440. In particular,electronic device 600rotates representation 1430 to include the horizontal orientation ofstripes 1408A-1408H. In some embodiments, whenrepresentation 1430 includes the horizontal orientation when displayed in the first page and/or the third page ofediting user interface 1440,electronic device 600 maintains display ofrepresentation 1430 in the horizontal orientation when transitioning to the second page ofediting user interface 1440. The horizontal orientation ofrepresentation 1430 can facilitate a user's ability to select a particular stripe ofstripes 1408A-1408H by providing uniform targets for a user to select (e.g., via a tap gesture). As such, displayingrepresentation 1430 in the horizontal orientation whenelectronic device 600 displays the second page ofediting user interface 1440 can improve a user's ability to selectstripes 1408A-1408H and adjust a particular stripe to a desired color. -
FIG. 14AD illustrates examples ofuser interface 1422 afterelectronic device 600 ceases to displayediting user interface 1440.FIG. 14AD includesfirst representation 1478 ofuser interface 1422 andsecond representation 1480 ofuser interface 1422 withbackground 1408 displayed within a circular border. Additionally,FIG. 14AD showsthird representation 1482 ofuser interface 1422 andfourth representation 1484 ofuser interface 1422 withbackground 1408 displayed within a rectangular border (e.g., a full screen border that includes the shape of display 602).First representation 1478 andsecond representation 1480 includecomplications display 602 and outside ofbackground 1408.Complications editing user interface 1440. Additionally,third representation 1482 andfourth representation 1484 includecomplications background 1408.Complications editing user interface 1440. -
FIGS. 15A-15F are a flow diagram illustrating methods of enabling configuration of a background for a user interface, in accordance with some embodiments.Method 1500 is performed at a computer system (e.g., 100, 300, 500, 600) (e.g., a smart device, such as a smartphone or a smartwatch; a mobile device) that is in communication with a display generation component and one or more input devices (e.g., including a touch-sensitive surface that is integrated with the display generation component; a mechanical input device; a rotatable input device; a rotatable and depressible input device; a microphone). Some operations inmethod 1500 are, optionally, combined, the orders of some operations are, optionally, changed, and some operations are, optionally, omitted. - As described below,
method 1500 provides an intuitive way for managing user interfaces related to time. The method reduces the cognitive burden on a user for managing user interfaces related to time, thereby creating a more efficient human-machine interface. For battery-operated computing devices, enabling a user to manage user interfaces related to time faster and more efficiently conserves power and increases the time between battery charges. - The computer system (e.g., 600) displays (1502), via the display generation component (e.g., 602), an editing user interface (e.g., 1406) for editing a background (e.g., 1408) of a user interface (e.g., a home/main user interface; a wake screen user interface; a lock screen user interface; a watch user interface; a watch face that includes an indication of time and one or more watch complications), wherein the user interface includes content (e.g., an indication of time; watch complications; icons; menus; folders) overlaid on the background (1504), and the editing user interface includes a representation of the background of the user interface that includes a first number of stripes (e.g., graphical lines across the background in a vertical or horizontal direction) that is greater than one (e.g., two or more stripes; an even number of repeating two stripes of different colors) (1506).
- While displaying the editing user interface (e.g., 1406) (1512), the computer system (e.g., 600) detects (1514), via the one or more input devices, a first user input (e.g., 1403, 1405) (e.g., a rotational input on the rotatable input device; a touch input such as a swipe or pinch input).
- In response to detecting the first user input (e.g., 1403) (1518), in accordance with a determination that the first user input corresponds to a first type of input (e.g., an input in a first direction (e.g., a clockwise rotational direction; a first vertical or horizontal direction)), the computer system (e.g., 600) displays (1522), in the user interface, a representation of an updated background (e.g., 1408) with a second number of stripes that is greater than the first number of stripes (e.g., add one or more additional stripes to the background (e.g., add one more stripe; add multiple stripes; add an even number of stripes; double the number of stripes); add one or more additional stripes to the background where the added stripes repeat a pattern (e.g., a repeating color pattern) of the original stripes). In some embodiments, updating the background with the second number of stripes that is greater than the first number of stripes includes moving (e.g., sliding) the new stripes onto the background from an edge of the display (e.g., 602).
- In response to detecting the first user input (e.g., 1405) (1518), in accordance with a determination that the first user input corresponds to a second type of input different from the first type of input (e.g., an input in a second direction (e.g., a counter-clockwise rotational direction; a second vertical or horizontal direction)), the computer system (e.g., 600) displays (1524), in the user interface, the representation of the updated background (e.g., 1408) with a third number of stripes that is less than the first number of stripes (e.g., remove one or more stripes from the background (e.g., remove one stripe; remove multiple stripes); if the first number of stripes have a repeating pattern (e.g., a repeating color pattern), remove one or more stripes such that the pattern is maintained within the remaining stripes; if the first number of stripes do not have a repeating pattern (e.g., a repeating color pattern), remove one or more stripes from the background in one direction). In some embodiments, updating the background with the third number of stripes that is less than the first number of stripes includes moving (e.g., sliding) stripes out of the background off of an edge of the display. Changing the number of stripes in the background in accordance with the first user input enables a user to change the number of stripes in the background easily and in an intuitive manner. Providing improved control options enhances the operability of the device and makes the user-device interface more efficient (e.g., by helping the user to provide proper inputs and reducing user mistakes when operating/interacting with the device) which, additionally, reduces power usage and improves battery life of the device by enabling the user to use the device more quickly and efficiently.
- The computer system (e.g., 600) detects (1526) (e.g., subsequent to detecting the first input), via the one or more input devices, a second user input (e.g., 1423) (e.g., a request to exit or cease display of the user interface for editing the background).
- In response to detecting the second user input (e.g., 1423) (1528), the computer system (e.g., 600) displays (1530), via the display generation component (e.g., 602), the user interface with the updated background (e.g., 1408). In some embodiments, the updated background includes the second number of stripes. In some embodiments, the updated background includes the third number of stripes. Displaying the user interface with the updated background in response to detecting the second user input enables a user to quickly and easily update the background of the current user interface. Providing improved control options without cluttering the UI with additional displayed controls enhances the operability of the device and makes the user-device interface more efficient (e.g., by helping the user to provide proper inputs and reducing user mistakes when operating/interacting with the device) which, additionally, reduces power usage and improves battery life of the device by enabling the user to use the device more quickly and efficiently.
- In some embodiments, the user interface is a watch user interface (e.g., a watch face; a user interface that includes an indication of a current time; a clock user interface for a smartwatch) (1508). In some embodiments, the content is an indication of a current time or current date (1510).
- In some embodiments, while displaying the editing user interface (e.g., 1406) (1512), the computer system (e.g., 600) displays (1516), in the editing user interface, a user interface (e.g., a tab (e.g., 1412) within the editing user interface) for editing (e.g., increasing or decreasing) a number of stripes of the representation of the background of the user interface, wherein the user interface for editing the number of stripes includes the representation of the background (e.g., 1408) of the user interface.
- In some embodiments, the first number of stripes are arranged in a first visual pattern of stripes of different colors (e.g., a first type of alternating color pattern (e.g., a repeating 2-color pattern; a repeating 3-color pattern)), and second number of stripes are arranged in the first visual pattern of stripes of different colors (e.g., the first type of alternating color pattern (e.g., a repeating 2-color pattern; a repeating 3-color pattern)) (1522). Maintaining the first visual pattern of stripes when the number of stirpes in the background are increased enables efficient editing of a background that includes the number of stripes. Reducing the number of inputs needed to perform an operation enhances the operability of the device and makes the user-device interface more efficient (e.g., by helping the user to provide proper inputs and reducing user mistakes when operating/interacting with the device) which, additionally, reduces power usage and improves battery life of the device by enabling the user to use the device more quickly and efficiently.
- In some embodiments, while displaying the representation of the updated background (e.g., 1408) with the third number of stripes, wherein the third number of stripes are arranged in a second visual pattern of stripes of different colors (e.g., a second type of alternating color pattern (e.g., a repeating 2-color pattern; a repeating 3-color pattern)) (1532), the computer system (e.g., 600) detects (1534), via the one or more input devices, a third user input (e.g., a rotational input on the rotatable input device; a touch input such as a swipe or pinch input). In some embodiments, in response to detecting the third user input (1536), the computer system displays (1538), in the user interface, the representation of the updated background with the first number of stripes, wherein the first number of stripes are arranged in the second visual pattern of stripes of different colors (e.g., a second type of alternating color pattern (e.g., a repeating 2-color pattern; a repeating 3-color pattern)). Arranging the first number of stripes in the second visual pattern of stripes of different colors (e.g., remembering the previous visual pattern of stripes) in response to detecting the third user input, where the number of stripes were first decreased, then increased via the third user input, enables efficient editing of a background that includes the number of stripes. Reducing the number of inputs needed to perform an operation enhances the operability of the device and makes the user-device interface more efficient (e.g., by helping the user to provide proper inputs and reducing user mistakes when operating/interacting with the device) which, additionally, reduces power usage and improves battery life of the device by enabling the user to use the device more quickly and efficiently
- In some embodiments, while displaying the representation of the updated background (e.g., 1408) with the third number of stripes, wherein the third number of stripes are arranged in a third visual pattern of stripes of different colors (e.g., a third type of alternating color pattern (e.g., a repeating 2-color pattern; a repeating 3-color pattern)), the computer system (e.g., 600) detects, via the one or more input devices, a fourth user input (e.g., a rotational input on the rotatable input device; a touch input such as a swipe or pinch input), wherein no other inputs were detected between displaying the representation of the updated background with the third number to detecting the fourth user input (e.g., there were no intervening operations on the computer system from updating the representation of the updated background to include the third number of stripes to detecting the fourth user input). In some embodiments, in response to detecting the fourth user input, displaying, in the user interface, the representation of the updated background with the first number of stripes, wherein the first number of stripes are arranged in the third visual pattern of stripes of different colors (e.g., the third type of alternating color pattern (e.g., a repeating 2-color pattern; a repeating 3-color pattern)). Arranging the first number of stripes in the third visual pattern of stripes of different colors (e.g., remembering the previous visual pattern of stripes) in response to detecting the fourth user input, where the number of stripes were first decreased, then increased via the fourth user input (e.g., and no intervening inputs were detected between the decreasing and increasing of the number of stripes), enables efficient editing of a background that includes the number of stripes. Reducing the number of inputs needed to perform an operation enhances the operability of the device and makes the user-device interface more efficient (e.g., by helping the user to provide proper inputs and reducing user mistakes when operating/interacting with the device) which, additionally, reduces power usage and improves battery life of the device by enabling the user to use the device more quickly and efficiently.
- Alternatively, in some embodiments, while displaying the representation of the updated background (e.g., 1408) with the third number of stripes, where the third number of stripes are arranged in the third visual pattern of stripes of different colors, the computer system (e.g., 600) detects one or more intervening inputs directed to causing display of a different user interface and/or causing display of a different page than a current page of the editing user interface, then detects the fourth user input. In some embodiments, in response to detecting the fourth user input, the computer system displays or causes display of, in the user interface, the representation of the updated background with the first number of stripes, where the first number of stripes are still arranged in the third visual pattern of stripes of different colors (e.g., the third type of alternating color pattern (e.g., a repeating 2-color pattern; a repeating 3-color pattern)).
- In some embodiments, while displaying the representation of the updated background (e.g., 1408) with the third number of stripes, wherein the third number of stripes are arranged in a fourth visual pattern of stripes of different colors (e.g., a fourth type of alternating color pattern (e.g., a repeating 2-color pattern; a repeating 3-color pattern)), the computer system (e.g., 600) detects, via the one or more input devices, a user input directed to performing an operation that does not include changing the third number of stripes of the representation of the updated background to a different number of stripes. In some embodiments, performing the operation includes displaying a user interface different from the editing user interface. In some embodiments, performing the operation includes editing a different aspect/feature of the representation of the updated background than changing or other modifying the stripes within the representation of the updated background (e.g., editing features of a watch face (e.g., watch face style; watch complications) having the updated background as the background).
- In some embodiments, in response to detecting the user input directed to performing the operation, the computer system (e.g., 600) ceases display of the representation of the updated background (e.g., 1408) (e.g., and exiting the user interface for editing the number of stripes and displaying (e.g., replacing display of the user interface for editing the number of stripes with) a different user interface for performing the operation that does not include changing the third number of stripes of the representation of the updated background to a different number of stripes).
- In some embodiments, subsequent to ceasing display of the representation of the updated background (e.g., 1408), the computer system (e.g., 600) detects, via the one or more input devices, a fifth user input (e.g., a rotational input on the rotatable input device; a touch input such as a swipe or pinch input).
- In some embodiments, in response to detecting the fifth user input, the computer system (e.g., 600) displays, in the user interface, the representation of the updated background (e.g., 1408) with the first number of stripes, wherein the first number of stripes are arranged in a fifth visual pattern of stripes of different colors (e.g., the fifth type of alternating color pattern (e.g., a repeating 2-color pattern; a repeating 3-color pattern)) that is different from the fourth visual pattern of stripes of different colors. Arranging the first number of stripes with the fifth visual pattern of stripes of different colors that is different from the fourth visual pattern of stripes of different colors in response to detecting the fifth user input, where the number of stripes were first decreased, then increased via the fifth user input, and there were intervening operations between the decreasing and increasing of the number of stripes, enables efficient editing of a background that includes the number of stripes by enabling a user to easily maintain the current visual pattern of stripes. Reducing the number of inputs needed to perform an operation enhances the operability of the device and makes the user-device interface more efficient (e.g., by helping the user to provide proper inputs and reducing user mistakes when operating/interacting with the device) which, additionally, reduces power usage and improves battery life of the device by enabling the user to use the device more quickly and efficiently.
- In some embodiments, while displaying the editing user interface (e.g., 1406), the computer system (e.g., 600) detects, via the one or more input devices (e.g., a touch-sensitive surface that is integrated with the display generation component (e.g., 602)), an input (e.g., 1407; a press-and-hold input; a touch-and-hold input) directed to a first stripe (e.g., 1408D; a stripe of the first number of stripes of the representation of the background (e.g., 1408). In some embodiments, in response to detecting the input directed to the first stripe, the computer system displays, in the editing user interface, an indication (e.g., 1416) (e.g., a visual indication (e.g., a tab, a box) surrounding or within the selected stripe indicating that the stripe has been selected, and that it can be modified) that the first stripe is selected for editing (e.g., editing for a different visual characteristic (e.g., a different color)). Transitioning through different selectable colors in response to detecting the rotational input enables a user to quickly and easily transition through the different selectable colors. Providing improved control options without cluttering the UI with additional displayed controls enhances the operability of the device and makes the user-device interface more efficient (e.g., by helping the user to provide proper inputs and reducing user mistakes when operating/interacting with the device) which, additionally, reduces power usage and improves battery life of the device by enabling the user to use the device more quickly and efficiently.
- In some embodiments, while displaying the indication (e.g., 1416) that the first stripe is selected for editing, the computer system (e.g., 600) detects, via the one or more input devices (e.g., a rotatable input device; a rotatable and depressible input device), a rotational input (e.g., 1409) (e.g., a rotational input on the rotatable input device; a touch input such as a swipe or pinch input). In some embodiments, in response to (e.g., and while) detecting the rotational input, the computer system transitions from a first color to a second color different from the first color (e.g., such that the second color is now set as the current color for the first stripe). In some embodiments, the transition from the first color to the second color includes, while detecting the rotational input, transitioning from the first color, through a plurality of different colors), to the second color. In some embodiments, the first stripe is edited without editing other stripes of the first number of stripes. Displaying the indication that the second stripe is selected for editing in response to detecting the input corresponding to the drag gesture enables efficient editing of a respective stripe of the background. Reducing the number of inputs needed to perform an operation enhances the operability of the device and makes the user-device interface more efficient (e.g., by helping the user to provide proper inputs and reducing user mistakes when operating/interacting with the device) which, additionally, reduces power usage and improves battery life of the device by enabling the user to use the device more quickly and efficiently.
- In some embodiments, displaying the editing user interface (e.g., 1406) includes, in accordance with a determination that the editing user interface is in a first editing mode (e.g., an editing mode for changing the number of respective stripes in the background), the representation of the background (e.g., 1408) of the user interface includes displaying respective stripes in the background with visually distinguishable spaces between the respective stripes. In some embodiments, displaying the editing user interface includes, in accordance with a determination that the editing user interface is in a second editing mode (e.g., an editing mode for changing a visual characteristic, such as a color, of one or more stripes in the background; an editing mode for rotating the respective stripes in the background) different from the first editing mode, the representation of the background includes displaying the respective stripes in the background without visually distinguishable spaces between the respective stripes.
- In some embodiments, while displaying the editing user interface (e.g., 1406), the computer system (e.g., 600) detects, via the one or more input devices (e.g., a touch-sensitive surface that is integrated with the display generation component), an input (e.g., 1411) on the representation of the background corresponding to a drag gesture (e.g., a finger touch drag gesture), wherein the drag gesture is detected across a plurality of stripes of the first number of stripes, beginning at an first stripe and ending at a second stripe (e.g., and including one or more stripes between the initial stripe and the final stripe). In some embodiments, in response to detecting the input corresponding to the drag gesture, the computer system displays, in the editing user interface, an indication (e.g., a visual indication (e.g., a tab, a box) surrounding or within the selected stripe indicating that the stripe has been selected, and that it can be modified) that the second stripe (e.g., the stripe that is displayed at a location that corresponds to a location in the user interface at which the drag gesture ended) is selected for editing (e.g., editing for a different visual characteristic (e.g., a different color)). Enabling the selection of a second stripe within the background using a drag gesture, where the drag gesture is detected beginning at the first stripe and ending at the second stripe, provides a convenient and intuitive method for selecting a different stripe in the background (e.g., without needing to provide additional controls for enabling selection of the second stripe). Providing additional control options without cluttering the UI with additional displayed controls enhances the operability of the device and makes the user-device interface more efficient (e.g., by helping the user to provide proper inputs and reducing user mistakes when operating/interacting with the device) which, additionally, reduces power usage and improves battery life of the device by enabling the user to use the device more quickly and efficiently
- In some embodiments, the computer system (e.g., 600) displays, via the display generation component (e.g., 602), the editing user interface (e.g., 1406) for editing the background of the user interface (e.g., including a respective number of stripes) in a second editing mode (e.g., an editing mode for rotating the stripes in the background; different from the current editing mode for changing the number of stripes in the background). In some embodiments, the while displaying the editing user interface for editing the background of the user interface, the computer system detects, via the one or more input devices (e.g., via a touch-sensitive surface that is integrated with the display generation component), an input (e.g., a swipe input (e.g., a horizontal swipe input)) directed to changing an editing mode. In some embodiments, in response to detecting the input directed to changing the editing mode, the computer system displays or causes display of the editing user interface in the second editing mode. Enabling quick and easy changing of an editing mode for editing a different feature/characteristic of a user interface, while maintaining display of the editing user interface (e.g., without needing to exit the editing user interface), enables the editing of user interfaces in an efficient manner and reduces the inputs required to edit the user interface. Reducing the number of inputs needed to perform an operation and providing improved control options enhances the operability of the device and makes the user-device interface more efficient (e.g., by helping the user to provide proper inputs and reducing user mistakes when operating/interacting with the device) which, additionally, reduces power usage and improves battery life of the device by enabling the user to use the device more quickly and efficiently.
- In some embodiments, while displaying, via the display generation component (e.g., 602), the editing user interface (e.g., 1406) for editing the background (e.g., 1408) of the user interface (e.g., including the respective number of stripes) in the second editing mode, the computer system detects, via the one or more input devices (e.g., a rotatable input device; a rotatable and depressible input device), a rotational input (e.g., a rotational input on the rotatable input device; a touch input such as a swipe or pinch input).
- In some embodiments, in response to (e.g., and while) detecting the rotational input (e.g., 1415), the computer system (e.g., 600) rotates the representation of the background (e.g., 1408) (e.g., including the respective number of stripes) (e.g., rotating with the center of the display generation component as the axis point) in accordance with the detected rotational input. In some embodiments, if the rotational input is in a clockwise direction, the (stripes within) the representation of the background is also rotated in the clockwise direction. In some embodiments, if the rotational input is in a counter-clockwise direction, the (stripes within) the representation of the background is also rotated in the counter-clockwise direction. In some embodiments, the representation of the background, including its respective number of stripes, are rotated with the center of the display generation component as the axis point for the rotation. In some embodiments, the respective number of stripes of the representation of the background maintain their straight shape (e.g., maintain their straightness as stripes) while they are being rotated about the axis point. Rotating the representation of the background in accordance with the detected rotational input enables efficient editing of a feature/characteristic of the background. Providing additional control options without cluttering the UI with additional displayed controls enhances the operability of the device and makes the user-device interface more efficient (e.g., by helping the user to provide proper inputs and reducing user mistakes when operating/interacting with the device) which, additionally, reduces power usage and improves battery life of the device by enabling the user to use the device more quickly and efficiently.
- In some embodiments, rotating the representation of the background (e.g., 1408) includes rotating the representation of the background by predefined rotational increments (e.g., 1 degree, 2 degree, 5 degree, by 10 degree increments; by 15 degree increments; by 30 degree increments) with respect to a rotational axis point (e.g., the center of the display generation component (e.g., 602)).
- In some embodiments, rotating the representation of the background (e.g., 1408) includes changing (e.g., increasing; decreasing) a characteristic (e.g., thickness; size; area) of a respective stripe within the representation of the background as the representation of the background is being rotated in accordance with the rotational input (e.g., 1415).
- In some embodiments, the computer system (e.g., 600) displays, via the display generation component (e.g., 602), the user interface with the updated background (e.g., 1408). In some embodiments, while displaying the user interface with the updated background (e.g., a watch user interface (e.g., watch face) with the updated background; a home user interface or main user interface with the updated background), the computer system detects, via the one or more input devices (e.g., a rotatable input device; a rotatable and depressible input device), a rotational input (e.g., 1415) (e.g., a rotational input on the rotatable input device; a touch input such as a swipe or pinch input). In some embodiments, in response to (e.g., and while) detecting the rotational input, the computer system rotates the updated background (e.g., with the center of the display generation component as the axis point) within the user interface in accordance with the detected rotational input. In some embodiments, if the rotational input is in a clockwise direction, the (stripes within) the updated background is also rotated in the clockwise direction. In some embodiments, if the rotational input is in a counter-clockwise direction, the (stripes within) the updated background is also rotated in the counter-clockwise direction. Enabling the updated background to be rotated based on the rotational input, where the direction of rotation of the updated background is based on direction of rotation of the input, provides an efficient and intuitive method for editing a feature of the updated background. Providing additional control options without cluttering the UI with additional displayed controls enhances the operability of the device and makes the user-device interface more efficient (e.g., by helping the user to provide proper inputs and reducing user mistakes when operating/interacting with the device) which, additionally, reduces power usage and improves battery life of the device by enabling the user to use the device more quickly and efficiently.
- In some embodiments, the content is a first complication. In some embodiments, a complication refers to any clock face feature other than those used to indicate the hours and minutes of a time (e.g., clock hands or hour/minute indications). In some embodiments, complications provide data obtained from an application. In some embodiments, a complication includes an affordance that when selected launches a corresponding application. In some embodiments, a complication is displayed at a fixed, predefined location on the display. In some embodiments, complications occupy respective locations at particular regions of a watch face (e.g., lower-right, lower-left, upper-right, and/or upper-left). In some embodiments, the computer system (e.g., 600) displays the user interface with the updated background (e.g., 1408), wherein the first complication includes a primary color (e.g., a color that most visually prominent in the displayed respective complication) that is selected (by the computer system) based on a first color a first stripe of a plurality of stripes in the updated background (e.g., based on the color of the first-in-order stripe in the updated background; based on the color of the stripes that are most common in the updated background). Automatically applying (e.g., without user input) the primary color for the first complication based on the first color of the first stripe of the updated background provides efficient editing/configuration of features of the user interface. Reducing the number of inputs needed to perform an operation enhances the operability of the device and makes the user-device interface more efficient (e.g., by helping the user to provide proper inputs and reducing user mistakes when operating/interacting with the device) which, additionally, reduces power usage and improves battery life of the device by enabling the user to use the device more quickly and efficiently.
- In some embodiments, the computer system (e.g., 600) displays the user interface with the updated background (e.g., 1408), wherein the first complication includes a secondary color (e.g., a color that is second-most visually prominent in the displayed respective complication; a color that is not as visually prominent in the displayed respective complication than the primary color) that is selected (by the computer system) based on a second color from a second stripe, different from the first stripe, of the plurality of stripes in the updated background (e.g., based on the color of the second-in-order stripe; based on the color of the stripe(s) that is not the most common in the updated background). Selecting (e.g., automatically, without user input) the secondary color for the first complication based on the second color from the second stripe reduces the number of user inputs needed to create a respective user interface that includes the updated background. Reducing the number of inputs needed to perform an operation enhances the operability of the device and makes the user-device interface more efficient (e.g., by helping the user to provide proper inputs and reducing user mistakes when operating/interacting with the device) which, additionally, reduces power usage and improves battery life of the device by enabling the user to use the device more quickly and efficiently.
- In some embodiments, rotating the representation (e.g., 1430) of the background (e.g., 1408) includes changing a thickness (e.g., a width) of the first number of stripes (e.g., 1408A-1408H) within the representation (e.g., 1430) of the background (e.g., 1408) as the representation (e.g., 1430) of the background (e.g., 1408) is being rotated in accordance with the rotational input (e.g., 1474). In some embodiments, the thickness of the first number of stripes within the representation of the background are changed uniformly (e.g., each stripe of the first number of stripes changes by the same amount). In some embodiments, the thickness of the first number of stripes changes based on a length of the longest stripe of the first number of stripes on the representation of the background (e.g., the stripes stretch and reduce in thickness as the length of the longest stripe increases). In some embodiments, rotating the representation (e.g., 1430) of the background (e.g., 1408) includes maintaining the first number of stripes (e.g., 1408A-1408H) within the representation (e.g., 1430) of the background (e.g., 1408) (e.g., the thickness of the stripes changes in order to fit the first number of stripes within the shape of the background without changing the first number of stripes).
- Changing the thickness of the first number of stripes as the representation of the background is being rotated in accordance with the rotational input enables a user to customize and/or adjust the background in an easy and intuitive manner. Providing improved control options enhances the operability of the device and makes the user-device interface more efficient (e.g., by helping the user to provide proper inputs and reducing user mistakes when operating/interacting with the device) which, additionally, reduces power usage and improves battery life of the device by enabling the user to use the device more quickly and efficiently.
- In some embodiments, the representation (e.g., 1430) of the background (e.g., 1408) is within a boundary having a first shape (e.g., a rectangle and/or a square). In some embodiments, the computer system (e.g., 600) displays, via the display generation component (e.g., 602), the editing user interface (e.g., 1440) for editing the background (e.g., 1408) of the user interface (e.g., 1422) (e.g., including a respective number of stripes) in a third editing mode (e.g., an editing mode for changing the representation of the background from a full screen mode to a partial screen mode (e.g., the partial screen mode displays the first number of stripes within a boundary having a different shape from a boundary of the full screen mode)). In some embodiments, while displaying the editing user interface (e.g., 1440) for editing the background (e.g., 1408) of the user interface, the computer system (e.g., 600) detects, via the one or more input devices (e.g., via a touch-sensitive surface that is integrated with the display generation component), an input (e.g., 1454, 1460, 1468, 1472, 1476) (e.g., a swipe input (e.g., a horizontal swipe input)) directed to changing an editing mode. In some embodiments, in response to detecting the input (e.g., 1454, 1460, 1468, 1472, 1476) directed to changing the editing mode, the computer system (e.g., 600) displays or causes display of the editing user interface (e.g., 1440) in the second editing mode.
- In some embodiments, the computer system (e.g., 600), while displaying, via the display generation component (e.g., 602), the editing user interface (e.g., 1440) for editing the background (e.g., 1408) of the user interface (e.g., 1422) (e.g., including the respective number of stripes) in the third editing mode, detects, via the one or more input devices (e.g., a rotatable input device; a rotatable and depressible input device), an input (e.g., 1448, 1470) (e.g., a rotational input on the rotatable input device; a touch input such as a swipe or pinch input).
- In some embodiments, the computer system (e.g., 600), in response to (e.g., and while) detecting the input (e.g., 1448, 1470), displays the representation (e.g., 1430, 1450) of the background (e.g., 1408) within a boundary having a second shape that is different from the first shape (e.g., the second shape is a circle, oval, and/or a round shape) and changes a thickness of the first number of stripes (e.g., 1408A-1408H) within the representation (e.g., 1430, 1450) of the background (e.g., 1408) (e.g., the first number of stripes is maintained when displaying the representation of the background in the boundary having the second shape, but the thickness of the first number of stripes is changed so that the first number of stripes fit evenly within the boundary having the second shape).
- Displaying the representation of the background within a boundary having a second shape that is different from the first shape in response to detecting the input enables a user to customize and/or adjust the background in an easy and intuitive manner. Providing improved control options enhances the operability of the device and makes the user-device interface more efficient (e.g., by helping the user to provide proper inputs and reducing user mistakes when operating/interacting with the device) which, additionally, reduces power usage and improves battery life of the device by enabling the user to use the device more quickly and efficiently.
- The computer system (e.g., 600), while displaying the user interface (e.g., 1422), receives (1540) a request (e.g., 1426) to display a watch face (e.g., a request to turn on the display, a request to switch from one watch face to a stripes watch face, or a request to exit an editing mode) with a first arrangement of stripes (e.g., color, thickness, number, angle).
- The computer system (e.g., 600), in response (1542) to the request (e.g., 1426) to display the watch face and in accordance with a determination that the first arrangement of stripes is displayed within a first boundary (e.g., a boundary having a first shape and first size), displays (1544) the first arrangement of stripes with a first width.
- The computer system (e.g., 600), in response (1542) to the request (e.g., 1426) to display the watch face and in accordance with a determination that the first arrangement of stripes is displayed within a second boundary (e.g., a boundary having a second shape different from the first shape and/or a second size different from the first size) that is different from the first boundary, displays (1546) the first arrangement of stripes with a second width that is different from the first width.
- Displaying the first arrangement of stripes with the first width or displaying the first arrangement of stripes with the second width based on a boundary of the first arrangement of stripes reduces a number of inputs needed by the user to customize the background. Reducing the number of inputs needed to customize the background enhances the operability of the device and makes the user-device interface more efficient (e.g., by helping the user to provide proper inputs and reducing user mistakes when operating/interacting with the device) which, additionally, reduces power usage and improves battery life of the device by enabling the user to use the device more quickly and efficiently.
- The computer system (e.g., 600), while displaying the user interface (e.g., 1422), receives (1540) a request (e.g., 1426) to display a watch face (e.g., a request to turn on the display, a request to switch from one watch face to a stripes watch face, or a request to exit an editing mode) with a first arrangement of stripes (e.g., color, thickness, number, angle).
- The computer system (e.g., 600), in response (1542) to the request (e.g., 1426) to display the watch face and in accordance with a determination that the first arrangement of stripes is displayed at a first angle within a first boundary (e.g., a boundary having a first shape and a first size), displays (1548) the first arrangement of stripes with a first width.
- The computer system (e.g., 600), in response (1542) to the request (e.g., 1426) to display the watch face and in accordance with a determination that the first arrangement of stripes is displayed at the first angle within a second boundary (e.g., a boundary having a second shape that is different from the first shape and/or a second size different from the first size) that is different from the first boundary, displays (1550) the first arrangement of stripes with a second width (e.g., the first width or a width different from the first width).
- The computer system (e.g., 600), in response (1542) to the request (e.g., 1426) to display the watch face and in accordance with a determination that the first arrangement of stripes is displayed at a second angle that is different from the first angle within the first boundary, displays (1552) the first arrangement of stripes with the first width (e.g., the first boundary includes a circular shape such that the width of the first arrangement of stripes do not change based on an angle of the first arrangement of stripes).
- The computer system (e.g., 600), in response (1542) to the request (e.g., 1426) to display the watch face and in accordance with a determination that the first arrangement of stripes is displayed at the second angle within the second boundary, displays (1554) the first arrangement of stripes with a third width that is different from the second width (e.g., the second boundary includes a non-circular shape such that the width of the first arrangement of stripes changes based on the angle of the first arrangement of stripes to fit the first arrangement of stripes evenly within the non-circular shaped boundary).
- Displaying the first arrangement of stripes with the first width, the second width, or the third width based on the boundary and an angle of the first arrangement of stripes reduces a number of inputs needed by the user to customize the background. Reducing the number of inputs needed to customize the background enhances the operability of the device and makes the user-device interface more efficient (e.g., by helping the user to provide proper inputs and reducing user mistakes when operating/interacting with the device) which, additionally, reduces power usage and improves battery life of the device by enabling the user to use the device more quickly and efficiently.
- The computer system (e.g., 600), while displaying the editing user interface (e.g., 1440) for editing the background (e.g., 1408) of the user interface (e.g., 1422) (e.g., including a respective number of stripes) in a fourth editing mode (e.g., the second editing mode, an editing mode for rotating the stripes in the background; different from the editing mode for changing the number of stripes in the background), detects (1556), via the one or more input devices, an input (e.g., 1466, 1474) (e.g., rotational input on the rotatable input device) corresponding to a request to rotate the representation (e.g., 1430, 1450) of the background (e.g., 1408).
- The computer system (e.g., 600), in response to detecting (1558) the input (e.g., 1466, 1474) and in accordance with a determination that the representation (e.g., 1450) of the background (e.g., 1408) is set to be displayed within a boundary of a first shape (e.g., a circle, an oval, and/or a round shape), rotates (1560) the representation of the background without adjusting a thickness of the first number of stripes (e.g., 1408A-1408H) within the representation (e.g., 1450) of the background (e.g., 1408) (e.g., rotating the representation of the background when displayed within the boundary having the first shape does not adjust a thickness of the first number of stripes).
- The computer system (e.g., 600), in response to detecting (1558) the input (e.g., 1466, 1474) and in accordance with a determination (1562) that the representation (e.g., 1430) of the background (e.g., 1408) is set to be displayed within a boundary of a second shape (e.g., a square and/or a rectangle), rotates (1564) the representation (e.g., 1430) of the background (e.g., 1408) and adjusts (1566) (e.g., changing, increasing, decreasing) the thickness of the first number of stripes (e.g., 1408A-1408H) as the representation (e.g., 1430) of the background (e.g., 1408) is rotated.
- Adjusting the thickness of the first number of stripes or forgoing adjusting the thickness of the first number of stripes based on a shape of the boundary of the background reduces a number of inputs needed by the user to customize the background. Reducing the number of inputs needed to customize the background enhances the operability of the device and makes the user-device interface more efficient (e.g., by helping the user to provide proper inputs and reducing user mistakes when operating/interacting with the device) which, additionally, reduces power usage and improves battery life of the device by enabling the user to use the device more quickly and efficiently.
- The computer system (e.g., 600), while displaying the editing user interface (e.g., 1440) for editing the background (e.g., 1408) of the user interface (e.g., 1422) (e.g., in an editing mode for rotating the representation of the background, in an editing mode for adjusting the first number of stripes, in an editing mode for adjusting the shape of the boundary of the representation of the background, and/or in an editing mode that is not for adjusting the color of a respective stripe of the first number of stripes), detects (1568) an input (e.g., 1454, 1476) corresponding to a request to display the editing user interface for editing the background of the user interface in a fifth editing mode (e.g., an editing mode for changing a color of a respective stripe of the first number of stripes).
- The computer system (e.g., 600), in response to detecting the input (e.g., 1454, 1476), displays (1570), via the display generation component (e.g., 602), the editing user interface (e.g., 1440) for editing the background (e.g., 1408) of the user interface (e.g., 1422) (e.g., including a respective number of stripes) in the fifth editing mode (e.g., an editing mode for changing a color of a respective stripe of the first number of stripes), wherein displaying the editing user interface (e.g., 1440) for editing the background (e.g., 1408) of the user interface (e.g., 1422) in the fifth editing mode includes the computer system (e.g., 600), in accordance with a determination that the representation (e.g., 1430, 1450) of the background (e.g., 1408) is in a first position (e.g., a rotational position and/or an angular position where the first number of stripes do not extend horizontally (at a 0 degree angle and/or a 360 degree angle) across display generation component), rotating (1574) the representation (e.g., 1430, 1450) of the background (e.g., 1408) to a second position (e.g., a rotational position and/or an angular position where the first number of stripes extend horizontally (at a 0 degree angle and/or a 360 degree angle) across display generation component) and displaying (1576) the representation (e.g., 1430, 1450) of the background (e.g., 1408) in the second position (e.g., a rotational position and/or an angular position where the first number of stripes extend horizontally (at a 0 degree angle and/or a 360 degree angle) across display generation component) in the editing user interface (e.g., 1440) for editing the background (e.g., 1408) of the user interface (e.g., 1422) in the fifth editing mode.
- The computer system (e.g., 600), in response to detecting the input (e.g., 1454, 1476), displays (1570), via the display generation component (e.g., 602), the editing user interface (e.g., 1440) for editing the background (e.g., 1408) of the user interface (e.g., 1422) (e.g., including a respective number of stripes) in the fifth editing mode (e.g., an editing mode for changing a color of a respective stripe of the first number of stripes), wherein displaying the editing user interface (e.g., 1440) for editing the background (e.g., 1408) of the user interface (e.g., 1422) in the fifth editing mode includes the computer system (e.g., 600), in accordance with a determination that the representation (e.g., 1430, 1450) of the background (e.g., 1408) is in the second position (e.g., a rotational position and/or an angular position where the first number of stripes extend horizontally (at a 0 degree angle and/or a 360 degree angle) across display generation component), maintaining (1578) display of the representation (e.g., 1430, 1450) of the background (e.g., 1408) in the second position (e.g., a rotational position and/or an angular position where the first number of stripes extend horizontally (at a 0 degree angle and/or a 360 degree angle) across display generation component) in the editing user interface (e.g., 1440) for editing the background (e.g., 1408) of the user interface (e.g., 1422) in the fifth editing mode.
- Displaying the representation of the background in the second position while the computer system displays the editing user interface for editing the background of the user interface in the fifth editing mode facilitates a user's ability to select a particular stripe of the first number of stripes, which reduces a number of inputs needed by the user to customize the background. Reducing the number of inputs needed to customize the background enhances the operability of the device and makes the user-device interface more efficient (e.g., by helping the user to provide proper inputs and reducing user mistakes when operating/interacting with the device) which, additionally, reduces power usage and improves battery life of the device by enabling the user to use the device more quickly and efficiently.
- In some embodiments, the editing user interface (e.g., 1440) for editing the background (e.g., 1408) of the user interface (e.g., 1422) in the fifth editing mode includes, in accordance with a determination that the representation (e.g., 1430, 1450) of the background (e.g., 1408) is in a third position (e.g., a rotational position and/or an angular position where the first number of stripes do not extend horizontally (at a 0 degree angle and/or a 360 degree angle) across display generation component) (e.g., a position different from the first position and the second position), rotating the representation (e.g., 1430, 1450) of the background (e.g., 1408) to the second position (e.g., a rotational position and/or an angular position where the first number of stripes are in a predetermined orientation such as a horizontal orientation (at a 0 degree angle and/or a 360 degree angle), a vertical orientation, and/or another predetermined orientation) and displaying the representation (e.g., 1430, 1450) of the background (e.g., 1408) in the second position (e.g., a rotational position and/or an angular position where the first number of stripes extend horizontally (at a 0 degree angle and/or a 360 degree angle) across display generation component) in the editing user interface (e.g., 1440) for editing the background (e.g., 1408) of the user interface (e.g., 1422) in the fifth editing mode.
- Displaying the representation of the background in the second position while the computer system displays the editing user interface for editing the background of the user interface in the fifth editing mode facilitates a user's ability to select a particular stripe of the first number of stripes, which reduces a number of inputs needed by the user to customize the background. Reducing the number of inputs needed to customize the background enhances the operability of the device and makes the user-device interface more efficient (e.g., by helping the user to provide proper inputs and reducing user mistakes when operating/interacting with the device) which, additionally, reduces power usage and improves battery life of the device by enabling the user to use the device more quickly and efficiently.
- Note that details of the processes described above with respect to method 1500 (e.g.,
FIGS. 15A-15F ) are also applicable in an analogous manner to the methods described above and below. For example,method 700 optionally includes one or more of the characteristics of the various methods described above with reference tomethod 1500. For example, a background for a user interface as described inFIGS. 14A-14AD can be used as the background for a watch user interface as described inFIGS. 6A-6H . For another example,method 900 optionally includes one or more of the characteristics of the various methods described above with reference tomethod 1500. For example, a background for a user interface as described inFIGS. 14A-14AD can be used as the background for a watch user interface as described inFIGS. 8A-8M . For another example,method 1100 optionally includes one or more of the characteristics of the various methods described above with reference tomethod 1500. For example, a device can use as a watch user interface either a watch user interface as described inFIGS. 10A-10AC or a user interface with a background as described inFIGS. 14A-14AD . For another example,method 1300 optionally includes one or more of the characteristics of the various methods described above with reference tomethod 1500. For example, a device can use as a watch user interface either a watch user interface as described inFIGS. 12A-12G or a user interface with a background as described inFIGS. 14A-14AD . For another example,method 1700 optionally includes one or more of the characteristics of the various methods described above with reference tomethod 1500. For example, one or more characteristics or features of a user interface that includes a background as described inFIGS. 14A-14AD can be edited via the process for editing characteristics or features of a watch user interface as described with reference toFIGS. 16A-16AE . For brevity, these details are not repeated below. -
FIGS. 16A-16AE illustrate exemplary user interfaces for enabling configuration of a user interface (e.g., editing a watch user interface), in accordance with some embodiments. The user interfaces in these figures are used to illustrate the processes described below, including the processes inFIGS. 17A-17D . -
FIG. 16A illustratesdevice 600 displaying, viadisplay 602, awatch user interface 1606 that includes a time region for displaying a current time (e.g., a dial and clock hands indicate the current time) and one or more complication regions for displaying watch complications onwatch user interface 1606. In some embodiments, a complication refers to any clock face feature other than those used to indicate the hours and minutes of a time (e.g., clock hands or hour/minute indications). In some embodiments, complications provide data obtained from an application. In some embodiments, a complication includes an affordance that when selected launches a corresponding application. In some embodiments, a complication is displayed at a fixed, predefined location ondisplay 602. In some embodiments, complications occupy respective locations at particular regions of watch user interface 1606 (e.g., lower-right, lower-left, upper-right, and/or upper-left). In some embodiments the complications are displayed at respective complication regions withinwatch user interface 1606. - In
FIG. 16A , watchuser interface 1606 includes acomplication 1608 corresponding to a contactable users application, acomplication 1610 corresponding to a calendar application, acomplication 1612 corresponding to a weather application, and acomplication 1614 corresponding to a moon phase application. - In
FIG. 16A , while displayingwatch user interface 1606,device 600 receives (e.g., detects) aninput 1601 onwatch user interface 1606. In some embodiments,input 1601 is a touch input (e.g., touch press input) ondisplay 602. In some embodiments,input 1601 is a press-and-hold input ondisplay 602. In response to detectinginput 1601,device 600 displays auser interface 1616 that includes arepresentation 1618 ofwatch user interface 1606 and an edit affordance 1620 for initiating a process for editingwatch user interface 1606, as shown inFIG. 16B . - In
FIG. 16B , while displayinguser interface 1616,device 600 receives (e.g., detects) aninput 1603 directed to selecting edit affordance 1620. In response to detectinginput 1603,device 600 displays, viadisplay 602, a first page 1626 (e.g., a style page) of anediting user interface 1622, as shown inFIG. 16C , whereediting user interface 1622 includes arepresentation 1624 of a layout ofwatch user interface 1606. In some embodiments,first page 1626 ofediting user interface 1622 is for editing a style ofwatch user interface 1606. - In
FIG. 16C , while displayingfirst page 1626 ofediting user interface 1622,device 600 receives (e.g., detects) aninput 1605 directed to changing the current page ofediting user interface 1622 to a second page 1628 (e.g., an editing mode for editing a dial of watch user interface 1606). In some embodiments,input 1605 includes a touch gesture (e.g., a horizontal swipe on display 602) or a rotational input onrotatable input mechanism 603. In response to detectinginput 1605,device 600 displayssecond page 1628 ofediting user interface 1606 includingrepresentation 1624 of a layout ofwatch user interface 1606, as shown inFIG. 16D . - In
FIG. 16D , while displayingsecond page 1628 ofediting user interface 1622,device 600 receives (e.g., detects) aninput 1607 directed to changing the current page ofediting user interface 1622 to a third page 1630 (e.g., an editing mode for changing a color (e.g., a background color; a color scheme) of watch user interface 1606). In some embodiments,input 1607 includes a touch gesture (e.g., a horizontal swipe on display 602) or a rotational input onrotatable input mechanism 603. In response to detectinginput 1607,device 600 displaysthird page 1630 ofediting user interface 1606 includingrepresentation 1624 of a layout ofwatch user interface 1606, as shown inFIG. 16E . Features ofthird page 1630 ofediting user interface 1622 are described in greater detail below with reference toFIGS. 16V-16X . - In
FIG. 16E , while displayingthird page 1630 ofediting user interface 1622,device 600 receives (e.g., detects) aninput 1609 directed to changing the current page ofediting user interface 1622 to a fourth page 1632 (e.g., an editing mode for changing one or more complications of watch user interface 1606). In some embodiments,input 1609 includes a touch gesture (e.g., a horizontal swipe on display 602) or a rotational input onrotatable input mechanism 603. In response to detectinginput 1609,device 600 displaysfourth page 1632 ofediting user interface 1606, as shown inFIG. 16F . - In
FIG. 16F ,device 600 displays, infourth page 1632 ofediting user interface 1622, complication previews 1634-1640 corresponding to complications 1608-1614 ofwatch user interface 1606, as shown inFIG. 16A .Complication preview 1634 corresponds tocomplication 1608 for the contactable users application,complication preview 1636 corresponds tocomplication 1610 for the calendar application.Complication preview 1638 corresponds tocomplication 1612 for the weather application, andcomplication preview 1640 corresponds tocomplication 1614 for the moon phase application. - In
FIG. 16F , while displaying complication previews 1634-1640 inediting user interface 1622,device 600 receives (e.g., detects) aninput 1611 directed to selectingcomplication preview 1634 corresponding tocomplication 1608 for the contactable users application. In some embodiments,input 1611 is a touch input ondisplay 602. In response to detectinginput 1611,device 600 displays, viadisplay 602, a complicationselection user interface 1642 for selecting a complication to be included in watch user interface 1606 (e.g., to replacecomplication 1608 in watch user interface 1606), as shown inFIG. 16G . - In
FIG. 16G , complicationselection user interface 1642 includes afirst region 1644 corresponding to the contactable users application (e.g., because the selected complication preview corresponds to the contactable users application).Region 1644 includes a header/label indicating that the region corresponds to the contactable users application and a group of complication previews 1644A-1644E. - In some embodiments, a respective complication preview corresponds to a respective complication that is configured to display a respective set of information obtained from the respective application (e.g., information based on a feature, operation, and/or characteristic of the respective application). The respective complication preview includes a graphical representation of the respective complication displaying the first set of information (e.g., an exemplary representation of the respective complication with an example of the respective set of information).
- In some embodiments, when the respective application is associated with a plurality of available complications, complication
selection user interface 1642 includes a plurality of complication previews corresponding to the plurality of available complications. For example, in accordance with a determination that the plurality of available complications exceeds a predetermined number of available complications (e.g., more than 5 or 6 complications),device 600 displays a plurality of complication previews that correspond to respective complications of the plurality of available complication along with an affordance for showing one or more additional complication previews of complications in the plurality of available complications (e.g., the plurality of complication previews does not exceed the predetermined number). InFIG. 16G , complication previews 1644A-1644E corresponding to the predetermined number of available complications for the respective application (the contactable users application) are displayed along with affordance 1648 (e.g., a “show more” icon or button). In response to selection ofaffordance 1648,device 600 displays one or more additional complication previews that were not included in the plurality of complication previews as well as the complication previews that were included in the plurality of complication previews. In some embodiments, in accordance with a determination that the plurality of available complications does not exceed the predetermined number, complicationselection user interface 1642 includes a complication preview for all of the available complications, without displaying the affordance (e.g., affordance 1648). - As mentioned, in
FIG. 16G , complicationselection user interface 1642 includesfirst region 1644 corresponding to the contactable users application, where the contactable users application is for managing information of a set of contactable users (e.g., user contacts stored in and/or accessible ondevice 600; user contacts stored in and/or accessible from an address book). A respective complication corresponding to the contactable users application corresponds to a respective contactable user of the set of contactable users. Complication previews 1644A-1644E correspond to respective complications (complication 1608) for five respective contactable users of the set of contactable users. - In some embodiments, in accordance with a determination that a first respective contactable user is a candidate contact (e.g., a favorite contact; a frequent contact; a primary contact) and that a second respective contactable user is not a candidate contact,
device 600 displays a first respective complication preview corresponding to the first respective contactable user prior to a second respective complication preview corresponding to the second respective contactable user in the displayed order of the complication previews. In some embodiments, in accordance with a determination that the first respective contactable user is not a candidate contact and that the second respective contactable user is a candidate contact,device 600 displays the second respective complication preview corresponding to the second respective contactable user prior to the first respective complication preview corresponding to the first respective contactable user in the displayed order of the complication previews. - In some embodiments, if there are as many or more candidate contacts than the maximum number of complication previews that are concurrently shown in complication
selection user interface 1642 for the contactable users application, as inFIG. 16G , all of the maximum number of complication previews that are shown (1644A-1644E) correspond to candidate contacts (e.g., listed in alphabetical order). In some embodiments, if there are fewer candidate contacts than the maximum number of complication previews that are concurrently shown, the candidate contacts are shown first (e.g., in alphabetical order) and regular contacts (non-candidate contacts) are shown for the remaining complication previews (e.g., separately in alphabetical order). - In
FIG. 16G ,device 600 displays avisual indication 1646 thatcomplication preview 1644A corresponds to the currently-selected complication forcomplication 1608 in watch user interface 1606 (e.g.,complication preview 1644A is highlighted and/or outlined compared to other complication previews). While displaying complicationselection user interface 1642 withcomplication preview 1644A selected,device 600 receives (e.g., detects) aninput 1613 directed to selectingcomplication preview 1644D. In some embodiments,input 1613 is a touch input ondisplay 602. In some embodiments,input 1613 is a press input onrotatable input mechanism 603 aftervisual indication 1646 is moved tocomplication preview 1644D (e.g., via rotation of rotatable input mechanism 603). - In response to receiving
input 1613,device 600 removesvisual indication 1646 fromcomplication preview 1644A and displaysvisual indication 1646 forcomplication preview 1644D, as shown inFIG. 16H , thereby indicating that the complication corresponding tocomplication preview 1644D has been selected to be used as the complication forcomplication 1608 inwatch user interface 1606. - In
FIG. 16H , whilecomplication preview 1644D is selected,device 600 receives (e.g., detects) aninput 1615 directed to anaffordance 1650 for exiting complicationselection user interface 1642 with the newly-selected settings. In some embodiments,input 1615 is a touch input ondisplay 602. In response to receivinginput 1615,device 600 displays fourth page 1632 (e.g., an editing mode for changing one or more complications of watch user interface 1606) ofediting user interface 1622, wherecomplication preview 1634 forwatch user interface 1606 now corresponds to the contactable user corresponding tocomplication preview 1644D (instead of the contactable user corresponding tocomplication preview 1644A) inFIGS. 16G-16H , as shown inFIG. 16I . - In
FIG. 16I , while displayingfourth page 1632 ofediting user interface 1622,device 600 receives (e.g., detects) aninput 1617 directed to selectingcomplication 1634. In some embodiments,input 1617 is a touch input ondisplay 602. In response to detectinginput 1617,device 600 displaysfirst region 1644 of complicationselection user interface 1642, as shown inFIG. 16J , wherefirst region 1644 includes complication previews 1644A-1644E corresponding to complications for the contactable users application, as first described above with reference toFIG. 16G . As mentioned,first region 1644 of complicationselection user interface 1642 includesaffordance 1648 that, when selected, causesdevice 600 to display one or more additional complication previews that were not included in the plurality of complication previews (e.g., in addition to the complication previews that were included in the plurality of complication previews). - In
FIG. 16J , while displayingfirst region 1644 corresponding to the contactable users application of complicationselection user interface 1642,device 600 receives (e.g., detects) aninput 1619 directed to selectingaffordance 1648. In some embodiments,input 1619 is a touch input ondisplay 602. In response to detectinginput 1619 directed toaffordance 1648,device 600 displays a contactable userselection user interface 1652, as shown inFIG. 16K . - In some embodiments, contactable user
selection user interface 1652 includes afirst region 1654 for candidate contacts (e.g., favorite contacts; frequent contacts; primary contacts), wherefirst region 1654 includes complication previews 1644A-1644D. Complication previews 1644A-1644D each correspond to a respective contactable user that is designated (e.g., by a user of device 600) as a candidate contact. In some embodiments, contactable userselection user interface 1652 includes asecond region 1656 for regular contacts (e.g., non-candidate contacts; non-favorite contacts), wheresecond region 1656 includes complication previews 1644E and 1656A that correspond to respective contactable users that are not designated as candidate contacts. In some embodiments, contactable userselection user interface 1652 can be navigated (e.g., scrolled) to show, insecond region 1656, additional complication previews corresponding to respective contactable users that are not designated as candidate contacts. -
FIG. 16L illustratesdevice 600 displaying, viadisplay 602, complicationselection user interface 1642 withfirst region 1644 corresponding to complication previews for the contactable users application, as first described above with reference toFIG. 16G . While displayingfirst region 1644 of complicationselection user interface 1642,device 600 receives (e.g., detects) aninput 1621 directed to navigating (e.g., scrolling) complicationselection user interface 1642. In some embodiments,input 1621 is a rotational input onrotatable input mechanism 603 shown inFIG. 16L . In some embodiments,input 1621 is a touch input such as a swipe or pinch input. -
FIGS. 16M-16O illustrate complicationselection user interface 1642 being navigated (e.g., scrolled) in response toinput 1621. InFIG. 16M , in response to (e.g., and while) receivinginput 1621,device 600 navigates complicationselection user interface 1642 from first region 1644 (corresponding to a complication group for contactable users application complications) to asecond region 1658 of complicationselection user interface 1642, wheresecond region 1658 corresponds to a complication group for a first third-party application - In some embodiments,
second region 1658 includes complication previews 1658A-1658E corresponding to respective complications that are configured to display, onwatch user interface 1606, a respective set of information obtained from the first third-party application. One or more of complication previews 1658A-1658E can include a respective graphical representation of the respective complication displaying the respective set of information.Second region 1658 of complicationselection user interface 1642 includes anaffordance 1660 that, when selected, causesdevice 600 to display one or more additional complication previews that were not included in the plurality of complication previews corresponding to the first third-party application insecond region 1658 of complicationselection user interface 1642. - In
FIG. 16N , in response to (e.g., and while) receivinginput 1621,device 600 navigates complicationselection user interface 1642 from second region 1658 (corresponding to a complication group for the first third-party application complications) to athird region 1662 and afourth region 1664 of complicationselection user interface 1642, wherethird region 1662 corresponds to a complication group for a second third-party application andfourth region 1664 corresponds to a complication group for a fitness application. - In some embodiments,
third region 1662 includes complication previews 1662A-1662B corresponding to respective complications that are configured to display a respective set of information obtained from the second third-party application. One or more of complication previews 1662A-1662B can include a respective graphical representation of the respective complication displaying the respective set of information. In some embodiments,third region 1662 of complicationselection user interface 1642 includes fewer than the predetermined number (e.g., 5 or 6) of complication previews that can be included for a respective region in complicationselection user interface 1642, and thus no affordance (e.g.,affordance 1648; affordance 1660) that, when selected, causesdevice 600 to display one or more additional complication previews for the respective application, is included. - In some embodiments,
fourth region 1664 includes complication previews 1664A-1664B corresponding to respective complications that are configured to display a respective set of information obtained from the fitness application. One or more of complication previews 1662A-1662B can include a respective graphical representation of the respective complication displaying the respective set of information. In some embodiments,fourth region 1664 of complicationselection user interface 1642 includes fewer than the predetermined number (e.g., 5 or 6) of complication previews that can be included for a respective region in complicationselection user interface 1642, and thus no affordance (e.g.,affordance 1648; affordance 1660) that, when selected, causesdevice 600 to display one or more additional complication previews for the respective application, is included. - In
FIG. 16O , in response to (e.g., and after) receivinginput 1621,device 600 navigates (e.g., scrolls) complicationselection user interface 1642 to afifth region 1666 of complicationselection user interface 1642, wherefifth region 1666 corresponds to a complication group for the weather application. - In some embodiments,
fifth region 1666 includes complication previews 1666A-1666D corresponding to respective complications that are configured to display, onwatch user interface 1606, a respective set of information obtained from the weather application. One or more of complication previews 1666A-1666D can include a respective graphical representation of the respective complication displaying the respective set of information. In some embodiments,fifth region 1666 of complicationselection user interface 1642 includes fewer than the predetermined number (e.g., 5 or 6) of complication previews that can be included for a respective region in complicationselection user interface 1642, and thus no affordance (e.g.,affordance 1648; affordance 1660) that, when selected, causesdevice 600 to display one or more additional complication previews for the respective application, is included. -
FIG. 16P illustratesdevice 600 displaying, viadisplay 602, awatch user interface 1668 that is different fromwatch user interface 1606 first described above with reference toFIG. 16A . InFIG. 16P , watchuser interface 1668 includes acomplication 1670 corresponding to an activity application,complication 1672 corresponding to a calendar application,complication 1674 corresponding to a health application,complication 1676 corresponding to a fitness application,complication 1678 corresponding to a time application,complication 1680 corresponding to a weather application,complication 1682 corresponding to the weather application, andcomplication 1684 corresponding to the calendar application. -
FIG. 16Q illustratesdevice 600 displaying, viadisplay 602, fourth page 1632 (e.g., an editing mode for changing one or more complications of watch user interface 1606) ofediting user interface 1622, includingcomplication preview 1686 corresponding tocomplication 1670 for the activity application,complication preview 1688 corresponding tocomplication 1672 for the calendar application,complication preview 1690 corresponding tocomplication 1674 for the health application,complication preview 1692 corresponding tocomplication 1676 for the fitness application,complication preview 1694 corresponding tocomplication 1678 for the time application,complication preview 1696 corresponding tocomplication 1680 for the weather application,complication preview 1698 corresponding tocomplication 1682 for the weather application, andcomplication preview 1699 corresponding tocomplication 1684 for the calendar application. - In
FIG. 16Q , while displayingfourth page 1632 ofediting user interface 1622 forwatch user interface 1668,device 600 receives (e.g., detects) aninput 1625 directed to selectingcomplication preview 1688 corresponding tocomplication 1672 for the calendar application. In some embodiments,input 1625 is a touch input ondisplay 602. In response to receivinginput 1625,device 600 displays asixth region 1697 of complicationselection user interface 1642 corresponding to a complication group for the calendar application, as shown inFIG. 16R , wheresixth region 1697 of complicationselection user interface 1642 includes a complication preview 1697A. - In
FIG. 16R , complicationselection user interface 1642 includes complication preview 1697A in a first shape (e.g., a first layout; a first design; a first outline) that corresponds to how the corresponding complication will be displayed if applied to watchuser interface 1668 at the location withinwatch user interface 1668 corresponding to the current location ofcomplication 1688. In some embodiments, in accordance with a determination that the current watch user interface (e.g., watch user interface 1668) is of a first type (e.g., a watch user interface having a first type of layout, design, and/or configuration), a respective complication preview shown in the complication selection user interface (e.g., complication preview 1697A) includes a graphical representation of the corresponding respective complication in the first shape. In some embodiments, the shape for the respective complication preview is (e.g., at least partly) determined based on the layout, design, and/or configuration of the respective watch user interface for which the corresponding respective complication is to be used. In some embodiments, the shape for a respective complication preview is (e.g., at least partly) determined based on the respective complication region of the one or more complications within the respective watch user interface for which the respective complication is being used. - In
FIG. 16Q , while displayingfourth page 1632 ofediting user interface 1622 forwatch user interface 1668,device 600 receives (e.g., detects) aninput 1627 directed to selectingcomplication preview 1698 corresponding tocomplication 1682 for the weather application. In some embodiments,input 1627 is a touch input ondisplay 602. In response to receivinginput 1627,device 600 displays aregion 1693 of complicationselection user interface 1642 corresponding to a complication group for the weather application, as shown inFIG. 16S , whereregion 1693 of complicationselection user interface 1642 includes complication previews 1693A-1693D. - In
FIG. 16S , complicationselection user interface 1642 includes complication previews 1693A-1693D in a second shape (e.g., a second layout; a second design; a second outline) that corresponds to how the corresponding complication will be displayed if applied to watchuser interface 1668 at the location withinwatch user interface 1668 corresponding to the current location ofcomplication 1698. In some embodiments, in accordance with a determination that the current watch user interface (e.g., watch user interface 1668) is of the first type, corresponding respective complication previews shown in the complication selection user interface (e.g., complication previews 1693A-1693D) include respective graphical representations of the corresponding respective complications in the second shape, different from the first shape. In some embodiments, the shape for the respective complication preview is (e.g., at least partly) determined based on the layout, design, and/or configuration of the respective watch user interface for which the corresponding respective complication is to be used. In some embodiments, the shape for a respective complication preview is (e.g., at least partly) determined based on the respective complication region of the one or more complications within the respective watch user interface for which the respective complication is being used. - In
FIG. 16S , while displayingregion 1693 of complicationselection user interface 1642,device 600 receives (e.g., detects) aninput 1631 directed to selectingcomplication preview 1693C. In some embodiments,input 1631 is a touch input ondisplay 602. In response to receivinginput 1631,device 600 visually indicates thatcomplication preview 1693C has been selected, as shown inFIG. 16T (e.g.,complication preview 1693C is outlined, highlighted, etc. compared to other complication previews to visually distinguishcomplication preview 1693C from other complication previews). - In
FIG. 16T , whilecomplication preview 1693C is selected,device 600 receives (e.g., detects) aninput 1633 directed toaffordance 1650 for exiting complicationselection user interface 1642 with the newly-selected settings. In some embodiments,input 1633 is a touch input ondisplay 602. In response to receivinginput 1633,device 600 displays fourth page 1632 (e.g., an editing mode for changing one or more complications of watch user interface 1668) ofediting user interface 1622, as shown inFIG. 16U , wherecomplication preview 1698 forwatch user interface 1668 now corresponds tocomplication preview 1693C selected inFIGS. 16S-16T . - In
FIG. 16U , while displayingfourth page 1634 ofediting user interface 1622,device 600 receives (e.g., detects) aninput 1635 directed to changing the current page ofediting user interface 1622 to third page 1630 (e.g., an editing mode for changing a color of watch user interface 1668). In some embodiments,input 1635 includes a gesture (e.g., a horizontal swipe ondisplay 602; a rotational input on rotatable input mechanism 603). In response to detectinginput 1635,device 600 displaysthird page 1630 ofediting user interface 1622 includingrepresentation 1691 of a layout ofwatch user interface 1668, as shown inFIG. 16V . - In
FIG. 16V ,third page 1630 ofediting user interface 1622 includes a navigable (e.g., scrollable)user interface element 1689 that includes a plurality of selectable colors (e.g., to be used as a background color forwatch user interface 1668; to be applied as a color scheme to watch user interface 1668). In some embodiments,user interface element 1689 includes a color wheel with colors represented in selectable circles. - In
FIG. 16V , while displayingthird page 1630 ofediting user interface 1622 includinguser interface element 1689,device 600 receives (e.g., detects) aninput 1637. In some embodiments,input 1637 is a rotational input onrotatable input mechanism 603 shown inFIG. 16V . In some embodiments,input 1637 is a touch input such as a swipe or pinch input. - In response to (e.g., and while) receiving
input 1637,device 600 navigates through the plurality of selectable colors inuser interface 1689. In some embodiments, as the plurality of selectable colors are being navigated viauser interface element 1689,device 600 indicates (e.g., by highlighting; by bolding; by visually emphasizing) the currently-selected color. - In some embodiments, in response to receiving
input 1637,device 600 navigates through the plurality of selectable colors inuser interface element 1689 to an end (e.g., top or bottom) ofuser interface element 1689, as shown inFIG. 16W . In some embodiments,user interface element 1689 includes, at the end ofuser interface element 1689, anindication 1687 that more colors are available for selection. InFIG. 16W , in response to reaching the end ofuser interface element 1689,device 600 displays anaffordance 1685 that, when selected, causes display of the additional selectable colors. - In
FIG. 16W , while displayingaffordance 1685,device 600 receives (e.g., detects) aninput 1639 directed toaffordance 1685. In some embodiments,input 1639 is a touch input ondisplay 602. In response to detectinginput 1639,device 600 displays an additional colorselection user interface 1683 that includes one or more groups (e.g., group 1681) of additional selectable colors (e.g.,group 1681 including at least additionalselectable colors 1681A-1681D), as shown inFIG. 16X . In some embodiments, additional colorselection user interface 1683 can be navigated (e.g., scrolled) for more groups of additional selectable colors. In some embodiments, a group of colors includes similar colors (e.g., a similar range of colors; colors of a common shade or theme). In some embodiments, a group of colors includes colors from a common period (e.g., a particular season of a particular year). In some embodiments, the plurality of selectable colors included inuser interface element 1689 corresponds to common colors and/or frequently used colors. In some embodiments, the plurality of additional selectable colors included in additional colorselection user interface 1683 corresponds to less-common colors and/or less-frequently used colors. -
FIG. 16Y illustrates a second device 600B (e.g., a smartphone) displaying, via a display 602B, afirst user interface 1679 of a companion application. In some embodiments, device 600B is paired withdevice 600. In some embodiments, the companion application on device 600B can be used to edit, configure, and/or modify settings or features ofdevice 600 and/or applications that are installed ondevice 600. - In some embodiments,
first user interface 1679 includes a watchuser interface representation 1677 corresponding to a representation of a watch user interface (e.g., watchuser interface 1668; a watch user interface that is currently selected to be used on device 600). In some embodiments,first user interface 1679 includes acolors region 1675 that includes a plurality of selectable colors that can be applied to the watch user interface (e.g., as a background color or for a color scheme). Similar tothird page 1630 ofediting user interface 1622 ofdevice 600, a color can be selected fromcolor region 1675 to be applied to the watch user interface. In some embodiments,first user interface 1679 includes acomplications region 1673 that indicates and enables changes to the current complications that are selected for the watch user interface. -
FIG. 16Z illustrates device 600B displaying, via display 602B, asecond user interface 1671 of the companion application, wheresecond user interface 1671 includes a selectableuser interface element 1669 for managing/editing a color(s) of the watch user interface. InFIG. 16Z , while displayingsecond user interface 1671 of the companion application, device 600B receives (e.g., detects) aninput 1641 directed touser interface element 1669. In some embodiments,input 1641 is a touch input on display 602B. In response to receiving (e.g., detecting)input 1641, device 600B displays, viadisplay 602, an additional colorselection user interface 1667 of the companion application, as shown inFIG. 16AA . - Similar to additional color
selection user interface 1683 described above with reference toFIG. 16X , additional colorselection user interface 1667 ofFIG. 16AA includes one or more groups (e.g.,groups 1665 and 1663) of additional selectable colors (e.g.,group 1665 including additionalselectable colors 1665A-1665F andgroup 1663 including at least additionalselectable colors 1663A-1663D), as shown inFIG. 16AA . In some embodiments, additional colorselection user interface 1667 can be navigated (e.g., scrolled) for more groups of additional selectable colors. In some embodiments, the plurality of selectable colors included incolor region 1675 offirst user interface 1679 of the companion application corresponds to common colors and/or frequently used colors. In some embodiments, the plurality of additional selectable colors included in additional colorselection user interface 1667 of the companion application corresponds to less-common colors and/or less-frequently used colors. -
FIG. 16AB-16AE , as described below, illustratedevice 600 displaying, inregion 1693 of complicationselection user interface 1642, the complication previews 1693A-1693D for respective corresponding complications of the weather application, where the shape of each respective complication preview is automatically adjusted or modified. - In
FIG. 16AB , complication previews 1693A-1693D corresponding to complications of the weather application are displayed with a first shape (e.g., a first layout; a first design; a first type of outline). In some embodiments, complication previews 1693A-1693D in the first shape, as inFIG. 16AB , correspond to a first complication region (e.g., the top-left-corner region, thus being the top-left-corner complication) ofwatch user interface 1668. - In
FIG. 16AC , complication previews 1693A-1693D corresponding to complications of the weather application are displayed, in complicationselection user interface 1642, with a second shape. In some embodiments, complication previews 1693A-1693D in the second shape, as inFIG. 16AC , correspond to a second complication region (e.g., the top-right-corner region, thus being the top-right-corner complication) ofwatch user interface 1668. - In
FIG. 16AD ,complication preview 1693B corresponding to a complication of the weather application is displayed, in complicationselection user interface 1642, with a third shape. In some embodiments,complication preview 1693B in the third shape, as inFIG. 16AD , corresponds to a third complication region (e.g., the top-bezel region, thus being the top-bezel complication) ofwatch user interface 1668. - In
FIG. 16AD , complication previews 1693C-1693D corresponding to complications of the weather application are displayed, in complicationselection user interface 1642, with a fourth shape. In some embodiments, complication previews 1693C-1693D in the fourth shape, as inFIG. 16AD , correspond to a fourth complication region (e.g., one of the (e.g., 4 possible) inner-dial regions, thus being one of the inner-dial complications) ofwatch user interface 1668. - In
FIG. 16AE , complication previews 1693A-1693D corresponding to complications of the weather application are displayed, in complicationselection user interface 1642, with the fourth shape. In some embodiments, complication previews 1693A-1693D in the fifth shape, as inFIG. 16AE , correspond to the fourth complication region (e.g., one of the inner-dial regions) ofwatch user interface 1668. In some embodiments, as shown bycomplication preview 1693B as shown inFIG. 16AD andcomplication preview 1693B as shown inFIG. 16AE , the same complication for the same application can be include in a respective watch user interface with different shapes based on the type of the respective watch user interface and/or the respective complication region within the respective watch user interface for which the complication is being used. - As mentioned above, in some embodiments, the shape for the respective complication preview is (e.g., at least partly) determined based on the layout, design, and/or configuration of the respective watch user interface (e.g., watch user interface 1668) for which the corresponding respective complication is to be used. As also mentioned above, in some embodiments, the shape for a respective complication preview is (e.g., at least partly) determined based on the respective complication region of the one or more complications within the respective watch user interface for which the respective complication is being used.
-
FIGS. 17A-17D are a flow diagram illustrating methods of enabling configuration of a background for a user interface, in accordance with some embodiments.Method 1700 is performed at a computer system (e.g., 100, 300, 500, 600) (e.g., a smart device, such as a smartphone or a smartwatch; a mobile device) that is in communication with a display generation component and one or more input devices (e.g., including a touch-sensitive surface that is integrated with the display generation component; a mechanical input device; a rotatable input device; a rotatable and depressible input device; a microphone). Some operations inmethod 1700 are, optionally, combined, the orders of some operations are, optionally, changed, and some operations are, optionally, omitted. - As described below,
method 1700 provides an intuitive way for managing user interfaces related to time. The method reduces the cognitive burden on a user for managing user interfaces related to time, thereby creating a more efficient human-machine interface. For battery-operated computing devices, enabling a user to manage user interfaces related to time faster and more efficiently conserves power and increases the time between battery charges. - In some embodiments, prior to displaying the watch face editing user interface (e.g., 1622), the computer system (e.g., 600) displays or causes display of the watch user interface (e.g., 1606, 1668) (e.g., a watch face). In some embodiments, the watch user interface includes a dial that indicates a current time. In some embodiments, the watch user interface includes one or more complications (e.g., 1608, 1610, 1612, 1614, 1670, 1672, 1674, 1676, 1678, 1680, 1682, 1684) corresponding to respective applications that indicate respective sets of information (e.g., a date; a calendar event; weather; contacts). In some embodiments, the complications are displayed at respective complication regions within the watch user interface.
- In some embodiments, while displaying the watch user interface (e.g., 1606, 1668), the computer system (e.g., 600) detects an input (e.g., 1601, 1603) (e.g., a press input; a press-and-hold input) on the watch user interface. In some embodiments, in response to detecting the input on the watch user interface, the computer system displays or causes display of the watch face editing user interface (e.g., 1622).
- The computer system (e.g., 600) displays (1702), via the display generation component (e.g., 602), a watch face editing user interface (e.g., 1622), wherein the watch face editing user interface includes a representation of a layout of a watch user interface (e.g., 1624) (e.g., a watch face; a user interface for a watch that includes an indication of a time and/or date) including a time region for displaying a current time and one or more complication regions for displaying complications on the watch user interface. In some embodiments, a complication refers to any clock face feature other than those used to indicate the hours and minutes of a time (e.g., clock hands or hour/minute indications). In some embodiments, complications provide data obtained from an application. In some embodiments, a complication includes an affordance that, when selected, launches a corresponding application. In some embodiments, a complication is displayed at a fixed, predefined location on the display. In some embodiments, complications occupy respective locations at particular regions of a watch face (e.g., lower-right, lower-left, upper-right, and/or upper-left).
- While displaying the watch face editing user interface (e.g., 1622) (1704), the computer system (e.g., 600) detects (1706), via the one or more input devices, a first input (e.g., 1611, 1617) (e.g., a first user selection) directed to a complication region of the one or more complication regions (e.g., regions corresponding to
complications complications - In response to detecting the first input (e.g., 1611, 1617) directed to the complication region of the one or more complication regions (1708), the computer system (e.g., 600) displays (1710) a complication selection user interface (e.g., 1642).
- Displaying the complication selection user interface (e.g., 1642) includes (1710) concurrently displaying an indication (e.g., label/header of region 1644, 1658, 1662, 1664, 1666) of (e.g., the name of; a graphical indication of; an icon corresponding to; a category of) a first application (e.g., an application that is installed on, can be launched on, and/or is accessible from the computer system) (1712), a first complication preview (e.g., 1644A-1644E) (e.g., a graphical preview of how the first complication would be displayed in the watch user interface) corresponding to a first complication that is configured to display, on the watch user interface (e.g., 1606, 1668), a first set of information obtained from the first application (e.g., information based on a feature, operation, and/or characteristic of the first application), wherein the first complication preview includes a graphical representation of the first complication displaying the first set of information (e.g., an exemplary representation of the first complication with an example of the first set of information) (1714), and a second complication preview (e.g., a graphical preview of how the second complication would be displayed in the watch user interface) corresponding to a second complication that is configured to display, on the watch user interface, a second set of information, different from the first set of information, obtained from the first application (e.g., information based on a feature, operation, and/or characteristic of the first application), wherein the second complication preview includes a graphical representation of the second complication displaying the second set of information (e.g., an exemplary representation of the second complication with an example of the second set of information) (1716). Displaying the complication selection user interface that includes the indication of the first application, the first complication preview, and the second complication preview (e.g., together in the same region of the complication selection user interface, displays as a group) enables a user to quickly and easily recognize that the first and second complication previews correspond to complications related to the first application, thereby enhancing the operability of the device and making the user-device interface more efficient (e.g., by helping the user to view related/associated items in the user interface together without needing to navigate to other portions of the user interface) which, additionally, reduces power usage and improves battery life of the device by enabling the user to use the device more quickly and efficiently.
- While displaying the complication selection user interface (e.g., 1642) (1718), the computer system (e.g., 600) detects (1720), via the one or more input devices (e.g., via a rotatable input device (e.g., 603); via a touch-sensitive surface), a second input (e.g., 1613) directed to selecting a respective complication preview (e.g., 1644A-1644E).
- In response to detecting the second input (e.g., 1613) directed to selecting the respective complication preview (e.g., 1644A-1644E) (1722), the computer system (e.g., 600) displays (1724), via the display generation component (e.g., 602), a representation of the watch user interface (e.g., as shown in
FIG. 16F and 16Q ) with a representation of a selected complication corresponding to the respective complication preview displayed at the first complication region of the watch user interface (e.g., 1606, 1668). - In accordance with a determination that the respective complication preview is the first complication preview, the first complication is displayed in the first complication region of the watch user interface (e.g., 1606, 1668) (1726).
- In accordance with a determination that the respective complication preview is the second complication preview, the second complication is displayed in the first complication region of the watch user interface (e.g., 1606, 1668) (1728). Displaying (e.g., automatically, without user input) a respective complication in a respective complication region of the watch user interface based on the selected complication preview enables a user to conveniently and efficiently manage and change complications of the watch user interface. Providing improved control options without cluttering the UI with additional displayed controls enhances the operability of the device.
- In some embodiments, while displaying the complication selection user interface (e.g., 1642) (1730), the computer system (e.g., 600) detects (1732), via the one or more input devices (e.g., via a rotatable input device; via a touch-sensitive surface), a third input (e.g., 1621) (e.g., a rotational input on the rotatable input device (e.g., 603); a touch scrolling input on the touch-sensitive surface). In some embodiments, in response to detecting the third input (1734), the computer system navigates (e.g., scrolls) through the complication selection user interface (1736).
- In some embodiments, navigating (e.g., scrolling) through the complication selection user interface (e.g., 1642) includes (1736) concurrently displaying an indication of (e.g., the name of; a graphical indication of; an icon corresponding to; a category of) a second application (e.g., an application that is installed on, can be launched on, and/or is accessible from the computer system) (1728), a third complication preview (e.g., 1634, 1636, 1638, 1640, 1686, 1688, 1690, 1692, 1694, 1696, 1698, 1699) (e.g., a graphical preview of how the third complication would be displayed in the watch user interface) corresponding to a third complication that is configured to display, on the watch user interface (e.g., 1606, 1668), a third set of information obtained from the second application (e.g., information based on a feature, operation, and/or characteristic of the second application), wherein the third complication preview includes a graphical representation of the third complication displaying the third set of information (e.g., an exemplary representation of the third complication with an example of the third set of information) (1740), and a fourth complication preview (e.g., a graphical preview of how the fourth complication would be displayed in the watch user interface) corresponding to a fourth complication that is configured to display, on the watch user interface, a fourth set of information, different from the third set of information, obtained from the second application (e.g., information based on a feature, operation, and/or characteristic of the second application), wherein the fourth complication preview (e.g., 1634, 1636, 1638, 1640, 1686, 1688, 1690, 1692, 1694, 1696, 1698, 1699) includes a graphical representation of the fourth complication displaying the fourth set of information (e.g., an exemplary representation of the fourth complication with an example of the fourth set of information) (1742). Displaying the indication of the second application, the third complication preview, and the fourth complication preview (e.g., together in the same region of the complication selection user interface; together as a group of complications corresponding to the second application) in accordance with navigating (e.g., scrolling) through the complication selection user interface provides easy and efficient access to different complications that are available for selection, as related complications (complications corresponding to the same application) are grouped together within the complication selection user interface. Providing improved visual feedback enhances the operability of the device enhances the operability of the device and makes the user-device interface more efficient (e.g., by helping the user to view related/associated items in the user interface together without needing to navigate to other portions of the user interface) which, additionally, reduces power usage and improves battery life of the device by enabling the user to use the device more quickly and efficiently.
- In some embodiments, navigating (e.g., scrolling) through the complication selection user interface (e.g., 1642) further includes ceasing display of the first complication preview (e.g., 1634, 1636, 1638, 1640, 1686, 1688, 1690, 1692, 1694, 1696, 1698, 1699) corresponding to the first complication and the second complication preview (e.g., 1634, 1636, 1638, 1640, 1686, 1688, 1690, 1692, 1694, 1696, 1698, 1699) corresponding to the second complication (e.g., and other complication previews corresponding to respective complications that are configured to display, on the watch user interface (e.g., 1606, 1668) (e.g., watch face), a respective set of information obtained from the first application) (1744). In some embodiments, ceasing display of the first complication preview and the second complication preview comprises moving the first complication preview and the second complication preview off of an edge of the display generation component as the complication selection user interface is navigated (e.g., scrolled).
- In some embodiments, the indication of the first application, the first complication preview (e.g., 1634, 1636, 1638, 1640, 1686, 1688, 1690, 1692, 1694, 1696, 1698, 1699), and the second complication preview (e.g., 1634, 1636, 1638, 1640, 1686, 1688, 1690, 1692, 1694, 1696, 1698, 1699) are displayed in (e.g., grouped together in) a first region (e.g., 1644, 1658, 1662, 1664, 1666) of the complication selection user interface (e.g., 1642) (e.g., where the indication of the first application is a header/label for the group), and the indication of the second application, the third complication preview, and the fourth complication preview are displayed in (e.g., grouped together in) a second region of the complication selection user interface different from the first region (e.g., where the indication of the second application is a header/label for the group) (1746). Displaying the indication of the first application, the first complication preview, and the second complication preview together in the first region of the complication selection user interface and displaying the indication of the second application, the third complication preview, and the fourth complication preview together in the second region of the complication selection user interface enable a user to view and select from the available complications in an intuitive manner. Providing additional control options enhances the operability of the device and makes the user-device interface more efficient (e.g., by helping the user to provide proper inputs and reducing user mistakes when operating/interacting with the device) which, additionally, reduces power usage and improves battery life of the device by enabling the user to use the device more quickly and efficiently.
- In some embodiments, the first application is associated with a plurality of available complications (e.g., 1608, 1610, 1612, 1614, 1670, 1672, 1674, 1676, 1678, 1680, 1682, 1684) that are configured to display information obtained from the first application, and the plurality of available complications include the first complication and the second complication. In some embodiments, displaying the complication selection user interface includes (e.g., 1642), in accordance with a determination that the plurality of available complications that are configured to display information obtained from the first application exceeds a predetermined number (e.g., 5, 6), the computer system (e.g., 600) displays a plurality of complication previews (e.g., the plurality of complication previews includes a number of complication previews that equals the predetermined number) that each correspond to a complication of the plurality of available complication, where the plurality of complication previews does not exceed the predetermined number, and a first selectable user interface object (e.g., 1648, 1660) (e.g., a first affordance; a “show more” icon/button) that, when selected, causes display of one or more additional complication previews (e.g., 1656A) that were not included in the plurality of complication previews (e.g., the one or more additional complication previews includes previews for all of the available complications that were not included in the plurality of complication previews). In some embodiments, displaying the complication selection user interface includes, in accordance with a determination that the plurality of available complications that are configured to display information obtained from the first application does not exceed the predetermined number, displaying a second plurality of complication previews (e.g., the second plurality of complication previews includes complication previews for all of the available complications that are configured to display information obtained from the first application) that each correspond to a complication of the plurality of available complication without displaying the first selectable user interface object. Displaying the plurality of complication previews that each correspond to a complication of the plurality of available complication, where the plurality of complication previews does not exceed the predetermined number, prevents cluttering of the complication selection user interface, thereby enabling a user to access the available complications in a quicker and more efficient manner. Providing additional control options without cluttering the UI with additional displayed controls enhances the operability of the device and makes the user-device interface more efficient (e.g., by helping the user to provide proper inputs and reducing user mistakes when operating/interacting with the device) which, additionally, reduces power usage and improves battery life of the device by enabling the user to use the device more quickly and efficiently.
- In some embodiments, the first application corresponds to an application (e.g., a contactable users application) for managing information of a set of contactable users (e.g., user contacts stored in and/or accessible on the computer system (e.g., 600); user contacts stored in and/or accessible from an address book), the first complication (e.g., 1608) corresponds to a first contactable user of the set of contactable users, the second complication corresponds to a second contactable user of the set of contactable users, the first complication preview and the second complication preview are displayed in an order (e.g., a predetermined order; a selected order).
- In some embodiments, displaying the complication selection user interface (e.g., 1642) includes, in accordance with a determination that the first contactable user is a user of a first type (e.g., a candidate contact, a favorite contact; a frequent contact) and that the second contactable user is not a user of the first type, the computer system (e.g., 600) displays the first complication preview prior to the second complication preview in the order. In some embodiments, displaying the complication selection user interface includes, in accordance with a determination that the first contactable user is not a user of the first type and that the second contactable user is a user of the first type, displaying the second complication preview prior to the first complication preview in the order. Displaying a complication preview corresponding to a candidate contact prior to displaying a complication preview corresponding to a non-candidate contact in the complication selection user interface provides a user with quicker and easier access to a respective complication preview corresponding to a candidate contact when navigating the complication selection user interface. Reducing the number of inputs needed to perform an operation enhances the operability of the device and makes the user-device interface more efficient (e.g., by helping the user to provide proper inputs and reducing user mistakes when operating/interacting with the device) which, additionally, reduces power usage and improves battery life of the device by enabling the user to use the device more quickly and efficiently.
- In some embodiments, the first application is the contactable users application, and the computer system (e.g., 600) displays or causes display of a maximum number of complication previews for the contactable users application in the complication selection user interface (e.g., 1642). In some embodiments, if there are as many or more candidate contacts (e.g., favorite contacts; frequent contacts) than the maximum number of complication previews that are concurrently shown in the complication section user interface for the contactable users application, all of the maximum number of complication previews that are shown correspond to candidate contacts (e.g., listed in alphabetical order). In some embodiments, if there are fewer candidate contacts than the maximum number of complication previews that are concurrently shown, the candidate contacts are shown first (e.g., in alphabetical order) and regular contacts are shown for the remaining complication previews (e.g., separately in alphabetical order).
- In some embodiments, in accordance with a determination that the watch user interface (e.g., 1606, 1668) is of a first type (e.g., a watch face having a first type of layout, design, and/or configuration), the first complication preview includes the graphical representation of the first complication in a first shape (e.g., 1693A-1693D in
FIG. 16AB ) (e.g., a first layout; a first design; a first outline) and the second complication preview includes the graphical representation of the second complication in the first shape. In some embodiments, in accordance with a determination that the watch user interface is of a second type (e.g., a watch face having a second type of layout, design, and/or configuration), the first complication preview includes the graphical representation of the first complication in a second shape (e.g., 1693A-1693D inFIG. 16AC ) (e.g., a second layout; a second design; a second outline) and the second complication preview includes the graphical representation of the second complication in the second shape, wherein the second shape is different from the first shape. In some embodiments, the type of shape (e.g., layout; design; outline) for complication previews are (e.g., at least partly) determined based on the layout, design, and/or configuration of the watch face for which the corresponding complications are to be used. Including, in the complication selection user interface, complication previews that include graphical representations of a respective complication in a respective shape, where the type of the respective shape is at least partly determined based on the layout, design, and/or configuration of the current watch user interface enables a user to conveniently preview, before selecting a particular complication for use, how a respective complication would appear when used in the watch user interface. Providing improved visual feedback and reducing the number of inputs needed to perform an operation enhances the operability of the device and makes the user-device interface more efficient (e.g., by helping the user to provide proper inputs and reducing user mistakes when operating/interacting with the device) which, additionally, reduces power usage and improves battery life of the device by enabling the user to use the device more quickly and efficiently. - In some embodiments, in accordance with a determination that the complication region (selected via the first input) of the one or more complication regions correspond to a first complication region, the first complication preview includes the graphical representation of the first complication in a third shape (e.g., 1693B in
FIG. 16AD ) (e.g., a third layout; a third design; a third outline) and the second complication preview includes the graphical representation of the second complication in the third shape. In some embodiments, in accordance with a determination that the complication region (selected via the first input) of the one or more complication regions correspond to a second complication region different from the first complication region, the first complication preview includes the graphical representation of the first complication in a fourth shape (e.g., 1693A-1693D inFIG. 16AE ) (e.g., a fourth layout; a fourth design; a fourth outline) and the second complication preview includes the graphical representation of the second complication in the fourth shape, wherein the fourth shape is different from the third shape. In some embodiments, the type of shape (e.g., layout; design; outline) for complication previews are (e.g., at least partly) determined based on the respective complication region of the one or more complications within a watch face for which the respective complication is being used. - In some embodiments, displaying the complication selection user interface (e.g., 1642) further includes displaying the indication of the first application prior to (e.g., above; as a header) the first complication preview and the second complication preview (e.g., prior to all complication previews that are associated with the first application). In some embodiments, the indication of the first application is indicative of (e.g., represents; is the name for; is the header for) a complication preview group comprising the first complication preview and the second complication preview. Displaying the indication of the first application prior to the first complication preview and the second complication preview enables a user to quickly and easily recognize the corresponding application for the displayed compilations, thereby enhancing the operability of the device and making the user-device interface more efficient (e.g., by helping the user to more easily recognize and categorize the displayed complications) which, additionally, reduces power usage and improves battery life of the device by enabling the user to use the device more quickly and efficiently.
- In some embodiments, while displaying the watch face editing user interface (e.g., 1622), the computer system (e.g., 600) displays, via the display generation component (e.g., 602) (e.g., at a top region of the display generation component), an indication (e.g., “DIAL” or “COLOR” in
FIGS. 16V and 16W ; an indication of a color editing user interface; an indication of a dial editing user interface) of an adjacent editing tab corresponding to an adjacent user interface that is different from a user interface for editing one or more complications of the watch user interface. In some embodiments, the editing interface different from the watch face editing user interface is configured to edit a different aspect/characteristic of the watch face other than the complications of the watch face. In some embodiments, while displaying the watch face editing user interface, the computer system detects, via the one or more input devices, a fourth input (e.g., a swipe input detected via a touch-sensitive surface that is integrated with the display generation component) directed to navigating to a different editing tab. In some embodiments, while displaying the watch face editing user interface, in response to detecting the fourth input, the computer system displays, via the display generation component, the adjacent user interface, the adjacent user interface for editing a characteristic (e.g., different aspect; different feature) of the watch user interface different from the one or more complications of the watch user interface. Providing, in the watch face editing user interface, adjacent editing tabs for editing different aspects/characteristics of the watch user interface enables a user to quickly and easily access the other editing tabs for editing the different aspects/characteristics (e.g., without needing to exit the watch face editing user interface). Providing improved control options and reducing the number of inputs needed to perform an operation enhances the operability of the device and makes the user-device interface more efficient (e.g., by helping the user to provide proper inputs and reducing user mistakes when operating/interacting with the device) which, additionally, reduces power usage and improves battery life of the device by enabling the user to use the device more quickly and efficiently. - In some embodiments, the computer system (e.g., 600) displays, via the display generation component (e.g., 602), a color editing user interface (e.g., 1630) (e.g., different from the watch face editing user interface). In some embodiments, the color editing user interface can be accessed via one or more swipe inputs from the watch face editing user interface (e.g., 1622) (e.g., the watch face editing user interface and color editing user interface are different tabs within a watch face editing mode). In some embodiments, the color editing user interface is accessed while the computer system is in watch face editing mode. In some embodiments, the color editing user interface is a tab within a plurality of (e.g., adjacent) tabs (e.g., style tab; dial tab; color tab; complication tab) that can be accessed while the computer system is in watch face editing mode. In some embodiments, the color editing user interface can be accessed via a companion application on a second computer system (e.g., a second electronic device, such as a smartphone) that is paired with the computer system. Providing the color editing user interface that can be accessed via one or more swipe inputs from the watch face editing user interface provides quick and easy access for editing colors of a current watch user interface that is being edited (e.g., without needing to exit the watch face editing user interface). Providing improved control options and reducing the number of inputs needed to perform an operation enhances the operability of the device and makes the user-device interface more efficient (e.g., by helping the user to provide proper inputs and reducing user mistakes when operating/interacting with the device) which, additionally, reduces power usage and improves battery life of the device by enabling the user to use the device more quickly and efficiently.
- In some embodiments, the color editing user interface (e.g., 1630) includes the representation of the layout of the watch user interface (e.g., 1624) displayed in a first color scheme based on a first color, and a first plurality of selectable colors (e.g., 1689) (e.g., displayed as navigable list of colors, with each color represented in a selectable circle) for the watch user interface (e.g., 1606, 1668) (e.g., a watch face), including the first color. Providing the representaiton of the layout of the watch user interface in the color editing user interface enables a user to easily view changes in color that are appleid to the current watch user interface, thereby enhancing the operability of the device and making the color editing process more efficient (e.g., by enabling the user to more easily view the changes that are being made) which, additionally, reduces power usage and improves battery life of the device by enabling the user to use the device more quickly and efficiently. In some embodiments, the color editing user interface is used to edit/modify a color/color scheme of the (e.g., background of) the layout of the watch user interface. In some embodiments, the first color is the currently-selected color. In some embodiments, if the first color is the currently-selected color, the computer system (e.g., 600) indicates (e.g., by highlighting; by bolding; by visually emphasizing), in the first plurality of colors, that the first color is the currently-selected color.
- In some embodiments, the computer system (e.g., 600) detects, via the one or more input devices (e.g., via a rotatable input device (e.g., 603); via a touch-sensitive surface), a fifth input (e.g., 1637) (e.g., a rotational input on the rotatable input device; a touch scrolling input on the touch-sensitive surface) directed to navigating (e.g., scrolling) through the first plurality of selectable colors (e.g., 1689). Enabling the plurality of selectable colors to be navigated (e.g., scrolled) via a rotational input on a rotatable input device provides an intuitive method for navigating through and selecting from the plurliaty of selectable colors. Providing improved control options enhances the operability of the device and makes the user-device interface more efficient (e.g., by helping the user to provide proper inputs and reducing user mistakes when operating/interacting with the device) which, additionally, reduces power usage and improves battery life of the device by enabling the user to use the device more quickly and efficiently.
- In some embodiments, in response to detecting the fifth input (e.g., 1637), the comptuer system (e.g., 600) navigates (e.g., scrolls) through the first plurality of colors (e.g., 1689) from the first color to a second color different from the first color. In some embodiments, the computer system also indicates (e.g., by highlighting; by bolding; by visually emphasizing), in the first plurality of colors, that the second color is now the currently-selected color. In some embodiments, in response to detecting the fifth input, the computer system displays the representation of the layout of the watch user interface (e.g., 1624) in a second color scheme based on the second color. Providing a color editing user interface that includes the representation of the layout of the watch user interface, where the displayed presentation of the layout of the watch user interface is adjusted based on a selected color scheme from the color editing user interface, enables a quick and easy method for editing the color scheme of the current watch user interface. Reducing the number of inputs needed to perform an operation enhances the operability of the device and makes the user-device interface more efficient (e.g., by helping the user to provide proper inputs and reducing user mistakes when operating/interacting with the device) which, additionally, reduces power usage and improves battery life of the device by enabling the user to use the device more quickly and efficiently.
- In some embodiments, subsequent to detecting the fifth input (e.g., 1637), the computer system (e.g., 600) detects, via the one or more input devices (e.g., via a rotatable input device (e.g., 603); via a touch-sensitive surface), a sixth input (e.g., a continuation of the fifth input) directed to navigating (e.g., scrolling) through the first plurality of selectable colors (e.g., 1689). In some embodiments, in response to detecting the sixth input, the computer system navigates (e.g., scrolls) through the first plurality of colors to display a second selectable user interface object (e.g., 1685) (e.g., a second affordance; a “show more” icon/button). In some embodiments, the second selectable user interface object is displayed with (e.g., with the same shape/layout/design as) other colors in the first plurality of colors. In some embodiments, the second selectable user interface object is displayed as the last color in the list of the first plurality of colors. In some embodiments, the computer system detects, via the one or more input devices, an activation (e.g., selection) of the second selectable user interface object. In some embodiments, in response to detecting the activation of the second selectable user interface object, the computer system displays, via the display generation component, a second plurality of selectable colors for the watch user interface that is different from the first plurality of selectable colors. In some embodiments, the first plurality of colors include common colors and/or frequently used colors while the second plurality of colors include less-common colors and/or less-frequently used colors. Providing the second selectable user interface object which, when activated, causes display of the second plurality of selectable colors prevents cluttering of the plurality of selectable colors while also enabling a user to easily access additional selectable colors, the second plurality of selectable colors, that were not included in the plurality of selectable colors. Providing additional control options without cluttering the UI with additional displayed controls enhances the operability of the device and makes the user-device interface more efficient (e.g., by helping the user to provide proper inputs and reducing user mistakes when operating/interacting with the device) which, additionally, reduces power usage and improves battery life of the device by enabling the user to use the device more quickly and efficiently.
- Note that details of the processes described above with respect to method 1700 (e.g.,
FIGS. 17A-17D ) are also applicable in an analogous manner to the methods described above. For example,method 700 optionally includes one or more of the characteristics of the various methods described above with reference tomethod 1700. For example, a respective complication of a watch user interface as described with reference toFIGS. 6A-6H can be changed to a different complication via the process for managing complications described with reference toFIGS. 16A-16AE . For another example,method 900 optionally includes one or more of the characteristics of the various methods described above with reference tomethod 1700. For example, a respective complication of a watch user interface as described with reference toFIGS. 8A-8M can be changed to a different complication via the process for managing complications described with reference toFIGS. 16A-16AE . For another example,method 1100 optionally includes one or more of the characteristics of the various methods described above with reference tomethod 1700. For example, one or more characteristics or features of a user interface that includes an indication of time and a graphical representation of a character as described with reference toFIGS. 10A-10AC can be edited via the process for editing characteristics or features of a watch user interface as described with reference toFIGS. 16A-16AE . For another example,method 1300 optionally includes one or more of the characteristics of the various methods described above with reference tomethod 1700. For example, one or more characteristics or features of a time user interface as described with reference toFIGS. 12A-12G can be edited via the process for editing characteristics or features of a watch user interface as described with reference toFIGS. 16A-16AE . For another example,method 1500 optionally includes one or more of the characteristics of the various methods described above with reference tomethod 1700. For example, a respective complication of a watch user interface with a background as described with reference toFIGS. 14A-14AD can be changed to a different complication via the process for managing complications described with reference toFIGS. 16A-16AE . For brevity, these details are not repeated below. -
FIGS. 18A-18J illustrate exemplary user interfaces for sharing a configuration of a user interface with an external device, in accordance with some embodiments. The user interfaces in these figures are used to illustrate the processes described below, including the processes inFIGS. 19A-19C . - At
FIG. 18A , electronic device 600 (e.g., “Jane's Watch”) displays watchface user interface 1800 ondisplay 602. Watchface user interface 1800 includesgraphical representation 1802 of a character (e.g., a first character in a set of characters configured to be displayed on watch face user interface 1800). InFIG. 18A , watchface user interface 1800 includestime indicator 1804 andcomplication 1806A (e.g., corresponding to a calendar application) andcomplication 1806B (e.g., corresponding to weather application). Watchface user interface 1800 includes a default color (e.g., black) andbackground 1808 having colors that are different from the default color (e.g., colors displayed byelectronic device 600 in accordance with user inputs while an editing user interface is displayed by electronic device 600). AtFIG. 18A ,electronic device 600 detectsuser input 1850A (e.g., a long press gesture) on watchface user interface 1800. In response to detectinguser input 1850A,electronic device 600displays user interface 1810, as shown atFIG. 18B . - At
FIG. 18B ,user interface 1810 includesfirst representation 1800A of watch face user interface 1800 (e.g., corresponding to set or collection of avatar characters configured to be sequentially displayed on watch face user interface 1800) andsecond representation 1800B of an additional watch face user interface configured to be displayed by electronic device 600 (e.g., a watch face user interface corresponding to a set or collection of animal-like characters and/or emojis configured to be sequentially displayed on the watch face user interface).First representation 1800A of watchface user interface 1800 includes graphical representations of multiple characters (e.g., a collection and/or a set of characters) configured to be displayed on watch face user interface 1800 (e.g., displayed sequentially based onelectronic device 600 detecting a change in activity state and/or a user input), as indicated by the multiple characters included onfirst representation 1800A.User interface 1810 includes watch face indicator 1812 that includes a name associated with watch face user interface 1800 (e.g., “Avatar”).User interface 1810 also includesshare affordance 1814 and editaffordance 1816. AtFIG. 18B ,electronic device 600 detectsuser input 1850B (e.g., a tap gesture) onshare affordance 1814. In response to detectinguser input 1850B,electronic device 600 displays sharinguser interface 1818, as shown atFIG. 18C . - At
FIG. 18C , sharinguser interface 1818 enables selection of a recipient for receiving information associated with watchface user interface 1800. For example, sharinguser interface 1818 includes affordances 1820A-1820C corresponding to respective recipients (e.g., contactable users, information for which is stored in electronic device 600) for receiving information associated with watchface user interface 1800. AtFIG. 18C , whileelectronic device 600 displays sharinguser interface 1818 includingaffordances 1820A-1820C,electronic device 600 detectsuser input 1850C (e.g., a tap gesture) corresponding to selection ofaffordance 1820C corresponding to recipient Ann Smith or an external device associated with recipient Ann Smith. In response to detectinguser input 1850C,electronic device 600 displaysmessaging user interface 1822 of a messaging application ofelectronic device 600, as shown atFIG. 18D . - At
FIG. 18D , messaging user interface includes 1822 includes amessage 1824 havingrepresentation 1826 of watchface user interface 1800.Messaging user interface 1822 includesindicator 1828 that indicates the recipient (e.g., Ann Smith) ofmessage 1824. Additionally,messaging user interface 1822 includessend affordance 1830 for initiating transmission ofmessage 1824. AtFIG. 18D ,electronic device 600 detectsuser input 1850D (e.g., a tap gesture) corresponding to selection ofsend affordance 1830. In response to detectinguser input 1850D,electronic device 600 initiates a process for sendingmessage 1824 to the selected recipient (e.g., external device 1832 (e.g., Ann's Watch)).Message 1824 includesrepresentation 1826 of watchface user interface 1800. In addition to transmittingmessage 1824 andrepresentation 1826,electronic device 600 also transmits data and/or information associated with watchface user interface 1800 toexternal device 1832. For example,electronic device 600 transmits information associated with a background of watch face user interface 1800 (e.g., color and/or size of background), a font of watch face user interface 1800 (e.g., a font for a date and/or time displayed by watch face user interface 1800), a position of a time indicator and/or complications of watchface user interface 1800, applications corresponding to complications of watchface user interface 1800, and/or customizations to complications of watch face user interface 1800 (e.g., colors and/or size of complications). - As discussed below, in some embodiments,
electronic device 600 transmits information and/or data indicative ofgraphical representation 1802 of a character of watchface user interface 1800. In particular,electronic device 600 transmits information and/or data indicative of (e.g., that defines)graphical representation 1802 of the character of watchface user interface 1800 when watchface user interface 1800 is configured to display a graphical representation of a single character without transitioning between display of graphical representations of multiple characters.Electronic device 600 forgoes transmission of information and/or data indicative ofgraphical representation 1802 of a character of watchface user interface 1800 when watchface user interface 1800 is configured to transition between display of respective graphical representations for multiple characters (e.g., a set of predetermined characters and/or a collection of predetermined characters). For example,electronic device 600 transmits information associated with (e.g., that defines) a graphical representation of a character for watch face user interfaces that are configured to display a graphical representation of only a single character. In some embodiments,electronic device 600 forgoes transmission of information associated with any graphical representation of any character for watch face user interfaces that transition between display of graphical representations of multiple characters (e.g., in response to detecting a change in activity state ofelectronic device 600 and/or in response to user input). Whileelectronic device 600 transmits and/or forgoes transmission of information associated with graphical representations of characters based on a type of watch face user interface (e.g., a single character watch face user interface or a collection of characters watch face user interface), in some embodiments,electronic device 600 transmits other data associated with watch face user interface 1800 (e.g., information related to background, fonts, and/or complications) regardless of whether information associated with a graphical representation of a character is transmitted or not. - At
FIG. 18E , external device 1832 (e.g., Ann's Watch) receivesmessage conversation 1824 andrepresentation 1826 of watchface user interface 1800. For example,external device 1832 displaysmessage conversation 1824 andrepresentation 1826 in amessaging user interface 1831 ondisplay 1833 of external device. Since watchface user interface 1800 includes graphical representations of multiple characters (e.g., watchface user interface 1800 is configured to transition between display of graphical representations of characters included in a collection of characters),external device 1832 does not receive information related tographical representation 1802 and/or graphical representations of other characters associated with watchface user interface 1800. AtFIG. 18E ,external device 1832 detects user input 1834 (e.g., a tap gesture) corresponding to selection ofrepresentation 1826. In response to detectinguser input 1834,external device 1832 displaysuser interface 1836, as shown atFIG. 18F . - At
FIG. 18F ,user interface 1836 includesrepresentation 1838, watchface indicator 1840, and addwatch face affordance 1842. AtFIG. 18F ,external device 1832 detects user input 1844 (e.g., a tap gesture) corresponding to selection of addwatch face affordance 1842. In response to detectinguser input 1844,external device 1832 adds a new watch face user interface to a watch face library ofexternal device 1832 and displays watchface user interface 1846, as shown atFIG. 18G . - At
FIG. 18G ,external device 1832 displays watchface user interface 1846. Watchface user interface 1846 includestime indicator 1804 andcomplication 1806A (e.g., corresponding to a calendar application) andcomplication 1806B (e.g., corresponding to a weather application). Watchface user interface 1846 further includes a default color (e.g., black) andbackground 1808 having colors that are different from the default color (e.g., colors displayed byelectronic device 600 in response to user inputs while an editing user interface is displayed by electronic device 600). As such, watchface user interface 1846 includes features that are the same as watchface user interface 1800. AtFIG. 18G ,time indicator 1804 andcomplication 1806A andcomplication 1806B of watchface user interface 1846 include a same position, font, and/or size as watchface user interface 1800. Additionally,background 1808 of watchface user interface 1846 includes a same color and/or size as watchface user interface 1800. As such,electronic device 600 transmits information related to watchface user interface 1800 toexternal device 1832 that is not indicative ofgraphical representation 1802 of watchface user interface 1800. Because watchface user interface 1800 is associated with a collection of graphical representations of multiple characters,electronic device 600 forgoes transmission of information associated withgraphical representation 1802 and information associated with any other graphical representations of characters associated with watchface user interface 1800. - At
FIG. 18G , watchface user interface 1846 includesgraphical representation 1848 of a character that is different fromgraphical representation 1802 of the character of watch face user interface 1800 (e.g., since information defining the characters of watchface user interface 1800 is not provided). In some embodiments, watchface user interface 1846 is associated with a collection of graphical representations of characters that are included and/or stored onexternal device 1832, or stored in an account associated withexternal device 1832. In some embodiments, watchface user interface 1846 is associated with a collection of graphical representations of characters that are selected randomly from a library of characters (e.g., stored onexternal device 1832 and/or stored on another external device different from external device 1832 (e.g., a server)). - Turning back to
FIG. 18B ,electronic device 600 detectsuser input 1850E (e.g., a swipe gesture) onuser interface 1810. In response to detectinguser input 1850E,electronic device 600 translatesfirst representation 1800A of watchface user interface 1800 andsecond representation 1800B of a second watch face user interface in a direction corresponding touser input 1850E, as shown atFIG. 18H . As a result of translatingfirst representation 1800A andsecond representation 1800B,electronic device 600 displaysthird representation 1800C associated with a third watch face user interface, different from watchface user interface 1800 and second watch face user interface. - At
FIG. 18H ,second representation 1800B of the second watch face user interface includes multiple different characters (e.g., animal-like avatars and/or emojis) to indicate that the second watch face user interface associated withsecond representation 1800B is configured to transition between display of graphical representations of multiple characters. Accordingly, in response to detecting user input corresponding to selection ofshare affordance 1814,electronic device 600 initiates the process for transmitting data associated with the second watch face without including information associated with graphical representations of characters configured to be displayed on the second watch face user interface. - At
FIG. 18H , electronic device detectsuser input 1850F (e.g., a swipe gesture) onuser interface 1810. In response to detectinguser input 1850F,electronic device 600 translatesfirst representation 1800A,second representation 1800B, andthird representation 1800C in a direction associated withuser input 1850F, as shown atFIG. 18I . As a result of translatingfirst representation 1800A,second representation 1800B, andthird representation 1800C,electronic device 600 ceases to displayfirst representation 1800A and displaysfourth representation 1800D associated with a fourth watch face user interface, different from watchface user interface 1800, second watch face user interface, and third watch face user interface. - At
FIG. 18I ,third representation 1800C includes a graphical representation of a single character, thereby indicating that the third watch face user interface is configured to display a graphical representation of only a single character (e.g., regardless ofelectronic device 600 detecting a change in activity state and/or a user input). AtFIG. 18I ,electronic device 600 detects user input 1850G (e.g., a tap gesture) corresponding to selection of share affordance 1814 (e.g., to share third watch face user interface). In response to detecting user input 1850G,electronic device 600 initiates a process for sharing the third watch face user interface (e.g., becausethird representation 1800C is designated, as indicated by being in a center position on user interface 1810). For example, in response to detecting user input 1850G,electronic device 600 displays sharinguser interface 1818. In response to detecting user input on an affordance associated with an external device of a recipient on sharinguser interface 1818,electronic device 600 displaysmessaging user interface 1822. In response to detecting user input corresponding to selection ofsend affordance 1830,electronic device 600 initiates a process for transmitting information associated with the third watch face user interface (e.g., a background, a font, and/or complications) as well as information associated with (e.g., that defines) a graphical representation of the character of the third user interface. - At
FIG. 18J ,external device 1832 displays watch face user interface 1852 (e.g., in response to receiving the transmission fromelectronic device 600 and detecting user input corresponding to add watch face affordance 1842). AtFIG. 8J , watchface user interface 1852 includesgraphical representation 1854 of a character that is the same character displayed onthird representation 1800C. Sincethird representation 1800C corresponds to a watch face user interface ofelectronic device 600 that is configured to display a graphical representation of a single character,electronic device 600 transmits information corresponding to the graphical representation of the single character to external device 632. In some embodiments, the information corresponding to the graphical representation of the single character includes a recipe that defines the graphical representation of the single character. In some embodiments, the recipe of the graphical representation of the single character includes information related to features of the character, such as skin color, hair type, hair color, hair length, nose type, nose size, mouth type, mouth size, lip color, eye color, eye type, eye size, eyebrow color, eyebrow size, eyebrow type, and/or accessories of the character (e.g., headwear, eyewear, earrings, nose rings, etc.). In some embodiments, the recipe of the graphical representation of the single character includes information related to animations that can be performed by the character either automatically (e.g., at predetermined intervals) and/or in response to user inputs. The information related to animations may be user defined (e.g., by a user of electronic device 600) such that the animations are specific to the character. In some embodiments, the information corresponding to the graphical representation of the single character includes an image and/or a video of the graphical representation of the character. In some embodiments,external device 1832 is configured to store and/or addgraphical representation 1854 to a character library once watchface user interface 1852 is added toexternal device 1832. In some embodiments,external device 1832 is configured to edit the character associated withgraphical representation 1854 after adding watchface user interface 1852 toexternal device 1832 and/or storinggraphical representation 1854 toexternal device 1832 and/or to the character library ofexternal device 1832. -
FIGS. 19A-19C are a flow diagram illustrating methods for sharing a configuration of a user interface with an external device, in accordance with some embodiments.Method 1900 is performed at a computer system (e.g., 100, 300, 500, 600) (e.g., a smart device, such as a smartphone or a smartwatch; a mobile device) that is in communication with a display generation component (e.g., 602) (e.g., a display and/or a touchscreen). Some operations inmethod 1900 are, optionally, combined, the orders of some operations are, optionally, changed, and some operations are, optionally, omitted. - As described below,
method 1900 provides an intuitive way for sharing a configuration of a user interface with an external device. The method reduces the cognitive burden on a user for sharing a configuration of a user interface with an external device, thereby creating a more efficient human-machine interface. For battery-operated computing devices, enabling a user to manage user interfaces related to time faster and more efficiently conserves power and increases the time between battery charges. - The computer system (e.g., 100, 300, 500, 600) displays (1902), via the display generation component (e.g., 602), a representation (e.g., 1800A-1800D) of a watch face user interface (e.g., 1800) (e.g., a watch face user interface that displays a single character without transitioning between multiple characters or a watch face user interface that transitions between display of multiple characters in a collection of characters) that is associated with one or more graphical representations (e.g., 1802) of respective characters (e.g., predetermined animated characters such as anthropomorphized animals, robots, or other objects or user-generated animated characters such as virtual avatars) (e.g., a recipe for a character that is included in the watch face user interface, the recipe including information related to features of the character, such as hair color, skin color, facial feature information, and/or accessory information) (e.g., a graphical representation of a single character when the watch face user interface is of a first type and graphical representations of a collection of characters when the watch face user interface is of a second type).
- The computer system (e.g., 100, 300, 500, 600), while displaying the representation (e.g., 1800A-1800D) of the watch face user interface (e.g., 1800), detects (1904) an input (e.g., 1850A, 1850B, 1850C, and/or 1850D) (e.g., a long press gesture on display generation component, and optionally, a subsequent tap gesture on a share affordance and/or a contact displayed in response to the long press gesture) corresponding to a request to share the watch face user interface (e.g., 1800) with an external device (e.g., 1832).
- The computer system (e.g., 100, 300, 500, 600), in response to detecting the input (1850A, 1850B, 1850C, and/or 1850D), initiates (1906) a process for sharing the watch face user interface (e.g., 1800) with the external device (e.g., 1832) and, in accordance with a determination that the watch face user interface (e.g., 1800) is associated with less than a threshold number of graphical representations (e.g., 1802) of respective characters (e.g., less than two characters, a single character) (e.g., a first watch face that does not transition between multiple characters), the process (1908) for sharing the watch face user interface (e.g., 1800) with the external device (e.g., 1832) includes sharing one or more characteristics of the watch face user interface (e.g., 1800) (e.g., background color, date/time font, date/time size, date/time placement, complication placement, complication type, and/or complication color) including transmitting a representation of one or more of the one or more graphical representations (e.g., 1802) of respective characters associated with the watch face user interface (e.g., 1800) (e.g., preparing and/or sending an electronic message to an address associated with the external device that includes a recipe for the respective characters of the watch face user interface, which enables a recipient associated with the external device to display graphical representations of the respective characters). In some embodiments, transmitting the representation of one or more of the one or more graphical representations of the respective characters associated with the watch face user interface includes sending data and/or information (e.g., without image data and/or multimedia data) associated with the one or more of the one or more graphical representations of the respective characters associated with the watch face user interface. In some embodiments, transmitting the representation of one or more of the one or more graphical representations of the respective characters associated with the watch face includes sending image data and/or multimedia data (e.g., video data) associated with the one or more of the one or more graphical representations of the respective characters associated with the watch face user interface.
- The computer system (e.g., 100, 300, 500, 600), in response to detecting the input (e.g., 1850A, 1850B, 1850C, and/or 1850D), initiates (1906) a process for sharing the watch face user interface (e.g., 1800) with the external device (e.g., 1832) and, in accordance with a determination that the watch face user interface (e.g., 1800) is associated with greater than or equal to the threshold number of graphical representations (e.g., 1802) of respective characters (e.g., a collection of characters, two or more characters) (e.g., a second watch face that transitions between display of characters sequentially, and optionally, the transition between characters is in response to meeting a transition criteria (e.g., inactivity of and/or an absence of user inputs detected by the computer system for a predetermined period of time)), the process (1910) for sharing the watch face user interface (e.g., 1800) with the external device (e.g., 1832) includes sharing one or more characteristics of the watch face user interface (e.g., 1800) (e.g., background color, date/time font, date/time size, date/time placement, complication placement, complication type, and/or complication color) without transmitting a representation of the one or more graphical representations (e.g., 1802) of respective characters associated with the watch user interface (e.g., 1800) (e.g., preparing and/or sending an electronic message to an address associated with the external device that includes data associated with features of the watch face user interface other than the representation of the one or more graphical representations of the respective characters, such that the external device is configured to display graphical representations of one or more second characters, different from the graphical representations of respective characters of the watch face user interface).
- Sharing one or more characteristics of the watch face user interface with or without transmitting a representation of one or more graphical representations of respective characters associated with the watch face user interface depending on a number of graphical representations of respective characters associated with the watch face user interface reduces an amount of data transmitted between the computer system and the external device. In particular, transmitting multiple representations of one or more graphical representations of respective characters associated with the watch face user interface consumes a relatively large amount of storage on external device and/or a relatively large amount of processing power of computer system. Reducing a size of a transmission improves the operability of the device and makes the user-device interface more efficient (e.g., by helping the user to provide proper inputs and reducing user mistakes when operating/interacting with the device) which, additionally, reduces power usage and improves battery life of the device by enabling the user to use the device more quickly and efficiently.
- In some embodiments, the representation of one or more of the one or more graphical representations (e.g., 1802) of respective characters associated with the watch face user interface (e.g., 1800) (e.g., in accordance with the determination that the watch face user interface is associated with less than the threshold number of graphical representations of respective characters) includes transmitting information corresponding to one or more settings associated with characteristic features (e.g., settings set by a user of computer system that are associated with (e.g., define) visual characteristics of the respective character corresponding to the graphical representation) of the representation of one or more of the one or more graphical representations (e.g., 1802) of respective characters associated with the watch face user interface (e.g., 1800) (e.g., without transmitting image data (e.g., an image file) and/or multimedia data (e.g., a video file) associated with the representation of one or more of the one or more graphical representations of respective characters associated with the watch face user interface).
- Sharing settings associated with characteristic features of the representation of one or more of the one or more graphical representations of respective characters associated with the watch face user interface without transmitting image data and/or multimedia data reduces an amount of data transmitted between the computer system and the external device. Reducing a size of a transmission improves the operability of the device and makes the user-device interface more efficient (e.g., by helping the user to provide proper inputs and reducing user mistakes when operating/interacting with the device) which, additionally, reduces power usage and improves battery life of the device by enabling the user to use the device more quickly and efficiently.
- In some embodiments, sharing the one or more characteristics of the watch face user interface (e.g., 1800) (e.g., background color, date/time font, date/time size, date/time placement, complication placement, complication type, and/or complication color) without transmitting a representation of the one or more graphical representations (e.g., 1802) of respective characters associated with the watch user interface (e.g., 1800) includes transmitting one or more graphical representation templates (e.g., blank and/or fillable graphical representations that do not correspond to the one or more graphical representations of respective characters associated with the watch face user interface) for one or more second graphical representations (e.g., 1848) of respective second characters, different from the one or more graphical representations (e.g., 1802) of respective characters of the watch face user interface (e.g., 1800) (e.g., the one or more second graphical representations of respective second characters are stored on external device).
- Sharing one or more graphical representation templates instead of sharing the representation of the one or more graphical representations of respective characters associated with the watch face user interface reduces an amount data transmitted between computer system and external device. Reducing a size of a transmission improves the operability of the device and makes the user-device interface more efficient (e.g., by helping the user to provide proper inputs and reducing user mistakes when operating/interacting with the device) which, additionally, reduces power usage and improves battery life of the device by enabling the user to use the device more quickly and efficiently.
- The computer system (e.g., 100, 300, 500, 600), while displaying the representation (e.g., 1800A-1800D) of the watch face user interface (e.g., 1800), detects (1912) a sequence of one or more inputs (e.g., 1850A) (e.g., a long press gesture on display generation component, and optionally, a subsequent tap gesture on an edit affordance) corresponding to a request to edit the watch face user interface (e.g., 1800).
- The computer system (e.g., 100, 300, 500, 600), in response to detecting the sequence of one or more inputs (e.g., 1850A), displays (1941), via the display generation component (e.g., 602), a first user interface (e.g., 1810) for selecting between a first set of characters (e.g., 1800A) that includes a plurality of user-customizable virtual avatars (e.g., a plurality of avatar-like emojis and/or the respective characters associated with the watch face user interface) and a graphical representation (e.g., 1800B) of a second set of characters (e.g., a plurality of emojis of animal-like characters) that includes two or more predetermined characters that are not available in the first set of characters.
- The computer system (e.g., 100, 300, 500, 600), while displaying the first user interface (e.g., 1810), detects (1916) (e.g., via one or more input devices that is in communication with the computer system, such as a touch-sensitive surface integrated with the display generation component) a third input corresponding to selection of the first set of characters (e.g., 1800A) or the second set of characters (e.g., 1800B).
- The computer system (e.g., 100, 300, 500, 600), in accordance with (e.g., or in response to) a determination that the third input corresponds to selection of the first set of characters (e.g., 1800A), displays (1918) the representation (e.g., 1800A) of the watch face user interface (e.g., 1800) including a first graphical representation (e.g., 1802) of a currently selected character from the first set of characters.
- The computer system (e.g., 100, 300, 500, 600), in accordance with (e.g., or in response to) a determination that the input corresponds to selection of the second set of characters (e.g., 1800B), displays (1920) the representation of the watch face user interface including a second graphical representation of a currently selected character from the second set of characters.
- Displaying the first user interface for selecting between the first set of characters and the second set of characters enables a user to easily customize the watch face user interface, thereby enhancing the operability of the device and making the user-device interface more efficient (e.g., by helping the user to provide proper inputs and reducing user mistakes when operating/interacting with the device) which, additionally, reduces power usage and improves battery life of the device by enabling the user to use the device more quickly and efficiently.
- The computer system (e.g., 100, 300, 500, 600), while displaying the representation (e.g., 1800A-1800D) of the watch face user interface (e.g., 1800), detects (1922) a fourth input (e.g., 1850A) (e.g., a long press gesture on display generation component, and optionally, a subsequent tap gesture on an edit affordance) corresponding to a request to edit the watch face user interface.
- The computer system (e.g., 100, 300, 500, 600), after detecting the fourth input (e.g., 1850A), displays (1924), via the display generation component (e.g., 602), a second user interface (e.g., 810) that includes a plurality of selectable characters (e.g., 1800A-1800D) (e.g., including a plurality of animated (e.g., 3D) emojis of animal-like characters; a plurality of animated (e.g., 3D) avatar-like emojis). In some embodiments, the plurality of selectable characters are displayed in a first tab or first screen of the second user interface. In some embodiments, the plurality of selectable characters includes selectable sets of characters.
- The computer system (e.g., 100, 300, 500, 600), while displaying the second user interface (e.g., 810), detects (1926) (e.g., via one or more input devices of the computer system, such as a touch-sensitive surface integrated with the display generation component) a selection of a character of the plurality of selectable characters.
- The computer system (e.g., 100, 300, 500, 600), in accordance with (e.g., or in response to) detecting the selection of the character, updates (1928) the representation of the watch face user interface to include a third graphical representation of the selected character (e.g., a graphical representation of a single character corresponding to the selected character and/or a graphical representation of a currently selected character from a selected set of characters).
- Displaying the second user interface for selecting between a plurality of selectable characters enables a user to easily customize the watch face user interface, thereby enhancing the operability of the device and making the user-device interface more efficient (e.g., by helping the user to provide proper inputs and reducing user mistakes when operating/interacting with the device) which, additionally, reduces power usage and improves battery life of the device by enabling the user to use the device more quickly and efficiently.
- The computer system (e.g., 100, 300, 500, 600), while displaying the representation (e.g., 1800A-1800D) of the watch face user interface (e.g., 1800), detects (1930) a fifth input (e.g., 1850A) (e.g., a long press gesture on display generation component, and optionally, a subsequent tap gesture on an edit affordance) corresponding to a request to edit the watch face user interface (e.g., 1800).
- The computer system (e.g., 100, 300, 500, 600) displays (1932), via the display generation component (e.g., 602), a third user interface that includes a fourth graphical representation of a character of the one or more graphical representations of respective characters associated with the watch face user interface (e.g., 1800).
- The computer system (e.g., 100, 300, 500, 600), while displaying the fourth representation of the character, detects (1934) (e.g., via one or more input devices that is in communication with the computer system, such as a touch-sensitive surface integrated with the display generation component) a sixth input (e.g., a rotational input on a rotatable input device or a rotatable and depressible input device; a scrolling input on a touch-sensitive surface integrated with the display generation component) directed to changing a visual characteristic of the character (e.g., hair color, skin color, facial feature information, and/or accessory information).
- The computer system (e.g., 100, 300, 500, 600), in response to detecting the input directed to changing the visual characteristic, changes (1936) (e.g., by transitioning through a plurality of selectable visual characteristics (e.g., selectable features associated with hair color, skin color, facial feature information, and/or accessory information)) the visual characteristic (e.g., hair color, skin color, facial feature information, and/or accessory information) from a first visual characteristic (e.g., a first hair color, a first skin color, a first facial feature, and/or a first accessory) to a second visual characteristic (e.g., a second hair color, a second skin color, a second facial feature, and/or a second accessory) different from the first visual characteristic. In some embodiments, changing the visual characteristic to the second visual characteristic is performed prior to sharing the watch face user interface and, when the watch face user interface with the second visual characteristic is shared, a representation of the watch face user interface including the second visual characteristic, is shared.
- Displaying the third user interface for changing the visual characteristic of the character enables a user to easily customize the watch face user interface, thereby enhancing the operability of the device and making the user-device interface more efficient (e.g., by helping the user to provide proper inputs and reducing user mistakes when operating/interacting with the device) which, additionally, reduces power usage and improves battery life of the device by enabling the user to use the device more quickly and efficiently.
- In some embodiments, the representation (e.g., 1800A-1800D) of the watch face user interface (e.g., 1800) includes a fifth graphical representation (e.g., 1802) of a character that corresponds to a graphical representation of (e.g., an animation based on; a graphical representations that animates features of) a user associated (e.g., based on an account to which the computer system is logged into) with the computer system (e.g., 100, 300, 500, 600) (e.g., an animated (e.g., 3D) avatar-like representation of the user of the computer system).
- Displaying the representation of the watch face user interface having the fifth graphical representation of a character that corresponds to a graphical representation of the user associated with the computer system provides improved visual feedback related to an identity of the user of the computer system, and in some embodiments, the identity of the user sharing the watch face user interface. Providing improved visual feedback enhances the operability of the device and makes the user-device interface more efficient (e.g., by helping the user to provide proper inputs and reducing user mistakes when operating/interacting with the device) which, additionally, reduces power usage and improves battery life of the device by enabling the user to use the device more quickly and efficiently.
- In some embodiments, sharing the one or more characteristics of the watch face user interface (e.g., 1800) (e.g., background color, date/time font, date/time size, date/time placement, complication placement, complication type, and/or complication color) without transmitting a representation of the one or more graphical representations (e.g., 1802) of respective characters associated with the watch user interface (e.g., 1800) includes transmitting one or more graphical representation templates (e.g., blank and/or fillable graphical representations that do not correspond to the one or more graphical representations of respective characters associated with the watch face user interface) for one or more second graphical representations (e.g., 1848) of respective second characters stored on the external device (e.g., 1832), different from the one or more graphical representations of respective characters of watch face user interface (e.g., 1800), wherein the one or more second graphical representations (e.g., 1848) of respective second characters stored on the external device (e.g., 1832) includes a sixth graphical representation (e.g., 1848) of a character that includes one or more visual characteristics set by a user of the external device (e.g., 1832) (e.g., the one or more second graphical representations of respective second characters are stored on the external device and include customized visual characteristics set by a user of the external device). In some embodiments, the one or more characteristics of the watch face user interface are based on settings of the computer system and displayed on the external device despite the one or more second graphical representations of respective second characters being stored on external device.
- Sharing one or more graphical representation templates instead of sharing the representation of the one or more graphical representations of respective characters associated with the watch face user interface reduces an amount data transmitted between computer system and external device. Reducing a size of a transmission improves the operability of the device and makes the user-device interface more efficient (e.g., by helping the user to provide proper inputs and reducing user mistakes when operating/interacting with the device) which, additionally, reduces power usage and improves battery life of the device by enabling the user to use the device more quickly and efficiently.
- In some embodiments, transmitting the representation of one or more of the one or more graphical representations (e.g., 1802) of respective characters associated with the watch face user interface (e.g., 1800) (e.g., in accordance with the determination that the watch face user interface is associated with less than the threshold number of graphical representations of respective characters) includes initiating a process for storing the representation of one or more of the one more graphical representations (e.g., 1802) of respective characters associated with the watch face user interface (e.g., 1800) on the external device (e.g., 1832) (e.g., in response to detecting user input corresponding to an add watch face affordance on external device, external device stores the representation of one or more of the one or more graphical representations of respective characters associated with the watch face user interface in a character library and/or an image library of external device).
- Initiating the process for storing the representation of one or more of the one or more graphical representations of respective characters associated with the watch face user interface on the external device reduces a number of inputs needed by a user of the external device to store the particular character on the external device. In particular, the user of the external device may store the representation of one or more of the graphical representations of respective characters associated with the watch face user interface instead of providing a sequence of inputs to create the particular character. Reducing the number of inputs needed to store the particular character improves the operability of the device and makes the user-device interface more efficient (e.g., by helping the user to provide proper inputs and reducing user mistakes when operating/interacting with the device) which, additionally, reduces power usage and improves battery life of the device by enabling the user to use the device more quickly and efficiently.
- In some embodiments, initiating the process for storing the representation of one or more of the one or more graphical representations (e.g., 1802) of respective characters associated with the watch face user interface (e.g., 1800) on the external device (e.g., 1832) includes enabling, via the external device (e.g., 1832), an ability to change one or more visual characteristics (e.g., via an editing user interface) of the representation of one or more of the one or more graphical representations (e.g., 1802) of respective characters associated with the watch face user interface (e.g., 1800) (e.g., a user of the external device may access the representation of one or more of the one or more graphical representations of respective characters associated with the watch face user interface (e.g., via a character library, via an image library, via a watch face selection user interface, and/or via a watch face editing user interface) and request to enter an editing mode of the representation, such that the external device may receive user inputs and adjust visual characteristics of the representation based on the user inputs (e.g., the external device updates visual characteristics of the representation that were shared to external device via computer system)).
- Enabling an ability on the external device to change one or more visual characteristics of the representation of one or more of the one or more graphical representations of respective characters associated with the watch face user interface reduces a number of inputs needed by the user of the external device to customize the character. In particular, the user of the external device may start with the representation of one or more of the one or more graphical representations of respective characters associated with the watch face user interface instead of creating the representation of the character via a sequence of user inputs. Reducing the number of inputs needed to customize the particular character improves the operability of the device and makes the user-device interface more efficient (e.g., by helping the user to provide proper inputs and reducing user mistakes when operating/interacting with the device) which, additionally, reduces power usage and improves battery life of the device by enabling the user to use the device more quickly and efficiently.
- Note that details of the processes described above with respect to method 1900 (e.g.,
FIGS. 19A-19C ) are also applicable in an analogous manner to the methods described above. For brevity, these details are not repeated below. - The foregoing description, for purpose of explanation, has been described with reference to specific embodiments. However, the illustrative discussions above are not intended to be exhaustive or to limit the invention to the precise forms disclosed. Many modifications and variations are possible in view of the above teachings. The embodiments were chosen and described in order to best explain the principles of the techniques and their practical applications. Others skilled in the art are thereby enabled to best utilize the techniques and various embodiments with various modifications as are suited to the particular use contemplated.
- Although the disclosure and examples have been fully described with reference to the accompanying drawings, it is to be noted that various changes and modifications will become apparent to those skilled in the art. Such changes and modifications are to be understood as being included within the scope of the disclosure and examples as defined by the claims.
Claims (30)
Priority Applications (5)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US17/031,671 US20210349426A1 (en) | 2020-05-11 | 2020-09-24 | User interfaces with a character having a visual state based on device activity state and an indication of time |
CN202310124087.2A CN116010013B (en) | 2020-05-11 | 2021-05-07 | Time-dependent user interface |
CN202180046818.5A CN115836254A (en) | 2020-05-11 | 2021-05-07 | Time-dependent user interface |
PCT/US2021/031212 WO2021231193A1 (en) | 2020-05-11 | 2021-05-07 | User interfaces related to time |
EP21728746.5A EP4133337A1 (en) | 2020-05-11 | 2021-05-07 | User interfaces related to time |
Applications Claiming Priority (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US202063023194P | 2020-05-11 | 2020-05-11 | |
US202063078314P | 2020-09-14 | 2020-09-14 | |
US17/031,671 US20210349426A1 (en) | 2020-05-11 | 2020-09-24 | User interfaces with a character having a visual state based on device activity state and an indication of time |
Publications (1)
Publication Number | Publication Date |
---|---|
US20210349426A1 true US20210349426A1 (en) | 2021-11-11 |
Family
ID=76764388
Family Applications (6)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US17/031,654 Active US11061372B1 (en) | 2020-05-11 | 2020-09-24 | User interfaces related to time |
US17/031,671 Pending US20210349426A1 (en) | 2020-05-11 | 2020-09-24 | User interfaces with a character having a visual state based on device activity state and an indication of time |
US17/031,765 Active US12008230B2 (en) | 2020-05-11 | 2020-09-24 | User interfaces related to time with an editable background |
US17/373,163 Active US11442414B2 (en) | 2020-05-11 | 2021-07-12 | User interfaces related to time |
US17/941,962 Active US11822778B2 (en) | 2020-05-11 | 2022-09-09 | User interfaces related to time |
US18/220,715 Pending US20230350564A1 (en) | 2020-05-11 | 2023-07-11 | User interfaces related to time |
Family Applications Before (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US17/031,654 Active US11061372B1 (en) | 2020-05-11 | 2020-09-24 | User interfaces related to time |
Family Applications After (4)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US17/031,765 Active US12008230B2 (en) | 2020-05-11 | 2020-09-24 | User interfaces related to time with an editable background |
US17/373,163 Active US11442414B2 (en) | 2020-05-11 | 2021-07-12 | User interfaces related to time |
US17/941,962 Active US11822778B2 (en) | 2020-05-11 | 2022-09-09 | User interfaces related to time |
US18/220,715 Pending US20230350564A1 (en) | 2020-05-11 | 2023-07-11 | User interfaces related to time |
Country Status (2)
Country | Link |
---|---|
US (6) | US11061372B1 (en) |
DK (3) | DK202070625A1 (en) |
Cited By (22)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US11301130B2 (en) | 2019-05-06 | 2022-04-12 | Apple Inc. | Restricted operation of an electronic device |
US11327650B2 (en) | 2018-05-07 | 2022-05-10 | Apple Inc. | User interfaces having a collection of complications |
US11327634B2 (en) | 2017-05-12 | 2022-05-10 | Apple Inc. | Context-specific user interfaces |
US11340757B2 (en) | 2019-05-06 | 2022-05-24 | Apple Inc. | Clock faces for an electronic device |
US11372659B2 (en) | 2020-05-11 | 2022-06-28 | Apple Inc. | User interfaces for managing user interface sharing |
US11380077B2 (en) | 2018-05-07 | 2022-07-05 | Apple Inc. | Avatar creation user interface |
US11442414B2 (en) | 2020-05-11 | 2022-09-13 | Apple Inc. | User interfaces related to time |
US11481988B2 (en) | 2010-04-07 | 2022-10-25 | Apple Inc. | Avatar editing environment |
US11526256B2 (en) | 2020-05-11 | 2022-12-13 | Apple Inc. | User interfaces for managing user interface sharing |
US11550465B2 (en) | 2014-08-15 | 2023-01-10 | Apple Inc. | Weather user interface |
US11580867B2 (en) | 2015-08-20 | 2023-02-14 | Apple Inc. | Exercised-based watch face and complications |
US11694590B2 (en) | 2020-12-21 | 2023-07-04 | Apple Inc. | Dynamic user interface with time indicator |
US11722764B2 (en) | 2018-05-07 | 2023-08-08 | Apple Inc. | Creative camera |
US11720239B2 (en) | 2021-01-07 | 2023-08-08 | Apple Inc. | Techniques for user interfaces related to an event |
US11740776B2 (en) | 2014-08-02 | 2023-08-29 | Apple Inc. | Context-specific user interfaces |
US11776190B2 (en) | 2021-06-04 | 2023-10-03 | Apple Inc. | Techniques for managing an avatar on a lock screen |
US20230367460A1 (en) * | 2022-05-10 | 2023-11-16 | Apple Inc. | Devices, Methods, and Graphical User Interfaces for Providing Notifications and Application Information |
US11921998B2 (en) | 2020-05-11 | 2024-03-05 | Apple Inc. | Editing features of an avatar |
US11921992B2 (en) | 2021-05-14 | 2024-03-05 | Apple Inc. | User interfaces related to time |
US11960701B2 (en) | 2019-05-06 | 2024-04-16 | Apple Inc. | Using an illustration to show the passing of time |
US12019862B2 (en) | 2015-03-08 | 2024-06-25 | Apple Inc. | Sharing user-configurable graphical constructs |
US12033296B2 (en) | 2023-04-24 | 2024-07-09 | Apple Inc. | Avatar creation user interface |
Families Citing this family (20)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP2741176A3 (en) * | 2012-12-10 | 2017-03-08 | Samsung Electronics Co., Ltd | Mobile device of bangle type, control method thereof, and UI display method |
USD762693S1 (en) * | 2014-09-03 | 2016-08-02 | Apple Inc. | Display screen or portion thereof with graphical user interface |
US10375313B1 (en) | 2018-05-07 | 2019-08-06 | Apple Inc. | Creative camera |
USD898755S1 (en) * | 2018-09-11 | 2020-10-13 | Apple Inc. | Electronic device with graphical user interface |
USD937293S1 (en) * | 2019-05-29 | 2021-11-30 | Apple Inc. | Electronic device with graphical user interface |
USD949179S1 (en) * | 2019-09-06 | 2022-04-19 | Apple Inc. | Display screen or portion thereof with animated graphical user interface |
USD936681S1 (en) * | 2019-11-22 | 2021-11-23 | Honor Device Co., Ltd. | Electronic display for a wearable device presenting a graphical user interface |
USD936684S1 (en) * | 2019-11-22 | 2021-11-23 | Honor Device Co., Ltd. | Electronic display for a wearable device presenting a graphical user interface |
USD936683S1 (en) * | 2019-11-22 | 2021-11-23 | Honor Device Co., Ltd. | Electronic display for a wearable device presenting a graphical user interface |
USD936685S1 (en) * | 2019-11-25 | 2021-11-23 | Honor Device Co., Ltd. | Electronic display for a wearable device presenting a graphical user interface |
USD941859S1 (en) * | 2019-11-25 | 2022-01-25 | Honor Device Co., Ltd. | Electronic display for a wearable device presenting a graphical user interface |
JP7173075B2 (en) * | 2020-03-23 | 2022-11-16 | カシオ計算機株式会社 | ELECTRONIC DEVICE, ELECTRONIC DEVICE CONTROL METHOD, ELECTRONIC DEVICE CONTROL PROGRAM |
US11503440B2 (en) | 2020-04-16 | 2022-11-15 | Avaya Management L.P. | Methods and systems for providing enterprise services to wearable and mobile devices |
US11582419B2 (en) | 2020-04-16 | 2023-02-14 | Avaya Management L.P. | Methods and systems for processing call content to devices using a distributed communication controller |
USD962254S1 (en) * | 2020-06-19 | 2022-08-30 | Apple Inc. | Display screen or portion thereof with graphical user interface |
USD960920S1 (en) | 2020-06-19 | 2022-08-16 | Apple Inc. | Display screen or portion thereof with graphical user interface |
USD957429S1 (en) * | 2020-09-14 | 2022-07-12 | Apple Inc. | Display screen or portion thereof with graphical user interface |
USD957430S1 (en) | 2020-09-14 | 2022-07-12 | Apple Inc. | Display screen or portion thereof with graphical user interface |
KR20220109240A (en) * | 2021-01-28 | 2022-08-04 | 삼성전자주식회사 | Wearable device and method for generating signal for controlling operation of electronic device |
US11868596B2 (en) * | 2021-07-28 | 2024-01-09 | Capital One Services, Llc | Color-based system for generating notifications |
Citations (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20030135769A1 (en) * | 2001-03-28 | 2003-07-17 | Loughran Stephen A. | Power management in computing applications |
US20030140309A1 (en) * | 2001-12-13 | 2003-07-24 | Mari Saito | Information processing apparatus, information processing method, storage medium, and program |
US7716057B2 (en) * | 1999-05-17 | 2010-05-11 | Microsoft Corporation | Controlling the listening horizon of an automatic speech recognition system for use in handsfree conversational dialogue |
US20110119610A1 (en) * | 2009-11-13 | 2011-05-19 | Hackborn Dianne K | Live wallpaper |
US20150042571A1 (en) * | 2012-10-30 | 2015-02-12 | Motorola Mobility Llc | Method and apparatus for action indication selection |
US20150062052A1 (en) * | 2012-05-09 | 2015-03-05 | Apple Inc. | Device, Method, and Graphical User Interface for Transitioning Between Display States in Response to a Gesture |
US20150082446A1 (en) * | 2013-09-16 | 2015-03-19 | Motorola Mobility Llc | Method and apparatus for displaying potentially private information |
US9625987B1 (en) * | 2015-04-17 | 2017-04-18 | Google Inc. | Updating and displaying information in different power modes |
US20190050045A1 (en) * | 2017-08-14 | 2019-02-14 | Samsung Electronics Co., Ltd. | Method for displaying content and electronic device thereof |
US20200089302A1 (en) * | 2017-05-17 | 2020-03-19 | Samsung Electronics Co., Ltd. | Method for displaying content and electronic device therefor |
Family Cites Families (1215)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US872200A (en) | 1901-05-09 | 1907-11-26 | Edward Rowe | Tower-clock. |
US3148500A (en) | 1963-01-29 | 1964-09-15 | Hayes Thomas | Animated clock |
DE1933049B2 (en) | 1969-06-30 | 1971-09-30 | Durowe Gmbh, 7530 Pforzheim | WORLD TIME CLOCK |
JPS49134364A (en) | 1973-04-25 | 1974-12-24 | ||
JPS5331170A (en) | 1976-09-03 | 1978-03-24 | Seiko Epson Corp | Electronic watch |
US4205628A (en) | 1978-10-24 | 1980-06-03 | Null Robert L | Animal conditioner |
JPS56621A (en) | 1979-06-15 | 1981-01-07 | Citizen Watch Co Ltd | Digital watch with universal time |
CH629064B (en) | 1979-06-28 | Ebauches Sa | ELECTRONIC WATCH WITH DIGITAL AUXILIARY DISPLAY. | |
JPS5650138A (en) | 1979-09-27 | 1981-05-07 | Nippon Telegr & Teleph Corp <Ntt> | Manufacture of optical fiber |
US4597674A (en) | 1984-03-30 | 1986-07-01 | Thompson Iii William H | Multiplex digital clock |
US4847819A (en) | 1988-04-07 | 1989-07-11 | Hong Kuo Hui | Universal clock having means for indicating zonal time in other global time zones |
DE3832514C1 (en) | 1988-09-24 | 1989-11-02 | Iwc International Watch Co. Ag, Schaffhausen, Ch | |
JPH02116783A (en) | 1988-10-27 | 1990-05-01 | Seikosha Co Ltd | Time signalling timepiece |
US5208790A (en) | 1989-05-29 | 1993-05-04 | Casio Computer Co., Ltd. | Astronomical data indicating device |
JP3062531B2 (en) | 1990-12-04 | 2000-07-10 | 株式会社レイ | Time display device |
CH682034B5 (en) | 1991-10-14 | 1994-01-14 | Eta S.A. Fabriques D'ebauches | Timepiece including a chronograph module adapted on a motor module. |
CH684619B5 (en) | 1992-07-17 | 1995-05-15 | Longines Montres Comp D | Timepiece universal time display. |
US5659693A (en) | 1992-08-27 | 1997-08-19 | Starfish Software, Inc. | User interface with individually configurable panel interface for use in a computer system |
CH685967B5 (en) | 1993-11-26 | 1996-05-31 | Asulab Sa | Piece watch digital signage. |
CH686808B5 (en) | 1994-01-12 | 1997-01-15 | Ebauchesfabrik Eta Ag | Piece watch indicating the part of the visible Earth from the moon. |
CH685659B5 (en) | 1994-03-04 | 1996-03-15 | Asulab Sa | Watch indicating a meteorological forecast. |
US5682469A (en) | 1994-07-08 | 1997-10-28 | Microsoft Corporation | Software platform having a real world interface with animated characters |
JP3007616U (en) | 1994-08-08 | 1995-02-21 | 翼システム株式会社 | Clock with display panel color change mechanism |
US5825353A (en) | 1995-04-18 | 1998-10-20 | Will; Craig Alexander | Control of miniature personal digital assistant using menu and thumbwheel |
JPH08339172A (en) | 1995-06-09 | 1996-12-24 | Sony Corp | Display control device |
JPH099072A (en) | 1995-06-26 | 1997-01-10 | Aibitsukusu Kk | Information communication system |
CH687494B5 (en) | 1995-07-18 | 1997-06-30 | Utc Service Ag | Clock with two ads for two different local times. |
US5845257A (en) | 1996-02-29 | 1998-12-01 | Starfish Software, Inc. | System and methods for scheduling and tracking events across multiple time zones |
JPH09251084A (en) | 1996-03-15 | 1997-09-22 | Citizen Watch Co Ltd | Electronic watch |
US6043818A (en) | 1996-04-30 | 2000-03-28 | Sony Corporation | Background image with a continuously rotating and functional 3D icon |
US5870683A (en) | 1996-09-18 | 1999-02-09 | Nokia Mobile Phones Limited | Mobile station having method and apparatus for displaying user-selectable animation sequence |
US6128012A (en) | 1996-09-19 | 2000-10-03 | Microsoft Corporation | User interface for a portable data management device with limited size and processing capability |
JPH10143636A (en) | 1996-11-14 | 1998-05-29 | Casio Comput Co Ltd | Picture processor |
US6621524B1 (en) | 1997-01-10 | 2003-09-16 | Casio Computer Co., Ltd. | Image pickup apparatus and method for processing images obtained by means of same |
JPH10293860A (en) | 1997-02-24 | 1998-11-04 | Nippon Telegr & Teleph Corp <Ntt> | Person image display method and device using voice drive |
JP2957507B2 (en) | 1997-02-24 | 1999-10-04 | インターナショナル・ビジネス・マシーンズ・コーポレイション | Small information processing equipment |
US5982710A (en) | 1997-03-14 | 1999-11-09 | Rawat; Prem P. | Method and apparatus for providing time using cartesian coordinates |
US5999195A (en) | 1997-03-28 | 1999-12-07 | Silicon Graphics, Inc. | Automatic generation of transitions between motion cycles in an animation |
US6806893B1 (en) | 1997-08-04 | 2004-10-19 | Parasoft Corporation | System and method for displaying simulated three dimensional buttons in a graphical user interface |
JPH11109066A (en) | 1997-09-30 | 1999-04-23 | Bandai Co Ltd | Display device |
DE19747879A1 (en) | 1997-10-21 | 1999-04-22 | Volker Prof Dr Hepp | User-friendly computer controlled clock with additional functions |
US5986655A (en) | 1997-10-28 | 1999-11-16 | Xerox Corporation | Method and system for indexing and controlling the playback of multimedia documents |
JP3058852B2 (en) | 1997-11-25 | 2000-07-04 | 株式会社プレ・ステージ | Electronic clock |
US6359839B1 (en) | 1997-12-23 | 2002-03-19 | Thomas C. Schenk | Watch with a 24-hour watch face |
US6466213B2 (en) | 1998-02-13 | 2002-10-15 | Xerox Corporation | Method and apparatus for creating personal autonomous avatars |
JPH11232013A (en) | 1998-02-18 | 1999-08-27 | Seiko Epson Corp | Portable information processor, control method and recording medium |
US6084598A (en) * | 1998-04-23 | 2000-07-04 | Chekerylla; James | Apparatus for modifying graphic images |
JP2000076460A (en) | 1998-06-18 | 2000-03-14 | Minolta Co Ltd | Monitor display device |
JP3123990B2 (en) | 1998-10-05 | 2001-01-15 | 埼玉日本電気株式会社 | Portable wireless terminal |
JP2000162349A (en) | 1998-11-30 | 2000-06-16 | Casio Comput Co Ltd | Image display control device and method |
US6353449B1 (en) | 1998-12-10 | 2002-03-05 | International Business Machines Corporation | Communicating screen saver |
US6279018B1 (en) | 1998-12-21 | 2001-08-21 | Kudrollis Software Inventions Pvt. Ltd. | Abbreviating and compacting text to cope with display space constraint in computer software |
US6441824B2 (en) | 1999-01-25 | 2002-08-27 | Datarover Mobile Systems, Inc. | Method and apparatus for dynamic text resizing |
US6160767A (en) | 1999-03-12 | 2000-12-12 | Leona Lighting Design Ltd. | Clock |
US6549218B1 (en) | 1999-03-31 | 2003-04-15 | Microsoft Corporation | Dynamic effects for computer display windows |
US8065155B1 (en) | 1999-06-10 | 2011-11-22 | Gazdzinski Robert F | Adaptive advertising apparatus and methods |
US6452597B1 (en) | 1999-08-24 | 2002-09-17 | Microsoft Corporation | Displaying text on a limited-area display surface |
JP2001119453A (en) | 1999-10-18 | 2001-04-27 | Japan Radio Co Ltd | Character display control method |
JP3379101B2 (en) | 1999-11-18 | 2003-02-17 | 日本電気株式会社 | Mobile phone character display system and method |
JP2001147282A (en) | 1999-11-22 | 2001-05-29 | Bazu Corporation:Kk | Time indicator |
US6968449B1 (en) | 1999-12-15 | 2005-11-22 | Microsoft Corporation | Methods and arrangements for providing a mark-up language based graphical user interface for user identification to an operating system |
KR20010056965A (en) | 1999-12-17 | 2001-07-04 | 박희완 | Method for creating human characters by partial image synthesis |
US6809724B1 (en) | 2000-01-18 | 2004-10-26 | Seiko Epson Corporation | Display apparatus and portable information processing apparatus |
US6539343B2 (en) | 2000-02-03 | 2003-03-25 | Xerox Corporation | Methods for condition monitoring and system-level diagnosis of electro-mechanical systems with multiple actuating components operating in multiple regimes |
AU2001255186A1 (en) | 2000-03-21 | 2001-10-03 | Bjorn Kartomten | Automatic location-detecting combination analog and digital wristwatch |
US20020054066A1 (en) | 2000-04-27 | 2002-05-09 | Dan Kikinis | Method and system for inputting time in a video environment |
JP4431918B2 (en) | 2000-05-01 | 2010-03-17 | ソニー株式会社 | Information processing apparatus, information processing method, and recording medium |
JP2001318852A (en) | 2000-05-12 | 2001-11-16 | Noboru Someya | Electronic data distributing system and video game and wrist watch to be used for the same system |
JP3813579B2 (en) | 2000-05-31 | 2006-08-23 | シャープ株式会社 | Moving picture editing apparatus, moving picture editing program, computer-readable recording medium |
JP3989194B2 (en) | 2000-06-12 | 2007-10-10 | 株式会社Qript | Communications system |
US6525997B1 (en) | 2000-06-30 | 2003-02-25 | International Business Machines Corporation | Efficient use of display real estate in a wrist watch display |
US6477117B1 (en) | 2000-06-30 | 2002-11-05 | International Business Machines Corporation | Alarm interface for a smart watch |
US6556222B1 (en) | 2000-06-30 | 2003-04-29 | International Business Machines Corporation | Bezel based input mechanism and user interface for a smart watch |
TW498240B (en) | 2000-06-30 | 2002-08-11 | Shiue-Ping Gan | On-line personalized image integration method and system |
US7657916B2 (en) | 2000-07-31 | 2010-02-02 | Cisco Technology, Inc. | Digital subscriber television networks with local physical storage devices and virtual storage |
US20050195173A1 (en) | 2001-08-30 | 2005-09-08 | Mckay Brent | User Interface for Large-Format Interactive Display Systems |
US6496780B1 (en) | 2000-09-12 | 2002-12-17 | Wsi Corporation | Systems and methods for conveying weather reports |
CA2356232A1 (en) * | 2000-09-14 | 2002-03-14 | George A. Hansen | Dynamically resizable display elements |
US8234218B2 (en) | 2000-10-10 | 2012-07-31 | AddnClick, Inc | Method of inserting/overlaying markers, data packets and objects relative to viewable content and enabling live social networking, N-dimensional virtual environments and/or other value derivable from the content |
JP2002202389A (en) | 2000-10-31 | 2002-07-19 | Sony Corp | Clock information distribution processing system, information distribution device, information distribution system, portable terminal device, information recording medium and information processing method |
US6639875B2 (en) | 2000-11-07 | 2003-10-28 | Alfred E. Hall | Time piece with changable color face |
KR100369646B1 (en) | 2000-11-23 | 2003-01-30 | 삼성전자 주식회사 | User interface method for portable terminal |
KR100377936B1 (en) | 2000-12-16 | 2003-03-29 | 삼성전자주식회사 | Method for inputting emotion icon in mobile telecommunication terminal |
JP2002257955A (en) | 2000-12-25 | 2002-09-11 | Seiko Epson Corp | Wristwatch device with communication function, method of displaying information, control program and recording medium |
WO2002054157A1 (en) | 2001-01-08 | 2002-07-11 | Firmaet Berit Johannsen | Device for displaying time |
JP2001273064A (en) | 2001-01-24 | 2001-10-05 | Casio Comput Co Ltd | Device for controlling image display and method therefor |
US6728533B2 (en) | 2001-01-25 | 2004-04-27 | Sharp Laboratories Of America, Inc. | Clock for mobile phones |
JP2002251238A (en) | 2001-02-22 | 2002-09-06 | Ddl:Kk | Method for displaying picture on desk top |
US6855169B2 (en) | 2001-02-28 | 2005-02-15 | Synthes (Usa) | Demineralized bone-derived implants |
US6601988B2 (en) | 2001-03-19 | 2003-08-05 | International Business Machines Corporation | Simplified method for setting time using a graphical representation of an analog clock face |
US6697072B2 (en) | 2001-03-26 | 2004-02-24 | Intel Corporation | Method and system for controlling an avatar using computer vision |
US7930624B2 (en) | 2001-04-20 | 2011-04-19 | Avid Technology, Inc. | Editing time-based media with enhanced content |
JP2003009404A (en) | 2001-06-25 | 2003-01-10 | Sharp Corp | Residual power notice method and residual power notice device |
US6714486B2 (en) | 2001-06-29 | 2004-03-30 | Kevin Biggs | System and method for customized time display |
US7674169B2 (en) | 2001-07-06 | 2010-03-09 | Scientific Games International, Inc. | Random animated lottery system |
WO2003009164A2 (en) | 2001-07-16 | 2003-01-30 | America Online Incorporated | Method and apparatus for calendaring reminders |
CN1397904A (en) | 2001-07-18 | 2003-02-19 | 张煌东 | Control system for playing interactive game according to parameters generated by movements |
CN1337638A (en) | 2001-09-13 | 2002-02-27 | 杜凤祥 | Practial interactive multimedia management and administration system for building development business |
US7313617B2 (en) | 2001-09-28 | 2007-12-25 | Dale Malik | Methods and systems for a communications and information resource manager |
US20030067497A1 (en) | 2001-10-09 | 2003-04-10 | Pichon Olivier Francis | Method and device for modifying a pre-existing graphical user interface |
JP2003121568A (en) | 2001-10-09 | 2003-04-23 | Sony Corp | Apparatus, method, and program for displaying time information |
US7167832B2 (en) | 2001-10-15 | 2007-01-23 | At&T Corp. | Method for dialog management |
US20040083474A1 (en) | 2001-10-18 | 2004-04-29 | Mckinlay Eric | System, method and computer program product for initiating a software download |
US7251812B1 (en) | 2001-10-31 | 2007-07-31 | Microsoft Corporation | Dynamic software update |
US7203380B2 (en) | 2001-11-16 | 2007-04-10 | Fuji Xerox Co., Ltd. | Video production and compaction with collage picture frame user interface |
US6754139B2 (en) | 2001-11-29 | 2004-06-22 | Timefoundry, Llc | Animated timepiece |
US20030107603A1 (en) | 2001-12-12 | 2003-06-12 | Intel Corporation | Scroll notification system and method |
TW546942B (en) | 2001-12-19 | 2003-08-11 | Inventec Multimedia & Telecom | Battery status voice alert method for wireless communication equipment |
JP2003219217A (en) | 2002-01-22 | 2003-07-31 | Fuji Photo Film Co Ltd | Imaging apparatus, image pick up method and program |
US7036025B2 (en) | 2002-02-07 | 2006-04-25 | Intel Corporation | Method and apparatus to reduce power consumption of a computer system display screen |
JP2003233616A (en) | 2002-02-13 | 2003-08-22 | Matsushita Electric Ind Co Ltd | Provided information presentation device and information providing device |
US20030169306A1 (en) | 2002-03-07 | 2003-09-11 | Nokia Corporation | Creating a screen saver from downloadable applications on mobile devices |
JP2003296246A (en) | 2002-04-01 | 2003-10-17 | Toshiba Corp | Electronic mail terminal device |
NL1020299C2 (en) | 2002-04-04 | 2003-10-13 | Albert Van Selst | Clock and watch fitted with such a clock. |
US7899915B2 (en) | 2002-05-10 | 2011-03-01 | Richard Reisman | Method and apparatus for browsing using multiple coordinated device sets |
US20030214885A1 (en) | 2002-05-17 | 2003-11-20 | Summer Powell | Electronic time-telling device |
JP2004028918A (en) | 2002-06-27 | 2004-01-29 | Aplix Corp | Wrist watch |
US7546548B2 (en) | 2002-06-28 | 2009-06-09 | Microsoft Corporation | Method and system for presenting menu commands for selection |
US7227976B1 (en) | 2002-07-08 | 2007-06-05 | Videomining Corporation | Method and system for real-time facial image enhancement |
US6871076B2 (en) | 2002-07-11 | 2005-03-22 | International Business Machines Corporation | Method and system for automatically adjusting location based system information in a mobile computer |
US6839542B2 (en) | 2002-07-22 | 2005-01-04 | Motorola, Inc. | Virtual dynamic cellular infrastructure based on coordinate information |
US20040017733A1 (en) | 2002-07-24 | 2004-01-29 | Sullivan Brian E. | Custom designed virtual time piece |
US7461346B2 (en) | 2002-07-30 | 2008-12-02 | Sap Ag | Editing browser documents |
AU2002950502A0 (en) | 2002-07-31 | 2002-09-12 | E-Clips Intelligent Agent Technologies Pty Ltd | Animated messaging |
DE60319638T2 (en) | 2002-08-07 | 2009-04-02 | Seiko Epson Corp. | Portable information device |
US7180524B1 (en) | 2002-09-30 | 2007-02-20 | Dale Axelrod | Artists' color display system |
US20040066710A1 (en) | 2002-10-03 | 2004-04-08 | Yuen Wai Man | Voice-commanded alarm clock system, and associated methods |
US20040075699A1 (en) * | 2002-10-04 | 2004-04-22 | Creo Inc. | Method and apparatus for highlighting graphical objects |
JP2004184396A (en) | 2002-10-09 | 2004-07-02 | Seiko Epson Corp | Display device, clock, method for controlling display device, control program, and recording medium |
US20040075700A1 (en) | 2002-10-16 | 2004-04-22 | Catherine Liu | Functional idle mode display |
US7515903B1 (en) | 2002-10-28 | 2009-04-07 | At&T Mobility Ii Llc | Speech to message processing |
US7773460B2 (en) | 2002-11-04 | 2010-08-10 | Lindsay Holt | Medication regimen communicator apparatus and method |
US6690623B1 (en) | 2002-11-08 | 2004-02-10 | Arnold K. Maano | Multi-functional time indicating device with a multi-colored fiber optic display |
CN2602404Y (en) | 2002-11-20 | 2004-02-04 | 张培柱 | Universal timekeeper |
KR100471594B1 (en) | 2002-11-26 | 2005-03-10 | 엔에이치엔(주) | Method for Providing Data Communication Service in Computer Network by using User-Defined Emoticon Image and Computer-Readable Storage Medium for storing Application Program therefor |
EP1429291A1 (en) | 2002-12-12 | 2004-06-16 | Sony Ericsson Mobile Communications AB | System and method for implementing avatars in a mobile environment |
US7113809B2 (en) | 2002-12-19 | 2006-09-26 | Nokia Corporation | Apparatus and a method for providing information to a user |
US7185315B2 (en) | 2003-02-25 | 2007-02-27 | Sheet Dynamics, Ltd. | Graphical feedback of disparities in target designs in graphical development environment |
US20070113181A1 (en) | 2003-03-03 | 2007-05-17 | Blattner Patrick D | Using avatars to communicate real-time information |
US7908554B1 (en) | 2003-03-03 | 2011-03-15 | Aol Inc. | Modifying avatar behavior based on user action or mood |
EP1599862A2 (en) | 2003-03-03 | 2005-11-30 | America Online, Inc. | Using avatars to communicate |
US20040179037A1 (en) | 2003-03-03 | 2004-09-16 | Blattner Patrick D. | Using avatars to communicate context out-of-band |
US7577934B2 (en) | 2003-03-12 | 2009-08-18 | Microsoft Corporation | Framework for modeling and providing runtime behavior for business software applications |
CN1536511A (en) | 2003-04-04 | 2004-10-13 | 干学平 | Method for on-line customizing object containing personal mark |
US7035170B2 (en) | 2003-04-29 | 2006-04-25 | International Business Machines Corporation | Device for displaying variable data for small screens |
US20040225966A1 (en) | 2003-05-09 | 2004-11-11 | Motorola, Inc. | Method and device for automatically displaying appointments |
JP4161814B2 (en) | 2003-06-16 | 2008-10-08 | ソニー株式会社 | Input method and input device |
US20050041667A1 (en) | 2003-06-30 | 2005-02-24 | Microsoft Corporation | Calendar channel |
US7433714B2 (en) | 2003-06-30 | 2008-10-07 | Microsoft Corporation | Alert mechanism interface |
US7580033B2 (en) | 2003-07-16 | 2009-08-25 | Honeywood Technologies, Llc | Spatial-based power savings |
US7257254B2 (en) | 2003-07-24 | 2007-08-14 | Sap Ag | Method and system for recognizing time |
TW200512616A (en) | 2003-09-17 | 2005-04-01 | Chi-Hung Su | Interactive mechanism allowing internet users to link database and self-configure dynamic 360-degree object-browsing webpage content |
JP4168338B2 (en) | 2003-09-18 | 2008-10-22 | ブラザー工業株式会社 | Installation program, computer-readable recording medium, and installation method |
US7500127B2 (en) | 2003-09-18 | 2009-03-03 | Vulcan Portals Inc. | Method and apparatus for operating an electronic device in a low power mode |
US20060008256A1 (en) | 2003-10-01 | 2006-01-12 | Khedouri Robert K | Audio visual player apparatus and system and method of content distribution using the same |
US7302650B1 (en) | 2003-10-31 | 2007-11-27 | Microsoft Corporation | Intuitive tools for manipulating objects in a display |
US7218575B2 (en) | 2003-10-31 | 2007-05-15 | Rosevear John M | Angular twilight clock |
US8645336B2 (en) | 2003-11-07 | 2014-02-04 | Magnaforte, Llc | Digital interactive phrasing system and method |
TWI254202B (en) | 2003-12-05 | 2006-05-01 | Mediatek Inc | Portable electronic apparatus and power management method thereof |
US20050122543A1 (en) * | 2003-12-05 | 2005-06-09 | Eric Walker | System and method for custom color design |
TWI236162B (en) | 2003-12-26 | 2005-07-11 | Ind Tech Res Inst | Light emitting diode |
US20050198319A1 (en) | 2004-01-15 | 2005-09-08 | Yahoo! Inc. | Techniques for parental control of internet access including a guest mode |
US8171084B2 (en) | 2004-01-20 | 2012-05-01 | Microsoft Corporation | Custom emoticons |
US7707520B2 (en) | 2004-01-30 | 2010-04-27 | Yahoo! Inc. | Method and apparatus for providing flash-based avatars |
CN1961333A (en) | 2004-02-12 | 2007-05-09 | 贝斯简·阿利万迪 | System and method for producing merchandise from a virtual environment |
IL160429A0 (en) | 2004-02-16 | 2005-11-20 | Home Comfort Technologies Ltd | Environmental control system |
US7637204B2 (en) | 2004-02-26 | 2009-12-29 | Sunbeam Products, Inc. | Brewing device with time-since-brew indicator |
US20050190653A1 (en) | 2004-02-27 | 2005-09-01 | Chen Chih Y. | Method of displaying world time with automatic correction of daylight saving time in a movement |
US7697960B2 (en) | 2004-04-23 | 2010-04-13 | Samsung Electronics Co., Ltd. | Method for displaying status information on a mobile terminal |
US7555717B2 (en) | 2004-04-30 | 2009-06-30 | Samsung Electronics Co., Ltd. | Method for displaying screen image on mobile terminal |
JP2005339017A (en) | 2004-05-25 | 2005-12-08 | Mitsubishi Electric Corp | Electronic device |
US20050278757A1 (en) | 2004-05-28 | 2005-12-15 | Microsoft Corporation | Downloadable watch faces |
WO2005119682A1 (en) | 2004-06-02 | 2005-12-15 | Koninklijke Philips Electronics N.V. | Clock-based user interface for hdd time-shift buffer navigation |
US7490295B2 (en) | 2004-06-25 | 2009-02-10 | Apple Inc. | Layer for accessing user interface elements |
US7761800B2 (en) | 2004-06-25 | 2010-07-20 | Apple Inc. | Unified interest layer for user interface |
US8453065B2 (en) | 2004-06-25 | 2013-05-28 | Apple Inc. | Preview and installation of user interface elements in a display environment |
US20060007785A1 (en) | 2004-07-08 | 2006-01-12 | Fernandez Juan C | Method and system for displaying appointments |
US20060020904A1 (en) * | 2004-07-09 | 2006-01-26 | Antti Aaltonen | Stripe user interface |
US20060035628A1 (en) | 2004-07-30 | 2006-02-16 | Microsoft Corporation | Weather channel |
US7411590B1 (en) | 2004-08-09 | 2008-08-12 | Apple Inc. | Multimedia file format |
US7619615B1 (en) | 2004-08-31 | 2009-11-17 | Sun Microsystems, Inc. | Method and apparatus for soft keys of an electronic device |
JP2006071582A (en) | 2004-09-06 | 2006-03-16 | Terumo Corp | Wrist watch with ultraviolet ray measuring function |
US7593755B2 (en) | 2004-09-15 | 2009-09-22 | Microsoft Corporation | Display of wireless data |
JP4636845B2 (en) | 2004-10-07 | 2011-02-23 | 任天堂株式会社 | GAME DEVICE AND GAME PROGRAM |
US7519923B2 (en) | 2004-10-20 | 2009-04-14 | International Business Machines Corporation | Method for generating a tree view of elements in a graphical user interface (GUI) |
US7614011B2 (en) | 2004-10-21 | 2009-11-03 | International Business Machines Corporation | Apparatus and method for display power saving |
US20060092770A1 (en) | 2004-10-30 | 2006-05-04 | Demas Theodore J | Information displays and methods associated therewith |
US7336280B2 (en) | 2004-11-18 | 2008-02-26 | Microsoft Corporation | Coordinating animations and media in computer display output |
US7671845B2 (en) | 2004-11-30 | 2010-03-02 | Microsoft Corporation | Directional input device and display orientation control |
JP4449723B2 (en) | 2004-12-08 | 2010-04-14 | ソニー株式会社 | Image processing apparatus, image processing method, and program |
KR100663277B1 (en) | 2004-12-20 | 2007-01-02 | 삼성전자주식회사 | Device and?method for processing system-related event in wireless terminal |
US7619616B2 (en) | 2004-12-21 | 2009-11-17 | Microsoft Corporation | Pressure sensitive controls |
US20060142944A1 (en) | 2004-12-23 | 2006-06-29 | France Telecom | Technique for creating, directing, storing, and automatically delivering a message to an intended recipient based on climatic conditions |
KR100511210B1 (en) | 2004-12-27 | 2005-08-30 | 주식회사지앤지커머스 | Method for converting 2d image into pseudo 3d image and user-adapted total coordination method in use artificial intelligence, and service besiness method thereof |
US8488023B2 (en) | 2009-05-20 | 2013-07-16 | DigitalOptics Corporation Europe Limited | Identifying facial expressions in acquired digital images |
JP4580783B2 (en) | 2005-03-02 | 2010-11-17 | 日本電信電話株式会社 | Information display apparatus and method and information display program thereof |
JP4943031B2 (en) | 2005-03-16 | 2012-05-30 | 京セラミタ株式会社 | Operation panel and display control method of operation panel |
US7751285B1 (en) | 2005-03-28 | 2010-07-06 | Nano Time, LLC | Customizable and wearable device with electronic images |
JP2006295514A (en) | 2005-04-11 | 2006-10-26 | Hitachi Ltd | Apparatus and method for displaying contents information |
KR20060109708A (en) | 2005-04-18 | 2006-10-23 | 어윤형 | Universal timepiece which can display day-and-night |
US7607582B2 (en) | 2005-04-22 | 2009-10-27 | Microsoft Corporation | Aggregation and synchronization of nearby media |
GB0509259D0 (en) | 2005-05-06 | 2005-06-15 | Beswick Andrew E | Device for dispensing paste |
KR100740189B1 (en) | 2005-05-13 | 2007-07-16 | 노키아 코포레이션 | Device with a graphical user interface |
US7672875B2 (en) | 2005-06-06 | 2010-03-02 | International Business Machines Corporation | Presenting an alternative product package offer from a web vendor |
JP2008542942A (en) | 2005-06-10 | 2008-11-27 | ノキア コーポレイション | Reconfiguration of electronic device standby screen |
US7685530B2 (en) | 2005-06-10 | 2010-03-23 | T-Mobile Usa, Inc. | Preferred contact group centric interface |
KR100716288B1 (en) | 2005-06-17 | 2007-05-09 | 삼성전자주식회사 | Display apparatus and control method thereof |
US20060294465A1 (en) | 2005-06-22 | 2006-12-28 | Comverse, Inc. | Method and system for creating and distributing mobile avatars |
US7861099B2 (en) | 2006-06-30 | 2010-12-28 | Intel Corporation | Method and apparatus for user-activity-based dynamic power management and policy creation for mobile platforms |
US20070004451A1 (en) | 2005-06-30 | 2007-01-04 | C Anderson Eric | Controlling functions of a handheld multifunction device |
US7659836B2 (en) | 2005-07-20 | 2010-02-09 | Astrazeneca Ab | Device for communicating with a voice-disabled person |
US8384763B2 (en) | 2005-07-26 | 2013-02-26 | Her Majesty the Queen in right of Canada as represented by the Minster of Industry, Through the Communications Research Centre Canada | Generating a depth map from a two-dimensional source image for stereoscopic and multiview imaging |
JP2007041385A (en) | 2005-08-04 | 2007-02-15 | Seiko Epson Corp | Display device and method for controlling the same |
WO2007018881A2 (en) | 2005-08-05 | 2007-02-15 | Walker Digital, Llc | Efficient customized media creation through pre-encoding of common elements |
US7760269B2 (en) * | 2005-08-22 | 2010-07-20 | Hewlett-Packard Development Company, L.P. | Method and apparatus for sizing an image on a display |
KR20070025292A (en) | 2005-09-01 | 2007-03-08 | 삼성전자주식회사 | Display apparatus |
US20070055947A1 (en) | 2005-09-02 | 2007-03-08 | Microsoft Corporation | Animations and transitions |
WO2007030503A2 (en) | 2005-09-06 | 2007-03-15 | Pattern Intelligence, Inc. | Graphical user interfaces |
KR100802615B1 (en) | 2005-09-09 | 2008-02-13 | 엘지전자 주식회사 | Event display apparatus for mobile terminal and method thereof |
US20070057775A1 (en) | 2005-09-10 | 2007-03-15 | O'reilly Mike R | Unpredictable alarm clock |
US7378954B2 (en) | 2005-10-21 | 2008-05-27 | Barry Myron Wendt | Safety indicator and method |
US20070101279A1 (en) | 2005-10-27 | 2007-05-03 | Chaudhri Imran A | Selection of user interface elements for unified display in a display environment |
JP2007163294A (en) | 2005-12-14 | 2007-06-28 | Sony Corp | Wrist watch, display method of wrist watch, and program |
CN101385071B (en) | 2005-12-22 | 2011-01-26 | 捷讯研究有限公司 | Method and apparatus for reducing power consumption in a display for an electronic device |
WO2007095257A2 (en) * | 2006-02-10 | 2007-08-23 | Freedom Scientific, Inc. | System-wide content-sensitive text stylization and replacement |
ES2284376B1 (en) | 2006-02-21 | 2008-09-16 | Io Think Future, Sl | ELECTRONIC WATCH WITH SIMPLIFIED ELECTRONICS. |
WO2007100767A2 (en) * | 2006-02-24 | 2007-09-07 | Visan Industries | Systems and methods for dynamically designing a product with digital content |
US8280979B2 (en) | 2006-02-27 | 2012-10-02 | Microsoft Corporation | Persistent public machine setting |
US7898542B1 (en) | 2006-03-01 | 2011-03-01 | Adobe Systems Incorporated | Creating animation effects |
US20070266239A1 (en) | 2006-03-08 | 2007-11-15 | David Vismans | Method for providing a cryptographically signed command |
KR100754674B1 (en) | 2006-03-10 | 2007-09-03 | 삼성전자주식회사 | Method and apparatus for selecting menu in portable terminal |
US7836400B2 (en) | 2006-03-31 | 2010-11-16 | Research In Motion Limited | Snooze support for event reminders |
US9395905B2 (en) | 2006-04-05 | 2016-07-19 | Synaptics Incorporated | Graphical scroll wheel |
KR20070100598A (en) | 2006-04-07 | 2007-10-11 | 삼성전자주식회사 | Information recording medium, reproducing method and reproducing apparatus |
US7924657B2 (en) | 2006-05-03 | 2011-04-12 | Liebowitz Daniel | Apparatus and method for time management and instruction |
US8620038B2 (en) | 2006-05-05 | 2013-12-31 | Parham Aarabi | Method, system and computer program product for automatic and semi-automatic modification of digital images of faces |
EP2194509A1 (en) | 2006-05-07 | 2010-06-09 | Sony Computer Entertainment Inc. | Method for providing affective characteristics to computer generated avatar during gameplay |
KR100679412B1 (en) | 2006-05-11 | 2007-02-07 | 삼성전자주식회사 | Method and apparatus for controlling alarm function of a mobile terminal with a inertial sensor |
US20070261537A1 (en) | 2006-05-12 | 2007-11-15 | Nokia Corporation | Creating and sharing variations of a music file |
EP1857953B1 (en) | 2006-05-16 | 2008-12-03 | EM Microelectronic-Marin SA | Method and system for authentication and secure exchange of data between a personalised chip and a dedicated server |
US20070271513A1 (en) | 2006-05-22 | 2007-11-22 | Nike, Inc. | User Interface for Remotely Controlling a Digital Music Player |
KR200425314Y1 (en) | 2006-06-16 | 2006-09-11 | 신상열 | Multi-function LCD Clock |
EP2050082A1 (en) | 2006-07-13 | 2009-04-22 | Partygaming Ia Limited | Networked gaming system |
JP5076388B2 (en) | 2006-07-28 | 2012-11-21 | 富士通セミコンダクター株式会社 | Semiconductor device and manufacturing method thereof |
US20080052242A1 (en) | 2006-08-23 | 2008-02-28 | Gofigure! Llc | Systems and methods for exchanging graphics between communication devices |
US8078036B2 (en) | 2006-08-23 | 2011-12-13 | Sony Corporation | Custom content compilation using digital chapter marks |
US8564544B2 (en) | 2006-09-06 | 2013-10-22 | Apple Inc. | Touch screen device, method, and graphical user interface for customizing display of content category icons |
US10313505B2 (en) | 2006-09-06 | 2019-06-04 | Apple Inc. | Portable multifunction device, method, and graphical user interface for configuring and displaying widgets |
US7771320B2 (en) | 2006-09-07 | 2010-08-10 | Nike, Inc. | Athletic performance sensing and/or tracking systems and methods |
US9544196B2 (en) | 2006-09-20 | 2017-01-10 | At&T Intellectual Property I, L.P. | Methods, systems and computer program products for determining installation status of SMS packages |
US7536645B2 (en) * | 2006-10-23 | 2009-05-19 | Research In Motion, Ltd | System and method for customizing layer based themes |
US8971667B2 (en) * | 2006-10-23 | 2015-03-03 | Hewlett-Packard Development Company, L.P. | Digital image auto-resizing |
US7518959B2 (en) | 2006-12-01 | 2009-04-14 | Seiko Epson Corporation | Display device and display method |
KR100912877B1 (en) | 2006-12-02 | 2009-08-18 | 한국전자통신연구원 | A mobile communication terminal having a function of the creating 3d avata model and the method thereof |
US7515509B2 (en) | 2006-12-08 | 2009-04-07 | Jennifer Klein | Teaching clock |
US8179388B2 (en) | 2006-12-15 | 2012-05-15 | Nvidia Corporation | System, method and computer program product for adjusting a refresh rate of a display for power savings |
US20080215240A1 (en) | 2006-12-18 | 2008-09-04 | Damian Howard | Integrating User Interfaces |
JP5157328B2 (en) | 2006-12-21 | 2013-03-06 | セイコーエプソン株式会社 | Pointer type display device |
US7940604B2 (en) | 2006-12-21 | 2011-05-10 | Seiko Epson Corporation | Dial indicator display device |
US7656275B2 (en) | 2006-12-22 | 2010-02-02 | Research In Motion Limited | System and method for controlling an alarm for an electronic device |
US8041968B2 (en) | 2007-01-04 | 2011-10-18 | Apple Inc. | Power management for driving display with baseband portion when application portion is in low power mode |
US7957762B2 (en) | 2007-01-07 | 2011-06-07 | Apple Inc. | Using ambient light sensor to augment proximity sensor output |
US8607167B2 (en) | 2007-01-07 | 2013-12-10 | Apple Inc. | Portable multifunction device, method, and graphical user interface for providing maps and directions |
US20080174606A1 (en) | 2007-01-23 | 2008-07-24 | Srikanth Rengarajan | Method and apparatus for low power refresh of a display device |
KR101239797B1 (en) | 2007-02-07 | 2013-03-06 | 엘지전자 주식회사 | Electronic Device With Touch Screen And Method Of Providing Analog Clock Using Same |
KR100896711B1 (en) | 2007-02-08 | 2009-05-11 | 삼성전자주식회사 | Method for executing function according to tap in mobile terminal with touch screen |
KR100801650B1 (en) | 2007-02-13 | 2008-02-05 | 삼성전자주식회사 | Method for executing function in idle screen of mobile terminal |
US7752188B2 (en) | 2007-02-16 | 2010-07-06 | Sony Ericsson Mobile Communications Ab | Weather information in a calendar |
CA2578927C (en) | 2007-02-19 | 2011-09-27 | Ray Arbesman | Precut adhesive body support articles and support system |
GB0703276D0 (en) | 2007-02-20 | 2007-03-28 | Skype Ltd | Instant messaging activity notification |
JPWO2008114491A1 (en) | 2007-03-20 | 2010-07-01 | 株式会社Access | Terminal having application update management function, application update management program, and system |
US20100084996A1 (en) | 2007-03-29 | 2010-04-08 | Koninklijke Philips Electronics N.V. | Natural daylight mimicking system and user interface |
KR101390103B1 (en) | 2007-04-03 | 2014-04-28 | 엘지전자 주식회사 | Controlling image and mobile terminal |
US8868053B2 (en) | 2007-04-20 | 2014-10-21 | Raphael A. Thompson | Communication delivery filter for mobile device |
US7735019B2 (en) | 2007-04-25 | 2010-06-08 | International Business Machines Corporation | Method for providing functional context within an actively scrolling view pane |
AT505245B1 (en) | 2007-05-25 | 2011-02-15 | Krieger Martin Mag | ELECTRONICALLY CONTROLLED CLOCK |
US8253770B2 (en) | 2007-05-31 | 2012-08-28 | Eastman Kodak Company | Residential video communication system |
CN100492288C (en) | 2007-06-14 | 2009-05-27 | 腾讯科技(深圳)有限公司 | Application program interface processing method and system |
KR20090002176A (en) | 2007-06-20 | 2009-01-09 | 엔에이치엔(주) | System for providing ranking of game-avatar in network and method thereof |
US8171432B2 (en) | 2008-01-06 | 2012-05-01 | Apple Inc. | Touch screen device, method, and graphical user interface for displaying and selecting application options |
US7720855B2 (en) | 2007-07-02 | 2010-05-18 | Brown Stephen J | Social network for affecting personal behavior |
GB2450757A (en) | 2007-07-06 | 2009-01-07 | Sony Comp Entertainment Europe | Avatar customisation, transmission and reception |
JP5063227B2 (en) | 2007-07-09 | 2012-10-31 | キヤノン株式会社 | Imaging control device, control method therefor, and program |
US20090016168A1 (en) | 2007-07-12 | 2009-01-15 | Emily Smith | Timepiece Device |
KR20090008976A (en) | 2007-07-19 | 2009-01-22 | 삼성전자주식회사 | Map scrolling method in navigation terminal and the navigation terminal thereof |
US8726194B2 (en) | 2007-07-27 | 2014-05-13 | Qualcomm Incorporated | Item selection using enhanced control |
US8422550B2 (en) | 2007-07-27 | 2013-04-16 | Lagavulin Limited | Apparatuses, methods, and systems for a portable, automated contractual image dealer and transmitter |
US8146005B2 (en) | 2007-08-07 | 2012-03-27 | International Business Machines Corporation | Creating a customized avatar that reflects a user's distinguishable attributes |
US8900731B2 (en) | 2007-08-24 | 2014-12-02 | Motorola Solutions, Inc. | Charger system for communication devices using a charger circuit to communicate a charge status to a portable host device |
US7778118B2 (en) | 2007-08-28 | 2010-08-17 | Garmin Ltd. | Watch device having touch-bezel user interface |
US9619143B2 (en) | 2008-01-06 | 2017-04-11 | Apple Inc. | Device, method, and graphical user interface for viewing application launch icons |
US8619038B2 (en) | 2007-09-04 | 2013-12-31 | Apple Inc. | Editing interface |
US20090068984A1 (en) | 2007-09-06 | 2009-03-12 | Burnett R Alan | Method, apparatus, and system for controlling mobile device use |
US8390628B2 (en) | 2007-09-11 | 2013-03-05 | Sony Computer Entertainment America Llc | Facial animation using motion capture data |
JP4466702B2 (en) | 2007-09-12 | 2010-05-26 | カシオ計算機株式会社 | Imaging apparatus and imaging control program |
KR100929236B1 (en) * | 2007-09-18 | 2009-12-01 | 엘지전자 주식회사 | Portable terminal with touch screen and operation control method thereof |
TW200915698A (en) | 2007-09-29 | 2009-04-01 | Acer Inc | Device to change the efficacy of charging or discharging based on the status of battery |
WO2009053775A1 (en) | 2007-10-23 | 2009-04-30 | Mitchell Foy | A system and apparatus for displaying local time without numeration |
KR100864578B1 (en) | 2007-10-31 | 2008-10-20 | 주식회사 엘지텔레콤 | Method and system for providing mibile widget service having quick link function |
US8600457B2 (en) | 2007-11-30 | 2013-12-03 | Microsoft Corporation | Sleep mode for mobile communication device |
US8892999B2 (en) | 2007-11-30 | 2014-11-18 | Nike, Inc. | Interactive avatar for social network services |
US20090146962A1 (en) | 2007-12-05 | 2009-06-11 | Nokia Corporation | Mobile communication terminal and method |
US8140335B2 (en) | 2007-12-11 | 2012-03-20 | Voicebox Technologies, Inc. | System and method for providing a natural language voice user interface in an integrated voice navigation services environment |
US8965787B2 (en) | 2007-12-17 | 2015-02-24 | Smooth Productions Inc. | Communications system and method for serving electronic content |
JP2009147889A (en) | 2007-12-18 | 2009-07-02 | Cybird Holdings Co Ltd | Image management system |
US20090164923A1 (en) | 2007-12-21 | 2009-06-25 | Nokia Corporation | Method, apparatus and computer program product for providing an adaptive icon |
US8327277B2 (en) | 2008-01-14 | 2012-12-04 | Microsoft Corporation | Techniques to automatically manage overlapping objects |
AU2009209018B2 (en) | 2008-01-30 | 2014-03-20 | Google Llc | Notification of mobile device events |
US8677285B2 (en) | 2008-02-01 | 2014-03-18 | Wimm Labs, Inc. | User interface of a small touch sensitive display for an electronic data and communication device |
US20090198581A1 (en) | 2008-02-06 | 2009-08-06 | Lidestri James M | Digital creative works widgets |
US20090201297A1 (en) | 2008-02-07 | 2009-08-13 | Johansson Carolina S M | Electronic device with animated character and method |
EP2263190A2 (en) | 2008-02-13 | 2010-12-22 | Ubisoft Entertainment S.A. | Live-action image capture |
US20110230986A1 (en) | 2008-02-20 | 2011-09-22 | Nike, Inc. | Systems and Methods for Storing and Analyzing Golf Data, Including Community and Individual Golf Data Collection and Storage at a Central Hub |
US8156060B2 (en) | 2008-02-27 | 2012-04-10 | Inteliwise Sp Z.O.O. | Systems and methods for generating and implementing an interactive man-machine web interface based on natural language processing and avatar virtual agent based character |
US20090254862A1 (en) | 2008-03-03 | 2009-10-08 | Kidzui, Inc | Method and apparatus for user interface for child oriented computer network |
JP2009217612A (en) | 2008-03-11 | 2009-09-24 | Toshiba Corp | Portable information terminal device |
US9513704B2 (en) | 2008-03-12 | 2016-12-06 | Immersion Corporation | Haptically enabled user interface |
US20100211899A1 (en) | 2009-02-17 | 2010-08-19 | Robb Fujioka | Virtual Marketplace Accessible To Widgetized Avatars |
US8634796B2 (en) | 2008-03-14 | 2014-01-21 | William J. Johnson | System and method for location based exchanges of data facilitating distributed location applications |
JP5266820B2 (en) | 2008-03-19 | 2013-08-21 | セイコーエプソン株式会社 | Satellite signal receiving device and control method of satellite signal receiving device |
US8169438B1 (en) | 2008-03-31 | 2012-05-01 | Pixar | Temporally coherent hair deformation |
WO2009122684A1 (en) | 2008-04-01 | 2009-10-08 | Yanase Takatoshi | Display system, display method, program, and recording medium |
US20090251484A1 (en) | 2008-04-03 | 2009-10-08 | Motorola, Inc. | Avatar for a portable device |
US8832552B2 (en) | 2008-04-03 | 2014-09-09 | Nokia Corporation | Automated selection of avatar characteristics for groups |
KR100977385B1 (en) | 2008-04-10 | 2010-08-20 | 주식회사 팬택 | Mobile terminal able to control widget type wallpaper and method for wallpaper control using the same |
JP5643746B2 (en) | 2008-04-16 | 2014-12-17 | ナイキ イノベイト セー. フェー. | User interface for athletic performance for mobile devices |
US8976007B2 (en) | 2008-08-09 | 2015-03-10 | Brian M. Dugan | Systems and methods for providing biofeedback information to a cellular telephone and for using such information |
US20110026368A1 (en) | 2008-04-22 | 2011-02-03 | Relyea Gregg F | Graphic display programmable wristwatch |
KR101526967B1 (en) | 2008-04-23 | 2015-06-11 | 엘지전자 주식회사 | Apparatus for transmitting software in cable broadcast, apparatus and method for downloading software and receiving in cable broadcast |
KR101687689B1 (en) | 2008-04-30 | 2016-12-19 | 가부시키가이샤 아크로디아 | Character display data generation device and method |
ES2388704T3 (en) | 2008-05-11 | 2012-10-17 | Research In Motion Limited | Electronic device and method that provide enhanced processing of a predetermined clock event during operation of a rest time mode |
ES2387527T3 (en) | 2008-05-11 | 2012-09-25 | Research In Motion Limited | Electronic device and method that provide an improved indication that an alarm clock is in an activated state |
EP2120116B1 (en) | 2008-05-11 | 2011-12-07 | Research In Motion Limited | Electronic device and method providing improved alarm clock feature and facilitated alarm |
CN101667091A (en) | 2008-05-15 | 2010-03-10 | 杭州惠道科技有限公司 | Human-computer interface for predicting user input in real time |
US8620641B2 (en) | 2008-05-16 | 2013-12-31 | Blackberry Limited | Intelligent elision |
KR101488726B1 (en) | 2008-05-27 | 2015-02-06 | 삼성전자주식회사 | Display apparatus for displaying a widget window and display system including the display apparatus and method for displaying thereof |
US8401284B2 (en) | 2008-05-28 | 2013-03-19 | Apple Inc. | Color correcting method and apparatus |
CN105327509B (en) | 2008-06-02 | 2019-04-19 | 耐克创新有限合伙公司 | The system and method for creating incarnation |
JP2009293960A (en) | 2008-06-02 | 2009-12-17 | Sony Ericsson Mobilecommunications Japan Inc | Display apparatus, portable terminal apparatus, and display method |
US20090307616A1 (en) | 2008-06-04 | 2009-12-10 | Nokia Corporation | User interface, device and method for an improved operating mode |
US9516116B2 (en) | 2008-06-06 | 2016-12-06 | Apple Inc. | Managing notification service connections |
US8135392B2 (en) | 2008-06-06 | 2012-03-13 | Apple Inc. | Managing notification service connections and displaying icon badges |
US8249660B2 (en) | 2008-06-11 | 2012-08-21 | At&T Intellectual Property I, Lp | System and method for display timeout on mobile communication devices |
US8010479B2 (en) | 2008-06-18 | 2011-08-30 | International Business Machines Corporation | Simplifying the creation of user-defined custom elements for use in a graphical modeling application |
US20090327897A1 (en) | 2008-06-26 | 2009-12-31 | Flypaper Studio, Inc. | System and Method For An Interactive Presentation System |
US20090327886A1 (en) | 2008-06-27 | 2009-12-31 | Microsoft Corporation | Use of secondary factors to analyze user intention in gui element activation |
WO2010000300A1 (en) | 2008-06-30 | 2010-01-07 | Accenture Global Services Gmbh | Gaming system |
US8446414B2 (en) | 2008-07-14 | 2013-05-21 | Microsoft Corporation | Programming APIS for an extensible avatar system |
US10983665B2 (en) | 2008-08-01 | 2021-04-20 | Samsung Electronics Co., Ltd. | Electronic apparatus and method for implementing user interface |
US8221125B2 (en) | 2008-08-14 | 2012-07-17 | World View Time Inc. | Electronic presentation of world time zones |
KR101179026B1 (en) | 2008-08-28 | 2012-09-03 | 에스케이플래닛 주식회사 | Apparatus and method for providing idle screen with mobile widget service |
KR101215175B1 (en) | 2008-08-28 | 2012-12-24 | 에스케이플래닛 주식회사 | System and method for providing multi-idle screen |
US20100064255A1 (en) | 2008-09-05 | 2010-03-11 | Apple Inc. | Contextual menus in an electronic device |
US8512211B2 (en) | 2008-09-05 | 2013-08-20 | Apple Inc. | Method for quickstart workout generation and calibration |
US8341557B2 (en) | 2008-09-05 | 2012-12-25 | Apple Inc. | Portable touch screen device, method, and graphical user interface for providing workout support |
US8385822B2 (en) | 2008-09-26 | 2013-02-26 | Hewlett-Packard Development Company, L.P. | Orientation and presence detection for use in configuring operations of computing devices in docked environments |
KR101546782B1 (en) | 2008-10-02 | 2015-08-25 | 삼성전자주식회사 | Apparatus and method for composing idle screen in a portable terminal |
US8872646B2 (en) | 2008-10-08 | 2014-10-28 | Dp Technologies, Inc. | Method and system for waking up a device due to motion |
EP2175343A1 (en) | 2008-10-08 | 2010-04-14 | Research in Motion Limited | A method and handheld electronic device having a graphical user interface which arranges icons dynamically |
US8941642B2 (en) | 2008-10-17 | 2015-01-27 | Kabushiki Kaisha Square Enix | System for the creation and editing of three dimensional models |
KR101510738B1 (en) | 2008-10-20 | 2015-04-10 | 삼성전자주식회사 | Apparatus and method for composing idle screen in a portable terminal |
US20100107100A1 (en) | 2008-10-23 | 2010-04-29 | Schneekloth Jason S | Mobile Device Style Abstraction |
DE102008054113A1 (en) * | 2008-10-31 | 2010-05-06 | Deutsche Telekom Ag | Method for adapting the background image on a screen |
WO2010051493A2 (en) | 2008-10-31 | 2010-05-06 | Nettoons, Inc. | Web-based real-time animation visualization, creation, and distribution |
US20100124152A1 (en) | 2008-11-18 | 2010-05-20 | Gilbert Kye Lee | Image Clock |
US8493408B2 (en) | 2008-11-19 | 2013-07-23 | Apple Inc. | Techniques for manipulating panoramas |
JP4752900B2 (en) | 2008-11-19 | 2011-08-17 | ソニー株式会社 | Image processing apparatus, image display method, and image display program |
KR101450580B1 (en) | 2008-11-19 | 2014-10-14 | 삼성전자주식회사 | Method and Apparatus for composing images |
JP5256001B2 (en) | 2008-11-20 | 2013-08-07 | 京セラドキュメントソリューションズ株式会社 | Color adjustment apparatus, method and program |
PL2194378T3 (en) | 2008-12-02 | 2013-08-30 | Hoffmann La Roche | Hand tool for measuring the analyte concentration in a body fluid sample |
US9197738B2 (en) | 2008-12-04 | 2015-11-24 | Microsoft Technology Licensing, Llc | Providing selected data through a locked display |
US20100146437A1 (en) | 2008-12-04 | 2010-06-10 | Microsoft Corporation | Glanceable animated notifications on a locked device |
KR101050642B1 (en) | 2008-12-04 | 2011-07-19 | 삼성전자주식회사 | Watch phone and method of conducting call in watch phone |
US20100149573A1 (en) | 2008-12-17 | 2010-06-17 | Xerox Corporation | System and method of providing image forming machine power up status information |
US20100153847A1 (en) | 2008-12-17 | 2010-06-17 | Sony Computer Entertainment America Inc. | User deformation of movie character images |
US8522163B2 (en) | 2008-12-19 | 2013-08-27 | Verizon Patent And Licensing Inc. | Systems and methods for radial display of time based information |
US8788655B2 (en) | 2008-12-19 | 2014-07-22 | Openpeak Inc. | Systems for accepting and approving applications and methods of operation of same |
KR101545880B1 (en) | 2008-12-22 | 2015-08-21 | 삼성전자주식회사 | Terminal having touch screen and method for displaying data thereof |
US8229411B2 (en) | 2008-12-30 | 2012-07-24 | Verizon Patent And Licensing Inc. | Graphical user interface for mobile device |
EP2204702B1 (en) | 2008-12-30 | 2014-04-23 | Vodafone Holding GmbH | Clock |
US20120001922A1 (en) | 2009-01-26 | 2012-01-05 | Escher Marc | System and method for creating and sharing personalized fonts on a client/server architecture |
JP2010176170A (en) | 2009-01-27 | 2010-08-12 | Sony Ericsson Mobilecommunications Japan Inc | Display apparatus, display control method, and display control program |
EP2214087B1 (en) | 2009-01-30 | 2015-07-08 | BlackBerry Limited | A handheld electronic device having a touchscreen and a method of using a touchscreen of a handheld electronic device |
US8295546B2 (en) | 2009-01-30 | 2012-10-23 | Microsoft Corporation | Pose tracking pipeline |
US10175848B2 (en) | 2009-02-09 | 2019-01-08 | Nokia Technologies Oy | Displaying a display portion including an icon enabling an item to be added to a list |
US8386957B2 (en) * | 2009-02-25 | 2013-02-26 | Hewlett-Packard Development Company, L.P. | Method for dynamically scaling an original background layout |
US20100223563A1 (en) | 2009-03-02 | 2010-09-02 | Apple Inc. | Remotely defining a user interface for a handheld device |
US20100226213A1 (en) | 2009-03-04 | 2010-09-09 | Brian Robert Drugge | User Customizable Timepiece |
CN101505320B (en) | 2009-03-09 | 2013-01-16 | 腾讯科技(深圳)有限公司 | Graphic user interface sharing method, system and tool |
US8756534B2 (en) | 2009-03-16 | 2014-06-17 | Apple Inc. | Methods and graphical user interfaces for editing on a multifunction device with a touch screen display |
US20100245268A1 (en) | 2009-03-30 | 2010-09-30 | Stg Interactive S.A. | User-friendly process for interacting with informational content on touchscreen devices |
US8238876B2 (en) | 2009-03-30 | 2012-08-07 | Microsoft Corporation | Notifications |
US8167127B2 (en) | 2009-03-31 | 2012-05-01 | Marware Inc. | Protective carrying case for a portable electronic device |
KR20100111563A (en) * | 2009-04-07 | 2010-10-15 | 삼성전자주식회사 | Method for composing display in mobile terminal |
DE102009018165A1 (en) | 2009-04-18 | 2010-10-21 | Schreiber & Friends | Method for displaying an animated object |
JP2010257051A (en) | 2009-04-22 | 2010-11-11 | Funai Electric Co Ltd | Rotary input device and electronic equipment |
EP2425302B1 (en) | 2009-04-26 | 2019-03-13 | NIKE Innovate C.V. | Athletic watch |
US8601389B2 (en) | 2009-04-30 | 2013-12-03 | Apple Inc. | Scrollable menus and toolbars |
US20100277470A1 (en) | 2009-05-01 | 2010-11-04 | Microsoft Corporation | Systems And Methods For Applying Model Tracking To Motion Capture |
US9377857B2 (en) | 2009-05-01 | 2016-06-28 | Microsoft Technology Licensing, Llc | Show body position |
US20100289723A1 (en) | 2009-05-16 | 2010-11-18 | David London | Teleidoscopic display device |
US8713459B2 (en) | 2009-05-29 | 2014-04-29 | Jason Philip Yanchar | Graphical planner |
US8464182B2 (en) | 2009-06-07 | 2013-06-11 | Apple Inc. | Device, method, and graphical user interface for providing maps, directions, and location-based information |
KR101598335B1 (en) | 2009-06-11 | 2016-02-29 | 엘지전자 주식회사 | Operating a Mobile Termianl |
US8446398B2 (en) | 2009-06-16 | 2013-05-21 | Intel Corporation | Power conservation for mobile device displays |
US8251294B2 (en) | 2009-07-02 | 2012-08-28 | Mastercard International, Inc. | Payment device having appeal for status consumers |
CH701440A2 (en) | 2009-07-03 | 2011-01-14 | Comme Le Temps Sa | Wrist touch screen and method for displaying on a watch with touch screen. |
US9513403B2 (en) | 2009-07-27 | 2016-12-06 | Peck Labs, Inc | Methods and systems for displaying customized icons |
SE534980C2 (en) | 2009-08-26 | 2012-03-06 | Svenska Utvecklings Entreprenoeren Susen Ab | Method of waking a sleepy motor vehicle driver |
JP5333068B2 (en) | 2009-08-31 | 2013-11-06 | ソニー株式会社 | Information processing apparatus, display method, and display program |
GB2475669A (en) | 2009-09-03 | 2011-06-01 | Tapisodes Ltd | Animated progress indicator for smartphone |
KR101390957B1 (en) | 2009-09-04 | 2014-05-02 | 나이키 인터내셔널 엘티디. | Monitoring and tracking athletic activity |
US8966375B2 (en) | 2009-09-07 | 2015-02-24 | Apple Inc. | Management of application programs on a portable electronic device |
CN101692681A (en) | 2009-09-17 | 2010-04-07 | 杭州聚贝软件科技有限公司 | Method and system for realizing virtual image interactive interface on phone set terminal |
EP3260969B1 (en) | 2009-09-22 | 2021-03-03 | Apple Inc. | Device, method, and graphical user interface for manipulating user interface objects |
US8832585B2 (en) | 2009-09-25 | 2014-09-09 | Apple Inc. | Device, method, and graphical user interface for manipulating workspace views |
TWI420332B (en) | 2009-09-29 | 2013-12-21 | Htc Corp | Method and apparatus for displaying weather condition and computer product using the same |
US20110074807A1 (en) | 2009-09-30 | 2011-03-31 | Hitachi, Ltd. | Method of color customization of content screen |
US8405663B2 (en) | 2009-10-01 | 2013-03-26 | Research In Motion Limited | Simulated resolution of stopwatch |
US8457651B2 (en) | 2009-10-02 | 2013-06-04 | Qualcomm Incorporated | Device movement user interface gestures for file sharing functionality |
US9176542B2 (en) | 2009-11-06 | 2015-11-03 | Sony Corporation | Accelerometer-based touchscreen user interface |
US8432367B2 (en) | 2009-11-19 | 2013-04-30 | Google Inc. | Translating user interaction with a touch screen into input commands |
CN101702112A (en) | 2009-11-19 | 2010-05-05 | 宇龙计算机通信科技(深圳)有限公司 | Setting method for standby graphical interfaces and electronic equipment |
US8364855B2 (en) | 2009-11-20 | 2013-01-29 | Apple Inc. | Dynamic interpretation of user input in a portable electronic device |
US8799816B2 (en) | 2009-12-07 | 2014-08-05 | Motorola Mobility Llc | Display interface and method for displaying multiple items arranged in a sequence |
JP5818806B2 (en) | 2009-12-09 | 2015-11-18 | ナイキ イノベイト シーブイ | Exercise performance monitoring system using heart rate information |
KR101626621B1 (en) | 2009-12-30 | 2016-06-01 | 엘지전자 주식회사 | Method for controlling data in mobile termina having circle type display unit and mobile terminal thereof |
US8438504B2 (en) | 2010-01-06 | 2013-05-07 | Apple Inc. | Device, method, and graphical user interface for navigating through multiple viewing areas |
US20110166777A1 (en) | 2010-01-07 | 2011-07-07 | Anand Kumar Chavakula | Navigation Application |
US20110173221A1 (en) | 2010-01-13 | 2011-07-14 | Microsoft Corporation | Calendar expand grid |
US20110179372A1 (en) | 2010-01-15 | 2011-07-21 | Bradford Allen Moore | Automatic Keyboard Layout Determination |
US20110181521A1 (en) | 2010-01-26 | 2011-07-28 | Apple Inc. | Techniques for controlling z-ordering in a user interface |
JP5286301B2 (en) | 2010-02-02 | 2013-09-11 | 光彌 齋藤 | Automatic pattern generation device, automatic generation method, and automatic generation program |
US20110197165A1 (en) | 2010-02-05 | 2011-08-11 | Vasily Filippov | Methods and apparatus for organizing a collection of widgets on a mobile device display |
KR101600549B1 (en) | 2010-02-11 | 2016-03-07 | 삼성전자주식회사 | Method and apparatus for providing history of information associated to time information |
KR20110093729A (en) | 2010-02-12 | 2011-08-18 | 삼성전자주식회사 | Method and apparatus of providing widget |
US9417787B2 (en) | 2010-02-12 | 2016-08-16 | Microsoft Technology Licensing, Llc | Distortion effects to indicate location in a movable data collection |
US20110205851A1 (en) | 2010-02-23 | 2011-08-25 | Jared Harris | E-Watch |
US20120028707A1 (en) | 2010-02-24 | 2012-02-02 | Valve Corporation | Game animations with multi-dimensional video game data |
US20110218765A1 (en) | 2010-03-02 | 2011-09-08 | Rogers Janice L | Illustrating and Displaying Time and The Expiration Thereof |
US20110221755A1 (en) | 2010-03-12 | 2011-09-15 | Kevin Geisner | Bionic motion |
CN101819486B (en) | 2010-03-23 | 2012-06-13 | 宇龙计算机通信科技(深圳)有限公司 | Method and device for monitoring touch screen and mobile terminal |
US8614560B2 (en) | 2010-03-26 | 2013-12-24 | Nokia Corporation | Method and apparatus for determining interaction mode |
US20110239115A1 (en) | 2010-03-26 | 2011-09-29 | Motorola, Inc. | Selecting an avatar on a display screen of a mobile device |
US8798610B2 (en) | 2010-03-26 | 2014-08-05 | Intel Corporation | Method and apparatus for bearer and server independent parental control on smartphone, managed by the smartphone |
JP2011209887A (en) | 2010-03-29 | 2011-10-20 | Sannetto:Kk | Method and program for creating avatar, and network service system |
JP5397699B2 (en) | 2010-03-31 | 2014-01-22 | 日本電気株式会社 | Mobile communication terminal and function restriction control method thereof |
FR2958487A1 (en) | 2010-04-06 | 2011-10-07 | Alcatel Lucent | A METHOD OF REAL TIME DISTORTION OF A REAL ENTITY RECORDED IN A VIDEO SEQUENCE |
US8451994B2 (en) | 2010-04-07 | 2013-05-28 | Apple Inc. | Switching cameras during a video conference of a multi-camera mobile device |
TWI439960B (en) | 2010-04-07 | 2014-06-01 | Apple Inc | Avatar editing environment |
US9542038B2 (en) | 2010-04-07 | 2017-01-10 | Apple Inc. | Personalizing colors of user interfaces |
KR101642725B1 (en) | 2010-04-14 | 2016-08-11 | 삼성전자 주식회사 | Method and apparatus for managing lock function in mobile terminal |
US20110261079A1 (en) | 2010-04-21 | 2011-10-27 | Apple Inc. | Automatic adjustment of a user interface composition |
FI20105493A0 (en) | 2010-05-07 | 2010-05-07 | Polar Electro Oy | power transmission |
US20120254804A1 (en) | 2010-05-21 | 2012-10-04 | Sheha Michael A | Personal wireless navigation system |
US8694899B2 (en) | 2010-06-01 | 2014-04-08 | Apple Inc. | Avatars reflecting user states |
US9245177B2 (en) | 2010-06-02 | 2016-01-26 | Microsoft Technology Licensing, Llc | Limiting avatar gesture display |
US10628985B2 (en) | 2017-12-01 | 2020-04-21 | Affectiva, Inc. | Avatar image animation using translation vectors |
US8749557B2 (en) | 2010-06-11 | 2014-06-10 | Microsoft Corporation | Interacting with user interface via avatar |
JP2011258160A (en) | 2010-06-11 | 2011-12-22 | Namco Bandai Games Inc | Program, information storage medium and image generation system |
US20130044080A1 (en) | 2010-06-16 | 2013-02-21 | Holy Stone Enterprise Co., Ltd. | Dual-view display device operating method |
US20110316858A1 (en) | 2010-06-24 | 2011-12-29 | Mediatek Inc. | Apparatuses and Methods for Real Time Widget Interactions |
US8484562B2 (en) | 2010-06-25 | 2013-07-09 | Apple Inc. | Dynamic text adjustment in a user interface element |
JP5134653B2 (en) | 2010-07-08 | 2013-01-30 | 株式会社バンダイナムコゲームス | Program and user terminal |
WO2012008628A1 (en) | 2010-07-13 | 2012-01-19 | 엘지전자 주식회사 | Mobile terminal and configuration method for standby screen thereof |
KR20120007686A (en) | 2010-07-15 | 2012-01-25 | 삼성전자주식회사 | Method and apparatus for controlling function in a touch device |
US9110589B1 (en) | 2010-07-21 | 2015-08-18 | Google Inc. | Tab bar control for mobile devices |
JP4635109B1 (en) | 2010-07-30 | 2011-02-23 | 日本テクノ株式会社 | A clock with a time display dial that has a display function on the entire surface. |
US8630392B2 (en) | 2010-07-30 | 2014-01-14 | Mitel Networks Corporation | World clock enabling time zone sensitive applications |
KR20120013727A (en) | 2010-08-06 | 2012-02-15 | 삼성전자주식회사 | Display apparatus and control method thereof |
US10572721B2 (en) | 2010-08-09 | 2020-02-25 | Nike, Inc. | Monitoring fitness using a mobile device |
KR101817048B1 (en) | 2010-08-09 | 2018-01-09 | 나이키 이노베이트 씨.브이. | Monitoring fitness using a mobile device |
US20120047447A1 (en) | 2010-08-23 | 2012-02-23 | Saad Ul Haq | Emotion based messaging system and statistical research tool |
KR101660746B1 (en) | 2010-08-24 | 2016-10-10 | 엘지전자 주식회사 | Mobile terminal and Method for setting application indicator thereof |
JP2012053642A (en) | 2010-08-31 | 2012-03-15 | Brother Ind Ltd | Communication device, communication system, communication control method, and communication control program |
US8854318B2 (en) | 2010-09-01 | 2014-10-07 | Nokia Corporation | Mode switching |
US8620850B2 (en) | 2010-09-07 | 2013-12-31 | Blackberry Limited | Dynamically manipulating an emoticon or avatar |
EP2426902A1 (en) | 2010-09-07 | 2012-03-07 | Research In Motion Limited | Dynamically manipulating an emoticon or avatar |
US20120062470A1 (en) | 2010-09-10 | 2012-03-15 | Chang Ray L | Power Management |
JP5230705B2 (en) * | 2010-09-10 | 2013-07-10 | 株式会社沖データ | Image processing device |
US8462997B2 (en) | 2010-09-15 | 2013-06-11 | Microsoft Corporation | User-specific attribute customization |
US20120069028A1 (en) | 2010-09-20 | 2012-03-22 | Yahoo! Inc. | Real-time animations of emoticons using facial recognition during a video chat |
US8558844B2 (en) | 2010-09-28 | 2013-10-15 | Apple Inc. | Systems, methods, and computer-readable media for changing colors of displayed assets |
US8830226B2 (en) | 2010-09-28 | 2014-09-09 | Apple Inc. | Systems, methods, and computer-readable media for integrating a three-dimensional asset with a three-dimensional model |
JP5249297B2 (en) | 2010-09-28 | 2013-07-31 | シャープ株式会社 | Image editing device |
US9483167B2 (en) | 2010-09-29 | 2016-11-01 | Adobe Systems Incorporated | User interface for a touch enabled device |
KR20120032888A (en) | 2010-09-29 | 2012-04-06 | 삼성전자주식회사 | Method and apparatus for reducing power consumption of mobile device |
US8768648B2 (en) | 2010-09-30 | 2014-07-01 | Fitbit, Inc. | Selection of display power mode based on sensor data |
US8781791B2 (en) | 2010-09-30 | 2014-07-15 | Fitbit, Inc. | Touchscreen with dynamically-defined areas having different scanning modes |
US8954290B2 (en) | 2010-09-30 | 2015-02-10 | Fitbit, Inc. | Motion-activated display of messages on an activity monitoring device |
TWI467462B (en) | 2010-10-01 | 2015-01-01 | Univ Nat Taiwan Science Tech | Active browsing method |
US9122053B2 (en) | 2010-10-15 | 2015-09-01 | Microsoft Technology Licensing, Llc | Realistic occlusion for a head mounted augmented reality display |
US20120113762A1 (en) | 2010-10-23 | 2012-05-10 | Frost Productions LLC | Electronic timepiece apparatus with random number and phrase generating functionality |
US8635475B2 (en) | 2010-10-27 | 2014-01-21 | Microsoft Corporation | Application-specific power management |
US9011292B2 (en) | 2010-11-01 | 2015-04-21 | Nike, Inc. | Wearable device assembly having athletic functionality |
US9195637B2 (en) | 2010-11-03 | 2015-11-24 | Microsoft Technology Licensing, Llc | Proportional font scaling |
TW201222405A (en) | 2010-11-16 | 2012-06-01 | Hon Hai Prec Ind Co Ltd | Method for configuring view of city in weather forecast application |
JP5622535B2 (en) | 2010-11-17 | 2014-11-12 | オリンパスイメージング株式会社 | Imaging device |
US8988214B2 (en) | 2010-12-10 | 2015-03-24 | Qualcomm Incorporated | System, method, apparatus, or computer program product for exercise and personal security |
EP2649782A2 (en) | 2010-12-10 | 2013-10-16 | Yota Devices IPR Ltd | Mobile device with user interface |
AU2010249319A1 (en) | 2010-12-13 | 2012-06-28 | Canon Kabushiki Kaisha | Conditional optimised paths in animated state machines |
CN102075727A (en) | 2010-12-30 | 2011-05-25 | 中兴通讯股份有限公司 | Method and device for processing images in videophone |
US9519418B2 (en) | 2011-01-18 | 2016-12-13 | Nokia Technologies Oy | Method and apparatus for providing a multi-stage device transition mechanism initiated based on a touch gesture |
TW201232486A (en) | 2011-01-26 | 2012-08-01 | Tomtom Int Bv | Navigation apparatus and method of providing weather condition information |
CN102142149A (en) | 2011-01-26 | 2011-08-03 | 深圳市同洲电子股份有限公司 | Method and device for obtaining contact image |
US8825362B2 (en) | 2011-01-27 | 2014-09-02 | Honda Motor Co., Ltd. | Calendar sharing for the vehicle environment using a connected cell phone |
US8635549B2 (en) * | 2011-02-10 | 2014-01-21 | Microsoft Corporation | Directly assigning desktop backgrounds |
GB201102794D0 (en) | 2011-02-17 | 2011-03-30 | Metail Ltd | Online retail system |
US8896652B2 (en) | 2011-02-28 | 2014-11-25 | Soryn Technologies Llc | System and method for real-time video communications |
US20130063383A1 (en) | 2011-02-28 | 2013-03-14 | Research In Motion Limited | Electronic device and method of displaying information in response to detecting a gesture |
JP5885185B2 (en) | 2011-03-07 | 2016-03-15 | 京セラ株式会社 | Mobile terminal device |
JP5749524B2 (en) | 2011-03-09 | 2015-07-15 | 京セラ株式会社 | Mobile terminal, mobile terminal control program, and mobile terminal control method |
JP3168099U (en) | 2011-03-11 | 2011-06-02 | 株式会社ホリウチ電子設計 | Clock using GPS time |
US20140043329A1 (en) | 2011-03-21 | 2014-02-13 | Peng Wang | Method of augmented makeover with 3d face modeling and landmark alignment |
JP5644622B2 (en) | 2011-03-24 | 2014-12-24 | 日本電気株式会社 | Display system, aggregation server, portable terminal, display method |
TW201239869A (en) | 2011-03-24 | 2012-10-01 | Hon Hai Prec Ind Co Ltd | System and method for adjusting font size on screen |
JP2012203832A (en) | 2011-03-28 | 2012-10-22 | Canon Inc | Display control device and control method thereof |
US9298287B2 (en) | 2011-03-31 | 2016-03-29 | Microsoft Technology Licensing, Llc | Combined activation for natural user interface systems |
US9239605B1 (en) | 2011-04-04 | 2016-01-19 | Google Inc. | Computing device power state transitions |
CN202217134U (en) | 2011-04-06 | 2012-05-09 | 金纬亘 | Dual system checking clock for synchronously displaying world zone time or each zone time |
US8643680B2 (en) | 2011-04-08 | 2014-02-04 | Amazon Technologies, Inc. | Gaze-based content display |
US9171268B1 (en) | 2011-04-22 | 2015-10-27 | Angel A. Penilla | Methods and systems for setting and transferring user profiles to vehicles and temporary sharing of user profiles to shared-use vehicles |
CN102750070A (en) | 2011-04-22 | 2012-10-24 | 上海三旗通信科技股份有限公司 | Mobile data-related functional interesting interactive wallpaper interactive mode |
US8687840B2 (en) | 2011-05-10 | 2014-04-01 | Qualcomm Incorporated | Smart backlights to minimize display power consumption based on desktop configurations and user eye gaze |
JP2012236166A (en) | 2011-05-12 | 2012-12-06 | Mitsubishi Heavy Ind Ltd | Co2 recovery device, and co2 recovery method |
US20120297346A1 (en) | 2011-05-16 | 2012-11-22 | Encelium Holdings, Inc. | Three dimensional building control system and method |
US8665345B2 (en) | 2011-05-18 | 2014-03-04 | Intellectual Ventures Fund 83 Llc | Video summary including a feature of interest |
KR101891803B1 (en) | 2011-05-23 | 2018-08-27 | 삼성전자주식회사 | Method and apparatus for editing screen of mobile terminal comprising touch screen |
EP2527968B1 (en) | 2011-05-24 | 2017-07-05 | LG Electronics Inc. | Mobile terminal |
KR101892638B1 (en) | 2012-03-27 | 2018-08-28 | 엘지전자 주식회사 | Mobile terminal |
KR20120132134A (en) | 2011-05-27 | 2012-12-05 | 윤일 | Clock display 24 hours a multinational |
KR101678271B1 (en) | 2011-06-05 | 2016-11-21 | 애플 인크. | Systems and methods for displaying notifications received from multiple applications |
US9013489B2 (en) | 2011-06-06 | 2015-04-21 | Microsoft Technology Licensing, Llc | Generation of avatar reflecting player appearance |
JP5765070B2 (en) | 2011-06-13 | 2015-08-19 | ソニー株式会社 | Display switching device, display switching method, display switching program |
JP5915000B2 (en) | 2011-06-13 | 2016-05-11 | ソニー株式会社 | Information processing apparatus and program |
JP6031735B2 (en) | 2011-06-13 | 2016-11-24 | ソニー株式会社 | Information processing apparatus, information processing method, and computer program |
US10083047B2 (en) | 2011-06-14 | 2018-09-25 | Samsung Electronics Co., Ltd. | System and method for executing multiple tasks in a mobile device |
WO2012174435A1 (en) | 2011-06-16 | 2012-12-20 | Richard Tao | Systems and methods for a virtual watch |
US8832284B1 (en) | 2011-06-16 | 2014-09-09 | Google Inc. | Virtual socializing |
US20120323933A1 (en) | 2011-06-20 | 2012-12-20 | Microsoft Corporation | Displaying notifications based on importance to the user |
US9153031B2 (en) | 2011-06-22 | 2015-10-06 | Microsoft Technology Licensing, Llc | Modifying video regions using mobile device input |
US9411506B1 (en) | 2011-06-28 | 2016-08-09 | Google Inc. | Providing additional functionality for a group messaging application |
US20130019175A1 (en) | 2011-07-14 | 2013-01-17 | Microsoft Corporation | Submenus for context based menu system |
US8854299B2 (en) | 2011-07-22 | 2014-10-07 | Blackberry Limited | Orientation based application launch system |
US8823318B2 (en) | 2011-07-25 | 2014-09-02 | ConvenientPower HK Ltd. | System and method for operating a mobile device |
JP5757815B2 (en) | 2011-07-27 | 2015-08-05 | 京セラ株式会社 | Electronic device, text editing method and control program |
US9417754B2 (en) | 2011-08-05 | 2016-08-16 | P4tents1, LLC | User interface system, method, and computer program product |
KR101832959B1 (en) | 2011-08-10 | 2018-02-28 | 엘지전자 주식회사 | Mobile device and control method for the same |
GB2493709A (en) | 2011-08-12 | 2013-02-20 | Siine Ltd | Faster input of text in graphical user interfaces |
KR101955976B1 (en) | 2011-08-25 | 2019-03-08 | 엘지전자 주식회사 | Activation of limited user interface capability device |
US8806369B2 (en) | 2011-08-26 | 2014-08-12 | Apple Inc. | Device, method, and graphical user interface for managing and interacting with concurrently open software applications |
US20130055147A1 (en) | 2011-08-29 | 2013-02-28 | Salesforce.Com, Inc. | Configuration, generation, and presentation of custom graphical user interface components for a virtual cloud-based application |
CN102968978B (en) | 2011-08-31 | 2016-01-27 | 联想(北京)有限公司 | A kind of control method of display refresh rates and device |
CN102298797A (en) | 2011-08-31 | 2011-12-28 | 深圳市美丽同盟科技有限公司 | Three-dimensional virtual fitting method, device and system |
US8890886B2 (en) * | 2011-09-02 | 2014-11-18 | Microsoft Corporation | User interface with color themes based on input image data |
CN102271241A (en) | 2011-09-02 | 2011-12-07 | 北京邮电大学 | Image communication method and system based on facial expression/action recognition |
US20130069893A1 (en) | 2011-09-15 | 2013-03-21 | Htc Corporation | Electronic device, controlling method thereof and computer program product |
US20130076757A1 (en) | 2011-09-27 | 2013-03-28 | Microsoft Corporation | Portioning data frame animation representations |
US9678578B2 (en) | 2011-10-03 | 2017-06-13 | Furuno Electric Co., Ltd. | Device having touch panel, display control program and display control method |
JP5983983B2 (en) | 2011-10-03 | 2016-09-06 | ソニー株式会社 | Information processing apparatus and method, and program |
US9619139B2 (en) | 2011-10-03 | 2017-04-11 | Kyocera Corporation | Device, method, and storage medium storing program |
US9804678B2 (en) | 2011-10-18 | 2017-10-31 | Slyde Watch Sa | Method and circuit for switching a wristwatch from a first power mode to a second power mode |
US9253282B2 (en) | 2011-10-18 | 2016-02-02 | Qualcomm Incorporated | Method and apparatus for generating, using, or updating an enriched user profile |
US9448708B1 (en) | 2011-10-19 | 2016-09-20 | Google Inc. | Theming for virtual collaboration |
US8467270B2 (en) | 2011-10-26 | 2013-06-18 | Google Inc. | Smart-watch with user interface features |
JP2013092989A (en) | 2011-10-27 | 2013-05-16 | Kyocera Corp | Device, method, and program |
US9477517B2 (en) | 2011-10-28 | 2016-10-25 | Qualcomm Incorporated | Service broker systems, methods, and apparatus |
US20130111579A1 (en) | 2011-10-31 | 2013-05-02 | Nokia Corporation | Electronic device mode, associated apparatus and methods |
JP2013097760A (en) | 2011-11-07 | 2013-05-20 | Toshiba Corp | Authentication system, terminal device, authentication program, and authentication method |
US9551980B2 (en) | 2011-11-09 | 2017-01-24 | Lonestar Inventions, L.P. | Solar timer using GPS technology |
JP2013101528A (en) | 2011-11-09 | 2013-05-23 | Sony Corp | Information processing apparatus, display control method, and program |
CN105653031B (en) | 2011-11-23 | 2019-10-01 | 英特尔公司 | Posture input with multiple views, display and physics |
EP2600316A1 (en) | 2011-11-29 | 2013-06-05 | Inria Institut National de Recherche en Informatique et en Automatique | Method, system and software program for shooting and editing a film comprising at least one image of a 3D computer-generated animation |
US8941707B2 (en) | 2011-12-01 | 2015-01-27 | Tangome, Inc. | Video messaging |
US8767034B2 (en) | 2011-12-01 | 2014-07-01 | Tangome, Inc. | Augmenting a video conference |
US20130141371A1 (en) | 2011-12-01 | 2013-06-06 | Research In Motion Limited | Electronic device and method of displaying information in response to a gesture |
US9154901B2 (en) | 2011-12-03 | 2015-10-06 | Location Labs, Inc. | System and method for disabling and enabling mobile device functional components |
US20130147933A1 (en) | 2011-12-09 | 2013-06-13 | Charles J. Kulas | User image insertion into a text message |
US9830049B2 (en) | 2011-12-12 | 2017-11-28 | Nokia Technologies Oy | Apparatus and method for providing a visual transition between screens |
US9207837B2 (en) | 2011-12-20 | 2015-12-08 | Nokia Technologies Oy | Method, apparatus and computer program product for providing multiple levels of interaction with a program |
US20130159900A1 (en) | 2011-12-20 | 2013-06-20 | Nokia Corporation | Method, apparatus and computer program product for graphically enhancing the user interface of a device |
EP2795440A4 (en) | 2011-12-21 | 2015-08-19 | Nokia Technologies Oy | Apparatus and method for collating application events with contacts of an electronic device. |
US20130225152A1 (en) | 2011-12-23 | 2013-08-29 | Microsoft Corporation | Automatically quieting mobile devices |
US9398262B2 (en) | 2011-12-29 | 2016-07-19 | Intel Corporation | Communication using avatar |
US9274683B2 (en) | 2011-12-30 | 2016-03-01 | Google Inc. | Interactive answer boxes for user search queries |
TWI494802B (en) | 2012-01-04 | 2015-08-01 | Asustek Comp Inc | Operating method and portable electronic device using the same |
CN104159508B (en) | 2012-01-04 | 2018-01-30 | 耐克创新有限合伙公司 | Sports watch |
US20130191785A1 (en) | 2012-01-23 | 2013-07-25 | Microsoft Corporation | Confident item selection using direct manipulation |
KR101907136B1 (en) | 2012-01-27 | 2018-10-11 | 라인 가부시키가이샤 | System and method for avatar service through cable and wireless web |
US9204099B2 (en) | 2012-02-01 | 2015-12-01 | Magor Communications Corporation | Videoconferencing system providing virtual physical context |
US8723796B2 (en) | 2012-02-02 | 2014-05-13 | Kodak Alaris Inc. | Multi-user interactive display system |
US9524272B2 (en) | 2012-02-05 | 2016-12-20 | Apple Inc. | Navigating among content items in a browser using an array mode |
WO2013120851A1 (en) | 2012-02-13 | 2013-08-22 | Mach-3D Sàrl | Method for sharing emotions through the creation of three-dimensional avatars and their interaction through a cloud-based platform |
CN103294965B (en) | 2012-02-16 | 2016-06-15 | 克利特股份有限公司 | Parent-child guidance support for social networks |
KR101873413B1 (en) | 2012-02-17 | 2018-07-02 | 엘지전자 주식회사 | Mobile terminal and control method for the mobile terminal |
US8988349B2 (en) | 2012-02-28 | 2015-03-24 | Google Technology Holdings LLC | Methods and apparatuses for operating a display in an electronic device |
KR101872865B1 (en) | 2012-03-05 | 2018-08-02 | 엘지전자 주식회사 | Electronic Device And Method Of Controlling The Same |
US9747495B2 (en) | 2012-03-06 | 2017-08-29 | Adobe Systems Incorporated | Systems and methods for creating and distributing modifiable animated video messages |
US20130239031A1 (en) | 2012-03-06 | 2013-09-12 | Apple Inc. | Application for viewing images |
KR101374385B1 (en) | 2012-03-07 | 2014-03-14 | 주식회사 팬택 | Method and apparatus for providing short-cut icon and portable device including the apparatus |
KR102030754B1 (en) * | 2012-03-08 | 2019-10-10 | 삼성전자주식회사 | Image edting apparatus and method for selecting region of interest |
GB2500375A (en) | 2012-03-13 | 2013-09-25 | Nec Corp | Input commands to a computer device using patterns of taps |
DE102012020817A1 (en) | 2012-03-13 | 2013-09-19 | Hannes Bonhoff | Method for entering a password and computer program product |
US20130254705A1 (en) | 2012-03-20 | 2013-09-26 | Wimm Labs, Inc. | Multi-axis user interface for a touch-screen enabled wearable device |
CN102681648A (en) | 2012-03-28 | 2012-09-19 | 中兴通讯股份有限公司 | Large screen terminal power saving method and device |
US9264660B1 (en) | 2012-03-30 | 2016-02-16 | Google Inc. | Presenter control during a video conference |
US20160345131A9 (en) | 2012-04-04 | 2016-11-24 | Port Nexus Corporation | Mobile device tracking monitoring system and device for enforcing organizational policies and no distracted driving protocols |
WO2013152454A1 (en) | 2012-04-09 | 2013-10-17 | Intel Corporation | System and method for avatar management and selection |
US20140198121A1 (en) | 2012-04-09 | 2014-07-17 | Xiaofeng Tong | System and method for avatar generation, rendering and animation |
WO2013152453A1 (en) | 2012-04-09 | 2013-10-17 | Intel Corporation | Communication using interactive avatars |
CN102622085A (en) | 2012-04-11 | 2012-08-01 | 北京航空航天大学 | Multidimensional sense man-machine interaction system and method |
US20130286161A1 (en) | 2012-04-25 | 2013-10-31 | Futurewei Technologies, Inc. | Three-dimensional face recognition for mobile devices |
US8847903B2 (en) | 2012-04-26 | 2014-09-30 | Motorola Mobility Llc | Unlocking an electronic device |
US9785883B2 (en) | 2012-04-27 | 2017-10-10 | Excalibur Ip, Llc | Avatars for use with personalized generalized content recommendations |
US9653056B2 (en) | 2012-04-30 | 2017-05-16 | Nokia Technologies Oy | Evaluation of beats, chords and downbeats from a musical audio signal |
US20130293686A1 (en) | 2012-05-03 | 2013-11-07 | Qualcomm Incorporated | 3d reconstruction of human subject using a mobile device |
EP2849004A4 (en) | 2012-05-07 | 2016-06-22 | Convex Corp Ltd | Relative time display device and relative time display program |
US9173052B2 (en) | 2012-05-08 | 2015-10-27 | ConnecteDevice Limited | Bluetooth low energy watch with event indicators and activation |
WO2013169875A2 (en) | 2012-05-09 | 2013-11-14 | Yknots Industries Llc | Device, method, and graphical user interface for displaying content associated with a corresponding affordance |
WO2013169849A2 (en) | 2012-05-09 | 2013-11-14 | Industries Llc Yknots | Device, method, and graphical user interface for displaying user interface objects corresponding to an application |
EP2847657B1 (en) | 2012-05-09 | 2016-08-10 | Apple Inc. | Device, method, and graphical user interface for displaying additional information in response to a user contact |
EP3264252B1 (en) | 2012-05-09 | 2019-11-27 | Apple Inc. | Device, method, and graphical user interface for performing an operation in accordance with a selected mode of operation |
WO2013169842A2 (en) | 2012-05-09 | 2013-11-14 | Yknots Industries Llc | Device, method, and graphical user interface for selecting object within a group of objects |
KR101868352B1 (en) * | 2012-05-14 | 2018-06-19 | 엘지전자 주식회사 | Mobile terminal and control method thereof |
US8966612B2 (en) | 2012-05-16 | 2015-02-24 | Ebay Inc. | Lockable widgets on a mobile device |
US9361021B2 (en) | 2012-05-22 | 2016-06-07 | Irobot Corporation | Graphical user interfaces including touchpad driving interfaces for telemedicine devices |
US20130318437A1 (en) | 2012-05-22 | 2013-11-28 | Samsung Electronics Co., Ltd. | Method for providing ui and portable apparatus applying the same |
US9927952B2 (en) | 2012-05-23 | 2018-03-27 | Microsoft Technology Licensing, Llc | Utilizing a ribbon to access an application user interface |
US8718716B2 (en) | 2012-05-23 | 2014-05-06 | Steven Earl Kader | Method of displaying images while charging a smartphone |
CN103425399A (en) | 2012-05-25 | 2013-12-04 | 鸿富锦精密工业(深圳)有限公司 | Portable electronic device unlocking system and unlocking mode setting method therefor |
US20130322218A1 (en) | 2012-05-29 | 2013-12-05 | Wolfgang Burkhardt | World Time Timepiece |
US9756172B2 (en) | 2012-06-05 | 2017-09-05 | Apple Inc. | Methods and apparatus for determining environmental factors to modify hardware or system operation |
US8965696B2 (en) | 2012-06-05 | 2015-02-24 | Apple Inc. | Providing navigation instructions while operating navigation application in background |
WO2013184744A2 (en) | 2012-06-05 | 2013-12-12 | Nike International Ltd. | Multi-activity platform and interface |
US9348607B2 (en) | 2012-06-07 | 2016-05-24 | Apple Inc. | Quiet hours for notifications |
US20130332840A1 (en) | 2012-06-10 | 2013-12-12 | Apple Inc. | Image application for creating and sharing image streams |
KR101911133B1 (en) | 2012-06-21 | 2018-10-23 | 마이크로소프트 테크놀로지 라이센싱, 엘엘씨 | Avatar construction using depth camera |
US8948832B2 (en) | 2012-06-22 | 2015-02-03 | Fitbit, Inc. | Wearable heart rate monitor |
US9042971B2 (en) | 2012-06-22 | 2015-05-26 | Fitbit, Inc. | Biometric monitoring device with heart rate measurement activated by a single user-gesture |
EP2680615B1 (en) | 2012-06-25 | 2018-08-08 | LG Electronics Inc. | Mobile terminal and audio zooming method thereof |
US9489471B2 (en) | 2012-06-29 | 2016-11-08 | Dell Products L.P. | Flash redirection with caching |
US9069932B2 (en) | 2012-07-06 | 2015-06-30 | Blackberry Limited | User-rotatable three-dimensionally rendered object for unlocking a computing device |
US20140022183A1 (en) | 2012-07-19 | 2014-01-23 | General Instrument Corporation | Sending and receiving information |
JP5922522B2 (en) | 2012-07-24 | 2016-05-24 | 京セラ株式会社 | Mobile device |
KR101892233B1 (en) | 2012-08-03 | 2018-08-27 | 삼성전자주식회사 | Method and apparatus for alarm service using context aware in portable terminal |
JP6309185B2 (en) | 2012-08-07 | 2018-04-11 | 任天堂株式会社 | Image display program, image display apparatus, image display system, and image display method |
JP2014035766A (en) | 2012-08-09 | 2014-02-24 | Keishi Hattori | Kaleidoscope image generation program |
US8910082B2 (en) * | 2012-08-10 | 2014-12-09 | Modiface Inc. | Method and system for modification of digital images through rotational cascading-effect interface |
CN102819400A (en) * | 2012-08-14 | 2012-12-12 | 北京小米科技有限责任公司 | Desktop system, interface interaction method and interface interaction device of mobile terminal |
US20140055495A1 (en) | 2012-08-22 | 2014-02-27 | Lg Cns Co., Ltd. | Responsive user interface engine for display devices |
KR20140026027A (en) | 2012-08-24 | 2014-03-05 | 삼성전자주식회사 | Method for running application and mobile device |
US9230076B2 (en) | 2012-08-30 | 2016-01-05 | Microsoft Technology Licensing, Llc | Mobile device child share |
US10553002B2 (en) | 2012-08-31 | 2020-02-04 | Apple, Inc. | Information display using electronic diffusers |
KR101955979B1 (en) | 2012-09-04 | 2019-03-08 | 엘지전자 주식회사 | Mobile terminal and application icon moving method thereof |
US9936165B2 (en) | 2012-09-06 | 2018-04-03 | Intel Corporation | System and method for avatar creation and synchronization |
US9602559B1 (en) | 2012-09-07 | 2017-03-21 | Mindmeld, Inc. | Collaborative communication system with real-time anticipatory computing |
US20140074570A1 (en) | 2012-09-10 | 2014-03-13 | Super Transcon Ip, Llc | Commerce System and Method of Controlling the Commerce System by Presenting Contextual Advertisements on a Computer System |
US20140173439A1 (en) | 2012-09-12 | 2014-06-19 | ACCO Brands Corporation | User interface for object tracking |
US20140078144A1 (en) | 2012-09-14 | 2014-03-20 | Squee, Inc. | Systems and methods for avatar creation |
US20140082533A1 (en) | 2012-09-20 | 2014-03-20 | Adobe Systems Incorporated | Navigation Interface for Electronic Content |
KR102017845B1 (en) | 2012-09-20 | 2019-09-03 | 삼성전자주식회사 | Method and apparatus for displaying missed calls on mobile terminal |
US20150113468A1 (en) | 2012-09-24 | 2015-04-23 | Richard Lawrence Clark | System and method of inputting time on an electronic device having a touch screen |
US20140086123A1 (en) | 2012-09-27 | 2014-03-27 | Apple Inc. | Disabling a low power mode to improve the reception of high priority messages |
KR20140042427A (en) | 2012-09-28 | 2014-04-07 | 삼성전자주식회사 | Device for creating animated emoticon and mehtod for controlling thereof |
RU2523040C2 (en) | 2012-10-02 | 2014-07-20 | ЭлДжи ЭЛЕКТРОНИКС ИНК. | Screen brightness control for mobile device |
US20140098296A1 (en) | 2012-10-04 | 2014-04-10 | Ati Technologies Ulc | Method and apparatus for changing a perspective of a video |
US20180122263A9 (en) | 2012-10-05 | 2018-05-03 | GlobalMe, LLC | Creating a workout routine in online and mobile networking environments |
US8701032B1 (en) | 2012-10-16 | 2014-04-15 | Google Inc. | Incremental multi-word recognition |
KR102004287B1 (en) | 2012-10-17 | 2019-07-26 | 에스케이플래닛 주식회사 | Apparatus and methods of making user emoticon |
US8832606B2 (en) | 2012-10-19 | 2014-09-09 | Google Inc. | Wallpaper assignment for multi-user mobile device |
KR101390228B1 (en) | 2012-10-22 | 2014-05-07 | (주)카카오 | Device and method of displaying image on chat area, and server for managing chat data |
US9152211B2 (en) | 2012-10-30 | 2015-10-06 | Google Technology Holdings LLC | Electronic device with enhanced notifications |
US9582156B2 (en) | 2012-11-02 | 2017-02-28 | Amazon Technologies, Inc. | Electronic publishing mechanisms |
CH707163A2 (en) | 2012-11-06 | 2014-05-15 | Montres Breguet Sa | Display mechanism for displaying day and lunar phase of e.g. Earth, in astronomic watch, has three-dimensional display unit displaying day and phase of star, where display unit is formed by mobile part that is driven by wheel |
US9948589B2 (en) | 2012-11-14 | 2018-04-17 | invi Labs, Inc. | System for and method of organizing contacts for chat sessions on an electronic device |
US9696898B2 (en) | 2012-11-14 | 2017-07-04 | Facebook, Inc. | Scrolling through a series of content items |
US9606695B2 (en) | 2012-11-14 | 2017-03-28 | Facebook, Inc. | Event notification |
US9235321B2 (en) | 2012-11-14 | 2016-01-12 | Facebook, Inc. | Animation sequence associated with content item |
US10185416B2 (en) | 2012-11-20 | 2019-01-22 | Samsung Electronics Co., Ltd. | User gesture input to wearable electronic device involving movement of device |
US10551928B2 (en) | 2012-11-20 | 2020-02-04 | Samsung Electronics Company, Ltd. | GUI transitions on wearable electronic device |
WO2014080064A1 (en) | 2012-11-20 | 2014-05-30 | Jolla Oy | A method, an apparatus and a computer program product for creating a user interface view |
US9448685B1 (en) | 2012-11-20 | 2016-09-20 | Amazon Technologies, Inc. | Preemptive event notification for media experience |
US8994827B2 (en) | 2012-11-20 | 2015-03-31 | Samsung Electronics Co., Ltd | Wearable electronic device |
US11372536B2 (en) | 2012-11-20 | 2022-06-28 | Samsung Electronics Company, Ltd. | Transition and interaction model for wearable electronic device |
JP2014109881A (en) | 2012-11-30 | 2014-06-12 | Toshiba Corp | Information processing device, information processing method, and program |
JP6338318B2 (en) | 2012-11-30 | 2018-06-06 | キヤノン株式会社 | Operating device, image forming apparatus, and computer program |
US9141270B1 (en) | 2012-12-01 | 2015-09-22 | Allscipts Software, Llc | Smart scroller user interface element |
KR102141044B1 (en) | 2012-12-03 | 2020-08-04 | 삼성전자주식회사 | Apparatus having a plurality of touch screens and method for sound output thereof |
KR102206044B1 (en) | 2012-12-10 | 2021-01-21 | 삼성전자주식회사 | Mobile device of bangle type, and methods for controlling and diplaying ui thereof |
US20140164907A1 (en) | 2012-12-12 | 2014-06-12 | Lg Electronics Inc. | Mobile terminal and method of controlling the mobile terminal |
US20140171132A1 (en) | 2012-12-14 | 2014-06-19 | Apple Inc. | Method and Apparatus for Automatically Repeating Alarms and Notifications in Response to Device Motion |
KR102037416B1 (en) | 2012-12-17 | 2019-10-28 | 삼성전자주식회사 | Method for managing of external devices, method for operating of an external device, host device, management server, and the external device |
US9466142B2 (en) | 2012-12-17 | 2016-10-11 | Intel Corporation | Facial movement based avatar animation |
CH707412A2 (en) | 2012-12-19 | 2014-06-30 | Eduardo Santana | Method for displaying rotation time of earth, involves obtaining relative motion between three-dimensional earth model and annular point field, from axis of earth and from superimposed motion of earth model along orbit of sun |
JP5874625B2 (en) | 2012-12-20 | 2016-03-02 | カシオ計算機株式会社 | INPUT DEVICE, INPUT OPERATION METHOD, CONTROL PROGRAM, AND ELECTRONIC DEVICE |
US9071923B2 (en) | 2012-12-20 | 2015-06-30 | Cellco Partnership | Automatic archiving of an application on a mobile device |
US20140189584A1 (en) | 2012-12-27 | 2014-07-03 | Compal Communications, Inc. | Method for switching applications in user interface and electronic apparatus using the same |
CN103902165B (en) | 2012-12-27 | 2017-08-01 | 北京新媒传信科技有限公司 | The method and apparatus for realizing menu background |
US10317977B2 (en) | 2012-12-28 | 2019-06-11 | Intel Corporation | Displaying area adjustment |
WO2014105275A1 (en) | 2012-12-29 | 2014-07-03 | Yknots Industries Llc | Device, method, and graphical user interface for forgoing generation of tactile output for a multi-contact gesture |
WO2015034966A1 (en) | 2013-09-03 | 2015-03-12 | Apple Inc. | User interface object manipulations in a user interface |
EP3564806B1 (en) | 2012-12-29 | 2024-02-21 | Apple Inc. | Device, method and graphical user interface for determining whether to scroll or select contents |
CN109375853A (en) | 2012-12-29 | 2019-02-22 | 苹果公司 | To equipment, method and the graphic user interface of the navigation of user interface hierarchical structure |
CN103914238B (en) | 2012-12-30 | 2017-02-08 | 杭州网易云音乐科技有限公司 | Method and device for achieving integration of controls in interface |
GB201300031D0 (en) | 2013-01-02 | 2013-02-13 | Canonical Ltd | Ubuntu UX innovations |
US20140195476A1 (en) | 2013-01-10 | 2014-07-10 | Sap Ag | Generating notification from database update |
CN103927190A (en) | 2013-01-11 | 2014-07-16 | 腾讯科技(深圳)有限公司 | Network expression downloading method and device |
US20140201655A1 (en) | 2013-01-16 | 2014-07-17 | Lookout, Inc. | Method and system for managing and displaying activity icons on a mobile device |
US9295413B2 (en) | 2013-01-17 | 2016-03-29 | Garmin Switzerland Gmbh | Fitness monitor |
JP5572726B2 (en) | 2013-01-24 | 2014-08-13 | デジタルア−ツ株式会社 | Program and information processing method |
US9933846B2 (en) | 2013-01-28 | 2018-04-03 | Samsung Electronics Co., Ltd. | Electronic system with display mode mechanism and method of operation thereof |
WO2014120210A1 (en) | 2013-01-31 | 2014-08-07 | Hewlett-Packard Development Company L.P. | Selection feature for adjusting values on a computing device |
CN103093490B (en) | 2013-02-02 | 2015-08-26 | 浙江大学 | Based on the real-time face animation method of single video camera |
KR20140101242A (en) | 2013-02-08 | 2014-08-19 | 삼성전자주식회사 | Mobile terminal and its operating method |
US10708545B2 (en) | 2018-01-17 | 2020-07-07 | Duelight Llc | System, method, and computer program for transmitting face models based on face data points |
US9682281B2 (en) | 2013-02-22 | 2017-06-20 | Nike, Inc. | Activity monitoring, tracking and synchronization |
US9031783B2 (en) | 2013-02-28 | 2015-05-12 | Blackberry Limited | Repositionable graphical current location indicator |
KR102188097B1 (en) | 2013-03-04 | 2020-12-07 | 삼성전자주식회사 | Method for operating page and electronic device thereof |
US9094576B1 (en) | 2013-03-12 | 2015-07-28 | Amazon Technologies, Inc. | Rendered audiovisual communication |
US9280844B2 (en) | 2013-03-12 | 2016-03-08 | Comcast Cable Communications, Llc | Animation |
EP2972816A4 (en) | 2013-03-13 | 2016-11-09 | Owaves Inc | Lifestyle management system |
US9087234B2 (en) | 2013-03-15 | 2015-07-21 | Nike, Inc. | Monitoring fitness using a mobile device |
US9792014B2 (en) | 2013-03-15 | 2017-10-17 | Microsoft Technology Licensing, Llc | In-place contextual menu for handling actions for a listing of items |
US20140282207A1 (en) | 2013-03-15 | 2014-09-18 | Rita H. Wouhaybi | Integration for applications and containers |
US10692096B2 (en) | 2013-03-15 | 2020-06-23 | Thermodynamic Design, Llc | Customizable data management system |
US20140267618A1 (en) | 2013-03-15 | 2014-09-18 | Google Inc. | Capturing and Refocusing Imagery |
US20140282103A1 (en) | 2013-03-16 | 2014-09-18 | Jerry Alan Crandall | Data sharing |
KR20140115731A (en) | 2013-03-22 | 2014-10-01 | 삼성전자주식회사 | Method for converting object in portable terminal and device thereof |
US20140331314A1 (en) | 2013-03-28 | 2014-11-06 | Fuhu Holdings, Inc. | Time and Sleep Control System and Method |
US20140293755A1 (en) | 2013-03-28 | 2014-10-02 | Meta Watch Oy | Device with functional display and method for time management |
KR20140120488A (en) | 2013-04-03 | 2014-10-14 | 엘지전자 주식회사 | Portable device and controlling method thereof |
JP5603452B1 (en) | 2013-04-11 | 2014-10-08 | 株式会社スクウェア・エニックス | Video game processing apparatus and video game processing program |
US9479922B2 (en) | 2013-04-12 | 2016-10-25 | Google Inc. | Provisioning a plurality of computing devices |
JP5630676B2 (en) | 2013-04-15 | 2014-11-26 | 東京自動機工株式会社 | Variable transmission |
KR101495257B1 (en) | 2013-04-18 | 2015-02-25 | 주식회사 팬택 | Apparatus and method for controlling terminal icon |
US9594354B1 (en) | 2013-04-19 | 2017-03-14 | Dp Technologies, Inc. | Smart watch extended system |
KR102171444B1 (en) | 2013-04-22 | 2020-10-29 | 엘지전자 주식회사 | Smart watch and method for controlling thereof |
CN103279261B (en) | 2013-04-23 | 2016-06-29 | 惠州Tcl移动通信有限公司 | The adding method of wireless telecommunications system and widget thereof |
JP6092702B2 (en) | 2013-04-25 | 2017-03-08 | 京セラ株式会社 | Communication terminal and information transmission method |
US20140325408A1 (en) | 2013-04-26 | 2014-10-30 | Nokia Corporation | Apparatus and method for providing musical content based on graphical user inputs |
IL226047A (en) | 2013-04-29 | 2017-12-31 | Hershkovitz Reshef May | Method and system for providing personal emoticons |
US9354613B2 (en) | 2013-05-01 | 2016-05-31 | Rajendra Serber | Proportional hour time display |
EP2992692B1 (en) | 2013-05-04 | 2018-08-29 | DECHARMS, Christopher | Mobile security technology |
US9721166B2 (en) | 2013-05-05 | 2017-08-01 | Qognify Ltd. | System and method for identifying a particular human in images using an artificial image composite or avatar |
US10805861B2 (en) | 2013-05-08 | 2020-10-13 | Cellcontrol, Inc. | Context-aware mobile device management |
GB2515266B (en) | 2013-05-09 | 2018-02-28 | Disney Entpr Inc | Manufacturing Process for 3D Printed Objects |
KR20140133363A (en) | 2013-05-10 | 2014-11-19 | 삼성전자주식회사 | Display apparatus and Method for controlling the display apparatus thereof |
JP2014222439A (en) | 2013-05-14 | 2014-11-27 | ソニー株式会社 | Information processing apparatus, part generating and using method, and program |
US9904575B2 (en) | 2013-05-15 | 2018-02-27 | Apple Inc. | System and method for selective timer rate limiting |
US9069458B2 (en) | 2013-05-16 | 2015-06-30 | Barnes & Noble College Booksellers, Llc | Kid mode user interface with application-specific configurability |
KR102070174B1 (en) | 2013-05-16 | 2020-01-29 | 인텔 코포레이션 | Automatically adjusting display areas to reduce power consumption |
US9649555B2 (en) | 2013-05-17 | 2017-05-16 | Brain Enterprises, LLC | System and process for a puzzle game |
KR20140136633A (en) | 2013-05-21 | 2014-12-01 | 삼성전자주식회사 | Method and apparatus for executing application in portable electronic device |
KR102144763B1 (en) | 2013-05-22 | 2020-08-28 | 삼성전자주식회사 | Method and apparatus for displaying schedule on wearable device |
CN104184760B (en) | 2013-05-22 | 2018-08-07 | 阿里巴巴集团控股有限公司 | Information interacting method, client in communication process and server |
US9282368B2 (en) | 2013-05-30 | 2016-03-08 | Verizon Patent And Licensing Inc. | Parental control system using more restrictive setting for media clients based on occurrence of an event |
US10021180B2 (en) | 2013-06-04 | 2018-07-10 | Kingston Digital, Inc. | Universal environment extender |
US9589357B2 (en) | 2013-06-04 | 2017-03-07 | Intel Corporation | Avatar-based video encoding |
US9378576B2 (en) | 2013-06-07 | 2016-06-28 | Faceshift Ag | Online modeling for real-time facial animation |
US10168882B2 (en) | 2013-06-09 | 2019-01-01 | Apple Inc. | Device, method, and graphical user interface for switching between camera interfaces |
US9626087B2 (en) | 2013-06-09 | 2017-04-18 | Apple Inc. | Device, method, and graphical user interface for navigating between user interfaces |
US9542907B2 (en) | 2013-06-09 | 2017-01-10 | Apple Inc. | Content adjustment in graphical user interface based on background content |
EP2992490B1 (en) | 2013-06-09 | 2021-02-24 | Apple Inc. | Device, method, and graphical user interface for sharing content from a respective application |
US9753436B2 (en) | 2013-06-11 | 2017-09-05 | Apple Inc. | Rotary input mechanism for an electronic device |
AU2014302623A1 (en) | 2013-06-24 | 2016-01-21 | Cimpress Schweiz Gmbh | System, method and user interface for designing customizable products from a mobile device |
CN103309618A (en) | 2013-07-02 | 2013-09-18 | 姜洪明 | Mobile operating system |
KR20150008996A (en) | 2013-07-04 | 2015-01-26 | 엘지전자 주식회사 | Mobile terminal and control method thereof |
KR102044701B1 (en) | 2013-07-10 | 2019-11-14 | 엘지전자 주식회사 | Mobile terminal |
US8725842B1 (en) | 2013-07-11 | 2014-05-13 | Khalid Al-Nasser | Smart watch |
US20150019982A1 (en) | 2013-07-12 | 2015-01-15 | Felix Houston Petitt, JR. | System, devices, and platform for security |
US9304667B2 (en) | 2013-07-12 | 2016-04-05 | Felix Houston Petitt, JR. | System, devices, and platform for education, entertainment |
KR102179812B1 (en) | 2013-07-18 | 2020-11-17 | 엘지전자 주식회사 | Watch type mobile terminal |
KR102163684B1 (en) | 2013-07-19 | 2020-10-12 | 삼성전자주식회사 | Method and apparatus for constructing a home screen of the device |
US20150033192A1 (en) | 2013-07-23 | 2015-01-29 | 3M Innovative Properties Company | Method for creating effective interactive advertising content |
JP2013232230A (en) | 2013-07-25 | 2013-11-14 | Sharp Corp | Display device, television receiver, display method, program, and recording medium |
JP6132260B2 (en) * | 2013-07-30 | 2017-05-24 | ブラザー工業株式会社 | Print data editing program |
US9923953B2 (en) | 2013-07-31 | 2018-03-20 | Adenda Media Inc. | Extending mobile applications to the lock screen of a mobile device |
CA3231419A1 (en) | 2013-08-02 | 2015-02-05 | Soul Machines Limited | System for neurobehavioural animation |
CN110413054B (en) | 2013-08-12 | 2023-04-28 | 苹果公司 | Context-sensitive actions in response to touch input |
KR102090923B1 (en) | 2013-08-13 | 2020-03-19 | 이베이 인크. | Applications for wearable devices |
US9959011B2 (en) | 2013-08-14 | 2018-05-01 | Vizbii Technologies, Inc. | Methods, apparatuses, and computer program products for quantifying a subjective experience |
US9568891B2 (en) | 2013-08-15 | 2017-02-14 | I.Am.Plus, Llc | Multi-media wireless watch |
US10289265B2 (en) | 2013-08-15 | 2019-05-14 | Excalibur Ip, Llc | Capture and retrieval of a personalized mood icon |
KR102101741B1 (en) | 2013-08-16 | 2020-05-29 | 엘지전자 주식회사 | Mobile terminal and method for controlling the same |
PL3036930T3 (en) | 2013-08-19 | 2020-08-10 | Estimote Polska Sp. Z O.O. | Method for distributing notifications |
CN103399480A (en) | 2013-08-19 | 2013-11-20 | 百度在线网络技术(北京)有限公司 | Smart watch, and control device and control method of smart watch |
US9881645B2 (en) | 2013-08-20 | 2018-01-30 | Google Llc | Systems, methods, and media for editing video during playback via gestures |
KR20150021311A (en) | 2013-08-20 | 2015-03-02 | 삼성전자주식회사 | Method and apparatus for saving battery of portable terminal |
US10075598B2 (en) | 2013-08-21 | 2018-09-11 | The Neat Company, Inc. | Sheet scanner with swipe screen interface with links to multiple storage destinations for scanned items |
US9804760B2 (en) | 2013-08-22 | 2017-10-31 | Apple Inc. | Scrollable in-line camera for capturing and sharing content |
US8775844B1 (en) | 2013-08-29 | 2014-07-08 | Google Inc. | Dynamic information adaptation for a computing device having multiple power modes |
EP3041247B1 (en) | 2013-08-29 | 2019-03-06 | Panasonic Intellectual Property Management Co., Ltd. | Broadcast image output device, download server, and control methods therefor |
US20150062130A1 (en) | 2013-08-30 | 2015-03-05 | Blackberry Limited | Low power design for autonomous animation |
US10001817B2 (en) | 2013-09-03 | 2018-06-19 | Apple Inc. | User interface for manipulating user interface objects with magnetic properties |
US20150061988A1 (en) | 2013-09-05 | 2015-03-05 | Texas Instruments Incorporated | Adaptive Power Savings on a Device Display |
KR102077108B1 (en) | 2013-09-13 | 2020-02-14 | 한국전자통신연구원 | Apparatus and method for providing contents experience service |
CA2863748C (en) * | 2013-09-19 | 2023-06-27 | Prinova, Inc. | System and method for variant content navigation |
KR102223504B1 (en) | 2013-09-25 | 2021-03-04 | 삼성전자주식회사 | Quantum dot-resin nanocomposite and method of preparing same |
US20150100621A1 (en) | 2013-10-03 | 2015-04-09 | Yang Pan | User Interface for a System Including Smart Phone and Smart Watch |
US20150302624A1 (en) * | 2013-10-09 | 2015-10-22 | Genue, Inc. | Pattern based design application |
US9794397B2 (en) | 2013-10-16 | 2017-10-17 | Lg Electronics Inc. | Watch type mobile terminal and method for controlling the same |
US20150112700A1 (en) | 2013-10-17 | 2015-04-23 | General Electric Company | Systems and methods to provide a kpi dashboard and answer high value questions |
KR102169952B1 (en) | 2013-10-18 | 2020-10-26 | 엘지전자 주식회사 | Wearable device and method of controlling thereof |
US20150143234A1 (en) | 2013-10-22 | 2015-05-21 | Forbes Holten Norris, III | Ergonomic micro user interface display and editing |
US9082314B2 (en) | 2013-10-30 | 2015-07-14 | Titanium Marketing, Inc. | Time teaching watch and method |
KR102129594B1 (en) | 2013-10-30 | 2020-07-03 | 애플 인크. | Displaying relevant user interface objects |
CN103544920A (en) | 2013-10-31 | 2014-01-29 | 海信集团有限公司 | Method, device and electronic device for screen display |
US20150128042A1 (en) | 2013-11-04 | 2015-05-07 | Microsoft Corporation | Multitasking experiences with interactive picture-in-picture |
KR102097639B1 (en) | 2013-11-05 | 2020-04-06 | 엘지전자 주식회사 | Mobile terminal and control method of the same |
CN103607660A (en) | 2013-11-22 | 2014-02-26 | 乐视致新电子科技(天津)有限公司 | Intelligent television interface switching control method and control apparatus |
EP2876537B1 (en) | 2013-11-22 | 2016-06-01 | Creoir Oy | Power-save mode in electronic apparatus |
US9246961B2 (en) | 2013-11-27 | 2016-01-26 | Facebook, Inc. | Communication user interface systems and methods |
AU2013406178A1 (en) | 2013-11-27 | 2016-05-12 | Facebook, Inc. | Communication user interface systems and methods |
US20150205509A1 (en) | 2013-12-02 | 2015-07-23 | Daydials, Inc. | User interface using graphical dials to represent user activity |
EP3077899A1 (en) | 2013-12-02 | 2016-10-12 | Daydials, Inc. | User interface using graphical dials to represent user activity |
US20160019360A1 (en) | 2013-12-04 | 2016-01-21 | Apple Inc. | Wellness aggregator |
WO2015083969A1 (en) | 2013-12-05 | 2015-06-11 | Lg Electronics Inc. | Mobile terminal and method for controlling the same |
US9430758B2 (en) | 2013-12-05 | 2016-08-30 | Cisco Technology, Inc. | User interface component with a radial clock and integrated schedule |
US9301082B2 (en) | 2013-12-06 | 2016-03-29 | Apple Inc. | Mobile device sensor data subscribing and sharing |
KR102131829B1 (en) | 2013-12-17 | 2020-08-05 | 엘지전자 주식회사 | Mobile terminal and method for controlling thereof |
US20150185703A1 (en) | 2013-12-27 | 2015-07-02 | Kabushiki Kaisha Toshiba | Electronic device and method for displaying watch object |
US9519408B2 (en) | 2013-12-31 | 2016-12-13 | Google Inc. | Systems and methods for guided user actions |
CN104753762B (en) | 2013-12-31 | 2018-07-27 | 北京发现角科技有限公司 | The method and system that ornament is added into avatar icon applied to instant messaging |
KR20150081140A (en) | 2014-01-03 | 2015-07-13 | 엘지전자 주식회사 | Wearable device and operation method thereof |
US9293119B2 (en) | 2014-01-06 | 2016-03-22 | Nvidia Corporation | Method and apparatus for optimizing display updates on an interactive display device |
US8938394B1 (en) | 2014-01-09 | 2015-01-20 | Google Inc. | Audio triggers based on context |
WO2015114950A1 (en) | 2014-01-30 | 2015-08-06 | コニカミノルタ株式会社 | Organ image capturing device |
EP3821795A1 (en) | 2014-02-03 | 2021-05-19 | NIKE Innovate C.V. | Visualization of activity points |
KR102304082B1 (en) | 2014-02-06 | 2021-09-24 | 삼성전자주식회사 | Apparatus and method for controlling displays |
US9804635B2 (en) | 2014-02-06 | 2017-10-31 | Samsung Electronics Co., Ltd. | Electronic device and method for controlling displays |
JP2015148946A (en) | 2014-02-06 | 2015-08-20 | ソニー株式会社 | Information processing device, information processing method, and program |
KR102170246B1 (en) | 2014-02-07 | 2020-10-26 | 삼성전자주식회사 | Electronic device and method for displaying image information |
WO2015126095A1 (en) | 2014-02-21 | 2015-08-27 | 삼성전자 주식회사 | Electronic device |
US10691332B2 (en) | 2014-02-28 | 2020-06-23 | Samsung Electronics Company, Ltd. | Text input on an interactive display |
US9519273B2 (en) | 2014-03-06 | 2016-12-13 | Seiko Epson Corporation | Electronic timepiece and movement |
CN104915161A (en) | 2014-03-10 | 2015-09-16 | 联想(北京)有限公司 | Information processing method and electronic equipment |
US20150253736A1 (en) | 2014-03-10 | 2015-09-10 | Icon Health & Fitness, Inc. | Watch with Multiple Sections for Tracking Multiple Parameters |
KR102208620B1 (en) | 2014-03-12 | 2021-01-28 | 삼성전자 주식회사 | Method for saving a power and portable electronic device supporting the same |
CN106133796B (en) | 2014-03-25 | 2019-07-16 | 苹果公司 | For indicating the method and system of virtual objects in the view of true environment |
US9798378B2 (en) | 2014-03-31 | 2017-10-24 | Google Technology Holdings LLC | Apparatus and method for awakening a primary processor out of sleep mode |
JP6173961B2 (en) | 2014-04-01 | 2017-08-02 | ビッグローブ株式会社 | Communication terminal, display control method and program |
GB201406167D0 (en) | 2014-04-04 | 2014-05-21 | Acticheck Ltd | Wearable apparatus and network for communication therewith |
US20150286391A1 (en) | 2014-04-08 | 2015-10-08 | Olio Devices, Inc. | System and method for smart watch navigation |
KR102244856B1 (en) | 2014-04-22 | 2021-04-27 | 삼성전자 주식회사 | Method for providing user interaction with wearable device and wearable device implenenting thereof |
US20150301506A1 (en) | 2014-04-22 | 2015-10-22 | Fahad Koumaiha | Transparent capacitive touchscreen device overlying a mechanical component |
CN203773233U (en) | 2014-04-23 | 2014-08-13 | 漳州海博工贸有限公司 | Pointer disc time-travelling colorful clock |
WO2015163536A1 (en) | 2014-04-24 | 2015-10-29 | Lg Electronics Inc. | Display device and method for controlling the same |
JP2015210587A (en) | 2014-04-24 | 2015-11-24 | 株式会社Nttドコモ | Information processing device, program, and information output method |
US10845982B2 (en) | 2014-04-28 | 2020-11-24 | Facebook, Inc. | Providing intelligent transcriptions of sound messages in a messaging application |
US9606788B2 (en) | 2014-04-30 | 2017-03-28 | Microsoft Technology Licensing, Llc | Dynamic update installer for customized software |
US20150317945A1 (en) * | 2014-04-30 | 2015-11-05 | Yahoo! Inc. | Systems and methods for generating tinted glass effect for interface controls and elements |
KR102173110B1 (en) | 2014-05-07 | 2020-11-02 | 삼성전자주식회사 | Wearable device and controlling method thereof |
JP2017526078A (en) | 2014-05-09 | 2017-09-07 | グーグル インコーポレイテッド | System and method for biomechanics-based ocular signals for interacting with real and virtual objects |
WO2015175240A1 (en) | 2014-05-15 | 2015-11-19 | Narvii Inc. | Systems and methods implementing user interface objects |
US20150331589A1 (en) | 2014-05-15 | 2015-11-19 | Todd KAWAKITA | Circular interface for navigating applications and an authentication mechanism on a wearable device |
US20150339261A1 (en) | 2014-05-23 | 2015-11-26 | Samsung Electronics Co., Ltd. | System and method for data transfer among the devices |
CN103973899B (en) | 2014-05-23 | 2015-12-02 | 努比亚技术有限公司 | Method is shared in a kind of mobile terminal and configuration thereof |
US20150346824A1 (en) | 2014-05-27 | 2015-12-03 | Apple Inc. | Electronic Devices with Low Power Motion Sensing and Gesture Recognition Circuitry |
WO2015183567A1 (en) | 2014-05-28 | 2015-12-03 | Polyera Corporation | Low power display updates |
WO2015183755A1 (en) | 2014-05-31 | 2015-12-03 | Apple Inc. | Message user interfaces for captured and transmittal of media and location |
US9628416B2 (en) | 2014-05-30 | 2017-04-18 | Cisco Technology, Inc. | Photo avatars |
US9185062B1 (en) | 2014-05-31 | 2015-11-10 | Apple Inc. | Message user interfaces for capture and transmittal of media and location content |
US9377762B2 (en) | 2014-06-02 | 2016-06-28 | Google Technology Holdings LLC | Displaying notifications on a watchface |
EP3761633A1 (en) | 2014-06-04 | 2021-01-06 | Apple Inc. | Dynamic display of video communication data |
KR20150140212A (en) | 2014-06-05 | 2015-12-15 | 삼성전자주식회사 | A wearable device, main unit of the wearable device, fixing unit of the wearable device and control method thereof |
CN105204931B (en) | 2014-06-11 | 2019-03-15 | 联发科技(新加坡)私人有限公司 | Low-power consumption wearable device and its multiple operating system switching, communication and management method |
US10775875B2 (en) | 2014-06-11 | 2020-09-15 | Mediatek Singapore Pte. Ltd. | Devices and methods for switching and communication among multiple operating systems and application management methods thereof |
KR102281133B1 (en) | 2014-06-12 | 2021-07-26 | 엘지전자 주식회사 | Watch type terminal and control method thereof |
US10478127B2 (en) | 2014-06-23 | 2019-11-19 | Sherlock Solutions, LLC | Apparatuses, methods, processes, and systems related to significant detrimental changes in health parameters and activating lifesaving measures |
CN104063280B (en) | 2014-06-25 | 2017-09-12 | 华为技术有限公司 | A kind of control method of intelligent terminal |
AU2015279544B2 (en) | 2014-06-27 | 2018-03-15 | Apple Inc. | Electronic device with rotatable input mechanism for navigating calendar application |
EP2960750B1 (en) | 2014-06-27 | 2019-02-20 | Samsung Electronics Co., Ltd | Portable terminal and display method thereof |
JP2016013151A (en) | 2014-06-30 | 2016-01-28 | 株式会社バンダイナムコエンターテインメント | Server system, game device, and program |
US9589362B2 (en) | 2014-07-01 | 2017-03-07 | Qualcomm Incorporated | System and method of three-dimensional model generation |
US20160004393A1 (en) | 2014-07-01 | 2016-01-07 | Google Inc. | Wearable device user interface control |
JP5807094B1 (en) | 2014-07-01 | 2015-11-10 | 株式会社 ディー・エヌ・エー | System, method and program enabling voice chat |
KR20160004770A (en) | 2014-07-04 | 2016-01-13 | 엘지전자 주식회사 | Watch-type mobile terminal |
US9615787B2 (en) | 2014-07-24 | 2017-04-11 | Lenovo (Singapre) Pte. Ltd. | Determining whether to change a time at which an alarm is to occur based at least in part on sleep data |
US20160134840A1 (en) | 2014-07-28 | 2016-05-12 | Alexa Margaret McCulloch | Avatar-Mediated Telepresence Systems with Enhanced Filtering |
WO2016017987A1 (en) * | 2014-07-31 | 2016-02-04 | Samsung Electronics Co., Ltd. | Method and device for providing image |
US9561444B2 (en) | 2014-08-01 | 2017-02-07 | Electronic Arts Inc. | Resolving graphical conflicts between objects |
US20160261675A1 (en) | 2014-08-02 | 2016-09-08 | Apple Inc. | Sharing user-configurable graphical constructs |
KR20230042141A (en) | 2014-08-02 | 2023-03-27 | 애플 인크. | Context-specific user interfaces |
CN114115460A (en) | 2014-08-06 | 2022-03-01 | 苹果公司 | Reduced size user interface for battery management |
US10045180B2 (en) | 2014-08-06 | 2018-08-07 | Sony Interactive Entertainment America Llc | Method and apparatus for beacon messaging point of sale messaging and delivery system |
US9640100B2 (en) | 2014-08-15 | 2017-05-02 | Google Technology Holdings LLC | Displaying always on display-related content |
US10452253B2 (en) | 2014-08-15 | 2019-10-22 | Apple Inc. | Weather user interface |
CN106575150B (en) | 2014-08-16 | 2020-03-03 | 谷歌有限责任公司 | Method for recognizing gestures using motion data and wearable computing device |
KR20160023232A (en) | 2014-08-21 | 2016-03-03 | 삼성전자주식회사 | Wearable device for displaying schedule and method thereof |
US9230355B1 (en) | 2014-08-21 | 2016-01-05 | Glu Mobile Inc. | Methods and systems for images with interactive filters |
KR102418119B1 (en) | 2014-08-25 | 2022-07-07 | 삼성전자 주식회사 | Method for organizing a clock frame and an wearable electronic device implementing the same |
KR102258579B1 (en) | 2014-08-29 | 2021-05-31 | 엘지전자 주식회사 | Watch type terminal |
KR102326200B1 (en) | 2014-08-29 | 2021-11-15 | 삼성전자 주식회사 | Electronic device and method for providing notification thereof |
KR102326154B1 (en) | 2014-08-29 | 2021-11-15 | 삼성전자 주식회사 | Method for displaying of low power mode and electronic device supporting the same |
JP2017527033A (en) | 2014-09-02 | 2017-09-14 | アップル インコーポレイテッド | User interface for receiving user input |
DE202015006141U1 (en) | 2014-09-02 | 2015-12-14 | Apple Inc. | Electronic touch communication |
JP6667233B2 (en) | 2014-09-02 | 2020-03-18 | ナイキ イノベイト シーブイ | Monitoring health using mobile devices |
WO2016036546A2 (en) | 2014-09-02 | 2016-03-10 | Apple Inc. | Reduced-size notification interface |
EP3373122B1 (en) | 2014-09-02 | 2022-04-06 | Apple Inc. | Reduced-size interfaces for managing alerts |
WO2016036427A1 (en) | 2014-09-02 | 2016-03-10 | Apple Inc. | Electronic device with rotatable input mechanism |
KR102367550B1 (en) | 2014-09-02 | 2022-02-28 | 삼성전자 주식회사 | Controlling a camera module based on physiological signals |
KR101776098B1 (en) | 2014-09-02 | 2017-09-07 | 애플 인크. | Physical activity and workout monitor |
US9547419B2 (en) | 2014-09-02 | 2017-01-17 | Apple Inc. | Reduced size configuration interface |
US10254948B2 (en) | 2014-09-02 | 2019-04-09 | Apple Inc. | Reduced-size user interfaces for dynamically updated application overviews |
KR102230267B1 (en) | 2014-09-04 | 2021-03-19 | 삼성전자주식회사 | Apparatus and method of displaying images |
KR101540544B1 (en) | 2014-09-05 | 2015-07-30 | 서용창 | Message service method using character, user device for performing the method, message application comprising the method |
US10212111B2 (en) | 2014-09-12 | 2019-02-19 | Google Llc | System and interface that facilitate selecting videos to share in a messaging application |
CN104182741A (en) | 2014-09-15 | 2014-12-03 | 联想(北京)有限公司 | Image acquisition prompt method and device and electronic device |
JP6747292B2 (en) | 2014-09-19 | 2020-08-26 | 日本電気株式会社 | Image processing apparatus, image processing method, and program |
CN106201161B (en) | 2014-09-23 | 2021-09-03 | 北京三星通信技术研究有限公司 | Display method and system of electronic equipment |
US20160085397A1 (en) | 2014-09-23 | 2016-03-24 | Qualcomm Incorporated | Smart Watch Notification Manager |
US9633463B2 (en) | 2014-09-24 | 2017-04-25 | Intel Corporation | User gesture driven avatar apparatus and method |
US9785123B2 (en) | 2014-09-26 | 2017-10-10 | Intel Corporation | Digital analog display with rotating bezel |
US10361986B2 (en) | 2014-09-29 | 2019-07-23 | Disney Enterprises, Inc. | Gameplay in a chat thread |
US10572103B2 (en) | 2014-09-30 | 2020-02-25 | Apple Inc. | Timeline view of recently opened documents |
KR102188267B1 (en) | 2014-10-02 | 2020-12-08 | 엘지전자 주식회사 | Mobile terminal and method for controlling the same |
US11435887B1 (en) | 2014-10-05 | 2022-09-06 | Turbopatent Inc. | Machine display operation systems and methods |
WO2016057062A1 (en) | 2014-10-10 | 2016-04-14 | Simplify and Go, LLC | World watch |
CN105631372B (en) | 2014-10-13 | 2020-06-12 | 麦克斯韦尔福瑞斯特私人有限公司 | Proximity monitoring apparatus and method |
KR102240302B1 (en) | 2014-10-21 | 2021-04-14 | 삼성전자주식회사 | Apparatus and Method for virtual fitting thereof |
KR102251483B1 (en) | 2014-10-23 | 2021-05-14 | 삼성전자주식회사 | Electronic device and method for processing image |
KR101930657B1 (en) | 2014-10-24 | 2018-12-18 | 유센스, 인코퍼레이티드 | System and method for immersive and interactive multimedia generation |
CN104376160A (en) | 2014-11-07 | 2015-02-25 | 薛景 | Real person simulation individuality ornament matching system |
KR102354763B1 (en) | 2014-11-17 | 2022-01-25 | 삼성전자주식회사 | Electronic device for identifying peripheral apparatus and method thereof |
US9906772B2 (en) | 2014-11-24 | 2018-02-27 | Mediatek Inc. | Method for performing multi-camera capturing control of an electronic device, and associated apparatus |
KR101997500B1 (en) | 2014-11-25 | 2019-07-08 | 삼성전자주식회사 | Method and apparatus for generating personalized 3d face model |
KR101639894B1 (en) | 2014-11-26 | 2016-07-14 | 홍익대학교세종캠퍼스산학협력단 | Methods customizing avatar on touch screen |
KR20160066951A (en) | 2014-12-03 | 2016-06-13 | 엘지전자 주식회사 | Mobile terminal and method for controlling the same |
US10764424B2 (en) | 2014-12-05 | 2020-09-01 | Microsoft Technology Licensing, Llc | Intelligent digital assistant alarm system for application collaboration with notification presentation |
MA41117A (en) | 2014-12-05 | 2017-10-10 | Myfiziq Ltd | IMAGING OF A BODY |
WO2016094634A1 (en) | 2014-12-10 | 2016-06-16 | Button Inc. | Switching to second application to perform action |
CN104484796B (en) | 2014-12-18 | 2018-03-27 | 天津三星通信技术研究有限公司 | Portable terminal and its agenda managing method |
CN104501043A (en) | 2014-12-19 | 2015-04-08 | 广东普加福光电科技有限公司 | Long-service-life quantum dot fluorescent composite thin film and preparation method thereof |
US10591955B2 (en) | 2014-12-23 | 2020-03-17 | Intel Corporation | Analog clock display with time events |
WO2016101124A1 (en) | 2014-12-23 | 2016-06-30 | Intel Corporation | Sketch selection for rendering 3d model avatar |
CN107004288B (en) | 2014-12-23 | 2022-03-01 | 英特尔公司 | Facial motion driven animation of non-facial features |
US9830728B2 (en) | 2014-12-23 | 2017-11-28 | Intel Corporation | Augmented facial animation |
US20160191511A1 (en) | 2014-12-24 | 2016-06-30 | Paypal Inc. | Wearable device authentication |
US10198594B2 (en) | 2014-12-30 | 2019-02-05 | Xiaomi Inc. | Method and device for displaying notification information |
US20160187995A1 (en) | 2014-12-30 | 2016-06-30 | Tyco Fire & Security Gmbh | Contextual Based Gesture Recognition And Control |
US10048856B2 (en) | 2014-12-30 | 2018-08-14 | Microsoft Technology Licensing, Llc | Configuring a user interface based on an experience mode transition |
US20160197773A1 (en) | 2015-01-07 | 2016-07-07 | Kii, Inc. | Techniques for sharing applications |
US9794402B2 (en) | 2015-01-12 | 2017-10-17 | Apple Inc. | Updating device behavior based on user behavior |
JP6152125B2 (en) | 2015-01-23 | 2017-06-21 | 任天堂株式会社 | Program, information processing apparatus, information processing system, and avatar image generation method |
GB2534847A (en) | 2015-01-28 | 2016-08-10 | Sony Computer Entertainment Europe Ltd | Display |
JP6525611B2 (en) | 2015-01-29 | 2019-06-05 | キヤノン株式会社 | Image processing apparatus and control method thereof |
JP6525617B2 (en) | 2015-02-03 | 2019-06-05 | キヤノン株式会社 | Image processing apparatus and control method thereof |
KR20160099399A (en) | 2015-02-12 | 2016-08-22 | 엘지전자 주식회사 | Watch type terminal |
KR102227262B1 (en) | 2015-02-17 | 2021-03-15 | 삼성전자주식회사 | Method for transferring profile and electronic device supporting thereof |
US10663925B2 (en) | 2015-03-01 | 2020-05-26 | Andrey Abramov | Hybrid smart watch multiple sources of time, multiple power sources, and multiple time indicator mechanisms |
US10379497B2 (en) | 2015-03-07 | 2019-08-13 | Apple Inc. | Obtaining and displaying time-related data on an electronic watch |
WO2016144385A1 (en) | 2015-03-08 | 2016-09-15 | Apple Inc. | Sharing user-configurable graphical constructs |
US10567477B2 (en) | 2015-03-08 | 2020-02-18 | Apple Inc. | Virtual assistant continuity |
US10095396B2 (en) | 2015-03-08 | 2018-10-09 | Apple Inc. | Devices, methods, and graphical user interfaces for interacting with a control object while dragging another object |
WO2016144977A1 (en) | 2015-03-08 | 2016-09-15 | Apple Inc. | Sharing user-configurable graphical constructs |
US9632664B2 (en) | 2015-03-08 | 2017-04-25 | Apple Inc. | Devices, methods, and graphical user interfaces for manipulating user interface objects with visual and/or haptic feedback |
EP3250997A1 (en) | 2015-03-08 | 2017-12-06 | Apple Inc. | User interface using a rotatable input mechanism |
CA2979218A1 (en) | 2015-03-09 | 2016-09-15 | Ventana 3D, Llc | Avatar control system |
JP6492794B2 (en) | 2015-03-09 | 2019-04-03 | セイコーエプソン株式会社 | Electronic device, time correction method, and time correction program |
US9959623B2 (en) | 2015-03-09 | 2018-05-01 | Here Global B.V. | Display of an annotation representation |
KR101587115B1 (en) | 2015-03-11 | 2016-01-21 | 이영규 | System for avatar messenger service |
KR101688162B1 (en) | 2015-03-23 | 2016-12-20 | 엘지전자 주식회사 | Mobile terminal and method for controlling the same |
WO2016153207A1 (en) | 2015-03-25 | 2016-09-29 | 엘지전자 주식회사 | Mobile terminal and control method therefor |
US9852543B2 (en) | 2015-03-27 | 2017-12-26 | Snap Inc. | Automated three dimensional model generation |
US9369537B1 (en) | 2015-03-31 | 2016-06-14 | Lock2Learn, LLC | Systems and methods for regulating device usage |
CN107430429B (en) | 2015-04-07 | 2022-02-18 | 英特尔公司 | Avatar keyboard |
US10019599B1 (en) | 2015-04-08 | 2018-07-10 | Comigo Ltd. | Limiting applications execution time |
EP3283927B1 (en) | 2015-04-12 | 2021-06-16 | Andrey Abramov | A wearable smart watch with a control ring and a user feedback mechanism |
US9667710B2 (en) | 2015-04-20 | 2017-05-30 | Agverdict, Inc. | Systems and methods for cloud-based agricultural data processing and management |
KR20160126446A (en) | 2015-04-23 | 2016-11-02 | 엘지전자 주식회사 | Wearable device and method for controlling the same |
KR20160128120A (en) | 2015-04-28 | 2016-11-07 | 엘지전자 주식회사 | Watch type terminal and control method thereof |
KR20160131275A (en) * | 2015-05-06 | 2016-11-16 | 엘지전자 주식회사 | Watch type terminal |
US20160327915A1 (en) | 2015-05-08 | 2016-11-10 | Garmin Switzerland Gmbh | Smart watch |
US10459887B1 (en) | 2015-05-12 | 2019-10-29 | Apple Inc. | Predictive application pre-launch |
US9907998B2 (en) | 2015-05-15 | 2018-03-06 | Polar Electro Oy | Wrist device having heart activity circuitry |
US20160342327A1 (en) | 2015-05-22 | 2016-11-24 | Lg Electronics Inc. | Watch-type mobile terminal and method of controlling therefor |
CN106303690A (en) | 2015-05-27 | 2017-01-04 | 腾讯科技(深圳)有限公司 | A kind of method for processing video frequency and device |
KR20160142527A (en) | 2015-06-03 | 2016-12-13 | 엘지전자 주식회사 | Display apparatus and controlling method thereof |
US20160357354A1 (en) | 2015-06-04 | 2016-12-08 | Apple Inc. | Condition-based activation of a user interface |
KR20160143338A (en) | 2015-06-05 | 2016-12-14 | 엘지전자 주식회사 | Mobile terminal and method for controlling the same |
US10175866B2 (en) | 2015-06-05 | 2019-01-08 | Apple Inc. | Providing complications on an electronic watch |
US10572571B2 (en) | 2015-06-05 | 2020-02-25 | Apple Inc. | API for specifying display of complication on an electronic watch |
KR20160143429A (en) | 2015-06-05 | 2016-12-14 | 엘지전자 주식회사 | Mobile terminal and method for controlling the same |
US9916075B2 (en) | 2015-06-05 | 2018-03-13 | Apple Inc. | Formatting content for a reduced-size user interface |
DE102016110903A1 (en) | 2015-06-14 | 2016-12-15 | Facense Ltd. | Head-mounted devices for measuring physiological reactions |
US20160370974A1 (en) | 2015-06-22 | 2016-12-22 | Here Global B.V. | Causation of Expansion of a Supplemental Content Overlay |
US10237280B2 (en) | 2015-06-25 | 2019-03-19 | Websafety, Inc. | Management and control of mobile computing device using local and remote software agents |
KR102348666B1 (en) | 2015-06-30 | 2022-01-07 | 엘지디스플레이 주식회사 | Display device and mobile terminal using the same |
US10628014B2 (en) | 2015-07-01 | 2020-04-21 | Lg Electronics Inc. | Mobile terminal and control method therefor |
KR20170006761A (en) | 2015-07-09 | 2017-01-18 | 엘지전자 주식회사 | Smart watch and method for contolling the same |
CN105100462A (en) | 2015-07-10 | 2015-11-25 | 广州市久邦数码科技有限公司 | Short message system having custom theme edition function |
US20170018289A1 (en) | 2015-07-15 | 2017-01-19 | String Theory, Inc. | Emoji as facetracking video masks |
US10068179B2 (en) | 2015-07-29 | 2018-09-04 | Adobe Systems Incorporated | Positioning text in digital designs based on an underlying image |
KR20170016262A (en) | 2015-08-03 | 2017-02-13 | 엘지전자 주식회사 | Mobile terminal and control method thereof |
CN106448614A (en) | 2015-08-04 | 2017-02-22 | 联发科技(新加坡)私人有限公司 | Electronic device and picture refresh rate control method |
US9880735B2 (en) | 2015-08-10 | 2018-01-30 | Apple Inc. | Devices, methods, and graphical user interfaces for manipulating user interface objects with visual and/or haptic feedback |
KR102430941B1 (en) | 2015-08-11 | 2022-08-10 | 삼성전자주식회사 | Method for providing physiological state information and electronic device for supporting the same |
KR20170019081A (en) | 2015-08-11 | 2017-02-21 | 삼성전자주식회사 | Portable apparatus and method for displaying a screen |
US10796480B2 (en) | 2015-08-14 | 2020-10-06 | Metail Limited | Methods of generating personalized 3D head models or 3D body models |
EP4321088A3 (en) | 2015-08-20 | 2024-04-24 | Apple Inc. | Exercise-based watch face |
KR102371906B1 (en) | 2015-08-24 | 2022-03-11 | 삼성디스플레이 주식회사 | Display device, mobile device having the same, and method of operating display device |
US9639945B2 (en) | 2015-08-27 | 2017-05-02 | Lytro, Inc. | Depth-based application of image effects |
KR20170025570A (en) | 2015-08-28 | 2017-03-08 | 엘지전자 주식회사 | Watch-type mobile terminal operating method thereof |
JP6628354B2 (en) | 2015-09-07 | 2020-01-08 | 株式会社ハイスピードボーイズ | Server device, program, and communication system |
US20170075316A1 (en) | 2015-09-11 | 2017-03-16 | Motorola Mobility Llc | Smart Watch with Power Saving Timekeeping Only Functionality and Methods Therefor |
KR20170033062A (en) | 2015-09-16 | 2017-03-24 | 엘지전자 주식회사 | Mobile terminal and method for controlling the same |
US9349414B1 (en) | 2015-09-18 | 2016-05-24 | Odile Aimee Furment | System and method for simultaneous capture of two video streams |
US20170083086A1 (en) | 2015-09-18 | 2017-03-23 | Kai Mazur | Human-Computer Interface |
AU2016336463A1 (en) | 2015-10-06 | 2018-05-24 | Raymond A. Berardinelli | Smartwatch device and method |
KR102358025B1 (en) | 2015-10-07 | 2022-02-04 | 삼성전자주식회사 | Electronic device and music visualization method thereof |
US9686497B1 (en) | 2015-10-29 | 2017-06-20 | Crater Group Co. | Video annotation and dynamic video call display for multi-camera devices |
US11237717B2 (en) | 2015-11-04 | 2022-02-01 | Sony Corporation | Information processing device and information processing method |
AU2015252123A1 (en) | 2015-11-05 | 2017-05-25 | Duffell, Emily MRS | Digital Clock |
CN105302468B (en) | 2015-11-10 | 2018-05-01 | 北京奇虎科技有限公司 | Intelligent watch and its main interface method for handover control |
KR101748669B1 (en) | 2015-11-12 | 2017-06-19 | 엘지전자 주식회사 | Watch type terminal and method for controlling the same |
US10025972B2 (en) | 2015-11-16 | 2018-07-17 | Facebook, Inc. | Systems and methods for dynamically generating emojis based on image analysis of facial features |
KR102256052B1 (en) | 2015-12-18 | 2021-05-25 | 삼성전자주식회사 | A wearable electronic device and an operating method thereof |
JP6323440B2 (en) | 2015-12-18 | 2018-05-16 | カシオ計算機株式会社 | Time display device, time display method and program |
US20170178287A1 (en) | 2015-12-21 | 2017-06-22 | Glen J. Anderson | Identity obfuscation |
KR20170076452A (en) | 2015-12-24 | 2017-07-04 | 엘지전자 주식회사 | Mobile terminal and method for controlling the same |
JP6292219B2 (en) | 2015-12-28 | 2018-03-14 | カシオ計算機株式会社 | Electronic device, display control method and program |
CN105611215A (en) | 2015-12-30 | 2016-05-25 | 掌赢信息科技(上海)有限公司 | Video call method and device |
KR20170081391A (en) | 2016-01-04 | 2017-07-12 | 엘지전자 주식회사 | Mobile terminal and method for controlling the same |
KR20170082698A (en) | 2016-01-06 | 2017-07-17 | 엘지전자 주식회사 | Watch type terminal and method for controlling the same |
US10664741B2 (en) | 2016-01-14 | 2020-05-26 | Samsung Electronics Co., Ltd. | Selecting a behavior of a virtual agent |
CN105607858A (en) | 2016-01-21 | 2016-05-25 | 钟林 | Method and device for controlling intelligent watch or intelligent glasses through direction gestures |
US10062133B1 (en) * | 2016-01-26 | 2018-08-28 | Google Llc | Image retrieval for computing devices |
KR102451469B1 (en) | 2016-02-03 | 2022-10-06 | 삼성전자주식회사 | Method and electronic device for controlling an external electronic device |
KR20170092877A (en) | 2016-02-04 | 2017-08-14 | 삼성전자주식회사 | Sharing Method for function and electronic device supporting the same |
EP3207823A1 (en) | 2016-02-16 | 2017-08-23 | Braun GmbH | Interactive system setup concept |
US20170243508A1 (en) | 2016-02-19 | 2017-08-24 | Fitbit, Inc. | Generation of sedentary time information by activity tracking device |
US9760252B1 (en) | 2016-02-23 | 2017-09-12 | Pebble Technology, Corp. | Controlling and optimizing actions on notifications for a mobile device |
GB2548154A (en) | 2016-03-11 | 2017-09-13 | Sony Computer Entertainment Europe Ltd | Virtual reality |
US10025399B2 (en) | 2016-03-16 | 2018-07-17 | Lg Electronics Inc. | Watch type mobile terminal and method for controlling the same |
JP6327276B2 (en) | 2016-03-23 | 2018-05-23 | カシオ計算機株式会社 | Electronic device and time display control method |
US20170285916A1 (en) | 2016-03-30 | 2017-10-05 | Yan Xu | Camera effects for photo story generation |
KR20170112406A (en) | 2016-03-31 | 2017-10-12 | 한국전자통신연구원 | Apparatus and method for taking a picture with avatar in augmented reality |
KR102279063B1 (en) | 2016-03-31 | 2021-07-20 | 삼성전자주식회사 | Method for composing image and an electronic device thereof |
US10152947B2 (en) | 2016-04-06 | 2018-12-11 | Microsoft Technology Licensing, Llc | Display brightness updating |
US10546501B2 (en) | 2016-04-11 | 2020-01-28 | Magnus Berggren | Method and apparatus for fleet management of equipment |
KR102518477B1 (en) | 2016-05-03 | 2023-04-06 | 삼성전자주식회사 | Method and electronic device for outputting screen |
US10481635B2 (en) | 2016-05-04 | 2019-11-19 | Verizon Patent And Licensing Inc. | Configuring a user interface layout of a user device via a configuration device |
US20170329477A1 (en) | 2016-05-16 | 2017-11-16 | Microsoft Technology Licensing, Llc. | Interactive glanceable information |
US10852935B2 (en) | 2016-05-18 | 2020-12-01 | Apple Inc. | Devices, methods, and graphical user interfaces for messaging |
DK179831B1 (en) | 2016-05-18 | 2019-07-22 | Apple Inc. | Devices, methods and graphical user interfaces for messaging |
WO2017201326A1 (en) | 2016-05-18 | 2017-11-23 | Apple Inc. | Applying acknowledgement options in a graphical messaging user interface |
US10332111B2 (en) | 2016-05-19 | 2019-06-25 | Visa International Service Association | Authentication with smartwatch |
KR20170138667A (en) | 2016-06-08 | 2017-12-18 | 삼성전자주식회사 | Method for activating application and electronic device supporting the same |
US20170357427A1 (en) | 2016-06-10 | 2017-12-14 | Apple Inc. | Context-specific user interfaces |
US9869973B2 (en) | 2016-06-10 | 2018-01-16 | Apple Inc. | Scheduling device for customizable electronic notifications |
US10520979B2 (en) | 2016-06-10 | 2019-12-31 | Apple Inc. | Enhanced application preview mode |
US10725761B2 (en) | 2016-06-10 | 2020-07-28 | Apple Inc. | Providing updated application data for previewing applications on a display |
DK201670595A1 (en) | 2016-06-11 | 2018-01-22 | Apple Inc | Configuring context-specific user interfaces |
AU2017100667A4 (en) | 2016-06-11 | 2017-07-06 | Apple Inc. | Activity and workout updates |
WO2017213937A1 (en) | 2016-06-11 | 2017-12-14 | Apple Inc. | Configuring context-specific user interfaces |
WO2017218069A1 (en) | 2016-06-12 | 2017-12-21 | Apple Inc. | Devices, methods, and graphical user interfaces for processing intensity information associated with touch inputs |
US9912860B2 (en) | 2016-06-12 | 2018-03-06 | Apple Inc. | User interface for camera effects |
US10114440B2 (en) | 2016-06-22 | 2018-10-30 | Razer (Asia-Pacific) Pte. Ltd. | Applying power management based on a target time |
WO2017223530A1 (en) | 2016-06-23 | 2017-12-28 | LoomAi, Inc. | Systems and methods for generating computer ready animation models of a human head from captured data images |
US10905956B2 (en) | 2016-06-28 | 2021-02-02 | Rec Room Inc. | Systems and methods providing temporary decoupling of user avatar synchronicity for presence enhancing experiences |
EP3264251B1 (en) | 2016-06-29 | 2019-09-04 | Dassault Systèmes | Generation of a color of an object displayed on a gui |
JP6763216B2 (en) | 2016-06-30 | 2020-09-30 | カシオ計算機株式会社 | Information display device, information display method and program |
CN106056848B (en) | 2016-06-30 | 2018-11-06 | 深圳先进技术研究院 | A kind of anti-tumble precaution device method for realizing low power consumption |
US10360708B2 (en) | 2016-06-30 | 2019-07-23 | Snap Inc. | Avatar based ideogram generation |
US10101711B2 (en) | 2016-07-06 | 2018-10-16 | Barbara Carey Stackowski | Past and future time visualization device |
JP6866584B2 (en) | 2016-07-21 | 2021-04-28 | カシオ計算機株式会社 | Display device, display control method and program |
KR102510708B1 (en) | 2016-07-25 | 2023-03-16 | 삼성전자주식회사 | Electronic device and method for diplaying image |
US10572005B2 (en) | 2016-07-29 | 2020-02-25 | Microsoft Technology Licensing, Llc | Private communication with gazing |
AU2017100879B4 (en) | 2016-07-29 | 2017-09-28 | Apple Inc. | Systems, devices, and methods for dynamically providing user interface controls at touch-sensitive secondary display |
US9955061B2 (en) | 2016-08-03 | 2018-04-24 | International Business Machines Corporation | Obtaining camera device image data representing an event |
WO2018049430A2 (en) | 2016-08-11 | 2018-03-15 | Integem Inc. | An intelligent interactive and augmented reality based user interface platform |
US20180047200A1 (en) | 2016-08-11 | 2018-02-15 | Jibjab Media Inc. | Combining user images and computer-generated illustrations to produce personalized animated digital avatars |
KR20180020386A (en) | 2016-08-18 | 2018-02-28 | 엘지전자 주식회사 | Mobile terminal and operating method thereof |
US10514822B2 (en) * | 2016-08-24 | 2019-12-24 | Motorola Solutions, Inc. | Systems and methods for text entry for multi-user text-based communication |
KR102549463B1 (en) | 2016-08-30 | 2023-06-30 | 삼성전자주식회사 | Method for Processing Image and the Electronic Device supporting the same |
US10466891B2 (en) | 2016-09-12 | 2019-11-05 | Apple Inc. | Special lock mode user interface |
EP3296819B1 (en) | 2016-09-14 | 2019-11-06 | Nxp B.V. | User interface activation |
DK179471B1 (en) | 2016-09-23 | 2018-11-26 | Apple Inc. | Image data for enhanced user interactions |
JP6698216B2 (en) | 2016-09-23 | 2020-05-27 | アップル インコーポレイテッドApple Inc. | Patent application to the US Patent and Trademark Office for creating and editing avatars |
JP6758590B2 (en) | 2016-09-23 | 2020-09-23 | アップル インコーポレイテッドApple Inc. | Watch theater mode |
JP6680165B2 (en) | 2016-09-23 | 2020-04-15 | カシオ計算機株式会社 | Image display device, image display method, and program |
US10928881B2 (en) | 2016-09-23 | 2021-02-23 | Apple Inc. | Low power touch sensing during a sleep state of an electronic device |
US20180096506A1 (en) | 2016-10-04 | 2018-04-05 | Facebook, Inc. | Controls and Interfaces for User Interactions in Virtual Spaces |
KR20180037844A (en) | 2016-10-05 | 2018-04-13 | 엘지전자 주식회사 | Mobile terminal |
KR101902864B1 (en) | 2016-10-19 | 2018-10-01 | 주식회사 앱포스터 | Method for generating watch screen design of smart watch and apparatus thereof |
CN108604266A (en) | 2016-10-21 | 2018-09-28 | 华为技术有限公司 | A kind of safe checking method and equipment |
US10432559B2 (en) | 2016-10-24 | 2019-10-01 | Snap Inc. | Generating and displaying customized avatars in electronic messages |
KR101973680B1 (en) | 2016-10-27 | 2019-04-29 | (주)유에스티21 | Tide clock device and Method of displaying tide using the same |
KR102641940B1 (en) * | 2016-11-03 | 2024-02-29 | 삼성전자주식회사 | Display apparatus and control method thereof |
CN112738408B (en) | 2016-11-07 | 2022-09-16 | 斯纳普公司 | Selective identification and ordering of image modifiers |
US10379726B2 (en) | 2016-11-16 | 2019-08-13 | Xerox Corporation | Re-ordering pages within an image preview |
US20180150443A1 (en) | 2016-11-25 | 2018-05-31 | Google Inc. | Application program interface for managing complication data |
US20180157452A1 (en) | 2016-12-07 | 2018-06-07 | Google Inc. | Decomposition of dynamic graphical user interfaces |
JP6266736B1 (en) | 2016-12-07 | 2018-01-24 | 株式会社コロプラ | Method for communicating via virtual space, program for causing computer to execute the method, and information processing apparatus for executing the program |
CN106598201B (en) | 2016-12-13 | 2020-07-24 | 联想(北京)有限公司 | Interface control method and device |
US10380968B2 (en) | 2016-12-19 | 2019-08-13 | Mediatek Singapore Pte. Ltd. | Method for adjusting the adaptive screen-refresh rate and device thereof |
US20180181381A1 (en) | 2016-12-23 | 2018-06-28 | Microsoft Technology Licensing, Llc | Application program package pre-installation user interface |
JP6240301B1 (en) | 2016-12-26 | 2017-11-29 | 株式会社コロプラ | Method for communicating via virtual space, program for causing computer to execute the method, and information processing apparatus for executing the program |
WO2018119574A1 (en) | 2016-12-26 | 2018-07-05 | 深圳市柔宇科技有限公司 | Display screen control method and apparatus |
JP6825366B2 (en) | 2016-12-28 | 2021-02-03 | カシオ計算機株式会社 | Clock, clock display control method and program |
CN106782268B (en) | 2017-01-04 | 2020-07-24 | 京东方科技集团股份有限公司 | Display system and driving method for display panel |
JP6786403B2 (en) | 2017-01-10 | 2020-11-18 | 京セラ株式会社 | Communication systems, communication equipment, control methods and programs |
WO2018135841A1 (en) | 2017-01-17 | 2018-07-26 | Samsung Electronics Co., Ltd. | Message generation method and wearable electronic device for supporting the same |
US20180246635A1 (en) | 2017-02-24 | 2018-08-30 | Microsoft Technology Licensing, Llc | Generating user interfaces combining foreground and background of an image with user interface elements |
KR102638911B1 (en) | 2017-02-24 | 2024-02-22 | 삼성전자 주식회사 | Method and apparatus for controlling a plurality of internet of things devices |
CN106782431B (en) | 2017-03-10 | 2020-02-07 | Oppo广东移动通信有限公司 | Screen backlight brightness adjusting method and device and mobile terminal |
KR102334213B1 (en) | 2017-03-13 | 2021-12-02 | 삼성전자주식회사 | Method and electronic apparatus for displaying information |
US10438393B2 (en) | 2017-03-16 | 2019-10-08 | Linden Research, Inc. | Virtual reality presentation of body postures of avatars |
KR102309296B1 (en) | 2017-03-17 | 2021-10-06 | 엘지전자 주식회사 | Mobile terminal and method for controlling the same |
JP6784204B2 (en) | 2017-03-22 | 2020-11-11 | カシオ計算機株式会社 | Information processing equipment, information processing methods and programs |
US10643246B1 (en) | 2017-03-29 | 2020-05-05 | Amazon Technologies, Inc. | Methods and systems for customization of user profiles |
US10111063B1 (en) | 2017-03-31 | 2018-10-23 | Verizon Patent And Licensing Inc. | System and method for EUICC personalization and network provisioning |
CN206638967U (en) | 2017-04-26 | 2017-11-14 | 京东方科技集团股份有限公司 | Electronic installation |
DK179412B1 (en) | 2017-05-12 | 2018-06-06 | Apple Inc | Context-Specific User Interfaces |
KR20230144661A (en) | 2017-05-16 | 2023-10-16 | 애플 인크. | Emoji recording and sending |
WO2018213451A1 (en) | 2017-05-16 | 2018-11-22 | Apple Inc. | Devices, methods, and graphical user interfaces for navigating between user interfaces and interacting with control objects |
DK179948B1 (en) | 2017-05-16 | 2019-10-22 | Apple Inc. | Recording and sending Emoji |
CN110490093B (en) | 2017-05-16 | 2020-10-16 | 苹果公司 | Emoticon recording and transmission |
KR20180128178A (en) | 2017-05-23 | 2018-12-03 | 삼성전자주식회사 | Method for displaying contents and electronic device thereof |
US11269393B2 (en) | 2017-06-02 | 2022-03-08 | Apple Inc. | Techniques for adjusting computing device sleep states |
US11144845B2 (en) | 2017-06-02 | 2021-10-12 | Stitch Fix, Inc. | Using artificial intelligence to design a product |
US11671250B2 (en) | 2017-06-04 | 2023-06-06 | Apple Inc. | Migration for wearable to new companion device |
JP2019007751A (en) | 2017-06-21 | 2019-01-17 | セイコーエプソン株式会社 | Wearable device and method for controlling the same |
JP2019020558A (en) | 2017-07-14 | 2019-02-07 | セイコーエプソン株式会社 | Portable electronic apparatus, control method, and program |
CN108305317B (en) | 2017-08-04 | 2020-03-17 | 腾讯科技(深圳)有限公司 | Image processing method, device and storage medium |
KR102423175B1 (en) | 2017-08-18 | 2022-07-21 | 삼성전자주식회사 | An apparatus for editing images using depth map and a method thereof |
KR102338576B1 (en) | 2017-08-22 | 2021-12-14 | 삼성전자주식회사 | Electronic device which stores depth information associating with image in accordance with Property of depth information acquired using image and the controlling method thereof |
KR102463281B1 (en) | 2017-08-25 | 2022-11-07 | 삼성전자주식회사 | Electronic apparatus for providing mode switching and storage medium |
US10444820B2 (en) | 2017-09-11 | 2019-10-15 | Apple Inc. | Low power touch detection |
US10372298B2 (en) | 2017-09-29 | 2019-08-06 | Apple Inc. | User interface for multi-user communication session |
US10845767B2 (en) | 2017-10-25 | 2020-11-24 | Lenovo (Singapore) Pte. Ltd. | Watch face selection |
CN107561904A (en) | 2017-10-30 | 2018-01-09 | 广东乐芯智能科技有限公司 | A kind of method of intelligent watch switching time-zone indicator |
US10657695B2 (en) | 2017-10-30 | 2020-05-19 | Snap Inc. | Animated chat presence |
US10684592B2 (en) | 2017-11-27 | 2020-06-16 | Lg Electronics Inc. | Watch type terminal |
US20190163142A1 (en) | 2017-11-27 | 2019-05-30 | Lg Electronics Inc. | Watch type terminal |
CA2986980A1 (en) | 2017-11-29 | 2019-05-29 | Qiqi WANG | Display with low voltage feature |
US20190180221A1 (en) | 2017-12-07 | 2019-06-13 | International Business Machines Corporation | Transmitting an inventory-based notification to a mobile device |
KR102460922B1 (en) | 2017-12-14 | 2022-11-01 | 엘지디스플레이 주식회사 | Display Device and Driving Method thereof |
KR102521734B1 (en) | 2018-01-08 | 2023-04-17 | 삼성전자주식회사 | Wearable device for executing a plurality of applications and method of operating the same |
US20190237003A1 (en) | 2018-01-26 | 2019-08-01 | Mobvoi Information Technology Co., Ltd. | Display device, electronic device and method of controlling screen display |
US11009833B2 (en) | 2018-02-20 | 2021-05-18 | Timex Group Usa, Inc. | Electronic device with simulated analog indicator interaction with digital information/images |
US10374994B1 (en) | 2018-02-21 | 2019-08-06 | King.Com Ltd. | Messaging system |
KR102661019B1 (en) | 2018-02-23 | 2024-04-26 | 삼성전자주식회사 | Electronic device providing image including 3d avatar in which motion of face is reflected by using 3d avatar corresponding to face and method for operating thefeof |
KR102082417B1 (en) | 2018-03-12 | 2020-04-23 | 엘지전자 주식회사 | Mobile terminal and method for controlling the same |
US10725623B2 (en) | 2018-03-28 | 2020-07-28 | International Business Machines Corporation | Browsing applications on mobile device via a wearable device |
JP6680307B2 (en) | 2018-04-11 | 2020-04-15 | カシオ計算機株式会社 | Time display device, time display method, and program |
WO2019200350A1 (en) | 2018-04-13 | 2019-10-17 | Li2Ei Llc | System for reward-based device control |
US10789753B2 (en) | 2018-04-23 | 2020-09-29 | Magic Leap, Inc. | Avatar facial expression representation in multidimensional space |
CN111936941A (en) | 2018-04-24 | 2020-11-13 | 谷歌有限责任公司 | User interface visualization in a hybrid smart watch |
US11327650B2 (en) | 2018-05-07 | 2022-05-10 | Apple Inc. | User interfaces having a collection of complications |
US11722764B2 (en) | 2018-05-07 | 2023-08-08 | Apple Inc. | Creative camera |
DK201870374A1 (en) | 2018-05-07 | 2019-12-04 | Apple Inc. | Avatar creation user interface |
US10375313B1 (en) | 2018-05-07 | 2019-08-06 | Apple Inc. | Creative camera |
CN111488193A (en) | 2018-05-07 | 2020-08-04 | 苹果公司 | Avatar creation user interface |
US10609208B2 (en) | 2018-05-08 | 2020-03-31 | Apple Inc. | Managing device usage |
CN114047856B (en) | 2018-05-08 | 2023-02-17 | 苹果公司 | User interface for controlling or presenting device usage on an electronic device |
US10558546B2 (en) | 2018-05-08 | 2020-02-11 | Apple Inc. | User interfaces for controlling or presenting device usage on an electronic device |
US10482583B1 (en) | 2018-05-10 | 2019-11-19 | Google Llc | Generating and displaying blur in images |
JP2020031316A (en) * | 2018-08-22 | 2020-02-27 | シャープ株式会社 | Image forming apparatus, image color changing method, and image color changing program |
JP6427711B1 (en) | 2018-08-27 | 2018-11-21 | 京セラ株式会社 | Electronic device, method and program |
US11726324B2 (en) | 2018-08-31 | 2023-08-15 | Apple Inc. | Display system |
KR102076727B1 (en) | 2018-09-20 | 2020-02-12 | 주식회사 앱포스터 | Method for generating watch screen design of smart watch and apparatus thereof |
JP2020056745A (en) | 2018-10-04 | 2020-04-09 | カシオ計算機株式会社 | Electronic apparatus, method for processing information, and information processing program |
US10878255B2 (en) | 2018-10-05 | 2020-12-29 | International Business Machines Corporation | Providing automatic responsive actions to biometrically detected events |
US10902661B1 (en) | 2018-11-28 | 2021-01-26 | Snap Inc. | Dynamic composite user identifier |
US11250399B2 (en) | 2018-11-29 | 2022-02-15 | Watch Skins Corporation | Watch skins selection application with blockchain token |
US10505726B1 (en) | 2018-12-07 | 2019-12-10 | Nike, Inc. | System and method for providing cryptographically secured digital assets |
US20200228646A1 (en) | 2019-01-10 | 2020-07-16 | Automatic Labs Inc. | Systems and methods for distracted driving prevention |
US11288880B2 (en) | 2019-01-18 | 2022-03-29 | Snap Inc. | Template-based generation of personalized videos |
US11107261B2 (en) | 2019-01-18 | 2021-08-31 | Apple Inc. | Virtual avatar animation based on facial feature movement |
US10817981B1 (en) * | 2019-02-04 | 2020-10-27 | Facebook, Inc. | Color sampling selection for displaying content items using machine learning |
KR102633572B1 (en) | 2019-02-19 | 2024-02-06 | 삼성전자주식회사 | Method for determining watch face image and electronic device thereof |
JP6939838B2 (en) | 2019-04-02 | 2021-09-22 | カシオ計算機株式会社 | Electronic clock, information update control method and program |
US11093659B2 (en) | 2019-04-25 | 2021-08-17 | Motorola Mobility Llc | Controlling content visibility on a computing device based on wearable device proximity |
US10992619B2 (en) | 2019-04-30 | 2021-04-27 | Snap Inc. | Messaging system with avatar generation |
US11960701B2 (en) | 2019-05-06 | 2024-04-16 | Apple Inc. | Using an illustration to show the passing of time |
US11131967B2 (en) | 2019-05-06 | 2021-09-28 | Apple Inc. | Clock faces for an electronic device |
US11340778B2 (en) | 2019-05-06 | 2022-05-24 | Apple Inc. | Restricted operation of an electronic device |
DK201970530A1 (en) | 2019-05-06 | 2021-01-28 | Apple Inc | Avatar integration with multiple applications |
CN113795810A (en) | 2019-05-06 | 2021-12-14 | 苹果公司 | Standalone wearable device configuration and interface |
US11468197B2 (en) | 2019-05-06 | 2022-10-11 | Apple Inc. | Configuring context-based restrictions for a computing device |
US20200380768A1 (en) | 2019-06-02 | 2020-12-03 | Apple Inc. | Parameterized generation of two-dimensional images from a three-dimensional model |
US11074753B2 (en) | 2019-06-02 | 2021-07-27 | Apple Inc. | Multi-pass object rendering using a three- dimensional geometric constraint |
US11481100B2 (en) | 2019-06-25 | 2022-10-25 | Apple Inc. | User interfaces for a compass application |
US11189098B2 (en) | 2019-06-28 | 2021-11-30 | Snap Inc. | 3D object camera customization system |
US20200412975A1 (en) | 2019-06-28 | 2020-12-31 | Snap Inc. | Content capture with audio input feedback |
KR102241153B1 (en) | 2019-07-01 | 2021-04-19 | 주식회사 시어스랩 | Method, apparatus, and system generating 3d avartar from 2d image |
US11488359B2 (en) | 2019-08-28 | 2022-11-01 | Snap Inc. | Providing 3D data for messages in a messaging system |
US11189104B2 (en) | 2019-08-28 | 2021-11-30 | Snap Inc. | Generating 3D data in a messaging system |
US10852905B1 (en) | 2019-09-09 | 2020-12-01 | Apple Inc. | Techniques for managing display usage |
AU2020309093B2 (en) | 2019-09-09 | 2021-07-22 | Apple Inc. | Techniques for managing display usage |
CN110662083B (en) | 2019-09-30 | 2022-04-22 | 北京达佳互联信息技术有限公司 | Data processing method and device, electronic equipment and storage medium |
US11252274B2 (en) | 2019-09-30 | 2022-02-15 | Snap Inc. | Messaging application sticker extensions |
KR102280391B1 (en) | 2019-10-31 | 2021-07-22 | 주식회사 앱포스터 | Apparatus and method for providing screen setting data of a plurality of device |
CN112860428A (en) | 2019-11-28 | 2021-05-28 | 华为技术有限公司 | High-energy-efficiency display processing method and equipment |
US11276340B2 (en) | 2019-12-31 | 2022-03-15 | Micron Technology, Inc. | Intelligent adjustment of screen refresh rate |
JP2021145209A (en) | 2020-03-11 | 2021-09-24 | キヤノン株式会社 | Electronic apparatus |
US20210287274A1 (en) | 2020-03-13 | 2021-09-16 | Hai Viet Nguyen | Methods and systems for a all-in-one personal fashion coaching and assistance using artificial intelligence and peer-to-peer network databases |
US11372520B2 (en) | 2020-04-06 | 2022-06-28 | Kyocera Document Solutions Inc. | Display input apparatus and image forming apparatus capable of moving plurality of icons from one page to another on display device and displaying moved icons thereon |
CN115552375A (en) | 2020-05-11 | 2022-12-30 | 苹果公司 | User interface for managing user interface sharing |
DK202070625A1 (en) | 2020-05-11 | 2022-01-04 | Apple Inc | User interfaces related to time |
KR102536234B1 (en) | 2020-05-11 | 2023-05-30 | 애플 인크. | User interfaces related to time |
US11372659B2 (en) | 2020-05-11 | 2022-06-28 | Apple Inc. | User interfaces for managing user interface sharing |
US11921998B2 (en) | 2020-05-11 | 2024-03-05 | Apple Inc. | Editing features of an avatar |
CN111610847B (en) | 2020-05-29 | 2022-05-17 | Oppo广东移动通信有限公司 | Page display method and device of third-party application program and electronic equipment |
CN111695471B (en) | 2020-06-02 | 2023-06-27 | 北京百度网讯科技有限公司 | Avatar generation method, apparatus, device and storage medium |
US11538437B2 (en) | 2020-06-27 | 2022-12-27 | Intel Corporation | Low power refresh during semi-active workloads |
US20220265143A1 (en) | 2020-12-07 | 2022-08-25 | Beta Bionics, Inc. | Ambulatory medicament pumps with selective alarm muting |
US11694590B2 (en) | 2020-12-21 | 2023-07-04 | Apple Inc. | Dynamic user interface with time indicator |
US11720239B2 (en) | 2021-01-07 | 2023-08-08 | Apple Inc. | Techniques for user interfaces related to an event |
US20220342514A1 (en) | 2021-04-27 | 2022-10-27 | Apple Inc. | Techniques for managing display usage |
US11921992B2 (en) | 2021-05-14 | 2024-03-05 | Apple Inc. | User interfaces related to time |
US11776190B2 (en) | 2021-06-04 | 2023-10-03 | Apple Inc. | Techniques for managing an avatar on a lock screen |
US20230236547A1 (en) | 2022-01-24 | 2023-07-27 | Apple Inc. | User interfaces for indicating time |
US20240077937A1 (en) | 2022-09-06 | 2024-03-07 | Apple Inc. | Devices, methods, and graphical user interfaces for controlling avatars within three-dimensional environments |
-
2020
- 2020-09-22 DK DKPA202070625A patent/DK202070625A1/en not_active Application Discontinuation
- 2020-09-22 DK DKPA202070623A patent/DK181103B1/en active IP Right Grant
- 2020-09-22 DK DKPA202070624A patent/DK202070624A1/en not_active Application Discontinuation
- 2020-09-24 US US17/031,654 patent/US11061372B1/en active Active
- 2020-09-24 US US17/031,671 patent/US20210349426A1/en active Pending
- 2020-09-24 US US17/031,765 patent/US12008230B2/en active Active
-
2021
- 2021-07-12 US US17/373,163 patent/US11442414B2/en active Active
-
2022
- 2022-09-09 US US17/941,962 patent/US11822778B2/en active Active
-
2023
- 2023-07-11 US US18/220,715 patent/US20230350564A1/en active Pending
Patent Citations (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US7716057B2 (en) * | 1999-05-17 | 2010-05-11 | Microsoft Corporation | Controlling the listening horizon of an automatic speech recognition system for use in handsfree conversational dialogue |
US20030135769A1 (en) * | 2001-03-28 | 2003-07-17 | Loughran Stephen A. | Power management in computing applications |
US20030140309A1 (en) * | 2001-12-13 | 2003-07-24 | Mari Saito | Information processing apparatus, information processing method, storage medium, and program |
US20110119610A1 (en) * | 2009-11-13 | 2011-05-19 | Hackborn Dianne K | Live wallpaper |
US20150062052A1 (en) * | 2012-05-09 | 2015-03-05 | Apple Inc. | Device, Method, and Graphical User Interface for Transitioning Between Display States in Response to a Gesture |
US20150042571A1 (en) * | 2012-10-30 | 2015-02-12 | Motorola Mobility Llc | Method and apparatus for action indication selection |
US20150082446A1 (en) * | 2013-09-16 | 2015-03-19 | Motorola Mobility Llc | Method and apparatus for displaying potentially private information |
US9625987B1 (en) * | 2015-04-17 | 2017-04-18 | Google Inc. | Updating and displaying information in different power modes |
US20200089302A1 (en) * | 2017-05-17 | 2020-03-19 | Samsung Electronics Co., Ltd. | Method for displaying content and electronic device therefor |
US20190050045A1 (en) * | 2017-08-14 | 2019-02-14 | Samsung Electronics Co., Ltd. | Method for displaying content and electronic device thereof |
Non-Patent Citations (1)
Title |
---|
Author: Droid Life Title: 20+ Galaxy S9, S9+ Tips and Tricks Date: Mar 22, 2018 Pages: 1-33 (Year: 2018) * |
Cited By (34)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US11481988B2 (en) | 2010-04-07 | 2022-10-25 | Apple Inc. | Avatar editing environment |
US11869165B2 (en) | 2010-04-07 | 2024-01-09 | Apple Inc. | Avatar editing environment |
US11740776B2 (en) | 2014-08-02 | 2023-08-29 | Apple Inc. | Context-specific user interfaces |
US11550465B2 (en) | 2014-08-15 | 2023-01-10 | Apple Inc. | Weather user interface |
US11922004B2 (en) | 2014-08-15 | 2024-03-05 | Apple Inc. | Weather user interface |
US12019862B2 (en) | 2015-03-08 | 2024-06-25 | Apple Inc. | Sharing user-configurable graphical constructs |
US11580867B2 (en) | 2015-08-20 | 2023-02-14 | Apple Inc. | Exercised-based watch face and complications |
US11908343B2 (en) | 2015-08-20 | 2024-02-20 | Apple Inc. | Exercised-based watch face and complications |
US11327634B2 (en) | 2017-05-12 | 2022-05-10 | Apple Inc. | Context-specific user interfaces |
US11775141B2 (en) | 2017-05-12 | 2023-10-03 | Apple Inc. | Context-specific user interfaces |
US11682182B2 (en) | 2018-05-07 | 2023-06-20 | Apple Inc. | Avatar creation user interface |
US11380077B2 (en) | 2018-05-07 | 2022-07-05 | Apple Inc. | Avatar creation user interface |
US11722764B2 (en) | 2018-05-07 | 2023-08-08 | Apple Inc. | Creative camera |
US11977411B2 (en) | 2018-05-07 | 2024-05-07 | Apple Inc. | Methods and systems for adding respective complications on a user interface |
US11327650B2 (en) | 2018-05-07 | 2022-05-10 | Apple Inc. | User interfaces having a collection of complications |
US11301130B2 (en) | 2019-05-06 | 2022-04-12 | Apple Inc. | Restricted operation of an electronic device |
US11960701B2 (en) | 2019-05-06 | 2024-04-16 | Apple Inc. | Using an illustration to show the passing of time |
US11340778B2 (en) | 2019-05-06 | 2022-05-24 | Apple Inc. | Restricted operation of an electronic device |
US11340757B2 (en) | 2019-05-06 | 2022-05-24 | Apple Inc. | Clock faces for an electronic device |
US12008230B2 (en) | 2020-05-11 | 2024-06-11 | Apple Inc. | User interfaces related to time with an editable background |
US11921998B2 (en) | 2020-05-11 | 2024-03-05 | Apple Inc. | Editing features of an avatar |
US11842032B2 (en) | 2020-05-11 | 2023-12-12 | Apple Inc. | User interfaces for managing user interface sharing |
US11372659B2 (en) | 2020-05-11 | 2022-06-28 | Apple Inc. | User interfaces for managing user interface sharing |
US11442414B2 (en) | 2020-05-11 | 2022-09-13 | Apple Inc. | User interfaces related to time |
US11526256B2 (en) | 2020-05-11 | 2022-12-13 | Apple Inc. | User interfaces for managing user interface sharing |
US11822778B2 (en) | 2020-05-11 | 2023-11-21 | Apple Inc. | User interfaces related to time |
US11694590B2 (en) | 2020-12-21 | 2023-07-04 | Apple Inc. | Dynamic user interface with time indicator |
US11720239B2 (en) | 2021-01-07 | 2023-08-08 | Apple Inc. | Techniques for user interfaces related to an event |
US11921992B2 (en) | 2021-05-14 | 2024-03-05 | Apple Inc. | User interfaces related to time |
US11776190B2 (en) | 2021-06-04 | 2023-10-03 | Apple Inc. | Techniques for managing an avatar on a lock screen |
US11893231B2 (en) | 2022-05-10 | 2024-02-06 | Apple Inc. | Devices, methods, and graphical user interfaces for providing notifications and application information |
US11868601B2 (en) | 2022-05-10 | 2024-01-09 | Apple Inc. | Devices, methods, and graphical user interfaces for providing notifications and application information |
US20230367460A1 (en) * | 2022-05-10 | 2023-11-16 | Apple Inc. | Devices, Methods, and Graphical User Interfaces for Providing Notifications and Application Information |
US12033296B2 (en) | 2023-04-24 | 2024-07-09 | Apple Inc. | Avatar creation user interface |
Also Published As
Publication number | Publication date |
---|---|
US20210349611A1 (en) | 2021-11-11 |
DK202070624A1 (en) | 2022-01-04 |
US11061372B1 (en) | 2021-07-13 |
US12008230B2 (en) | 2024-06-11 |
DK202070625A1 (en) | 2022-01-04 |
US11442414B2 (en) | 2022-09-13 |
US20230004270A1 (en) | 2023-01-05 |
US11822778B2 (en) | 2023-11-21 |
DK181103B1 (en) | 2022-12-15 |
US20210349427A1 (en) | 2021-11-11 |
US20230350564A1 (en) | 2023-11-02 |
DK202070623A1 (en) | 2021-12-21 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US11442414B2 (en) | User interfaces related to time | |
US11733656B2 (en) | Configuring context-specific user interfaces | |
US10908559B1 (en) | Techniques for managing display usage | |
AU2022220279B2 (en) | User interfaces related to time | |
CN118295558A (en) | Time-dependent user interface |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: APPLE INC., CALIFORNIA Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:CHEN, KEVIN WILL;BARLIER, GUILLAUME PIERRE ANDRE;FORSSELL, LISA K.;AND OTHERS;SIGNING DATES FROM 20210122 TO 20210304;REEL/FRAME:055524/0228 |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: FINAL REJECTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: FINAL REJECTION MAILED |
|
STCV | Information on status: appeal procedure |
Free format text: NOTICE OF APPEAL FILED |
|
STCV | Information on status: appeal procedure |
Free format text: APPEAL BRIEF (OR SUPPLEMENTAL BRIEF) ENTERED AND FORWARDED TO EXAMINER |
|
STCV | Information on status: appeal procedure |
Free format text: EXAMINER'S ANSWER TO APPEAL BRIEF MAILED |
|
STCV | Information on status: appeal procedure |
Free format text: ON APPEAL -- AWAITING DECISION BY THE BOARD OF APPEALS |