CN116382557A - Navigating a user interface using hand gestures - Google Patents

Navigating a user interface using hand gestures Download PDF

Info

Publication number
CN116382557A
CN116382557A CN202310358775.5A CN202310358775A CN116382557A CN 116382557 A CN116382557 A CN 116382557A CN 202310358775 A CN202310358775 A CN 202310358775A CN 116382557 A CN116382557 A CN 116382557A
Authority
CN
China
Prior art keywords
user interface
gesture
computer system
type
interface object
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202310358775.5A
Other languages
Chinese (zh)
Inventor
T·K·阮
J·N·卡特赖特
E·C·克兰菲尔
C·B·弗莱扎克
J·R·福特
J·R·约翰逊
C·马鲁夫
H·赛义德穆萨维
H·涅托
J·D·巴顿
S·R·史高丽
I·G·尤瑟夫
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Apple Inc
Original Assignee
Apple Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Apple Inc filed Critical Apple Inc
Priority claimed from PCT/US2022/030021 external-priority patent/WO2022246060A1/en
Publication of CN116382557A publication Critical patent/CN116382557A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/163Wearable computers, e.g. on a belt
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/1633Constructional details or arrangements of portable computers not specific to the type of enclosures covered by groups G06F1/1615 - G06F1/1626
    • G06F1/1684Constructional details or arrangements related to integrated I/O peripherals not covered by groups G06F1/1635 - G06F1/1675
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/1633Constructional details or arrangements of portable computers not specific to the type of enclosures covered by groups G06F1/1615 - G06F1/1626
    • G06F1/1684Constructional details or arrangements related to integrated I/O peripherals not covered by groups G06F1/1635 - G06F1/1675
    • G06F1/1694Constructional details or arrangements related to integrated I/O peripherals not covered by groups G06F1/1635 - G06F1/1675 the I/O peripheral being a single or a set of motion sensors for pointer control or gesture input obtained by sensing movements of the portable computer
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/014Hand-worn input/output arrangements, e.g. data gloves
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/015Input arrangements based on nervous system activity detection, e.g. brain waves [EEG] detection, electromyograms [EMG] detection, electrodermal response detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/016Input arrangements with force or tactile feedback as computer generated output to the user
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/0304Detection arrangements using opto-electronic means
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0362Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of 1D translations or rotations of an operating part of the device, e.g. scroll wheels, sliders, knobs, rollers or belts
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04812Interaction techniques based on cursor appearance or behaviour, e.g. being affected by the presence of displayed objects
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04817Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance using icons
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/0482Interaction with lists of selectable items, e.g. menus
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04842Selection of displayed objects or displayed text elements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04847Interaction techniques to control parameter settings, e.g. interaction with sliders or dials
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/0485Scrolling or panning

Abstract

The present disclosure relates generally to navigating a user interface using hand gestures.

Description

Navigating a user interface using hand gestures
The present application is a divisional application of the invention patent application with international application date 2022, 5, 19, national application number 202280005772.7 (international application number PCT/US 2022/030021) and the name of "navigating user interface using hand gestures".
Cross-reference application
U.S. provisional patent application Ser. No. 63/190,783 entitled "NAVIGATING USER INTERFACES USING HAND GESTURES," filed 5/19 at 2021; U.S. provisional patent application Ser. No. 63/221,331, entitled "NAVIGATING USER INTERFACES USING HAND GESTURES," filed on 7/13 at 2021; and U.S. patent application Ser. No. 17/747,613, entitled "NAVIGATING USER INTERFACES USING HAND GESTURES," filed 5/18 at 2022. The contents of these patent applications are hereby incorporated by reference in their entirety.
Technical Field
The present disclosure relates generally to computer user interfaces, and more particularly to techniques for navigating a user interface using hand gestures.
Background
Users of smartphones and other personal electronic devices are using their devices more frequently. Some prior art techniques allow users to navigate user interfaces on their devices.
Disclosure of Invention
However, some techniques for navigating a user interface using hand gestures with an electronic device are often cumbersome and inefficient. For example, some prior art techniques use complex and time consuming user interfaces that may include multiple key presses or keystrokes. The prior art requires much time, which results in wasted time and energy of the device for the user. This latter consideration is particularly important in battery-powered devices.
Thus, the present technology provides a faster, more efficient method and interface for navigating a user interface using hand gestures for a user of an electronic device. Such methods and interfaces optionally supplement or replace other methods for navigating a user interface using hand gestures. Such methods and interfaces reduce the cognitive burden on the user and result in a more efficient human-machine interface. For battery-powered computing devices, such methods and interfaces conserve power and increase the time interval between battery charges.
According to some embodiments, a method is described. The method is performed at a computer system in communication with the display generating component and the optical sensor. The method comprises the following steps: displaying, via the display generating component, a user interface comprising a first user interface object, a second user interface object, a third user interface object, and an indication that the first user interface object is selected; detecting, via at least the optical sensor, a hand gesture while displaying a user interface comprising a first user interface object, a second user interface object, a third user interface object, and an indication that the first user interface object is selected; and in response to detecting the hand gesture via at least the optical sensor: in accordance with a determination that the hand gesture is a gesture of a first type, displaying, via the display generating component, an indication that the second user interface object is selected; and in accordance with a determination that the hand gesture is a gesture of a second type different from the first type of gesture, displaying, via the display generating component, an indication that the third user interface object is selected.
According to some embodiments, a non-transitory computer readable storage medium is described. The non-transitory computer readable storage medium stores one or more programs configured to be executed by one or more processors of a computer system in communication with the display generation component and the optical sensor, the one or more programs including instructions for: displaying, via the display generating component, a user interface comprising a first user interface object, a second user interface object, a third user interface object, and an indication that the first user interface object is selected; detecting, via at least the optical sensor, a hand gesture while displaying a user interface comprising a first user interface object, a second user interface object, a third user interface object, and an indication that the first user interface object is selected; and in response to detecting the hand gesture via at least the optical sensor: in accordance with a determination that the hand gesture is a gesture of a first type, displaying, via the display generating component, an indication that the second user interface object is selected; and in accordance with a determination that the hand gesture is a gesture of a second type different from the first type of gesture, displaying, via the display generating component, an indication that the third user interface object is selected.
According to some embodiments, a transitory computer readable storage medium is described. The transitory computer readable storage medium stores one or more programs configured to be executed by one or more processors of a computer system in communication with the display generation component and the optical sensor, the one or more programs comprising instructions for: displaying, via the display generating component, a user interface comprising a first user interface object, a second user interface object, a third user interface object, and an indication that the first user interface object is selected; detecting, via at least the optical sensor, a hand gesture while displaying a user interface comprising a first user interface object, a second user interface object, a third user interface object, and an indication that the first user interface object is selected; and in response to detecting the hand gesture via at least the optical sensor: in accordance with a determination that the hand gesture is a gesture of a first type, displaying, via the display generating component, an indication that the second user interface object is selected; and in accordance with a determination that the hand gesture is a gesture of a second type different from the first type of gesture, displaying, via the display generating component, an indication that the third user interface object is selected.
According to some embodiments, a computer system is described. The computer system includes: a display generation section; an optical sensor; one or more processors; and a memory storing one or more programs configured to be executed by the one or more processors, the one or more programs including instructions for: displaying, via the display generating component, a user interface comprising a first user interface object, a second user interface object, a third user interface object, and an indication that the first user interface object is selected; detecting, via at least the optical sensor, a hand gesture while displaying a user interface comprising a first user interface object, a second user interface object, a third user interface object, and an indication that the first user interface object is selected; and in response to detecting the hand gesture via at least the optical sensor: in accordance with a determination that the hand gesture is a gesture of a first type, displaying, via the display generating component, an indication that the second user interface object is selected; and in accordance with a determination that the hand gesture is a gesture of a second type different from the first type of gesture, displaying, via the display generating component, an indication that the third user interface object is selected.
According to some embodiments, a computer system is described. The computer system includes: a display generation section; an optical sensor; one or more processors; and a memory storing one or more programs configured to be executed by the one or more processors, the one or more programs including instructions for: means for displaying, via the display generating means, a user interface comprising a first user interface object, a second user interface object, a third user interface object, and an indication that the first user interface object was selected; means for detecting a hand gesture via at least the optical sensor while displaying a user interface comprising a first user interface object, a second user interface object, a third user interface object, and an indication that the first user interface object is selected; means for: in response to detecting the hand gesture via at least the optical sensor: in accordance with a determination that the hand gesture is a gesture of a first type, displaying, via the display generating component, an indication that the second user interface object is selected; and in accordance with a determination that the hand gesture is a gesture of a second type different from the first type of gesture, displaying, via the display generating component, an indication that the third user interface object is selected.
According to some embodiments, a computer program product is described. The computer program product includes one or more programs configured to be executed by one or more processors of a computer system in communication with the display generation component and the optical sensor, the one or more programs including instructions for: displaying, via the display generating component, a user interface comprising a first user interface object, a second user interface object, a third user interface object, and an indication that the first user interface object is selected; detecting, via at least the optical sensor, a hand gesture while displaying a user interface comprising a first user interface object, a second user interface object, a third user interface object, and an indication that the first user interface object is selected; and in response to detecting the hand gesture via at least the optical sensor: in accordance with a determination that the hand gesture is a gesture of a first type, displaying, via the display generating component, an indication that the second user interface object is selected; and in accordance with a determination that the hand gesture is a gesture of a second type different from the first type of gesture, displaying, via the display generating component, an indication that the third user interface object is selected.
According to some embodiments, a method is described. The method is performed at a computer system in communication with a display generation component. The method comprises the following steps: displaying, via a display generating component, a user interface comprising selectable user interface objects, a cursor displayed at a first location on the user interface; detecting a request to move a cursor from a first location to a second location on a user interface while displaying a selectable user interface object and the cursor at the first location on the user interface; and in response to detecting a request to move the cursor from the first position to the second position: a cursor displayed at a second location; in accordance with determining that the second position corresponds to a position of the selectable user interface object, displaying an animation that provides a visual indication of how long the cursor needs to be positioned at the second position to perform the operation, wherein the visual indication is updated over a period of time; and in accordance with a determination that the second location does not correspond to a location of the selectable user interface object, forgoing displaying the animation.
According to some embodiments, a non-transitory computer readable storage medium is described. The non-transitory computer readable storage medium stores one or more programs configured to be executed by one or more processors of a computer system in communication with the display generation component, the one or more programs including instructions for: displaying, via a display generating component, a user interface comprising selectable user interface objects, a cursor displayed at a first location on the user interface; detecting a request to move a cursor from a first location to a second location on a user interface while displaying a selectable user interface object and the cursor at the first location on the user interface; and in response to detecting a request to move the cursor from the first position to the second position: a cursor displayed at a second location; in accordance with determining that the second position corresponds to a position of the selectable user interface object, displaying an animation that provides a visual indication of how long the cursor needs to be positioned at the second position to perform the operation, wherein the visual indication is updated over a period of time; and in accordance with a determination that the second location does not correspond to a location of the selectable user interface object, forgoing displaying the animation.
According to some embodiments, a transitory computer readable storage medium is described. The transitory computer readable storage medium stores one or more programs configured to be executed by one or more processors of a computer system in communication with a display generation component, the one or more programs comprising instructions for: displaying, via a display generating component, a user interface comprising selectable user interface objects, a cursor displayed at a first location on the user interface; detecting a request to move a cursor from a first location to a second location on a user interface while displaying a selectable user interface object and the cursor at the first location on the user interface; and in response to detecting a request to move the cursor from the first position to the second position: a cursor displayed at a second location; in accordance with determining that the second position corresponds to a position of the selectable user interface object, displaying an animation that provides a visual indication of how long the cursor needs to be positioned at the second position to perform the operation, wherein the visual indication is updated over a period of time; and in accordance with a determination that the second location does not correspond to a location of the selectable user interface object, forgoing displaying the animation.
According to some embodiments, a computer system is described. The computer system includes: a display generation section; one or more processors; and a memory storing one or more programs configured to be executed by the one or more processors, the one or more programs including instructions for: displaying, via a display generating component, a user interface comprising selectable user interface objects, a cursor displayed at a first location on the user interface; detecting a request to move a cursor from a first location to a second location on a user interface while displaying a selectable user interface object and the cursor at the first location on the user interface; and in response to detecting a request to move the cursor from the first position to the second position: a cursor displayed at a second location; in accordance with determining that the second position corresponds to a position of the selectable user interface object, displaying an animation that provides a visual indication of how long the cursor needs to be positioned at the second position to perform the operation, wherein the visual indication is updated over a period of time; and in accordance with a determination that the second location does not correspond to a location of the selectable user interface object, forgoing displaying the animation.
According to some embodiments, a computer system is described. The computer system includes: a display generation section; one or more processors; and a memory storing one or more programs configured to be executed by the one or more processors, the one or more programs including instructions for: means for displaying, via the display generating means, a user interface comprising selectable user interface objects and a cursor displayed at a first location on the user interface; means for detecting a request to move a cursor from a first location to a second location on the user interface while the selectable user interface object and the cursor at the first location on the user interface are displayed; means for: in response to detecting a request to move the cursor from the first position to the second position: a cursor displayed at a second location; in accordance with determining that the second position corresponds to a position of the selectable user interface object, displaying an animation that provides a visual indication of how long the cursor needs to be positioned at the second position to perform the operation, wherein the visual indication is updated over a period of time; and in accordance with a determination that the second location does not correspond to a location of the selectable user interface object, forgoing displaying the animation.
According to some embodiments, a computer program product is described. The computer program product includes one or more programs configured to be executed by one or more processors of a computer system in communication with a display generation component, the one or more programs including instructions for: displaying, via a display generating component, a user interface comprising selectable user interface objects, a cursor displayed at a first location on the user interface; detecting a request to move a cursor from a first location to a second location on a user interface while displaying a selectable user interface object and the cursor at the first location on the user interface; and in response to detecting a request to move the cursor from the first position to the second position: a cursor displayed at a second location; in accordance with determining that the second position corresponds to a position of the selectable user interface object, displaying an animation that provides a visual indication of how long the cursor needs to be positioned at the second position to perform the operation, wherein the visual indication is updated over a period of time; and in accordance with a determination that the second location does not correspond to a location of the selectable user interface object, forgoing displaying the animation.
Executable instructions for performing these functions are optionally included in a non-transitory computer-readable storage medium or other computer program product configured for execution by one or more processors. Executable instructions for performing these functions are optionally included in a transitory computer-readable storage medium or other computer program product configured for execution by one or more processors.
Thus, faster, more efficient methods and interfaces for navigating a user interface using hand gestures are provided for devices, thereby improving the effectiveness, efficiency, and user satisfaction of such devices. Such methods and interfaces may supplement or replace other methods for navigating a user interface using hand gestures.
Drawings
For a better understanding of the various described embodiments, reference should be made to the following detailed description taken in conjunction with the following drawings, in which like reference numerals designate corresponding parts throughout the several views.
Fig. 1A is a block diagram illustrating a portable multifunction device with a touch-sensitive display in accordance with some embodiments.
FIG. 1B is a block diagram illustrating exemplary components for event processing according to some embodiments.
Fig. 2 illustrates a portable multifunction device with a touch screen in accordance with some embodiments.
FIG. 3 is a block diagram of an exemplary multifunction device with a display and a touch-sensitive surface in accordance with some embodiments.
Fig. 4A illustrates an exemplary user interface for a menu of applications on a portable multifunction device in accordance with some embodiments.
Fig. 4B illustrates an exemplary user interface for a multifunction device with a touch-sensitive surface separate from a display in accordance with some embodiments.
Fig. 5A illustrates a personal electronic device according to some embodiments.
Fig. 5B is a block diagram illustrating a personal electronic device, according to some embodiments.
FIG. 6 illustrates an exemplary set of hand gestures according to some embodiments.
Fig. 7A-7 AA illustrate exemplary user interfaces for navigating a user interface using hand gestures, according to some embodiments.
Fig. 8A-8J illustrate an exemplary user interface for navigating the user interface using hand gestures, according to some embodiments.
Fig. 9A-9H illustrate exemplary user interfaces for navigating a user interface using hand gestures, according to some embodiments.
10A-10F illustrate an exemplary user interface for navigating the user interface using hand gestures, according to some embodiments.
11A-11H illustrate an exemplary user interface for navigating the user interface using hand gestures, according to some embodiments.
Fig. 12A-12J illustrate an exemplary user interface for navigating the user interface using hand gestures, according to some embodiments.
Fig. 13A-13G illustrate exemplary user interfaces for navigating a user interface using hand gestures, according to some embodiments.
FIG. 14 illustrates an exemplary user interface for navigating the user interface using hand gestures, according to some embodiments.
FIG. 15 is a flowchart illustrating a method for navigating a user interface using hand gestures, according to some embodiments.
FIG. 16 is a flow chart illustrating a method for navigating a user interface using hand gestures, according to some embodiments.
Detailed Description
The following description sets forth exemplary methods, parameters, and the like. However, it should be recognized that such description is not intended as a limitation on the scope of the present disclosure, but is instead provided as a description of exemplary embodiments.
There is a need for an electronic device that provides an efficient method and interface for navigating a user interface using hand gestures. For example, a user may want to navigate a user interface without touching the display of his device. Such techniques may reduce the cognitive and/or physical burden on a user navigating the user interface, thereby increasing productivity. Further, such techniques may reduce processor power and battery power that would otherwise be wasted on redundant user inputs.
1A-1B, 2, 3, 4A-4B, and 5A-5B below provide a description of an exemplary device for performing techniques for navigating a user interface using hand gestures. FIG. 6 illustrates an exemplary set of hand gestures according to some embodiments. Fig. 7A-7 AA illustrate exemplary user interfaces for navigating a user interface using hand gestures, according to some embodiments. Fig. 8A-8J illustrate an exemplary user interface for navigating the user interface using hand gestures, according to some embodiments. Fig. 9A-9H illustrate exemplary user interfaces for navigating a user interface using hand gestures, according to some embodiments. 10A-10F illustrate an exemplary user interface for navigating the user interface using hand gestures, according to some embodiments. 11A-11H illustrate an exemplary user interface for navigating the user interface using hand gestures, according to some embodiments. Fig. 12A-12J illustrate an exemplary user interface for navigating the user interface using hand gestures, according to some embodiments. Fig. 13A-13G illustrate exemplary user interfaces for navigating a user interface using hand gestures, according to some embodiments. FIG. 14 illustrates an exemplary user interface for navigating the user interface using hand gestures, according to some embodiments. FIG. 15 is a flowchart illustrating a method for navigating a user interface using hand gestures, according to some embodiments. FIG. 16 is a flow chart illustrating a method for navigating a user interface using hand gestures, according to some embodiments. The user interfaces in fig. 6, 7A to 7AA, 8A to 8J, 9A to 9H, 10A to 10F, 11A to 11H, 12A to 12J, 13A to 13G, and 14 are used to illustrate the processes described below, including the processes in fig. 15 and 16.
Furthermore, in a method described herein in which one or more steps are dependent on one or more conditions having been met, it should be understood that the method may be repeated in multiple iterations such that during the iteration, all conditions that determine steps in the method have been met in different iterations of the method. For example, if a method requires performing a first step (if a condition is met) and performing a second step (if a condition is not met), one of ordinary skill will know that the stated steps are repeated until both the condition and the condition are not met (not sequentially). Thus, a method described as having one or more steps depending on one or more conditions having been met may be rewritten as a method that repeats until each of the conditions described in the method have been met. However, this does not require the system or computer-readable medium to claim that the system or computer-readable medium contains instructions for performing the contingent operation based on the satisfaction of the corresponding condition or conditions, and thus is able to determine whether the contingent situation has been met without explicitly repeating the steps of the method until all conditions to decide on steps in the method have been met. It will also be appreciated by those of ordinary skill in the art that, similar to a method with optional steps, a system or computer readable storage medium may repeat the steps of the method as many times as necessary to ensure that all optional steps have been performed.
Although the following description uses the terms "first," "second," etc. to describe various elements, these elements should not be limited by the terms. These terms are only used to distinguish one element from another element. For example, a first touch may be named a second touch and similarly a second touch may be named a first touch without departing from the scope of the various described embodiments. Both the first touch and the second touch are touches, but they are not the same touch.
The terminology used in the description of the various illustrated embodiments herein is for the purpose of describing particular embodiments only and is not intended to be limiting. As used in the description of the various described embodiments and in the appended claims, the singular forms "a," "an," and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise. It will also be understood that the term "and/or" as used herein refers to and encompasses any and all possible combinations of one or more of the associated listed items. It will be further understood that the terms "comprises," "comprising," "includes," and/or "including," when used in this specification, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof.
Depending on the context, the term "if" is optionally interpreted to mean "when..once..once.," in response to determining "or" in response to detecting ". Similarly, the phrase "if determined … …" or "if detected [ stated condition or event ]" is optionally interpreted to mean "upon determining … …" or "in response to determining … …" or "upon detecting [ stated condition or event ]" or "in response to detecting [ stated condition or event ]" depending on the context.
Embodiments of electronic devices, user interfaces for such devices, and related processes for using such devices are described herein. In some embodiments, the device is a portable communication device, such as a mobile phone, that also includes other functions, such as PDA and/or music player functions. Exemplary embodiments of the portable multifunction device include, but are not limited to, those from Apple inc (Cupertino, california)
Figure BDA0004164249590000101
Device, iPod->
Figure BDA0004164249590000102
Device, and->
Figure BDA0004164249590000103
An apparatus. Other portable electronic devices, such as a laptop or tablet computer having a touch-sensitive surface (e.g., a touch screen display and/or a touchpad), are optionally used. It should also be appreciated that in some embodiments The device is not a portable communication device, but a desktop computer having a touch-sensitive surface (e.g., a touch screen display and/or a touch pad). In some embodiments, the electronic device is a computer system in communication (e.g., via wireless communication, via wired communication) with the display generation component. The display generation component is configured to provide visual output, such as display via a CRT display, display via an LED display, or display via image projection. In some embodiments, the display generating component is integrated with the computer system. In some embodiments, the display generating component is separate from the computer system. As used herein, "displaying" content includes displaying content (e.g., video data rendered or decoded by display controller 156) by transmitting data (e.g., image data or video data) to an integrated or external display generation component via a wired or wireless connection to visually produce the content.
In the following discussion, an electronic device including a display and a touch-sensitive surface is described. However, it should be understood that the electronic device optionally includes one or more other physical user interface devices, such as a physical keyboard, mouse, and/or joystick.
The device typically supports various applications such as one or more of the following: drawing applications, presentation applications, word processing applications, website creation applications, disk editing applications, spreadsheet applications, gaming applications, telephony applications, video conferencing applications, email applications, instant messaging applications, fitness support applications, photo management applications, digital camera applications, digital video camera applications, web browsing applications, digital music player applications, and/or digital video player applications.
The various applications executing on the device optionally use at least one generic physical user interface device, such as a touch-sensitive surface. One or more functions of the touch-sensitive surface and corresponding information displayed on the device are optionally adjusted and/or changed for different applications and/or within the respective applications. In this way, the common physical architecture of the devices (such as the touch-sensitive surface) optionally supports various applications with a user interface that is intuitive and transparent to the user.
Attention is now directed to embodiments of a portable device having a touch sensitive display. Fig. 1A is a block diagram illustrating a portable multifunction device 100 with a touch-sensitive display system 112 in accordance with some embodiments. Touch-sensitive display 112 is sometimes referred to as a "touch screen" for convenience and is sometimes referred to or referred to as a "touch-sensitive display system". Device 100 includes memory 102 (which optionally includes one or more computer-readable storage media), memory controller 122, one or more processing units (CPUs) 120, peripheral interface 118, RF circuitry 108, audio circuitry 110, speaker 111, microphone 113, input/output (I/O) subsystem 106, other input control devices 116, and external ports 124. The apparatus 100 optionally includes one or more optical sensors 164. The device 100 optionally includes one or more contact intensity sensors 165 for detecting the intensity of a contact on the device 100 (e.g., a touch-sensitive surface, such as the touch-sensitive display system 112 of the device 100). Device 100 optionally includes one or more tactile output generators 167 (e.g., generating tactile output on a touch-sensitive surface, such as touch-sensitive display system 112 of device 100 or touch pad 355 of device 300) for generating tactile output on device 100. These components optionally communicate via one or more communication buses or signal lines 103.
As used in this specification and the claims, the term "intensity" of a contact on a touch-sensitive surface refers to the force or pressure (force per unit area) of the contact on the touch-sensitive surface (e.g., finger contact), or to an alternative to the force or pressure of the contact on the touch-sensitive surface (surrogate). The intensity of the contact has a range of values that includes at least four different values and more typically includes hundreds of different values (e.g., at least 256). The intensity of the contact is optionally determined (or measured) using various methods and various sensors or combinations of sensors. For example, one or more force sensors below or adjacent to the touch-sensitive surface are optionally used to measure forces at different points on the touch-sensitive surface. In some implementations, force measurements from multiple force sensors are combined (e.g., weighted average) to determine an estimated contact force. Similarly, the pressure sensitive tip of the stylus is optionally used to determine the pressure of the stylus on the touch sensitive surface. Alternatively, the size of the contact area and/or its variation detected on the touch-sensitive surface, the capacitance of the touch-sensitive surface and/or its variation in the vicinity of the contact and/or the resistance of the touch-sensitive surface and/or its variation in the vicinity of the contact are optionally used as a substitute for the force or pressure of the contact on the touch-sensitive surface. In some implementations, surrogate measurements of contact force or pressure are directly used to determine whether an intensity threshold has been exceeded (e.g., the intensity threshold is described in units corresponding to surrogate measurements). In some implementations, surrogate measurements of contact force or pressure are converted to an estimated force or pressure, and the estimated force or pressure is used to determine whether an intensity threshold has been exceeded (e.g., the intensity threshold is a pressure threshold measured in units of pressure). The intensity of the contact is used as an attribute of the user input, allowing the user to access additional device functions that are not otherwise accessible to the user on a smaller sized device of limited real estate for displaying affordances and/or receiving user input (e.g., via a touch-sensitive display, touch-sensitive surface, or physical/mechanical control, such as a knob or button).
As used in this specification and in the claims, the term "haptic output" refers to a physical displacement of a device relative to a previous position of the device, a physical displacement of a component of the device (e.g., a touch sensitive surface) relative to another component of the device (e.g., a housing), or a displacement of a component relative to a centroid of the device, to be detected by a user with a user's feel. For example, in the case where the device or component of the device is in contact with a touch-sensitive surface of the user (e.g., a finger, palm, or other portion of the user's hand), the haptic output generated by the physical displacement will be interpreted by the user as a haptic sensation corresponding to a perceived change in a physical characteristic of the device or component of the device. For example, movement of a touch-sensitive surface (e.g., a touch-sensitive display or touch pad) is optionally interpreted by a user as a "press click" or "click-down" of a physically actuated button. In some cases, the user will feel a tactile sensation, such as "press click" or "click down", even when the physical actuation button associated with the touch-sensitive surface that is physically pressed (e.g., displaced) by the user's movement is not moved. As another example, movement of the touch-sensitive surface may optionally be interpreted or sensed by a user as "roughness" of the touch-sensitive surface, even when the smoothness of the touch-sensitive surface is unchanged. While such interpretation of touches by a user will be limited by the user's individualized sensory perception, many sensory perceptions of touches are common to most users. Thus, when a haptic output is described as corresponding to a particular sensory perception of a user (e.g., "click down," "click up," "roughness"), unless stated otherwise, the haptic output generated corresponds to a physical displacement of the device or component thereof that would generate that sensory perception of a typical (or ordinary) user.
It should be understood that the device 100 is merely one example of a portable multifunction device, and that the device 100 optionally has more or fewer components than shown, optionally combines two or more components, or optionally has a different configuration or arrangement of the components. The various components shown in fig. 1A are implemented in hardware, software, or a combination of both hardware and software, including one or more signal processing and/or application specific integrated circuits.
Memory 102 optionally includes high-speed random access memory, and also optionally includes non-volatile memory, such as one or more disk storage devices, flash memory devices, or other non-volatile solid-state memory devices. Memory controller 122 optionally controls access to memory 102 by other components of device 100.
Peripheral interface 118 may be used to couple input and output peripherals of the device to CPU 120 and memory 102. The one or more processors 120 run or execute various software programs, such as computer programs (e.g., including instructions), and/or sets of instructions stored in the memory 102 to perform various functions of the device 100 and process data. In some embodiments, peripheral interface 118, CPU 120, and memory controller 122 are optionally implemented on a single chip, such as chip 104. In some other embodiments, they are optionally implemented on separate chips.
The RF (radio frequency) circuit 108 receives and transmits RF signals, also referred to as electromagnetic signals. RF circuitry 108 converts/converts electrical signals to/from electromagnetic signals and communicates with communication networks and other communication devices via electromagnetic signals. RF circuitry 108 optionally includes well known circuitry for performing these functions including, but not limited to, an antenna system, an RF transceiver, one or more amplifiers, a tuner, one or more oscillators, a digital signal processor, a codec chipset, a Subscriber Identity Module (SIM) card, memory, and the like. RF circuitry 108 optionally communicates via wireless communication with networks such as the internet (also known as the World Wide Web (WWW)), intranets, and/or wireless networks such as cellular telephone networks, wireless Local Area Networks (LANs), and/or Metropolitan Area Networks (MANs), and other devices. The RF circuitry 108 optionally includes well-known circuitry for detecting a Near Field Communication (NFC) field, such as by a short-range communication radio. Wireless communications optionally use any of a variety of communication standards, protocols, and technologies including, but not limited to, global system for mobile communications (GSM), enhanced Data GSM Environment (EDGE), high Speed Downlink Packet Access (HSDPA), high Speed Uplink Packet Access (HSUPA), evolution, pure data (EV-DO), HSPA, hspa+, dual cell HSPA (DC-HSPDA), long Term Evolution (LTE), near Field Communications (NFC), wideband code division multiple access (W-CDMA), code Division Multiple Access (CDMA), time Division Multiple Access (TDMA), bluetooth low energy (BTLE), wireless fidelity (Wi-Fi) (e.g., IEEE 802.11a, IEEE 802.11b, IEEE 802.11g, IEEE 802.11n, and/or IEEE 802.11 ac), voice over internet protocol (VoIP), wi-MAX, email protocols (e.g., internet Message Access Protocol (IMAP) and/or Post Office Protocol (POP)), messages (e.g., extensible message handling and presence protocol (XMPP), protocols for instant messaging and presence using extended session initiation protocol (sime), messages and presence (IMPS), instant messaging and/or SMS (SMS) protocols, or any other suitable communications protocol not yet developed herein.
Audio circuitry 110, speaker 111, and microphone 113 provide an audio interface between the user and device 100. Audio circuitry 110 receives audio data from peripheral interface 118, converts the audio data to electrical signals, and transmits the electrical signals to speaker 111. The speaker 111 converts electrical signals into sound waves that are audible to humans. The audio circuit 110 also receives electrical signals converted from sound waves by the microphone 113. The audio circuitry 110 converts the electrical signals into audio data and transmits the audio data to the peripheral interface 118 for processing. The audio data is optionally retrieved from and/or transmitted to the memory 102 and/or the RF circuitry 108 by the peripheral interface 118. In some embodiments, the audio circuit 110 also includes a headset jack (e.g., 212 in fig. 2). The headset jack provides an interface between the audio circuit 110 and removable audio input/output peripherals such as output-only headphones or a headset having both an output (e.g., a monaural or binaural) and an input (e.g., a microphone).
I/O subsystem 106 couples input/output peripheral devices on device 100, such as touch screen 112 and other input control devices 116, to peripheral interface 118. The I/O subsystem 106 optionally includes a display controller 156, an optical sensor controller 158, a depth camera controller 169, an intensity sensor controller 159, a haptic feedback controller 161, and one or more input controllers 160 for other input or control devices. The one or more input controllers 160 receive electrical signals from/transmit electrical signals to other input control devices 116. The other input control devices 116 optionally include physical buttons (e.g., push buttons, rocker buttons, etc.), dials, slider switches, joysticks, click-type dials, and the like. In some implementations, the input controller 160 is optionally coupled to (or not coupled to) any of the following: a keyboard, an infrared port, a USB port, and a pointing device such as a mouse. One or more buttons (e.g., 208 in fig. 2) optionally include an up/down button for volume control of speaker 111 and/or microphone 113. The one or more buttons optionally include a push button (e.g., 206 in fig. 2). In some embodiments, the electronic device is a computer system that communicates (e.g., via wireless communication, via wired communication) with one or more input devices. In some implementations, the one or more input devices include a touch-sensitive surface (e.g., a touch pad as part of a touch-sensitive display). In some implementations, the one or more input devices include one or more camera sensors (e.g., one or more optical sensors 164 and/or one or more depth camera sensors 175) such as for tracking gestures (e.g., hand gestures and/or air gestures) of the user as input. In some embodiments, one or more input devices are integrated with the computer system. In some embodiments, one or more input devices are separate from the computer system. In some embodiments, the air gesture is a gesture that is detected without the user touching an input element that is part of the device (or independent of an input element that is part of the device) and based on a detected movement of a portion of the user's body through the air, including a movement of the user's body relative to an absolute reference (e.g., an angle of the user's arm relative to the ground or a distance of the user's hand relative to the ground), a movement relative to another portion of the user's body (e.g., a movement of the user's hand relative to the user's shoulder, a movement of the user's hand relative to the other hand of the user, and/or a movement of the user's finger relative to the other finger or part of the hand of the user), and/or an absolute movement of a portion of the user's body (e.g., a flick gesture that includes a predetermined amount and/or speed of movement of the hand in a predetermined gesture that includes a predetermined gesture of the hand, or a shake gesture that includes a predetermined speed or amount of rotation of a portion of the user's body).
The quick press of the push button optionally disengages the lock of the touch screen 112 or optionally begins the process of unlocking the device using gestures on the touch screen, as described in U.S. patent application Ser. No. 11/322,549 (i.e., U.S. patent No.7,657,849) entitled "Unlocking a Device by Performing Gestures on an Unlock Image," filed on even 23, 12, 2005, which is hereby incorporated by reference in its entirety. Long presses of a button (e.g., 206) optionally cause the device 100 to power on or off. The function of the one or more buttons is optionally customizable by the user. Touch screen 112 is used to implement virtual buttons or soft buttons and one or more soft keyboards.
The touch sensitive display 112 provides an input interface and an output interface between the device and the user. Display controller 156 receives electrical signals from touch screen 112 and/or transmits electrical signals to touch screen 112. Touch screen 112 displays visual output to a user. Visual output optionally includes graphics, text, icons, video, and any combination thereof (collectively, "graphics"). In some embodiments, some or all of the visual output optionally corresponds to a user interface object.
Touch screen 112 has a touch-sensitive surface, sensor, or set of sensors that receives input from a user based on haptic and/or tactile contact. Touch screen 112 and display controller 156 (along with any associated modules and/or sets of instructions in memory 102) detect contact (and any movement or interruption of the contact) on touch screen 112 and translate the detected contact into interactions with user interface objects (e.g., one or more soft keys, icons, web pages, or images) displayed on touch screen 112. In an exemplary embodiment, the point of contact between touch screen 112 and the user corresponds to a user's finger.
Touch screen 112 optionally uses LCD (liquid crystal display) technology, LPD (light emitting polymer display) technology, or LED (light emitting diode) technology, but in other embodiments other display technologies are used. Touch screen 112 and display controller 156 optionally detect contact and any movement or interruption thereof using any of a variety of touch sensing technologies now known or later developed, including but not limited to capacitive, resistive, infrared, and surface acoustic wave technologies, as well as other proximity sensor arrays or other elements for determining one or more points of contact with touch screen 112. In exemplary embodiments Using projected mutual capacitance sensing techniques, such as those described in Apple inc (Cupertino, california)
Figure BDA0004164249590000161
And iPod->
Figure BDA0004164249590000162
Techniques used in the above.
The touch sensitive display in some implementations of touch screen 112 is optionally similar to the multi-touch sensitive touch pad described in the following U.S. patents: 6,323,846 (Westerman et al), 6,570,557 (Westerman et al) and/or 6,677,932 (Westerman et al) and/or U.S. patent publication 2002/0015024A1, each of which is hereby incorporated by reference in its entirety. However, touch screen 112 displays visual output from device 100, while touch sensitive touchpads do not provide visual output.
Touch sensitive displays in some implementations of touch screen 112 are described in the following applications: (1) U.S. patent application Ser. No. 11/381,313, "Multipoint Touch Surface Controller", filed on 5/2/2006; (2) U.S. patent application Ser. No. 10/840,862, "Multipoint Touchscreen", filed 5/6/2004; (3) U.S. patent application Ser. No. 10/903,964, "Gestures For Touch Sensitive Input Devices", filed 7.30.2004; (4) U.S. patent application Ser. No. 11/048,264, "Gestures For Touch Sensitive Input Devices", filed 1/31/2005; (5) U.S. patent application Ser. No. 11/038,590, "Mode-Based Graphical User Interfaces For Touch Sensitive Input Devices", filed 1/18/2005; (6) U.S. patent application Ser. No. 11/228,758, "Virtual Input Device Placement On A Touch Screen User Interface", filed 9/16/2005; (7) U.S. patent application Ser. No. 11/228,700, "Operation Of A Computer With A Touch Screen Interface", filed 9/16/2005; (8) U.S. patent application Ser. No. 11/228,737, "Activating Virtual Keys Of A Touch-Screen Virtual Keyboard", filed on 9/16/2005; and (9) U.S. patent application Ser. No. 11/367,749, "Multi-Functional Hand-Held Device," filed 3/2006. All of these applications are incorporated by reference herein in their entirety.
Touch screen 112 optionally has a video resolution in excess of 100 dpi. In some implementations, the touch screen has a video resolution of about 160 dpi. The user optionally uses any suitable object or appendage, such as a stylus, finger, or the like, to make contact with touch screen 112. In some embodiments, the user interface is designed to work primarily through finger-based contact and gestures, which may not be as accurate as stylus-based input due to the large contact area of the finger on the touch screen. In some embodiments, the device translates the finger-based coarse input into a precise pointer/cursor position or command for performing the action desired by the user.
In some embodiments, the device 100 optionally includes a touch pad for activating or deactivating a particular function in addition to the touch screen. In some embodiments, the touch pad is a touch sensitive area of the device that, unlike the touch screen, does not display visual output. The touch pad is optionally a touch sensitive surface separate from the touch screen 112 or an extension of the touch sensitive surface formed by the touch screen.
The apparatus 100 also includes a power system 162 for powering the various components. The power system 162 optionally includes a power management system, one or more power sources (e.g., battery, alternating Current (AC)), a recharging system, a power failure detection circuit, a power converter or inverter, a power status indicator (e.g., light Emitting Diode (LED)), and any other components associated with the generation, management, and distribution of power in the portable device.
The apparatus 100 optionally further comprises one or more optical sensors 164. FIG. 1A shows an optical sensor coupled to an optical sensor controller 158 in the I/O subsystem 106. The optical sensor 164 optionally includes a Charge Coupled Device (CCD) or a Complementary Metal Oxide Semiconductor (CMOS) phototransistor. The optical sensor 164 receives light projected through one or more lenses from the environment and converts the light into data representing an image. In conjunction with imaging module 143 (also called a camera module), optical sensor 164 optionally captures still images or video. In some embodiments, the optical sensor is located on the rear of the device 100, opposite the touch screen display 112 on the front of the device, so that the touch screen display can be used as a viewfinder for still image and/or video image acquisition. In some embodiments, the optical sensor is located on the front of the device such that the user's image is optionally acquired for video conferencing while viewing other video conference participants on the touch screen display. In some implementations, the position of the optical sensor 164 may be changed by the user (e.g., by rotating a lens and sensor in the device housing) such that a single optical sensor 164 is used with the touch screen display for both video conferencing and still image and/or video image acquisition.
The device 100 optionally further includes one or more depth camera sensors 175. FIG. 1A shows a depth camera sensor coupled to a depth camera controller 169 in the I/O subsystem 106. The depth camera sensor 175 receives data from the environment to create a three-dimensional model of objects (e.g., faces) within the scene from a point of view (e.g., depth camera sensor). In some implementations, in conjunction with the imaging module 143 (also referred to as a camera module), the depth camera sensor 175 is optionally used to determine a depth map of different portions of the image captured by the imaging module 143. In some embodiments, a depth camera sensor is located at the front of the device 100 such that a user image with depth information is optionally acquired for a video conference while the user views other video conference participants on a touch screen display, and a self-photograph with depth map data is captured. In some embodiments, the depth camera sensor 175 is located at the back of the device, or at the back and front of the device 100. In some implementations, the position of the depth camera sensor 175 can be changed by the user (e.g., by rotating a lens and sensor in the device housing) such that the depth camera sensor 175 is used with a touch screen display for both video conferencing and still image and/or video image acquisition.
The apparatus 100 optionally further comprises one or more contact intensity sensors 165. FIG. 1A shows a contact intensity sensor coupled to an intensity sensor controller 159 in the I/O subsystem 106. The contact strength sensor 165 optionally includes one or more piezoresistive strain gauges, capacitive force sensors, electrical force sensors, piezoelectric force sensors, optical force sensors, capacitive touch-sensitive surfaces, or other strength sensors (e.g., sensors for measuring force (or pressure) of a contact on a touch-sensitive surface). The contact strength sensor 165 receives contact strength information (e.g., pressure information or a surrogate for pressure information) from the environment. In some implementations, at least one contact intensity sensor is juxtaposed or adjacent to a touch-sensitive surface (e.g., touch-sensitive display system 112). In some embodiments, at least one contact intensity sensor is located on the rear of the device 100, opposite the touch screen display 112 located on the front of the device 100.
The device 100 optionally further includes one or more proximity sensors 166. Fig. 1A shows a proximity sensor 166 coupled to the peripheral interface 118. Alternatively, the proximity sensor 166 is optionally coupled to the input controller 160 in the I/O subsystem 106. The proximity sensor 166 optionally performs as described in the following U.S. patent applications: no.11/241,839, entitled "Proximity Detector In Handheld Device"; no.11/240,788, entitled "Proximity Detector In Handheld Device"; no.11/620,702, entitled "Using Ambient Light Sensor To Augment Proximity Sensor Output"; no.11/586,862, entitled "Automated Response To And Sensing Of User Activity In Portable Devices"; and No.11/638,251, entitled "Methods And Systems For Automatic Configuration Of Peripherals," which are hereby incorporated by reference in their entirety. In some embodiments, the proximity sensor is turned off and the touch screen 112 is disabled when the multifunction device is placed near the user's ear (e.g., when the user is making a telephone call).
The device 100 optionally further comprises one or more tactile output generators 167. FIG. 1A shows a haptic output generator coupled to a haptic feedback controller 161 in the I/O subsystem 106. The tactile output generator 167 optionally includes one or more electroacoustic devices such as speakers or other audio components; and/or electromechanical devices for converting energy into linear motion such as motors, solenoids, electroactive polymers, piezoelectric actuators, electrostatic actuators, or other tactile output generating means (e.g., means for converting an electrical signal into a tactile output on a device). The contact intensity sensor 165 receives haptic feedback generation instructions from the haptic feedback module 133 and generates a haptic output on the device 100 that can be perceived by a user of the device 100. In some embodiments, at least one tactile output generator is juxtaposed or adjacent to a touch-sensitive surface (e.g., touch-sensitive display system 112), and optionally generates tactile output by moving the touch-sensitive surface vertically (e.g., inward/outward of the surface of device 100) or laterally (e.g., backward and forward in the same plane as the surface of device 100). In some embodiments, at least one tactile output generator sensor is located on the rear of the device 100, opposite the touch screen display 112 located on the front of the device 100.
The device 100 optionally further includes one or more accelerometers 168. Fig. 1A shows accelerometer 168 coupled to peripheral interface 118. Alternatively, accelerometer 168 is optionally coupled to input controller 160 in I/O subsystem 106. Accelerometer 168 optionally performs as described in the following U.S. patent publications: U.S. patent publication No.20050190059, entitled "acception-based Theft Detection System for Portable Electronic Devices" and U.S. patent publication No.20060017692, entitled "Methods And Apparatuses For Operating A Portable Device Based On An Accelerometer", both of which are incorporated herein by reference in their entirety. In some implementations, information is displayed in a portrait view or a landscape view on a touch screen display based on analysis of data received from one or more accelerometers. The device 100 optionally includes a magnetometer and a GPS (or GLONASS or other global navigation system) receiver in addition to the accelerometer 168 for obtaining information about the position and orientation (e.g., longitudinal or lateral) of the device 100.
In some embodiments, the software components stored in memory 102 include an operating system 126, a communication module (or instruction set) 128, a contact/motion module (or instruction set) 130, a graphics module (or instruction set) 132, a text input module (or instruction set) 134, a Global Positioning System (GPS) module (or instruction set) 135, and an application program (or instruction set) 136. Furthermore, in some embodiments, memory 102 (fig. 1A) or 370 (fig. 3) stores device/global internal state 157, as shown in fig. 1A and 3. The device/global internal state 157 includes one or more of the following: an active application state indicating which applications (if any) are currently active; display status, indicating what applications, views, or other information occupy various areas of the touch screen display 112; sensor status, including information obtained from the various sensors of the device and the input control device 116; and location information relating to the device location and/or pose.
Operating system 126 (e.g., darwin, RTXC, LINUX, UNIX, OS X, iOS, WINDOWS, or embedded operating systems such as VxWorks) includes various software components and/or drivers for controlling and managing general system tasks (e.g., memory management, storage device control, power management, etc.), and facilitates communication between the various hardware components and software components.
The communication module 128 facilitates communication with other devices through one or more external ports 124 and also includes various software components for processing data received by the RF circuitry 108 and/or the external ports 124. External port 124 (e.g., universal Serial Bus (USB), firewire, etc.) is adapted to be coupled directly to other devices or indirectly via a network (e.g., the internet, wireless LAN, etc.). In some embodiments, the external port is in communication with
Figure BDA0004164249590000201
The 30-pin connector used on the (Apple inc. Trademark) device is the same or similar and/or compatible with a multi-pin (e.g., 30-pin) connector.
The contact/motion module 130 optionally detects contact with the touch screen 112 (in conjunction with the display controller 156) and other touch sensitive devices (e.g., a touchpad or physical click wheel). The contact/motion module 130 includes various software components for performing various operations related to contact detection, such as determining whether a contact has occurred (e.g., detecting a finger press event), determining the strength of the contact (e.g., the force or pressure of the contact, or a substitute for the force or pressure of the contact), determining whether there is movement of the contact and tracking movement across the touch-sensitive surface (e.g., detecting one or more finger drag events), and determining whether the contact has ceased (e.g., detecting a finger lift event or a contact break). The contact/motion module 130 receives contact data from the touch-sensitive surface. Determining movement of the point of contact optionally includes determining a velocity (magnitude), a speed (magnitude and direction), and/or an acceleration (change in magnitude and/or direction) of the point of contact, the movement of the point of contact being represented by a series of contact data. These operations are optionally applied to single point contacts (e.g., single finger contacts) or simultaneous multi-point contacts (e.g., "multi-touch"/multiple finger contacts). In some embodiments, the contact/motion module 130 and the display controller 156 detect contact on the touch pad.
In some implementations, the contact/motion module 130 uses a set of one or more intensity thresholds to determine whether an operation has been performed by a user (e.g., to determine whether the user has "clicked" on an icon). In some implementations, at least a subset of the intensity thresholds are determined according to software parameters (e.g., the intensity thresholds are not determined by activation thresholds of particular physical actuators and may be adjusted without changing the physical hardware of the device 100). For example, without changing the touchpad or touch screen display hardware, the mouse "click" threshold of the touchpad or touch screen may be set to any of a wide range of predefined thresholds. Additionally, in some implementations, a user of the device is provided with software settings for adjusting one or more intensity thresholds in a set of intensity thresholds (e.g., by adjusting individual intensity thresholds and/or by adjusting multiple intensity thresholds at once with a system-level click on an "intensity" parameter).
The contact/motion module 130 optionally detects gesture input by the user. Different gestures on the touch-sensitive surface have different contact patterns (e.g., different movements, timings, and/or intensities of the detected contacts). Thus, gestures are optionally detected by detecting a particular contact pattern. For example, detecting a finger tap gesture includes detecting a finger press event, and then detecting a finger lift (lift off) event at the same location (or substantially the same location) as the finger press event (e.g., at the location of an icon). As another example, detecting a finger swipe gesture on the touch-sensitive surface includes detecting a finger-down event, then detecting one or more finger-dragging events, and then detecting a finger-up (lift-off) event.
Graphics module 132 includes various known software components for rendering and displaying graphics on touch screen 112 or other displays, including components for changing the visual impact (e.g., brightness, transparency, saturation, contrast, or other visual attribute) of the displayed graphics. As used herein, the term "graphic" includes any object that may be displayed to a user, including but not limited to text, web pages, icons (such as user interface objects including soft keys), digital images, video, animation, and the like.
In some embodiments, graphics module 132 stores data representing graphics to be used. Each graphic is optionally assigned a corresponding code. The graphic module 132 receives one or more codes for designating graphics to be displayed from an application program or the like, and also receives coordinate data and other graphic attribute data together if necessary, and then generates screen image data to output to the display controller 156.
Haptic feedback module 133 includes various software components for generating instructions used by haptic output generator 167 to generate haptic output at one or more locations on device 100 in response to user interaction with device 100.
Text input module 134, which is optionally a component of graphics module 132, provides a soft keyboard for entering text in various applications (e.g., contacts 137, email 140, IM 141, browser 147, and any other application requiring text input).
The GPS module 135 determines the location of the device and provides this information for use in various applications (e.g., to the phone 138 for use in location-based dialing, to the camera 143 as picture/video metadata, and to applications that provide location-based services, such as weather gadgets, local page gadgets, and map/navigation gadgets).
The application 136 optionally includes the following modules (or sets of instructions) or a subset or superset thereof:
contact module 137 (sometimes referred to as an address book or contact list);
a telephone module 138;
video conferencing module 139;
email client module 140;
an Instant Messaging (IM) module 141;
a fitness support module 142;
a camera module 143 for still and/or video images;
an image management module 144;
a video player module;
a music player module;
browser module 147;
Calendar module 148;
a gadget module 149, optionally comprising one or more of: weather gadgets 149-1, stock gadgets 149-2, calculator gadget 149-3, alarm gadget 149-4, dictionary gadget 149-5, and other gadgets obtained by the user, and user-created gadgets 149-6;
a gadget creator module 150 for forming a user-created gadget 149-6;
search module 151;
a video and music player module 152 that incorporates the video player module and the music player module;
a note module 153;
map module 154; and/or
An online video module 155.
Examples of other applications 136 optionally stored in memory 102 include other word processing applications, other image editing applications, drawing applications, presentation applications, JAVA-enabled applications, encryption, digital rights management, voice recognition, and voice replication.
In conjunction with touch screen 112, display controller 156, contact/motion module 130, graphics module 132, and text input module 134, contacts module 137 is optionally used to manage an address book or contact list (e.g., in application internal state 192 of contacts module 137 stored in memory 102 or memory 370), including: adding one or more names to the address book; deleting the name from the address book; associating a telephone number, email address, physical address, or other information with the name; associating the image with the name; classifying and classifying names; providing a telephone number or email address to initiate and/or facilitate communications through telephone 138, video conferencing module 139, email 140, or IM 141; etc.
In conjunction with RF circuitry 108, audio circuitry 110, speaker 111, microphone 113, touch screen 112, display controller 156, contact/motion module 130, graphics module 132, and text input module 134, telephone module 138 is optionally used to input a sequence of characters corresponding to a telephone number, access one or more telephone numbers in contact module 137, modify the entered telephone number, dial the corresponding telephone number, conduct a conversation, and disconnect or hang up when the conversation is completed. As described above, wireless communication optionally uses any of a variety of communication standards, protocols, and technologies.
In conjunction with RF circuitry 108, audio circuitry 110, speaker 111, microphone 113, touch screen 112, display controller 156, optical sensor 164, optical sensor controller 158, contact/motion module 130, graphics module 132, text input module 134, contacts module 137, and telephony module 138, videoconferencing module 139 includes executable instructions to initiate, conduct, and terminate a videoconference between a user and one or more other participants according to user instructions.
In conjunction with RF circuitry 108, touch screen 112, display controller 156, contact/motion module 130, graphics module 132, and text input module 134, email client module 140 includes executable instructions for creating, sending, receiving, and managing emails in response to user instructions. In conjunction with the image management module 144, the email client module 140 makes it very easy to create and send emails with still or video images captured by the camera module 143.
In conjunction with RF circuitry 108, touch screen 112, display controller 156, contact/motion module 130, graphics module 132, and text input module 134, instant message module 141 includes executable instructions for: inputting a character sequence corresponding to an instant message, modifying previously inputted characters, transmitting a corresponding instant message (e.g., using a Short Message Service (SMS) or Multimedia Message Service (MMS) protocol for phone-based instant messages or using XMPP, SIMPLE, or IMPS for internet-based instant messages), receiving an instant message, and viewing the received instant message. In some embodiments, the transmitted and/or received instant message optionally includes graphics, photographs, audio files, video files, and/or other attachments supported in an MMS and/or Enhanced Messaging Service (EMS). As used herein, "instant message" refers to both telephony-based messages (e.g., messages sent using SMS or MMS) and internet-based messages (e.g., messages sent using XMPP, SIMPLE, or IMPS).
In conjunction with RF circuitry 108, touch screen 112, display controller 156, contact/motion module 130, graphics module 132, text input module 134, GPS module 135, map module 154, and music player module, workout support module 142 includes executable instructions for creating a workout (e.g., with time, distance, and/or calorie burn targets); communicate with a fitness sensor (exercise device); receiving fitness sensor data; calibrating a sensor for monitoring fitness; selecting and playing music for exercise; and displaying, storing and transmitting the fitness data.
In conjunction with touch screen 112, display controller 156, optical sensor 164, optical sensor controller 158, contact/motion module 130, graphics module 132, and image management module 144, camera module 143 includes executable instructions for: capturing still images or videos (including video streams) and storing them in the memory 102, modifying features of still images or videos, or deleting still images or videos from the memory 102.
In conjunction with touch screen 112, display controller 156, contact/motion module 130, graphics module 132, text input module 134, and camera module 143, image management module 144 includes executable instructions for arranging, modifying (e.g., editing), or otherwise manipulating, tagging, deleting, presenting (e.g., in a digital slide or album), and storing still and/or video images.
In conjunction with RF circuitry 108, touch screen 112, display controller 156, contact/motion module 130, graphics module 132, and text input module 134, browser module 147 includes executable instructions for browsing the internet according to user instructions, including searching, linking to, receiving, and displaying web pages or portions thereof, as well as attachments and other files linked to web pages.
In conjunction with RF circuitry 108, touch screen 112, display controller 156, contact/motion module 130, graphics module 132, text input module 134, email client module 140, and browser module 147, calendar module 148 includes executable instructions for creating, displaying, modifying, and storing calendars and data associated with calendars (e.g., calendar entries, to-do items, etc.) according to user instructions.
In conjunction with RF circuitry 108, touch screen 112, display controller 156, contact/motion module 130, graphics module 132, text input module 134, and browser module 147, gadget module 149 is a mini-application (e.g., weather gadget 149-1, stock gadget 149-2, calculator gadget 149-3, alarm gadget 149-4, and dictionary gadget 149-5) or a mini-application created by a user (e.g., user created gadget 149-6) that is optionally downloaded and used by a user. In some embodiments, gadgets include HTML (hypertext markup language) files, CSS (cascading style sheet) files, and JavaScript files. In some embodiments, gadgets include XML (extensible markup language) files and JavaScript files (e.g., yahoo | gadgets).
In conjunction with RF circuitry 108, touch screen 112, display controller 156, contact/motion module 130, graphics module 132, text input module 134, and browser module 147, gadget creator module 150 is optionally used by a user to create gadgets (e.g., to transform user-specified portions of a web page into gadgets).
In conjunction with touch screen 112, display controller 156, contact/motion module 130, graphics module 132, and text input module 134, search module 151 includes executable instructions for searching memory 102 for text, music, sound, images, video, and/or other files that match one or more search criteria (e.g., one or more user-specified search terms) according to user instructions.
In conjunction with touch screen 112, display controller 156, contact/motion module 130, graphics module 132, audio circuit 110, speaker 111, RF circuit 108, and browser module 147, video and music player module 152 includes executable instructions that allow a user to download and playback recorded music and other sound files stored in one or more file formats, such as MP3 or AAC files, as well as executable instructions for displaying, rendering, or otherwise playing back video (e.g., on touch screen 112 or on an external display connected via external port 124). In some embodiments, the device 100 optionally includes the functionality of an MP3 player such as an iPod (trademark of Apple inc.).
In conjunction with the touch screen 112, the display controller 156, the contact/movement module 130, the graphics module 132, and the text input module 134, the notes module 153 includes executable instructions for creating and managing notes, backlog, and the like according to user instructions.
In conjunction with RF circuitry 108, touch screen 112, display controller 156, contact/motion module 130, graphics module 132, text input module 134, GPS module 135, and browser module 147, map module 154 is optionally configured to receive, display, modify, and store maps and data associated with maps (e.g., driving directions, data related to shops and other points of interest at or near a particular location, and other location-based data) according to user instructions.
In conjunction with touch screen 112, display controller 156, contact/motion module 130, graphics module 132, audio circuit 110, speaker 111, RF circuit 108, text input module 134, email client module 140, and browser module 147, online video module 155 includes instructions for: allowing a user to access, browse, receive (e.g., by streaming and/or downloading), play back (e.g., on a touch screen or on an external display connected via external port 124), send an email with a link to a particular online video, and otherwise manage online video in one or more file formats such as h.264. In some embodiments, the instant messaging module 141 is used to send links to particular online videos instead of the email client module 140. Additional description of online video applications can be found in U.S. provisional patent application Ser. No.60/936,562, titled "Portable Multifunction Device, method, and Graphical User Interface for Playing Online Videos," filed on even date 6, 20, 2007, and U.S. patent application Ser. No.11/968,067, titled "Portable Multifunction Device, method, and Graphical User Interface for Playing Online Videos," filed on even date 12, 31, 2007, the contents of both of which are hereby incorporated by reference in their entirety.
Each of the modules and applications described above corresponds to a set of executable instructions for performing one or more of the functions described above, as well as the methods described in this patent application (e.g., the computer-implemented methods and other information processing methods described herein). These modules (e.g., sets of instructions) need not be implemented in a separate software program, such as a computer program (e.g., including instructions), process, or module, and thus the various subsets of these modules are optionally combined or otherwise rearranged in various embodiments. For example, the video player module is optionally combined with the music player module into a single module (e.g., video and music player module 152 in fig. 1A). In some embodiments, memory 102 optionally stores a subset of the modules and data structures described above. Further, memory 102 optionally stores additional modules and data structures not described above.
In some embodiments, device 100 is a device in which the operation of a predefined set of functions on the device is performed exclusively through a touch screen and/or touch pad. By using a touch screen and/or a touch pad as the primary input control device for operating the device 100, the number of physical input control devices (e.g., push buttons, dials, etc.) on the device 100 is optionally reduced.
A predefined set of functions performed solely by the touch screen and/or touch pad optionally includes navigation between user interfaces. In some embodiments, the touchpad, when touched by a user, navigates the device 100 from any user interface displayed on the device 100 to a main menu, home menu, or root menu. In such implementations, a touch pad is used to implement a "menu button". In some other embodiments, the menu buttons are physical push buttons or other physical input control devices, rather than touch pads.
FIG. 1B is a block diagram illustrating exemplary components for event processing according to some embodiments. In some embodiments, memory 102 (FIG. 1A) or memory 370 (FIG. 3) includes event sorter 170 (e.g., in operating system 126) and corresponding applications 136-1 (e.g., any of the aforementioned applications 137-151, 155, 380-390).
The event classifier 170 receives the event information and determines the application view 191 of the application 136-1 and the application 136-1 to which the event information is to be delivered. The event sorter 170 includes an event monitor 171 and an event dispatcher module 174. In some embodiments, the application 136-1 includes an application internal state 192 that indicates one or more current application views that are displayed on the touch-sensitive display 112 when the application is active or executing. In some embodiments, the device/global internal state 157 is used by the event classifier 170 to determine which application(s) are currently active, and the application internal state 192 is used by the event classifier 170 to determine the application view 191 to which to deliver event information.
In some implementations, the application internal state 192 includes additional information, such as one or more of the following: restoration information to be used when the application 136-1 resumes execution, user interface state information indicating that the information is being displayed or ready for display by the application 136-1, a state queue for enabling the user to return to a previous state or view of the application 136-1, and a repeat/undo queue of previous actions taken by the user.
Event monitor 171 receives event information from peripheral interface 118. The event information includes information about sub-events (e.g., user touches on the touch sensitive display 112 as part of a multi-touch gesture). The peripheral interface 118 transmits information it receives from the I/O subsystem 106 or sensors, such as a proximity sensor 166, one or more accelerometers 168, and/or microphone 113 (via audio circuitry 110). The information received by the peripheral interface 118 from the I/O subsystem 106 includes information from the touch-sensitive display 112 or touch-sensitive surface.
In some embodiments, event monitor 171 sends requests to peripheral interface 118 at predetermined intervals. In response, the peripheral interface 118 transmits event information. In other embodiments, the peripheral interface 118 transmits event information only if there is a significant event (e.g., receiving an input above a predetermined noise threshold and/or receiving an input exceeding a predetermined duration).
In some implementations, the event classifier 170 also includes a hit view determination module 172 and/or an active event identifier determination module 173.
When the touch sensitive display 112 displays more than one view, the hit view determination module 172 provides a software process for determining where within one or more views a sub-event has occurred. The view is made up of controls and other elements that the user can see on the display.
Another aspect of the user interface associated with an application is a set of views, sometimes referred to herein as application views or user interface windows, in which information is displayed and touch-based gestures occur. The application view (of the respective application) in which the touch is detected optionally corresponds to a level of programming within the application's programming or view hierarchy. For example, the lowest horizontal view in which a touch is detected is optionally referred to as a hit view, and the set of events that are recognized as correct inputs is optionally determined based at least in part on the hit view of the initial touch that begins a touch-based gesture.
Hit view determination module 172 receives information related to sub-events of the touch-based gesture. When an application has multiple views organized in a hierarchy, hit view determination module 172 identifies the hit view as the lowest view in the hierarchy that should process sub-events. In most cases, the hit view is the lowest level view in which the initiating sub-event (e.g., the first sub-event in a sequence of sub-events that form an event or potential event) occurs. Once the hit view is identified by the hit view determination module 172, the hit view typically receives all sub-events related to the same touch or input source for which it was identified as a hit view.
The activity event recognizer determination module 173 determines which view or views within the view hierarchy should receive a particular sequence of sub-events. In some implementations, the active event identifier determination module 173 determines that only the hit view should receive a particular sequence of sub-events. In other embodiments, the activity event recognizer determination module 173 determines that all views that include the physical location of a sub-event are actively engaged views, and thus determines that all actively engaged views should receive a particular sequence of sub-events. In other embodiments, even if the touch sub-event is completely localized to an area associated with one particular view, the higher view in the hierarchy will remain the actively engaged view.
The event dispatcher module 174 dispatches event information to an event recognizer (e.g., event recognizer 180). In embodiments that include an active event recognizer determination module 173, the event dispatcher module 174 delivers event information to the event recognizers determined by the active event recognizer determination module 173. In some embodiments, the event dispatcher module 174 stores event information in an event queue that is retrieved by the corresponding event receiver 182.
In some embodiments, the operating system 126 includes an event classifier 170. Alternatively, the application 136-1 includes an event classifier 170. In yet another embodiment, the event classifier 170 is a stand-alone module or part of another module stored in the memory 102, such as the contact/motion module 130.
In some embodiments, application 136-1 includes a plurality of event handlers 190 and one or more application views 191, each of which includes instructions for processing touch events that occur within a respective view of the user interface of the application. Each application view 191 of the application 136-1 includes one or more event recognizers 180. Typically, the respective application view 191 includes a plurality of event recognizers 180. In other embodiments, one or more of the event recognizers 180 are part of a separate module that is a higher level object from which methods and other properties are inherited, such as the user interface toolkit or application 136-1. In some implementations, the respective event handlers 190 include one or more of the following: data updater 176, object updater 177, GUI updater 178, and/or event data 179 received from event sorter 170. Event handler 190 optionally utilizes or invokes data updater 176, object updater 177, or GUI updater 178 to update the application internal state 192. Alternatively, one or more of application views 191 include one or more corresponding event handlers 190. Additionally, in some implementations, one or more of the data updater 176, the object updater 177, and the GUI updater 178 are included in a respective application view 191.
The corresponding event identifier 180 receives event information (e.g., event data 179) from the event classifier 170 and identifies events based on the event information. Event recognizer 180 includes event receiver 182 and event comparator 184. In some embodiments, event recognizer 180 further includes at least a subset of metadata 183 and event transfer instructions 188 (which optionally include sub-event delivery instructions).
Event receiver 182 receives event information from event sorter 170. The event information includes information about sub-events such as touches or touch movements. The event information also includes additional information, such as the location of the sub-event, according to the sub-event. When a sub-event relates to movement of a touch, the event information optionally also includes the rate and direction of the sub-event. In some embodiments, the event includes rotation of the device from one orientation to another orientation (e.g., from a portrait orientation to a landscape orientation, or vice versa), and the event information includes corresponding information about a current orientation of the device (also referred to as a device pose).
The event comparator 184 compares the event information with predefined event or sub-event definitions and determines an event or sub-event or determines or updates the state of the event or sub-event based on the comparison. In some embodiments, event comparator 184 includes event definition 186. Event definition 186 includes definitions of events (e.g., a predefined sequence of sub-events), such as event 1 (187-1), event 2 (187-2), and others. In some implementations, sub-events in the event (187) include, for example, touch start, touch end, touch move, touch cancel, and multi-touch. In one example, the definition of event 1 (187-1) is a double click on the displayed object. For example, a double click includes a first touch on the displayed object for a predetermined length of time (touch start), a first lift-off on the displayed object for a predetermined length of time (touch end), a second touch on the displayed object for a predetermined length of time (touch start), and a second lift-off on the displayed object for a predetermined length of time (touch end). In another example, the definition of event 2 (187-2) is a drag on the displayed object. For example, dragging includes touching (or contacting) on the displayed object for a predetermined period of time, movement of the touch on the touch-sensitive display 112, and lift-off of the touch (touch end). In some embodiments, the event also includes information for one or more associated event handlers 190.
In some implementations, the event definitions 187 include definitions of events for respective user interface objects. In some implementations, the event comparator 184 performs a hit test to determine which user interface object is associated with a sub-event. For example, in an application view that displays three user interface objects on touch-sensitive display 112, when a touch is detected on touch-sensitive display 112, event comparator 184 performs a hit test to determine which of the three user interface objects is associated with the touch (sub-event). If each displayed object is associated with a respective event handler 190, the event comparator uses the results of the hit test to determine which event handler 190 should be activated. For example, event comparator 184 selects an event handler associated with the sub-event and the object that triggered the hit test.
In some embodiments, the definition of the respective event (187) further includes a delay action that delays delivery of the event information until it has been determined that the sequence of sub-events does or does not correspond to an event type of the event recognizer.
When the respective event recognizer 180 determines that the sequence of sub-events does not match any of the events in the event definition 186, the respective event recognizer 180 enters an event impossible, event failed, or event end state after which subsequent sub-events of the touch-based gesture are ignored. In this case, the other event recognizers (if any) that remain active for the hit view continue to track and process sub-events of the ongoing touch-based gesture.
In some embodiments, the respective event recognizer 180 includes metadata 183 with configurable properties, flags, and/or lists that indicate how the event delivery system should perform sub-event delivery to the actively engaged event recognizer. In some embodiments, metadata 183 includes configurable attributes, flags, and/or lists that indicate how event recognizers interact or are able to interact with each other. In some embodiments, metadata 183 includes configurable properties, flags, and/or lists that indicate whether sub-events are delivered to different levels in a view or programmatic hierarchy.
In some embodiments, when one or more particular sub-events of an event are identified, the corresponding event recognizer 180 activates an event handler 190 associated with the event. In some implementations, the respective event identifier 180 delivers event information associated with the event to the event handler 190. The activate event handler 190 is different from sending (and deferring) sub-events to the corresponding hit view. In some embodiments, event recognizer 180 throws a marker associated with the recognized event, and event handler 190 associated with the marker obtains the marker and performs a predefined process.
In some implementations, the event delivery instructions 188 include sub-event delivery instructions that deliver event information about the sub-event without activating the event handler. Instead, the sub-event delivery instructions deliver the event information to an event handler associated with the sub-event sequence or to an actively engaged view. Event handlers associated with the sequence of sub-events or with the actively engaged views receive the event information and perform a predetermined process.
In some embodiments, the data updater 176 creates and updates data used in the application 136-1. For example, the data updater 176 updates a telephone number used in the contact module 137 or stores a video file used in the video player module. In some embodiments, object updater 177 creates and updates objects used in application 136-1. For example, the object updater 177 creates a new user interface object or updates the location of the user interface object. GUI updater 178 updates the GUI. For example, the GUI updater 178 prepares the display information and sends the display information to the graphics module 132 for display on a touch-sensitive display.
In some embodiments, event handler 190 includes or has access to data updater 176, object updater 177, and GUI updater 178. In some embodiments, the data updater 176, the object updater 177, and the GUI updater 178 are included in a single module of the respective application 136-1 or application view 191. In other embodiments, they are included in two or more software modules.
It should be appreciated that the above discussion regarding event handling of user touches on a touch sensitive display also applies to other forms of user inputs that utilize an input device to operate the multifunction device 100, not all of which are initiated on a touch screen. For example, mouse movements and mouse button presses optionally in conjunction with single or multiple keyboard presses or holds; contact movement on the touchpad, such as tap, drag, scroll, etc.; inputting by a touch pen; movement of the device; verbal instructions; detected eye movement; inputting biological characteristics; and/or any combination thereof is optionally used as input corresponding to sub-events defining the event to be distinguished.
Fig. 2 illustrates a portable multifunction device 100 with a touch screen 112 in accordance with some embodiments. The touch screen optionally displays one or more graphics within a User Interface (UI) 200. In this and other embodiments described below, a user can select one or more of these graphics by making a gesture on the graphics, for example, with one or more fingers 202 (not drawn to scale in the figures) or one or more styluses 203 (not drawn to scale in the figures). In some embodiments, selection of one or more graphics will occur when a user breaks contact with the one or more graphics. In some embodiments, the gesture optionally includes one or more taps, one or more swipes (left to right, right to left, up and/or down), and/or scrolling of a finger that has been in contact with the device 100 (right to left, left to right, up and/or down). In some implementations or in some cases, inadvertent contact with the graphic does not select the graphic. For example, when the gesture corresponding to the selection is a tap, a swipe gesture that swipes over an application icon optionally does not select the corresponding application.
The device 100 optionally also includes one or more physical buttons, such as a "home" or menu button 204. As previously described, menu button 204 is optionally used to navigate to any application 136 in a set of applications that are optionally executed on device 100. Alternatively, in some embodiments, the menu buttons are implemented as soft keys in a GUI displayed on touch screen 112.
In some embodiments, the device 100 includes a touch screen 112, menu buttons 204, a press button 206 for powering the device on/off and for locking the device, one or more volume adjustment buttons 208, a Subscriber Identity Module (SIM) card slot 210, a headset jack 212, and a docking/charging external port 124. Pressing button 206 is optionally used to turn on/off the device by pressing the button and holding the button in the pressed state for a predefined time interval; locking the device by depressing the button and releasing the button before the predefined time interval has elapsed; and/or unlock the device or initiate an unlocking process. In an alternative embodiment, the device 100 also accepts voice input through the microphone 113 for activating or deactivating certain functions. The device 100 also optionally includes one or more contact intensity sensors 165 for detecting the intensity of contacts on the touch screen 112, and/or one or more haptic output generators 167 for generating haptic outputs for a user of the device 100.
FIG. 3 is a block diagram of an exemplary multifunction device with a display and a touch-sensitive surface in accordance with some embodiments. The device 300 need not be portable. In some embodiments, the device 300 is a laptop computer, a desktop computer, a tablet computer, a multimedia player device, a navigation device, an educational device (such as a child learning toy), a gaming system, or a control device (e.g., a home controller or an industrial controller). The device 300 generally includes one or more processing units (CPUs) 310, one or more network or other communication interfaces 360, memory 370, and one or more communication buses 320 for interconnecting these components. Communication bus 320 optionally includes circuitry (sometimes referred to as a chipset) that interconnects and controls communications between system components. The device 300 includes an input/output (I/O) interface 330 with a display 340, typically a touch screen display. The I/O interface 330 also optionally includes a keyboard and/or mouse (or other pointing device) 350 and a touchpad 355, a tactile output generator 357 (e.g., similar to the tactile output generator 167 described above with reference to fig. 1A), a sensor 359 (e.g., an optical sensor, an acceleration sensor, a proximity sensor, a touch sensitive sensor, and/or a contact intensity sensor (similar to the contact intensity sensor 165 described above with reference to fig. 1A)) for generating tactile output on the device 300. Memory 370 includes high-speed random access memory, such as DRAM, SRAM, DDR RAM, or other random access solid state memory devices; and optionally includes non-volatile memory such as one or more magnetic disk storage devices, optical disk storage devices, flash memory devices, or other non-volatile solid state storage devices. Memory 370 optionally includes one or more storage devices located remotely from CPU 310. In some embodiments, memory 370 stores programs, modules, and data structures, or a subset thereof, similar to those stored in memory 102 of portable multifunction device 100 (fig. 1A). Furthermore, memory 370 optionally stores additional programs, modules, and data structures not present in memory 102 of portable multifunction device 100. For example, memory 370 of device 300 optionally stores drawing module 380, presentation module 382, word processing module 384, website creation module 386, disk editing module 388, and/or spreadsheet module 390, while memory 102 of portable multifunction device 100 (fig. 1A) optionally does not store these modules.
Each of the above elements in fig. 3 is optionally stored in one or more of the previously mentioned memory devices. Each of the above-described modules corresponds to a set of instructions for performing the above-described functions. The above-described modules or computer programs (e.g., sets of instructions or instructions) need not be implemented in a separate software program (such as a computer program (e.g., instructions), process or module, and thus the various subsets of these modules are optionally combined or otherwise rearranged in various embodiments. In some embodiments, memory 370 optionally stores a subset of the modules and data structures described above. Further, memory 370 optionally stores additional modules and data structures not described above.
Attention is now directed to embodiments of user interfaces optionally implemented on, for example, portable multifunction device 100.
Fig. 4A illustrates an exemplary user interface of an application menu on the portable multifunction device 100 in accordance with some embodiments. A similar user interface is optionally implemented on device 300. In some embodiments, the user interface 400 includes the following elements, or a subset or superset thereof:
Signal strength indicators 402 for wireless communications such as cellular signals and Wi-Fi signals;
time 404;
bluetooth indicator 405;
battery status indicator 406;
tray 408 with icons for commonly used applications, such as:
an icon 416 labeled "phone" of the o phone module 138, the icon 416 optionally including an indicator 414 of the number of missed calls or voice mails;
an icon 418 labeled "mail" of the o email client module 140, the icon 418 optionally including an indicator 410 of the number of unread emails;
icon 420 labeled "browser" of the omicron browser module 147; and
an icon 422 labeled "iPod" of the omicron video and music player module 152 (also known as iPod (trademark of Apple inc.) module 152); and
icons of other applications, such as:
icon 424 labeled "message" of omicron IM module 141;
icon 426 labeled "calendar" of calendar module 148;
icon 428 labeled "photo" of image management module 144;
an icon 430 labeled "camera" of the omicron camera module 143;
icon 432 labeled "online video" of online video module 155;
Icon 434 labeled "stock market" for the o stock market gadget 149-2;
icon 436 labeled "map" of the omicron map module 154;
icon 438 labeled "weather" for the o weather gadget 149-1;
icon 440 labeled "clock" for the o alarm clock gadget 149-4;
icon 442 labeled "fitness support" of omicron fitness support module 142;
icon 444 labeled "note" of the omicron note module 153; and
an icon 446 labeled "set" for a set application or module that provides access to the settings of device 100 and its various applications 136.
It should be noted that the iconic labels shown in fig. 4A are merely exemplary. For example, the icon 422 of the video and music player module 152 is labeled "music" or "music player". Other labels are optionally used for various application icons. In some embodiments, the label of the respective application icon includes a name of the application corresponding to the respective application icon. In some embodiments, the label of a particular application icon is different from the name of the application corresponding to the particular application icon.
Fig. 4B illustrates an exemplary user interface on a device (e.g., device 300 of fig. 3) having a touch-sensitive surface 451 (e.g., tablet or touchpad 355 of fig. 3) separate from a display 450 (e.g., touch screen display 112). The device 300 also optionally includes one or more contact intensity sensors (e.g., one or more of the sensors 359) for detecting the intensity of the contact on the touch-sensitive surface 451 and/or one or more tactile output generators 357 for generating tactile outputs for a user of the device 300.
While some of the examples below will be given with reference to inputs on touch screen display 112 (where the touch sensitive surface and the display are combined), in some embodiments the device detects inputs on a touch sensitive surface separate from the display, as shown in fig. 4B. In some implementations, the touch-sensitive surface (e.g., 451 in fig. 4B) has a primary axis (e.g., 452 in fig. 4B) that corresponds to the primary axis (e.g., 453 in fig. 4B) on the display (e.g., 450). According to these embodiments, the device detects contact (e.g., 460 and 462 in fig. 4B) with the touch-sensitive surface 451 at a location corresponding to a respective location on the display (e.g., 460 corresponds to 468 and 462 corresponds to 470 in fig. 4B). In this way, when the touch-sensitive surface (e.g., 451 in FIG. 4B) is separated from the display (e.g., 450 in FIG. 4B) of the multifunction device, user inputs (e.g., contacts 460 and 462 and movement thereof) detected by the device on the touch-sensitive surface are used by the device to manipulate the user interface on the display. It should be appreciated that similar approaches are optionally used for other user interfaces described herein.
Additionally, while the following examples are primarily given with reference to finger inputs (e.g., finger contacts, single-finger flick gestures, finger swipe gestures), it should be understood that in some embodiments one or more of these finger inputs are replaced by input from another input device (e.g., mouse-based input or stylus input). For example, a swipe gesture is optionally replaced with a mouse click (e.g., rather than a contact), followed by movement of the cursor along the path of the swipe (e.g., rather than movement of the contact). As another example, a flick gesture is optionally replaced by a mouse click (e.g., instead of detection of contact, followed by ceasing to detect contact) when the cursor is over the position of the flick gesture. Similarly, when multiple user inputs are detected simultaneously, it should be appreciated that multiple computer mice are optionally used simultaneously, or that the mice and finger contacts are optionally used simultaneously.
Fig. 5A illustrates an exemplary personal electronic device 500. The device 500 includes a body 502. In some embodiments, device 500 may include some or all of the features described with respect to devices 100 and 300 (e.g., fig. 1A-4B). In some implementations, the device 500 has a touch sensitive display 504, hereinafter referred to as a touch screen 504. In addition to or in lieu of touch screen 504, device 500 has a display and a touch-sensitive surface. As with devices 100 and 300, in some implementations, touch screen 504 (or touch-sensitive surface) optionally includes one or more intensity sensors for detecting the intensity of an applied contact (e.g., touch). One or more intensity sensors of the touch screen 504 (or touch sensitive surface) may provide output data representative of the intensity of the touch. The user interface of the device 500 may respond to touches based on the intensity of the touches, meaning that touches of different intensities may invoke different user interface operations on the device 500.
Exemplary techniques for detecting and processing touch intensity are found, for example, in the following related patent applications: international patent application sequence No. pct/US2013/040061, filed 5/8 a 2013, entitled "Device, method, and Graphical User Interface for Displaying User Interface Objects Corresponding to an Application", issued as WIPO patent publication No. wo/2013/169849; and international patent application sequence No. pct/US2013/069483, filed 11/2013, entitled "Device, method, and Graphical User Interface for Transitioning Between Touch Input to Display Output Relationships", published as WIPO patent publication No. wo/2014/105276, each of which is hereby incorporated by reference in its entirety.
In some embodiments, the device 500 has one or more input mechanisms 506 and 508. The input mechanisms 506 and 508 (if included) may be in physical form. Examples of physical input mechanisms include push buttons and rotatable mechanisms. In some embodiments, the device 500 has one or more attachment mechanisms. Such attachment mechanisms, if included, may allow for attachment of the device 500 with, for example, a hat, glasses, earrings, necklace, shirt, jacket, bracelet, watchband, bracelet, pants, leash, shoe, purse, backpack, or the like. These attachment mechanisms allow the user to wear the device 500.
Fig. 5B depicts an exemplary personal electronic device 500. In some embodiments, the apparatus 500 may include some or all of the components described with reference to fig. 1A, 1B, and 3. The device 500 has a bus 512 that operatively couples an I/O section 514 with one or more computer processors 516 and memory 518. The I/O portion 514 may be connected to a display 504, which may have a touch sensitive component 522 and optionally an intensity sensor 524 (e.g., a contact intensity sensor). In addition, the I/O portion 514 may be connected to a communication unit 530 for receiving application and operating system data using Wi-Fi, bluetooth, near Field Communication (NFC), cellular, and/or other wireless communication technologies. The device 500 may include input mechanisms 506 and/or 508. For example, the input mechanism 506 is optionally a rotatable input device or a depressible input device and a rotatable input device. In some examples, the input mechanism 508 is optionally a button.
In some examples, the input mechanism 508 is optionally a microphone. Personal electronic device 500 optionally includes various sensors, such as a GPS sensor 532, an accelerometer 534, an orientation sensor 540 (e.g., compass), a gyroscope 536, a motion sensor 538, and/or combinations thereof, all of which are operatively connected to I/O section 514.
The memory 518 of the personal electronic device 500 may include one or more non-transitory computer-readable storage media for storing computer-executable instructions that, when executed by the one or more computer processors 516, for example, may cause the computer processors to perform the techniques described below, including processes 1500 and 1600 (fig. 15-16). A computer-readable storage medium may be any medium that can tangibly contain or store computer-executable instructions for use by or in connection with an instruction execution system, apparatus, and device. In some examples, the storage medium is a transitory computer-readable storage medium. In some examples, the storage medium is a non-transitory computer-readable storage medium. The non-transitory computer readable storage medium may include, but is not limited to, magnetic storage devices, optical storage devices, and/or semiconductor storage devices. Examples of such storage devices include magnetic disks, optical disks based on CD, DVD, or blu-ray technology, and persistent solid state memories such as flash memory, solid state drives, etc. The personal electronic device 500 is not limited to the components and configuration of fig. 5B, but may include other components or additional components in a variety of configurations.
As used herein, the term "affordance" refers to a user-interactive graphical user interface object that is optionally displayed on a display screen of device 100, 300, and/or 500 (fig. 1A, 3, and 5A-5B). For example, an image (e.g., an icon), a button, and text (e.g., a hyperlink) optionally each constitute an affordance.
As used herein, the term "focus selector" refers to an input element for indicating the current portion of a user interface with which a user is interacting. In some implementations that include a cursor or other position marker, the cursor acts as a "focus selector" such that when the cursor detects an input (e.g., presses an input) on a touch-sensitive surface (e.g., touch pad 355 in fig. 3 or touch-sensitive surface 451 in fig. 4B) above a particular user interface element (e.g., a button, window, slider, or other user interface element), the particular user interface element is adjusted according to the detected input. In some implementations including a touch screen display (e.g., touch sensitive display system 112 in fig. 1A or touch screen 112 in fig. 4A) that enables direct interaction with user interface elements on the touch screen display, the contact detected on the touch screen acts as a "focus selector" such that when an input (e.g., a press input by a contact) is detected on the touch screen display at the location of a particular user interface element (e.g., a button, window, slider, or other user interface element), the particular user interface element is adjusted in accordance with the detected input. In some implementations, the focus is moved from one area of the user interface to another area of the user interface without a corresponding movement of the cursor or movement of contact on the touch screen display (e.g., by moving the focus from one button to another using a tab key or arrow key); in these implementations, the focus selector moves according to movement of the focus between different areas of the user interface. Regardless of the particular form that the focus selector takes, the focus selector is typically controlled by the user in order to deliver a user interface element (or contact on the touch screen display) that is interactive with the user of the user interface (e.g., by indicating to the device the element with which the user of the user interface desires to interact). For example, upon detection of a press input on a touch-sensitive surface (e.g., a touchpad or touch screen), the position of a focus selector (e.g., a cursor, contact, or selection box) over a respective button will indicate that the user desires to activate the respective button (rather than other user interface elements shown on the device display).
As used in the specification and claims, the term "characteristic intensity" of a contact refers to the characteristic of a contact based on one or more intensities of the contact. In some embodiments, the characteristic intensity is based on a plurality of intensity samples. The characteristic intensity is optionally based on a predefined number of intensity samples or a set of intensity samples acquired during a predetermined period of time (e.g., 0.05 seconds, 0.1 seconds, 0.2 seconds, 0.5 seconds, 1 second, 2 seconds, 5 seconds, 10 seconds) relative to a predefined event (e.g., after detection of contact, before or after detection of lift-off of contact, before or after detection of start of movement of contact, before or after detection of end of contact, and/or before or after detection of decrease in intensity of contact). The characteristic intensity of the contact is optionally based on one or more of: maximum value of intensity of contact, average value of intensity of contact, value at first 10% of intensity of contact, half maximum value of intensity of contact, 90% maximum value of intensity of contact, etc. In some embodiments, the duration of the contact is used in determining the characteristic intensity (e.g., when the characteristic intensity is an average of the intensity of the contact over time). In some embodiments, the characteristic intensity is compared to a set of one or more intensity thresholds to determine whether the user has performed an operation. For example, the set of one or more intensity thresholds optionally includes a first intensity threshold and a second intensity threshold. In this example, contact of the feature strength that does not exceed the first threshold results in a first operation, contact of the feature strength that exceeds the first strength threshold but does not exceed the second strength threshold results in a second operation, and contact of the feature strength that exceeds the second threshold results in a third operation. In some implementations, a comparison between the feature strength and one or more thresholds is used to determine whether to perform one or more operations (e.g., whether to perform or forgo performing the respective operations) rather than for determining whether to perform the first or second operations.
FIG. 6 illustrates an exemplary set of hand gestures according to some embodiments. The user interfaces in these figures are used to illustrate the processes described below, including the processes in fig. 15-16.
FIG. 6 illustrates an exemplary set of hand gestures that may be detected by computer system 600. Computer system 600 includes an input mechanism 602a (e.g., a crown of a wristwatch), an input mechanism 602b (e.g., side keys), and a display 602c. In some embodiments, computer system 600 includes one or more components of devices 100, 300, and/or 500. In some embodiments, computer system 600 is a wearable device, such as a watch. In some embodiments, input mechanisms 602a and 602b may include one or more of the components and/or features of input mechanisms 506 and 608 described above with respect to fig. 5A. In some implementations, the display screen 602c is a touch-sensitive display and includes one or more of the components and/or features described above with respect to the touch screen 504.
As shown in fig. 6, computer system 600 is worn around the wrist of a user and away from hand 622. The computer system 600 includes one or more accelerometers, gyroscopes, and/or biometric sensors to detect various hand gestures and/or movements (e.g., such as tilting) of the computer system 600. At fig. 6, one or more of the biometric sensors include an optical sensor and/or heart rate sensor that computer system 600 uses to detect various hand gestures. In some embodiments, computer system 600 may detect hand gestures using one or more sensors other than optical sensors/heart rate sensors.
FIG. 6 illustrates an exemplary hand gesture that may be detected by computer system 600 (e.g., via an optical sensor/heart rate sensor). As shown in fig. 6, the hand gestures include a neutral gesture/position 610, a grasping gesture 620, a double grasping gesture 630, a clamping gesture 640, and a double clamping gesture 650. Although these hand gestures (e.g., 610, 620, 630, 640, and 650) are discussed throughout this application, hand gestures (e.g., multi-finger tap hand gestures and/or three (four, five) grasping/pinching gestures) and one or more combinations of these hand gestures are contemplated as being detectable by the computer system 600 (e.g., via an optical sensor/heart rate sensor) and/or for performing one or more operations as described below. Thus, the hand gestures provided herein are for exemplary purposes only, and the embodiments described herein are not limited to these particular hand gestures. Further, hand gestures described herein are not directed to (e.g., are not captured by) one or more cameras of computer system 600. Further, hand gestures described herein do not contact the display 602c and/or are not performed in front of the display 602c (e.g., so that hand gestures are detected by the computer system 600).
As shown in fig. 6, the neutral hand gesture/position 610 is a hand gesture/position in which none of the fingertips of the hand 622 touch any portion of the hand 622. The clenching gesture 620 is a hand gesture in which one or more of the fingers of the hand 622 are touching another portion of the user's hand such that the palm of the user's hand is when the user forms a fist. The dual grasping gesture 630 is a combination (and/or sequence) of the neutral gesture/position 610 and the grasping gesture 620, wherein the fingers of the hand 622 are closed to form a first grasping gesture (e.g., the first portion 630 a), opened to form a neutral gesture/position (e.g., the second portion 630 b), and closed again to form a second grasping gesture (e.g., the third portion 630 c). Thus, the dual grasping gesture is a gesture that includes multiple (e.g., two) instances (e.g., the first portion 630a and the third portion 630 c) of the grasping gesture detected within a predetermined period of time (e.g., 0 seconds-2 seconds). The pinch gesture 640 is a hand gesture in which one or more of the fingers of the hand 622 are touching each other (e.g., two fingers). The grip gesture is different from the grasping gesture in that a fist is not formed when the grip gesture is made, and a fist is formed when the grasping gesture is made. The dual grip gesture 650 is a combination (and/or sequence) of the neutral gesture/position 610 and the grip gesture 640, wherein the fingers of the hand 622 touch to form a first grip gesture (e.g., the first portion 650 a), open to form a neutral gesture/position (e.g., the second portion 650 b), and close again to form a second grip gesture (e.g., the third portion 650 c). Thus, a dual grip gesture is a gesture that includes multiple (e.g., two) instances (e.g., first portion 650a and third portion 650 c) of a grip gesture detected at a period detected within a predetermined period of time (e.g., 0 seconds-2 seconds). As shown in fig. 6, the dual grasping gesture 630 and the grasping gesture 620 do not include multiple instances of the clamping gesture 640, and the dual clamping gesture 650 and the clamping gesture 640 do not include multiple instances of the grasping gesture 620. In some embodiments, when two grip gestures are detected within a predetermined period of time, computer system 600 registers (or detects) the two gestures as a dual grip gesture. In some embodiments, when two grip gestures are not detected within a predetermined period of time, the computer system 600 registers (or detects) the two gestures as two separate grip gestures (e.g., does not register a double grip gesture). In some embodiments, when two grasping gestures are detected within a predetermined period of time, the computer system 600 registers (or detects) the two gestures as a double grasping gesture. In some embodiments, when two grasping gestures are not detected within a predetermined period of time, the computer system 600 registers (or detects) the two gestures as two separate grasping gestures (e.g., does not register a double grasping gesture).
Fig. 7A-7 AA illustrate exemplary user interfaces for navigating a user interface using hand gestures, according to some embodiments. The user interfaces in these figures are used to illustrate the processes described below, including the processes in fig. 15-16.
Fig. 7A-7G illustrate an exemplary user interface for responding to an incoming alarm (e.g., timer alarm) using hand gestures. FIG. 7A shows computer system 600 displaying a clock user interface 710 on display screen 602c that includes the current time (e.g., 10:09). When the computer system 600 is displaying the clock user interface 710, the hand 622 is in a neutral position (e.g., 610 in fig. 6). At fig. 7A, computer system 600 receives an incoming alert from a timer application (e.g., installed on computer system 600). As shown in FIG. 7B, in response to receiving an incoming alert from the timer application, computer system 600 displays a timer user interface 712. Timer user interface 712 includes a stop control 712a (e.g., which, when activated, causes computer system 600 to stop outputting audible sounds associated with an incoming alarm) and a repeat control 712b (e.g., which, when activated, causes computer system 600 to repeat a timer associated with an incoming alarm) and an indication that the timer has completed (e.g., a "timer complete"). At fig. 7C, computer system 600 detects a clamp gesture 750C. However, as shown in fig. 7D, computer system 600 maintains the display of timer user interface 712 and does not perform any operations in response to grip gesture 750c because computer system 600 is not operating in a hand gesture navigation mode (e.g., and/or an accessibility mode). In some embodiments, the computer system 600 does not detect the grip gesture 750c because the computer system 600 is not operating in the hand gesture navigation mode.
At fig. 7D, the computer system 600 detects a double grasping gesture 750D. At fig. 7E, in response to detecting the double grasping gesture 750d, the computer system 600 begins operating in the hand gesture navigation mode. At fig. 7E, computer system 600 detects a clamp gesture 750E (e.g., a second clamp gesture). As shown in fig. 7F, in response to detecting pinch gesture 750e, computer system 600 displays a focus indicator around stop control 712a because computer system 600 is operating in a hand gesture navigation mode. Computer system 600 displays a focus indicator around stop control 712a to indicate that stop control 712a may be activated in response to computer system 600 detecting a particular hand gesture. In contrast, computer system 600 does not display a focus indicator around repetition control 712b in FIG. 7F. Thus, the repetition control 712b cannot be activated in response to the computer system 600 detecting a particular hand gesture (e.g., before the repetition control 712b is displayed with the focus indicator). At fig. 7F, the computer system 600 detects a grasping gesture 750F. At fig. 7G, in response to detecting the grasping gesture 750f, the computer system 600 activates the stop control 712a. Upon activation of the stop control 712a, the computer system 600 stops displaying the timer user interface 712, stops outputting audible sounds associated with the incoming alert, and redisplays the clock user interface 710 (e.g., as shown in fig. 7G).
Fig. 7G-7J illustrate exemplary user interfaces for responding to an incoming alert (e.g., an incoming call) using a hand gesture. As discussed above, FIG. 7G illustrates the computer system 600 displaying a clock user interface 710 when operating in the hand gesture navigation mode. At fig. 7G, computer system 600 receives an alert corresponding to an incoming telephone call. As shown in fig. 7H, in response to receiving the alert, computer system 600 displays a telephone user interface 714 that includes a call identifier 714a (e.g., "JOHN applied incoming call") that indicates that computer system 600 is receiving an incoming call from JOHN applied. Telephone user interface 714 also includes a rejection control 714b (e.g., which, when activated, causes computer system 600 to reject a telephone call), a response control 714c (e.g., which, when activated, causes computer system 600 to answer the telephone call), and an additional options control 714d (e.g., which, when activated, causes computer system 600 to display additional options for responding to an incoming telephone call). As shown in fig. 7H, the computer system 600 displays a hand gesture notification 716 indicating that the user can answer the phone call by providing a double grasping gesture (e.g., instead of providing multiple gestures (e.g., using one or more of the techniques as discussed with respect to fig. 7C-7G, one or more pinch gestures that navigate to the answer control 714C and a grasping gesture that navigates to the answer control 714C), the user can answer the phone call). In some implementations, the computer system 600 displays the hand gesture notification 716 because it is determined that the computer system 600 is operating in the hand gesture navigation mode and/or the computer system 600 is operating in a use context, where certain types of operations (e.g., answering a phone call, stopping an alarm, and/or replying to a text message) may be accomplished via a single hand gesture (e.g., a predetermined hand gesture). In some embodiments, computer system 600 displays other and/or different hand gesture notifications (e.g., from hand gesture 716) to inform the user that one or more other gestures may be used to perform an operation.
At fig. 7I, the computer system 600 detects a double grasping gesture 750I. As shown in fig. 7J, in response to detecting the double grasping gesture 750i, the computer system 600 replaces the call identifier 714a with an elapsed time indicator 714e (which indicates that the incoming telephone call has been answered) and displays a volume control 714f (which, when activated, causes the volume level of one or more speakers of the computer system 600 to be adjusted, for example). In other words, in response to detecting the double grasping gesture 750i, the computer system 600 answers the incoming telephone call. In some embodiments, computer system 600 performs different operations (e.g., as discussed with respect to fig. 7D-7E and 7L-7M) than performed when the incoming alarm is being received (e.g., and/or has been received within a predetermined period of time) in response to detecting a double grasping gesture when the incoming alarm is not received (and/or has not been received within a predetermined period of time).
Fig. 7I-7U illustrate exemplary user interfaces for navigating the user interface using hand gestures and moving a cursor. In particular, fig. 7I-7U illustrate exemplary scenarios for ending a fitness tracker using hand gestures and moving a cursor. Fig. 7K illustrates a computer system 600 displaying an exercise user interface 718 that includes a list of exercise metrics. When the exercise user interface 718 is displayed, the computer system 600 detects the double grasping gesture 750L at fig. 7L. As shown in fig. 7M, in response to detecting the double grasping gesture 750l, the computer system 600 displays a menu 732 including a digital hardware operation control 732a, a movement cursor control 732b, an interaction control 732c, and an additional option control 732 d. Menu 732 includes a control identifier 708 ("digital crown") that identifies the control currently in focus (e.g., a black box around digital hardware operation control 732a in fig. 7M). In response to detecting activation of the digital hardware operation control 732a, the computer system 600 begins operating in a digital hardware operation mode (e.g., a different hand gesture navigation mode). In response to detecting activation of interactive control 732c, computer system 600 replaces menu 732 with a menu having controls that, when activated, cause computer system 600 to perform different operations corresponding to the various gestures detected at the location on computer system 600 (e.g., as discussed below with respect to fig. 7Z). In response to detecting activation of the additional options control 732d, the computer system 600 replaces the menu 732 with a menu that includes additional controls (e.g., as discussed below with respect to fig. 7Z).
As shown in fig. 7M, in response to detecting the double grasping gesture 750l, the computer system 600 displays a focus indicator around the digital hardware operation control 732a to indicate that the digital hardware operation control 732a may be activated in response to the computer system 600 detecting one or more hand gestures. At fig. 7M, computer system 600 detects a clamp gesture 750M. As shown in fig. 7N, in response to detecting the pinch gesture 750m, the computer system 600 moves the focus indication Fu Xiangyou such that the focus indicator surrounds the move cursor control 732b and no longer surrounds the digital hardware operation control 732a. Notably, in response to detecting the pinch gesture 750m, the computer system 600 moves the focus indicator from one control to the next (e.g., in terms of location) on the user interface. At fig. 7N, the computer system 600 detects the grasping gesture 750N while displaying a focus indicator around the moving cursor control 732 b.
In response to detecting the grasping gesture 750n, the computer system 600 stops displaying the menu 732 and displays a cursor 742 at a position on the exercise user interface 718, as shown in fig. 7O. At fig. 7O, in response to detecting the grasping gesture 750n, the computer system 600 begins operating in a moving cursor mode of operation (e.g., another hand gesture mode of operation). At fig. 7P, computer system 600 detects (e.g., via one or more accelerometers and/or gyroscopes, via sensors other than optical/heart rate sensors, and/or via sensors other than sensors that detect one or more hand gestures) that computer system 600 is tilting (e.g., moving) to the left (e.g., 750P). At fig. 7P, in response to detecting that computer system 600 is tilting to the left, computer system 600 moves cursor 742 to the left based on the amount of tilt and/or the tilt speed detected by computer system 600. As shown in fig. 7P, cursor 742 moves toward the left edge of fitness user interface 718. At fig. 7P, it is determined that cursor 742 is positioned at (e.g., and/or near) the left edge of fitness user interface 718 (e.g., for a predetermined period of time). As shown in fig. 7Q-7R, because it is determined that cursor 742 is positioned on the left edge of exercise user interface 718, computer system 600 displays an animation that slides exercise user interface 718 off the right edge of display screen 602c and slides the exercise control user interface from the left edge of the display screen onto display screen 602c (e.g., to the right). Thus, because it is determined that cursor 742 is positioned at (e.g., and/or near), computer system 600 replaces the currently displayed user interface with another user interface.
As shown in fig. 7R, exercise control user interface 720 includes a lock control 720a (e.g., which when activated causes computer system 600 to ignore touch input detected on computer system 600), a new control 720b (e.g., which when activated causes computer system 600 to initiate a new exercise tracker), an end control 720c (e.g., which when activated causes computer system 600 to stop the exercise tracker currently tracking exercise activity), and a pause control 730d (e.g., which when activated causes computer system 600 to pause the exercise tracker currently tracking exercise activity). As shown in fig. 7R, hand 622 is in a neutral position (e.g., 610 in fig. 6) and cursor 742 is positioned over end control 720 c. At fig. 7S, a period of time has elapsed since the cursor 742 was first positioned over the end control 720c (and/or at the current position of the cursor 742) (e.g., and the hand 622 has remained in the neutral position). As shown in fig. 7S, computer system 600 displays a cursor 742 having an indication 744. Indication 744 indicates an amount of time before an operation (e.g., an activation operation) corresponding to the position of cursor 742 will be performed, such as activating end control 720c and/or displaying a menu (e.g., menu 732 of fig. 7M) including one or more controls to perform an operation corresponding to the position of cursor 742. Thus, when the cursor 742 is displayed at a location corresponding to a user interface object, such as a control, the computer system 600 displays an animation of the size of the indication 744 and/or fills the cursor 742 within a period of time (e.g., a period of time indicating the remaining time before the operation will be performed). At fig. 7S, an indication 744 that fills up approximately half of the cursor 742 indicates that approximately half of the time has elapsed before the computer system 600 can perform an operation corresponding to the position of the cursor 742 (e.g., since the cursor 742 has been positioned over the end control 720 c). Referring back to fig. 7Q-7R, computer system 600 does not display an indication with cursor 742 while computer system 600 is not displayed on the user interface and/or at the edge of the display, regardless of how long computer system 600 is displayed at a particular location. Thus, in some embodiments, computer system 600 displays an animation that includes indication 744 only when moving cursor 742 over a user interface object (e.g., a selectable and/or (activatable) user interface object, a user interface object that is activatable via one or more gestures detected on display 602 c).
At fig. 7S, more time has elapsed since cursor 742 was first positioned over end control 720c (e.g., and hand 622 has remained in a neutral position). As shown in fig. 7T, computer system 600 has updated indication 744 such that indication 744 has filled all of cursor 742. At FIG. 7T, it is determined that cursor 742 has been positioned over end control 720c for a predetermined period of time (e.g., 1 second-5 seconds). At fig. 7U, because it is determined that cursor 742 has been positioned over end control 720c for the predetermined period of time, computer system 600 activates end control 720c. Upon activation of end control 720c, computer system 600 displays a completion user interface 770 indicating that the fitness tracker has ended.
Fig. 7U-7 AA illustrate exemplary user interfaces for navigating the user interface using hand gestures and moving a cursor. In particular, fig. 7U-7 AA illustrate exemplary scenarios in which computer system 600 enters a mobile cursor operation mode in response to detecting a shake gesture (e.g., instead of activating mobile cursor control 732b on menu 732 via one or more hand gestures, as described above with respect to fig. 7N). FIG. 7U illustrates computer system 600 displaying a clock user interface 710 that includes a current time (e.g., 10:09). At fig. 7W, computer system 600 detects that it is (or has been) being panned (e.g., via one or more accelerometers and/or gyroscopes) (e.g., 750W). At fig. 7X, in response to detecting that the computer system 600 is being (or has been) panned, the computer system 600 begins operating in a moving cursor mode (e.g., transitions from not operating in the moving cursor mode to operating in the moving cursor mode). In response to detecting that computer system 600 is being (or has been) panned, computer system 600 displays a cursor 742. At fig. 7Y, computer system 600 detects that computer system 600 is tilting (e.g., moving) to the right (e.g., 750Y). At fig. 7Y, in response to detecting that computer system 600 is tilting to the right, computer system 600 moves cursor 742 to the right based on the amount of tilt and/or the tilt speed detected by computer system 600. As shown in fig. 7Y, cursor 742 moves toward the right edge of clock user interface 710. At fig. 7Y, it is determined that cursor 742 is positioned at (e.g., and/or near) the right edge of clock user interface 710. As shown in fig. 7Z-7 AA, because it is determined that the cursor 742 is positioned on the left edge of the fitness user interface 718, the computer system 600 displays an animation that slides the clock user interface away from the left edge of the display screen 602c (e.g., away from the edge on which the cursor 742 is positioned), and displays that the analog clock user interface 780 slides from the right side of the display screen 602c and moves toward the left edge of the display screen 602 c.
Fig. 8A-8J illustrate an exemplary user interface for navigating the user interface using hand gestures, according to some embodiments. The user interfaces in these figures are used to illustrate the processes described below, including the processes in fig. 15-16.
In particular, fig. 8A-8J illustrate an exemplary scenario in which computer system 600 displays menu 732 (e.g., instead of activating a user interface object, as described above with respect to fig. 7I-7U) in response to cursor 742 being positioned (e.g., displayed) over the user interface object. FIG. 8A illustrates a computer system 600 displaying a clock user interface 810 that includes application controls 810a-810d (e.g., when activated, each cause the computer system 600 to launch an application). At fig. 8A, computer system 600 detects that it is (or has been) being panned (e.g., 850 a) (e.g., using one or more techniques as described above with respect to fig. 7U-7 AA). At fig. 8B, in response to detecting that computer system 600 is being (or has been) panned, computer system 600 begins operating in a moving cursor mode and a cursor 742 is displayed on clock user interface 810.
At fig. 8C, the cursor 742 has been displayed at the same location (e.g., the location it was displayed in fig. 8B) for a predetermined period of time. However, at fig. 8C, computer system 600 does not display an animation of cursor 742 being filled in because cursor 742 is not located at a position (e.g., using one or more techniques described with respect to fig. 7I-7U) that corresponds to a user interface object (e.g., when computer system 600 detects one or more inputs on a user interface object, and in some embodiments, when the computer system is operating in a normal operating mode and/or does not detect a hand gesture, selectable and/or user interface objects that may be activated and/or user interface objects that are activated). At fig. 8C, computer system 600 detects that it is tilting downward (e.g., 850C) (e.g., using one or more of the techniques described with respect to fig. 7I-7U). As shown in fig. 8D, in response to detecting that computer system 600 is tilting downward, computer system 600 moves cursor 742 over application icon 810D.
As shown in fig. 8E-8G, computer system 600 displays an animated cursor that fills in an indication 744 of cursor 742 over a period of time because cursor 742 is displayed over application icon 810d (e.g., a selectable user interface object) (e.g., using one or more techniques discussed above with respect to fig. 7I-7U). As shown in fig. 8G, the indication 744 has completely filled the cursor 742. At fig. 8G, it is determined that the cursor 742 has been displayed over the application icon 810d, and it is determined that the cursor 742 has been positioned over the application icon 810d for a predetermined period of time (e.g., 1 second-5 seconds).
As shown in fig. 8H, because it is determined that the cursor 742 has been positioned over the application icon 810d for the predetermined period of time, the computer system 600 displays the menu 732 (e.g., and in some embodiments, selects the application icon 810d and/or the location of the application icon 810d (e.g., such that an operation may be performed using the application icon 810d and/or the location of the application icon 810 d) without activating the application icon 810 d). Further, at FIG. 8H, computer system 600 is displaying a cursor 742 positioned over interactive control 732 c. In some embodiments, one or more settings control whether computer system 600 will activate a user interface object when it is determined that cursor 742 has been positioned over a corresponding user interface object for a predetermined period of time, whether computer system 600 displays a menu when it is determined that cursor 742 has been positioned over a corresponding user interface object for a predetermined period of time, or both. In some embodiments, computer system 600 displays menu 732 in a location on the display that is farther from the location where cursor 742 is positioned over application icon 810d (so as not to overlay the user interface object on which cursor 742 is hovering).
As shown in fig. 8I, computer system 600 displays an indication 744 that cursor 742 has been filled (e.g., using one or more techniques as discussed with respect to fig. 8E-8G). At fig. 8I, it is determined that a cursor 742 has been detected over the interactive control 732c for a predetermined period of time. At fig. 8J, because it is determined that the cursor 742 has been detected above the cursor 742 for the predetermined period of time, the computer system 600 activates the interaction control 732c. In response to activation of interactive control 732c, computer system 600 displays additional controls including tap control 832a. Notably, when it is determined that the cursor 742 has been detected over the selectable user interface object for the predetermined time while the menu 732 is displayed, the computer system 600 activates the user interface object (and does not redisplay the menu 732 regardless of one or more settings of the computer system 600).
At fig. 8J, computer system 600 displays a cursor over tap control 832a. In some implementations, the computer system 600 activates the tap control 832a when it is determined that the cursor 742 has been detected over the tap control 832a. In some embodiments, in response to detecting activation of the tap control 832a, the computer system 600 detects and/or performs an operation that would be performed if a tap gesture was detected at a location where the cursor 742 was displayed at a location where the menu 732 was displayed (e.g., in fig. 8G-8I). For example, at fig. 8J, in response to detecting activation of tap control 832a, computer system 600 launches an application corresponding to application icon 810d (e.g., the user interface of fig. 10F). In some embodiments, other gesture controls may be displayed and activated, wherein computer system 600 performs an operation to be performed if a corresponding gesture (e.g., a long press gesture, a drag gesture) is detected at a location where cursor 742 is displayed at a location where menu 732 is displayed (e.g., if a different operation of a tap gesture has been detected) (e.g., a gesture that causes a clock face menu to be displayed, a gesture that causes an application icon to be moved from one location to another location on clock user interface 810, a gesture that causes an application icon to be deleted from clock user interface 810).
Fig. 9A-9H illustrate exemplary user interfaces for navigating a user interface using hand gestures, according to some embodiments. The user interfaces in these figures are used to illustrate the processes described below, including the processes in fig. 15-16.
In particular, fig. 9A-9H illustrate exemplary scenarios in which computer system 600 navigates through a user interface in response to detecting a hand gesture (e.g., using similar techniques as described above with respect to fig. 7A-7G). FIG. 9A illustrates the computer system 600 displaying the media user interface 910 when not operating in the hand gesture navigation mode. Media user interface 910 includes a back control 910a (e.g., which when activated causes computer system 600 to display a previously displayed user interface), a reverse control 910b (e.g., which when activated causes computer system 600 to be configured to play back a previous media item in the list of media items), a pause control 910c (e.g., which when activated causes computer system 600 to pause playback of a media item), a forward control 910d (e.g., which when activated causes computer system 600 to be configured to play back a next media item in the list of media items), an additional option control 910e (e.g., which when activated causes computer system 600 to display one or more additional controls not currently displayed in FIG. 9A), a queue control 910f (e.g., which when activated causes computer system 600 to display a list of queued media items), and a connect control 910g (e.g., which when activated causes computer system 600 to display a user interface for connecting computer system 600 to one or more external systems). At fig. 9A, the computer system 600 detects a double grasping gesture 950a.
At fig. 9B, in response to detecting the double grasping gesture 950a, the computer system 600 begins operating in the hand gesture navigation mode (e.g., using one or more techniques as described above with respect to fig. 7D-7E). As shown in fig. 9B, in response to detecting the double grasping gesture 950a, the computer system 600 displays a focus indicator around the advance control 910D (e.g., using one or more techniques as discussed above with respect to fig. 7D). In some embodiments, the computer system 600 displays a focus indicator around a control on the media user interface 910 that is different from and/or not the advance control 910d (e.g., controls 910a-910d or 910e-910 g) in response to detecting the double pinch gesture 950 a.
At fig. 9B, the computer system 600 detects a double clamp gesture 950B. As shown in fig. 9C, in response to detecting the double pinch gesture 950b, the computer system 600 moves the focus indication Fu Xiangzuo such that the focus indicator is displayed around the pause control 910C and not around the advance control 910 d. At fig. 9C, in response to detecting the double pinch gesture 950b, the computer system 600 moves a focus indicator relative to the forward control 910d around a control that the computer system 600 has determined to be a previous control on the media user interface 910. In some implementations, the computer system 600 determines the previous control by identifying a control on the media user interface 910 that travels from a position of the previous control 910d (e.g., the previous control surrounded by the focus indicator) toward a starting row on the media user interface 910 (e.g., a position on a row near and/or at the top of the media user interface 910 and/or a position on a row having selectable user interface objects (e.g., controls) furthest from the left/right side of the media user interface 910 and near the top). In some embodiments, at fig. 9C, the pause control 910C is determined to be the previous control to the forward control 910d because the pause control 910C is adjacent to the forward control 910d (e.g., next or next with no other control between the two controls) and is closer to the beginning line of the media user interface 910 than the forward control 910d (or in some embodiments, any other control (e.g., queue control 910f, additional option control 910 e) in the other adjacent controls). At fig. 9C, the computer system 600 detects a grasping gesture 950C.
At fig. 9D, in response to detecting the grasping gesture 950c, the computer system 600 activates the pause control 910c and pauses playback of the media (e.g., "WHAT MATTERS" by a "popular singer"). As shown in fig. 9D, in response to detecting the grasping gesture 950c, the pause control 910c is replaced with the play control 910h (e.g., when activated, it initiates playback of the media item). In response to detecting the grasping gesture 950c, the computer system 600 maintains the focus indicator in the same position on the media user interface 910, as shown in fig. 9D. Thus, in some embodiments, the computer system 600 does not move the focus control when the grasping gesture 950c is detected and/or when the control is activated in response to detecting the hand gesture, regardless of whether the control is replaced with another control. At fig. 9E, computer system 600 detects a pinch gesture 950E.
As shown in fig. 9F, in response to detecting the pinch gesture 950e, the computer system 600 moves the focus indication Fu Xiangyou such that the focus indicator is displayed around the advance control 910d and not around the play control 910 h. At fig. 9F, in response to detecting the pinch gesture 950e, the computer system 600 moves the focus indicator relative to the play control 910h around the control that the computer system 600 has determined to be the next control on the media user interface 910. In some implementations, the computer system 600 determines the next control by identifying a control on the media user interface 910 that travels from a position of the forward control 910d (e.g., a previous control surrounded by the focus indicator) toward an ending row on the media user interface 910 (e.g., a position on a row near and/or at the top of the media user interface 910 and/or a position on a row having selectable user interface objects (e.g., controls) furthest from the left/right side of the media user interface 910 and near the bottom). In some embodiments, at fig. 9F, the forward control 910d is determined to be the next control to the play control 910h because the forward control 910d is adjacent to the forward control 910d (e.g., next or next with no other control between the two controls) and closer to the beginning line of the media user interface 910 than the play control 910 h.
Notably, the dual clamp gesture 950B of fig. 9B includes multiple instances of the clamp gesture 950E of fig. 9E. Further, computer system 600 performs an operation (e.g., navigating to a previous control) in response to detecting double pinch gesture 950b that is opposite to the operation performed by computer system 600 in response to detecting pinch gesture 950 e. Thus, in some embodiments, computer system 600 performs the opposite operation in response to detecting a gesture comprising multiple instances of each other. In some implementations, opposite gestures help a user to more easily navigate a user interface due to connectivity of different gestures (e.g., similarity between different gestures). Returning to FIG. 9F, computer system 600 detects a pinch gesture 950F.
In response to detecting the pinch gesture 950f, the computer system 600 moves the focus indicator downward such that the focus indicator is displayed around the additional option control 910e and not around the forward control 910d, as shown in fig. 9G. Here, computer system 600 moves the focus indicator downward because additional options control 910e is the next control. At fig. 9G, computer system 600 does not display a focus indicator around connection control 910G because connection control 910G is not adjacent to forward control 910d (e.g., so connection control 910G is not determined to be the next control). In some implementations, the computer system 600 moves focus around the user interface in a pattern based on a particular layout of controls on a particular user interface. Referring back to fig. 9A, in some embodiments, computer system 600 moves focus indicators around media user interface 910 in the order of identification of the controls (910 a, 910b, 910c, 910d, 910e, 910f, and 910g, continuously (or in reverse)). In some implementations, when the focus indicator is around the last control (e.g., 910 g), the computer system 600 moves the focus indicator around the first control (e.g., 910 a) in response to detecting a pinch gesture (or a gesture to move the next control). In some implementations, when the focus indicator is around the first control (e.g., 910 a), the computer system 600 moves the focus indicator around the last control (e.g., 910 g) in response to detecting a double pinch gesture (or a gesture to move the previous control). Returning to fig. 9G, the computer system 600 detects the grasping gesture 910G when the focus indicator is around the additional option control 950 e.
In response to detecting the grasping gesture 950g, the computer system 600 displays the additional options user interface 920 and stops displaying the media user interface 910, as shown in fig. 9H. As shown in fig. 9H, in response to detecting the grasping gesture 950g, the computer system 600 displays a focus indicator around a control (e.g., a control at a position on the start line) of the additional option user interface 920. In some implementations, in response to detecting one or more hand gestures (e.g., pinch and/or double pinch gestures) that move a focus indicator around the additional option user interface 920, the computer system 600 moves the focus indicator around different controls included in the additional option user interface 920. In some embodiments, computer system 600 moves focus indicators around different controls included in additional options user interface 910 (e.g., as described above with respect to fig. 9A-9G) in a different mode (e.g., vertical mode) than the mode (e.g., serpentine mode) in which computer system 600 moves focus indicators around controls included in media user interface 920.
10A-10F illustrate an exemplary user interface for navigating the user interface using hand gestures, according to some embodiments. The user interfaces in these figures are used to illustrate the processes described below, including the processes in fig. 15-16.
In particular, fig. 10A-10F illustrate exemplary scenarios in which computer system 600 navigates through a user interface in response to detecting a hand gesture (e.g., using similar techniques as described above with respect to fig. 7A-7G). FIG. 10A illustrates the computer system 600 displaying a media user interface 1010 when not operating in a hand gesture navigation mode. The media user interface 1010 includes application icons 1010a-1010d (e.g., which, when activated, cause the computer system 600 to display a user interface for the respective application icon that was activated). At fig. 10A, computer system 600 is operating in a hand-navigation mode of operation (e.g., as discussed above with respect to fig. 7A-7G and 9A-9H). In some embodiments, computer system 600 navigates through media user interface 1010 using one or more of the techniques described above with respect to fig. 7A-7G and 9A-9H. At fig. 10B, the computer system 600 detects a double grasping gesture 1050B.
As shown in fig. 10C, in response to detecting the double grasping gesture 1050b, the computer system 600 displays a menu 732 (e.g., as discussed above with respect to fig. 7D) including controls 732 a-732D. Menu 732 includes a control identifier 708 ("digital crown") that identifies the control currently in focus (e.g., a black box around digital hardware operation control 732a in fig. 10C). As shown in fig. 10C, the computer system 600 displays a focus indicator around the digital hardware operation control 732a and the application icon 1010C because the application icon 1010C was selected before the double grasping gesture 1050b was detected and/or the computer system 600 may perform an operation at the location of the application icon 1010C (e.g., activation of the application icon 1010C, as discussed above with respect to fig. 8J). At fig. 10C, the computer system detects a double grasping gesture 1050C.
As shown in fig. 10D, in response to detecting the double grasping gesture 1050c, the computer system 600 stops displaying the menu 732 (e.g., when the computer system 600 has continued to operate in the hand gesture navigation mode). At fig. 10D, computer system 600 detects a clamp gesture 1050D. As shown in fig. 10E, in response to detecting the pinch gesture 1050d, the computer system 600 moves the focus indicator downward and to the left such that the focus indicator is displayed around the application icon 1010d and not around the application icon 1010 c. The computer system 600 displays a focus indicator around the application icon 1010d because the application icon 1010d is determined to be the next control (e.g., in response to detecting the pinch gesture 1050 d). In some embodiments, this determination is made using one or more of the techniques discussed above with respect to fig. 9A-9H. At fig. 10E, the computer system 600 detects the grasping gesture 1050E. At fig. 10F, in response to detecting the grasping gesture 1050e, the computer system 600 activates the application icon 1010d and displays an application (e.g., a ventilator application) corresponding to the application icon 1010d (e.g., a respiratory application icon).
11A-11H illustrate an exemplary user interface for navigating the user interface using hand gestures, according to some embodiments. The user interfaces in these figures are used to illustrate the processes described below, including the processes in fig. 15-16. In particular, fig. 11A-11H illustrate exemplary scenarios in which computer system 600 automatically scrolls through a user interface and performs different operations in response to detecting one or more hand gestures. In some embodiments, computer system 600 automatically scrolls through the user interface when the computer system operates in a digital hardware mode of operation (e.g., a hand gesture navigation mode) using one or more techniques as described with respect to fig. 11A-11H. In some embodiments, computer system 600 begins operating in a digital hardware mode of operation in response to activation of digital hardware operation control 732 a. In some embodiments, computer system 600 automatically scrolls through the user interface and/or performs the same operations in an operation similar to the operation performed in response to computer system 600 detecting an input (e.g., a rotatable input) on an input mechanism. In some embodiments, computer system 600 automatically scrolls through the user interface while the computer system is operating in an auto-scroll mode (e.g., as discussed below with respect to 1412).
Fig. 11A illustrates computer system 600 displaying a text message user interface 1110. The text message user interface 1110 includes a text message area 1120 and one or more controls including an animated image control 1110 a. As shown in fig. 11A, text message area 1120 includes text message 1120a ("hey, where does you be. As shown in fig. 11A, computer system 600 is displaying a focus indicator around animated image control 1110a, which is displayed at the bottom of text message user interface 1110. At fig. 11A, the computer system 600 detects a double grasping gesture 1150a.
As shown in fig. 11B, in response to detecting the double grasping gesture 1150a, the computer system 600 displays a menu 732 including digital hardware manipulation controls 732 a. At fig. 11B, computer system 600 displays menu 732 at the top of text message user interface 1110. In some embodiments, computer system 600 displays menu 732 at the top of text message user interface 1110 because focus indicator is displayed around animated image control 1110a and animated image control 1110a is displayed at the bottom of text message user interface 1110. In some implementations, the computer system 600 displays the menu 732 at the bottom (or another area of the user interface) in response to determining that the selected control (e.g., the control having a focus indicator around it) is not displayed at the bottom of the media user interface 1110. At fig. 11B, the computer system 600 detects the double grasping gesture 1150B while the focus indicator is displayed around the digital hardware operation control 732 a.
As shown in fig. 11C, in response to detecting the double grasping gesture 1150b, the computer system 600 begins operating in a digital hardware mode of operation (e.g., and/or an automatic scroll mode). When operating in the digital hardware mode of operation, the computer system 600 performs the operations as if one or more inputs (e.g., a rotational input, a press input, a slide input) were received at an input mechanism of the computer system 600. At some time after the user interface of fig. 11C is displayed, computer system 600 performs an operation consistent with the detection of one or more inputs on input mechanism 602a without detecting an input and/or a hand gesture (e.g., hand 622 is in a neutral position). As shown in fig. 11D, when performing an operation, computer system 600 moves focus indication Fu Congdong around image control 1110a to around language control 1110b (e.g., no input is detected on input mechanism 602 a), which is an operation consistent with the input detected on input mechanism 602 a. At some time after the user interface of fig. 11D is displayed, computer system 600 performs another operation consistent with detecting one or more inputs on input mechanism 602a without detecting an input and/or a hand gesture (e.g., hand 622 is in a neutral position). As shown in FIG. 11E, when performing an operation, computer system 600 scrolls text message user interface 1110 and moves the focus indicator from around language control 1110b to around reply control 1110 c. At some time after the user interface of fig. 11E is displayed, computer system 600 performs another operation consistent with detecting one or more inputs on input mechanism 602a without detecting an input and/or a hand gesture (e.g., hand 622 is in a neutral position), computer system 600 scrolls text message user interface 1110 (e.g., displays a new control) and moves a focus indicator from around reply control 1110c to around reply control 1110E. At FIG. 11F, the computer system 600 detects the pinch gesture 1150F while the focus indicator is displayed around the reply control 1110 e. As shown in fig. 11G, in response to detecting the pinch gesture 1150f, the computer system 600 ceases to automatically perform operations consistent with one or more inputs detected on the input mechanism 602a (e.g., and/or pauses scrolling). In some embodiments, in response to detecting the additional grip gesture, computer system 600 resumes performing operations consistent with the one or more inputs detected on input mechanism 602 a.
At fig. 11G, the computer system 600 detects the grasping gesture 1150G while a focus indicator is displayed around the reply control 1110e. As shown in fig. 11G, in response to detecting the grasping gesture 1150G, the reply control 1110e is activated. At fig. 11G, in response to detecting the grasping gesture 1150G, the computer system 600 inserts a reply message 1120b ("in-transit") corresponding to the reply control 1110e in the text message region 1120. In some embodiments, computer system 600 sends reply message 1120b to one or more external computer systems that are part of a text message session.
Fig. 12A-12J illustrate an exemplary user interface for navigating the user interface using hand gestures, according to some embodiments. The user interfaces in these figures are used to illustrate the processes described below, including the processes in fig. 15-16.
In particular, fig. 12A-12J illustrate exemplary scenarios in which computer system 600 automatically scrolls through a user interface and performs different operations in response to detecting one or more hand gestures. In some embodiments, computer system 600 automatically scrolls through the user interface when the computer system operates in a digital hardware mode of operation (e.g., a hand gesture navigation mode) using one or more techniques as described with respect to fig. 12A-12J. In some embodiments, computer system 600 begins operating in a digital hardware mode of operation in response to activation of digital hardware operation control 732 a. In some embodiments, computer system 600 automatically scrolls through the user interface and/or performs the same operations in an operation similar to the operation performed in response to computer system 600 detecting an input (e.g., a rotatable input) on an input mechanism. In some embodiments, computer system 600 automatically scrolls through the user interface while the computer system is operating in an auto-scroll mode (e.g., as discussed below with respect to 1412).
FIG. 12A illustrates computer system 600 displaying a clock face user interface 1210. The clock face user interface 1210 includes selectable controls 1210a-1210e. At FIG. 12A, computer system 600 is operating in an auto-scroll mode. As shown in fig. 12A-12C, computer system 600 automatically (e.g., as indicated by hand 622 being in a neutral position) moves a focus indicator (e.g., no input is detected by computer system 600) between selectable controls 1210a-1210C (e.g., at a first speed). At fig. 12C, computer system 600 detects pinch gesture 1250C while a focus indicator is displayed around selectable control 1210C. As shown in fig. 12D, in response to detecting the pinch gesture 1250c, the computer system 600 ceases automatically moving the focus indicator between the selectable controls and continues to display the focus indicator around the selectable control 1210c (e.g., as indicated by the hand 622 being in the neutral position). At fig. 12E, computer system 600 detects pinch gesture 1250E while a focus indicator is displayed around selectable control 1210 c. As shown in fig. 12F-12G, in response to detecting the pinch gesture 1250e, the computer system 600 resumes moving the focus indicator (e.g., around the selectable control 1210d and then displays the focus indicator around the selectable control 1210 e).
At fig. 12H, computer system 600 moves focus indication Fu Xiangzuo such that the focus indicator is displayed around selectable control 1210d and not around 1210e. At fig. 12H, computer system 600 moves the focus indicator from selectable control 1210e to selectable control 1210d because 1210e was determined to be the last selectable control on clock plane user interface 1210. Thus, at FIG. 12H, computer system 600 reverses the direction of movement of the focus indicator around the selectable user interface object of clock face user interface 1210. In some embodiments, computer system 600 displays a focus indicator around selectable control 1210c when a predetermined amount of time has elapsed. At fig. 12H, computer system 600 detects a double clamp gesture 1250H. At fig. 12G, in response to detecting the double pinch gesture 1250h, the computer system 600 reverses the direction of movement of the focus indicator around the selectable user interface object of the clock face user interface 1210. As shown in fig. 12I, in response to detecting the double pinch gesture 1250h, computer system 600 displays a focus indicator around selectable control 1210e (e.g., instead of displaying a focus indicator around selectable control 1210c, if double pinch gesture 1250h is not detected, computer system 600 will have displayed a focus indicator around the selectable control). At fig. 12I, the computer system 600 detects the grasping gesture 1250I while displaying a focus indicator around the selectable control 1210e. As shown in fig. 12J, in response to detecting the grasping gesture 1250J, the computer system 600 activates the selectable control 1210e. In response to detecting the grasping gesture 1250J, the computer system 600 displays a calendar application user interface 1220 that corresponds to the selectable control 1210e, as shown in fig. 12J.
Fig. 13A-13G illustrate exemplary user interfaces for navigating a user interface using hand gestures, according to some embodiments. The user interfaces in these figures are used to illustrate the processes described below, including the processes in fig. 15-16.
In particular, fig. 13A-13G illustrate one or more example settings for controlling one or more hand gesture navigation modes (e.g., as described above with respect to fig. 7A-7 AA, 8A-8J, 9A-9H, 10A-10F, 11A-11H, and 12A-12J). FIG. 13A illustrates computer system 600 displaying one or more settings including accessibility setting controls 1312a, 1312b, and 1312 c. In some embodiments, in response to detecting an input on accessibility setting control 1312a, computer system 600 displays one or more of accessibility settings 1322a-1322f (e.g., as shown in FIG. 13B).
As shown in FIG. 13B, accessibility settings 1312a-1312f include a hand gesture navigation control 1322h. In some implementations, when the hand gesture navigation control 1322h is closed (e.g., inactive), the computer system 600 does not perform an operation in response to detecting a hand gesture. In some implementations, when the hand gesture navigation control 1322h is open (e.g., active), the computer system 600 performs an operation in response to detecting a hand gesture. In some implementations, in response to detecting a gesture for the hand gesture navigation control 1322h, the computer system 600 displays the user interface of fig. 13C.
As shown in FIG. 13C, computer system 600 displays hand gesture navigation controls 1314a-1314h. The hand gesture navigation controls include a main setup control 1314a, a hand gesture control 1314b, a mobile cursor control 1314c, a mobile style control 1314d, and one or more appearance setup controls 1314e-1314g, and a hand gesture confirmation control 1314h. In response to detecting input for the main settings control 1314a, the computer system 600 turns on/off the hand gesture navigation mode (e.g., as described above with respect to the hand gesture navigation control 1322 h). In response to detecting input for the hand gesture control 1314b, the computer system 600 toggles the on/off of the hand gesture control 1314 b. When the hand gesture control 1314b is open, the computer system 600 is configured to perform one or more operations in response to detecting the hand gesture. When the hand gesture control 1314b is closed, the computer system 600 is not configured to perform one or more operations in response to detecting the hand gesture. In response to detecting input for the mobile cursor control 1314c, the computer system 600 toggles the mobile cursor control on/off. When the mobile cursor control 1314c is open, the computer system 600 may be configured to operate in a mobile cursor mode (e.g., as described above with respect to fig. 7I-7U and 8A-8J). When the mobile cursor control 1314c is closed, the computer system 600 cannot be configured to operate in the mobile cursor mode (e.g., in response to a shake gesture and/or in response to detecting an input on the mobile cursor control 732b of menu 732 in fig. 7N). In response to detecting an input for the movement pattern control 1314d, the computer system 600 switches the movement pattern control 1314d between different movement options. The different movement options include one or more of automatic movement and manual movement (e.g., of a focus indicator on a displayed user interface). In response to detecting input for one or more appearance settings controls 1314e-1314g, computer system 600 changes one or more appearances of menu 732 (e.g., as described with respect to fig. 7N), cursor 742 (e.g., as described with respect to fig. 7A-7 AA), and indication 744 (e.g., as described with respect to fig. 7A-7 AA), such as a color and/or control of the displayed one or more user interface objects. In response to input to detect the hand gesture confirmation control 1314h, the computer system 600 toggles the on/off of the hand gesture confirmation control 1314h. When the hand gesture confirmation control 1314h is open, the computer system 600 is configured to confirm one or more operations (e.g., payment transactions) in response to detecting one or more hand gestures. When the hand gesture confirmation control 1314h is closed, the computer system 600 is not configured to confirm one or more operations in response to detecting one or more hand gestures.
As shown in fig. 13D, computer system 600 displays one or more setup controls (e.g., setup controls 1316a-1313 e) for changing one or more operations performed by computer system 600 in response to detecting a corresponding hand gesture. In some embodiments, in response to detecting one or more gestures with respect to learning control 1316E, computer system 600 displays the user interface of fig. 13E.
As shown in fig. 13E, computer system 600 displays user interface 1318. The user interface 1318 may include one or more instructions (e.g., with a graphical representation) that indicate how the user may perform one or more gestures. In some implementations, at fig. 13C, computer system 600 displays the user interface of fig. 13F in response to detecting a gesture for moving cursor control 1314C. In some embodiments, when user interface 1318 is displayed, computer system 600 will provide feedback as to whether the user performed the gesture correctly and/or incorrectly during the learning mode session.
As shown in fig. 13F, computer system 1300 displays one or more controls for controlling one or more aspects of the moving cursor mode, such as sensitivity (e.g., 1322 b) (e.g., the manner and/or amount by which the moving cursor moves in response to detecting movement (e.g., tilting) of computer system 600), time of activity (e.g., 1322 c) (e.g., the predetermined amount of time the moving cursor must be over the user interface object in order to perform an operation and/or the amount of time indicated by the indication (e.g., indication 744 of fig. 7N)), tolerance of movement (e.g., 1322 d) (e.g., the amount of movement that needs to be performed before indication 744 is reset and/or animated to cease to be displayed after the moving cursor has been moved from a position corresponding to the user interface object), and dwell control (e.g., 1322 e) (e.g., the predetermined amount of time the moving user must be over the user interface object in order to perform an operation and/or whether the computer system can begin operating in the moving cursor mode via one or more shake inputs). At FIG. 13G, computer system 600 displays one or more settings (e.g., 1324a-1324 c) for changing the color of the moving cursor.
Fig. 14 illustrates a plurality of menu controls (e.g., as described above with respect to fig. 7N) that may be displayed on menu 732. Menu 732 may include a hierarchy of controls, such as the control hierarchy described with respect to fig. 14. In some implementations, in response to detecting input for interactive control 1410, computer system 600 displays one or more of tap control gesture 1410a (e.g., as described above with respect to 832 a), mobile cursor control 1410b (e.g., as described above with respect to mobile cursor control 732 b), digital hardware operation control 1410 (e.g., as described above with respect to digital hardware operation 732 a). In some embodiments, in response to detecting input to system control 1414, computer system 600 displays one or more of icon grid 1414a (e.g., when activated, it displays a grid comprising a plurality of application icons), control center 1414b (e.g., when activated, it displays one or more device controls, such as Wi-Fi controls for switching Wi-Fi settings, bluetooth controls for switching bluetooth settings, mute controls for switching sound output on/off), settings 1414c (e.g., when activated, it displays one or more settings for configuring the computer system, as described above with respect to fig. 13A-13G), and base control 1414d (e.g., when activated, it displays one or more controls for navigating to one or more applications (e.g., opening applications and/or running in the background)). In some embodiments, in response to detecting input to the additional option control 1416 (e.g., 732 d), the computer system 600 displays a wallet control 1416a (e.g., which, when activated, causes the computer system 600 to display one or more user interface objects for initiating payment transactions), a side key 1416b (e.g., which, when activated, causes the computer system 600 to perform one or more operations consistent with the input mechanism 602b being pressed, such as turning off the computer system 600), and a gesture mode 1416c (e.g., which, when activated, causes the computer system 60 to begin operating in gesture mode). In some embodiments, in response to detecting an input to the auto-scroll control 1412, the computer system 600 begins operating in an auto-scroll mode (e.g., as described above with respect to fig. 11A-11H and 12A-12J). In some embodiments, in response to detecting an input to exit control 1418, computer system 600 stops displaying menu 732 (e.g., as described above with respect to fig. 7N). In some embodiments, menu 732 includes one or more controls not shown in fig. 14. In some embodiments, the controls of menu 732 are in a different hierarchical structure than that depicted in fig. 14.
Fig. 15 is a flow chart illustrating a method for using 1500 of a computer system, according to some embodiments. The method 1500 is performed at a computer system (e.g., 100, 300, 500) (e.g., a wearable device (e.g., a smart watch)) in communication with a display generation component (e.g., a display controller, a touch-sensitive display system) and an optical sensor (e.g., a heart rate sensor). In some embodiments, the computer system communicates with one or more sensors (e.g., one or more biometric sensors (e.g., optical sensors (e.g., heart rate sensor), cameras), gyroscopes, accelerometers)). In some implementations, the computer system communicates with one or more input devices (e.g., touch-sensitive surface, microphone). In some embodiments, the computer system communicates with one or more output devices (e.g., one or more speakers, one or more microphones). Some operations in method 1500 are optionally combined, the order of some operations is optionally changed, and some operations are optionally omitted.
As described below, the method 1500 provides an intuitive way of navigating a user interface using hand gestures. The method reduces the cognitive burden on a user to navigate the user interface using hand gestures, thereby creating a more efficient human-machine interface. For battery-powered computing devices, users are enabled to navigate the user interface faster, and to save power more efficiently, and to increase the time interval between battery charges.
The computer system displays (1502), via the display generating component, a user interface including a first user interface object (e.g., 712a-712b, 910a-910h, or 1010a-1010 d), a second user interface object (e.g., 712a-712b, 910a-910h, or 1010a-1010 d), a third user interface object (e.g., 712a-712b, 910a-910h, or 1010a-1010 d), and an indication (e.g., a focus indicator, as discussed around 712a-712b, 910a-910h, or 1010a-1010 d) that the first user interface object (e.g., a visual indication (e.g., highlighting (e.g., a boundary of the first user interface), displaying text of the first user interface object, magnifying the first user interface object) (e.g., not indicating that the second user interface object and the third user interface object are selected) in some embodiments, the first user interface object, the second user interface object and the third user interface object are different from one another in some embodiments, the first user interface object is displayed in some embodiments, the third user interface object is different from the first user interface object, the third user interface object is displayed around the first user interface object.
In some embodiments, when a user interface including a first user interface object (e.g., 712a-712b, 910a-910h, or 1010a-1010 d) is displayed (e.g., and when the computer system is in a first mode of operation (e.g., a first accessibility mode), a second user interface object (e.g., 712a-712b, 910a-910h, or 1010a-1010 d), a third user interface object (e.g., 712a-712b, 910a-910h, or 1010a-1010 d), and an indication that the first user interface object is selected (e.g., a focus indicator, as discussed around 712a-712b, 910a-910h, or 1010a-1010 d), the computer system at least via (e.g., using) an optical sensor (e.g., and/or one or more other sensors, such as gyroscopes, accelerometers) to detect (1504) a hand gesture (e.g., 750e, 750f, 750i, 750l, 750m, 750n, 950a, 950b, 950c, 950e, 950f, 950g, 1050b, 1050c, 1050d, or 1050 e) (e.g., a first hand gesture) (e.g., that does not contact a device (e.g., that is not directed to an object on a user interface and/or any device in communication with the device)) (e.g., a hand gesture that is not detected by one or more cameras of the device) (e.g., a hand gesture that occurs when a user's wrist is located at a single location and/or is not moving, a hand gesture that is detected when the computer system is worn on the wrist (e.g., the user's wrist)).
In some embodiments, in response to (1506) detecting a hand gesture (e.g., 750e, 750f, 750i, 750l, 750m, 750n, 950a, 950b, 950c, 950e, 950f, 950g, 1050b, 1050c, 1050d, or 1050 e) via at least the optical sensor (and when a user interface including a first user interface object, a second user interface object, a third user interface object is displayed) (e.g., and when the computer system is operating in a first mode of operation), and in accordance with determining that the hand gesture (e.g., 750e, 750f, 750i, 750m, 750n, 950a, 950b, 950c, 950e, 950f, 950g, 1050b, 1050c, 1050d, or 1050 e) is a first type of gesture (e.g., 610, 620, 630, 640, 650) (e.g., the type of hand gesture) (e.g., a finger tap/pinch gesture (e.g., a gesture in which two or more fingers touch each other)) (and in some embodiments, the first type of gesture is a non-finger tap/pinch gesture (e.g., a fist-making gesture), a multi-clenching gesture (e.g., a gesture including multiple clenching gestures), a finger stretch gesture (e.g., a gesture in which one or more fingers do not touch a portion of a hand), a multi-finger stretch gesture (e.g., a gesture including multiple stretch gestures), and/or any combination thereof), the computer system displays (1508) the second user interface object (e.g., 712a-712b, 910a-910h, or 1010a-1010 d) via the display generating component is selected (e.g., a selected new user interface object) (e.g., a focus indicator, as discussed around 712a-712b, 910a-910h, or 1010a-1010 d) (e.g., a visual indication (e.g., highlighting (e.g., border of the second user interface), displaying text of the second user interface, zooming in on the second user interface) (e.g., ceasing to display an indication that the first user interface object was selected and/or not displaying an indication that the third user interface object was selected) (e.g., not indicating that the first user interface object and the third user interface object were selected). In some implementations, the indication of the second user interface object is not previously displayed before the first indication of the first user interface object is displayed. In some implementations, in accordance with a determination that the hand gesture is a gesture of a first type, the computer system moves the indication from a first user interface object (e.g., from being displayed around, near (e.g., above, below, beside) the first user interface object) (e.g., an indication that the first user interface object was selected) to a second user interface object (e.g., to be displayed around, near (e.g., above, below, beside) the second user interface object) (e.g., an indication that the second user interface object was selected).
In some embodiments, in response to (1506) detecting, via at least the optical sensor, a hand gesture (e.g., 750e, 750f, 750i, 750l, 750m, 750n, 950a, 950b, 950c, 950e, 950f, 950g, 1050b, 1050c, 1050d, or 1050 e) (and when a user interface including a first user interface object, a second user interface object, a third user interface object is displayed) (e.g., and when the computer system is operating in a first mode of operation), and in accordance with determining that the hand gesture (e.g., 750e, 750f, 750i, 750l, 750m, 750n, 950a, 950b, 950c, 950e, 950f, 950g, 1050b, 1050c, 1050d, or 1050 e) is a second type of gesture (e.g., 610, 620, 630, 640, 650) different from the first type of gesture (e.g., the type of hand gesture) (e.g., a multi-finger tap/pinch gesture (e.g., a gesture in which two or more fingers touch each other multiple times) (and in some embodiments, the first type of gesture is a non-finger tap/pinch gesture (e.g., a gesture that makes a fist), a multi-grasping gesture (e.g., a gesture that includes multiple grasping gestures), a finger stretch gesture (e.g., a gesture that does not touch a portion of a hand by one or more fingers), a multi-finger stretch gesture (e.g., a gesture that includes multiple stretch gestures), and/or any combination thereof), the computer system displays (1510), via the display generating component, the third user interface object (e.g., 712a-712b, 910a-910h, or 1010a-1010 d) selected (e.g., previous user interface objects selected) (e.g., focus indicators, as discussed around 712a-712b, 910a-910h, or 1010a-1010 d) (e.g., visual indications (e.g., highlighting (e.g., boundaries of a third user interface), displaying text of the third user interface, zooming in on the third user interface) (e.g., not indicating that the first user interface object and the second user interface object are selected) (e.g., ceasing to display an indication that the first user interface object is selected and/or not displaying an indication that the second user interface object is selected). In some implementations, the indication of the third user interface object is previously displayed before the first indication of the first user interface object is displayed. In some implementations, in accordance with a determination that the hand gesture is a gesture of the second type, the computer system moves the indication from the first user interface object (e.g., from being displayed around, near (e.g., above, below, beside) the first user interface object) (e.g., an indication that the first user interface object was selected) to the third user interface object (e.g., to being displayed around, near (e.g., above, below, beside) the third user interface object) (e.g., an indication that the second user interface object was selected). In some embodiments, at least two of the indication that the first user interface object is selected, the indication that the second user interface object is selected, and/or the indication that the third user interface object is selected are of different sizes. Selecting whether to display an indication that the second user interface object is selected or whether to display an indication that the third user interface object is selected based on the type of hand gesture detected provides the user with more control over the system and helps the user navigate the user interface without touching the computer system, which may result in more efficient control of the user interface for some users.
In some implementations, the hand gesture (e.g., 750e, 750f, 750i, 750l, 750m, 750n, 950a, 950b, 950c, 950e, 950f, 950g, 1050b, 1050c, 1050d, or 1050 e) is detected based on heart rate data determined using data detected via an optical sensor. In some embodiments, the second type of gesture (e.g., 630 or 650) is a gesture type of multiple instances of the first type of gesture (e.g., 620 or 640) (e.g., multi-finger tap/pinch, multi-finger stretch gesture, and/or multi-grasping gesture). In some implementations, the second type of gesture includes at least one instance of the first type of gesture. Selecting whether to display an indication of whether a second user interface object is selected or whether to display an indication of a third user interface object is selected based on the type of hand gesture detected (where the second type of gesture is a gesture type that is multiple instances of the first type of gesture) provides the user with more control over the system and helps the user navigate the user interface to perform similar actions differently (e.g., selecting an object to the right versus selecting an object to the left) using similar hand gestures (e.g., one hand gesture including another hand gesture) without touching the computer system, which may result in more efficient control of the user interface for some users.
In some implementations, in response to detecting a hand gesture (e.g., 750e, 750f, 750i, 750l, 750m, 750n, 950a, 950b, 950c, 950e, 950f, 950g, 1050b, 1050c, 1050d, or 1050 e) via at least the optical sensor and in accordance with a determination that the hand gesture is a third type of gesture (e.g., 620, 630, 640, or 650) that is different from the first type of gesture and the second type of gesture (e.g., the type of hand gesture), the computer system performs (e.g., via the display generating component) an operation (e.g., an action) corresponding to a selection of the first user interface object (e.g., selecting a play/pause button, selecting an interaction on a menu, selecting a cancel button, selecting a answer/reject button for an incoming phone call) (e.g., without displaying a menu including one or more selectable options). In some implementations, in response to detecting a hand gesture (e.g., 750e, 750f, 750i, 750l, 750m, 750n, 950a, 950b, 950c, 950e, 950f, 950g, 1050b, 1050c, 1050d, or 1050 e) via at least the optical sensor and in accordance with a determination that the hand gesture is a fourth type of gesture (e.g., 620, 630, 640, or 650) that is different from the first type of gesture, the second type of gesture, and the third type of gesture (e.g., the type of hand gesture), the computer system displays, via the display generation component, a menu (e.g., as described below with respect to method 1600) that includes one or more selectable options (e.g., does not perform an option corresponding to a selection of the first user interface object). Selecting whether to perform an operation corresponding to the selection of the first user interface object or to display a menu including one or more selectable options based on the type of hand gesture detected provides the user with more control over the system and assists the user in navigating the user interface without touching the computer system, which may result in more efficient control of the user interface for some users.
In some implementations, prior to detecting a hand gesture (e.g., 750e, 750f, 750I, 750l, 750m, 750n, 950a, 950b, 950c, 950e, 950f, 950g, 1050b, 1050c, 1050d, or 1050 e), the computer system (e.g., 600) is in a first mode of operation (e.g., an mode of operation in which the computer system performs the operations (e.g., operations) described above (e.g., as described above with respect to fig. 7A-7I) in response to detecting one or more of the first type of gesture, the second type of gesture, the third type of gesture, and/or the fourth type of gesture). In some embodiments, as part of performing an operation corresponding to selection of a first user interface object (e.g., 1412 or 736 a) (e.g., as described above with respect to fig. 11A-11H and 12A-12J), the computer system transitions the computer system from a first mode of operation to a second mode of operation (e.g., as described above with respect to fig. 7I-7U) (e.g., a mode in which a cursor automatically moves on a display, wherein the user interface automatically scrolls in a direction (e.g., up, down, right, left, tilt), wherein detection of one or more of a first type of gesture, a second type of gesture, a third type of gesture, and/or a fourth type of gesture causes the computer system to perform a mode of operation that is different than those performed when the computer system is operated in the first mode of operation).
In some implementations, when the computer system (e.g., 600) is in the second mode of operation, the computer system detects a second hand gesture (e.g., 750e, 750f, 750i, 750l, 750m, 750n, 950a, 950b, 950c, 950e, 950f, 950g, 1050b, 1050c, 1050d, or 1050 e) via at least the optical sensor. In some embodiments, in response to detecting at least a second hand gesture via the optical sensor (e.g., when the computer system is in a second mode of operation but not the first mode of operation), and in accordance with a determination that the second hand gesture is a gesture of a first type (e.g., 620, 630, 640, or 650), the computer system switches a first automatic scroll operation (e.g., as discussed above with respect to fig. 12C-12E) between an active state (e.g., a resume state) and an inactive state (e.g., a pause state) (e.g., without reversing the direction of the automatic scroll operation) (e.g., an operation in which one or more user interface objects are automatically selected in sequence (e.g., without user input) (e.g., an operation in which the automatic scroll operation is switched between the active state and the inactive state is a different operation than displaying an indication that the second user interface object was selected). In some implementations, in accordance with a determination that the hand gesture is a gesture of a first type and in accordance with a determination that the scroll operation is in an inactive state (e.g., a paused state, a scroll off state), the computer system transitions the automatic scroll operation from the inactive state to the active state (e.g., resumes the scroll operation). In some implementations, in accordance with a determination that the hand gesture is a gesture of a second type and in accordance with a determination that the scroll operation is in an active state (e.g., scroll open state, resume state), the computer system transitions the automatic scroll operation from the active state to the inactive state (e.g., pauses the scroll operation). In some embodiments, in accordance with a determination that the automatic scrolling operation is active, the computer system displays a notification indicating that the automatic scrolling operation is active. In some embodiments, in accordance with a determination that the automatic scroll operation is inactive, the computer system displays a notification indicating that the automatic scroll operation is inactive and/or ceases to display a notification indicating that the automatic scroll operation is active. In some embodiments, in response to detecting at least a second hand gesture via the optical sensor (e.g., when the computer system is in a second mode of operation but not the first mode of operation) and in accordance with a determination that the second hand gesture is a gesture of a second type, the computer system reverses a first direction of the first automatic scroll operation (e.g., does not switch the automatic scroll operation between an active state and an inactive state) (e.g., reverses the first direction of the automatic scroll operation to be a different operation than displaying an indication that a third user interface object was selected). In some embodiments, as part of reversing the direction of the auto-scroll operation, the computer system selects user interface objects in an order opposite to the order in which the user interfaces were previously selected and/or reverses the interface to scroll in one direction. Selecting whether to switch the first automatic scrolling operation between the active state and the inactive state or to reverse the first direction of the automatic scrolling operation based on the type of hand gesture detected provides the user with more control over the system and helps the user navigate the user interface without touching the computer system, which may result in more efficient control of the user interface for some users.
In some implementations, in response to detecting the second hand gesture (e.g., 750e, 750f, 750i, 750l, 750m, 750n, 950a, 950b, 950c, 950e, 950f, 950g, 1050b, 1050c, 1050d, or 1050 e) at least via the optical sensor: in accordance with a determination that the second hand gesture is a third type of gesture (e.g., 610, 620, 630, 640, or 650), the computer system performs a second operation (e.g., changes a speed of the automatic scroll operation); and in accordance with a determination that the second hand gesture is a fourth type of gesture (e.g., 610, 620, 630, 640, or 650), the computer system performs a third operation that is different from the second operation (e.g., selects a user interface at a current location of the user interface object highlighted by the scroll operation when the second hand gesture is detected, ending the auto-scroll mode). Selecting whether to perform the first operation or the second operation based on the type of hand gesture detected provides the user with more control over the system and helps the user navigate the user interface without touching the computer system, which may result in more efficient control of the user interface for some users.
In some embodiments, when the computer system (e.g., 600) is in the second mode of operation, the computer system performs a second automatic scrolling operation (e.g., as discussed above with respect to fig. 12G-12H) that scrolls the sequence of the plurality of interface objects in the second direction. In some embodiments, when performing the second automatic scroll operation, the computer system detects an end of the sequence of the plurality of user interface objects (e.g., detecting that the scroll position is at or near a last user interface object in the sequence and/or detecting a boundary of the user interface (e.g., as discussed above with respect to fig. 12G-12H.) in some embodiments, in response to detecting an end of the sequence of the plurality of user interface objects, the computer system performs a third automatic scroll operation that scrolls the sequence of the plurality of interface objects in a third direction different from (e.g., opposite to) the second direction (e.g., as discussed above with respect to fig. 12G.) in response to detecting an end of the sequence of the plurality of user interface objects, the third automatic scroll operation that scrolls the sequence of the plurality of interface objects in a third direction different from the second direction allows the computer system to provide a method for the user to automatically navigate the user interface without requiring the user to provide additional input to restart/reset the automatic navigation when the automatic navigation has cycled through the user interface objects on the user interface.
In some implementations, in response to detecting a second hand gesture (e.g., 750e, 750f, 750i, 750l, 750m, 750n, 950a, 950b, 950c, 950e, 950f, 950g, 1050b, 1050c, 1050d, or 1050 e) via at least the optical sensor and in accordance with a determination that the second hand gesture is a fourth type of gesture (e.g., 610, 620, 630, 640, or 650), the computer system transitions the computer system from the second mode of operation to a third mode of operation (and/or ends/exits the second mode of operation). In some embodiments, the third mode of operation is the first mode of operation. Transitioning the computer system from the second mode of operation to the third mode of operation in accordance with determining that the second hand gesture is a fourth type of gesture provides the user with more control over the system and helps the user navigate the user interface without touching the computer system, which may result in more efficient control of the user interface for some users.
In some implementations, in accordance with a determination that the second hand gesture (e.g., 750e, 750f, 750i, 750l, 750m, 750n, 950a, 950b, 950c, 950e, 950f, 950g, 1050b, 1050c, 1050d, or 1050 e) is a gesture of the first type (e.g., 610, 620, 630, 640, or 650), the computer system displays a notification (e.g., 716) indicating a status of the automatic scroll operation (e.g., scroll pause, scroll stop). Displaying a notification indicating a state of the automatic scroll operation in accordance with a determination that the second hand gesture is a first type of gesture provides visual feedback to the user in response to detecting that the hand gesture has performed an operation, which may prevent the user from erroneously performing multiple gestures that may cause the computer system to perform unnecessary operations and/or unintended operations.
In some embodiments, the third type of gesture (e.g., 610, 620, 630, 640, or 650) is a gesture type of multiple instances (e.g., multi-finger tap/pinch, multi-finger stretch gesture, and/or multi-grasping gesture) of the fourth type of gesture (e.g., 610, 620, 630, 640, or 650). In some implementations, the third type of gesture includes at least one instance of a fourth type of gesture. In some implementations, the fourth type of gesture (e.g., 610, 620, 630, 640, or 650) does not include multiple instances of the first type of gesture (e.g., 610, 620, 630, 640, or 650) (and/or the second type of gesture), and the third type of gesture (e.g., 610, 620, 630, 640, or 650) does not include multiple instances of the second type of gesture (e.g., 610, 620, 630, 640, or 650) (and/or the first type of gesture) (e.g., as discussed above with respect to fig. 6). In some embodiments, the fourth type of gesture does not include the first type of gesture and/or the second type of gesture. In some embodiments, the third type of gesture does not include the first type of gesture and/or the second type of gesture.
In some embodiments, in response to detecting at least a second hand gesture (e.g., 750e, 750f, 750i, 750l, 750m, 750n, 950a, 950b, 950c, 950e, 950f, 950g, 1050b, 1050c, 1050d, or 1050 e) via the optical sensor and in accordance with determining that a notification (e.g., a notification corresponding to a phone call, email, text message, voicemail message) is received within a threshold period of time and in accordance with determining that the second hand gesture is a fourth type of gesture, the computer system performs an action related to the notification (e.g., answers the phone call, responds/opens a text message and/or email corresponding to the notification, plays a voicemail message) (e.g., as discussed above with respect to fig. 7H). Performing the action related to the notification in accordance with determining that the notification was received within the threshold period of time and in accordance with determining that the second hand gesture is of the fourth type provides the user with more control over the system and helps the user navigate the user interface without touching the computer system (e.g., causes the computer system to perform the action related to the notification), which may result in more efficient control of the user interface for some users.
In some embodiments, the one or more selectable options (e.g., 732a-732d, 832a, or 1410-1418) include a first selectable user interface object (e.g., 732a-732d, 832a, or 1410-1418) for changing the operation that the one or more hand gestures may cause the computer system to perform. In some embodiments, selection of the first selectable user interface object causes the computer system to display a plurality of settings. In some embodiments, each setting controls an operation that the one or more hand gestures may cause the computer system to perform when the one or more hand gestures are detected by the computer system (e.g., as discussed above with respect to fig. 8A-8J). Displaying a menu including a first selectable user interface object for changing what operations the one or more hand gestures may cause the computer system to perform allows the computer system to provide the user with the option to perform the operations without requiring more complex hand gestures, which reduces the number of hand gestures for performing the operations and allows the user to customize the hand gestures that cause the operations to be performed.
In some embodiments, the one or more selectable options (e.g., 732a-732d, 832a, or 1410-1418) include a second selectable option for transitioning the computer system to a fourth mode of operation and a third selectable option for transitioning the computer system to a fifth mode of operation that is different from the fourth mode of operation. In some embodiments, selection of a second selectable option (e.g., 732a-732d, 832a, or 1410-1418) causes the computer system to transition to a fourth mode of operation (e.g., an auto-scroll mode, a cursor mode (e.g., as discussed herein with respect to method 1600). In some embodiments, selection of a third selectable option (e.g., 732a-732d, 832a, or 1410-1418) causes the computer system to transition to a fifth mode of operation (e.g., an auto-scroll mode, a cursor mode (e.g., as discussed herein with respect to method 1600)), displaying a menu including the second selectable option for transitioning the computer system to the fourth mode of operation and the third selectable option for transitioning the computer system to a fifth mode of operation that is different from the fourth mode of operation provides the user with more control over the computer system by allowing the user to transition to the different modes of operation (e.g., without providing more complex hand gestures), and also reduces the number of hand gestures for performing the operation.
In some embodiments, the one or more selectable options (e.g., 732a-732d, 832a, or 1410-1418) include a fourth selectable option (e.g., 1416) for displaying one or more additional selectable options (e.g., an option corresponding to an option to perform an operation corresponding to an operation performed when an input is detected at a location (e.g., a second location) of the selectable user interface object, an option to perform an operation, such as turning on/off the computer system, displaying a different menu and/or user interface, an option corresponding to an operation performed in response to detection of an input via one or more input devices in communication with the computer system. In some embodiments, selection of the fourth selectable option causes the computer system (e.g., 600) to display one or more additional selectable options that were not previously displayed prior to selection of the fourth selectable option (e.g., when selected, each additional selectable option causes the computer system to perform one or more operations (e.g., different operations)). In some embodiments, in response to detecting the fourth selectable option, the computer system stops displaying the one or more selectable options displayed prior to detecting the selection of the fourth selectable option. Displaying a menu including a fourth selectable option for displaying one or more additional options provides the user with more control over the computer system by allowing the user to select additional selectable options that are not previously displayed, reducing the number of hand gestures for performing additional operations, and reducing the number of selectable user interface options displayed when the menu is initially displayed, which clears the clutter of the user interface.
In some embodiments, in accordance with a determination that a respective user interface object (e.g., a selected user interface object, a user interface object surrounded by a focus indicator) is at a first location on the user interface, a menu is displayed at the first location (e.g., as discussed above with respect to fig. 11B); and determining that the menu is displayed at a second location different from the first location (e.g., as discussed above with respect to fig. 11B) according to the respective user interface object not being at the first location on the user interface. Automatically selecting to display a menu at a location based on the location at which the corresponding user interface object is displayed allows the computer system to avoid displaying the menu at the location of the user interface object that may be of interest to the user (e.g., the selected user interface object), and also reduces the number of inputs that the user would need to make to move the menu without affecting the display of the user interface object.
In some embodiments, when a third menu (e.g., 732) comprising one or more selectable options (e.g., 1410A-732d, 732a, or 832-1418) is displayed (e.g., a menu comprising one or more selectable options, a second menu comprising one or more selectable options), the computer system detects a third hand gesture via at least the optical sensor (e.g., as discussed above with respect to fig. 10A-10E). In some embodiments, in response to detecting a third hand gesture (e.g., as discussed above with respect to fig. 10A-10E) via at least the optical sensor and in accordance with a determination that the third hand gesture is a fourth type of gesture, the computer system stops displaying a third menu comprising one or more selectable options. Stopping displaying the menu including one or more selectable options in accordance with a determination that the third hand gesture is a fourth type of gesture provides the user with more control over the system and helps the user navigate the user interface without touching the computer system, which may result in more efficient control of the user interface for some users.
In some embodiments, after displaying the user interface including the first user interface object (e.g., 732a-732d, 832a, or 1410-1418), the computer system detects the fourth hand gesture via at least the optical sensor. In some embodiments, in response to detecting the fourth hand gesture via at least the optical sensor and in accordance with a determination that the fourth hand gesture is a fifth type of gesture (e.g., and in some embodiments, the fifth type of gesture is the same gesture as the fourth type of gesture) that is different from the first type of gesture and the second type of gesture (and/or the first type of gesture), the computer system transitions the computer system from an inactive state (e.g., a sleep state, a dormant state, a reduced power mode state, a state in which the display generating component is less bright than the display generating component in the active state) to an active (e.g., awake state, full power state) state (or transitions the computer system from the active state to the inactive state) (e.g., as discussed above with respect to fig. 14 (e.g., 1416 b)). Transitioning the computer system from the inactive state to the active state in accordance with determining that the fourth hand gesture is a fifth type of gesture provides the user with more control over the system without touching the computer system, which may result in more efficient control of the user interface for some users.
In some embodiments, the computer system detects the fifth hand gesture via at least the optical sensor while displaying, via the display generating component, an indication that the second user interface object is selected. In some implementations, in response to detecting the fourth hand gesture via at least the optical sensor: in accordance with a determination that the fifth hand gesture is a gesture of the first type, the computer system displays, via the display generation component, an indication that a fourth first user interface object (e.g., a selected next user interface object) is selected; and in accordance with a determination that the fifth hand gesture is a gesture of the second type, the computer system displays, via the display generating component, an indication that the first user interface object (e.g., the selected previous user interface object) was selected. In some embodiments, upon displaying, via the display generating component, an indication that the third user interface object was selected, the computer system detects, via at least the optical sensor, the respective hand gesture. In some implementations, in response to detecting the respective hand gesture and in accordance with a determination that the respective hand gesture is a gesture of the first type, the computer system displays an indication that a fourth user interface object, different from the first user interface object, the second user interface object, and the third user interface object, is selected. In some embodiments, in response to detecting the respective hand gesture and in accordance with a determination that the respective hand gesture is a gesture of the second type, the computer system displays an indication that the first user interface object is selected. Selecting whether to display the indication that the fourth first user interface object is selected or to display the indication that the first user interface object is selected based on the type of hand gesture detected while the second user interface object is selected provides the user with more control over the system and helps the user navigate the user interface without touching the computer system (e.g., consistently navigating the user interface in the same manner), which may result in more efficient control of the user interface for some users.
In some embodiments, upon displaying a user interface comprising a first user interface object, the computer system detects a request to transition the computer system from a first mode of operation to a fourth mode of operation (e.g., as discussed above with respect to fig. 13A-13G and 14) (e.g., a mode of operation in which the computer system does not perform an operation in response to detecting a hand gesture and/or a mode in which the computer system does not detect one or more hand gestures). In some embodiments, in response to detecting a request to transition the computer system from the first mode of operation to the fourth mode of operation, the computer system transitions the computer system from the first mode of operation to the fourth mode of operation (e.g., as described above with respect to 7A-7C). In some embodiments, the computer system detects a sixth hand gesture (e.g., as described above with respect to 7A-7C) via at least the optical sensor when the computer system is in the fourth mode of operation and when the user interface including the first user interface object, the second user interface object, and the third user interface object, and the indication that the first user interface object is selected, are displayed. In some implementations, in response to detecting a sixth hand gesture (e.g., and while the computer system is in the fourth mode of operation) at least via the optical sensor (e.g., as described above with respect to 7A-7C): in accordance with a determination that the hand gesture is a gesture of a first type, the computer system continues to display an indication that the first user interface object was selected (without displaying an indication that the second user interface object was selected and/or an indication that the third user interface object was selected and/or performing no operation via the display generation component) (e.g., as described above with respect to 7A-7C); in accordance with a determination that the hand gesture is a gesture of the second type, the computer system continues to display an indication that the first user interface object was selected (without displaying an indication that the third user interface object was selected and/or an indication that the second user interface object was selected and/or performing no operation via the display generation component) (e.g., as described above with respect to fig. 7A-7C). In accordance with a determination that the hand gesture is a first type of gesture and in accordance with a determination that the hand gesture is a first type of gesture (e.g., and when the computer system is in a fourth mode of operation), continuing to display an indication that the first user interface object was selected, provides the user with more control over the system to control performance of the computer system detection and/or operation of the hand gesture in response to detecting the hand gesture.
It is noted that the details of the process described above with respect to method 1500 (e.g., fig. 15) also apply in a similar manner to the methods described herein. For example, method 1500 optionally includes one or more of the features of the various methods described herein with reference to method 1600. For example, method 1500 may be performed when a computer system switches from operating in an operational mode in which the computer system operates using the techniques of method 1600. For the sake of brevity, these details are not repeated hereinafter.
FIG. 16 is a flow chart illustrating a method for navigating using a computer system, according to some embodiments. Method 1600 is performed at a computer system (e.g., 100, 300, 500) (e.g., a wearable device (e.g., a smart watch)) in communication with a display generation component (e.g., a display controller, a touch sensitive display system). In some embodiments, the computer system communicates with one or more sensors (e.g., one or more biometric sensors (e.g., heart rate sensor, camera), gyroscopes, accelerometers)). In some implementations, the computer system communicates with one or more input devices (e.g., touch-sensitive surface, microphone). In some embodiments, the computer system communicates with one or more output devices (e.g., one or more speakers, one or more microphones). Some operations in method 1500 are optionally combined, the order of some operations is optionally changed, and some operations are optionally omitted.
As described below, the method 1600 provides an intuitive way of navigating a user interface using hand gestures. The method reduces the cognitive burden on a user to navigate the user interface using hand gestures, thereby creating a more efficient human-machine interface. For battery-powered computing devices, users are enabled to navigate the user interface faster, and to save power more efficiently, and to increase the time interval between battery charges.
The computer system displays (1602) a user interface (e.g., 710, 720, 810, or 780) via a display generation component, the user interface including selectable user interface objects (e.g., 720c, 732c, 810d, or 832 a) and a cursor (e.g., 742) displayed at a first location on the user interface (e.g., a location different from the location of the user interface objects) (e.g., a user interface object, a non-selectable user interface object).
When a selectable user interface object (e.g., 720c, 732c, 810d, or 832 a) and a cursor (e.g., 742) are displayed at a first location on a user interface (e.g., 710, 720, 810, or 780), the computer system detects (1604) a request to move the cursor (e.g., 742) from the first location to a second location (e.g., a location different from the first location) on the user interface (e.g., via a hand gesture (e.g., a tilt of a wrist and/or one or more other hand gestures as described above with respect to method 1500)). In some embodiments, the request to move the cursor is detected based on movement of the computer system, movement of one or more devices (e.g., a mouse) associated with the computer system, and/or one or more inputs/gestures (e.g., swipe gestures) detected on a display generating component of the computer system.
In response to (1606) detecting a request to move the cursor from the first position to the second position (e.g., 750p, 850c, 850d or tilting in fig. 7Y or fig. 7Z), the computer system displays (1608) the cursor at the second position (e.g., moves the cursor from the first position to the second position). In some embodiments, the computer system moves the cursor from the first position to the second position as part of displaying the cursor at the second position. In some embodiments, the computer system moves the cursor according to the direction in which the computer system is moving.
In response to (1606) detecting a request to move the cursor from the first position to the second position (e.g., 750p, 850c, 850d or a tilt in fig. 7Y or fig. 7Z), in accordance with a determination that the second position corresponds to a position (and/or corresponds to an area (e.g., edge) of the display) of the selectable user interface object (e.g., 720c, 732c, 810d or 832 a), the computer system displays (1610) an animation (e.g., a visual indication (e.g., object) that fills, changes color, blinks, counts down/up, increases/decreases in size, fades in/out) that provides a visual indication (e.g., 744) of how long the cursor (e.g., 742) needs to be in the second position to perform an operation (e.g., does not perform an operation that includes selecting the selectable user interface object) in a period of time (e.g., a period of time less than or equal to a threshold period of time). In some embodiments, when the animation is displayed (e.g., for less than a period of time), the computer system detects a request to move the cursor and, in response to detecting the second request, the computer system stops displaying the animation. In some embodiments, after displaying the animation, the computer system redisplays the cursor. In some embodiments, the cursor is and/or includes a visual indication.
In response to (1606) detecting a request to move the cursor from the first position to the second position (e.g., 750p, 850c, 850d or tilting in fig. 7Y or fig. 7Z), the computer system discards (1612) displaying (1612) the animation (e.g., providing the indication (e.g., 744)) the animation (and continues displaying the cursor) (e.g., does not perform an action including selecting the selectable user interface object) in accordance with a determination that the second position does not correspond to the position (and/or does not correspond to the region (e.g., edge) of the display) of the selectable user interface object (e.g., 720c, 732c, 810d or 832 a). In some embodiments, the computer system replaces the display of the cursor with the visual indication as part of displaying the animation that provides the visual indication. In some embodiments, when displaying the animation, the computer system detects a request to move the cursor from the second position to the third position, and in response to detecting the request to move the cursor from the third position to the second position, displays the cursor at the third position (e.g., does not display a visual indication and/or replaces the visual indication with the cursor). In some embodiments, the third location is different from the second location and/or the first location.
In some embodiments, in response to detecting a request to move a cursor (e.g., 742) from a first position to a second position (e.g., 750p, 850c, 850d or a tilt in fig. 7Y or fig. 7Z) and in accordance with a determination that the second position corresponds to a position of a selectable user interface object (e.g., 720c, 732c, 810d, or 832 a) and the cursor (e.g., 742) is displayed at the second position for more than a first threshold period of time (e.g., a non-zero period (e.g., 1 second-5 seconds)) (e.g., a period of time longer than or equal to a time that an animation is displayed), the computer system performs an operation (e.g., as discussed with respect to fig. 7I-7U, as discussed above with respect to fig. 8A-8J). In some embodiments, as part of performing the operation, the computer system selects a selectable user interface object (e.g., as discussed above with respect to fig. 8A-8J, as discussed with respect to fig. 7I-7U) (e.g., highlights the selectable user interface object, launches a menu corresponding to the selectable user interface object, launches an application corresponding to the selectable user interface object). Automatically performing operations including selecting the selectable user interface object in accordance with determining that the second position corresponds to a position of the selectable user interface object and that the cursor is displayed at the second position for more than a first threshold period of time allows the computer system to automatically select the user interface object when a prescribed condition is met. Performing operations including selecting the selectable user interface object in accordance with determining that the second position corresponds to a position of the selectable user interface object and that the cursor is displayed at the second position for more than a first threshold period of time provides the user with more control over the computer system by giving the user the ability to select the user interface object when the cursor has been positioned at a certain position (e.g., by the user) for a certain period of time.
In some embodiments, in response to detecting a request to move a cursor (e.g., 742) from a first position to a second position and in accordance with a determination that the second position corresponds to a position of a selectable user interface object (e.g., 720c, 732c, 810d, or 832 a) and the cursor (e.g., 742) is displayed at the second position for more than a second threshold period of time (e.g., a non-zero period of time (e.g., 1 second-5 seconds)) (e.g., a period of time longer than or equal to a time that an animation is displayed), the computer system performs an operation (e.g., as discussed with respect to fig. 7I-7U). In some embodiments, performing the operation includes launching an application corresponding to the selectable user interface object (e.g., as discussed with respect to fig. 7I-7U). Automatically performing operations including launching an application corresponding to the selectable user interface object in response to determining that the second location corresponds to the location of the selectable user interface object and that the cursor is displayed at the second location for more than the first threshold period of time allows the computer system to automatically launch the application corresponding to the selectable user interface object when the prescribed condition is met. Performing operations including launching an application corresponding to the selectable user interface object in accordance with determining that the second position corresponds to the position of the selectable user interface object and the cursor is displayed at the second position for more than a first threshold period of time provides the user with more control over the computer system by giving the user the ability to launch the application corresponding to the selectable user interface object when the cursor has been positioned (e.g., by the user) at a certain position for a certain period of time.
In some embodiments, responsive to detecting a request to move the cursor from the first position to the second position and in accordance with a determination that the second position does not correspond to a location of the selectable user interface object (e.g., 720C, 732C, 810d, or 832 a), performing the operation (e.g., as described above with respect to fig. 8C) is forgone (e.g., the display of a menu including one or more user interface objects via the display generation component to perform the one or more operations at the second position, launching an application corresponding to the selectable user interface object is forgone). In some embodiments, in accordance with a determination that the second location does not correspond to a location of the selectable user interface object and the cursor is displayed and/or not displayed for more than a threshold period of time, the computer system relinquishes performing the operation. Selecting not to perform an operation in accordance with determining that the second location does not correspond to a location of the selectable user interface object gives the computer system the ability to not perform an operation when the prescribed condition is not met and helps the computer system avoid performing unnecessary and/or unexpected operations.
In some embodiments, in response to detecting a request to move a cursor (e.g., 742) from a first position to a second position and in accordance with a determination that the cursor (e.g., 742) is not displayed at the second position for more than a third threshold period of time (e.g., a non-zero period of time (e.g., 1 second-5 seconds)) (e.g., a period of time longer than or equal to a time when an animation is displayed), the computer system relinquishes performing the operation (e.g., as shown in fig. 8D). In some embodiments, in accordance with a determination that the second position corresponds to and/or does not correspond to a position of the selectable user interface object and the cursor is not displayed at the second position for more than a threshold period of time, the computer system relinquishes performing the operation. Selecting not to perform an operation in accordance with a determination that the cursor is not displayed at the second location for more than a third threshold period of time gives the computer system the ability to not perform an operation when the prescribed condition is not met and helps the computer system avoid performing unnecessary and/or unexpected operations.
In some embodiments, in response to detecting a request to move a cursor (e.g., 742) from a first position to a second position and in accordance with a determination that the second position corresponds to (e.g., is) a position in an area of the display generation component (e.g., 742 in fig. 7Q or 742 in fig. 7Y) (e.g., edge, left edge, right edge) and the cursor is displayed at the second position for more than a fourth threshold period (e.g., non-zero time period) (e.g., 1 second-5 seconds) (e.g., a period of time longer than or equal to a time that an animation is displayed), the computer system performs a second operation that includes displaying a second user interface (e.g., 780 or 720) that is different from the user interface (e.g., 710 or 718) (and ceasing to display the user interface) (while continuing to display the cursor) (e.g., other than selecting a user interface object)). In some embodiments, as part of displaying a second user interface that is different from the user interface, the computer system displays an animation (e.g., a swipe animation, a fade animation, and/or a swipe animation) of the second user interface in place of the user interface. In some embodiments, the animation includes sliding the user interface from the screen to the region and/or sliding a second user interface object on the screen to the region. In some embodiments, in accordance with a determination that the second position corresponds to a position in an area of the display and the cursor is not displayed at the second position for more than a threshold period of time, the computer system foregoes performing a second operation that includes displaying the second user interface and/or continues displaying the user interface. In some embodiments, the second threshold time period is the same time period as the threshold time period. In some embodiments, the second threshold time period is a different threshold time period than the threshold time period. In some embodiments, the region does not include one or more selectable user interface objects. In some embodiments, in accordance with a determination that the second location corresponds to (e.g., is) a location in the region, the computer system displays the animation. In some embodiments, in accordance with a determination that the second location corresponds to (e.g., is) a location in the region, the computer system forgoes displaying the animation. In some embodiments, in accordance with a determination that the second location does not correspond to (e.g., is) a location in the region, the computer system displays the animation. In some embodiments, in accordance with a determination that the second location does not correspond to (e.g., is) a location in the region, the computer system discards displaying the animation. In accordance with a determination that the second position corresponds to a position in an area of the display generating component and that the cursor is displayed at the second position for more than a fourth threshold period of time, automatically performing a second operation that includes displaying a second user interface that is different from the user interface allows the computer system to display another user interface while the cursor is in a particular area of the display generating component. Performing a second operation comprising displaying a second user interface different from the user interface in accordance with determining that the second position corresponds to a position in the area of the display generating component and that the cursor is displayed at the second position for more than a fourth threshold period of time provides the user with more control over the computer system by giving the user the ability to display another user interface when the prescribed condition is met.
In some embodiments, in accordance with a determination that the second position corresponds to a position of the selectable user interface object (e.g., 720c, 732c, 810d, or 832 a) and the cursor (e.g., 742) is displayed at the second position for more than a fifth threshold period (e.g., a non-zero period) (e.g., 1 second-5 seconds) (e.g., when the cursor at the second position is displayed) (e.g., a period of time greater than or equal to a time that the animation is displayed), the computer system performs an operation. In some embodiments, performing the operation includes displaying, via the display generating component, a menu (e.g., 732) including one or more selectable options (e.g., 1410a-732d and/or 732-1418) (e.g., a user interface object) to perform one or more operations (e.g., one or more different operations) at the second location (e.g., operations performed when a tap input (e.g., and/or a first type of input) is detected at the second location (e.g., operations of launching an application program), operations performed when a press and hold input (e.g., and/or a second type of input that is different from the first type of input) is detected at the second location (e.g., operations of calling out a menu for deleting the user interface object), operations performed when a swipe input (e.g., a third type of input that is different from the first type of input and the second type of input) is detected at the second location, and operations performed when an automatic scroll operation is initiated at the second location. In some embodiments, the menu is displayed simultaneously with the selectable user interface object and/or the cursor. In some embodiments, when the menu is displayed, one or more portions of the user interface (e.g., excluding the selectable user interface object and/or cursor) cease to be displayed. In some embodiments, the menu is displayed at a location (e.g., top/bottom) on the user interface that does not include a selectable user interface object and/or cursor. In some embodiments, in accordance with a determination that the selectable user interface object and/or cursor is displayed at a first respective location, the menu is displayed at a second respective location (e.g., on a side of the display opposite the first respective location, as opposed to the first location) (not displayed at the first respective location). In some embodiments, in accordance with a determination that the selectable user interface object and/or cursor is not displayed at the first respective location, the menu is displayed at the first respective location (not displayed at the second respective location). Automatically performing an operation including displaying a menu including one or more selectable options to perform one or more operations at the second location in response to determining that the second location corresponds to a location of the selectable user interface object and that the cursor is displayed at the second location for more than the first threshold period of time allows the computer system to automatically display the menu when a prescribed condition is met. Performing operations including displaying a menu including one or more selectable options to perform one or more operations in accordance with determining that the second position corresponds to a position of the selectable user interface object and the cursor being displayed at the second position for more than a first threshold period of time provides the user with more control over the computer system by giving the user the ability to display the menu to perform one or more operations after the cursor has been positioned at a certain position (e.g., by the user) for a certain period of time.
In some embodiments, the one or more selectable options (e.g., 732a-732 d) include a first selectable option (e.g., 832a, 1410b, 1410c, or 1412) for performing a selection operation, and wherein selection of the first selectable option causes the computer system to perform the selection operation (e.g., an action performed when a tap is received at the second location) at the second location (e.g., on the selectable user interface object). In some embodiments, in response to detecting the selection of the first selectable option, the computer system performs a selection operation including launching an application associated with the selectable user interface object. In some embodiments, in response to detecting the selection of the first selectable option, the computer system performs a selection operation that includes emphasizing (e.g., highlighting) the selectable user interface object displayed at the second location. Displaying a menu including a first selectable option for performing a selection operation provides the user with more control over the computer system by allowing the user to cause the computer system to perform the selection operation (e.g., not provide more complex hand gestures), and also reduces the number of hand gestures for performing the operation.
In some embodiments, the one or more selectable options include a second selectable option (e.g., 832 a) for performing an operation (e.g., launching an application in response to an input (e.g., tap input, press and hold input, swipe input), moving a selectable user interface object displayed at the second location in response to an input (e.g., tap input, press and hold input, swipe input), displaying a notification (e.g., pop-up) corresponding to the selectable user interface object displayed at the second location), the operation corresponding to (e.g., as) an operation performed (and/or to be performed) in response to an input (e.g., tap input, press and hold input, swipe input) detected at the second location (and/or gesture) by the computer system, and wherein selection of the second selectable option causes the computer system to perform an operation corresponding to an operation performed in response to the detection of the input at the second location (e.g., no input detected at the second location). Displaying a menu including second selectable options for performing operations corresponding to operations performed in response to detecting the input at the second location provides the user with more control over the computer system by allowing the user to cause the computer system to perform operations performed in response to detecting the input at the second location (e.g., without providing more complex hand gestures and without the user touching the computer system and/or display generating components), and also reduces the number of hand gestures for performing the operations.
In some embodiments, the one or more selectable options include a third selectable option (e.g., 732a, 1410c, or 1412) for performing an automatic scroll operation (e.g., as described with respect to method 1500), and wherein selection of the third selectable option causes the computer system to automatically scroll through a plurality of selectable user interface objects including the selectable user interface object. In some embodiments, the computer system initiates automatic scrolling from the second location (e.g., from the selectable user interface object displayed at the second location) as part of automatically scrolling through a plurality of selectable user interface objects including the selectable user interface object. In some embodiments, as part of automatically scrolling through a plurality of selectable user interface objects including selectable user interface objects, the computer system sequentially highlights (e.g., highlights) one or more selectable user interface objects (e.g., as described with respect to method 1500). Displaying a menu including a third selectable option for performing an automatic scroll operation provides the user with more control over the computer system by allowing the user to automatically scroll the computer system through a plurality of selectable user interface objects including the selectable user interface objects (e.g., does not provide more complex hand gestures), and also reduces the number of hand gestures for performing the operation.
In some embodiments, the one or more selectable options include a fourth selectable option (e.g., 1416) for displaying one or more option add-ons (e.g., an option corresponding to an option to perform an operation corresponding to an operation performed when an input is detected at a location (e.g., a second location) of the selectable user interface object, an option to perform an operation such as turning on/off the computer system, displaying a different menu and/or user interface, and/or an option corresponding to an operation performed in response to detecting an input via one or more input devices in communication with the computer system. In some embodiments, selection of the fourth selectable option causes the computer system to display one or more additional options (e.g., 832a and menus shown in fig. 8J) that were not previously displayed before selection of the fourth selectable option was detected. In some embodiments, in response to detecting the fourth selectable option, the computer system stops displaying the one or more selectable options displayed prior to detecting the selection of the fourth selectable option. Displaying a menu including a fourth selectable option for displaying one or more option additional options provides the user with more control over the computer system by allowing the user to select additional selectable options that are not previously displayed, reducing the number of hand gestures for performing additional operations, and reducing the number of selectable user interface options that are not previously displayed before selection of the fourth selectable option is detected.
In some embodiments, the one or more additional options (e.g., 832a and menus shown in fig. 8J) that were not previously displayed prior to detecting the selection of the fourth selectable option (e.g., 1414 or 1416) include respective additional options (e.g., 1414a-1414d or 1416a-1416 c), wherein the selection of the respective additional options does not cause the computer system to perform an operation (e.g., exit operation, close menu operation) that was not performed at the second location. In some embodiments, the one or more selectable options do not include a selection operation that, when selected, causes the computer system to perform operations (e.g., any operations) that are not performed at the second location (e.g., are not initiated at the second location, are not performed at the second location, and/or are not related to the second location). Displaying a menu comprising one or more additional options not previously displayed prior to detecting the selection of the fourth selectable option comprises a respective additional option that provides the user with more control over the computer system by allowing the user to select an option to perform an operation that is not performed at the second location and reduces the number of hand gestures to perform an operation that is not performed at the second location.
In some embodiments, a computer system (e.g., 600) communicates with one or more input devices (e.g., 602a-602 c). In some embodiments, the one or more selectable options include a fifth selectable option (e.g., 732a, 1410c, or 1416 b) for performing an operation corresponding to (e.g., as) an operation (e.g., a press (e.g., single and/or multiple presses) on a hardware input device to display a menu, initiate a payment operation, initiate a shutdown of a computer system, initiate use of voice assistance, rotate a hardware input device to move a cursor from one location to another location, and/or rotate a hardware input device to zoom, scroll, and/or adjust one or more components of a user interface), the operation being (and/or to be) performed in response to detecting an input via one or more input devices (e.g., a hardware button (e.g., a crown of a smart watch, a rotatable hardware button, a hardware button that can be pressed/de-pressed), a hardware slider). In some implementations, selection of the fifth selectable option causes the computer system to perform operations corresponding to operations performed in response to detecting input via the one or more input devices (e.g., no input at the one or more input devices is detected). Displaying a menu including fifth selectable options for performing operations corresponding to operations performed in response to detecting input via the one or more input devices provides the user with more control over the computer system by allowing the user to select to perform operations corresponding to operations performed in response to detecting input via the one or more input devices (e.g., without providing more complex hand gestures and without the user touching the one or more input devices), and also reduces the number of hand gestures for performing operations.
In some embodiments, the one or more selectable options include a sixth selectable option (e.g., 732a-732d or 1410-1418) and a seventh selectable option (e.g., 732a-732d or 1410-1418) (e.g., different from the sixth selectable option). In some implementations, when the menu (e.g., 732) is displayed, the computer system detects a request to move the cursor from the second position to a third position on the user interface (e.g., 750p, 850c, 850d or a tilt in fig. 7Y or fig. 7Z). In some embodiments, the second request to move the cursor is detected based on movement of the computer system, movement of one or more devices (e.g., a mouse) associated with the computer system, and/or one or more inputs/gestures (e.g., swipe gestures) detected on a display generating component of the computer system. In some embodiments, in response to detecting a request to move the cursor from the second position to the third position on the user interface, the computer system: in accordance with a determination that the third position corresponds to a position of the sixth selectable option and the cursor is displayed at the third position for more than a sixth threshold period of time (e.g., a non-zero period of time (e.g., 1 second-5 seconds)) (e.g., a period of time longer than or equal to a time when the animation is displayed), performing a first operation (e.g., not performing a second operation) corresponding to the sixth selectable option; and in accordance with a determination that the third position corresponds to a position of the seventh selectable option and the cursor is displayed at the third position for more than a sixth threshold period of time, performing a second operation (e.g., not performing the first operation) that is different from the first operation. In some embodiments, in accordance with a determination that the third location corresponds to a location of the sixth selectable option and the cursor is not displayed at the third location for more than a third threshold period of time, the computer system refrains from performing the first operation. In some embodiments, in accordance with a determination that the third position corresponds to the position of the seventh selectable option and the cursor is not displayed at the third position for more than a third threshold period of time, the computer system refrains from performing the first operation. In some embodiments, the third threshold time period is the same time period as the threshold time period. In some embodiments, the third threshold time period is a different threshold time period than the threshold time period. Performing different operations based on the position of the cursor being displayed for a threshold period of time provides the user with more control over the computer system to perform the different operations based on the position of the cursor.
In some implementations (e.g., when the cursor at the second location is displayed and the second location corresponds to the selectable user interface object), the one or more operations include causing the computer system (e.g., 600) to display an operation (e.g., as discussed above with respect to fig. 8J) of the user interface of the application corresponding to the selectable user interface object (e.g., 720c, 732c, 810d, or 832 a) (e.g., without exiting the first mode).
In some embodiments (e.g., when a cursor at a second location is displayed and the second location corresponds to a selectable user interface object), the one or more operations include an operation (e.g., as discussed above with respect to 732a and/or 1412) to transition the computer system from the first mode of operation to the second mode of operation (e.g., an operation mode in which an operation (e.g., an automatic scroll operation, a motion pointer operation) initiates at (and/or begins from) the second location) (e.g., without launching an application corresponding to the selectable user interface object).
In some embodiments, after displaying a menu (e.g., 732) comprising one or more selectable options (e.g., when displaying the menu, one or more menu objects) the computer system detects a request to display a second user interface that is different from the user interface (e.g., as described above with respect to fig. 7I-7U and fig. 8A-8J). In some embodiments, in response to detecting a request to display the second user interface, the computer system displays the second user interface (and when operating in the first mode of operation). In some embodiments, the second user interface does not include a menu and includes a second selectable user interface object (e.g., 732a-732d or 1410-1418) (e.g., which is different from the selectable user interface object) and a cursor (e.g., 744) (e.g., as described above with respect to fig. 7I-7U and fig. 8A-8J). In some embodiments, when the second selectable user interface object and the cursor are displayed, the computer system detects a request to move the cursor from a fourth position on the second user interface to a fifth position on the second user interface (e.g., as described above with respect to fig. 7I-7U and fig. 8A-8J). In some implementations, the third request to move the cursor is detected based on movement of the computer system, movement of one or more devices (e.g., a mouse) associated with the computer system, and/or one or more inputs/gestures (e.g., swipe gestures) detected on a display generating component of the computer system. In some embodiments, in response to detecting a request to move the cursor from a fourth location on the second user interface to a fifth location on the second user interface, the computer system displays the cursor at the fifth location on the second user interface (e.g., as described above with respect to fig. 7I-7U and fig. 8A-8J). In some embodiments, in response to detecting a request to move the cursor from a fourth location on the second user interface to a fifth location on the second user interface, the computer system redisplays a menu (e.g., 732) comprising one or more selectable options via the display generating component (e.g., as described above with respect to fig. 7I-7U and 8A-8J) (e.g., when the menu is displayed) in accordance with determining that the fifth location corresponds to a location of the second selectable user interface object and the cursor is displayed at the second location for more than a seventh threshold period of time (e.g., a non-zero period of time (e.g., 1-5 seconds)) (e.g., a period of time longer than or equal to a time that the animation is displayed). In some embodiments, in accordance with a determination that the fifth location does not correspond to a location of the second selectable user interface object and/or the cursor is not displayed at the second location for more than a seventh threshold period of time, the display of the menu is relinquished. In accordance with a determination that the fifth location corresponds to a location of the second selectable user interface object and the cursor is displayed at the second location for more than a seventh threshold period of time, automatically selecting to redisplay the menu allows the computer system to provide a consistent menu to the user when the same set of prescribed conditions are met for different selectable user interface objects.
In some embodiments, a request to move a cursor (e.g., 742) from a first position to a second position is detected when it is determined that the computer system (e.g., 600) has been tilted in a corresponding direction (e.g., 750p, 850c, 850d or tilt in fig. 7Y or fig. 7Z) (e.g., one side (e.g., right side, left side) of the computer system is higher/lower than the other side of the computer system). In some embodiments, the determination is made using data detected via one or more gyroscopes in communication with the computer system. In some embodiments, the respective direction is a direction from a higher side of the computer system to a lower side of the computer. In some implementations, the cursor (e.g., 742) moves in a direction corresponding to the respective direction.
In some embodiments, the computer system displays a third user interface prior to displaying the user interface including the selectable user interface object (e.g., 720c, 732c, 810d, or 832 a) and the cursor (e.g., 742) displayed at the first location on the user interface. In some embodiments, when a third user interface (e.g., 710, 720, 810, or 780) is displayed (and in some embodiments, the user interface includes a selectable user interface object), the computer system detects a request to enter a first mode of operation (e.g., 750w, 850 a). In some implementations, in response to detecting a request to enter a first mode of operation, the computer system transitions the computer system (e.g., 600) from a second mode of operation (e.g., different from the first mode of operation) to the first mode of operation. In some embodiments, in response to detecting a request to enter the first mode of operation, the computer system displays a cursor (e.g., at a first location) (e.g., and/or a user interface including a selectable user interface object and a cursor displayed at the first location on the user interface). Displaying a cursor in response to detecting a request to enter a first mode of operation provides feedback to a user that the computer system is operating in the first mode of operation (e.g., or one or more user actions may cause the computer system to perform one or more particular operations because the computer system is in the first mode of operation).
In some embodiments, as part of detecting a request to enter a first mode of operation, the computer system detects that the computer system has (or is) panned (e.g., 750w, 850 a) (e.g., when the computer system is in a second mode of operation). Transitioning the computer system from being in the second mode of operation to the first mode of operation in response to detecting that the computer system has been rocked provides the user with more control over the computer system by allowing the user to switch between modes by rocking the computer system (e.g., without providing input on a display generating component of the device).
In some embodiments, when the computer system is in the first mode, the computer system detects that the computer system (e.g., 600) has been rocked (e.g., 750w, 850 a). In some embodiments, in response to detecting that the computer system has been rocked, the computer system transitions the computer system from the first mode of operation to a third mode of operation (and, in some embodiments, the third mode of operation is the second mode of operation) (e.g., the first mode of operation as described above with respect to method 1500). Transitioning the computer system from the first mode of operation to the third mode of operation in response to detecting that the computer system has been rocked provides the user with more control over the computer system by allowing the user to switch between modes by rocking the computer system (e.g., without providing input on a display generating component of the device).
In some embodiments, as part of transitioning the computer system (e.g., 600) from the first mode of operation to the third mode of operation, the computer system stops displaying the cursor (e.g., 742). In some embodiments, in response to detecting that the computer system has been rocked (e.g., when the computer system is in the first mode), the computer system stops displaying the cursor. Stopping displaying the cursor in response to detecting that the computer system has been panned provides feedback to the user that the computer system is not operating in the first mode of operation (e.g., and/or one or more user actions are not capable of causing the computer system to perform one or more particular operations because the computer system is not in the first mode of operation).
In some embodiments, the computer system detects a request to move the cursor from the sixth position to the seventh position when the computer system is in the third mode of operation and when the cursor (e.g., 742) is displayed. In some embodiments, in response to detecting a request to move the cursor from the sixth position (e.g., 750p, 850c, 850d, or tilt in fig. 7Y or fig. 7Z) to the seventh position (e.g., when the computer system is in the third mode of operation), the computer system displays the cursor at the seventh position. In some embodiments, in response to detecting a request to move the cursor from the sixth position (e.g., 750p, 850c, 850d, or the tilt in fig. 7Y or fig. 7Z) to the seventh position (e.g., when the computer system is in the third mode of operation), and in accordance with a determination that the seventh position corresponds to the position of the selectable user interface object and the cursor is displayed at the seventh position for more than an eighth threshold period of time (e.g., a non-zero period of time (e.g., 1 second-5 seconds)) (e.g., a period of time longer than or equal to the time that the animation is displayed), the computer system relinquishes performing the operation. In accordance with a determination that the seventh position corresponds to a position of the selectable user interface object and the cursor is displayed at the seventh position for more than an eighth threshold period of time to forgo performing the operation (e.g., when the computer system is in the third mode of operation), the user is provided with more control over the computer system to control when certain hand gestures will perform certain operations.
In some implementations, in response to detecting a request to move a cursor (e.g., 742) from a first position to a second position and in accordance with a determination that the second position corresponds to a position of a selectable user interface object (e.g., 720c, 732c, 810d, or 832 a) and the cursor (e.g., 742) is displayed at the second position for more than a ninth threshold period (e.g., a non-zero period) (e.g., 1 second-5 seconds) (e.g., when the cursor at the second position is displayed) (e.g., a period of time longer than a time to display an animation), the computer system performs an operation. In some embodiments, in accordance with a determination that one or more settings (e.g., user settings) of the computer system are in a first state, the computer system displays a second menu including one or more selectable options (e.g., user interface objects) to perform one or more operations (e.g., one or more different operations) at a second location without launching a second application corresponding to the selectable user interface objects, as part of performing the operations. In some embodiments, in accordance with a determination that one or more settings (e.g., user settings) of the computer system are in a first state, the computer system launches a second application corresponding to the selectable user interface object without displaying a second menu including one or more selectable options (e.g., user interface objects) to perform one or more operations (e.g., one or more different operations) at a second location as part of performing the operations. Selecting whether to launch the second application and/or display the second menu when the prescribed condition is met provides the user with more control over the computer system to change whether one or more actions will result in launching the application and/or causing the menu to be displayed.
It is noted that the details of the process described above with respect to method 1600 (e.g., fig. 16) also apply in a similar manner to the methods described herein. For example, method 1600 optionally includes one or more of the features of the various methods described herein with reference to method 1500. For example, method 1600 may be performed when a computer system switches from operating in an operational mode in which the computer system operates using the techniques of method 1500. For the sake of brevity, these details are not repeated hereinafter.
The foregoing description, for purposes of explanation, has been described with reference to specific embodiments. However, the illustrative discussions above are not intended to be exhaustive or to limit the invention to the precise forms disclosed. Many modifications and variations are possible in light of the above teaching. The embodiments were chosen and described in order to best explain the principles of the techniques and their practical applications. Those skilled in the art will be able to best utilize the techniques and various embodiments with various modifications as are suited to the particular use contemplated.
While the present disclosure and examples have been fully described with reference to the accompanying drawings, it is to be noted that various changes and modifications will become apparent to those skilled in the art. It should be understood that such variations and modifications are considered to be included within the scope of the disclosure and examples as defined by the claims.
As described above, one aspect of the present technology is to collect and use data from various sources to improve detection of hand gestures. The present disclosure contemplates that in some examples, such collected data may include personal information data that uniquely identifies or may be used to contact or locate a particular person. Such personal information data may include demographic data, location-based data, telephone numbers, email addresses, tweet IDs, home addresses, data or records related to the user's health or fitness level (e.g., vital sign measurements, medication information, exercise information), date of birth, or any other identifying or personal information.
The present disclosure recognizes that the use of such personal information data in the present technology may be used to benefit users. For example, personal information data may be used to improve detection of hand gestures by a user. Thus, using such personal information data enables a user to calculate user control over the data, which may improve detection of hand gestures. In addition, the present disclosure contemplates other uses for personal information data that are beneficial to the user. For example, health and fitness data may be used to provide insight into the overall health of a user, or may be used as positive feedback to individuals using technology to pursue health goals.
The present disclosure contemplates that entities responsible for collecting, analyzing, disclosing, transmitting, storing, or otherwise using such personal information data will adhere to established privacy policies and/or privacy practices. In particular, such entities should exercise and adhere to privacy policies and practices that are recognized as meeting or exceeding industry or government requirements for maintaining the privacy and security of personal information data. Such policies should be readily accessible to the user and should be updated as the collection and/or use of the data changes. Personal information from users should be collected for legal and reasonable use by entities and not shared or sold outside of these legal uses. In addition, such collection/sharing should be performed after informed consent is received from the user. In addition, such entities should consider taking any necessary steps to defend and secure access to such personal information data and to ensure that others who have access to personal information data adhere to their privacy policies and procedures. In addition, such entities may subject themselves to third party evaluations to prove compliance with widely accepted privacy policies and practices. In addition, policies and practices should be adjusted to collect and/or access specific types of personal information data and to suit applicable laws and standards including specific considerations of jurisdiction. For example, in the united states, the collection or acquisition of certain health data may be governed by federal and/or state law, such as the health insurance flow and liability act (HIPAA); while health data in other countries may be subject to other regulations and policies and should be processed accordingly. Thus, different privacy practices should be maintained for different personal data types in each country.
In spite of the foregoing, the present disclosure also contemplates embodiments in which a user selectively prevents use or access to personal information data. That is, the present disclosure contemplates that hardware elements and/or software elements may be provided to prevent or block access to such personal information data. For example, with respect to user interface management (e.g., including navigation), the present technology may be configured to allow a user to choose to "opt-in" or "opt-out" to participate in the collection of personal information data at any time during or after registration with a service. In another example, the user may choose not to provide heart rate data and/or other data that may be used to improve detection of hand gestures. In yet another example, the user may choose to limit the length of time that heart rate data and/or other data is maintained. In addition to providing the "opt-in" and "opt-out" options, the present disclosure also contemplates providing notifications related to accessing or using personal information. For example, the user may be notified that his personal information data will be accessed when the application is downloaded, and then be reminded again just before the personal information data is accessed by the application.
Further, it is an object of the present disclosure that personal information data should be managed and processed to minimize the risk of inadvertent or unauthorized access or use. Once the data is no longer needed, risk can be minimized by limiting the data collection and deleting the data. In addition, and when applicable, included in certain health-related applications, the data de-identification may be used to protect the privacy of the user. De-identification may be facilitated by removing specific identifiers (e.g., date of birth, etc.), controlling the amount or specificity of stored data (e.g., collecting location data at a city level instead of at an address level), controlling how data is stored (e.g., aggregating data among users), and/or other methods, as appropriate.
Thus, while the present disclosure broadly covers the use of personal information data to implement one or more of the various disclosed embodiments, the present disclosure also contemplates that the various embodiments may be implemented without accessing such personal information data. That is, various embodiments of the present technology do not fail to function properly due to the lack of all or a portion of such personal information data. For example, hand gestures may be detected based on non-personal information data or an absolute minimum of personal information (such as content requested by a device associated with a user, other non-personal information available to a data detection service, or publicly available information).

Claims (65)

1. A method, the method comprising:
at a computer system in communication with the display generating component and the optical sensor:
displaying, via the display generating component, a user interface comprising a first user interface object, a second user interface object, a third user interface object, and an indication that the first user interface object is selected;
detecting a hand gesture via at least the optical sensor while displaying the user interface including the first user interface object, the second user interface object, the third user interface object, and the indication that the first user interface object is selected; and
In response to detecting the hand gesture via at least the optical sensor:
in accordance with a determination that the hand gesture is a gesture of a first type, displaying, via the display generating component, an indication that the second user interface object is selected; and
in accordance with a determination that the hand gesture is a gesture of a second type different from the first type of gesture, an indication that the third user interface object is selected is displayed via the display generation component.
2. The method of claim 1, wherein the hand gesture is detected based on heart rate data determined using data detected via the optical sensor.
3. The method of any of claims 1-2, wherein the second type of gesture is a gesture type that is multiple instances of the first type of gesture.
4. The method of any of claims 1-2, further comprising:
in response to detecting the hand gesture via at least the optical sensor:
in accordance with a determination that the hand gesture is a third type of gesture different from the first type of gesture and the second type of gesture, performing an operation corresponding to a selection of the first user interface object; and
In accordance with a determination that the hand gesture is a fourth type of gesture different from the first type of gesture, the second type of gesture, and the third type of gesture, a menu including one or more selectable options is displayed via the display generation component.
5. The method according to claim 4, wherein:
before detecting the hand gesture, the computer system is in a first mode of operation; and is also provided with
Performing the operation corresponding to the selection of the first user interface object includes transitioning the computer system from the first mode of operation to a second mode of operation.
6. The method of claim 5, further comprising:
detecting, via at least the optical sensor, a second hand gesture when the computer system is in the second mode of operation; and
in response to detecting the second hand gesture via at least the optical sensor:
in accordance with a determination that the second hand gesture is the first type of gesture, switching a first automatic scrolling operation between an active state and an inactive state; and
in accordance with a determination that the second hand gesture is the second type of gesture, a first direction of the first automatic scrolling operation is reversed.
7. The method of claim 6, further comprising:
in response to detecting the second hand gesture via at least the optical sensor:
in accordance with a determination that the second hand gesture is of the third type, performing a second operation; and
in accordance with a determination that the second hand gesture is of the fourth type, a third operation that is different from the second operation is performed.
8. The method of claim 6, further comprising:
performing a second automatic scrolling operation when the computer system is in the second mode of operation, the second automatic scrolling operation scrolling a sequence of a plurality of interface objects in a second direction;
detecting an end of the sequence of the plurality of user interface objects while performing the second automatic scrolling operation; and
in response to detecting the end of the sequence of the plurality of user interface objects, a third automatic scrolling operation is performed that scrolls the sequence of plurality of interface objects in a third direction different from the second direction.
9. The method of claim 6, further comprising:
in response to detecting the second hand gesture via at least the optical sensor:
In accordance with a determination that the second hand gesture is the fourth type of gesture, the computer system is transitioned from the second mode of operation to a third mode of operation.
10. The method of claim 6, further comprising:
in accordance with a determination that the second hand gesture is the first type of gesture, a notification is displayed indicating a state of the automatic scroll operation.
11. The method of claim 4, wherein the third type of gesture is a gesture type that is multiple instances of the fourth type of gesture.
12. The method of claim 4, wherein the fourth type of gesture does not include multiple instances of the first type of gesture and the third type of gesture does not include multiple instances of the second type of gesture.
13. The method of claim 4, further comprising:
in response to detecting the second hand gesture via at least the optical sensor:
in accordance with a determination that a notification is received within a threshold period of time and in accordance with a determination that the second hand gesture is a gesture of the fourth type, an action related to the notification is performed.
14. The method of claim 4, wherein the one or more selectable options include a first selectable user interface object for changing one or more hand gestures to enable operations performed by the computer system, wherein selection of the first selectable user interface object causes the computer system to display a plurality of settings, wherein each setting controls operations performed by the computer system by the one or more hand gestures when the one or more hand gestures are detected by the computer system.
15. The method according to claim 4, wherein:
the one or more selectable options include a second selectable option for transitioning the computer system to a fourth mode of operation and a third selectable option for transitioning the computer system to a fifth mode of operation that is different from the fourth mode of operation;
selection of the second selectable option causes the computer system to transition to the fourth mode of operation; and is also provided with
Selection of the third selectable option causes the computer system to transition to the fifth mode of operation.
16. The method of any of claims 1-2, wherein the one or more selectable options include a fourth selectable option for displaying one or more additional selectable options, and wherein selection of the fourth selectable option causes the computer system to display the one or more additional selectable options that were not previously displayed prior to selection of the fourth selectable option.
17. The method of any one of claims 1 to 2, wherein the menu:
in accordance with a determination that a respective user interface object is at a first location on the user interface, displayed at the first location; and
In accordance with a determination that the respective user interface object is not at the first location on the user interface, but is displayed at a second location different from the first location.
18. The method of any of claims 1-2, further comprising:
detecting a third hand gesture via at least the optical sensor while displaying the menu comprising one or more selectable options; and
in response to detecting the third hand gesture via at least the optical sensor:
in accordance with a determination that the third hand gesture is of the fourth type, the display of the menu including the one or more selectable options is stopped.
19. The method of any of claims 1-2, further comprising:
detecting a fourth hand gesture via at least the optical sensor after displaying the user interface including the first user interface object; and
in response to detecting the fourth hand gesture via at least the optical sensor:
in accordance with a determination that the fourth hand gesture is a fifth type of gesture that is different from the first type of gesture and the second type of gesture, the computer system is transitioned from an inactive state to an active state.
20. The method of any of claims 1-2, further comprising:
detecting a fifth hand gesture via at least the optical sensor while displaying the indication that the second user interface object is selected via the display generating component; and
in response to detecting the fourth hand gesture via at least the optical sensor:
in accordance with a determination that the fifth hand gesture is the first type of gesture, displaying, via the display generating component, an indication that a fourth first user interface object is selected; and
in accordance with a determination that the fifth hand gesture is the second type of gesture, the indication of the first user interface object is displayed via the display generation component.
21. The method of any of claims 1-2, further comprising:
detecting a request to transition the computer system from a first mode of operation to a fourth mode of operation while displaying the user interface including the first user interface object;
responsive to detecting the request to transition the computer system from the first mode of operation to the fourth mode of operation, transitioning the computer system from the first mode of operation to the fourth mode of operation;
Detecting a sixth hand gesture via at least the optical sensor while the computer system is in the fourth mode of operation and while displaying the user interface including the first user interface object, the second user interface object, and the third user interface object, and the indication that the first user interface object is selected;
in response to detecting the sixth hand gesture via at least the optical sensor:
in accordance with a determination that the hand gesture is of the first type, continuing to display the indication that the first user interface object was selected; and
in accordance with a determination that the hand gesture is of the second type of gesture, continuing to display the indication that the first user interface object was selected.
22. A computer system in communication with a display generating component and an optical sensor, the computer system comprising:
one or more processors; and
a memory storing one or more programs configured to be executed by the one or more processors, the one or more programs comprising instructions for:
displaying, via the display generating component, a user interface comprising a first user interface object, a second user interface object, a third user interface object, and an indication that the first user interface object is selected;
Detecting a hand gesture via at least the optical sensor while displaying the user interface including the first user interface object, the second user interface object, the third user interface object, and the indication that the first user interface object is selected; and
in response to detecting the hand gesture via at least the optical sensor:
in accordance with a determination that the hand gesture is a gesture of a first type, displaying, via the display generating component, an indication that the second user interface object is selected; and
in accordance with a determination that the hand gesture is a gesture of a second type different from the first type of gesture, an indication that the third user interface object is selected is displayed via the display generation component.
23. The computer system of claim 22, wherein the hand gesture is detected based on heart rate data determined using data detected via the optical sensor.
24. The computer system of any of claims 22-23, wherein the second type of gesture is a gesture type that is multiple instances of the first type of gesture.
25. The computer system of any of claims 22 to 23, wherein the one or more programs further comprise instructions for:
In response to detecting the hand gesture via at least the optical sensor:
in accordance with a determination that the hand gesture is a third type of gesture different from the first type of gesture and the second type of gesture, performing an operation corresponding to a selection of the first user interface object; and
in accordance with a determination that the hand gesture is a fourth type of gesture different from the first type of gesture, the second type of gesture, and the third type of gesture, a menu including one or more selectable options is displayed via the display generation component.
26. The computer system of claim 25, wherein:
before detecting the hand gesture, the computer system is in a first mode of operation; and is also provided with
Performing the operation corresponding to the selection of the first user interface object includes transitioning the computer system from the first mode of operation to a second mode of operation.
27. The computer system of claim 26, wherein the one or more programs further comprise instructions for:
detecting, via at least the optical sensor, a second hand gesture when the computer system is in the second mode of operation; and
In response to detecting the second hand gesture via at least the optical sensor:
in accordance with a determination that the second hand gesture is the first type of gesture, switching a first automatic scrolling operation between an active state and an inactive state; and
in accordance with a determination that the second hand gesture is the second type of gesture, a first direction of the first automatic scrolling operation is reversed.
28. The computer system of claim 27, wherein the one or more programs further comprise instructions for:
in response to detecting the second hand gesture via at least the optical sensor:
in accordance with a determination that the second hand gesture is of the third type, performing a second operation; and
in accordance with a determination that the second hand gesture is of the fourth type, a third operation that is different from the second operation is performed.
29. The computer system of claim 27, wherein the one or more programs further comprise instructions for:
performing a second automatic scrolling operation when the computer system is in the second mode of operation, the second automatic scrolling operation scrolling a sequence of a plurality of interface objects in a second direction;
Detecting an end of the sequence of the plurality of user interface objects while performing the second automatic scrolling operation; and
in response to detecting the end of the sequence of the plurality of user interface objects, a third automatic scrolling operation is performed that scrolls the sequence of plurality of interface objects in a third direction different from the second direction.
30. The computer system of claim 27, wherein the one or more programs further comprise instructions for:
in response to detecting the second hand gesture via at least the optical sensor:
in accordance with a determination that the second hand gesture is the fourth type of gesture, the computer system is transitioned from the second mode of operation to a third mode of operation.
31. The computer system of claim 27, wherein the one or more programs further comprise instructions for:
in accordance with a determination that the second hand gesture is the first type of gesture, a notification is displayed indicating a state of the automatic scroll operation.
32. The computer system of claim 25, wherein the third type of gesture is a gesture type that is multiple instances of the fourth type of gesture.
33. The computer system of claim 25, wherein the fourth type of gesture does not include multiple instances of the first type of gesture and the third type of gesture does not include multiple instances of the second type of gesture.
34. The computer system of claim 25, wherein the one or more programs further comprise instructions for:
in response to detecting the second hand gesture via at least the optical sensor:
in accordance with a determination that a notification is received within a threshold period of time and in accordance with a determination that the second hand gesture is a gesture of the fourth type, an action related to the notification is performed.
35. The computer system of claim 25, wherein the one or more selectable options include a first selectable user interface object for changing one or more hand gestures to enable operations performed by the computer system, wherein selection of the first selectable user interface object causes the computer system to display a plurality of settings, wherein each setting controls operations performed by the computer system by the one or more hand gestures when the one or more hand gestures are detected by the computer system.
36. The computer system of claim 25, wherein:
the one or more selectable options include a second selectable option for transitioning the computer system to a fourth mode of operation and a third selectable option for transitioning the computer system to a fifth mode of operation that is different from the fourth mode of operation;
selection of the second selectable option causes the computer system to transition to the fourth mode of operation; and is also provided with
Selection of the third selectable option causes the computer system to transition to the fifth mode of operation.
37. The computer system of any of claims 22-23, wherein the one or more selectable options include a fourth selectable option for displaying one or more additional selectable options, and wherein selection of the fourth selectable option causes the computer system to display the one or more additional selectable options that were not previously displayed prior to selection of the fourth selectable option.
38. The computer system of any of claims 22 to 23, wherein the menu:
in accordance with a determination that a respective user interface object is at a first location on the user interface, displayed at the first location; and
In accordance with a determination that the respective user interface object is not at the first location on the user interface, but is displayed at a second location different from the first location.
39. The computer system of any of claims 22 to 23, wherein the one or more programs further comprise instructions for:
detecting a third hand gesture via at least the optical sensor while displaying the menu comprising one or more selectable options; and
in response to detecting the third hand gesture via at least the optical sensor:
in accordance with a determination that the third hand gesture is of the fourth type, the display of the menu including the one or more selectable options is stopped.
40. The computer system of any of claims 22 to 23, wherein the one or more programs further comprise instructions for:
detecting a fourth hand gesture via at least the optical sensor after displaying the user interface including the first user interface object; and
in response to detecting the fourth hand gesture via at least the optical sensor:
in accordance with a determination that the fourth hand gesture is a fifth type of gesture that is different from the first type of gesture and the second type of gesture, the computer system is transitioned from an inactive state to an active state.
41. The computer system of any of claims 22 to 23, wherein the one or more programs further comprise instructions for:
detecting a fifth hand gesture via at least the optical sensor while displaying the indication that the second user interface object is selected via the display generating component; and
in response to detecting the fourth hand gesture via at least the optical sensor:
in accordance with a determination that the fifth hand gesture is the first type of gesture, displaying, via the display generating component, an indication that a fourth first user interface object is selected; and
in accordance with a determination that the fifth hand gesture is the second type of gesture, the indication of the first user interface object is displayed via the display generation component.
42. The computer system of any of claims 22 to 23, wherein the one or more programs further comprise instructions for:
detecting a request to transition the computer system from a first mode of operation to a fourth mode of operation while displaying the user interface including the first user interface object;
responsive to detecting the request to transition the computer system from the first mode of operation to the fourth mode of operation, transitioning the computer system from the first mode of operation to the fourth mode of operation;
Detecting a sixth hand gesture via at least the optical sensor while the computer system is in the fourth mode of operation and while displaying the user interface including the first user interface object, the second user interface object, and the third user interface object, and the indication that the first user interface object is selected;
in response to detecting the sixth hand gesture via at least the optical sensor:
in accordance with a determination that the hand gesture is of the first type, continuing to display the indication that the first user interface object was selected; and
in accordance with a determination that the hand gesture is of the second type of gesture, continuing to display the indication that the first user interface object was selected.
43. A computer-readable storage medium storing one or more programs configured to be executed by one or more processors of a computer system in communication with a display generation component and an optical sensor, the one or more programs comprising instructions for:
displaying, via the display generating component, a user interface comprising a first user interface object, a second user interface object, a third user interface object, and an indication that the first user interface object is selected;
Detecting a hand gesture via at least the optical sensor while displaying the user interface including the first user interface object, the second user interface object, the third user interface object, and the indication that the first user interface object is selected; and
in response to detecting the hand gesture via at least the optical sensor:
in accordance with a determination that the hand gesture is a gesture of a first type, displaying, via the display generating component, an indication that the second user interface object is selected; and
in accordance with a determination that the hand gesture is a gesture of a second type different from the first type of gesture, an indication that the third user interface object is selected is displayed via the display generation component.
44. The computer readable storage medium of claim 43, wherein the hand gesture is detected based on heart rate data determined using data detected via the optical sensor.
45. The computer-readable storage medium of any of claims 43-44, wherein the second type of gesture is a gesture type that is multiple instances of the first type of gesture.
46. The computer readable storage medium of any one of claims 43 to 44, wherein the one or more programs further comprise instructions for:
in response to detecting the hand gesture via at least the optical sensor:
in accordance with a determination that the hand gesture is a third type of gesture different from the first type of gesture and the second type of gesture, performing an operation corresponding to a selection of the first user interface object; and
in accordance with a determination that the hand gesture is a fourth type of gesture different from the first type of gesture, the second type of gesture, and the third type of gesture, a menu including one or more selectable options is displayed via the display generation component.
47. The computer readable storage medium of claim 46, wherein:
before detecting the hand gesture, the computer system is in a first mode of operation; and is also provided with
Performing the operation corresponding to the selection of the first user interface object includes transitioning the computer system from the first mode of operation to a second mode of operation.
48. The computer readable storage medium of claim 47, wherein the one or more programs further comprise instructions for:
Detecting, via at least the optical sensor, a second hand gesture when the computer system is in the second mode of operation; and
in response to detecting the second hand gesture via at least the optical sensor:
in accordance with a determination that the second hand gesture is the first type of gesture, switching a first automatic scrolling operation between an active state and an inactive state; and
in accordance with a determination that the second hand gesture is the second type of gesture, a first direction of the first automatic scrolling operation is reversed.
49. The computer readable storage medium of claim 48, wherein the one or more programs further comprise instructions for:
in response to detecting the second hand gesture via at least the optical sensor:
in accordance with a determination that the second hand gesture is of the third type, performing a second operation; and
in accordance with a determination that the second hand gesture is of the fourth type, a third operation that is different from the second operation is performed.
50. The computer readable storage medium of claim 48, wherein the one or more programs further comprise instructions for:
Performing a second automatic scrolling operation when the computer system is in the second mode of operation, the second automatic scrolling operation scrolling a sequence of a plurality of interface objects in a second direction;
detecting an end of the sequence of the plurality of user interface objects while performing the second automatic scrolling operation; and
in response to detecting the end of the sequence of the plurality of user interface objects, a third automatic scrolling operation is performed that scrolls the sequence of plurality of interface objects in a third direction different from the second direction.
51. The computer readable storage medium of claim 48, wherein the one or more programs further comprise instructions for:
in response to detecting the second hand gesture via at least the optical sensor:
in accordance with a determination that the second hand gesture is the fourth type of gesture, the computer system is transitioned from the second mode of operation to a third mode of operation.
52. The computer readable storage medium of claim 48, wherein the one or more programs further comprise instructions for:
In accordance with a determination that the second hand gesture is the first type of gesture, a notification is displayed indicating a state of the automatic scroll operation.
53. The computer-readable storage medium of claim 46, wherein the third type of gesture is a gesture type that is multiple instances of the fourth type of gesture.
54. The computer-readable storage medium of claim 46, wherein the fourth type of gesture does not include multiple instances of the first type of gesture and the third type of gesture does not include multiple instances of the second type of gesture.
55. The computer readable storage medium of claim 46, wherein the one or more programs further comprise instructions for:
in response to detecting the second hand gesture via at least the optical sensor:
in accordance with a determination that a notification is received within a threshold period of time and in accordance with a determination that the second hand gesture is a gesture of the fourth type, an action related to the notification is performed.
56. The computer-readable storage medium of claim 46, wherein the one or more selectable options include a first selectable user interface object for changing one or more hand gestures to enable operations performed by the computer system, wherein selection of the first selectable user interface object causes the computer system to display a plurality of settings, wherein each setting controls an operation performed by the computer system by the one or more hand gestures when detected by the computer system.
57. The computer readable storage medium of claim 46, wherein:
the one or more selectable options include a second selectable option for transitioning the computer system to a fourth mode of operation and a third selectable option for transitioning the computer system to a fifth mode of operation that is different from the fourth mode of operation;
selection of the second selectable option causes the computer system to transition to the fourth mode of operation; and is also provided with
Selection of the third selectable option causes the computer system to transition to the fifth mode of operation.
58. The computer-readable storage medium of any one of claims 43 to 44, wherein the one or more selectable options include a fourth selectable option for displaying one or more additional selectable options, and wherein selection of the fourth selectable option causes the computer system to display the one or more additional selectable options that were not previously displayed prior to selection of the fourth selectable option.
59. The computer readable storage medium of any one of claims 43 to 44, wherein the menu:
in accordance with a determination that a respective user interface object is at a first location on the user interface, displayed at the first location; and
In accordance with a determination that the respective user interface object is not at the first location on the user interface, but is displayed at a second location different from the first location.
60. The computer readable storage medium of any one of claims 43 to 44, wherein the one or more programs further comprise instructions for:
detecting a third hand gesture via at least the optical sensor while displaying the menu comprising one or more selectable options; and
in response to detecting the third hand gesture via at least the optical sensor:
in accordance with a determination that the third hand gesture is of the fourth type, the display of the menu including the one or more selectable options is stopped.
61. The computer readable storage medium of any one of claims 43 to 44, wherein the one or more programs further comprise instructions for:
detecting a fourth hand gesture via at least the optical sensor after displaying the user interface including the first user interface object; and
in response to detecting the fourth hand gesture via at least the optical sensor:
In accordance with a determination that the fourth hand gesture is a fifth type of gesture that is different from the first type of gesture and the second type of gesture, the computer system is transitioned from an inactive state to an active state.
62. The computer readable storage medium of any one of claims 43 to 44, wherein the one or more programs further comprise instructions for:
detecting a fifth hand gesture via at least the optical sensor while displaying the indication that the second user interface object is selected via the display generating component; and
in response to detecting the fourth hand gesture via at least the optical sensor:
in accordance with a determination that the fifth hand gesture is the first type of gesture, displaying, via the display generating component, an indication that a fourth first user interface object is selected; and
in accordance with a determination that the fifth hand gesture is the second type of gesture, the indication of the first user interface object is displayed via the display generation component.
63. The computer readable storage medium of any one of claims 43 to 44, wherein the one or more programs further comprise instructions for:
Detecting a request to transition the computer system from a first mode of operation to a fourth mode of operation while displaying the user interface including the first user interface object;
responsive to detecting the request to transition the computer system from the first mode of operation to the fourth mode of operation, transitioning the computer system from the first mode of operation to the fourth mode of operation;
detecting a sixth hand gesture via at least the optical sensor while the computer system is in the fourth mode of operation and while displaying the user interface including the first user interface object, the second user interface object, and the third user interface object, and the indication that the first user interface object is selected;
in response to detecting the sixth hand gesture via at least the optical sensor:
in accordance with a determination that the hand gesture is of the first type, continuing to display the indication that the first user interface object was selected; and
in accordance with a determination that the hand gesture is of the second type of gesture, continuing to display the indication that the first user interface object was selected.
64. A computer system in communication with a display generating component and an optical sensor, the computer system comprising:
apparatus for performing the method of claim 1 or 2.
65. A computer program product comprising one or more programs configured to be executed by one or more processors of a computer system in communication with a display generating component and an optical sensor, the one or more programs comprising instructions for performing the method of claim 1 or 2.
CN202310358775.5A 2021-05-19 2022-05-19 Navigating a user interface using hand gestures Pending CN116382557A (en)

Applications Claiming Priority (8)

Application Number Priority Date Filing Date Title
US202163190783P 2021-05-19 2021-05-19
US63/190,783 2021-05-19
US202163221331P 2021-07-13 2021-07-13
US63/221,331 2021-07-13
US17/747,613 US20220374085A1 (en) 2021-05-19 2022-05-18 Navigating user interfaces using hand gestures
US17/747,613 2022-05-18
PCT/US2022/030021 WO2022246060A1 (en) 2021-05-19 2022-05-19 Navigating user interfaces using hand gestures
CN202280005772.7A CN115968464A (en) 2021-05-19 2022-05-19 Navigating a user interface using hand gestures

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
CN202280005772.7A Division CN115968464A (en) 2021-05-19 2022-05-19 Navigating a user interface using hand gestures

Publications (1)

Publication Number Publication Date
CN116382557A true CN116382557A (en) 2023-07-04

Family

ID=84102770

Family Applications (2)

Application Number Title Priority Date Filing Date
CN202280005772.7A Pending CN115968464A (en) 2021-05-19 2022-05-19 Navigating a user interface using hand gestures
CN202310358775.5A Pending CN116382557A (en) 2021-05-19 2022-05-19 Navigating a user interface using hand gestures

Family Applications Before (1)

Application Number Title Priority Date Filing Date
CN202280005772.7A Pending CN115968464A (en) 2021-05-19 2022-05-19 Navigating a user interface using hand gestures

Country Status (4)

Country Link
US (2) US20220374085A1 (en)
EP (1) EP4189520A1 (en)
JP (1) JP2023539880A (en)
CN (2) CN115968464A (en)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10558278B2 (en) 2017-07-11 2020-02-11 Apple Inc. Interacting with an electronic device through physical movement
US20220374085A1 (en) * 2021-05-19 2022-11-24 Apple Inc. Navigating user interfaces using hand gestures
US11954266B2 (en) * 2021-12-20 2024-04-09 Htc Corporation Method for interacting with virtual world, host, and computer readable storage medium

Family Cites Families (109)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6128010A (en) * 1997-08-05 2000-10-03 Assistive Technology, Inc. Action bins for computer user interface
US6031531A (en) * 1998-04-06 2000-02-29 International Business Machines Corporation Method and system in a graphical user interface for facilitating cursor object movement for physically challenged computer users
US20060107226A1 (en) * 2004-11-16 2006-05-18 Microsoft Corporation Sidebar autohide to desktop
US8341537B2 (en) * 2006-02-28 2012-12-25 Microsoft Corporation Indication of delayed content output in a user interface
US7852315B2 (en) * 2006-04-07 2010-12-14 Microsoft Corporation Camera and acceleration based interface for presentations
US8508472B1 (en) * 2006-11-28 2013-08-13 James W. Wieder Wearable remote control with a single control button
US7844915B2 (en) * 2007-01-07 2010-11-30 Apple Inc. Application programming interfaces for scrolling operations
JP4855970B2 (en) * 2007-02-20 2012-01-18 株式会社エヌ・ティ・ティ・ドコモ Terminal device and program
US7873906B2 (en) * 2007-06-22 2011-01-18 International Business Machines Corporation Method and system for presenting a visual notification and delaying an action responsive to an onscreen selection
US20110210931A1 (en) * 2007-08-19 2011-09-01 Ringbow Ltd. Finger-worn device and interaction methods and communication methods
JP4670860B2 (en) * 2007-11-22 2011-04-13 ソニー株式会社 Recording / playback device
JP2009245239A (en) * 2008-03-31 2009-10-22 Sony Corp Pointer display device, pointer display/detection method, pointer display/detection program and information apparatus
US10489747B2 (en) * 2008-10-03 2019-11-26 Leaf Group Ltd. System and methods to facilitate social media
US9569001B2 (en) * 2009-02-03 2017-02-14 Massachusetts Institute Of Technology Wearable gestural interface
US20140078318A1 (en) * 2009-05-22 2014-03-20 Motorola Mobility Llc Electronic Device with Sensing Assembly and Method for Interpreting Consecutive Gestures
US20110153435A1 (en) * 2009-09-17 2011-06-23 Lexos Media Inc. System and method of cursor-based content delivery
US8593576B2 (en) * 2009-10-15 2013-11-26 At&T Intellectual Property I, L.P. Gesture-based remote control
US8819576B2 (en) * 2011-05-09 2014-08-26 Blackberry Limited Systems and methods for facilitating an input to an electronic device
US9513799B2 (en) * 2011-06-05 2016-12-06 Apple Inc. Devices, methods, and graphical user interfaces for providing control of a touch-based user interface absent physical touch capabilities
US9367860B2 (en) * 2011-08-05 2016-06-14 Sean McKirdy Barcode generation and implementation method and system for processing information
US9477320B2 (en) * 2011-08-16 2016-10-25 Argotext, Inc. Input device
US9442517B2 (en) * 2011-11-30 2016-09-13 Blackberry Limited Input gestures using device movement
US20160011724A1 (en) * 2012-01-06 2016-01-14 Google Inc. Hands-Free Selection Using a Ring-Based User-Interface
US20150220149A1 (en) * 2012-02-14 2015-08-06 Google Inc. Systems and methods for a virtual grasping user interface
US9891718B2 (en) * 2015-04-22 2018-02-13 Medibotics Llc Devices for measuring finger motion and recognizing hand gestures
US20150261310A1 (en) * 2012-08-01 2015-09-17 Whirlscape, Inc. One-dimensional input system and method
CN104871117A (en) * 2012-11-07 2015-08-26 达维德·佩佩 Input device, particularly for computers or the like, and corresponding graphical user interface system
US11157436B2 (en) * 2012-11-20 2021-10-26 Samsung Electronics Company, Ltd. Services associated with wearable electronic device
US10185416B2 (en) * 2012-11-20 2019-01-22 Samsung Electronics Co., Ltd. User gesture input to wearable electronic device involving movement of device
US11237719B2 (en) * 2012-11-20 2022-02-01 Samsung Electronics Company, Ltd. Controlling remote electronic device with wearable electronic device
US8994827B2 (en) * 2012-11-20 2015-03-31 Samsung Electronics Co., Ltd Wearable electronic device
US20140293755A1 (en) * 2013-03-28 2014-10-02 Meta Watch Oy Device with functional display and method for time management
DE102013007250A1 (en) * 2013-04-26 2014-10-30 Inodyn Newmedia Gmbh Procedure for gesture control
KR102135092B1 (en) * 2013-06-03 2020-07-17 엘지전자 주식회사 Operating Method for Image Display apparatus
KR101824921B1 (en) * 2013-06-11 2018-02-05 삼성전자주식회사 Method And Apparatus For Performing Communication Service Based On Gesture
US9435641B2 (en) * 2013-06-20 2016-09-06 Analog Devices, Inc. Optical angle measurement
US10884493B2 (en) * 2013-06-20 2021-01-05 Uday Parshionikar Gesture based user interfaces, apparatuses and systems using eye tracking, head tracking, hand tracking, facial expressions and other user actions
US9606721B2 (en) * 2013-07-22 2017-03-28 Lg Electronics Inc. Mobile terminal and control method thereof
KR102034587B1 (en) * 2013-08-29 2019-10-21 엘지전자 주식회사 Mobile terminal and controlling method thereof
US20150156028A1 (en) * 2013-12-01 2015-06-04 Apx Labs, Llc Systems and methods for sharing information between users of augmented reality devices
US9632655B2 (en) * 2013-12-13 2017-04-25 Amazon Technologies, Inc. No-touch cursor for item selection
JP2015127870A (en) * 2013-12-27 2015-07-09 ソニー株式会社 Controller, control method, program, and electronic apparatus
US20150199780A1 (en) * 2014-01-16 2015-07-16 Alex Beyk Methods and systems for digital agreement establishment, signing, centralized management, and a storefront using head mounted displays and networks
US20150205358A1 (en) * 2014-01-20 2015-07-23 Philip Scott Lyren Electronic Device with Touchless User Interface
US20220050425A1 (en) * 2019-07-19 2022-02-17 Medibotics Llc Wrist-Worn Device with a Flip-Up or Pop-Up Component and One or More Cameras
US20150242083A1 (en) * 2014-02-27 2015-08-27 Nokia Corporation Circumferential span region of a virtual screen
US10691332B2 (en) * 2014-02-28 2020-06-23 Samsung Electronics Company, Ltd. Text input on an interactive display
US9804753B2 (en) * 2014-03-20 2017-10-31 Microsoft Technology Licensing, Llc Selection using eye gaze evaluation over time
US9414115B1 (en) * 2014-03-28 2016-08-09 Aquifi, Inc. Use of natural user interface realtime feedback to customize user viewable ads presented on broadcast media
CA2851611A1 (en) * 2014-05-09 2015-11-09 Reza Chaji Touch screen accessibility and functionality enhancement
US20160328108A1 (en) * 2014-05-10 2016-11-10 Chian Chiu Li Systems And Methods for Displaying Information
US9977566B2 (en) * 2014-06-24 2018-05-22 Google Llc Computerized systems and methods for rendering an animation of an object in response to user input
KR20160015719A (en) * 2014-07-31 2016-02-15 엘지전자 주식회사 Mobile terminal and method for controlling the same
JP6447917B2 (en) * 2014-08-06 2019-01-09 パナソニックIpマネジメント株式会社 Wrist-mounted input device
KR20160034065A (en) * 2014-09-19 2016-03-29 엘지전자 주식회사 Mobile terminal and method for controlling the same
KR102271434B1 (en) * 2014-09-26 2021-07-01 엘지전자 주식회사 Mobile terminal and method for controlling the same
KR102188268B1 (en) * 2014-10-08 2020-12-08 엘지전자 주식회사 Mobile terminal and method for controlling the same
KR101636460B1 (en) * 2014-11-05 2016-07-05 삼성전자주식회사 Electronic device and method for controlling the same
US9685005B2 (en) * 2015-01-02 2017-06-20 Eon Reality, Inc. Virtual lasers for interacting with augmented reality environments
US10656720B1 (en) * 2015-01-16 2020-05-19 Ultrahaptics IP Two Limited Mode switching for integrated gestural interaction and multi-user collaboration in immersive virtual reality environments
JP2016194755A (en) * 2015-03-31 2016-11-17 ソニー株式会社 Information processing device, information processing method, and program
KR101927323B1 (en) * 2015-04-03 2018-12-10 엘지전자 주식회사 Mobile terminal and method for controlling the same
WO2017039044A1 (en) * 2015-09-04 2017-03-09 엘지전자 주식회사 Watch-type mobile terminal
US11106273B2 (en) * 2015-10-30 2021-08-31 Ostendo Technologies, Inc. System and methods for on-body gestural interfaces and projection displays
GB2551927B (en) * 2015-11-09 2020-07-01 Sky Cp Ltd Television user interface
US10459597B2 (en) * 2016-02-03 2019-10-29 Salesforce.Com, Inc. System and method to navigate 3D data on mobile and desktop
KR102481632B1 (en) * 2016-04-26 2022-12-28 삼성전자주식회사 Electronic device and method for inputting adaptive touch using display in the electronic device
US11327566B2 (en) * 2019-03-29 2022-05-10 Facebook Technologies, Llc Methods and apparatuses for low latency body state prediction based on neuromuscular data
US11331045B1 (en) * 2018-01-25 2022-05-17 Facebook Technologies, Llc Systems and methods for mitigating neuromuscular signal artifacts
US20180036469A1 (en) * 2016-08-05 2018-02-08 Fresenius Medical Care Holdings, Inc. Remote User Interfaces for Dialysis Systems
US10110272B2 (en) * 2016-08-24 2018-10-23 Centurylink Intellectual Property Llc Wearable gesture control device and method
US10318034B1 (en) * 2016-09-23 2019-06-11 Apple Inc. Devices, methods, and user interfaces for interacting with user interface objects via proximity-based and contact-based inputs
CN107454947A (en) * 2016-09-26 2017-12-08 深圳市大疆创新科技有限公司 Unmanned aerial vehicle (UAV) control method, wear-type show glasses and system
US10162422B2 (en) * 2016-10-10 2018-12-25 Deere & Company Control of machines through detection of gestures by optical and muscle sensors
KR102038120B1 (en) * 2016-12-02 2019-10-30 피손 테크놀로지, 인크. Detection and Use of Body Tissue Electrical Signals
US10528214B2 (en) * 2016-12-28 2020-01-07 Microsoft Technology Licensing, Llc Positioning mechanism for bubble as a custom tooltip
US10261595B1 (en) * 2017-05-19 2019-04-16 Facebook Technologies, Llc High resolution tracking and response to hand gestures through three dimensions
US10496162B2 (en) * 2017-07-26 2019-12-03 Microsoft Technology Licensing, Llc Controlling a computer using eyegaze and dwell
JP6981106B2 (en) * 2017-08-29 2021-12-15 株式会社リコー Image pickup device, image display system, operation method, program
WO2019059623A1 (en) * 2017-09-20 2019-03-28 Samsung Electronics Co., Ltd. Wearable device with bezel ring to enable motion in multiple degrees of freedom
WO2019079790A1 (en) * 2017-10-21 2019-04-25 Eyecam, Inc Adaptive graphic user interfacing system
US11048334B2 (en) * 2017-12-22 2021-06-29 Butterfly Network, Inc. Methods and apparatuses for identifying gestures based on ultrasound data
US20230072423A1 (en) * 2018-01-25 2023-03-09 Meta Platforms Technologies, Llc Wearable electronic devices and extended reality systems including neuromuscular sensors
US11493993B2 (en) * 2019-09-04 2022-11-08 Meta Platforms Technologies, Llc Systems, methods, and interfaces for performing inputs based on neuromuscular control
US20200310541A1 (en) * 2019-03-29 2020-10-01 Facebook Technologies, Llc Systems and methods for control schemes based on neuromuscular data
US11150730B1 (en) * 2019-04-30 2021-10-19 Facebook Technologies, Llc Devices, systems, and methods for controlling computing devices via neuromuscular signals of users
US10970936B2 (en) * 2018-10-05 2021-04-06 Facebook Technologies, Llc Use of neuromuscular signals to provide enhanced interactions with physical objects in an augmented reality environment
JP2019185531A (en) * 2018-04-13 2019-10-24 セイコーエプソン株式会社 Transmission type head-mounted display, display control method, and computer program
WO2020033110A1 (en) * 2018-08-05 2020-02-13 Pison Technology, Inc. User interface control of responsive devices
US10802598B2 (en) * 2018-08-05 2020-10-13 Pison Technology, Inc. User interface control of responsive devices
EP3843617B1 (en) * 2018-08-31 2023-10-04 Facebook Technologies, LLC. Camera-guided interpretation of neuromuscular signals
WO2020061451A1 (en) * 2018-09-20 2020-03-26 Ctrl-Labs Corporation Neuromuscular text entry, writing and drawing in augmented reality systems
JP2022500729A (en) * 2018-09-20 2022-01-04 フェイスブック・テクノロジーズ・リミテッド・ライアビリティ・カンパニーFacebook Technologies, Llc Neuromuscular control of augmented reality system
US20200150772A1 (en) * 2018-11-09 2020-05-14 Google Llc Sensing Hand Gestures Using Optical Sensors
TWI689859B (en) * 2019-03-19 2020-04-01 國立臺灣科技大學 System for recognizing user gestures according to mechanomyogram detected from user's wrist and method thereof
US11422623B2 (en) * 2019-10-23 2022-08-23 Interlake Research, Llc Wrist worn computing device control systems and methods
US11199908B2 (en) * 2020-01-28 2021-12-14 Pison Technology, Inc. Wrist-worn device-based inputs for an operating system
US11256336B2 (en) * 2020-06-29 2022-02-22 Facebook Technologies, Llc Integration of artificial reality interaction modes
US11416917B2 (en) * 2020-08-04 2022-08-16 Contextlogic, Inc. Content carousel
US11899845B2 (en) * 2020-08-04 2024-02-13 Samsung Electronics Co., Ltd. Electronic device for recognizing gesture and method for operating the same
US20220156353A1 (en) * 2020-11-19 2022-05-19 Jvckenwood Corporation Biometric authentication through vascular monitoring
US20220253146A1 (en) * 2021-02-09 2022-08-11 Finch Technologies Ltd. Combine Inputs from Different Devices to Control a Computing Device
US20220291753A1 (en) * 2021-03-10 2022-09-15 Finch Technologies Ltd. Spatial Gesture Recognition using Inputs from Different Devices to Control a Computing Device
US20220374085A1 (en) * 2021-05-19 2022-11-24 Apple Inc. Navigating user interfaces using hand gestures
US11347320B1 (en) * 2021-06-29 2022-05-31 Google Llc Gesture calibration for devices
WO2023059458A1 (en) * 2021-10-08 2023-04-13 Meta Platforms Technologies, Llc Apparatus, system, and method for detecting user input via hand gestures and arm movements
TWI802123B (en) * 2021-12-02 2023-05-11 財團法人工業技術研究院 Recognition method of 3d vein pattern and recognition device thereof
US20230090410A1 (en) * 2021-12-14 2023-03-23 Meta Platforms Technologies, Llc Artificial Reality Input Models
US20230270363A1 (en) * 2022-02-25 2023-08-31 Meta Platforms Technologies, Llc Smart Electrodes for Sensing Signals and Processing Signals Using Internally-Housed Signal-Processing Components at Wearable Devices and Wearable Devices Incorporating the Smart Electrodes

Also Published As

Publication number Publication date
US20220374085A1 (en) 2022-11-24
US20230195237A1 (en) 2023-06-22
CN115968464A (en) 2023-04-14
JP2023539880A (en) 2023-09-20
EP4189520A1 (en) 2023-06-07

Similar Documents

Publication Publication Date Title
JP7119179B2 (en) Context specific user interface
CN112368674B (en) Device control using gaze information
US10953307B2 (en) Swim tracking and notifications for wearable devices
CN113093970B (en) Controlling system zoom magnification using a rotatable input mechanism
US11847378B2 (en) User interfaces for audio routing
US11921998B2 (en) Editing features of an avatar
US11520416B2 (en) Interacting with an electronic device through physical movement
CN112711375A (en) Device configuration user interface
US20210312404A1 (en) Device, Method, and Graphical User Interface for Changing the Time of a Calendar Event
US11947784B2 (en) User interface for initiating a telephone call
US20220374085A1 (en) Navigating user interfaces using hand gestures
CN112437912A (en) Turbo scrolling and selection
US11800001B2 (en) User interfaces for presenting indications of incoming calls
US20230393616A1 (en) Displaying application views
US11696017B2 (en) User interface for managing audible descriptions for visual media
US11416136B2 (en) User interfaces for assigning and responding to user inputs
CN113906725B (en) Method for volume control, electronic device, and computer-readable storage medium
US20220392589A1 (en) User interfaces related to clinical data
US20230376193A1 (en) User interfaces for device controls
US20240053953A1 (en) User interfaces for audio routing
WO2022246060A1 (en) Navigating user interfaces using hand gestures
CN117441156A (en) User interface for audio routing
WO2023225012A1 (en) User interfaces for device controls
CN116324698A (en) User interface for controlling insertion of a marker

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination