WO2024058802A1 - Swivel gesture functionality on computing devices - Google Patents

Swivel gesture functionality on computing devices Download PDF

Info

Publication number
WO2024058802A1
WO2024058802A1 PCT/US2022/076287 US2022076287W WO2024058802A1 WO 2024058802 A1 WO2024058802 A1 WO 2024058802A1 US 2022076287 W US2022076287 W US 2022076287W WO 2024058802 A1 WO2024058802 A1 WO 2024058802A1
Authority
WO
WIPO (PCT)
Prior art keywords
graphical element
setting
computing device
updated
swivel gesture
Prior art date
Application number
PCT/US2022/076287
Other languages
French (fr)
Inventor
Arthur KIM
Original Assignee
Google Llc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Google Llc filed Critical Google Llc
Priority to PCT/US2022/076287 priority Critical patent/WO2024058802A1/en
Publication of WO2024058802A1 publication Critical patent/WO2024058802A1/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04847Interaction techniques to control parameter settings, e.g. interaction with sliders or dials
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text

Definitions

  • a computing device may include a display device that displays content from an application executing at the computing device, such as textual or graphical content.
  • a user may interact with a graphical user interface (GUI) of the application using a presence-sensitive screen (e.g., touchscreen) of the computing device.
  • GUI graphical user interface
  • a presence-sensitive screen e.g., touchscreen
  • this disclosure describes techniques that enable a computing device to perform an action based on movement of two or more graphical elements relative to each other. For instance, a user may perform a swivel gesture to cause one graphical element to visually rotate around another graphical element. The rotation of the one graphical element around the other graphical element may cause the computing device to perform various actions, including, but not limited to, controlling media playback, changing media playback settings, and changing application settings.
  • the techniques of this disclosure may enable more compact GUIs, which may be more suitable for smaller screens, and enable additional functionality of the computing device.
  • the techniques may reduce the number of user inputs required to perform a task. This may enable the device to operate more efficiently (e.g., in terms of processor usage, memory consumed, memory bus bandwidth utilized, etc.).
  • a method includes: outputting, by one or more processors of a computing device and for display, a first graphical element and a second graphical element, wherein the second graphical element is located at an initial location relative to the first graphical element, and wherein the initial location is at a first angular position relative to the first graphical element; recei ving, by the one or more processors, an indication of a user input having an input point starting at the initial location; determining, by the one or more processors, whether the user input corresponds to a swivel gesture, wherein the swivel gesture includes a rotating of tire input point from the initial location to an updated location of the second graphical element relative to the first graphical element, and wherein the updated location is at a second angular position relative to the first graphical element; and responsive to determining that the user input corresponds to the swivel gesture, performing, by the one or more processors, an action associated with the swivel gesture.
  • a device includes: one or more processors; and a memory that stores instructions that, when executed by the one or more processors, cause the one or more processors to: output for display a first graphical element and a second graphical element, wherein the second graphical element is located at an initial location relative to the first graphical element, and wherein the initial location is at a first angular position relative to the first graphical element; receive an indication of a user input having an input point starting at the initial location; determine whether the user input corresponds to a swivel gesture, wherein the swivel gesture includes a rotating of tire input point from tire initial location to an updated location of the second graphical element relative to the first graphical element, and wherein the updated location is at a second angular position relative to the first graphical element; and responsive to determining that the user input corresponds to tire swivel gesture, perform an action associated with the swivel gesture.
  • a non-transitory computer-readable storage medium stores instructions that, when executed, cause one or more processors of a computing device to: output for display a first graphical element and a second graphical element, wherein the second graphical element is located at an initial location relative to the first graphical element, and wherein the initial location is at a first angular position relative to the first graphical element; receive an indication of a user input having an input point starting at the initial location; determine whether the user input corresponds to a swivel gesture, wherein the swivel gesture includes a rotating of the input point from the initial location to an updated location of the second graphical element relative to the first graphical element, and wherein the updated location is at a second angular position relative to the first graphical element; and responsive to determining that the user input corresponds to the swivel gesture, perform an action associated with the swivel gesture.
  • an apparatus includes: means for outputting, for display, a first graphical element and a second graphical element, wherein the second graphical element is located at an initial location relative to the first graphical element, and wherein the initial location is at a first angular position relative to the first graphical element; means for receiving an indication of a user input having an input point starting at the initial location; means for determining whether the user input corresponds to a swivel gesture, wherein the swivel gesture includes a rotating of the input point from the initial location to an updated location of the second graphical element relative to the first graphical element, and wherein the updated location is at a second angular position relative to the first graphical element; and means for performing an action associated with the swivel gesture in response to determining that the user input corresponds to the swivel gesture.
  • FIG. 1 is a conceptual diagram illustrating an example computing device configured to perform an action in response to a swivel gesture, in accordance with one or more aspects of the present disclosure.
  • FIG.2 is a block diagram illustrating an example computing device configured to perform an action in response to a swivel gesture, in accordance with one or more aspects of the present disclosure.
  • FIG . 3 is a conceptual diagram illustrating example graphical user interfaces, in accordance with one or more aspects of the present disclosure.
  • FIG. 4 is a conceptual diagram illustrating example graphical user interfaces, in accordance with one or more aspects of the present disclosure.
  • FIG. 5 is a conceptual diagram illustrating example graphical user interfaces, in accordance with one or more aspects of the present disclosure.
  • FIG, 6 is a conceptual diagram illustrating example graphical user interfaces, in accordance with one or more aspects of the present disclosure.
  • FIG. 7 is a flowchart illustrating example operations for performing actions in response to a swivel gesture, in accordance with one or more aspects of the present disclosure.
  • FIG, 1 is a conceptual diagram illustrating an example computing device 102 configured to output a GUI in accordance with one or more techniques of this disclosure.
  • computing device 102 is a mobile computing device (e.g., a mobile phone).
  • computing device 102 may be a tablet computer, a laptop computer, a desktop computer, a gaming system , a media player, an e-book reader, a television platform, an automobile navigation system, a wearable computing device (e.g., a computerized watch, computerized headset, computerized eyewear, a computerized glove), or any oilier type of mobile or non-mobile computing device.
  • a wearable computing device e.g., a computerized watch, computerized headset, computerized eyewear, a computerized glove
  • Computing device 102 includes a user interface device (UID) 104
  • UID 104 of computing device 102 may function as an input device for computing device 102 and as an output device for computing device 102.
  • UID 104 may be implemented using various technologies. For instance, UID 104 may function as an input device using a presence- sensitive input screen, such as a resistive touchscreen, a surface acoustic wave touchscreen, a capacitive touchscreen, a projective capacitive touchscreen, a pressure sensitive screen, an acoustic pulse recognition touchscreen, radar, ultrawide band, or another presence-sensitive display technology.
  • a presence- sensitive input screen such as a resistive touchscreen, a surface acoustic wave touchscreen, a capacitive touchscreen, a projective capacitive touchscreen, a pressure sensitive screen, an acoustic pulse recognition touchscreen, radar, ultrawide band, or another presence-sensitive display technology.
  • UID 104 may function as an output (e.g., display) device using any one or more display devices, such as a liquid crystal display (LCD), dot matrix display, light emitting diode (LED) display, microLED, organic light-emitting diode (DEED) display, e- ink, or similar monochrome or color display capable of outputting visible information to a user of computing device 102.
  • display devices such as a liquid crystal display (LCD), dot matrix display, light emitting diode (LED) display, microLED, organic light-emitting diode (DEED) display, e- ink, or similar monochrome or color display capable of outputting visible information to a user of computing device 102.
  • UID 104 of computing device 102 may include a presence-sensitive display that may receive input from a user of computing device 102.
  • UID 104 may receive indications of the input by detecting one or more gestures from a user of computing device 102 (e.g., the user touching or pointing to one or more locations of UID 104 with a finger or a stylus pen).
  • UID 104 may present output to a user, for instance at a presence-sensitive display.
  • UID 104 may- present the output as a graphical user interface, which may be associated with functionality provided by computing device 102.
  • UID 104 may present various user interfaces of components of a computing platform, operating system, one or more applications 120, or services executing at or accessible by computing device 102 (e.g., an electronic message application, an Internet browser application, a mobile operating system, etc.). A user may interact with a respective user interface to cause computing device 102 to perform operations relating to an action.
  • Computing device 102 includes UI module 106, which manages user interactions with UID 104 and other components of computing device 102.
  • UI module 106 may act as an intermediary between various components of computing device 102 to make determinations based on user input detected by UID 104 and generate output at UID 104 in response to the user input.
  • UI module 106 may receive instructions from applications 120, service, platform, or other module of computing device 102 to cause UID 104 to output a user interface, such as GUIs.
  • UI module 106 may manage inputs received by computing device 102 as a user views and interacts with the user interface presented at UID 104.
  • GUI 108 may be an initial GUI
  • GUI 108B may be an updated GUI in accordance with techniques of this disclosure.
  • a GUI displayed by UID 104 may include one or more graphical elements.
  • GUIs 108A-108B may each include a first graphical element 1 10 and a second graphical element 112.
  • a user may interact with graphical elements to cause computing device 102 to perform corresponding actions.
  • each graphical element may occupy valuable space.
  • it may be advantageous to design graphical elements to be more space efficient while being easy to use. This may allow producers of computing devices to design computing devices with smaller form factors, reducing material consumption and in turn costs.
  • computing device 102 may perform an action in response to a sw'ivel gesture being detected proximate to, centered at, or otherwise determined to be associated with second graphical element 112. For instance, a user may perform a swivel gesture by rotating second graphical element 112 along a path 114 (which may or may not be visible) around first graphical element 1 10, As shown in FIG. 1, the space occupied by first graphical element 110 and second graphical element 112 may be relatively compact (because, e.g., first graphical element 110 and second graphical element 112 are close together).
  • First graphical element 110 and second graphical element 112 being close together may enable easy user interaction. For example, instead of providing multiple inputs to open a menu and select an action included in the menu, a user may provide fewer inputs by performing a swivel gesture to control computing device 102. Performing the swivel gesture may be more efficient (e.g., in terms of time and number of user inputs) and reliable. In this way, GUIs 108 in accordance wdth techniques of this disclosure may improve space- efficiency while being easy to use.
  • UI module 106 of computing device 102 may output (e.g., by UID 104) GUI 108A of application 120 (e.g., a main screen of a camera application) that includes first graphical element 110 and second graphical element 112.
  • GUI 108A of application 120 e.g., a main screen of a camera application
  • Computing device 102 may perform an action when a user interacts with first graphical element 1 10 and/or second graphical element 112 and may perform different actions depending on the particular user input.
  • computing device 102 may perform an action in response to a tap input detected at first graphical element 110 (e.g, opening a photo library of a camera application) but not in response to a swivel gesture (e.g., a rotating of an input point along a curved path) detected proximate to, centered at, or otherwise determined to be associated with first graphical element 1 10.
  • a tap input detected at first graphical element 110 e.g, opening a photo library of a camera application
  • a swivel gesture e.g., a rotating of an input point along a curved path
  • computing device 102 may perform an action (e.g., changing an access setting to a photo hbrasy of a camera application) in response to a swivel gesture being detected proximate to, centered at, or otherwise determined to be associated with second graphical element 112 but not in response to a tap input being detected proximate to, centered at, or otherwise determined to be associated with second graphical element 112.
  • UI module 106 may receive (e.g., by UID 104) a user input, such as a user input having an input point starting at an initial location of second graphical element 112.
  • UI module 106 may determine an input point of the user input based on the part of UID 104 with which a user initially made contact.
  • the initial location of second graphical element 112 may be at a first angular position relative to first graphical element 110.
  • second graphical element 112 is at a first angular position of 0 degrees relative to first graphical element 110 (e.g., as indicated by second graphical element 112 being directly right of first graphical element 110).
  • the user input may further include rotating (or otherwise moving) the input point along path 1 14 from the initial location of second graphical element 112 to an updated location of second graphical element 112 relative to first graphical element 110.
  • the updated location may be at a second angular position relative to first graphical element 110.
  • second graphical element 112 is at a second angular position of 270 degrees relative to first graphical element 110 (e.g., as indicated by second graphical element 112 being directly below first graphical element 110).
  • a swivel gesture module 118 may determine whether a user input corresponds to a swivel gesture.
  • Swivel gesture module 1 18 may determine that a user input is a swivel gesture if an input point is rotated (or otherwise moved) along a curved path. Consequently, swivel gesture module 118 may determine that a user input that includes rotating an input point from an initial location (e.g., a first angular position of 0 degrees) of second graphical element 112 to an updated location (e.g, a second angular position of 270 degrees) of second graphical element 112 along path 114 (which is curved) is a swivel gesture.
  • swivel gesture module 118 may cause computing device 102 to perform an action. For instance, swivel gesture module 1 18 may instruct applications 120 to update a first graphical element setting in response to determining that a user input detected proximate to, centered at, or otherwise determined to be associated with second graphical element 112 is a swivel gesture.
  • the first graphical element setting may be an unlocked setting of applications 120 that pennits access to content of applications 120. In the example of FIG.
  • the first graphical element setting enables a first action associated with first graphical element 110, such as opening a photo library of a camera application in response to a tap input being detected proximate to, centered at, or otherwise determined to be associated with first graphical element 110.
  • applications 120 may update the first graphical element setting to an updated graphical element setting.
  • the updated graphical element setting may be a locked setting of applications 120 that denies access to content of applications 120.
  • the updated graphical element setting enables a second action associated with first graphical element 110, such as not. opening the photo library of the camera application, outputting a notification that access is disabled, etc,
  • first graphical element 110 may have an appearance indicative of the first graphical element setting.
  • first graphical element 110 may have an initial first, graphical element appearance indicative of a first graphical element setting.
  • first graphical element 110 may have an initial first graphical element appearance of a photo (e.g., the most recent photo) in the photo library, as shown in GUI 108A.
  • UI module 106 may output, for display by UID 104, first graphical element 110 having an updated first graphical element appearance.
  • the updated first graphical element appearance may be indicative of the updated first graphical element setting.
  • first graphical element 110 may have an updated first graphical element appearance of a black background, as shown in GUI 108B.
  • second graphical element 112 may visually indicate the first graphical element setting (which may help a user when deciding whether to apply a swivel gesture to second graphical element 1 12 to update the first graphical element setting).
  • second graphical element 112 may have an appearance indicative of a setting of computing device 102 (e.g., applications 120 of computer device 102). For example, if a first graphical element setting allows access to a photo library, second graphical element 112 may have an initial second graphical element appearance of an open lock, as shown in GUI 108A. Responsive to swivel gesture module 118 updating the setting to an updated setting, UI module 106 may output, for display, second graphical element 1 12 having an updated second graphical element appearance. The updated second graphical element appearance may be indicative of the updated seting. For example, if an updated first graphical element setting denies access to the photo library, second graphical element 112 may have an updated second graphical element appearance of a. closed lock, as shown in GUI 108B. In this way, second graphical element 112 may visually indicate the seting of computing device 102.
  • a first graphical element setting allows access to a photo library
  • second graphical element 112 may have an initial second graphical element appearance of an open lock
  • UI module 106 may output, for display (e.g., by UID 104), second graphical element 112 at the initial location of second graphical element 112 after completion of the swivel gesture.
  • a user may perform a swivel gesture along path 114 (shown in FIG. 1A) to change a setting of computing device 102 (e.g., from the unlocked seting to the locked setting and vice versa).
  • UI module 106 may- output, for display (e.g., by UID 104), second graphical element 112 at the updated location of second graphical element 112 in response to determining that the user input corresponds to the swivel gesture.
  • An example of the updated location may be location 115 shown in FIG. IB.
  • a user may perform a swivel gesture along path 114 from the initial location of second graphical element 112 to the updated location of second graphical element 1 12 to change a setting of computing device 102 (e.g,, from the unlocked setting to the locked setting).
  • a user may perform a swivel gesture along path 114 from the updated location (e.g., location 115) of second graphical element 1 12 to the initial location of second graphical element 112 to revert the change to the setting (e.g., from the locked setting to the unlocked setting).
  • a swivel gesture includes rotating (or otherwise moving) an input point along a curved path
  • simply moving an input point along a substantially non-curved path does not correspond to a swivel gesture.
  • UID 104 may receive a user input having an input point starting at an initial location of second graphical element 1 12.
  • the user input may- further include moving the input point in a line from the initial location of second graphical element 112. to an updated location of second graphical element 112.
  • Computing device 102 may determine that the user input is not a swivel gesture due to the input point not moving from the initial location to the updated location along a curved path (e.g., path 114) and thus not perform an action.
  • the swivel gesture is primarily described above as being a single-finger gesture, it should be understood that the swivel gesture may be a multi-finger gesture.
  • UID 104 may detect a user input.
  • UI module 106 may determine the characteristics of the user input, such as whether multiple input points (e.g., one for each finger) are detected by UID 104.
  • UI module 106 may provide the characteristics of the user input to swivel gesture module 118.
  • Swivel gesture module 1 18 may determine whether the user input is a swivel gesture based on the user input characteristics. For example, if swivel gesture module 1 18 determines that a first input point is substantially fixed in location (e.g., stationary), swivel gesture module 118 may determine that the first input point is an anchor point.
  • swivel gesture module 1 18 determines that a second input point rotated along a curved path from an initial location (e.g., a first angular position of 0 degrees) relative to the first input point to an updated location (e.g., a second angular position of 270 degrees) relative to the first input point
  • swivel gesture module 118 may determine that the user input is a swivel gesture.
  • Swivel gesture module 1 18 may determine which, if any, GUI element (e.g., first graphical element 110, second graphical element 112, etc.) is located proximate to the anchor point of the swivel gesture.
  • Swivel gesture module 118 may provide information about the swivel gesture input and the proximate GUI element to an operating system computing device 102 or one of applications 120. The operating system or one of applications 120 may then determine what action to take based on the characteristics of the swivel gestures and tire GUI element proximate to the anchor point and cause device 102 to perform that action.
  • GUI element e.g., first graphical element 110, second graphical element 112, etc.
  • FIG . 2 is a block diagram illustrating an example computing device 202, in accordance with one or more aspects of the present disclosure.
  • Computing device 202 may be substantially similar to computing device 102 shown in FIG. 1.
  • Computing device 2.02 is only one particular example, and many other examples of computing device 202 may be used in other instances.
  • computing device 202 may be a wearable computing device, a mobile computing device (e.g., a smartphone), or any other computing device.
  • Computing device 202 may include a subset of the components included in example computing device 202 or may include additional components not shown in FIG. 2.
  • computing device 202 includes user interface device 204 (“UID 204”), one or more processors 240, one or more input devices 242, one or more communication units 244, one or more output devices 246, and one or more storage devices 248.
  • Storage devices 248 of computing device 202 also include operating system 254 and UI module 206.
  • Communication channels 250 may interconnect each of the components 240, 242, 244, 246, 248, and 204 for inter-component communications (physically, communicatively, and/or operatively).
  • communication channels 250 may include a system bus, a network connection, an inter-process communication data structure, or any other method for communicating data.
  • One or more input devices 242 of computing device 202 may be configured to receive input. Examples of input are tactile, audio, and video input. Input devices 242. of computing device 202, in one example, includes a presence-sensitive display, touch-sensitive screen, mouse, keyboard, voice responsive system, video camera, microphone or any other type of device for detecting input from a human or machine.
  • One or more output devices 246 of computing device 202 may be configured to generate output. Examples of output are tactile, audio, and video output.
  • Output devices 246 of computing device 202 includes a presence-sensitive display, sound card, video graphics adapter card, speaker, cathode ray tube (CRT) monitor, liquid crystal display (LCD), or any other type of device for generating output to a human or machine.
  • CTR cathode ray tube
  • LCD liquid crystal display
  • One or more communication units 244 of computing device 202 may be configured to communicate with external devices via one or more wired and/or wireless networks by transmitting and/or receiving network signals on the one or more networks.
  • Examples of communication unit 244 include a network interface card (e.g., such as an Ethernet card), an optical transceiver, a radio frequency transceiver, a GPS receiver, or any other type of device that can send and/or receive information.
  • Other examples of communication units 244 may include short wave radios, cellular data radios, wireless network radios, as well as universal serial bus (USB) controllers.
  • USB universal serial bus
  • One or more storage devices 248 within computing device 202 may store information for processing during operation of computing device 202.
  • storage device 248 is a temporary memory, meaning that a primary purpose of storage device 248 is not long-term storage.
  • Storage devices 248 on computing device 202 may be configured for short-term storage of information as volatile memory and therefore not retain stored contents if powered off. Examples of volatile memories include random access memories (RAM), dynamic random access memories (DRAM), static random access memories (SRAM), and other forms of volatile memories known in the art.
  • RAM random access memories
  • DRAM dynamic random access memories
  • SRAM static random access memories
  • Storage devices 248 may be configured to store larger amounts of information than volatile memory.
  • Storage devices 248 may further be configured for long-term storage of information as non-volatile memory space and retain information after power on/off cycles. Examples of non-volatile memories include magnetic hard discs, optical discs, floppy discs, flash memories, or forms of electrically programmable memories (EPROM) or electrically erasable and programmable (EEPROM) memories.
  • Storage devices 248 may store program instructions and/or information (e.g., data) associated with UI module 206, swivel gesture module 218, one or more applications 220, and operating system 254.
  • processors 240 may implement functionality and/or execute instractions within computing device 202.
  • processors 240 on computing device 202 may receive and execute instructions stored by storage devices 248 that execute the functionality of UI module 206 and swivel gesture module 218. These instructions executed by processors 240 may cause UI module 206 of computing device 202 to provide GUIs as described herein.
  • UID 204 of computing device 202 may include functionality of input devices 242 and/or output devices 246.
  • UID 204 may be or may include a presence-sensitive input device.
  • a presence sensitive input device may detect an object at and/or near a screen.
  • a presencesensitive input device may detect an object, such as a finger or stylus that is within 2 inches or less of the screen.
  • the presence-sensitive input device may determine a location (e.g., an (x,y) coordinate) of a screen at which the object was detected.
  • a presence-sensitive input device may detect an object six inches or less from the screen and other ranges are also possible, fire presence-sensitive input device may determine the location of the screen selected by a user’s finger using capacitive, inductive, and/or optical recognition techniques.
  • a presence sensitive input device also provides output to a user using tactile, audio, or video stimuli as described with respect to output devices 246, e.g., at a display.
  • UID 204 may present a user interface.
  • UID 204 While illustrated as an internal component of computing device 202, UID 204 also represents an external component that shares a data path with computing device 202 for transmitting and/or receiving input and output.
  • UID 204 represents a built-in component of computing device 202 located within and physically connected to the external packaging of computing device 202 (e.g., a screen on a mobile phone).
  • UID 204 represents an external component of computing device 202 located outside and physically separated from the packaging of computing device 202 (e.g., a monitor, a projector, etc. that shares a wired and/or wireless data, path with a tablet computer).
  • UI module 206 may include all functionality of UI module 106 of computing device 102 of FIG.
  • UI module 106 may perform similar operations as UI module 106 for managing a user interface (e.g., GUIs 108) that computing device 202 provides at UID 204.
  • UI module 206 of computing device 202 may output GUIs 108 associated with one or more of applications 220, as shown in FIG. 1.
  • Swivel gesture module 218 may cause computing device 202 to perform an action in response to a swivel gesture as described above with respect to FIG. 1 .
  • Example actions that computing device 2.02 may perform in response to a swivel gesture are described below.
  • FIG. 3 is a conceptual diagram showing example GUIs 308A-308B (collectively, ‘"GUIs 308”), m accordance with techniques of this disclosure. For purposes of illustration only, GUIs 308 are described within the context of computing device 202 of FIG. 2.
  • GUIs 308 may be associated with one or more applications 220, such as a media playback application.
  • application 220 is a video playback application.
  • GUI 308A may show a state and/or setting of application 220 prior to receipt of a swivel gesture.
  • GUI 308B which may be an updated GUI, may show a state and/or setting of application 220 following receipt of a swivel gesture.
  • GUIs 308 may include media content 316.
  • media content 316 is video footage of three vehicles recorded by a drone ascending in the air.
  • Computing device 202 may perform an action when a user interacts with first graphical element 310 and/or second graphical element 312, For example, application 220 may pause or play media content 316 in response to a tap input being detected proximate to, centered at, or otherwise determined to be associated with first graphical element 310.
  • the appearance of first graphical element 310 may indicate which action application 220 performs in response to a tap input being detected proximate to, centered at, or otherwise determined to be associated with first graphical element 310.
  • the appearance of first graphical element 310 in GUI 308A may indicate that tapping first graphical element 310 will application 220 to play media content 316.
  • the appearance of first graphical element 310 in GUI 308B may indicate that tapping first graphical element 310 will cause application 220 to pause media content 316.
  • Computing device 202 may perform an action in response to a swivel gesture being detected proximate to, centered at, or otherwise determined to be associated with second graphical element 312.
  • swivel gesture module 218 may instruct application 220 to adjust a playback position of media content 316.
  • swivel gesture module 218 may instruct application 220 to adjust a playback position of media content 316 by an amount proportional to a length of the swivel gesture.
  • a swivel gesture that rotates an input point from an initial location of 0 degrees to an updated location of 135 degrees (i.e., an angular displacement equivalent to 50% of the angular range) may adjust the playback position to a time that is 50% of the total length of the media content.
  • UI module 206 may output, for display by UID 204, second graphical element 312 at the updated location of second graphical element 312 in response to determining that the user input corresponds to the swivel gesture.
  • swivel gesture module 218 may instruct application 220 to adjust a playback position of media content 316 by an amount proportional to a duration of the swivel gesture. For instance, application 220 may progressively advance (e.g., fast forward, skip, etc.) the playback position through tire media content in response to commitment of a swivel gesture and until the release of the swivel gesture (e.g., when a user lifts the user’s finger from UID 204).
  • UI module 206 may output, for display by UID 204, second graphical element 312 at the initial location of second graphical element 312 in response to determining that the user input corresponds to the swivel gesture.
  • first graphical element 310, second graphical element 312, and path 314 may be relatively compact.
  • first graphical element 310, second graphical element 312, and path 314 (if visible) may obscure less of video content 316.
  • compactness of first graphical element 310, second graphical element 312, and path 314 may enable a user to interact with first graphical element 310 and second graphical element 312 with one hand, which may be convenient in various circumstances.
  • FIG. 4 is a conceptual diagram showing example GUIs 408A-408B (collectively, “GUIs 408”), in accordance with techniques of this disclosure. For purposes of illustration only, GUIs 408 are described within the context of computing device 202 of FIG. 2.
  • GUIs 408 may be associated with one or more applications 220, such as a medsa playback application.
  • application 220 is a music streaming application.
  • GUI 408A may show a state and/or seting of the media playback application prior to receipt of a swivel gesture.
  • GUI 408B which may be an updated GUI, may show a state and/or setting of the media playback application following receipt of a swivel gesture.
  • GUIs 408 may include media content 416.
  • media content 416 may be a song entitled “Let’s Swivel, U & I” by the artist CPG.
  • Computing device 202 may perform an action when a user interacts with first graphical element 410 and/or second graphical element 412. For example, application 220 may pause or play media content 416 in response to a tap input being detected proximate to, centered at, or otherwise determined to be associated with first graphical element 410. Additionally, computing device 202 may perform an action in response to a swivel gesture being detected proximate to, centered at, or otherwise determined to be associated with second graphical element 412. For example, swivel gesture module 218 may instruct application 220 to update a setting of application 220 to an updated setting of application 220.
  • application 220 may update a first volume setting (e.g., a volume setting of 1) of a media playback application to a second volume setting (e.g., a volume setting of 5).
  • UI module 206 may output, for display by UID 204, second graphical element 412. at the updated location of second graphical element 412. in response to determining that the user input corresponds to the swivel gesture.
  • UI module 206 may update an appearance of second graphical element 412 in response to and based on the swivel gesture detected proximate to, centered at, or otherwise determined to be associated with second graphical element 412.
  • FIG. 5 is a conceptual diagram showing example GUIs 508A-508B (collectively, “GUIs 508”), in accordance with techniques of this disclosure. For purposes of illustration only, GUIs 508 are described within the context of computing device 202 of FIG. 2.
  • GUI module 206 may output, by UID 204, GUIs 508 that include a first graphical element 510 and a second graphical element 512.
  • GUIs 408 may be associated with one or more applications 220, such as a camera application, as shown in FIG. 5.
  • GUI 508A may- show a state and/or setting of computing device 2.02 prior to receipt of a swivel gesture
  • GUI 508B which may be an updated GUI, may show a state and/or setting of computing device 202 following receipt of a swivel gesture.
  • Computing device 202 may- perform an action when a user interacts with first graphical element 510 and/or second graphical element 512.
  • application 220 may capture a photograph (e.g., a photograph of a person long jumping) in response to a tap input being detected proximate to, centered at, or otherwise determined to be associated with first graphical element 510.
  • computing device 2.02 may perform an action in response to a swivel gesture being detected proximate to, centered at, or otherwise determined to be associated with second graphical element 512.
  • swivel gesture module 218 may instruct application 220 to update a seting of application 220 to an updated seting of application 220.
  • application 220 may update a first shutter speed seting (e.g., a shutter speed of 1/4) of a camera application to a second shutter speed seting (e.g., a shutter speed of 1 /1000).
  • UI module 206 may output, for display by UID 204, second graphical element 512 at the updated location of second graphical element 512 in response to determining that the user input corresponds to the swivel gesture. As further shown in FIG. 5, UI module 206 may update an appearance of second graphical element 512 in response to and based on the swivel gesture detected proximate to, centered at, or otherwise determined to be associated with second graphical element 512.
  • FIG. 6 is a conceptual diagram showing example GUIs 608A-608B (collectively, “GUIs 608”), in accordance with techniques of this disclosure. For purposes of illustration only, GUIs 608 are described within the context of computing device 202 of FIG. 2.
  • GUI 608A may be associated with a home screen of computing device 202.
  • GUI 608B may be associated with one or more applications 220, such as a camera application, as shown in FIG. 6.
  • GUI 608A may show a state and/or seting of computing device 202 prior to receipt of a swivel gesture.
  • GUI 608B which may be an updated GUI, may show' a state and/or setting of computing device 202 following receipt of a swivel gesture.
  • Computing device 202 may perform an action when a user interacts with first graphical element 610 and/or second graphical element 612. For example, with respect to GUI 608A, computing device 202 may authenticate a user in response to the user pressing a finger on first graphical element 610. With respect to GUI 608B, application 220 may capture a photograph in response to a tap input being detected proximate to, centered at, or otherwise determined to be associated with first graphical element 610.
  • computing device 202 may perform an action in response to a swivel gesture being detected proximate to, centered at, or otherwise determined to be associated with second graphical element 612. For instance, with respect to GUI 608A, swivel gesture module 218 may instruct computing device 202 to launch a specific application (e.g., based on a computing device setting). As an example, computing device 202 may launch application 220, a camera application.
  • a specific application e.g., based on a computing device setting.
  • computing device 202 may launch application 220, a camera application.
  • a computing device may perform a variety of actions in response to a swivel gesture.
  • the examples described herein are provided for purposes of explanation and are not intended to be limiting.
  • Other example actions that a computing device may perform in response to a swivel gesture are contemplated by this disclosure.
  • FIG. 7 is a flowchart illustrating example operations performing actions in response to a swivel gesture, in accordance with one or more techniques of the present disclosure. For purposes of illustration only, the example operations are described below within the context of computing device 102 of FIG. 1.
  • Computing device 102 may output a GUI of a screen of an application (702).
  • UI module 106 may cause UID 104 to display GUI 108A (e.g., a main screen of a camera application).
  • Computing device 102 may moni tor for receipt of an indication of a user input (704). For instance, UID 104 may generate (e.g., via a touch or presence sensitive screen) user input data. UI module 106 may process the user input data. Responsive to not receiving an indication of a user input (“No” branch of 704), UI module 106 may continue outputting, for display by UID 104, a GUI of a screen of application 120 (702).
  • swivel gesture module 118 may determine whether the user input is a swivel gesture detected proximate to, centered at, or otherwise determined to be associated with second graphical element 112 (706). swivel gesture module 118 may determine that the user input is a swivel gesture detected proximate to, centered at, or otherwise determined to be associated with second graphical element 112. in response to the user input data indicating a user input includes rotating of an input point from an initial location of second graphical element 1 12 to an updated location of second graphical element 1 12 (e.g., along path 114).
  • swivel gesture module 118 may instruct computing device 102 to perform an action associated with the swivel gesture (708). For example, swivel gesture module 118 may instruct application 12.0 to change a setting of application 120 to deny access to a photo library'. Additionally, UI module 106 may cause UID 104 to output, for display, an updated GUI including second graphical element 1 12 (710).
  • UI module 106 may cause UID 104 to output, for display, GUI 108B with second graphical element 1 12 at the updated location of second graphical element 1 12 (e.g., the second angular position relative to first graphical element 1 10) in response to determining that the user input corresponds to the swivel gesture.
  • UI module 106 may cause UID 104 to output, tor display, GUI 108B with second graphical element 112 at the initial location of second graphical element 112 (e.g., the first angular position relative to first graphical element 1 10) in response to determining that the user input corresponds to the swivel gesture.
  • swivel gesture module 118 may not instruct computing device 102 to perform any action associated with a swivel gesture. For example, UID 104 may continue displaying GUI 108 A (702).
  • Example 1 A method includes outputting, by one or more processors of a computing device and for display, a first graphical element and a second graphical element, wherein the second graphical element is located at an initial location relative to the first graphical element, and wherein the initial location is at a first angular position relative to the first graphical element; receiving, by the one or more processors, an indication of a user input having an input point starting at the initial location; determining, by the one or more processors, whether the user input corresponds to a swivel gesture, wherein the swivel gesture includes a rotating of the input point from the initial location to an updated location of the second graphical element relative to the first graphical element, and wherein the updated location is at a second angular position relative to the first graphical element; and responsive to determining that the user input corresponds to the swivel gesture, performing, by the one or more processors, an action associated with the swivel gesture.
  • Example 2 The method of example 1, further including, responsive to determining that the user input corresponds to the swivel gesture, outputting, by the one or more processors and for display, the second graphical element at the updated location.
  • Example 3 The method of example I, further including, responsive to performing the action associated with the swivel gesture, outputting, by the one or more processors and for display, the second graphical element located at the initial location.
  • Example 4 The method of any of examples 1 to 3, wherein the action associated with the swivel gesture includes updating a setting of the computing device to tin updated setting of the computing device.
  • Example 5 The method of example 4, wherein the setting is a first volume setting of a media playback application, and -wherein the updated setting is a second volume setting of the media playback application, wherein the second volume seting is different from the first volume setting.
  • Example 6 The method of example 4, wherein the setting is a first shutter speed setting of a camera application, and wherein the updated setting is a second shutter speed setting of the camera application, wherein the second shuter speed setting is different from the first shutter speed setting.
  • Example 7 The method of example 4, wherein the setting is an unlocked setting of an application, and wherein the updated seting is a locked setting of the application, wherein the unlocked setting permits access to content of the application, and wherein the locked setting denies access to the content of the application.
  • Example 8 The method of example 4, wherein the setting is a first graphical element setting, and wherein the updated setting is a second graphical element setting.
  • Example 9 The method of example 8, wherein the first graphical element has an initial first graphical element appearance indicative of the first graphical element setting, further including, responsive to updating the first graphical element setting to the updated first graphical element seting, outputting, by the one or more processors and for display, the first graphical element having an updated first graphical element appearance indicative of the updated first graphical element seting.
  • Example 10 The method of any of examples 1 to 9, wherein a first graphical element setting enables a first action associated with the first graphical element, and wherein an updated first graphical element setting enables a second action associated with the first graphical element.
  • Example 11 The method of any of examples 1 to 10, wherein the second graphical element has an initial second graphical element appearance indicative of a setting of the computing device, further including, responsive to updating the setting to an updated setting, outputing, by the one or more processors and for display, the second graphical element having an updated second graphical element appearance indicative of an updated setting.
  • Example 12 The method of any of examples 1 to 3, wherein the action associated with the swivel gesture includes adjusting a playback position of media content.
  • Example 13 The method of any of examples 1 to 3, wherein the action associated with the swivel gesture includes wherein the action includes adjusting a playback position of media content by an amount proportional to at least one of a length of the swivel gesture or a duration of the swivel gesture.
  • Example 14 A computing device including means for performing any of the methods of examples 1-13.
  • Example 15 A non-transitory computer-readable storage medium storing instructions that, when executed by one or more processors of a computing device, cause the one or more processors to perform any of the methods of examples 1-13.
  • Computer-readable media may include computer-readable storage media, which corresponds to a tangible medium such as data storage media, or communication media including any medium that facilitates transfer of a computer program from one place to another, e.g., according to a communication protocol.
  • computer-readable media generally may correspond to (1) tangible computer- readable storage media, which is non-transitory or (2) a communication medium such as a signal or carrier wave.
  • Data storage media may be any available media that can be accessed by one or more computers or one or more processors to retrieve instructions, code and/or data structures for implementation of the techniques described in this disclosure.
  • a computer program product may include a computer-readable medium.
  • such computer-readable storage media can comprise RAM, ROM, EEPROM, CD-ROM or other optical disk storage, magnetic disk storage, or other magnetic storage devices, flash memory; or any other medium that can be used to store desired program code in the form of instructions or data structures and that can be accessed by a computer. Also, any connection is properly termed a computer-readable medium.
  • Disk and disc includes compact disc (CD), laser disc, optical disc, digital versatile disc (DVD), floppy disk and Blu-ray disc, where disks usually reproduce data magnetically, while discs reproduce data optically with lasers. Combinations of the above should also be included within the scope of computer-readable media.
  • processors such as one or more digital signal processors (DSPs), general purpose microprocessors, application specific integrated circuits (ASICs), field programmable logic arrays (FPGAs), or other equivalent integrated or discrete logic circuitry.
  • DSPs digital signal processors
  • ASICs application specific integrated circuits
  • FPGAs field programmable logic arrays
  • processors may refer to any of the foregoing structures or any other structure suitable for implementation of the techniques described herein.
  • the functionality described herein may be provided within dedicated hardware and/or software modules. Also, the techniques could be fully implemented in one or more circuits or logic elements.
  • the techniques of this disclosure may be implemented in a wide variety of devices or apparatuses, including a wireless handset, an integrated circuit (IC) or a set of ICs (e.g., a chip set).
  • IC integrated circuit
  • a set of ICs e.g., a chip set.
  • Various components, modules, or units are described in this disclosure to emphasize functional aspects of devices configured to perform the disclosed techniques, but do not necessarily require realization by different hardware units. Rather, as described above, various units may be combined in a hardware unit or provided by a collection of intraoperative hardware units, including one or more processors as described above, in conjunction with suitable software and/or firmware.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

A computing device outputs for display a first graphical element and a second graphical element. The second, graphical element is located at an initial location relative to the first graphical element, where the initial location is at a first angular position relative to the first graphical element. The computing device receives an indication of a user input having an input point starting at the initial location. The computing device determines whether the user input corresponds to a swivel gesture. Responsive to determining that the user input corresponds to the swivel gesture, the computing device performs an action associated with the swivel gesture.

Description

SWIVEL GESTURE FUNCTIONALITY ON COMPUTING DEVICES
BACKGROUND
[0001] A computing device may include a display device that displays content from an application executing at the computing device, such as textual or graphical content. In some examples, a user may interact with a graphical user interface (GUI) of the application using a presence-sensitive screen (e.g., touchscreen) of the computing device. User interactions with one or more graphical elements included in the GUI may cause the computing device to perform various actions.
SUMMARY
[0002] In general, this disclosure describes techniques that enable a computing device to perform an action based on movement of two or more graphical elements relative to each other. For instance, a user may perform a swivel gesture to cause one graphical element to visually rotate around another graphical element. The rotation of the one graphical element around the other graphical element may cause the computing device to perform various actions, including, but not limited to, controlling media playback, changing media playback settings, and changing application settings. As a result, the techniques of this disclosure may enable more compact GUIs, which may be more suitable for smaller screens, and enable additional functionality of the computing device. Furthermore, the techniques may reduce the number of user inputs required to perform a task. This may enable the device to operate more efficiently (e.g., in terms of processor usage, memory consumed, memory bus bandwidth utilized, etc.).
[0003] In some examples, a method includes: outputting, by one or more processors of a computing device and for display, a first graphical element and a second graphical element, wherein the second graphical element is located at an initial location relative to the first graphical element, and wherein the initial location is at a first angular position relative to the first graphical element; recei ving, by the one or more processors, an indication of a user input having an input point starting at the initial location; determining, by the one or more processors, whether the user input corresponds to a swivel gesture, wherein the swivel gesture includes a rotating of tire input point from the initial location to an updated location of the second graphical element relative to the first graphical element, and wherein the updated location is at a second angular position relative to the first graphical element; and responsive to determining that the user input corresponds to the swivel gesture, performing, by the one or more processors, an action associated with the swivel gesture.
[0004] In some examples, a device includes: one or more processors; and a memory that stores instructions that, when executed by the one or more processors, cause the one or more processors to: output for display a first graphical element and a second graphical element, wherein the second graphical element is located at an initial location relative to the first graphical element, and wherein the initial location is at a first angular position relative to the first graphical element; receive an indication of a user input having an input point starting at the initial location; determine whether the user input corresponds to a swivel gesture, wherein the swivel gesture includes a rotating of tire input point from tire initial location to an updated location of the second graphical element relative to the first graphical element, and wherein the updated location is at a second angular position relative to the first graphical element; and responsive to determining that the user input corresponds to tire swivel gesture, perform an action associated with the swivel gesture.
[0005] In some examples, a non-transitory computer-readable storage medium stores instructions that, when executed, cause one or more processors of a computing device to: output for display a first graphical element and a second graphical element, wherein the second graphical element is located at an initial location relative to the first graphical element, and wherein the initial location is at a first angular position relative to the first graphical element; receive an indication of a user input having an input point starting at the initial location; determine whether the user input corresponds to a swivel gesture, wherein the swivel gesture includes a rotating of the input point from the initial location to an updated location of the second graphical element relative to the first graphical element, and wherein the updated location is at a second angular position relative to the first graphical element; and responsive to determining that the user input corresponds to the swivel gesture, perform an action associated with the swivel gesture.
[0006] In some examples, an apparatus includes: means for outputting, for display, a first graphical element and a second graphical element, wherein the second graphical element is located at an initial location relative to the first graphical element, and wherein the initial location is at a first angular position relative to the first graphical element; means for receiving an indication of a user input having an input point starting at the initial location; means for determining whether the user input corresponds to a swivel gesture, wherein the swivel gesture includes a rotating of the input point from the initial location to an updated location of the second graphical element relative to the first graphical element, and wherein the updated location is at a second angular position relative to the first graphical element; and means for performing an action associated with the swivel gesture in response to determining that the user input corresponds to the swivel gesture.
[0007] The details of one or more examples of the disclosure are set forth in the accompanying drawings and the description below. Other features, objects, and advantages of the disclosure will be apparent from the description and drawings, and from the claims.
BRIEF DESCRIPTION OF DRAWINGS
[0008] FIG. 1 is a conceptual diagram illustrating an example computing device configured to perform an action in response to a swivel gesture, in accordance with one or more aspects of the present disclosure.
[0009] FIG.2 is a block diagram illustrating an example computing device configured to perform an action in response to a swivel gesture, in accordance with one or more aspects of the present disclosure.
[0010] FIG . 3 is a conceptual diagram illustrating example graphical user interfaces, in accordance with one or more aspects of the present disclosure.
[0011] FIG. 4 is a conceptual diagram illustrating example graphical user interfaces, in accordance with one or more aspects of the present disclosure.
[0012] FIG. 5 is a conceptual diagram illustrating example graphical user interfaces, in accordance with one or more aspects of the present disclosure.
[0013] FIG, 6 is a conceptual diagram illustrating example graphical user interfaces, in accordance with one or more aspects of the present disclosure.
[0014] FIG. 7 is a flowchart illustrating example operations for performing actions in response to a swivel gesture, in accordance with one or more aspects of the present disclosure.
DETAILED DESCRIPTION
[0015] FIG, 1 is a conceptual diagram illustrating an example computing device 102 configured to output a GUI in accordance with one or more techniques of this disclosure. As shown in FIG. 1, computing device 102 is a mobile computing device (e.g., a mobile phone).
However, in other examples, computing device 102 may be a tablet computer, a laptop computer, a desktop computer, a gaming system , a media player, an e-book reader, a television platform, an automobile navigation system, a wearable computing device (e.g., a computerized watch, computerized headset, computerized eyewear, a computerized glove), or any oilier type of mobile or non-mobile computing device.
[0016] Computing device 102. includes a user interface device (UID) 104, UID 104 of computing device 102 may function as an input device for computing device 102 and as an output device for computing device 102. UID 104 may be implemented using various technologies. For instance, UID 104 may function as an input device using a presence- sensitive input screen, such as a resistive touchscreen, a surface acoustic wave touchscreen, a capacitive touchscreen, a projective capacitive touchscreen, a pressure sensitive screen, an acoustic pulse recognition touchscreen, radar, ultrawide band, or another presence-sensitive display technology. UID 104 may function as an output (e.g., display) device using any one or more display devices, such as a liquid crystal display (LCD), dot matrix display, light emitting diode (LED) display, microLED, organic light-emitting diode (DEED) display, e- ink, or similar monochrome or color display capable of outputting visible information to a user of computing device 102.
[0017] UID 104 of computing device 102 may include a presence-sensitive display that may receive input from a user of computing device 102. UID 104 may receive indications of the input by detecting one or more gestures from a user of computing device 102 (e.g., the user touching or pointing to one or more locations of UID 104 with a finger or a stylus pen). UID 104 may present output to a user, for instance at a presence-sensitive display. UID 104 may- present the output as a graphical user interface, which may be associated with functionality provided by computing device 102. For example, UID 104 may present various user interfaces of components of a computing platform, operating system, one or more applications 120, or services executing at or accessible by computing device 102 (e.g., an electronic message application, an Internet browser application, a mobile operating system, etc.). A user may interact with a respective user interface to cause computing device 102 to perform operations relating to an action.
[0018] Computing device 102 includes UI module 106, which manages user interactions with UID 104 and other components of computing device 102. In other words, UI module 106 may act as an intermediary between various components of computing device 102 to make determinations based on user input detected by UID 104 and generate output at UID 104 in response to the user input. UI module 106 may receive instructions from applications 120, service, platform, or other module of computing device 102 to cause UID 104 to output a user interface, such as GUIs. UI module 106 may manage inputs received by computing device 102 as a user views and interacts with the user interface presented at UID 104. UI module 106 may update the user interface in response to receiving additional instructions from applications 120, services, platforms, or other modules of computing device 102 that are processing the user input. As such, UI module 106 may cause UID 104 to display GUIs, such as GUIs 108A-108B (collectively “GUIs 108”). GUI 108A may be an initial GUI, and GUI 108B may be an updated GUI in accordance with techniques of this disclosure.
[0019] A GUI displayed by UID 104 may include one or more graphical elements. For instance, as shown in FIG. 1 , GUIs 108A-108B may each include a first graphical element 1 10 and a second graphical element 112. In general, a user may interact with graphical elements to cause computing device 102 to perform corresponding actions. However, given the limited size of UID 104, each graphical element may occupy valuable space. Thus, it may be advantageous to design graphical elements to be more space efficient while being easy to use. This may allow producers of computing devices to design computing devices with smaller form factors, reducing material consumption and in turn costs.
[0020] In accordance with techniques of this disclosure, computing device 102 may perform an action in response to a sw'ivel gesture being detected proximate to, centered at, or otherwise determined to be associated with second graphical element 112. For instance, a user may perform a swivel gesture by rotating second graphical element 112 along a path 114 (which may or may not be visible) around first graphical element 1 10, As shown in FIG. 1, the space occupied by first graphical element 110 and second graphical element 112 may be relatively compact (because, e.g., first graphical element 110 and second graphical element 112 are close together).
[0021] First graphical element 110 and second graphical element 112 being close together may enable easy user interaction. For example, instead of providing multiple inputs to open a menu and select an action included in the menu, a user may provide fewer inputs by performing a swivel gesture to control computing device 102. Performing the swivel gesture may be more efficient (e.g., in terms of time and number of user inputs) and reliable. In this way, GUIs 108 in accordance wdth techniques of this disclosure may improve space- efficiency while being easy to use.
[0022] UI module 106 of computing device 102 may output (e.g., by UID 104) GUI 108A of application 120 (e.g., a main screen of a camera application) that includes first graphical element 110 and second graphical element 112. Computing device 102 may perform an action when a user interacts with first graphical element 1 10 and/or second graphical element 112 and may perform different actions depending on the particular user input. For example, computing device 102 may perform an action in response to a tap input detected at first graphical element 110 (e.g, opening a photo library of a camera application) but not in response to a swivel gesture (e.g., a rotating of an input point along a curved path) detected proximate to, centered at, or otherwise determined to be associated with first graphical element 1 10. On the other hand, computing device 102 may perform an action (e.g., changing an access setting to a photo hbrasy of a camera application) in response to a swivel gesture being detected proximate to, centered at, or otherwise determined to be associated with second graphical element 112 but not in response to a tap input being detected proximate to, centered at, or otherwise determined to be associated with second graphical element 112. [0023] UI module 106 may receive (e.g., by UID 104) a user input, such as a user input having an input point starting at an initial location of second graphical element 112. In some examples, UI module 106 may determine an input point of the user input based on the part of UID 104 with which a user initially made contact. The initial location of second graphical element 112 may be at a first angular position relative to first graphical element 110. In the example of FIG. 1, second graphical element 112 is at a first angular position of 0 degrees relative to first graphical element 110 (e.g., as indicated by second graphical element 112 being directly right of first graphical element 110).
[0024] The user input may further include rotating (or otherwise moving) the input point along path 1 14 from the initial location of second graphical element 112 to an updated location of second graphical element 112 relative to first graphical element 110. The updated location may be at a second angular position relative to first graphical element 110. In the example of FIG. 1, second graphical element 112 is at a second angular position of 270 degrees relative to first graphical element 110 (e.g., as indicated by second graphical element 112 being directly below first graphical element 110).
[0025] A swivel gesture module 118 may determine whether a user input corresponds to a swivel gesture. Swivel gesture module 1 18 may determine that a user input is a swivel gesture if an input point is rotated (or otherwise moved) along a curved path. Consequently, swivel gesture module 118 may determine that a user input that includes rotating an input point from an initial location (e.g., a first angular position of 0 degrees) of second graphical element 112 to an updated location (e.g, a second angular position of 270 degrees) of second graphical element 112 along path 114 (which is curved) is a swivel gesture.
[0026] Responsive to determining that a user input detected proximate to, centered at, or otherwise determined to be associated with second graphical element 112 is a swivel gesture, swivel gesture module 118 may cause computing device 102 to perform an action. For instance, swivel gesture module 1 18 may instruct applications 120 to update a first graphical element setting in response to determining that a user input detected proximate to, centered at, or otherwise determined to be associated with second graphical element 112 is a swivel gesture. In some examples, the first graphical element setting may be an unlocked setting of applications 120 that pennits access to content of applications 120. In the example of FIG. 1, the first graphical element setting enables a first action associated with first graphical element 110, such as opening a photo library of a camera application in response to a tap input being detected proximate to, centered at, or otherwise determined to be associated with first graphical element 110.
[0027 ] Responsive to instructions from swivel gesture module 118, applications 120 may update the first graphical element setting to an updated graphical element setting. In some examples, the updated graphical element setting may be a locked setting of applications 120 that denies access to content of applications 120. In the example of FIG. 1, the updated graphical element setting enables a second action associated with first graphical element 110, such as not. opening the photo library of the camera application, outputting a notification that access is disabled, etc,
[0028 ] In some examples, first graphical element 110 may have an appearance indicative of the first graphical element setting. For instance, first graphical element 110 may have an initial first, graphical element appearance indicative of a first graphical element setting. For example, if the first graphical element setting allows access to a photo library, first graphical element 110 may have an initial first graphical element appearance of a photo (e.g., the most recent photo) in the photo library, as shown in GUI 108A. Responsive to swivel gesture module 118 updating (e.g., via applications 120) the first graphical element setting to an updated first graphical element setting (e.g., because of a swivel gesture detected proximate to, centered at, or otherwise determined to be associated with second graphical element 112), UI module 106 may output, for display by UID 104, first graphical element 110 having an updated first graphical element appearance. The updated first graphical element appearance may be indicative of the updated first graphical element setting. For example, if the updated first graphical element setting denies access to the photo library, first graphical element 110 may have an updated first graphical element appearance of a black background, as shown in GUI 108B. In this way, second graphical element 112 may visually indicate the first graphical element setting (which may help a user when deciding whether to apply a swivel gesture to second graphical element 1 12 to update the first graphical element setting).
[0029] In some examples, second graphical element 112 may have an appearance indicative of a setting of computing device 102 (e.g., applications 120 of computer device 102). For example, if a first graphical element setting allows access to a photo library, second graphical element 112 may have an initial second graphical element appearance of an open lock, as shown in GUI 108A. Responsive to swivel gesture module 118 updating the setting to an updated setting, UI module 106 may output, for display, second graphical element 1 12 having an updated second graphical element appearance. The updated second graphical element appearance may be indicative of the updated seting. For example, if an updated first graphical element setting denies access to the photo library, second graphical element 112 may have an updated second graphical element appearance of a. closed lock, as shown in GUI 108B. In this way, second graphical element 112 may visually indicate the seting of computing device 102.
[0030] As shown in FIG. IB, UI module 106 may output, for display (e.g., by UID 104), second graphical element 112 at the initial location of second graphical element 112 after completion of the swivel gesture. In such examples, a user may perform a swivel gesture along path 114 (shown in FIG. 1A) to change a setting of computing device 102 (e.g., from the unlocked seting to the locked setting and vice versa). Alternatively, UI module 106 may- output, for display (e.g., by UID 104), second graphical element 112 at the updated location of second graphical element 112 in response to determining that the user input corresponds to the swivel gesture. An example of the updated location may be location 115 shown in FIG. IB. In such examples, a user may perform a swivel gesture along path 114 from the initial location of second graphical element 112 to the updated location of second graphical element 1 12 to change a setting of computing device 102 (e.g,, from the unlocked setting to the locked setting). In the same examples, a user may perform a swivel gesture along path 114 from the updated location (e.g., location 115) of second graphical element 1 12 to the initial location of second graphical element 112 to revert the change to the setting (e.g., from the locked setting to the unlocked setting).
[0031] Because a swivel gesture includes rotating (or otherwise moving) an input point along a curved path, simply moving an input point along a substantially non-curved path does not correspond to a swivel gesture. For instance, UID 104 may receive a user input having an input point starting at an initial location of second graphical element 1 12. The user input may- further include moving the input point in a line from the initial location of second graphical element 112. to an updated location of second graphical element 112. Computing device 102 may determine that the user input is not a swivel gesture due to the input point not moving from the initial location to the updated location along a curved path (e.g., path 114) and thus not perform an action. [0032] Although the swivel gesture is primarily described above as being a single-finger gesture, it should be understood that the swivel gesture may be a multi-finger gesture. In such examples, UID 104 may detect a user input. UI module 106 may determine the characteristics of the user input, such as whether multiple input points (e.g., one for each finger) are detected by UID 104. UI module 106 may provide the characteristics of the user input to swivel gesture module 118.
[0033] Swivel gesture module 1 18 may determine whether the user input is a swivel gesture based on the user input characteristics. For example, if swivel gesture module 1 18 determines that a first input point is substantially fixed in location (e.g., stationary), swivel gesture module 118 may determine that the first input point is an anchor point. If swivel gesture module 1 18 determines that a second input point rotated along a curved path from an initial location (e.g., a first angular position of 0 degrees) relative to the first input point to an updated location (e.g., a second angular position of 270 degrees) relative to the first input point, swivel gesture module 118 may determine that the user input is a swivel gesture.
[0034] Swivel gesture module 1 18 may determine which, if any, GUI element (e.g., first graphical element 110, second graphical element 112, etc.) is located proximate to the anchor point of the swivel gesture. Swivel gesture module 118 may provide information about the swivel gesture input and the proximate GUI element to an operating system computing device 102 or one of applications 120. The operating system or one of applications 120 may then determine what action to take based on the characteristics of the swivel gestures and tire GUI element proximate to the anchor point and cause device 102 to perform that action.
[0035] FIG . 2 is a block diagram illustrating an example computing device 202, in accordance with one or more aspects of the present disclosure. Computing device 202 may be substantially similar to computing device 102 shown in FIG. 1. Computing device 2.02 is only one particular example, and many other examples of computing device 202 may be used in other instances. In the example of FIG. 2, computing device 202 may be a wearable computing device, a mobile computing device (e.g., a smartphone), or any other computing device. Computing device 202 may include a subset of the components included in example computing device 202 or may include additional components not shown in FIG. 2.
[0036] As shown in the example of FIG. 2, computing device 202 includes user interface device 204 (“UID 204”), one or more processors 240, one or more input devices 242, one or more communication units 244, one or more output devices 246, and one or more storage devices 248. Storage devices 248 of computing device 202 also include operating system 254 and UI module 206. [0037] Communication channels 250 may interconnect each of the components 240, 242, 244, 246, 248, and 204 for inter-component communications (physically, communicatively, and/or operatively). In some examples, communication channels 250 may include a system bus, a network connection, an inter-process communication data structure, or any other method for communicating data.
[0038] One or more input devices 242 of computing device 202 may be configured to receive input. Examples of input are tactile, audio, and video input. Input devices 242. of computing device 202, in one example, includes a presence-sensitive display, touch-sensitive screen, mouse, keyboard, voice responsive system, video camera, microphone or any other type of device for detecting input from a human or machine.
[0039] One or more output devices 246 of computing device 202 may be configured to generate output. Examples of output are tactile, audio, and video output. Output devices 246 of computing device 202, in one example, includes a presence-sensitive display, sound card, video graphics adapter card, speaker, cathode ray tube (CRT) monitor, liquid crystal display (LCD), or any other type of device for generating output to a human or machine.
[0040] One or more communication units 244 of computing device 202 may be configured to communicate with external devices via one or more wired and/or wireless networks by transmitting and/or receiving network signals on the one or more networks. Examples of communication unit 244 include a network interface card (e.g., such as an Ethernet card), an optical transceiver, a radio frequency transceiver, a GPS receiver, or any other type of device that can send and/or receive information. Other examples of communication units 244 may include short wave radios, cellular data radios, wireless network radios, as well as universal serial bus (USB) controllers.
[0041] One or more storage devices 248 within computing device 202 may store information for processing during operation of computing device 202. In some examples, storage device 248 is a temporary memory, meaning that a primary purpose of storage device 248 is not long-term storage. Storage devices 248 on computing device 202 may be configured for short-term storage of information as volatile memory and therefore not retain stored contents if powered off. Examples of volatile memories include random access memories (RAM), dynamic random access memories (DRAM), static random access memories (SRAM), and other forms of volatile memories known in the art.
[0042] Storage devices 248, in some examples, also include one or more computer-readable storage media. Storage devices 248 may be configured to store larger amounts of information than volatile memory. Storage devices 248 may further be configured for long-term storage of information as non-volatile memory space and retain information after power on/off cycles. Examples of non-volatile memories include magnetic hard discs, optical discs, floppy discs, flash memories, or forms of electrically programmable memories (EPROM) or electrically erasable and programmable (EEPROM) memories. Storage devices 248 may store program instructions and/or information (e.g., data) associated with UI module 206, swivel gesture module 218, one or more applications 220, and operating system 254.
[0043] One or more processors 240 may implement functionality and/or execute instractions within computing device 202. For example, processors 240 on computing device 202 may receive and execute instructions stored by storage devices 248 that execute the functionality of UI module 206 and swivel gesture module 218. These instructions executed by processors 240 may cause UI module 206 of computing device 202 to provide GUIs as described herein. [0044] In some examples, UID 204 of computing device 202 may include functionality of input devices 242 and/or output devices 246. In the example of FIG. 2, UID 204 may be or may include a presence-sensitive input device. In some examples, a presence sensitive input device may detect an object at and/or near a screen. As one example range, a presencesensitive input device may detect an object, such as a finger or stylus that is within 2 inches or less of the screen. The presence-sensitive input device may determine a location (e.g., an (x,y) coordinate) of a screen at which the object was detected. In another example range, a presence-sensitive input device may detect an object six inches or less from the screen and other ranges are also possible, lire presence-sensitive input device may determine the location of the screen selected by a user’s finger using capacitive, inductive, and/or optical recognition techniques. In some examples, a presence sensitive input device also provides output to a user using tactile, audio, or video stimuli as described with respect to output devices 246, e.g., at a display. In the example of FIG. 2, UID 204 may present a user interface.
[0045] While illustrated as an internal component of computing device 202, UID 204 also represents an external component that shares a data path with computing device 202 for transmitting and/or receiving input and output. For instance, in one example, UID 204 represents a built-in component of computing device 202 located within and physically connected to the external packaging of computing device 202 (e.g., a screen on a mobile phone). In another example, UID 204 represents an external component of computing device 202 located outside and physically separated from the packaging of computing device 202 (e.g., a monitor, a projector, etc. that shares a wired and/or wireless data, path with a tablet computer). [0046] UI module 206 may include all functionality of UI module 106 of computing device 102 of FIG. 1 and may perform similar operations as UI module 106 for managing a user interface (e.g., GUIs 108) that computing device 202 provides at UID 204. For example. UI module 206 of computing device 202 may output GUIs 108 associated with one or more of applications 220, as shown in FIG. 1. Swivel gesture module 218 may cause computing device 202 to perform an action in response to a swivel gesture as described above with respect to FIG. 1 . Example actions that computing device 2.02 may perform in response to a swivel gesture are described below.
[00471 FIG. 3 is a conceptual diagram showing example GUIs 308A-308B (collectively, ‘"GUIs 308”), m accordance with techniques of this disclosure. For purposes of illustration only, GUIs 308 are described within the context of computing device 202 of FIG. 2.
[0048] UI module 206 may output, by UID 204, GUIs 308 that include a first graphical element 310 and a second graphical element 312. GUIs 308 may be associated with one or more applications 220, such as a media playback application. In the example of FIG. 3, application 220 is a video playback application. GUI 308A may show a state and/or setting of application 220 prior to receipt of a swivel gesture. GUI 308B, which may be an updated GUI, may show a state and/or setting of application 220 following receipt of a swivel gesture. GUIs 308 may include media content 316. In the example of FIG. 3, media content 316 is video footage of three vehicles recorded by a drone ascending in the air.
[0049] Computing device 202 may perform an action when a user interacts with first graphical element 310 and/or second graphical element 312, For example, application 220 may pause or play media content 316 in response to a tap input being detected proximate to, centered at, or otherwise determined to be associated with first graphical element 310. The appearance of first graphical element 310 may indicate which action application 220 performs in response to a tap input being detected proximate to, centered at, or otherwise determined to be associated with first graphical element 310. For instance, the appearance of first graphical element 310 in GUI 308A may indicate that tapping first graphical element 310 will application 220 to play media content 316. The appearance of first graphical element 310 in GUI 308B may indicate that tapping first graphical element 310 will cause application 220 to pause media content 316.
[0050] Computing device 202 may perform an action in response to a swivel gesture being detected proximate to, centered at, or otherwise determined to be associated with second graphical element 312. For example, swivel gesture module 218 may instruct application 220 to adjust a playback position of media content 316. In some examples, swivel gesture module 218 may instruct application 220 to adjust a playback position of media content 316 by an amount proportional to a length of the swivel gesture. For instance, if the full angular range of a path 314 is from 0 degrees to 270 degrees, then a swivel gesture that rotates an input point from an initial location of 0 degrees to an updated location of 135 degrees (i.e., an angular displacement equivalent to 50% of the angular range) may adjust the playback position to a time that is 50% of the total length of the media content. As shown in FIG. 3, UI module 206 may output, for display by UID 204, second graphical element 312 at the updated location of second graphical element 312 in response to determining that the user input corresponds to the swivel gesture.
[0051] In other examples, swivel gesture module 218 may instruct application 220 to adjust a playback position of media content 316 by an amount proportional to a duration of the swivel gesture. For instance, application 220 may progressively advance (e.g., fast forward, skip, etc.) the playback position through tire media content in response to commitment of a swivel gesture and until the release of the swivel gesture (e.g., when a user lifts the user’s finger from UID 204). Although not shown in FIG. 3, UI module 206 may output, for display by UID 204, second graphical element 312 at the initial location of second graphical element 312 in response to determining that the user input corresponds to the swivel gesture.
[0052] The space occupied by first graphical element 310, second graphical element 312, and path 314 may be relatively compact. As a result, first graphical element 310, second graphical element 312, and path 314 (if visible) may obscure less of video content 316. Moreover, the compactness of first graphical element 310, second graphical element 312, and path 314 may enable a user to interact with first graphical element 310 and second graphical element 312 with one hand, which may be convenient in various circumstances.
[0053] FIG. 4 is a conceptual diagram showing example GUIs 408A-408B (collectively, “GUIs 408”), in accordance with techniques of this disclosure. For purposes of illustration only, GUIs 408 are described within the context of computing device 202 of FIG. 2.
[0054] UI module 206 may output, by UID 204, GUIs 408 that include a first graphical element 410 and a second graphical element 412, GUIs 408 may be associated with one or more applications 220, such as a medsa playback application. In the example of FIG. 4, application 220 is a music streaming application. GUI 408A may show a state and/or seting of the media playback application prior to receipt of a swivel gesture. GUI 408B, which may be an updated GUI, may show a state and/or setting of the media playback application following receipt of a swivel gesture. GUIs 408 may include media content 416. In the example of FIG. 4, media content 416 may be a song entitled “Let’s Swivel, U & I” by the artist CPG.
[0055] Computing device 202 may perform an action when a user interacts with first graphical element 410 and/or second graphical element 412. For example, application 220 may pause or play media content 416 in response to a tap input being detected proximate to, centered at, or otherwise determined to be associated with first graphical element 410. Additionally, computing device 202 may perform an action in response to a swivel gesture being detected proximate to, centered at, or otherwise determined to be associated with second graphical element 412. For example, swivel gesture module 218 may instruct application 220 to update a setting of application 220 to an updated setting of application 220. As an example, application 220 may update a first volume setting (e.g., a volume setting of 1) of a media playback application to a second volume setting (e.g., a volume setting of 5). [0056] As shown in FIG. 4, UI module 206 may output, for display by UID 204, second graphical element 412. at the updated location of second graphical element 412. in response to determining that the user input corresponds to the swivel gesture. As further shown in FIG. 4, UI module 206 may update an appearance of second graphical element 412 in response to and based on the swivel gesture detected proximate to, centered at, or otherwise determined to be associated with second graphical element 412.
[0057] FIG. 5 is a conceptual diagram showing example GUIs 508A-508B (collectively, “GUIs 508”), in accordance with techniques of this disclosure. For purposes of illustration only, GUIs 508 are described within the context of computing device 202 of FIG. 2.
[0058] UI module 206 may output, by UID 204, GUIs 508 that include a first graphical element 510 and a second graphical element 512. GUIs 408 may be associated with one or more applications 220, such as a camera application, as shown in FIG. 5. GUI 508A may- show a state and/or setting of computing device 2.02 prior to receipt of a swivel gesture, GUI 508B, which may be an updated GUI, may show a state and/or setting of computing device 202 following receipt of a swivel gesture.
[0059] Computing device 202 may- perform an action when a user interacts with first graphical element 510 and/or second graphical element 512. For example, application 220 may capture a photograph (e.g., a photograph of a person long jumping) in response to a tap input being detected proximate to, centered at, or otherwise determined to be associated with first graphical element 510. Additionally, computing device 2.02 may perform an action in response to a swivel gesture being detected proximate to, centered at, or otherwise determined to be associated with second graphical element 512. For instance, swivel gesture module 218 may instruct application 220 to update a seting of application 220 to an updated seting of application 220. As an example, application 220 may update a first shutter speed seting (e.g., a shutter speed of 1/4) of a camera application to a second shutter speed seting (e.g., a shutter speed of 1 /1000).
[0060] As shown in FIG. 5, UI module 206 may output, for display by UID 204, second graphical element 512 at the updated location of second graphical element 512 in response to determining that the user input corresponds to the swivel gesture. As further shown in FIG. 5, UI module 206 may update an appearance of second graphical element 512 in response to and based on the swivel gesture detected proximate to, centered at, or otherwise determined to be associated with second graphical element 512.
[0061] FIG. 6 is a conceptual diagram showing example GUIs 608A-608B (collectively, “GUIs 608”), in accordance with techniques of this disclosure. For purposes of illustration only, GUIs 608 are described within the context of computing device 202 of FIG. 2.
[0062] UI module 206 may output, by UID 204, GUIs 608 that include a first graphical element 610 and a second graphical element 612. GUI 608A may be associated with a home screen of computing device 202. GUI 608B may be associated with one or more applications 220, such as a camera application, as shown in FIG. 6. GUI 608A may show a state and/or seting of computing device 202 prior to receipt of a swivel gesture. GUI 608B, which may be an updated GUI, may show' a state and/or setting of computing device 202 following receipt of a swivel gesture.
[ 0063 ] Computing device 202 may perform an action when a user interacts with first graphical element 610 and/or second graphical element 612. For example, with respect to GUI 608A, computing device 202 may authenticate a user in response to the user pressing a finger on first graphical element 610. With respect to GUI 608B, application 220 may capture a photograph in response to a tap input being detected proximate to, centered at, or otherwise determined to be associated with first graphical element 610.
[0064] Additionally, computing device 202 may perform an action in response to a swivel gesture being detected proximate to, centered at, or otherwise determined to be associated with second graphical element 612. For instance, with respect to GUI 608A, swivel gesture module 218 may instruct computing device 202 to launch a specific application (e.g., based on a computing device setting). As an example, computing device 202 may launch application 220, a camera application.
[0065] As demonstrated by at least the above examples, a computing device may perform a variety of actions in response to a swivel gesture. However, it should be understood that the examples described herein are provided for purposes of explanation and are not intended to be limiting. Other example actions that a computing device may perform in response to a swivel gesture are contemplated by this disclosure.
[0066] FIG. 7 is a flowchart illustrating example operations performing actions in response to a swivel gesture, in accordance with one or more techniques of the present disclosure. For purposes of illustration only, the example operations are described below within the context of computing device 102 of FIG. 1.
[0067] Computing device 102 may output a GUI of a screen of an application (702). For instance, UI module 106 may cause UID 104 to display GUI 108A (e.g., a main screen of a camera application).
[0068] Computing device 102 may moni tor for receipt of an indication of a user input (704). For instance, UID 104 may generate (e.g., via a touch or presence sensitive screen) user input data. UI module 106 may process the user input data. Responsive to not receiving an indication of a user input (“No” branch of 704), UI module 106 may continue outputting, for display by UID 104, a GUI of a screen of application 120 (702).
[0069] Responsive to receiving an indication of a user input (“Yes” branch of 704), swivel gesture module 118 may determine whether the user input is a swivel gesture detected proximate to, centered at, or otherwise determined to be associated with second graphical element 112 (706). swivel gesture module 118 may determine that the user input is a swivel gesture detected proximate to, centered at, or otherwise determined to be associated with second graphical element 112. in response to the user input data indicating a user input includes rotating of an input point from an initial location of second graphical element 1 12 to an updated location of second graphical element 1 12 (e.g., along path 114).
[0070] Responsive to determining that a user input detected proximate to, centered at, or otherwise determined to be associated with second graphical element 1 12 is a swivel gesture (“Yes” branch of 706), swivel gesture module 118 may instruct computing device 102 to perform an action associated with the swivel gesture (708). For example, swivel gesture module 118 may instruct application 12.0 to change a setting of application 120 to deny access to a photo library'. Additionally, UI module 106 may cause UID 104 to output, for display, an updated GUI including second graphical element 1 12 (710). For instance, UI module 106 may cause UID 104 to output, for display, GUI 108B with second graphical element 1 12 at the updated location of second graphical element 1 12 (e.g., the second angular position relative to first graphical element 1 10) in response to determining that the user input corresponds to the swivel gesture. Alternatively, UI module 106 may cause UID 104 to output, tor display, GUI 108B with second graphical element 112 at the initial location of second graphical element 112 (e.g., the first angular position relative to first graphical element 1 10) in response to determining that the user input corresponds to the swivel gesture. [0071] Responsive to determining that a user input detected proximate to, centered at, or otherwise determined to be associated with second graphical element 112 is not a swivel gesture (“No” branch of 706), swivel gesture module 118 may not instruct computing device 102 to perform any action associated with a swivel gesture. For example, UID 104 may continue displaying GUI 108 A (702).
[0072 ] The following numbered examples may illustrate one or more aspects of this disclosure:
[0073] Example 1: A method includes outputting, by one or more processors of a computing device and for display, a first graphical element and a second graphical element, wherein the second graphical element is located at an initial location relative to the first graphical element, and wherein the initial location is at a first angular position relative to the first graphical element; receiving, by the one or more processors, an indication of a user input having an input point starting at the initial location; determining, by the one or more processors, whether the user input corresponds to a swivel gesture, wherein the swivel gesture includes a rotating of the input point from the initial location to an updated location of the second graphical element relative to the first graphical element, and wherein the updated location is at a second angular position relative to the first graphical element; and responsive to determining that the user input corresponds to the swivel gesture, performing, by the one or more processors, an action associated with the swivel gesture.
[0074] Example 2: The method of example 1, further including, responsive to determining that the user input corresponds to the swivel gesture, outputting, by the one or more processors and for display, the second graphical element at the updated location.
[0075] Example 3: The method of example I, further including, responsive to performing the action associated with the swivel gesture, outputting, by the one or more processors and for display, the second graphical element located at the initial location.
[0076] Example 4: The method of any of examples 1 to 3, wherein the action associated with the swivel gesture includes updating a setting of the computing device to tin updated setting of the computing device.
[0077] Example 5: The method of example 4, wherein the setting is a first volume setting of a media playback application, and -wherein the updated setting is a second volume setting of the media playback application, wherein the second volume seting is different from the first volume setting.
[0078] Example 6: The method of example 4, wherein the setting is a first shutter speed setting of a camera application, and wherein the updated setting is a second shutter speed setting of the camera application, wherein the second shuter speed setting is different from the first shutter speed setting.
[0079] Example 7 : The method of example 4, wherein the setting is an unlocked setting of an application, and wherein the updated seting is a locked setting of the application, wherein the unlocked setting permits access to content of the application, and wherein the locked setting denies access to the content of the application.
[0080] Example 8: The method of example 4, wherein the setting is a first graphical element setting, and wherein the updated setting is a second graphical element setting.
[0081] Example 9: The method of example 8, wherein the first graphical element has an initial first graphical element appearance indicative of the first graphical element setting, further including, responsive to updating the first graphical element setting to the updated first graphical element seting, outputting, by the one or more processors and for display, the first graphical element having an updated first graphical element appearance indicative of the updated first graphical element seting.
[0082] Example 10: The method of any of examples 1 to 9, wherein a first graphical element setting enables a first action associated with the first graphical element, and wherein an updated first graphical element setting enables a second action associated with the first graphical element.
[0083] Example 11: The method of any of examples 1 to 10, wherein the second graphical element has an initial second graphical element appearance indicative of a setting of the computing device, further including, responsive to updating the setting to an updated setting, outputing, by the one or more processors and for display, the second graphical element having an updated second graphical element appearance indicative of an updated setting. [0084] Example 12: The method of any of examples 1 to 3, wherein the action associated with the swivel gesture includes adjusting a playback position of media content.
[0085] Example 13: The method of any of examples 1 to 3, wherein the action associated with the swivel gesture includes wherein the action includes adjusting a playback position of media content by an amount proportional to at least one of a length of the swivel gesture or a duration of the swivel gesture. [0086] Example 14: A computing device including means for performing any of the methods of examples 1-13.
[0087] Example 15: A non-transitory computer-readable storage medium storing instructions that, when executed by one or more processors of a computing device, cause the one or more processors to perform any of the methods of examples 1-13.
[0088] In one or more examples, the actions described may be implemented in hardware, software, firmware, or any combination thereof. If implemented in software, the actions may be stored on or transmitted over, as one or more instructions or code, a computer-readable medium and executed by a hardware-based processing unit. Computer-readable media may include computer-readable storage media, which corresponds to a tangible medium such as data storage media, or communication media including any medium that facilitates transfer of a computer program from one place to another, e.g., according to a communication protocol. In tins manner, computer-readable media generally may correspond to (1) tangible computer- readable storage media, which is non-transitory or (2) a communication medium such as a signal or carrier wave. Data storage media may be any available media that can be accessed by one or more computers or one or more processors to retrieve instructions, code and/or data structures for implementation of the techniques described in this disclosure. A computer program product may include a computer-readable medium.
[0089] By way of example, and not limitation, such computer-readable storage media can comprise RAM, ROM, EEPROM, CD-ROM or other optical disk storage, magnetic disk storage, or other magnetic storage devices, flash memory; or any other medium that can be used to store desired program code in the form of instructions or data structures and that can be accessed by a computer. Also, any connection is properly termed a computer-readable medium. For example, if instructions are transmitted from a website, server, or other remote source using a coaxial cable, fiber optic cable, twisted pair, digital subscriber line (DSL), or wireless technologies such as infrared, radio, and microwave, then the coaxial cable, fiber optic cable, twisted pair, DSL, or wireless technologies such as infrared, radio, and microwave are included in the definition of medium. It should be understood, however, that computer-readable storage media and data storage media do not include connections, earner waves, signals, or other transient media, but are instead directed to non -transient, tangible storage media. Disk and disc, as used herein, includes compact disc (CD), laser disc, optical disc, digital versatile disc (DVD), floppy disk and Blu-ray disc, where disks usually reproduce data magnetically, while discs reproduce data optically with lasers. Combinations of the above should also be included within the scope of computer-readable media. [0090] Instructions may be executed by one or more processors, such as one or more digital signal processors (DSPs), general purpose microprocessors, application specific integrated circuits (ASICs), field programmable logic arrays (FPGAs), or other equivalent integrated or discrete logic circuitry. Accordingly, the term “processor,” as used herein may refer to any of the foregoing structures or any other structure suitable for implementation of the techniques described herein. In addition, in some aspects, the functionality described herein may be provided within dedicated hardware and/or software modules. Also, the techniques could be fully implemented in one or more circuits or logic elements.
[0091] The techniques of this disclosure may be implemented in a wide variety of devices or apparatuses, including a wireless handset, an integrated circuit (IC) or a set of ICs (e.g., a chip set). Various components, modules, or units are described in this disclosure to emphasize functional aspects of devices configured to perform the disclosed techniques, but do not necessarily require realization by different hardware units. Rather, as described above, various units may be combined in a hardware unit or provided by a collection of intraoperative hardware units, including one or more processors as described above, in conjunction with suitable software and/or firmware.
[0092] Various examples of the disclosure have been described. Any combination of the described systems, operations, or actions is contemplated. These and other examples are within tlie scope of the following claims.

Claims

WHAT IS CLAIMED IS:
1 , A method comprising: outputing, by one or more processors of a computing device and for display, a first graphical element and a second graphical element, wherein the second graphical element is located at an initial location relative to the first graphical element, and wherein the initial location is at a first angular position relative to the first graphical element; receiving, by the one or more processors, an indication of a user input having an input point starting at the initial location; determining, by the one or more processors, whether the user input corresponds to a swivel gesture, wherein the swivel gesture includes a rotating of the input point from the initial location to tin updated location of the second graphical element relative to the first graphical element, and wherein the updated location is at a second angular position relative to the first graphical element; and responsive to determining that the user input corresponds to the swivel gesture, performing, by the one or more processors, an action associated with the swivel gesture.
2, The method of claim 1 , further comprising, responsive to determining that the user input corresponds to the swivel gesture, outputting, by the one or more processors and for display, the second graphical element at the updated location.
3. The method of claim 1, further comprising, responsive to performing the action associated with the swivel gesture, outputting, by the one or more processors and for display, the second graphical element located at the initial location.
4. The method of any of claims 1 to 3, wherein the action associated with the swivel gesture includes updating a setting of the computing device to an updated setting of the computing device.
5. The method of claim 4, wherein the setting is a first volume setting of a media playback application, and wherein the updated setting is a second volume setting of the media playback application, wherein the second volume setting is different from the first volume setting.
6. The method of claim 4, wherein the setting is a first shutter speed setting of a camera application, and wherein the updated seting is a second shutter speed setting of the camera application, wherein the second shutter speed setting is different from the first shutter speed setting.
7. The method of claim 4, wherein the setting is an unlocked setting of an application, and wherein the updated setting is a locked setting of the application, wherein the unlocked setting permits access to content of the application, and wherein the locked setting denies access to the content of the application.
8. Hie method of claim 4, wherein the setting is a first graphical element setting, and wherein the updated setting is a second graphical element seting.
9. The method of claim 8, w herein the first graphical element has an initial first graphical elemen t appearance indicative of the first graphical element setting, further comprising, responsive to updating the first graphical element setting to the updated first graphical element setting, outputting, by tire one or more processors and for display, the first graphical element having an updated first graphical elem ent appearance indicative of the updated first graphical element setting.
10. The method of any of claims 1 to 9, wherein a first graphical element setting enables a first action associated with the first graphical element, and wherein an updated first graphical element setting enables a second action associated with the first graphical element.
11. The method of any of claims 1 to 10, wherein the second graphical element has an initial second graphical element appearance indicative of a setting of the computing device, further comprising, responsive to updating the seting to an updated setting, outputting, by the one or more processors and for display, the second graphical element having an updated second graphical element appearance indicative of an updated setting.
12. The method of any of claims 1 to 3, wherein the action associated with the swivel gesture includes adjusting a playback position of media content.
13. The method of any of claims 1 to 3, wherein the action associated with the swivel gesture includes adjusting a playback position of media content by an amount proportional to at least one of a length of the swivel gesture or a duration of the swivel gesture.
14. A computing device comprising means for performing any of the methods of claims 1-13.
15. A non-transitory computer-readable storage medium storing instructions that, when executed by one or more processors of a computing device, cause the one or more processors to perform any of tire methods of claims 1-13.
PCT/US2022/076287 2022-09-12 2022-09-12 Swivel gesture functionality on computing devices WO2024058802A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/US2022/076287 WO2024058802A1 (en) 2022-09-12 2022-09-12 Swivel gesture functionality on computing devices

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/US2022/076287 WO2024058802A1 (en) 2022-09-12 2022-09-12 Swivel gesture functionality on computing devices

Publications (1)

Publication Number Publication Date
WO2024058802A1 true WO2024058802A1 (en) 2024-03-21

Family

ID=83691413

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2022/076287 WO2024058802A1 (en) 2022-09-12 2022-09-12 Swivel gesture functionality on computing devices

Country Status (1)

Country Link
WO (1) WO2024058802A1 (en)

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100037185A1 (en) * 2008-08-07 2010-02-11 Hong Fu Jin Precision Industry (Shenzhen) Co., Ltd. Input method for communication device
US20180188925A1 (en) * 2015-07-01 2018-07-05 Lg Electronics Inc. Mobile terminal and control method therefor

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100037185A1 (en) * 2008-08-07 2010-02-11 Hong Fu Jin Precision Industry (Shenzhen) Co., Ltd. Input method for communication device
US20180188925A1 (en) * 2015-07-01 2018-07-05 Lg Electronics Inc. Mobile terminal and control method therefor

Similar Documents

Publication Publication Date Title
US10372238B2 (en) User terminal device and method for controlling the user terminal device thereof
US9563341B2 (en) Data sharing
US8762895B2 (en) Camera zoom indicator in mobile devices
US9176480B2 (en) Gesture-based time input
US11978335B2 (en) Controlling remote devices using user interface templates
AU2013276998B2 (en) Mouse function provision method and terminal implementing the same
US9798512B1 (en) Context-based volume adjustment
US8601561B1 (en) Interactive overlay to prevent unintentional inputs
AU2014312481A1 (en) Display apparatus, portable device and screen display methods thereof
AU2019257433B2 (en) Device, method and graphic user interface used to move application interface element
US20200285377A1 (en) Dynamicaly configurable application control elements
US9830056B1 (en) Indicating relationships between windows on a computing device
US8698835B1 (en) Mobile device user interface having enhanced visual characteristics
WO2016078251A1 (en) Projector playing control method, device, and computer storage medium
US20150074597A1 (en) Separate smoothing filter for pinch-zooming touchscreen gesture response
US11716414B2 (en) Context aware airplane mode
US11460971B2 (en) Control method and electronic device
WO2024058802A1 (en) Swivel gesture functionality on computing devices
WO2023072233A1 (en) Page switching method, page switching apparatus, electronic device and readable storage medium
US20240160346A1 (en) Back gesture preview on computing devices
WO2022216299A1 (en) Stretching content to indicate scrolling beyond the end of the content
WO2023239409A1 (en) Intelligent user interface rotation

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22789799

Country of ref document: EP

Kind code of ref document: A1