US20130014050A1 - Automatically hiding controls - Google Patents

Automatically hiding controls Download PDF

Info

Publication number
US20130014050A1
US20130014050A1 US13/249,648 US201113249648A US2013014050A1 US 20130014050 A1 US20130014050 A1 US 20130014050A1 US 201113249648 A US201113249648 A US 201113249648A US 2013014050 A1 US2013014050 A1 US 2013014050A1
Authority
US
United States
Prior art keywords
mobile device
user input
control region
control bar
proximity
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US13/249,648
Inventor
Jean Baptiste Maurice Queru
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Google LLC
Original Assignee
Google LLC
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Google LLC filed Critical Google LLC
Priority to US13/249,648 priority Critical patent/US20130014050A1/en
Priority to DE202012002486U priority patent/DE202012002486U1/en
Priority to NL2008449A priority patent/NL2008449C2/en
Priority to AU2012100269A priority patent/AU2012100269A4/en
Publication of US20130014050A1 publication Critical patent/US20130014050A1/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/0482Interaction with lists of selectable items, e.g. menus
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04886Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures by partitioning the display area of the touch-screen or the surface of the digitising tablet into independently controllable areas, e.g. virtual keyboards or menus
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/041Indexing scheme relating to G06F3/041 - G06F3/045
    • G06F2203/04108Touchless 2D- digitiser, i.e. digitiser detecting the X/Y position of the input means, finger or stylus, also when it does not touch, but is proximate to the digitiser's interaction surface without distance measurement in the Z direction

Abstract

Methods, systems, and apparatus, including computer programs encoded on a computer storage medium, for automatically hiding controls. In one aspect, a method includes displaying at least a portion of a content region of a user interface and at least a portion of a control region of the user interface, wherein the control region is peripheral to the content region; and determining that the portion of the control region of the user interface has been displayed for a predetermined period of time, then automatically removing the portion of the control region from display.

Description

    CROSS-REFERENCE TO RELATED APPLICATIONS
  • This application is a continuation of U.S. patent application Ser. No. 13/216,466, filed Aug. 24, 2011, titled “AUTOMATICALLY HIDING CONTROLS,” which claims priority from U.S. Provisional Application Ser. No. 61/451,879, titled “AUTOMATICALLY HIDING CONTROLS,” filed Mar. 11, 2011. The entire contents of the prior applications are hereby incorporated by reference.
  • BACKGROUND
  • User interfaces may include content regions that display content, and control regions that display controls. For example, an email application may have a user interface that includes a content region that displays the content of an email, as well as a control region that includes reply, forward, and delete buttons. A web browser may display the content of a web page in a content region, and may display, in a control region, a navigation bar for allowing the user to navigate web pages or to determine information about a particular web page.
  • Applications typically display controls adjacent to content. When a user interface includes a large number of controls, the display of one or more control regions may overwhelm the small screen of a mobile device. To overcome this problem, traditional mobile device applications require the user of the mobile device to interact with the mobile device application before controls are shown. This approach, however, requires the user to be aware that the control region exists. Furthermore, once the control region is displayed, the small screen of a mobile device may become, and may remain, similarly overwhelmed.
  • SUMMARY
  • According to one general implementation, a control window or region, such as a region of a user interface that includes a toolbar, is displayed for a limited amount of time, then is automatically hidden to allow a larger portion of a content region to be displayed. By initially displaying the control region, the user is made aware of the existence of the controls, while eventually allowing a greater portion of a display to be used to display content. The user may, but is not required to, manually remove the control region from the display.
  • In general, another aspect of the subject matter described in this specification may be embodied in methods that include the actions of providing a control region for display adjacent to a content region, and determining that the control region has been displayed for a predetermined period of time. The process also includes removing the control region from display in response to determining that the control region has been displayed for the predetermined period of time.
  • In general, another aspect of the subject matter described in this specification may be embodied in methods that include the actions of displaying a content region, receiving a flinging gesture on the content region, and displaying a control region above or below the content region, in response to receiving the flinging gesture. The actions also include detecting a timer trigger event, and automatically hide the control region based detecting the timer trigger event.
  • In general, another aspect of the subject matter described in this specification may be embodied in methods that include the action of displaying at least a portion of a content region of a user interface and at least a portion of a control region of the user interface, where the control region is peripheral to the content region. The method also includes determining that the portion of the control region of the user interface has been displayed for a predetermined period of time, then automatically removing the portion of the control region from display.
  • Other embodiments of these aspects include corresponding systems, apparatus, and computer programs, configured to perform the actions of the methods, encoded on computer storage devices.
  • These and other embodiments may each optionally include one or more of the following features. For instance, the actions include receiving a user input through the user interface, the portion of a content region and the portion of the control region are simultaneously displayed in response to receiving the user input; the user input is a flinging gesture initiated in the content region of the user interface; automatically removing the portion of the control region from display includes displaying a same or different portion of the content region only; automatically removing the portion of the control region includes scrolling the portion of the control region from display; and/or the actions include receiving a user input through the user interface after removing the portion of the control region from display, then displaying a same or different portion of the control region.
  • In some instances, determining that the portion of the control region has been displayed for a predetermined period of time includes determining when any portion of the control region has been displayed, then starting a timer, and determining when the timer indicates that the predetermined period of time has elapsed; the control region is disposed adjacent to, and above or below the content region; and/or the actions include determining that a flinging gesture input by the user through the user interface has caused or will cause the user interface to scroll past an edge of the content region, where the portion of a content region and the portion of the control region are simultaneously displayed in response to determining that the flinging gesture has caused or will cause the user interface to scroll past an edge of the content region.
  • In some instances, the actions include receiving a user input, and determining that the user input will cause only a portion of the control region of the interface to be displayed, where simultaneously displaying at least a portion of a content region of a user interface and at least a portion of a control region of the user interface further includes simultaneously displaying the portion of the content region and all of the control region based on determining that the user input will cause only a portion of the control region to be displayed; the portion of the control region includes at least a portion of a title bar, a menu bar, a tool bar, an address bar, a status bar, and/or a progress bar; and/or the user interface includes a web browser user interface, or an email application user interface.
  • The details of one or more embodiments of the subject matter described in this specification are set forth in the accompanying drawings and the description below. Other potential features, aspects, and advantages of the subject matter will become apparent from the description, the drawings, and the claims.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • FIGS. 1 and 2 are diagrams of example systems that may be used for displaying user interfaces.
  • FIG. 3 is a flowchart of an example process for automatically hiding controls.
  • FIG. 4 illustrates the removing of a control region from an example user interface.
  • FIG. 5 illustrates the displaying of a control region on an example user interface.
  • Like reference numbers represent corresponding parts throughout.
  • DETAILED DESCRIPTION
  • FIG. 1 is a diagram of an example system 100 that may be used for displaying a user interface on a presence-sensitive interface, e.g., a touch-screen display. The system 100 includes a mobile device 102, e.g., a device that includes a processor 104 and a computer readable medium 106. The computer readable medium 106 includes one or more applications 108, such as a web browser.
  • For mobile device applications that have user interfaces that include both content regions and control regions, an application may briefly display a control region before automatically hiding the control region, for example by scrolling or “snapping” to the outer edge of the content region. Once hidden, the user of the mobile device may drag or scroll the user interface to redisplay the control region. Because the control region is initially displayed to the user, the user is made aware of the controls displayed in the control region. By automatically hiding the control region, however, the user interface is able to display a larger amount of the content region. Such an approach is particularly useful when the user “flings” to, i.e., uses a flinging gesture to access, the top or bottom of content, but overshoots the top or bottom edge of the content region and is shown a control region.
  • Because the mobile device 102 includes a small display, the web browser may initially display a viewable portion 110 of a user interface 122, where the viewable portion 110 includes a control region 112 and a content region 114. The control region 112 includes a title bar 116, a menu bar 118, and a tool bar 120. The user of the mobile device 102 may drag or scroll the viewable portion 110 of the user interface 122 to see additional portions, or to adjust the viewable portion, of the content region 114.
  • When the content region 114 is sized to be readable by the user, the entire user interface 122 may be too large to be displayed on the mobile device 102 at any one point in time. The viewable portion 110 of the user interface 122 may thus be a viewport to the user interface 122, where a user's scrolling within the viewable portion 110 of the user interface 122 adjusts the viewport to allow the user see different portions of the user interface 122. The user interface 122 may include, for example, the control region 112 (e.g., as illustrated by a control region 125) and a bottom status bar control region 126. The content region 114 displayed in the viewable portion 110 of the user interface 122 may be a portion of a larger content region 128 included in the user interface 122.
  • In a state “A”, which may correspond to a start-up state of the web browser or a state in which a user has “flung” to the to the top of the user interface 122, the viewable portion 110 of the user interface 122 displays a portion 124 of the user interface 122, where the portion 124 includes a portion of the content region 128 and the control region 125. In some implementations, in state “A”, the viewable portion 110 of the user interface 122 includes a control region corresponding to the control region 126 (e.g., a status bar may be displayed at the bottom of the viewable portion 110 of the user interface 122).
  • By displaying the control region 112 (and possibly a bottom status bar control region 128) the application 108 communicates the existence of the control regions to the user. However, displaying control regions uses space on the viewable portion 110 of the user interface 122 and takes away space that may be otherwise used for displaying content in the content region 114. For example, as shown in the viewable portion 110 of the user interface 122, the control region 112 occupies approximately one third of the viewable portion 110 of the user interface 122. If the control region 112 were not shown on the viewable portion 110 of the user interface 122, a gain of one third of the viewable portion 110 of the user interface 122 for display of content may be achieved. However, as mentioned, if the control region 112 is not displayed, the user may not be made aware that the control region 112 exists.
  • To achieve balance between a goal of communicating the existence and location of the control region 112 and a goal of increasing space for display of content, the control region 112 may be displayed for a brief, predetermined period of time and then may be removed from the viewable portion 110 of the user interface 122. In other words, the control region 112 may be shown initially and momentarily, so that the user is made aware of the control region 112, and then the control region 112 may be hidden to increase the amount of the content region 114 that is shown on the display.
  • For example, in a state “B”, a predetermined interval (e.g., one second, three seconds, one minute) of a timer 132 may expire and may trigger a timer event. In response to the timer event, the control region 112 may be removed from the viewable portion 110 of the user interface 122, as illustrated by a viewable portion 134 of the user interface 122, without or without an intervening selection or input by the user. A result of removing the control region 112 is that the viewable portion 134 of the user interface 122 includes a larger area for display of content than the viewable portion 110 of the user interface 122. The viewable portion 134 of the user interface 122 is displaying a portion 136 of the user interface 122, where the portion 136 includes the top portion of the content region 128.
  • In a state “C”, the user drags or scrolls the content displayed in the viewable portion 134 of the user interface 122. For example, as illustrated in a viewable portion 137 of the user interface 122, the user may scroll using a finger 138. After scrolling, a portion 142 of the user interface 122 is displayed in the viewable portion 137 of the user interface 122. Scrolling may result in all or a portion of the control region 125 scrolling into view (e.g., as illustrated by a partial control region 140). The user may scroll intentionally, for example, to gain access to the control region 125. The previous showing of the control region 112 (e.g., in the state “A”) provided the user with an indication of the location of the control region 125 and also indicated that scrolling may be used to bring the control region 125 back into view.
  • In some implementations, when scrolling results in some but not all of the control region 125 being scrolled into view, the portion of the control region 125 (e.g., the partial control region 140) remains displayed for a predetermined period time (e.g., five seconds), and then disappears when the predetermined period of time has elapsed. The predetermined period of time for displaying a partial control region may be the same amount or a different amount than the predetermined period of time associated with the timer 132.
  • In some implementations, if scrolling results in some but not all of the control region 125 being scrolled into view, the entire control region 125 may be shown, or, in other words, may “snap” into view (e.g., as illustrated by the control region 112). After being “snapped” into view, the control region 125 may remain visible until the user scrolls the control region 125 from display. Similarly, if the user scrolls the viewable portion 137 of the user interface 122 so that the entire control region 125 is visible, the control region 125 may remain visible indefinitely (e.g., until the user scrolls the control region 125 from display).
  • As another example, FIG. 2 is a diagram of an example system 200 that may be used for displaying a user interface 202. The user interface 202 includes a control region 204, a content region 206, and a control region 208. The control region 204 includes a title bar 210, a menu bar 212, and a tool bar 214. A viewable portion of the user interface 202 may be displayed, such as in a user interface of a browser application, on a mobile device.
  • For example, in a state “A”, a viewable portion 216 of the user interface 202 is displayed, showing a portion 218 of the user interface 202. The portion 218 includes a portion of the content region 206. In the state “A”, the user scrolls the content displayed in the viewable portion 216 of the user interface 202, such as by using a hand 220. For example, the user may perform a “flinging” gesture, which is a gesture that may be performed with a varying amount of velocity that results in a scroll amount that is proportionate to the velocity.
  • In a state “B”, in response to the flinging gesture, the browser scrolls a viewable portion 222 of the user interface 202 to display, during the scrolling, different portions of the user interface 202. When the scrolling stops, a particular portion of the user interface 202 is displayed in the viewable portion 222 of the user interface 202. For example, if the velocity of the fling is sufficient such that the scrolling reaches the bottom of the user interface 202, a portion 224 of the user interface 202 may be displayed in the viewable portion 222 of the user interface 202. For example, the viewable portion 222 of the user interface 202 includes a control region 226 that corresponds to the control region 208 located at the bottom of the portion 224.
  • As described above, the showing of the control region 226 in the viewable portion 222 of the user interface 202 facilitates control region discoverability of the user, but uses a portion of the viewable portion 222 of the user interface 202 that may otherwise be used to display content. In a state “C”, the control region 226 is removed from the viewable portion 222 of the user interface 202 after a predetermined period of time elapses (e.g., as illustrated by a timer 228). The removal of the control region 226 from the viewable portion 222 of the user interface 202 is illustrated by a viewable portion 230 of the user interface 202. The viewable portion 230 of the user interface 202 is displaying a portion 232 of the user interface 202 (e.g., the bottommost portion of the content region 206). As shown in FIG. 2, the viewable portion 230 of the user interface 202 is capable of displaying a larger amount of content than the viewable portion 222 of the user interface 202.
  • In a state “D”, the user scrolls the viewable portion 230 of the user interface 202 (such as by using a hand 232, as illustrated in a viewable portion 234 of the user interface 202). The user may scroll, for example, to access the control region 208. In response to the scrolling, the control region 208 is displayed (e.g., as illustrated by a control region 236). The control region 236 may be displayed indefinitely, since it may be determined that the user intentionally scrolled to view the control region 236. The control region 236 may be removed from the viewable portion 234 of the user interface 202 if, for example, the user scrolls the control region 236 out of view.
  • FIG. 3 is a flowchart of an example process 300 for automatically hiding controls. Briefly, the process 300 includes: (optionally) detecting a user input such as a flinging gesture; simultaneously displaying at least a portion of a content region of a user interface and at least a portion of a control region of the user interface, where the control region is peripheral to the content region. The process 300 also includes determining that the portion of the control region of the user interface has been displayed for a predetermined period of time; and automatically removing the portion of the control region from display.
  • In further detail, when the process 300 begins (301), a flinging gesture or other user input may be detected (302). As indicated by the dashed outline of the flowchart shape for operation 302, operation 302 is optional. A flinging gesture or other user input may be detected on a user interface, such as the user interface of a mobile device or of another type of device, such as a personal computer. A velocity associated with the flinging gesture may be detected. The velocity may determine an amount of scrolling to perform on the user interface. For example, it may be determined that the flinging gesture input has caused or will cause the user interface to scroll past an edge of a content region included in the user interface.
  • The content region may be peripheral to one or more control regions. For example, a control region may be at the bottom, at the top, at the left side, or at the right side of the content region. A control region may include one or more of a title bar, a menu bar, a tool bar, an address bar, a status bar, and a progress bar, to name a few examples.
  • Continuing with FIG. 3, at least a portion of the content region of the user interface and at least a portion of a control region of the user interface are simultaneously displayed (304). The simultaneous display may occur, for example, upon startup of the user interface or in response to a user input (e.g., in response to operation 302). If the simultaneous display occurs in response to user input (e.g., in response to operation 302), the portion of the content region and all or a portion of the control region may be simultaneously displayed based on determining that the user input will cause at least a portion of the control region to be displayed. In other words, the user input may cause the user interface to scroll causing at least a portion of the control region to be displayed. In some implementations, the portion of the content region and all of the control region are simultaneously displayed based on determining that the user input will cause only a portion of the control region to be displayed. In other words, if the user input causes the user interface to scroll such that a portion of the control region is displayed, the entire control region may be displayed, or may “snap” into view.
  • It is determined that the portion of the control region of the user interface has been displayed for a predetermined period of time (306). For example, a timer may be started when it is determined that any portion of the control region has been displayed and it may be subsequently determined when the timer indicates that the predetermined period of time has elapsed. Alternatively, the time may be started when a user input is received at some point after any portion of the control region has been displayed. For example, a timer event may be received when the predetermined period of time has elapsed. In some implementations, the predetermined period of time is application specific. In some implementations, the predetermined period of time is user configurable.
  • The portion of the control region is automatically removed from display (308), thereby ending the process 300 (309). All of the control region may be removed from display at the same time, or the control region may be removed by scrolling the control region from display. The control region may be scrolled from display to provide a visual indication to the user, indicating to the user that scrolling the user interface towards the former location of the control region is an action that may cause the control region to reappear.
  • Removing the control region may result in an increase in the size of the content region, which may result in the display of different content in the content region. For example, if the control region is a title bar located above the content region, removing the title bar from display may result in the display of additional content in the content region. For example, content currently being displayed may move upward in the content region and previously undisplayed content may be displayed at the bottom of the content region.
  • FIG. 4 illustrates the removing of a control region 402 from an example user interface 404. The user interface 404 includes the control region 402 and a content region 406. The content region 406 is displaying content 407. The control region 402 may be displayed in the user interface 404, for example, upon startup of the user interface 404. As another example, the control region 402 may be displayed in the user interface 404 in response to a user scrolling to the bottom of the user interface 404.
  • In response to the display of the control region 402, a timer 408 may be started. The timer 408 may be started based upon the display of the control region, or based on receiving a user input after the control region has been displayed. The timer 408 may be configured such that a timer event fires after a predetermined period of time has elapsed. In response to the timer event, the control region 402 may be removed from the user interface 404. In some implementations, when the control region 402 is removed it may appear to the user that the entire control region 402 is hidden instantaneously. For example, the user interface 404 may appear to change to a user interface 410 instantaneously.
  • In some implementations, the control region 402 may be removed by scrolling the control region 402 from display. For example and as illustrated in a region 412 of a user interface 414, the control region 402 may appear to scroll or move off of the bottom of the user interface 414. The control region 402 may be scrolled from display to provide a visual indication to the user, indicating that scrolling the user interface 414 towards the former location of the control region 402 is an action that may cause the control region 402 to reappear.
  • Content may also appear to scroll, or move. Content 416 corresponds to the content 407, and movement of the content 416 to a lower location (as illustrated by content 418) illustrates that content in a content region 420 may move downward in response to the removal of the control region 402. Previously undisplayed content may appear at the top of the content region 420.
  • In some implementations, an indicator may be displayed to the user that provides a visual indication of the existence of a control region and an indication of an action that may cause the control region to be displayed. For example, the user interfaces 410 and 414 include indicators 422 and 424, respectively, where the indicators 422 and 424 indicate that a control region is located at the bottom of the respective user interface 410 or 414. The indicators 422 and 424 indicate that the user may be able to scroll off the bottom of the user interface 410 or 414, respectively, to cause the control region 402 to be displayed.
  • FIG. 5 illustrates the displaying of a control region 502 on an example user interface 504. The user interface 504 includes the control region 502 and a content region 505. The control region 502 is a portion of a larger control region 506 illustrated in a user interface 508. The control region 502 may be displayed in response to a user input, such as a flinging gesture. For example, the user may have performed a flinging gesture to cause the contents of the user interface 504 to scroll so that the control region 502 (e.g., some but not all of the control region 506) is displayed in the user interface 504.
  • A velocity of the flinging gesture may be detected and the user interface 504 may be scrolled using a simulation of friction, where while the content is being scrolled, larger distances are scrolled in earlier time intervals and smaller distances are scrolled in later time intervals, until the distance scrolled is reduced to zero as the scrolling stops. For example, scrolled distances associated with time points T1, T2, . . . , T7 grow progressively smaller. Suppose that the scrolling stops at the time point T7. In some implementations, in response to scrolling stopping while some but not all of the control region 506 has been displayed, the remainder of the control region 506 may automatically be displayed (e.g., the control region 502 may be replaced with the control region 506), such as at a time point T8, as illustrated in the user interface 508.
  • A number of implementations have been described. Nevertheless, it will be understood that various modifications may be made without departing from the spirit and scope of the disclosure. For example, various forms of the flows shown above may be used, with operations re-ordered, added, or removed.
  • Embodiments of the invention and all of the functional operations described in this specification may be implemented in digital electronic circuitry, or in computer software, firmware, or hardware, including the structures disclosed in this specification and their structural equivalents, or in combinations of one or more of them. Embodiments of the invention may be implemented as one or more computer program products, i.e., one or more modules of computer program instructions encoded on a computer readable medium for execution by, or to control the operation of, data processing apparatus. The computer readable medium may be a machine-readable storage device, a machine-readable storage substrate, a memory device, a composition of matter effecting a machine-readable propagated signal, or a combination of one or more of them. The term “data processing apparatus” encompasses all apparatus, devices, and machines for processing data, including by way of example a programmable processor, a computer, or multiple processors or computers. The apparatus may include, in addition to hardware, code that creates an execution environment for the computer program in question, e.g., code that constitutes processor firmware, a protocol stack, a database management system, an operating system, or a combination of one or more of them. A propagated signal is an artificially generated signal, e.g., a machine-generated electrical, optical, or electromagnetic signal that is generated to encode information for transmission to suitable receiver apparatus.
  • A computer program (also known as a program, software, software application, script, or code) may be written in any form of programming language, including compiled or interpreted languages, and it may be deployed in any form, including as a stand alone program or as a module, component, subroutine, or other unit suitable for use in a computing environment. A computer program does not necessarily correspond to a file in a file system. A program may be stored in a portion of a file that holds other programs or data (e.g., one or more scripts stored in a markup language document), in a single file dedicated to the program in question, or in multiple coordinated files (e.g., files that store one or more modules, sub programs, or portions of code). A computer program may be deployed to be executed on one computer or on multiple computers that are located at one site or distributed across multiple sites and interconnected by a communication network.
  • The processes and logic flows described in this specification may be performed by one or more programmable processors executing one or more computer programs to perform functions by operating on input data and generating output. The processes and logic flows may also be performed by, and apparatus may also be implemented as, special purpose logic circuitry, e.g., an FPGA (field programmable gate array) or an ASIC (application specific integrated circuit).
  • Processors suitable for the execution of a computer program include, by way of example, both general and special purpose microprocessors, and any one or more processors of any kind of digital computer. Generally, a processor will receive instructions and data from a read only memory or a random access memory or both. The essential elements of a computer are a processor for performing instructions and one or more memory devices for storing instructions and data. Generally, a computer will also include, or be operatively coupled to receive data from or transfer data to, or both, one or more mass storage devices for storing data, e.g., magnetic, magneto optical disks, or optical disks. However, a computer need not have such devices. Moreover, a computer may be embedded in another device, e.g., a tablet computer, a mobile telephone, a personal digital assistant (PDA), a mobile audio player, a Global Positioning System (GPS) receiver, to name just a few. Computer readable media suitable for storing computer program instructions and data include all forms of non volatile memory, media and memory devices, including by way of example semiconductor memory devices, e.g., EPROM, EEPROM, and flash memory devices; magnetic disks, e.g., internal hard disks or removable disks; magneto optical disks; and CD ROM and DVD-ROM disks. The processor and the memory may be supplemented by, or incorporated in, special purpose logic circuitry.
  • To provide for interaction with a user, aspects may be implemented on a computer having a display device, e.g., a CRT (cathode ray tube) or LCD (liquid crystal display) monitor, for displaying information to the user and a keyboard and a pointing device, e.g., a mouse or a trackball, by which the user may provide input to the computer. Other kinds of devices may be used to provide for interaction with a user as well; for example, feedback provided to the user may be any form of sensory feedback, e.g., visual feedback, auditory feedback, or tactile feedback; and input from the user may be received in any form, including acoustic, speech, or tactile input.
  • Aspects may be implemented in a computing system that includes a back end component, e.g., as a data server, or that includes a middleware component, e.g., an application server, or that includes a front end component, e.g., a client computer having a graphical user interface or a Web browser through which a user may interact with an implementation of the invention, or any combination of one or more such back end, middleware, or front end components. The components of the system may be interconnected by any form or medium of digital data communication, e.g., a communication network. Examples of communication networks include a local area network (“LAN”) and a wide area network (“WAN”), e.g., the Internet.
  • The computing system may include clients and servers. A client and server are generally remote from each other and typically interact through a communication network. The relationship of client and server arises by virtue of computer programs running on the respective computers and having a client-server relationship to each other.
  • While this specification contains many specifics, these should not be construed as limitations on the scope of the invention or of what may be claimed, but rather as descriptions of features specific to particular aspects. Certain features that are described in this specification in the context of separate aspects may also be implemented in combination in a single implementation. Conversely, various features that are described in the context of a single aspect may also be implemented in multiple aspects separately or in any suitable subcombination. Moreover, although features may be described above as acting in certain combinations and even initially claimed as such, one or more features from a claimed combination may in some cases be excised from the combination, and the claimed combination may be directed to a subcombination or variation of a subcombination.
  • Similarly, while operations are depicted in the drawings in a particular order, this should not be understood as requiring that such operations be performed in the particular order shown or in sequential order, or that all illustrated operations be performed, to achieve desirable results. In certain circumstances, multitasking and parallel processing may be advantageous. Moreover, the separation of various system components in the implementations described above should not be understood as requiring such separation in all implementations, and it should be understood that the described program components and systems may generally be integrated together in a single software product or packaged into multiple software products.
  • In each instance where an HTML file is mentioned, other file types or formats may be substituted. For instance, an HTML file may be replaced by an XML, JSON, plain text, or other types of files. Moreover, where a table or hash table is mentioned, other data structures (such as spreadsheets, relational databases, or structured files) may be used.
  • Thus, particular implementations have been described. Other implementations are within the scope of the following claims. For example, the actions recited in the claims may be performed in a different order and still achieve desirable results.

Claims (23)

1. A computer implemented method comprising:
receiving, at a proximity-sensitive display of a mobile device, a user input;
in response to receiving the user input, scrolling a content window, and scrolling a control bar into view on the proximity-sensitive display of the mobile device, wherein the content window and the control bar are scrolled in a common direction;
determining, at the mobile device, whether the user input corresponds to a flinging gesture or a dragging gesture; and
in response to determining that the user input corresponds to the flinging gesture and after the control bar has been in view on the proximity sensitive display of the mobile device for a predetermined period of time, without receiving a further user input, scrolling the content window and scrolling the control bar out of view on the proximity-sensitive display of the mobile device.
2-20. (canceled)
21. The method of claim 40, wherein the user input is initiated in the content window.
22. The method of claim 40, wherein the predetermined period of time commences when any portion of the control bar comes into view on the proximity-sensitive display of the mobile device.
23. The method of claim 40, wherein the control bar is displayed adjacent to, and above or below the content window.
24. The method of claim 40, wherein the control bar comprises at least one of a title bar or a status bar.
25. (canceled)
26. The method of claim 40, wherein the content window comprises a content window of a web browser.
27-39. (canceled)
40. The method of claim 1, further comprising:
in response to determining that the user input corresponds to the dragging gesture, causing the control bar to remain in view on the proximity-sensitive display of the mobile device until receiving an additional user input.
41. A mobile device comprising:
a proximity-sensitive display;
one or more processors; and
one or more storage devices storing instructions that when executed by the one or more processors, cause the one or more processors to perform operations comprising:
receiving, at the proximity-sensitive display of the mobile device, a user input;
in response to receiving the user input, scrolling a content window, and scrolling a control bar into view on the proximity-sensitive display of the mobile device, wherein the content window and the control bar are scrolled in a common direction;
determining whether the user input corresponds to a flinging gesture or a dragging gesture; and
in response to determining that the user input corresponds to the flinging gesture and after the control bar has been in view on the proximity sensitive display of the mobile device for a predetermined period of time, without receiving a further user input, scrolling the content window and scrolling the control bar out of view on the proximity-sensitive display.
42. The mobile device of claim 41, further comprising:
in response to determining that the user input corresponds to the dragging gesture, causing the control bar to remain in view on the proximity-sensitive display of the mobile device until receiving an additional user input.
43. The mobile device of claim 42, wherein the user input is initiated in the content window.
44. The mobile device of claim 42, wherein the predetermined period of time commences when any portion of the control bar comes into view on the proximity-sensitive display of the mobile device.
45. The mobile device of claim 42, wherein the control bar is displayed adjacent to, and above or below the content window.
46. The mobile device of claim 42, wherein the control bar comprises at least one of a title bar or a status bar.
47. The mobile device of claim 42, wherein the content window comprises a content window of a web browser.
48. A computer storage medium encoded with a computer program, the program comprising instructions that if executed by one or more computers cause the one or more computers to perform operations comprising:
receiving, at a proximity-sensitive display of a mobile device, a user input;
in response to receiving the user input, scrolling a content window, and scrolling a control bar into view on the proximity-sensitive display of the mobile device, wherein the content window and the control bar are scrolled in a common direction;
determining whether the user input corresponds to a flinging gesture or a dragging gesture; and
in response to determining that the user input corresponds to the flinging gesture and after the control bar has been in view on the proximity sensitive display of the mobile device for a predetermined period of time, without receiving a further user input, scrolling the content window and scrolling the control bar out of view on the proximity-sensitive display of the mobile device.
49. The computer storage medium of claim 48, further comprising:
in response to determining that the user input corresponds to the dragging gesture, causing the control bar to remain in view on the proximity-sensitive display of the mobile device until receiving an additional user input.
50. The computer storage medium of claim 49, wherein the user input is initiated in the content window.
51. The computer storage medium of claim 49, wherein the predetermined period of time commences when any portion of the control bar comes into view on the proximity-sensitive display of the mobile device.
52. The computer storage medium of claim 49, wherein the control bar is displayed adjacent to, and above or below the content window.
53. The computer storage medium of claim 49, wherein the control bar comprises at least one of a title bar or a status bar.
US13/249,648 2011-03-11 2011-09-30 Automatically hiding controls Abandoned US20130014050A1 (en)

Priority Applications (4)

Application Number Priority Date Filing Date Title
US13/249,648 US20130014050A1 (en) 2011-03-11 2011-09-30 Automatically hiding controls
DE202012002486U DE202012002486U1 (en) 2011-03-11 2012-03-09 Automatic hiding of controls
NL2008449A NL2008449C2 (en) 2011-03-11 2012-03-09 Automatically hiding controls.
AU2012100269A AU2012100269A4 (en) 2011-03-11 2012-03-12 Auotmatically hiding controls

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
US201161451879P 2011-03-11 2011-03-11
US13/216,466 US8904305B2 (en) 2011-03-11 2011-08-24 Automatically hiding controls
US13/249,648 US20130014050A1 (en) 2011-03-11 2011-09-30 Automatically hiding controls

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
US13/216,466 Continuation US8904305B2 (en) 2011-03-11 2011-08-24 Automatically hiding controls

Publications (1)

Publication Number Publication Date
US20130014050A1 true US20130014050A1 (en) 2013-01-10

Family

ID=46051898

Family Applications (2)

Application Number Title Priority Date Filing Date
US13/216,466 Expired - Fee Related US8904305B2 (en) 2011-03-11 2011-08-24 Automatically hiding controls
US13/249,648 Abandoned US20130014050A1 (en) 2011-03-11 2011-09-30 Automatically hiding controls

Family Applications Before (1)

Application Number Title Priority Date Filing Date
US13/216,466 Expired - Fee Related US8904305B2 (en) 2011-03-11 2011-08-24 Automatically hiding controls

Country Status (4)

Country Link
US (2) US8904305B2 (en)
AU (1) AU2012100269A4 (en)
DE (1) DE202012002486U1 (en)
NL (1) NL2008449C2 (en)

Cited By (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103713847A (en) * 2013-12-25 2014-04-09 华为终端有限公司 System bar control method of user equipment and user equipment
US20140298258A1 (en) * 2013-03-28 2014-10-02 Microsoft Corporation Switch List Interactions
US20150052454A1 (en) * 2012-12-06 2015-02-19 Huizhou Tcl Mobile Communication Co., Ltd File sharing method and handheld apparatus
US9329761B2 (en) 2014-04-01 2016-05-03 Microsoft Technology Licensing, Llc Command user interface for displaying and scaling selectable controls and commands
US9367490B2 (en) 2014-06-13 2016-06-14 Microsoft Technology Licensing, Llc Reversible connector for accessory devices
US9384335B2 (en) 2014-05-12 2016-07-05 Microsoft Technology Licensing, Llc Content delivery prioritization in managed wireless distribution networks
US9384334B2 (en) 2014-05-12 2016-07-05 Microsoft Technology Licensing, Llc Content discovery in managed wireless distribution networks
US9430667B2 (en) 2014-05-12 2016-08-30 Microsoft Technology Licensing, Llc Managed wireless distribution network
US9614724B2 (en) 2014-04-21 2017-04-04 Microsoft Technology Licensing, Llc Session-based device configuration
US9874914B2 (en) 2014-05-19 2018-01-23 Microsoft Technology Licensing, Llc Power management contracts for accessory devices
US10042655B2 (en) 2015-01-21 2018-08-07 Microsoft Technology Licensing, Llc. Adaptable user interface display
US10111099B2 (en) 2014-05-12 2018-10-23 Microsoft Technology Licensing, Llc Distributing content in managed wireless distribution networks
US10209849B2 (en) 2015-01-21 2019-02-19 Microsoft Technology Licensing, Llc Adaptive user interface pane objects
US10402034B2 (en) 2014-04-02 2019-09-03 Microsoft Technology Licensing, Llc Adaptive user interface pane manager
US10691445B2 (en) 2014-06-03 2020-06-23 Microsoft Technology Licensing, Llc Isolating a portion of an online computing service for testing
US11366571B2 (en) * 2018-05-04 2022-06-21 Dentma, LLC Visualization components including sliding bars

Families Citing this family (65)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7469381B2 (en) 2007-01-07 2008-12-23 Apple Inc. List scrolling and document translation, scaling, and rotation on a touch-screen display
US8984431B2 (en) 2009-03-16 2015-03-17 Apple Inc. Device, method, and graphical user interface for moving a current position in content at a variable scrubbing rate
US20120304081A1 (en) * 2011-05-27 2012-11-29 Mirko Mandic Navigation User Interface in Support of Page-Focused, Touch- or Gesture-based Browsing Experience
US9244584B2 (en) * 2011-08-26 2016-01-26 Apple Inc. Device, method, and graphical user interface for navigating and previewing content items
US20130067408A1 (en) * 2011-09-12 2013-03-14 Microsoft Corporation Contextually applicable commands
US9454186B2 (en) * 2011-09-30 2016-09-27 Nokia Technologies Oy User interface
US9582236B2 (en) 2011-09-30 2017-02-28 Nokia Technologies Oy User interface
WO2013155590A1 (en) * 2012-04-18 2013-10-24 Research In Motion Limited Systems and methods for displaying information or a feature in overscroll regions on electronic devices
CN103376977B (en) * 2012-04-23 2016-07-06 腾讯科技(深圳)有限公司 The display packing of browser and display device
BR112014028774B1 (en) * 2012-05-18 2022-05-10 Apple Inc Method, electronic device, computer readable storage medium and information processing apparatus
TW201349090A (en) * 2012-05-31 2013-12-01 Pegatron Corp User interface, method for displaying the same and electrical device
CN102799382A (en) * 2012-07-16 2012-11-28 华为终端有限公司 Control method for system bar of user equipment, and user equipment
US9158440B1 (en) * 2012-08-01 2015-10-13 Google Inc. Display of information areas in a view of a graphical interface
KR101957173B1 (en) 2012-09-24 2019-03-12 삼성전자 주식회사 Method and apparatus for providing multi-window at a touch device
US10713846B2 (en) 2012-10-05 2020-07-14 Elwha Llc Systems and methods for sharing augmentation data
US10180715B2 (en) 2012-10-05 2019-01-15 Elwha Llc Correlating user reaction with at least an aspect associated with an augmentation of an augmented view
US9141188B2 (en) 2012-10-05 2015-09-22 Elwha Llc Presenting an augmented view in response to acquisition of data inferring user activity
US9077647B2 (en) 2012-10-05 2015-07-07 Elwha Llc Correlating user reactions with augmentations displayed through augmented views
US10269179B2 (en) 2012-10-05 2019-04-23 Elwha Llc Displaying second augmentations that are based on registered first augmentations
US9111383B2 (en) 2012-10-05 2015-08-18 Elwha Llc Systems and methods for obtaining and using augmentation data and for sharing usage data
CN103729113B (en) * 2012-10-16 2017-03-22 中兴通讯股份有限公司 Method and device for controlling switching of virtual navigation bars
CN103076966B (en) * 2012-11-02 2015-07-29 网易(杭州)网络有限公司 The method and apparatus of menu is unlocked by performing gesture on the touchscreen
US20140136960A1 (en) * 2012-11-13 2014-05-15 Microsoft Corporation Content-Aware Scrolling
CN104037929B (en) 2013-03-06 2017-06-27 华为技术有限公司 A kind of method of supplying power to and device
US9477381B2 (en) 2013-03-12 2016-10-25 Hexagon Technology Center Gmbh User interface for toolbar navigation
US10025486B2 (en) 2013-03-15 2018-07-17 Elwha Llc Cross-reality select, drag, and drop for augmented reality systems
US10109075B2 (en) 2013-03-15 2018-10-23 Elwha Llc Temporal element restoration in augmented reality systems
US9639964B2 (en) * 2013-03-15 2017-05-02 Elwha Llc Dynamically preserving scene elements in augmented reality systems
US20140359501A1 (en) * 2013-06-04 2014-12-04 Tencent Technology (Shenzhen) Company Limited Method and device for configuring visualization of a web browser
US9727212B2 (en) 2013-07-09 2017-08-08 Google Inc. Full screen content viewing interface entry
US11068128B2 (en) 2013-09-03 2021-07-20 Apple Inc. User interface object manipulations in a user interface
EP3047359B1 (en) 2013-09-03 2020-01-01 Apple Inc. User interface for manipulating user interface objects
CN104423794A (en) * 2013-09-11 2015-03-18 上海帛茂信息科技有限公司 Intelligent mobile equipment with double-window display function
US9268671B2 (en) * 2013-12-06 2016-02-23 Testfire, Inc. Embedded test management for mobile applications
US20150253984A1 (en) * 2014-03-06 2015-09-10 Henry Zhihui Zhang Smart frame toggling
GB2524781A (en) * 2014-04-02 2015-10-07 Mark Hawkins Hidden user interface for a mobile computing device
KR102201095B1 (en) 2014-05-30 2021-01-08 애플 인크. Transition from use of one device to another
US20160378967A1 (en) * 2014-06-25 2016-12-29 Chian Chiu Li System and Method for Accessing Application Program
CN116243841A (en) 2014-06-27 2023-06-09 苹果公司 Reduced size user interface
WO2016036416A1 (en) 2014-09-02 2016-03-10 Apple Inc. Button functionality
WO2016036509A1 (en) 2014-09-02 2016-03-10 Apple Inc. Electronic mail user interface
CN110072131A (en) 2014-09-02 2019-07-30 苹果公司 Music user interface
CN104267886B (en) * 2014-09-10 2018-07-10 北京金山安全软件有限公司 Method and device for displaying browser page
KR20160032880A (en) * 2014-09-17 2016-03-25 삼성전자주식회사 Display apparatus and method for controlling thereof
WO2016185806A1 (en) * 2015-05-19 2016-11-24 京セラドキュメントソリューションズ株式会社 Display device and display control method
US10229717B1 (en) 2015-07-24 2019-03-12 Snap, Inc. Interactive presentation of video content and associated information
US10992795B2 (en) 2017-05-16 2021-04-27 Apple Inc. Methods and interfaces for home media control
US11431836B2 (en) 2017-05-02 2022-08-30 Apple Inc. Methods and interfaces for initiating media playback
US10928980B2 (en) 2017-05-12 2021-02-23 Apple Inc. User interfaces for playing and managing audio items
US20180329586A1 (en) * 2017-05-15 2018-11-15 Apple Inc. Displaying a set of application views
US20220279063A1 (en) 2017-05-16 2022-09-01 Apple Inc. Methods and interfaces for home media control
CN111343060B (en) 2017-05-16 2022-02-11 苹果公司 Method and interface for home media control
JP2017188169A (en) * 2017-07-19 2017-10-12 Kddi株式会社 Portable terminal equipment, tool screen control method, and computer program
US11435830B2 (en) 2018-09-11 2022-09-06 Apple Inc. Content-based tactile outputs
US11010121B2 (en) 2019-05-31 2021-05-18 Apple Inc. User interfaces for audio media control
WO2020243691A1 (en) 2019-05-31 2020-12-03 Apple Inc. User interfaces for audio media control
US11675563B2 (en) * 2019-06-01 2023-06-13 Apple Inc. User interfaces for content applications
CN110929054B (en) * 2019-11-20 2022-08-05 北京小米移动软件有限公司 Multimedia information application interface display method and device, terminal and medium
CN112882823A (en) * 2019-11-29 2021-06-01 华为技术有限公司 Screen display method and electronic equipment
CN111273992B (en) * 2020-01-21 2024-04-19 维沃移动通信有限公司 Icon display method and electronic equipment
US11392291B2 (en) 2020-09-25 2022-07-19 Apple Inc. Methods and interfaces for media control with dynamic feedback
CN117597682A (en) 2021-01-29 2024-02-23 苹果公司 User interface and associated system and method for sharing content item portions
US11907605B2 (en) 2021-05-15 2024-02-20 Apple Inc. Shared-content session user interfaces
US20220368548A1 (en) 2021-05-15 2022-11-17 Apple Inc. Shared-content session user interfaces
US11954303B2 (en) * 2021-05-17 2024-04-09 Apple Inc. Ephemeral navigation bar

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5910802A (en) * 1997-06-11 1999-06-08 Microsoft Corporation Operating system for handheld computing device having taskbar auto hide
US8130205B2 (en) * 2007-01-07 2012-03-06 Apple Inc. Portable electronic device, method, and graphical user interface for displaying electronic lists and documents

Family Cites Families (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5473745A (en) * 1994-12-14 1995-12-05 International Business Machines Corporation Exposing and hiding a title bar behind its window using a visual cue
KR20000016918A (en) * 1998-08-27 2000-03-25 에토 요지 Digitizer with out-of-bounds tracking
US6448986B1 (en) * 1999-09-07 2002-09-10 Spotware Technologies Llc Method and system for displaying graphical objects on a display screen
US20050183017A1 (en) 2001-01-31 2005-08-18 Microsoft Corporation Seekbar in taskbar player visualization mode
US20060107226A1 (en) 2004-11-16 2006-05-18 Microsoft Corporation Sidebar autohide to desktop
US20080238938A1 (en) * 2005-08-29 2008-10-02 Eklund Don Effects for interactive graphic data in disc authoring
EP2318897B1 (en) * 2008-05-28 2018-05-16 Google LLC Motion-controlled views on mobile computing devices
KR20110069946A (en) * 2009-12-18 2011-06-24 삼성전자주식회사 Portable device including a project module and operation method thereof
US8856682B2 (en) * 2010-05-11 2014-10-07 AI Squared Displaying a user interface in a dedicated display area

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5910802A (en) * 1997-06-11 1999-06-08 Microsoft Corporation Operating system for handheld computing device having taskbar auto hide
US8130205B2 (en) * 2007-01-07 2012-03-06 Apple Inc. Portable electronic device, method, and graphical user interface for displaying electronic lists and documents

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
David Pogue, "iPhone: The Missing Manual, Forth Edition," August 2010 *
Wayne Pan, "JavaScript Pull to Refresh," July 2010, http://waynepan.com/2010/07/30/javascript-pull-to-refresh/ *

Cited By (18)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20150052454A1 (en) * 2012-12-06 2015-02-19 Huizhou Tcl Mobile Communication Co., Ltd File sharing method and handheld apparatus
US20140298258A1 (en) * 2013-03-28 2014-10-02 Microsoft Corporation Switch List Interactions
CN103713847A (en) * 2013-12-25 2014-04-09 华为终端有限公司 System bar control method of user equipment and user equipment
US9329761B2 (en) 2014-04-01 2016-05-03 Microsoft Technology Licensing, Llc Command user interface for displaying and scaling selectable controls and commands
US10019145B2 (en) 2014-04-01 2018-07-10 Microsoft Technology Licensing, Llc Command user interface for displaying and scaling selectable controls and commands
US10402034B2 (en) 2014-04-02 2019-09-03 Microsoft Technology Licensing, Llc Adaptive user interface pane manager
US9614724B2 (en) 2014-04-21 2017-04-04 Microsoft Technology Licensing, Llc Session-based device configuration
US9430667B2 (en) 2014-05-12 2016-08-30 Microsoft Technology Licensing, Llc Managed wireless distribution network
US9384334B2 (en) 2014-05-12 2016-07-05 Microsoft Technology Licensing, Llc Content discovery in managed wireless distribution networks
US10111099B2 (en) 2014-05-12 2018-10-23 Microsoft Technology Licensing, Llc Distributing content in managed wireless distribution networks
US9384335B2 (en) 2014-05-12 2016-07-05 Microsoft Technology Licensing, Llc Content delivery prioritization in managed wireless distribution networks
US9874914B2 (en) 2014-05-19 2018-01-23 Microsoft Technology Licensing, Llc Power management contracts for accessory devices
US10691445B2 (en) 2014-06-03 2020-06-23 Microsoft Technology Licensing, Llc Isolating a portion of an online computing service for testing
US9477625B2 (en) 2014-06-13 2016-10-25 Microsoft Technology Licensing, Llc Reversible connector for accessory devices
US9367490B2 (en) 2014-06-13 2016-06-14 Microsoft Technology Licensing, Llc Reversible connector for accessory devices
US10209849B2 (en) 2015-01-21 2019-02-19 Microsoft Technology Licensing, Llc Adaptive user interface pane objects
US10042655B2 (en) 2015-01-21 2018-08-07 Microsoft Technology Licensing, Llc. Adaptable user interface display
US11366571B2 (en) * 2018-05-04 2022-06-21 Dentma, LLC Visualization components including sliding bars

Also Published As

Publication number Publication date
NL2008449A (en) 2012-09-12
AU2012100269A4 (en) 2012-04-12
US8904305B2 (en) 2014-12-02
DE202012002486U1 (en) 2012-05-29
US20120304111A1 (en) 2012-11-29
NL2008449C2 (en) 2013-04-23

Similar Documents

Publication Publication Date Title
US8904305B2 (en) Automatically hiding controls
US10133466B2 (en) User interface for editing a value in place
US9411499B2 (en) Jump to top/jump to bottom scroll widgets
KR102197538B1 (en) Automatically switching between input modes for a user interface
US9870578B2 (en) Scrolling interstitial advertisements
CN107003803B (en) Scroll bar for dynamic content
US20160092071A1 (en) Generate preview of content
US8643616B1 (en) Cursor positioning on a touch-sensitive display screen
KR20140126687A (en) Organizing graphical representations on computing devices
US9507791B2 (en) Storage system user interface with floating file collection
US9348498B2 (en) Wrapped content interaction
US9563710B2 (en) Smooth navigation between content oriented pages
EP2699998A2 (en) Compact control menu for touch-enabled command execution
US9201925B2 (en) Search result previews
CN111832271A (en) Data presentation method and device, electronic equipment and storage medium
US9513770B1 (en) Item selection
US9582180B2 (en) Automated touch screen zoom
JP6453943B2 (en) Display device, display method, and program
TW201428658A (en) Device and method for positioning a scroll view to the best display position automatically in a stock quoting software
US20200097151A1 (en) Systems and methods for generating a card view interface on a mobile computing device

Legal Events

Date Code Title Description
STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION