US9639257B2 - System and method for selecting interface elements within a scrolling frame - Google Patents
System and method for selecting interface elements within a scrolling frame Download PDFInfo
- Publication number
- US9639257B2 US9639257B2 US14/021,343 US201314021343A US9639257B2 US 9639257 B2 US9639257 B2 US 9639257B2 US 201314021343 A US201314021343 A US 201314021343A US 9639257 B2 US9639257 B2 US 9639257B2
- Authority
- US
- United States
- Prior art keywords
- interface elements
- input
- axis
- along
- subset
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active, expires
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
- G06F3/0485—Scrolling or panning
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
- G06F3/0485—Scrolling or panning
- G06F3/04855—Interaction with scrollbars
Definitions
- This disclosure relates generally to the technical fields of software and/or hardware technology and, in one example embodiment, to a system and method to select interface elements within a scrolling frame.
- Computing devices and consumer electronic devices such as personal computers (PCs), tablet computers, smartphones, personal digital assistants (PDAs), media players, and the like, commonly include a graphical user interface (GUI).
- GUI graphical user interface
- the GUI may facilitate operation of the devices by allowing users to scroll among and select interface elements, such as icons, images, text, links, and the like, that may start applications, choose images for viewing, text for copying, open web pages, and so forth.
- interface elements such as icons, images, text, links, and the like
- Such devices may incorporate a user input device, such as a touchscreen, a mouse, a trackball, a visual motion detector, and the like, that allows the user to scroll and select interface elements.
- FIG. 1 is a block diagram of a system, in accordance with an example embodiment
- FIG. 2 is an abstract image of a graphical user interface (GUI) as displayed on a visual display, in accordance with an example embodiment
- FIGS. 3A and 3B are images of a frame that illustrates scrolling in a frame without selecting interface elements or otherwise changing the configuration of the interface elements, in an example embodiment
- FIGS. 4A and 4B are images of a frame that illustrates selecting individual interface elements, in an example embodiment
- FIGS. 5A-5F are images of a frame that illustrates switching a configuration of multiple interface elements between an unselected configuration and a selected configuration, in an example embodiment
- FIGS. 6A-6F are images of a frame that illustrates switching a configuration of multiple interface elements between an unselected configuration and a selected configuration, in an example embodiment
- FIG. 7 is a flow chart illustrating a method, in accordance with an example embodiment, for switching an interface element between an unselected configuration and a selected configuration, in an example embodiment
- FIG. 8 is a block diagram illustrating components of a machine, according to some example embodiments, able to read instructions from a machine-readable medium and implement one or more of the example methodologies described herein.
- Certain user input devices include separate mechanisms for initiating motion within a GUI and selecting interface elements.
- a mouse conventionally includes a mechanism, such as a light sensor or rollerball, for translating the motion of the mouse into movement within the GUI, such as by moving a cursor within the GUI, and a button for selecting interface items in relation to the cursor.
- various alternative user input devices such as a touchscreen or a visual motion detector, may tend to rely on a single mechanism for user interaction, such as a physical touch in the case of a touchscreen.
- the touchscreen may then seek to differentiate between types of touches. For instance, a swipe or drag of a finger along the touchscreen may be interpreted as movement within the GUI while a tap or other quick touch and release action may be interpreted as a selection of an interface element.
- a system, device, and method have been developed that may facilitate the selection of multiple interface elements.
- scrolling occurs along one axis of a frame of the GUI but not in the other (i.e., a frame may scroll up-and-down but not side-to-side or vice versa). Consequently, the swipe action may be useful in the axis of scrolling but essentially unutilized in the axis orthogonal to the axis scrolling as an input to the GUI.
- the system, device, and method may interpret a user input to a user input device along an axis orthogonal to the scrolling axis as a selection of interface elements or, more generally, of switching a selection configuration of the interface elements between an unselected condition and a selected condition, depending on a stating condition of the interface element.
- a subsequent user input along the scrolling axis may be interpreted as a continuation of the selection of interface elements, allowing multiple interface elements along both axes to be selected without individually selecting, such as through tapping, each interface element.
- the scrolling function may be maintained, allowing for concurrent selection and scrolling based on movement along the scrolling axis.
- FIG. 1 is a block diagram of a system 10 , in accordance with an example embodiment.
- the system 10 includes a user input device 12 and a visual display 14 .
- the user input device 12 may be any of a variety of suitable devices, including, but not limited to, a touchscreen, a mouse, a touchpad, a trackball, a motion detector, keys or a keyboard, and the like.
- the visual display 14 may be any of a variety of suitable displays, including, but not limited to, a liquid crystal display (LCD), a light emitting diode (LED) display, an organic light emitting diode (OLED) display, a cathode ray tube (CRT), and the like.
- LCD liquid crystal display
- LED light emitting diode
- OLED organic light emitting diode
- CRT cathode ray tube
- the system 10 further optionally includes a processor 16 and a memory 18 .
- the processor 16 may be a conventional processor 16 utilized in computing devices and/or consumer electronic devices, including a microprocessor, a microcontroller, a custom-designed circuit, and the like.
- the processor 16 may include multiple processors, either within a single package or distributed over multiple locations.
- the memory 18 may incorporate one or more types of memory and/or storage devices, including random access memory (RAM), read-only memory (ROM), flash memory and other electrically erasable programmable read-only memory (EEPROM), hard disks, compact discs (CDs), DVDs, and the like.
- the system 10 may be a single device, such as a tablet computer, laptop computer, smartphone, medial player, PDA, and the like.
- the system 10 may be a local system, such as a conventional desktop computer or workstation arrangement with a user input device 12 and visual display 14 separate from but locally connected to the processor 16 and memory 18 .
- the system 10 may be a remote system, with the user input device 12 and the visual display 14 providing interaction with a remotely located processor 16 and memory 18 , such as through the remote interaction of a workstation with a server, mainframe computer, or network-based “cloud” resources. It is to be understood that the system 10 may include multiple ones of the above modes concurrently, and thus a tablet computer may have both an integral processor 16 and memory 18 while also accessing and utilizing cloud-based processors 16 and memory 18 .
- the system 10 may include various additional optional components such as may be included in computing devices and/or consumer electronic devices. Such components may be ancillary or otherwise unrelated to the function of the components 12 , 14 , 16 , 18 and the methods described herein. Alternatively, certain such components, such as a camera, may be utilized as part of or in supplement to the functions and methods disclosed herein. In an example, the camera may be or may be a component of the user input device 12 , such as where the user input device is a motion detection system.
- FIG. 2 is an abstract image of a graphical user interface (GUI) 20 as displayed on the visual display 14 , in accordance with an example embodiment.
- the GUI 20 includes one or more frames 22 in which multiple interface elements 24 may be displayed.
- a single frame 22 may be coextensive with the GUI 20 itself, may occupy a sub-portion of the GUI 20 , or multiple frames 22 may be incorporated within the GUI 20 .
- Interface elements 24 may optionally be included in the GUI 20 outside of any given frame 22 .
- the interface elements 24 may be visual elements such as icons, images, text characters, Internet links, spreadsheet cells, graphs or components of a graph, and the like.
- an interface element 24 may be any of a variety of visual objects that may be selected via the GUI 20 , such as to select or start a related application, select or display a related image, select or play a related media file, select or copy a related image, media file or text, access various system functions, such as copy, paste, create a new folder, a context menu, select or move the interface element 24 , and so forth.
- Interface elements 24 may also pertain to file structures, in which the selection of an interface element 24 may result in the display of additional interface elements 24 that are stored in a file structure associated with the first interface element 24 .
- the interface elements 24 may correspond to an unselected configuration 24 A.
- the unselected configuration 24 A may correspond to a base state of the interface elements 24 .
- the visual display 14 may display the selected interface elements 24 in a selected configuration 24 B.
- the selected configuration 24 B may change the appearance of the interface element 24 , such as by highlighting or otherwise marking the interface element 24 , or by executing an application or function related to the interface element 24 , such as by displaying a related image or starting a related application or media file.
- the frame 22 optionally includes a scroll indicator 26 to indicate a visible portion of the frame 22 in relation to the entire frame 22 .
- the scroll indicator 26 is a sliding indicator that shows at a relatively high granularity the visible portion of the frame 22 . It is to be understood that the scroll indicator 26 may be any of a variety of formats, including a selector for discrete sub-frames or pages of the frame 22 in contrast to the sliding indicator. In various examples, the scroll indicator 26 may be utilized to scroll within the frame 22 , such as by selecting the scroll indicator 26 via the user input device 12 and manipulating the scroll indicator 26 .
- the user input device 12 may be utilized to interact with the GUI 20 .
- the user input device 12 may interact with the GUI 20 by selecting interface elements 24 and scrolling the frame 22 based on a single interface mode.
- the single mode of a touchscreen or touchpad may be various types of touching of the user input device 12 by the user.
- a mouse may have a motion detector and a button.
- the mouse may be configured to both select and scroll without using the button.
- multi-mode user input devices 12 may be utilized in a single mode per the systems and methods disclosed herein.
- the GUI 20 includes a first axis 28 and a second axis 30 orthogonal to the first axis 28 .
- the scroll of the frame 22 is along or substantially along the first axis.
- the first axis 28 is vertical relative to the illustrated orientation of the GUI 20 , though it is to be understood that the first axis 28 may, in various examples, be any of a variety of alternative orientations, such as horizontal or diagonal. In such examples, the frame 22 would thus scroll horizontally or diagonally, as appropriate.
- an input along the first axis 28 may cause visual display 14 to display the frame 22 scrolling in the direction of the input.
- a touch and drag motion in substantially the downward direction i.e., substantially along the first axis 28
- a touch and drag motion in substantially the upward direction i.e., substantially along the first axis 28
- a touch and drag motion substantially sideways i.e., substantially along the second axis 30 , would not cause the frame 22 to scroll.
- a user input is substantially along one of the axes 28 , 30 based on various margins of error.
- an input is substantially along an axis 28 , 30 if the input indicates movement within thirty (30) degrees of the axis 28 , 30 .
- the remaining thirty (30) degrees may not indicate a command or may be allocated to an alternative command, such as selection of an interface element 24 .
- An alternative example would include movement within ten (10) degrees of the axes 28 , 30 indicating scrolling or selection.
- any input that indicates movement may be determined to be substantially along the axis 28 , 30 with the greatest component of motion (e.g., the largest vector).
- the input may be interpreted as being substantially along the axis 28 , 30 .
- one axis 28 , 30 may be favored over the other. For instance, movement within sixty (60) degrees of the vertical axis 28 may cause scrolling while movement within thirty (30) degrees of the horizontal axis 30 may cause selection of interface elements 24 . In such an example, then, scrolling may be favored over selection, with a user needing to be relatively close to the horizontal axis 30 to cause selection. It is to be recognized that many configurations are possible, and that selection may be favored over scrolling in various configurations. In various examples, the various angles to the axes 28 , 30 are separately selectable and configurable to tune selection and scrolling, such as to a user preference or other factor.
- an input received by the user input device 12 that does not indicate movement may be a selection of an interface element 24 in relation to the input (for instance, with a respect to a touchscreen, a touch and release or tap on the touchscreen in relation to a particular interface element 24 ).
- a tap on the touchscreen may be in relation to a particular interface element 24 if the tap touches the interface element 24 , is within a particular range of a center 32 of an interface element 24 with or without necessarily touching the interface element 24 , or is closer to the interface element 24 or the center 32 of the interface element 24 than any other interface element.
- While the operations for selecting one or more interface elements 24 may affirmatively select the related interface elements 24 , the same operation when applied to an interface element in the selected configuration 24 B may place the interface element in the unselected configuration 24 A.
- FIGS. 3A-6F illustrate various concepts related generally to switching between selected and unselected configurations of interface elements 24 and scrolling within the frame 22 , according to various example embodiments.
- the examples of FIGS. 3A-6F will be discussed with respect to the user device 12 being configured to detect touchscreen or touchpad inputs, such as taps and swipes.
- the inputs may be through alternative inputs.
- a mouse or a trackball may be used to pause a cursor over an interface element 24 to perform the equivalent of a tap on a touchscreen, while a relatively substantial movement of the mouse may be the equivalent to a swipe on a touchscreen.
- a motion sensor may detect a pause or a certain gesture as equivalent to a tap and a sweep of an arm horizontally or vertically as the equivalent to a swipe. It is to be recognized that alternative user input devices 12 are contemplated with related inputs. As noted above, while the examples of FIGS. 3A-6F show a vertical first axis 28 , along which the frame scrolls, and a horizontal second axis 30 , the arrangement may be reversed, with a horizontal first axis 28 and a vertical second axis 30 .
- FIGS. 3A and 3B illustrate scrolling in a frame without selecting interface elements 24 or otherwise changing the configurations 24 A, 24 B of the interface elements, as discussed herein, in an example embodiment.
- the user input device 12 detects the placing of a detectable object, such as a finger, on a location 34 .
- the user input device 12 detects the movement of the detectable object to a second location 36 along a path 38 substantially along the first axis 28 .
- the frame 22 scrolls along the first axis 28 , as illustrated in the displacement of the interface elements 24 and the change of the scroll indicator 26 . Because the movement is along the first axis 28 without first having been along the second axis, the interface elements 24 along the path 38 are not placed in the selected configuration 24 B.
- FIGS. 4A and 4B illustrate selecting individual interface elements 24 , in an example embodiment.
- a first interface element 24 ′ is selected when the user input device 12 detects a tap on a location 40 on the first interface element 24 ′.
- a tap executes a function of the first interface element 24 ′, such as by executing a related application.
- a tap selects the interface element 24 ′ without executing a function of the interface element 24 ′.
- the tap places the interface element 24 ′ in the selected configuration 24 B.
- a second interface element 24 ′′ is selected when the user interface device 12 detects a tap in a second location 42 on the second interface element 24 ′′.
- the tap is a double tap, which executes the function of both the first, already selected interface element 24 ′ and the second interface element 24 ′′. While the example of FIGS. 4A and 4B are with respect to selecting individual interface elements 24 , it is to be understood that the principles apply equally well to switching interface elements 24 from the selected configuration 24 B to the unselected configuration 24 A.
- FIGS. 5A-5F illustrate switching a configuration of multiple interface elements 24 between the unselected configuration 24 A and the selected configuration 24 B, in an example embodiment.
- the user input device 12 detects a swipe from a first location 44 to a second location 46 along a first path 48 substantially along the second axis 30 .
- a first subset 50 of interface elements 24 in relation to the first path 48 e.g., as touched by the first path 48 or in proximity of the first path 48 , are displayed by the visual display 14 in the selected configuration 24 B.
- the scroll bar 26 indicates that the visible portion of the frame 22 is approximately in the middle of the frame 22 .
- the user input device 12 detects a swipe from or near the second location 46 to a third location 52 on the bottom row 54 along a second path 56 along the first axis 28 .
- the first subset 50 is expanded to a second subset 58 to include the interface elements 24 within a rectangular shape defined by the first and second paths 48 , 56 .
- the visual display 14 displays the interface elements 24 in the second subset 58 in the selected configuration 24 B.
- the swipe along the second path 56 is based on a continuous input with the user input device 12 along the first and second paths 48 , 56 (e.g., a user's finger is continuous contact with a touchscreen from the first location 44 to the third location 52 ).
- the input may be discontinuous (e.g., any subsequent swipe on the touchscreen results in the selection of the related interface elements 24 ).
- FIG. 5C after the input holds at the third location 52 on the original bottom row 60 after traversing along second path 56 , the frame 22 scrolls down and the new bottom row 60 of interface elements 24 is displayed.
- the scroll bar 26 updates to indicate that, as illustrated, the bottom of the frame 22 has been reached.
- Some of the interface elements 24 of the new bottom row 60 are included in the second subset 58 based on their relation to the first and second paths 48 , 56 .
- the input as detected by the user input device 12 ceases and the interface elements 24 remain in the selected configuration 24 B.
- the interface elements 24 of the first and second subsets 50 , 58 may optionally be held in the unselected configuration 24 A until the input ceases, such as is the case in FIG. 5D and is optionally the case in FIG. 5B , whereupon the interface elements 24 of the first and second subsets 50 , 58 are displayed by the visual display in the selected configuration 24 B.
- the user input device detects an input from a fourth location 62 to a fifth location 64 along a third path 66 along the second axis 30 to form a third subset 68 .
- the selected and unselected configuration of the interface elements 24 in relation to the third path are switched to the opposing configuration.
- interface elements 24 are placed in the unselected state 24 A if the interface element 24 started in the selected state 24 B and in the selected state if the interface element 24 started in the unselected state 24 A.
- the user input device 12 detects a swipe from or near the fifth location 64 to a sixth location 70 along a fourth path 72 along the first axis 28 .
- the third subset 68 is expanded to a fourth subset 74 to include the interface elements 24 within a rectangular shape defined by the third and fourth paths 66 , 72 .
- the visual display 14 displays each of the interface elements 24 in the fourth subset 74 in a configuration 24 A, 24 B that is switched from the individual interface elements' 24 original configuration 24 A, 24 B.
- interface elements 24 are placed in the unselected state 24 A if the interface element 24 started in the selected state 24 B and in the selected state if the interface element 24 started in the unselected state 24 A. Interface elements that were originally placed in the third subset 68 are not changed again.
- the swipe along the fourth path 72 is based on a continuous input with the user input device 12 along the third and fourth paths 66 , 72 (e.g., a user's finger is continuous contact with a touchscreen from the fourth location 62 to the sixth location 70 ).
- the input may be discontinuous (e.g., any subsequent swipe on the touchscreen results in the ultimate change of configuration 24 A, 24 B of the related interface elements 24 ).
- the interface elements in relation to the third path 66 may be held in their respective original configurations 24 A, 24 B and then switched to their changed configuration 24 A, 24 B, as appropriate. It is to be understood that the example above is exemplary and that the additional movement along the first axis 28 is optional.
- the configuration of interface elements 24 may be switched between the unselected and selected configurations 24 A, 24 B simply on the basis of a swipe along the second axis 30 . As noted above, execution of the interface elements 24 in the selected configuration 24 B may occur automatically following the selection of the interface elements 24 or may be on the basis of an additional input.
- the locations 44 , 46 , 52 , 62 , 64 , 70 represent a farthest-point of movement along the respective axes 28 , 30 .
- the user interaction may be thought of as applying “ink”; the interface items 24 at the farthest extent that the user traveled along each axis 28 , 30 become selected even if and after the user interaction retreats away from the interface items 24 .
- the locations 44 , 46 , 52 , 62 , 64 , 70 are definitionally the farthest extent of the user interaction along each axis 28 , 30 .
- the user may overshoot the locations 44 , 46 , 52 , 62 , 64 , 70 but ultimately return to the locations 44 , 46 , 52 , 62 , 64 , 70 .
- the user may proceed past location 46 along the first path 48 but ultimately return to the location 46 before proceeding along the second path 56 to show the selected configuration as illustrated.
- the locations 44 , 46 , 52 , 62 , 64 , 70 define starting points and ending points and the interface items 24 affected by the user interaction are redefined over the course of the user interaction.
- interface items 24 that were once selected 24 B may revert to the unselected state 24 A, or vice versa. It is to be recognized and understood that while the discussion of this and the preceding paragraph is discussed in particular with respect to FIGS. 5A-5F , the principles discussed are applicable throughout this disclosure.
- interface elements 24 of the example of FIGS. 5A-5F are organized into rows 76 , it is to be understood that the interface items 24 may be positioned irregularly. Individual interface elements 24 that meet the position requirements of the various locations and paths may have their configuration 24 A, 24 B switched without respect to their positional relationship to other interface elements 24 . As such, a user may reposition interface elements 24 as desired while retaining the ability to operate according to the example of FIGS. 5A-5F .
- FIGS. 6A-6F illustrate switching a configuration of multiple interface elements 24 between the unselected configuration 24 A and the selected configuration 24 B, in an example embodiment. As will be illustrated below, in contrast to the example of FIGS. 5A-5F , the example of FIGS. 6A-6F operate based on the organization and positioning of the interface elements 24 in rows 76 .
- the user input device 12 detects a swipe from a first location 78 to a second location 80 along a first path 82 substantially along the second axis 30 .
- a first subset 84 of interface elements 24 in relation to the first path 82 e.g., as touched by the first path 82 or in proximity of the first path 82 , are displayed by the visual display 14 in the selected configuration 24 B.
- the user input device 12 detects a swipe from or near the second location 80 to a third location 86 on the bottom row 88 along a second path 90 along the first axis 28 .
- the first subset 84 is expanded to a second subset 92 to include the interface elements 24 within a rectangular shape defined by the first and second paths 82 , 90 as well other interface elements 24 within the rows 76 included from the second location 80 to the third location 86 , as illustrated.
- the visual display 14 displays the interface elements 24 in the second subset 92 in the selected configuration 24 B.
- the swipe along the second path 90 is based on a continuous input with the user input device 12 along the first and second paths 82 , 90 (e.g., a user's finger is continuous contact with a touchscreen from the first location 78 to the third location 86 ).
- the input may be discontinuous (e.g., any subsequent swipe on the touchscreen results in the selection of the related interface elements 24 ).
- FIG. 6C after the input holds at the third location 86 on the original bottom row 88 after traversing along second path 90 , the frame 22 scrolls down and the new bottom row 94 of interface elements 24 is displayed. Some of the interface elements 24 of the new bottom row 94 are included in the second subset 92 based on their relation to the first and second paths 82 , 90 .
- the input as detected by the user input device 12 ceases and the interface elements 24 remain in the selected configuration 24 B.
- the interface elements 24 of the first and second subsets 84 , 92 may optionally be held in the unselected configuration 24 A until the input ceases, such as is the case in FIG. 6D and is optionally the case in FIG. 6B , whereupon the interface elements 24 of the first and second subsets 84 , 92 are displayed by the visual display in the selected configuration 24 B.
- the user input device detects an input from a fourth location 96 to a fifth location 98 along a third path 100 along the second axis 30 to form a third subset 102 .
- the selected and unselected configuration of the interface elements 24 in relation to the third path 100 are switched to the opposing configuration. As illustrated, interface elements 24 are placed in the unselected state 24 A if the interface element 24 started in the selected state 24 B and in the selected state if the interface element 24 started in the unselected state 24 A.
- the user input device 12 detects a swipe from or near the fifth location 98 to a sixth location 104 along a fourth path 106 along the first axis 28 .
- the third subset 102 is expanded to a fourth subset 108 to include the interface elements 24 within a rectangular shape defined by the third and fourth paths 100 , 106 as well other interface elements 24 within the rows 76 included from the second location 80 to the third location 86 , as illustrated.
- the visual display 14 displays each of the interface elements 24 in the fourth subset 108 in a configuration 24 A, 24 B that is switched from the individual interface elements' 24 original configuration 24 A, 24 B.
- interface elements 24 are placed in the unselected state 24 A if the interface element 24 started in the selected state 24 B and in the selected state if the interface element 24 started in the unselected state 24 A. Interface elements that were originally placed in the third subset 102 are not changed again.
- the swipe along the fourth path 106 is based on a continuous input with the user input device 12 along the third and fourth paths 100 , 106 (e.g., a user's finger is continuous contact with a touchscreen from the fourth location 96 to the sixth location 104 ).
- the input may be discontinuous (e.g., any subsequent swipe on the touchscreen results in the ultimate change of configuration 24 A, 24 B of the related interface elements 24 ).
- the interface elements in relation to the third path 100 may be held in their respective original configurations 24 A, 24 B and then switched to their changed configuration 24 A, 24 B, as appropriate. It is to be understood that the example above is exemplary and that the additional movement along the first axis 28 is optional.
- the configuration of interface elements 24 may be switched between the unselected and selected configurations 24 A, 24 B simply on the basis of a swipe along the second axis 30 . As noted above, execution of the interface elements 24 in the selected configuration 24 B may occur automatically following the selection of the interface elements 24 or may be on the basis of an additional input
- FIG. 7 is a flow chart 78 illustrating a method, in accordance with an example embodiment, for switching an interface element between an unselected configuration and a selected configuration, in an example embodiment.
- the method may be performed by processing logic that may comprise hardware (e.g., dedicated logic, programmable logic, microcode, etc.), software (such as run on a general purpose computer system programmed to perform particular functions pursuant to instructions from program software or on a dedicated machine), or a combination of both.
- the processing logic may reside in the processor 16 and memory 18 shown in FIG. 1 .
- Alternative embodiments may comprise more, less, or functionally equivalent modules or engines in various combinations. While certain modules or engines are described as performing certain operations, these descriptions are merely examples and the operations may be performed by other components or systems.
- interface elements are displayed in a frame of the visual display, the frame being configured to scroll along a first axis based on an input as received by a user input device being substantially along the first axis.
- the display of interface elements of a subset of the interface elements is switched between an unselected configuration and a selected configuration in response to a first input as received by the user input device being substantially along a second axis orthogonal to the first axis.
- switching the display of the interface elements of the subset is further based on a proximity of the first input to the interface elements.
- the user input device is a touchscreen positioned with respect to the visual display, and switching the display of the interface elements is based on the first input indicating a touching of a location on the touchscreen that corresponds with the interface elements of the first subset.
- the first input is substantially along the second axis based on being within thirty (30) degrees of the second axis.
- switching between the unselected configuration and the selected configuration comprises changing an interface element in an initial unselected condition to the selected condition.
- the display of interface elements of an expanded subset of the interface elements, the expanded subset based on the subset is switched between the unselected configuration and the selected configuration in response to a second input, substantially along the first axis, as received by the user input device following the first input.
- the visual display does not switch the display of the interface elements between the unselected configuration and the selected configuration based on the input along the first axis without first receiving a first input along the second axis.
- the expanded subset comprises interface elements within a geometric shape defined, at least in part, by the first input and the second input.
- the interface elements are organized into a plurality of rows, and the expanded subset further comprises interface elements not within the geometric shape but within one of the plurality of rows corresponding to at least one of the interface elements within the expanded subset.
- the frame is scrolled along the first axis based on the second input.
- the scrolling of the frame may be concurrent or substantially concurrent with switching the interface elements of the expanded subset.
- operation 88 inputs from the user input device are received by the processor and provided to the visual display. Operation 88 may be concurrent with operations 82 - 86 .
- FIG. 8 is a block diagram illustrating components of a machine 800 , according to some example embodiments, able to read instructions from a machine-readable medium (e.g., a machine-readable storage medium) and perform any one or more of the methodologies discussed herein.
- a machine-readable medium e.g., a machine-readable storage medium
- FIG. 8 shows a diagrammatic representation of the machine 800 in the example form of a computer system and within which instructions 824 (e.g., software) for causing the machine 800 to perform any one or more of the methodologies discussed herein may be executed.
- the machine 800 operates as a standalone device or may be connected (e.g., networked) to other machines.
- the machine 800 may operate in the capacity of a server machine or a client machine in a server-client network environment, or as a peer machine in a peer-to-peer (or distributed) network environment.
- the machine 800 may be a server computer, a client computer, a personal computer (PC), a tablet computer, a laptop computer, a netbook, a set-top box (STB), a personal digital assistant (PDA), a cellular telephone, a smartphone, a web appliance, a network router, a network switch, a network bridge, or any machine capable of executing the instructions 824 , sequentially or otherwise, that specify actions to be taken by that machine.
- the term “machine” shall also be taken to include a collection of machines that individually or jointly execute the instructions 824 to perform any one or more of the methodologies discussed herein.
- the machine 800 includes a processor 802 (e.g., a central processing unit (CPU), a graphics processing unit (GPU), a digital signal processor (DSP), an application specific integrated circuit (ASIC), a radio-frequency integrated circuit (RFIC), or any suitable combination thereof), a main memory 804 , and a static memory 806 , which are configured to communicate with each other via a bus 808 .
- the machine 800 may further include a graphics display 810 (e.g., a plasma display panel (PDP), a light emitting diode (LED) display, a liquid crystal display (LCD), a projector, or a cathode ray tube (CRT)).
- a graphics display 810 e.g., a plasma display panel (PDP), a light emitting diode (LED) display, a liquid crystal display (LCD), a projector, or a cathode ray tube (CRT)
- the machine 800 may also include an alphanumeric input device 812 (e.g., a keyboard), a cursor control device 814 (e.g., a mouse, a touchpad, a trackball, a joystick, a motion sensor, or other pointing instrument), a storage unit 816 , a signal generation device 818 (e.g., a speaker), and a network interface device 820 .
- an alphanumeric input device 812 e.g., a keyboard
- a cursor control device 814 e.g., a mouse, a touchpad, a trackball, a joystick, a motion sensor, or other pointing instrument
- a storage unit 816 e.g., a disk drive, or other pointing instrument
- a signal generation device 818 e.g., a speaker
- the storage unit 816 includes a machine-readable medium 822 on which is stored the instructions 824 (e.g., software) embodying any one or more of the methodologies or functions described herein.
- the instructions 824 may also reside, completely or at least partially, within the main memory 804 , within the processor 802 (e.g., within the processor's cache memory), or both, during execution thereof by the machine 800 . Accordingly, the main memory 804 and the processor 802 may be considered as machine-readable media.
- the instructions 824 may be transmitted or received over a network 826 via the network interface device 820 .
- the term “memory” refers to a machine-readable medium able to store data temporarily or permanently and may be taken to include, but not be limited to, random-access memory (RAM), read-only memory (ROM), buffer memory, flash memory, and cache memory. While the machine-readable medium 822 is shown in an example embodiment to be a single medium, the term “machine-readable medium” should be taken to include a single medium or multiple media (e.g., a centralized or distributed database, or associated caches and servers) able to store instructions.
- machine-readable medium shall also be taken to include any medium, or combination of multiple media, that is capable of storing instructions (e.g., software) for execution by a machine (e.g., machine 800 ), such that the instructions, when executed by one or more processors of the machine (e.g., processor 802 ), cause the machine to perform any one or more of the methodologies described herein.
- a “machine-readable medium” refers to a single storage apparatus or device, as well as “cloud-based” storage systems or storage networks that include multiple storage apparatus or devices.
- the term “machine-readable medium” shall accordingly be taken to include, but not be limited to, one or more data repositories in the form of a solid-state memory, an optical medium, a magnetic medium, or any suitable combination thereof.
Landscapes
- Engineering & Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
Description
Claims (18)
Priority Applications (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| US14/021,343 US9639257B2 (en) | 2013-09-09 | 2013-09-09 | System and method for selecting interface elements within a scrolling frame |
Applications Claiming Priority (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| US14/021,343 US9639257B2 (en) | 2013-09-09 | 2013-09-09 | System and method for selecting interface elements within a scrolling frame |
Publications (2)
| Publication Number | Publication Date |
|---|---|
| US20150074590A1 US20150074590A1 (en) | 2015-03-12 |
| US9639257B2 true US9639257B2 (en) | 2017-05-02 |
Family
ID=52626812
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| US14/021,343 Active 2034-01-06 US9639257B2 (en) | 2013-09-09 | 2013-09-09 | System and method for selecting interface elements within a scrolling frame |
Country Status (1)
| Country | Link |
|---|---|
| US (1) | US9639257B2 (en) |
Families Citing this family (1)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| KR102213897B1 (en) * | 2014-10-31 | 2021-02-08 | 삼성전자주식회사 | A method for selecting one or more items according to an user input and an electronic device therefor |
Citations (4)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20080122796A1 (en) * | 2006-09-06 | 2008-05-29 | Jobs Steven P | Touch Screen Device, Method, and Graphical User Interface for Determining Commands by Applying Heuristics |
| US20080297482A1 (en) * | 2007-05-30 | 2008-12-04 | Microsoft Corporation | Recognizing selection regions from multiple simultaneous inputs |
| US20090282332A1 (en) * | 2008-05-12 | 2009-11-12 | Nokia Corporation | Apparatus, method and computer program product for selecting multiple items using multi-touch |
| US20140137039A1 (en) * | 2012-03-30 | 2014-05-15 | Google Inc. | Systems and Methods for Object Selection on Presence Sensitive Devices |
-
2013
- 2013-09-09 US US14/021,343 patent/US9639257B2/en active Active
Patent Citations (4)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20080122796A1 (en) * | 2006-09-06 | 2008-05-29 | Jobs Steven P | Touch Screen Device, Method, and Graphical User Interface for Determining Commands by Applying Heuristics |
| US20080297482A1 (en) * | 2007-05-30 | 2008-12-04 | Microsoft Corporation | Recognizing selection regions from multiple simultaneous inputs |
| US20090282332A1 (en) * | 2008-05-12 | 2009-11-12 | Nokia Corporation | Apparatus, method and computer program product for selecting multiple items using multi-touch |
| US20140137039A1 (en) * | 2012-03-30 | 2014-05-15 | Google Inc. | Systems and Methods for Object Selection on Presence Sensitive Devices |
Also Published As
| Publication number | Publication date |
|---|---|
| US20150074590A1 (en) | 2015-03-12 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| US9015639B2 (en) | Methods and systems for navigating a list with gestures | |
| US10152948B2 (en) | Information display apparatus having at least two touch screens and information display method thereof | |
| AU2017200737B2 (en) | Multi-application environment | |
| CN103098011B (en) | Item display method and equipment | |
| US9104440B2 (en) | Multi-application environment | |
| US10775971B2 (en) | Pinch gestures in a tile-based user interface | |
| JP6141300B2 (en) | Indirect user interface interaction | |
| CN103314373B (en) | Efficient Processing of Large Datasets on Mobile Devices | |
| US8760474B2 (en) | Virtualized data presentation in a carousel panel | |
| CN114467078A (en) | User interface adaptation based on inferred content occlusion and user intent | |
| US20140068500A1 (en) | System and method for navigation of a multimedia container | |
| US20140082533A1 (en) | Navigation Interface for Electronic Content | |
| US11429272B2 (en) | Multi-factor probabilistic model for evaluating user input | |
| US20110199386A1 (en) | Overlay feature to provide user assistance in a multi-touch interactive display environment | |
| US20160088060A1 (en) | Gesture navigation for secondary user interface | |
| CN105593812A (en) | Pan and selection gesture detection | |
| US20220155948A1 (en) | Offset touch screen editing | |
| US20190317658A1 (en) | Interaction method and device for a flexible display screen | |
| US20140040789A1 (en) | Tool configuration history in a user interface | |
| US20130111382A1 (en) | Data collection interaction using customized layouts | |
| KR20130097491A (en) | Device and method for changing size of display window on screen | |
| US20140351749A1 (en) | Methods, apparatuses and computer program products for merging areas in views of user interfaces | |
| US20160103573A1 (en) | Scalable and tabbed user interface | |
| CN103460174B (en) | Increase user interface element | |
| US9213555B2 (en) | Off-screen window controls |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| AS | Assignment |
Owner name: ADOBE SYSTEMS INCORPORATED, CALIFORNIA Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:RAHN, ANDREW BENJAMIN;REEL/FRAME:031165/0638 Effective date: 20130909 |
|
| FEPP | Fee payment procedure |
Free format text: PAYOR NUMBER ASSIGNED (ORIGINAL EVENT CODE: ASPN); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY |
|
| STCF | Information on status: patent grant |
Free format text: PATENTED CASE |
|
| AS | Assignment |
Owner name: ADOBE INC., CALIFORNIA Free format text: CHANGE OF NAME;ASSIGNOR:ADOBE SYSTEMS INCORPORATED;REEL/FRAME:048867/0882 Effective date: 20181008 |
|
| MAFP | Maintenance fee payment |
Free format text: PAYMENT OF MAINTENANCE FEE, 4TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1551); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY Year of fee payment: 4 |
|
| MAFP | Maintenance fee payment |
Free format text: PAYMENT OF MAINTENANCE FEE, 8TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1552); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY Year of fee payment: 8 |