CN107111423B - Selecting actionable items in a graphical user interface of a mobile computer system - Google Patents

Selecting actionable items in a graphical user interface of a mobile computer system Download PDF

Info

Publication number
CN107111423B
CN107111423B CN201580072555.XA CN201580072555A CN107111423B CN 107111423 B CN107111423 B CN 107111423B CN 201580072555 A CN201580072555 A CN 201580072555A CN 107111423 B CN107111423 B CN 107111423B
Authority
CN
China
Prior art keywords
gesture
actionable
detecting
touch screen
user interface
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201580072555.XA
Other languages
Chinese (zh)
Other versions
CN107111423A (en
Inventor
M-G·拉沃埃
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Oracle International Corp
Original Assignee
Oracle International Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Oracle International Corp filed Critical Oracle International Corp
Publication of CN107111423A publication Critical patent/CN107111423A/en
Application granted granted Critical
Publication of CN107111423B publication Critical patent/CN107111423B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04842Selection of displayed objects or displayed text elements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/0482Interaction with lists of selectable items, e.g. menus
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04817Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance using icons
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04886Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures by partitioning the display area of the touch-screen or the surface of the digitising tablet into independently controllable areas, e.g. virtual keyboards or menus
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • H04M1/72469User interfaces specially adapted for cordless or mobile telephones for operating the device by selecting functions from two or more displayed items, e.g. menus or icons

Abstract

A computer-based method for selecting actionable items displayed in a graphical user interface on a mobile computer system is provided. The method includes detecting a first gesture ending at a first location at a lower portion of the touch screen; activating a one-handed operation mode in response to detecting the first gesture; determining a sequence of actionable items for a foreground application executing on a processor; detecting a second gesture on the touch screen, the second gesture beginning at the first location and ending at the second location; in response to detecting the second gesture, highlighting one of the actionable items of the sequence based on the second location; detecting a third gesture on the touch screen; and in response to detecting the release at the second location as a third gesture, selecting the highlighted actionable item for processing by the foreground application and disabling the one-handed operation mode.

Description

Selecting actionable items in a graphical user interface of a mobile computer system
Technical Field
One embodiment relates generally to mobile computer systems, and more particularly to graphical user interfaces for mobile computer systems.
Background
As the size of mobile computer system touch screens increases, one-handed operation of these devices becomes increasingly difficult. For example, a large screen smart phone as shown in fig. 1 (such as, for example, LG G3Android, Samsung Galaxy S5, HTC One Max Android, Apple iPhone 6Plus, etc.) may include a touchscreen 2 having a height of about 5 inches and a width of about 3 inches. As depicted in fig. 1, gripping the large screen smartphone 1 (only) in the right hand allows the thumb 5 of the average user to have a range of movement that provides access to the home button 3and the pie-shaped area of the touchscreen 2, which is typically defined by a semicircular arc 4, the bottom of the touchscreen 2, and a portion of the right side of the touchscreen 2. A user with a larger hand and longer thumb will be able to access a larger area of the touch screen 2, while a user with a smaller hand and shorter thumb will be able to access a smaller area of the touch screen 2. Unfortunately, selectable elements of a graphical user interface displayed within the upper portion 6 of the touch screen 2 for applications executing in the foreground (such as, for example, icons in a home screen, graphical control elements in a browser application, etc.) are not accessible using the thumb 5 and one-handed grip.
Several smart phone manufacturers have attempted to alleviate the difficulties associated with one-handed operation of large screen smart phones. In one attempt, the user may switch the smartphone 1 to a small screen mode that reduces the graphical user interface to a smaller size for display in the lower portion of the touchscreen 2. Reducing the size of the graphical user interface to allow the user to access all elements is often contrary to the original purpose of having a large touch screen, and may make certain elements of the graphical user interface too small to be selected with the thumb 5, too small to be read, etc. In another attempt, the user may activate an accessible mode that slides the upper portion of the graphical user interface down to the lower portion of the touch screen 2, while removing the lower portion of the graphical user interface completely from the touch screen 2. Once the touchable mode is activated, only the elements of the graphical user interface visible in the lower part of the touchscreen 2 may be selected by the user, which may make certain gestures impossible to perform, etc.
Disclosure of Invention
Embodiments of the present invention provide a computer-based method for selecting actionable items displayed in a graphical user interface on a mobile computer system having a processor and a touchscreen. The method includes detecting a first gesture ending at a first location of a lower portion of the touch screen; activating a one-handed operation mode in response to detecting the first gesture; determining a sequence of actionable items for a foreground application executing on a processor; detecting a second gesture on the touch screen, the second gesture beginning at the first location and ending at the second location; in response to detecting the second gesture, highlighting one of the actionable items of the sequence based on the second location; detecting a third gesture on the touch screen; and in response to detecting the release at the second position as a third gesture, selecting the highlighted actionable item for processing by a foreground application (for one-handed operation) and disabling the one-handed operation mode.
Drawings
FIG. 1 is a diagram depicting a mobile computer system held in a user's hand.
FIG. 2 is a system block diagram of a mobile computer system according to an embodiment of the present invention.
FIG. 3 is a diagram depicting a mobile computer system held in a user's hand according to an embodiment of the invention.
Fig. 4A and 4B present a flow diagram depicting at least some of the functionality of the selection module of fig. 2, according to an embodiment of the invention.
Fig. 5A-5J are diagrams of a mobile computer system depicting a one-handed mode of operation, according to embodiments of the present invention.
Detailed Description
Embodiments of the present invention will now be described with reference to the drawings, wherein like reference numerals refer to like parts throughout.
Embodiments of the present invention advantageously provide a short-range, one-handed mode of operation on a mobile computer system. In one embodiment, in response to detecting a first gesture on the touch screen, a sequence of actionable items for a foreground application executing on the mobile computer system is determined, in response to detecting a second gesture on the touch screen, one of the actionable items of the sequence is highlighted, and in response to detecting a third gesture, the highlighted actionable item is selected for processing by the foreground application and the single-handed operation mode is disabled.
In one embodiment, the first gesture is a horizontal swipe using a thumb, which may begin at a side edge of the lower portion of the touchscreen. In another embodiment, the second gesture is a vertical swipe using a thumb. The vertical swipe need not remain parallel to the side edges of the touch screen and may describe an arc generally similar to arc 4 of fig. 1. The vertical component of the second gesture will be used to determine which actionable item should be highlighted. In yet another embodiment, the third gesture is a thumb lift (release). In some embodiments, the first gesture and the second gesture are performed sequentially, i.e., without lifting the thumb from the touch screen. Advantageously, all three gestures can be accomplished by the user's thumb without having to readjust the user's grip on the mobile computer system, requiring less than two inches of thumb movement in many cases. These gestures may be performed by the user significantly faster than other methods of controlling a graphical user interface.
In some embodiments, the actionable item sequence is determined by identifying all actionable items displayed on the touchscreen by the foreground application and then ordering and arranging the actionable items in the sequence in a predetermined order. In one embodiment, the predetermined order begins at the top left corner of the graphical user interface and ends at the bottom right corner of the graphical user interface. After the one-handed mode of operation is activated by the first gesture, the second gesture allows the user to travel through the sequence of actionable items, thereby highlighting each item in turn until the desired actionable item is highlighted. The third gesture selects an actionable item for processing by the foreground application and disables the single-handed operation mode. Advantageously, during the one-handed mode of operation, the actionable items are displayed in their original position in the graphical user interface.
In some embodiments, the foreground applications may be the home screen(s) of the operating system that display one or more pages of icons representing applications stored on the mobile computer system. The icons are displayed in a grid and in one embodiment, the sequence of actionable items includes a list of icons that begins with the first icon on the upper left side of the grid and progresses to the last icon on the lowermost right side of the grid. Other icon list sequences are also contemplated. When the user performs a second gesture (such as, for example, a vertical swipe or an arc), the icon is highlighted based on the vertical movement of the thumb during the second gesture. The icons are highlighted in order as the thumb moves upward during the second gesture, and the icons are highlighted in the reverse order as the user's thumb moves downward during the second gesture. Highlighting may include changing the color of the icon, adding a border around the icon, and the like. When a third gesture (such as a release) is detected, the highlighted icon at the current location is selected for processing by the operating system (e.g., launching the highlighted application), and the one-handed mode of operation is disabled.
If the second gesture continues until the last icon in the sequence is highlighted, then additional vertical movement of the thumb causes the last icon to be de-highlighted, and release at this point deactivates the single-handed mode of operation without selecting an icon for processing by the operating system. Alternatively, a horizontal slide may be used instead of release. This may occur at either end of the sequence, i.e., at the beginning of the list or at the end of the list. For example, continuation of the second gesture in the upward direction will highlight each icon in the list until the last icon in the list is de-highlighted. Release or horizontal sliding at this time will deactivate the one-handed operation mode. If the user reverses direction during the continuation of the second gesture, the icons will be highlighted in reverse order until the first icon in the list is de-highlighted. Release or horizontal sliding at this time will also deactivate the one-handed operation mode.
The method of the present invention may be applied to other foreground applications executing on a mobile computer system. In these cases, the actionable items can include selectable elements of a graphical user interface for the application, such as, for example, icons, graphical control elements, widgets (widgets), tabs (tabs), and so forth. For example, if a text box is highlighted and then selected by the user, the foreground application may process the selection by presenting a cursor within the text box and a keyboard in a lower portion of the touch screen to receive user input.
FIG. 2 is a system block diagram of mobile computer system 10 according to an embodiment of the present invention.
The mobile computer System 10 includes a rechargeable battery 11, a bus 12 or other communication mechanism for communicating information, and a System-on-Chip ("SoC") 21 coupled to the bus 12 for processing information. SoC21 is an integrated circuit that integrates the major components of mobile computer system 10 onto a single chip. SoC21 may include a multicore processor 22 having two or more processor cores, which may be any type of general purpose or special purpose processor cores. The number of processor cores may be 2, 4, etc. The SoC21 may also include a multi-core graphics processor 23 having two or more graphics cores, which may be any type of general purpose or special purpose graphics cores. The number of graphics cores may be 2, 4, etc. The SoC21 includes shared memory 24, such as L1 cache for each processor and graphics core, L2 and L3 caches accessible by the multi-core processor 22 and the multi-core graphics processor 23, and so forth. In alternative embodiments, SoC21 may be replaced by two or more single-core processors and supporting components, combinations of single-core and multi-core processors and supporting circuitry, and the like.
The mobile computer system 10 also includes a memory 14 and a file system 17, where the memory 14 is used to store instructions and information to be executed by the multicore processor 22 and the multicore graphics processor 23. The memory 14 may be comprised of any combination of storage devices, such as, for example, volatile memory 14-v, including random access memory ("RAM"), etc., and non-volatile memory 14-n, including NAND flash memory, read only memory ("ROM"), etc. Computer readable media can be any available media that can be accessed by the components of SoC21 including volatile memory 14-v and non-volatile memory 14-n, removable and non-removable media, and communication media. Communication media may include computer readable instructions, data structures, program modules, or other data in a modulated data signal, such as a carrier wave or other transport mechanism, and includes any information delivery media.
The mobile computer system 10 also includes a communication interface 20 coupled to the bus 12 to provide access to various wireless networks, such as a cellular network 30, a Wi-Fi network 32, a bluetooth network 34, and/or various bluetooth devices, etc., as well as one or more wired networks or devices, such as a universal serial bus ("USB"), via a USB connector 25. The mobile computer system 10 also includes a liquid crystal display ("LCD") touch screen 26 coupled to the bus 12 and one or more sensors, buttons, and/or switches 28, such as, for example, a touch pad, a master button, a touch ID sensor/master button combination, a programmable mute switch, a volume rocker switch, etc., to enable a user to interface with the mobile computer system 10.
In one embodiment, the memory 14 stores software modules that provide functionality when executed by the multi-core processor 22 and the multi-core graphics processor 23. These modules include an operating system 15 that provides operating system functionality for the system 10. These modules also include a selection module 16 for selecting actionable items displayed in the graphical user interface on the LCD touch screen 26, as well as all other functions disclosed herein. The system 10 may include one or more additional functional modules 18, such as, for example, applications, add-ons (add-ons), and the like. Alternatively, the selection module 16 may be included within the operating system 15 and/or the functional module 18. The file system 17 provides, among other things, centralized storage for the operating system 15, the selection module 16, and the function modules 18.
Fig. 3 is a diagram depicting a mobile computer system 10, such as the Apple iPhone 6Plus, held in a user's hand, fig. 3 being used to illustrate certain aspects of the present invention. When held in the user's right hand, the thumb 5 can access a pie-shaped area of the touch ID/primary button 28 and the touch screen 26, where the pie-shaped area of the touch screen 26 is generally defined by the semi-circular arc 4, the bottom of the touch screen 26, and a portion of the right side of the touch screen 26. Although the mobile computer system 10 is depicted as being held in a portrait orientation, a user may also hold the mobile computer system 10 in a landscape orientation. Furthermore, embodiments of the present invention are not limited to large screen mobile devices.
The operating system 15, such as Apple's iOS 8, displays to the user one or more home screens 40 on which are displayed application icons and application folder icons 50. In this example, up to twenty-four (24) icons 50 may be displayed in six rows of four icons in each row in each home screen 40. A dock (dock)42 located at the lower portion of each home screen displays the same set of application icons on each home screen 40. In this example, up to four (4) application icons may be displayed in a row of four icons in the dock 42. A point (not shown for clarity) above the dock 42 shows how many home screens 40 are available to the user and which home screen 40 is currently being viewed. A status bar (not shown for clarity) at the top of the touchscreen 26 displays various icons including, for example, a cell signal icon, an airplane mode icon, a network icon, a Wi-Fi icon, a battery icon, and the like.
In this example, the operating system 15 is a foreground application, and each application icon is an actionable item that, when selected by a user, causes the operating system 15 to launch a particular application.
Fig. 4A and 4B present a flow diagram 100 depicting at least some of the functionality of the selection module 16 of fig. 1, according to an embodiment of the invention. In some embodiments, the functions of flowchart 100 may be implemented by software stored in a memory or other computer-readable tangible, non-transitory medium and executed by multi-core processor 22, multi-core graphics processor 23, or the like. In other embodiments, the functions may be performed by other SoC21 components, such as, for example, an application specific integrated circuit ("ASIC"), a programmable gate array ("PGA"), a field programmable gate array ("FPGA"), etc., or by any combination of hardware and software.
More specifically, the flow diagram 100 depicts a method for selecting actionable items displayed in a graphical user interface on a mobile computer system 10 having a touch screen 26 in accordance with an embodiment of the present invention. Fig. 5A-5J are diagrams depicting the mobile computer system 10 in a one-handed mode of operation, which will be used to illustrate certain aspects of the present invention. FIG. 5A depicts the operating system 15 as a foreground application. For convenience, the home screen 40 includes twenty-four icons numbered 50.1 through 50.24, while the dock 42 includes four icons numbered 50.25 through 50.28. An icon folder may also be displayed on the home screen 40. The touch screen 26 includes a bottom edge 27 and side edges 29.
The method depicted in flowchart 100 begins by detecting (110) a gesture and, in response, activating (120) a single-handed operation (SHO) mode. In one embodiment, the SHO mode may be implemented using a simple add-on controller TapRemote. In the-viewDidApper message of a conventional view controller, a simple class method can be used to initialize the inventive gesture handler (handler), such as, for example:
Figure BDA0001342691470000071
the inventive functionality may be deactivated by sending a null (nil) controller to the TapRemote:
Figure BDA0001342691470000072
Figure BDA0001342691470000081
in another embodiment, for Apple iOS, SHO mode can be implemented using several Objective C protocols and interfaces for the Cocoa touch framework:
@protocol TapRemoteControllerFeedback<NSObject>
@optional
-(NSArray)filterTapRemoteTargetViews:(NSArray)views;
@end
@interface TapRemote:NSObject<UGestureRecognizerDelegate>
+(void)tapRemoteForController:(UlViewController)viewController;
@end
the gesture used to activate the SHO mode should be unique with respect to the foreground application in order to prevent interference with gesture recognition by the foreground application, and should prevent the foreground application from interpreting gestures made by the user when the SHO mode is active.
In one embodiment, the gesture is a horizontal swipe using thumb 5 on the lower portion of touch screen 26 starting at the side edges of touch screen 26 and proceeding toward the center of touch screen 26. The height of the lower portion of the touchscreen 26 on which the SHO mode activation gesture is made depends on the gesture recognition provided by the foreground application. For example, when the operating system 15 is a foreground application, horizontal sliding through the home screen 40 is recognized and results in switching between the home screens 40. However, the horizontal sliding through the dock 42 is not recognized by the operating system 15, and thus, the height of the lower portion of the touchscreen 26 on which the SHO mode activation gesture is made may be set accordingly. In one embodiment, the lower portion has a predetermined height "H" above the bottom edge 27 of the touch screen 26, wherein the predetermined height "H" is about 15% of the height "H" of the touch screen 26.
Fig. 5B depicts horizontal sliding 112 activating SHO mode when operating system 15 is a foreground application. The horizontal swipe 112 begins at the right edge 29 of the touch screen 26 and ends at location 114 and may be performed by the user's right thumb 5. Alternatively, the slide 112 may begin at the left side edge 29, or the user may hold the mobile computer system 10 in the left hand and slide from either side edge using the left thumb.
The method continues by determining (130) a sequence of actionable items for the foreground application. In some embodiments, the actionable item sequence is determined by identifying all actionable items displayed on the touchscreen 26 by the foreground application, creating an initial list of actionable items, and then arranging the initial list of actionable items into an ordered list. In one embodiment, actionable items are captured from a graphical user interface and then sent to a target view controller for filtering. Based on the state of the foreground application, actionable items can be removed or added to the initial list of actionable items, which advantageously provides an opportunity to add graphical user interface elements (such as icons, images, etc.) that are not otherwise captured from the graphical user interface as actionable items.
In one embodiment, the predetermined order begins at the top left corner of the graphical user interface and ends at the bottom right corner of the graphical user interface. For example, when the operating system 15 is a foreground application, the actionable items are icons 50, and these icons 50 are identified and then arranged in an ordered sequence starting at icon 50.1 and ending at icon 50.28. In this example, all icons from the home screen 40 and the dock 42 are included in the sequence. Other ordered sequences are also contemplated by the present invention, such as, for example, a sequence that begins in the upper right corner and ends at the lower left corner, a sequence that begins at the leftmost icon of the upper row and proceeds to the rightmost icon of the row and then descends to the rightmost icon of the next row and proceeds to the leftmost icon of the row and so on. The ordered sequence may also depend on the orientation of the mobile computer device 10 in the user's hand, i.e., portrait or landscape.
The method continues by detecting (140) another gesture on the touch screen 26, and in response, highlighting (150) the actionable item based on the gesture. In one embodiment, the gesture is a vertical slide 132 starting at location 114 and ending at location 134, as depicted in FIG. 5C. The vertical swipe 132 may be performed using the same thumb 5 that performed the activation gesture without lifting the thumb between gestures. The vertical swipe 132 need not remain parallel to the side edges of the touch screen and may trace an arc because the vertical component of the gesture is used to determine the location 134. In this example, the position 134 corresponds to the tenth actionable item in the sequence, icon 50.10, with icon 50.10 highlighted with a border 152. In other embodiments, the color of the icon 50.10 may be changed to indicate highlighting, such as, for example, a tint color (tint color). In one embodiment, as vertical slide 132 moves from position 114 to position 134, the first nine icons in the sequence are successively highlighted and de-highlighted until thumb 5 pauses at position 134, resulting in the highlighting of icon 50.10. In other words, icon 50.1 is first highlighted, de-highlighted, and then icon 50.2 is next highlighted, de-highlighted, and so on; at any one time, only a single icon is highlighted.
The method continues by detecting (160) a further gesture on the touch screen 26.
In response to detecting the release (170), the method continues by selecting (180) the highlighted actionable item for processing by the foreground application and disabling (190) the SHO mode. In this example, as shown in fig. 5D, after thumb 5 is lifted from touch screen 26 at location 134, icon 50.10 is recognized for operating system 15 and SHO mode is deactivated, allowing operating system 15 to resume gesture processing. The operating system 15 then launches the application associated with the icon 50.10 for execution in the foreground. For other applications, if a graphical user interface control element, image, etc. is selected (180), a click (tap) gesture at the location of the element, image, etc. is provided to the foreground application.
In response to the horizontal swipe (200), the method continues by disabling (190) the SHO mode. In one embodiment, the horizontal swipe 202 begins at location 134 and ends at the side edge 29 of the touch screen 26, as depicted in FIG. 5E.
In response to continuing the vertical swipe (210), the method continues by determining (220) whether the end of the sequence has been reached. In one embodiment, the vertical slide 212 begins at position 134 and ends at position 214, as depicted in FIG. 5F. The vertical swipe 212 may be performed using the same thumb 5 that performed the previous gesture without lifting the thumb between gestures. The vertical swipe 212 may pause before the last icon 50.28 is highlighted, allowing the user to select that icon using release. However, if the vertical slide 212 continues until the last icon 50.28 in the sequence is highlighted, then additional vertical movement of the thumb 5 causes the last icon 50.28 to be de-highlighted, as depicted in FIG. 5F. In this example, all of the icons 50.1 through 50.28 have been successively highlighted and de-highlighted, and the end of the sequence has been reached.
The method continues by detecting (230) a further gesture. In response to the release (240) or horizontal swipe (250), the SHO mode is deactivated (190) without selecting an icon for processing by the operating system 15. As depicted in fig. 5G, the release 170 is detected at location 214, which causes SHO to be deactivated without selecting an icon for processing. Similarly, as depicted in fig. 5H, a horizontal swipe 202 is detected starting from position 214 and continuing to the side edge 29, which results in SHO being deactivated without selecting an icon for processing.
In response to the vertical swipe in the downward direction, the method continues by highlighting (150) the actionable items based on the vertical swipe. In one embodiment, the vertical slide 222 begins at position 214 and ends at position 224, as depicted in FIG. 5I. The same thumb 5 that performed the other gestures may be used to perform the vertical swipe 222 without lifting between gestures. In this example, position 224 corresponds to the thirteenth actionable item in the sequence, namely icon 50.13, with icon 50.13 highlighted with border 152. In response to detecting the release (170), the method continues by selecting (180) the highlighted actionable item for processing by the foreground application and disabling (190) the SHO mode. In this example, as depicted in fig. 5J, after thumb 5 is lifted from touch screen 26 at location 224, icon 50.13 is recognized for operating system 15 and SHO mode is deactivated, allowing operating system 15 to resume gesture processing. The operating system 15 then launches the application associated with the icon 50.13 for execution in the foreground.
If the beginning of the sequence is reached by vertical swipe 222, then the release or horizontal swipe will disable the SHO mode without selecting an icon for processing by operating system 15.
Embodiments of the present invention advantageously improve the operation of large screen mobile computer systems by allowing one-handed operation of certain aspects of the graphical user interface using a combination of touch screen gestures without reducing the size of the graphical user interface, without removing a portion of the graphical user interface, and the like. In a technical sense, the present invention may be made and used in many mobile computer system industries such as, for example, large screen smart phones, small tablet computers, and the like.
In one embodiment, in response to detecting a first gesture on the touchscreen, a sequence of actionable items for a foreground application executing on the mobile computer system is determined, in response to detecting a second gesture on the touchscreen, one of the actionable items of the sequence is highlighted, and in response to detecting a third gesture, the highlighted actionable item is selected for processing by the foreground application and the single-handed operation mode is disabled.
The many features and advantages of the invention are apparent from the detailed specification, and thus, it is intended by the appended claims to cover all such features and advantages of the invention which fall within the true spirit and scope of the invention. Further, since numerous modifications and variations will readily occur to those skilled in the art, it is not desired to limit the invention to the exact construction and operation illustrated and described, and accordingly, all suitable modifications and equivalents may be resorted to, falling within the scope of the invention.

Claims (21)

1. A non-transitory computer readable medium having instructions stored thereon that, when executed by a processor, cause the processor to select an actionable item displayed in a graphical user interface on a mobile computer system having a touch screen, the selecting comprising:
detecting a first gesture on a lower portion of the touchscreen, the first gesture ending at a first location, the lower portion having a predetermined height above a bottom edge of the touchscreen;
activating a one-handed mode of operation in response to detecting the first gesture;
organizing existing actionable items for a foreground application executing on the processor into a sequence of actionable items without adding additional representations of the existing actionable items to the graphical user interface;
detecting a second gesture on the touch screen, the second gesture beginning at the first location and ending at a second location;
in response to detecting the second gesture, highlighting one of the actionable items of the sequence based on the second position;
detecting a third gesture on the touch screen; and
in response to detecting the release at the second position as the third gesture, selecting the highlighted actionable item for processing by the foreground application and disabling the one-handed operation mode.
2. The non-transitory computer-readable medium of claim 1, wherein the first gesture is a horizontal swipe starting at one of the side edges of the touchscreen.
3. The non-transitory computer-readable medium of claim 2, wherein the second gesture is a vertical swipe or arc.
4. The non-transitory computer readable medium of claim 3, wherein selecting an actionable item displayed in a graphical user interface on a mobile computer system with a touch screen further comprises:
disabling the one-handed operation mode in response to detecting a horizontal swipe starting at the second position and ending at one of the side edges of the touch screen as the third gesture.
5. The non-transitory computer readable medium of claim 3, wherein selecting an actionable item displayed in a graphical user interface on a mobile computer system with a touch screen further comprises:
successively highlighting actionable items of the sequence as the second gesture moves between the first position and the second position, a previous actionable item being de-highlighted before a next actionable item is highlighted.
6. The non-transitory computer readable medium of claim 5, wherein selecting an actionable item displayed in a graphical user interface on a mobile computer system with a touch screen further comprises:
in response to detecting the second gesture continuing from the second position to a third position as the third gesture, successively highlighting actionable items of the sequence as the third gesture moves between the second position and the third position, a previous actionable item being de-highlighted before a next actionable item is highlighted;
determining that the end of the actionable item sequence has been reached; and
deactivating the one-handed operation mode in response to detecting a fourth gesture on the touch screen, the fourth gesture being a release or a horizontal swipe starting at the third location and ending at one of the side edges of the touch screen.
7. The non-transitory computer readable medium of claim 1, wherein the predetermined height is about 15% of a height of the touch screen.
8. The non-transitory computer-readable medium of claim 1, wherein the sequence maps actionable items in the graphical user interface that start in an upper left corner of the graphical user interface and end in a lower right corner of the graphical user interface.
9. The non-transitory computer readable medium of claim 1, wherein the actionable items are a plurality of icons arranged in a grid pattern, and highlighting the actionable items comprises changing a color of the icons.
10. The non-transitory computer readable medium of claim 1, wherein the actionable items are a plurality of graphical control elements, and highlighting the actionable items comprises adding a border around the graphical control elements.
11. A computer-based method for selecting actionable items displayed in a graphical user interface on a mobile computer system having a processor and a touch screen, the method comprising:
detecting a first gesture on a lower portion of the touchscreen, the first gesture ending at a first location, the lower portion having a predetermined height above a bottom edge of the touchscreen;
activating a one-handed mode of operation in response to detecting the first gesture;
organizing existing actionable items for a foreground application executing on the processor into a sequence of actionable items without adding additional representations of the existing actionable items to the graphical user interface;
detecting a second gesture on the touch screen, the second gesture beginning at the first location and ending at a second location;
in response to detecting the second gesture, highlighting one of the actionable items of the sequence based on the second position;
detecting a third gesture on the touch screen; and
in response to detecting the release at the second position as the third gesture, selecting the highlighted actionable item for processing by the foreground application and disabling the one-handed operation mode.
12. The method of claim 11, wherein the first gesture is a horizontal swipe starting at one of the side edges of the touch screen and the second gesture is a vertical swipe or an arc.
13. The method of claim 12, further comprising:
in response to detecting a horizontal swipe starting at the second location and ending at one of the side edges of the touch screen as the third gesture, disabling the one-handed operation mode;
successively highlighting actionable items of the sequence as the second gesture moves between the first position and the second position, a previous actionable item being de-highlighted before a next actionable item is highlighted; and
in response to detecting that the second gesture continues from the second position to a third position as the third gesture:
sequentially highlighting actionable items of the sequence as the third gesture moves between the second position and the third position, a previous actionable item being de-highlighted before a next actionable item is highlighted;
determining that the end of the actionable item sequence has been reached; and
deactivating the one-handed operation mode in response to detecting a fourth gesture on the touch screen, the fourth gesture being a release or a horizontal swipe starting at the third location and ending at one of the side edges of the touch screen.
14. The method of claim 11, wherein the predetermined height is about 15% of a touchscreen height, and the sequence maps actionable items in the graphical user interface that start in an upper left corner of the graphical user interface and end in a lower right corner of the graphical user interface.
15. The method of claim 11, wherein the actionable items are a plurality of icons arranged in a grid pattern and highlighting the actionable items comprises changing a color of the icons, or the actionable items are a plurality of graphical control elements and highlighting the actionable items comprises adding a border around the graphical control elements.
16. A mobile computer system, comprising:
a touch screen;
a memory; and
a processor coupled to the touch screen and the memory, the processor configured to select an actionable item displayed in a graphical user interface on a mobile computer system having a processor and a touch screen, the selecting comprising:
detecting a first gesture on a lower portion of the touchscreen, the first gesture ending at a first location, the lower portion having a predetermined height above a bottom edge of the touchscreen;
activating a one-handed mode of operation in response to detecting the first gesture;
organizing existing actionable items for a foreground application executing on the processor into a sequence of actionable items without adding additional representations of the existing actionable items to the graphical user interface;
detecting a second gesture on the touch screen, the second gesture beginning at the first location and ending at a second location;
in response to detecting the second gesture, highlighting one of the actionable items of the sequence based on the second position;
detecting a third gesture on the touch screen; and
in response to detecting the release at the second position as the third gesture, selecting the highlighted actionable item for processing by the foreground application and disabling the one-handed operation mode.
17. The system of claim 16, wherein the first gesture is a horizontal swipe starting at one of the side edges of the touchscreen and the second gesture is a vertical swipe or an arc.
18. The system of claim 17, wherein the processor is further configured to:
successively highlighting actionable items of the sequence as the second gesture moves between the first position and the second position, a previous actionable item being de-highlighted before a next actionable item is highlighted;
in response to detecting a horizontal swipe starting at the second location and ending at one of the side edges of the touch screen as the third gesture, disabling the one-handed operation mode; and
in response to detecting that the second gesture continues from the second position to a third position as the third gesture:
sequentially highlighting actionable items of the sequence as the third gesture moves between the second position and the third position, a previous actionable item being de-highlighted before a next actionable item is highlighted;
determining that the end of the actionable item sequence has been reached; and
deactivating the one-handed operation mode in response to detecting a fourth gesture on the touch screen, the fourth gesture being a release or a horizontal swipe starting at the third location and ending at one of the side edges of the touch screen.
19. The system of claim 16, wherein the predetermined height is about 15% of a touchscreen height, and the sequence maps actionable items in the graphical user interface that begin in an upper left corner of the graphical user interface and end in a lower right corner of the graphical user interface.
20. The system of claim 16, wherein the actionable items are a plurality of icons arranged in a grid pattern and highlighting the actionable items comprises changing a color of the icons, or the actionable items are a plurality of graphical control elements and highlighting the actionable items comprises adding a border around the graphical control elements.
21. An apparatus for selecting an actionable item displayed in a graphical user interface on a mobile computer system with a processor and a touchscreen, the apparatus comprising means for performing the method of any of claims 11-15.
CN201580072555.XA 2015-01-06 2015-12-16 Selecting actionable items in a graphical user interface of a mobile computer system Active CN107111423B (en)

Applications Claiming Priority (5)

Application Number Priority Date Filing Date Title
US201562100330P 2015-01-06 2015-01-06
US62/100,330 2015-01-06
US14/617,772 2015-02-09
US14/617,772 US9690463B2 (en) 2015-01-06 2015-02-09 Selecting actionable items in a graphical user interface of a mobile computer system
PCT/US2015/066173 WO2016111821A1 (en) 2015-01-06 2015-12-16 Selecting actionable items in a graphical user interface of a mobile computer system

Publications (2)

Publication Number Publication Date
CN107111423A CN107111423A (en) 2017-08-29
CN107111423B true CN107111423B (en) 2021-03-02

Family

ID=56286537

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201580072555.XA Active CN107111423B (en) 2015-01-06 2015-12-16 Selecting actionable items in a graphical user interface of a mobile computer system

Country Status (5)

Country Link
US (1) US9690463B2 (en)
EP (1) EP3243127B1 (en)
JP (1) JP6549726B2 (en)
CN (1) CN107111423B (en)
WO (1) WO2016111821A1 (en)

Families Citing this family (25)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9055870B2 (en) 2012-04-05 2015-06-16 Welch Allyn, Inc. Physiological parameter measuring platform device supporting multiple workflows
USD916713S1 (en) 2012-04-05 2021-04-20 Welch Allyn, Inc. Display screen with graphical user interface for patient central monitoring station
US9235682B2 (en) 2012-04-05 2016-01-12 Welch Allyn, Inc. Combined episodic and continuous parameter monitoring
US10226200B2 (en) 2012-04-05 2019-03-12 Welch Allyn, Inc. User interface enhancements for physiological parameter monitoring platform devices
US9690463B2 (en) * 2015-01-06 2017-06-27 Oracle International Corporation Selecting actionable items in a graphical user interface of a mobile computer system
US9883007B2 (en) * 2015-01-20 2018-01-30 Microsoft Technology Licensing, Llc Downloading an application to an apparatus
US10430073B2 (en) 2015-07-17 2019-10-01 Crown Equipment Corporation Processing device having a graphical user interface for industrial vehicle
USD862505S1 (en) 2015-10-02 2019-10-08 Samsung Electronics Co., Ltd. Display screen or portion thereof with graphical user interface
USD854557S1 (en) 2015-10-02 2019-07-23 Samsung Electronics Co., Ltd. Display screen or portion thereof with graphical user interface
CN105468280A (en) * 2015-11-13 2016-04-06 小米科技有限责任公司 Method and device for switching keyboard styles
BR112019006440A2 (en) 2016-11-22 2019-06-25 Crown Equip Corp display and processing device for an industrial vehicle, and, industrial vehicle
US20200097096A1 (en) * 2017-06-16 2020-03-26 Hewlett-Packard Development Company, L.P. Displaying images from multiple devices
US10871851B2 (en) * 2017-08-22 2020-12-22 Blackberry Limited Electronic device and method for one-handed operation
CN107831938A (en) * 2017-11-08 2018-03-23 芯海科技(深圳)股份有限公司 A kind of electronic equipment exchange method for touch-screen
CA184008S (en) * 2018-06-08 2019-07-17 Beijing Microlive Vision Tech Co Ltd Display screen with graphical user interface
EP3814881A1 (en) * 2018-07-26 2021-05-05 Patmos, Unipessoal Lda Enhanced touch sensitive selection
USD903709S1 (en) * 2018-10-01 2020-12-01 Butterfly Network, Inc. Display panel or portion thereof with graphical user interface
USD889487S1 (en) * 2018-10-29 2020-07-07 Facebook, Inc. Display panel of a programmed computer system with a graphical user interface
USD874504S1 (en) * 2018-10-29 2020-02-04 Facebook, Inc. Display panel of a programmed computer system with a graphical user interface
JP2020177074A (en) * 2019-04-16 2020-10-29 株式会社デンソー Device for vehicle, and method for controlling device for vehicle
CN111147660B (en) * 2019-12-04 2021-06-15 华为技术有限公司 Control operation method and electronic equipment
CN111290691A (en) * 2020-01-16 2020-06-16 北京京东振世信息技术有限公司 Method and device for operating page, computer equipment and readable storage medium
US11954307B2 (en) * 2020-12-04 2024-04-09 Samsung Electronics Co., Ltd. Visual selector for application activities
US11385770B1 (en) * 2021-04-21 2022-07-12 Qualcomm Incorporated User interfaces for single-handed mobile device control
CN113778276A (en) * 2021-08-27 2021-12-10 苏州三星电子电脑有限公司 Touch terminal desktop icon control method and touch terminal

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20040135823A1 (en) * 2002-07-30 2004-07-15 Nokia Corporation User input device
CN102625931A (en) * 2009-07-20 2012-08-01 惠普发展公司,有限责任合伙企业 User interface for initiating activities in an electronic device
CN102203715B (en) * 2011-05-23 2013-03-20 华为终端有限公司 An input method, an input device and a terminal device
US20140089829A1 (en) * 2012-09-26 2014-03-27 Samsung Electronics Co., Ltd. System supporting manual user interface based control of an electronic device

Family Cites Families (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4254573B2 (en) * 2004-02-27 2009-04-15 株式会社日立製作所 Display method and display device
US8819569B2 (en) 2005-02-18 2014-08-26 Zumobi, Inc Single-handed approach for navigation of application tiles using panning and zooming
US20080256454A1 (en) 2007-04-13 2008-10-16 Sap Ag Selection of list item using invariant focus location
US20100008031A1 (en) 2008-07-08 2010-01-14 Emblaze Mobile Ltd Ergonomic handheld device
GB0817805D0 (en) * 2008-09-29 2008-11-05 Symbian Software Ltd Method and system for receicing and displaying unsolicitted content on a device
US8464180B1 (en) * 2012-06-15 2013-06-11 Google Inc. Organizing graphical representations on computing devices
KR102036240B1 (en) * 2012-12-18 2019-10-25 삼성디스플레이 주식회사 Touch Screen Panel and Display Device Having the Same
US8976202B2 (en) * 2013-01-28 2015-03-10 Dave CAISSY Method for controlling the display of a portable computing device
US8769431B1 (en) 2013-02-28 2014-07-01 Roy Varada Prasad Method of single-handed software operation of large form factor mobile electronic devices
CN104301497A (en) * 2013-07-15 2015-01-21 中兴通讯股份有限公司 Method and apparatus for displaying incoming call interface
US9690463B2 (en) * 2015-01-06 2017-06-27 Oracle International Corporation Selecting actionable items in a graphical user interface of a mobile computer system

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20040135823A1 (en) * 2002-07-30 2004-07-15 Nokia Corporation User input device
CN102625931A (en) * 2009-07-20 2012-08-01 惠普发展公司,有限责任合伙企业 User interface for initiating activities in an electronic device
CN102203715B (en) * 2011-05-23 2013-03-20 华为终端有限公司 An input method, an input device and a terminal device
US20140089829A1 (en) * 2012-09-26 2014-03-27 Samsung Electronics Co., Ltd. System supporting manual user interface based control of an electronic device

Also Published As

Publication number Publication date
WO2016111821A1 (en) 2016-07-14
CN107111423A (en) 2017-08-29
EP3243127B1 (en) 2020-03-18
JP2018503209A (en) 2018-02-01
EP3243127A1 (en) 2017-11-15
US9690463B2 (en) 2017-06-27
US20160196041A1 (en) 2016-07-07
JP6549726B2 (en) 2019-07-24

Similar Documents

Publication Publication Date Title
CN107111423B (en) Selecting actionable items in a graphical user interface of a mobile computer system
US11079908B2 (en) Method and apparatus for adding icon to interface of android system, and mobile terminal
US20160202887A1 (en) Method for managing application icon and terminal
US11199942B2 (en) Method and system for sorting desktop objects
US10996786B2 (en) Method and apparatus for controlling multi window display in interface
US10423290B2 (en) Information processing apparatus
KR101972924B1 (en) Method and apparatus for designating enire area using partial area touch in a portable equipment
EP2698708A1 (en) Method for providing user interface having multi-tasking function, mobile communication device, and computer readable recording medium for providing the same
US20150185987A1 (en) Method, apparatus and computer readable medium for zooming and operating screen frame
KR20110063410A (en) Method of moving content between applications and apparatus for the same
US10514839B2 (en) Display device and display control method
WO2014196639A1 (en) Information processing apparatus and control program
EP2790096A2 (en) Object display method and apparatus of portable electronic device
CN103793137A (en) Display method and electronic device
US10788950B2 (en) Input/output controller and input/output control program
US20180088966A1 (en) Electronic device and method thereof for managing applications
JP6026363B2 (en) Information processing apparatus and control program
KR20230031970A (en) How to organize icons, icon organizers and electronic devices
US9280235B2 (en) Portable electronic device
US10318222B2 (en) Apparatus and method for screen display control in electronic device
US20150012856A1 (en) Electronic device and method for displaying user interface for one handed operation
JP2013238934A (en) Information processing device, control method and control program for information processing device, and recording medium
CN106681582A (en) Desktop icon adjusting method and device
US10691234B2 (en) Receiving input from multiple touch sensors
CN110502165B (en) Method for rapidly moving multiple APP icons

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant