US20150317048A1 - Application quick launch extension - Google Patents
Application quick launch extension Download PDFInfo
- Publication number
- US20150317048A1 US20150317048A1 US14/797,979 US201514797979A US2015317048A1 US 20150317048 A1 US20150317048 A1 US 20150317048A1 US 201514797979 A US201514797979 A US 201514797979A US 2015317048 A1 US2015317048 A1 US 2015317048A1
- Authority
- US
- United States
- Prior art keywords
- switch
- menu
- computing device
- handheld computing
- time period
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0489—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using dedicated keyboard keys or combinations thereof
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
- G06F3/0482—Interaction with lists of selectable items, e.g. menus
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
- G06F3/04817—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance using icons
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
- G06F3/04842—Selection of displayed objects or displayed text elements
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
Definitions
- the disclosure generally relates to the field of menus for mobile computing devices, for example, mobile telephones.
- Mobile devices specifically mobile phone devices, today are ubiquitous. As each new generation of mobile phone devices comes to market more and more functionality is incorporated into that device. To access and navigate the functionality incorporated into a mobile phone device, conventional mobile phones have incorporated linear menus.
- a linear menu is presented with a list of entries corresponding to a function.
- a user scrolls down through the list of entries, or taps on the item on screen (for touch screens) to minimize the number of UI interactions to access a function. Scrolling, or tapping on the screen requires more interaction from the user and requires the user to rely on visual feedback to ensure they are scrolling or tapping the appropriate place.
- linear menus have significant drawbacks. For example, a user is required to perform a number of physical interactions with their device in order to access a particular menu entry. Another drawback is that desired menu entries may be nested, therefore increasing the number of interactions necessary to reach a particular menu entry. Moreover, such entry may be difficult to locate among the myriad of menus available to the user.
- buttons a user desires increases so that the limited real estates results in omissions that ultimately decreases user productivity.
- increased user sophistication also results in different users having different views on functionality that they find most useful, and thus, most commonly accessed.
- To provide each different user with a different set of dedicated application buttons causes a significant increase in manufacturing cost and complexity such that it is impractical.
- the art lacks a mechanism to provide quick access and flexible customization to access a variety of functional operations in devices, particularly those having small form factors.
- a disclosed system extends quick application launch functionality on a mobile computing device, e.g., a handheld computing device such as a mobile phone, a smartphone or the like.
- the system renders menus on a screen of the mobile that correlate with a navigation mechanism.
- the mobile computing device includes one or more application switches and one or more navigation switches.
- the application switch is configured to include application switches (or buttons) that when depressed and immediately released (e.g., an initial time period) immediately launch (or execute) an application.
- an event manager detects the depression of that application switch for that predetermined time period and generates a switch identifier corresponding to the application switch.
- a preferences manager receives the switch identifier and, in response, retrieves a plurality of application identifiers corresponding to that application switch.
- Each application identifier linked (e.g., a pointer or a batch command) to an executable for a corresponding application.
- a menu engine displays each application identifier in a menu on the screen.
- Each application identifier is selectable for execution so that when it is selected, e.g., by the navigation mechanism, it launches the corresponding application.
- a mobile computing device e.g., a personal digital assistant or a smart phone
- the mobile computing device in this example embodiment integrates at least one application button and a five-way navigation mechanism.
- Each application button is directly mapped to a specific application.
- the application button may be assigned to a calendar function so that when it is depressed and immediately released (e.g., an initial time period) the calendar application immediately launches (executes).
- a menu is rendered that allows a user to select additional applications for execution (e.g., a calculator, a task list, an alarm, and a help application).
- the menu visually maps with the five-way navigation mechanism as a pop-up that graphically shows a relationship as a two-dimensional map.
- the four additional applications are shown at four menu items arranged in a circular or pie configuration.
- the calculator is an item in a “north” or “up” (or “top”) location
- the task list as an item in the “east” or “right” location
- the alarm as an item in the “west” or “left” location
- the help as an item in the “south” or “down” (or “bottom”) direction.
- the disclosed example configuration has a number of advantages. For example, it eases access to a wider range of applications without a need to traverse multiple menus.
- the configuration also provides for ease of navigation within a menu.
- it is customizable so that user's can assign menu options corresponding to applications (or functions) most frequently accessed by them.
- FIGS. 1 a - 1 d illustrate one embodiment of a mobile computing device with telephonic functionality, e.g., a handheld computing device such as a smart phone.
- FIG. 2 illustrates one embodiment of an architecture of a mobile computing device.
- FIG. 3 illustrates one embodiment of a menu serving architecture, for example, in a mobile computing device.
- FIG. 4 illustrates one embodiment of a process for serving a menu and executing a menu item, for example, in a mobile computing device.
- FIGS. 5 a and 5 b illustrates one embodiment of a navigation mechanism on a mobile computing device and one embodiment of a corresponding software navigation menu.
- FIG. 6 illustrates one embodiment of a served menu rendered on a screen of a mobile computing device.
- FIGS. 7 a - 7 e illustrate one embodiment of displayed screen shots used to configure a menu to serve and render on a screen of a mobile computing device.
- FIGS. 1 a - 1 d illustrate one embodiment of a mobile computing device with telephonic functionality, e.g., a mobile phone or a smartphone.
- the computing device is configured to host and execute a phone application for placing and receiving telephone calls.
- a phone application for placing and receiving telephone calls.
- the principles disclosed herein are in an example context of a mobile computing device with telephonic functionality operating in a mobile telecommunications network.
- the principles disclosed herein may be applied in other duplex (or multiplex) telephonic contexts such as devices with telephonic functionality configured to directly interface with public switched telephone networks (PSTN) or data networks having voice over interne protocol (VoIP) functionality.
- PSTN public switched telephone networks
- VoIP voice over interne protocol
- FIGS. 1 a through 1 c illustrate embodiments of a mobile computing device 110 in accordance with the present invention.
- the mobile computing device 110 is configured to be of a form factor that is convenient to hold in a user's hand, for example, a personal digital assistant (PDA) or a smart phone form factor.
- PDA personal digital assistant
- the mobile computing device 110 can have dimensions that range from 3 to 6 inches by 2 to 5 inches by 0.25 to 0.85 inches and weigh between 2 and 8 ounces.
- the mobile computing device 110 includes a speaker 120 , a screen 130 , a navigation area 140 , a keypad area 150 , and a microphone 160 .
- the mobile computing device 110 also may include one or more switches 170 a - c (generally 170 ).
- the one or more switches 170 may be buttons, sliders, or rocker switches and can be mechanical or solid state (e.g., touch sensitive solid state switch).
- the screen 130 of the mobile computing device 110 is, for example, a 240 ⁇ 240, a 320 ⁇ 320, or a 320 ⁇ 480 transflective TFT color display that includes touch screen or inductive pen support.
- the navigation area 140 is configured to control functions of an application executing in the mobile computing device 110 and visible through the screen 130 .
- the navigation area includes an x-way (x is e.g., 5) navigation ring 145 that provide cursor control, selection, and the like.
- the navigation area 140 may include selection buttons 143 to select functions viewed just above the buttons on the screen 130 .
- the navigation area also may include dedicated function buttons 147 , e.g., for functions such as calendar or home screen.
- the navigation ring 145 may be through mechanical, solid state switches, dials, or a combination thereof.
- the keypad area 150 may be a numeric keypad (e.g., a dialpad) or a numeric keypad integrated with an alpha or alphanumeric keypad 160 (e.g., keyboard with consecutive keys of QWERTY, AZERTY, or other equivalent set of keys on a keyboard).
- the mobile computing device 110 also may include an expansion slot 125 .
- the expansion slot 125 is configured to receive and support expansion cards (or media cards), which may include memory cards such as CompactFlashTM cards, SD cards, XD cards, Memory SticksTM, MultiMediaCardTM, SDIO, and the like.
- FIG. 1 b illustrates a rear-view of the example mobile computing device 110 .
- the rear-view illustrates additional features of the mobile computing device, including, a stylus housing (for holding a stylus) 182 , a second speaker (e.g., for speaker phone functionality) 184 , and a camera 186 .
- a switch 170 that can be used to control functions, such as the ones further described below, of the mobile computing device 110 .
- FIG. 1 c illustrates a perspective view of the mobile computing device 110 .
- FIG. 1 d illustrates a side view of the mobile computing device 110 .
- the views in FIGS. 1 c and 1 d provide additional visual details of example switches 170 for use with applications running on the mobile computing device 110 .
- one switch can be a rocker switch 170 a while another can be a push button switch 170 b .
- these switches can be configured as mechanical switches or solid state switches.
- FIG. 2 a block diagram illustrates one embodiment of an architecture 205 of a mobile computing device, e.g., 110 , with telephonic functionality.
- the computing device 200 includes a central processor 220 , a power module 240 , and a radio subsystem 250 .
- the central processor 220 communicates with: audio system 210 , camera 212 , flash memory 214 , RAM memory 216 , and short range radio module 218 (e.g., Bluetooth, Wireless Fidelity (WiFi) component).
- the power module 240 powers the central processor 220 and the radio subsystem 250 .
- the power module 240 may correspond to a battery pack (e.g., rechargeable) or a powerline connection or component.
- Other components that communicate with the processor 220 and which are powered by power module 240 include a screen driver 230 (which may be contact or inductive-sensitive) and one or more input/output (I/O) mechanisms 245 (e.g., buttons 147 , keyboards 160 , slider switches, rocker switches 170 a , push button switches 170 b , touch sensitive switches, photo switches, etc.).
- I/O input/output
- the I/O mechanisms 245 are configured (or structured) to receive input corresponding to user activity, for example, one or more switch mechanisms through triggering (or toggling) of a switch from one state to another through actuation, induction, photo or optical sensing, pressure sensing, or the like.
- the I/O mechanism 245 detects, for example, switch activity associated with the navigation area 140 such as the selection buttons 143 , dedicated function buttons 147 , or the navigation ring 145 .
- the I/O mechanism 245 includes switches for at least one of the dedicated function buttons 147 , which may be dedicated to immediately launch an application, e.g., a calendar program or an email program, upon triggering (e.g., actuation or other state change).
- Another I/O mechanism 245 includes the navigation ring 145 , which can be structured as a single switch with four points where the switch may be toggled (e.g., left side of ring, right side of ring, top of ring, or bottom of ring) or it could be structured to include four separate switches (e.g., also left side of ring, right side of ring, top (or up) of ring, or bottom (or down) of ring).
- the navigation ring 145 can be circular, oblong, oval, square, or rectangular with respect to its structure.
- the radio subsystem 250 includes a radio processor 260 , a radio memory 262 , and a receiver (Rx)/transmitter (Tx) 264 .
- the receiver (Rx)/transmitter (Tx) 264 may be two separate components or a single component. In either instance, it also may be referenced as a transceiver 264 .
- the receiver portion of the transceiver 264 communicatively couples with a radio signal input of the device 110 , e.g., an antenna, where communication signals are received from an established call (e.g., a connected or on-going call).
- the received communication signals include voice (or other sound signals) received from the call and processed by the radio processor 260 for output through the speaker 120 (or 184 ).
- the transmitter portion of the transceiver 264 communicatively couples a radio signal output of the device 110 , e.g., the antenna, where communication signals are transmitted to an established (e.g., a connected (or coupled) or active) call.
- the communication signals for transmission include voice, e.g., received through the microphone 160 of the device 110 , (or other sound signals) that is processed by the radio processor 260 for transmission through the transmitter of the transceiver 264 to the established call.
- communications using the described radio communications may be over a voice or data network.
- voice networks include Global System of Mobile (GSM) communication system, a Code Division, multiple Access (CDMA system), and a Universal Mobile Telecommunications System (UMTS).
- data networks include General Packet Radio Service (GPRS), third-generation (3G) mobile, High Speed Download Packet Access (HSDPA), and Worldwide Interoperability for Microwave Access (WiMAX).
- GSM Global System of Mobile
- CDMA Code Division, multiple Access
- UMTS Universal Mobile Telecommunications System
- data networks include General Packet Radio Service (GPRS), third-generation (3G) mobile, High Speed Download Packet Access (HSDPA), and Worldwide Interoperability for Microwave Access (WiMAX).
- GPRS General Packet Radio Service
- 3G Third-generation
- HSDPA High Speed Download Packet Access
- WiMAX Worldwide Interoperability for Microwave Access
- While other components may be provided with the radio subsystem 250 , the basic components shown provide the ability for the mobile computing device to perform radio-frequency communications, including telephonic communications. In an embodiment, many, if not all, of the components under the control of the central processor 220 are not required by the radio subsystem 250 when a telephone call is established, e.g., connected or ongoing.
- the radio processor 260 may communicate with central processor 220 using a serial line 278 .
- central processor 220 executes logic (by way of programming, code, instructions) corresponding to executing applications interfaced through, for example, the navigation area 140 or switches 170 . It is noted that numerous other components and variations are possible to the hardware architecture of the computing device 200 , thus an embodiment such as shown by FIG. 2 is just illustrative of one implementation for an embodiment.
- FIG. 3 illustrates one embodiment of a menu serving architecture, for example, in a handheld computing device 110 .
- the architecture includes an event manager 315 , a preferences manager and database (or file) 320 , a program resource manager and database (or file) 325 , and a menu engine 330 .
- These components may be embodied in software, hardware, or a combination thereof.
- these components communicatively couple with the device components, including one or more I/O mechanisms 245 (e.g., one or more switch detection mechanisms 310 a - n (generally 310 )), the screen 230 , and the central processor 220 .
- the central processor 220 is configured to execute instructions for software embodiments corresponding to the components within the system.
- a data bus 335 communicatively couples the menu serving architectural components and the other device components, e.g., the processor 220 or I/O mechanism 245 .
- the event manager 315 is configured to detect depression of one of the application switches for a predetermined time period and configured to generate a switch identifier corresponding to one or more I/O mechanisms 245 , e.g., an application switch.
- a preferences manager 320 is configured to receive the switch identifier from the event manager 315 . In response, the preferences manager 320 retrieves a one more application identifiers corresponding to the application switch from its database. Each application identifier is linked to an executable for a corresponding application. The linking may include a pointer or the executable itself.
- the menu engine 330 is configured to render a menu with the application identifiers on the screen 230 .
- the menu engine 330 is configured to display each application identifier in a menu shaped in a “pie” configuration.
- Each menu item (e.g., portion) of the menu corresponds with each application identifier and is selectable. Selection of particular menu item allows for direct execution of the application associated with the application identifier.
- program resource manager 325 is configured to store an icon for each application identifier/corresponding application. The menu engine 330 either retrieves or is sent the icon for integration into the menu that will be displayed on the screen 230 .
- FIG. 4 illustrates one embodiment of a process for serving a menu and executing a menu item, for example, in a mobile computing device.
- this example embodiment of the process is described in the context of the handheld computing device 110 and corresponding architecture 205 .
- FIGS. 5 a and 5 b reference also is made to FIGS. 5 a and 5 b.
- FIG. 5 a illustrates one embodiment of the navigation area (or navigation mechanisms on, e.g., the handheld computing device).
- the application switches (or buttons) are functionally similar to the application switches 147 described previously.
- the application buttons are for a phone application 510 a , a calendar application 510 b , a home or applications menu 510 c , and a messaging (e.g., electronic mail or short messaging service (SMS)) application 510 d .
- SMS short messaging service
- Also illustrated is a navigation ring 520 similar to the navigation ring 145 described above.
- the navigation ring 520 allows for movement in a left, right, up, and down direction within a display on the screen 130 .
- the center of the navigation ring 520 includes a button that can be used to select a particular entry that may be highlighted within a display on the screen 130 .
- the navigation ring mechanism may be referred to as a five-way navigation mechanism.
- the process starts 405 with a determination of whether one of the application switches (or buttons), e.g., calendar 510 b , has been pressed and held 410 for greater than a predetermined length of time.
- the application switch 510 b the application assigned to that switch, i.e., the calendar application in this example, will immediately launch so the process exits 435 .
- the event manager 315 detects additional activity corresponding to menu generation.
- the event manager 315 looks up (or accesses or notifies) 415 the preferences manager 320 to perform a menu lookup by sending a switch identifier that identifies the corresponding switch, i.e., the calendar switch 510 b in this example, that was depressed and held.
- the preferences manager 320 receives the switch identifier and in its database determines which applications correspond with the calendar application switch 510 b .
- the preferences manager 320 retrieves a one more application identifiers corresponding to the calendar application switch 510 b from its database.
- Each application identifier is linked to an executable for a corresponding application.
- the linking may include a pointer to the application executable or the executable itself.
- the preferences manager 320 instructs the program resource manager 325 to provide (or serve or supply) an icon corresponding to each application or application identifier.
- the program resource manager 325 retrieves the icon information from its database and forwards it back to the preferences manager 320 or onto the menu engine 330 .
- the menu engine receives the application identifier information from the preferences manager 320 as well as the icons information from the program resource manager 325 .
- the menu engine 330 either retrieves or is sent the icon for integration into the menu that will be displayed on the screen 130 . In particular, the menu engine 330 configures a particular layout for a menu.
- the layout includes a bounded area that is less than a size of the screen 130 and will be rendered on top or in front of a presently active display (e.g., one a user is interacting with on the screen 130 of a handheld computing device 110 ).
- Examples of a menu layout include a pie (e.g., square, rectangle, round, or oval) with “slices” or a racetrack (oblong) with segments.
- the menu engine 330 displays (or renders) 420 that menu on the screen 130 on top or in front of the presently active display.
- FIG. 5 b illustrates one example of the displayed pie menu 525 corresponding to the pressed and held (for a predetermined time period greater than initial time period) calendar application button 510 b .
- the rendered menu shows each menu item 535 a - d , e.g., the slice or segment, as a selectable item.
- the selectable items include a task/to do application 535 a , a calculator application 535 b , an alarm application 535 c , and a synchronization application 535 d .
- FIG. 6 illustrates the served menu rendered on the screen 230 of the handheld computing device 110 against a context of a calendar application 610 . In one example context, this configuration may appear where a user seeks to access a task application corresponding to a particular calendar entry and desires to directly launch the application rather than search for it in nested menu lists.
- the rendered menu 520 may be displayed with one of the menu items pre-selected (e.g., different shade for menu item, bolded border of menu item, etc.). By actuating the center button on the navigation ring 520 , that selected application can be immediately launched (or executed). If another selection is desired rather than the pre-selected option, the navigation ring 520 can be used to traverse between the menu items. For example, if the task application 535 a is pre-selected and the alarm application 535 c is desired, a right portion (or part) 545 b of the navigation ring 520 can be triggered (pressed and immediately released) twice to move the selection to the alarm application 535 c via the calculator entry 535 b . Alternatively, a down (or bottom) portion 545 c of the navigation ring 520 can be selected to jump directly from the task application 535 a to the alarm application 535 c.
- the center button on the navigation ring 520 By actuating the center button on the navigation ring 520 , that selected
- each move highlights the current selection that would be executed if the center button of the navigation ring 520 were triggered.
- the process can be configured so that pressing a corresponding direction, e.g. on the navigation ring, immediately executes (or launches) the application assigned to that direction. By allowing for execution by directly navigating to the appropriate application identifier within the menu, it further reduces the number of steps to execute an application.
- this process of selecting the alarm application 535 c corresponds in FIG. 4 to detection 425 of the selection activity and execution (or launch) 430 of the selected menu item before exiting 435 the process.
- a device may be configured to include a function key that toggles keys in a keyboard, which has in one row the Q-W-R-T-Y keys, into a selection and navigation configuration.
- depressing and releasing a function key for at least a predetermined time period may have an “*” key or ‘#’ key serve as an application button.
- the ‘D’ key (left), the ‘R’ key (up or top), the ‘G’ key (right) and the ‘C’ key (down or bottom) would function as directional elements and the ‘F’ key or an ‘enter’ key may function as launch (or execute) keys.
- the disclosed configuration increases the number of applications available to immediately launch a particular application without having to search for those applications within nested menu lists. Moreover, availability of the applications for immediate execution is brought directly to the user's attention without having to search groupings of applications to find the particular application to launch. Overall, this increases this reach of dedicated user buttons by a multiples factor. For example, one calendar application button 510 b now becomes one calendar application button 510 c plus four additional applications 535 a - d with a single action of pressing and holding the calendar button 510 b.
- the menu is configured to “mirror” or correspond with the navigation mechanism.
- the pie menu 525 is illustrated in comparison with the five-way navigation mechanism (navigation ring) 520 .
- the menu items 535 a - d of the menu 525 in this example correspond to a corresponding location, e.g., 545 a - d , on the navigation ring 520 .
- This configuration further eases use by helping to make navigation among the menu items intuitively and quickly. For example, it decreases the number of steps necessary to directly launch an application from the menu 525 when the corresponding portion of the navigation ring 520 is simply selected before the center of the navigation ring 520 is pressed and released to launch that application.
- the menu items are disclosed as mapping to physical keys or switches
- still additional alternative embodiments of an extended menu system may be implemented using the principles disclosed herein.
- the menu items i.e., the application identifiers
- the screen 120 may be an enhanced touch sensitive screen that provides haptic feedback.
- the menu items themselves will have or provide a tactile feel or response so that the user can “feel” each item as a button.
- Each of these tactile buttons may be configured so that it accepts a selection from a user (e.g., the user can depress or press on the button) and in response the selected application will be launched.
- the menu boundary area may be structured similar to the navigation area so that the rendered application buttons on the screen 130 are located in areas similar to the physical application buttons.
- four application buttons may be rendered on the screen 130 , are each above each of the four physical application buttons, e.g., 510 a - 510 d.
- a benefit of the haptic enabled alternative embodiment is that it provides physical-type feel for a user when that user interacts with the device.
- the number of physical application buttons in this configuration can be extended without a need for additional device real estate for physical buttons. This increases access and availability to applications within the handheld computer device.
- FIGS. 7 a - 7 e illustrate one embodiment of displayed screen shots used to configure a menu to serve and render on a screen of a mobile computing device.
- a starter application applet is configured to include a location on the menu 710 , a list of applications 715 , and soft application buttons 720 that correspond to corresponding application switches, e.g., 147 ( 510 a - 510 d ), on the handheld computing device 110 .
- the pie menu 525 has four menu items locations identified as “Nav Up”, “Nav Right”, “Nav Down”, and “Nav Left” (collectively 710 ) and the user selects one, e.g., “Nav Up” 710 a . It is noted that this example illustrates how the pie menu is configured to closely reflect positioning on the navigation ring, e.g., 520 , in that the menu locations correspond with the navigation ring 520 locations.
- the user then selects an application from the list 715 , e.g., “To Do List”, to display in the pie menu, e.g., 525 , at the selected location or “Nav Up” 710 a .
- the user in this example then assigns an application button on the handheld computing device 110 .
- the user selects the soft application button, e.g., 720 a or 720 b , corresponding to the physical button, e.g., calendar application button 510 b or a message application button 510 d , to which to associate the menu.
- the user selects the “Cal” soft button 720 a to assign to the calendar application button 510 b on the handheld computing device 110 .
- the menu pops up as shown in FIG. 7 c , providing a preview of what will show for display on the screen 130 of the device 110 when the calendar application button 510 b is depressed and held for a predetermined time period.
- FIGS. 7 c and 7 d also can be used to describe an embodiment in which a menu associated with a physical key on the handheld computing device 110 can be previewed.
- a user selects the application soft keys 720 a , 720 b corresponding to the physical button the user seeks to preview to have displayed the menu and options for that physical button.
- the “Cal” soft application button 720 a shows a first menu with a first set of menu items that correspond to executable applications available through the physical calendar application button 510 b .
- the “SMS” soft application button 720 b shows a second menu with a second set of menu items that correspond to executable applications available through the physical messaging application button 510 d .
- the selection of applications to populate the menu items may be accomplished in a ‘semi-automatic’ manner.
- menu items presented as a result of press and hold of the messaging button 510 d correspond to applications that are messaging in nature, e.g., electronic mail (e-mail), chat, voice (e.g., voice over IP), and blogging applications.
- the messaging application button 510 d was re-assigned by the user to another category, e.g., multimedia
- the four pie menu options change to four applications corresponding to that category, voice memo, camera (still and/or motion), audio player, and video player.
- any reference to “one embodiment” or “an embodiment” means that a particular element, feature, structure, or characteristic described in connection with the embodiment is included in at least one embodiment.
- the appearances of the phrase “in one embodiment” in various places in the specification are not necessarily all referring to the same embodiment.
- Coupled and “connected” along with their derivatives. It should be understood that these terms are not intended as synonyms for each other. For example, some embodiments may be described using the term “connected” to indicate that two or more elements are in direct physical or electrical contact with each other. In another example, some embodiments may be described using the term “coupled” to indicate that two or more elements are in direct physical or electrical contact. The term “coupled,” however, may also mean that two or more elements are not in direct contact with each other, but yet still co-operate or interact with each other. The embodiments are not limited in this context.
- the terms “comprises,” “comprising,” “includes,” “including,” “has,” “having” or any other variation thereof, are intended to cover a non-exclusive inclusion.
- a process, method, article, or apparatus that comprises a list of elements is not necessarily limited to only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus.
- “or” refers to an inclusive or and not to an exclusive or. For example, a condition A or B is satisfied by any one of the following: A is true (or present) and B is false (or not present), A is false (or not present) and B is true (or present), and both A and B are true (or present).
Landscapes
- Engineering & Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Telephone Function (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
Description
- The present application is a continuation of co-pending, commonly assigned, U.S. patent application Ser. No. 11/694,068 filed Mar. 30, 2007 entitled APPLICATION QUICK LAUNCH EXTENSION, the disclosure of which is hereby incorporated herein by reference.
- 1. Field of Art
- The disclosure generally relates to the field of menus for mobile computing devices, for example, mobile telephones.
- 2. Description of the Related Art
- Mobile devices, specifically mobile phone devices, today are ubiquitous. As each new generation of mobile phone devices comes to market more and more functionality is incorporated into that device. To access and navigate the functionality incorporated into a mobile phone device, conventional mobile phones have incorporated linear menus.
- Generally, a linear menu is presented with a list of entries corresponding to a function. A user scrolls down through the list of entries, or taps on the item on screen (for touch screens) to minimize the number of UI interactions to access a function. Scrolling, or tapping on the screen requires more interaction from the user and requires the user to rely on visual feedback to ensure they are scrolling or tapping the appropriate place. However, linear menus have significant drawbacks. For example, a user is required to perform a number of physical interactions with their device in order to access a particular menu entry. Another drawback is that desired menu entries may be nested, therefore increasing the number of interactions necessary to reach a particular menu entry. Moreover, such entry may be difficult to locate among the myriad of menus available to the user.
- To alleviate some of the drawbacks of conventional linear menus, device manufactures have incorporated dedicated physical buttons on their devices. When selected, the dedicated button executes the functionality associated with that dedicated button. Such hard wired solutions provide immediate access to common functional tasks thereby reducing the number of interactions and easing access to such function. However, this configuration too has a number of drawbacks. For example, as devices become smaller in form factor, the amount of available real estate on the device in which to put most commonly accessed functionality is significantly reduced.
- Moreover, as users become more sophisticated, the number of dedicated functional buttons a user desires increases so that the limited real estates results in omissions that ultimately decreases user productivity. Further, increased user sophistication also results in different users having different views on functionality that they find most useful, and thus, most commonly accessed. To provide each different user with a different set of dedicated application buttons causes a significant increase in manufacturing cost and complexity such that it is impractical.
- Thus, the art lacks a mechanism to provide quick access and flexible customization to access a variety of functional operations in devices, particularly those having small form factors.
- A disclosed system (and method) extends quick application launch functionality on a mobile computing device, e.g., a handheld computing device such as a mobile phone, a smartphone or the like. In one embodiment the system renders menus on a screen of the mobile that correlate with a navigation mechanism.
- In one embodiment, the mobile computing device includes one or more application switches and one or more navigation switches. The application switch is configured to include application switches (or buttons) that when depressed and immediately released (e.g., an initial time period) immediately launch (or execute) an application. When the application switch is depressed for a time period greater than the initial time period, an event manager detects the depression of that application switch for that predetermined time period and generates a switch identifier corresponding to the application switch.
- A preferences manager receives the switch identifier and, in response, retrieves a plurality of application identifiers corresponding to that application switch. Each application identifier linked (e.g., a pointer or a batch command) to an executable for a corresponding application. A menu engine displays each application identifier in a menu on the screen. Each application identifier is selectable for execution so that when it is selected, e.g., by the navigation mechanism, it launches the corresponding application.
- In one example embodiment, a mobile computing device (e.g., a personal digital assistant or a smart phone) has a form factor that allows for holding within a palm of a user. The mobile computing device in this example embodiment integrates at least one application button and a five-way navigation mechanism. Each application button is directly mapped to a specific application. For example, the application button may be assigned to a calendar function so that when it is depressed and immediately released (e.g., an initial time period) the calendar application immediately launches (executes).
- By depressing and holding the calendar application button for a longer period of time (e.g., a predetermined time period that is greater than the initial time period), a menu is rendered that allows a user to select additional applications for execution (e.g., a calculator, a task list, an alarm, and a help application). In one example embodiment, the menu visually maps with the five-way navigation mechanism as a pop-up that graphically shows a relationship as a two-dimensional map.
- In this configuration, the four additional applications are shown at four menu items arranged in a circular or pie configuration. In this example configuration, the calculator is an item in a “north” or “up” (or “top”) location, the task list as an item in the “east” or “right” location, the alarm as an item in the “west” or “left” location, the help as an item in the “south” or “down” (or “bottom”) direction. By mapping the menu in this configuration with the five-way navigation a user can with a finger or thumb can select each entry by selecting the corresponding location on the five-way navigation and then launch or execute the selected item by pressing the appropriate direction on the navigation or pressing and releasing a center button within the five-way navigation mechanism.
- The disclosed example configuration has a number of advantages. For example, it eases access to a wider range of applications without a need to traverse multiple menus. The configuration also provides for ease of navigation within a menu. Moreover, it is customizable so that user's can assign menu options corresponding to applications (or functions) most frequently accessed by them.
- The features and advantages described in the specification are not all inclusive and, in particular, many additional features and advantages will be apparent to one of ordinary skill in the art in view of the drawings, specification, and claims. Moreover, it should be noted that the language used in the specification has been principally selected for readability and instructional purposes, and may not have been selected to delineate or circumscribe the disclosed subject matter.
- The disclosed embodiments have other advantages and features which will be more readily apparent from the detailed description, the appended claims, and the accompanying drawings, in which:
-
FIGS. 1 a-1 d illustrate one embodiment of a mobile computing device with telephonic functionality, e.g., a handheld computing device such as a smart phone. -
FIG. 2 illustrates one embodiment of an architecture of a mobile computing device. -
FIG. 3 illustrates one embodiment of a menu serving architecture, for example, in a mobile computing device. -
FIG. 4 illustrates one embodiment of a process for serving a menu and executing a menu item, for example, in a mobile computing device. -
FIGS. 5 a and 5 b illustrates one embodiment of a navigation mechanism on a mobile computing device and one embodiment of a corresponding software navigation menu. -
FIG. 6 illustrates one embodiment of a served menu rendered on a screen of a mobile computing device. -
FIGS. 7 a-7 e illustrate one embodiment of displayed screen shots used to configure a menu to serve and render on a screen of a mobile computing device. - The Figures (FIGS.) and the following description relate to preferred embodiments by way of illustration only. It should be noted that from the following discussion, alternative embodiments of the structures and methods disclosed herein will be readily recognized as viable alternatives that may be employed without departing from the principles disclosed herein.
- Reference will now be made in detail to several embodiments, examples of which are illustrated in the accompanying figures. It is noted that wherever practicable similar or like reference numbers may be used in the figures and may indicate similar or like functionality. The figures depict embodiments of the disclosed system (or method) for purposes of illustration only. One skilled in the art will readily recognize from the following description that alternative embodiments of the structures and methods illustrated herein may be employed without departing from the principles disclosed herein.
-
FIGS. 1 a-1 d illustrate one embodiment of a mobile computing device with telephonic functionality, e.g., a mobile phone or a smartphone. The computing device is configured to host and execute a phone application for placing and receiving telephone calls. It is noted that for ease of understanding the principles disclosed herein are in an example context of a mobile computing device with telephonic functionality operating in a mobile telecommunications network. However, the principles disclosed herein may be applied in other duplex (or multiplex) telephonic contexts such as devices with telephonic functionality configured to directly interface with public switched telephone networks (PSTN) or data networks having voice over interne protocol (VoIP) functionality. -
FIGS. 1 a through 1 c illustrate embodiments of amobile computing device 110 in accordance with the present invention. As illustrated through a front face view inFIG. 1 a, themobile computing device 110 is configured to be of a form factor that is convenient to hold in a user's hand, for example, a personal digital assistant (PDA) or a smart phone form factor. For example, themobile computing device 110 can have dimensions that range from 3 to 6 inches by 2 to 5 inches by 0.25 to 0.85 inches and weigh between 2 and 8 ounces. - The
mobile computing device 110 includes aspeaker 120, ascreen 130, a navigation area 140, akeypad area 150, and amicrophone 160. Themobile computing device 110 also may include one ormore switches 170 a-c (generally 170). The one ormore switches 170 may be buttons, sliders, or rocker switches and can be mechanical or solid state (e.g., touch sensitive solid state switch). - The
screen 130 of themobile computing device 110 is, for example, a 240×240, a 320×320, or a 320×480 transflective TFT color display that includes touch screen or inductive pen support. The navigation area 140 is configured to control functions of an application executing in themobile computing device 110 and visible through thescreen 130. For example, the navigation area includes an x-way (x is e.g., 5)navigation ring 145 that provide cursor control, selection, and the like. In addition, the navigation area 140 may includeselection buttons 143 to select functions viewed just above the buttons on thescreen 130. In addition, the navigation area also may includededicated function buttons 147, e.g., for functions such as calendar or home screen. In this example, thenavigation ring 145 may be through mechanical, solid state switches, dials, or a combination thereof. Thekeypad area 150 may be a numeric keypad (e.g., a dialpad) or a numeric keypad integrated with an alpha or alphanumeric keypad 160 (e.g., keyboard with consecutive keys of QWERTY, AZERTY, or other equivalent set of keys on a keyboard). - Although not illustrated, it is noted that the
mobile computing device 110 also may include an expansion slot 125. The expansion slot 125 is configured to receive and support expansion cards (or media cards), which may include memory cards such as CompactFlash™ cards, SD cards, XD cards, Memory Sticks™, MultiMediaCard™, SDIO, and the like. -
FIG. 1 b illustrates a rear-view of the examplemobile computing device 110. The rear-view illustrates additional features of the mobile computing device, including, a stylus housing (for holding a stylus) 182, a second speaker (e.g., for speaker phone functionality) 184, and acamera 186. Also illustrated is anotherswitch 170 that can be used to control functions, such as the ones further described below, of themobile computing device 110. - In addition,
FIG. 1 c illustrates a perspective view of themobile computing device 110.FIG. 1 d illustrates a side view of themobile computing device 110. The views inFIGS. 1 c and 1 d provide additional visual details of example switches 170 for use with applications running on themobile computing device 110. For example, one switch can be arocker switch 170 a while another can be apush button switch 170 b. Again, it is noted that these switches can be configured as mechanical switches or solid state switches. - Referring next to
FIG. 2 , a block diagram illustrates one embodiment of anarchitecture 205 of a mobile computing device, e.g., 110, with telephonic functionality. By way of example, the architecture illustrated inFIG. 2 will be described with respect to the mobile computing device ofFIGS. 1 a-1 d. The computing device 200 includes acentral processor 220, apower module 240, and a radio subsystem 250. Thecentral processor 220 communicates with:audio system 210,camera 212,flash memory 214,RAM memory 216, and short range radio module 218 (e.g., Bluetooth, Wireless Fidelity (WiFi) component). Thepower module 240 powers thecentral processor 220 and the radio subsystem 250. Thepower module 240 may correspond to a battery pack (e.g., rechargeable) or a powerline connection or component. Other components that communicate with theprocessor 220 and which are powered bypower module 240 include a screen driver 230 (which may be contact or inductive-sensitive) and one or more input/output (I/O) mechanisms 245 (e.g.,buttons 147,keyboards 160, slider switches,rocker switches 170 a, push button switches 170 b, touch sensitive switches, photo switches, etc.). - In one embodiment, the I/
O mechanisms 245 are configured (or structured) to receive input corresponding to user activity, for example, one or more switch mechanisms through triggering (or toggling) of a switch from one state to another through actuation, induction, photo or optical sensing, pressure sensing, or the like. The I/O mechanism 245 detects, for example, switch activity associated with the navigation area 140 such as theselection buttons 143,dedicated function buttons 147, or thenavigation ring 145. The I/O mechanism 245 includes switches for at least one of thededicated function buttons 147, which may be dedicated to immediately launch an application, e.g., a calendar program or an email program, upon triggering (e.g., actuation or other state change). Another I/O mechanism 245 includes thenavigation ring 145, which can be structured as a single switch with four points where the switch may be toggled (e.g., left side of ring, right side of ring, top of ring, or bottom of ring) or it could be structured to include four separate switches (e.g., also left side of ring, right side of ring, top (or up) of ring, or bottom (or down) of ring). Note that thenavigation ring 145 can be circular, oblong, oval, square, or rectangular with respect to its structure. - The radio subsystem 250 includes a
radio processor 260, a radio memory 262, and a receiver (Rx)/transmitter (Tx) 264. The receiver (Rx)/transmitter (Tx) 264 may be two separate components or a single component. In either instance, it also may be referenced as a transceiver 264. The receiver portion of the transceiver 264 communicatively couples with a radio signal input of thedevice 110, e.g., an antenna, where communication signals are received from an established call (e.g., a connected or on-going call). The received communication signals include voice (or other sound signals) received from the call and processed by theradio processor 260 for output through the speaker 120 (or 184). The transmitter portion of the transceiver 264 communicatively couples a radio signal output of thedevice 110, e.g., the antenna, where communication signals are transmitted to an established (e.g., a connected (or coupled) or active) call. The communication signals for transmission include voice, e.g., received through themicrophone 160 of thedevice 110, (or other sound signals) that is processed by theradio processor 260 for transmission through the transmitter of the transceiver 264 to the established call. - In one embodiment, communications using the described radio communications may be over a voice or data network. Examples of voice networks include Global System of Mobile (GSM) communication system, a Code Division, multiple Access (CDMA system), and a Universal Mobile Telecommunications System (UMTS). Examples of data networks include General Packet Radio Service (GPRS), third-generation (3G) mobile, High Speed Download Packet Access (HSDPA), and Worldwide Interoperability for Microwave Access (WiMAX).
- While other components may be provided with the radio subsystem 250, the basic components shown provide the ability for the mobile computing device to perform radio-frequency communications, including telephonic communications. In an embodiment, many, if not all, of the components under the control of the
central processor 220 are not required by the radio subsystem 250 when a telephone call is established, e.g., connected or ongoing. Theradio processor 260 may communicate withcentral processor 220 using aserial line 278. - In one embodiment,
central processor 220 executes logic (by way of programming, code, instructions) corresponding to executing applications interfaced through, for example, the navigation area 140 or switches 170. It is noted that numerous other components and variations are possible to the hardware architecture of the computing device 200, thus an embodiment such as shown byFIG. 2 is just illustrative of one implementation for an embodiment. -
FIG. 3 illustrates one embodiment of a menu serving architecture, for example, in ahandheld computing device 110. The architecture includes anevent manager 315, a preferences manager and database (or file) 320, a program resource manager and database (or file) 325, and amenu engine 330. These components may be embodied in software, hardware, or a combination thereof. In this configuration, these components communicatively couple with the device components, including one or more I/O mechanisms 245 (e.g., one or more switch detection mechanisms 310 a-n (generally 310)), thescreen 230, and thecentral processor 220. Thecentral processor 220 is configured to execute instructions for software embodiments corresponding to the components within the system. A data bus 335 communicatively couples the menu serving architectural components and the other device components, e.g., theprocessor 220 or I/O mechanism 245. - The
event manager 315 is configured to detect depression of one of the application switches for a predetermined time period and configured to generate a switch identifier corresponding to one or more I/O mechanisms 245, e.g., an application switch. Apreferences manager 320 is configured to receive the switch identifier from theevent manager 315. In response, thepreferences manager 320 retrieves a one more application identifiers corresponding to the application switch from its database. Each application identifier is linked to an executable for a corresponding application. The linking may include a pointer or the executable itself. - Once the application identifiers are retrieved by the
preferences manager 320, themenu engine 330 is configured to render a menu with the application identifiers on thescreen 230. In one embodiment, themenu engine 330 is configured to display each application identifier in a menu shaped in a “pie” configuration. Each menu item (e.g., portion) of the menu corresponds with each application identifier and is selectable. Selection of particular menu item allows for direct execution of the application associated with the application identifier. In addition, in one embodimentprogram resource manager 325 is configured to store an icon for each application identifier/corresponding application. Themenu engine 330 either retrieves or is sent the icon for integration into the menu that will be displayed on thescreen 230. -
FIG. 4 illustrates one embodiment of a process for serving a menu and executing a menu item, for example, in a mobile computing device. For ease of discussion and illustration, this example embodiment of the process is described in the context of thehandheld computing device 110 andcorresponding architecture 205. In addition, reference also is made toFIGS. 5 a and 5 b. -
FIG. 5 a illustrates one embodiment of the navigation area (or navigation mechanisms on, e.g., the handheld computing device). The application switches (or buttons) are functionally similar to the application switches 147 described previously. In this example the application buttons are for aphone application 510 a, acalendar application 510 b, a home orapplications menu 510 c, and a messaging (e.g., electronic mail or short messaging service (SMS))application 510 d. Also illustrated is anavigation ring 520 similar to thenavigation ring 145 described above. In this example, thenavigation ring 520 allows for movement in a left, right, up, and down direction within a display on thescreen 130. The center of thenavigation ring 520 includes a button that can be used to select a particular entry that may be highlighted within a display on thescreen 130. In one embodiment, the navigation ring mechanism may be referred to as a five-way navigation mechanism. - Referring back to
FIG. 4 , the process starts 405 with a determination of whether one of the application switches (or buttons), e.g.,calendar 510 b, has been pressed and held 410 for greater than a predetermined length of time. Note that if the user simply presses and immediately releases (an initial time period) theapplication switch 510 b, the application assigned to that switch, i.e., the calendar application in this example, will immediately launch so the process exits 435. However, when theapplication switch 510 b is held longer than this initial time period, theevent manager 315 detects additional activity corresponding to menu generation. - The
event manager 315 looks up (or accesses or notifies) 415 thepreferences manager 320 to perform a menu lookup by sending a switch identifier that identifies the corresponding switch, i.e., thecalendar switch 510 b in this example, that was depressed and held. Thepreferences manager 320 receives the switch identifier and in its database determines which applications correspond with thecalendar application switch 510 b. Specifically, thepreferences manager 320 retrieves a one more application identifiers corresponding to thecalendar application switch 510 b from its database. Each application identifier is linked to an executable for a corresponding application. The linking may include a pointer to the application executable or the executable itself. - In addition, optionally, the
preferences manager 320 instructs theprogram resource manager 325 to provide (or serve or supply) an icon corresponding to each application or application identifier. Theprogram resource manager 325 retrieves the icon information from its database and forwards it back to thepreferences manager 320 or onto themenu engine 330. The menu engine receives the application identifier information from thepreferences manager 320 as well as the icons information from theprogram resource manager 325. Themenu engine 330 either retrieves or is sent the icon for integration into the menu that will be displayed on thescreen 130. In particular, themenu engine 330 configures a particular layout for a menu. The layout includes a bounded area that is less than a size of thescreen 130 and will be rendered on top or in front of a presently active display (e.g., one a user is interacting with on thescreen 130 of a handheld computing device 110). Examples of a menu layout include a pie (e.g., square, rectangle, round, or oval) with “slices” or a racetrack (oblong) with segments. Themenu engine 330 displays (or renders) 420 that menu on thescreen 130 on top or in front of the presently active display. -
FIG. 5 b illustrates one example of the displayedpie menu 525 corresponding to the pressed and held (for a predetermined time period greater than initial time period)calendar application button 510 b. The rendered menu shows each menu item 535 a-d, e.g., the slice or segment, as a selectable item. The selectable items include a task/to doapplication 535 a, acalculator application 535 b, analarm application 535 c, and asynchronization application 535 d.FIG. 6 illustrates the served menu rendered on thescreen 230 of thehandheld computing device 110 against a context of acalendar application 610. In one example context, this configuration may appear where a user seeks to access a task application corresponding to a particular calendar entry and desires to directly launch the application rather than search for it in nested menu lists. - In one embodiment, the rendered
menu 520 may be displayed with one of the menu items pre-selected (e.g., different shade for menu item, bolded border of menu item, etc.). By actuating the center button on thenavigation ring 520, that selected application can be immediately launched (or executed). If another selection is desired rather than the pre-selected option, thenavigation ring 520 can be used to traverse between the menu items. For example, if thetask application 535 a is pre-selected and thealarm application 535 c is desired, a right portion (or part) 545 b of thenavigation ring 520 can be triggered (pressed and immediately released) twice to move the selection to thealarm application 535 c via thecalculator entry 535 b. Alternatively, a down (or bottom)portion 545 c of thenavigation ring 520 can be selected to jump directly from thetask application 535 a to thealarm application 535 c. - Note that each move highlights the current selection that would be executed if the center button of the
navigation ring 520 were triggered. Alternatively, the process can be configured so that pressing a corresponding direction, e.g. on the navigation ring, immediately executes (or launches) the application assigned to that direction. By allowing for execution by directly navigating to the appropriate application identifier within the menu, it further reduces the number of steps to execute an application. - Once at the
alarm application 535 c, it will be highlighted and the center button of thenavigation ring 520 can be triggered to execute thealarm application 535 c. By way of example, this process of selecting thealarm application 535 c corresponds inFIG. 4 todetection 425 of the selection activity and execution (or launch) 430 of the selected menu item before exiting 435 the process. - Although the disclosure describes operation in the context of an application button, e.g., 510 b, and a navigation mechanism, e.g., 520, the principles disclosed need not be limited to these particular embodiments. For example, a device may be configured to include a function key that toggles keys in a keyboard, which has in one row the Q-W-R-T-Y keys, into a selection and navigation configuration. In such example, depressing and releasing a function key for at least a predetermined time period may have an “*” key or ‘#’ key serve as an application button. In addition, in this example, the ‘D’ key (left), the ‘R’ key (up or top), the ‘G’ key (right) and the ‘C’ key (down or bottom) would function as directional elements and the ‘F’ key or an ‘enter’ key may function as launch (or execute) keys.
- In one embodiment the disclosed configuration increases the number of applications available to immediately launch a particular application without having to search for those applications within nested menu lists. Moreover, availability of the applications for immediate execution is brought directly to the user's attention without having to search groupings of applications to find the particular application to launch. Overall, this increases this reach of dedicated user buttons by a multiples factor. For example, one
calendar application button 510 b now becomes onecalendar application button 510 c plus four additional applications 535 a-d with a single action of pressing and holding thecalendar button 510 b. - To further help ease navigation, in one embodiment, the menu is configured to “mirror” or correspond with the navigation mechanism. For example, referring back to
FIG. 5 b, thepie menu 525 is illustrated in comparison with the five-way navigation mechanism (navigation ring) 520. The menu items 535 a-d of themenu 525 in this example, correspond to a corresponding location, e.g., 545 a-d, on thenavigation ring 520. This configuration further eases use by helping to make navigation among the menu items intuitively and quickly. For example, it decreases the number of steps necessary to directly launch an application from themenu 525 when the corresponding portion of thenavigation ring 520 is simply selected before the center of thenavigation ring 520 is pressed and released to launch that application. - Although in one embodiment the menu items (i.e., the application identifiers) are disclosed as mapping to physical keys or switches, still additional alternative embodiments of an extended menu system may be implemented using the principles disclosed herein. For example, rather than map to physical keys or switches, the menu items (i.e., the application identifiers) may themselves take on physical characteristics. In one example embodiment, the screen 120 (230) may be an enhanced touch sensitive screen that provides haptic feedback. Thus, in this embodiment when the menu is rendered, the menu items themselves will have or provide a tactile feel or response so that the user can “feel” each item as a button. Each of these tactile buttons may be configured so that it accepts a selection from a user (e.g., the user can depress or press on the button) and in response the selected application will be launched. To help ease interactions in such embodiments, the menu boundary area may be structured similar to the navigation area so that the rendered application buttons on the
screen 130 are located in areas similar to the physical application buttons. By way of example, four application buttons may be rendered on thescreen 130, are each above each of the four physical application buttons, e.g., 510 a-510 d. - A benefit of the haptic enabled alternative embodiment is that it provides physical-type feel for a user when that user interacts with the device. Thus, the number of physical application buttons in this configuration can be extended without a need for additional device real estate for physical buttons. This increases access and availability to applications within the handheld computer device.
- The disclosed embodiments for an extended menu system beneficially are configurable.
FIGS. 7 a-7 e illustrate one embodiment of displayed screen shots used to configure a menu to serve and render on a screen of a mobile computing device. By way of example, reference will be made to the menu described inFIGS. 5 a, 5 b, and 6. InFIGS. 7 a-7 d, a starter application applet is configured to include a location on themenu 710, a list ofapplications 715, and soft application buttons 720 that correspond to corresponding application switches, e.g., 147 (510 a-510 d), on thehandheld computing device 110. - To start the assignment process, in
FIG. 7 a, a user would select a location on the menu where an application will be assigned. In this example, thepie menu 525 has four menu items locations identified as “Nav Up”, “Nav Right”, “Nav Down”, and “Nav Left” (collectively 710) and the user selects one, e.g., “Nav Up” 710 a. It is noted that this example illustrates how the pie menu is configured to closely reflect positioning on the navigation ring, e.g., 520, in that the menu locations correspond with thenavigation ring 520 locations. - In
FIG. 7 b, the user then selects an application from thelist 715, e.g., “To Do List”, to display in the pie menu, e.g., 525, at the selected location or “Nav Up” 710 a. The user in this example then assigns an application button on thehandheld computing device 110. Specifically, the user selects the soft application button, e.g., 720 a or 720 b, corresponding to the physical button, e.g.,calendar application button 510 b or amessage application button 510 d, to which to associate the menu. In this example, the user selects the “Cal”soft button 720 a to assign to thecalendar application button 510 b on thehandheld computing device 110. In one embodiment of this process, when this is done the menu pops up as shown inFIG. 7 c, providing a preview of what will show for display on thescreen 130 of thedevice 110 when thecalendar application button 510 b is depressed and held for a predetermined time period. -
FIGS. 7 c and 7 d also can be used to describe an embodiment in which a menu associated with a physical key on thehandheld computing device 110 can be previewed. In particular, a user selects the applicationsoft keys soft application button 720 a shows a first menu with a first set of menu items that correspond to executable applications available through the physicalcalendar application button 510 b. The “SMS”soft application button 720 b shows a second menu with a second set of menu items that correspond to executable applications available through the physicalmessaging application button 510 d. Once the keys have been assigned, a user can elect to activate or not active the menu by selecting or deselecting acorresponding selection box 740 as shown inFIG. 7 e. - It is noted that in one embodiment, the selection of applications to populate the menu items, e.g., 535 a-535 d, may be accomplished in a ‘semi-automatic’ manner. For example, menu items presented as a result of press and hold of the
messaging button 510 d (the ‘’ key) correspond to applications that are messaging in nature, e.g., electronic mail (e-mail), chat, voice (e.g., voice over IP), and blogging applications. Alternatively, if themessaging application button 510 d was re-assigned by the user to another category, e.g., multimedia, the four pie menu options change to four applications corresponding to that category, voice memo, camera (still and/or motion), audio player, and video player. - Some portions of above description describe the embodiments in terms of algorithms and symbolic representations of operations on information. These algorithmic descriptions and representations are commonly used by those skilled in the data processing arts to convey the substance of their work effectively to others skilled in the art. These operations, while described functionally, computationally, or logically, are understood to be implemented by computer programs or equivalent electrical circuits, microcode, or the like. Furthermore, it has also proven convenient at times, to refer to these arrangements of operations as modules, without loss of generality. The described operations and their associated modules may be embodied in software, firmware, hardware, or any combinations thereof.
- As used herein any reference to “one embodiment” or “an embodiment” means that a particular element, feature, structure, or characteristic described in connection with the embodiment is included in at least one embodiment. The appearances of the phrase “in one embodiment” in various places in the specification are not necessarily all referring to the same embodiment.
- Some embodiments may be described using the expression “coupled” and “connected” along with their derivatives. It should be understood that these terms are not intended as synonyms for each other. For example, some embodiments may be described using the term “connected” to indicate that two or more elements are in direct physical or electrical contact with each other. In another example, some embodiments may be described using the term “coupled” to indicate that two or more elements are in direct physical or electrical contact. The term “coupled,” however, may also mean that two or more elements are not in direct contact with each other, but yet still co-operate or interact with each other. The embodiments are not limited in this context.
- As used herein, the terms “comprises,” “comprising,” “includes,” “including,” “has,” “having” or any other variation thereof, are intended to cover a non-exclusive inclusion. For example, a process, method, article, or apparatus that comprises a list of elements is not necessarily limited to only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Further, unless expressly stated to the contrary, “or” refers to an inclusive or and not to an exclusive or. For example, a condition A or B is satisfied by any one of the following: A is true (or present) and B is false (or not present), A is false (or not present) and B is true (or present), and both A and B are true (or present).
- In addition, use of the “a” or “an” are employed to describe elements and components of the embodiments herein. This is done merely for convenience and to give a general sense of the invention. This description should be read to include one or at least one and the singular also includes the plural unless it is obvious that it is meant otherwise.
- Upon reading this disclosure, those of skill in the art will appreciate still additional alternative structural and functional designs for an apparatus and a process for an quick launch extension through the disclosed principles herein. Thus, while particular embodiments and applications have been illustrated and described, it is to be understood that the disclosed embodiments are not limited to the precise construction and components disclosed herein. Various modifications, changes and variations, which will be apparent to those skilled in the art, may be made in the arrangement, operation and details of the method and apparatus disclosed herein without departing from the spirit and scope defined in the appended claims.
Claims (37)
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US14/797,979 US20150317048A1 (en) | 2007-03-30 | 2015-07-13 | Application quick launch extension |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US11/694,068 US9122396B2 (en) | 2007-03-30 | 2007-03-30 | Application quick launch extension |
US14/797,979 US20150317048A1 (en) | 2007-03-30 | 2015-07-13 | Application quick launch extension |
Related Parent Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US11/694,068 Continuation US9122396B2 (en) | 2007-03-30 | 2007-03-30 | Application quick launch extension |
Publications (1)
Publication Number | Publication Date |
---|---|
US20150317048A1 true US20150317048A1 (en) | 2015-11-05 |
Family
ID=39796473
Family Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US11/694,068 Expired - Fee Related US9122396B2 (en) | 2007-03-30 | 2007-03-30 | Application quick launch extension |
US14/797,979 Abandoned US20150317048A1 (en) | 2007-03-30 | 2015-07-13 | Application quick launch extension |
Family Applications Before (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US11/694,068 Expired - Fee Related US9122396B2 (en) | 2007-03-30 | 2007-03-30 | Application quick launch extension |
Country Status (1)
Country | Link |
---|---|
US (2) | US9122396B2 (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20120023406A1 (en) * | 2010-07-21 | 2012-01-26 | Yamaha Corporation | Audio mixing console |
Families Citing this family (22)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20080270949A1 (en) * | 2007-04-25 | 2008-10-30 | Liang Younger L | Methods and Systems for Navigation and Selection of Items within User Interfaces with a Segmented Cursor |
US20090113333A1 (en) * | 2007-10-26 | 2009-04-30 | Palm, Inc. | Extendable Toolbar for Navigation and Execution of Operational Functions |
TWI420890B (en) * | 2007-12-26 | 2013-12-21 | Htc Corp | Handheld electronic device and method for switching user interface thereof |
TW200930023A (en) * | 2007-12-31 | 2009-07-01 | Htc Corp | Method for providing a menu using an end key and mobile communication device using the same |
TWI365411B (en) * | 2008-06-06 | 2012-06-01 | Asustek Comp Inc | Computer management system to speed up executing application program and method thereof |
US20090319893A1 (en) * | 2008-06-24 | 2009-12-24 | Nokia Corporation | Method and Apparatus for Assigning a Tactile Cue |
US8659555B2 (en) * | 2008-06-24 | 2014-02-25 | Nokia Corporation | Method and apparatus for executing a feature using a tactile cue |
US8086275B2 (en) | 2008-10-23 | 2011-12-27 | Microsoft Corporation | Alternative inputs of a mobile communications device |
US8175653B2 (en) | 2009-03-30 | 2012-05-08 | Microsoft Corporation | Chromeless user interface |
EP2237140B1 (en) * | 2009-03-31 | 2018-12-26 | Lg Electronics Inc. | Mobile terminal and controlling method thereof |
US9417754B2 (en) * | 2011-08-05 | 2016-08-16 | P4tents1, LLC | User interface system, method, and computer program product |
US10114445B2 (en) * | 2012-10-29 | 2018-10-30 | Facebook, Inc. | Screen timeout duration |
US9665279B2 (en) * | 2014-03-21 | 2017-05-30 | Blackberry Limited | Electronic device and method for previewing content associated with an application |
US20150317721A1 (en) * | 2014-04-30 | 2015-11-05 | Mahesh Kumar T J | Enterprise mobile application for managing sales activites |
USD785018S1 (en) * | 2014-06-02 | 2017-04-25 | Samsung Electronics Co., Ltd. | Display screen or portion thereof with graphical user interface |
US20150379160A1 (en) * | 2014-06-29 | 2015-12-31 | Yaniv AVRAHAM | Apparatus and methods for transferring focus control and related return content within a graphical user interface |
US10235043B2 (en) * | 2014-09-02 | 2019-03-19 | Google Llc | Keyboard for use with a computing device |
US11188143B2 (en) * | 2016-01-04 | 2021-11-30 | Microsoft Technology Licensing, Llc | Three-dimensional object tracking to augment display area |
EP3346412B1 (en) * | 2017-01-05 | 2020-09-09 | Tata Consultancy Services Limited | System and method for consent centric data compliance checking |
USD846576S1 (en) * | 2017-03-01 | 2019-04-23 | United Services Automobile Association (Usaa) | Display screen with wheel of recognition graphical user interface |
CN107515703B (en) * | 2017-07-19 | 2021-02-19 | 联想(北京)有限公司 | State prompting method and electronic equipment |
USD924246S1 (en) | 2018-03-22 | 2021-07-06 | Leica Microsystems Cms Gmbh | Microscope display screen with graphical user interface |
Family Cites Families (19)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5706448A (en) | 1992-12-18 | 1998-01-06 | International Business Machines Corporation | Method and system for manipulating data through a graphic user interface within a data processing system |
CA2157623C (en) * | 1994-09-20 | 1999-12-21 | Lars Stig Sorensen | Method and apparatus for dynamic radio communication menu |
US6014351A (en) * | 1995-03-14 | 2000-01-11 | International Business Machines Corporation | Controlling method and apparatus for integral personal computer and CD-player |
US5798760A (en) | 1995-06-07 | 1998-08-25 | Vayda; Mark | Radial graphical menuing system with concentric region menuing |
US6144378A (en) | 1997-02-11 | 2000-11-07 | Microsoft Corporation | Symbol entry system and methods |
US5977969A (en) | 1997-11-03 | 1999-11-02 | International Business Machines Corporation | Universal resource locator dialog with extended top level domains |
US5940076A (en) | 1997-12-01 | 1999-08-17 | Motorola, Inc. | Graphical user interface for an electronic device and method therefor |
WO1999035590A1 (en) * | 1998-01-07 | 1999-07-15 | Microsoft Corporation | Fast start voice recording on a hand held digital device |
US6448987B1 (en) | 1998-04-03 | 2002-09-10 | Intertainer, Inc. | Graphic user interface for a digital content delivery system using circular menus |
US6463304B2 (en) * | 1999-03-04 | 2002-10-08 | Openwave Systems Inc. | Application launcher for a two-way mobile communications device |
US6706448B1 (en) * | 1999-08-30 | 2004-03-16 | Georgia Tech Research Corp. | Method and apparatus for lithiating alloys |
US7546548B2 (en) * | 2002-06-28 | 2009-06-09 | Microsoft Corporation | Method and system for presenting menu commands for selection |
US8019388B2 (en) * | 2003-02-06 | 2011-09-13 | Flextronics Ap, Llc | Main menu navigation principle for mobile phone user |
US20040155909A1 (en) * | 2003-02-07 | 2004-08-12 | Sun Microsystems, Inc. | Scroll tray mechanism for cellular telephone |
US7668824B2 (en) * | 2004-03-01 | 2010-02-23 | Denso Corporation | Interface device, inferring system, and visual expression method |
JP4396590B2 (en) * | 2005-05-13 | 2010-01-13 | ソニー株式会社 | Playback apparatus, playback method, and playback program |
US7487467B1 (en) * | 2005-06-23 | 2009-02-03 | Sun Microsystems, Inc. | Visual representation and other effects for application management on a device with a small screen |
US8539374B2 (en) * | 2005-09-23 | 2013-09-17 | Disney Enterprises, Inc. | Graphical user interface for electronic devices |
WO2008032486A1 (en) * | 2006-09-12 | 2008-03-20 | Sharp Kabushiki Kaisha | Portable terminal, display method, display mode determining program and computer readable recording medium |
-
2007
- 2007-03-30 US US11/694,068 patent/US9122396B2/en not_active Expired - Fee Related
-
2015
- 2015-07-13 US US14/797,979 patent/US20150317048A1/en not_active Abandoned
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20120023406A1 (en) * | 2010-07-21 | 2012-01-26 | Yamaha Corporation | Audio mixing console |
US9564981B2 (en) * | 2010-07-21 | 2017-02-07 | Yamaha Corporation | Audio mixing console |
Also Published As
Publication number | Publication date |
---|---|
US9122396B2 (en) | 2015-09-01 |
US20080244447A1 (en) | 2008-10-02 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US9122396B2 (en) | Application quick launch extension | |
US20210124311A1 (en) | System, method and graphical user interface for inputting date and time information on a portable multifunction device | |
US8537117B2 (en) | Handheld wireless communication device that selectively generates a menu in response to received commands | |
US7978176B2 (en) | Portrait-landscape rotation heuristics for a portable multifunction device | |
US8477139B2 (en) | Touch screen device, method, and graphical user interface for manipulating three-dimensional virtual objects | |
CA2572574C (en) | Method and arrangement for a primary action on a handheld electronic device | |
EP1803057B1 (en) | Mobile communications terminal having an improved user interface and method therefor | |
EP2132622B1 (en) | Transparent layer application | |
US9128597B2 (en) | Method for switching user interface, electronic device and recording medium using the same | |
US8116807B2 (en) | Airplane mode indicator on a portable multifunction device | |
US7966578B2 (en) | Portable multifunction device, method, and graphical user interface for translating displayed content | |
US8091045B2 (en) | System and method for managing lists | |
US20150012885A1 (en) | Two-mode access linear ui | |
US20080098331A1 (en) | Portable Multifunction Device with Soft Keyboards | |
US20100214218A1 (en) | Virtual mouse | |
US20080165145A1 (en) | Portable Multifunction Device, Method, and Graphical User Interface for Interpreting a Finger Swipe Gesture | |
US20090187840A1 (en) | Side-bar menu and menu on a display screen of a handheld electronic device | |
US20110115722A1 (en) | System and method of entering symbols in a touch input device | |
US20100269068A1 (en) | Changing selection focus on an electronic device | |
EP1815313B1 (en) | A hand-held electronic appliance and method of displaying a tool-tip | |
EP1820084A1 (en) | Displaying information in an interactive computing device | |
EP1803053A1 (en) | A hand-held electronic appliance and method of entering a selection of a menu item | |
KR101505197B1 (en) | Method For Executing Application In Portable Terminal And Portable Terminal Performing The Same | |
EP2081110A1 (en) | Side-bar menu and menu on a display screen of a handheld electronic device | |
CA2691164C (en) | Changing selection focus on an electronic device |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: PALM, INC., CALIFORNIA Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:SAGAR, RICHARD BRYAN;REEL/FRAME:038721/0617 Effective date: 20070330 Owner name: HEWLETT-PACKARD DEVELOPMENT COMPANY, L.P., TEXAS Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:PALM, INC.;REEL/FRAME:038815/0349 Effective date: 20131218 Owner name: QUALCOMM INCORPORATED, CALIFORNIA Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:HEWLETT-PACKARD DEVELOPMENT COMPANY, L.P.;REEL/FRAME:038815/0549 Effective date: 20140123 |
|
STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION |