WO2017019028A1 - Application launch state determination - Google Patents

Application launch state determination Download PDF

Info

Publication number
WO2017019028A1
WO2017019028A1 PCT/US2015/042375 US2015042375W WO2017019028A1 WO 2017019028 A1 WO2017019028 A1 WO 2017019028A1 US 2015042375 W US2015042375 W US 2015042375W WO 2017019028 A1 WO2017019028 A1 WO 2017019028A1
Authority
WO
WIPO (PCT)
Prior art keywords
application
user interface
interface element
parameter
gesture action
Prior art date
Application number
PCT/US2015/042375
Other languages
French (fr)
Inventor
Shlomi MASURI
Eli REVACH
Amos NESHER
Original Assignee
Hewlett Packard Enterprise Development Lp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hewlett Packard Enterprise Development Lp filed Critical Hewlett Packard Enterprise Development Lp
Priority to PCT/US2015/042375 priority Critical patent/WO2017019028A1/en
Publication of WO2017019028A1 publication Critical patent/WO2017019028A1/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones

Definitions

  • Each application is typically visually represented by a user interface element (such as an icon) on a display of the computing device.
  • a user interface element such as an icon
  • FIG. 1 is a block diagram of an example system for application launch state determination
  • FIG. 2 is a series of example viewports of a display of a computing device illustrating an exemplary gesture action
  • FIG. 3 is a flowchart of an example method for application launch state determination
  • FIG. 4 is a flowchart of an example method for application launch state determination
  • FIG. 5 is a block diagram of an example system for application launch state determination
  • FIG. 6 is a block diagram of an example system for application launch state determination.
  • Example systems for determining an application launch state may reduce the number of gesture controls to perform common activities in an application.
  • the common activities may involve functionalities of the application and values used by the application. Different functionalities of the application, values used by the application as well as functionality/value combinations may be mapped to a variety of gesture actions that can be performed on a user interface element representing the application. Performing the gesture action on the user interface element passes a parameter containing the corresponding functionality, value or functionality/value combination to the application upon launching the application. The application may then be launched in a state performing the functionality using the value.
  • a user of a mobile mapping application may commonly use the application to navigate to the user's home.
  • a user may have to launch the application, open a navigation panel, select the address, select a route and select a navigation user interface element.
  • the functionality of the activity is a navigation functionality and the value is the home.
  • Example systems for determining an application launch state allow a user to perform a gesture action on the user interface element corresponding to the navigation functionality and the home value.
  • a parameter identifying the navigation functionality and the home value is provided to the application and the application will automatically begin performing the navigation functionality using the home value upon launch of the application.
  • a variety of gesture actions may be used with the example systems for determining an application launch state. For example, when a user long presses a user interface element, such as an application icon, several options may appear on the screen.
  • a long press refers to holding the user interface element for a certain period of time.
  • Each option may represent a functionality, a value or a functionality/value combination used by the application.
  • the user may drag the application icon to an option corresponding to navigate to home.
  • drag may refer to the action of selecting a user interface element, such as an application icon, and moving the user interface element in a particular direction. Upon receiving the selection, the application will launch and begin performing the navigation functionality using the home value.
  • a user may long press a user interface element and create a pattern using the user interface element, such as dragging an application icon in a certain direction, creating a shape, etc.
  • Certain patterns may correspond to a functionality, a value or a functionality/ alue combination used by the application.
  • a parameter containing the corresponding functionality, value or functionality/value combination is passed to the application. The application will then launch and begin performing the functionality on any value included in the parameter.
  • An example method for determining an application launch state may include receiving a gesture action performed on a user interface element representing an application and identifying the received gesture action from a set of possible gesture actions. Each gesture action in the set may correspond to a different parameter for use by the application. The method may also include determining a parameter corresponding to the received gesture action, providing the parameter to the application upon launching the application and launching the application in a state defined by the parameter.
  • FIG. 1 is a block diagram of an example system 100 for application launch state determination.
  • system 100 may comprise various components, including a user interface element receiver 1 10, a gesture action state initiator 1 12, a gesture action determiner 1 14, an option displayer 1 16, an option receiver 1 18, a pattern interpreter 120, a parameter provider 122, an application launcher 124 and/or other components.
  • application launch state determination system 100 may be implemented in hardware and/or a combination of hardware and programming that configures hardware.
  • FIG. 1 and other Figures described herein different numbers of components or entities than depicted may be used. As is illustrated with respect to FIG.
  • the hardware of the various components of application launch state determination system 100 may include one or both of a processor and a machine-readable storage medium, while the instructions are code stored on the machine-readable storage medium and executable by the processor to perform the designated function.
  • User interface element receiver 1 10 may receive a selection of a user interface element, such as an application icon. The selection may be made by holding the user interface element for a certain period of time.
  • a device may have numerous applications installed and each installed application may be visually represented by a user interface element, such as an application icon.
  • Application icons may be displayed during an initial state of the computing device, such as a home screen, application menu, etc.
  • Gesture action state initiator 1 12 may initiate a gesture action state in response to receiving the selected user interface element, in some examples, the gesture action state may be visually distinct from the initial state. For example, only the selected user interface element may be displayed. In some examples, the gesture action state may be visually similar to the initial state. The gesture action state can recognize a gesture action performed on the user interface element.
  • a "gesture action" refers to any predefined motion to interact with a user interface element. Gesture actions may be performed by any input devices, such as a finger in a touch screen interface, a mouse, a keyboard, a stylus, etc.
  • Exemplary gesture actions include receiving an option corresponding to a selected user interface element and receiving a pattern performed on a selected user interface element. These exemplary gesture actions are discussed in greater detail below.
  • Gesture action interpreter 1 14 may interpret a parameter for the application from the gesture action.
  • the parameter may identify a functionality of the application, a value used by the application or a combination of the two.
  • Gesture action interpreter 1 14 may determine a functionality, a value or a functionality/value combination corresponding to the gesture action and include the functionality, the value or the functionality/value combination in the parameter.
  • the parameter may also define an action to be performed by the application upon startup, a screen to be displayed by the application upon startup, etc.
  • a gesture action may be a selection of at least one option corresponding to a selected user interface element. This example is discussed in further detail below in reference to FIG. 2.
  • an option dispiayer 1 16 may display a first set of options corresponding to the selected user interface element (e.g., as discussed herein with respect to user interface element receiver 1 10) within a viewport of the display of the computing device.
  • viewport refers to a visible region of a display where user interface elements are rendered. Each option in the first set of options may be visually represented by a user interface element.
  • the user interface elements for the first set of options may be displayed within a viewport of a display of the computing device, in some examples, a user may be able to set a functionality, a value or a functionality/value combination to correspond to an option. In some examples, the options may be preset.
  • An option receiver 1 18 may receive a selection of a user interface element representing an option from the set of options. For example, the option receiver 1 18 may identify that the user interface element has been dragged to the user interface element. The user interface element may be dragged by, for example, a user of the computing device.
  • option displayer 1 16 may display a second set of options relating to the selected option in response to receiving the selection of the user interface element.
  • Each option in the second set of options may be visually represented by a user interface element.
  • the user interface elements for the second set of options may be displayed within a viewport of a display of the computing device.
  • user interface elements for the first set of options may be removed from the viewport or no longer be displayed within the viewport of the display of the computing device.
  • a gesture action may be a pattern performed on a selected user interface element.
  • a pattern interpreter 120 may interpret a pattern created by the user with the user interface element and determine the parameter associated with the pattern.
  • Example patterns may include certain directions (up down, left, right, up-left, down-right, etc.), certain locations (top of the display, bottom of the display, etc.) and shapes (circle, square, etc.) Each direction, shape, etc. may correspond to a functionality, a value or a functiona!ity/va!ue combination of the application.
  • a user may be able to set an option to correspond to a functionality, a value or a functionality/value combination.
  • the options may be preset.
  • Parameter provider 122 may provide the parameter (e.g., as discussed herein with respect to gesture action state initiator 1 12) to the application upon launching the application.
  • Application launcher 124 may launch the application in a state defined by the parameter.
  • the state may define, for example, an action to be performed by the application upon startup, a screen to be displayed by the application upon startup, a functionality to be performed by the application upon startup, a value to be used by the functionality or application upon startup, etc.
  • FIG. 2 is a series of example viewports 200, 210 and 220 of a display of a computing device illustrating an exemplary gesture action that may be useful in conjunction with an application launch state determination system, such as the example system 100 discussed above.
  • Viewport 200 illustrates an example initial state screen of a computing device displaying several user interface elements 202 and a user interface element selector 204.
  • the initial state screen may be, for example, a home screen of a computing device, an application selection screen, etc.
  • User interface element selector 204 may be used to select a user interface element displayed within the viewport 200.
  • the user interface element selector 204 is depicted as a mouse cursor, any of a variety of interfaces may be used for selecting user interface elements, such as a keyboard, a touch screen interface, a scrolling device, a voice interface, etc.
  • a gesture action state may be initiated (e.g., as discussed herein with respect to gesture action state initiator 1 12).
  • Viewport 210 illustrates an exemplary gesture action state 210 including a selected user interface element 208 and a set of options 212 corresponding to the selected user interface element 208 (e.g., as discussed herein with respect to option dispiayer 1 16).
  • the options 212 may correspond to a functionality of the application, a value used by the application or a combination of the two.
  • one of the options 212 may be a navigation functionality, a home value, a combination of the navigation functionality and the home value, etc.
  • Each option in the set of options may be visually represented by a user interface element 212.
  • the initial state illustrated in viewport 200 and the gesture action state illustrated in viewport 210 may be visually distinct.
  • the initial state illustrated in viewport 200 includes six rows of user interface elements, while the gesture action state illustrated in viewport 210 includes the selected user interface element and eight options 212 corresponding to the selected user interface element, In the example viewport 210, elements other the selected user interface element and the options 212, such as the user interface elements 202 that were not selected, may no longer be displayed.
  • the set of options may be depicted as user interface elements in a circular pattern around the selected user interface element 208, any of a variety of techniques may be used to display the set of options.
  • the user interface elements that were not selected may be removed from the display and no longer displayed within the viewport of the display,
  • Viewport 220 illustrates the selection user interface element 208 being dragged towards a selected option 222.
  • the direction that the selected user interface element 208 is being dragged towards may be represented by arrow 224.
  • a second set of options may be displayed in response to receiving a selection from the first set of options.
  • the second set of options may be related to the selected option 222, For example, if the selected option was a navigation functionality of a mapping application, the second set of options may be values used by the navigation functionality, such as addresses or locations,
  • FIG. 3 is a flowchart of an example method 300 for determining an application launch state.
  • Method 300 may be described below as being executed or performed by a system, for example, system 100 of FIG, 1 , system 500 of FIG. 5 or system 800 of FIG. 8. Other suitable systems and/or computing devices may be used as well.
  • Method 300 may be implemented in the form of executable instructions stored on at least one machine-readable storage medium of the system and executed by at least one processor of the system.
  • method 300 may be implemented in the form of electronic circuitry (e.g., hardware), in alternate examples of the present disclosure, at least one step of method 300 may be executed substantially concurrently or in a different order than shown in FIG. 3. in alternate examples of the present disclosure, method 3(30 may include more or less steps than are shown in FIG. 3. in some examples, at least one of the steps of method 300 may, at certain times, be ongoing and/or may repeat,
  • Method 300 may start at step 302 and continue to step 304, where the method may include receiving a gesture action performed on a user interface element representing an application.
  • the method may include identifying the received gesture action from a set of possible gesture actions. Each gesture action in the set of possible gesture actions may correspond to a different parameter for use by the application.
  • the different parameters may include a functionality, a value, etc. for the application corresponding to the gesture action.
  • the parameter may also define an action to be performed by the application upon startup or a screen to be displayed by the application upon startup.
  • the method may include determining a first parameter corresponding to the received gesture action.
  • the method may include providing the first parameter to the application upon launching the application.
  • the method may include launching the application in a state defined by the first parameter. Method 300 may eventually continue to step 314, where method 300 may stop.
  • FIG. 4 is a flowchart of an example method 400 for determining an application launch state.
  • Method 40(3 may be described below as being executed or performed by a system, for example, system 100 of FIG. 1 , system 500 of FIG. 5 or system 600 of FIG. 6. Other suitable systems and/or computing devices may be used as well.
  • Method 400 may be implemented in the form of executable instructions stored on at least one machine-readable storage medium of the system and executed by at least one processor of the system.
  • method 400 may be implemented in the form of electronic circuitry (e.g., hardware), in alternate examples of the present disclosure, at least one step of method 400 may be executed substantially concurrently or in a different order than shown in FIG. 4. In alternate examples of the present disclosure, method 400 may include more or less steps than are shown in FIG.
  • At least one of the steps of method 400 may, at certain times, be ongoing and/or may repeat.
  • Method 400 may start at step 402 and continue to step 404, where the method may include receiving a selection of a user interface element.
  • the user interface element may visually represent an application installed on a computer device.
  • the method may include displaying a set of options corresponding to the user interface element. Each option in the set of options may correspond to a different parameter for use by the application. Each option in the set of options may be visually represented by a user interface element.
  • the method may include receiving a selection of a first option from the set of options.
  • the method may include determining that a user has dragged the user interface element to the first option and a first parameter corresponding to a first option.
  • the method may include displaying a second set of options relating to the first option. The method may perform step 412, in response to receiving the selection of the first option. Method 400 may eventually continue to step 414, where method 400 may stop.
  • FIG. 5 is a block diagram of an example application launch state determination system 500.
  • System 500 may be similar to system 100 of FIG. 1 , for example.
  • system 500 includes user interface element receiver 502, gesture action state initiator 504, gesture action determiner 506, parameter provider 508 and application launcher 510.
  • User interface element receiver 502 may receive a selection of a user interface element corresponding to an application installed on a computer device.
  • User interface element receiver 502 may be implemented in the form of executable instructions stored on at least one machine-readable storage medium of system 500 and executed by at least one processor of system 500.
  • user interface element receiver 502 may be implemented in the form of at least one hardware device including electronic circuitry for implementing the functionality of user interface element receiver 502.
  • Gesture action state initiator 504 may initiate a gesture action state.
  • the gesture action state may recognize a gesture action from a set of gesture actions defining different parameters for use by the application.
  • Gesture action state initiator 504 may be implemented in the form of executable instructions stored on at least one machine-readable storage medium of system 500 and executed by at least one processor of system 500.
  • gesture action state initiator 504 may be implemented in the form of at least one hardware device including electronic circuitry for implementing the functionality of gesture action state initiator 504.
  • Gesture action determiner 506 may determine a first parameter corresponding to a received gesture action.
  • Gesture action interpreter 506 may determine a functionality and a value for the application from the gesture action. The functionality and the value may be included in the gesture action.
  • the parameter may define an action to be performed by the application upon startup of the application, a specific screen to be displayed by the application upon startup of the application, etc.
  • Gesture action determiner 506 may interpret a variety of gesture actions. For example, the gesture action determiner 506 may display a set of options corresponding to the selection of the user interface element. Each option in the set of options may be visually represented. Gesture action determiner 506 may display only the user interface element and the set of options within a viewport of a display of the computing device.
  • Gesture action determiner 506 may receive a selection of a first option from the set of options. Gesture action determiner 506 may identify that a user has dragged the user interface element to the first option. Gesture action determiner 506 may also display a second set of options related to the first option in response to receiving the selection of the first option. Gesture action determiner 506 may display only the user interface element and the second set of options within a viewport of a display of the computing device. Gesture action determiner 506 may remove and/or no longer display the first set of options corresponding to the selection of the user interface element.
  • gesture action determiner 506 may determine a pattern created by the user with the user interface elements and determine a parameter associated with the pattern.
  • the pattern may be a shape, a dragging pattern (up, down, left, right, etc.), etc.
  • Gesture action determiner 506 may be implemented in the form of executable instructions stored on at least one machine-readable storage medium of system 500 and executed by at least one processor of system 500. Alternatively or in addition, gesture action determiner 506 may be implemented in the form of at least one hardware device including electronic circuitry for implementing the functionality of gesture action determiner 506.
  • Parameter provider 508 may provide the parameter to the application upon launching the application.
  • Parameter provider 508 may be implemented in the form of executable instructions stored on at least one machine-readable storage medium of system 500 and executed by at least one processor of system 500.
  • parameter provider 508 may be implemented in the form of at least one hardware device including electronic circuitry for implementing the functionality of parameter provider 508.
  • Application launcher 510 may launch the application in a state defined by the parameter.
  • Application launcher 510 may be implemented in the form of executable instructions stored on at least one machine-readable storage medium of system 500 and executed by at least one processor of system 500.
  • application launcher 510 may be implemented in the form of at least one hardware device including electronic circuitry for implementing the application launcher 510.
  • FIG. 6 is a block diagram of an example system 600 for application launch state determination.
  • System 600 may be similar to system 100 of FIG. 1 , for example, in the example illustrated in FIG. 6,
  • system 800 includes a processor 802 and a machine- readable storage medium 604.
  • the following descriptions refer to a single processor and a single machine-readable storage medium, the descriptions may also apply to a system with multiple processors and multiple machine-readable storage mediums, in such examples, the instructions may be distributed (e.g., stored) across multiple machine-readable storage mediums and the instructions may be distributed (e.g., executed by) across multiple processors.
  • Processor 602 may be one or more central processing units (CPUs), microprocessors, and/or other hardware devices suitable for retrieval and execution of instructions stored in machine-readable storage medium 604.
  • processor 602 may fetch, decode, and execute instructions 606, 608, 610 and 612 to perform application launch state determination.
  • processor 602 may include one or more electronic circuits comprising a number of electronic components for performing the functionality of at least one of the instructions in machine-readable storage medium 604.
  • executable instruction representations e.g., boxes
  • executable instructions and/or electronic circuits included within one box may, in alternate examples, be included in a different box shown in the figures or in a different box not shown.
  • Machine-readable storage medium 604 may be any electronic, magnetic, optical, or other physical storage device that stores executable instructions.
  • machine-readable storage medium 604 may be, for example, Random Access Memory (RAM), an Electrically-Erasable Programmable Read-Only Memory (EEPROM), a storage drive, an optical disc, and the like.
  • Machine-readable storage medium 604 may be disposed within system 600, as shown in FIG. 6. In this situation, the executable instructions may be "installed" on the system 600.
  • machine-readable storage medium 604 may be a portable, external or remote storage medium, for example, that allows system 600 to download the instructions from the portable/external/remote storage medium. In this situation, the executable instructions may be part of an Installation package".
  • machine-readable storage medium 604 may be encoded with executable instructions for a web technology responsive to mixtures of emotions.
  • application user interface element selection instructions 606 when executed by a processor (e.g., 602), may cause system 600 to receive a selection of an application user interface element corresponding to an application installed on a computer device.
  • the gesture action corresponds to one of a plurality of possible parameters defining a functionality performed by the application and a parameter used by the application.
  • the parameter may define an action to be performed by the application upon startup of the application, a specific screen to be displayed by the application upon startup of the application, etc.
  • Gesture action instructions 608 may receive a variety of gesture actions. For example, gesture action instructions may display a first set of user interface elements within a viewport of a display of the computing device. As used herein, the term "viewport" refers to a visible region of a display where user interface elements are rendered. Each user interface element in the first set of user interface elements may visually represent an option corresponding to the application user interface element. Gesture action instructions 608 may receive a selection of a first user interface element from the first set of user interface elements. The first user interface element may represent a first option. Gesture action instructions 608 may display only the application user interface element and the first set of user interface elements within a viewport of a display of the computing device. Receiving a selection may further include identifying that the user has dragged the application user interface element to the first user interface element.
  • Gesture action instructions 608 may display a second set of user interface elements within the viewport of the display of the computing device. Each user interface element in the second set of user interface elements may correspond to the first option. The second set of user interface elements may be displayed upon receiving the selection of the first user interface element. The first set of user interface elements may be removed and/or no longer displayed within the viewport of the display.
  • Example patterns may include dragging an application icon in a certain direction (up down, left, right, up-left, down-right, etc.), certain locations (top of the display, bottom of the display, etc.) and shapes (circle, square, etc.) Each pattern (direction, shape, etc.) may correspond to a functionality, a value or a functionality/value combination of the application.
  • Gesture action instructions 608 may interpret a pattern created by the user with the application user interface element and determine a parameter associated with the pattern.
  • Parameter provide instructions 610 when executed by a processor (e.g., 602), may cause system 600 to provide the parameter to the application upon launching the application.
  • Functionality execute instructions 612 when executed by a processor (e.g., 602), may cause system 600 to automatically execute the functionality using the value defined by the parameter.
  • the foregoing disclosure describes a number of examples for application launch state determination.
  • the disclosed examples may include systems, devices, computer-readable storage media, and methods for application launch state determination.
  • certain examples are described with reference to the components illustrated in FIGS. 1 -6.
  • the functionality of the illustrated components may overlap, however, and may be present in a fewer or greater number of elements and components. Further, all or part of the functionality of illustrated elements may co-exist or be distributed among several geographically dispersed locations. Further, the disclosed examples may be implemented in various environments and are not limited to the illustrated examples. [0055] Further, the sequence of operations described in connection with FIGS. 1 -

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

In one example in accordance with the present disclosure, a method for determining an application launch state includes receiving a gesture action performed on a user interface element representing an application. The method also includes identifying the received gesture action from a set of possible gesture actions. Each gesture action in the set corresponds to a different parameter for use by the application. The method also includes determining a parameter corresponding to the received gesture action, providing the parameter to the application upon launching the application and launching the application in a state defined by the parameter.

Description

APPLICATION LAU NCH STATE DETERMINATION
BACKGROUND
[0001 ] Various applications may be installed on a computing device. Each application is typically visually represented by a user interface element (such as an icon) on a display of the computing device. These applications may provide a variety of different functionalities to a user
BREF DESCRIPTION OF THE DRAWINGS [0002] The following detailed description references the drawings, wherein:
[0003] FIG. 1 is a block diagram of an example system for application launch state determination;
[0004] FIG. 2 is a series of example viewports of a display of a computing device illustrating an exemplary gesture action;
[0005] FIG. 3 is a flowchart of an example method for application launch state determination;
[0006] FIG. 4 is a flowchart of an example method for application launch state determination;
[0007] FIG. 5 is a block diagram of an example system for application launch state determination; and
[0008] FIG. 6 is a block diagram of an example system for application launch state determination.
DETAILED DESCRIPTION
[0009] Although many applications provide a variety of features, they may require numerous gesture actions, such as clicks and selections, to perform the desired functionality of common activities. Constantly repeating the gesture controls to perform these common activities may lead to user annoyance and frustration.
[0010] Example systems for determining an application launch state may reduce the number of gesture controls to perform common activities in an application. The common activities may involve functionalities of the application and values used by the application. Different functionalities of the application, values used by the application as well as functionality/value combinations may be mapped to a variety of gesture actions that can be performed on a user interface element representing the application. Performing the gesture action on the user interface element passes a parameter containing the corresponding functionality, value or functionality/value combination to the application upon launching the application. The application may then be launched in a state performing the functionality using the value.
[001 1 ] For example, a user of a mobile mapping application may commonly use the application to navigate to the user's home. Traditionally, to perform this activity, a user may have to launch the application, open a navigation panel, select the address, select a route and select a navigation user interface element. In this example, the functionality of the activity is a navigation functionality and the value is the home. Example systems for determining an application launch state allow a user to perform a gesture action on the user interface element corresponding to the navigation functionality and the home value. Upon launching, a parameter identifying the navigation functionality and the home value is provided to the application and the application will automatically begin performing the navigation functionality using the home value upon launch of the application.
[0012] A variety of gesture actions may be used with the example systems for determining an application launch state. For example, when a user long presses a user interface element, such as an application icon, several options may appear on the screen. A long press" refers to holding the user interface element for a certain period of time. Each option may represent a functionality, a value or a functionality/value combination used by the application. The user may drag the application icon to an option corresponding to navigate to home. As used herein, "drag" may refer to the action of selecting a user interface element, such as an application icon, and moving the user interface element in a particular direction. Upon receiving the selection, the application will launch and begin performing the navigation functionality using the home value.
[0013] As another example, a user may long press a user interface element and create a pattern using the user interface element, such as dragging an application icon in a certain direction, creating a shape, etc. Certain patterns may correspond to a functionality, a value or a functionality/ alue combination used by the application. Upon completing the pattern, a parameter containing the corresponding functionality, value or functionality/value combination is passed to the application. The application will then launch and begin performing the functionality on any value included in the parameter.
[0014] An example method for determining an application launch state may include receiving a gesture action performed on a user interface element representing an application and identifying the received gesture action from a set of possible gesture actions. Each gesture action in the set may correspond to a different parameter for use by the application. The method may also include determining a parameter corresponding to the received gesture action, providing the parameter to the application upon launching the application and launching the application in a state defined by the parameter.
[0015] FIG. 1 is a block diagram of an example system 100 for application launch state determination. In the example shown in FIG. 1 , system 100 may comprise various components, including a user interface element receiver 1 10, a gesture action state initiator 1 12, a gesture action determiner 1 14, an option displayer 1 16, an option receiver 1 18, a pattern interpreter 120, a parameter provider 122, an application launcher 124 and/or other components. According to various implementations, application launch state determination system 100 may be implemented in hardware and/or a combination of hardware and programming that configures hardware. Furthermore, in FIG. 1 and other Figures described herein, different numbers of components or entities than depicted may be used. As is illustrated with respect to FIG. 6, the hardware of the various components of application launch state determination system 100, for example, may include one or both of a processor and a machine-readable storage medium, while the instructions are code stored on the machine-readable storage medium and executable by the processor to perform the designated function. [0016] User interface element receiver 1 10 may receive a selection of a user interface element, such as an application icon. The selection may be made by holding the user interface element for a certain period of time. As discussed above, a device may have numerous applications installed and each installed application may be visually represented by a user interface element, such as an application icon. Application icons may be displayed during an initial state of the computing device, such as a home screen, application menu, etc.
[0017] Gesture action state initiator 1 12 may initiate a gesture action state in response to receiving the selected user interface element, in some examples, the gesture action state may be visually distinct from the initial state. For example, only the selected user interface element may be displayed. In some examples, the gesture action state may be visually similar to the initial state. The gesture action state can recognize a gesture action performed on the user interface element. As used herein a "gesture action" refers to any predefined motion to interact with a user interface element. Gesture actions may be performed by any input devices, such as a finger in a touch screen interface, a mouse, a keyboard, a stylus, etc. Exemplary gesture actions include receiving an option corresponding to a selected user interface element and receiving a pattern performed on a selected user interface element. These exemplary gesture actions are discussed in greater detail below.
[0018] Gesture action interpreter 1 14 may interpret a parameter for the application from the gesture action. The parameter may identify a functionality of the application, a value used by the application or a combination of the two. Gesture action interpreter 1 14 may determine a functionality, a value or a functionality/value combination corresponding to the gesture action and include the functionality, the value or the functionality/value combination in the parameter. The parameter may also define an action to be performed by the application upon startup, a screen to be displayed by the application upon startup, etc.
[0019] in one example, a gesture action may be a selection of at least one option corresponding to a selected user interface element. This example is discussed in further detail below in reference to FIG. 2. After the gesture action state is initiated (e.g., as discussed herein with respect to gesture action state initiator 1 12), an option dispiayer 1 16 may display a first set of options corresponding to the selected user interface element (e.g., as discussed herein with respect to user interface element receiver 1 10) within a viewport of the display of the computing device. As used herein, the term "viewport" refers to a visible region of a display where user interface elements are rendered. Each option in the first set of options may be visually represented by a user interface element. The user interface elements for the first set of options may be displayed within a viewport of a display of the computing device, in some examples, a user may be able to set a functionality, a value or a functionality/value combination to correspond to an option. In some examples, the options may be preset.
[0020] An option receiver 1 18 may receive a selection of a user interface element representing an option from the set of options. For example, the option receiver 1 18 may identify that the user interface element has been dragged to the user interface element. The user interface element may be dragged by, for example, a user of the computing device.
[0021 ] in some examples, option displayer 1 16 may display a second set of options relating to the selected option in response to receiving the selection of the user interface element. Each option in the second set of options may be visually represented by a user interface element. The user interface elements for the second set of options may be displayed within a viewport of a display of the computing device. When the user interface elements for the second set of options are displayed within the viewport, user interface elements for the first set of options may be removed from the viewport or no longer be displayed within the viewport of the display of the computing device.
[0022] in another example, a gesture action may be a pattern performed on a selected user interface element. After the gesture action state is initiated (e.g., as discussed herein with respect to gesture action state initiator 1 12), a pattern interpreter 120 may interpret a pattern created by the user with the user interface element and determine the parameter associated with the pattern. Example patterns may include certain directions (up down, left, right, up-left, down-right, etc.), certain locations (top of the display, bottom of the display, etc.) and shapes (circle, square, etc.) Each direction, shape, etc. may correspond to a functionality, a value or a functiona!ity/va!ue combination of the application. In one example, a user may be able to set an option to correspond to a functionality, a value or a functionality/value combination. In some examples, the options may be preset. [0023] Parameter provider 122 may provide the parameter (e.g., as discussed herein with respect to gesture action state initiator 1 12) to the application upon launching the application. Application launcher 124 may launch the application in a state defined by the parameter. The state may define, for example, an action to be performed by the application upon startup, a screen to be displayed by the application upon startup, a functionality to be performed by the application upon startup, a value to be used by the functionality or application upon startup, etc.
[0024] Fig. 2 is a series of example viewports 200, 210 and 220 of a display of a computing device illustrating an exemplary gesture action that may be useful in conjunction with an application launch state determination system, such as the example system 100 discussed above. Viewport 200 illustrates an example initial state screen of a computing device displaying several user interface elements 202 and a user interface element selector 204. The initial state screen may be, for example, a home screen of a computing device, an application selection screen, etc. User interface element selector 204 may be used to select a user interface element displayed within the viewport 200. Although the user interface element selector 204 is depicted as a mouse cursor, any of a variety of interfaces may be used for selecting user interface elements, such as a keyboard, a touch screen interface, a scrolling device, a voice interface, etc.
[0025] After the selection of a user interface element is received, (e.g., as discussed herein with respect to user interface element receiver 1 10) a gesture action state may be initiated (e.g., as discussed herein with respect to gesture action state initiator 1 12). Viewport 210 illustrates an exemplary gesture action state 210 including a selected user interface element 208 and a set of options 212 corresponding to the selected user interface element 208 (e.g., as discussed herein with respect to option dispiayer 1 16). The options 212 may correspond to a functionality of the application, a value used by the application or a combination of the two. For example, if the application is a mapping application, one of the options 212 may be a navigation functionality, a home value, a combination of the navigation functionality and the home value, etc. Each option in the set of options may be visually represented by a user interface element 212.
[0026] The initial state illustrated in viewport 200 and the gesture action state illustrated in viewport 210 may be visually distinct. The initial state illustrated in viewport 200 includes six rows of user interface elements, while the gesture action state illustrated in viewport 210 includes the selected user interface element and eight options 212 corresponding to the selected user interface element, In the example viewport 210, elements other the selected user interface element and the options 212, such as the user interface elements 202 that were not selected, may no longer be displayed.
[0027] Although the set of options may be depicted as user interface elements in a circular pattern around the selected user interface element 208, any of a variety of techniques may be used to display the set of options. In some examples, the user interface elements that were not selected may be removed from the display and no longer displayed within the viewport of the display,
[0028] Viewport 220 illustrates the selection user interface element 208 being dragged towards a selected option 222. The direction that the selected user interface element 208 is being dragged towards may be represented by arrow 224. Once the selected option 222 (e.g., as discussed herein with respect to option receiver 1 18) are received, a corresponding parameter may be interpreted (e.g., as discussed herein with respect to gesture action interpreter 1 14) and provided to the application upon launch (e.g., as discussed herein with respect to parameter provider 122),
[0029] In some examples, a second set of options may be displayed in response to receiving a selection from the first set of options. The second set of options may be related to the selected option 222, For example, if the selected option was a navigation functionality of a mapping application, the second set of options may be values used by the navigation functionality, such as addresses or locations,
[0030] FIG. 3 is a flowchart of an example method 300 for determining an application launch state. Method 300 may be described below as being executed or performed by a system, for example, system 100 of FIG, 1 , system 500 of FIG. 5 or system 800 of FIG. 8. Other suitable systems and/or computing devices may be used as well. Method 300 may be implemented in the form of executable instructions stored on at least one machine-readable storage medium of the system and executed by at least one processor of the system. Alternatively or in addition, method 300 may be implemented in the form of electronic circuitry (e.g., hardware), in alternate examples of the present disclosure, at least one step of method 300 may be executed substantially concurrently or in a different order than shown in FIG. 3. in alternate examples of the present disclosure, method 3(30 may include more or less steps than are shown in FIG. 3. in some examples, at least one of the steps of method 300 may, at certain times, be ongoing and/or may repeat,
[0031 ] Method 300 may start at step 302 and continue to step 304, where the method may include receiving a gesture action performed on a user interface element representing an application. At step 306, the method may include identifying the received gesture action from a set of possible gesture actions. Each gesture action in the set of possible gesture actions may correspond to a different parameter for use by the application. The different parameters may include a functionality, a value, etc. for the application corresponding to the gesture action. The parameter may also define an action to be performed by the application upon startup or a screen to be displayed by the application upon startup. At step 308, the method may include determining a first parameter corresponding to the received gesture action. At step 310, the method may include providing the first parameter to the application upon launching the application. At step 312, the method may include launching the application in a state defined by the first parameter. Method 300 may eventually continue to step 314, where method 300 may stop.
[0032] FIG. 4 is a flowchart of an example method 400 for determining an application launch state. Method 40(3 may be described below as being executed or performed by a system, for example, system 100 of FIG. 1 , system 500 of FIG. 5 or system 600 of FIG. 6. Other suitable systems and/or computing devices may be used as well. Method 400 may be implemented in the form of executable instructions stored on at least one machine-readable storage medium of the system and executed by at least one processor of the system. Alternatively or in addition, method 400 may be implemented in the form of electronic circuitry (e.g., hardware), in alternate examples of the present disclosure, at least one step of method 400 may be executed substantially concurrently or in a different order than shown in FIG. 4. In alternate examples of the present disclosure, method 400 may include more or less steps than are shown in FIG.
4. in some examples, at least one of the steps of method 400 may, at certain times, be ongoing and/or may repeat.
[0033] Method 400 may start at step 402 and continue to step 404, where the method may include receiving a selection of a user interface element. The user interface element may visually represent an application installed on a computer device. At step 406, the method may include displaying a set of options corresponding to the user interface element. Each option in the set of options may correspond to a different parameter for use by the application. Each option in the set of options may be visually represented by a user interface element. At step 408, the method may include receiving a selection of a first option from the set of options.
[0034] At step 410, the method may include determining that a user has dragged the user interface element to the first option and a first parameter corresponding to a first option. At step 412, the method may include displaying a second set of options relating to the first option. The method may perform step 412, in response to receiving the selection of the first option. Method 400 may eventually continue to step 414, where method 400 may stop.
[0035] FIG. 5 is a block diagram of an example application launch state determination system 500. System 500 may be similar to system 100 of FIG. 1 , for example. In FIG. 5, system 500 includes user interface element receiver 502, gesture action state initiator 504, gesture action determiner 506, parameter provider 508 and application launcher 510.
[0036] User interface element receiver 502 may receive a selection of a user interface element corresponding to an application installed on a computer device. User interface element receiver 502 may be implemented in the form of executable instructions stored on at least one machine-readable storage medium of system 500 and executed by at least one processor of system 500. Alternatively or in addition, user interface element receiver 502 may be implemented in the form of at least one hardware device including electronic circuitry for implementing the functionality of user interface element receiver 502.
[0037] Gesture action state initiator 504 may initiate a gesture action state. The gesture action state may recognize a gesture action from a set of gesture actions defining different parameters for use by the application. Gesture action state initiator 504 may be implemented in the form of executable instructions stored on at least one machine-readable storage medium of system 500 and executed by at least one processor of system 500. Alternatively or in addition, gesture action state initiator 504 may be implemented in the form of at least one hardware device including electronic circuitry for implementing the functionality of gesture action state initiator 504.
[0038] Gesture action determiner 506 may determine a first parameter corresponding to a received gesture action. Gesture action interpreter 506 may determine a functionality and a value for the application from the gesture action. The functionality and the value may be included in the gesture action. The parameter may define an action to be performed by the application upon startup of the application, a specific screen to be displayed by the application upon startup of the application, etc.
[0039] Gesture action determiner 506 may interpret a variety of gesture actions. For example, the gesture action determiner 506 may display a set of options corresponding to the selection of the user interface element. Each option in the set of options may be visually represented. Gesture action determiner 506 may display only the user interface element and the set of options within a viewport of a display of the computing device.
[0040] Gesture action determiner 506 may receive a selection of a first option from the set of options. Gesture action determiner 506 may identify that a user has dragged the user interface element to the first option. Gesture action determiner 506 may also display a second set of options related to the first option in response to receiving the selection of the first option. Gesture action determiner 506 may display only the user interface element and the second set of options within a viewport of a display of the computing device. Gesture action determiner 506 may remove and/or no longer display the first set of options corresponding to the selection of the user interface element.
[0041 ] Another example of a gesture action that may be interpreted by gesture action determiner 506 is a pattern. Gesture action determiner 506 may determine a pattern created by the user with the user interface elements and determine a parameter associated with the pattern. The pattern may be a shape, a dragging pattern (up, down, left, right, etc.), etc.
[0042] Gesture action determiner 506 may be implemented in the form of executable instructions stored on at least one machine-readable storage medium of system 500 and executed by at least one processor of system 500. Alternatively or in addition, gesture action determiner 506 may be implemented in the form of at least one hardware device including electronic circuitry for implementing the functionality of gesture action determiner 506.
[0043] Parameter provider 508 may provide the parameter to the application upon launching the application. Parameter provider 508 may be implemented in the form of executable instructions stored on at least one machine-readable storage medium of system 500 and executed by at least one processor of system 500. Alternatively or in addition, parameter provider 508 may be implemented in the form of at least one hardware device including electronic circuitry for implementing the functionality of parameter provider 508.
[0044] Application launcher 510 may launch the application in a state defined by the parameter. Application launcher 510 may be implemented in the form of executable instructions stored on at least one machine-readable storage medium of system 500 and executed by at least one processor of system 500. Alternatively or in addition, application launcher 510 may be implemented in the form of at least one hardware device including electronic circuitry for implementing the application launcher 510.
[0045] FIG. 6 is a block diagram of an example system 600 for application launch state determination. System 600 may be similar to system 100 of FIG. 1 , for example, in the example illustrated in FIG. 6, system 800 includes a processor 802 and a machine- readable storage medium 604. Although the following descriptions refer to a single processor and a single machine-readable storage medium, the descriptions may also apply to a system with multiple processors and multiple machine-readable storage mediums, in such examples, the instructions may be distributed (e.g., stored) across multiple machine-readable storage mediums and the instructions may be distributed (e.g., executed by) across multiple processors.
[0048] Processor 602 may be one or more central processing units (CPUs), microprocessors, and/or other hardware devices suitable for retrieval and execution of instructions stored in machine-readable storage medium 604. In the example illustrated in FIG. 6, processor 602 may fetch, decode, and execute instructions 606, 608, 610 and 612 to perform application launch state determination. As an alternative or in addition to retrieving and executing instructions, processor 602 may include one or more electronic circuits comprising a number of electronic components for performing the functionality of at least one of the instructions in machine-readable storage medium 604. With respect to the executable instruction representations (e.g., boxes) described and shown herein, it should be understood that part or all of the executable instructions and/or electronic circuits included within one box may, in alternate examples, be included in a different box shown in the figures or in a different box not shown.
[0047] Machine-readable storage medium 604 may be any electronic, magnetic, optical, or other physical storage device that stores executable instructions. Thus, machine-readable storage medium 604 may be, for example, Random Access Memory (RAM), an Electrically-Erasable Programmable Read-Only Memory (EEPROM), a storage drive, an optical disc, and the like. Machine-readable storage medium 604 may be disposed within system 600, as shown in FIG. 6. In this situation, the executable instructions may be "installed" on the system 600. Alternatively, machine-readable storage medium 604 may be a portable, external or remote storage medium, for example, that allows system 600 to download the instructions from the portable/external/remote storage medium. In this situation, the executable instructions may be part of an Installation package". As described herein, machine-readable storage medium 604 may be encoded with executable instructions for a web technology responsive to mixtures of emotions.
[0048] Referring to FIG. 6, application user interface element selection instructions 606, when executed by a processor (e.g., 602), may cause system 600 to receive a selection of an application user interface element corresponding to an application installed on a computer device. Gesture action instructions 608, when executed by a processor (e.g., 602), may receive a gesture action performed on the application user interface element. The gesture action corresponds to one of a plurality of possible parameters defining a functionality performed by the application and a parameter used by the application. The parameter may define an action to be performed by the application upon startup of the application, a specific screen to be displayed by the application upon startup of the application, etc.
[0049] Gesture action instructions 608 may receive a variety of gesture actions. For example, gesture action instructions may display a first set of user interface elements within a viewport of a display of the computing device. As used herein, the term "viewport" refers to a visible region of a display where user interface elements are rendered. Each user interface element in the first set of user interface elements may visually represent an option corresponding to the application user interface element. Gesture action instructions 608 may receive a selection of a first user interface element from the first set of user interface elements. The first user interface element may represent a first option. Gesture action instructions 608 may display only the application user interface element and the first set of user interface elements within a viewport of a display of the computing device. Receiving a selection may further include identifying that the user has dragged the application user interface element to the first user interface element.
[0050] Gesture action instructions 608 may display a second set of user interface elements within the viewport of the display of the computing device. Each user interface element in the second set of user interface elements may correspond to the first option. The second set of user interface elements may be displayed upon receiving the selection of the first user interface element. The first set of user interface elements may be removed and/or no longer displayed within the viewport of the display.
[0051 ] Another example of a gesture action is a pattern. Example patterns may include dragging an application icon in a certain direction (up down, left, right, up-left, down-right, etc.), certain locations (top of the display, bottom of the display, etc.) and shapes (circle, square, etc.) Each pattern (direction, shape, etc.) may correspond to a functionality, a value or a functionality/value combination of the application.
[0052] Gesture action instructions 608 may interpret a pattern created by the user with the application user interface element and determine a parameter associated with the pattern.
[0053] Parameter provide instructions 610, when executed by a processor (e.g., 602), may cause system 600 to provide the parameter to the application upon launching the application. Functionality execute instructions 612, when executed by a processor (e.g., 602), may cause system 600 to automatically execute the functionality using the value defined by the parameter.
[0054] The foregoing disclosure describes a number of examples for application launch state determination. The disclosed examples may include systems, devices, computer-readable storage media, and methods for application launch state determination. For purposes of explanation, certain examples are described with reference to the components illustrated in FIGS. 1 -6. The functionality of the illustrated components may overlap, however, and may be present in a fewer or greater number of elements and components. Further, all or part of the functionality of illustrated elements may co-exist or be distributed among several geographically dispersed locations. Further, the disclosed examples may be implemented in various environments and are not limited to the illustrated examples. [0055] Further, the sequence of operations described in connection with FIGS. 1 -
8 are examples and are not intended to be limiting, Additional or fewer operations or combinations of operations may be used or may vary without departing from the scope of the disclosed examples. Furthermore, implementations consistent with the disclosed examples need not perform the sequence of operations in any particular order. Thus, the present disclosure merely sets forth possible examples of implementations, and many variations and modifications may be made to the described examples.

Claims

1 ) A method for application launch state determination, the method comprising: receiving a gesture action performed on a user interface element representing an application;
identifying the received gesture action from a set of possible gesture actions, wherein each gesture action in the set corresponds to a different parameter for use by the application;
determining a first parameter corresponding to the received gesture action; providing the first parameter to the application upon launching the application; and
launching the application in a state defined by the first parameter.
2) The method of claim 1 , wherein receiving the gesture action further comprises: receiving a selection of the user interface element;
displaying a set of options corresponding to the user interface element, wherein each option in the set of options corresponds to the different parameter for use by the application; and
receiving a selection of a first option from the set of options.
3) The method of claim 3 wherein receiving the selection further comprises:
determining that a user has dragged the user interface element to the first option; and
determining the first parameter corresponding to the first option.
4) The method of claim 3, further comprising:
displaying, in response to receiving the selection of the first option, a second set of options relating to the first option.
5) The method of claim 1 wherein the state defined by the first parameter is one of an action to be performed by the application upon startup or a screen to be displayed by the application upon startup.
6) The method of claim 1 , wherein receiving the gesture action further includes: interpreting a pattern created with the user interface element; and determining the first parameter corresponding to the pattern.
7) A system for application launch state determination, the system comprising: a user interface element receiver to receive a selection of a user interface element corresponding to an application installed on a computer device;
a gesture action state initiator to initiate a gesture action state, wherein the gesture action state can recognize a gesture action from a set of gesture actions defining different parameters for use by the application;
a gesture action determiner to determine a first parameter corresponding to a received gesture action;
a parameter provider to provide the first parameter to the application upon launching the application; and
an application launcher to launch the application in a state defined by the first parameter,
8) The system of claim 7, wherein the first parameter further defines one of: an action to be performed by the application upon startup or a screen to be displayed by the application upon startup.
9) The system of claim 7, further comprising:
an option dispiayer to display a set of options corresponding to the user interface element, wherein each option in the set of options corresponds to one of the different parameters for use by the application; and
an option receiver to identify that a user has dragged the user interface element to a first option.
10) The system of claim 7, further comprising:
a pattern interpreter to interpret a pattern created by a user with the user interface element; and
determine the first parameter associated with the pattern. 1 1 ) A non-transitory machine-readable storage medium comprising instructions executable by a processor of a computing device for application launch state determination, the machine-readable storage medium comprising:
receive a selection of an application user interface element corresponding to an application installed on a computer device;
receive a gesture action performed on the application user interface element, wherein the gesture action corresponds to one of a plurality of possible parameters defining a functionality performed by the application and a value used by the application;
provide the parameter to the application upon launching the application; and automatically execute the functionality using the value defined by the parameter.
12) The non-transitory machine-readable storage medium of claim 1 1 , wherein the instructions executable by the processor of the system further cause the system to: display a first set of user interface elements within a viewport of a display of the computing device, wherein each user interface element in the first set of user interface elements visually represents an option corresponding to the application user interface element; and
receive a selection of a first option.
13) The non-transitory machine-readable storage medium of claim 12, wherein the instructions executable by the processor of the system further cause the system to: display only the application user interface element and the first set of user interface elements within the viewport of the display of the computing device.
14) The non-transitory machine-readable storage medium of claim 12, wherein the instructions executable by the processor of the system further cause the system to: display, upon receiving the selection of the first user interface element, a second set of user interface elements within the viewport of the display of the computing device, wherein each user interface element in the second set of user interface elements corresponds to the first option; and
remove the first set of user interface elements from the viewport of the display. 15) The non-transitory machine-readable storage medium of claim 1 1 , wherein the instructions executable by the processor of the system further cause the system to: interpret a pattern created by a user with the application user interface element; and
determine the parameter associated with the pattern.
PCT/US2015/042375 2015-07-28 2015-07-28 Application launch state determination WO2017019028A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/US2015/042375 WO2017019028A1 (en) 2015-07-28 2015-07-28 Application launch state determination

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/US2015/042375 WO2017019028A1 (en) 2015-07-28 2015-07-28 Application launch state determination

Publications (1)

Publication Number Publication Date
WO2017019028A1 true WO2017019028A1 (en) 2017-02-02

Family

ID=57884865

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2015/042375 WO2017019028A1 (en) 2015-07-28 2015-07-28 Application launch state determination

Country Status (1)

Country Link
WO (1) WO2017019028A1 (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20080036743A1 (en) * 1998-01-26 2008-02-14 Apple Computer, Inc. Gesturing with a multipoint sensing device
US20080074399A1 (en) * 2006-09-27 2008-03-27 Lg Electronic Inc. Mobile communication terminal and method of selecting menu and item
US20090265670A1 (en) * 2007-08-30 2009-10-22 Kim Joo Min User interface for a mobile device using a user's gesture in the proximity of an electronic device
US20110273388A1 (en) * 2010-05-10 2011-11-10 Samsung Electronics Co., Ltd. Apparatus and method for receiving gesture-based input in a mobile device
WO2013062883A1 (en) * 2011-10-25 2013-05-02 Google Inc. Gesture-based search

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20080036743A1 (en) * 1998-01-26 2008-02-14 Apple Computer, Inc. Gesturing with a multipoint sensing device
US20080074399A1 (en) * 2006-09-27 2008-03-27 Lg Electronic Inc. Mobile communication terminal and method of selecting menu and item
US20090265670A1 (en) * 2007-08-30 2009-10-22 Kim Joo Min User interface for a mobile device using a user's gesture in the proximity of an electronic device
US20110273388A1 (en) * 2010-05-10 2011-11-10 Samsung Electronics Co., Ltd. Apparatus and method for receiving gesture-based input in a mobile device
WO2013062883A1 (en) * 2011-10-25 2013-05-02 Google Inc. Gesture-based search

Similar Documents

Publication Publication Date Title
US10445132B2 (en) Method and apparatus for switching applications
EP2656193B1 (en) Application-launching interface for multiple modes
US8707211B2 (en) Radial graphical user interface
US9063644B2 (en) Adjustment mechanisms for virtual knobs on a touchscreen interface
CN104423789B (en) A kind of information processing method and electronic equipment
RU2018146112A (en) DEVICES, METHODS AND GRAPHIC USER INTERFACES FOR MANAGING USER INTERFACE OBJECTS WITH VISUAL AND / OR HAPTIC FEEDBACK
US9933922B2 (en) Child container control of parent container of a user interface
KR102265126B1 (en) Organizing user interface elements
US11099723B2 (en) Interaction method for user interfaces
CN107596688B (en) Skill release control method and device, storage medium, processor and terminal
US9870122B2 (en) Graphical user interface for rearranging icons
US10761717B2 (en) Controlling application launch
RU2016136361A (en) AUTOMATIC CREATION AND PERFORMANCE OF THE CUSTOMIZABLE USER INTERFACE
US9588661B1 (en) Graphical user interface widget to select multiple items from a fixed domain
CN103279304B (en) Method and device for displaying selected icon and mobile device
US20150143289A1 (en) Automatic check box interaction
CN111282264B (en) Virtual object control method and device
CN105573610A (en) Spreadsheet operation methods and apparatuses
EP2711805A1 (en) Method for handling a gesture-based user interface
WO2017019028A1 (en) Application launch state determination
US9635170B2 (en) Apparatus and method for controlling terminal to expand available display region to a virtual display space
JP6662861B2 (en) Hit test to determine whether to enable direct operation in response to user action
CN105404439B (en) Folder creating method and device
CN107463315B (en) Information processing method and electronic equipment
RU2020100889A (en) FORMATION OF USER INTERFACES BASED ON THE RULES

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 15899821

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 15899821

Country of ref document: EP

Kind code of ref document: A1