US20130044141A1 - Cross-slide Gesture to Select and Rearrange - Google Patents
Cross-slide Gesture to Select and Rearrange Download PDFInfo
- Publication number
- US20130044141A1 US20130044141A1 US13/656,354 US201213656354A US2013044141A1 US 20130044141 A1 US20130044141 A1 US 20130044141A1 US 201213656354 A US201213656354 A US 201213656354A US 2013044141 A1 US2013044141 A1 US 2013044141A1
- Authority
- US
- United States
- Prior art keywords
- panning
- gesture
- drag
- different
- action
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
- G06F3/0485—Scrolling or panning
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
- G06F3/04817—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance using icons
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
- G06F3/04842—Selection of displayed objects or displayed text elements
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
- G06F3/04883—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
Definitions
- gesture-based input is that of providing secondary actions. For example, in touch interfaces today, it is common to tap on an item to launch the item. This makes it difficult to provide secondary functionality such as an ability to select items. Further, certain challenges exist with so-call pannable surfaces, i.e. surfaces that can be panned and have their content moved. For example, a pannable surface typically reacts to a finger drag and moves the content in the direction of the user's finger. If the surface contains objects that a user might want to re-arrange, it is difficult to differentiate when the user wants to pan the surface or re-arrange the content.
- cross slide gestures for touch displays are described.
- cross slide gestures can be used on content that pans or scrolls in one direction, to enable additional actions, such as content selection, drag and drop operations, and the like.
- a cross slide gesture can be performed by dragging an item or object in a direction that is different from a panning or scrolling direction.
- the different-direction drag can be mapped to additional actions or functionality.
- one or more thresholds can be utilized, such as a distance threshold, in combination with the different-direction drag, to map to additional actions or functionality.
- so-called speed bumps or other perceptible indicia such as visual indicia, can be used to provide a user with an understanding or awareness of the thresholds.
- FIG. 1 is an illustration of an environment in an example implementation in accordance with one or more embodiments.
- FIG. 2 is an illustration of a system in an example implementation showing FIG. 1 in greater detail.
- FIG. 3 illustrates an example computing device in accordance with one or more embodiments.
- FIG. 4 illustrates an example computing device in accordance with one or more embodiments.
- FIG. 5 is a flow diagram that describes steps in a method in accordance with one or more embodiments.
- FIG. 6 illustrates an example computing device in accordance with one or more embodiments.
- FIG. 7 illustrates an example computing device in accordance with one or more embodiments.
- FIG. 8 is a flow diagram that describes the steps in a method in accordance with one or more embodiments.
- FIG. 9 illustrates a cross-slide detection example in accordance with one or more embodiments.
- FIG. 10 is a flow diagram that describes steps in a method in accordance with one or more embodiments.
- FIG. 11 illustrates distance thresholds in accordance with one or more embodiments.
- FIG. 12 is a flow diagram that describes steps in a method in accordance with one or more embodiments.
- FIG. 13 illustrates distance thresholds in accordance with one or more embodiments.
- FIG. 14 is a flow diagram that describes the steps in a method in accordance with one or more embodiments.
- FIG. 15 illustrates a cross-slide gesture in accordance with one or more embodiments.
- FIG. 16 is a flow diagram that describes steps in a method in accordance with one or more embodiments.
- FIG. 17 illustrates an example computing device that can be utilized to implement various embodiments described herein.
- cross slide gestures for touch displays are described.
- cross slide gestures can be used on content that pans or scrolls in one direction, to enable additional actions, such as content selection, drag and drop operations, and the like.
- a cross slide gesture can be performed by dragging an item or object in a direction that is different, e.g. orthogonal, from a panning or scrolling direction.
- Dragging can be performed via a touch-related drag, such as through a finger, stylus, pen and the like, through a mouse/trackpad drag and the like. In the examples described in this document, touch-related dragging is used.
- the different-direction drag can be mapped to additional actions or functionality.
- one or more thresholds can be utilized, such as a distance threshold, in combination with the different-direction drag, to map to additional actions or functionality.
- dragging an object vertically a short distance and releasing it may mark an object as selected, while dragging the object a larger distance vertically may break the object free from an associated list so that it can be dropped somewhere else.
- so-called speed bumps or other perceptible indicia such as visual indicia, can be used to provide a user with an understanding or awareness of the thresholds.
- a mode can be thought of as an action that is initiated by a user that is not necessarily related to manipulating an item directly. For example, a mode can be entered by clicking on a particular user interface button to then be exposed to functionality that can be performed relative to an item or object. In the described embodiments, modes can be avoided by eliminating, in at least some instances, user interface elements to access drag functionality.
- an example environment is first described that is operable to employ the gesture techniques described herein.
- Example illustrations of the gestures and procedures are then described, which may be employed in the example environment, as well as in other environments. Accordingly, the example environment is not limited to performing the example gestures and the gestures are not limited to implementation in the example environment.
- FIG. 1 is an illustration of an environment 100 in an example implementation that is operable to employ cross-slide gestures as described herein.
- the illustrated environment 100 includes an example of a computing device 102 that may be configured in a variety of ways.
- the computing device 102 may be configured as a traditional computer (e.g., a desktop personal computer, laptop computer, and so on), a mobile station, an entertainment appliance, a set-top box communicatively coupled to a television, a wireless phone, a netbook, a game console, a handheld device, and so forth as further described in relation to FIG. 2 .
- the computing device 102 may range from full resource devices with substantial memory and processor resources (e.g., personal computers, game consoles) to a low-resource device with limited memory and/or processing resources (e.g., traditional set-top boxes, hand-held game consoles).
- the computing device 102 also includes software that causes the computing device 102 to perform one or more operations as described below.
- Computing device 102 includes a gesture module 104 that is operational to provide gesture functionality as described in this document.
- the gesture module can be implemented in connection with any suitable type of hardware, software, firmware or combination thereof.
- the gesture module is implemented in software that resides on some type of tangible, computer-readable storage medium examples of which are provided below.
- Gesture module 104 is representative of functionality that recognizes gestures, including cross-slide gestures that can be performed by one or more fingers, and causes operations to be performed that correspond to the gestures.
- the gestures may be recognized by module 104 in a variety of different ways.
- the gesture module 104 may be configured to recognize a touch input, such as a finger of a user's hand 106 a as proximal to display device 108 of the computing device 102 using touchscreen functionality.
- gesture module 104 can recognize cross slide gestures that can be used on content that pans or scrolls in one direction, to enable additional actions, such as content selection, drag and drop operations, and the like.
- a pan or scroll direction is shown as being in the vertical direction, as indicated by the arrows.
- a cross slide gesture can be performed by dragging an item or object in a direction that is different, e.g. orthogonal, from the panning or scrolling direction.
- the different-direction drag can be mapped to additional actions or functionality.
- a direction is vertical or horizontal
- a vertical direction can be considered, in at least some instances, as a direction that is generally parallel to one side of a display device, and a horizontal direction can be considered as a direction that is generally orthogonal to the vertical direction.
- the orientation of a computing device may change, the verticality or horizontality of a particular cross slide gesture can remain standard as defined relative to and along the display device.
- a finger of the user's hand 106 a is illustrated as selecting 110 an image 112 displayed by the display device 108 .
- Selection 110 of the image 112 and subsequent movement of the finger of the user's hand 106 a in a direction that is different from the pan or scroll direction, e.g., generally orthogonal relative to the pan or scroll direction, may be recognized by the gesture module 104 .
- the gesture module 104 may then identify this recognized movement, by the nature and character of the movement, as indicating a “drag and drop” operation to change a location of the image 112 to a point in the display at which the finger of the user's hand 106 a is lifted away from the display device 108 .
- recognition of the touch input that describes selection of the image, movement of the selection point to another location, and then lifting of the finger of the user's hand 106 a may be used to identify a gesture (e.g., drag-and-drop gesture) that is to initiate the drag-and-drop operation.
- a gesture e.g., drag-and-drop gesture
- gesture module 104 can be utilized to recognize single-finger gestures and bezel gestures, multiple-finger/same-hand gestures and bezel gestures, and/or multiple-finger/different-hand gestures and bezel gestures.
- the computing device 102 may be configured to detect and differentiate between a touch input (e.g., provided by one or more fingers of the user's hand 106 a ) and a stylus input (e.g., provided by a stylus 116 ).
- the differentiation may be performed in a variety of ways, such as by detecting an amount of the display device 108 that is contacted by the finger of the user's hand 106 versus an amount of the display device 108 that is contacted by the stylus 116 .
- the gesture module 104 may support a variety of different gesture techniques through recognition and leverage of a division between stylus and touch inputs, as well as different types of touch inputs.
- FIG. 2 illustrates an example system showing the gesture module 104 as being implemented in an environment where multiple devices are interconnected through a central computing device.
- the central computing device may be local to the multiple devices or may be located remotely from the multiple devices.
- the central computing device is a “cloud” server farm, which comprises one or more server computers that are connected to the multiple devices through a network or the Internet or other means.
- this interconnection architecture enables functionality to be delivered across multiple devices to provide a common and seamless experience to the user of the multiple devices.
- Each of the multiple devices may have different physical requirements and capabilities, and the central computing device uses a platform to enable the delivery of an experience to the device that is both tailored to the device and yet common to all devices.
- a “class” of target device is created and experiences are tailored to the generic class of devices.
- a class of device may be defined by physical features or usage or other common characteristics of the devices.
- the computing device 102 may be configured in a variety of different ways, such as for mobile 202 , computer 204 , and television 206 uses.
- Each of these configurations has a generally corresponding screen size and thus the computing device 102 may be configured as one of these device classes in this example system 200 .
- the computing device 102 may assume the mobile 202 class of device which includes mobile telephones, music players, game devices, and so on.
- the computing device 102 may also assume a computer 204 class of device that includes personal computers, laptop computers, netbooks, and so on.
- the television 206 configuration includes configurations of device that involve display in a casual environment, e.g., televisions, set-top boxes, game consoles, and so on.
- the techniques described herein may be supported by these various configurations of the computing device 102 and are not limited to the specific examples described in the following sections.
- Cloud 208 is illustrated as including a platform 210 for web services 212 .
- the platform 210 abstracts underlying functionality of hardware (e.g., servers) and software resources of the cloud 208 and thus may act as a “cloud operating system.”
- the platform 210 may abstract resources to connect the computing device 102 with other computing devices.
- the platform 210 may also serve to abstract scaling of resources to provide a corresponding level of scale to encountered demand for the web services 212 that are implemented via the platform 210 .
- a variety of other examples are also contemplated, such as load balancing of servers in a server farm, protection against malicious parties (e.g., spam, viruses, and other malware), and so on.
- the cloud 208 is included as a part of the strategy that pertains to software and hardware resources that are made available to the computing device 102 via the Internet or other networks.
- the gesture module 104 may be implemented in part on the computing device 102 as well as via a platform 210 that supports web services 212 .
- the gesture techniques supported by the gesture module may be detected using touchscreen functionality in the mobile configuration 202 , track pad functionality of the computer 204 configuration, detected by a camera as part of support of a natural user interface (NUI) that does not involve contact with a specific input device, and so on. Further, performance of the operations to detect and recognize the inputs to identify a particular gesture may be distributed throughout the system 200 , such as by the computing device 102 and/or the web services 212 supported by the platform 210 of the cloud 208 .
- NUI natural user interface
- any of the functions described herein can be implemented using software, firmware, hardware (e.g., fixed logic circuitry), manual processing, or a combination of these implementations.
- the terms “module,” “functionality,” and “logic” as used herein generally represent software, firmware, hardware, or a combination thereof.
- the module, functionality, or logic represents program code that performs specified tasks when executed on or by a processor (e.g., CPU or CPUs).
- the program code can be stored in one or more computer readable memory devices.
- a section entitled “Method/Gesture for Cross-Slide Relative to Panning Direction” describes a cross-slide gesture that can be executed relative to a panning direction in accordance with one or more embodiments.
- a section entitled “Method/Gesture for Re-arranging Items in a Pannable List” describes how items can be arranged and rearranged utilizing a cross-slide gesture in accordance with one or more embodiments.
- a section entitled “Detecting Cross-Slide Gestures” describes how cross-slide gestures can be detected in accordance with one or more embodiments.
- a section entitled “Combining Multiple Interactions” describes how multiple interactions can be combined in conjunction with cross-slide gestures in accordance with one or more embodiments.
- a section entitled “Direct Manipulation to Facilitate Threshold to Discernability” describes how direct manipulation feedback can be provided to enable a user to become aware of various thresholds in accordance with one or more embodiments.
- a section entitled “Interaction Feedback” describes embodiments in which feedback can be provided to a user in accordance with one or more embodiments.
- Example Device describes aspects of an example device that can be utilized to implement one or more embodiments.
- a cross slide gesture can be performed for causing an object-related action to be performed by dragging an item or object in a direction that is different, e.g. orthogonal, from a scrolling or panning direction.
- computing device 302 includes a display device 308 whose content can be scrolled or panned in the horizontal direction, as indicated by the double-headed arrow 304 , and as suggested by scroll bar 305 .
- Display device 308 has displayed, thereon, multiple different objects or items 310 , 312 , 314 , 316 , 318 , 320 , 322 , 324 which are shown in their entireties, and partial objects or items 326 , 328 .
- a user can affect scrolling or panning in the horizontal direction by using a swipe gesture on the display device 308 in the horizontal direction.
- a user can cause an object-related action to be performed by performing a cross slide gesture, relative to one of the objects or items, in a direction that is different from the scrolling or panning direction.
- a user's hand 306 a has touched over item 312 and moved it in a direction that is different from the scrolling or panning direction.
- the different direction is generally orthogonal to the scrolling or panning direction in a downward direction.
- the object can be moved downward and upward or, more generally, bi-directionally, to access the same or different object-related actions.
- Any suitable type of object-related action can be performed.
- one type of object-related action can include, by way of example and not limitation, object selection.
- the selected item is directly manipulated and visual feedback is provided to the user by being able to observe the object move responsive to the user's engagement.
- the object-related action is performed without showing additional user interface elements, such as a button to enable a command selection.
- Other object-related actions can be performed such as object delete and other object manipulation actions.
- computing device 402 includes a display device 408 whose content can be scrolled or panned in the vertical direction, as indicated by the double-headed arrow 404 , and as suggested by scroll bar 405 .
- Display device 408 has displayed, thereon, multiple different objects or items 410 , 412 , 414 , 416 , 418 , 420 , 422 , 424 which are shown in their entireties.
- a user can affect scrolling or panning in the vertical direction by using a swipe gesture on the display device 408 in the vertical direction.
- a user can cause an object-related action to be performed by performing a cross slide gesture, relative to one of the objects or items, in a direction that is different from the scrolling or panning direction.
- a user's hand 406 a has touched over item 412 and moved it in a direction that is different from the scrolling or panning direction.
- the different direction is generally orthogonal to the scrolling or panning direction.
- Any suitable type of object-related action can be performed, examples of which are provided below.
- one type of object-related action can include, by way of example and not limitation, object selection. It is to be appreciated and understood that functionality that is accessible through cross slide gestures can be accessed in connection with moving the object or item any suitable threshold distance to invoke the object-related action. In at least some embodiments, there may be no threshold distance to invoke object-related action. In these instances, movement in a different direction other than the pan or scroll direction may be used to invoke the object-related action.
- FIG. 5 is a flow diagram that describes steps in a method in accordance with one or more embodiments.
- the method can be performed in connection with any suitable hardware, software, firmware, or combination thereof.
- the method can be performed by a suitably-configured gesture module, such as the one described above.
- Step 500 detects a gestural slide input relative to a display device associated with a computing device.
- Step 502 ascertains whether the direction of the gestural slide input is different from a panning direction. If the direction is not different from the panning direction, step 504 pans content in the direction of the gestural slide input. If, on the other hand, the direction of the gestural slide input is different from the panning direction, step 506 performs an object-related action. Any suitable type of object-related action can be performed, examples of which are provided below.
- a cross slide gesture can be performed effective to cause an object-related action, in the form of an object-rearrangement action, to be performed by dragging an item or object in a direction that is different, e.g. orthogonal to or generally not in the direction associated with a scrolling or panning direction.
- computing device 602 includes a display device 608 whose content can be scrolled or panned in the horizontal direction, as indicated by the double-headed arrow 604 , and as suggested by scroll bar 605 .
- Display device 608 has displayed, thereon, multiple different objects or items 610 , 612 , 614 , 616 , 618 , 620 , 622 , 624 which are shown in their entireties, and partial objects or items 626 , 628 .
- a user can affect scrolling or panning in the horizontal direction by using a swipe gesture on the display device 608 in the horizontal direction.
- a user can cause an object-related action, in the form of a rearrangement action, to be performed by performing a cross slide gesture, relative to one of the objects or items, in a direction that is different from the scrolling or panning direction.
- an object-related action in the form of a rearrangement action
- a cross slide gesture relative to one of the objects or items, in a direction that is different from the scrolling or panning direction.
- a user's hand 606 a has touched display device 608 over object 612 and dragged the object in a first direction that is generally orthogonal to the scrolling or panning direction, and then in a second direction toward the left bottom corner of display device 608 .
- the first direction is a generally vertical direction. Dragging the object in the first direction indicates to the gesture module that an object is to be rearranged.
- computing device 602 the user's hand 606 a has dragged object 612 to its illustrated position and dropped it in place. Subsequently, the user's hand touched display device 608 over object 618 and dragged the object in a first direction that is generally orthogonal to the scrolling or pending direction, and then in a second direction toward the middle portion of the display device.
- the first direction is a generally vertical direction.
- computing device 702 includes a display device 708 whose content can be scrolled or panned in the vertical direction, as indicated by the double-headed arrow 704 , and as suggested by scroll bar 705 .
- Display device 708 has displayed, thereon, multiple different objects or items 710 , 712 , 714 , 716 , 718 , 720 , 722 , 724 .
- a user can affect scrolling or panning in the vertical direction by using a swipe gesture on the display device 708 in the vertical direction.
- a user can cause an object-related action, in the form of a rearrangement action, to be performed by performing a cross slide gesture, relative to one of the objects or items, in a direction that is different from the scrolling or panning direction.
- an object-related action in the form of a rearrangement action
- a cross slide gesture relative to one of the objects or items, in a direction that is different from the scrolling or panning direction.
- a user's hand 706 a has touched display device 708 over object 712 and dragged the object in a direction that is generally orthogonal to the scrolling or panning direction.
- the direction is a generally horizontal direction. Dragging an object in this direction indicates to the gesture module that the object is to be rearranged.
- computing device 702 the user's hand 706 a has dragged object 712 to its illustrated position and dropped it in place. Subsequently, the user's hand touched display device 708 over object 710 and dragged the object in a direction that is generally orthogonal to the scrolling or pending direction. Here, the direction is a generally horizontal direction. Once the user's hand is lifted from the touched display device 708 , object 710 will be dropped in its illustrated place.
- FIG. 8 is a flow diagram that describes steps in a method in accordance with one or more embodiments.
- the method can be performed in connection with any suitable hardware, software, firmware, or combination thereof.
- the method can be performed by a suitably-configured gesture module, such as the one described above.
- Step 800 detects a drag direction associated with a drag operation relative to a display device associated with a computing device.
- Step 802 ascertains whether the drag direction is different from a panning direction. If the drag direction is not different from the panning direction, step 804 pans content in the dragging direction. If, on the other hand, the drag direction is different from the panning direction, step 806 performs an object-rearrangement action. Examples of how this can be done are provided above. In one or more embodiments, rearrangement can occur in any suitable direction.
- Cross-slide gestures can be detected in any suitable way. As but one example of how cross-slide gestures can be detected, consider the following in connection with FIG. 9 . In one or more embodiments, to detect if a user is panning or cross-sliding, region detection logic can be applied as graphically illustrated in FIG. 9 .
- region detection can be employed to ascertain the outcome of the drag. For example, in a situation in which there is a drag into one of region 902 , the content will be panned in a corresponding direction. However, a drag into one of region 904 will be recognized as a cross-slide gesture and, accordingly, the functionality associated with the cross-slide gesture can be implemented.
- regions 902 and 904 are generally similarly sized. However, based on the scenario, certain actions can be prioritized by changing the entrance angle or range of angles, e.g. angles a and b, of the different regions. For example, by making angles a larger thereby increasing their range, (and angles b smaller, thereby decreasing their range), it is easier to start panning without accidentally performing a cross-slide gesture, and vice versa.
- angles a and b e.g. angles a and b
- FIG. 10 is a flow diagram that describes steps in a method in accordance with one or more embodiments.
- the method can be performed in connection with any suitable hardware, software, firmware, or combination thereof.
- the method can be performed by a suitably-configured gesture module, such as the one described above.
- Step 1000 defines one or more regions associated with a panning gesture. Any suitable region geometry can be utilized, an example of which is provided above.
- Step 1002 defines one or more regions associated with a cross slide gesture. Again, any suitable region geometry can be utilized, an example of which is provided above.
- the region geometries are generally triangular in shape and converge at a point associated with a touch input. Other geometries can be utilized without departing from the spirit and scope of the claimed subject matter.
- Step 1004 detects a drag operation. This step can be performed by detecting gestural input in the form of a touch gesture, such as a swipe.
- Step 1006 ascertains an associated region within which the drag operation occurs. If, at step 1008 the region is associated with a panning gesture, step 1010 pans content in an associated direction. If, on the other hand, the region is not associated with a panning gesture, step 1012 performs an operation associated with a cross slide gesture. Any suitable object-related action can be performed, including, by way of example and not limitation, object selection, object deletion, object rearrangement, and the like.
- a threshold that can be utilized to lock into an object-related action such as a drag threshold that allows locking into a drag direction.
- Any suitable type of threshold can be utilized including, by way of example and not limitation, distance thresholds, velocity thresholds, directionality thresholds, any combination of the aforementioned thresholds, as well as other thresholds.
- a combination of distance and velocity threshold can be used to mitigate what might otherwise constitute an accidental or unintended action. For example, when a particular threshold is reached, the velocity of finger movement might be ascertained. If the velocity is below a particular threshold, then a drag action might be invoked. If it is above a particular threshold, then perhaps an object select action is performed.
- box 906 is defined. While the user's finger is within box 906 or, alternatively, within the boundary of circle 900 , the corresponding gesture can be in an “undecided” state. Once the finger crosses outside the boundary of the box (or circle), a decision as to the gesture can be made. In practice, this can be handled in a couple of different ways. First, neither a pan operation nor cross-slide functionality can be implemented until the finger has crossed the boundary of box 906 . Alternately, both pan and cross-slide operations can be implemented simultaneously while the finger is within the boundary of box 906 . As soon as the finger crosses the boundary of the box, the operation associated with that particular region can be maintained, while the other operation can be canceled.
- FIG. 11 There, an object or item 1100 is shown. Various distances are shown and are indicated at 1102 and 1104 . The distances show the travel distance of object 1100 .
- the first distance 1102 is a threshold which, when passed, results in an action potentially being committed. In this particular example, passing this distance threshold while performing a drag operation causes object 1100 to be selected. To commit this action, the user would lift her finger, and the dragged object would slide back to its original position and change its state to be selected. The area beneath that corresponding to distance 1102 before the threshold of distance 1104 is reached can be treated as a buffer. Thus, releasing the object within this area will still result in object selection.
- the next object-related action on the cross-slide gesture can be committed.
- the object-related action can break the object out of its associated list or position on the display device, and thus enable the user to drag and drop the object in any direction.
- the object reaches line 1106 such can trigger yet additional object-related actions. For example, crossing this line with the object might trigger additional visual feedback to make it clear to the user that the drag and drop threshold has been reached.
- a first threshold might be defined by the boundary of the illustrated circle within object 1100 , a second threshold by the distance 1102 , and a third threshold by the distance 1104 .
- movement outside of the first threshold can lock the associated object in the associated movement direction.
- FIG. 12 is a flow diagram that describes steps in a method in accordance with one or more embodiments.
- the method can be performed in connection with any suitable hardware, software, firmware, or combination thereof.
- the method can be performed by a suitably-configured gesture module, such as the one described above.
- Step 1200 defines one or more thresholds. This step can be performed in any suitable way utilizing any suitable type of threshold. For example, in the embodiment described above, distance thresholds were employed. It is to be appreciated and understood, however, that other types of thresholds and/or combinations thereof can be utilized without departing from the spirit and scope of the claimed subject matter.
- the defined thresholds can be utilized in connection with a cross-slide gesture as described above.
- Step 1202 detects a cross-slide gesture. Examples of how this can be done are provided above.
- Step 1204 detects or one or more threshold triggers. For example, once a user has touched-engaged an object they can move the object in a particular direction. This step detects when the object has been moved sufficient to trigger one or more thresholds. In embodiments where thresholds are defined in terms of distances, the step can be performed by detecting when an object has been moved a particular distance.
- Step 1206 detects whether a user action indicates that an object-related action is to be committed.
- This step can be performed in any suitable way.
- a user action might include lifting their finger off a particular object.
- step 1208 does not commit the object-related action.
- the user might terminate the cross-slide gesture in a particular way such that no action is to be committed.
- the cross-slide gesture can be reversible, i.e. if the user starts dragging an object down, then she can, at any time while still holding the object, slide it back to its original position. By doing this, no cross-slide actions will be taken.
- one or more thresholds might be crossed, without a user having yet indicated that the object-related action is to be committed.
- the method would continue to monitor for threshold triggers as by returning to step 1204 .
- step 1210 commits the object-related action associated with a last-triggered threshold.
- This step can be performed in any suitable way and can include any suitable object-related action, examples of which are provided above.
- the multiple different directions used for cross-slide functionality can either result in the same object-related actions being performed, or different object-related actions being performed. For example, object selection might occur when an object is dragged downward, while a drag and drop action might be performed when the object is dragged upward.
- direct manipulation can provide visual feedback so that a user can visually observe an object move and, in accordance with object movement, can be provided with visual affordances to facilitate threshold discernability.
- Any suitable type of the visual affordance can be employed including, by way of example and not limitation, tool tips, icons, glyphs, and the like.
- so-called speed bumps can be used to provide a user with an understanding or awareness of the various thresholds that might be present. As an example, consider FIG. 13 .
- an object or item 1300 is shown.
- Various distances are shown and are indicated at 1302 , 1304 , and 1306 .
- the distances show the travel distance of object 1300 or distances through which the object can travel.
- the first distance 1302 is a threshold which, when passed, results in an action potentially being committed.
- passing this distance threshold while performing a drag operation causes object 1300 to be selected.
- the user would lift her finger, and the dragged object would slide back to its original position and change its state to be selected.
- the area beneath that corresponding to distance 1302 before the region corresponding to distance 1306 is reached can be treated as a buffer. Thus, releasing the object within this area will still result in object selection.
- Distance 1306 corresponds to a speed bump region. Movement of object 1300 within the speed bump region is slower than movement of the finger. This presents a visual cue or indication that a new threshold is about to be reached, thus making it easier for the user to commit a particular action without accidentally moving into and over a next distance threshold. For example, within a speed bump region, a user may drag her finger 50 pixels in length, while the corresponding object may move five pixels in distance. Releasing the object within this speed bump region will result in an associated action being committed. In this example, the associated action is an object selection.
- the object-related action can break the object out of its associated list or position on the display device, and thus enable the user to drag and drop the object in any direction.
- the object reaches line 1308 such can trigger yet additional object-related actions. For example, crossing this line with the object might trigger additional visual feedback to make it clear to the user that the drag and drop threshold has been reached.
- multiple speed bumps can be utilized in connection with the distance thresholds.
- any suitable number of distance thresholds and speed bumps can be employed and can be associated with object related actions.
- other visual indicia can be utilized to indicate thresholds or threshold changes. For example, while dragging an object, one or more lines can be rendered to indicate thresholds and thus, the distance that an object should be dragged to commit different actions. Visuals can also be drawn on the object itself as it gets closer to or crosses a threshold.
- FIG. 14 is a flow diagram that describes steps in a method in accordance with one or more embodiments.
- the method can be performed in connection with any suitable hardware, software, firmware, or combination thereof.
- the method can be performed by a suitably-configured gesture module, such as the one described above.
- Step 1400 defines one or more distance thresholds including one or more speed bumps.
- This step can be performed in any suitable way.
- the defined thresholds and speed bumps can be utilized in connection with a cross-slide gesture as described above.
- Step 1402 detects a cross-slide gesture. Examples of how this can be done are provided above.
- Step 1404 detects a speed bump crossing. For example, once a user has touched-engaged an object they can move the object in a particular direction. This step detects when the object has been moved sufficient to cross a boundary associated with a speed bump.
- Step 1406 modifies a user experience within the speed bump region.
- Any suitable modification of the user experience can be provided.
- modification of the user experience can entail modifying the user's visual experience.
- the user's finger may move the faster than the underlying object.
- other experience modifications can take place including, by way of example and not limitation, providing audible or haptic feedback to indicate presence within a particular speed bump region.
- Step 1408 detects whether a user action indicates that an object-related action is to be committed.
- This step can be performed in any suitable way.
- a user action might include lifting their finger off a particular object.
- step 1410 does not commit the object-related action.
- the user might terminate the cross-slide gesture in a particular way such that no action is to be committed.
- the cross-slide gesture can be reversible, i.e. if the user starts dragging an object down, then she can, at any time while still holding the object, slide it back to its original position. By doing this, no cross-slide actions will be taken.
- step 1412 commits the object-related action associated with a last-crossed threshold. This step can be performed in any suitable way and can include any suitable object-related action, examples of which are provided above.
- the multiple different directions used for cross-slide functionality can either result in the same object-related actions being performed, or different object-related actions being performed. For example, object selection might occur when an object is dragged downward, while a drag and drop action might be performed when the object is dragged upward.
- visual feedback can be provided to a user to inform the user of a particular object-related action that will be committed, responsive to the detected cross-slide gesture.
- visual indicia can be provided to inform the user of a particular action that will be committed by releasing the object.
- visual indicia can further be provided on a particular object-related action that might be next if object dragging continues.
- FIG. 15 an object in the form of a picture is shown generally at 1500 .
- a user has touched the object to initiate a drag operation.
- visual indicia 1504 can be presented as by beginning to emerge from underneath the picture.
- the visual indicia resides in the form of a check box that gradually emerges from underneath the object.
- Any suitable type of visual indicia can be utilized without departing from the spirit and scope of the claimed subject matter.
- the visual indicia might be presented in the form of a line, beneath the picture, and to which the picture is to be dragged to commit a particular action.
- the visual indicia here, the check box
- the object-related action comprises an object selection.
- the fully exposed check box can indicate that the action is completed.
- FIG. 16 is a flow diagram that describes steps in a method in accordance with one or more embodiments.
- the method can be performed in connection with any suitable hardware, software, firmware, or combination thereof.
- the method can be performed by a suitably-configured gesture module, such as the one described above.
- Step 1600 detects a drag operation associated with an object. Examples of how this can be done are provided above. For example, the drag operation can be detected in conjunction with a cross-slide gesture. Step 1602 presents a partial portion of visual indicia associated with committing an object-related action. Examples of how this can be done are provided above. Step 1604 ascertains whether the drag operation continues. If the drag operation has not continued, the method can, in an event that a user has not terminated the drag operation, return to step 1602 . In the event that user has terminated the drag operation, as by returning the object to its original position, the method can terminate. On the other hand, if the drag operation continues, step 1606 ascertains whether a distance threshold associated with an object-related action has been reached.
- step 1608 presents a complete visual indicia associated with committing the object-related action. By doing so, the visual indicia can inform the user, visually, that the object-related action can be committed as by the user removing her finger from the object.
- FIG. 17 illustrates various components of an example device 1700 that can be implemented as any type of portable and/or computer device as described with reference to FIGS. 1 and 2 to implement embodiments of the gesture techniques described herein.
- Device 1700 includes communication devices 1702 that enable wired and/or wireless communication of device data 1704 (e.g., received data, data that is being received, data scheduled for broadcast, data packets of the data, etc.).
- the device data 1704 or other device content can include configuration settings of the device, media content stored on the device, and/or information associated with a user of the device.
- Media content stored on device 1700 can include any type of audio, video, and/or image data.
- Device 1700 includes one or more data inputs 1706 via which any type of data, media content, and/or inputs can be received, such as user-selectable inputs, messages, music, television media content, recorded video content, and any other type of audio, video, and/or image data received from any content and/or data source.
- any type of data, media content, and/or inputs can be received, such as user-selectable inputs, messages, music, television media content, recorded video content, and any other type of audio, video, and/or image data received from any content and/or data source.
- Device 1700 also includes communication interfaces 1708 that can be implemented as any one or more of a serial and/or parallel interface, a wireless interface, any type of network interface, a modem, and as any other type of communication interface.
- the communication interfaces 1708 provide a connection and/or communication links between device 1700 and a communication network by which other electronic, computing, and communication devices communicate data with device 1700 .
- Device 1700 includes one or more processors 1710 (e.g., any of microprocessors, controllers, and the like) which process various computer-executable or readable instructions to control the operation of device 1700 and to implement the gesture embodiments described above.
- processors 1710 e.g., any of microprocessors, controllers, and the like
- device 1700 can be implemented with any one or combination of hardware, firmware, or fixed logic circuitry that is implemented in connection with processing and control circuits which are generally identified at 1712 .
- device 1700 can include a system bus or data transfer system that couples the various components within the device.
- a system bus can include any one or combination of different bus structures, such as a memory bus or memory controller, a peripheral bus, a universal serial bus, and/or a processor or local bus that utilizes any of a variety of bus architectures.
- Device 1700 also includes computer-readable media 1714 , such as one or more memory components, examples of which include random access memory (RAM), non-volatile memory (e.g., any one or more of a read-only memory (ROM), flash memory, EPROM, EEPROM, etc.), and a disk storage device.
- RAM random access memory
- non-volatile memory e.g., any one or more of a read-only memory (ROM), flash memory, EPROM, EEPROM, etc.
- a disk storage device may be implemented as any type of magnetic or optical storage device, such as a hard disk drive, a recordable and/or rewriteable compact disc (CD), any type of a digital versatile disc (DVD), and the like.
- Device 1700 can also include a mass storage media device 1716 .
- Computer-readable media 1714 provides data storage mechanisms to store the device data 1704 , as well as various device applications 1718 and any other types of information and/or data related to operational aspects of device 1700 .
- an operating system 1720 can be maintained as a computer application with the computer-readable media 1714 and executed on processors 1710 .
- the device applications 1718 can include a device manager (e.g., a control application, software application, signal processing and control module, code that is native to a particular device, a hardware abstraction layer for a particular device, etc.), as well as other applications that can include, web browsers, image processing applications, communication applications such as instant messaging applications, word processing applications and a variety of other different applications.
- the device applications 1718 also include any system components or modules to implement embodiments of the gesture techniques described herein.
- the device applications 1718 include an interface application 1722 and a gesture-capture driver 1724 that are shown as software modules and/or computer applications.
- the gesture-capture driver 1724 is representative of software that is used to provide an interface with a device configured to capture a gesture, such as a touchscreen, track pad, camera, and so on.
- the interface application 1722 and the gesture-capture driver 1724 can be implemented as hardware, software, firmware, or any combination thereof.
- Device 1700 also includes an audio and/or video input-output system 1726 that provides audio data to an audio system 1728 and/or provides video data to a display system 1730 .
- the audio system 1728 and/or the display system 1730 can include any devices that process, display, and/or otherwise render audio, video, and image data.
- Video signals and audio signals can be communicated from device 1700 to an audio device and/or to a display device via an RF (radio frequency) link, S-video link, composite video link, component video link, DVI (digital video interface), analog audio connection, or other similar communication link.
- the audio system 1728 and/or the display system 1730 are implemented as external components to device 1700 .
- the audio system 1728 and/or the display system 1730 are implemented as integrated components of example device 1700 .
- cross slide gestures for touch displays are described.
- cross slide gestures can be used on content that pans or scrolls in one direction, to enable additional actions, such as content selection, drag and drop operations, and the like.
- a cross slide gesture can be performed by dragging an item or object in a direction that is different from a scrolling direction.
- the different-direction drag can be mapped to additional actions or functionality.
- one or more thresholds can be utilized, such as a distance threshold, in combination with the different-direction drag, to map to additional actions or functionality.
- so-called speed bumps can be used to provide a user with an understanding or awareness of the thresholds.
Landscapes
- Engineering & Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- User Interface Of Digital Computer (AREA)
- Position Input By Displaying (AREA)
- Telephone Function (AREA)
Abstract
Cross slide gestures for touch displays are described. In at least some embodiments, cross slide gestures can be used on content that pans or scrolls in one direction, to enable additional actions, such as content selection, drag and drop operations, and the like. In one or more embodiments, a cross slide gesture can be performed by dragging an item or object in a direction that is different from a scrolling direction. The different-direction drag can be mapped to additional actions or functionality. In one or more embodiments, one or more thresholds can be utilized, such as a distance threshold, in combination with the different-direction drag, to map to additional actions or functionality.
Description
- This application is a continuation of and claims priority under 35 U.S.C. §120 to U.S. patent application Ser. No. 13/196,272, filed on Aug. 2, 2011, the disclosure of which is incorporated by reference herein in its entirety.
- One of the challenges that continues to face designers of devices having user-engageable displays, such as touch displays, pertains to providing enhanced functionality for users, through gestures that can be employed with the devices. This is so, not only with devices having larger or multiple screens, but also in the context of devices having a smaller footprint, such as tablet PCs, hand-held devices, smaller multi-screen devices and the like.
- One challenge with gesture-based input is that of providing secondary actions. For example, in touch interfaces today, it is common to tap on an item to launch the item. This makes it difficult to provide secondary functionality such as an ability to select items. Further, certain challenges exist with so-call pannable surfaces, i.e. surfaces that can be panned and have their content moved. For example, a pannable surface typically reacts to a finger drag and moves the content in the direction of the user's finger. If the surface contains objects that a user might want to re-arrange, it is difficult to differentiate when the user wants to pan the surface or re-arrange the content.
- This Summary is provided to introduce a selection of concepts in a simplified form that are further described below in the Detailed Description. This Summary is not intended to identify key features or essential features of the claimed subject matter, nor is it intended to be used as an aid in determining the scope of the claimed subject matter.
- Cross slide gestures for touch displays are described. In at least some embodiments, cross slide gestures can be used on content that pans or scrolls in one direction, to enable additional actions, such as content selection, drag and drop operations, and the like.
- In one or more embodiments, a cross slide gesture can be performed by dragging an item or object in a direction that is different from a panning or scrolling direction. The different-direction drag can be mapped to additional actions or functionality. In one or more embodiments, one or more thresholds can be utilized, such as a distance threshold, in combination with the different-direction drag, to map to additional actions or functionality.
- In at least some embodiments, so-called speed bumps, or other perceptible indicia such as visual indicia, can be used to provide a user with an understanding or awareness of the thresholds.
- The detailed description is described with reference to the accompanying figures. In the figures, the left-most digit(s) of a reference number identifies the figure in which the reference number first appears. The use of the same reference numbers in different instances in the description and the figures may indicate similar or identical items.
-
FIG. 1 is an illustration of an environment in an example implementation in accordance with one or more embodiments. -
FIG. 2 is an illustration of a system in an example implementation showingFIG. 1 in greater detail. -
FIG. 3 illustrates an example computing device in accordance with one or more embodiments. -
FIG. 4 illustrates an example computing device in accordance with one or more embodiments. -
FIG. 5 is a flow diagram that describes steps in a method in accordance with one or more embodiments. -
FIG. 6 illustrates an example computing device in accordance with one or more embodiments. -
FIG. 7 illustrates an example computing device in accordance with one or more embodiments. -
FIG. 8 is a flow diagram that describes the steps in a method in accordance with one or more embodiments. -
FIG. 9 illustrates a cross-slide detection example in accordance with one or more embodiments. -
FIG. 10 is a flow diagram that describes steps in a method in accordance with one or more embodiments. -
FIG. 11 illustrates distance thresholds in accordance with one or more embodiments. -
FIG. 12 is a flow diagram that describes steps in a method in accordance with one or more embodiments. -
FIG. 13 illustrates distance thresholds in accordance with one or more embodiments. -
FIG. 14 is a flow diagram that describes the steps in a method in accordance with one or more embodiments. -
FIG. 15 illustrates a cross-slide gesture in accordance with one or more embodiments. -
FIG. 16 is a flow diagram that describes steps in a method in accordance with one or more embodiments. -
FIG. 17 illustrates an example computing device that can be utilized to implement various embodiments described herein. - Overview
- Cross slide gestures for touch displays are described. In at least some embodiments, cross slide gestures can be used on content that pans or scrolls in one direction, to enable additional actions, such as content selection, drag and drop operations, and the like.
- In one or more embodiments, a cross slide gesture can be performed by dragging an item or object in a direction that is different, e.g. orthogonal, from a panning or scrolling direction. Dragging can be performed via a touch-related drag, such as through a finger, stylus, pen and the like, through a mouse/trackpad drag and the like. In the examples described in this document, touch-related dragging is used. The different-direction drag can be mapped to additional actions or functionality. In one or more embodiments, one or more thresholds can be utilized, such as a distance threshold, in combination with the different-direction drag, to map to additional actions or functionality. For example, in the context of a horizontally-scrollable list, dragging an object vertically a short distance and releasing it may mark an object as selected, while dragging the object a larger distance vertically may break the object free from an associated list so that it can be dropped somewhere else.
- In at least some embodiments, so-called speed bumps, or other perceptible indicia such as visual indicia, can be used to provide a user with an understanding or awareness of the thresholds.
- Various embodiments described herein enable an item to be dragged without necessarily entering a mode. A mode can be thought of as an action that is initiated by a user that is not necessarily related to manipulating an item directly. For example, a mode can be entered by clicking on a particular user interface button to then be exposed to functionality that can be performed relative to an item or object. In the described embodiments, modes can be avoided by eliminating, in at least some instances, user interface elements to access drag functionality.
- In the following discussion, an example environment is first described that is operable to employ the gesture techniques described herein. Example illustrations of the gestures and procedures are then described, which may be employed in the example environment, as well as in other environments. Accordingly, the example environment is not limited to performing the example gestures and the gestures are not limited to implementation in the example environment.
- Example Environment
-
FIG. 1 is an illustration of anenvironment 100 in an example implementation that is operable to employ cross-slide gestures as described herein. The illustratedenvironment 100 includes an example of acomputing device 102 that may be configured in a variety of ways. For example, thecomputing device 102 may be configured as a traditional computer (e.g., a desktop personal computer, laptop computer, and so on), a mobile station, an entertainment appliance, a set-top box communicatively coupled to a television, a wireless phone, a netbook, a game console, a handheld device, and so forth as further described in relation toFIG. 2 . Thus, thecomputing device 102 may range from full resource devices with substantial memory and processor resources (e.g., personal computers, game consoles) to a low-resource device with limited memory and/or processing resources (e.g., traditional set-top boxes, hand-held game consoles). Thecomputing device 102 also includes software that causes thecomputing device 102 to perform one or more operations as described below. -
Computing device 102 includes agesture module 104 that is operational to provide gesture functionality as described in this document. The gesture module can be implemented in connection with any suitable type of hardware, software, firmware or combination thereof. In at least some embodiments, the gesture module is implemented in software that resides on some type of tangible, computer-readable storage medium examples of which are provided below. -
Gesture module 104 is representative of functionality that recognizes gestures, including cross-slide gestures that can be performed by one or more fingers, and causes operations to be performed that correspond to the gestures. The gestures may be recognized bymodule 104 in a variety of different ways. For example, thegesture module 104 may be configured to recognize a touch input, such as a finger of a user'shand 106 a as proximal todisplay device 108 of thecomputing device 102 using touchscreen functionality. In particular,gesture module 104 can recognize cross slide gestures that can be used on content that pans or scrolls in one direction, to enable additional actions, such as content selection, drag and drop operations, and the like. - For instance, in the illustrated example, a pan or scroll direction is shown as being in the vertical direction, as indicated by the arrows. In one or more embodiments, a cross slide gesture can be performed by dragging an item or object in a direction that is different, e.g. orthogonal, from the panning or scrolling direction. The different-direction drag can be mapped to additional actions or functionality. With respect to whether a direction is vertical or horizontal, a vertical direction can be considered, in at least some instances, as a direction that is generally parallel to one side of a display device, and a horizontal direction can be considered as a direction that is generally orthogonal to the vertical direction. Hence, while the orientation of a computing device may change, the verticality or horizontality of a particular cross slide gesture can remain standard as defined relative to and along the display device.
- For example, a finger of the user's
hand 106 a is illustrated as selecting 110 animage 112 displayed by thedisplay device 108.Selection 110 of theimage 112 and subsequent movement of the finger of the user'shand 106 a in a direction that is different from the pan or scroll direction, e.g., generally orthogonal relative to the pan or scroll direction, may be recognized by thegesture module 104. Thegesture module 104 may then identify this recognized movement, by the nature and character of the movement, as indicating a “drag and drop” operation to change a location of theimage 112 to a point in the display at which the finger of the user'shand 106 a is lifted away from thedisplay device 108. Thus, recognition of the touch input that describes selection of the image, movement of the selection point to another location, and then lifting of the finger of the user'shand 106 a may be used to identify a gesture (e.g., drag-and-drop gesture) that is to initiate the drag-and-drop operation. - Although cross-slide gestures are primarily discussed in this document, it is to be appreciated and understood that a variety of different types of gestures may be recognized by the
gesture module 104 including, by way of example and not limitation, gestures that are recognized from a single type of input (e.g., touch gestures such as the previously described drag-and-drop gesture) as well as gestures involving multiple types of inputs. For example,module 104 can be utilized to recognize single-finger gestures and bezel gestures, multiple-finger/same-hand gestures and bezel gestures, and/or multiple-finger/different-hand gestures and bezel gestures. - For example, the
computing device 102 may be configured to detect and differentiate between a touch input (e.g., provided by one or more fingers of the user'shand 106 a) and a stylus input (e.g., provided by a stylus 116). The differentiation may be performed in a variety of ways, such as by detecting an amount of thedisplay device 108 that is contacted by the finger of the user's hand 106 versus an amount of thedisplay device 108 that is contacted by thestylus 116. - Thus, the
gesture module 104 may support a variety of different gesture techniques through recognition and leverage of a division between stylus and touch inputs, as well as different types of touch inputs. -
FIG. 2 illustrates an example system showing thegesture module 104 as being implemented in an environment where multiple devices are interconnected through a central computing device. The central computing device may be local to the multiple devices or may be located remotely from the multiple devices. In one embodiment, the central computing device is a “cloud” server farm, which comprises one or more server computers that are connected to the multiple devices through a network or the Internet or other means. - In one embodiment, this interconnection architecture enables functionality to be delivered across multiple devices to provide a common and seamless experience to the user of the multiple devices. Each of the multiple devices may have different physical requirements and capabilities, and the central computing device uses a platform to enable the delivery of an experience to the device that is both tailored to the device and yet common to all devices. In one embodiment, a “class” of target device is created and experiences are tailored to the generic class of devices. A class of device may be defined by physical features or usage or other common characteristics of the devices. For example, as previously described the
computing device 102 may be configured in a variety of different ways, such as for mobile 202,computer 204, andtelevision 206 uses. Each of these configurations has a generally corresponding screen size and thus thecomputing device 102 may be configured as one of these device classes in thisexample system 200. For instance, thecomputing device 102 may assume the mobile 202 class of device which includes mobile telephones, music players, game devices, and so on. Thecomputing device 102 may also assume acomputer 204 class of device that includes personal computers, laptop computers, netbooks, and so on. Thetelevision 206 configuration includes configurations of device that involve display in a casual environment, e.g., televisions, set-top boxes, game consoles, and so on. Thus, the techniques described herein may be supported by these various configurations of thecomputing device 102 and are not limited to the specific examples described in the following sections. - Cloud 208 is illustrated as including a
platform 210 forweb services 212. Theplatform 210 abstracts underlying functionality of hardware (e.g., servers) and software resources of the cloud 208 and thus may act as a “cloud operating system.” For example, theplatform 210 may abstract resources to connect thecomputing device 102 with other computing devices. Theplatform 210 may also serve to abstract scaling of resources to provide a corresponding level of scale to encountered demand for theweb services 212 that are implemented via theplatform 210. A variety of other examples are also contemplated, such as load balancing of servers in a server farm, protection against malicious parties (e.g., spam, viruses, and other malware), and so on. - Thus, the cloud 208 is included as a part of the strategy that pertains to software and hardware resources that are made available to the
computing device 102 via the Internet or other networks. For example, thegesture module 104 may be implemented in part on thecomputing device 102 as well as via aplatform 210 that supportsweb services 212. - For example, the gesture techniques supported by the gesture module may be detected using touchscreen functionality in the
mobile configuration 202, track pad functionality of thecomputer 204 configuration, detected by a camera as part of support of a natural user interface (NUI) that does not involve contact with a specific input device, and so on. Further, performance of the operations to detect and recognize the inputs to identify a particular gesture may be distributed throughout thesystem 200, such as by thecomputing device 102 and/or theweb services 212 supported by theplatform 210 of the cloud 208. - Generally, any of the functions described herein can be implemented using software, firmware, hardware (e.g., fixed logic circuitry), manual processing, or a combination of these implementations. The terms “module,” “functionality,” and “logic” as used herein generally represent software, firmware, hardware, or a combination thereof. In the case of a software implementation, the module, functionality, or logic represents program code that performs specified tasks when executed on or by a processor (e.g., CPU or CPUs). The program code can be stored in one or more computer readable memory devices. The features of the gesture techniques described below are platform-independent, meaning that the techniques may be implemented on a variety of commercial computing platforms having a variety of processors.
- In the discussion that follows, various sections describe example cross-slide gestures including re-arrange gestures. A section entitled “Method/Gesture for Cross-Slide Relative to Panning Direction” describes a cross-slide gesture that can be executed relative to a panning direction in accordance with one or more embodiments. Next, a section entitled “Method/Gesture for Re-arranging Items in a Pannable List” describes how items can be arranged and rearranged utilizing a cross-slide gesture in accordance with one or more embodiments. Following this, a section entitled “Detecting Cross-Slide Gestures” describes how cross-slide gestures can be detected in accordance with one or more embodiments. Next, a section entitled “Combining Multiple Interactions” describes how multiple interactions can be combined in conjunction with cross-slide gestures in accordance with one or more embodiments. Following this, a section entitled “Direct Manipulation to Facilitate Threshold to Discernability” describes how direct manipulation feedback can be provided to enable a user to become aware of various thresholds in accordance with one or more embodiments. Next, a section entitled “Interaction Feedback” describes embodiments in which feedback can be provided to a user in accordance with one or more embodiments. Last, a section entitled “Example Device” describes aspects of an example device that can be utilized to implement one or more embodiments.
- Method/Gesture for Cross-Slide Relative to Panning Direction
- In one or more embodiments, a cross slide gesture can be performed for causing an object-related action to be performed by dragging an item or object in a direction that is different, e.g. orthogonal, from a scrolling or panning direction.
- As an example, consider
FIG. 3 which illustrates anenvironment 300 in accordance with one or more embodiments. Here,computing device 302 includes adisplay device 308 whose content can be scrolled or panned in the horizontal direction, as indicated by the double-headedarrow 304, and as suggested byscroll bar 305.Display device 308 has displayed, thereon, multiple different objects oritems items display device 308 in the horizontal direction. Alternately, a user can cause an object-related action to be performed by performing a cross slide gesture, relative to one of the objects or items, in a direction that is different from the scrolling or panning direction. - As an example, consider the bottom-most illustration of
computing device 302. There, a user'shand 306 a has touched overitem 312 and moved it in a direction that is different from the scrolling or panning direction. In this particular example, the different direction is generally orthogonal to the scrolling or panning direction in a downward direction. It is to be appreciated and understood that, in at least some embodiments, the object can be moved downward and upward or, more generally, bi-directionally, to access the same or different object-related actions. Any suitable type of object-related action can be performed. For example, one type of object-related action can include, by way of example and not limitation, object selection. Notice, in this example, that the selected item is directly manipulated and visual feedback is provided to the user by being able to observe the object move responsive to the user's engagement. Notice also that, in this embodiment and the ones described below, the object-related action is performed without showing additional user interface elements, such as a button to enable a command selection. Other object-related actions can be performed such as object delete and other object manipulation actions. - As another example, consider
FIG. 4 which illustrates anenvironment 400 in accordance with one or more embodiments. Here,computing device 402 includes adisplay device 408 whose content can be scrolled or panned in the vertical direction, as indicated by the double-headedarrow 404, and as suggested byscroll bar 405.Display device 408 has displayed, thereon, multiple different objects oritems display device 408 in the vertical direction. Alternately, a user can cause an object-related action to be performed by performing a cross slide gesture, relative to one of the objects or items, in a direction that is different from the scrolling or panning direction. - As an example, consider the bottom-most illustration of
computing device 402. There, a user'shand 406 a has touched overitem 412 and moved it in a direction that is different from the scrolling or panning direction. In this particular example, the different direction is generally orthogonal to the scrolling or panning direction. Any suitable type of object-related action can be performed, examples of which are provided below. For example, one type of object-related action can include, by way of example and not limitation, object selection. It is to be appreciated and understood that functionality that is accessible through cross slide gestures can be accessed in connection with moving the object or item any suitable threshold distance to invoke the object-related action. In at least some embodiments, there may be no threshold distance to invoke object-related action. In these instances, movement in a different direction other than the pan or scroll direction may be used to invoke the object-related action. -
FIG. 5 is a flow diagram that describes steps in a method in accordance with one or more embodiments. The method can be performed in connection with any suitable hardware, software, firmware, or combination thereof. In at least some embodiments, the method can be performed by a suitably-configured gesture module, such as the one described above. - Step 500 detects a gestural slide input relative to a display device associated with a computing device. Step 502 ascertains whether the direction of the gestural slide input is different from a panning direction. If the direction is not different from the panning direction, step 504 pans content in the direction of the gestural slide input. If, on the other hand, the direction of the gestural slide input is different from the panning direction,
step 506 performs an object-related action. Any suitable type of object-related action can be performed, examples of which are provided below. - Method/Gesture for Re-Arranging Items in a Pannable List
- In one or more embodiments, a cross slide gesture can be performed effective to cause an object-related action, in the form of an object-rearrangement action, to be performed by dragging an item or object in a direction that is different, e.g. orthogonal to or generally not in the direction associated with a scrolling or panning direction.
- As an example, consider
FIG. 6 which illustrates anenvironment 600 in accordance with one or more embodiments. Here,computing device 602 includes adisplay device 608 whose content can be scrolled or panned in the horizontal direction, as indicated by the double-headedarrow 604, and as suggested byscroll bar 605.Display device 608 has displayed, thereon, multiple different objects oritems items display device 608 in the horizontal direction. Alternately, a user can cause an object-related action, in the form of a rearrangement action, to be performed by performing a cross slide gesture, relative to one of the objects or items, in a direction that is different from the scrolling or panning direction. For example, in the topmost illustration, a user'shand 606 a has toucheddisplay device 608 overobject 612 and dragged the object in a first direction that is generally orthogonal to the scrolling or panning direction, and then in a second direction toward the left bottom corner ofdisplay device 608. Here, the first direction is a generally vertical direction. Dragging the object in the first direction indicates to the gesture module that an object is to be rearranged. - Consider now the bottom-most illustration of
computing device 602. There, the user'shand 606 a has draggedobject 612 to its illustrated position and dropped it in place. Subsequently, the user's hand toucheddisplay device 608 overobject 618 and dragged the object in a first direction that is generally orthogonal to the scrolling or pending direction, and then in a second direction toward the middle portion of the display device. Here, the first direction is a generally vertical direction. Once the user's hand is lifted from the toucheddisplay device 608,object 618 will be dropped in its illustrated place. - As another example, consider
FIG. 7 which illustrates anenvironment 700 in accordance with one or more embodiments. Here,computing device 702 includes adisplay device 708 whose content can be scrolled or panned in the vertical direction, as indicated by the double-headedarrow 704, and as suggested byscroll bar 705.Display device 708 has displayed, thereon, multiple different objects oritems display device 708 in the vertical direction. Alternately, a user can cause an object-related action, in the form of a rearrangement action, to be performed by performing a cross slide gesture, relative to one of the objects or items, in a direction that is different from the scrolling or panning direction. For example, in the topmost illustration, a user'shand 706 a has toucheddisplay device 708 overobject 712 and dragged the object in a direction that is generally orthogonal to the scrolling or panning direction. Here, the direction is a generally horizontal direction. Dragging an object in this direction indicates to the gesture module that the object is to be rearranged. - Consider now the bottom-most illustration of
computing device 702. There, the user'shand 706 a has draggedobject 712 to its illustrated position and dropped it in place. Subsequently, the user's hand toucheddisplay device 708 overobject 710 and dragged the object in a direction that is generally orthogonal to the scrolling or pending direction. Here, the direction is a generally horizontal direction. Once the user's hand is lifted from the toucheddisplay device 708,object 710 will be dropped in its illustrated place. -
FIG. 8 is a flow diagram that describes steps in a method in accordance with one or more embodiments. The method can be performed in connection with any suitable hardware, software, firmware, or combination thereof. In at least some embodiments, the method can be performed by a suitably-configured gesture module, such as the one described above. - Step 800 detects a drag direction associated with a drag operation relative to a display device associated with a computing device. Step 802 ascertains whether the drag direction is different from a panning direction. If the drag direction is not different from the panning direction, step 804 pans content in the dragging direction. If, on the other hand, the drag direction is different from the panning direction,
step 806 performs an object-rearrangement action. Examples of how this can be done are provided above. In one or more embodiments, rearrangement can occur in any suitable direction. - Detecting Cross-slide Gestures
- Cross-slide gestures can be detected in any suitable way. As but one example of how cross-slide gestures can be detected, consider the following in connection with
FIG. 9 . In one or more embodiments, to detect if a user is panning or cross-sliding, region detection logic can be applied as graphically illustrated inFIG. 9 . - In this example, consider that the user has displayed a horizontally pannable list of items. When the user puts her finger down on an object, as within the illustrated
circle 900, and starts to drag her finger outside the boundary of the circle, region detection can be employed to ascertain the outcome of the drag. For example, in a situation in which there is a drag into one ofregion 902, the content will be panned in a corresponding direction. However, a drag into one ofregion 904 will be recognized as a cross-slide gesture and, accordingly, the functionality associated with the cross-slide gesture can be implemented. - In the illustrated example,
regions -
FIG. 10 is a flow diagram that describes steps in a method in accordance with one or more embodiments. The method can be performed in connection with any suitable hardware, software, firmware, or combination thereof. In at least some embodiments, the method can be performed by a suitably-configured gesture module, such as the one described above. -
Step 1000 defines one or more regions associated with a panning gesture. Any suitable region geometry can be utilized, an example of which is provided above.Step 1002 defines one or more regions associated with a cross slide gesture. Again, any suitable region geometry can be utilized, an example of which is provided above. In theFIG. 9 example, the region geometries are generally triangular in shape and converge at a point associated with a touch input. Other geometries can be utilized without departing from the spirit and scope of the claimed subject matter. -
Step 1004 detects a drag operation. This step can be performed by detecting gestural input in the form of a touch gesture, such as a swipe.Step 1006 ascertains an associated region within which the drag operation occurs. If, atstep 1008 the region is associated with a panning gesture,step 1010 pans content in an associated direction. If, on the other hand, the region is not associated with a panning gesture,step 1012 performs an operation associated with a cross slide gesture. Any suitable object-related action can be performed, including, by way of example and not limitation, object selection, object deletion, object rearrangement, and the like. - Having considered how drag operations can be detected and differentiated in accordance with one or more embodiments, consider now a discussion of how multiple interactions can be combined.
- Combining Multiple Interactions
- In some cases, it can be desirable to have a threshold that can be utilized to lock into an object-related action, such as a drag threshold that allows locking into a drag direction. Any suitable type of threshold can be utilized including, by way of example and not limitation, distance thresholds, velocity thresholds, directionality thresholds, any combination of the aforementioned thresholds, as well as other thresholds. For example, a combination of distance and velocity threshold can be used to mitigate what might otherwise constitute an accidental or unintended action. For example, when a particular threshold is reached, the velocity of finger movement might be ascertained. If the velocity is below a particular threshold, then a drag action might be invoked. If it is above a particular threshold, then perhaps an object select action is performed.
- This makes it possible for the user to be a less precise at the beginning of their gesture. For example, returning to the
FIG. 9 example, notice thatbox 906 is defined. While the user's finger is withinbox 906 or, alternatively, within the boundary ofcircle 900, the corresponding gesture can be in an “undecided” state. Once the finger crosses outside the boundary of the box (or circle), a decision as to the gesture can be made. In practice, this can be handled in a couple of different ways. First, neither a pan operation nor cross-slide functionality can be implemented until the finger has crossed the boundary ofbox 906. Alternately, both pan and cross-slide operations can be implemented simultaneously while the finger is within the boundary ofbox 906. As soon as the finger crosses the boundary of the box, the operation associated with that particular region can be maintained, while the other operation can be canceled. - Once a cross-slide gesture has been detected, different thresholds can be utilized to implement object-related actions. As an example, consider
FIG. 11 . There, an object oritem 1100 is shown. Various distances are shown and are indicated at 1102 and 1104. The distances show the travel distance ofobject 1100. In one or more embodiments, thefirst distance 1102 is a threshold which, when passed, results in an action potentially being committed. In this particular example, passing this distance threshold while performing a drag operation causesobject 1100 to be selected. To commit this action, the user would lift her finger, and the dragged object would slide back to its original position and change its state to be selected. The area beneath that corresponding to distance 1102 before the threshold ofdistance 1104 is reached can be treated as a buffer. Thus, releasing the object within this area will still result in object selection. - Once the dragged object (dragged along the solid line or, any other suitable direction such as along the dashed line) reaches
distance 1104 and crosses its threshold, the next object-related action on the cross-slide gesture can be committed. In this particular example, the object-related action can break the object out of its associated list or position on the display device, and thus enable the user to drag and drop the object in any direction. In one or more embodiments, if the object reachesline 1106, such can trigger yet additional object-related actions. For example, crossing this line with the object might trigger additional visual feedback to make it clear to the user that the drag and drop threshold has been reached. - It is to be appreciated and understood that any suitable number of distance thresholds can be employed and can be associated with object related actions. For example, a first threshold might be defined by the boundary of the illustrated circle within
object 1100, a second threshold by thedistance 1102, and a third threshold by thedistance 1104. In one or more embodiments, movement outside of the first threshold can lock the associated object in the associated movement direction. -
FIG. 12 is a flow diagram that describes steps in a method in accordance with one or more embodiments. The method can be performed in connection with any suitable hardware, software, firmware, or combination thereof. In at least some embodiments, the method can be performed by a suitably-configured gesture module, such as the one described above. - Step 1200 defines one or more thresholds. This step can be performed in any suitable way utilizing any suitable type of threshold. For example, in the embodiment described above, distance thresholds were employed. It is to be appreciated and understood, however, that other types of thresholds and/or combinations thereof can be utilized without departing from the spirit and scope of the claimed subject matter.
- In the illustrated and described embodiment, the defined thresholds can be utilized in connection with a cross-slide gesture as described above. Step 1202 detects a cross-slide gesture. Examples of how this can be done are provided above. Step 1204 detects or one or more threshold triggers. For example, once a user has touched-engaged an object they can move the object in a particular direction. This step detects when the object has been moved sufficient to trigger one or more thresholds. In embodiments where thresholds are defined in terms of distances, the step can be performed by detecting when an object has been moved a particular distance.
- Step 1206 detects whether a user action indicates that an object-related action is to be committed. This step can be performed in any suitable way. For example, a user action might include lifting their finger off a particular object. If a user action does not indicate that an object-related action is to be committed, step 1208 does not commit the object-related action. For example, the user might terminate the cross-slide gesture in a particular way such that no action is to be committed. For example, the cross-slide gesture can be reversible, i.e. if the user starts dragging an object down, then she can, at any time while still holding the object, slide it back to its original position. By doing this, no cross-slide actions will be taken. Alternately, one or more thresholds might be crossed, without a user having yet indicated that the object-related action is to be committed. In this case, if the cross-slide gesture is ongoing, the method would continue to monitor for threshold triggers as by returning to step 1204. If, on the other hand, a user action indicates that an object-related action is to be committed, step 1210 commits the object-related action associated with a last-triggered threshold. This step can be performed in any suitable way and can include any suitable object-related action, examples of which are provided above.
- In one or more embodiments, the multiple different directions used for cross-slide functionality can either result in the same object-related actions being performed, or different object-related actions being performed. For example, object selection might occur when an object is dragged downward, while a drag and drop action might be performed when the object is dragged upward.
- Having considered the use of various drag thresholds and associated the object-related actions, consider now an additional example that employs thresholds along with indicia to provide feedback of direct object manipulation.
- Direct Manipulation to Facilitate Threshold Discernability
- In at least some embodiments, direct manipulation can provide visual feedback so that a user can visually observe an object move and, in accordance with object movement, can be provided with visual affordances to facilitate threshold discernability. Any suitable type of the visual affordance can be employed including, by way of example and not limitation, tool tips, icons, glyphs, and the like. In the example described just below, so-called speed bumps can be used to provide a user with an understanding or awareness of the various thresholds that might be present. As an example, consider
FIG. 13 . - There, an object or
item 1300 is shown. Various distances are shown and are indicated at 1302, 1304, and 1306. The distances show the travel distance ofobject 1300 or distances through which the object can travel. In one or more embodiments, thefirst distance 1302 is a threshold which, when passed, results in an action potentially being committed. In this particular example, passing this distance threshold while performing a drag operation causesobject 1300 to be selected. To commit this action, the user would lift her finger, and the dragged object would slide back to its original position and change its state to be selected. The area beneath that corresponding to distance 1302 before the region corresponding to distance 1306 is reached can be treated as a buffer. Thus, releasing the object within this area will still result in object selection. -
Distance 1306 corresponds to a speed bump region. Movement ofobject 1300 within the speed bump region is slower than movement of the finger. This presents a visual cue or indication that a new threshold is about to be reached, thus making it easier for the user to commit a particular action without accidentally moving into and over a next distance threshold. For example, within a speed bump region, a user may drag her finger 50 pixels in length, while the corresponding object may move five pixels in distance. Releasing the object within this speed bump region will result in an associated action being committed. In this example, the associated action is an object selection. - Once the dragged object proceeds through the speed bump region corresponding to distance 1306, and reaches
distance 1304 and crosses its threshold, the next object-related action on the cross-slide gesture can be committed. In this particular example, the object-related action can break the object out of its associated list or position on the display device, and thus enable the user to drag and drop the object in any direction. In one or more embodiments, if the object reachesline 1308, such can trigger yet additional object-related actions. For example, crossing this line with the object might trigger additional visual feedback to make it clear to the user that the drag and drop threshold has been reached. In addition, multiple speed bumps can be utilized in connection with the distance thresholds. - It is to be appreciated and understood that any suitable number of distance thresholds and speed bumps can be employed and can be associated with object related actions. Alternately or additionally, other visual indicia can be utilized to indicate thresholds or threshold changes. For example, while dragging an object, one or more lines can be rendered to indicate thresholds and thus, the distance that an object should be dragged to commit different actions. Visuals can also be drawn on the object itself as it gets closer to or crosses a threshold.
-
FIG. 14 is a flow diagram that describes steps in a method in accordance with one or more embodiments. The method can be performed in connection with any suitable hardware, software, firmware, or combination thereof. In at least some embodiments, the method can be performed by a suitably-configured gesture module, such as the one described above. -
Step 1400 defines one or more distance thresholds including one or more speed bumps. This step can be performed in any suitable way. In the illustrated and described embodiment, the defined thresholds and speed bumps can be utilized in connection with a cross-slide gesture as described above.Step 1402 detects a cross-slide gesture. Examples of how this can be done are provided above.Step 1404 detects a speed bump crossing. For example, once a user has touched-engaged an object they can move the object in a particular direction. This step detects when the object has been moved sufficient to cross a boundary associated with a speed bump. -
Step 1406 modifies a user experience within the speed bump region. Any suitable modification of the user experience can be provided. For example, in at least some embodiments, modification of the user experience can entail modifying the user's visual experience. For example, and as noted above, the user's finger may move the faster than the underlying object. Alternately or additionally, other experience modifications can take place including, by way of example and not limitation, providing audible or haptic feedback to indicate presence within a particular speed bump region. -
Step 1408 detects whether a user action indicates that an object-related action is to be committed. This step can be performed in any suitable way. For example, a user action might include lifting their finger off a particular object. If a user action does not indicate that an object-related action is to be committed,step 1410 does not commit the object-related action. For example, the user might terminate the cross-slide gesture in a particular way such that no action is to be committed. For example, the cross-slide gesture can be reversible, i.e. if the user starts dragging an object down, then she can, at any time while still holding the object, slide it back to its original position. By doing this, no cross-slide actions will be taken. Alternately, one or more thresholds and one or more speed bump regions might be crossed, without a user having yet indicated that the object-related action is to be committed. In this case, if the cross-slide gesture is ongoing, the method would continue to monitor for threshold crossings and additional speed bumps, as appropriate. If, on the other hand, a user action indicates that an object-related action is to be committed,step 1412 commits the object-related action associated with a last-crossed threshold. This step can be performed in any suitable way and can include any suitable object-related action, examples of which are provided above. - In one or more embodiments, the multiple different directions used for cross-slide functionality can either result in the same object-related actions being performed, or different object-related actions being performed. For example, object selection might occur when an object is dragged downward, while a drag and drop action might be performed when the object is dragged upward.
- Interaction Feedback
- In one or more embodiments, visual feedback can be provided to a user to inform the user of a particular object-related action that will be committed, responsive to the detected cross-slide gesture. For example, as a particular object passes different distance thresholds, visual indicia can be provided to inform the user of a particular action that will be committed by releasing the object. Alternately or additionally, visual indicia can further be provided on a particular object-related action that might be next if object dragging continues.
- As an example, consider
FIG. 15 . There, an object in the form of a picture is shown generally at 1500. In the top-most part of the figure, a user has touched the object to initiate a drag operation. As the user drags the object downward, as shown at 1502,visual indicia 1504 can be presented as by beginning to emerge from underneath the picture. In this particular example, the visual indicia resides in the form of a check box that gradually emerges from underneath the object. Any suitable type of visual indicia can be utilized without departing from the spirit and scope of the claimed subject matter. For example, the visual indicia might be presented in the form of a line, beneath the picture, and to which the picture is to be dragged to commit a particular action. Once the object has been dragged a particular distance, as shown at 1506, the visual indicia—here, the check box, can be fully exposed thus informing the user that she can release the object to commit an object-related action. In this particular example, the object-related action comprises an object selection. Thus, the fully exposed check box can indicate that the action is completed. -
FIG. 16 is a flow diagram that describes steps in a method in accordance with one or more embodiments. The method can be performed in connection with any suitable hardware, software, firmware, or combination thereof. In at least some embodiments, the method can be performed by a suitably-configured gesture module, such as the one described above. -
Step 1600 detects a drag operation associated with an object. Examples of how this can be done are provided above. For example, the drag operation can be detected in conjunction with a cross-slide gesture.Step 1602 presents a partial portion of visual indicia associated with committing an object-related action. Examples of how this can be done are provided above.Step 1604 ascertains whether the drag operation continues. If the drag operation has not continued, the method can, in an event that a user has not terminated the drag operation, return tostep 1602. In the event that user has terminated the drag operation, as by returning the object to its original position, the method can terminate. On the other hand, if the drag operation continues,step 1606 ascertains whether a distance threshold associated with an object-related action has been reached. If not, the method can return tostep 1602. By doing so, more of the visual indicia can be exposed in accordance with the distance that the object has been dragged. If, on the other hand, a distance threshold associated with an object-related action has been reached,step 1608 presents a complete visual indicia associated with committing the object-related action. By doing so, the visual indicia can inform the user, visually, that the object-related action can be committed as by the user removing her finger from the object. - Having described an example visual indicia associated with committing an object-related action associated with a cross-slide gesture, consider now an example device that can be utilized to implement one more embodiments described above.
- Example Device
-
FIG. 17 illustrates various components of anexample device 1700 that can be implemented as any type of portable and/or computer device as described with reference toFIGS. 1 and 2 to implement embodiments of the gesture techniques described herein.Device 1700 includescommunication devices 1702 that enable wired and/or wireless communication of device data 1704 (e.g., received data, data that is being received, data scheduled for broadcast, data packets of the data, etc.). Thedevice data 1704 or other device content can include configuration settings of the device, media content stored on the device, and/or information associated with a user of the device. Media content stored ondevice 1700 can include any type of audio, video, and/or image data.Device 1700 includes one ormore data inputs 1706 via which any type of data, media content, and/or inputs can be received, such as user-selectable inputs, messages, music, television media content, recorded video content, and any other type of audio, video, and/or image data received from any content and/or data source. -
Device 1700 also includescommunication interfaces 1708 that can be implemented as any one or more of a serial and/or parallel interface, a wireless interface, any type of network interface, a modem, and as any other type of communication interface. The communication interfaces 1708 provide a connection and/or communication links betweendevice 1700 and a communication network by which other electronic, computing, and communication devices communicate data withdevice 1700. -
Device 1700 includes one or more processors 1710 (e.g., any of microprocessors, controllers, and the like) which process various computer-executable or readable instructions to control the operation ofdevice 1700 and to implement the gesture embodiments described above. Alternatively or in addition,device 1700 can be implemented with any one or combination of hardware, firmware, or fixed logic circuitry that is implemented in connection with processing and control circuits which are generally identified at 1712. Although not shown,device 1700 can include a system bus or data transfer system that couples the various components within the device. A system bus can include any one or combination of different bus structures, such as a memory bus or memory controller, a peripheral bus, a universal serial bus, and/or a processor or local bus that utilizes any of a variety of bus architectures. -
Device 1700 also includes computer-readable media 1714, such as one or more memory components, examples of which include random access memory (RAM), non-volatile memory (e.g., any one or more of a read-only memory (ROM), flash memory, EPROM, EEPROM, etc.), and a disk storage device. A disk storage device may be implemented as any type of magnetic or optical storage device, such as a hard disk drive, a recordable and/or rewriteable compact disc (CD), any type of a digital versatile disc (DVD), and the like.Device 1700 can also include a massstorage media device 1716. - Computer-
readable media 1714 provides data storage mechanisms to store thedevice data 1704, as well asvarious device applications 1718 and any other types of information and/or data related to operational aspects ofdevice 1700. For example, anoperating system 1720 can be maintained as a computer application with the computer-readable media 1714 and executed onprocessors 1710. Thedevice applications 1718 can include a device manager (e.g., a control application, software application, signal processing and control module, code that is native to a particular device, a hardware abstraction layer for a particular device, etc.), as well as other applications that can include, web browsers, image processing applications, communication applications such as instant messaging applications, word processing applications and a variety of other different applications. Thedevice applications 1718 also include any system components or modules to implement embodiments of the gesture techniques described herein. In this example, thedevice applications 1718 include aninterface application 1722 and a gesture-capture driver 1724 that are shown as software modules and/or computer applications. The gesture-capture driver 1724 is representative of software that is used to provide an interface with a device configured to capture a gesture, such as a touchscreen, track pad, camera, and so on. Alternatively or in addition, theinterface application 1722 and the gesture-capture driver 1724 can be implemented as hardware, software, firmware, or any combination thereof. -
Device 1700 also includes an audio and/or video input-output system 1726 that provides audio data to anaudio system 1728 and/or provides video data to adisplay system 1730. Theaudio system 1728 and/or thedisplay system 1730 can include any devices that process, display, and/or otherwise render audio, video, and image data. Video signals and audio signals can be communicated fromdevice 1700 to an audio device and/or to a display device via an RF (radio frequency) link, S-video link, composite video link, component video link, DVI (digital video interface), analog audio connection, or other similar communication link. In an embodiment, theaudio system 1728 and/or thedisplay system 1730 are implemented as external components todevice 1700. Alternatively, theaudio system 1728 and/or thedisplay system 1730 are implemented as integrated components ofexample device 1700. - Conclusion
- Cross slide gestures for touch displays are described. In at least some embodiments, cross slide gestures can be used on content that pans or scrolls in one direction, to enable additional actions, such as content selection, drag and drop operations, and the like.
- In one or more embodiments, a cross slide gesture can be performed by dragging an item or object in a direction that is different from a scrolling direction. The different-direction drag can be mapped to additional actions or functionality. In one or more embodiments, one or more thresholds can be utilized, such as a distance threshold, in combination with the different-direction drag, to map to additional actions or functionality.
- In at least some embodiments, so-called speed bumps can be used to provide a user with an understanding or awareness of the thresholds.
- Although the embodiments have been described in language specific to structural features and/or methodological acts, it is to be understood that the embodiments defined in the appended claims are not necessarily limited to the specific features or acts described. Rather, the specific features and acts are disclosed as example forms of implementing the claimed embodiments.
Claims (20)
1. A method comprising:
detecting a gesture slide input relative to an object displayed on a display device of a computing device;
ascertaining whether a gestural slide input direction is different from a panning or scrolling direction by using region detection logic in which gestural slide input that occurs relative to the object and within one region is ascertained to be associated with panning or scrolling and gestural slide input that occurs relative to the object and within a different region is ascertained to not be associated with panning or scrolling; and
responsive to the gestural slide input being in a direction that is different from the panning or scrolling direction and within the different region, performing an object-related action.
2. The method of claim 1 , wherein the panning or scrolling direction is generally vertical along the display device.
3. The method of claim 1 , wherein the panning or scrolling direction is generally horizontal along the display device.
4. The method of claim 1 , wherein the object-related action comprises an object selection.
5. The method of claim 1 , wherein the object-related action comprises a re-arrangement action.
6. The method of claim 1 , wherein performing an object-related action comprises performing one of a plurality of object-related actions which are accessible via the gestural slide input being in a direction that is different from the panning or scrolling direction.
7. The method of claim 1 , wherein the direction that is different from the panning or scrolling direction comprises a direction that is generally orthogonal relative to the panning or scrolling direction.
8. The method of claim 1 , wherein said performing an object-related action is performed responsive to detecting a threshold trigger associated with the gesture slide input.
9. One or more computer readable storage media embodying computer readable instructions which, when executed, implement a method comprising:
detecting a drag direction associated with a drag operation;
ascertaining whether the drag direction is different from a panning direction by using region detection logic in which drag directions that occur relative to an object and within one region are ascertained to be associated with panning and drag directions that occur relative to the object and within a different region are ascertained to not be associated with panning; and
responsive to the drag direction being different than the panning direction and occurring within the different region, performing an object-rearrangement action associated with the object.
10. The one or more computer readable storage media of claim 9 , wherein detecting a drag direction is performed by detecting a drag direction associated with a touch gesture.
11. The one or more computer readable storage media of claim 9 , wherein the panning direction is one of generally vertical or generally horizontal along the display device, and the drag direction is generally orthogonal relative to the panning direction.
12. The one or more computer readable storage media of claim 9 , wherein the one region and the different region are defined by one or more angle ranges.
13. The one or more computer readable storage media of claim 9 , wherein said performing an object re-arrangement action is performed responsive to detecting a threshold trigger associated with the drag direction.
14. The one or more computer readable storage media of claim 9 , wherein said performing an object re-arrangement action is performed responsive to detecting a threshold trigger associated with the drag direction, the threshold trigger being associated with a distance.
15. The one or more computer readable storage media of claim 9 , wherein said performing an object re-arrangement action is performed responsive to detecting a threshold trigger associated with the drag direction, the threshold trigger being associated with a distance, and wherein at least one other action is configured to be performed based on at least one other respective threshold trigger that is different from the first-mentioned threshold trigger.
16. A system comprising:
one or more computer-readable storage media;
an application embodied on the one or more computer-readable storage media, the application being configured to implement a method comprising:
detecting a cross-slide gesture in a direction that is different from a panning direction;
responsive to said detecting, detecting one or more threshold triggers associated with the cross-slide gesture, individual threshold triggers being associated with an individual object-related action;
detecting whether a user action indicates that an object-related action is to be committed; and
responsive to the user action indicating that an object-related action is to be committed, committing an object-related action associated with a last-triggered threshold.
17. The system of claim 16 , wherein the one or more threshold triggers comprise distance threshold triggers.
18. The system of claim 16 further comprising providing visual feedback, responsive to detecting the cross-slide gesture, to facilitate threshold discernability.
19. The system of claim 16 further comprising providing visual feedback, responsive to detecting the cross-slide gesture, to facilitate threshold discernability, the visual feedback being provided within a region in which an object that is a subject of the cross-slide gesture moves slower than a finger executing the cross-slide gesture.
20. The system of claim 16 further comprising responsive to detecting the one or more threshold triggers, modifying a user experience associated with a region within which the cross-slide gesture occurs.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US13/656,354 US20130044141A1 (en) | 2011-08-02 | 2012-10-19 | Cross-slide Gesture to Select and Rearrange |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US13/196,272 US8687023B2 (en) | 2011-08-02 | 2011-08-02 | Cross-slide gesture to select and rearrange |
US13/656,354 US20130044141A1 (en) | 2011-08-02 | 2012-10-19 | Cross-slide Gesture to Select and Rearrange |
Related Parent Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US13/196,272 Continuation US8687023B2 (en) | 2011-08-02 | 2011-08-02 | Cross-slide gesture to select and rearrange |
Publications (1)
Publication Number | Publication Date |
---|---|
US20130044141A1 true US20130044141A1 (en) | 2013-02-21 |
Family
ID=47626692
Family Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US13/196,272 Active US8687023B2 (en) | 2011-08-02 | 2011-08-02 | Cross-slide gesture to select and rearrange |
US13/656,354 Abandoned US20130044141A1 (en) | 2011-08-02 | 2012-10-19 | Cross-slide Gesture to Select and Rearrange |
Family Applications Before (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US13/196,272 Active US8687023B2 (en) | 2011-08-02 | 2011-08-02 | Cross-slide gesture to select and rearrange |
Country Status (17)
Country | Link |
---|---|
US (2) | US8687023B2 (en) |
EP (1) | EP2740022B1 (en) |
JP (1) | JP5980924B2 (en) |
KR (1) | KR102052771B1 (en) |
CN (1) | CN103907087B (en) |
AU (1) | AU2012290559B2 (en) |
BR (1) | BR112014002379B1 (en) |
CA (1) | CA2843607C (en) |
CL (1) | CL2014000244A1 (en) |
CO (1) | CO6890079A2 (en) |
HK (1) | HK1199520A1 (en) |
IL (1) | IL230724A0 (en) |
IN (1) | IN2014CN00621A (en) |
MX (1) | MX338046B (en) |
MY (1) | MY167640A (en) |
RU (1) | RU2623198C2 (en) |
WO (1) | WO2013019404A1 (en) |
Cited By (48)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20090106696A1 (en) * | 2001-09-06 | 2009-04-23 | Matias Duarte | Loop menu navigation apparatus and method |
CN103246449A (en) * | 2013-04-16 | 2013-08-14 | 广东欧珀移动通信有限公司 | Mobile terminal and screen unlocking method of same |
US8548431B2 (en) | 2009-03-30 | 2013-10-01 | Microsoft Corporation | Notifications |
US8560959B2 (en) | 2010-12-23 | 2013-10-15 | Microsoft Corporation | Presenting an application change through a tile |
US8689123B2 (en) | 2010-12-23 | 2014-04-01 | Microsoft Corporation | Application reporting in an application-selectable user interface |
US8687023B2 (en) | 2011-08-02 | 2014-04-01 | Microsoft Corporation | Cross-slide gesture to select and rearrange |
US8814683B2 (en) | 2013-01-22 | 2014-08-26 | Wms Gaming Inc. | Gaming system and methods adapted to utilize recorded player gestures |
US8830270B2 (en) | 2011-09-10 | 2014-09-09 | Microsoft Corporation | Progressively indicating new content in an application-selectable user interface |
WO2014150611A1 (en) * | 2013-03-15 | 2014-09-25 | Google Inc. | Document scale and position optimization |
US8893033B2 (en) | 2011-05-27 | 2014-11-18 | Microsoft Corporation | Application notifications |
US20140340335A1 (en) * | 2013-05-17 | 2014-11-20 | Elektrobit Automotive Gmbh | System and method for data selection by means of a touch-sensitive surface |
US8922575B2 (en) | 2011-09-09 | 2014-12-30 | Microsoft Corporation | Tile cache |
US8933952B2 (en) | 2011-09-10 | 2015-01-13 | Microsoft Corporation | Pre-rendering new content for an application-selectable user interface |
US8935631B2 (en) | 2011-09-01 | 2015-01-13 | Microsoft Corporation | Arranging tiles |
US20150026639A1 (en) * | 2013-07-19 | 2015-01-22 | Fuji Xerox Co., Ltd. | Information processing apparatus and method, and non-transitory computer readable medium |
US8970499B2 (en) | 2008-10-23 | 2015-03-03 | Microsoft Technology Licensing, Llc | Alternative inputs of a mobile communications device |
US8990733B2 (en) | 2010-12-20 | 2015-03-24 | Microsoft Technology Licensing, Llc | Application-launching interface for multiple modes |
US9052820B2 (en) | 2011-05-27 | 2015-06-09 | Microsoft Technology Licensing, Llc | Multi-application environment |
US9104440B2 (en) | 2011-05-27 | 2015-08-11 | Microsoft Technology Licensing, Llc | Multi-application environment |
US9128605B2 (en) | 2012-02-16 | 2015-09-08 | Microsoft Technology Licensing, Llc | Thumbnail-image selection of applications |
US9158445B2 (en) | 2011-05-27 | 2015-10-13 | Microsoft Technology Licensing, Llc | Managing an immersive interface in a multi-application immersive environment |
US9176657B2 (en) | 2013-09-14 | 2015-11-03 | Changwat TUMWATTANA | Gesture-based selection and manipulation method |
US9223472B2 (en) | 2011-12-22 | 2015-12-29 | Microsoft Technology Licensing, Llc | Closing applications |
US9244802B2 (en) | 2011-09-10 | 2016-01-26 | Microsoft Technology Licensing, Llc | Resource user interface |
US9323424B2 (en) | 2008-10-23 | 2016-04-26 | Microsoft Corporation | Column organization of content |
US9383917B2 (en) | 2011-03-28 | 2016-07-05 | Microsoft Technology Licensing, Llc | Predictive tiling |
US9423951B2 (en) | 2010-12-31 | 2016-08-23 | Microsoft Technology Licensing, Llc | Content-based snap point |
US9430130B2 (en) | 2010-12-20 | 2016-08-30 | Microsoft Technology Licensing, Llc | Customization of an immersive environment |
US9451822B2 (en) | 2014-04-10 | 2016-09-27 | Microsoft Technology Licensing, Llc | Collapsible shell cover for computing device |
US9557909B2 (en) | 2011-09-09 | 2017-01-31 | Microsoft Technology Licensing, Llc | Semantic zoom linguistic helpers |
US9658766B2 (en) | 2011-05-27 | 2017-05-23 | Microsoft Technology Licensing, Llc | Edge gesture |
US9665384B2 (en) | 2005-08-30 | 2017-05-30 | Microsoft Technology Licensing, Llc | Aggregation of computing device settings |
US9674335B2 (en) | 2014-10-30 | 2017-06-06 | Microsoft Technology Licensing, Llc | Multi-configuration input device |
US9767076B2 (en) | 2013-03-15 | 2017-09-19 | Google Inc. | Document scale and position optimization |
US9769293B2 (en) | 2014-04-10 | 2017-09-19 | Microsoft Technology Licensing, Llc | Slider cover for computing device |
US9841874B2 (en) | 2014-04-04 | 2017-12-12 | Microsoft Technology Licensing, Llc | Expandable application representation |
US9977575B2 (en) | 2009-03-30 | 2018-05-22 | Microsoft Technology Licensing, Llc | Chromeless user interface |
US20190065027A1 (en) * | 2017-08-31 | 2019-02-28 | Apple Inc. | Systems, Methods, and Graphical User Interfaces for Interacting with Augmented and Virtual Reality Environments |
US10254942B2 (en) | 2014-07-31 | 2019-04-09 | Microsoft Technology Licensing, Llc | Adaptive sizing and positioning of application windows |
US10353575B2 (en) * | 2015-10-06 | 2019-07-16 | Canon Kabushiki Kaisha | Display control apparatus, method for controlling the same, and recording medium |
US10353566B2 (en) | 2011-09-09 | 2019-07-16 | Microsoft Technology Licensing, Llc | Semantic zoom animations |
US10592080B2 (en) | 2014-07-31 | 2020-03-17 | Microsoft Technology Licensing, Llc | Assisted presentation of application windows |
CN110971976A (en) * | 2019-11-22 | 2020-04-07 | 中国联合网络通信集团有限公司 | Audio and video file analysis method and device |
US10642365B2 (en) | 2014-09-09 | 2020-05-05 | Microsoft Technology Licensing, Llc | Parametric inertia and APIs |
US10678412B2 (en) | 2014-07-31 | 2020-06-09 | Microsoft Technology Licensing, Llc | Dynamic joint dividers for application windows |
US11099707B2 (en) | 2018-01-24 | 2021-08-24 | Apple Inc. | Devices, methods, and graphical user interfaces for system-wide behavior for 3D models |
US11287967B2 (en) | 2016-11-03 | 2022-03-29 | Microsoft Technology Licensing, Llc | Graphical user interface list content density adjustment |
US11740727B1 (en) | 2011-08-05 | 2023-08-29 | P4Tents1 Llc | Devices, methods, and graphical user interfaces for manipulating user interface objects with visual and/or haptic feedback |
Families Citing this family (65)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US8780130B2 (en) | 2010-11-30 | 2014-07-15 | Sitting Man, Llc | Methods, systems, and computer program products for binding attributes between visual components |
US9715332B1 (en) | 2010-08-26 | 2017-07-25 | Cypress Lake Software, Inc. | Methods, systems, and computer program products for navigating between visual components |
US10397639B1 (en) | 2010-01-29 | 2019-08-27 | Sitting Man, Llc | Hot key systems and methods |
US9679404B2 (en) | 2010-12-23 | 2017-06-13 | Microsoft Technology Licensing, Llc | Techniques for dynamic layout of presentation tiles on a grid |
US9436685B2 (en) | 2010-12-23 | 2016-09-06 | Microsoft Technology Licensing, Llc | Techniques for electronic aggregation of information |
US9471145B2 (en) | 2011-01-06 | 2016-10-18 | Blackberry Limited | Electronic device and method of displaying information in response to a gesture |
US9465440B2 (en) * | 2011-01-06 | 2016-10-11 | Blackberry Limited | Electronic device and method of displaying information in response to a gesture |
US9423878B2 (en) | 2011-01-06 | 2016-08-23 | Blackberry Limited | Electronic device and method of displaying information in response to a gesture |
US9715485B2 (en) * | 2011-03-28 | 2017-07-25 | Microsoft Technology Licensing, Llc | Techniques for electronic aggregation of information |
US20120272180A1 (en) * | 2011-04-20 | 2012-10-25 | Nokia Corporation | Method and apparatus for providing content flipping based on a scrolling operation |
US8922584B2 (en) * | 2011-09-30 | 2014-12-30 | Frederic Sigal | Method of creating, displaying, and interfacing an infinite navigable media wall |
EP2791790B1 (en) * | 2011-12-14 | 2019-08-14 | Intel Corporation | Gaze activated content transfer system |
KR101882724B1 (en) * | 2011-12-21 | 2018-08-27 | 삼성전자 주식회사 | Category Search Method And Portable Device supporting the same |
JP2013152566A (en) * | 2012-01-24 | 2013-08-08 | Funai Electric Co Ltd | Remote control device |
KR20130097266A (en) * | 2012-02-24 | 2013-09-03 | 삼성전자주식회사 | Method and apparatus for editing contents view in mobile terminal |
JP6055734B2 (en) * | 2012-09-26 | 2016-12-27 | 京セラドキュメントソリューションズ株式会社 | Display input device and image forming apparatus having the same |
US9891781B2 (en) * | 2012-10-05 | 2018-02-13 | Htc Corporation | Mobile communications device, non-transitory computer-readable medium and method of navigating between a plurality of different views of home screen of mobile communications device |
US9335913B2 (en) | 2012-11-12 | 2016-05-10 | Microsoft Technology Licensing, Llc | Cross slide gesture |
CN105074780B (en) | 2013-02-23 | 2020-11-10 | 高通股份有限公司 | System and method for interactive image caricature generation by an electronic device |
US10120540B2 (en) * | 2013-03-14 | 2018-11-06 | Samsung Electronics Co., Ltd. | Visual feedback for user interface navigation on television system |
US10025459B2 (en) * | 2013-03-14 | 2018-07-17 | Airwatch Llc | Gesture-based workflow progression |
US20140298258A1 (en) * | 2013-03-28 | 2014-10-02 | Microsoft Corporation | Switch List Interactions |
US20140298219A1 (en) * | 2013-03-29 | 2014-10-02 | Microsoft Corporation | Visual Selection and Grouping |
US9450952B2 (en) | 2013-05-29 | 2016-09-20 | Microsoft Technology Licensing, Llc | Live tiles without application-code execution |
KR102120651B1 (en) * | 2013-05-30 | 2020-06-09 | 삼성전자 주식회사 | Method and apparatus for displaying a seen in a device comprising a touch screen |
US20140372923A1 (en) * | 2013-06-14 | 2014-12-18 | Microsoft Corporation | High Performance Touch Drag and Drop |
JP6218451B2 (en) * | 2013-06-18 | 2017-10-25 | シャープ株式会社 | Program execution device |
USD732561S1 (en) * | 2013-06-25 | 2015-06-23 | Microsoft Corporation | Display screen with graphical user interface |
JP5505550B1 (en) * | 2013-08-06 | 2014-05-28 | 富士ゼロックス株式会社 | Image display apparatus and program |
EP3043250A4 (en) * | 2013-09-02 | 2017-04-12 | Sony Corporation | Information processing device, information processing method, and program |
WO2015057634A2 (en) * | 2013-10-18 | 2015-04-23 | Citrix Systems, Inc. | Providing enhanced message management user interfaces |
US20150128095A1 (en) * | 2013-11-07 | 2015-05-07 | Tencent Technology (Shenzhen) Company Limited | Method, device and computer system for performing operations on objects in an object list |
USD767590S1 (en) * | 2013-12-30 | 2016-09-27 | Nikolai Joukov | Display screen or portion thereof with graphical user interface for displaying software cells |
JP5924554B2 (en) * | 2014-01-06 | 2016-05-25 | コニカミノルタ株式会社 | Object stop position control method, operation display device, and program |
US20150286391A1 (en) * | 2014-04-08 | 2015-10-08 | Olio Devices, Inc. | System and method for smart watch navigation |
CN106170747A (en) * | 2014-04-14 | 2016-11-30 | 夏普株式会社 | Input equipment and the control method of input equipment |
US10089346B2 (en) | 2014-04-25 | 2018-10-02 | Dropbox, Inc. | Techniques for collapsing views of content items in a graphical user interface |
US9891794B2 (en) | 2014-04-25 | 2018-02-13 | Dropbox, Inc. | Browsing and selecting content items based on user gestures |
US9547433B1 (en) * | 2014-05-07 | 2017-01-17 | Google Inc. | Systems and methods for changing control functions during an input gesture |
US10656784B2 (en) * | 2014-06-16 | 2020-05-19 | Samsung Electronics Co., Ltd. | Method of arranging icon and electronic device supporting the same |
KR101631966B1 (en) * | 2014-06-19 | 2016-06-20 | 엘지전자 주식회사 | Mobile terminal and method for controlling the same |
US20160070460A1 (en) * | 2014-09-04 | 2016-03-10 | Adobe Systems Incorporated | In situ assignment of image asset attributes |
CN104238944B (en) * | 2014-09-10 | 2015-08-26 | 腾讯科技(深圳)有限公司 | A kind of document handling method, device and terminal device |
US20160147381A1 (en) * | 2014-11-26 | 2016-05-26 | Blackberry Limited | Electronic device and method of controlling display of information |
US10120848B2 (en) * | 2014-12-09 | 2018-11-06 | Salesforce.Com, Inc. | Methods and systems for applying responsive design to subframes on a web page |
US9883007B2 (en) * | 2015-01-20 | 2018-01-30 | Microsoft Technology Licensing, Llc | Downloading an application to an apparatus |
CN104571871A (en) * | 2015-01-26 | 2015-04-29 | 深圳市中兴移动通信有限公司 | Method and system for selecting files |
US10409465B2 (en) * | 2015-12-08 | 2019-09-10 | International Business Machines Corporation | Selecting areas of content on a touch screen |
JP6624972B2 (en) * | 2016-02-26 | 2019-12-25 | キヤノン株式会社 | Method, apparatus, and program for controlling display |
US11543936B2 (en) | 2016-06-16 | 2023-01-03 | Airwatch Llc | Taking bulk actions on items in a user interface |
US10386933B2 (en) | 2016-08-30 | 2019-08-20 | International Business Machines Corporation | Controlling navigation of a visual aid during a presentation |
US10802125B2 (en) * | 2016-10-03 | 2020-10-13 | FLIR Belgium BVBA | Touch-gesture control for side-looking sonar systems |
US11209912B2 (en) * | 2016-12-06 | 2021-12-28 | Rohde & Schwarz Gmbh & Co. Kg | Measuring device and configuration method |
US10579740B2 (en) | 2016-12-28 | 2020-03-03 | Motorola Solutions, Inc. | System and method for content presentation selection |
KR102316024B1 (en) * | 2017-03-02 | 2021-10-26 | 삼성전자주식회사 | Display apparatus and user interface displaying method thereof |
US10345957B2 (en) | 2017-06-21 | 2019-07-09 | Microsoft Technology Licensing, Llc | Proximity selector |
CN107358213B (en) * | 2017-07-20 | 2020-02-21 | 湖南科乐坊教育科技股份有限公司 | Method and device for detecting reading habits of children |
CN109388322B (en) * | 2017-08-02 | 2022-11-04 | 腾讯科技(深圳)有限公司 | Method and apparatus for displaying data, storage medium, and electronic medium |
JP7119408B2 (en) * | 2018-02-15 | 2022-08-17 | コニカミノルタ株式会社 | Image processing device, screen handling method, and computer program |
JP7064173B2 (en) * | 2018-05-11 | 2022-05-10 | 富士フイルムビジネスイノベーション株式会社 | Information processing equipment and programs |
US11681415B2 (en) * | 2018-10-31 | 2023-06-20 | Apple Inc. | Near-viewing notification techniques |
US10936281B2 (en) | 2018-12-19 | 2021-03-02 | International Business Machines Corporation | Automatic slide page progression based on verbal and visual cues |
US11237716B2 (en) | 2019-10-14 | 2022-02-01 | Sling TV L.L.C. | Devices, systems and processes for facilitating user adaptive progressions through content |
US11099729B1 (en) | 2020-05-29 | 2021-08-24 | Capital One Services, Llc | Methods and systems for displaying content based on a scroll pattern |
JP7501159B2 (en) * | 2020-07-01 | 2024-06-18 | コニカミノルタ株式会社 | Information processing device, method for controlling information processing device, and program |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20110055773A1 (en) * | 2009-08-25 | 2011-03-03 | Google Inc. | Direct manipulation gestures |
US20110074719A1 (en) * | 2009-09-30 | 2011-03-31 | Higgstec Inc. | Gesture detecting method for touch panel |
US20110157027A1 (en) * | 2009-12-30 | 2011-06-30 | Nokia Corporation | Method and Apparatus for Performing an Operation on a User Interface Object |
Family Cites Families (661)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US4823283A (en) | 1986-10-14 | 1989-04-18 | Tektronix, Inc. | Status driven menu system |
US5189732A (en) | 1987-11-18 | 1993-02-23 | Hitachi, Ltd. | Touch panel input apparatus |
JPH01147647A (en) | 1987-12-03 | 1989-06-09 | Mitsubishi Electric Corp | Data processor |
US5046001A (en) | 1988-06-30 | 1991-09-03 | Ibm Corporation | Method for accessing selected windows in a multi-tasking system |
US5321750A (en) | 1989-02-07 | 1994-06-14 | Market Data Corporation | Restricted information distribution system apparatus and methods |
US5339392A (en) | 1989-07-27 | 1994-08-16 | Risberg Jeffrey S | Apparatus and method for creation of a user definable video displayed document showing changes in real time data |
US5526034A (en) | 1990-09-28 | 1996-06-11 | Ictv, Inc. | Interactive home information system with signal assignment |
US5297032A (en) | 1991-02-01 | 1994-03-22 | Merrill Lynch, Pierce, Fenner & Smith Incorporated | Securities trading workstation |
FR2693810B1 (en) | 1991-06-03 | 1997-01-10 | Apple Computer | USER INTERFACE SYSTEMS WITH DIRECT ACCESS TO A SECONDARY DISPLAY AREA. |
US5258748A (en) | 1991-08-28 | 1993-11-02 | Hewlett-Packard Company | Accessing and selecting multiple key functions with minimum keystrokes |
JP3341290B2 (en) | 1991-09-10 | 2002-11-05 | ソニー株式会社 | Video display device |
JP2654283B2 (en) | 1991-09-30 | 1997-09-17 | 株式会社東芝 | Icon display method |
JP2827612B2 (en) | 1991-10-07 | 1998-11-25 | 富士通株式会社 | A touch panel device and a method for displaying an object on the touch panel device. |
US6061062A (en) | 1991-12-20 | 2000-05-09 | Apple Computer, Inc. | Zooming controller |
US5640176A (en) | 1992-01-24 | 1997-06-17 | Compaq Computer Corporation | User interface for easily setting computer speaker volume and power conservation levels |
JPH07306955A (en) | 1992-07-24 | 1995-11-21 | Walt Disney Co:The | Method and system for generation of three-dimensional illusion |
US5508717A (en) * | 1992-07-28 | 1996-04-16 | Sony Corporation | Computer pointing device with dynamic sensitivity |
US5432932A (en) | 1992-10-23 | 1995-07-11 | International Business Machines Corporation | System and method for dynamically controlling remote processes from a performance monitor |
US5463725A (en) | 1992-12-31 | 1995-10-31 | International Business Machines Corp. | Data processing system graphical user interface which emulates printed material |
EP0626635B1 (en) | 1993-05-24 | 2003-03-05 | Sun Microsystems, Inc. | Improved graphical user interface with method for interfacing to remote devices |
US5598523A (en) | 1994-03-31 | 1997-01-28 | Panasonic Technologies, Inc. | Method and system for displayed menu activation using a matching distinctive arrangement of keypad actuators |
US5914720A (en) | 1994-04-21 | 1999-06-22 | Sandia Corporation | Method of using multiple perceptual channels to increase user absorption of an N-dimensional presentation environment |
US5495566A (en) | 1994-11-22 | 1996-02-27 | Microsoft Corporation | Scrolling contents of a window |
US5623613A (en) | 1994-11-29 | 1997-04-22 | Microsoft Corporation | System for displaying programming information |
US5611060A (en) | 1995-02-22 | 1997-03-11 | Microsoft Corporation | Auto-scrolling during a drag and drop operation |
US5819284A (en) | 1995-03-24 | 1998-10-06 | At&T Corp. | Personalized real time information display as a portion of a screen saver |
US5793415A (en) | 1995-05-15 | 1998-08-11 | Imagetel International Inc. | Videoconferencing and multimedia system |
US6807558B1 (en) | 1995-06-12 | 2004-10-19 | Pointcast, Inc. | Utilization of information “push” technology |
US5860073A (en) | 1995-07-17 | 1999-01-12 | Microsoft Corporation | Style sheets for publishing system |
US5687331A (en) | 1995-08-03 | 1997-11-11 | Microsoft Corporation | Method and system for displaying an animated focus item |
US5712995A (en) | 1995-09-20 | 1998-01-27 | Galileo Frames, Inc. | Non-overlapping tiling apparatus and method for multiple window displays |
US5574836A (en) | 1996-01-22 | 1996-11-12 | Broemmelsiek; Raymond M. | Interactive display apparatus and method with viewer position compensation |
US6008816A (en) | 1996-04-25 | 1999-12-28 | Microsoft Corporation | Method and system for managing color specification using attachable palettes and palettes that refer to other palettes |
US5675329A (en) | 1996-05-09 | 1997-10-07 | International Business Machines Corporation | Method of obtaining a second function from keys on a keyboard using pressure differentiation |
US5771042A (en) | 1996-07-17 | 1998-06-23 | International Business Machines Corporation | Multi-size control for multiple adjacent workspaces |
US5963204A (en) | 1996-09-20 | 1999-10-05 | Nikon Corporation | Electronic camera with reproduction and display of images at the same timing |
US6064383A (en) | 1996-10-04 | 2000-05-16 | Microsoft Corporation | Method and system for selecting an emotional appearance and prosody for a graphical character |
US6057839A (en) | 1996-11-26 | 2000-05-02 | International Business Machines Corporation | Visualization tool for graphically displaying trace data produced by a parallel processing computer |
US5905492A (en) | 1996-12-06 | 1999-05-18 | Microsoft Corporation | Dynamically updating themes for an operating system shell |
US5959621A (en) | 1996-12-06 | 1999-09-28 | Microsoft Corporation | System and method for displaying data items in a ticker display pane on a client computer |
US6216141B1 (en) | 1996-12-06 | 2001-04-10 | Microsoft Corporation | System and method for integrating a document into a desktop window on a client computer |
US6211921B1 (en) | 1996-12-20 | 2001-04-03 | Philips Electronics North America Corporation | User interface for television |
US6009519A (en) | 1997-04-04 | 1999-12-28 | Andrea Electronics, Corp. | Method and apparatus for providing audio utility software for use in windows applications |
US6028600A (en) | 1997-06-02 | 2000-02-22 | Sony Corporation | Rotary menu wheel interface |
US6166736A (en) | 1997-08-22 | 2000-12-26 | Natrificial Llc | Method and apparatus for simultaneously resizing and relocating windows within a graphical display |
KR100300972B1 (en) | 1997-09-19 | 2001-09-03 | 윤종용 | Texture mapping system and texture cache access method |
US6008809A (en) | 1997-09-22 | 1999-12-28 | International Business Machines Corporation | Apparatus and method for viewing multiple windows within a dynamic window |
US6470386B1 (en) | 1997-09-26 | 2002-10-22 | Worldcom, Inc. | Integrated proxy interface for web based telecommunications management tools |
US6266098B1 (en) | 1997-10-22 | 2001-07-24 | Matsushita Electric Corporation Of America | Function presentation and selection using a rotatable function menu |
US5940076A (en) | 1997-12-01 | 1999-08-17 | Motorola, Inc. | Graphical user interface for an electronic device and method therefor |
US6311058B1 (en) | 1998-06-30 | 2001-10-30 | Microsoft Corporation | System for delivering data content over a low bit rate transmission channel |
US6449638B1 (en) | 1998-01-07 | 2002-09-10 | Microsoft Corporation | Channel definition architecture extension |
US7663607B2 (en) | 2004-05-06 | 2010-02-16 | Apple Inc. | Multipoint touchscreen |
US8479122B2 (en) * | 2004-07-30 | 2013-07-02 | Apple Inc. | Gestures for touch sensitive input devices |
US9292111B2 (en) | 1998-01-26 | 2016-03-22 | Apple Inc. | Gesturing with a multipoint sensing device |
US20070177804A1 (en) * | 2006-01-30 | 2007-08-02 | Apple Computer, Inc. | Multi-touch gesture dictionary |
US6011542A (en) | 1998-02-13 | 2000-01-04 | Sony Corporation | Graphical text entry wheel |
US6278448B1 (en) | 1998-02-17 | 2001-08-21 | Microsoft Corporation | Composite Web page built from any web content |
EP1062613A1 (en) | 1998-03-13 | 2000-12-27 | Aspen Technology, Inc. | Computer method and apparatus for automatic execution of software applications |
US6108003A (en) | 1998-03-18 | 2000-08-22 | International Business Machines Corporation | Maintaining visibility and status indication of docked applications and application bars |
FR2776415A1 (en) | 1998-03-20 | 1999-09-24 | Philips Consumer Communication | ELECTRONIC APPARATUS HAVING A SCREEN AND METHOD FOR DISPLAYING GRAPHICS |
US6784925B1 (en) | 1998-03-24 | 2004-08-31 | Canon Kabushiki Kaisha | System to manage digital camera images |
US6448987B1 (en) | 1998-04-03 | 2002-09-10 | Intertainer, Inc. | Graphic user interface for a digital content delivery system using circular menus |
US6104418A (en) | 1998-04-06 | 2000-08-15 | Silicon Magic Corporation | Method and system for improved memory interface during image rendering |
JPH11298572A (en) | 1998-04-07 | 1999-10-29 | Nec Shizuoka Ltd | Receiver and method for displaying received information |
GB0027260D0 (en) | 2000-11-08 | 2000-12-27 | Koninl Philips Electronics Nv | An image control system |
US6212564B1 (en) | 1998-07-01 | 2001-04-03 | International Business Machines Corporation | Distributed application launcher for optimizing desktops based on client characteristics information |
US6611272B1 (en) | 1998-07-02 | 2003-08-26 | Microsoft Corporation | Method and apparatus for rasterizing in a hierarchical tile order |
AR020608A1 (en) | 1998-07-17 | 2002-05-22 | United Video Properties Inc | A METHOD AND A PROVISION TO SUPPLY A USER REMOTE ACCESS TO AN INTERACTIVE PROGRAMMING GUIDE BY A REMOTE ACCESS LINK |
US6369837B1 (en) | 1998-07-17 | 2002-04-09 | International Business Machines Corporation | GUI selector control |
US6832355B1 (en) | 1998-07-28 | 2004-12-14 | Microsoft Corporation | Web page display system |
US6188405B1 (en) | 1998-09-14 | 2001-02-13 | Microsoft Corporation | Methods, apparatus and data structures for providing a user interface, which exploits spatial memory, to objects |
US20020018051A1 (en) | 1998-09-15 | 2002-02-14 | Mona Singh | Apparatus and method for moving objects on a touchscreen display |
US6510553B1 (en) | 1998-10-26 | 2003-01-21 | Intel Corporation | Method of streaming video from multiple sources over a network |
JP3956553B2 (en) | 1998-11-04 | 2007-08-08 | 富士ゼロックス株式会社 | Icon display processing device |
US6597374B1 (en) | 1998-11-12 | 2003-07-22 | Microsoft Corporation | Activity based remote control unit |
US6337698B1 (en) | 1998-11-20 | 2002-01-08 | Microsoft Corporation | Pen-based interface for a notepad computer |
US6510466B1 (en) | 1998-12-14 | 2003-01-21 | International Business Machines Corporation | Methods, systems and computer program products for centralized management of application programs on a network |
US6577350B1 (en) | 1998-12-21 | 2003-06-10 | Sony Corporation | Method and apparatus for displaying an electronic program guide |
US6396963B2 (en) | 1998-12-29 | 2002-05-28 | Eastman Kodak Company | Photocollage generation and modification |
US7469381B2 (en) | 2007-01-07 | 2008-12-23 | Apple Inc. | List scrolling and document translation, scaling, and rotation on a touch-screen display |
US6628309B1 (en) | 1999-02-05 | 2003-09-30 | International Business Machines Corporation | Workspace drag and drop |
US6707890B1 (en) | 2002-09-03 | 2004-03-16 | Bell South Intellectual Property Corporation | Voice mail notification using instant messaging |
US7283620B2 (en) | 1999-02-26 | 2007-10-16 | At&T Bls Intellectual Property, Inc. | Systems and methods for originating and sending a voice mail message to an instant messaging platform |
US6463304B2 (en) | 1999-03-04 | 2002-10-08 | Openwave Systems Inc. | Application launcher for a two-way mobile communications device |
US6281940B1 (en) | 1999-03-31 | 2001-08-28 | Sony Corporation | Display of previewed channels with rotation of multiple previewed channels along an arc |
EP1052565A3 (en) | 1999-05-13 | 2005-05-11 | Sony Corporation | Information processing method and apparatus |
US6505243B1 (en) | 1999-06-02 | 2003-01-07 | Intel Corporation | Automatic web-based detection and display of product installation help information |
US6456334B1 (en) | 1999-06-29 | 2002-09-24 | Ati International Srl | Method and apparatus for displaying video in a data processing system |
US6426753B1 (en) | 1999-07-01 | 2002-07-30 | Microsoft Corporation | Cache memory for high latency and out-of-order return of texture data |
US6577323B1 (en) | 1999-07-01 | 2003-06-10 | Honeywell Inc. | Multivariable process trend display and methods regarding same |
US6971067B1 (en) | 1999-08-23 | 2005-11-29 | Sentillion, Inc. | Application launchpad |
US6976210B1 (en) | 1999-08-31 | 2005-12-13 | Lucent Technologies Inc. | Method and apparatus for web-site-independent personalization from multiple sites having user-determined extraction functionality |
US6424338B1 (en) | 1999-09-30 | 2002-07-23 | Gateway, Inc. | Speed zone touchpad |
ATE365944T1 (en) | 1999-10-26 | 2007-07-15 | Iontas Ltd | MONITORING COMPUTER USE |
US7028264B2 (en) | 1999-10-29 | 2006-04-11 | Surfcast, Inc. | System and method for simultaneous display of multiple information sources |
US7987431B2 (en) | 1999-10-29 | 2011-07-26 | Surfcast, Inc. | System and method for simultaneous display of multiple information sources |
US6724403B1 (en) | 1999-10-29 | 2004-04-20 | Surfcast, Inc. | System and method for simultaneous display of multiple information sources |
US6697825B1 (en) | 1999-11-05 | 2004-02-24 | Decentrix Inc. | Method and apparatus for generating and modifying multiple instances of element of a web site |
US6510144B1 (en) | 1999-12-07 | 2003-01-21 | Cisco Technology, Inc. | Network layer support to enhance the transport layer performance in mobile and wireless environments |
US6820111B1 (en) | 1999-12-07 | 2004-11-16 | Microsoft Corporation | Computer user interface architecture that saves a user's non-linear navigation history and intelligently maintains that history |
US6801203B1 (en) | 1999-12-22 | 2004-10-05 | Microsoft Corporation | Efficient graphics pipeline with a pixel cache and data pre-fetching |
US6433789B1 (en) | 2000-02-18 | 2002-08-13 | Neomagic Corp. | Steaming prefetching texture cache for level of detail maps in a 3D-graphics engine |
JP3720230B2 (en) | 2000-02-18 | 2005-11-24 | シャープ株式会社 | Expression data control system, expression data control apparatus constituting the same, and recording medium on which the program is recorded |
KR100460105B1 (en) | 2000-02-22 | 2004-12-03 | 엘지전자 주식회사 | Method for searching a menu in a mobile communication terminal |
US20020152305A1 (en) | 2000-03-03 | 2002-10-17 | Jackson Gregory J. | Systems and methods for resource utilization analysis in information management environments |
US20030046396A1 (en) | 2000-03-03 | 2003-03-06 | Richter Roger K. | Systems and methods for managing resource utilization in information management environments |
US6721958B1 (en) | 2000-03-08 | 2004-04-13 | Opentv, Inc. | Optional verification of interactive television content |
US8701027B2 (en) | 2000-03-16 | 2014-04-15 | Microsoft Corporation | Scope user interface for displaying the priorities and properties of multiple informational items |
US6507643B1 (en) | 2000-03-16 | 2003-01-14 | Breveon Incorporated | Speech recognition system and method for converting voice mail messages to electronic mail messages |
US6636246B1 (en) | 2000-03-17 | 2003-10-21 | Vizible.Com Inc. | Three dimensional spatial user interface |
GB2360658B (en) | 2000-03-20 | 2004-09-08 | Hewlett Packard Co | Camera with user identity data |
US7155729B1 (en) | 2000-03-28 | 2006-12-26 | Microsoft Corporation | Method and system for displaying transient notifications |
US7249326B2 (en) | 2000-04-06 | 2007-07-24 | Microsoft Corporation | Method and system for reducing notification area clutter |
JP4325075B2 (en) | 2000-04-21 | 2009-09-02 | ソニー株式会社 | Data object management device |
KR100363619B1 (en) | 2000-04-21 | 2002-12-05 | 배동훈 | Contents structure with a spiral donut and contents display system |
JP4730571B2 (en) | 2000-05-01 | 2011-07-20 | ソニー株式会社 | Information processing apparatus and method, and program storage medium |
US20020133554A1 (en) | 2000-05-25 | 2002-09-19 | Daniel Checkoway | E-mail answering agent |
US7210099B2 (en) | 2000-06-12 | 2007-04-24 | Softview Llc | Resolution independent vector display of internet content |
JP2003536177A (en) | 2000-06-22 | 2003-12-02 | インテル コーポレイション | Method and system for transferring objects between users or applications |
JP2002014661A (en) | 2000-06-29 | 2002-01-18 | Toshiba Corp | Liquid crystal display device and electronic equipment provided therewith |
US6966034B2 (en) | 2000-06-30 | 2005-11-15 | Microsoft Corporation | Supplemental request header for applications or devices using web browsers |
US6662023B1 (en) | 2000-07-06 | 2003-12-09 | Nokia Mobile Phones Ltd. | Method and apparatus for controlling and securing mobile phones that are lost, stolen or misused |
US6907273B1 (en) | 2000-07-07 | 2005-06-14 | Openwave Systems Inc. | Method and system for processing overloaded keys of a mobile device |
GB0017793D0 (en) * | 2000-07-21 | 2000-09-06 | Secr Defence | Human computer interface |
US6707449B2 (en) | 2000-08-30 | 2004-03-16 | Microsoft Corporation | Manual controlled scrolling |
US7043690B1 (en) | 2000-09-11 | 2006-05-09 | International Business Machines Corporation | Method, system, and program for checking contact information |
SE524595C2 (en) | 2000-09-26 | 2004-08-31 | Hapax Information Systems Ab | Procedure and computer program for normalization of style throws |
US7263668B1 (en) | 2000-11-09 | 2007-08-28 | International Business Machines Corporation | Display interface to a computer controlled display system with variable comprehensiveness levels of menu items dependent upon size of variable display screen available for menu item display |
WO2002041190A2 (en) | 2000-11-15 | 2002-05-23 | Holbrook David M | Apparatus and method for organizing and/or presenting data |
US6907574B2 (en) | 2000-11-29 | 2005-06-14 | Ictv, Inc. | System and method of hyperlink navigation between frames |
US7058955B2 (en) | 2000-12-06 | 2006-06-06 | Microsoft Corporation | Method and system for passing messages between threads |
CA2328795A1 (en) | 2000-12-19 | 2002-06-19 | Advanced Numerical Methods Ltd. | Applications and performance enhancements for detail-in-context viewing technology |
US6983310B2 (en) | 2000-12-29 | 2006-01-03 | International Business Machines Corporation | System and method for providing search capabilties on a wireless device |
US7133859B1 (en) | 2001-01-05 | 2006-11-07 | Palm, Inc. | Category specific sort and display instructions for an electronic device |
US20020097264A1 (en) | 2001-01-19 | 2002-07-25 | Ibm Corporation | Apparatus and methods for management of temporal parameters to provide enhanced accessibility to computer programs |
US7069207B2 (en) | 2001-01-26 | 2006-06-27 | Microsoft Corporation | Linguistically intelligent text compression |
US6938101B2 (en) | 2001-01-29 | 2005-08-30 | Universal Electronics Inc. | Hand held device having a browser application |
SE519884C2 (en) | 2001-02-02 | 2003-04-22 | Scalado Ab | Method for zooming and producing a zoomable image |
US7735021B2 (en) | 2001-02-16 | 2010-06-08 | Microsoft Corporation | Shortcut system for use in a mobile electronic device and method thereof |
US6798421B2 (en) | 2001-02-28 | 2004-09-28 | 3D Labs, Inc. Ltd. | Same tile method |
US20020129061A1 (en) | 2001-03-07 | 2002-09-12 | Swart Stacey J. | Method and apparatus for creating files that are suitable for hardcopy printing and for on-line use |
CA2375844C (en) | 2001-03-09 | 2008-12-30 | Research In Motion Limited | Advanced voice and data operations in a mobile data communication device |
US7017119B1 (en) | 2001-03-15 | 2006-03-21 | Vaultus Mobile Technologies, Inc. | System and method for display notification in a tabbed window setting |
US6972776B2 (en) | 2001-03-20 | 2005-12-06 | Agilent Technologies, Inc. | Scrolling method using screen pointing device |
US6904597B2 (en) | 2001-03-30 | 2005-06-07 | Intel Corporation | Inter-thread communications between different components using double buffer |
US7734285B2 (en) | 2001-04-03 | 2010-06-08 | Qualcomm Incorporated | Method and apparatus for network initiated uninstallation of application program over wireless network |
US6778192B2 (en) | 2001-04-05 | 2004-08-17 | International Business Machines Corporation | System and method for creating markers on scroll bars of a graphical user interface |
US6990638B2 (en) | 2001-04-19 | 2006-01-24 | International Business Machines Corporation | System and method for using shading layers and highlighting to navigate a tree view display |
US20020161634A1 (en) | 2001-04-27 | 2002-10-31 | Koninklijke Philips Electronics N.V. | Electronic document with an automatically updated portion |
US7013431B2 (en) | 2001-04-30 | 2006-03-14 | Broadband Graphics, Llc | Cell based EUI methods and apparatus |
US6907447B1 (en) | 2001-04-30 | 2005-06-14 | Microsoft Corporation | Method and apparatus for providing an instant message notification |
US20020186251A1 (en) | 2001-06-07 | 2002-12-12 | International Business Machines Corporation | Method, apparatus and computer program product for context-sensitive scrolling |
EP1271896B1 (en) | 2001-06-18 | 2004-07-28 | Swisscom Mobile AG | Method and system for mobile IP Nodes in heterogeneous networks |
JP2003009244A (en) | 2001-06-25 | 2003-01-10 | Fuji Photo Film Co Ltd | Image data transmitter and controlling method thereof |
US6975836B2 (en) | 2001-06-28 | 2005-12-13 | Kabushiki Kaisha Toshiba | Data broadcasting system, receiving terminal device, contents providing server, and contents providing method |
KR100420280B1 (en) | 2001-07-09 | 2004-03-02 | 삼성전자주식회사 | Menu display method of mobile terminal |
US6876312B2 (en) | 2001-07-10 | 2005-04-05 | Behavior Tech Computer Corporation | Keyboard with multi-function keys |
US6987991B2 (en) | 2001-08-17 | 2006-01-17 | Wildseed Ltd. | Emoticon input method and apparatus |
FR2828970B1 (en) | 2001-08-27 | 2003-12-19 | Cit Alcatel | INTEROPERABILITY SYSTEM BETWEEN MMS MESSAGES AND SMS / EMS MESSAGES AND RELATED EXCHANGE METHOD |
US20030096604A1 (en) | 2001-08-29 | 2003-05-22 | Jorg Vollandt | Method of operating an electronic device, in particular a mobile telephone |
US6690365B2 (en) | 2001-08-29 | 2004-02-10 | Microsoft Corporation | Automatic scrolling |
US7093201B2 (en) | 2001-09-06 | 2006-08-15 | Danger, Inc. | Loop menu navigation apparatus and method |
US6912695B2 (en) | 2001-09-13 | 2005-06-28 | Pixia Corp. | Data storage and retrieval system and method |
US7036090B1 (en) | 2001-09-24 | 2006-04-25 | Digeo, Inc. | Concentric polygonal menus for a graphical user interface |
US20030073414A1 (en) | 2001-10-15 | 2003-04-17 | Stephen P. Capps | Textual and telephony dual input device |
US6857104B1 (en) | 2001-10-17 | 2005-02-15 | At&T Corp | Organizing graphical user interfaces to reveal hidden areas |
US7487262B2 (en) | 2001-11-16 | 2009-02-03 | At & T Mobility Ii, Llc | Methods and systems for routing messages through a communications network based on message content |
JP2003162355A (en) | 2001-11-26 | 2003-06-06 | Sony Corp | Display switching method of task, portable equipment, and portable communication equipment |
WO2003048960A1 (en) | 2001-11-30 | 2003-06-12 | A New Voice, Inc. | Method and system for contextual prioritization of unified messages |
US20030135582A1 (en) | 2001-12-21 | 2003-07-17 | Docomo Communications Laboratories Usa, Inc. | Context aware search service |
US6690387B2 (en) | 2001-12-28 | 2004-02-10 | Koninklijke Philips Electronics N.V. | Touch-screen image scrolling system and method |
US7139800B2 (en) | 2002-01-16 | 2006-11-21 | Xerox Corporation | User interface for a message-based system having embedded information management capabilities |
FI116425B (en) | 2002-01-18 | 2005-11-15 | Nokia Corp | Method and apparatus for integrating an extensive keyboard into a small apparatus |
EP1469374A4 (en) | 2002-01-22 | 2009-11-11 | Fujitsu Ltd | Menu element selecting device and method |
WO2003062976A1 (en) | 2002-01-22 | 2003-07-31 | Fujitsu Limited | Menu element selecting device and method |
US7019757B2 (en) | 2002-01-28 | 2006-03-28 | International Business Machines Corporation | Changing the alpha levels of an application window to indicate a status of a computing task |
US7146573B2 (en) | 2002-01-28 | 2006-12-05 | International Business Machines Corporation | Automatic window representation adjustment |
US20040078299A1 (en) | 2002-01-31 | 2004-04-22 | Kathleen Down-Logan | Portable color and style analysis, match and management system |
US7333092B2 (en) | 2002-02-25 | 2008-02-19 | Apple Computer, Inc. | Touch pad for handheld device |
US7031977B2 (en) | 2002-02-28 | 2006-04-18 | Plumtree Software, Inc. | Efficiently storing indented threads in a threaded discussion application |
US6952207B1 (en) | 2002-03-11 | 2005-10-04 | Microsoft Corporation | Efficient scenery object rendering |
US7610563B2 (en) | 2002-03-22 | 2009-10-27 | Fuji Xerox Co., Ltd. | System and method for controlling the display of non-uniform graphical objects |
US7127685B2 (en) | 2002-04-30 | 2006-10-24 | America Online, Inc. | Instant messaging interface having a tear-off element |
US7779076B2 (en) | 2002-05-31 | 2010-08-17 | Aol Inc. | Instant messaging personalization |
US7689649B2 (en) | 2002-05-31 | 2010-03-30 | Aol Inc. | Rendering destination instant messaging personalization items before communicating with destination |
US20080048986A1 (en) | 2002-06-10 | 2008-02-28 | Khoo Soon H | Compound Computing Device with Dual Portion Keyboards Controlled by a Single Processing Element |
AU2002311525A1 (en) | 2002-06-21 | 2004-01-06 | Nokia Corporation | Mobile communication device having music player navigation function and method of operation thereof |
US6873329B2 (en) | 2002-07-05 | 2005-03-29 | Spatial Data Technologies, Inc. | System and method for caching and rendering images |
US7302648B1 (en) | 2002-07-10 | 2007-11-27 | Apple Inc. | Method and apparatus for resizing buffered windows |
US7216588B2 (en) | 2002-07-12 | 2007-05-15 | Dana Suess | Modified-qwerty letter layout for rapid data entry |
US7658562B2 (en) | 2002-07-12 | 2010-02-09 | Dana Suess | Modified-QWERTY letter layout for rapid data entry |
US7111044B2 (en) | 2002-07-17 | 2006-09-19 | Fastmobile, Inc. | Method and system for displaying group chat sessions on wireless mobile terminals |
US7089507B2 (en) | 2002-08-12 | 2006-08-08 | International Business Machines Corporation | System and method for display views using a single stroke control |
US7065385B2 (en) | 2002-09-12 | 2006-06-20 | Sony Ericsson Mobile Communications Ab | Apparatus, methods, and computer program products for dialing telephone numbers using alphabetic selections |
US20040068543A1 (en) | 2002-10-03 | 2004-04-08 | Ralph Seifert | Method and apparatus for processing e-mail |
US7913183B2 (en) | 2002-10-08 | 2011-03-22 | Microsoft Corporation | System and method for managing software applications in a graphical user interface |
JP2004133733A (en) | 2002-10-11 | 2004-04-30 | Sony Corp | Display device, display method, and program |
KR200303655Y1 (en) | 2002-11-19 | 2003-02-14 | 강성윤 | Folder-type Mobile phone which is convenient for character message transmission |
CA2414378A1 (en) | 2002-12-09 | 2004-06-09 | Corel Corporation | System and method for controlling user interface features of a web application |
US7600234B2 (en) | 2002-12-10 | 2009-10-06 | Fisher-Rosemount Systems, Inc. | Method for launching applications |
AU2002953555A0 (en) | 2002-12-23 | 2003-01-16 | Canon Kabushiki Kaisha | Method for presenting hierarchical data |
US7321824B1 (en) | 2002-12-30 | 2008-01-22 | Aol Llc | Presenting a travel route using more than one presentation style |
JP2004227393A (en) | 2003-01-24 | 2004-08-12 | Sony Corp | Icon drawing system, icon drawing method and electronic device |
US7158123B2 (en) | 2003-01-31 | 2007-01-02 | Xerox Corporation | Secondary touch contextual sub-menu navigation for touch screen interface |
US6885974B2 (en) | 2003-01-31 | 2005-04-26 | Microsoft Corporation | Dynamic power control apparatus, systems and methods |
US7606714B2 (en) | 2003-02-11 | 2009-10-20 | Microsoft Corporation | Natural language classification within an automated response system |
US20040185883A1 (en) | 2003-03-04 | 2004-09-23 | Jason Rukman | System and method for threading short message service (SMS) messages with multimedia messaging service (MMS) messages |
US7075535B2 (en) | 2003-03-05 | 2006-07-11 | Sand Codex | System and method for exact rendering in a zooming user interface |
US7313764B1 (en) | 2003-03-06 | 2007-12-25 | Apple Inc. | Method and apparatus to accelerate scrolling for buffered windows |
US7480872B1 (en) | 2003-04-06 | 2009-01-20 | Apple Inc. | Method and apparatus for dynamically resizing windows |
US6865297B2 (en) | 2003-04-15 | 2005-03-08 | Eastman Kodak Company | Method for automatically classifying images into events in a multimedia authoring application |
GB2411551B (en) | 2003-04-22 | 2006-05-03 | Spinvox Ltd | A method of providing voicemails to a wireless information device |
US7102626B2 (en) | 2003-04-25 | 2006-09-05 | Hewlett-Packard Development Company, L.P. | Multi-function pointing device |
US7388579B2 (en) | 2003-05-01 | 2008-06-17 | Motorola, Inc. | Reduced power consumption for a graphics accelerator and display |
US8555165B2 (en) | 2003-05-08 | 2013-10-08 | Hillcrest Laboratories, Inc. | Methods and systems for generating a zoomable graphical user interface |
US7173623B2 (en) | 2003-05-09 | 2007-02-06 | Microsoft Corporation | System supporting animation of graphical display elements through animation object instances |
JP4177713B2 (en) | 2003-05-30 | 2008-11-05 | 京セラ株式会社 | Imaging device |
JP2005004396A (en) | 2003-06-11 | 2005-01-06 | Sony Corp | Information display method, information display unit, and computer program |
GB2404630B (en) | 2003-08-07 | 2006-09-27 | Research In Motion Ltd | Cover plate for a mobile device having a push-through dial keypad |
US7669140B2 (en) | 2003-08-21 | 2010-02-23 | Microsoft Corporation | System and method for providing rich minimized applications |
US7308288B2 (en) | 2003-08-22 | 2007-12-11 | Sbc Knowledge Ventures, Lp. | System and method for prioritized interface design |
US7725419B2 (en) | 2003-09-05 | 2010-05-25 | Samsung Electronics Co., Ltd | Proactive user interface including emotional agent |
KR100566122B1 (en) | 2003-09-15 | 2006-03-30 | (주) 멀티비아 | Method of compressing still pictures for mobile devices |
US7433920B2 (en) | 2003-10-10 | 2008-10-07 | Microsoft Corporation | Contact sidebar tile |
US7231231B2 (en) | 2003-10-14 | 2007-06-12 | Nokia Corporation | Method and apparatus for locking a mobile telephone touch screen |
US7224963B2 (en) | 2003-10-17 | 2007-05-29 | Sony Ericsson Mobile Communications Ab | System method and computer program product for managing themes in a mobile phone |
US20050085215A1 (en) | 2003-10-21 | 2005-04-21 | Nokia Corporation | Method and related apparatus for emergency calling in a touch screen mobile phone from a touch screen and keypad lock active state |
US20050090239A1 (en) | 2003-10-22 | 2005-04-28 | Chang-Hung Lee | Text message based mobile phone configuration system |
US7644376B2 (en) | 2003-10-23 | 2010-01-05 | Microsoft Corporation | Flexible architecture for notifying applications of state changes |
US7461151B2 (en) | 2003-11-13 | 2008-12-02 | International Business Machines Corporation | System and method enabling future messaging directives based on past participation via a history monitor |
US7370284B2 (en) | 2003-11-18 | 2008-05-06 | Laszlo Systems, Inc. | User interface for displaying multiple applications |
US7814419B2 (en) | 2003-11-26 | 2010-10-12 | Nokia Corporation | Changing an orientation of a user interface via a course of motion |
US7454713B2 (en) | 2003-12-01 | 2008-11-18 | Sony Ericsson Mobile Communications Ab | Apparatus, methods and computer program products providing menu expansion and organization functions |
WO2005055034A1 (en) | 2003-12-01 | 2005-06-16 | Research In Motion Limited | Previewing a new event on a small screen device |
EP1538536A1 (en) | 2003-12-05 | 2005-06-08 | Sony International (Europe) GmbH | Visualization and control techniques for multimedia digital content |
US7103388B2 (en) | 2003-12-16 | 2006-09-05 | Research In Motion Limited | Expedited communication graphical user interface system and method |
EP1557837A1 (en) | 2004-01-26 | 2005-07-27 | Sony International (Europe) GmbH | Redundancy elimination in a content-adaptive video preview system |
US20050164688A1 (en) | 2004-01-27 | 2005-07-28 | Kyocera Corporation | Mobile terminal, method for controlling mobile telephone terminal, and mobile telephone terminal |
US20050198584A1 (en) | 2004-01-27 | 2005-09-08 | Matthews David A. | System and method for controlling manipulation of tiles within a sidebar |
US7403191B2 (en) | 2004-01-28 | 2008-07-22 | Microsoft Corporation | Tactile overlay for an imaging display |
US7296184B2 (en) | 2004-01-28 | 2007-11-13 | Microsoft Corporation | Method and system for masking dynamic regions in a user interface to enable testing of user interface consistency |
US8001120B2 (en) | 2004-02-12 | 2011-08-16 | Microsoft Corporation | Recent contacts and items |
US20050183021A1 (en) | 2004-02-13 | 2005-08-18 | Allen Joel E. | Method for electronically packaging a user's personal computing environment on a computer or device, and mobilizing it for transfer over a network |
JP4071726B2 (en) | 2004-02-25 | 2008-04-02 | シャープ株式会社 | Portable information device, character display method in portable information device, and program for realizing the method |
US20050198159A1 (en) | 2004-03-08 | 2005-09-08 | Kirsch Steven T. | Method and system for categorizing and processing e-mails based upon information in the message header and SMTP session |
WO2005089286A2 (en) | 2004-03-15 | 2005-09-29 | America Online, Inc. | Sharing social network information |
US7599790B2 (en) | 2004-03-23 | 2009-10-06 | Google Inc. | Generating and serving tiles in a digital mapping system |
GB0406451D0 (en) | 2004-03-23 | 2004-04-28 | Patel Sanjay | Keyboards |
FI20040446A (en) | 2004-03-24 | 2005-09-25 | Nokia Corp | Procedure for administering application hardware, electronic device and computer software product |
US7289806B2 (en) | 2004-03-30 | 2007-10-30 | Intel Corporation | Method and apparatus for context enabled search |
US7912904B2 (en) | 2004-03-31 | 2011-03-22 | Google Inc. | Email system with conversation-centric user interface |
US8027276B2 (en) | 2004-04-14 | 2011-09-27 | Siemens Enterprise Communications, Inc. | Mixed mode conferencing |
US8448083B1 (en) | 2004-04-16 | 2013-05-21 | Apple Inc. | Gesture control of multimedia editing applications |
EP1589444A3 (en) | 2004-04-21 | 2008-03-12 | Samsung Electronics Co., Ltd. | Method, medium, and apparatus for detecting situation change of digital photos and method, medium, and apparatus for situation-based photo clustering in digital photo album |
WO2005109644A1 (en) | 2004-04-27 | 2005-11-17 | Wildseed Ltd. | Reduced keypad for predictive input |
US8707209B2 (en) | 2004-04-29 | 2014-04-22 | Microsoft Corporation | Save preview representation of files being created |
EP1596613A1 (en) | 2004-05-10 | 2005-11-16 | Dialog Semiconductor GmbH | Data and voice transmission within the same mobile phone call |
US7386807B2 (en) | 2004-05-17 | 2008-06-10 | Microsoft Corporation | System and method for monitoring application response and providing visual treatment |
US7353466B2 (en) | 2004-05-28 | 2008-04-01 | Microsoft Corporation | System and method for generating message notification objects on dynamically scaled timeline |
AU2005253600B2 (en) | 2004-06-04 | 2011-01-27 | Benjamin Firooz Ghassabian | Systems to enhance data entry in mobile and fixed environment |
US7434058B2 (en) | 2004-06-07 | 2008-10-07 | Reconnex Corporation | Generating signatures over a document |
US7469380B2 (en) | 2004-06-15 | 2008-12-23 | Microsoft Corporation | Dynamic document and template previews |
US7761800B2 (en) | 2004-06-25 | 2010-07-20 | Apple Inc. | Unified interest layer for user interface |
US7464110B2 (en) | 2004-06-30 | 2008-12-09 | Nokia Corporation | Automated grouping of image and other user data |
US7388578B2 (en) | 2004-07-01 | 2008-06-17 | Nokia Corporation | Touch display PDA phone with slide keypad |
US7669135B2 (en) | 2004-07-15 | 2010-02-23 | At&T Mobility Ii Llc | Using emoticons, such as for wireless devices |
US20060015726A1 (en) | 2004-07-19 | 2006-01-19 | Callas Jonathan D | Apparatus for partial authentication of messages |
JP2006042171A (en) | 2004-07-29 | 2006-02-09 | Olympus Corp | Camera, reproducing apparatus and album registration method |
US7958115B2 (en) | 2004-07-29 | 2011-06-07 | Yahoo! Inc. | Search systems and methods using in-line contextual queries |
US7653883B2 (en) | 2004-07-30 | 2010-01-26 | Apple Inc. | Proximity detector in handheld device |
US7178111B2 (en) | 2004-08-03 | 2007-02-13 | Microsoft Corporation | Multi-planar three-dimensional user interface |
US7181373B2 (en) | 2004-08-13 | 2007-02-20 | Agilent Technologies, Inc. | System and methods for navigating and visualizing multi-dimensional biological data |
US7559053B2 (en) | 2004-08-24 | 2009-07-07 | Microsoft Corporation | Program and system performance data correlation |
KR20060019198A (en) | 2004-08-27 | 2006-03-03 | 서동휘 | Method and device for transmitting and receiving graphic emoticons, and method for mapping graphic emoticons |
US7434173B2 (en) | 2004-08-30 | 2008-10-07 | Microsoft Corporation | Scrolling web pages using direct interaction |
US7619615B1 (en) | 2004-08-31 | 2009-11-17 | Sun Microsystems, Inc. | Method and apparatus for soft keys of an electronic device |
KR100854333B1 (en) | 2004-09-02 | 2008-09-02 | 리얼네트웍스아시아퍼시픽 주식회사 | Method for processing call establishment by using character string |
US8473848B2 (en) | 2004-09-15 | 2013-06-25 | Research In Motion Limited | Palette-based color selection within a user interface theme |
US20070061488A1 (en) | 2004-09-20 | 2007-03-15 | Trilibis Inc. | System and method for flexible user interfaces |
US8510657B2 (en) | 2004-09-30 | 2013-08-13 | Microsoft Corporation | Editing the text of an arbitrary graphic via a hierarchical list |
US20060074735A1 (en) | 2004-10-01 | 2006-04-06 | Microsoft Corporation | Ink-enabled workflow authoring |
US20060075360A1 (en) | 2004-10-04 | 2006-04-06 | Edwards Systems Technology, Inc. | Dynamic highlight prompting apparatus and method |
KR100738069B1 (en) | 2004-10-04 | 2007-07-10 | 삼성전자주식회사 | Method and apparatus for category-based photo clustering in digital photo album |
US7512966B2 (en) | 2004-10-14 | 2009-03-31 | International Business Machines Corporation | System and method for visually rendering resource policy usage information |
KR100597670B1 (en) | 2004-10-18 | 2006-07-07 | 주식회사 네오엠텔 | mobile communication terminal capable of reproducing and updating multimedia content, and method for reproducing the same |
US7657842B2 (en) | 2004-11-12 | 2010-02-02 | Microsoft Corporation | Sidebar tile free-arrangement |
US20060103623A1 (en) | 2004-11-15 | 2006-05-18 | Nokia Corporation | Method and apparatus to enter text in a phone dialer entry field |
KR100703690B1 (en) | 2004-11-19 | 2007-04-05 | 삼성전자주식회사 | User interface and method for managing icon by grouping using skin image |
US7581034B2 (en) | 2004-11-23 | 2009-08-25 | Microsoft Corporation | Sending notifications to auxiliary displays |
EP1662760A1 (en) | 2004-11-30 | 2006-05-31 | Sony Ericsson Mobile Communications AB | Method for providing alerts in a mobile device and mobile device therefor |
KR100809585B1 (en) | 2004-12-21 | 2008-03-07 | 삼성전자주식회사 | Device and method for processing schedule-related event in wireless terminal |
US7073908B1 (en) | 2005-01-11 | 2006-07-11 | Anthony Italo Provitola | Enhancement of depth perception |
US7478326B2 (en) | 2005-01-18 | 2009-01-13 | Microsoft Corporation | Window information switching system |
US7317907B2 (en) | 2005-01-31 | 2008-01-08 | Research In Motion Limited | Synchronizing server and device data using device data schema |
US7571189B2 (en) | 2005-02-02 | 2009-08-04 | Lightsurf Technologies, Inc. | Method and apparatus to implement themes for a handheld device |
US20060184901A1 (en) | 2005-02-15 | 2006-08-17 | Microsoft Corporation | Computer content navigation tools |
US8819569B2 (en) | 2005-02-18 | 2014-08-26 | Zumobi, Inc | Single-handed approach for navigation of application tiles using panning and zooming |
US20060212806A1 (en) | 2005-03-18 | 2006-09-21 | Microsoft Corporation | Application of presentation styles to items on a web page |
US20060218234A1 (en) | 2005-03-24 | 2006-09-28 | Li Deng | Scheme of sending email to mobile devices |
US7725837B2 (en) | 2005-03-31 | 2010-05-25 | Microsoft Corporation | Digital image browser |
US20060223593A1 (en) | 2005-04-01 | 2006-10-05 | Ixi Mobile (R&D) Ltd. | Content delivery system and method for a mobile communication device |
US9141402B2 (en) | 2005-04-25 | 2015-09-22 | Aol Inc. | Providing a user interface |
US20060246955A1 (en) | 2005-05-02 | 2006-11-02 | Mikko Nirhamo | Mobile communication device and method therefor |
US7949542B2 (en) | 2005-05-05 | 2011-05-24 | Ionosoft, Inc. | System, method and computer program product for graphically illustrating entities and generating a text-based report therefrom |
US8769433B2 (en) | 2005-05-13 | 2014-07-01 | Entrust, Inc. | Method and apparatus for protecting communication of information through a graphical user interface |
US20070024646A1 (en) | 2005-05-23 | 2007-02-01 | Kalle Saarinen | Portable electronic apparatus and associated method |
US7797641B2 (en) | 2005-05-27 | 2010-09-14 | Nokia Corporation | Mobile communications terminal and method therefore |
US20060271520A1 (en) | 2005-05-27 | 2006-11-30 | Ragan Gene Z | Content-based implicit search query |
US7953448B2 (en) | 2006-05-31 | 2011-05-31 | Research In Motion Limited | Keyboard for mobile device |
US7685530B2 (en) | 2005-06-10 | 2010-03-23 | T-Mobile Usa, Inc. | Preferred contact group centric interface |
US7684791B2 (en) | 2005-06-13 | 2010-03-23 | Research In Motion Limited | Multiple keyboard context sensitivity for application usage |
KR100627799B1 (en) | 2005-06-15 | 2006-09-25 | 에스케이 텔레콤주식회사 | Method and mobile communication terminal for providing function of integration management of short message service |
US7487467B1 (en) | 2005-06-23 | 2009-02-03 | Sun Microsystems, Inc. | Visual representation and other effects for application management on a device with a small screen |
US7720834B2 (en) | 2005-06-23 | 2010-05-18 | Microsoft Corporation | Application launching via indexed data |
US20060294396A1 (en) | 2005-06-24 | 2006-12-28 | Robert Witman | Multiplatform synchronized data access from mobile devices of dynamically aggregated content |
US7730142B2 (en) | 2005-07-01 | 2010-06-01 | 0733660 B.C. Ltd. | Electronic mail system with functionality to include both private and public messages in a communication |
US20070011610A1 (en) | 2005-07-11 | 2007-01-11 | Onskreen Inc. | Customized Mobile Device Interface System And Method |
US20070015532A1 (en) | 2005-07-15 | 2007-01-18 | Tom Deelman | Multi-function key for electronic devices |
US7577918B2 (en) | 2005-07-15 | 2009-08-18 | Microsoft Corporation | Visual expression of a state of an application window |
WO2007016704A2 (en) | 2005-08-02 | 2007-02-08 | Ipifini, Inc. | Input device having multifunctional keys |
US7925973B2 (en) | 2005-08-12 | 2011-04-12 | Brightcove, Inc. | Distribution of content |
CN100501647C (en) | 2005-08-12 | 2009-06-17 | 深圳华为通信技术有限公司 | Keypad of cell phone and use thereof |
KR100757867B1 (en) | 2005-08-30 | 2007-09-11 | 삼성전자주식회사 | Apparatus and method of interface in multitasking system |
US8225231B2 (en) | 2005-08-30 | 2012-07-17 | Microsoft Corporation | Aggregation of PC settings |
KR100714700B1 (en) | 2005-09-06 | 2007-05-07 | 삼성전자주식회사 | Mobile communication terminal and method for outputting a short message thereof |
US20070061714A1 (en) | 2005-09-09 | 2007-03-15 | Microsoft Corporation | Quick styles for formatting of documents |
US20070073718A1 (en) | 2005-09-14 | 2007-03-29 | Jorey Ramer | Mobile search service instant activation |
US7873356B2 (en) | 2005-09-16 | 2011-01-18 | Microsoft Corporation | Search interface for mobile devices |
US7933632B2 (en) | 2005-09-16 | 2011-04-26 | Microsoft Corporation | Tile space user interface for mobile devices |
US20070063995A1 (en) | 2005-09-22 | 2007-03-22 | Bailey Eric A | Graphical user interface for use with a multi-media system |
US8539374B2 (en) | 2005-09-23 | 2013-09-17 | Disney Enterprises, Inc. | Graphical user interface for electronic devices |
US8860748B2 (en) | 2005-10-03 | 2014-10-14 | Gary Lynn Campbell | Computerized, personal-color analysis system |
US8689147B2 (en) | 2005-10-07 | 2014-04-01 | Blackberry Limited | System and method for using navigational and other commands on a mobile communication device |
US20070083821A1 (en) | 2005-10-07 | 2007-04-12 | International Business Machines Corporation | Creating viewports from selected regions of windows |
US7869832B2 (en) | 2005-10-07 | 2011-01-11 | Research In Motion Limited | Device, system, and method for informing users of functions and characters associated with telephone keys |
US7280097B2 (en) | 2005-10-11 | 2007-10-09 | Zeetoo, Inc. | Human interface input acceleration system |
JP2007148927A (en) | 2005-11-29 | 2007-06-14 | Alps Electric Co Ltd | Input device and scrolling control method using the same |
US7412663B2 (en) | 2005-11-30 | 2008-08-12 | Microsoft Corporation | Dynamic reflective highlighting of a glass appearance window frame |
US8924889B2 (en) | 2005-12-02 | 2014-12-30 | Hillcrest Laboratories, Inc. | Scene transitions in a zoomable user interface using a zoomable markup language |
KR100785067B1 (en) | 2005-12-06 | 2007-12-12 | 삼성전자주식회사 | Device and method for displaying screen image in wireless terminal |
US9069877B2 (en) | 2005-12-07 | 2015-06-30 | Ziilabs Inc., Ltd. | User interface with variable sized icons |
US7664067B2 (en) | 2005-12-15 | 2010-02-16 | Microsoft Corporation | Preserving socket connections over a wireless network |
CN100488177C (en) | 2005-12-22 | 2009-05-13 | 华为技术有限公司 | Method and device for realizing pocket transmission news service |
US7480870B2 (en) | 2005-12-23 | 2009-01-20 | Apple Inc. | Indication of progress towards satisfaction of a user input condition |
US7657849B2 (en) | 2005-12-23 | 2010-02-02 | Apple Inc. | Unlocking a device by performing gestures on an unlock image |
EP1804153A1 (en) | 2005-12-27 | 2007-07-04 | Amadeus s.a.s | User customizable drop-down control list for GUI software applications |
WO2007079425A2 (en) | 2005-12-30 | 2007-07-12 | Apple Inc. | Portable electronic device with multi-touch input |
US7509588B2 (en) | 2005-12-30 | 2009-03-24 | Apple Inc. | Portable electronic device with interface reconfiguration mode |
US7895309B2 (en) | 2006-01-11 | 2011-02-22 | Microsoft Corporation | Network event notification and delivery |
US7657603B1 (en) | 2006-01-23 | 2010-02-02 | Clearwell Systems, Inc. | Methods and systems of electronic message derivation |
WO2007130716A2 (en) | 2006-01-31 | 2007-11-15 | Intellext, Inc. | Methods and apparatus for computerized searching |
US20070198420A1 (en) | 2006-02-03 | 2007-08-23 | Leonid Goldstein | Method and a system for outbound content security in computer networks |
US7536654B2 (en) | 2006-02-06 | 2009-05-19 | Microsoft Corporation | Photo browse and zoom |
JP4844814B2 (en) | 2006-02-13 | 2011-12-28 | ソニー株式会社 | Imaging apparatus and method, and program |
US8537117B2 (en) | 2006-02-13 | 2013-09-17 | Blackberry Limited | Handheld wireless communication device that selectively generates a menu in response to received commands |
WO2007094268A1 (en) | 2006-02-13 | 2007-08-23 | International Business Machines Corporation | Control device, control program, and control method for controlling display of display device for displaying superimposed windows |
JP2007219830A (en) | 2006-02-16 | 2007-08-30 | Fanuc Ltd | Numerical controller |
US20070197196A1 (en) | 2006-02-22 | 2007-08-23 | Michael Shenfield | Apparatus, and associated method, for facilitating delivery and processing of push content |
US20070208840A1 (en) | 2006-03-03 | 2007-09-06 | Nortel Networks Limited | Graphical user interface for network management |
US20070214429A1 (en) | 2006-03-13 | 2007-09-13 | Olga Lyudovyk | System and method for managing application alerts |
TWI300184B (en) | 2006-03-17 | 2008-08-21 | Htc Corp | Information navigation methods, and machine readable medium thereof |
US7595810B2 (en) | 2006-03-22 | 2009-09-29 | Apple Inc. | Methods of manipulating a screen space of a display device |
US8244757B2 (en) | 2006-03-30 | 2012-08-14 | Microsoft Corporation | Facet-based interface for mobile search |
US20070236468A1 (en) | 2006-03-30 | 2007-10-11 | Apaar Tuli | Gesture based device activation |
US8111243B2 (en) * | 2006-03-30 | 2012-02-07 | Cypress Semiconductor Corporation | Apparatus and method for recognizing a tap gesture on a touch sensing device |
US20070238488A1 (en) | 2006-03-31 | 2007-10-11 | Research In Motion Limited | Primary actions menu for a mobile communication device |
US8744056B2 (en) | 2006-04-04 | 2014-06-03 | Sony Corporation | Communication identifier list configuration |
US8255473B2 (en) | 2006-04-04 | 2012-08-28 | International Business Machines Corporation | Caching message fragments during real-time messaging conversations |
KR20070113018A (en) | 2006-05-24 | 2007-11-28 | 엘지전자 주식회사 | Apparatus and operating method of touch screen |
US8077153B2 (en) | 2006-04-19 | 2011-12-13 | Microsoft Corporation | Precise selection techniques for multi-touch screens |
US8296684B2 (en) | 2008-05-23 | 2012-10-23 | Hewlett-Packard Development Company, L.P. | Navigating among activities in a computing device |
US8683362B2 (en) * | 2008-05-23 | 2014-03-25 | Qualcomm Incorporated | Card metaphor for activities in a computing device |
US8156187B2 (en) | 2006-04-20 | 2012-04-10 | Research In Motion Limited | Searching for electronic mail (email) messages with attachments at a wireless communication device |
EP2010999A4 (en) | 2006-04-21 | 2012-11-21 | Google Inc | System for organizing and visualizing display objects |
US7636779B2 (en) | 2006-04-28 | 2009-12-22 | Yahoo! Inc. | Contextual mobile local search based on social network vitality information |
US20070256029A1 (en) | 2006-05-01 | 2007-11-01 | Rpo Pty Llimited | Systems And Methods For Interfacing A User With A Touch-Screen |
US20070260674A1 (en) | 2006-05-02 | 2007-11-08 | Research In Motion Limited | Push framework for delivery of dynamic mobile content |
US7646392B2 (en) | 2006-05-03 | 2010-01-12 | Research In Motion Limited | Dynamic theme color palette generation |
US20070257891A1 (en) | 2006-05-03 | 2007-11-08 | Esenther Alan W | Method and system for emulating a mouse on a multi-touch sensitive surface |
US9063647B2 (en) | 2006-05-12 | 2015-06-23 | Microsoft Technology Licensing, Llc | Multi-touch uses, gestures, and implementation |
EP2036372A1 (en) | 2006-05-23 | 2009-03-18 | Nokia Corporation | Mobile communication terminal with enhanced phonebook management |
TW200805131A (en) | 2006-05-24 | 2008-01-16 | Lg Electronics Inc | Touch screen device and method of selecting files thereon |
KR101188083B1 (en) | 2006-05-24 | 2012-10-05 | 삼성전자주식회사 | Method for providing idle screen layer given an visual effect and method of providing idle screen |
US8571580B2 (en) | 2006-06-01 | 2013-10-29 | Loopt Llc. | Displaying the location of individuals on an interactive map display on a mobile communication device |
US8594634B2 (en) | 2006-06-02 | 2013-11-26 | International Business Machines Corporation | Missed call integration with voicemail and granular access to voicemail |
US7640518B2 (en) | 2006-06-14 | 2009-12-29 | Mitsubishi Electric Research Laboratories, Inc. | Method and system for switching between absolute and relative pointing with direct input devices |
KR20070120368A (en) | 2006-06-19 | 2007-12-24 | 엘지전자 주식회사 | Method and appratus for controlling of menu - icon |
US7880728B2 (en) | 2006-06-29 | 2011-02-01 | Microsoft Corporation | Application switching via a touch screen interface |
US20080040692A1 (en) * | 2006-06-29 | 2008-02-14 | Microsoft Corporation | Gesture input |
US7779370B2 (en) | 2006-06-30 | 2010-08-17 | Google Inc. | User interface for mobile devices |
IL176673A0 (en) | 2006-07-03 | 2007-07-04 | Fermon Israel | A variably displayable mobile device keyboard |
WO2008014408A1 (en) | 2006-07-28 | 2008-01-31 | Blue Lava Technologies | Method and system for displaying multimedia content |
US20080032681A1 (en) | 2006-08-01 | 2008-02-07 | Sony Ericsson Mobile Communications Ab | Click-hold Operations of Mobile Device Input Keys |
US7996487B2 (en) | 2006-08-23 | 2011-08-09 | Oracle International Corporation | Managing searches on mobile devices |
US8564544B2 (en) * | 2006-09-06 | 2013-10-22 | Apple Inc. | Touch screen device, method, and graphical user interface for customizing display of content category icons |
US8014760B2 (en) | 2006-09-06 | 2011-09-06 | Apple Inc. | Missed telephone call management for a portable multifunction device |
US7941760B2 (en) | 2006-09-06 | 2011-05-10 | Apple Inc. | Soft keyboard display for a portable multifunction device |
WO2008031871A1 (en) | 2006-09-13 | 2008-03-20 | Imencro Software Sa | Method for automatically classifying communication between a sender and a recipient |
US7702683B1 (en) | 2006-09-18 | 2010-04-20 | Hewlett-Packard Development Company, L.P. | Estimating similarity between two collections of information |
US20080076472A1 (en) | 2006-09-22 | 2008-03-27 | Sony Ericsson Mobile Communications Ab | Intelligent Predictive Text Entry |
WO2008035831A1 (en) | 2006-09-22 | 2008-03-27 | Gt Telecom, Co., Ltd | Celluar phones having a function of dialing with a searched name |
KR100774927B1 (en) | 2006-09-27 | 2007-11-09 | 엘지전자 주식회사 | Mobile communication terminal, menu and item selection method using the same |
SG141289A1 (en) | 2006-09-29 | 2008-04-28 | Wireless Intellect Labs Pte Lt | An event update management system |
US8756510B2 (en) | 2006-10-17 | 2014-06-17 | Cooliris, Inc. | Method and system for displaying photos, videos, RSS and other media content in full-screen immersive view and grid-view using a browser feature |
US8891455B2 (en) | 2006-10-23 | 2014-11-18 | Samsung Electronics Co., Ltd. | Synchronous spectrum sharing by dedicated networks using OFDM/OFDMA signaling |
US20080102863A1 (en) | 2006-10-31 | 2008-05-01 | Research In Motion Limited | System, method, and user interface for searching for messages associated with a message service on a mobile device |
US8942739B2 (en) | 2006-11-06 | 2015-01-27 | Qualcomm Incorporated | Methods and apparatus for communication of notifications |
US20080113656A1 (en) | 2006-11-15 | 2008-05-15 | Lg Telecom Ltd. | System and method for updating contents |
US8117555B2 (en) | 2006-12-07 | 2012-02-14 | Sap Ag | Cooperating widgets |
US9003296B2 (en) | 2006-12-20 | 2015-04-07 | Yahoo! Inc. | Browser renderable toolbar |
US20080163104A1 (en) | 2006-12-30 | 2008-07-03 | Tobias Haug | Multiple window handler on display screen |
US7921176B2 (en) | 2007-01-03 | 2011-04-05 | Madnani Rajkumar R | Mechanism for generating a composite email |
US7956847B2 (en) | 2007-01-05 | 2011-06-07 | Apple Inc. | Gestures for controlling, manipulating, and editing of media files using touch sensitive devices |
US7924271B2 (en) | 2007-01-05 | 2011-04-12 | Apple Inc. | Detecting gestures on multi-event sensitive devices |
US7907125B2 (en) | 2007-01-05 | 2011-03-15 | Microsoft Corporation | Recognizing multiple input point gestures |
US7877707B2 (en) | 2007-01-06 | 2011-01-25 | Apple Inc. | Detecting and interpreting real-world and security gestures on touch and hover sensitive devices |
US8689132B2 (en) | 2007-01-07 | 2014-04-01 | Apple Inc. | Portable electronic device, method, and graphical user interface for displaying electronic documents and lists |
US8091045B2 (en) | 2007-01-07 | 2012-01-03 | Apple Inc. | System and method for managing lists |
US8082523B2 (en) | 2007-01-07 | 2011-12-20 | Apple Inc. | Portable electronic device with graphical user interface supporting application switching |
US20080222545A1 (en) | 2007-01-07 | 2008-09-11 | Lemay Stephen O | Portable Electronic Device with a Global Setting User Interface |
US7671756B2 (en) | 2007-01-07 | 2010-03-02 | Apple Inc. | Portable electronic device with alert silencing |
US20080168382A1 (en) | 2007-01-07 | 2008-07-10 | Louch John O | Dashboards, Widgets and Devices |
US20080168402A1 (en) | 2007-01-07 | 2008-07-10 | Christopher Blumenberg | Application Programming Interfaces for Gesture Operations |
US7791598B2 (en) | 2007-01-10 | 2010-09-07 | Microsoft Corporation | Hybrid pen mouse user input device |
US20080172609A1 (en) | 2007-01-11 | 2008-07-17 | Nokia Corporation | Multiple application handling |
US20080182628A1 (en) | 2007-01-26 | 2008-07-31 | Matthew Lee | System and method for previewing themes |
US20080180399A1 (en) | 2007-01-31 | 2008-07-31 | Tung Wan Cheng | Flexible Multi-touch Screen |
US8601370B2 (en) | 2007-01-31 | 2013-12-03 | Blackberry Limited | System and method for organizing icons for applications on a mobile device |
KR20080073868A (en) | 2007-02-07 | 2008-08-12 | 엘지전자 주식회사 | Terminal and method for displaying menu |
US7737979B2 (en) | 2007-02-12 | 2010-06-15 | Microsoft Corporation | Animated transitions for data visualization |
KR101426718B1 (en) | 2007-02-15 | 2014-08-05 | 삼성전자주식회사 | Apparatus and method for displaying of information according to touch event in a portable terminal |
US7853240B2 (en) | 2007-02-15 | 2010-12-14 | Research In Motion Limited | Emergency number selection for mobile communications device |
US8078969B2 (en) | 2007-03-05 | 2011-12-13 | Shutterfly, Inc. | User interface for creating image collage |
US20080222273A1 (en) | 2007-03-07 | 2008-09-11 | Microsoft Corporation | Adaptive rendering of web pages on mobile devices using imaging technology |
US8352881B2 (en) | 2007-03-08 | 2013-01-08 | International Business Machines Corporation | Method, apparatus and program storage device for providing customizable, immediate and radiating menus for accessing applications and actions |
US8255812B1 (en) | 2007-03-15 | 2012-08-28 | Google Inc. | Embedding user-selected content feed items in a webpage |
US20080242362A1 (en) | 2007-03-26 | 2008-10-02 | Helio, Llc | Rapid Content Association Methods |
US7884805B2 (en) | 2007-04-17 | 2011-02-08 | Sony Ericsson Mobile Communications Ab | Using touches to transfer information between devices |
KR101344265B1 (en) | 2007-04-17 | 2013-12-24 | 삼성전자주식회사 | Method for displaying human relations and mobile terminal thereof |
TWI418200B (en) | 2007-04-20 | 2013-12-01 | Lg Electronics Inc | Mobile terminal and screen displaying method thereof |
US20080301104A1 (en) | 2007-06-01 | 2008-12-04 | Kendall Gregory Lockhart | System and method for implementing enhanced search functionality |
US8381122B2 (en) | 2007-06-08 | 2013-02-19 | Apple Inc. | Multi-dimensional application environment |
US9740386B2 (en) | 2007-06-13 | 2017-08-22 | Apple Inc. | Speed/positional mode translations |
US8923507B2 (en) | 2007-06-20 | 2014-12-30 | Microsoft Corporation | Alpha character support and translation in dialer |
US8059101B2 (en) * | 2007-06-22 | 2011-11-15 | Apple Inc. | Swipe gestures for touch screen keyboards |
US20080316177A1 (en) | 2007-06-22 | 2008-12-25 | Kuo-Hwa Tseng | Mouse-type mobile phone |
US8171432B2 (en) | 2008-01-06 | 2012-05-01 | Apple Inc. | Touch screen device, method, and graphical user interface for displaying and selecting application options |
US8065628B2 (en) | 2007-06-25 | 2011-11-22 | Microsoft Corporation | Dynamic user interface for previewing live content |
MY168177A (en) | 2007-06-27 | 2018-10-11 | Karen Knowles Entpr Pty Lty | Communication method, system and products |
JP5133001B2 (en) | 2007-06-28 | 2013-01-30 | 京セラ株式会社 | Portable electronic device and display method in the same device |
US9772751B2 (en) | 2007-06-29 | 2017-09-26 | Apple Inc. | Using gestures to slide between user interfaces |
US8762880B2 (en) | 2007-06-29 | 2014-06-24 | Microsoft Corporation | Exposing non-authoring features through document status information in an out-space user interface |
US7707205B2 (en) | 2007-07-05 | 2010-04-27 | Sony Ericsson Mobile Communications Ab | Apparatus and method for locating a target item in a list |
US20120229473A1 (en) | 2007-07-17 | 2012-09-13 | Airgini Group, Inc. | Dynamic Animation in a Mobile Device |
KR20090011314A (en) | 2007-07-25 | 2009-02-02 | 삼성전자주식회사 | Mobile terminal and sim card displaying method thereof |
US9489216B2 (en) | 2007-07-26 | 2016-11-08 | Sap Se | Active tiled user interface |
US7783597B2 (en) | 2007-08-02 | 2010-08-24 | Abaca Technology Corporation | Email filtering using recipient reputation |
JP5046158B2 (en) | 2007-08-10 | 2012-10-10 | インターナショナル・ビジネス・マシーンズ・コーポレーション | Apparatus and method for detecting characteristics of an e-mail message |
US20080301046A1 (en) | 2007-08-10 | 2008-12-04 | Christian John Martinez | Methods and systems for making a payment and/or a donation via a network, such as the Internet, using a drag and drop user interface |
US7877687B2 (en) | 2007-08-16 | 2011-01-25 | Yahoo! Inc. | Persistent visual media player |
KR101430445B1 (en) | 2007-08-20 | 2014-08-14 | 엘지전자 주식회사 | Terminal having function for controlling screen size and program recording medium |
US20090051671A1 (en) | 2007-08-22 | 2009-02-26 | Jason Antony Konstas | Recognizing the motion of two or more touches on a touch-sensing surface |
US8306509B2 (en) | 2007-08-31 | 2012-11-06 | At&T Mobility Ii Llc | Enhanced messaging with language translation feature |
US9477395B2 (en) | 2007-09-04 | 2016-10-25 | Apple Inc. | Audio file interface |
US11126321B2 (en) | 2007-09-04 | 2021-09-21 | Apple Inc. | Application menu user interface |
US20090070673A1 (en) | 2007-09-06 | 2009-03-12 | Guy Barkan | System and method for presenting multimedia content and application interface |
US20090077649A1 (en) | 2007-09-13 | 2009-03-19 | Soft Trust, Inc. | Secure messaging system and method |
US9177317B2 (en) | 2007-09-28 | 2015-11-03 | Bank Of America Corporation | System and method for consumer protection |
US8094105B2 (en) | 2007-09-28 | 2012-01-10 | Motorola Mobility, Inc. | Navigation for a non-traditionally shaped liquid crystal display for mobile handset devices |
US8098235B2 (en) | 2007-09-28 | 2012-01-17 | Immersion Corporation | Multi-touch device having dynamic haptic effects |
DE202008018283U1 (en) | 2007-10-04 | 2012-07-17 | Lg Electronics Inc. | Menu display for a mobile communication terminal |
US8482521B2 (en) * | 2007-10-05 | 2013-07-09 | Gvbb Holdings S.A.R.L. | Pointer controlling apparatus |
US7983718B1 (en) | 2007-10-11 | 2011-07-19 | Sprint Spectrum L.P. | Wireless phones with keys displaying image files |
US20090109243A1 (en) | 2007-10-25 | 2009-04-30 | Nokia Corporation | Apparatus and method for zooming objects on a display |
US8275398B2 (en) | 2007-11-02 | 2012-09-25 | Hewlett-Packard Development Company, L.P. | Message addressing techniques for a mobile computing device |
US7992104B2 (en) | 2007-11-13 | 2011-08-02 | Microsoft Corporation | Viewing data |
US8745513B2 (en) | 2007-11-29 | 2014-06-03 | Sony Corporation | Method and apparatus for use in accessing content |
US8020780B2 (en) | 2007-11-30 | 2011-09-20 | Honeywell International Inc. | Thermostatic control system having a configurable lock |
US20090140986A1 (en) | 2007-11-30 | 2009-06-04 | Nokia Corporation | Method, apparatus and computer program product for transferring files between devices via drag and drop |
US20090146962A1 (en) | 2007-12-05 | 2009-06-11 | Nokia Corporation | Mobile communication terminal and method |
US8212784B2 (en) | 2007-12-13 | 2012-07-03 | Microsoft Corporation | Selection and display of media associated with a geographic area based on gesture input |
US20090164888A1 (en) | 2007-12-19 | 2009-06-25 | Thomas Phan | Automated Content-Based Adjustment of Formatting and Application Behavior |
JP4605478B2 (en) | 2007-12-19 | 2011-01-05 | ソニー株式会社 | Information processing apparatus, display control method, and display control program |
KR20090066368A (en) | 2007-12-20 | 2009-06-24 | 삼성전자주식회사 | Portable terminal having touch screen and method for performing function thereof |
US20090164928A1 (en) | 2007-12-21 | 2009-06-25 | Nokia Corporation | Method, apparatus and computer program product for providing an improved user interface |
US8515397B2 (en) | 2007-12-24 | 2013-08-20 | Qualcomm Incorporation | Time and location based theme of mobile telephones |
US9372576B2 (en) * | 2008-01-04 | 2016-06-21 | Apple Inc. | Image jaggedness filter for determining whether to perform baseline calculations |
US20090182788A1 (en) | 2008-01-14 | 2009-07-16 | Zenbe, Inc. | Apparatus and method for customized email and data management |
US20090184939A1 (en) | 2008-01-23 | 2009-07-23 | N-Trig Ltd. | Graphical object manipulation with a touch sensitive screen |
US8677285B2 (en) | 2008-02-01 | 2014-03-18 | Wimm Labs, Inc. | User interface of a small touch sensitive display for an electronic data and communication device |
US8356258B2 (en) | 2008-02-01 | 2013-01-15 | Microsoft Corporation | Arranging display areas utilizing enhanced window states |
US9612847B2 (en) | 2008-02-05 | 2017-04-04 | Microsoft Technology Licensing, Llc | Destination list associated with an application launcher |
US8910299B2 (en) | 2008-02-08 | 2014-12-09 | Steven Charles Michalske | Emergency information access on portable electronic devices |
US8205157B2 (en) | 2008-03-04 | 2012-06-19 | Apple Inc. | Methods and graphical user interfaces for conducting searches on a portable multifunction device |
US9772689B2 (en) | 2008-03-04 | 2017-09-26 | Qualcomm Incorporated | Enhanced gesture-based image manipulation |
JP2009245423A (en) | 2008-03-13 | 2009-10-22 | Panasonic Corp | Information device and window display method |
US8327286B2 (en) | 2008-03-13 | 2012-12-04 | Microsoft Corporation | Unifying application launchers and switchers |
US9269059B2 (en) | 2008-03-25 | 2016-02-23 | Qualcomm Incorporated | Apparatus and methods for transport optimization for widget content delivery |
US20090249257A1 (en) | 2008-03-31 | 2009-10-01 | Nokia Corporation | Cursor navigation assistance |
TWI381304B (en) | 2008-04-22 | 2013-01-01 | Htc Corp | Method and apparatus for adjusting display area of user interface and recoding medium using the same |
JP4171770B1 (en) * | 2008-04-24 | 2008-10-29 | 任天堂株式会社 | Object display order changing program and apparatus |
US8174503B2 (en) | 2008-05-17 | 2012-05-08 | David H. Cain | Touch-based authentication of a mobile device through user generated pattern creation |
US8296670B2 (en) | 2008-05-19 | 2012-10-23 | Microsoft Corporation | Accessing a menu utilizing a drag-operation |
US8375336B2 (en) * | 2008-05-23 | 2013-02-12 | Microsoft Corporation | Panning content utilizing a drag operation |
CN102187694A (en) | 2008-05-28 | 2011-09-14 | 谷歌公司 | Motion-controlled views on mobile computing devices |
EP2129090B1 (en) | 2008-05-29 | 2016-06-15 | LG Electronics Inc. | Mobile terminal and display control method thereof |
JP5164675B2 (en) | 2008-06-04 | 2013-03-21 | キヤノン株式会社 | User interface control method, information processing apparatus, and program |
US8135392B2 (en) | 2008-06-06 | 2012-03-13 | Apple Inc. | Managing notification service connections and displaying icon badges |
US8099332B2 (en) | 2008-06-06 | 2012-01-17 | Apple Inc. | User interface for application management for a mobile device |
US8477139B2 (en) | 2008-06-09 | 2013-07-02 | Apple Inc. | Touch screen device, method, and graphical user interface for manipulating three-dimensional virtual objects |
KR101477743B1 (en) * | 2008-06-16 | 2014-12-31 | 삼성전자 주식회사 | Terminal and method for performing function thereof |
US9092053B2 (en) | 2008-06-17 | 2015-07-28 | Apple Inc. | Systems and methods for adjusting a display based on the user's position |
GB0811196D0 (en) | 2008-06-18 | 2008-07-23 | Skype Ltd | Searching method and apparatus |
JP2010003098A (en) | 2008-06-20 | 2010-01-07 | Konica Minolta Business Technologies Inc | Input device, operation acceptance method and operation acceptance program |
US8154524B2 (en) | 2008-06-24 | 2012-04-10 | Microsoft Corporation | Physics simulation-based interaction for surface computing |
US20090322760A1 (en) | 2008-06-26 | 2009-12-31 | Microsoft Corporation | Dynamic animation scheduling |
US20090327969A1 (en) | 2008-06-27 | 2009-12-31 | Microsoft Corporation | Semantic zoom in a virtual three-dimensional graphical user interface |
US8150017B2 (en) | 2008-07-11 | 2012-04-03 | Verizon Patent And Licensing Inc. | Phone dialer with advanced search feature and associated method of searching a directory |
TW201005599A (en) | 2008-07-18 | 2010-02-01 | Asustek Comp Inc | Touch-type mobile computing device and control method of the same |
KR20100010072A (en) | 2008-07-22 | 2010-02-01 | 엘지전자 주식회사 | Controlling method of user interface for multitasking of mobile devices |
US8390577B2 (en) | 2008-07-25 | 2013-03-05 | Intuilab | Continuous recognition of multi-touch gestures |
WO2010015070A1 (en) | 2008-08-07 | 2010-02-11 | Research In Motion Limited | System and method for providing content on a mobile device by controlling an application independent of user action |
US8924892B2 (en) | 2008-08-22 | 2014-12-30 | Fuji Xerox Co., Ltd. | Multiple selection on devices with many gestures |
JP4636141B2 (en) * | 2008-08-28 | 2011-02-23 | ソニー株式会社 | Information processing apparatus and method, and program |
US20100058248A1 (en) | 2008-08-29 | 2010-03-04 | Johnson Controls Technology Company | Graphical user interfaces for building management systems |
US20100070931A1 (en) | 2008-09-15 | 2010-03-18 | Sony Ericsson Mobile Communications Ab | Method and apparatus for selecting an object |
KR101548958B1 (en) | 2008-09-18 | 2015-09-01 | 삼성전자주식회사 | A method for operating control in mobile terminal with touch screen and apparatus thereof. |
US20100075628A1 (en) | 2008-09-19 | 2010-03-25 | Verizon Data Services Llc | Method and apparatus for transmitting authenticated emergency messages |
US8296658B2 (en) | 2008-09-19 | 2012-10-23 | Cisco Technology, Inc. | Generator for personalization of electronic devices |
US8595371B2 (en) | 2008-09-19 | 2013-11-26 | Samsung Electronics Co., Ltd. | Sending a remote user interface |
US8352864B2 (en) | 2008-09-19 | 2013-01-08 | Cisco Technology, Inc. | Method of operating a design generator for personalization of electronic devices |
US8600446B2 (en) | 2008-09-26 | 2013-12-03 | Htc Corporation | Mobile device interface with dual windows |
US8176438B2 (en) | 2008-09-26 | 2012-05-08 | Microsoft Corporation | Multi-modal interaction for a screen magnifier |
US20100079413A1 (en) | 2008-09-29 | 2010-04-01 | Denso Corporation | Control device |
WO2010035491A1 (en) * | 2008-09-29 | 2010-04-01 | パナソニック株式会社 | User interface device, user interface method, and recording medium |
US20100087169A1 (en) | 2008-10-02 | 2010-04-08 | Microsoft Corporation | Threading together messages with multiple common participants |
US20100087173A1 (en) | 2008-10-02 | 2010-04-08 | Microsoft Corporation | Inter-threading Indications of Different Types of Communication |
US9015616B2 (en) | 2008-10-22 | 2015-04-21 | Google Inc. | Search initiation |
US8385952B2 (en) | 2008-10-23 | 2013-02-26 | Microsoft Corporation | Mobile communications device user interface |
US8411046B2 (en) | 2008-10-23 | 2013-04-02 | Microsoft Corporation | Column organization of content |
US20100105424A1 (en) | 2008-10-23 | 2010-04-29 | Smuga Michael A | Mobile Communications Device User Interface |
US20100105441A1 (en) | 2008-10-23 | 2010-04-29 | Chad Aron Voss | Display Size of Representations of Content |
TW201023026A (en) | 2008-10-23 | 2010-06-16 | Microsoft Corp | Location-based display characteristics in a user interface |
US8086275B2 (en) | 2008-10-23 | 2011-12-27 | Microsoft Corporation | Alternative inputs of a mobile communications device |
US8477103B2 (en) | 2008-10-26 | 2013-07-02 | Microsoft Corporation | Multi-touch object inertia simulation |
US8108623B2 (en) | 2008-10-26 | 2012-01-31 | Microsoft Corporation | Poll based cache event notifications in a distributed cache |
US20100107067A1 (en) | 2008-10-27 | 2010-04-29 | Nokia Corporation | Input on touch based user interfaces |
KR101029627B1 (en) | 2008-10-31 | 2011-04-15 | 에스케이텔레시스 주식회사 | Method of operating functions of mobile terminal with touch screen and apparatus thereof |
WO2010055197A1 (en) | 2008-11-11 | 2010-05-20 | Nokia Corporation | Method and apparatus for managing advertising-enabled applications |
KR20100056350A (en) | 2008-11-18 | 2010-05-27 | 황선원 | Method and apparatus for automatically outputting updated music letter voice and picture on initial display window of the portable display devices |
US8302026B2 (en) | 2008-11-28 | 2012-10-30 | Microsoft Corporation | Multi-panel user interface |
US20100145675A1 (en) | 2008-12-04 | 2010-06-10 | Microsoft Corporation | User interface having customizable text strings |
US20100146437A1 (en) | 2008-12-04 | 2010-06-10 | Microsoft Corporation | Glanceable animated notifications on a locked device |
US8331992B2 (en) | 2008-12-19 | 2012-12-11 | Verizon Patent And Licensing Inc. | Interactive locked state mobile communication device |
US8942767B2 (en) | 2008-12-19 | 2015-01-27 | Verizon Patent And Licensing Inc. | Communications convergence and user interface systems, apparatuses, and methods |
US8443303B2 (en) | 2008-12-22 | 2013-05-14 | Verizon Patent And Licensing Inc. | Gesture-based navigation |
US8291348B2 (en) | 2008-12-31 | 2012-10-16 | Hewlett-Packard Development Company, L.P. | Computing device and method for selecting display regions responsive to non-discrete directional input actions and intelligent content analysis |
US8799806B2 (en) * | 2008-12-31 | 2014-08-05 | Verizon Patent And Licensing Inc. | Tabbed content view on a touch-screen device |
US20100175029A1 (en) | 2009-01-06 | 2010-07-08 | General Electric Company | Context switching zooming user interface |
US8499251B2 (en) | 2009-01-07 | 2013-07-30 | Microsoft Corporation | Virtual page turn |
US8433998B2 (en) | 2009-01-16 | 2013-04-30 | International Business Machines Corporation | Tool and method for annotating an event map, and collaborating using the annotated event map |
US8750906B2 (en) | 2009-02-20 | 2014-06-10 | T-Mobile Usa, Inc. | Dynamic elements on a map within a mobile device, such as elements that facilitate communication between users |
US8819570B2 (en) | 2009-03-27 | 2014-08-26 | Zumobi, Inc | Systems, methods, and computer program products displaying interactive elements on a canvas |
US8238876B2 (en) | 2009-03-30 | 2012-08-07 | Microsoft Corporation | Notifications |
US20100248741A1 (en) | 2009-03-30 | 2010-09-30 | Nokia Corporation | Method and apparatus for illustrative representation of a text communication |
US8175653B2 (en) | 2009-03-30 | 2012-05-08 | Microsoft Corporation | Chromeless user interface |
US8355698B2 (en) | 2009-03-30 | 2013-01-15 | Microsoft Corporation | Unlock screen |
KR20100114572A (en) | 2009-04-16 | 2010-10-26 | 삼성전자주식회사 | Method for displaying contents of terminal having touch screen and apparatus thereof |
WO2010124397A1 (en) | 2009-04-29 | 2010-11-04 | Torch Mobile Inc. | Software-based asynchronous tiled backingstore |
US20100281409A1 (en) | 2009-04-30 | 2010-11-04 | Nokia Corporation | Apparatus and method for handling notifications within a communications device |
US8669945B2 (en) | 2009-05-07 | 2014-03-11 | Microsoft Corporation | Changing of list views on mobile device |
US8368707B2 (en) | 2009-05-18 | 2013-02-05 | Apple Inc. | Memory management based on automatic full-screen detection |
KR101620874B1 (en) | 2009-05-19 | 2016-05-13 | 삼성전자주식회사 | Searching Method of a List And Portable Device using the same |
US20110004845A1 (en) | 2009-05-19 | 2011-01-06 | Intelliborn Corporation | Method and System For Notifying A User of An Event Or Information Using Motion And Transparency On A Small Screen Display |
US8269736B2 (en) | 2009-05-22 | 2012-09-18 | Microsoft Corporation | Drop target gestures |
US8836648B2 (en) | 2009-05-27 | 2014-09-16 | Microsoft Corporation | Touch pull-in gesture |
US9298336B2 (en) | 2009-05-28 | 2016-03-29 | Apple Inc. | Rotation smoothing of a user interface |
US20100302176A1 (en) | 2009-05-29 | 2010-12-02 | Nokia Corporation | Zoom-in functionality |
US8225193B1 (en) | 2009-06-01 | 2012-07-17 | Symantec Corporation | Methods and systems for providing workspace navigation with a tag cloud |
US8621387B2 (en) | 2009-06-08 | 2013-12-31 | Apple Inc. | User interface for multiple display regions |
KR101561703B1 (en) | 2009-06-08 | 2015-10-30 | 엘지전자 주식회사 | The method for executing menu and mobile terminal using the same |
KR101649098B1 (en) | 2009-06-30 | 2016-08-19 | 삼성전자주식회사 | Apparatus and method for rendering using sensor in portable terminal |
US8239781B2 (en) | 2009-06-30 | 2012-08-07 | Sap Ag | Drag and drop of an application component to desktop |
US20110004839A1 (en) | 2009-07-02 | 2011-01-06 | Derek Cha | User-customized computer display method |
JP2011028524A (en) | 2009-07-24 | 2011-02-10 | Toshiba Corp | Information processing apparatus, program and pointing method |
US20110029904A1 (en) | 2009-07-30 | 2011-02-03 | Adam Miles Smith | Behavior and Appearance of Touch-Optimized User Interface Elements for Controlling Computer Function |
US8656314B2 (en) | 2009-07-30 | 2014-02-18 | Lenovo (Singapore) Pte. Ltd. | Finger touch gesture for joining and unjoining discrete touch objects |
US8521809B2 (en) | 2009-07-31 | 2013-08-27 | Z2Live, Inc. | Mobile device notification controls system and method |
US8766928B2 (en) | 2009-09-25 | 2014-07-01 | Apple Inc. | Device, method, and graphical user interface for manipulating user interface objects |
US8624933B2 (en) | 2009-09-25 | 2014-01-07 | Apple Inc. | Device, method, and graphical user interface for scrolling a multi-section document |
CA2681879A1 (en) | 2009-10-07 | 2011-04-07 | Research In Motion Limited | A method of controlling touch input on a touch-sensitive display when a display element is active and a portable electronic device configured for the same |
US20110087988A1 (en) | 2009-10-12 | 2011-04-14 | Johnson Controls Technology Company | Graphical control elements for building management systems |
KR101701492B1 (en) | 2009-10-16 | 2017-02-14 | 삼성전자주식회사 | Terminal and method for displaying data thereof |
US8499253B2 (en) | 2009-10-13 | 2013-07-30 | Google Inc. | Individualized tab audio controls |
US9104275B2 (en) | 2009-10-20 | 2015-08-11 | Lg Electronics Inc. | Mobile terminal to display an object on a perceived 3D space |
US8261212B2 (en) | 2009-10-20 | 2012-09-04 | Microsoft Corporation | Displaying GUI elements on natural user interfaces |
US8677284B2 (en) | 2009-11-04 | 2014-03-18 | Alpine Electronics, Inc. | Method and apparatus for controlling and displaying contents in a user interface |
US20110113363A1 (en) | 2009-11-10 | 2011-05-12 | James Anthony Hunt | Multi-Mode User Interface |
US9152318B2 (en) | 2009-11-25 | 2015-10-06 | Yahoo! Inc. | Gallery application for content viewing |
KR101725887B1 (en) | 2009-12-21 | 2017-04-11 | 삼성전자주식회사 | Method and apparatus for searching contents in touch screen device |
US9189500B2 (en) * | 2009-12-31 | 2015-11-17 | Verizon Patent And Licensing Inc. | Graphical flash view of documents for data navigation on a touch-screen device |
US8786559B2 (en) | 2010-01-06 | 2014-07-22 | Apple Inc. | Device, method, and graphical user interface for manipulating tables using multi-contact gestures |
CA2786991A1 (en) | 2010-01-12 | 2011-07-21 | Crane Merchandising Systems, Inc. | Mechanism for a vending machine graphical user interface utilizing xml for a versatile customer experience |
US9542097B2 (en) * | 2010-01-13 | 2017-01-10 | Lenovo (Singapore) Pte. Ltd. | Virtual touchpad for a touch device |
EP2354914A1 (en) | 2010-01-19 | 2011-08-10 | LG Electronics Inc. | Mobile terminal and control method thereof |
US8930841B2 (en) | 2010-02-15 | 2015-01-06 | Motorola Mobility Llc | Methods and apparatus for a user interface configured to display event information |
US20110231796A1 (en) | 2010-02-16 | 2011-09-22 | Jose Manuel Vigil | Methods for navigating a touch screen device in conjunction with gestures |
US8539384B2 (en) | 2010-02-25 | 2013-09-17 | Microsoft Corporation | Multi-screen pinch and expand gestures |
US8751970B2 (en) | 2010-02-25 | 2014-06-10 | Microsoft Corporation | Multi-screen synchronous slide gesture |
US8473870B2 (en) | 2010-02-25 | 2013-06-25 | Microsoft Corporation | Multi-screen hold and drag gesture |
US9075522B2 (en) | 2010-02-25 | 2015-07-07 | Microsoft Technology Licensing, Llc | Multi-screen bookmark hold gesture |
US20110209089A1 (en) | 2010-02-25 | 2011-08-25 | Hinckley Kenneth P | Multi-screen object-hold and page-change gesture |
US9454304B2 (en) | 2010-02-25 | 2016-09-27 | Microsoft Technology Licensing, Llc | Multi-screen dual tap gesture |
US20110209101A1 (en) | 2010-02-25 | 2011-08-25 | Hinckley Kenneth P | Multi-screen pinch-to-pocket gesture |
US8589815B2 (en) | 2010-03-10 | 2013-11-19 | Microsoft Corporation | Control of timing for animations in dynamic icons |
US8458615B2 (en) | 2010-04-07 | 2013-06-04 | Apple Inc. | Device, method, and graphical user interface for managing folders |
US20110252376A1 (en) | 2010-04-07 | 2011-10-13 | Imran Chaudhri | Device, Method, and Graphical User Interface for Managing Concurrently Open Software Applications |
FR2959037A1 (en) | 2010-04-14 | 2011-10-21 | Orange Vallee | METHOD FOR CREATING A MEDIA SEQUENCE BY COHERENT GROUPS OF MEDIA FILES |
US8957920B2 (en) | 2010-06-25 | 2015-02-17 | Microsoft Corporation | Alternative semantics for zoom operations in a zoomable scene |
US8639747B2 (en) | 2010-07-01 | 2014-01-28 | Red Hat, Inc. | System and method for providing a cloud computing graphical user interface |
US8285258B2 (en) | 2010-07-07 | 2012-10-09 | Research In Motion Limited | Pushed content notification and display |
US20120050332A1 (en) | 2010-08-25 | 2012-03-01 | Nokia Corporation | Methods and apparatuses for facilitating content navigation |
US10140301B2 (en) | 2010-09-01 | 2018-11-27 | Apple Inc. | Device, method, and graphical user interface for selecting and using sets of media player controls |
EP2625660A4 (en) * | 2010-10-05 | 2014-06-11 | Centric Software Inc | Interactive collection book for mobile devices |
US20120102433A1 (en) | 2010-10-20 | 2012-04-26 | Steven Jon Falkenburg | Browser Icon Management |
US20120151397A1 (en) | 2010-12-08 | 2012-06-14 | Tavendo Gmbh | Access to an electronic object collection via a plurality of views |
US9239674B2 (en) | 2010-12-17 | 2016-01-19 | Nokia Technologies Oy | Method and apparatus for providing different user interface effects for different implementation characteristics of a touch event |
US20120159395A1 (en) | 2010-12-20 | 2012-06-21 | Microsoft Corporation | Application-launching interface for multiple modes |
US8689123B2 (en) | 2010-12-23 | 2014-04-01 | Microsoft Corporation | Application reporting in an application-selectable user interface |
US8612874B2 (en) | 2010-12-23 | 2013-12-17 | Microsoft Corporation | Presenting an application change through a tile |
US20120174029A1 (en) | 2010-12-30 | 2012-07-05 | International Business Machines Corporation | Dynamically magnifying logical segments of a view |
US9423951B2 (en) | 2010-12-31 | 2016-08-23 | Microsoft Technology Licensing, Llc | Content-based snap point |
US9311061B2 (en) | 2011-02-10 | 2016-04-12 | International Business Machines Corporation | Designing task execution order based on location of the task icons within a graphical user interface |
US9104288B2 (en) | 2011-03-08 | 2015-08-11 | Nokia Technologies Oy | Method and apparatus for providing quick access to media functions from a locked screen |
US9383917B2 (en) | 2011-03-28 | 2016-07-05 | Microsoft Technology Licensing, Llc | Predictive tiling |
US20120299968A1 (en) | 2011-05-27 | 2012-11-29 | Tsz Yan Wong | Managing an immersive interface in a multi-application immersive environment |
US9158445B2 (en) | 2011-05-27 | 2015-10-13 | Microsoft Technology Licensing, Llc | Managing an immersive interface in a multi-application immersive environment |
US20120304118A1 (en) | 2011-05-27 | 2012-11-29 | Donahue Tyler J | Application Notification Display |
US20120304117A1 (en) | 2011-05-27 | 2012-11-29 | Donahue Tyler J | Application Notification Tags |
US20120304113A1 (en) | 2011-05-27 | 2012-11-29 | Patten Michael J | Gesture-based content-object zooming |
US9104307B2 (en) | 2011-05-27 | 2015-08-11 | Microsoft Technology Licensing, Llc | Multi-application environment |
US9104440B2 (en) | 2011-05-27 | 2015-08-11 | Microsoft Technology Licensing, Llc | Multi-application environment |
US8893033B2 (en) | 2011-05-27 | 2014-11-18 | Microsoft Corporation | Application notifications |
US20120304068A1 (en) | 2011-05-27 | 2012-11-29 | Nazia Zaman | Presentation format for an application tile |
US9728164B2 (en) | 2011-05-31 | 2017-08-08 | Lenovo (Singapore) Pte. Ltd. | Moving a tile across multiple workspaces |
US8694603B2 (en) | 2011-06-20 | 2014-04-08 | International Business Machines Corporation | Geospatial visualization performance improvement for contiguous polylines with similar dynamic characteristics |
US8687023B2 (en) | 2011-08-02 | 2014-04-01 | Microsoft Corporation | Cross-slide gesture to select and rearrange |
US8700999B2 (en) | 2011-08-15 | 2014-04-15 | Google Inc. | Carousel user interface for document management |
US20130057587A1 (en) | 2011-09-01 | 2013-03-07 | Microsoft Corporation | Arranging tiles |
US9557909B2 (en) | 2011-09-09 | 2017-01-31 | Microsoft Technology Licensing, Llc | Semantic zoom linguistic helpers |
US20130067390A1 (en) | 2011-09-09 | 2013-03-14 | Paul J. Kwiatkowski | Programming Interface for Semantic Zoom |
US20130067398A1 (en) | 2011-09-09 | 2013-03-14 | Theresa B. Pittappilly | Semantic Zoom |
US8922575B2 (en) | 2011-09-09 | 2014-12-30 | Microsoft Corporation | Tile cache |
US20130067412A1 (en) | 2011-09-09 | 2013-03-14 | Microsoft Corporation | Grouping selectable tiles |
US20130067420A1 (en) | 2011-09-09 | 2013-03-14 | Theresa B. Pittappilly | Semantic Zoom Gestures |
US10353566B2 (en) | 2011-09-09 | 2019-07-16 | Microsoft Technology Licensing, Llc | Semantic zoom animations |
US9244802B2 (en) | 2011-09-10 | 2016-01-26 | Microsoft Technology Licensing, Llc | Resource user interface |
US8933952B2 (en) | 2011-09-10 | 2015-01-13 | Microsoft Corporation | Pre-rendering new content for an application-selectable user interface |
US9146670B2 (en) | 2011-09-10 | 2015-09-29 | Microsoft Technology Licensing, Llc | Progressively indicating new content in an application-selectable user interface |
US8243102B1 (en) | 2011-10-12 | 2012-08-14 | Google Inc. | Derivative-based selection of zones for banded map display |
-
2011
- 2011-08-02 US US13/196,272 patent/US8687023B2/en active Active
-
2012
- 2012-07-17 JP JP2014523952A patent/JP5980924B2/en active Active
- 2012-07-17 MY MYPI2014700208A patent/MY167640A/en unknown
- 2012-07-17 IN IN621CHN2014 patent/IN2014CN00621A/en unknown
- 2012-07-17 AU AU2012290559A patent/AU2012290559B2/en active Active
- 2012-07-17 MX MX2014001342A patent/MX338046B/en active IP Right Grant
- 2012-07-17 CN CN201280048551.4A patent/CN103907087B/en active Active
- 2012-07-17 RU RU2014103238A patent/RU2623198C2/en active
- 2012-07-17 CA CA2843607A patent/CA2843607C/en active Active
- 2012-07-17 WO PCT/US2012/047091 patent/WO2013019404A1/en active Application Filing
- 2012-07-17 BR BR112014002379-4A patent/BR112014002379B1/en active IP Right Grant
- 2012-07-17 EP EP12820488.0A patent/EP2740022B1/en active Active
- 2012-07-17 KR KR1020147002433A patent/KR102052771B1/en active IP Right Grant
- 2012-10-19 US US13/656,354 patent/US20130044141A1/en not_active Abandoned
-
2014
- 2014-01-30 IL IL230724A patent/IL230724A0/en active IP Right Grant
- 2014-01-31 CL CL2014000244A patent/CL2014000244A1/en unknown
- 2014-02-28 CO CO14043066A patent/CO6890079A2/en active IP Right Grant
- 2014-12-30 HK HK14113051.3A patent/HK1199520A1/en unknown
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20110055773A1 (en) * | 2009-08-25 | 2011-03-03 | Google Inc. | Direct manipulation gestures |
US8429565B2 (en) * | 2009-08-25 | 2013-04-23 | Google Inc. | Direct manipulation gestures |
US20110074719A1 (en) * | 2009-09-30 | 2011-03-31 | Higgstec Inc. | Gesture detecting method for touch panel |
US20110157027A1 (en) * | 2009-12-30 | 2011-06-30 | Nokia Corporation | Method and Apparatus for Performing an Operation on a User Interface Object |
Cited By (81)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20090106696A1 (en) * | 2001-09-06 | 2009-04-23 | Matias Duarte | Loop menu navigation apparatus and method |
US9665384B2 (en) | 2005-08-30 | 2017-05-30 | Microsoft Technology Licensing, Llc | Aggregation of computing device settings |
US9223412B2 (en) | 2008-10-23 | 2015-12-29 | Rovi Technologies Corporation | Location-based display characteristics in a user interface |
US9323424B2 (en) | 2008-10-23 | 2016-04-26 | Microsoft Corporation | Column organization of content |
US8970499B2 (en) | 2008-10-23 | 2015-03-03 | Microsoft Technology Licensing, Llc | Alternative inputs of a mobile communications device |
US10133453B2 (en) | 2008-10-23 | 2018-11-20 | Microsoft Technology Licensing, Llc | Alternative inputs of a mobile communications device |
US9606704B2 (en) | 2008-10-23 | 2017-03-28 | Microsoft Technology Licensing, Llc | Alternative inputs of a mobile communications device |
US8548431B2 (en) | 2009-03-30 | 2013-10-01 | Microsoft Corporation | Notifications |
US9977575B2 (en) | 2009-03-30 | 2018-05-22 | Microsoft Technology Licensing, Llc | Chromeless user interface |
US9696888B2 (en) | 2010-12-20 | 2017-07-04 | Microsoft Technology Licensing, Llc | Application-launching interface for multiple modes |
US9430130B2 (en) | 2010-12-20 | 2016-08-30 | Microsoft Technology Licensing, Llc | Customization of an immersive environment |
US8990733B2 (en) | 2010-12-20 | 2015-03-24 | Microsoft Technology Licensing, Llc | Application-launching interface for multiple modes |
US10969944B2 (en) | 2010-12-23 | 2021-04-06 | Microsoft Technology Licensing, Llc | Application reporting in an application-selectable user interface |
US9766790B2 (en) | 2010-12-23 | 2017-09-19 | Microsoft Technology Licensing, Llc | Application reporting in an application-selectable user interface |
US9864494B2 (en) | 2010-12-23 | 2018-01-09 | Microsoft Technology Licensing, Llc | Application reporting in an application-selectable user interface |
US9870132B2 (en) | 2010-12-23 | 2018-01-16 | Microsoft Technology Licensing, Llc | Application reporting in an application-selectable user interface |
US9213468B2 (en) | 2010-12-23 | 2015-12-15 | Microsoft Technology Licensing, Llc | Application reporting in an application-selectable user interface |
US8689123B2 (en) | 2010-12-23 | 2014-04-01 | Microsoft Corporation | Application reporting in an application-selectable user interface |
US9015606B2 (en) | 2010-12-23 | 2015-04-21 | Microsoft Technology Licensing, Llc | Presenting an application change through a tile |
US8612874B2 (en) | 2010-12-23 | 2013-12-17 | Microsoft Corporation | Presenting an application change through a tile |
US8560959B2 (en) | 2010-12-23 | 2013-10-15 | Microsoft Corporation | Presenting an application change through a tile |
US9229918B2 (en) | 2010-12-23 | 2016-01-05 | Microsoft Technology Licensing, Llc | Presenting an application change through a tile |
US11126333B2 (en) | 2010-12-23 | 2021-09-21 | Microsoft Technology Licensing, Llc | Application reporting in an application-selectable user interface |
US9423951B2 (en) | 2010-12-31 | 2016-08-23 | Microsoft Technology Licensing, Llc | Content-based snap point |
US9383917B2 (en) | 2011-03-28 | 2016-07-05 | Microsoft Technology Licensing, Llc | Predictive tiling |
US9658766B2 (en) | 2011-05-27 | 2017-05-23 | Microsoft Technology Licensing, Llc | Edge gesture |
US9104307B2 (en) | 2011-05-27 | 2015-08-11 | Microsoft Technology Licensing, Llc | Multi-application environment |
US8893033B2 (en) | 2011-05-27 | 2014-11-18 | Microsoft Corporation | Application notifications |
US9158445B2 (en) | 2011-05-27 | 2015-10-13 | Microsoft Technology Licensing, Llc | Managing an immersive interface in a multi-application immersive environment |
US10303325B2 (en) | 2011-05-27 | 2019-05-28 | Microsoft Technology Licensing, Llc | Multi-application environment |
US9535597B2 (en) | 2011-05-27 | 2017-01-03 | Microsoft Technology Licensing, Llc | Managing an immersive interface in a multi-application immersive environment |
US11698721B2 (en) | 2011-05-27 | 2023-07-11 | Microsoft Technology Licensing, Llc | Managing an immersive interface in a multi-application immersive environment |
US11272017B2 (en) | 2011-05-27 | 2022-03-08 | Microsoft Technology Licensing, Llc | Application notifications manifest |
US9104440B2 (en) | 2011-05-27 | 2015-08-11 | Microsoft Technology Licensing, Llc | Multi-application environment |
US9052820B2 (en) | 2011-05-27 | 2015-06-09 | Microsoft Technology Licensing, Llc | Multi-application environment |
US8687023B2 (en) | 2011-08-02 | 2014-04-01 | Microsoft Corporation | Cross-slide gesture to select and rearrange |
US11740727B1 (en) | 2011-08-05 | 2023-08-29 | P4Tents1 Llc | Devices, methods, and graphical user interfaces for manipulating user interface objects with visual and/or haptic feedback |
US8935631B2 (en) | 2011-09-01 | 2015-01-13 | Microsoft Corporation | Arranging tiles |
US10579250B2 (en) | 2011-09-01 | 2020-03-03 | Microsoft Technology Licensing, Llc | Arranging tiles |
US9557909B2 (en) | 2011-09-09 | 2017-01-31 | Microsoft Technology Licensing, Llc | Semantic zoom linguistic helpers |
US8922575B2 (en) | 2011-09-09 | 2014-12-30 | Microsoft Corporation | Tile cache |
US10353566B2 (en) | 2011-09-09 | 2019-07-16 | Microsoft Technology Licensing, Llc | Semantic zoom animations |
US10114865B2 (en) | 2011-09-09 | 2018-10-30 | Microsoft Technology Licensing, Llc | Tile cache |
US8830270B2 (en) | 2011-09-10 | 2014-09-09 | Microsoft Corporation | Progressively indicating new content in an application-selectable user interface |
US8933952B2 (en) | 2011-09-10 | 2015-01-13 | Microsoft Corporation | Pre-rendering new content for an application-selectable user interface |
US9244802B2 (en) | 2011-09-10 | 2016-01-26 | Microsoft Technology Licensing, Llc | Resource user interface |
US9146670B2 (en) | 2011-09-10 | 2015-09-29 | Microsoft Technology Licensing, Llc | Progressively indicating new content in an application-selectable user interface |
US10254955B2 (en) | 2011-09-10 | 2019-04-09 | Microsoft Technology Licensing, Llc | Progressively indicating new content in an application-selectable user interface |
US10191633B2 (en) | 2011-12-22 | 2019-01-29 | Microsoft Technology Licensing, Llc | Closing applications |
US9223472B2 (en) | 2011-12-22 | 2015-12-29 | Microsoft Technology Licensing, Llc | Closing applications |
US9128605B2 (en) | 2012-02-16 | 2015-09-08 | Microsoft Technology Licensing, Llc | Thumbnail-image selection of applications |
US8814683B2 (en) | 2013-01-22 | 2014-08-26 | Wms Gaming Inc. | Gaming system and methods adapted to utilize recorded player gestures |
WO2014150611A1 (en) * | 2013-03-15 | 2014-09-25 | Google Inc. | Document scale and position optimization |
US9588675B2 (en) | 2013-03-15 | 2017-03-07 | Google Inc. | Document scale and position optimization |
US20170371846A1 (en) | 2013-03-15 | 2017-12-28 | Google Inc. | Document scale and position optimization |
US10691326B2 (en) | 2013-03-15 | 2020-06-23 | Google Llc | Document scale and position optimization |
US9767076B2 (en) | 2013-03-15 | 2017-09-19 | Google Inc. | Document scale and position optimization |
CN103246449A (en) * | 2013-04-16 | 2013-08-14 | 广东欧珀移动通信有限公司 | Mobile terminal and screen unlocking method of same |
US20140340335A1 (en) * | 2013-05-17 | 2014-11-20 | Elektrobit Automotive Gmbh | System and method for data selection by means of a touch-sensitive surface |
US9507515B2 (en) * | 2013-05-17 | 2016-11-29 | Elektrobit Automotive Gmbh | System and method for data selection by means of a touch-sensitive surface |
US20150026639A1 (en) * | 2013-07-19 | 2015-01-22 | Fuji Xerox Co., Ltd. | Information processing apparatus and method, and non-transitory computer readable medium |
US9965144B2 (en) * | 2013-07-19 | 2018-05-08 | Fuji Xerox Co., Ltd. | Information processing apparatus and method, and non-transitory computer readable medium |
US9176657B2 (en) | 2013-09-14 | 2015-11-03 | Changwat TUMWATTANA | Gesture-based selection and manipulation method |
US10459607B2 (en) | 2014-04-04 | 2019-10-29 | Microsoft Technology Licensing, Llc | Expandable application representation |
US9841874B2 (en) | 2014-04-04 | 2017-12-12 | Microsoft Technology Licensing, Llc | Expandable application representation |
US9769293B2 (en) | 2014-04-10 | 2017-09-19 | Microsoft Technology Licensing, Llc | Slider cover for computing device |
US9451822B2 (en) | 2014-04-10 | 2016-09-27 | Microsoft Technology Licensing, Llc | Collapsible shell cover for computing device |
US10592080B2 (en) | 2014-07-31 | 2020-03-17 | Microsoft Technology Licensing, Llc | Assisted presentation of application windows |
US10678412B2 (en) | 2014-07-31 | 2020-06-09 | Microsoft Technology Licensing, Llc | Dynamic joint dividers for application windows |
US10254942B2 (en) | 2014-07-31 | 2019-04-09 | Microsoft Technology Licensing, Llc | Adaptive sizing and positioning of application windows |
US10642365B2 (en) | 2014-09-09 | 2020-05-05 | Microsoft Technology Licensing, Llc | Parametric inertia and APIs |
US9674335B2 (en) | 2014-10-30 | 2017-06-06 | Microsoft Technology Licensing, Llc | Multi-configuration input device |
US10353575B2 (en) * | 2015-10-06 | 2019-07-16 | Canon Kabushiki Kaisha | Display control apparatus, method for controlling the same, and recording medium |
US11287967B2 (en) | 2016-11-03 | 2022-03-29 | Microsoft Technology Licensing, Llc | Graphical user interface list content density adjustment |
CN111095165A (en) * | 2017-08-31 | 2020-05-01 | 苹果公司 | Systems, methods, and graphical user interfaces for interacting with augmented and virtual reality environments |
US11163417B2 (en) * | 2017-08-31 | 2021-11-02 | Apple Inc. | Systems, methods, and graphical user interfaces for interacting with augmented and virtual reality environments |
US11740755B2 (en) | 2017-08-31 | 2023-08-29 | Apple Inc. | Systems, methods, and graphical user interfaces for interacting with augmented and virtual reality environments |
US20190065027A1 (en) * | 2017-08-31 | 2019-02-28 | Apple Inc. | Systems, Methods, and Graphical User Interfaces for Interacting with Augmented and Virtual Reality Environments |
US11099707B2 (en) | 2018-01-24 | 2021-08-24 | Apple Inc. | Devices, methods, and graphical user interfaces for system-wide behavior for 3D models |
US12099692B2 (en) | 2018-01-24 | 2024-09-24 | Apple Inc. | Devices, methods, and graphical user interfaces for system-wide behavior for 3D models |
CN110971976A (en) * | 2019-11-22 | 2020-04-07 | 中国联合网络通信集团有限公司 | Audio and video file analysis method and device |
Also Published As
Publication number | Publication date |
---|---|
NZ620528A (en) | 2016-01-29 |
EP2740022A1 (en) | 2014-06-11 |
JP2014522054A (en) | 2014-08-28 |
KR102052771B1 (en) | 2019-12-05 |
MX2014001342A (en) | 2014-05-12 |
RU2623198C2 (en) | 2017-06-27 |
CA2843607C (en) | 2019-01-15 |
CN103907087B (en) | 2017-06-16 |
MY167640A (en) | 2018-09-21 |
AU2012290559B2 (en) | 2016-12-15 |
CO6890079A2 (en) | 2014-03-10 |
IN2014CN00621A (en) | 2015-04-03 |
RU2014103238A (en) | 2015-08-10 |
EP2740022A4 (en) | 2015-04-29 |
BR112014002379A2 (en) | 2018-09-25 |
CA2843607A1 (en) | 2013-02-07 |
IL230724A0 (en) | 2014-03-31 |
AU2012290559A1 (en) | 2014-02-20 |
US20130033525A1 (en) | 2013-02-07 |
EP2740022B1 (en) | 2021-04-21 |
WO2013019404A1 (en) | 2013-02-07 |
CL2014000244A1 (en) | 2014-08-22 |
HK1199520A1 (en) | 2015-07-03 |
US8687023B2 (en) | 2014-04-01 |
BR112014002379B1 (en) | 2021-07-27 |
CN103907087A (en) | 2014-07-02 |
KR20140058519A (en) | 2014-05-14 |
MX338046B (en) | 2016-03-31 |
JP5980924B2 (en) | 2016-08-31 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US8687023B2 (en) | Cross-slide gesture to select and rearrange | |
US9891795B2 (en) | Secondary actions on a notification | |
US20130067392A1 (en) | Multi-Input Rearrange | |
US20190310771A1 (en) | Information processing apparatus, information processing method, and program | |
AU2011282997B2 (en) | Motion continuation of touch input | |
EP2715491B1 (en) | Edge gesture | |
EP2715485B1 (en) | Target disambiguation and correction | |
US20140372923A1 (en) | High Performance Touch Drag and Drop | |
US20130014053A1 (en) | Menu Gestures | |
US9348498B2 (en) | Wrapped content interaction | |
US20130019201A1 (en) | Menu Configuration | |
US20170220243A1 (en) | Self-revealing gesture | |
US20130067393A1 (en) | Interaction with Lists | |
US9588679B2 (en) | Virtual viewport and fixed positioning with optical zoom | |
NZ620528B2 (en) | Cross-slide gesture to select and rearrange |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION |
|
AS | Assignment |
Owner name: MICROSOFT TECHNOLOGY LICENSING, LLC, WASHINGTON Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:MICROSOFT CORPORATION;REEL/FRAME:034766/0509 Effective date: 20141014 |