US20140359538A1 - Systems and methods for moving display objects based on user gestures - Google Patents

Systems and methods for moving display objects based on user gestures Download PDF

Info

Publication number
US20140359538A1
US20140359538A1 US13/903,056 US201313903056A US2014359538A1 US 20140359538 A1 US20140359538 A1 US 20140359538A1 US 201313903056 A US201313903056 A US 201313903056A US 2014359538 A1 US2014359538 A1 US 2014359538A1
Authority
US
United States
Prior art keywords
output device
user
user gesture
gesture
display object
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US13/903,056
Inventor
Pavan Kumar Singh Thakur
Robert William Grubbs
Justin V. John
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
General Electric Co
Original Assignee
General Electric Co
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by General Electric Co filed Critical General Electric Co
Priority to US13/903,056 priority Critical patent/US20140359538A1/en
Assigned to GENERAL ELECTRIC COMPANY reassignment GENERAL ELECTRIC COMPANY ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: Thakur, Pavan Kumar Singh, GRUBBS, ROBERT WILLIAM, John, Justin V.
Publication of US20140359538A1 publication Critical patent/US20140359538A1/en
Application status is Abandoned legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object or an image, setting a parameter value or selecting a range
    • G06F3/0486Drag-and-drop
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for entering handwritten data, e.g. gestures, text
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04886Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures by partitioning the screen or tablet into independently controllable areas, e.g. virtual keyboards, menus

Abstract

Certain embodiments herein relate to systems and methods for moving display objects based on user gestures. In one embodiment, a system can include at least one memory configured to store computer-executable instructions and at least one control device configured to access the at least one memory and execute the computer-executable instructions. The instructions may be configured to detect a first user gesture adjacent to an output device in order to identity a display object displayed on the output device. The instructions may be configured to detect a second user gesture adjacent to the output device in order to identify a location to move the display object. The instructions may be configured to update the output device to display the display object at the identified location on the output device.

Description

    FIELD OF THE DISCLOSURE
  • Embodiments of the disclosure generally relate to moving display objects displayed on an output device, and more particularly, to systems and methods for moving display objects based on user gestures.
  • BACKGROUND
  • It has become increasing popular to provide touch-sensitive displays in mobile and communication-type computing devices, such as, hand held tablets or smartphones. Typically, objects being displayed on such devices can be moved by touching and dragging the object using one or more fingers. While users of relatively smaller computing devices can comfortably use conventional touch and drag gestures to move objects on associated displays, relatively large surface computers have much larger display areas making it relatively uncomfortable for users to move display objects using conventional touch and drag gestures.
  • BRIEF SUMMARY OF THE DISCLOSURE
  • Some or all of the above needs and/or problems may be addressed by certain embodiments of the disclosure. Certain embodiments may include systems and methods for moving display objects based on user gestures, such as objects displayed on an output device of a surface computer. According to one embodiment of the disclosure, there is disclosed a system. The system may include at least one memory configured to store computer-executable instructions and at least one control device configured to access the at least one memory and execute the computer-executable instructions. The instructions may be configured to detect a first user gesture adjacent to an output device of a surface computer to identify a display object displayed on the output device. The instructions may be further configured to detect a second user gesture adjacent to the output device identifying a location to move the display object on the output device. The instructions may further be configured to update the output device to display the display object at the identified location on the output device.
  • According to another embodiment of the disclosure, there is disclosed a method. The method can include detecting, by a control device, a first user gesture adjacent to an output device of a surface computer identifying at least one display object displayed on the output device. The method may further include detecting, by the control device, a second user gesture adjacent to the output device identifying a location to move the display object. The method may also include updating, by the control device, the output device to display the identified display object at the identified location on the output device.
  • Other embodiments, systems, methods, aspects, and features of the disclosure will become apparent to those skilled in the art from the following detailed description, the accompanying drawings, and the appended claims.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • The detailed description is set forth with reference to the accompanying drawings, which are not necessarily drawn to scale. The use of the same reference numbers in different figures indicates similar or identical items.
  • FIG. 1 illustrates an example system for moving display objects based on user gestures, according to an embodiment of the disclosure.
  • FIG. 2 is a flow diagram of an example method for moving display objects based on user gestures, according to an embodiment of the disclosure.
  • FIG. 3A is an example method for identifying a display object based on user gestures, according to an embodiment of the disclosure.
  • FIG. 3B is an example method for identifying a location to move a display object based on user gestures, according to an embodiment of the disclosure.
  • FIG. 3C is an example method for updating an output device to display an identified display object at an identified location on the output device, according to an embodiment of the disclosure.
  • DETAILED DESCRIPTION
  • Illustrative embodiments of the disclosure will now be described more fully hereinafter with reference to the accompanying drawings, in which some, but not all embodiments of the disclosure are shown. The disclosure may be embodied in many different forms and should not be construed as limited to the embodiments set forth herein; rather, these embodiments are provided so that this disclosure will satisfy applicable legal requirements.
  • Certain embodiments disclosed herein relate to moving one or more display objects based on user gestures. Accordingly, a system can be provided to facilitate moving display objects based upon detecting a first user gesture and a second user gesture generated by one or more users interacting with the output device. For example, a user may interact with the output device by, for instance, a finger stroke and/or a finger tap adjacent to the surface of the output device. Based upon the first and the second user gesture, a display object may be selected and a location to move the display object on the output device may be identified. Thereafter, the output device may be updated to display the identified display object at the identified location on the output device. One or more technical effects associated with certain embodiments herein may include, but are not limited to, reduced time and expense for a user to move display objects to new positions on a relatively large output device of a surface computer without employing traditional touch and drag methods as briefly described above. Furthermore, one or more technical effects associated with certain embodiments can include providing
  • FIG. 1 depicts a block diagram of one example system 100 that facilitates moving display objects based on user gestures. According to an embodiment of the disclosure, the system 100 may include a surface computer 110 that includes an output device 120. The output device 120 may be configured to display to a user one or more display objects 130, such as, for instance, user interface controls, that may include text, colors, images, icons, and the like.
  • With continued reference to FIG. 1, the surface computer 110 may further include one or more input devices 140 configured to detect and/or capture user gestures adjacent to the output device 120. In certain embodiments, the input devices 140 may include a user gesture capturing device, such as, for instance, one or more cameras and/or transparent ink pad controls disposed in close proximity to the output device 120. In certain embodiments, a user device 140 can include a gesture reader software module and/or a transparent ink pad user interface control. In any instance, the input devices 140 can be configured to detect a first user gesture and a second user gesture adjacent to the output device 120 and communicate them in real-time or near real-time to a control device, such as, control device 150 in FIG. 1, via a network, such as, network 105 in FIG. 1. In certain embodiments, the control device 150 may be configured to receive and to analyze the first and the second user gestures from the input devices 140.
  • Based at least upon the first and the second user gestures, the control device 150 may also be configured to identify and/or select a display object 130, a location on the output device 120 to move the display object 130 and/or generate and transmit to the surface computer 110 an updated output device 120 to display the identified display object 120 at the identified location on the output device 120 via network 105 as will be described.
  • The control device 150 may include any number of suitable computer processing components that may, among other things, analyze user gestures detected by the input devices 140. Examples of suitable processing devices that may be incorporated into the control device 150 include, but are not limited to, personal computers, server computers, application-specific circuits, microcontrollers, minicomputers, other computing devices, and the like. As such, the control device 150 may include any number of processors 155 that facilitate the execution of computer-readable instructions. By executing computer-readable instructions, the control device 150 may include or form a special purpose computer or particular machine that facilitates processing of user gestures in order to move display objects displayed on the output device 120.
  • In addition to one or more processor(s) 155, the control device 150 may include one or more memory devices 160, one or more input/output (“I/O”) interfaces 165, and/or one or more communications and/or network interfaces 170. The one or more memory devices 160 or memories may include any suitable memory devices, for example, caches, read-only memory devices, random access memory devices, magnetic storage devices, etc. The one or more memory devices 160 may store user gestures or other data, executable instructions, and/or various program modules utilized by the control device 150, for example, data files 170, an operating system (“OS”) 180 and/or a user gesture analyzer module 185. The data files 170 may include any suitable data that facilitates the operation of the control device 150 including, but not limited to, information associated with one or more detected user gestures and/or information associated with one or more control actions directed by the control device 150 based on detected user gestures. The OS 180 may include executable instructions and/or program modules that facilitate and/or control the general operation of the controller 150.
  • Additionally, the OS 180 may facilitate the execution of other software programs and/or program modules by the processors 155, such as, the user gesture analyzer module 185. The user gesture analyzer module 185 may be a suitable software module configured to analyze and/or process user gestures detected by the input devices 140. For instance, the user gesture analyzer module 185 may analyze user gestures detected by the input devices 140, which may be collected and stored in memory 160.
  • According to one embodiment, the control device 150 may be configured to detect a first user gesture via the one or more input devices 140. For instance, upon viewing one or more display objects 130 displayed on the output device 120, a first user may generate a first user gesture using one or more fingers in order to select, or otherwise identify, a display object 130 the user would like to move. To do so, in one embodiment, a user may tap the screen of the output device 120 with a finger where the display object 130 is displayed in order to indicate that the user would like to move the display object 130 to another location on the output device 120. As another non-limiting example, the user may generate a finger stroke gesture on the screen of the output device 120 in order to identify the display object 130 the user would like to move.
  • In certain embodiments, the input devices 140 may include one or more program modules that facilitate capturing detected user gestures and any other information associated with the user gestures. For instance, the input devices 140 may include one or more cameras that detect a user gesture. Thereafter, a user gesture reader software module may be executed and configured to automatically, or in response to some other trigger, transmit the captured user gesture and any other information associated with the user gesture to the control device 150 via network 105. Similarly, in another example, the input devices 140 may include one or more transparent ink pad controls, where upon detecting a user gesture by the transparent ink pad controls, the transparent ink pad control interface transmits the user gesture to the control device 150 via network 105.
  • Upon receiving the first user gesture, the control device 150 may be configured to execute the user gesture analyzer module 185. The user gesture analyzer module 185 may be configured to analyze the first user gesture. For instance, the user gesture analyzer module 185 may be configured to associate a location of the first user gesture on the output device 120 to the location of a display object 130 on the output device 120. Using this example, the user gesture analyzer module 185 may determine the particular display object 130 the user would like to move. Having identified the display object 130 the user would like to move, in one embodiment, the user gesture analyzer module 185 may be configured to select the display object 130 and/or wait to receive a second user gesture detected from the input device 140.
  • Continuing with the same example, after a first user gesture by a first user, a second user gesture may be generated by a second user. According to one embodiment, a second user may generate a second user gesture, such as, a finger tap gesture, using one or more fingers in order to select or otherwise identify a location on the output device 120 to move the identified display object 130. To do so, in one embodiment, a user may tap a location on the screen of the output device 120 with a finger in order to indicate the location to move the display object 130 on the output device 120.
  • Similar to the first user gesture, the input device 140 may be configured to automatically, or in response to some other trigger, transmit to the control device 150 via network 105 the captured second user gesture and any other information associated with the second user gesture. Upon receiving the second user gesture, the control device 150 may be configured to execute the user gesture analyzer module 185. The user gesture analyzer module 185 may be configured to analyze the second user gesture. For instance, the user gesture analyzer module 185 may be configured to associate the second user gesture to the first user gesture. In this way, the user gesture analyzer module 185 may be configured to associate the display object 130 identified by the first user gesture to the location on the output device 120 identified by the second user gesture. Thereafter, the user gesture analyzer module 185 may be configured to update the output device 120 to display the identified display object 130 at the identified location on the output device 130. For instance, the user gesture analyzer module 185 may direct the communication by the control device 150 of an updated of presentation of the display objects 120 to the surface computer 110 for display on the output device 120.
  • In another non-limiting example, the user gesture analyzer module 185 may be configured to make more complex user gesture assessments. For instance, a first user gesture may be, for instance, a finger stroke that is a circle gesture representing a command or a text character, such as the letter “o.” Upon receiving a first user gesture, according to one embodiment, the user gesture analyzer module 185 may be executed and configured to analyze a first user gesture in order to identify the command or the text character associated with the first user gesture. Thereafter, in certain embodiments, the user gesture analyzer module 185 may then search one or more data files 180 that may identify , for each command or text character, a corresponding gesture action, such as, selecting and/or moving the display object 120, to be executed by the control device 150 in response to detecting a second user gesture by the input device 140 or some other trigger.
  • Upon detecting the second user gesture, the user gesture analyzer module 185 then may execute the command action and/or direct the communication from the control device 150 of an updated of presentation of the display objects 120 to the surface computer 110 for display on the output device 120 that includes executing the identified action.
  • As desired, embodiments of the disclosure may include a system 100 with more or less than the components illustrated in FIG. 1. Additionally, certain components of the system 100 may be combined in various embodiments of the disclosure. The system 100 of FIG. 1 is provided by way of example only.
  • Referring now to FIG. 2, shown is a flow diagram of an example method 200 for moving one or more display objects being displayed on an output device of a surface based on one or more user gestures, according to an illustrative embodiment of the disclosure. The method 200 may be utilized in association with various systems, such as the system 100 illustrated in FIG. 1.
  • The method 200 may begin at block 205. At block 205, a control device, such as 150 in FIG. 1, may detect a first user gesture adjacent to an output device, such as 120 in FIG. 1, of a surface computer, such as 110 in FIG. 1. In certain embodiments, the first user gesture may be analyzed by, for example, a user gesture analyzer module such as 185 in FIG. 1, in order to identify a display object, such as 130 in FIG. 1, on the output device that a user would like to move to another location on the output device. In certain embodiments, the first user gesture may be detected by an input device, such as, input device 140 illustrated in FIG. 1. As described above, the first user gesture may include a finger-based gesture, such as, a finger stroke gesture, that may be generated by a first user.
  • Next, at block 210, the control device 150 may detect, via the input device 140, a second user gesture adjacent to the output device 120 of the surface computer 110 identifying a location to move the display object 130 on the output device 120. In certain embodiments, the second user gesture may be a finger tap gesture generated by a second user.
  • Lastly, at block 215, the control device 150 may update the output device 120 to display the identified display object 130 at the identified location on the output device 120. As described above, based on the detected first and second user gesture, the control device 150 may be configured to communicate an updated presentation of the display object 130 to the output device 120 for display to one or more users.
  • The method 200 of FIG. 2 may optionally end following block 215.
  • The operations described and shown in the method 200 of FIG. 2 may be carried out or performed in any suitable order as desired in various embodiments of the disclosure. Additionally, in certain embodiments, at least a portion of the operations may be carried out in parallel. Furthermore, in certain embodiments, less than or more than the operations described in FIG. 2 may be performed. As desired, the operations set forth in FIG. 2 may also be performed in a loop as a rotating machine is monitored. For example, the operations may be performed every twenty minutes.
  • Referring now to FIG. 3A, shown is an example method for identifying a display object based on user gestures as described in block 205 of FIG. 2. As illustrated in FIG. 3A, one or more display objects 320 a may be displayed on an output device 310 a. A user may identify or otherwise select the display object 320 a by generating a first user gesture. For example, as shown in FIG. 3A a user may tap the screen of the output device 321 a with a finger where the display object 320 a is displayed in order to indicate that the user would like to move the display object 320 a to another location on the output device 310 a.
  • Next, in FIG. 3B, shown is an example method for identifying a location to move a display object based on user gestures as described in block 210 of FIG. 2. As shown in FIG. 3B, a user may generate a second user gesture using one or more fingers in order to identify a location on an output device 310 b to move an identified display object 320 b. For instance, as shown in FIG. 3B, a user may tap a location on the screen of the output device 310 b with a finger in order to indicate the location to move the display object 320 b on the output device 310 b.
  • Lastly, in FIG. 3C, shown is an example method for updating an output device to display an identified display object at an identified location on the output device as described in block 215 of FIG. 2. According to one embodiment, based upon the first and second user gesture, an updated presentation of a selected display object 320 c at an identified location on an output device 310 c may be displayed to one or more users. The disclosure is described above with reference to block and flow diagrams of systems, methods, apparatus, and/or computer program products according to example embodiments of the disclosure. It will be understood that one or more blocks of the block diagrams and flow diagrams, and combinations of blocks in the block diagrams and flow diagrams, respectively, can be implemented by computer-executable program instructions. Likewise, some blocks of the block diagrams and flow diagrams may not necessarily need to be performed in the order presented, or may not necessarily need to be performed at all, according to some embodiments of the disclosure.
  • These computer-executable program instructions may be loaded onto a general purpose computer, a special purpose computer, a processor, or other programmable data processing apparatus to produce a particular machine, such that the instructions that execute on the computer, processor, or other programmable data processing apparatus create means for implementing one or more functions specified in the flow diagram block or blocks. These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means that implement one or more functions specified in the flow diagram block or blocks. As an example, embodiments of the disclosure may provide for a computer program product, comprising a computer usable medium having a computer-readable program code or program instructions embodied therein, said computer-readable program code adapted to be executed to implement one or more functions specified in the flow diagram block or blocks. The computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational elements or steps to be performed on the computer or other programmable apparatus to produce a computer-implemented process such that the instructions that execute on the computer or other programmable apparatus provide elements or steps for implementing the functions specified in the flow diagram block or blocks.
  • Accordingly, blocks of the block diagrams and flow diagrams support combinations of means for performing the specified functions, combinations of elements or steps for performing the specified functions and program instruction means for performing the specified functions. It will also be understood that each block of the block diagrams and flow diagrams, and combinations of blocks in the block diagrams and flow diagrams, can be implemented by special purpose, hardware-based computer systems that perform the specified functions, elements or steps, or combinations of special purpose hardware and computer instructions.
  • While the disclosure has been described in connection with what is presently considered to be the most practical and various embodiments, it is to be understood that the disclosure is not to be limited to the disclosed embodiments, but on the contrary, is intended to cover various modifications and equivalent arrangements included within the spirit and scope of the appended claims.
  • This written description uses examples to disclose the disclosure, including the best mode, and also to enable any person skilled in the art to practice the disclosure, including making and using any devices or systems and performing any incorporated methods. The patentable scope of the disclosure is defined in the claims, and may include other examples that occur to those skilled in the art. Such other examples are intended to be within the scope of the claims if they have structural elements that do not differ from the literal language of the claims, or if they include equivalent structural elements with insubstantial differences from the literal language of the claims.

Claims (20)

What is claimed is:
1. A method for moving display objects based on user gestures, the method comprising:
detecting, by at least one control device, a first user gesture adjacent to an output device to identify at least one display object displayed on the output device;
detecting, by the at least one control device, a second user gesture adjacent to the output device identifying a location to move the at least one display object; and
updating, by the at least one control device, the output device to display the at least one display object at the identified location on the output device.
2. The method of claim 1, wherein the first user gesture is generated by a first user.
3. The method of claim 1, wherein the first user gesture is a finger stroke gesture.
4. The method of claim 1, wherein either the first user gesture or the second user gesture is detected by the at least one control device via an input device disposed in close proximity to the output device.
5. The method of claim 4, wherein the input device comprises at least one of: a camera, a transparent ink pad, a gesture reader software module or a transparent ink pad user interface control.
6. The method of claim 4, wherein detecting, by at least one control device, a first user gesture adjacent to the output device to identify at least one display object displayed on the output device further comprises:
receiving, by the at least one control device from the input device, the first user gesture;
determining, by the at least one control device, a text character or command associated with the first user gesture; and
identifying, by the at least one control device, a gesture action associated with the text character or command.
7. The method of claim 6, wherein the gesture action comprises selecting and moving the at least one display object on the output device.
8. The method of claim 1, wherein the second user gesture is generated by a second user.
9. The method of claim 1, wherein the second user gesture is a finger tap gesture.
10. The method of claim 6, wherein updating, by the at least one control device, the output device to display the at least one display object at the identified location on the output device further comprises:
executing, by the at least one control device, the at least one display object to the identified location on the output device based at least in part on detecting the second user gesture.
11. A system for moving display objects being displayed on an output device of a computer based on one or more user gestures, the system comprising:
an input unit configured to detect at least one of a first user gesture or a second user gesture on an output device of a surface computer; and
at least one control device in communication with the input unit that is configured to:
detect a first user gesture adjacent to an output device in order to identity at least one display object displayed on the output device;
detect a second user gesture adjacent to the output device in order to identify a location to move the at least one display object; and
update the output device to display the at least one display object at the identified location on the output device.
12. The system of claim 11, wherein the first user gesture is generated by a first user.
13. The system of claim 11, wherein the first user gesture is a finger stroke gesture.
14. The system of claim 11, wherein the input device is disposed in close proximity to the output device.
15. The system of claim 11, wherein the input device comprises at least one of: a camera, a transparent ink pad, a gesture reader software module, or a transparent ink pad user interface control.
16. The system of claim 15, wherein the input unit detects the first user gesture or the second user gesture via the transparent ink pad control user interface.
17. The system of claim 11, wherein the at least one control device is further configured select the at least one display object based on the first user gesture.
18. The system of claim 11, wherein the second user gesture is generated from a second user.
19. The system of claim 11, wherein the second user gesture is a finger tap gesture.
20. The system of claim 11, wherein the at least one controller is further configured to update the output device to display the at least one display object at the identified location on the output device in response to detecting the second user gesture.
US13/903,056 2013-05-28 2013-05-28 Systems and methods for moving display objects based on user gestures Abandoned US20140359538A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US13/903,056 US20140359538A1 (en) 2013-05-28 2013-05-28 Systems and methods for moving display objects based on user gestures

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
US13/903,056 US20140359538A1 (en) 2013-05-28 2013-05-28 Systems and methods for moving display objects based on user gestures
US14/136,840 US9459786B2 (en) 2013-05-28 2013-12-20 Systems and methods for sharing a user interface element based on user gestures
PCT/US2014/038109 WO2014193657A1 (en) 2013-05-28 2014-05-15 Systems and methods for moving display objects based on user gestures

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US14/136,840 Continuation US9459786B2 (en) 2013-05-28 2013-12-20 Systems and methods for sharing a user interface element based on user gestures

Publications (1)

Publication Number Publication Date
US20140359538A1 true US20140359538A1 (en) 2014-12-04

Family

ID=50942897

Family Applications (2)

Application Number Title Priority Date Filing Date
US13/903,056 Abandoned US20140359538A1 (en) 2013-05-28 2013-05-28 Systems and methods for moving display objects based on user gestures
US14/136,840 Active 2034-04-09 US9459786B2 (en) 2013-05-28 2013-12-20 Systems and methods for sharing a user interface element based on user gestures

Family Applications After (1)

Application Number Title Priority Date Filing Date
US14/136,840 Active 2034-04-09 US9459786B2 (en) 2013-05-28 2013-12-20 Systems and methods for sharing a user interface element based on user gestures

Country Status (2)

Country Link
US (2) US20140359538A1 (en)
WO (1) WO2014193657A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20150067456A1 (en) * 2013-08-28 2015-03-05 Canon Kabushiki Kaisha Image display apparatus, control method therefor, and storage medium

Families Citing this family (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10198252B2 (en) 2015-07-02 2019-02-05 Microsoft Technology Licensing, Llc Transformation chain application splitting
US10261985B2 (en) 2015-07-02 2019-04-16 Microsoft Technology Licensing, Llc Output rendering in dynamic redefining application
US9712472B2 (en) 2015-07-02 2017-07-18 Microsoft Technology Licensing, Llc Application spawning responsive to communication
US9733915B2 (en) 2015-07-02 2017-08-15 Microsoft Technology Licensing, Llc Building of compound application chain applications
US9785484B2 (en) 2015-07-02 2017-10-10 Microsoft Technology Licensing, Llc Distributed application interfacing across different hardware
US9733993B2 (en) 2015-07-02 2017-08-15 Microsoft Technology Licensing, Llc Application sharing using endpoint interface entities
US9658836B2 (en) 2015-07-02 2017-05-23 Microsoft Technology Licensing, Llc Automated generation of transformation chain compatible class
US9860145B2 (en) 2015-07-02 2018-01-02 Microsoft Technology Licensing, Llc Recording of inter-application data flow
US10031724B2 (en) 2015-07-08 2018-07-24 Microsoft Technology Licensing, Llc Application operation responsive to object spatial status
US10198405B2 (en) 2015-07-08 2019-02-05 Microsoft Technology Licensing, Llc Rule-based layout of changing information
US10277582B2 (en) 2015-08-27 2019-04-30 Microsoft Technology Licensing, Llc Application service architecture

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20060007166A1 (en) * 2004-07-06 2006-01-12 Jao-Ching Lin Method and controller for identifying a drag gesture
US20060238515A1 (en) * 2005-04-26 2006-10-26 Alps Electric Co., Ltd. Input device
US20080055272A1 (en) * 2006-09-06 2008-03-06 Freddy Allen Anzures Video Manager for Portable Multifunction Device
US20090066666A1 (en) * 2007-09-12 2009-03-12 Casio Hitachi Mobile Communications Co., Ltd. Information Display Device and Program Storing Medium
US20130073980A1 (en) * 2011-09-21 2013-03-21 Sony Corporation, A Japanese Corporation Method and apparatus for establishing user-specific windows on a multi-user interactive table
US20140282066A1 (en) * 2013-03-13 2014-09-18 Promontory Financial Group, Llc Distributed, interactive, collaborative, touchscreen, computing systems, media, and methods

Family Cites Families (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20020018051A1 (en) * 1998-09-15 2002-02-14 Mona Singh Apparatus and method for moving objects on a touchscreen display
US20050091595A1 (en) 2003-10-24 2005-04-28 Microsoft Corporation Group shared spaces
US20060007174A1 (en) 2004-07-06 2006-01-12 Chung-Yi Shen Touch control method for a drag gesture and control module thereof
CA2635499A1 (en) * 2005-02-12 2006-08-24 Teresis Media Management, Inc. Methods and apparatuses for assisting the production of media works and the like
US8914733B2 (en) 2005-10-04 2014-12-16 International Business Machines Corporation User interface widget unit sharing for application user interface distribution
US8689115B2 (en) 2008-09-19 2014-04-01 Net Power And Light, Inc. Method and system for distributed computing interface
US8269736B2 (en) * 2009-05-22 2012-09-18 Microsoft Corporation Drop target gestures
US9065927B2 (en) * 2010-10-13 2015-06-23 Verizon Patent And Licensing Inc. Method and system for providing context based multimedia intercom services
TW201234223A (en) 2011-02-01 2012-08-16 Novatek Microelectronics Corp Moving point gesture determination method, touch control chip, touch control system and computer system
TW201237725A (en) 2011-03-04 2012-09-16 Novatek Microelectronics Corp Single-finger and multi-touch gesture determination method, touch control chip, touch control system and computer system
US20130290855A1 (en) 2012-04-29 2013-10-31 Britt C. Ashcraft Virtual shared office bulletin board

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20060007166A1 (en) * 2004-07-06 2006-01-12 Jao-Ching Lin Method and controller for identifying a drag gesture
US20060238515A1 (en) * 2005-04-26 2006-10-26 Alps Electric Co., Ltd. Input device
US20080055272A1 (en) * 2006-09-06 2008-03-06 Freddy Allen Anzures Video Manager for Portable Multifunction Device
US20090066666A1 (en) * 2007-09-12 2009-03-12 Casio Hitachi Mobile Communications Co., Ltd. Information Display Device and Program Storing Medium
US20130073980A1 (en) * 2011-09-21 2013-03-21 Sony Corporation, A Japanese Corporation Method and apparatus for establishing user-specific windows on a multi-user interactive table
US20140282066A1 (en) * 2013-03-13 2014-09-18 Promontory Financial Group, Llc Distributed, interactive, collaborative, touchscreen, computing systems, media, and methods

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20150067456A1 (en) * 2013-08-28 2015-03-05 Canon Kabushiki Kaisha Image display apparatus, control method therefor, and storage medium
US9563606B2 (en) * 2013-08-28 2017-02-07 Canon Kabushiki Kaisha Image display apparatus, control method therefor, and storage medium

Also Published As

Publication number Publication date
US9459786B2 (en) 2016-10-04
WO2014193657A1 (en) 2014-12-04
US20140359478A1 (en) 2014-12-04

Similar Documents

Publication Publication Date Title
EP3019930B1 (en) Interactive digital displays
TWI617953B (en) Multi-task switching method, system and electronic device for touching interface
US9146672B2 (en) Multidirectional swipe key for virtual keyboard
CN103294401B (en) A kind of icon disposal route and device with the electronic equipment of touch-screen
CN102609130B (en) Touch event anticipation in a computing device
RU2582854C2 (en) Method and device for fast access to device functions
EP2511812B1 (en) Continuous recognition method of multi-touch gestures from at least two multi-touch input devices
KR101597844B1 (en) Interpreting ambiguous inputs on a touch-screen
CN110362414A (en) Act on behalf of gesture recognition
WO2014059886A1 (en) Method and apparatus for obtaining image
AU2012247286B2 (en) Application control in electronic devices
CN103067569B (en) Method and device of multi-window displaying of smart phone
CN103097996A (en) Motion control touch screen method and apparatus
WO2016011568A1 (en) Touch control method and device for multi-point touch terminal
KR101375166B1 (en) System and control method for character make-up
US20150007099A1 (en) Pinch Gestures in a Tile-Based User Interface
CN103677618B (en) Text identification device and method for terminal
US10048859B2 (en) Display and management of application icons
US9098942B2 (en) Legend indicator for selecting an active graph series
KR101802876B1 (en) Multi-character continuous handwriting input method
RU2015104790A (en) Method and device of the user interface for the user terminal
JP2014502399A (en) Handwriting input method by superimposed writing
JP2016527578A (en) Application scenario identification method, power consumption management method, apparatus, and terminal device
US20130080910A1 (en) Dynamic visualization of page element access rates in a web application
US9658761B2 (en) Information processing apparatus, information processing method, and computer program

Legal Events

Date Code Title Description
AS Assignment

Owner name: GENERAL ELECTRIC COMPANY, NEW YORK

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:THAKUR, PAVAN KUMAR SINGH;GRUBBS, ROBERT WILLIAM;JOHN, JUSTIN V.;SIGNING DATES FROM 20130419 TO 20130502;REEL/FRAME:030497/0729

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION