KR101450231B1 - Touch gestures for remote control operations - Google Patents

Touch gestures for remote control operations Download PDF

Info

Publication number
KR101450231B1
KR101450231B1 KR1020137024833A KR20137024833A KR101450231B1 KR 101450231 B1 KR101450231 B1 KR 101450231B1 KR 1020137024833 A KR1020137024833 A KR 1020137024833A KR 20137024833 A KR20137024833 A KR 20137024833A KR 101450231 B1 KR101450231 B1 KR 101450231B1
Authority
KR
South Korea
Prior art keywords
gesture
data
display
character
computing
Prior art date
Application number
KR1020137024833A
Other languages
Korean (ko)
Other versions
KR20130121992A (en
Inventor
양 리
하오 루
Original Assignee
구글 인코포레이티드
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority to US13/033,295 priority Critical
Priority to US13/033,295 priority patent/US20120216152A1/en
Priority to US13/250,055 priority
Priority to US13/250,055 priority patent/US8271908B2/en
Application filed by 구글 인코포레이티드 filed Critical 구글 인코포레이티드
Priority to PCT/US2012/025093 priority patent/WO2012115823A1/en
Publication of KR20130121992A publication Critical patent/KR20130121992A/en
Application granted granted Critical
Publication of KR101450231B1 publication Critical patent/KR101450231B1/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/0482Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance interaction with lists of selectable items, e.g. menus
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for entering handwritten data, e.g. gestures, text
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/41Structure of client; Structure of client peripherals
    • H04N21/4104Peripherals receiving signals from specially adapted client devices
    • H04N21/4126The peripheral being portable, e.g. PDAs or mobile phones
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/41Structure of client; Structure of client peripherals
    • H04N21/422Input-only peripherals, i.e. input devices connected to specially adapted client devices, e.g. global positioning system [GPS]
    • H04N21/42204User interfaces specially adapted for controlling a client device through a remote control device; Remote control devices therefor
    • H04N21/42206User interfaces specially adapted for controlling a client device through a remote control device; Remote control devices therefor characterized by hardware details
    • H04N21/42224Touch pad or touch panel provided on the remote control

Abstract

Generally, the present invention describes a technique for providing a user of a first computing device with the ability to control a second computing device (e.g., a television) using a first computing device (e.g., a mobile device) . Specifically, the techniques of the present invention may, in some instances, enable a user to remotely control and operate a second computing device using a drawing gesture at a mobile computing device. Using a presence-sensitive user interface device (e.g., a touch screen), the user can use the drawing gesture to display the characters associated with the action and commands to control the second computing device.

Description

{TOUCH GESTURES FOR REMOTE CONTROL OPERATIONS FOR REMOTE CONTROL}

The present invention relates to a gesture-based user interface for a mobile device.

BACKGROUND OF THE INVENTION [0002] Computing devices are constantly being improved and more commonly used. Additionally, touch-based interaction using a touch screen of a computing device is also becoming a more common and primary interaction method for the user interface of mobile devices. The touch-based interaction may be, for example, a finger-based touch input.

Further, computing devices are becoming more and more used to interact with other devices and perform operations different from the simple tasks traditionally associated with computing devices. In some cases, the computing device may be used to remotely control the operation of another device.

Generally, the present invention describes a technique for providing a user of a computing device with the ability to control a second computing device using the first computing device. In particular, the techniques of the present invention may, in some instances, enable a user to remotely control and operate another device using a drawing gesture in a mobile computing device. Using a presence-sensitive user interface device (e.g., a touch screen), a user may use a drawing gesture to indicate an operation and a character associated with the command, To control the computing device.

In one example, the invention provides a method comprising: receiving, via a presence-sensing user interface device of a first device, a first user input comprising a first drawing gesture; Transmitting, by the first device, first data representing the first drawing gesture to a second device wirelessly connected to the first device, the second device including a display for displaying one or more selectable elements And at least one of the selectable elements is graphically-highlighted on the display based on the first data; Receiving, via the presence-sensing user interface device of the first device, a second user input comprising a selection gesture; And transmitting, by the first device, second data indicative of the selection gesture, wherein the at least one graphically-highlighted element is selected in response to the second data.

In another example, the invention is a computer-readable storage medium encoded with instructions that when executed, causes one or more processors in a first device to perform a presence- Receiving, via the user interface device, a first user input comprising a first drawing gesture; Transmitting, by the first device, first data representing the first drawing gesture to a second device wirelessly connected to the first device, the second device including a display for displaying one or more selectable elements Wherein at least one of the selectable elements is graphically-highlighted on the display based on the first data; Receiving, via the presence-sensing user interface device of the first device, a second user input comprising a selection gesture; And transferring, by the first device, data indicative of the selection gesture, wherein the at least one graphically-highlighted element is selected from the group consisting of: 2 data.

In yet another example, the present invention provides a system comprising: one or more processors; Presence-sensing user interface device; A gesture module operable, via the presence-sensing user interface device, by the one or more processors to receive a first user input comprising a first drawing gesture; Means for transmitting first data representing the first drawing gesture to a second device wirelessly connected to the first device, the second device comprising a display for displaying one or more selectable elements, the selectable elements At least one of which is graphically-highlighted on the display based on the first data; And a user interface controller operable by the one or more processors to receive a second user input comprising a selection gesture via the presence-sensing user interface device, the means for transmitting comprising means for receiving data representative of the selection gesture And the at least one graphically-highlighted element is selected in response to the second data.

In yet another example, the present invention provides a device comprising: a first device comprising a presence-sensing user interface device receiving a first user input comprising a first drawing gesture; and means for transmitting data; And a second device coupled to the first device and including a display to display one or more selectable elements, the means for transmitting data transmitting first data representing the first drawing gesture to the second device , At least one of the selectable elements being graphically-highlighted on the display based on the first data, the presence-sensing user interface device receiving a second user input comprising a selection gesture, Wherein the means for transmitting transmits data indicative of the selection gesture, wherein the at least one graphically-highlighted element is selected in response to the second data.

Specific techniques of the present invention may allow a user of a computing device to perform a particular remote operation using a drawing gesture on the touch screen of a computing device, as a non-limiting example. The user can remotely control another device by inputting a different pattern using a gesture on the touch screen, indicating the desired action.

The details of one or more embodiments of the invention are set forth in the accompanying drawings and the description below. Other features, objects, and advantages of the invention will be apparent from the description and drawings, and from the claims.

1 is a block diagram illustrating an exemplary system in accordance with the teachings of the present invention;
2 is a block diagram illustrating additional details of an example of a first computing device shown in FIG.
Figures 3A and 3B are block diagrams illustrating an exemplary screen display of a first computing device when a user interacts with a first computing device to remotely control a second computing device in accordance with one or more aspects of the present invention.
Figures 4A and 4B are block diagrams illustrating another exemplary screen display of a first computing device when a user interacts with a first computing device to remotely control a second computing device in accordance with one or more aspects of the present invention.
5 is a flow diagram illustrating a method that may be performed by a computing device according to one or more aspects of the present invention.
Figure 6 is a state diagram illustrating exemplary state transition functionality associated with one or more aspects of the present invention.

Generally, the present invention can be implemented using a gesture on a computing device (e.g., using an existing-sense user interface device, such as a touchscreen user interface) A technique for providing a user with the ability to remotely control an operation is described. These techniques may allow a user to select an item on a second computing device through interaction with a touch screen of the computing device using a gesture on the first computing device. These techniques may be integrated with existing systems to allow a user to draw a gesture on a touch screen, where the gesture may be associated with a character corresponding to the item displayed on the second computing device. With this touch screen, a user can interact with a second computing device by drawing a gesture anywhere on the screen, no matter what the second computing device is or what is being displayed on the second computing device. In this manner, the user is not limited to a predefined keypad corresponding to a particular second computing device, but can interact with multiple computing devices using the same user interface. Additionally, as long as the user draws a gesture on the entire screen of the computing device somewhere in the user interface, in some instances, the user can operate the second computing device without having to view the computing device.

In one example, at least a portion of the touch screen of the computing device may be assigned to draw a gesture. Using the assigned portion, the user can use the gesture to draw a character corresponding to the item displayed on the second computing device. In one example, the item may be a menu item, and the character that the user draws may correspond to a first letter of one or more of the items or a shape associated with the item. One item corresponding to the drawn character can be highlighted and the user can further define the character using the same technique until the desired item is selected. A user may select an item (e.g., a selectable menu item) or operate an item (e.g., a control item) using the computing device. When a user draws a gesture on a computing device, a representation of the drawn gesture may be displayed on the computing device and / or on the display of the second computing device. The user may also use the gesture on the computing device to delete the last character or sequence of characters.

Some computing devices provide a user interface that allows a user to use the first computing device as a remote control for the second computing device. Often, the user interface displays a keypad that resembles a remote control. Additionally, the user interface may display a remote control keypad specific to the second computing device. For example, the remote control keypad may be an application that requires a user to obtain for a particular second computing device, and thus, if the user desires to remotely control the second computing device using the first computing device , A different remote control keypad may be required to be acquired because different computing devices are associated with different keypads. Furthermore, some computing devices may require a keypad with many keys that enable different functions. However, due to the limitation of the size of the computing device, the remote control keypad may be too small to facilitate navigating or selecting. This annoys and annoys the user.

The techniques of the present invention allow a user to create a gesture by drawing a gesture on a presence-sensing user interface device (e.g., a touch screen) of a first computing device using a first computing device (e.g., a mobile phone) To control a second computing device (e.g., Google TV device, television, projector, etc.). Instead of using a keypad that resembles a remote control to control the second computing device, the technique allows the user to draw a gesture that displays the desired action on at least a portion of the touch screen of the computing device. The gesture may be converted into action by associating the character represented by the drawn gesture with one or more items appearing on the second computing device.

1 is a block diagram illustrating an exemplary system 100 in accordance with the teachings of the present invention. In the example of FIG. 1, the system 100 includes a first computing device 105 and a second computing device 110. Using the techniques of the present invention, the first computing device 105 may be used as a remote control device of the second computing device 110. In another example, system 100 may include one or more computing devices 105 and / or one or more computing devices 110. In system 100, devices may interact with each other via a communication link, such as, for example, connection 112 when properly configured. In one example, the first computing device 105 and the second computing device 110 may comprise Wi-Fi or Bluetooth capabilities and may be configured to communicate wirelessly over the connection 112.

In some examples, the first computing device 105 may comprise a mobile device. For example, the first computing device 105 may be a wireless communication device (e.g., a wireless mobile handset or device), a video phone, a digital multimedia player, a personal digital assistant (PDA), a video game console, , Or other device, or a portion thereof. In some instances, the first computing device 105 may communicate with an external, separate device via one or more networks (not shown), such as one or more wired or wireless networks that may provide access to the Internet in some cases . The first computing device 105 may communicate with one or more second computing devices 110, such as, for example, an independent smart television, or a set-top box connected to a television set.

As shown in the example of FIG. 1, the first computing device 105 may include a user interface 102. At least some of the user interface 102 may be an existent-sensing user interface device. The presence-sensing user interface device may be, for example, a touch screen of the first computing device 105 in response to a tactile input of the user's finger or stylus pen, for example. The first computing device 105 may be configured to allow a user of the first computing device 105 to interact with the second computing device 110 and to control its operation, You can run the application. During execution, interaction of the user with the remote control application 104 enables control of the operations associated with the second computing device 110.

In one example, a user may initiate a connection between the first computing device 105 and the second computing device 110 by establishing communications over the connection 112. When the remote control application 104 is initially launched, the application may be launched from the second computing device 110 that the first computing device 105 can communicate and control using the remote control application 104 You can prompt the user for a list. A user may use the computing device 105 to select a second computing device that he or she wishes to connect to and control. When this connection is established between the first computing device 105 and the second computing device 110, the second computing device 110 may obtain a list of devices that the user can control using the remote control application 104 Lt; / RTI >

Once a connection is established between the first computing device 105 and the second computing device 110, the user can control the input provided by the user to the remote control application 104 to control the operation associated with the second computing device 110 The remote control application 104 may be configured on the first computing device 105 to be able to communicate with the remote control application 104. [ In one example, the remote control application 104 on the first computing device 105 may be configured to control one or more second computing devices 110. In one example, when the remote control application 104 is executed in the first computing device 105, the list of associated second computing devices 110 may be presented to the user to select a second computing device to control . In another example, when the remote control application 104 is executed in the first computing device 105, the second computing device 110 closest to the first computing device 105 may be automatically launched. In one example, the user may switch the second computing device 110 using the gesture while the remote control application 104 is running.

In one example, the remote control application 104 may operate in the first computing device 105 to perform functions in accordance with the techniques of the present invention during execution. For example, the remote control application 104 may interact with and / or exchange data with a device external to the first computing device 105, such as the second computing device 110. The first computing device 105 may download or otherwise obtain the remote control application 104 from an external server via one or more networks (not shown) in various instances. For example, a web browser hosted by a first computing device 105 may be configured to communicate with a remote control application 104 when accessing one or more web sites hosted, for example, by an external server (e.g., a web server) You can download one or more applications, such as.

During execution, the remote control application 104 may implement, invoke, execute, or otherwise use the user interface 102 as a mechanism for obtaining user input. For example, during the initial interaction, the remote control application 104 may provide the setup information associated with the user and any second computing device 110 that the user wishes to interact with using the remote control application 104 The user can be prompted via the user interface 102 for that purpose. In another example, during execution, the remote control application 104 may present a list of one or more second computing devices 110 to the user via the user interface 102, and the one or more second computing devices 110, The user may select a second computing device that he wishes to interact with. The remote application 104 may interact with the user interface 102 in a gesture-based mode in which the user draws a gesture in the user interface 102 to indicate the remote control operation the user wishes to use to control the operation of the second computing device 110 ). ≪ / RTI >

The user interface 102 provides the user with at least one portion that senses the presence during execution of the remote control application 104 so that the user can interact with each other (e.g., by a finger or a stylus pen) To draw a gesture corresponding to an item on the second computing device 110. [ The representation of the drawn gesture may be displayed on the user interface 102 when the user draws the gesture. The remote control application 104 may include gesture recognition capabilities that recognize the drawn gesture and convert the gesture into matching characters, e.g., letters. When the user draws a gesture on a portion of the user interface 102 dedicated for gesture input, the remote control application 104 establishes communication with the second computing device 110 via the connection 112, Information about the gesture to the second computing device 110. In one example, the first computing device 105 may perform gesture recognition to execute an algorithm to determine a corresponding character, and may send the drawn gesture and the corresponding character to the second computing device 110. In yet another example, the first computing device 105 may send the drawn gesture to the second computing device 110, and the second computing device 110 may perform the gesture recognition to determine the corresponding character . In this example, the second computing device 110 may include one or more processors operable to execute algorithms that perform gesture recognition, which may be stored in the memory and / or storage of the second computing device 110.

Characters corresponding to the drawn gesture may be associated with an item displayed on the display of the second computing device 110. [ An indication may be displayed on the second computing device 110 to indicate an associated item. In one example, an associated item may be highlighted by displaying a box around the item, or by displaying the item in a different color or using a different pattern or the like. The user may indicate using the gesture (e.g., by touching the user interface 102) that they want to select the highlighted item. In one example, selecting a highlighted item may result in the display of additional items if the selected item (e.g., item 2 in FIG. 1) has sub-item 114, as shown in FIG. 1 . The user may select one of the sub-items from the sub-item 114 using the same technique as described above. In another example, selecting a highlighted item may result in executing an action associated with the item (e.g., changing a channel, recording a program, or launching a website).

In one example, more than one item displayed on the second computing device 110 may be associated with a character. For example, the drawn gesture may correspond to the letter A, and two or more items displayed on the second computing device 110 may start with the letter A. In this example, the item listed first in the item (e.g., the item closer to the top of the list) may be highlighted. The user can determine that the highlighted item is not what he or she wants to interact with and use the first computing device 105 to draw a gesture associated with the second character of the desired item. In the same manner as described above, the character corresponding to the drawn gesture can be sent to the second computing device 110, and one of the items associated with the first character can be selected based on the one associated with the second character , And can be highlighted for user selection as described above.

In one example, the user may use a different gesture for editing purposes. For example, if a user draws a wrong gesture or draws a gesture that highlights an erroneous item, the user swipes across the user interface 102 to determine the end of the last gesture corresponding to the last drawn gesture (E.g., by swiping horizontally left on the screen) or by deleting a sequence of characters corresponding to the drawn gesture (e.g., by swiping horizontally right on the screen) ) Can be deleted. When the user deletes a sequence of characters or characters, the items highlighted on the screen are appropriately changed. For example, if a user deletes a character in a sequence of characters, the item corresponding to the remaining characters may be highlighted. In another example, if the user deletes the entire sequence of characters, none of the items are highlighted.

In one example, as described above, the user can define a specific action and delete the character corresponding to the drawn gesture using the gesture on the screen. For example, the user may swipe to the left horizontally across the user interface 102 to delete the character corresponding to the last drawn gesture. In another example, a user may swipe to the right horizontally across the user interface 102 to delete a sequence of characters corresponding to a sequence of drawn gestures. As described above, the user can select an item highlighted by tapping in the user interface 102. [ Such as moving a highlight box from one item to another by swiping up or down vertically in the user interface 102, for example, using a different gesture in the user interface 102 Other operations can be performed.

In another example, a user may define a particular action and delete the character corresponding to the drawn gesture by using a movement gesture, i. E., By moving the computing device 105 in a particular direction. For example, to delete the character corresponding to the last drawn gesture, the user may move the computing device 105 horizontally to the left, and the user may move the computing device 105 to the left of the computing device 105, Can be moved to the right in the horizontal direction. In this example, the user may move the computing device 105 vertically upward or downward to move the highlight box from one item to another.

2 is a block diagram illustrating additional details of the computing device 105 shown in FIG. FIG. 2 illustrates only one specific example of computing device 105, and in many other instances many different exemplary embodiments of computing device 105 may be used. 2, the computing device 105 may include one or more processors 122, a memory 124, a network interface 126, one or more storage devices 128, a user interface 130, (Not shown). For example, if the computing device 105 includes a mobile device, the computing device 105 may include a battery 132. Each component 122, 124, 126, 128, 130, and 132 may be interconnected via one or more buses for inter-component communication. The processor 122 may be configured to implement functions within the computing device 105 and / or process execution instructions. The processor 122 may process instructions stored in the memory 124 or instructions stored in the storage device 128.

The user interface 130 may include, for example, a monitor or other display device to provide visual information to a user of the computing device 105. The user interface 130 may further include one or more input devices, e.g., a manual keyboard, a mouse, a touchpad, a trackpad, etc., that allow a user to input data. In some instances, the user interface 130 may include an existing-sensing user interface device, e.g., a touch screen, which may be used to receive and process user input and also to display output information. The user interface 130 may further include a printer or other device for outputting information. In various instances in the detailed description included herein, reference to user interface 130 may refer to a portion of a user interface 130 (e.g., a touch screen) that provides user input functionality. In one example, the user interface 130 may be a touch screen that responds to a tactile input (e.g., by a user's finger or a stylus pen) by the user.

The memory 124 may be configured to store information in the computing device 105 during operation. The memory 124 may be described as a computer-readable storage medium in some examples. In some instances, the memory 124 is a temporary memory meaning that the primary purpose of the memory 124 is not long term storage. The memory 124 may also be described as a volatile memory, meaning that the memory 124 does not retain the stored content when the computer is turned off. Examples of volatile memory include random access memory (RAM), dynamic random access memory (DRAM), static random access memory (SRAM), and other types of volatile memory known in the art. In some instances, the memory 124 may be used to store program instructions to be executed by the processor 122. The memory 124 may be used by software or applications running on the computing device 105 (e.g., the remote application 104 shown in Figure 1) to temporarily store information during program execution.

The storage device 128 may also include one or more computer-readable storage media. The storage device 128 may be configured to store a greater amount of information than the memory 124. [ The storage device 128 may further be configured to store information over a long period of time. In some examples, the storage device 128 may comprise a non-volatile storage element. Examples of such non-volatile storage elements may include magnetic hard disks, optical disks, floppy disks, flash memory, or electrically programmable memory (EPROM) or electrically erasable and programmable memory (EEPROM) memory types.

The computing device 105 also includes a network interface 126. The computing device 105 may be coupled to an external device (e.g., one or more servers, a web server, a second computing device 110) and / or an external device Communication can be performed. The computing device 105 may be coupled to a network interface 126 (e.g., in response to execution of one or more applications requiring data to be transmitted to and / or received from another device (e.g., another computing device, server, ) Can be used. Computing device 105 may include, for example, Wi-Fi or Bluetooth capabilities, which may be configured to establish communication with second computing device 110 (FIG. 1) via network interface 126. [

Any application (e.g., the remote control application 104 shown in FIG. 1) implemented in or executed by the computing device 105 may be coupled to the processor 122, the memory 124, the network interface 126, May be implemented within or contained within, executed by, executed by, and / or operably coupled to the storage device 128, and / or the user interface 130. [

In one example, the computing device 105 may include a remote control application 104 that allows a user to enter a gesture at the computing device 105 to control the operation of the second computing device 110. The remote control application 104 may include a display module 142, a user interface controller 144, and a gesture module 146. The remote control application 104 may provide or display a user interface 102 that allows a user to provide a tactile input to operate the second computing device 110. The remote control application 104 may be stored in the memory 124 and / or the storage device 130 and may be operated by the processor 122 to perform various tasks during execution.

In one example, during the implementation or realization of the remote control application 104, the display module 142 is operated by the processor 122 to define at least a portion of the user interface 130 that receives the gesture via the tactile user input . The user interface controller 144 may be operated by the processor 122 to receive a user input that specifies a drawn gesture intended to limit the operation of controlling the second computing device 110 via the user interface 130 have. The user input may include contacting (e.g., contacting the touch screen) with the user interface 130 to draw a gesture corresponding to an action associated with the second computing device 110, respectively.

The gesture module 146 determines a suitably matched character and action that may subsequently be associated with the appropriate action or item at the second computing device 110, based on the gesture the user draws at the user interface 130 (Not shown). In one example, the display module 142 may define at least a portion of the user interface 130 for inputting a gesture. In one example, the gesture module 146 may display the drawn gesture in the user interface 130 and determine the corresponding character. In another example, the gesture module 146 may transmit the drawn gesture for display at the second computing device 110. [

In some instances, the gesture module 146 may also determine an action corresponding to this motion, based on the direction in which the user moves in the computing device 105. [ The action may result in deleting previously defined characters by the user's drawn gesture. In one example, this deleting operation may be defined using a gesture drawn in the user interface 130 instead of a motion. Whether or not the gesture module 146 interprets the motion gesture in addition to the drawn gesture may be based on user selection indicating a user's desire to use the motion gesture or may be the default setting of the remote control application 104. [ In an example where a motion gesture is used, the computing device 105 may also include components capable of detecting a change in motion and position of the computing device 105, e.g., an accelerometer, a compass, .

The remote control application 104 may define at least a portion of the user interface 130 on which the user may draw characters associated with an operation on the second computing device 110 using a gesture. Characters may correspond to characters or shapes associated with items and actions of the second computing device 110. [ Additionally, using a particular drawn gesture or motion, a user can apply an action to an already entered character, for example, delete one or more characters.

In one example, when a user uses gestures to draw characters and actions, the gesture module 146 may determine the desired characters and actions to match. Display module 142 may be operable to receive data for the drawn gesture and display it on user interface 130. [ In this manner, the display module 142 may be operable to display the gesture when the user draws the gesture on the user interface 130. In some examples, the gesture module 146 may send data for the drawn gesture to the second computing device 110, which may display the drawn gesture. The second computing device 110 may determine a corresponding character or action based on the data for the drawn gesture. In one example, the gesture module 146 may also send data to the second computing device 110 for the character or action corresponding to the drawn gesture.

The processor 122 may be operable to execute one or more algorithms including, for example, a gesture-interpretation algorithm. In one example, the gesture-interpretation algorithm can determine the character and action corresponding to the drawn gesture. In some examples, the algorithm may associate characters and actions corresponding to the drawn gesture with items and actions of the second computing device 110. In another example, the algorithm enables characters and actions corresponding to the drawn gesture to be sent to the second computing device 110, where the characters and actions may be associated with items and actions on the second computing device 110 have.

Figures 3a and 3b are block diagrams illustrating an exemplary screen display of a first computing device when a user interacts with a first computing device to remotely control a second computing device in accordance with one or more aspects of the present invention. The first computing device 305 may operate in the same manner as the first computing device 105 of Figures 1 and 2 and the second computing device 310 may operate in the same manner as the second computing device 110 of Figure 1 . ≪ / RTI > The first computing device 305 may include an existent-sensing user interface device, such as, for example, a user interface 302 (e.g., a touch screen).

3A, the first computing device 305 is connected to the second computing device < RTI ID = 0.0 > (e. G., Via a connection 312) using one of many available wireless technologies, 310 in a wireless manner. The second computing device 310 may list the target element that may be associated with a different operation or function in the display device 316 associated with the second computing device 310. [ In one example, the target element displayed in the display device 316 of the second computing device 310 may be a textual object, as shown in FIG. 3A, and its name, e.g., a bookmark, , A spotlight, an application, and so on. The target element may also be, for example, a graphical object, such as a slider, a progress bar, a volume knob, etc., and may include a graphical object, , A bar, a volume, a knob, and the like. In the example of FIG. 3A, a user may use the first computing device 305 to draw a gesture associated with a target element that the user desires to select. In this example, the user can draw a gesture representing the letter "B ". When the user draws the gesture at the user interface 302 of the first computing device 305, the representation of the drawn gesture can be displayed in the user interface as shown in Figure 3A.

As shown in FIG. 3B, when the user draws a gesture for the letter "B ", data associated with the drawn gesture may be transmitted to the second computing device 310. In one example, the first computing device 305 may interpret the drawn gesture to determine the corresponding character. In this example, the data associated with the drawn gesture may include data and corresponding characters for the drawn gesture. In another example, the first computing device 305 may send data for the drawn gesture to the second computing device 310, which may determine the corresponding character, The second computing device 310 may display a representation of the drawn gesture on the display device 316. Additionally, the second computing device 310 may associate the corresponding character with the target element displayed on the display device 316. [

In another example, the second computing device 310 may communicate data associated with the displayed target element at the display device 316. [ In this example, the first computing device 305 may interpret the drawn gesture to determine the corresponding character and associate the corresponding character with the target element based on the communicated data from the second computing device 310 . The first computing device 305 may display an indication of the associated target element to the user and request a confirmation. Upon verification, the first computing device 310 may communicate data to the second computing device 305 indicating the selection of the associated target element. The second computing device 305 may proceed with the appropriate action associated with the selection of the displayed target element.

In one example, the second computing device 310 may associate a character with a first indexed element that begins with the same character as the character corresponding to the drawn gesture. In this example, the first indexed target element is a "bookmark" so that the user can be associated with the letter "B" corresponding to the gesture drawn at the user interface 302 of the first computing device 305. The second computing device 310 may highlight the associated target element by displaying a box around the item (as shown in FIG. 3B), or by displaying the item in a different color or using a different pattern or the like.

If the user desires to select a highlighted target element, the user may indicate a request to provide a gesture to the first computing device 305 to activate the highlighted target element. For example, the user may tap the user interface 302 of the first computing device 305 to activate a "bookmark ". In one example, activating the highlighted target element may result in different actions depending on the highlighted element and the associated action. For example, activating the "bookmark" element may result in displaying another screen on the display device 316 that lists the bookmarks associated with the second computing device 310. The bookmark can be listed and indexed, and the user can select one of the bookmarks as described above for selecting one of the target elements. In another example, activating a highlighted target element may result in launching a website or screen associated with the highlighted target element. In another example, activating a highlighted target element may result in performing an action associated with the highlighted target element, e.g., increasing / decreasing the volume, changing the display settings, fast forwarding, etc. have.

4A and 4B are block diagrams illustrating another exemplary screen display of a computing device when the user interacts with the computing device to remotely control the second computing device in accordance with one or more aspects of the present invention. In some instances, more than one target element may correspond to a drawn gesture, as shown in Figure 4A. For example, the user may draw a gesture as shown in FIG. 4A that may correspond to the letter "A ". In this example, the second computing device 310 may highlight a first target element, e.g., "application, " starting with the letter" A ". If the highlighted target element is what the user desires to select, the user can activate the highlighted target element as described above.

In one example, the user may have wanted to select the target element "AM Radio ". In this example, the user can draw another gesture to display the desired target element. As shown in FIG. 4B, the user can draw a gesture corresponding to the letter "M " and add it to the character sequence. In this example, the second computing device 310 may determine the corresponding target element by associating the characters corresponding to the drawn gesture in the drawn order and thereby associating the "AM radio" with the drawn gesture. When the user draws more gestures, the highlighted target element is updated accordingly, and the word "AM radio" is highlighted here, as shown in Figure 4b. The user may activate the highlighted target element as described above.

In one example, if a user makes a mistake and wants to delete one or more gestures that have already been drawn, the user may use the gesture to delete the drawn gesture. For example, the user may swipe left in the horizontal direction in the user interface 302 to delete the last drawn gesture (e.g., "M") and move horizontally in the user interface 302 to the right And sweep to delete the entire gesture sequence (e.g., "AM"). When the user deletes one or more drawn gestures, the highlighted target element may be updated accordingly. For example, in the example of FIG. 4B, if the user deletes the gesture corresponding to the letter "M ", the second computing device 310 may update the highlighted target element back to" application. &Quot; If the user deletes the entire sequence of the drawn gesture, there will be no highlighted target element.

In one example, a user may wish to move a highlighted target element without drawing a gesture in the user interface 302. [ A user may use a movement gesture to move a highlight from one target element to another, where the movement may be, for example, upward, downward, leftward, or rightward movement. In the example of Figure 4B, the user may want to select the target element "Queue ". The user may move the computing device 305 upward and move the highlight box upward from the "AM radio" until the desired target element, e.g., "cue "

5 is a flow diagram illustrating a method that may be performed by a computing device according to one or more aspects of the present invention. For example, the illustrated exemplary method may be performed at least partially by the first computing device 105 (Figs. 1 and 2). In some instances, a computer-readable storage medium (e.g., media included in the storage device 128 of FIG. 2), when executed, causes one or more processors (e.g., processor 122) And an action to perform one or more of the actions shown in the method of FIG.

The method of Figure 5 includes receiving (502) a first user input comprising a first drawing gesture using an existing-sensing user interface device (e.g., touch screen 102) connected to a first computing device ). As described above, the first drawing gesture may define a first characteristic associated with one or more target elements or items displayed in a display device associated with a second computing device (e.g., second computing device 110) . The first characteristic may be a first character of the one or more target elements or a shape associated with the one or more target elements. The first drawing gesture may be associated with at least one of the target elements based on the first characteristic, wherein the association may be performed by the first computing device and / or the second computing device (504). When the element or item displayed at the second computing device is associated with a drawing gesture, the element or item may be graphically highlighted (506) to display the associated element or item.

The method also includes receiving (508) a second user input comprising an activation gesture using the presence-sensing user interface device. In response to receiving the activation gesture, the highlighted target element or item may be activated. In one example, activation may result in an action associated with a highlighted target element or item, such as, for example, changing a channel or adjusting a volume or display setting. In another example, activation may be to display a highlighted target element, such as, for example, a submenu, or an element or subset of items associated with the item. When activating is to display a subset of elements, the same method as described above may be used to select an element or menu of a subset of elements.

In one example, the one or more elements or items may correspond to a first drawing gesture that defines a characteristic associated with the target element or item. In accordance with the teachings of the present invention, the highlighted target element or item may be the first listed element or item, or the one closest to the top of a menu or list of elements or items. In one example, a user may provide one or more additional drawing gestures corresponding to one or more additional characteristics associated with the target element or item. If the user further inputs a drawing gesture, the associated target element or item may be updated and appropriately highlighted. In another example, if the highlighted element or item is not desired, the user may use one or more of the elements or items (e.g., by drawing a gesture in a particular direction or moving the first computing device) To another element or item. In another example, if the highlighted element or item is not desired, the user may use the drawing gesture to delete the previously drawn gesture, as described above.

6 is a state diagram illustrating an exemplary state transition function associated with one or more aspects of the present invention. As shown in FIG. 6, there can be two states associated with the technique of the present invention, an initial state 602 and an anchoring state 604. During the initial state 602, the user may interact with the user interface of the first computing device (e.g., the first computing device 105 of Figures 1 and 2, or the first computing device 305 of Figures 3 and 4) No gestures have been drawn yet. When a user adds a gesture by drawing a gesture at the user interface of the first computing device, the drawn gesture is interpreted to determine the corresponding character. The second computing device communicating with the first computing device may associate a character with a target element displayed on a display device associated with the second computing device. The associated target element may be highlighted, for example, using a focusing box as described above. If the target element is highlighted, the associated state is an anchor state 604, which indicates that the gesture was drawn and the corresponding character is associated with the target element.

While in anchor state 604, several actions may occur. For example, the user may draw another gesture corresponding to the second character, and the highlighted target element is updated accordingly to correspond to the sequence of gestures in which the sequence of characters associated with the target element is drawn. In another example, from an anchor state 604, a user may change a highlighted target element by moving a highlight from one target element to another using a movement gesture. In this example, the user can move the highlight from one element to another without drawing additional or new gestures. In another example, from an anchor state 604, a user may use a gesture to delete the last drawn gesture if the drawn gesture exceeds one. Deleting the last drawn gesture may change the corresponding sequence of characters and cause the highlighted target element to change to a target element corresponding to the remainder drawn gesture.

From the anchor state 604, the user may perform some action that may cause it to return to the initial state 602. [ For example, a user may activate a target element highlighted by tapping the user interface of the first computing device, for example, using a gesture indicating activation. Activating a highlighted target element may, for example, display a new list of target elements associated with the highlighted target element, or may perform an action associated with the highlighted target element. The user interface of the first computing device may be cleared and the user may restart the drawing gesture to perform an operation or to select a target element associated with the updated display.

In one example, from the anchor state 604, the user may clear the entire sequence of gestures drawn by swiping to the right to clear all gestures, which may result in removing the highlight from the highlighted target element, The user can start from a clear user interface. In another example, swiping to the left in the user interface of a computing device, when the user only draws a single gesture, clears a single drawn gesture and thereby removes the highlight from the highlighted target element, To return to an initial state 602 having an interface.

The techniques described in the present invention may be implemented at least in part by hardware, software, firmware, or any combination thereof. For example, various aspects of the described techniques may be implemented within one or more microprocessors, a digital signal processor (DSP), an application specific integrated circuit (ASIC), a field programmable gate array (FPGA), or any other equivalent integrated or discrete logic circuit (equivalent integrated or discrete logic circuit), and any combination of such components. The term "processor" or "processing circuit" generally refers to any of the logic circuits, either alone or in combination with other logic circuits or any other equivalent circuits. A control unit comprising hardware may also perform one or more of the techniques of the present invention.

Such hardware, software, and firmware may be implemented in the same device or in separate devices that support the various technologies described in the present invention. Moreover, any of the described units, modules, or components may be implemented separately or in conjunction with a separate but interoperable logic device. Depicting different features in a module or unit is intended only to highlight different functional aspects and does not mean that such a module or unit must be realized by separate hardware, firmware, or software components. Rather, the functionality associated with one or more modules or units may be performed by separate hardware, firmware, or software components, or may be integrated within common or separate hardware, firmware, or software components.

The techniques described in the present invention may also be implemented or encoded in a computer-readable medium, such as a computer-readable storage medium, including instructions. Instruction implemented or encoded in a computer-readable medium including a computer-readable storage medium may cause the one or more programmable processors or other processors to execute instructions stored in a computer-readable medium by one or more processors When executed, may enable one or more of the techniques described herein to be implemented. The computer-readable storage medium may be a random access memory (RAM), a read only memory (ROM), a programmable read only memory (PROM), an erasable programmable read only memory (EPROM), an electronically erasable programmable read only memory (EEPROM) Hard disk, compact disk ROM (CD-ROM), floppy disk, cassette, magnetic media, optical media, or other computer readable media. In some instances, the article of manufacture may comprise one or more computer-readable storage media.

In some examples, the computer-readable storage medium may comprise non-volatile media. The term "non-transient" may indicate that the storage medium is not implemented in a carrier wave or a radio wave signal. In a particular example, non-temporary storage media may store data (e.g., in RAM or cache) that may change over time.

Several embodiments of the present invention have been described. These and other embodiments are within the scope of the following claims.

Claims (20)

  1. CLAIMS 1. A method for remotely controlling a device,
    In a first device, receiving, via a presence-sensing display of the first device, a first user input comprising a first drawing gesture, the first drawing gesture comprising a drawing corresponding to a first character ;
    Transmitting, by the first device, first data representing the first drawing gesture to a second device wirelessly connected to the first device, the second device including a display for displaying one or more selectable elements Wherein at least a first element of the selectable elements comprises the first character and is visually represented on a display of the second device based on the first data;
    Receiving, at the first device, a second user input comprising a second drawing gesture via the presence-sensing display, the second drawing gesture representing a drawing corresponding to a second character;
    Transmitting, by the first device, second data representing the second drawing gesture to the second device, wherein at least a second element of the selectable elements includes both the first character and the second character And visually represented on the display of the second device based on the second data;
    Receiving, at the first device, a third user input comprising a motion gesture through the presence-sensing display after transmitting the second data; And
    Transferring third data representing the motion gesture to the second device by the first device
    / RTI >
    Wherein at least a third element of the selectable elements comprises the first character and is visually represented on the display of the second device based on the third data. .
  2. The method according to claim 1,
    Wherein the display of the second device outputs a representation of the first drawing gesture based on the first data.
  3. The method according to claim 1,
    Wherein the motion gesture comprises a first motion gesture,
    Receiving, at the first device, a second motion gesture through the first device, the second motion gesture defining a direction of motion;
    Transmitting the motion data indicating the second motion gesture to the second device by the first device
    Further comprising:
    Wherein at least a fourth element of the selectable elements is arranged on the display of the second device along a direction defined relative to at least a third element of the selectable elements based on the motion data representing the second motion gesture Characterized in that the device is visually represented.
  4. The method according to claim 1,
    Receiving, at the first device, a fourth user input comprising a selection gesture, via an existing-sense display of the first device; And
    Transmitting the fourth data indicating the selection gesture to the second device by the first device
    Further comprising:
    Wherein at least a third element of the selectable elements is selected in response to the fourth data,
    Wherein the display of the second device outputs a new list of one or more selectable elements in response to the fourth data.
  5. The method according to claim 1,
    Receiving a fourth user input, comprising a selection gesture, through a presence-sensing display of the first device; And
    Transmitting the fourth data indicating the selection gesture to the second device by the first device
    Further comprising:
    Wherein at least a third element of the selectable elements is selected in response to the fourth data,
    Wherein an operation associated with at least a third element of the selectable elements is performed by the second device in response to the fourth data.
  6. The method according to claim 1,
    Wherein the at least one selectable element comprises at least one of one or more words, one or more graphical user interface elements, or one or more operations of the second device.
  7. The method according to claim 1,
    Wherein the first data comprises data representing the first drawing gesture and the second device determines the first character based on the first data.
  8. The method according to claim 1,
    Further comprising determining the first character by the first device based on the first drawing gesture,
    RTI ID = 0.0 > 1, < / RTI > wherein the first data comprises the first character.
  9. A computer readable storage medium encoded with instructions that, when executed, causes one or more processors of a first device to perform operations for remotely controlling a device, the operations comprising:
    Receiving, at a first device, a first user input comprising a first drawing gesture via an existing-sense display of the first device, the first drawing gesture comprising a drawing corresponding to a first character ;
    Sending, by the first device, first data representing the first drawing gesture to a second device wirelessly connected to the first device, the second device including a display for outputting one or more selectable elements Wherein at least a first element of the selectable elements comprises the first character and is visually represented on a display of the second device based on the first data;
    Receiving, at the first device, a second user input comprising a second drawing gesture through a presence-sensing display of the first device, the second drawing gesture representing a drawing representing a second character;
    Transferring second data representing the second drawing gesture to the second device by the first device, wherein at least a second element of the selectable elements includes both the first character and the second character And visually represented on the display of the second device based on the second data;
    Receiving, at the first device, a third user input comprising a motion gesture through the presence-sensing display of the first device after transmitting the second data; And
    Transferring third data representing the motion gesture to the second device by the first device
    / RTI >
    Wherein at least a third element of the selectable elements comprises the first character and is visually represented on the display of the second device based on the third data.
  10. 10. The method of claim 9,
    Wherein the display of the second device outputs a representation of the first drawing gesture based on the first data.
  11. A first device for remotely controlling a second device, the first device comprising:
    One or more processors;
    Presence-sensing display;
    A gesture module operable by the one or more processors to receive a second user input comprising a first user input including a first drawing gesture and a second drawing gesture via the presence-sensing display, the first drawing gesture comprising: The first drawing gesture representing a drawing corresponding to a first character and the second drawing gesture representing a drawing corresponding to a second character;
    A network interface adapted to transmit first data representing the first drawing gesture to a second device wirelessly connected to the first device, the second device comprising a display for outputting one or more selectable elements, At least a first element of the elements comprises the first character and is visually represented on the display of the second device based on the first data;
    Wherein the network interface is further adapted to transmit second data indicative of the second drawing gesture to a second device wirelessly connected to the first device, wherein at least a second element of the selectable elements is coupled to the first character Two characters and is visually represented on the display of the second device based on the second data;
    A user interface controller operable by the one or more processors to receive a third user input comprising a motion gesture via the presence-
    Lt; / RTI >
    Wherein the network interface is further adapted to transmit third data representative of the motion gesture to the second device, at least a third element of the selectable elements including the first character, and based on the third data, Wherein the first device is visually represented on a display of the second device.
  12. A system for a first device to remotely control a second device, the system comprising:
    The first device, the first device,
    A presence-sensing display adapted to receive user input, and
    A network interface adapted to transmit data; And
    A second device connected to the first device
    / RTI >
    The second device comprising a display adapted to output one or more selectable elements,
    Wherein the presence-sensing display receives a first user input comprising a first drawing gesture, the first drawing gesture representing a drawing corresponding to a first character,
    Wherein the network interface transmits first data representing the first drawing gesture to the second device,
    Wherein at least a first element of the selectable elements comprises the first character and is visually represented on a display of the second device based on the first data,
    Wherein the presence-sensing user interface device receives a second user input comprising a second drawing gesture, the second drawing gesture representing a drawing corresponding to a second character,
    Wherein the network interface transmits second data representing the second drawing gesture to the second device,
    Wherein at least a second element of the selectable elements comprises both the first character and the second character and is selected responsive to being visually rendered on the display of the second device based on the second data,
    After the second data is transmitted, the presence-sensing display receives a third user input comprising a motion gesture,
    The network interface transmits third data representing the motion gesture to the second device, and
    Wherein at least a third element of the selectable elements comprises the first character and is visually represented on the display of the second device based on the third data. A system for remote control.
  13. delete
  14. delete
  15. delete
  16. delete
  17. delete
  18. delete
  19. delete
  20. delete
KR1020137024833A 2011-02-23 2012-02-14 Touch gestures for remote control operations KR101450231B1 (en)

Priority Applications (5)

Application Number Priority Date Filing Date Title
US13/033,295 2011-02-23
US13/033,295 US20120216152A1 (en) 2011-02-23 2011-02-23 Touch gestures for remote control operations
US13/250,055 2011-09-30
US13/250,055 US8271908B2 (en) 2011-02-23 2011-09-30 Touch gestures for remote control operations
PCT/US2012/025093 WO2012115823A1 (en) 2011-02-23 2012-02-14 Touch gestures for remote control operations

Publications (2)

Publication Number Publication Date
KR20130121992A KR20130121992A (en) 2013-11-06
KR101450231B1 true KR101450231B1 (en) 2014-10-13

Family

ID=46653796

Family Applications (1)

Application Number Title Priority Date Filing Date
KR1020137024833A KR101450231B1 (en) 2011-02-23 2012-02-14 Touch gestures for remote control operations

Country Status (4)

Country Link
US (2) US20120216152A1 (en)
EP (1) EP2678765A1 (en)
KR (1) KR101450231B1 (en)
WO (1) WO2012115823A1 (en)

Families Citing this family (38)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP5327017B2 (en) * 2009-11-24 2013-10-30 ソニー株式会社 Remote operation device, remote operation system, information processing method and program using remote operation device
US9607505B2 (en) 2010-09-22 2017-03-28 Apple Inc. Closed loop universal remote control
US20120096354A1 (en) * 2010-10-14 2012-04-19 Park Seungyong Mobile terminal and control method thereof
KR20130007811A (en) * 2011-07-11 2013-01-21 삼성전자주식회사 Method and apparatus for displaying screen of portable terminal connected with external device
US20140068526A1 (en) * 2012-02-04 2014-03-06 Three Bots Ltd Method and apparatus for user interaction
US20130207901A1 (en) * 2012-02-10 2013-08-15 Nokia Corporation Virtual Created Input Object
US9817479B2 (en) * 2012-02-24 2017-11-14 Nokia Technologies Oy Method and apparatus for interpreting a gesture
WO2014073825A1 (en) * 2012-11-09 2014-05-15 Lg Electronics Inc. Portable device and control method thereof
US20130298071A1 (en) * 2012-05-02 2013-11-07 Jonathan WINE Finger text-entry overlay
US20130339859A1 (en) * 2012-06-15 2013-12-19 Muzik LLC Interactive networked headphones
CN103677377A (en) * 2012-09-26 2014-03-26 北京鼎元丰和科技有限公司 System and device for achieving terminal display
US8584049B1 (en) * 2012-10-16 2013-11-12 Google Inc. Visual feedback deletion
EP2722744A1 (en) * 2012-10-16 2014-04-23 Advanced Digital Broadcast S.A. Method for generating a graphical user interface.
EP2722745A1 (en) * 2012-10-17 2014-04-23 Advanced Digital Broadcast S.A. A method for operating a gesture-controlled graphical user interface
US8640046B1 (en) * 2012-10-23 2014-01-28 Google Inc. Jump scrolling
CN103778549A (en) * 2012-10-23 2014-05-07 北京鼎元丰和科技有限公司 Mobile application popularizing system and method
US10001918B2 (en) 2012-11-21 2018-06-19 Algotec Systems Ltd. Method and system for providing a specialized computer input device
US20140168098A1 (en) * 2012-12-17 2014-06-19 Nokia Corporation Apparatus and associated methods
US20140189602A1 (en) * 2012-12-28 2014-07-03 Mediatek Inc. Method and associated system for displaying graphic content on extension screen
US20140223382A1 (en) * 2013-02-01 2014-08-07 Barnesandnoble.Com Llc Z-shaped gesture for touch sensitive ui undo, delete, and clear functions
US20140325568A1 (en) * 2013-04-26 2014-10-30 Microsoft Corporation Dynamic creation of highlight reel tv show
KR102203885B1 (en) * 2013-04-26 2021-01-15 삼성전자주식회사 User terminal device and control method thereof
US10180728B2 (en) * 2013-05-17 2019-01-15 Citrix Systems, Inc. Remoting or localizing touch gestures at a virtualization client agent
KR102063103B1 (en) * 2013-08-23 2020-01-07 엘지전자 주식회사 Mobile terminal
KR102130797B1 (en) 2013-09-17 2020-07-03 엘지전자 주식회사 Mobile terminal and control method for the mobile terminal
US20150106764A1 (en) * 2013-10-15 2015-04-16 Apple Inc. Enhanced Input Selection
JP2015176186A (en) * 2014-03-13 2015-10-05 ソニー株式会社 Information processing apparatus, information processing method and information processing system
US20150371529A1 (en) * 2014-06-24 2015-12-24 Bose Corporation Audio Systems and Related Methods and Devices
US9426203B2 (en) 2014-06-27 2016-08-23 Microsoft Technology Licensing, Llc Remote application control interface
US9554189B2 (en) 2014-06-30 2017-01-24 Microsoft Technology Licensing, Llc Contextual remote control interface
US10747426B2 (en) * 2014-09-01 2020-08-18 Typyn, Inc. Software for keyboard-less typing based upon gestures
CN107210950A (en) 2014-10-10 2017-09-26 沐择歌有限责任公司 Equipment for sharing user mutual
US10678326B2 (en) 2015-09-25 2020-06-09 Microsoft Technology Licensing, Llc Combining mobile devices with people tracking for large display interactions
US10572497B2 (en) 2015-10-05 2020-02-25 International Business Machines Corporation Parsing and executing commands on a user interface running two applications simultaneously for selecting an object in a first application and then executing an action in a second application to manipulate the selected object in the first application
US10228775B2 (en) * 2016-01-22 2019-03-12 Microsoft Technology Licensing, Llc Cross application digital ink repository
US9712863B1 (en) 2016-08-09 2017-07-18 Le Technology, Inc. Remote control device with programming guide information
CN108319422A (en) * 2017-01-18 2018-07-24 中兴通讯股份有限公司 A kind of multi-screen interactive touch control display method, device, storage medium and terminal
US10433134B2 (en) * 2017-01-24 2019-10-01 Arris Enterprises Llc Video gateway as an internet of things mesh enhancer apparatus and method

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2010118247A1 (en) 2009-04-10 2010-10-14 Google Inc. Glyph entry on computing device
US20100293462A1 (en) 2008-05-13 2010-11-18 Apple Inc. Pushing a user interface to a remote device

Family Cites Families (64)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5726669A (en) * 1988-06-20 1998-03-10 Fujitsu Limited Multi-window communication system
US5649104A (en) * 1993-03-19 1997-07-15 Ncr Corporation System for allowing user of any computer to draw image over that generated by the host computer and replicating the drawn image to other computers
JP2863428B2 (en) * 1993-05-18 1999-03-03 富士通株式会社 Conversational graphics system
US5729687A (en) * 1993-12-20 1998-03-17 Intel Corporation System for sending differences between joining meeting information and public meeting information between participants in computer conference upon comparing annotations of joining and public meeting information
DE4446139C2 (en) * 1993-12-30 2000-08-17 Intel Corp Method and device for highlighting objects in a conference system
JP3486459B2 (en) * 1994-06-21 2004-01-13 キヤノン株式会社 Electronic information equipment and control method thereof
US5880743A (en) * 1995-01-24 1999-03-09 Xerox Corporation Apparatus and method for implementing visual animation illustrating results of interactive editing operations
JPH08305663A (en) * 1995-04-28 1996-11-22 Hitachi Ltd Teamwork support system
US7406214B2 (en) * 1999-05-19 2008-07-29 Digimarc Corporation Methods and devices employing optical sensors and/or steganography
US5867156A (en) * 1995-11-08 1999-02-02 Intel Corporation Automatic viewport display synchronization during application sharing
US6049329A (en) * 1996-06-04 2000-04-11 International Business Machines Corporartion Method of and system for facilitating user input into a small GUI window using a stylus
US5923323A (en) * 1996-06-26 1999-07-13 Xerox Corporation Method and apparatus for organizing and displaying long lists of data items on a work space of a computer controlled display system
US5748185A (en) * 1996-07-03 1998-05-05 Stratos Product Development Group Touchpad with scroll and pan regions
IL119498A (en) * 1996-10-27 2003-02-12 Advanced Recognition Tech Application launching system
US5923307A (en) * 1997-01-27 1999-07-13 Microsoft Corporation Logical monitor configuration in a multiple monitor environment
WO1999006909A1 (en) * 1997-08-01 1999-02-11 Muse Technologies, Inc. Shared multi-user interface for multi-dimensional synthetic environments
US6057845A (en) * 1997-11-14 2000-05-02 Sensiva, Inc. System, method, and apparatus for generation and recognizing universal commands
US6545669B1 (en) * 1999-03-26 2003-04-08 Husam Kinawi Object-drag continuity between discontinuous touch-screens
US6476834B1 (en) * 1999-05-28 2002-11-05 International Business Machines Corporation Dynamic creation of selectable items on surfaces
US6992702B1 (en) * 1999-09-07 2006-01-31 Fuji Xerox Co., Ltd System for controlling video and motion picture cameras
US6956562B1 (en) * 2000-05-16 2005-10-18 Palmsource, Inc. Method for controlling a handheld computer by entering commands onto a displayed feature of the handheld computer
US20020116205A1 (en) * 2000-05-19 2002-08-22 Ankireddipally Lakshmi Narasimha Distributed transaction processing system
US7750891B2 (en) 2003-04-09 2010-07-06 Tegic Communications, Inc. Selective input system based on tracking of motion parameters of an input device
US7821503B2 (en) 2003-04-09 2010-10-26 Tegic Communications, Inc. Touch screen and graphical user interface
US7289110B2 (en) * 2000-07-17 2007-10-30 Human Messaging Ab Method and arrangement for identifying and processing commands in digital images, where the user marks the command, for example by encircling it
US7202861B2 (en) * 2001-06-25 2007-04-10 Anoto Ab Control of a unit provided with a processor
US6938222B2 (en) * 2002-02-08 2005-08-30 Microsoft Corporation Ink gestures
US6842795B2 (en) * 2002-06-10 2005-01-11 Siemens Communications, Inc. Methods and apparatus for shifting focus between multiple devices
US7225131B1 (en) * 2002-06-14 2007-05-29 At&T Corp. System and method for accessing and annotating electronic medical records using multi-modal interface
US7353453B1 (en) * 2002-06-28 2008-04-01 Microsoft Corporation Method and system for categorizing data objects with designation tools
US20040145574A1 (en) * 2003-01-29 2004-07-29 Xin Zhen Li Invoking applications by scribing an indicium on a touch screen
US7831933B2 (en) * 2004-03-17 2010-11-09 Leapfrog Enterprises, Inc. Method and system for implementing a user interface for a device employing written graphical elements
KR20040083788A (en) * 2003-03-25 2004-10-06 삼성전자주식회사 Portable communication terminal capable of operating program using a gesture command and program operating method using thereof
US7886236B2 (en) * 2003-03-28 2011-02-08 Microsoft Corporation Dynamic feedback for gestures
EP1625474A4 (en) 2003-04-30 2012-03-07 Disney Entpr Inc Cell phone multimedia controller
US20040240739A1 (en) * 2003-05-30 2004-12-02 Lu Chang Pen gesture-based user interface
EP1639441A1 (en) * 2003-07-01 2006-03-29 Nokia Corporation Method and device for operating a user-input area on an electronic display device
US7411575B2 (en) * 2003-09-16 2008-08-12 Smart Technologies Ulc Gesture recognition method and touch system incorporating the same
US7532196B2 (en) * 2003-10-30 2009-05-12 Microsoft Corporation Distributed sensing techniques for mobile devices
US7685134B2 (en) * 2003-12-31 2010-03-23 Nokia Corporation Media file sharing, correlation of metadata related to shared media files and assembling shared media file collections
US7277726B2 (en) 2004-05-03 2007-10-02 Motorola, Inc. Controlling wireless mobile devices from a remote device
US20060004834A1 (en) * 2004-06-30 2006-01-05 Nokia Corporation Dynamic shortcuts
US7844914B2 (en) 2004-07-30 2010-11-30 Apple Inc. Activating virtual keys of a touch-screen virtual keyboard
US7724242B2 (en) * 2004-08-06 2010-05-25 Touchtable, Inc. Touch driven method and apparatus to integrate and display multiple image layers forming alternate depictions of same subject matter
US7557774B2 (en) * 2004-08-13 2009-07-07 Microsoft Corporation Displaying visually correct pointer movements on a multi-monitor display system
US7761814B2 (en) * 2004-09-13 2010-07-20 Microsoft Corporation Flick gesture
KR20060070280A (en) 2004-12-20 2006-06-23 한국전자통신연구원 Apparatus and its method of user interface using hand gesture recognition
US7533189B2 (en) * 2005-06-21 2009-05-12 Microsoft Corporation Enabling a graphical window modification command to be applied to a remotely generated graphical window
US20070050054A1 (en) 2005-08-26 2007-03-01 Sony Ericssson Mobile Communications Ab Mobile communication terminal with virtual remote control
JP2007109118A (en) * 2005-10-17 2007-04-26 Hitachi Ltd Input instruction processing apparatus and input instruction processing program
US7636794B2 (en) * 2005-10-31 2009-12-22 Microsoft Corporation Distributed sensing techniques for mobile devices
US7817991B2 (en) * 2006-02-14 2010-10-19 Microsoft Corporation Dynamic interconnection of mobile devices
US7567233B2 (en) * 2006-09-06 2009-07-28 Stereotaxis, Inc. Global input device for multiple computer-controlled medical systems
US20080065722A1 (en) 2006-09-11 2008-03-13 Apple Computer, Inc. Media device playlists
US7698660B2 (en) * 2006-11-13 2010-04-13 Microsoft Corporation Shared space for communicating information
US8060841B2 (en) * 2007-03-19 2011-11-15 Navisense Method and device for touchless media searching
JP4560062B2 (en) * 2007-03-29 2010-10-13 株式会社東芝 Handwriting determination apparatus, method, and program
US7693842B2 (en) * 2007-04-09 2010-04-06 Microsoft Corporation In situ search for active note taking
JP5282333B2 (en) * 2008-01-07 2013-09-04 株式会社Isowa Conveyor device
US9311115B2 (en) * 2008-05-13 2016-04-12 Apple Inc. Pushing a graphical user interface to a remote device with display rules provided by the remote device
US20100216508A1 (en) * 2009-02-23 2010-08-26 Augusta Technology, Inc. Systems and Methods for Driving an External Display Device Using a Mobile Phone Device
US8243983B2 (en) * 2009-08-14 2012-08-14 Microsoft Corporation Graphically encoded data copy and paste
US20110239114A1 (en) * 2010-03-24 2011-09-29 David Robbins Falkenburg Apparatus and Method for Unified Experience Across Different Devices
US9235268B2 (en) * 2010-04-09 2016-01-12 Nokia Technologies Oy Method and apparatus for generating a virtual interactive workspace

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100293462A1 (en) 2008-05-13 2010-11-18 Apple Inc. Pushing a user interface to a remote device
WO2010118247A1 (en) 2009-04-10 2010-10-14 Google Inc. Glyph entry on computing device

Also Published As

Publication number Publication date
US20120216154A1 (en) 2012-08-23
KR20130121992A (en) 2013-11-06
US8271908B2 (en) 2012-09-18
EP2678765A1 (en) 2014-01-01
US20120216152A1 (en) 2012-08-23
WO2012115823A1 (en) 2012-08-30

Similar Documents

Publication Publication Date Title
JP2019220237A (en) Method and apparatus for providing character input interface
US20190369823A1 (en) Device, method, and graphical user interface for manipulating workspace views
US10754517B2 (en) System and methods for interacting with a control environment
EP3041201B1 (en) User terminal device and control method thereof
US10156974B2 (en) Information processing apparatus, display control method, and display control program
US10037130B2 (en) Display apparatus and method for improving visibility of the same
US9547391B2 (en) Method for processing input and electronic device thereof
AU2017268569B2 (en) Device, method, and graphical user interface for synchronizing two or more displays
KR101720849B1 (en) Touch screen hover input handling
RU2633367C2 (en) Method and device for operating and controlling intelligent device
KR102188097B1 (en) Method for operating page and electronic device thereof
US9898179B2 (en) Method and apparatus for scrolling a screen in a display apparatus
US10825456B2 (en) Method and apparatus for performing preset operation mode using voice recognition
KR102052771B1 (en) Cross-slide gesture to select and rearrange
RU2666236C2 (en) Method of block display operation and terminal therefor
JP6038925B2 (en) Semantic zoom animation
JP6042892B2 (en) Programming interface for semantic zoom
JP5964429B2 (en) Semantic zoom
US9323444B2 (en) Device, method, and storage medium storing program
EP2787683A1 (en) Apparatus and method for providing private chat in group chat
US10168864B2 (en) Gesture menu
KR102091235B1 (en) Apparatus and method for editing a message in a portable terminal
EP2682853B1 (en) Mobile device and operation method control available for using touch and drag
US7558600B2 (en) Mobile communication terminal and method of control through pattern recognition
US20140111451A1 (en) User interface (ui) display method and apparatus of touch-enabled device

Legal Events

Date Code Title Description
A201 Request for examination
A302 Request for accelerated examination
E902 Notification of reason for refusal
E701 Decision to grant or registration of patent right
GRNT Written decision to grant
FPAY Annual fee payment

Payment date: 20170927

Year of fee payment: 4

FPAY Annual fee payment

Payment date: 20180927

Year of fee payment: 5

FPAY Annual fee payment

Payment date: 20190925

Year of fee payment: 6