US20150317055A1 - Remote device control using gestures on a touch sensitive device - Google Patents
Remote device control using gestures on a touch sensitive device Download PDFInfo
- Publication number
- US20150317055A1 US20150317055A1 US14/798,216 US201514798216A US2015317055A1 US 20150317055 A1 US20150317055 A1 US 20150317055A1 US 201514798216 A US201514798216 A US 201514798216A US 2015317055 A1 US2015317055 A1 US 2015317055A1
- Authority
- US
- United States
- Prior art keywords
- remote device
- computing device
- touch
- gesture
- processor
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
-
- H—ELECTRICITY
- H01—ELECTRIC ELEMENTS
- H01H—ELECTRIC SWITCHES; RELAYS; SELECTORS; EMERGENCY PROTECTIVE DEVICES
- H01H9/00—Details of switching devices, not covered by groups H01H1/00 - H01H7/00
- H01H9/02—Bases, casings, or covers
- H01H9/0214—Hand-held casings
- H01H9/0235—Hand-held casings specially adapted for remote control, e.g. of audio or video apparatus
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F16/00—Information retrieval; Database structures therefor; File system structures therefor
- G06F16/20—Information retrieval; Database structures therefor; File system structures therefor of structured data, e.g. relational data
- G06F16/24—Querying
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/033—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
- G06F3/0354—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of 2D relative movements between the device, or an operating part thereof, and a plane or surface, e.g. 2D mice, trackballs, pens or pucks
- G06F3/03547—Touch pads, in which fingers can move on a surface
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
- G06F3/04883—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
-
- G—PHYSICS
- G08—SIGNALLING
- G08C—TRANSMISSION SYSTEMS FOR MEASURED VALUES, CONTROL OR SIMILAR SIGNALS
- G08C17/00—Arrangements for transmitting signals characterised by the use of a wireless electrical link
- G08C17/02—Arrangements for transmitting signals characterised by the use of a wireless electrical link using a radio link
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L67/00—Network arrangements or protocols for supporting network services or applications
- H04L67/01—Protocols
- H04L67/02—Protocols based on web technology, e.g. hypertext transfer protocol [HTTP]
- H04L67/025—Protocols based on web technology, e.g. hypertext transfer protocol [HTTP] for remote control or remote monitoring of applications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2200/00—Indexing scheme relating to G06F1/04 - G06F1/32
- G06F2200/16—Indexing scheme relating to G06F1/16 - G06F1/18
- G06F2200/163—Indexing scheme relating to constructional details of the computer
- G06F2200/1636—Sensing arrangement for detection of a tap gesture on the housing
-
- G—PHYSICS
- G08—SIGNALLING
- G08C—TRANSMISSION SYSTEMS FOR MEASURED VALUES, CONTROL OR SIMILAR SIGNALS
- G08C2201/00—Transmission systems of control signals via wireless link
- G08C2201/30—User interface
Definitions
- This description relates to systems, methods and computer program products for remote device control using gestures on a touch sensitive device.
- a remote control device may be used to control one or more consumer electronic devices such as, for example, a television, a stereo receiver, a digital video disk (DVD) player or computer devices.
- the remote control device may be considered a universal remote control device that is capable of controlling different types of devices made by different manufacturers. With a single remote control device capable of controlling many different types of other devices, it may be desirable for a user to be able to access and control those devices in a secure and user-friendly manner.
- an apparatus includes a touch sensitive area that is arranged and configured to receive one or more gestures, a memory that is arranged and configured to store one or more device gestures, where the stored device gestures correspond to a selection of one of one or more remote devices and a processor that is operably coupled to the touch sensitive area and the memory.
- the processor is arranged and configured to compare the gestures received in the touch sensitive area to the stored device gestures, determine a selected remote device based on the comparison and initiate contact with the selected remote device.
- Implementations may include one or more of the following features.
- the processor initiating contact with the selected remote device may include communicating a wake up signal to the selected remote device.
- the processor initiating contact with the selected remote device may include communicating an unlock signal to the selected remote device.
- the processor initiating contact with the selected remote device may include communicating a wake up signal and an unlock signal to the selected remote device.
- the apparatus may include a display that is operably coupled to the processor, where the processor is arranged and configured to cause the display to display a custom screen for the selected remote device.
- the apparatus may include a display that is operably coupled to the processor, where the touch sensitive area may include at least a portion of the display.
- the apparatus may include a display that is operably coupled to the processor and where the touch sensitive area may include an area other than the display.
- a device gesture may include a single touch and slide movement that uniquely corresponds to one of the remote devices.
- a device gesture may include multiple touches that uniquely correspond to one of the remote devices.
- a device gesture may include multiple touches within a configurable period of time that uniquely correspond to one of the remote devices.
- a device gesture may include multiple simultaneous touches that uniquely correspond to one of the remote devices.
- the touch sensitive area may include multiple different touch sensitive areas and a device gesture may include multiple simultaneous touches on the different touch sensitive areas that uniquely correspond to one of the remote devices.
- a computer-readable storage medium has recorded and stored thereon instructions that, when executed by a processor, cause the processor to perform a method, where the method includes receiving one or more gestures in a touch sensitive area of a device, comparing the gestures received in the touch sensitive area to one or more device gestures stored in a memory, where the stored device gestures correspond to a selection of one or more remote devices, determining a selected remote device based on the comparison and initiating contact with the selected remote device.
- Implementations may include one or more of the following features.
- initiating contact may include communicating a wake up signal to the selected remote device.
- Initiating contact may include communicating an unlock signal to the selected remote device.
- Initiating contact may include communicating a wake up signal and an unlock signal to the selected remote device.
- the computer-readable storage medium may further include instructions that, when executed by the processor, cause the processor to perform the method of displaying on the device a custom screen for the selected remote device.
- the touch sensitive area may include at least a portion of a display of the device.
- the touch sensitive area may include an area other than a display of the device.
- a device gesture may include a single touch and slide movement that uniquely corresponds to one of the remote devices.
- a device gesture may include multiple touches that uniquely correspond to one of the remote devices.
- a device gesture may include multiple touches within a configurable period of time that uniquely correspond to one of the remote devices.
- a device gesture may include multiple simultaneous touches that uniquely correspond to one of the remote devices.
- the touch sensitive area may include multiple touch sensitive areas and a device gesture may include multiple simultaneous touches on the different touch sensitive areas that uniquely correspond to one of the remote devices.
- a computer-implemented method may include receiving one or more gestures in a touch sensitive area of a device, comparing the gestures received in the touch sensitive area to one or more device gestures stored in a memory, where the stored device gestures correspond to a selection of one of one or more remote devices, determining a selected remote device based on the comparison and initiating contact with the selected remote device.
- Implementations may include one or more of the following features.
- initiating contact may include communicating a wake up signal to the selected remote device.
- Initiating contact may include communicating an unlock signal to the selected remote device.
- Initiating contact may include communicating a wake up signal and an unlock signal to the selected remote device.
- the computer-implemented method may further include displaying on the device a custom screen for the selected remote device.
- the touch sensitive area may include at least a portion of a display of the device.
- the touch sensitive area may include an area other than a display of the device.
- a device gesture may include a single touch and slide movement that uniquely corresponds to one of the remote devices.
- a device gesture may include multiple touches that uniquely correspond to one of the remote devices.
- a device gesture may include multiple touches within a configurable period of time that uniquely correspond to one of the remote devices.
- a device gesture may include multiple simultaneous touches that uniquely correspond to one of the remote devices.
- the touch sensitive area may include multiple touch sensitive areas and a device gesture may include multiple simultaneous touches on the different touch sensitive areas that uniquely correspond to one of the remote devices.
- FIGS. 1A-1F are exemplary block diagrams of a touch-sensitive device illustrating a gesture on a touch-sensitive area of the device.
- FIGS. 2A and 2B are exemplary block diagrams illustrating exemplary components of the touch-sensitive device of FIGS. 1A-IF .
- FIG. 3 is a flowchart illustrating example operations of the touch sensitive device of FIG. 1A-1F and 2 A- 2 B.
- FIG. 4 shows an example of a computer device and a mobile computer device that can be used to implement the techniques described here.
- This document describes systems and techniques for using gestures on a touch sensitive device to initiate contact with a remote device, where a particular gesture corresponds to a particular device.
- a specific gesture received on the touch sensitive device may correspond to a specific remote device such that the touch sensitive device initiates contact with the specific remote device.
- the touch sensitive device may communicate a wake up signal to the remote device and/or may communicate an unlock signal to the remote device.
- the wake up signal may cause the remote device to transition from an off or sleep state to an on state.
- the unlock signal may cause the remote device to unlock and accept and process additional inputs from the touch sensitive device or directly on the remote device itself.
- Such techniques may include receiving one or more gestures on a touch sensitive area of the device and comparing the received gestures to one or more device gestures, which may be stored in a memory of the device.
- the device gestures correspond to a selection of one of one or more remote devices. If the received gesture matches a stored device gesture, then the touch sensitive device determines a selected remote device based on the comparison and initiates contact with the selected remote device.
- the touch sensitive device also may display a custom display screen that is specific to the selected remote device. In this manner, a user may communicate control signals and/or commands to the selected remote device using the custom display screen for that specific remote device.
- the touch sensitive device 102 may include any type of device that includes one or more touch sensitive areas.
- the touch sensitive device 102 may include a mobile phone, a laptop, a tablet device, a game device, a music player, a personal digital assistant, a smart phone and any combinations of these devices, where the device includes one or more touch sensitive areas.
- the touch sensitive area may include, for example, a display, a keypad, a track pad, and a portion of the device housing.
- the touch sensitive device 102 may include a display 104 .
- at least a part of the display 104 may be touch sensitive.
- at least a portion of the display 104 may be a touch screen that enables user interaction with the device using the touch screen display.
- the touch sensitive device 102 may include a button 106 for navigating and selecting items displayed on the display 104 .
- the button 106 may be considered a touch sensitive area, where slide movements of a user's finger on the button 106 may be used to move a selector on the display 104 .
- the housing 108 of the device 102 may be a touch sensitive area. Any or all portions of the housing 108 may be touch sensitive.
- the touch sensitive portions of the housing 108 may be used to control a selector, a cursor or other movement on the display 104 .
- a user may be able to slide a finger on the touch sensitive portions of the housing 108 to interact with the display 104 .
- the touch sensitive device 102 may include other components, which are not illustrated in FIG. 1A .
- the touch sensitive device 102 may include a physical keyboard (not shown), which may be used to enter information into the device 102 .
- the physical keyboard (not shown) may be implemented on the front of the device 102 along with the display 104 or may be implemented as a slide out keyboard from behind the display 104 .
- the physical keyboard also may include one or more touch sensitive areas.
- One or more of the touch sensitive areas may be configured to receive one or more gestures.
- a user may input a gesture on one or more of the touch sensitive areas to select a specific remote device that corresponds to a specific gesture.
- the touch sensitive device 102 may be configured to compare the received gestures with device gestures that are stored on the device 102 .
- a device gesture includes a gesture that corresponds to a selection of one of multiple remote devices.
- a device gesture is a gesture that uniquely corresponds to selection of a remote device and any customized displays for communicating with that remote device.
- a specific gesture may be associated with a specific remote device by default or by user configuration. The specific gesture associated with the specific remote device may be customized and configured by the user.
- a remote device may include a computing device such as, for example, a desktop computer, a server, a laptop or notebook computer, a tablet device or other types of computing devices.
- a remote device may include electronic equipment such as, for example, a television, a radio, a tuner, a stereo receiver, a compact disc player, a DVD player, a video cassette recorder (VCR), other audio and visual electronic devices and other consumer electronic devices.
- VCR video cassette recorder
- a gesture may include, for example, a single touch and movement of a user's finger or an apparatus, such as a stylus, on the touch sensitive area.
- a gesture may be a single touch of the touch sensitive area followed by a movement of the finger across the touch sensitive area, without lifting the finger from the touch sensitive area between the touch and the movement.
- the movement may include a continuous, sliding motion on the touch sensitive area.
- the touch sensitive device 102 is illustrated.
- a user 110 enters a gesture comprising a single touch and movement on a touch sensitive area, which in this example is the display 104 .
- the user 110 touches the display 104 in the upper left hand corner of the display 104 with a single touch and then moves her finger across the top edge of the display and then down the right side of the display, as indicated by the arrows.
- the display 104 is configured to receive the gesture.
- the gesture may be performed in one continuous motion without lifting the finger from the display 104 .
- the gesture across the top of the display 104 and then down the right side of the display 104 may correspond to selecting a particular remote device such as, for example, a desktop computer.
- the device 102 receives this particular gesture, compares the received gesture to device gestures stored on the device, determines that the desktop computer is the selected device and initiates contact with the desktop computer.
- the device 102 may communicate a wake up signal to the desktop computer to transition the desktop computer to an on or active state.
- the device may communicate an unlock signal to the desktop computer to unlock the desktop computer for use by the user. In some instances, without the unlock signal from the device 102 , the desktop computer would remained locked.
- the gesture entered by the user may function as both a wake up signal and a security mechanism for the desktop device.
- This particular gesture entered on the device 102 is unique to identify and select the desktop computer and may function as a password to unlock the desktop computer for use.
- the device 102 may display a custom display screen related to the desktop computer. In this manner, the device 102 may be used to communicate control signals and other commands to the desktop computer such that the device 102 may function as a remote control for the desktop computer.
- Other single touch and movement gestures on the touch sensitive area may be used to select one of multiple other remote devices and to initiate contact with the other remote devices.
- a single touch and movement gesture in the shape of the letter “COO may be used to select and initiate contact with a CD player.
- a gesture in the shape of the letter “D” may be used to select and initiate contact with a DVD player.
- the device 102 may include touch sensitive areas other than just the display 104 .
- the touch sensitive device 102 is illustrated.
- the user 110 enters a gesture comprising a single touch and movement on a touch sensitive area, which in this example is the housing 108 .
- the user 110 touches the housing 108 on the top, left corner with a single touch and then moves her finger across the top of the housing 108 and then down the right side of the housing 108 , as indicated by the arrows.
- the housing 108 is configured to receive the gesture.
- the gesture may be performed in one continuous motion without lifting the finger from the housing 108 .
- this particular gesture on a touch sensitive area other than the display may be associated with a particular remote device.
- the device 102 compares the gesture with other stored device gestures, determines the particular remote device based on the comparison and initiates contact with the particular remote device.
- the touch sensitive device 102 is illustrated.
- the user 110 enters a gesture comprising a single touch and movement on a touch sensitive area, which in this example is the button 106 .
- the user 110 touches the button 106 on the left edge with a single touch and then moves her finger to the right edge of and then down to the bottom edge of the button 106 , as indicated by the arrows.
- the button 106 is configured to receive the gesture.
- the gesture may be performed in one continuous motion without lifting the finger from the button 106 .
- this particular gesture on the button 106 may be associated with a particular remote device.
- the device 102 compares the gesture with other stored device gestures, determines the particular remote device based on the comparison and initiates contact with the particular remote device.
- a gesture also may include multiple touches and one or more movements of a user's finger or an apparatus, such as a stylus, on the touch sensitive area.
- a gesture may include a first touch and movement on the touch sensitive area followed by a second touch and movement on the touch sensitive area.
- the first touch and movement and the second touch and movement can be on the same areas of the touch sensitive area or may be on different areas of the touch sensitive area.
- the first touch and movement and the second touch and movement may be on different touch sensitive areas.
- a particular gesture that includes multiple touches and one or more movements may be used to select a particular device.
- the multiple touches may include simultaneous touches.
- the user may touch the touch sensitive area in two or more places at the same time.
- a particular gesture that includes multiple simultaneous touches may be used to select a particular device.
- the touch sensitive device 102 is illustrated.
- the user 110 enters a gesture comprising a first touch and movement followed by a second touch and movement on the touch sensitive area, which in this example is the display 104 .
- the user 110 first touches the display 104 in the upper left hand corner and slides her finger across the top and then diagonally towards the center, as indicated by the arrows.
- the user 110 then touches the display 104 in the lower right hand corner and slides her finger up the right side of the display 104 .
- This exemplary gesture may be used to select a particular remote device such that the device 102 initiates contact with the remote device. While this example gesture is illustrated on the display 104 , other gestures that include multiple touches and one or more movements may be performed on the other touch sensitive areas of the device 102 .
- a gesture may include various combinations of touches and movements, other than those described above in the example of FIG. 1E .
- the touches may include a simultaneous touch in two or more places on the display 104 , such as, touching the display 104 with one finger on one side of the display 104 and with another finger on the other side of the display 104 at the same time.
- the simultaneous touches may be followed by a movement of one or more of the fingers on the display 104 .
- the touches may include a simultaneous touch in two or more places on different touch sensitive areas. For instance, the user may touch the housing 108 with one finger and touch the display 104 with another finger at the same time.
- the simultaneous touches on the housing 108 and the display 104 may be followed by a movement of one or more of the fingers on the housing 108 or the display 104 .
- a gesture also may include multiple touches on the touch sensitive area.
- the multiple touches on the touch sensitive area may be within a specific period of time, where the period of time may be pre-defined and/or configurable.
- the multiple touches may not be in combination with a movement of the finger or apparatus on the touch sensitive area.
- the multiple touches may be in the same location on the touch sensitive area or the multiple touches may be on different locations on the touch sensitive area.
- the multiple touches may be simultaneous or nonsimultaneous touches on either the same or on different touch sensitive areas.
- the touch sensitive device 102 is illustrated.
- the user 110 enters a gesture comprising multiple touches on the touch sensitive area, which in this example is the display 104 .
- the user 110 may first touch the upper left corner of the display 104 twice and then touch the lower right corner of the display 104 three times. This combination of touches may be considered a gesture.
- the multiple touches may include multiple simultaneous touches.
- the user 110 may touch two different locations on the display 104 at the same time and then touch the same locations again.
- the user 110 may touch two different locations on the display 104 at the same time and then touch two other locations on the display 104 at the same time.
- the user 110 may touch two different locations on two different touch sensitive areas, such as the display 104 and the housing 108 , at the same time and then touch either those same locations again or different locations on the same or different touch sensitive areas.
- FIGS. 1A-1F illustrate example touches and/or movements on the touch sensitive areas, it is understood that these are examples and many other touches and movements are possible.
- these exemplary gestures may be uniquely associated with a particular remote device such that the device 102 may initiate contact with the remote device associated with a particular gesture.
- the gestures that are received on the touch sensitive area may be compared to gestures that are stored in the memory of the device 102 .
- a processor of the device 102 may be configured to compare the received gesture with a stored gesture.
- One or more of the stored gestures may be referred to as stored device gestures, where a device gesture is associated with a selection of a particular remote device. Each device gesture may uniquely correspond to a particular remote device. If the gestures match, then the processor may be configured to determine a selected remote device and initiate contact with the selected remote device. If the gestures do not match, then the device 102 may not take any action.
- the processor may be configured to monitor the touch sensitive areas of the device 102 for gestures.
- the touch sensitive device 102 may include various types of devices, the components of the device may vary. Some devices may include transceivers, antennas, radios, controllers, memory, processors and other components that are used to perform the functions of the device. Some of these components are described in more detail below with respect to FIG. 4 . Referring to FIG. 2A , a couple of common components for the touch sensitive device 102 are illustrated.
- the touch sensitive device 102 may include a display 104 , a memory 212 , a processor 214 and a communication module 215 .
- the display 104 , the memory 212 , the processor 214 and the communication module 215 may be operably coupled to each other and may be configured to communicate with one another.
- the memory 212 may store instructions that are communicated to the processor 214 for execution by the processor 214 .
- the instructions that are executed by the processor 214 may cause the device 102 to perform certain actions.
- the memory 212 may store one or more devices gestures.
- the stored gestures may be referred to as device gestures because they are compared against gestures received on the touch sensitive area and when a match is confirmed, the processor 214 may determine which one of multiple remote devices 202 a and 202 b has been selected and initiate contact with the selected remote device using the communication module 215 .
- the processor 214 also may cause the display 104 to show a custom screen related to the selected remote device.
- the stored device gestures may include any type of the gestures discussed above with respect to FIGS. 1A-1F .
- one or more default device gestures may be stored in the memory 212 .
- the default device gestures may be pre-defined gestures such that when a user makes the gesture on the touch sensitive area, the processor 214 initiates contact with the particular remote device associated with the particular pre-defined gesture.
- the device gestures may be programmed by the user and stored in the memory 212 .
- a settings option for the device 102 may allow the user to program one or more device gestures.
- the user may be able to configure the number of touches and/or movements, the movements, the location of the touches and/or the movements, and the period of time in which the gesture needs to be completed.
- the settings option allows the user to associate a configured device gesture with a particular remote device.
- the user programmed device gestures, including the remote device associated with a device gestures may be stored in the memory 212 .
- a menu of predefined device gestures and associated remote devices may be presented to the user for selection to use.
- the processor 214 may be operably coupled to the display 104 , the touch sensitive area(s), the memory 212 and the communication module 215 .
- the processor 214 may be configured to monitor the touch sensitive areas for received gestures.
- the processor 214 may be configured to compare gestures received on the touch sensitive area to the device gestures stored in the memory 212 . When a received gesture matches a stored device gestures, the processor 214 is configured to determine which remote device (e.g., remote device 202 a or 202 b ) is associated with the gesture and to initiate contact with the selected remote device using the communication module 215 .
- remote device e.g., remote device 202 a or 202 b
- the processor 214 may not take any action.
- the processor 214 may continue monitoring the touch sensitive areas for gestures.
- the communication module 215 may include a radio frequency transceiver, a Bluetooth transceiver, a Wi-Fi transceiver, an infrared transceiver or other type of transceiver that is configured to communicate with multiple different remote devices 202 a and 202 b.
- the processor 214 may communicate a wake up and/or a power on signal (or command) to a selected remote device using the communication module 215 .
- a particular wake up signal and/or power on signal may be associated with a particular remote device and stored in the memory 212 .
- the wake up signal may cause the selected remote device to transition from a sleep state to an on state such that the remote device is ready for use by the user.
- the power on signal may cause the remote device to transition from an off state to an on state.
- the processor may communicate an unlock signal (or command) to a selected remote device using the communication module 215 .
- a particular unlock signal may be associated with a particular remote device and stored in the memory 212 .
- the unlock signal may cause the selected remote device to be accessible for operation by a user directly or by the remote device itself.
- the unlock signal may be a password or passcode or other type of security code that unlocks the selected remote device using the device 102 .
- the touch sensitive areas may include a separate controller or processor, which may be configured to monitor the touch sensitive area for received gestures.
- the touch sensitive device 102 also may include a touch sensitive controller 216 .
- the touch sensitive controller 216 may be a processor that is configured to control one or more of the touch sensitive areas of the device 102 .
- the touch sensitive controller 216 may be operably coupled to the display 104 and any other touch sensitive areas.
- the touch sensitive areas may include, for example, the display 104 , the button 106 and portions of the housing 108 .
- the touch sensitive controller 216 also may be configured to perform other actions related to one or more of the device components.
- the touch sensitive controller 216 may be configured to perform actions related to the display 104 and to control the functionality of the display 104 .
- the touch sensitive controller 216 also may be operably coupled to the memory 212 , the processor 214 and the communication module 215 .
- the touch sensitive controller 216 also may monitor the display 104 for gestures received on the display 104 .
- the touch sensitive controller 216 may be configured to compare a received gesture with a device gesture stored in the memory 212 and to determine a particular remote device associated with the gesture when a match occurs.
- the touch sensitive controller 216 also may be configured to send a signal to the communication module 215 to initiate contact with the selected remote device. In this manner, the main processor 214 may be off or in a reduced power state and the touch sensitive controller 216 may be in a power state with enough power to monitor the touch sensitive areas.
- Process 300 includes receiving one or more gestures on a touch sensitive area ( 301 ).
- the device 102 may include one or more touch sensitive areas on which one or more gestures may be received including, for example, the display 104 , the button 106 and the housing 108 ( 301 ).
- the processor 214 and/or the touch sensitive controller 216 may be configured to monitor the touch sensitive areas for received gestures, even when the touch sensitive area is otherwise in a sleep state and/or even when the device 102 is in a sleep state.
- Process 300 includes comparing gestures received on the touch sensitive areas to one or more device gestures stored in memory, where the stored device gestures correspond to a selection of one of multiple remote devices ( 303 ).
- the processor 214 may be configured to compare the gestures received on the touch sensitive areas to device gestures stored in the memory 212 ( 303 ).
- the touch sensitive controller 216 may be configured to compare the gestures received on the touch sensitive areas to the device gestures stored in the memory 212 ( 303 ).
- Process 300 includes determining a selected remote device based on the comparison ( 305 ). For example, each device gesture stored in the memory 212 may be uniquely associated with the selection of a particular remote device. Based on the comparison between a received gesture and a stored device gesture, the processor 214 or the touch sensitive controller 216 may determine which particular remote device has been selected.
- Process 300 includes initiating contact with the selected remote device ( 307 ).
- the processor 214 or the touch sensitive controller 216 may initiate contact with the selected remote device using the communication module 215 .
- Initiating contact with the selected remote device may include communicating a wake up signal and/or an unlock signal to the selected remote device.
- FIG. 4 shows an example of a generic computer device 400 and a generic mobile computer device 450 , which may be used with the techniques described here.
- Computing device 400 is intended to represent various forms of digital computers, such as laptops, desktops, workstations, personal digital assistants, servers, blade servers, mainframes, and other appropriate computers.
- Computing device 450 is intended to represent various forms of mobile devices, such as personal digital assistants, cellular telephones, smart phones, tablets, and other similar computing devices.
- the components shown here, their connections and relationships, and their functions, are meant to be exemplary only, and are not meant to limit implementations of the inventions described and/or claimed in this document.
- Both computing device and computing device 450 may include the features and functionality of the touch sensitive device 102 described above with respect to FIGS. 1A-IF , 2 A, 2 B and 3 .
- the descriptions below include other exemplary components and additional functionality, which may be incorporated into the touch sensitive device 102 .
- Computing device 400 includes a processor 402 , memory 404 , a storage device 406 , a high-speed interface 408 connecting to memory 404 and high-speed expansion ports 410 , and a low speed interface 412 connecting to low speed bus 414 and storage device 406 .
- Each of the components 402 , 404 , 406 , 408 , 410 , and 412 are interconnected using various busses, and may be mounted on a common motherboard or in other manners as appropriate.
- the processor 402 can process instructions for execution within the computing device 400 , including instructions stored in the memory 404 or on the storage device 406 to display graphical information for a GUI on an external input/output device, such as display 416 coupled to high speed interface 408 .
- multiple processors and/or multiple buses may be used, as appropriate, along with multiple memories and types of memory.
- multiple computing devices y be connected, with each device providing portions of the necessary operations (e.g., as a server bank, a group of blade servers, or a multi-processor system).
- the memory 404 stores information within the computing device n one implementation, the memory 404 is a volatile memory unit or units. In another implementation, the memory 404 is a non-volatile memory unit or units. The memory 404 may also be another form of computer-readable medium, such as a magnetic or optical disk.
- the storage device 406 is capable of providing mass storage for the computing device n one implementation, the storage device 406 may be or contain a computer-readable medium, such as a floppy disk device, a hard disk device, an optical disk device, or a tape device, a flash memory or other similar solid state memory device, or an array of devices, including devices in a storage area network or other configurations.
- a computer program product can be tangibly embodied in an information carrier.
- the computer program product may also contain instructions that, when executed, perform one or more methods, such as those described above.
- the information carrier is a non-transitory computer-or machine-readable medium, such as the memory 404 , the storage device 406 , or memory on processor 402 .
- the high speed controller 408 manages bandwidth-intensive operations for the computing device 400 , while the low speed controller 412 manages lower bandwidth-intensive operations.
- the high-speed controller 408 is coupled to memory 404 , display 416 (e.g., through a graphics processor or accelerator), and to high-speed expansion ports 410 , which may accept various expansion cards (not shown).
- low-speed controller 412 is coupled to storage device 406 and low-speed expansion port 414 .
- the low-speed expansion port which may include various communication ports (e.g., USB, Bluetooth, Ethernet, wireless Ethernet) may be coupled to one or more input/output devices, such as a keyboard, a pointing device, a scanner, or a networking device such as a switch or router, e.g., through a network adapter.
- input/output devices such as a keyboard, a pointing device, a scanner, or a networking device such as a switch or router, e.g., through a network adapter.
- the computing device 400 may be implemented in a number of different forms, as shown in the figure. For example, it may be implemented as a standard server 420 , or multiple times in a group of such servers. It may also be implemented as part of a rack server system 424 . In addition, it may be implemented in a personal computer such as a laptop computer 422 . Alternatively, components from computing device y be combined with other components in a mobile device (not shown), such as device 450 . Each of such devices may contain one or more of computing device 50 , and an entire system may be made up of multiple computing devices 50 communicating with each other.
- Computing device 450 includes a processor 452 , memory 464 , an input/output device such as a display 454 , a communication interface 466 , and a transceiver 468 , among other components.
- the device 450 may also be provided with a storage device, such as a micro drive or other device, to provide additional storage.
- a storage device such as a micro drive or other device, to provide additional storage.
- Each of the components 452 , 464 , 454 , 466 , and 468 are interconnected using various buses, and several of the components may be mounted on a common motherboard or in other manners as appropriate.
- the processor 452 can execute instructions within the computing device 450 , including instructions stored in the memory 464 .
- the processor may be implemented as a chipset of chips that include separate and multiple analog and digital processors.
- the processor may provide, for example, for coordination of the other components of the device 450 , such as control of user interfaces, applications run by device 450 , and wireless communication by device 450 .
- Processor 452 may communicate with a user through control interface 458 and display interface 456 coupled to a display 454 .
- the display 454 may be, for example, a TFT LCD (Thin-Film-Transistor Liquid Crystal Display) or an OLED (Organic Light Emitting Diode) display, or other appropriate display technology.
- the display interface 456 may comprise appropriate circuitry for driving the display 454 to present graphical and other information to a user.
- the control interface 458 may receive commands from a user and convert them for submission to the processor 452 .
- an external interface 462 may be provide in communication with processor 452 , so as to enable near area communication of device 450 with other devices. External interface 462 may provide, for example, for wired communication in some implementations, or for wireless communication in other implementations, and multiple interfaces may also be used.
- the memory 464 stores information within the computing device 450 .
- the memory 464 can be implemented as one or more of a non-transitory computer readable medium or media, a volatile memory unit or units, or a non-volatile memory unit or units.
- Expansion memory 474 may also be provided and connected to device 450 through expansion interface 472 , which may include, for example, a SIMM (Single In Line Memory Module) card interface.
- SIMM Single In Line Memory Module
- expansion memory 474 may provide extra storage space for device 450 , or may also store applications or other information for device 450 .
- expansion memory 474 may include instructions to carry out or supplement the processes described above, and may include secure information also.
- expansion memory 474 may be provide as a security module for device 450 , and may be programmed with instructions that permit secure use of device 450 .
- secure applications may be provided via the SIMM cards, along with additional information, such as placing identifying information on the SIMM card in a non-hackable manner.
- the memory may include, for example, flash memory and/or NVRAM memory, as discussed below.
- a computer program product is tangibly embodied in an information carrier.
- the computer program product contains instructions that, when executed, perform one or more methods, such as those described above.
- the information carrier is a non-transitory computer-or machine-readable medium, such as the memory 464 , expansion memory 474 , or memory on processor 452 , that may be received, for example, over transceiver 468 or external interface 462 .
- Device 450 may communicate wirelessly through communication interface 466 , which may include digital signal processing circuitry where necessary. Communication interface 466 may provide for communications under various modes or protocols, such as GSM voice calls, SMS, EMS, or MMS messaging, CDMA, TDMA, PDC, WCDMA, CDMA or GPRS, among others. Such communication may occur, for example, through radio-frequency transceiver 468 . In addition, short-range communication may occur, such as using a Bluetooth, WiFi, or other such transceiver (not shown). In addition, GPS (Global Positioning System) receiver module 470 may provide additional navigation-and location-related wireless data to device 450 , which may be used as appropriate by applications running on device 450 .
- GPS Global Positioning System
- Device 450 may also communicate audibly using audio codec 460 , which may receive spoken information from a user and convert it to usable digital information. Audio codec 460 may likewise generate audible sound for a user, such as through a speaker, e.g., in a handset of device 450 . Such sound may include sound from voice telephone calls, may include recorded sound (e.g., voice messages, music files, etc.) and may also include sound generated by applications operating on device 450 .
- Audio codec 460 may receive spoken information from a user and convert it to usable digital information. Audio codec 460 may likewise generate audible sound for a user, such as through a speaker, e.g., in a handset of device 450 . Such sound may include sound from voice telephone calls, may include recorded sound (e.g., voice messages, music files, etc.) and may also include sound generated by applications operating on device 450 .
- the computing device 450 may be implemented in a number of different forms, as shown in the figure. For example, it may be implemented as a cellular telephone 480 . It may also be implemented as part of a smart phone 482 , personal digital assistant, or other similar mobile device.
- implementations of the systems and techniques described here can be realized in digital electronic circuitry, integrated circuitry, specially designed ASICs (application specific integrated circuits), computer hardware, firmware, software, and/or combinations thereof.
- ASICs application specific integrated circuits
- These various implementations can include implementation in one or more computer programs that are executable and/or interpretable on a programmable system including at least one programmable processor, which may be special or general purpose, coupled to receive data and instructions from, and to transmit data and instructions to, a storage system, at least one input device, and at least one output device.
- the systems and techniques described here can be implemented on a computer having a display device (e.g., a CRT (cathode ray tube) or LCD (liquid crystal display) monitor) for displaying information to the user and a keyboard and a pointing device (e.g., a mouse or a trackball) by which the user can provide input to the computer.
- a display device e.g., a CRT (cathode ray tube) or LCD (liquid crystal display) monitor
- a keyboard and a pointing device e.g., a mouse or a trackball
- Other kinds of devices can be used to provide for interaction with a user as well; for example, feedback provided to the user can be any form of sensory feedback (e.g., visual feedback, auditory feedback, or tactile feedback); and input from the user can be received in any form, including acoustic, speech, or tactile input.
- the systems and techniques described here can be implemented in a computing system that includes a back end component (e.g., as a data server), or that includes a middleware component (e.g., an application server), or that includes a front end component (e.g., a client computer having a graphical user interface or a Web browser through which a user can interact with an implementation of the systems and techniques described here), or any combination of such back end, middleware, or front end components.
- the components of the system can be interconnected by any form or medium of digital data communication (e.g., a communication network). Examples of communication networks include a local area network (“LAN”), a wide area network (“WAN”), and the Internet.
- LAN local area network
- WAN wide area network
- the Internet the global information network
- the computing system can include clients and servers.
- a client and server are generally remote from each other and typically interact through a communication network.
- the relationship of client and server arises by virtue of computer programs running on the respective computers and having a client-server relationship to each other.
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Human Computer Interaction (AREA)
- Computer Networks & Wireless Communication (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Data Mining & Analysis (AREA)
- Databases & Information Systems (AREA)
- Computational Linguistics (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
Description
- This application is a Continuation of and claims priority benefit to U.S. patent application Ser. No. 13/097,790, filed on Apr. 29, 2011, the contents of which are hereby incorporated by reference as if fully set forth below.
- This description relates to systems, methods and computer program products for remote device control using gestures on a touch sensitive device.
- A remote control device may be used to control one or more consumer electronic devices such as, for example, a television, a stereo receiver, a digital video disk (DVD) player or computer devices. The remote control device may be considered a universal remote control device that is capable of controlling different types of devices made by different manufacturers. With a single remote control device capable of controlling many different types of other devices, it may be desirable for a user to be able to access and control those devices in a secure and user-friendly manner.
- According to one general aspect, an apparatus includes a touch sensitive area that is arranged and configured to receive one or more gestures, a memory that is arranged and configured to store one or more device gestures, where the stored device gestures correspond to a selection of one of one or more remote devices and a processor that is operably coupled to the touch sensitive area and the memory. The processor is arranged and configured to compare the gestures received in the touch sensitive area to the stored device gestures, determine a selected remote device based on the comparison and initiate contact with the selected remote device.
- Implementations may include one or more of the following features. For example, the processor initiating contact with the selected remote device may include communicating a wake up signal to the selected remote device. The processor initiating contact with the selected remote device may include communicating an unlock signal to the selected remote device. The processor initiating contact with the selected remote device may include communicating a wake up signal and an unlock signal to the selected remote device. The apparatus may include a display that is operably coupled to the processor, where the processor is arranged and configured to cause the display to display a custom screen for the selected remote device.
- The apparatus may include a display that is operably coupled to the processor, where the touch sensitive area may include at least a portion of the display. The apparatus may include a display that is operably coupled to the processor and where the touch sensitive area may include an area other than the display.
- A device gesture may include a single touch and slide movement that uniquely corresponds to one of the remote devices. A device gesture may include multiple touches that uniquely correspond to one of the remote devices. A device gesture may include multiple touches within a configurable period of time that uniquely correspond to one of the remote devices. A device gesture may include multiple simultaneous touches that uniquely correspond to one of the remote devices. The touch sensitive area may include multiple different touch sensitive areas and a device gesture may include multiple simultaneous touches on the different touch sensitive areas that uniquely correspond to one of the remote devices.
- In another general aspect, a computer-readable storage medium has recorded and stored thereon instructions that, when executed by a processor, cause the processor to perform a method, where the method includes receiving one or more gestures in a touch sensitive area of a device, comparing the gestures received in the touch sensitive area to one or more device gestures stored in a memory, where the stored device gestures correspond to a selection of one or more remote devices, determining a selected remote device based on the comparison and initiating contact with the selected remote device.
- Implementations may include one or more of the following features. For example, initiating contact may include communicating a wake up signal to the selected remote device. Initiating contact may include communicating an unlock signal to the selected remote device. Initiating contact may include communicating a wake up signal and an unlock signal to the selected remote device. The computer-readable storage medium may further include instructions that, when executed by the processor, cause the processor to perform the method of displaying on the device a custom screen for the selected remote device.
- The touch sensitive area may include at least a portion of a display of the device. The touch sensitive area may include an area other than a display of the device. A device gesture may include a single touch and slide movement that uniquely corresponds to one of the remote devices. A device gesture may include multiple touches that uniquely correspond to one of the remote devices. A device gesture may include multiple touches within a configurable period of time that uniquely correspond to one of the remote devices. A device gesture may include multiple simultaneous touches that uniquely correspond to one of the remote devices. The touch sensitive area may include multiple touch sensitive areas and a device gesture may include multiple simultaneous touches on the different touch sensitive areas that uniquely correspond to one of the remote devices.
- In another general aspect, a computer-implemented method may include receiving one or more gestures in a touch sensitive area of a device, comparing the gestures received in the touch sensitive area to one or more device gestures stored in a memory, where the stored device gestures correspond to a selection of one of one or more remote devices, determining a selected remote device based on the comparison and initiating contact with the selected remote device.
- Implementations may include one or more of the following features. For example, initiating contact may include communicating a wake up signal to the selected remote device. Initiating contact may include communicating an unlock signal to the selected remote device. Initiating contact may include communicating a wake up signal and an unlock signal to the selected remote device. The computer-implemented method may further include displaying on the device a custom screen for the selected remote device.
- The touch sensitive area may include at least a portion of a display of the device. The touch sensitive area may include an area other than a display of the device. A device gesture may include a single touch and slide movement that uniquely corresponds to one of the remote devices. A device gesture may include multiple touches that uniquely correspond to one of the remote devices. A device gesture may include multiple touches within a configurable period of time that uniquely correspond to one of the remote devices. A device gesture may include multiple simultaneous touches that uniquely correspond to one of the remote devices. The touch sensitive area may include multiple touch sensitive areas and a device gesture may include multiple simultaneous touches on the different touch sensitive areas that uniquely correspond to one of the remote devices.
- The details of one or more implementations are set forth in the accompanying drawings and the description below. Other features will be apparent from the description and drawings, and from the claims.
-
FIGS. 1A-1F are exemplary block diagrams of a touch-sensitive device illustrating a gesture on a touch-sensitive area of the device. -
FIGS. 2A and 2B are exemplary block diagrams illustrating exemplary components of the touch-sensitive device ofFIGS. 1A-IF . -
FIG. 3 is a flowchart illustrating example operations of the touch sensitive device ofFIG. 1A-1F and 2A-2B. -
FIG. 4 shows an example of a computer device and a mobile computer device that can be used to implement the techniques described here. - This document describes systems and techniques for using gestures on a touch sensitive device to initiate contact with a remote device, where a particular gesture corresponds to a particular device. For instance, a specific gesture received on the touch sensitive device may correspond to a specific remote device such that the touch sensitive device initiates contact with the specific remote device. In this manner, the touch sensitive device may communicate a wake up signal to the remote device and/or may communicate an unlock signal to the remote device. The wake up signal may cause the remote device to transition from an off or sleep state to an on state. The unlock signal may cause the remote device to unlock and accept and process additional inputs from the touch sensitive device or directly on the remote device itself.
- Such techniques may include receiving one or more gestures on a touch sensitive area of the device and comparing the received gestures to one or more device gestures, which may be stored in a memory of the device. The device gestures correspond to a selection of one of one or more remote devices. If the received gesture matches a stored device gesture, then the touch sensitive device determines a selected remote device based on the comparison and initiates contact with the selected remote device. The touch sensitive device also may display a custom display screen that is specific to the selected remote device. In this manner, a user may communicate control signals and/or commands to the selected remote device using the custom display screen for that specific remote device.
- Referring to
FIG. 1A , an exemplary touchsensitive device 102 is illustrated. The touchsensitive device 102 may include any type of device that includes one or more touch sensitive areas. For example, the touchsensitive device 102 may include a mobile phone, a laptop, a tablet device, a game device, a music player, a personal digital assistant, a smart phone and any combinations of these devices, where the device includes one or more touch sensitive areas. The touch sensitive area may include, for example, a display, a keypad, a track pad, and a portion of the device housing. For example, the touchsensitive device 102 may include adisplay 104. In one exemplary implementation, at least a part of thedisplay 104 may be touch sensitive. For instance, at least a portion of thedisplay 104 may be a touch screen that enables user interaction with the device using the touch screen display. - Other areas of the touch
sensitive device 102 may be touch sensitive. For example, the touch sensitive device may include abutton 106 for navigating and selecting items displayed on thedisplay 104. Thebutton 106 may be considered a touch sensitive area, where slide movements of a user's finger on thebutton 106 may be used to move a selector on thedisplay 104. Also, for example, thehousing 108 of thedevice 102 may be a touch sensitive area. Any or all portions of thehousing 108 may be touch sensitive. For instance, the touch sensitive portions of thehousing 108 may be used to control a selector, a cursor or other movement on thedisplay 104. A user may be able to slide a finger on the touch sensitive portions of thehousing 108 to interact with thedisplay 104. - The touch
sensitive device 102 may include other components, which are not illustrated inFIG. 1A . For instance, the touchsensitive device 102 may include a physical keyboard (not shown), which may be used to enter information into thedevice 102. The physical keyboard (not shown) may be implemented on the front of thedevice 102 along with thedisplay 104 or may be implemented as a slide out keyboard from behind thedisplay 104. In the various implementations of a physical keyboard, the physical keyboard also may include one or more touch sensitive areas. - One or more of the touch sensitive areas may be configured to receive one or more gestures. In one exemplary implementation, a user may input a gesture on one or more of the touch sensitive areas to select a specific remote device that corresponds to a specific gesture. The touch
sensitive device 102 may be configured to compare the received gestures with device gestures that are stored on thedevice 102. A device gesture includes a gesture that corresponds to a selection of one of multiple remote devices. A device gesture is a gesture that uniquely corresponds to selection of a remote device and any customized displays for communicating with that remote device. A specific gesture may be associated with a specific remote device by default or by user configuration. The specific gesture associated with the specific remote device may be customized and configured by the user. - A remote device may include a computing device such as, for example, a desktop computer, a server, a laptop or notebook computer, a tablet device or other types of computing devices. A remote device may include electronic equipment such as, for example, a television, a radio, a tuner, a stereo receiver, a compact disc player, a DVD player, a video cassette recorder (VCR), other audio and visual electronic devices and other consumer electronic devices.
- A gesture may include, for example, a single touch and movement of a user's finger or an apparatus, such as a stylus, on the touch sensitive area. For example, a gesture may be a single touch of the touch sensitive area followed by a movement of the finger across the touch sensitive area, without lifting the finger from the touch sensitive area between the touch and the movement. The movement may include a continuous, sliding motion on the touch sensitive area.
- Referring to
FIG. 1B , the touchsensitive device 102 is illustrated. In this example illustration, auser 110 enters a gesture comprising a single touch and movement on a touch sensitive area, which in this example is thedisplay 104. Theuser 110 touches thedisplay 104 in the upper left hand corner of thedisplay 104 with a single touch and then moves her finger across the top edge of the display and then down the right side of the display, as indicated by the arrows. Thedisplay 104 is configured to receive the gesture. The gesture may be performed in one continuous motion without lifting the finger from thedisplay 104. - In one exemplary implementation, the gesture across the top of the
display 104 and then down the right side of thedisplay 104 may correspond to selecting a particular remote device such as, for example, a desktop computer. Thedevice 102 receives this particular gesture, compares the received gesture to device gestures stored on the device, determines that the desktop computer is the selected device and initiates contact with the desktop computer. Thedevice 102 may communicate a wake up signal to the desktop computer to transition the desktop computer to an on or active state. The device may communicate an unlock signal to the desktop computer to unlock the desktop computer for use by the user. In some instances, without the unlock signal from thedevice 102, the desktop computer would remained locked. In this manner, the gesture entered by the user may function as both a wake up signal and a security mechanism for the desktop device. This particular gesture entered on thedevice 102 is unique to identify and select the desktop computer and may function as a password to unlock the desktop computer for use. - In one exemplary implementation, when this particular gesture for the desktop computer is entered, the
device 102 may display a custom display screen related to the desktop computer. In this manner, thedevice 102 may be used to communicate control signals and other commands to the desktop computer such that thedevice 102 may function as a remote control for the desktop computer. - Other single touch and movement gestures on the touch sensitive area may be used to select one of multiple other remote devices and to initiate contact with the other remote devices. For example, a single touch and movement gesture in the shape of the letter “COO may be used to select and initiate contact with a CD player. A gesture in the shape of the letter “D” may be used to select and initiate contact with a DVD player. These different gestures may be configured by the user so that the user can easily associate a particular gesture with a particular remote device.
- As discussed above, the
device 102 may include touch sensitive areas other than just thedisplay 104. Referring toFIG. 1C , the touchsensitive device 102 is illustrated. In this example illustration, theuser 110 enters a gesture comprising a single touch and movement on a touch sensitive area, which in this example is thehousing 108. Theuser 110 touches thehousing 108 on the top, left corner with a single touch and then moves her finger across the top of thehousing 108 and then down the right side of thehousing 108, as indicated by the arrows. Thehousing 108 is configured to receive the gesture. The gesture may be performed in one continuous motion without lifting the finger from thehousing 108. - In this exemplary implementation, this particular gesture on a touch sensitive area other than the display may be associated with a particular remote device. When this gesture is received on the
device 102, thedevice 102 compares the gesture with other stored device gestures, determines the particular remote device based on the comparison and initiates contact with the particular remote device. - Referring to
FIG. 1D , the touchsensitive device 102 is illustrated. In this example illustration, theuser 110 enters a gesture comprising a single touch and movement on a touch sensitive area, which in this example is thebutton 106. Theuser 110 touches thebutton 106 on the left edge with a single touch and then moves her finger to the right edge of and then down to the bottom edge of thebutton 106, as indicated by the arrows. Thebutton 106 is configured to receive the gesture. The gesture may be performed in one continuous motion without lifting the finger from thebutton 106. - In this exemplary implementation, this particular gesture on the
button 106 may be associated with a particular remote device. When this gesture is received on thedevice 102, thedevice 102 compares the gesture with other stored device gestures, determines the particular remote device based on the comparison and initiates contact with the particular remote device. - In another exemplary implementation, a gesture also may include multiple touches and one or more movements of a user's finger or an apparatus, such as a stylus, on the touch sensitive area. For example, a gesture may include a first touch and movement on the touch sensitive area followed by a second touch and movement on the touch sensitive area. The first touch and movement and the second touch and movement can be on the same areas of the touch sensitive area or may be on different areas of the touch sensitive area. The first touch and movement and the second touch and movement may be on different touch sensitive areas. A particular gesture that includes multiple touches and one or more movements may be used to select a particular device.
- In another example, the multiple touches may include simultaneous touches. For instance, the user may touch the touch sensitive area in two or more places at the same time. A particular gesture that includes multiple simultaneous touches may be used to select a particular device.
- Referring to
FIG. 1E , the touchsensitive device 102 is illustrated. In this example illustration, theuser 110 enters a gesture comprising a first touch and movement followed by a second touch and movement on the touch sensitive area, which in this example is thedisplay 104. Theuser 110 first touches thedisplay 104 in the upper left hand corner and slides her finger across the top and then diagonally towards the center, as indicated by the arrows. Theuser 110 then touches thedisplay 104 in the lower right hand corner and slides her finger up the right side of thedisplay 104. This exemplary gesture may be used to select a particular remote device such that thedevice 102 initiates contact with the remote device. While this example gesture is illustrated on thedisplay 104, other gestures that include multiple touches and one or more movements may be performed on the other touch sensitive areas of thedevice 102. - A gesture may include various combinations of touches and movements, other than those described above in the example of
FIG. 1E . For example, the touches may include a simultaneous touch in two or more places on thedisplay 104, such as, touching thedisplay 104 with one finger on one side of thedisplay 104 and with another finger on the other side of thedisplay 104 at the same time. The simultaneous touches may be followed by a movement of one or more of the fingers on thedisplay 104. In another example, the touches may include a simultaneous touch in two or more places on different touch sensitive areas. For instance, the user may touch thehousing 108 with one finger and touch thedisplay 104 with another finger at the same time. The simultaneous touches on thehousing 108 and thedisplay 104 may be followed by a movement of one or more of the fingers on thehousing 108 or thedisplay 104. - In one exemplary implementation, a gesture also may include multiple touches on the touch sensitive area. The multiple touches on the touch sensitive area may be within a specific period of time, where the period of time may be pre-defined and/or configurable. The multiple touches may not be in combination with a movement of the finger or apparatus on the touch sensitive area. The multiple touches may be in the same location on the touch sensitive area or the multiple touches may be on different locations on the touch sensitive area. The multiple touches may be simultaneous or nonsimultaneous touches on either the same or on different touch sensitive areas. Each of these various exemplary implementations of gestures may be used to select a particular remote device.
- Referring to
FIG. 1F , the touchsensitive device 102 is illustrated. In this example illustration, theuser 110 enters a gesture comprising multiple touches on the touch sensitive area, which in this example is thedisplay 104. For example, theuser 110 may first touch the upper left corner of thedisplay 104 twice and then touch the lower right corner of thedisplay 104 three times. This combination of touches may be considered a gesture. - Also, as discussed above, the multiple touches may include multiple simultaneous touches. For example, the
user 110 may touch two different locations on thedisplay 104 at the same time and then touch the same locations again. In another example, theuser 110 may touch two different locations on thedisplay 104 at the same time and then touch two other locations on thedisplay 104 at the same time. In another example, theuser 110 may touch two different locations on two different touch sensitive areas, such as thedisplay 104 and thehousing 108, at the same time and then touch either those same locations again or different locations on the same or different touch sensitive areas. - While the examples provided in
FIGS. 1A-1F illustrate example touches and/or movements on the touch sensitive areas, it is understood that these are examples and many other touches and movements are possible. As discussed above, these exemplary gestures may be uniquely associated with a particular remote device such that thedevice 102 may initiate contact with the remote device associated with a particular gesture. - The gestures that are received on the touch sensitive area may be compared to gestures that are stored in the memory of the
device 102. In this manner, a processor of thedevice 102 may be configured to compare the received gesture with a stored gesture. One or more of the stored gestures may be referred to as stored device gestures, where a device gesture is associated with a selection of a particular remote device. Each device gesture may uniquely correspond to a particular remote device. If the gestures match, then the processor may be configured to determine a selected remote device and initiate contact with the selected remote device. If the gestures do not match, then thedevice 102 may not take any action. The processor may be configured to monitor the touch sensitive areas of thedevice 102 for gestures. - Since the touch
sensitive device 102 may include various types of devices, the components of the device may vary. Some devices may include transceivers, antennas, radios, controllers, memory, processors and other components that are used to perform the functions of the device. Some of these components are described in more detail below with respect toFIG. 4 . Referring toFIG. 2A , a couple of common components for the touchsensitive device 102 are illustrated. The touchsensitive device 102 may include adisplay 104, amemory 212, aprocessor 214 and acommunication module 215. Thedisplay 104, thememory 212, theprocessor 214 and thecommunication module 215 may be operably coupled to each other and may be configured to communicate with one another. - The
memory 212 may store instructions that are communicated to theprocessor 214 for execution by theprocessor 214. The instructions that are executed by theprocessor 214 may cause thedevice 102 to perform certain actions. - In one exemplary implementation, the
memory 212 may store one or more devices gestures. The stored gestures may be referred to as device gestures because they are compared against gestures received on the touch sensitive area and when a match is confirmed, theprocessor 214 may determine which one of multipleremote devices communication module 215. Theprocessor 214 also may cause thedisplay 104 to show a custom screen related to the selected remote device. The stored device gestures may include any type of the gestures discussed above with respect toFIGS. 1A-1F . - In one exemplary implementation, one or more default device gestures may be stored in the
memory 212. The default device gestures may be pre-defined gestures such that when a user makes the gesture on the touch sensitive area, theprocessor 214 initiates contact with the particular remote device associated with the particular pre-defined gesture. - In one exemplary implementation, the device gestures may be programmed by the user and stored in the
memory 212. For example, a settings option for thedevice 102 may allow the user to program one or more device gestures. For each device gesture, the user may be able to configure the number of touches and/or movements, the movements, the location of the touches and/or the movements, and the period of time in which the gesture needs to be completed. The settings option allows the user to associate a configured device gesture with a particular remote device. The user programmed device gestures, including the remote device associated with a device gestures, may be stored in thememory 212. In another implementation, a menu of predefined device gestures and associated remote devices may be presented to the user for selection to use. - The
processor 214 may be operably coupled to thedisplay 104, the touch sensitive area(s), thememory 212 and thecommunication module 215. Theprocessor 214 may be configured to monitor the touch sensitive areas for received gestures. Theprocessor 214 may be configured to compare gestures received on the touch sensitive area to the device gestures stored in thememory 212. When a received gesture matches a stored device gestures, theprocessor 214 is configured to determine which remote device (e.g.,remote device communication module 215. - If the received gesture does not match a stored device gesture, then the
processor 214 may not take any action. Theprocessor 214 may continue monitoring the touch sensitive areas for gestures. - The
communication module 215 may include a radio frequency transceiver, a Bluetooth transceiver, a Wi-Fi transceiver, an infrared transceiver or other type of transceiver that is configured to communicate with multiple differentremote devices - In one exemplary implementation, the
processor 214 may communicate a wake up and/or a power on signal (or command) to a selected remote device using thecommunication module 215. A particular wake up signal and/or power on signal may be associated with a particular remote device and stored in thememory 212. The wake up signal may cause the selected remote device to transition from a sleep state to an on state such that the remote device is ready for use by the user. The power on signal may cause the remote device to transition from an off state to an on state. - In one exemplary implementation, the processor may communicate an unlock signal (or command) to a selected remote device using the
communication module 215. A particular unlock signal may be associated with a particular remote device and stored in thememory 212. The unlock signal may cause the selected remote device to be accessible for operation by a user directly or by the remote device itself. The unlock signal may be a password or passcode or other type of security code that unlocks the selected remote device using thedevice 102. - In one exemplary implementation, the touch sensitive areas may include a separate controller or processor, which may be configured to monitor the touch sensitive area for received gestures. Referring to
FIG. 2B , the touchsensitive device 102 also may include a touchsensitive controller 216. The touchsensitive controller 216 may be a processor that is configured to control one or more of the touch sensitive areas of thedevice 102. The touchsensitive controller 216 may be operably coupled to thedisplay 104 and any other touch sensitive areas. As discussed above, the touch sensitive areas may include, for example, thedisplay 104, thebutton 106 and portions of thehousing 108. The touchsensitive controller 216 also may be configured to perform other actions related to one or more of the device components. For example, the touchsensitive controller 216 may be configured to perform actions related to thedisplay 104 and to control the functionality of thedisplay 104. - The touch
sensitive controller 216 also may be operably coupled to thememory 212, theprocessor 214 and thecommunication module 215. The touchsensitive controller 216 also may monitor thedisplay 104 for gestures received on thedisplay 104. In one exemplary implementation, the touchsensitive controller 216 may be configured to compare a received gesture with a device gesture stored in thememory 212 and to determine a particular remote device associated with the gesture when a match occurs. The touchsensitive controller 216 also may be configured to send a signal to thecommunication module 215 to initiate contact with the selected remote device. In this manner, themain processor 214 may be off or in a reduced power state and the touchsensitive controller 216 may be in a power state with enough power to monitor the touch sensitive areas. - Referring to
FIG. 3 , an exemplary flow chart illustrates a process 300. Process 300 includes receiving one or more gestures on a touch sensitive area (301). For example, thedevice 102 may include one or more touch sensitive areas on which one or more gestures may be received including, for example, thedisplay 104, thebutton 106 and the housing 108 (301). Theprocessor 214 and/or the touchsensitive controller 216 may be configured to monitor the touch sensitive areas for received gestures, even when the touch sensitive area is otherwise in a sleep state and/or even when thedevice 102 is in a sleep state. - Process 300 includes comparing gestures received on the touch sensitive areas to one or more device gestures stored in memory, where the stored device gestures correspond to a selection of one of multiple remote devices (303). For example, in one exemplary implementation, the
processor 214 may be configured to compare the gestures received on the touch sensitive areas to device gestures stored in the memory 212 (303). In another exemplary implementation, the touchsensitive controller 216 may be configured to compare the gestures received on the touch sensitive areas to the device gestures stored in the memory 212 (303). - Process 300 includes determining a selected remote device based on the comparison (305). For example, each device gesture stored in the
memory 212 may be uniquely associated with the selection of a particular remote device. Based on the comparison between a received gesture and a stored device gesture, theprocessor 214 or the touchsensitive controller 216 may determine which particular remote device has been selected. - Process 300 includes initiating contact with the selected remote device (307). For example, the
processor 214 or the touchsensitive controller 216 may initiate contact with the selected remote device using thecommunication module 215. Initiating contact with the selected remote device may include communicating a wake up signal and/or an unlock signal to the selected remote device. -
FIG. 4 shows an example of ageneric computer device 400 and a genericmobile computer device 450, which may be used with the techniques described here.Computing device 400 is intended to represent various forms of digital computers, such as laptops, desktops, workstations, personal digital assistants, servers, blade servers, mainframes, and other appropriate computers.Computing device 450 is intended to represent various forms of mobile devices, such as personal digital assistants, cellular telephones, smart phones, tablets, and other similar computing devices. The components shown here, their connections and relationships, and their functions, are meant to be exemplary only, and are not meant to limit implementations of the inventions described and/or claimed in this document. Both computing device andcomputing device 450 may include the features and functionality of the touchsensitive device 102 described above with respect toFIGS. 1A-IF , 2A, 2B and 3. The descriptions below include other exemplary components and additional functionality, which may be incorporated into the touchsensitive device 102. -
Computing device 400 includes aprocessor 402,memory 404, astorage device 406, a high-speed interface 408 connecting tomemory 404 and high-speed expansion ports 410, and alow speed interface 412 connecting tolow speed bus 414 andstorage device 406. Each of thecomponents processor 402 can process instructions for execution within thecomputing device 400, including instructions stored in thememory 404 or on thestorage device 406 to display graphical information for a GUI on an external input/output device, such asdisplay 416 coupled tohigh speed interface 408. In other implementations, multiple processors and/or multiple buses may be used, as appropriate, along with multiple memories and types of memory. Also, multiple computing devices y be connected, with each device providing portions of the necessary operations (e.g., as a server bank, a group of blade servers, or a multi-processor system). - The
memory 404 stores information within the computing device n one implementation, thememory 404 is a volatile memory unit or units. In another implementation, thememory 404 is a non-volatile memory unit or units. Thememory 404 may also be another form of computer-readable medium, such as a magnetic or optical disk. - The
storage device 406 is capable of providing mass storage for the computing device n one implementation, thestorage device 406 may be or contain a computer-readable medium, such as a floppy disk device, a hard disk device, an optical disk device, or a tape device, a flash memory or other similar solid state memory device, or an array of devices, including devices in a storage area network or other configurations. A computer program product can be tangibly embodied in an information carrier. The computer program product may also contain instructions that, when executed, perform one or more methods, such as those described above. The information carrier is a non-transitory computer-or machine-readable medium, such as thememory 404, thestorage device 406, or memory onprocessor 402. - The
high speed controller 408 manages bandwidth-intensive operations for thecomputing device 400, while thelow speed controller 412 manages lower bandwidth-intensive operations. Such allocation of functions is exemplary only. In one implementation, the high-speed controller 408 is coupled tomemory 404, display 416 (e.g., through a graphics processor or accelerator), and to high-speed expansion ports 410, which may accept various expansion cards (not shown). In the implementation, low-speed controller 412 is coupled tostorage device 406 and low-speed expansion port 414. The low-speed expansion port, which may include various communication ports (e.g., USB, Bluetooth, Ethernet, wireless Ethernet) may be coupled to one or more input/output devices, such as a keyboard, a pointing device, a scanner, or a networking device such as a switch or router, e.g., through a network adapter. - The
computing device 400 may be implemented in a number of different forms, as shown in the figure. For example, it may be implemented as astandard server 420, or multiple times in a group of such servers. It may also be implemented as part of arack server system 424. In addition, it may be implemented in a personal computer such as alaptop computer 422. Alternatively, components from computing device y be combined with other components in a mobile device (not shown), such asdevice 450. Each of such devices may contain one or more of computing device 50, and an entire system may be made up of multiple computing devices 50 communicating with each other. -
Computing device 450 includes aprocessor 452,memory 464, an input/output device such as adisplay 454, acommunication interface 466, and atransceiver 468, among other components. Thedevice 450 may also be provided with a storage device, such as a micro drive or other device, to provide additional storage. Each of thecomponents - The
processor 452 can execute instructions within thecomputing device 450, including instructions stored in thememory 464. The processor may be implemented as a chipset of chips that include separate and multiple analog and digital processors. The processor may provide, for example, for coordination of the other components of thedevice 450, such as control of user interfaces, applications run bydevice 450, and wireless communication bydevice 450. -
Processor 452 may communicate with a user throughcontrol interface 458 anddisplay interface 456 coupled to adisplay 454. Thedisplay 454 may be, for example, a TFT LCD (Thin-Film-Transistor Liquid Crystal Display) or an OLED (Organic Light Emitting Diode) display, or other appropriate display technology. Thedisplay interface 456 may comprise appropriate circuitry for driving thedisplay 454 to present graphical and other information to a user. Thecontrol interface 458 may receive commands from a user and convert them for submission to theprocessor 452. In addition, anexternal interface 462 may be provide in communication withprocessor 452, so as to enable near area communication ofdevice 450 with other devices.External interface 462 may provide, for example, for wired communication in some implementations, or for wireless communication in other implementations, and multiple interfaces may also be used. - The
memory 464 stores information within thecomputing device 450. Thememory 464 can be implemented as one or more of a non-transitory computer readable medium or media, a volatile memory unit or units, or a non-volatile memory unit or units.Expansion memory 474 may also be provided and connected todevice 450 throughexpansion interface 472, which may include, for example, a SIMM (Single In Line Memory Module) card interface.Such expansion memory 474 may provide extra storage space fordevice 450, or may also store applications or other information fordevice 450. Specifically,expansion memory 474 may include instructions to carry out or supplement the processes described above, and may include secure information also. Thus, for example,expansion memory 474 may be provide as a security module fordevice 450, and may be programmed with instructions that permit secure use ofdevice 450. In addition, secure applications may be provided via the SIMM cards, along with additional information, such as placing identifying information on the SIMM card in a non-hackable manner. - The memory may include, for example, flash memory and/or NVRAM memory, as discussed below. In one implementation, a computer program product is tangibly embodied in an information carrier. The computer program product contains instructions that, when executed, perform one or more methods, such as those described above. The information carrier is a non-transitory computer-or machine-readable medium, such as the
memory 464,expansion memory 474, or memory onprocessor 452, that may be received, for example, overtransceiver 468 orexternal interface 462. -
Device 450 may communicate wirelessly throughcommunication interface 466, which may include digital signal processing circuitry where necessary.Communication interface 466 may provide for communications under various modes or protocols, such as GSM voice calls, SMS, EMS, or MMS messaging, CDMA, TDMA, PDC, WCDMA, CDMA or GPRS, among others. Such communication may occur, for example, through radio-frequency transceiver 468. In addition, short-range communication may occur, such as using a Bluetooth, WiFi, or other such transceiver (not shown). In addition, GPS (Global Positioning System)receiver module 470 may provide additional navigation-and location-related wireless data todevice 450, which may be used as appropriate by applications running ondevice 450. -
Device 450 may also communicate audibly usingaudio codec 460, which may receive spoken information from a user and convert it to usable digital information.Audio codec 460 may likewise generate audible sound for a user, such as through a speaker, e.g., in a handset ofdevice 450. Such sound may include sound from voice telephone calls, may include recorded sound (e.g., voice messages, music files, etc.) and may also include sound generated by applications operating ondevice 450. - The
computing device 450 may be implemented in a number of different forms, as shown in the figure. For example, it may be implemented as acellular telephone 480. It may also be implemented as part of asmart phone 482, personal digital assistant, or other similar mobile device. - Various implementations of the systems and techniques described here can be realized in digital electronic circuitry, integrated circuitry, specially designed ASICs (application specific integrated circuits), computer hardware, firmware, software, and/or combinations thereof. These various implementations can include implementation in one or more computer programs that are executable and/or interpretable on a programmable system including at least one programmable processor, which may be special or general purpose, coupled to receive data and instructions from, and to transmit data and instructions to, a storage system, at least one input device, and at least one output device.
- These computer programs (also known as programs, software, software applications or code) include machine instructions for a programmable processor, and can be implemented in a high-level procedural and/or object-oriented programming language, and/or in assembly/machine language. As used herein, the terms “machine-readable medium” “computer-readable medium” refers to any computer program product, apparatus and/or device (e.g., magnetic discs, optical disks, memory, Programmable Logic Devices (PLDs)) used to provide machine instructions and/or data to a programmable processor, including a machine-readable medium that receives machine instructions as a machine-readable signal. The term “machine-readable signal” refers to any signal used to provide machine instructions and/or data to a programmable processor.
- To provide for interaction with a user, the systems and techniques described here can be implemented on a computer having a display device (e.g., a CRT (cathode ray tube) or LCD (liquid crystal display) monitor) for displaying information to the user and a keyboard and a pointing device (e.g., a mouse or a trackball) by which the user can provide input to the computer. Other kinds of devices can be used to provide for interaction with a user as well; for example, feedback provided to the user can be any form of sensory feedback (e.g., visual feedback, auditory feedback, or tactile feedback); and input from the user can be received in any form, including acoustic, speech, or tactile input.
- The systems and techniques described here can be implemented in a computing system that includes a back end component (e.g., as a data server), or that includes a middleware component (e.g., an application server), or that includes a front end component (e.g., a client computer having a graphical user interface or a Web browser through which a user can interact with an implementation of the systems and techniques described here), or any combination of such back end, middleware, or front end components. The components of the system can be interconnected by any form or medium of digital data communication (e.g., a communication network). Examples of communication networks include a local area network (“LAN”), a wide area network (“WAN”), and the Internet.
- The computing system can include clients and servers. A client and server are generally remote from each other and typically interact through a communication network. The relationship of client and server arises by virtue of computer programs running on the respective computers and having a client-server relationship to each other.
- A number of embodiments have been described. Nevertheless, it will be understood that various modifications may be made without departing from the spirit and scope of the invention.
- In addition, the logic flows depicted in the figures do not require the particular order shown, or sequential order, to achieve desirable results. In addition, other steps may be provided, or steps may be eliminated, from the described flows, and other components may be added to, or removed from, the described systems. Accordingly, other embodiments are within the scope of the following claims.
Claims (20)
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US14/798,216 US20150317055A1 (en) | 2011-04-29 | 2015-07-13 | Remote device control using gestures on a touch sensitive device |
US16/890,125 US11543956B2 (en) | 2011-04-29 | 2020-06-02 | Remote device control using gestures on a touch sensitive device |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US13/097,790 US9081810B1 (en) | 2011-04-29 | 2011-04-29 | Remote device control using gestures on a touch sensitive device |
US14/798,216 US20150317055A1 (en) | 2011-04-29 | 2015-07-13 | Remote device control using gestures on a touch sensitive device |
Related Parent Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US13/097,790 Continuation US9081810B1 (en) | 2011-04-29 | 2011-04-29 | Remote device control using gestures on a touch sensitive device |
Related Child Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US16/890,125 Continuation US11543956B2 (en) | 2011-04-29 | 2020-06-02 | Remote device control using gestures on a touch sensitive device |
Publications (1)
Publication Number | Publication Date |
---|---|
US20150317055A1 true US20150317055A1 (en) | 2015-11-05 |
Family
ID=53506768
Family Applications (3)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US13/097,790 Active 2033-02-18 US9081810B1 (en) | 2011-04-29 | 2011-04-29 | Remote device control using gestures on a touch sensitive device |
US14/798,216 Abandoned US20150317055A1 (en) | 2011-04-29 | 2015-07-13 | Remote device control using gestures on a touch sensitive device |
US16/890,125 Active 2032-01-20 US11543956B2 (en) | 2011-04-29 | 2020-06-02 | Remote device control using gestures on a touch sensitive device |
Family Applications Before (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US13/097,790 Active 2033-02-18 US9081810B1 (en) | 2011-04-29 | 2011-04-29 | Remote device control using gestures on a touch sensitive device |
Family Applications After (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US16/890,125 Active 2032-01-20 US11543956B2 (en) | 2011-04-29 | 2020-06-02 | Remote device control using gestures on a touch sensitive device |
Country Status (1)
Country | Link |
---|---|
US (3) | US9081810B1 (en) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20210181887A1 (en) * | 2019-12-13 | 2021-06-17 | Hyundai Motor Company | Input device and operation method thereof |
US11543956B2 (en) | 2011-04-29 | 2023-01-03 | Google Llc | Remote device control using gestures on a touch sensitive device |
Families Citing this family (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102662576B (en) * | 2012-03-29 | 2015-04-29 | 华为终端有限公司 | Method and device for sending out information based on touch |
US10289205B1 (en) | 2015-11-24 | 2019-05-14 | Google Llc | Behind the ear gesture control for a head mountable device |
US10437464B2 (en) * | 2016-11-18 | 2019-10-08 | Adobe Inc. | Content filtering system for touchscreen devices |
US10698498B2 (en) * | 2017-11-30 | 2020-06-30 | Komodo OpenLab Inc. | Configurable device switching mechanism that enables seamless interactions with multiple devices |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20050046584A1 (en) * | 1992-05-05 | 2005-03-03 | Breed David S. | Asset system control arrangement and method |
US20050212911A1 (en) * | 2004-03-23 | 2005-09-29 | Marvit David L | Gesture identification of controlled devices |
US20050212753A1 (en) * | 2004-03-23 | 2005-09-29 | Marvit David L | Motion controlled remote controller |
US20120162073A1 (en) * | 2010-12-28 | 2012-06-28 | Panasonic Corporation | Apparatus for remotely controlling another apparatus and having self-orientating capability |
Family Cites Families (14)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR20040082409A (en) | 2002-02-05 | 2004-09-24 | 코닌클리케 필립스 일렉트로닉스 엔.브이. | Method of activating a remotely controllable device |
US7103006B2 (en) | 2002-05-01 | 2006-09-05 | International Business Machines Corporation | Method, system, and article of manufacture for data transmission |
US7176902B2 (en) | 2003-10-10 | 2007-02-13 | 3M Innovative Properties Company | Wake-on-touch for vibration sensing touch input devices |
WO2005109215A2 (en) * | 2004-04-30 | 2005-11-17 | Hillcrest Laboratories, Inc. | Methods and devices for removing unintentional movement in free space pointing devices |
US8629836B2 (en) * | 2004-04-30 | 2014-01-14 | Hillcrest Laboratories, Inc. | 3D pointing devices with orientation compensation and improved usability |
US7499985B2 (en) | 2004-06-22 | 2009-03-03 | Nokia Corporation | Intuitive energy management of a short-range communication transceiver associated with a mobile terminal |
US7480790B2 (en) | 2005-07-29 | 2009-01-20 | Hewlett-Packard Development Company, L.P. | Sleep state resume |
US7606552B2 (en) | 2005-11-10 | 2009-10-20 | Research In Motion Limited | System and method for activating an electronic device |
US7549583B2 (en) | 2007-03-07 | 2009-06-23 | Metrologic Instruments, Inc. | Method for waking up in-counter fixed scanners |
US20080284739A1 (en) | 2007-05-17 | 2008-11-20 | Microsoft Corporation | Human Interface Device |
US7889175B2 (en) | 2007-06-28 | 2011-02-15 | Panasonic Corporation | Touchpad-enabled remote controller and user interaction methods |
US8766925B2 (en) | 2008-02-28 | 2014-07-01 | New York University | Method and apparatus for providing input to a processor, and a sensor pad |
US20100287513A1 (en) * | 2009-05-05 | 2010-11-11 | Microsoft Corporation | Multi-device gesture interactivity |
US9081810B1 (en) | 2011-04-29 | 2015-07-14 | Google Inc. | Remote device control using gestures on a touch sensitive device |
-
2011
- 2011-04-29 US US13/097,790 patent/US9081810B1/en active Active
-
2015
- 2015-07-13 US US14/798,216 patent/US20150317055A1/en not_active Abandoned
-
2020
- 2020-06-02 US US16/890,125 patent/US11543956B2/en active Active
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20050046584A1 (en) * | 1992-05-05 | 2005-03-03 | Breed David S. | Asset system control arrangement and method |
US20050212911A1 (en) * | 2004-03-23 | 2005-09-29 | Marvit David L | Gesture identification of controlled devices |
US20050212753A1 (en) * | 2004-03-23 | 2005-09-29 | Marvit David L | Motion controlled remote controller |
US20120162073A1 (en) * | 2010-12-28 | 2012-06-28 | Panasonic Corporation | Apparatus for remotely controlling another apparatus and having self-orientating capability |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US11543956B2 (en) | 2011-04-29 | 2023-01-03 | Google Llc | Remote device control using gestures on a touch sensitive device |
US20210181887A1 (en) * | 2019-12-13 | 2021-06-17 | Hyundai Motor Company | Input device and operation method thereof |
Also Published As
Publication number | Publication date |
---|---|
US11543956B2 (en) | 2023-01-03 |
US9081810B1 (en) | 2015-07-14 |
US20200293172A1 (en) | 2020-09-17 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US11543956B2 (en) | Remote device control using gestures on a touch sensitive device | |
AU2020201096B2 (en) | Quick screen splitting method, apparatus, and electronic device, display UI, and storage medium | |
US10209821B2 (en) | Computing devices having swiping interfaces and methods of operating the same | |
US11237724B2 (en) | Mobile terminal and method for split screen control thereof, and computer readable storage medium | |
US8599105B2 (en) | Method and apparatus for implementing a multiple display mode | |
KR102028724B1 (en) | User terminal device and display method thereof | |
WO2020007147A1 (en) | Application switching method and apparatus for split screen, storage medium, and electronic device | |
US9377868B2 (en) | Sliding control method and terminal device thereof | |
AU2013276998B2 (en) | Mouse function provision method and terminal implementing the same | |
CN105359086A (en) | Method for controlling chat window and electronic device implementing the same | |
KR20170076357A (en) | User terminal device, and mode conversion method and sound system for controlling volume of speaker thereof | |
WO2015100569A1 (en) | Sidebar menu display method, device and terminal | |
WO2015043194A1 (en) | Virtual keyboard display method and apparatus, and terminal | |
US11262911B2 (en) | Integrated home key and virtual key area for a smart terminal | |
KR20140111790A (en) | Method and apparatus for inputting keys using random valuable on virtual keyboard | |
WO2020007144A1 (en) | Switching method and device for split screen application, storage medium and electronic device | |
CN103870133A (en) | Method and apparatus for scrolling screen of display device | |
KR20200051783A (en) | Method and terminal for displaying multiple content cards | |
US9727131B2 (en) | User input method for use in portable device using virtual input area | |
US20130050094A1 (en) | Method and apparatus for preventing malfunction of touchpad in electronic device | |
CN108604161A (en) | A kind of method, apparatus and terminal device of locking list object | |
CN104238931B (en) | Information input method and device and electronic equipment | |
KR102106354B1 (en) | Method and apparatus for controlling operation in a electronic device | |
US20170269687A1 (en) | Methods and apparatus to provide haptic feedback for computing devices | |
US9977567B2 (en) | Graphical user interface |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: GOOGLE INC., CALIFORNIA Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:SMITH, GREGORY PAUL;REEL/FRAME:036073/0074 Effective date: 20110428 |
|
AS | Assignment |
Owner name: GOOGLE LLC, CALIFORNIA Free format text: CHANGE OF NAME;ASSIGNOR:GOOGLE INC.;REEL/FRAME:044695/0115 Effective date: 20170929 |
|
STCV | Information on status: appeal procedure |
Free format text: ON APPEAL -- AWAITING DECISION BY THE BOARD OF APPEALS |
|
STCV | Information on status: appeal procedure |
Free format text: BOARD OF APPEALS DECISION RENDERED |
|
STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- AFTER EXAMINER'S ANSWER OR BOARD OF APPEALS DECISION |