US20170235479A1 - Executing a default action on a touchscreen device - Google Patents
Executing a default action on a touchscreen device Download PDFInfo
- Publication number
- US20170235479A1 US20170235479A1 US15/587,587 US201715587587A US2017235479A1 US 20170235479 A1 US20170235479 A1 US 20170235479A1 US 201715587587 A US201715587587 A US 201715587587A US 2017235479 A1 US2017235479 A1 US 2017235479A1
- Authority
- US
- United States
- Prior art keywords
- gesture
- context
- action
- default
- application
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
- G06F3/04883—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/041—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
- G06F3/0482—Interaction with lists of selectable items, e.g. menus
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
Definitions
- the present disclosure generally relates to operating a touchscreen device, and more particularly to executing a predefined action on a touchscreen device using a predefined gesture.
- Touchscreen devices interact with the users by receiving input through touch operations.
- Such touchscreen devices may include, for example, desktop computers, laptop computers, tablet computers, smartphones, and televisions.
- the disclosed subject technology relates to a computer-implemented method for executing a default action on a touchscreen device.
- the method includes receiving a touch input from a user on a touchscreen device and determining a context associated with the touch input.
- the context is associated with one or more actions including a default action.
- the method also includes determining that the received touch input comprises a default gesture, and performing the default action associated with the determined context.
- the disclosed subject technology further relates to a system for executing a default action on a touchscreen device.
- the system includes a memory storing executable instructions.
- the system also includes a processor coupled to the memory configured to execute the stored executable instructions to receive a touch input from a user on a touchscreen device and determine a context associated with the touch input.
- the context is associated with one or more actions including a default action predetermined by the user.
- the processor is further configured to determine whether the received touch input comprises a default gesture, and if the received touch input comprises the default gesture, perform the default action associated with the determined context.
- the disclosed subject technology also relates to a machine-readable storage medium comprising machine-readable instructions for causing a processor to execute a method for executing a default action on a touchscreen device.
- the method includes receiving a touch input from a user on a touchscreen device and determining a context associated with the touch input based on information available to a user within a predetermined distance from a location on the touchscreen device at which the touch input is received.
- the context is associated with one or more actions including a default action predetermined by the user.
- the method also includes determining whether the received touch input comprises a two-finger double-tap gesture, and if the received touch input comprises the two-finger double-tap gesture, executing a predetermined application associated with the determined context.
- FIG. 1 illustrates an example architecture for executing a default action on a touchscreen device.
- FIG. 2 is a block diagram illustrating an example system for executing a default action on a touchscreen device.
- FIG. 3 is a diagram illustrating example operations for executing a default action on a touchscreen device.
- FIG. 4 illustrates an example flow diagram of example processes for executing a default action on a touchscreen device.
- FIG. 5 conceptually illustrates an electronic system with which some implementations of the subject technology are implemented.
- touchscreen devices such as, for example, smartphones, tablets computers, and laptop computers and desktop computers with touchscreens.
- Platforms which the touchscreen devices may allow for these applications to associate themselves with different contexts (which may also be referred to as “intents” herein).
- Platforms may be, for example, mobile operating systems.
- a context or an intent can be determined based on a touch input made on a phone number displayed on a screen of a mobile device.
- Many applications installed or running on the touchscreen device may be associated with such context (determined based on touch input on a phone number). When such context is determined, the associated applications may perform actions that are associated with the context.
- VoIP voice-over-IP
- VoIP voice-over-IP
- a touch on an email address may trigger determination of another context which many email client applications may associate themselves with.
- Another example in which a context may be determined is a touch on URL links which many web browser applications may associate themselves with.
- Contexts may also be associated with different system actions. For example, a context may be determined based on the highlighting input on a text displayed on the screen of the touchscreen device. Such context may be associated with system actions such as copy, cut and paste. A touch input on the screen in general may give rise to determination of another context which may be associated with system actions such as paste or zoom in/out of displayed contents.
- a default application may be specified for a given context (e.g., determined based on touch input on a phone number, email address or URL Link). For example, a user may specify that whenever a phone number is touched upon, a specific VoIP application respond each time. However, specifying the default application may bury easy access to other options which may be associated with the given context. For example, if a default application is associated with telephone numbers, touching a telephone number would no longer bring up a menu for selecting an application for responding to the touch. Rather, the default application automatically responds to the touch. Therefore, if the user wishes for a different application to respond to the context later on, the user would have to go through the process of clearing the default, which may often require accessing multiple levels of settings menu.
- Two-finger double-tap gesture is a type of touch input which is relatively easy to perform, and there are currently no generally accepted actions associated with this gesture.
- a default action may be associated with a two-finger double tap. This way, the user may conveniently execute the default action by performing the two-finger double-tap gesture, without hiding easy access to other available actions. For example, telephone numbers may be associated with multiple VoIP applications, and a default action may be associated with an application “A”.
- the user may perform the two-finger double-tap gesture. If the user wishes to select other applications for responding to a touch on a telephone number, the user may perform the traditional single-finger single-tap gesture, which may bring up the menu for selecting other applications to respond to the touch gesture.
- the two-finger double tap may perform the option that is most commonly used, or the most reasonable given the context, while the single-finger single-tap gesture may bring up the list of options, as usual.
- Two-finger double tap to activate a “web intents” action For example, two-finger double tap an image to add it to a default photo sharing site.
- gestures While the subject technology is described above with a two-finger double-tap gesture, other gestures may also be used.
- an application as used herein encompasses its plain and ordinary meaning including, but not limited to, a piece of software.
- An application may be stored and run on a touchscreen device, the Internet, or other electronic devices.
- An application may also be stored on the internet and be run on a touchscreen device.
- An application may also be specifically designed to run on a specific type of electronic device. For example, an application may be designed specifically to run on a mobile touchscreen device such as a smartphone.
- FIG. 1 illustrates an example architecture 100 for executing a default action on a touchscreen device.
- the architecture 100 includes servers 110 and clients 120 connected over a network 130 . Each of the clients 120 may interact with users, and communicate with the servers 110 to execute a default action on a touchscreen device.
- the servers 110 may be any device having a processor, memory, and communications capability for communicating with the clients 120 for distributing applications associated with executing a default action on a touchscreen device. For example, the servers 110 may distribute applications to the clients 120 , and the clients 120 may execute a default action using the distributed applications in response to a user input.
- the servers 110 may also receive instructions for executing a default action from the clients 120 , which in turn may receive user input for executing the default action.
- the clients 120 may be the touchscreen device such as, for example, a desktop computer, a laptop computer, a mobile device (e.g., a smartphone, tablet computer, or PDA), a set top box (e.g., for a television), television, video game console, home appliance (e.g., a refrigerator, microwave oven, washer or dryer) or any other device having a touch interface, processor, memory, and communications capabilities for interacting with the user, receiving applications for executing a default action and/or communicating with the servers 110 to execute the default action.
- a desktop computer e.g., a laptop computer
- a mobile device e.g., a smartphone, tablet computer, or PDA
- a set top box e.g., for a television
- television video game console
- home appliance e.g., a refrigerator, microwave oven, washer or dryer
- any other device having a touch interface, processor, memory, and communications capabilities for interacting with the user, receiving applications for executing a default action and/or communicating with the
- the clients 120 may communicate with the servers 110 as described above, but the clients need not communicate with the servers to execute the default action on a touchscreen device.
- the applications associated with executing a default action on a touchscreen device may be installed at the clients 120 locally (e.g., via a USB device or a local network such as, for example, LAN, Bluetooth, or near field communication). Further, the clients 120 may receive the user input for executing the default action and execute the default action at the clients.
- the network 130 may include, for example, any one or more of a personal area network (PAN), a local area network (LAN), a campus area network (CAN), a metropolitan area network (MAN), a wide area network (WAN), a broadband network (BBN), the Internet, and the like. Further, the network 130 can include, but is not limited to, any one or more of the following network topologies, including a bus network, a star network, a ring network, a mesh network, a star-bus network, tree or hierarchical network, and the like.
- PAN personal area network
- LAN local area network
- CAN campus area network
- MAN metropolitan area network
- WAN wide area network
- BBN broadband network
- the Internet and the like.
- the network 130 can include, but is not limited to, any one or more of the following network topologies, including a bus network, a star network, a ring network, a mesh network, a star-bus network, tree or hierarchical network, and the like.
- FIG. 2 is a block diagram 200 illustrating an example system 202 for executing a default action on a touchscreen device.
- the system 202 may be implemented, for example, at a touchscreen device (e.g., a client 120 ).
- the system 202 includes a processor 204 , and a memory 206 .
- the system 202 may also include a communications module 208 , and may be connected to the network 230 via the communications module 208 .
- the network 230 may be, for example, the network 130 .
- the communications module 208 is configured to interface with the network 230 to send and receive information, such as data, requests, responses, and commands to other devices (e.g., servers 110 ) or systems on the network 230 .
- the data sent and received through the communications module 208 may also include applications which may be used for executing default actions.
- the communications module 208 may be, for example, modems, Ethernet cards or mobile broadband adaptors.
- the system 202 also includes a touch interface 212 through which a user may interact with the system 202 using touch input.
- Touch input may include various gestures, for example, single or multiple taps using one finger, single or multiple taps using multiple fingers, a swipe gesture using one or more fingers and a pinching or expanding gesture using multiple fingers.
- the touch interface 212 may be any types of interface for receiving and recognizing touch input such as, for example, capacitive-type touchscreens, resistive-type touchscreens, or optical touch recognition systems.
- the system 202 may receive a touch input through the touch interface 212 and determine the type of touch input received. For example, the system 202 may determine whether the received touch input is a single-finger single-tap gesture, or a two-finger double-tap gesture.
- the system 202 may perform an appropriate action. Such action may include executing an application 222 or performing a system action associated with the type of input and the context.
- the application 222 may be received from an external server (e.g., servers 110 ) through the network 230 or locally installed on the system 202 using a local connection (e.g., USB connection, Bluetooth or WiFi), and stored in the memory 206 .
- an external server e.g., servers 110
- a local connection e.g., USB connection, Bluetooth or WiFi
- System 202 may also include a data store 210 , which may also store the application 222 .
- the data store 210 may be integrated with the memory 206 , or may be independent from the memory and be in communication with the processor 204 and the memory.
- the data store 210 may also be implemented to be independent from the system 202 and in communication with the system.
- the processor 204 is configured to execute instructions, such as instructions physically coded into the processor, instructions received in the form of software from the memory 206 , or a combination of both.
- the processor 204 is configured to execute instructions to receive a touch input from a user through the touch interface 212 and determine a context associated with the touch input, where the context is associated with one or more actions including a default action.
- the context may be determined based on information displayed on the touch interface 212 , information stored in the memory 206 or data store 210 , and location and type of gesture of the touch input.
- the default action may be executing a predetermined application or executing a system action, depending on the determined context and the received touch input.
- the processor is also configured to determine whether the received touch input comprises a default gesture associated with a default action, and if so, perform the default action associated with the determined context.
- the processor is further configured to determine whether the received touch input is an action-select gesture not associated with a default action, and if so, provide to prompt the user to select an action from among the one or more actions associated with the determined context.
- FIG. 3 is a diagram 300 illustrating example operations for executing a default action on a touchscreen device.
- the operations may be performed, for example, by a touchscreen device 302 incorporating a system for executing a default action on a touchscreen device (e.g., system 202 ).
- the touchscreen device 302 includes a touchscreen 304 (e.g., touch interface 212 ).
- the touchscreen 304 displays information which a user may perceive, and also receives touch input from a user. Touch input may take various forms including different types of gestures.
- gestures may be, for example, a single tap with a single finger, a double tap with one finger, a single tap with more than one finger, a double tap with more than one finger, a pinching gesture with more than one finger, an expanding gesture with more than one finger or a swipe gesture with one or more finger.
- the touchscreen device 302 interprets the touch input differently depending on the type of gestures used for the touch input.
- the touch input may also be interpreted differently depending on the context in which the touch input is received.
- the context may be determined based on the location on the touchscreen 304 at which the touch input is received.
- the touch input may be interpreted differently depending on the physical location of the touch input made on the touchscreen 304 , or depending on the relative location of the touch input with respect to the information displayed on the touchscreen, even though same type of gesture is used.
- a single tap with a single finger touch input on an “OK” button may indicate approval of information displayed on the touchscreen 304
- the same gesture made on a “cancel” button may indicate disapproval of the information displayed on the touchscreen.
- the context may also be determined based on the type of information displayed on the touchscreen 304 at the time touch input is received.
- a touch input at a location where phone numbers are displayed may be interpreted differently as compared to a touch input received where an email address is displayed.
- a single tap with a single finger at a telephone number may initiate a phone call; the same type of touch input at an email address may bring up a screen for composing an email.
- the context may also include information stored in a memory (e.g., memory 206 or data store 210 ) of the touchscreen device 302 at the time touch input is received. For example, if a touch input is received at a location where the user may enter text while a selection of copied text is stored in the memory, the touchscreen device 302 may provide for an option to paste the copied text at the location the touch input is received.
- a memory e.g., memory 206 or data store 210
- a context 306 is determined.
- the context 306 may be determined based on a touch input received on a telephone number displayed on the touchscreen 304 . Determination is also made as to the type of touch input received.
- the touch input may be an action-select gesture 308 or a default gesture 310 .
- the user may utilize the action-select gesture 308 to generally interact with the touchscreen device 302 , and the action-select gesture may often be the primary means for interacting with the touchscreen device. For example, most of the user interface elements of the touchscreen device 302 may be pre-programmed to respond to an action-select gesture 308 . As specific examples, an action-select gesture 308 on an icon representing an application executes the application; and an action-select gesture 308 on a touchscreen 304 places a cursor at the location the general gesture is received. Depending on the context 306 , the action-select gesture 308 may also provide for prompting the user to select a desired action from a list of available actions that may be associated with the context.
- the action-select gesture 308 may be, for example, a single-finger single tap.
- the default gesture 310 may be a gesture that the user may use to perform a specific action. If a context 306 is associated with more than one possible action, then the default gesture 310 may be predetermined to be associated with a specific action, and the user may use the default gesture to perform that predetermined action.
- the default gesture 310 may be, for example, a two-finger double tap. Other types of gestures may also be used for the action-select gesture 308 and the default gesture 310 .
- the default action associated with the context is performed.
- many applications e.g., application 222
- these applications may be VoIP applications for making telephone calls, or contact management applications for managing contacts information including phone numbers.
- a default application may be predetermined.
- the predetermined default application is executed to perform the actions that the application was designed to perform, given the context 306 .
- all available actions for the context 306 may be made available to the user. For example, a list of all applications that are associated with the displayed telephone number is displayed to the user, and the user is prompted to select an application for handling the telephone number. After a selection is received from the user, the selected application is executed to perform the actions that the application was designed to perform, given the context 306 .
- the context 306 may be determined based on an email address displayed to the user, or a URL link which are associated with applications which are capable of handling email addresses and URL links, respectively.
- the receiving of a default gesture 310 executes a predetermined application capable of handling emails (e.g., an email client application).
- the receiving of a default gesture 310 executes a predetermined application capable of handling URL links (e.g., a web browser application).
- the context 306 may also be associated with system actions.
- System actions include, for example, actions related to the editing text or graphical objects, such as cut, copy, paste, delete and select.
- the context 306 may also be associated with other actions specific to the context. For example, where the context 306 is determined based on an input received on a combo box or a drop-down menu, the context may be associated with displaying the list of the selectable items. Where the context 306 is determined based on an input received on a photo, the context may be associated with executing an application capable of handling photos, as well as the system actions discussed above.
- receiving an action-select gesture 308 may provide for allowing the user an option of selecting a desired action or an application to be executed which are associated with the context 306 , whereas receiving a default gesture 310 performs a predetermined action or executes a predetermined application associated with the context.
- An action or an application to be executed upon receiving a default gesture 310 on a given context 306 may be determined based on the most logical or the most likely action/application that the user may wish to perform/execute, given the context. Examples may include receiving a default gesture 310 on a selected (highlighted) text which provides for copying the selected text into the memory; default gesture 310 on an empty input box while having a copied text stored in the memory which provides for pasting the copied text into the input box; default gesture 310 on a URL link which provides for executing a predetermined web browser application which opens up the link; a default gesture 310 on an image which shares the image on a predetermined photo sharing service; a default gesture 310 on an email address which executes a predetermined email client application which opens up a email compose interface; a default gesture 310 on an address which executes a map tool application which shows the area around the address; a default gesture 310 on a name of a person or business which opens a web browser or a social networking application showing the name's
- FIG. 4 illustrates an example flow diagram 400 of example processes for displaying virtual designs.
- the operations of FIG. 4 may be performed, for example, by the system 202 .
- the operations of FIG. 4 are not limited to such a system, and may be performed using other systems/configurations.
- step 402 a touch input is received on a touchscreen device (e.g., touchscreen device 302 ).
- a context e.g., context 306
- the context is also associated with one more actions, including a default action.
- step 406 determination is made as to the type of the touch input received in step 402 . If the touch input is a default gesture (e.g., default gesture 310 ), then in step 408 , the default action associated with the context determined in step 404 is performed.
- a default gesture e.g., default gesture 310
- step 406 the touch input is determined to be an action-select gesture (e.g., action-select gesture 308 ), then in step 410 , provision is made for prompting the user to select an action from among the one or more actions associated with the context determined in step 404 .
- action-select gesture e.g., action-select gesture 308
- the process would be similar to that described above with reference to FIG. 4 , except that after receiving the touch input (e.g., step 402 ), the determination is made as to the type of the touch input received. If the touch input is a default gesture (e.g., default gesture 310 ), then a context associated with the default gesture is determined. Next, the default action associated with the determined context is performed.
- a default gesture e.g., default gesture 310
- Computer-readable storage medium also referred to as computer-readable medium.
- processing unit(s) e.g., one or more processors, cores of processors, or other processing units
- processing unit(s) e.g., one or more processors, cores of processors, or other processing units
- Examples of computer-readable media include, but are not limited to, CD-ROMs, flash drives, RAM chips, hard drives, EPROMs, etc.
- the computer-readable media does not include carrier waves and electronic signals passing wirelessly or over wired connections.
- the term “software” is meant to include, but not limited to, firmware residing in read-only memory or applications stored in magnetic storage, which can be read into memory for processing by a processor.
- multiple software aspects of the subject disclosure can be implemented as sub-parts of a larger program while remaining distinct software aspects of the subject disclosure.
- multiple software aspects can also be implemented as separate programs.
- any combination of separate programs that together implement a software aspect described here is within the scope of the subject disclosure.
- the software programs when installed to operate on one or more electronic systems, define one or more specific machine implementations that execute and perform the operations of the software programs.
- a computer program (also known as a program, software, software application, script, or code) can be written in any form of programming language, including compiled or interpreted languages, declarative or procedural languages, and it can be deployed in any form, including as a stand alone program or as a module, component, subroutine, object, or other unit suitable for use in a computing environment.
- a computer program may, but need not, correspond to a file in a file system.
- a program can be stored in a portion of a file that holds other programs or data (e.g., one or more scripts stored in a markup language document), in a single file dedicated to the program in question, or in multiple coordinated files (e.g., files that store one or more modules, sub programs, or portions of code).
- a computer program can be deployed to be executed on one computer or on multiple computers that are located at one site or distributed across multiple sites and interconnected by a communication network.
- FIG. 5 conceptually illustrates an electronic system with which some implementations of the subject technology are implemented.
- Electronic system 500 can be a computer, phone, PDA, or any other sort of electronic device. Such an electronic system includes various types of computer-readable media and interfaces for various other types of computer-readable media.
- Electronic system 500 includes a bus 508 , processing unit(s) 512 , a system memory 504 , a read-only memory (ROM) 510 , a permanent storage device 502 , an input device interface 514 , an output device interface 506 , and a network interface 516 .
- processing unit(s) 512 includes a bus 508 , processing unit(s) 512 , a system memory 504 , a read-only memory (ROM) 510 , a permanent storage device 502 , an input device interface 514 , an output device interface 506 , and a network interface 516 .
- ROM read-only memory
- Bus 508 collectively represents all system, peripheral, and chipset buses that communicatively connect the numerous internal devices of electronic system 500 .
- bus 508 communicatively connects processing unit(s) 512 with ROM 510 , system memory 504 , and permanent storage device 502 .
- processing unit(s) 512 retrieves instructions to execute and data to process in order to execute the processes of the subject disclosure.
- the processing unit(s) can be a single processor or a multi-core processor in different implementations.
- ROM 510 stores static data and instructions that are needed by processing unit(s) 512 and other modules of the electronic system.
- Permanent storage device 502 is a read-and-write memory device. This device is a non-volatile memory unit that stores instructions and data even when electronic system 500 is off. Some implementations of the subject disclosure use a mass-storage device (such as a magnetic or optical disk and its corresponding disk drive) as permanent storage device 502 .
- system memory 504 is a read-and-write memory device. However, unlike storage device 502 , system memory 504 is a volatile read-and-write memory, such as a random access memory. System memory 504 stores some of the instructions and data that the processor needs at runtime. In some implementations, the processes of the subject disclosure are stored in system memory 504 , permanent storage device 502 , and/or ROM 510 . From these various memory units, processing unit(s) 512 retrieves instructions to execute and data to process in order to execute the processes of some implementations.
- Bus 508 also connects to input and output device interfaces 514 and 506 .
- Input device interface 514 enables the user to communicate information and select commands to the electronic system.
- Input devices used with input device interface 514 include, for example, alphanumeric keyboards and pointing devices (also called “cursor control devices”).
- Output device interface 506 enables, for example, the display of images generated by the electronic system 500 .
- Output devices used with output device interface 506 include, for example, printers and display devices, such as cathode ray tubes (CRT) or liquid crystal displays (LCD). Some implementations include devices such as a touchscreen that functions as both input and output devices.
- CTR cathode ray tubes
- LCD liquid crystal displays
- bus 508 also couples electronic system 500 to a network (not shown) through a network interface 516 .
- the computer can be a part of a network of computers (such as a local area network (“LAN”), a wide area network (“WAN”), an Intranet, or a network of networks, such as the Internet. Any or all components of electronic system 500 can be used in conjunction with the subject disclosure.
- Some implementations include electronic components, such as microprocessors, storage and memory that store computer program instructions in a machine-readable or computer-readable medium (alternatively referred to as computer-readable storage media, machine-readable media, or machine-readable storage media).
- computer-readable media include RAM, ROM, read-only compact discs (CD-ROM), recordable compact discs (CD-R), rewritable compact discs (CD-RW), read-only digital versatile discs (e.g., DVD-ROM, dual-layer DVD-ROM), a variety of recordable/rewritable DVDs (e.g., DVD-RAM, DVD-RW, DVD+RW, etc.), flash memory (e.g., SD cards, mini-SD cards, micro-SD cards, etc.), magnetic and/or solid state hard drives, read-only and recordable Blu-Ray® discs, ultra density optical discs, any other optical or magnetic media, and floppy disks.
- CD-ROM compact discs
- CD-R recordable compact discs
- the computer-readable media can store a computer program that is executable by at least one processing unit and includes sets of instructions for performing various operations.
- Examples of computer programs or computer code include machine code, such as is produced by a compiler, and files including higher-level code that are executed by a computer, an electronic component, or a microprocessor using an interpreter.
- ASICs application specific integrated circuits
- FPGAs field programmable gate arrays
- integrated circuits execute instructions that are stored on the circuit itself.
- the terms “computer”, “server”, “processor”, and “memory” all refer to electronic or other technological devices. These terms exclude people or groups of people.
- display or displaying means displaying on an electronic device.
- the terms “computer-readable medium” and “computer-readable media” are entirely restricted to tangible, physical objects that store information in a form that is readable by a computer. These terms exclude any wireless signals, wired download signals, and any other ephemeral signals.
- implementations of the subject technology described in this specification can be implemented on a computer having a display device, e.g., a CRT (cathode ray tube) or LCD (liquid crystal display) monitor, for displaying information to the user and a keyboard and a pointing device, e.g., a mouse or a trackball, by which the user can provide input to the computer.
- a display device e.g., a CRT (cathode ray tube) or LCD (liquid crystal display) monitor
- keyboard and a pointing device e.g., a mouse or a trackball
- Other kinds of devices can be used to provide for interaction with a user as well; for example, feedback provided to the user can be any form of sensory feedback, e.g., visual feedback, auditory feedback, or tactile feedback; and input from the user can be received in any form, including acoustic, speech, or tactile input.
- a computer can interact with a user by sending documents to and receiving documents from a device that is used
- aspects of the subject technology described in this specification can be implemented in a computing system that includes a back end component, e.g., as a data server, or that includes a middleware component, e.g., an application server, or that includes a front end component, e.g., a client computer having a graphical user interface or a Web browser through which a user can interact with an implementation of the subject technology described in this specification, or any combination of one or more such back end, middleware, or front end components.
- the components of the system can be interconnected by any form or medium of digital data communication, e.g., a communication network.
- Examples of communication networks include a local area network (“LAN”) and a wide area network (“WAN”), an inter-network (e.g., the Internet), and peer-to-peer networks (e.g., ad hoc peer-to-peer networks).
- LAN local area network
- WAN wide area network
- inter-network e.g., the Internet
- peer-to-peer networks e.g., ad hoc peer-to-peer networks.
- the computing system can include clients and servers.
- a client and server are generally remote from each other and typically interact through a communication network. The relationship of client and server arises by virtue of computer programs running on the respective computers and having a client-server relationship to each other.
- a server transmits data (e.g., an HTML page) to a client device (e.g., for purposes of displaying data to and receiving user input from a user interacting with the client device).
- client device e.g., for purposes of displaying data to and receiving user input from a user interacting with the client device.
- Data generated at the client device e.g., a result of the user interaction
- any specific order or hierarchy of steps in the processes disclosed is an illustration of example approaches. Based upon design preferences, it is understood that the specific order or hierarchy of steps in the processes may be rearranged, or that not all illustrated steps be performed. Some of the steps may be performed simultaneously. For example, in certain circumstances, multitasking and parallel processing may be advantageous. Moreover, the separation of various system components in the aspects described above should not be understood as requiring such separation in all aspects, and it should be understood that the described program components and systems can generally be integrated together in a single software product or packaged into multiple software products.
- a phrase such as an “aspect” does not imply that such aspect is essential to the subject technology or that such aspect applies to all configurations of the subject technology.
- a disclosure relating to an aspect may apply to all configurations, or one or more configurations.
- a phrase such as an aspect may refer to one or more aspects and vice versa.
- a phrase such as a “configuration” does not imply that such configuration is essential to the subject technology or that such configuration applies to all configurations of the subject technology.
- a disclosure relating to a configuration may apply to all configurations, or one or more configurations.
- a phrase such as a configuration may refer to one or more configurations and vice versa.
Landscapes
- Engineering & Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
Description
- This application is a continuation, and claims the benefit under 35 U.S.C. §120, of U.S. patent application Ser. No. 13/567,828, filed 6 Aug. 2012, of which the entire contents and substance are hereby incorporated by reference as is fully set forth below.
- The present disclosure generally relates to operating a touchscreen device, and more particularly to executing a predefined action on a touchscreen device using a predefined gesture.
- Touchscreen devices interact with the users by receiving input through touch operations. Such touchscreen devices may include, for example, desktop computers, laptop computers, tablet computers, smartphones, and televisions.
- The disclosed subject technology relates to a computer-implemented method for executing a default action on a touchscreen device. The method includes receiving a touch input from a user on a touchscreen device and determining a context associated with the touch input. The context is associated with one or more actions including a default action. The method also includes determining that the received touch input comprises a default gesture, and performing the default action associated with the determined context.
- The disclosed subject technology further relates to a system for executing a default action on a touchscreen device. The system includes a memory storing executable instructions. The system also includes a processor coupled to the memory configured to execute the stored executable instructions to receive a touch input from a user on a touchscreen device and determine a context associated with the touch input. The context is associated with one or more actions including a default action predetermined by the user. The processor is further configured to determine whether the received touch input comprises a default gesture, and if the received touch input comprises the default gesture, perform the default action associated with the determined context.
- The disclosed subject technology also relates to a machine-readable storage medium comprising machine-readable instructions for causing a processor to execute a method for executing a default action on a touchscreen device. The method includes receiving a touch input from a user on a touchscreen device and determining a context associated with the touch input based on information available to a user within a predetermined distance from a location on the touchscreen device at which the touch input is received. The context is associated with one or more actions including a default action predetermined by the user. The method also includes determining whether the received touch input comprises a two-finger double-tap gesture, and if the received touch input comprises the two-finger double-tap gesture, executing a predetermined application associated with the determined context.
- It is understood that other configurations of the subject technology will become readily apparent to those skilled in the art from the following detailed description, wherein various configurations of the subject technology are shown and described by way of illustration. As will be realized, the subject technology is capable of other and different configurations and its several details are capable of modification in various other respects, all without departing from the scope of the subject technology. Accordingly, the drawings and detailed description are to be regarded as illustrative in nature and not as restrictive.
- Certain features of the subject technology are set forth in the appended claims. However, for purposes of explanation, several aspects of the subject technology are set forth in the following figures.
-
FIG. 1 illustrates an example architecture for executing a default action on a touchscreen device. -
FIG. 2 is a block diagram illustrating an example system for executing a default action on a touchscreen device. -
FIG. 3 is a diagram illustrating example operations for executing a default action on a touchscreen device. -
FIG. 4 illustrates an example flow diagram of example processes for executing a default action on a touchscreen device. -
FIG. 5 conceptually illustrates an electronic system with which some implementations of the subject technology are implemented. - The detailed description set forth below is intended as a description of various configurations of the subject technology and is not intended to represent the only configurations in which the subject technology may be practiced. The appended drawings are incorporated herein and constitute a part of the detailed description. The detailed description includes specific details for the purpose of providing a thorough understanding of the subject technology. However, it will be clear and apparent to those skilled in the art that the subject technology is not limited to the specific details set forth herein and may be practiced without these specific details. In some instances, well-known structures and components are shown in block diagram form in order to avoid obscuring the concepts of the subject technology.
- Many applications may be installed on touchscreen devices such as, for example, smartphones, tablets computers, and laptop computers and desktop computers with touchscreens. Platforms which the touchscreen devices may allow for these applications to associate themselves with different contexts (which may also be referred to as “intents” herein). Platforms may be, for example, mobile operating systems. For example, a context or an intent can be determined based on a touch input made on a phone number displayed on a screen of a mobile device. Many applications installed or running on the touchscreen device may be associated with such context (determined based on touch input on a phone number). When such context is determined, the associated applications may perform actions that are associated with the context. Specifically, many voice-over-IP (VoIP) applications may register themselves with phone numbers, and when the user touches a phone number, the platform allows the user to select among the many VoIP applications which have registered themselves with phone numbers, for responding to the touch input. A touch on an email address may trigger determination of another context which many email client applications may associate themselves with. Another example in which a context may be determined is a touch on URL links which many web browser applications may associate themselves with.
- Contexts may also be associated with different system actions. For example, a context may be determined based on the highlighting input on a text displayed on the screen of the touchscreen device. Such context may be associated with system actions such as copy, cut and paste. A touch input on the screen in general may give rise to determination of another context which may be associated with system actions such as paste or zoom in/out of displayed contents.
- In existing platforms, a default application may be specified for a given context (e.g., determined based on touch input on a phone number, email address or URL Link). For example, a user may specify that whenever a phone number is touched upon, a specific VoIP application respond each time. However, specifying the default application may bury easy access to other options which may be associated with the given context. For example, if a default application is associated with telephone numbers, touching a telephone number would no longer bring up a menu for selecting an application for responding to the touch. Rather, the default application automatically responds to the touch. Therefore, if the user wishes for a different application to respond to the context later on, the user would have to go through the process of clearing the default, which may often require accessing multiple levels of settings menu.
- Furthermore, for contexts that are associated with different system actions (e.g., touching a text leading to a menu for Cut, Copy, Paste, Delete and Select All), in existing platforms, the most commonly used of these available actions (e.g., Copy) or the most reasonable in a given context (e.g., Paste, if a blank space is touched) is given no shortcut.
- According to various aspects of the subject technology, a method and system for executing a default action on a touchscreen device is provided. Two-finger double-tap gesture is a type of touch input which is relatively easy to perform, and there are currently no generally accepted actions associated with this gesture. In addition to the regular touch action (e.g., single-finger single tap) which may bring up the conventional list of options for responding to a touch gesture, a default action may be associated with a two-finger double tap. This way, the user may conveniently execute the default action by performing the two-finger double-tap gesture, without hiding easy access to other available actions. For example, telephone numbers may be associated with multiple VoIP applications, and a default action may be associated with an application “A”. When the user wishes to use the application A when touching a telephone number, the user may perform the two-finger double-tap gesture. If the user wishes to select other applications for responding to a touch on a telephone number, the user may perform the traditional single-finger single-tap gesture, which may bring up the menu for selecting other applications to respond to the touch gesture.
- Likewise, where a user operation reveals a series of system actions (e.g., touching near a text brings up a menu for Cut, Copy, Paste, Delete and Select All), the two-finger double tap may perform the option that is most commonly used, or the most reasonable given the context, while the single-finger single-tap gesture may bring up the list of options, as usual.
- By way of non-limiting example, following are some default actions which may be performed by a two-finger double-tap gesture for various contexts:
- 1. Two-finger double tap selected text to copy.
- 2. Two-finger double tap an empty input box to paste.
- 3. Two-finger double tap to follow a link using a default browser.
- 4. Two-finger double tap to activate a “web intents” action. For example, two-finger double tap an image to add it to a default photo sharing site.
- 5. Two-finger double tap an email address to open up a default email client application.
- 6. Two-finger double tap an address to open up a default maps tool with that address.
- 7. Two-finger double tap a name to take to a profile page on a default social network.
- 8. Two-finger double tap a form element to auto-fill previously stored information.
- 9. Two-finger double tap a combo box (drop-down menu) to switch back to the default element.
- While the subject technology is described above with a two-finger double-tap gesture, other gestures may also be used.
- The term “application” as used herein encompasses its plain and ordinary meaning including, but not limited to, a piece of software. An application may be stored and run on a touchscreen device, the Internet, or other electronic devices. An application may also be stored on the internet and be run on a touchscreen device. An application may also be specifically designed to run on a specific type of electronic device. For example, an application may be designed specifically to run on a mobile touchscreen device such as a smartphone.
-
FIG. 1 illustrates anexample architecture 100 for executing a default action on a touchscreen device. Thearchitecture 100 includesservers 110 andclients 120 connected over anetwork 130. Each of theclients 120 may interact with users, and communicate with theservers 110 to execute a default action on a touchscreen device. Theservers 110 may be any device having a processor, memory, and communications capability for communicating with theclients 120 for distributing applications associated with executing a default action on a touchscreen device. For example, theservers 110 may distribute applications to theclients 120, and theclients 120 may execute a default action using the distributed applications in response to a user input. Theservers 110 may also receive instructions for executing a default action from theclients 120, which in turn may receive user input for executing the default action. Theclients 120 may be the touchscreen device such as, for example, a desktop computer, a laptop computer, a mobile device (e.g., a smartphone, tablet computer, or PDA), a set top box (e.g., for a television), television, video game console, home appliance (e.g., a refrigerator, microwave oven, washer or dryer) or any other device having a touch interface, processor, memory, and communications capabilities for interacting with the user, receiving applications for executing a default action and/or communicating with theservers 110 to execute the default action. - The
clients 120 may communicate with theservers 110 as described above, but the clients need not communicate with the servers to execute the default action on a touchscreen device. For example, the applications associated with executing a default action on a touchscreen device may be installed at theclients 120 locally (e.g., via a USB device or a local network such as, for example, LAN, Bluetooth, or near field communication). Further, theclients 120 may receive the user input for executing the default action and execute the default action at the clients. - The
network 130 may include, for example, any one or more of a personal area network (PAN), a local area network (LAN), a campus area network (CAN), a metropolitan area network (MAN), a wide area network (WAN), a broadband network (BBN), the Internet, and the like. Further, thenetwork 130 can include, but is not limited to, any one or more of the following network topologies, including a bus network, a star network, a ring network, a mesh network, a star-bus network, tree or hierarchical network, and the like. -
FIG. 2 is a block diagram 200 illustrating anexample system 202 for executing a default action on a touchscreen device. Thesystem 202 may be implemented, for example, at a touchscreen device (e.g., a client 120). Thesystem 202 includes aprocessor 204, and amemory 206. Thesystem 202 may also include acommunications module 208, and may be connected to thenetwork 230 via thecommunications module 208. Thenetwork 230 may be, for example, thenetwork 130. Thecommunications module 208 is configured to interface with thenetwork 230 to send and receive information, such as data, requests, responses, and commands to other devices (e.g., servers 110) or systems on thenetwork 230. The data sent and received through thecommunications module 208 may also include applications which may be used for executing default actions. Thecommunications module 208 may be, for example, modems, Ethernet cards or mobile broadband adaptors. - The
system 202 also includes atouch interface 212 through which a user may interact with thesystem 202 using touch input. Touch input may include various gestures, for example, single or multiple taps using one finger, single or multiple taps using multiple fingers, a swipe gesture using one or more fingers and a pinching or expanding gesture using multiple fingers. Thetouch interface 212 may be any types of interface for receiving and recognizing touch input such as, for example, capacitive-type touchscreens, resistive-type touchscreens, or optical touch recognition systems. Thesystem 202 may receive a touch input through thetouch interface 212 and determine the type of touch input received. For example, thesystem 202 may determine whether the received touch input is a single-finger single-tap gesture, or a two-finger double-tap gesture. Depending on the type of determined touch input and the context in which the touch input is received, thesystem 202 may perform an appropriate action. Such action may include executing an application 222 or performing a system action associated with the type of input and the context. The application 222 may be received from an external server (e.g., servers 110) through thenetwork 230 or locally installed on thesystem 202 using a local connection (e.g., USB connection, Bluetooth or WiFi), and stored in thememory 206. -
System 202 may also include adata store 210, which may also store the application 222. Thedata store 210 may be integrated with thememory 206, or may be independent from the memory and be in communication with theprocessor 204 and the memory. Thedata store 210 may also be implemented to be independent from thesystem 202 and in communication with the system. - The
processor 204 is configured to execute instructions, such as instructions physically coded into the processor, instructions received in the form of software from thememory 206, or a combination of both. For example, theprocessor 204 is configured to execute instructions to receive a touch input from a user through thetouch interface 212 and determine a context associated with the touch input, where the context is associated with one or more actions including a default action. The context may be determined based on information displayed on thetouch interface 212, information stored in thememory 206 ordata store 210, and location and type of gesture of the touch input. The default action may be executing a predetermined application or executing a system action, depending on the determined context and the received touch input. The processor is also configured to determine whether the received touch input comprises a default gesture associated with a default action, and if so, perform the default action associated with the determined context. The processor is further configured to determine whether the received touch input is an action-select gesture not associated with a default action, and if so, provide to prompt the user to select an action from among the one or more actions associated with the determined context. -
FIG. 3 is a diagram 300 illustrating example operations for executing a default action on a touchscreen device. The operations may be performed, for example, by atouchscreen device 302 incorporating a system for executing a default action on a touchscreen device (e.g., system 202). Thetouchscreen device 302 includes a touchscreen 304 (e.g., touch interface 212). Thetouchscreen 304 displays information which a user may perceive, and also receives touch input from a user. Touch input may take various forms including different types of gestures. These gestures may be, for example, a single tap with a single finger, a double tap with one finger, a single tap with more than one finger, a double tap with more than one finger, a pinching gesture with more than one finger, an expanding gesture with more than one finger or a swipe gesture with one or more finger. Thetouchscreen device 302 interprets the touch input differently depending on the type of gestures used for the touch input. - The touch input may also be interpreted differently depending on the context in which the touch input is received. The context may be determined based on the location on the
touchscreen 304 at which the touch input is received. For example, the touch input may be interpreted differently depending on the physical location of the touch input made on thetouchscreen 304, or depending on the relative location of the touch input with respect to the information displayed on the touchscreen, even though same type of gesture is used. Specifically, a single tap with a single finger touch input on an “OK” button may indicate approval of information displayed on thetouchscreen 304, whereas the same gesture made on a “cancel” button may indicate disapproval of the information displayed on the touchscreen. The context may also be determined based on the type of information displayed on thetouchscreen 304 at the time touch input is received. For example, a touch input at a location where phone numbers are displayed may be interpreted differently as compared to a touch input received where an email address is displayed. Specifically, a single tap with a single finger at a telephone number may initiate a phone call; the same type of touch input at an email address may bring up a screen for composing an email. - The context may also include information stored in a memory (e.g.,
memory 206 or data store 210) of thetouchscreen device 302 at the time touch input is received. For example, if a touch input is received at a location where the user may enter text while a selection of copied text is stored in the memory, thetouchscreen device 302 may provide for an option to paste the copied text at the location the touch input is received. - When a touch input is received, a
context 306 is determined. For example, thecontext 306 may be determined based on a touch input received on a telephone number displayed on thetouchscreen 304. Determination is also made as to the type of touch input received. For example, the touch input may be an action-select gesture 308 or adefault gesture 310. - The user may utilize the action-
select gesture 308 to generally interact with thetouchscreen device 302, and the action-select gesture may often be the primary means for interacting with the touchscreen device. For example, most of the user interface elements of thetouchscreen device 302 may be pre-programmed to respond to an action-select gesture 308. As specific examples, an action-select gesture 308 on an icon representing an application executes the application; and an action-select gesture 308 on atouchscreen 304 places a cursor at the location the general gesture is received. Depending on thecontext 306, the action-select gesture 308 may also provide for prompting the user to select a desired action from a list of available actions that may be associated with the context. The action-select gesture 308 may be, for example, a single-finger single tap. Thedefault gesture 310 may be a gesture that the user may use to perform a specific action. If acontext 306 is associated with more than one possible action, then thedefault gesture 310 may be predetermined to be associated with a specific action, and the user may use the default gesture to perform that predetermined action. Thedefault gesture 310 may be, for example, a two-finger double tap. Other types of gestures may also be used for the action-select gesture 308 and thedefault gesture 310. - In the example where the
context 306 is determined based on a touch input received on a telephone number, if the touch input is determined to be adefault gesture 310, then the default action associated with the context is performed. For example, many applications (e.g., application 222) may be installed on thetouchscreen device 302 which are associated with phone numbers. These applications may be VoIP applications for making telephone calls, or contact management applications for managing contacts information including phone numbers. Out of the many applications installed on thetouchscreen device 302 that are associated with phone numbers, a default application may be predetermined. When thedefault gesture 310 is received and thecontext 306 is determined based on a touch input received on a telephone number, then the predetermined default application is executed to perform the actions that the application was designed to perform, given thecontext 306. - For the
same context 306 as in the above example where the touch input is received on a telephone number, if the received touch input is determined to be an action-select gesture 308, then all available actions for thecontext 306 may be made available to the user. For example, a list of all applications that are associated with the displayed telephone number is displayed to the user, and the user is prompted to select an application for handling the telephone number. After a selection is received from the user, the selected application is executed to perform the actions that the application was designed to perform, given thecontext 306. - While the above paragraphs discuss an example of the
context 306 involving a telephone number, other types of contexts may also be determined which may be associated with various other types of actions. For example, thecontext 306 may be determined based on an email address displayed to the user, or a URL link which are associated with applications which are capable of handling email addresses and URL links, respectively. In the case where thecontext 306 is determined based on an input received on an email address, the receiving of adefault gesture 310 executes a predetermined application capable of handling emails (e.g., an email client application). In the case where thecontext 306 is determined based on an input received on a URL link, the receiving of adefault gesture 310 executes a predetermined application capable of handling URL links (e.g., a web browser application). - The
context 306 may also be associated with system actions. System actions include, for example, actions related to the editing text or graphical objects, such as cut, copy, paste, delete and select. Thecontext 306 may also be associated with other actions specific to the context. For example, where thecontext 306 is determined based on an input received on a combo box or a drop-down menu, the context may be associated with displaying the list of the selectable items. Where thecontext 306 is determined based on an input received on a photo, the context may be associated with executing an application capable of handling photos, as well as the system actions discussed above. - As discussed in the above examples, for the
same context 306, receiving an action-select gesture 308 may provide for allowing the user an option of selecting a desired action or an application to be executed which are associated with thecontext 306, whereas receiving adefault gesture 310 performs a predetermined action or executes a predetermined application associated with the context. - An action or an application to be executed upon receiving a
default gesture 310 on a givencontext 306 may be determined based on the most logical or the most likely action/application that the user may wish to perform/execute, given the context. Examples may include receiving adefault gesture 310 on a selected (highlighted) text which provides for copying the selected text into the memory;default gesture 310 on an empty input box while having a copied text stored in the memory which provides for pasting the copied text into the input box;default gesture 310 on a URL link which provides for executing a predetermined web browser application which opens up the link; adefault gesture 310 on an image which shares the image on a predetermined photo sharing service; adefault gesture 310 on an email address which executes a predetermined email client application which opens up a email compose interface; adefault gesture 310 on an address which executes a map tool application which shows the area around the address; adefault gesture 310 on a name of a person or business which opens a web browser or a social networking application showing the name's profile page; adefault gesture 310 on an empty form field which automatically fills in the field with previously entered information; and adefault gesture 310 on a combo box or a drop-down menu which selects a default menu element among the different menus of the combo box or the drop-down menu. -
FIG. 4 illustrates an example flow diagram 400 of example processes for displaying virtual designs. The operations ofFIG. 4 may be performed, for example, by thesystem 202. However, the operations ofFIG. 4 are not limited to such a system, and may be performed using other systems/configurations. - The operation begins in
step 402 where a touch input is received on a touchscreen device (e.g., touchscreen device 302). Instep 404, a context (e.g., context 306) associated with the touch input is determined. The context is also associated with one more actions, including a default action. Instep 406, determination is made as to the type of the touch input received instep 402. If the touch input is a default gesture (e.g., default gesture 310), then instep 408, the default action associated with the context determined instep 404 is performed. - If in
step 406, the touch input is determined to be an action-select gesture (e.g., action-select gesture 308), then in step 410, provision is made for prompting the user to select an action from among the one or more actions associated with the context determined instep 404. - In another example process, the process would be similar to that described above with reference to
FIG. 4 , except that after receiving the touch input (e.g., step 402), the determination is made as to the type of the touch input received. If the touch input is a default gesture (e.g., default gesture 310), then a context associated with the default gesture is determined. Next, the default action associated with the determined context is performed. - Many of the above-described features and applications are implemented as software processes that are specified as a set of instructions recorded on a computer-readable storage medium (also referred to as computer-readable medium). When these instructions are executed by one or more processing unit(s) (e.g., one or more processors, cores of processors, or other processing units), they cause the processing unit(s) to perform the actions indicated in the instructions. Examples of computer-readable media include, but are not limited to, CD-ROMs, flash drives, RAM chips, hard drives, EPROMs, etc. The computer-readable media does not include carrier waves and electronic signals passing wirelessly or over wired connections.
- In this specification, the term “software” is meant to include, but not limited to, firmware residing in read-only memory or applications stored in magnetic storage, which can be read into memory for processing by a processor. Also, in some implementations, multiple software aspects of the subject disclosure can be implemented as sub-parts of a larger program while remaining distinct software aspects of the subject disclosure. In some implementations, multiple software aspects can also be implemented as separate programs. Finally, any combination of separate programs that together implement a software aspect described here is within the scope of the subject disclosure. In some implementations, the software programs, when installed to operate on one or more electronic systems, define one or more specific machine implementations that execute and perform the operations of the software programs.
- A computer program (also known as a program, software, software application, script, or code) can be written in any form of programming language, including compiled or interpreted languages, declarative or procedural languages, and it can be deployed in any form, including as a stand alone program or as a module, component, subroutine, object, or other unit suitable for use in a computing environment. A computer program may, but need not, correspond to a file in a file system. A program can be stored in a portion of a file that holds other programs or data (e.g., one or more scripts stored in a markup language document), in a single file dedicated to the program in question, or in multiple coordinated files (e.g., files that store one or more modules, sub programs, or portions of code). A computer program can be deployed to be executed on one computer or on multiple computers that are located at one site or distributed across multiple sites and interconnected by a communication network.
-
FIG. 5 conceptually illustrates an electronic system with which some implementations of the subject technology are implemented.Electronic system 500 can be a computer, phone, PDA, or any other sort of electronic device. Such an electronic system includes various types of computer-readable media and interfaces for various other types of computer-readable media.Electronic system 500 includes abus 508, processing unit(s) 512, asystem memory 504, a read-only memory (ROM) 510, apermanent storage device 502, aninput device interface 514, anoutput device interface 506, and anetwork interface 516. -
Bus 508 collectively represents all system, peripheral, and chipset buses that communicatively connect the numerous internal devices ofelectronic system 500. For instance,bus 508 communicatively connects processing unit(s) 512 withROM 510,system memory 504, andpermanent storage device 502. - From these various memory units, processing unit(s) 512 retrieves instructions to execute and data to process in order to execute the processes of the subject disclosure. The processing unit(s) can be a single processor or a multi-core processor in different implementations.
-
ROM 510 stores static data and instructions that are needed by processing unit(s) 512 and other modules of the electronic system.Permanent storage device 502, on the other hand, is a read-and-write memory device. This device is a non-volatile memory unit that stores instructions and data even whenelectronic system 500 is off. Some implementations of the subject disclosure use a mass-storage device (such as a magnetic or optical disk and its corresponding disk drive) aspermanent storage device 502. - Other implementations use a removable storage device (such as a floppy disk, flash drive, and its corresponding disk drive) as
permanent storage device 502. Likepermanent storage device 502,system memory 504 is a read-and-write memory device. However, unlikestorage device 502,system memory 504 is a volatile read-and-write memory, such as a random access memory.System memory 504 stores some of the instructions and data that the processor needs at runtime. In some implementations, the processes of the subject disclosure are stored insystem memory 504,permanent storage device 502, and/orROM 510. From these various memory units, processing unit(s) 512 retrieves instructions to execute and data to process in order to execute the processes of some implementations. -
Bus 508 also connects to input and output device interfaces 514 and 506.Input device interface 514 enables the user to communicate information and select commands to the electronic system. Input devices used withinput device interface 514 include, for example, alphanumeric keyboards and pointing devices (also called “cursor control devices”).Output device interface 506 enables, for example, the display of images generated by theelectronic system 500. Output devices used withoutput device interface 506 include, for example, printers and display devices, such as cathode ray tubes (CRT) or liquid crystal displays (LCD). Some implementations include devices such as a touchscreen that functions as both input and output devices. - Finally, as shown in
FIG. 5 ,bus 508 also coupleselectronic system 500 to a network (not shown) through anetwork interface 516. In this manner, the computer can be a part of a network of computers (such as a local area network (“LAN”), a wide area network (“WAN”), an Intranet, or a network of networks, such as the Internet. Any or all components ofelectronic system 500 can be used in conjunction with the subject disclosure. - These functions described above can be implemented in digital electronic circuitry, in computer software, firmware or hardware. The techniques can be implemented using one or more computer program products. Programmable processors and computers can be included in or packaged as mobile devices. The processes and logic flows can be performed by one or more programmable processors and by one or more programmable logic circuitry. General and special purpose computing devices and storage devices can be interconnected through communication networks.
- Some implementations include electronic components, such as microprocessors, storage and memory that store computer program instructions in a machine-readable or computer-readable medium (alternatively referred to as computer-readable storage media, machine-readable media, or machine-readable storage media). Some examples of such computer-readable media include RAM, ROM, read-only compact discs (CD-ROM), recordable compact discs (CD-R), rewritable compact discs (CD-RW), read-only digital versatile discs (e.g., DVD-ROM, dual-layer DVD-ROM), a variety of recordable/rewritable DVDs (e.g., DVD-RAM, DVD-RW, DVD+RW, etc.), flash memory (e.g., SD cards, mini-SD cards, micro-SD cards, etc.), magnetic and/or solid state hard drives, read-only and recordable Blu-Ray® discs, ultra density optical discs, any other optical or magnetic media, and floppy disks. The computer-readable media can store a computer program that is executable by at least one processing unit and includes sets of instructions for performing various operations. Examples of computer programs or computer code include machine code, such as is produced by a compiler, and files including higher-level code that are executed by a computer, an electronic component, or a microprocessor using an interpreter.
- While the above discussion primarily refers to microprocessors or multi-core processors that execute software, some implementations are performed by one or more integrated circuits, such as application specific integrated circuits (ASICs) or field programmable gate arrays (FPGAs). In some implementations, such integrated circuits execute instructions that are stored on the circuit itself.
- As used in this specification and any claims of this application, the terms “computer”, “server”, “processor”, and “memory” all refer to electronic or other technological devices. These terms exclude people or groups of people. For the purposes of the specification, the terms display or displaying means displaying on an electronic device. As used in this specification and any claims of this application, the terms “computer-readable medium” and “computer-readable media” are entirely restricted to tangible, physical objects that store information in a form that is readable by a computer. These terms exclude any wireless signals, wired download signals, and any other ephemeral signals.
- To provide for interaction with a user, implementations of the subject technology described in this specification can be implemented on a computer having a display device, e.g., a CRT (cathode ray tube) or LCD (liquid crystal display) monitor, for displaying information to the user and a keyboard and a pointing device, e.g., a mouse or a trackball, by which the user can provide input to the computer. Other kinds of devices can be used to provide for interaction with a user as well; for example, feedback provided to the user can be any form of sensory feedback, e.g., visual feedback, auditory feedback, or tactile feedback; and input from the user can be received in any form, including acoustic, speech, or tactile input. In addition, a computer can interact with a user by sending documents to and receiving documents from a device that is used by the user.
- Aspects of the subject technology described in this specification can be implemented in a computing system that includes a back end component, e.g., as a data server, or that includes a middleware component, e.g., an application server, or that includes a front end component, e.g., a client computer having a graphical user interface or a Web browser through which a user can interact with an implementation of the subject technology described in this specification, or any combination of one or more such back end, middleware, or front end components. The components of the system can be interconnected by any form or medium of digital data communication, e.g., a communication network. Examples of communication networks include a local area network (“LAN”) and a wide area network (“WAN”), an inter-network (e.g., the Internet), and peer-to-peer networks (e.g., ad hoc peer-to-peer networks).
- The computing system can include clients and servers. A client and server are generally remote from each other and typically interact through a communication network. The relationship of client and server arises by virtue of computer programs running on the respective computers and having a client-server relationship to each other. In some aspects, a server transmits data (e.g., an HTML page) to a client device (e.g., for purposes of displaying data to and receiving user input from a user interacting with the client device). Data generated at the client device (e.g., a result of the user interaction) can be received from the client device at the server.
- It is understood that any specific order or hierarchy of steps in the processes disclosed is an illustration of example approaches. Based upon design preferences, it is understood that the specific order or hierarchy of steps in the processes may be rearranged, or that not all illustrated steps be performed. Some of the steps may be performed simultaneously. For example, in certain circumstances, multitasking and parallel processing may be advantageous. Moreover, the separation of various system components in the aspects described above should not be understood as requiring such separation in all aspects, and it should be understood that the described program components and systems can generally be integrated together in a single software product or packaged into multiple software products.
- The previous description is provided to enable any person skilled in the art to practice the various aspects described herein. Various modifications to these aspects will be readily apparent to those skilled in the art, and the generic principles defined herein may be applied to other aspects. Thus, the claims are not intended to be limited to the aspects shown herein, but are to be accorded the full scope consistent with the language claims, wherein reference to an element in the singular is not intended to mean “one and only one” unless specifically so stated, but rather “one or more.” Unless specifically stated otherwise, the term “some” refers to one or more. Pronouns in the masculine (e.g., his) include the feminine and neuter gender (e.g., her and its) and vice versa. Headings and subheadings, if any, are used for convenience only and do not limit the subject disclosure.
- A phrase such as an “aspect” does not imply that such aspect is essential to the subject technology or that such aspect applies to all configurations of the subject technology. A disclosure relating to an aspect may apply to all configurations, or one or more configurations. A phrase such as an aspect may refer to one or more aspects and vice versa. A phrase such as a “configuration” does not imply that such configuration is essential to the subject technology or that such configuration applies to all configurations of the subject technology. A disclosure relating to a configuration may apply to all configurations, or one or more configurations. A phrase such as a configuration may refer to one or more configurations and vice versa.
- The word “exemplary” is used herein to mean “serving as an example or illustration.” Any aspect or design described herein as “exemplary” is not necessarily to be construed as preferred or advantageous over other aspects or designs.
- All structural and functional equivalents to the elements of the various aspects described throughout this disclosure that are known or later come to be known to those of ordinary skill in the art are expressly incorporated herein by reference and are intended to be encompassed by the claims.
Claims (20)
Priority Applications (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US15/587,587 US20170235479A1 (en) | 2012-08-06 | 2017-05-05 | Executing a default action on a touchscreen device |
US16/656,297 US11243683B2 (en) | 2012-08-06 | 2019-10-17 | Context based gesture actions on a touchscreen |
US17/561,015 US11599264B2 (en) | 2012-08-06 | 2021-12-23 | Context based gesture actions on a touchscreen |
US18/174,503 US11789605B2 (en) | 2012-08-06 | 2023-02-24 | Context based gesture actions on a touchscreen |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US13/567,828 US9684398B1 (en) | 2012-08-06 | 2012-08-06 | Executing a default action on a touchscreen device |
US15/587,587 US20170235479A1 (en) | 2012-08-06 | 2017-05-05 | Executing a default action on a touchscreen device |
Related Parent Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US13/567,828 Continuation US9684398B1 (en) | 2012-08-06 | 2012-08-06 | Executing a default action on a touchscreen device |
Related Child Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US16/656,297 Continuation US11243683B2 (en) | 2012-08-06 | 2019-10-17 | Context based gesture actions on a touchscreen |
Publications (1)
Publication Number | Publication Date |
---|---|
US20170235479A1 true US20170235479A1 (en) | 2017-08-17 |
Family
ID=59034381
Family Applications (5)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US13/567,828 Active 2032-09-28 US9684398B1 (en) | 2012-08-06 | 2012-08-06 | Executing a default action on a touchscreen device |
US15/587,587 Abandoned US20170235479A1 (en) | 2012-08-06 | 2017-05-05 | Executing a default action on a touchscreen device |
US16/656,297 Active 2032-09-28 US11243683B2 (en) | 2012-08-06 | 2019-10-17 | Context based gesture actions on a touchscreen |
US17/561,015 Active US11599264B2 (en) | 2012-08-06 | 2021-12-23 | Context based gesture actions on a touchscreen |
US18/174,503 Active US11789605B2 (en) | 2012-08-06 | 2023-02-24 | Context based gesture actions on a touchscreen |
Family Applications Before (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US13/567,828 Active 2032-09-28 US9684398B1 (en) | 2012-08-06 | 2012-08-06 | Executing a default action on a touchscreen device |
Family Applications After (3)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US16/656,297 Active 2032-09-28 US11243683B2 (en) | 2012-08-06 | 2019-10-17 | Context based gesture actions on a touchscreen |
US17/561,015 Active US11599264B2 (en) | 2012-08-06 | 2021-12-23 | Context based gesture actions on a touchscreen |
US18/174,503 Active US11789605B2 (en) | 2012-08-06 | 2023-02-24 | Context based gesture actions on a touchscreen |
Country Status (1)
Country | Link |
---|---|
US (5) | US9684398B1 (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US11243683B2 (en) | 2012-08-06 | 2022-02-08 | Google Llc | Context based gesture actions on a touchscreen |
Families Citing this family (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9483755B2 (en) | 2008-03-04 | 2016-11-01 | Apple Inc. | Portable multifunction device, method, and graphical user interface for an email client |
US11068128B2 (en) | 2013-09-03 | 2021-07-20 | Apple Inc. | User interface object manipulations in a user interface |
EP3605286B1 (en) | 2013-09-03 | 2021-02-17 | Apple Inc. | User interface for manipulating user interface objects |
US10503388B2 (en) | 2013-09-03 | 2019-12-10 | Apple Inc. | Crown input for a wearable electronic device |
CN116301544A (en) | 2014-06-27 | 2023-06-23 | 苹果公司 | Reduced size user interface |
CN105830028B (en) * | 2014-07-11 | 2020-06-26 | 华为技术有限公司 | Man-machine interaction function execution method and terminal |
TWI676127B (en) * | 2014-09-02 | 2019-11-01 | 美商蘋果公司 | Method, system, electronic device and computer-readable storage medium regarding electronic mail user interface |
CN110072131A (en) | 2014-09-02 | 2019-07-30 | 苹果公司 | Music user interface |
US10082892B2 (en) | 2014-09-02 | 2018-09-25 | Apple Inc. | Button functionality |
US10049087B2 (en) * | 2016-07-19 | 2018-08-14 | International Business Machines Corporation | User-defined context-aware text selection for touchscreen devices |
US11435830B2 (en) | 2018-09-11 | 2022-09-06 | Apple Inc. | Content-based tactile outputs |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20110066976A1 (en) * | 2009-09-15 | 2011-03-17 | Samsung Electronics Co., Ltd. | Function executing method and apparatus for mobile terminal |
US20110169760A1 (en) * | 2008-09-22 | 2011-07-14 | Stantum | Device for control of electronic apparatus by manipulation of graphical objects on a multicontact touch screen |
US20120297041A1 (en) * | 2011-05-20 | 2012-11-22 | Citrix Systems, Inc. | Shell Integration on a Mobile Device for an Application Executing Remotely on a Server |
US20140298244A1 (en) * | 2013-03-26 | 2014-10-02 | Samsung Electronics Co., Ltd. | Portable device using touch pen and application control method using the same |
Family Cites Families (48)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP5010605B2 (en) | 2006-08-11 | 2012-08-29 | パナソニック株式会社 | Event processing device |
US8677285B2 (en) * | 2008-02-01 | 2014-03-18 | Wimm Labs, Inc. | User interface of a small touch sensitive display for an electronic data and communication device |
KR101526965B1 (en) | 2008-02-29 | 2015-06-11 | 엘지전자 주식회사 | Terminal and method for controlling the same |
EP2297685A1 (en) | 2008-07-04 | 2011-03-23 | Yogesh Chunilal Rathod | Methods and systems for brands social networks (bsn) platform |
US20100107067A1 (en) | 2008-10-27 | 2010-04-29 | Nokia Corporation | Input on touch based user interfaces |
JP4885938B2 (en) | 2008-12-25 | 2012-02-29 | 京セラ株式会社 | Input device |
US9213687B2 (en) | 2009-03-23 | 2015-12-15 | Lawrence Au | Compassion, variety and cohesion for methods of text analytics, writing, search, user interfaces |
US8994666B2 (en) * | 2009-12-23 | 2015-03-31 | Colin J. Karpfinger | Tactile touch-sensing interface system |
CN101778147B (en) | 2009-12-29 | 2013-08-21 | 大唐微电子技术有限公司 | Menu display method and communication intelligent card |
JP5413673B2 (en) * | 2010-03-08 | 2014-02-12 | ソニー株式会社 | Information processing apparatus and method, and program |
US20110296333A1 (en) * | 2010-05-25 | 2011-12-01 | Bateman Steven S | User interaction gestures with virtual keyboard |
US8645866B2 (en) | 2010-06-29 | 2014-02-04 | Exelis Inc. | Dynamic icon overlay system and method of producing dynamic icon overlays |
US8473289B2 (en) | 2010-08-06 | 2013-06-25 | Google Inc. | Disambiguating input based on context |
CN102479028A (en) | 2010-11-24 | 2012-05-30 | 上海三旗通信科技股份有限公司 | Method for realizing intelligent standby function |
US8982045B2 (en) | 2010-12-17 | 2015-03-17 | Microsoft Corporation | Using movement of a computing device to enhance interpretation of input events produced when interacting with the computing device |
JP2012256099A (en) * | 2011-06-07 | 2012-12-27 | Sony Corp | Information processing terminal and method, program, and recording medium |
US20130050131A1 (en) * | 2011-08-23 | 2013-02-28 | Garmin Switzerland Gmbh | Hover based navigation user interface control |
KR20130023954A (en) * | 2011-08-30 | 2013-03-08 | 삼성전자주식회사 | Apparatus and method for changing icon in portable terminal |
JP2013084233A (en) * | 2011-09-28 | 2013-05-09 | Kyocera Corp | Device, method, and program |
US9189252B2 (en) | 2011-12-30 | 2015-11-17 | Microsoft Technology Licensing, Llc | Context-based device action prediction |
USD696266S1 (en) | 2012-01-19 | 2013-12-24 | Pepsico, Inc. | Display screen with graphical user interface |
US9685160B2 (en) | 2012-04-16 | 2017-06-20 | Htc Corporation | Method for offering suggestion during conversation, electronic device using the same, and non-transitory storage medium |
US20130332512A1 (en) * | 2012-06-10 | 2013-12-12 | Apple Inc. | Creating and publishing image streams |
US9684398B1 (en) | 2012-08-06 | 2017-06-20 | Google Inc. | Executing a default action on a touchscreen device |
US9189064B2 (en) | 2012-09-05 | 2015-11-17 | Apple Inc. | Delay of display event based on user gaze |
KR102045841B1 (en) | 2012-10-09 | 2019-11-18 | 삼성전자주식회사 | Method for creating an task-recommendation-icon in electronic apparatus and apparatus thereof |
USD711395S1 (en) | 2012-11-02 | 2014-08-19 | Bank Of America Corporation | Display screen for a communication device |
US10386992B2 (en) | 2012-12-06 | 2019-08-20 | Samsung Electronics Co., Ltd. | Display device for executing a plurality of applications and method for controlling the same |
KR20140111495A (en) | 2013-03-11 | 2014-09-19 | 삼성전자주식회사 | Method for controlling display and an electronic device thereof |
KR102164454B1 (en) | 2013-03-27 | 2020-10-13 | 삼성전자주식회사 | Method and device for providing a private page |
US20140372896A1 (en) | 2013-06-14 | 2014-12-18 | Microsoft Corporation | User-defined shortcuts for actions above the lock screen |
CN104346024A (en) | 2013-07-23 | 2015-02-11 | 北京千橡网景科技发展有限公司 | Method and device for selecting shortcuts |
USD740303S1 (en) | 2013-10-11 | 2015-10-06 | Microsoft Corporation | Display screen with transitional graphical user interface |
USD747352S1 (en) | 2013-12-09 | 2016-01-12 | Lg Electronics Inc. | Display screen of a television receiver with a graphical user interface |
USD744505S1 (en) | 2014-01-10 | 2015-12-01 | Aliphcom | Display screen or portion thereof with graphical user interface |
USD788785S1 (en) | 2014-04-11 | 2017-06-06 | Johnson Controls Technology Company | Display having a graphical user interface |
US20150379558A1 (en) | 2014-06-26 | 2015-12-31 | Celtra Inc. | Detecting unintentional user input to mobile advertisements |
USD760773S1 (en) | 2014-08-29 | 2016-07-05 | Samsung Electronics Co., Ltd. | Display screen or portion thereof with animated graphical user interface |
USD761812S1 (en) | 2014-09-30 | 2016-07-19 | Salesforce.Com, Inc. | Display screen or portion thereof with animated graphical user interface |
CN104598109A (en) | 2015-01-08 | 2015-05-06 | 天津三星通信技术研究有限公司 | Method and equipment for previewing application in portable terminal |
US9632664B2 (en) | 2015-03-08 | 2017-04-25 | Apple Inc. | Devices, methods, and graphical user interfaces for manipulating user interface objects with visual and/or haptic feedback |
USD783676S1 (en) | 2015-06-18 | 2017-04-11 | Samsung Electronics Co., Ltd | Display screen or portion thereof with animated graphical user interface |
US10747554B2 (en) | 2016-03-24 | 2020-08-18 | Google Llc | Contextual task shortcuts |
US9965530B2 (en) | 2016-04-20 | 2018-05-08 | Google Llc | Graphical keyboard with integrated search features |
US10261666B2 (en) | 2016-05-31 | 2019-04-16 | Microsoft Technology Licensing, Llc | Context-independent navigation of electronic content |
USD823862S1 (en) | 2016-07-13 | 2018-07-24 | Google Llc | Display screen with graphical user interface |
US20200057541A1 (en) | 2017-12-22 | 2020-02-20 | Google Llc | Dynamically generated task shortcuts for user interactions with operating system user interface elements |
US10922715B2 (en) | 2019-04-26 | 2021-02-16 | Criteo Sa | Dynamically modifying activation behavior of a computerized graphical advertisement display |
-
2012
- 2012-08-06 US US13/567,828 patent/US9684398B1/en active Active
-
2017
- 2017-05-05 US US15/587,587 patent/US20170235479A1/en not_active Abandoned
-
2019
- 2019-10-17 US US16/656,297 patent/US11243683B2/en active Active
-
2021
- 2021-12-23 US US17/561,015 patent/US11599264B2/en active Active
-
2023
- 2023-02-24 US US18/174,503 patent/US11789605B2/en active Active
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20110169760A1 (en) * | 2008-09-22 | 2011-07-14 | Stantum | Device for control of electronic apparatus by manipulation of graphical objects on a multicontact touch screen |
US20110066976A1 (en) * | 2009-09-15 | 2011-03-17 | Samsung Electronics Co., Ltd. | Function executing method and apparatus for mobile terminal |
US20120297041A1 (en) * | 2011-05-20 | 2012-11-22 | Citrix Systems, Inc. | Shell Integration on a Mobile Device for an Application Executing Remotely on a Server |
US20140298244A1 (en) * | 2013-03-26 | 2014-10-02 | Samsung Electronics Co., Ltd. | Portable device using touch pen and application control method using the same |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US11243683B2 (en) | 2012-08-06 | 2022-02-08 | Google Llc | Context based gesture actions on a touchscreen |
US11599264B2 (en) | 2012-08-06 | 2023-03-07 | Google Llc | Context based gesture actions on a touchscreen |
US11789605B2 (en) | 2012-08-06 | 2023-10-17 | Google Llc | Context based gesture actions on a touchscreen |
Also Published As
Publication number | Publication date |
---|---|
US9684398B1 (en) | 2017-06-20 |
US11243683B2 (en) | 2022-02-08 |
US20200050358A1 (en) | 2020-02-13 |
US11789605B2 (en) | 2023-10-17 |
US20220113864A1 (en) | 2022-04-14 |
US20230221860A1 (en) | 2023-07-13 |
US11599264B2 (en) | 2023-03-07 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US11789605B2 (en) | Context based gesture actions on a touchscreen | |
US20150199082A1 (en) | Displaying actionable items in an overscroll area | |
US9195368B2 (en) | Providing radial menus with touchscreens | |
US9261989B2 (en) | Interacting with radial menus for touchscreens | |
US20210311597A1 (en) | Multi-spatial overview mode | |
US10437425B2 (en) | Presenting a menu at a mobile device | |
US8451246B1 (en) | Swipe gesture classification | |
US20140164989A1 (en) | Displaying windows on a touchscreen device | |
US20180260085A1 (en) | Autofill user interface for mobile device | |
US10409420B1 (en) | Touch interpretation for displayed elements | |
US20150153949A1 (en) | Task selections associated with text inputs | |
US9740393B2 (en) | Processing a hover event on a touchscreen device | |
US20150220151A1 (en) | Dynamically change between input modes based on user input | |
US9335905B1 (en) | Content selection feedback | |
US20150205516A1 (en) | System and method for processing touch input | |
US9606720B1 (en) | System and method for providing a preview of a digital photo album | |
US20130265237A1 (en) | System and method for modifying content display size | |
US20150199083A1 (en) | Consolidated system tray |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: GOOGLE INC., CALIFORNIA Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:SAMUEL, FADY;JAIN, VARUN;REEL/FRAME:042270/0476 Effective date: 20120803 |
|
AS | Assignment |
Owner name: GOOGLE LLC, CALIFORNIA Free format text: CHANGE OF NAME;ASSIGNOR:GOOGLE INC.;REEL/FRAME:044129/0001 Effective date: 20170929 |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: FINAL REJECTION MAILED |
|
STCV | Information on status: appeal procedure |
Free format text: NOTICE OF APPEAL FILED |
|
STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION |