GB2623597A - Configuration of a touch-sensing device - Google Patents
Configuration of a touch-sensing device Download PDFInfo
- Publication number
- GB2623597A GB2623597A GB2218164.8A GB202218164A GB2623597A GB 2623597 A GB2623597 A GB 2623597A GB 202218164 A GB202218164 A GB 202218164A GB 2623597 A GB2623597 A GB 2623597A
- Authority
- GB
- United Kingdom
- Prior art keywords
- touch
- sensing device
- configuring
- computing device
- application
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 230000009471 action Effects 0.000 claims abstract description 31
- 238000012545 processing Methods 0.000 claims abstract description 7
- 238000003860 storage Methods 0.000 claims abstract description 7
- 238000000034 method Methods 0.000 claims description 33
- 230000003993 interaction Effects 0.000 description 4
- 230000008569 process Effects 0.000 description 4
- 230000014759 maintenance of location Effects 0.000 description 2
- 238000005259 measurement Methods 0.000 description 2
- 230000015654 memory Effects 0.000 description 2
- 101100264195 Caenorhabditis elegans app-1 gene Proteins 0.000 description 1
- 230000006399 behavior Effects 0.000 description 1
- 230000008859 change Effects 0.000 description 1
- 238000004590 computer program Methods 0.000 description 1
- 230000001419 dependent effect Effects 0.000 description 1
- 238000011156 evaluation Methods 0.000 description 1
- 238000013507 mapping Methods 0.000 description 1
- 230000004044 response Effects 0.000 description 1
- 230000035945 sensitivity Effects 0.000 description 1
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/017—Gesture based interaction, e.g. based on a set of recognized hand gestures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
- G06F3/04883—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
Landscapes
- Engineering & Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
Configuring a touch-sensing device operable with a computing device comprises determining a running application processing inputs received from the touch-sensing device, selecting a dataset for configuring the touch sensing device depending on the determined running application and loading the selected dataset from a storage storing several such datasets, and configuring the touch-sensing device with the selected dataset. Each dataset contains configuration parameters for associating touch gestures to be detected by the touch-sensing device with actions to be performed by the computing device. At least one configuration parameter of at least one dataset is user configurable. Associations may be applied for a single application or system-wide across multiple applications. The dataset may additionally be selected based on the current user. The touch-sensing device may be pre-configured e.g. prior to the running application being determined. There may also be predefined gestures that aren’t user configurable. A configuration application may allow the user to configure the datasets e.g. to associate gestures with actions for a single application or multiple applications. A device may comprise a touch-sensing device (e.g. a capacitive-multi-touch-sensing device) and a computer device, which may be integrated with the touch-sensing device.
Description
CONFIGURATION OF A TOUCH-SENSING DEVICE
TECHNICAL FIELD
This specification relates to a configuration of a touch-sensing device, particularly to a configuration of one or more touch gestures to be detected by a touch-sensing device.
BACKGROUND
A touch-sensing device such as a touchpad or touchscreen may be implemented to detect certain touch inputs made by users of the device. Touch-sensing devices are commonly implemented using capacitive sensing technology, which may comprise a capacitive touch surface such as a capacitive touchpad or touchscreen and a controller for processing touch inputs detected on the touch surface often by means of some digital signal processing technology.
Touch inputs may comprise gestures made by a user when touching a touch surface of a touch-sensing device with one or more fingers. Gestures are widely used in HMI is (Human Machine Interface) devices, for example, to perform control actions in industrial environments.
Gestures made with two or more fingers are called multi-finger gestures and require a multi-touch-sensing device. Gestures may help to minimize the actions, which a user must make for performing certain input commands with a touch-sensing device. For example, additional functionality such as selecting text, zooming in and out in pictures, moving selected text, rotating images may be assigned to system-specific predefined gestures.
Typical predefined gestures may be for example a tap gesture, i.e. when a user briefly touches a touch surface of a touch-sensing device with a fingertip, a double tap gesture, i.e. when a user rapidly touches the touch surface twice with fingertips, a drag gesture, i.e. when a user moves a fingertip over the touch surface without losing contact, a flick gesture, i.e. when a user quickly brushes the touch surface with a fingertip, a pinch gesture and spread gesture, i.e. when a user touches the touch surface with two fingers and brings them closer together or moves them apart, respectively, a press gesture and a press and tap gesture, i.e. when a user presses the touch surface with one finger for an extended period of time or briefly touches the touch surface with a second finger, respectively, a press and drag gesture, i.e. 'when a user presses the touch surface with one finger and moves a second finger over the surface without losing contact, and a rotate gesture, i.a when a user touches the touch surface with two fingers and moves them in a clockwise or counterclockwise direction.
The US patent US10551984B2 describes methods, devices, non-transitory processor-media of various embodiments that may enable contextual operation of a mobile computing device including a capacitive input sensor, which may be a rear area capacitive input sensor. In various embodiments, a processor of a mobile computing device including a rear area capacitive input sensor may monitor sensor measurements and generate an interaction profile based on the sensor measurements. The processor of the mobile computing device may determine whether the interaction profile is inconsistent with in-hand operation and may increase sensitivity of the capacitive input sensor in response to determining that the interaction profile is inconsistent with in-hand operation The US patent US9710151B2 describes systems and methods, which are provided for evaluating the quality of automatically composed digital content based on non-intentional user feedback obtained through a haptic interface. For example, a method includes accessing non-intentional user feedback collected by a haptic interface executing on a computing device, wherein the non-intentional user feedback comprises information regarding user interaction with elements of digital content rendered by the computing device. The digital content is content that is automatically generated using content generation rules. The method further includes evaluating a quality of the digital content based on non-intentional user feedback and generating an evaluation report that includes information regarding the quality of the digital content.
SUMMARY
This specification describes solutions for a configuration of a touch-sensing device so that a user may for example configure a specific gesture to perform different actions in different applications. Thus, the described solutions enable a more versatile use of a touch-sensing device by allowing users to determine their preferred touch-sensing device configuration instead of a system-wide configuration, which is the same for all users of touch-sensing device, and over which users have no control. The configuration of the touch-sensing device as described herein particularly relates to gestures configuration; and more particularly to the association of gestures to actions to be performed by applications being executed on a computing device. A user may particularly configure the association of gestures to actions depending on applications running on or being executed by the computing device. For example, a rotate gesture may be associated to different actions depending on the application running on a computing device, such as rotating an image in a drawing application and controlling the speed of an electric motor in a control application in an industrial environment.
According to an aspect of this specification, a method for configuring a touch-sensing device is provided, wherein the method comprises a) determining, by a computing device that is operably coupled with the touch-sensing device, a running application processing inputs received from the touch--sensing device, b) selecting, by the computing device, a dataset for configuring the touch-sensing device depending on the determined running application and loading the selected dataset from a storage storing several datasets for configuring the touch-sensing device; and c) configuring, by the computing device the touch-sensing device with the selected dataset; wherein each dataset for configuring the touch-sensing device contains one or more configuration parameters for associating one or more touch gestures to be detected by the touch-sensing device with actions to be performed by the computing device; and wherein at least one of the one or more configuration parameters of at least one of the datasets for configuring the touch-sensing device is user configurable.
The method enables an application-dependent configuration of a touch-sensing device. For example, a user may configure different associations of a gesture for different applications. Thus, a system-wide configuration of a touch-sensing device can be changed to an application-specific user configuration, which allows more flexibility for touch-sensitive devices. A running application processing inputs received from the touch-sensing device may particularly comprise a computer program designed to carry out a specific task other than one relating to an operation of a computer itself such as an application program typically to be used by end-users like a control program for devices controlled via the touch-sensing devices as user interface or standard application programs like word processors or media players, and utility software such as a program for controlling and modifying settings of a computer or computing device.
In an embodiment, associations contained in the one or more configuration parameters are provided for configuring the touch-sensing device system--and/or an application--wide, wherein system-wide means that an association of one or more touch gestures to be detected by the touch-sensing device with actions to be performed by the computing device is configured for several applications and application-wide means that an association of one or more touch gestures to be detected by the touch-sensing device with actions to be performed by the computing device is configured for a single application. Thus, touch gestures may be configured by users to configure the touch-sensing device on a system-level, i.e., touch gestures which can be used in several or even all applications, and/or on an application-level, i.e., touch gestures which can be used in only one application. The one or more configuration parameters may be for example defined with a configuration program provided by the computing device allowing users to set up the configuration parameters as desired.
In an embodiment, the method may further comprise determining, by the computing device, a user profile and selecting the dataset for configuring the touch-sensing device depending on the determined running application and the determined user profile.
Thus, the configuration of the touch-sensing device may be connected with a user profile, which may be helpful in multi-user environments since a specific user configuration of the touch-sensing device may be automatically used when the respective user logs into his account for example in the computing device.
In a further embodiment, the method may further comprise determining, by the computing device, a predefined configuration of the touch-sensing device and configuring the touch-sensing device with the predefined configuration. The predefined configuration may differ for example configure standard associations of gestures with actions, which may be configured according to user preferences.
In a yet further embodiment, the configuring of the touch-sensing device with the predefined configuration may be performed before step a). This may ensure that the touch-sensing device is configured even if no user configuration is available.
In a yet further embodiment, the method may further comprise providing, by the computing device, a configuration application for configuring the at least one of the datasets for configuring the touch-sensing device by a user. The configuration application may be for example implemented with a graphical user interface to easily allow the user to configure associations of gestures with actions according to his preferences. In an embodiment, the configuration application may enable users to configure for each gesture to be detected by the touch-sensing device an association with an action to be performed by the computing device and to define for each association whether it is configured for several applications and/or a single application.
In a still further embodiment, the one or more touch gestures may comprise single-and/or multi-touch gestures. Single-touch gestures comprise gestures requiring only a touch input from a single finger, such as tap, double tap, drag, flick, press, while multi-touch gestures require touch inputs from several fingers, such as pinch, spread, press and tap, press and drag, rotate.
According to a further aspect of this specification, a device is described, which comprises a touch-sensing device and a computing device operably coupled with the touch-sensing device, wherein the computing device is configured to perform a method as described herein. The computing device may comprise one or more processors and one or more storages storing a program comprising instructions for execution by the one or more processors and configuring the one or more processors to perform a method as described herein.
In an embodiment, the touch-sensing device may be one of the following: a 30 touchscreen; a touchpad.
In a further embodiment, the touch-sensing device may be a multi-touch-sensing device; particularly a capacitive multi-touch-sensing device.
In a yet further embodiment, the computing device may be integrated in the touch-sensing device. For example, the computing device may be implemented as a controller comprising one or more processors and one or more storages, particularly non-volatile memories such as ROM, EPROM, EEPROM, Flash-ROM, storing a program comprising instructions for execution by the one or more processors and configuring the one or more processors to perform a method as described herein.
The details of more embodiments are set forth in the accompanying drawings and the description below. Other features and advantages will be apparent from the
description and drawings, and from the claims.
DESCRIPTION OF DRAWINGS
Fig 1 shows a flowchart of an embodiment of a method for configuring a touch-sensing device; is Fig. 2 shows an embodiment of a touch-sensing model application; Fig. 3 shows a flowchart of a further embodiment of a method for configuring a touch-sensing device; and Fig 4 shows an embodiment of a gesture configuration application.
DETAILED DESCRIPTION
In the following, functionally similar or identical elements may have the same reference numerals. Absolute values are shown below by way of example only and should not be construed as limiting.
Fig. 1 shows a flowchart of an embodiment of a method for configuring a touch-sensing device. Optional steps are shown in dashed lines in Fla 1. The method may be applied to a touch-sensing model application as shown in Fig. 2, which comprises a touch screen HMI (Human Machine Interface) panel, for example, a capacitive multi-touch display, for displaying information, a touch screen gesture application, i.e. an application configured to process touch gestures received via the touch screen HMI panel, and a gesture configuration mapping storage for storing one or more datasets for configuring the touch screen HMI panel. As shown in Fig. 2, the one or more datasets may form a gesture map comprising configuration parameters associating different touch gestures with (actions of) applications. In the gesture map from Fig. 2, applications App 1 and App 2 are associated with a Gesture 1, applications App 3 and App 4 with Gesture 6, and application App 5 with Gesture 2. When a touch gesture is received as input from the touch screen HMI panel, the touch screen gesture application, which is actually running, processes the gesture map with the running application and reverts with an action to be performed by the running application and being associated with the touch gesture in the gesture map.
is The method of Fig. 1 may be executed as part of a middleware software executed by a computing device and may enhance the input subsystem of a kernel of an operating system of the computing device. The computing device may be operably coupled with a touch-sensing device as the touch display panel. The touch-sensing device and the computing device may form at least a part of a HMI, for example, an HMI of an industrial control system.
As an optional step 6 after starting the method, a predefined configuration of the touch--sensing device may be determined, for example, a system-wide configuration of touch gestures, and the touch-sensing device may be configured with this predefined configuration.
As a further optional step 8, a user profile may be determined, if any exists. For example, the user profile may be determined based on the currently logged-in user of the computing device, if a user account management is implemented. It is also possible to actively request from a user an input determining a user profile, for example by displaying on a screen one or more buttons for:selecting available user profiles or for entering user profile data.
In step 10, an application running on or executed by the computing device is determined, which processes inputs received from the touch-sensing device. For example, it may be determined that a control program for equipment of an industrial plant is executed by or running on the computing device, which is prepared to receive and process user inputs such as inputs to control machines etc. Depending on the determined running application, a dataset for configuring the touch-sensing device (and the determined user profile (if any)) is selected in step 12. The selecting may be for example performed by retrieving from a database storing configuration datasets for the touch-sensing device and assignments of applications executable by the computing device to the configuration datasets.
In step 14, a selected dataset is then loaded from a storage storing several datasets for configuring the touch-sensing device, for example, a database containing the several datasets for configuring the touch-sensing device.
The touch-sensing device is then configured in step 16 with the selected and load dataset.
The method of Fig. 1 may be continuously running on or executed by the computing device in that after step 16 the method continues with either optional step 8 or step 10.
A dataset for configuring the touch-sensing device contains one or more configuration parameters for associating one or more touch gestures to be detected by the touch-sensing device with actions to be performed by the computing device. For example, gestures such as (double) tap, drag, flick, pinch and spread, press (and tap), press and drag, and rotate, may be associated to specific actions to be performed by the computing device, and these associations may be contained in the configuration parameters. The associations may be predefined system-and/or application-wide, for example with a predefined configuration as in optional step 6. The predefined configuration may be an initial configuration, which is like the system-wide configurations of touch-sensing devices known in the art, which are not user configurable.
At least one of the one or more configuration parameters of at least one of the datasets for configuring the touch-sensing device is user configurable, which means that a user can change the association of a specific gesture to a specific action depending on the running application. For example, a user may configure a configuration parameter determining that the rotate gesture is associated with a certain action in the certain application such as controlling speed of an electric motor in a control application of the motor.
Fig. 3 shows in a flowchart the processing of a gesture detected with a touch-sensing device (input panel). A touch application (application with touch input) running on a computing device receives the gesture and identifies the received gesture, for example as a tap. The touch application checks the identified gesture with a user configuration of the touch-sensing device for the touch application, particularly which action is associated with the identified gesture in the touch application. The touch application sends the identified gesture and associated action to the touch library stored in a memory of the computing device. The touch library then executes the action associated to the identified gesture with the processor of the computing device. The processor may control a display output on a GUI (Graphical User Interface) information on the identified gesture and/or associated action.
Fig. 4 shows a gesture control application with user profiles. The gesture control application serves to configure the specific associations of gestures to actions.
Particularly, the gesture control application serves as a configuration program enabling users to configure for each gesture to be detected by the touch-sensing device an association with an action to be performed by the computing device, system-and/or application-wide. In Fig. 4, profile 1 contains application-specific configurations for gesture 2 and a user-defined gesture with the association "rotate 180", whereas gestures 1, 3, and 4 are system-wide associations "open application", screen lock", "voice assist'. Profile 2 contains application-specific configurations for gesture 1, 2, and 3 with the association "rotate 180", whereas gestures 4 arid the user-defined gesture are system-wide associations "open application", screen lock", "open system settings". User 1 as well as user 2 can configure the gestures behavior with the control application by selecting one or more applications and configuring the configuration parameters for associating the gestures to be detected by the touch-sensing device with actions to be performed by the computing device as shown in Fig. 4.
Claims (10)
- CLAIMS1. A method for configuring a touch-sensing device comprising a) determining (10), by a computing device that is operably coupled with the touch-sensing device, a running application processing inputs received from the touch-sensing device, b) selecting (12), by the computing device; a dataset for configuring the touch-sensing device depending on the determined running application and loading (14) the selected dataset from a storage storing several datasets for configuring the touch-sensing device, and c) configuring (16), by the computing device, the touch-sensing device with the selected dataset, wherein each dataset for configuring the touch-sensing device contains one or more configuration parameters for associating one or more touch gestures to be detected by the touch-sensing device with actions to be performed by the is computing device; and wherein at least one of the one or more configuration parameters of at least one of the datasets for configuring the touch-sensing device is user configurable.
- 2 The method of claim 1, wherein associations contained in the one or more configuration parameters are provided for configuring the touch-sensing device system-and/or an application-wide, wherein system-wide means that an association of one or more touch gestures to be detected by the touch-sensing device with actions to be performed by the computing device is configured for several applications and application-wide means that an association of one or more touch gestures to be detected by the touch-sensing device with actions to be performed by the computing device is configured for a single application.
- 3. The method of claim 1 or 2, further comprising determining (8), by the computing device, a user profile and selecting (12) the dataset for configuring the touch-sensing device dependinç n the determined running appiication and the determined user profile.
- 4 The method of claim 1, 2 or 3, further comprising determining (6), by the computing device, a predefined configuration of the touch-sensing device and configuring the touch-sensing device with the predefined configuration.
- 5. The method of claim 4, wherein the configuring (6) of the touch-sensing device with the predefined configuration is performed before step a).
- 6. The method of any preceding claim, further comprising providing, by the computing device, a configuration application for configuring the at least one of in the datasets for configuring the touch-sensing device by a user.
- 7 The method of claim 6, wherein the configuration application enables users to configure for each gesture to be detected by the touch-sensing device an association with an action to be performed by the computing device and to define for each association whether it is configured for several applications and/or a single application.
- 8. A device comprisin * a touch-sensing device and * a computing device operably coupled with the touch-sensing device, wherein the computing device is configured to perform a method of any preceding claim.
- 9. The device of claim 8, wherein the touch-sensing device is a multi-touch-sensing device, particularly a capacitive multi-touch-sensing device.
- 10. The device of claim 8 or 9, wherein the computing device is integrated in the touch-sensing device.
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
IN202211059685 | 2022-10-19 |
Publications (2)
Publication Number | Publication Date |
---|---|
GB202218164D0 GB202218164D0 (en) | 2023-01-18 |
GB2623597A true GB2623597A (en) | 2024-04-24 |
Family
ID=84926495
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
GB2218164.8A Pending GB2623597A (en) | 2022-10-19 | 2022-12-02 | Configuration of a touch-sensing device |
Country Status (1)
Country | Link |
---|---|
GB (1) | GB2623597A (en) |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20130120279A1 (en) * | 2009-11-20 | 2013-05-16 | Jakub Plichta | System and Method for Developing and Classifying Touch Gestures |
EP2685367A2 (en) * | 2012-07-09 | 2014-01-15 | Samsung Electronics Co., Ltd | Method and apparatus for operating additional function in mobile device |
US9063576B1 (en) * | 2013-04-04 | 2015-06-23 | Amazon Technologies, Inc. | Managing gesture input information |
US20170147176A1 (en) * | 2015-11-23 | 2017-05-25 | Google Inc. | Recognizing gestures and updating display by coordinator |
-
2022
- 2022-12-02 GB GB2218164.8A patent/GB2623597A/en active Pending
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20130120279A1 (en) * | 2009-11-20 | 2013-05-16 | Jakub Plichta | System and Method for Developing and Classifying Touch Gestures |
EP2685367A2 (en) * | 2012-07-09 | 2014-01-15 | Samsung Electronics Co., Ltd | Method and apparatus for operating additional function in mobile device |
US9063576B1 (en) * | 2013-04-04 | 2015-06-23 | Amazon Technologies, Inc. | Managing gesture input information |
US20170147176A1 (en) * | 2015-11-23 | 2017-05-25 | Google Inc. | Recognizing gestures and updating display by coordinator |
Also Published As
Publication number | Publication date |
---|---|
GB202218164D0 (en) | 2023-01-18 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US7181697B2 (en) | Method of implementing a plurality of system tray areas | |
US10133396B2 (en) | Virtual input device using second touch-enabled display | |
EP3660670B1 (en) | Gesture recognizers for controlling and modifying complex gesture recognition | |
US9501175B2 (en) | Techniques and apparatus for managing touch interface | |
US20150153897A1 (en) | User interface adaptation from an input source identifier change | |
EP3491506B1 (en) | Systems and methods for a touchscreen user interface for a collaborative editing tool | |
US9128548B2 (en) | Selective reporting of touch data | |
TWI512601B (en) | Electronic device, controlling method thereof, and computer program product | |
US11150797B2 (en) | Method and device for gesture control and interaction based on touch-sensitive surface to display | |
GB2509599A (en) | Identification and use of gestures in proximity to a sensor | |
WO2015088883A1 (en) | Controlling interactions based on touch screen contact area | |
WO2015088882A1 (en) | Resolving ambiguous touches to a touch screen interface | |
KR102198596B1 (en) | Disambiguation of indirect input | |
CN116483246A (en) | Input control method and device, electronic equipment and storage medium | |
US20130135232A1 (en) | Processing method for touch signal and computing device thereof | |
US11755200B2 (en) | Adjusting operating system posture for a touch-enabled computing device based on user input modality signals | |
GB2623597A (en) | Configuration of a touch-sensing device | |
US9791956B2 (en) | Touch panel click action | |
KR20150111651A (en) | Control method of favorites mode and device including touch screen performing the same | |
WO2019024507A1 (en) | Touch control method and device, and terminal | |
KR102296968B1 (en) | Control method of favorites mode and device including touch screen performing the same | |
EP2956839A1 (en) | Methods and systems for multimodal interaction | |
US20170010868A1 (en) | Method for handling user-level events for programming an application | |
CN114939275B (en) | Method, device, equipment and storage medium for object interaction | |
KR102205235B1 (en) | Control method of favorites mode and device including touch screen performing the same |