WO2017047929A1 - Dispositif à écran tactile apte à exécuter un événement basé sur une combinaison de gestes et son procédé de fonctionnement - Google Patents

Dispositif à écran tactile apte à exécuter un événement basé sur une combinaison de gestes et son procédé de fonctionnement Download PDF

Info

Publication number
WO2017047929A1
WO2017047929A1 PCT/KR2016/008712 KR2016008712W WO2017047929A1 WO 2017047929 A1 WO2017047929 A1 WO 2017047929A1 KR 2016008712 W KR2016008712 W KR 2016008712W WO 2017047929 A1 WO2017047929 A1 WO 2017047929A1
Authority
WO
WIPO (PCT)
Prior art keywords
gesture
touch
input
touch inputs
combination
Prior art date
Application number
PCT/KR2016/008712
Other languages
English (en)
Korean (ko)
Inventor
이창일
김홍식
Original Assignee
주식회사 한컴플렉슬
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from KR1020150162259A external-priority patent/KR101718070B1/ko
Application filed by 주식회사 한컴플렉슬 filed Critical 주식회사 한컴플렉슬
Priority to US15/529,986 priority Critical patent/US10540088B2/en
Publication of WO2017047929A1 publication Critical patent/WO2017047929A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures

Definitions

  • Embodiments of the present invention relate to a technique for a technique that enables a touch screen device to perform various control operations based on a user's touch input.
  • these smart devices are usually equipped with a touch screen, many applications that provide an interface based on the user's touch input has emerged.
  • the screen can be scrolled by tracking the movement trajectory of the user's touch input or the screen can be enlarged or reduced based on the multi-touch input. Function to make it possible.
  • touch screen devices equipped with a touch screen are often designed to perform an operation according to a corresponding touch input based on one touch input input by a user.
  • the user may apply a touch input on the touch screen to control the touch screen device based on the touch input, and there is a limit to the number of functions such as a scroll function and a page turning.
  • the user can execute various events provided by the touch screen device only by applying a specific touch input, such as using a shortcut key, the user's convenience may be facilitated, but as described above, the existing touch Since screen devices are developed to be capable of executing only a simple event corresponding to a single touch input or a simple multi-touch, there is a limit to executing various events based on the touch input.
  • a touch screen device capable of executing an event based on a gesture combination and a method of operating the same maintain a command database in which a plurality of predetermined gesture combinations in which two or more gestures are sequentially combined and corresponding instructions are stored.
  • the command corresponding to the gesture combination is extracted from the command database and corresponding thereto.
  • a touch screen device capable of executing an event based on a gesture combination may include a plurality of predetermined gesture combinations, wherein the gesture combination means that two or more gestures are sequentially combined.
  • the gesture combination means that two or more gestures are sequentially combined.
  • a command database storing commands corresponding to each of the selected gesture combinations of the two or more touch inputs is sequentially input at intervals within a predetermined time
  • the gesture corresponding to each of the two or more touch inputs is performed.
  • a touch confirming unit confirming a first gesture combination according to the two or more touch inputs
  • a command extracting unit extracting a first command stored corresponding to the first gesture combination from the command database, and the extracted first It includes an event execution unit for executing an event according to the command.
  • an operation method of a touch screen device capable of executing an event based on a gesture combination may include a plurality of predetermined gesture combinations, wherein the gesture combination means that two or more gestures are sequentially combined. Maintaining a command database in which commands corresponding to each of the plurality of predetermined gesture combinations are stored; when two or more touch inputs are sequentially input at intervals within a predetermined time, the two or more touch inputs Identifying a first gesture combination according to the two or more touch inputs based on a corresponding gesture, extracting a first command stored corresponding to the first gesture combination from the command database, and extracting the extracted first command; Executing an event according to the first instruction.
  • a touch screen device capable of executing an event based on a gesture combination and a method of operating the same maintain a command database in which a plurality of predetermined gesture combinations in which two or more gestures are sequentially combined and corresponding instructions are stored.
  • the command corresponding to the gesture combination is extracted from the command database and corresponding thereto.
  • FIG. 1 illustrates a structure of a touch screen device capable of executing an event based on a gesture combination according to an embodiment of the present invention.
  • FIG. 2 is a view for explaining an operation of a touch screen device capable of executing an event based on a gesture combination according to an embodiment of the present invention.
  • FIG. 3 is a flowchart illustrating a method of operating a touch screen device capable of executing an event based on a gesture combination.
  • FIG. 1 illustrates a structure of a touch screen device capable of executing an event based on a gesture combination according to an embodiment of the present invention.
  • the touch screen device 110 capable of executing an event based on a gesture combination according to an embodiment of the present invention may include a command database 111, a touch checker 112, a command extractor 113, and an event.
  • the execution unit 114 is included.
  • the command database 111 stores a plurality of predetermined gesture combinations and a command corresponding to each of the plurality of predetermined gesture combinations.
  • information may be stored in the command database 111 as shown in Table 1 below.
  • gesture combination Gesture Combination Method Gesture combination 1 Gesture 1 + Gesture 2 Command 1 Gesture combination 2 Gesture 3 + Gesture 4 + Gesture 5 Command 2 Gesture combination 3 Gesture 1 + Gesture 2 + Gesture 4 Command 3 ... ... ...
  • the gesture combination means that two or more gestures are sequentially combined.
  • two gestures such as a circular drawing gesture, a gesture of moving from left to right, and a gesture of moving upward from downward, are two. This means that the above gestures are combined in sequence.
  • gesture combination 1 means a gesture combination in which "gesture 1" and “gesture 2" are sequentially combined
  • gesture combination 2 means “gesture 3" and “gesture 4”.
  • Gesture 5 means a gesture that combines sequentially
  • Gesture Combination 3 means a gesture that sequentially combines” Gesture 1 "
  • Gesture 2 means Gesture 4 ".
  • the touch checker 112 When two or more touch inputs are sequentially input on the touch screen at intervals within a predetermined time period, the touch checker 112 based on a gesture corresponding to each of the two or more touch inputs, the first touch input according to the two or more touch inputs. Check the gesture combination.
  • the command language extractor 113 extracts a first command stored corresponding to the first gesture combination from the command database 111.
  • the event execution unit 114 executes an event according to the extracted first command.
  • the touch check unit 112 is " Based on the gestures corresponding to the touch input 1 and the touch input 2, the first gesture combination according to the touch input 1 and the touch input 2 may be checked.
  • the touch confirming unit 112 may display “touch input 1" and “touch.” A gesture combination in which "gesture 1" and “gesture 2" are sequentially combined may be confirmed as the first gesture combination according to the input 2 ".
  • the command extractor 113 corresponds to the gesture combination 1 which is a gesture combination in which the first gesture combination "Gesture 1" and “Gesture 2" are sequentially combined from the command database 111 as shown in Table 1 above. And “command 1” may be extracted with the stored first command.
  • the event execution unit 114 executes the event according to the extracted "command 1", thereby controlling the touch screen device 110 capable of executing an event based on gesture combination to perform an operation according to "command 1". can do.
  • the touch screen device 110 capable of executing an event based on a gesture combination may include a command database in which a plurality of predetermined gesture combinations in which two or more gestures are sequentially combined and corresponding instructions are stored ( 111, and when two or more touch inputs are input by the user on the touch screen at intervals within a predetermined time, the gesture combinations according to the two or more touch inputs are checked and then the gestures are received from the command database 111.
  • a command database in which a plurality of predetermined gesture combinations in which two or more gestures are sequentially combined and corresponding instructions are stored ( 111, and when two or more touch inputs are input by the user on the touch screen at intervals within a predetermined time, the gesture combinations according to the two or more touch inputs are checked and then the gestures are received from the command database 111.
  • the touch check unit 112 may include a gesture combination generator 115, and the command extractor 113 may include a determiner 116 and an extractor 117. Can be.
  • the gesture combination generation unit 115 tracks the trajectories of each touch input based on the trajectories of the respective touch inputs. Generate gesture information corresponding to each touch input, and generate the first gesture combination according to the two or more touch inputs based on the gesture information corresponding to each touch input.
  • the command database 111 may be used. It may be determined whether the first gesture combination exists among the plurality of predetermined gesture combinations stored on the screen.
  • the extraction unit 117 determines that the first gesture combination exists among the plurality of predetermined gesture combinations stored on the command database 111, and the first gesture is determined from the command database 111.
  • the first command stored corresponding to the combination may be extracted.
  • the touch check unit 112 may further include a free line display unit 118 and a free line removal unit 119.
  • the free line display unit 118 tracks the trajectories of each touch input and according to the trajectories of the respective touch inputs. Free lines are generated and displayed on the touch screen.
  • the free line removing unit 119 may sequentially input the two or more touch inputs on the touch screen at intervals within the predetermined time, and then, if the additional touch inputs are not input within the predetermined time, Remove the free lines that are shown in.
  • FIG 2 is a view for explaining the operation of the touch screen device 110 capable of executing an event based on a gesture combination according to an embodiment of the present invention.
  • the command database 111 stores a plurality of predetermined gesture combinations and a command corresponding to each of the plurality of predetermined gesture combinations as shown in Table 1 above.
  • the touch checking unit 112 may perform the two or more touches based on a gesture corresponding to each of the two or more touch inputs.
  • the first gesture combination according to the input may be checked.
  • the gesture combination generation unit 115 tracks the trajectory of each touch input whenever the two or more touch inputs are sequentially input on the touch screen 210 at intervals within the predetermined time.
  • the gesture information corresponding to each touch input may be generated based on the trajectory of the touch input.
  • the gesture information may be generated based on the trajectory of the touch input 1 211 as gesture information corresponding to the touch input 1 211, and the gesture information may be generated.
  • gesture information corresponding to the touch input 2 212 is moved from left to right based on the trace of the touch input 2 212.
  • Gesture information may be generated.
  • a gesture of drawing a circle is referred to as “gesture 1” and a gesture of moving from left to right is referred to as “gesture 2".
  • the gesture combination generation unit 115 generates the first gesture combination using gesture combinations according to the touch input 1 211 and the touch input 2 212 based on the information about the "gesture 1" and the "gesture 2".
  • the gesture combination generation unit 115 may generate the gesture combination 1 which is a gesture combination in which "gesture 1" and “gesture 2" are sequentially combined.
  • the free line display unit 118 touches the touch input 1 211 and the touch input whenever the touch input 1 211 and the touch input 2 212 are input at intervals within 1 second on the touch screen 210. After tracking the trajectory of 2 (212), a free line corresponding to touch input 1 (211) and a free line corresponding to touch input 2 (212) are generated, as shown in FIG. You can mark each free line in.
  • the command extracting unit 113 is input to the command extractor 113.
  • the included determination unit 116 may determine whether the “gesture combination 1” exists among the plurality of predetermined gesture combinations stored on the command database 111.
  • the free line removing unit 119 does not receive an additional touch input within 1 second after the touch input 1 211 and the touch input 2 212 are input at intervals within 1 second on the touch screen 210. Otherwise, all of the free lines displayed on the touch screen 210 may be removed.
  • the touch screen device 110 capable of executing an event based on the gesture combination according to the present invention includes a free line display unit 118, so that the user can determine which touch input is applied on the touch screen 210.
  • the free line removal unit 119 when the free line is removed, it is possible to support when the authorization for two or more touch inputs is completed and when the command is executed. .
  • the extraction unit 117 is a "command word” which is a command stored in correspondence with the "gesture combination 1" from the command database 111. 1 "can be extracted.
  • the event execution unit 114 may execute an event according to the extracted “command 1”.
  • the gesture combination generation unit 115 tracks the trajectory of each touch input whenever the two or more touch inputs are sequentially input at intervals within the predetermined time on the touch screen.
  • a queue storage unit 120 generating gesture information corresponding to each touch input based on the trajectory of each touch input, and sequentially storing gesture information corresponding to each touch input in a memory queue; When gesture information corresponding to each of the two or more touch inputs is stored in the memory queue, the gesture information corresponding to each of the two or more touch inputs stored in the memory queue is combined according to a storage order in the memory queue.
  • the combination generator 121 may generate the first gesture combination according to at least two touch inputs.
  • the cue storage unit 120 may include “gesture 1” which is gesture information corresponding to the touch input 1 211. “And then store it in the memory queue, and when touch input 2 212 is input on touch screen 210," gesture 2 "which is gesture information corresponding to touch input 2 212 is generated. It can then be stored further in the memory queue.
  • the combination generating unit 121 stores information about "gesture 1" and “gesture 2" stored in the memory queue in the order of "gesture 1" and “gesture 2" according to the storage order in the memory queue. Can be combined to produce “gesture combination 1”.
  • the touch check unit 112 may further include an input cancel unit 122.
  • the input canceling unit 122 sequentially inputs two or more touch inputs on the touch screen at intervals within the predetermined time, and then executes a cancellation command for the touch input at intervals within the predetermined time.
  • gesture information corresponding to each of the two or more touch inputs is obtained from the gesture information corresponding to each of the two or more touch inputs stored in the memory queue. Can be deleted one by one in an order opposite to the order stored in the memory queue.
  • the gesture 1 is displayed on the memory queue.
  • a first preset preset for executing a cancel command for a touch input within 1 second after the touch input 2 212 is input on the touch screen 210 in a state where the information about “and“ gesture 2 ”is sequentially stored.
  • the input canceling unit 122 may delete the information about “gesture 2” from the memory queue based on the first canceling touch input.
  • the input canceling unit 122 may enter the first canceled touch again. Based on the input, information about "gesture 1" may be deleted from the memory queue.
  • gesture information “gesture 3”, which is gesture information corresponding to “touch input 3”, may be additionally stored in the memory queue. In this case, gesture information regarding "gesture 1" and “gesture 3” is stored in the memory queue.
  • the touch screen device 110 capable of executing an event based on the gesture combination according to the present invention may cancel the already input touch input on the touch screen 210.
  • the touch input itself for executing the event may be modified.
  • the input canceling unit 122 is the two or more touch inputs within the predetermined time after the two or more touch inputs are sequentially input at intervals within the predetermined time on the touch screen
  • the second canceling touch input corresponding to the second canceling gesture information preset for executing the input canceling command for the entire touch input is input, the two or more stored in the memory queue based on the second canceling touch input; All gesture information corresponding to each touch input may be deleted.
  • the gesture 1 is displayed on the memory queue.
  • a second preset second for executing a cancel command for all touch inputs within 1 second after the touch input 2 212 is input on the touch screen 210 in a state where the information about "and" gesture 2 "is sequentially stored;
  • the input canceling unit 122 may extract both information about “gesture 1” and “gesture 2” from the memory queue based on the second canceling touch input. You can delete it.
  • FIG. 3 is a flowchart illustrating a method of operating a touch screen device capable of executing an event based on a gesture combination.
  • a command database storing a plurality of predetermined gesture combinations (the gesture combination means that two or more gestures are sequentially combined) and a command corresponding to each of the plurality of predetermined gesture combinations is stored. Keep it.
  • the first gesture combination according to the two or more touch inputs is based on a gesture corresponding to each of the two or more touch inputs.
  • a first command stored corresponding to the first gesture combination is extracted from the command database.
  • step S320 whenever the two or more touch inputs are sequentially input at intervals within the predetermined time, the traces of the respective touch inputs are tracked. Generating gesture information corresponding to each touch input based on the trajectory of each touch input, and generating the first gesture combination according to the two or more touch inputs based on gesture information corresponding to each touch input It may include.
  • step S330 if the two or more touch inputs are sequentially input at intervals within the predetermined time, and additional touch inputs are not input within the predetermined time, they are stored on the command database. Determining whether the first gesture combination exists among the plurality of predetermined gesture combinations, and wherein the first gesture combination exists among the plurality of predetermined gesture combinations stored on the command database. If it is determined that the first command is stored corresponding to the first gesture combination from the command database may include the step of extracting.
  • step S320 each time the two or more touch inputs are sequentially input at intervals within the predetermined time, the trace of each touch input is tracked. After generating the free lines according to the trajectory of each touch input to display on the touch screen and the two or more touch inputs are sequentially input on the touch screen at intervals within the predetermined time, the selected If the additional touch input is not input within a time period, the method may further include removing the free lines displayed on the touch screen.
  • the generating of the first gesture combination may include each touch input whenever the two or more touch inputs are sequentially input on the touch screen at intervals within the predetermined time. Generating gesture information corresponding to each touch input based on the trajectory of each touch input by sequentially tracking the trajectories of the plurality of touch inputs, and sequentially storing the gesture information corresponding to each touch input in the memory queue; When gesture information corresponding to each input is stored in the memory queue, gesture information corresponding to each of the two or more touch inputs stored in the memory queue may be combined according to a storage order in the memory queue, so that the two or more touch inputs are combined. And generating the first gesture combination according to the present invention.
  • step S320 the two or more touch inputs are sequentially input on the touch screen at intervals within the predetermined time, and then touch inputs at intervals within the predetermined time.
  • the gesture information corresponding to each of the two or more touching inputs stored in the memory queue is inputted.
  • the method may further include deleting gesture information corresponding to each of the touch inputs one by one in an order opposite to the order stored in the memory queue.
  • step S320 the two or more touch inputs are sequentially input on the touch screen at intervals within the predetermined time, and then the two or more touch inputs within the predetermined time.
  • a second canceling touch input corresponding to second preset cancellation gesture information is input for executing an input canceling command for all, the two or more touch inputs stored in the memory queue based on the second canceling touch input All gesture information corresponding to each can be deleted.
  • the operating method of the touch screen device capable of executing an event based on a gesture combination according to an embodiment of the present invention has been described above with reference to FIG. 3.
  • the operation method of the touch screen device capable of executing an event based on the gesture combination according to an embodiment of the present invention operates the touch screen device 110 capable of executing an event based on the gesture combination described with reference to FIGS. 1 and 2. Since it may correspond to the configuration for, a more detailed description thereof will be omitted.
  • An operation method of a touch screen device capable of executing an event based on a gesture combination may be implemented as a computer program stored in a storage medium for execution by combining with a computer.
  • a method of operating a touch screen device capable of executing an event based on a gesture combination may be implemented in the form of program instructions that can be executed by various computer means and recorded in a computer readable medium.
  • the computer readable medium may include program instructions, data files, data structures, etc. alone or in combination.
  • Program instructions recorded on the media may be those specially designed and constructed for the purposes of the present invention, or they may be of the kind well-known and available to those having skill in the computer software arts.
  • Examples of computer-readable recording media include magnetic media such as hard disks, floppy disks, and magnetic tape, optical media such as CD-ROMs, DVDs, and magnetic disks, such as floppy disks.
  • Magneto-optical media and hardware devices specifically configured to store and execute program instructions, such as ROM, RAM, flash memory, and the like.
  • program instructions include not only machine code generated by a compiler, but also high-level language code that can be executed by a computer using an interpreter or the like.
  • the hardware device described above may be configured to operate as one or more software modules to perform the operations of the present invention, and vice versa.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

Selon un mode de réalisation de la présente invention, un dispositif à écran tactile, apte à exécuter un événement basé sur une combinaison de gestes, et son procédé de fonctionnement, permettent : que soit maintenue une base de données de commande, dans laquelle une pluralité de combinaisons de gestes sélectionnées ayant deux gestes ou plus combinés en séquence à l'intérieur de ces dernières et des commandes correspondant aux combinaisons de gestes sélectionnées sont stockées; et que soit confirmée, lorsque deux entrées tactiles ou plus sont entrées dans un écran tactile par un utilisateur, dans des intervalles temporels sélectionnés, une combinaison de gestes en accord avec les deux entrées tactiles ou plus, et ensuite, qu'une commande correspondant à la combinaison de gestes soit extraite de la base de données de commandes de telle sorte qu'un événement correspondant à la commande extraite est exécuté.
PCT/KR2016/008712 2015-09-17 2016-08-08 Dispositif à écran tactile apte à exécuter un événement basé sur une combinaison de gestes et son procédé de fonctionnement WO2017047929A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US15/529,986 US10540088B2 (en) 2015-09-17 2016-08-08 Touch screen device capable of executing event based on gesture combination and operating method thereof

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
KR10-2015-0131469 2015-09-17
KR20150131469 2015-09-17
KR10-2015-0162259 2015-11-19
KR1020150162259A KR101718070B1 (ko) 2015-09-17 2015-11-19 제스처 조합 기반의 이벤트 실행이 가능한 터치스크린 장치 및 그 동작 방법

Publications (1)

Publication Number Publication Date
WO2017047929A1 true WO2017047929A1 (fr) 2017-03-23

Family

ID=58289056

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/KR2016/008712 WO2017047929A1 (fr) 2015-09-17 2016-08-08 Dispositif à écran tactile apte à exécuter un événement basé sur une combinaison de gestes et son procédé de fonctionnement

Country Status (1)

Country Link
WO (1) WO2017047929A1 (fr)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112506379A (zh) * 2020-12-21 2021-03-16 北京百度网讯科技有限公司 触控事件的处理方法、装置、设备以及存储介质

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100020025A1 (en) * 2008-07-25 2010-01-28 Intuilab Continuous recognition of multi-touch gestures
KR101275040B1 (ko) * 2012-12-06 2013-06-17 주식회사 한글과컴퓨터 자유선 입력 기반의 전자 문서 구동 장치 및 방법
KR20140069359A (ko) * 2009-06-10 2014-06-09 닛본 덴끼 가부시끼가이샤 전자 기기, 제스처 처리 방법, 및 제스처 처리 프로그램
KR20140083303A (ko) * 2012-12-26 2014-07-04 전자부품연구원 멀티 포인트 터치를 이용한 사용자 인터페이스 제공 방법 및 이를 위한 장치
KR20140083302A (ko) * 2012-12-26 2014-07-04 전자부품연구원 투 포인트 터치를 이용한 사용자 인터페이스 제공 방법 및 이를 위한 장치

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100020025A1 (en) * 2008-07-25 2010-01-28 Intuilab Continuous recognition of multi-touch gestures
KR20140069359A (ko) * 2009-06-10 2014-06-09 닛본 덴끼 가부시끼가이샤 전자 기기, 제스처 처리 방법, 및 제스처 처리 프로그램
KR101275040B1 (ko) * 2012-12-06 2013-06-17 주식회사 한글과컴퓨터 자유선 입력 기반의 전자 문서 구동 장치 및 방법
KR20140083303A (ko) * 2012-12-26 2014-07-04 전자부품연구원 멀티 포인트 터치를 이용한 사용자 인터페이스 제공 방법 및 이를 위한 장치
KR20140083302A (ko) * 2012-12-26 2014-07-04 전자부품연구원 투 포인트 터치를 이용한 사용자 인터페이스 제공 방법 및 이를 위한 장치

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112506379A (zh) * 2020-12-21 2021-03-16 北京百度网讯科技有限公司 触控事件的处理方法、装置、设备以及存储介质

Similar Documents

Publication Publication Date Title
WO2013141464A1 (fr) Procédé de commande d'entrée tactile
WO2013125914A1 (fr) Procédé et appareil d'ajustement de dimension d'objet sur un écran
WO2013125804A1 (fr) Procédé et appareil permettant le déplacement d'un contenu dans un terminal
WO2014065499A1 (fr) Procédé d'édition basé sur la définition d'un bloc de texte grâce à plusieurs touchers
WO2011081371A1 (fr) Procédé et appareil de traitement de mot de passe
EP2673701A2 (fr) Appareil d'affichage d'informations comportant au moins deux écrans tactiles et procédé associé d'affichage d'informations
AU2012214924A1 (en) Information display apparatus having at least two touch screens and information display method thereof
KR101718070B1 (ko) 제스처 조합 기반의 이벤트 실행이 가능한 터치스크린 장치 및 그 동작 방법
WO2012153914A1 (fr) Procédé et appareil de mise en œuvre d'une interface graphique utilisateur pourvue d'une fonction de suppression d'élément
WO2014035041A1 (fr) Procédé d'interaction et dispositif d'interaction permettant d'intégrer la technologie de réalité augmentée et des données en masse
WO2015174597A1 (fr) Dispositif d'affichage d'image à commande vocale et procédé de commande vocale pour dispositif d'affichage d'image
WO2016080596A1 (fr) Procédé et système de fourniture d'outil de prototypage, et support d'enregistrement lisible par ordinateur non transitoire
WO2014148689A1 (fr) Dispositif d'affichage capturant du contenu numérique et son procédé de commande
WO2013005901A1 (fr) Appareil et procédé d'entrée de caractère sur un écran tactile
WO2014003448A1 (fr) Dispositif terminal et son procédé de commande
WO2011081354A2 (fr) Procédé et appareil de saisie coréens utilisant un écran tactile, et terminal portable comprenant un appareil de saisie de touche
WO2015167072A1 (fr) Dispositif numérique fournissant un rejet tactile et procédé de commande pour celui-ci
WO2017164584A1 (fr) Dispositif hmd susceptible de réaliser une authentification d'utilisateur basée sur un geste et procédé d'authentification d'utilisateur basée sur un geste pour un dispositif hmd
WO2017047929A1 (fr) Dispositif à écran tactile apte à exécuter un événement basé sur une combinaison de gestes et son procédé de fonctionnement
WO2014003276A1 (fr) Dispositif et procédé permettant de saisir des mots chinois
WO2017047931A1 (fr) Dispositif à écran tactile permettant de déplacer ou de copier une entité d'après une entrée tactile, et procédé pour son utilisation
WO2018070657A1 (fr) Appareil électronique et appareil d'affichage
WO2014168385A1 (fr) Procédé permettant de sélectionner des données de caractères, et dispositif électronique de traitement associé
WO2021025369A1 (fr) Procédé, dispositif, programme et support d'enregistrement lisible par ordinateur pour commander un défilement d'interaction
WO2017047930A1 (fr) Dispositif à écran tactile capable d'entrer sélectivement une ligne libre et procédé pour prendre en charge une entrée de ligne libre sélective d'un dispositif à écran tactile

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 16846750

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 16846750

Country of ref document: EP

Kind code of ref document: A1